text
stringlengths 27
775k
|
---|
---
title: Azure Active Directory 的目錄同步作業
description: 使用 Azure Active Directory 達到目錄同步作業的架構指引。
services: active-directory
author: BarbaraSelden
manager: daveba
ms.service: active-directory
ms.workload: identity
ms.subservice: fundamentals
ms.topic: conceptual
ms.date: 10/10/2020
ms.author: baselden
ms.reviewer: ajburnle
ms.custom: it-pro, seodec18
ms.collection: M365-identity-device-management
ms.openlocfilehash: 748f91b2fe77667969e9736f8084a9dd24018425
ms.sourcegitcommit: d22a86a1329be8fd1913ce4d1bfbd2a125b2bcae
ms.translationtype: MT
ms.contentlocale: zh-TW
ms.lasthandoff: 11/26/2020
ms.locfileid: "96172464"
---
# <a name="directory-synchronization"></a>目錄同步作業
許多組織都有包含內部部署和雲端元件的混合式基礎結構。 在本機與雲端目錄之間同步處理使用者的身分識別,可讓使用者使用單一認證集來存取資源。
同步處理是
* 根據特定條件建立物件
* 保持物件的更新
* 不再符合條件時移除物件。
內部部署布建牽涉到從內部部署來源布建 (例如 Active Directory) Azure Active Directory (Azure AD) 。
## <a name="use-when"></a>使用時機
您必須將內部部署 Active Directory 環境中的身分識別資料同步處理到 Azure AD。

## <a name="components-of-system"></a>系統的元件
* **使用者**:使用 Azure AD 來存取應用程式。
* **Web 瀏覽器**:使用者與之互動的元件,以存取應用程式的外部 URL。
* **應用程式**:依賴使用 Azure AD 進行驗證和授權的 Web 應用程式。
* **Azure AD**:透過 Azure AD Connect 從組織的內部部署目錄同步身分識別資訊。
* **Azure AD Connect**:用來將內部部署身分識別基礎結構連線至 Microsoft Azure AD 的工具。 Wizard 和引導式體驗可協助您部署和設定連線所需的必要條件和元件,包括同步處理和從 Active Directory 登入 Azure AD。
* **Active Directory**: Active Directory 是大部分 Windows Server 作業系統中包含的目錄服務。 執行 Active Directory Domain Services (AD DS) 的伺服器稱為網域控制站。 他們會驗證並授權網域中的所有使用者和電腦。
## <a name="implement-directory-synchronization-with-azure-ad"></a>使用 Azure AD 來執行目錄同步作業
* [什麼是身分識別保護?](../cloud-provisioning/what-is-provisioning.md)
* [混合式身分識別目錄整合工具](../hybrid/plan-hybrid-identity-design-considerations-tools-comparison.md)
* [Azure AD Connect 安裝藍圖](../hybrid/how-to-connect-install-roadmap.md) |
module Schools
class Attendance
include ActiveModel::Model
attr_accessor :bookings, :bookings_params, :updated_bookings
def initialize(bookings:, bookings_params:)
self.bookings = bookings
self.bookings_params = bookings_params
end
def save
@updated_bookings = []
bookings_params.each do |booking_id, attended|
fetch(booking_id).tap do |booking|
begin
booking.attended = ActiveModel::Type::Boolean.new.cast(attended)
booking.save!(context: :attendance)
@updated_bookings << booking.id
rescue ActiveRecord::RecordInvalid => e
errors.add :bookings_params,
"Unable to set attendance for #{booking.date.to_formatted_s(:govuk)}"
update_error e
end
end
end
errors.empty?
end
def update_gitis
bookings_params.slice(*updated_bookings).each do |booking_id, _attended|
fetch(booking_id).tap do |booking|
Bookings::Gitis::EventLogger.write_later \
booking.contact_uuid, :attendance, booking
end
end
end
private
def indexed_bookings
@indexed_bookings ||= self.bookings.index_by(&:id)
end
def fetch(id)
indexed_bookings.fetch(id)
end
def update_error(exception)
ExceptionNotifier.notify_exception(exception)
Raven.capture_exception(exception)
end
end
end
|
class SchoolAdminMailer < SchoolMailer
# @param school_admin [SchoolAdmin] Existing school admin
# @param new_school_admin [SchoolAdmin] Newly created school admin
def school_admin_added(school_admin, new_school_admin)
@school_admin = school_admin
@new_school_admin = new_school_admin
@school = school_admin.user.school
simple_roadie_mail(school_admin.email, "New School Admin Added")
end
end
|
require "akchabar/version"
require 'net/http'
require 'json'
require 'bigdecimal'
module Akchabar
def self.rates
JSON.parse(Net::HTTP.get(URI("http://rates.akchabar.kg/get.json")))
end
def self.btc_rate
BigDecimal.new self.rates["rates"]["btc"]
end
end
|
---
uid: DocumentFormat.OpenXml.Validation.ValidationErrorInfo
ms.author: "soliver"
manager: "soliver"
---
---
uid: DocumentFormat.OpenXml.Validation.ValidationErrorInfo.SetDebugField(System.String,System.String)
ms.author: "soliver"
manager: "soliver"
---
---
uid: DocumentFormat.OpenXml.Validation.ValidationErrorInfo.#ctor
ms.author: "soliver"
manager: "soliver"
---
---
uid: DocumentFormat.OpenXml.Validation.ValidationErrorInfo.ErrorType
ms.author: "soliver"
manager: "soliver"
---
---
uid: DocumentFormat.OpenXml.Validation.ValidationErrorInfo.RelatedPart
ms.author: "soliver"
manager: "soliver"
---
---
uid: DocumentFormat.OpenXml.Validation.ValidationErrorInfo.Node
ms.author: "soliver"
manager: "soliver"
---
---
uid: DocumentFormat.OpenXml.Validation.ValidationErrorInfo.RelatedNode
ms.author: "soliver"
manager: "soliver"
---
---
uid: DocumentFormat.OpenXml.Validation.ValidationErrorInfo.Id
ms.author: "soliver"
manager: "soliver"
---
---
uid: DocumentFormat.OpenXml.Validation.ValidationErrorInfo.Path
ms.author: "soliver"
manager: "soliver"
---
---
uid: DocumentFormat.OpenXml.Validation.ValidationErrorInfo.Part
ms.author: "soliver"
manager: "soliver"
---
---
uid: DocumentFormat.OpenXml.Validation.ValidationErrorInfo.Description
ms.author: "soliver"
manager: "soliver"
---
|
extern crate byteorder;
extern crate rand;
extern crate crypto;
extern crate num_bigint;
#[macro_use]
extern crate log;
#[cfg(target_os = "redox")]
extern crate syscall;
#[cfg(not(target_os = "redox"))]
extern crate libc;
mod error;
mod algorithm;
mod packet;
mod message;
mod connection;
mod key_exchange;
mod encryption;
mod mac;
mod channel;
pub mod public_key;
pub mod server;
#[cfg(target_os = "redox")]
#[path = "sys/redox.rs"]
pub mod sys;
#[cfg(not(target_os = "redox"))]
#[path = "sys/unix.rs"]
pub mod sys;
pub use self::server::{Server, ServerConfig};
|
//------------------------------------------------------------------------------
// <copyright file="MainWindow.xaml.cs" company="Microsoft">
// Copyright (c) Microsoft Corporation. All rights reserved.
// </copyright>
//------------------------------------------------------------------------------
namespace Microsoft.Samples.Kinect.BodyIndexBasics
{
using System;
using System.ComponentModel;
using System.Diagnostics;
using System.Globalization;
using System.IO;
using System.Runtime.InteropServices;
using System.Windows;
using System.Windows.Media;
using System.Windows.Media.Imaging;
using Microsoft.Kinect;
/// <summary>
/// Interaction logic for the MainWindow
/// </summary>
public partial class MainWindow : Window, INotifyPropertyChanged
{
/// <summary>
/// Size of the RGB pixel in the bitmap
/// </summary>
private const int BytesPerPixel = 4;
/// <summary>
/// Collection of colors to be used to display the BodyIndexFrame data.
/// </summary>
private static readonly uint[] BodyColor =
{
0x0000FF00,
0x00FF0000,
0xFFFF4000,
0x40FFFF00,
0xFF40FF00,
0xFF808000,
};
/// <summary>
/// Active Kinect sensor
/// </summary>
private KinectSensor kinectSensor = null;
/// <summary>
/// Reader for body index frames
/// </summary>
private BodyIndexFrameReader bodyIndexFrameReader = null;
/// <summary>
/// Description of the data contained in the body index frame
/// </summary>
private FrameDescription bodyIndexFrameDescription = null;
/// <summary>
/// Bitmap to display
/// </summary>
private WriteableBitmap bodyIndexBitmap = null;
/// <summary>
/// Intermediate storage for frame data converted to color
/// </summary>
private uint[] bodyIndexPixels = null;
/// <summary>
/// Current status text to display
/// </summary>
private string statusText = null;
/// <summary>
/// Initializes a new instance of the MainWindow class.
/// </summary>
public MainWindow()
{
// get the kinectSensor object
this.kinectSensor = KinectSensor.GetDefault();
// open the reader for the depth frames
this.bodyIndexFrameReader = this.kinectSensor.BodyIndexFrameSource.OpenReader();
// wire handler for frame arrival
this.bodyIndexFrameReader.FrameArrived += this.Reader_FrameArrived;
this.bodyIndexFrameDescription = this.kinectSensor.BodyIndexFrameSource.FrameDescription;
// allocate space to put the pixels being converted
this.bodyIndexPixels = new uint[this.bodyIndexFrameDescription.Width * this.bodyIndexFrameDescription.Height];
// create the bitmap to display
this.bodyIndexBitmap = new WriteableBitmap(this.bodyIndexFrameDescription.Width, this.bodyIndexFrameDescription.Height, 96.0, 96.0, PixelFormats.Bgr32, null);
// set IsAvailableChanged event notifier
this.kinectSensor.IsAvailableChanged += this.Sensor_IsAvailableChanged;
// open the sensor
this.kinectSensor.Open();
// set the status text
this.StatusText = this.kinectSensor.IsAvailable ? Properties.Resources.RunningStatusText
: Properties.Resources.NoSensorStatusText;
// use the window object as the view model in this simple example
this.DataContext = this;
// initialize the components (controls) of the window
this.InitializeComponent();
}
/// <summary>
/// INotifyPropertyChangedPropertyChanged event to allow window controls to bind to changeable data
/// </summary>
public event PropertyChangedEventHandler PropertyChanged;
/// <summary>
/// Gets the bitmap to display
/// </summary>
public ImageSource ImageSource
{
get
{
return this.bodyIndexBitmap;
}
}
/// <summary>
/// Gets or sets the current status text to display
/// </summary>
public string StatusText
{
get
{
return this.statusText;
}
set
{
if (this.statusText != value)
{
this.statusText = value;
// notify any bound elements that the text has changed
if (this.PropertyChanged != null)
{
this.PropertyChanged(this, new PropertyChangedEventArgs("StatusText"));
}
}
}
}
/// <summary>
/// Execute shutdown tasks
/// </summary>
/// <param name="sender">object sending the event</param>
/// <param name="e">event arguments</param>
private void MainWindow_Closing(object sender, System.ComponentModel.CancelEventArgs e)
{
if (this.bodyIndexFrameReader != null)
{
// remove the event handler
this.bodyIndexFrameReader.FrameArrived -= this.Reader_FrameArrived;
// BodyIndexFrameReder is IDisposable
this.bodyIndexFrameReader.Dispose();
this.bodyIndexFrameReader = null;
}
if (this.kinectSensor != null)
{
this.kinectSensor.Close();
this.kinectSensor = null;
}
}
/// <summary>
/// Handles the user clicking on the screenshot button
/// </summary>
/// <param name="sender">object sending the event</param>
/// <param name="e">event arguments</param>
private void ScreenshotButton_Click(object sender, RoutedEventArgs e)
{
if (this.bodyIndexBitmap != null)
{
// create a png bitmap encoder which knows how to save a .png file
BitmapEncoder encoder = new PngBitmapEncoder();
// create frame from the writable bitmap and add to encoder
encoder.Frames.Add(BitmapFrame.Create(this.bodyIndexBitmap));
string time = System.DateTime.UtcNow.ToString("hh'-'mm'-'ss", CultureInfo.CurrentUICulture.DateTimeFormat);
string myPhotos = Environment.GetFolderPath(Environment.SpecialFolder.MyPictures);
string path = Path.Combine(myPhotos, "KinectScreenshot-BodyIndex-" + time + ".png");
// write the new file to disk
try
{
// FileStream is IDisposable
using (FileStream fs = new FileStream(path, FileMode.Create))
{
encoder.Save(fs);
}
this.StatusText = string.Format(CultureInfo.CurrentCulture, Properties.Resources.SavedScreenshotStatusTextFormat, path);
}
catch (IOException)
{
this.StatusText = string.Format(CultureInfo.CurrentCulture, Properties.Resources.FailedScreenshotStatusTextFormat, path);
}
}
}
/// <summary>
/// Handles the body index frame data arriving from the sensor
/// </summary>
/// <param name="sender">object sending the event</param>
/// <param name="e">event arguments</param>
private void Reader_FrameArrived(object sender, BodyIndexFrameArrivedEventArgs e)
{
bool bodyIndexFrameProcessed = false;
using (BodyIndexFrame bodyIndexFrame = e.FrameReference.AcquireFrame())
{
if (bodyIndexFrame != null)
{
// the fastest way to process the body index data is to directly access
// the underlying buffer
using (Microsoft.Kinect.KinectBuffer bodyIndexBuffer = bodyIndexFrame.LockImageBuffer())
{
// verify data and write the color data to the display bitmap
if (((this.bodyIndexFrameDescription.Width * this.bodyIndexFrameDescription.Height) == bodyIndexBuffer.Size) &&
(this.bodyIndexFrameDescription.Width == this.bodyIndexBitmap.PixelWidth) && (this.bodyIndexFrameDescription.Height == this.bodyIndexBitmap.PixelHeight))
{
this.ProcessBodyIndexFrameData(bodyIndexBuffer.UnderlyingBuffer, bodyIndexBuffer.Size);
bodyIndexFrameProcessed = true;
}
}
}
}
if (bodyIndexFrameProcessed)
{
this.RenderBodyIndexPixels();
}
}
/// <summary>
/// Directly accesses the underlying image buffer of the BodyIndexFrame to
/// create a displayable bitmap.
/// This function requires the /unsafe compiler option as we make use of direct
/// access to the native memory pointed to by the bodyIndexFrameData pointer.
/// </summary>
/// <param name="bodyIndexFrameData">Pointer to the BodyIndexFrame image data</param>
/// <param name="bodyIndexFrameDataSize">Size of the BodyIndexFrame image data</param>
private unsafe void ProcessBodyIndexFrameData(IntPtr bodyIndexFrameData, uint bodyIndexFrameDataSize)
{
byte* frameData = (byte*)bodyIndexFrameData;
// convert body index to a visual representation
for (int i = 0; i < (int)bodyIndexFrameDataSize; ++i)
{
// the BodyColor array has been sized to match
// BodyFrameSource.BodyCount
if (frameData[i] < BodyColor.Length)
{
// this pixel is part of a player,
// display the appropriate color
this.bodyIndexPixels[i] = BodyColor[frameData[i]];
}
else
{
// this pixel is not part of a player
// display black
this.bodyIndexPixels[i] = 0x00000000;
}
}
}
/// <summary>
/// Renders color pixels into the writeableBitmap.
/// </summary>
private void RenderBodyIndexPixels()
{
this.bodyIndexBitmap.WritePixels(
new Int32Rect(0, 0, this.bodyIndexBitmap.PixelWidth, this.bodyIndexBitmap.PixelHeight),
this.bodyIndexPixels,
this.bodyIndexBitmap.PixelWidth * (int)BytesPerPixel,
0);
}
/// <summary>
/// Handles the event which the sensor becomes unavailable (E.g. paused, closed, unplugged).
/// </summary>
/// <param name="sender">object sending the event</param>
/// <param name="e">event arguments</param>
private void Sensor_IsAvailableChanged(object sender, IsAvailableChangedEventArgs e)
{
// on failure, set the status text
this.StatusText = this.kinectSensor.IsAvailable ? Properties.Resources.RunningStatusText
: Properties.Resources.SensorNotAvailableStatusText;
}
}
}
|
package gamemsg
import (
"LollipopGo2.8x/conf/g"
. "LollipopGo2.8x/proto/sr_proto"
"LollipopGo2.8x/tables"
. "github.com/Golangltd/Twlib/proto"
. "github.com/Golangltd/Twlib/user"
)
/*
学籍界面消息
*/
type SRInterfaceMsg struct {
Protocol int //主协议
Protocol2 int //子协议
UserName string //用户名
StudentID int64 //学号
GradeInfo *GradeMsg //评级信息
Association string //协会
BattlePower int //战力
Colleges []*CollegeData //学院列表信息
}
func NewSRInterfaceMsg() *SRInterfaceMsg {
m := &SRInterfaceMsg{}
m.Protocol = GGameBattleProto
m.Protocol2 = S2CSRInterfaceMsgProto2
return m
}
//发送给客户端的学院数据
type CollegeData struct {
ID int //学院ID
CollegeType int //学院类型
Level int //学院等级
}
func newCollegeData(info *CollegeInfo) *CollegeData {
m := &CollegeData{}
m.ID = info.CollegeID //学院ID
m.Level = tables.CollegeTable[m.ID].CollegeLevel //学院等级
m.CollegeType = tables.CollegeTable[m.ID].CollegeType //学院类型
return m
}
func (m *SRInterfaceMsg) InitFieldsData(st *UserSt) {
m.UserName = st.RoleName
m.StudentID = st.RoleUid
m.GradeInfo = newGradeMsg(st) //生成评级信息
m.Association = st.Association
m.BattlePower = st.TotalPower
m.InitColleges(st.CollegesInfo)
}
//初始化学院列表信息
func (m *SRInterfaceMsg) InitColleges(collegesInfo map[g.CollegeID]*CollegeInfo) {
m.Colleges = make([]*CollegeData, 0)
for _, c := range collegesInfo {
m.Colleges = append(m.Colleges, newCollegeData(c))
}
m.sortColleges()
}
//排序学院列表
func (m *SRInterfaceMsg) sortColleges() {
if len(m.Colleges) <= 1 {
return
}
for i := 0; i < len(m.Colleges)-1; i++ {
for j := i + 1; j > 0; j-- {
if m.Colleges[j].ID < m.Colleges[i].ID {
m.Colleges[j], m.Colleges[i] = m.Colleges[i], m.Colleges[j]
} else {
break
}
}
}
}
|
module Atom
class BufProcess
include Native
def initialize(command, args, stdout, exitcb)
options = `{ command: #{command}, args: #{args}, stdout: #{stdout}, exit: #{exitcb} }`
other_atom = `require("atom")`
super(`new #{other_atom}.BufferedProcess(#{options})`)
end
alias_native :kill
end
end
|
// Given an array of integers out of order, determine the bounds of the smallest window that must be sorted
// in order for the entire array to be sorted.
// For example, given [3, 7, 5, 6, 9, 8], you should return (1, 3).
const sortingWindow = arr => {
let startIndex = 0
let endIndex = 0
let i =0;
let j = arr.length-1;
while ( i < j && j > i) {
if(arr[i] < arr[i+1]){
i++;
}
else if (arr[i] > arr[i+1] ){
startIndex = i;
}
if (arr[j] < arr[j-1]) {
j--
}
}
for (let i = 0; i < arr.length; i++) {
if (arr[i] < arr[i + 1]) {
// these two items are sorted
}
if (arr[i] > arr[i + 1]) {
// these two items are not sorted
startIndex = i
while (arr[startIndex] > arr[i+1]) {
i++
}
endIndex = i
}
}
for (let i = arr.length -1; i >= 0; i--) {
if (arr[i] < arr[i - 1]) {
endIndex = i
}
}
return [startIndex, endIndex]
}
console.log(sortingWindow([3, 7, 5, 6, 9])) |
/*!
* It's me!
* https://github.com/mugetsu/its-me
* based on my gist (https://gist.github.com/mugetsu/5dcd567208b38d15e69b)
* @author Randell Quitain (https://github.com/mugetsu)
* @version 0.0.1
* Copyright 2015. MIT licensed.
*/
(function($){
var children = 5,
saveThem;
function eightyThreeOrEightySeven(min, max) {
return Math.random() * (max - min) + min;
}
setInterval(function() {
var incidents = eightyThreeOrEightySeven(19,87),
springsuits = 2;
saveThem = $("<div class='its-me animated fadeInDown' style='left:"
+ eightyThreeOrEightySeven(0,screen.width) + "px; top:"
+ eightyThreeOrEightySeven(0,screen.height) + "px; font-size:"
+ incidents + "px; line-height:"
+ (incidents - springsuits) + "px;'>it's me</div>");
$('body').append(saveThem);
var youCant = saveThem;
var birthday = setTimeout(function() {
$(youCant).removeClass('fadeInDown').addClass('fadeOut');
var failure = setTimeout(function() {
$(youCant).remove();
clearTimeout(failure);
}, 1982);
clearTimeout(birthday);
}, 1983);
}, 1987);
})(jQuery); |
#include "Value.hxx"
#include "Array.hxx"
#include "Object.hxx"
namespace GDN {
void Value::accept(Visitor & visitor) {
visitor.visitValue(*this);
}
// constructors
Value::Value(Object val){
this->value = std::make_shared<Object>(val);
}
Value::Value(Array val){
this->value = std::make_shared<Array>(val);
}
Value::Value(long val, IntFormat format){
this->value = IntPack{val, format};
}
Value::Value(double val){
this->value = val;
}
Value::Value(bool val){
this->value = val;
}
Value::Value(std::string val){
this->value = val;
}
Value::Value(const char * val) : Value(std::string(val)){
// only delegating
}
Value::Value(){
this->value = Nil();
}
// read functions
std::optional<double> Value::getFloat() const{
if(auto val = std::get_if<double>(&this->value)){
return *val;
}
//automatic conversion from int(long) to float(double)
//every int can also be read as float
else if (auto val = this->getInt()){
return static_cast<double>(val->val);
}
else{
return std::optional<double>();
}
}
std::optional<Value::IntPack> Value::getInt() const{
if(auto val = std::get_if<IntPack>(&this->value)){
return *val;
}
else{
return std::optional<IntPack>();
}
}
std::optional<std::string> Value::getString() const{
if(auto val = std::get_if<std::string>(&this->value)){
return *val;
}
else{
return std::optional<std::string>();
}
}
std::optional<bool> Value::getBool() const {
if(auto val = std::get_if<bool>(&this->value)){
return *val;
}
else{
return std::optional<bool>();
}
}
std::optional<Value::Nil> Value::getNull() const{
if(auto val = std::get_if<Nil>(&this->value)){
return *val;
}
else{
return std::optional<Nil>();
}
}
const Object * Value::getObject()const{
if(auto val = std::get_if<std::shared_ptr<Object>>(&this->value)){
return val->get();
}
else{
return nullptr;
}
}
const Array * Value::getArray()const{
if(auto val = std::get_if<std::shared_ptr<Array>>(&this->value)){
return val->get();
}
else{
return nullptr;
}
}
Object * Value::getObject() {
if(auto val = std::get_if<std::shared_ptr<Object>>(&this->value)){
return val->get();
}
else{
return nullptr;
}
}
Array * Value::getArray() {
if(auto val = std::get_if<std::shared_ptr<Array>>(&this->value)){
return val->get();
}
else{
return nullptr;
}
}
} |
package com.example.arca
import android.content.Intent
import android.content.res.Configuration
import android.support.v7.app.AppCompatActivity
import android.os.Bundle
import android.widget.Button
import android.widget.ImageView
import android.widget.Toast
import com.synnapps.carouselview.CarouselView
import com.synnapps.carouselview.ImageListener
class FTBActivity : AppCompatActivity() {
var land = 0
var sampleImages = intArrayOf(
R.drawable.car0,
R.drawable.car1,
R.drawable.car2
)
var sampleImagesLand = intArrayOf(
// R.drawable.carLand0,
R.drawable.ma,
R.drawable.ia
)
/* THE LANDSCAPE FUNCTIONAITY IS NOT YET FINISHED */
override fun onConfigurationChanged(newConfig: Configuration) {
super.onConfigurationChanged(newConfig)
var temp=land;
// Checks the orientation of the screen
if (newConfig.orientation === Configuration.ORIENTATION_LANDSCAPE) {
Toast.makeText(this, "landscape", Toast.LENGTH_SHORT).show()
temp++
}
land=temp
}
override fun onCreate(savedInstanceState: Bundle?) {
super.onCreate(savedInstanceState)
setContentView(R.layout.activity_ftb)
val btnNext = findViewById<Button>(R.id.ftb_next)
val carouselView = findViewById<CarouselView>(R.id.carouselView)
if (land==0) {
carouselView.setPageCount(sampleImages.size)
carouselView.setImageListener(imageListener)
}else if (land!=0) {
carouselView.setPageCount(sampleImagesLand.size)
carouselView.setImageListener(imageListenerLand)
}
btnNext.setOnClickListener {
/* Intent is just like glue which helps to navigate one activity to another. */
val intent = Intent(this, MainActivity::class.java)
/*
intent.putExtra("EXTRA_SESSION_ID", sessionId)
intent2.putExtra("totalCA", "$labelTotalE")
setResult(Activity.RESULT_OK, intent2)
startActivityForResult(intent2 , MY_REQUEST_ID);
finish()
*/
startActivity(intent) // startActivity allow you to move
}
}
var imageListener: ImageListener = object : ImageListener {
override fun setImageForPosition(position: Int, imageView: ImageView){
//Can also add Glide or Picasso here
//Picasso.get().load(sampleImages[position]).into(imageView)
imageView.setImageResource(sampleImages[position])
}
}
var imageListenerLand: ImageListener = object : ImageListener {
override fun setImageForPosition(position: Int, imageView: ImageView){
imageView.setImageResource(sampleImagesLand[position])
}
}
}
|
=head1 PURPOSE
Basic MooX::Struct usage.
=head1 AUTHOR
Toby Inkster E<lt>[email protected]<gt>.
=head1 COPYRIGHT AND LICENCE
This software is copyright (c) 2012 by Toby Inkster.
This is free software; you can redistribute it and/or modify it under
the same terms as the Perl 5 programming language system itself.
=cut
use strict;
use Test::More tests => 12;
use MooX::Struct
Organisation => [qw/ name employees /, company_number => [is => 'rw']],
Person => [qw/ name /];
my $alice = Person->new(name => 'Alice');
my $bob = Person->new(name => 'Bob');
my $acme = Organisation->new(name => 'ACME', employees => [$alice, $bob]);
note sprintf("Person class: %s", Person);
note sprintf("Organisation class: %s", Organisation);
is(
ref($alice),
ref($bob),
'Alice and Bob are in the same class',
);
isnt(
ref($alice),
ref($acme),
'Alice and ACME are not in the same class',
);
isa_ok($_, 'MooX::Struct', '$'.lc($_->name)) for ($alice, $bob, $acme);
is($alice->name, 'Alice', '$alice is called Alice');
is($bob->name, 'Bob', '$bob is called Bob');
is($acme->name, 'ACME', '$acme is called ACME');
ok !eval {
$acme->name('Acme Inc'); 1
}, 'accessors are read-only by default';
$acme->company_number(12345);
is($acme->company_number, 12345, 'accessors can be made read-write');
can_ok $alice => 'OBJECT_ID';
isnt($alice->OBJECT_ID, $bob->OBJECT_ID, 'OBJECT_ID is unique identifier');
|
/*
* Input.hpp
*
* Created on: Feb 11, 2020
* Author: jmbae
*/
#ifndef INPUT_HPP_
#define INPUT_HPP_
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <math.h>
#include <iostream>
#include <vector>
#include <unistd.h>
using namespace std;
struct inputParameter{
inputParameter(){
dirPath = new char[100];
c1Path = new char[100];
c1SidsetPath = new char[100];
c2Path = new char[100];
c3Path = new char[100];
c4Path1 = new char[100];
c4Path2 = new char[100];
vIdPath = new char[100];
oIdPath = new char[100];
numOfGPUs = 8;
numOfThreads = 20;
isWriteOutput = 0;
hostIdx = 0;
minLen = 18;
maxLen = 30;
minGC = 30;
maxGC = 80;
minTM = 64.0;
maxTM = 70.0;
maxSC = 7;
endMaxSC = 7;
endMinDG = -9;
maxHP = 5;
contiguous = 6;
lenDiff = 5;
TMDiff=5;
minPS=60;
maxPS=500;
maxPC=9;
endMaxPC=7;
DGlen = 5;
}
~inputParameter(){
delete[] dirPath;
delete[] c1SidsetPath;
delete[] c1Path;
delete[] c2Path;
delete[] c3Path;
delete[] c4Path1;
delete[] c4Path2;
delete[] vIdPath;
delete[] oIdPath;
}
void printInputParameter(){
cout << "<Files>" << endl;
cout << "input: " << inputPath << ", C1: " << c1Path <<
", C1': " << c1SidsetPath << ", C2: " << c2Path << ", C3: " << c3Path
<< ", C4(k=1): " << c4Path1 << ", C4(k=2): " << c4Path2
<< ", output " << outputPath << "\n\n";
cout << "<Settings>" << endl;
cout << "numOfGPUs: " << numOfGPUs << ", numOfThreads: " << numOfThreads << "\n\n";
cout << "<Parameters for single filtering>" << endl;
cout << "minLen: " << minLen << ", maxLen: " << maxLen
<< ", minGC: " << minGC << ", maxGC: " << maxGC
<< ", minTM: " << minTM << ", maxTM: " << maxTM
<< ", maxSC: " << maxSC << ", endMaxSC: " << endMaxSC
<< ", endMinDG: " << endMinDG
<< ", maxHP: " << maxHP << ", contiguous: " << contiguous << "\n\n";
cout << "<Parameters for pair filtering>" << endl;
cout << "lenDiff: " << lenDiff << ", TMDiff: " << TMDiff
<< ", minPS: " << minPS << ", maxPS: " << maxPS
<< ", maxPC: " << maxPC << ", endMaxPC: " << endMaxPC << "\n\n";
}
char* oIdPath;
char* vIdPath;
char* dirPath;
char* inputPath;
char* c1Path;
char* c1SidsetPath;
char* c2Path;
char* c3Path;
char* c4Path1;
char* c4Path2;
char* outputPath;
int numOfGPUs;
int numOfThreads;
int isWriteOutput;
int minLen;
int maxLen;
float minGC;
float maxGC;
float minTM;
float maxTM;
int maxSC;
int endMaxSC;
int endMinDG;
int maxHP;
int contiguous;
int lenDiff;
int TMDiff;
int minPS;
int maxPS;
int maxPC;
int endMaxPC;
int hostIdx;
int DGlen;
};
bool readInputParameter(int argc, char*argv[], inputParameter* input) {
if (argc == 1)
{
cout << "-og <organism-vid mapping DB (necessary>" << endl;
cout << "-v <sid-vid mapping DB (necessary)>" << endl;
cout << "-i <input sequence DB (necessary)>" <<endl;
cout << "-o <final output path (necessary)>" << endl;
cout << "-d <working and storage directory (necessary)" << endl;
cout << "-s <the maximum sid of host (necessary)" << endl;
cout << "-t <number of CPU threads (in default 20)>" << endl;
cout << "-g <number of GPUs (in default 1)>" << endl;
cout << "-w <write intermediate output (0: no, 1: yes, in default 0)>"<<endl;
cout << "-p1 <change the parameter about single filtering (0: no, 1: yes, in default 0)>" << endl;
cout << "-p2 <change the parameter about pair filtering (0: no, 1: yes, in default 0)>" << endl;
return false;
}
int argnr = 0;
bool inputCheck = false;
bool dirCheck = false;
bool outputCheck = false;
bool sidCheck = false;
bool ogCheck = false;
bool vCheck = false;
while (++argnr < argc)
{
if (!strcmp(argv[argnr], "-i")) {
input->inputPath = argv[++argnr];
inputCheck = true;
}
else if(!strcmp(argv[argnr], "-v")){
string tmpFname = argv[++argnr];
strcpy(input->vIdPath, tmpFname.c_str());
vCheck = true;
}
else if (!strcmp(argv[argnr], "-og")) {
string tmpFname = argv[++argnr];
strcpy(input->oIdPath, tmpFname.c_str());
ogCheck = true;
}
else if(!strcmp(argv[argnr], "-d")){
string dir = argv[++argnr];
string tmpFname;
strcpy(input->dirPath, dir.c_str());
tmpFname = dir + "/C1.txt";
strcpy(input->c1Path, tmpFname.c_str());
tmpFname = dir + "/C1_sidset.txt";
strcpy(input->c1SidsetPath, tmpFname.c_str());
tmpFname = dir + "/C2.txt";
strcpy(input->c2Path, tmpFname.c_str());
tmpFname = dir + "/C3.txt";
strcpy(input->c3Path, tmpFname.c_str());
tmpFname = dir + "/C4_1.txt";
strcpy(input->c4Path1, tmpFname.c_str());
tmpFname = dir + "/C4_2.txt";
strcpy(input->c4Path2, tmpFname.c_str());
dirCheck = true;
}
else if(!strcmp(argv[argnr], "-o")){
input->outputPath = argv[++argnr];
outputCheck = true;
}
else if(!strcmp(argv[argnr], "-s")){
input->hostIdx = stoi(argv[++argnr]);
sidCheck = true;
}
else if (!strcmp(argv[argnr], "-t")){
input->numOfThreads = stoi(argv[++argnr]);
}
else if (!strcmp(argv[argnr], "-g")){
input->numOfGPUs = stoi(argv[++argnr]);
}
else if (!strcmp(argv[argnr], "-w")){
input->isWriteOutput = stoi(argv[++argnr]);
}
else if(!strcmp(argv[argnr], "-p1")){
int tmpInt; float tmpFloat;
if(stoi(argv[++argnr]) == 1){
cout << "## Parameters for single filtering" << endl;
cout << "The minimum length of primer (in default 19): ";
cin >> tmpInt;
input->minLen = tmpInt;
cout << "The maximum length of primer (in default 23): ";
cin >> tmpInt;
input->maxLen = tmpInt;
cout << "The minimum GC ratios (in default 40): ";
cin >> tmpInt;
input->minGC = tmpInt;
cout << "The maximum GC ratios (in default 60): ";
cin >> tmpInt;
input->maxGC = tmpInt;
cout << "The minimum primer melting temperatures (in default 58.0): ";
cin >> tmpFloat;
input->minTM = tmpFloat;
cout << "The maximum primer melting temperatures (in default 62.0): ";
cin >> tmpFloat;
input->maxTM = tmpFloat;
cout << "The maximum self-complementarity (in default 5): ";
cin >> tmpInt;
input->maxSC = tmpInt;
cout << "The maximum 3' end self-complementarity (in default 4): ";
cin >> tmpInt;
input->endMaxSC = tmpInt;
cout << "The maximum contiguous residues (in default 6): ";
cin >> tmpInt;
input->contiguous = tmpInt;
cout << "The maximum end stability (in default -9): ";
cin >> tmpInt;
input->endMinDG = tmpInt;
cout << "The maximum hairpin (in default 3): ";
cin >> tmpInt;
input->maxHP = tmpInt;
cout << endl;
}
}
else if(!strcmp(argv[argnr], "-p2")){
int tmpInt;
if(stoi(argv[++argnr]) == 1){
cout << "## Parameters for pair filteirng " << endl;
cout << "The maximum length difference (in default 5): ";
cin >> tmpInt;
input->lenDiff = tmpInt;
cout << "The maximum temperature difference (in default 3): ";
cin >> tmpInt;
input->TMDiff = tmpInt;
cout << "The minimum PCR amplicon size (in default 100): ";
cin >> tmpInt;
input->minPS = tmpInt;
cout << "The maximum PCR amplicon size (indefault 250): ";
cin >> tmpInt;
input->maxPS = tmpInt;
cout << "The maximum pair-complementarity (in default 5): ";
cin >> tmpInt;
input->maxPC = tmpInt;
cout << "The maximum 3'end pair-complementarity (in default 4): ";
cin >> tmpInt;
input->endMaxPC = tmpInt;
}
}
}
if(!inputCheck){
cout << "There is no input sequence file. It is essential." << endl;
return false;
}
else if(!outputCheck){
cout << "There is no output path. It is essential." << endl;
return false;
}
else if(!dirCheck){
cout << "There is no storage directory. It is essential." << endl;
return false;
}
else if(!sidCheck){
cout << "There is no maximum sid of host." << endl;
return false;
}
else if(!ogCheck){
cout << "Threr is no organism-vid mapping DB" << endl;
return false;
}
else if(!vCheck){
cout << "There is no sid-vid mapping DB" << endl;
return false;
}
input->printInputParameter();
return true;
}
#endif /* INPUT_HPP_ */
|
# Does CSBS Support Cross-Region Backup for ECSs?<a name="EN-US_TOPIC_0056584614"></a>
No. Currently CSBS supports only backup and restoration within a region but not across regions.
|
package com.github.onotoliy.opposite.treasure.pages.deposit
import com.github.onotoliy.opposite.treasure.store.reducers.State
import kotlinx.coroutines.CoroutineScope
import react.RClass
import react.RProps
import react.invoke
import react.redux.rConnect
import react.router.dom.RouteResultHistory
interface DepositViewPageContainerProps : RProps {
var scope: CoroutineScope
var history: RouteResultHistory
var person: String
}
val depositViewPageContainer: RClass<DepositViewPageContainerProps> =
rConnect<State, DepositViewPageContainerProps, DepositViewPageProps>(
{ state, ownProps ->
scope = ownProps.scope
person = ownProps.person
deposit = state.deposits.deposit
contributions = state.transactions.transactions
debts = state.events.debts
}
)(DepositViewPage::class.js.unsafeCast<RClass<DepositViewPageProps>>())
|
import { defHttp } from '/@/utils/http/axios';
enum ImageApi {
IMG_LIST = '/images/list',
}
export const getImgListApi = (params: any) =>
defHttp.post<any>({
url: ImageApi.IMG_LIST,
params,
});
|
/**
* @class Oskari.statistics.statsgrid.ColorService
*/
Oskari.clazz.define('Oskari.statistics.statsgrid.ColorService',
/**
* @method create called automatically on construction
* @static
*
*/
function () {
var me = this;
this.colorsets.forEach(function(item) {
me.limits.name.push(item.name);
});
this.limits.defaultType = this.colorsets[0].type;
this.limits.defaultName = this.colorsets[0].name;
}, {
__name: "StatsGrid.ColorService",
__qname: "Oskari.statistics.statsgrid.ColorService",
getQName: function () {
return this.__qname;
},
getName: function () {
return this.__name;
},
// Limits should be used when creating UI for color selection
limits : {
type : ['div', 'seq', 'qual'],
// defaultType is set in constructor
defaultType : undefined,
// names are populated in constructor
name : [],
// defaultName is set in constructor
defaultName : undefined,
count : {
min : 2,
// some colorsets have 11 values, some only go to 9
// Should take out the extras if we only provide 9
max : 9
}
},
/**
* Tries to return an array of colors where length equals count parameter.
* If such set is not available, returns null if array with requested count is not available
* @param {Number} count number of colors requested
* @param {String} type optional type, supports 'div', 'seq' or 'qual', defaults to 'div'
* @param {String} name optional name, defaults to 'BrBG'
* @return {String[]} array of hex-strings as colors like ["d8b365","5ab4ac"]
*/
getColorset : function(count, type, name) {
type = type || this.limits.defaultType;
name = name || this.limits.defaultName;
var log = Oskari.log('StatsGrid.ColorService');
var getArray = function(item) {
// 2 colors is the first set and index starts at 0 -> -2
var index = count - 2;
if(index < 0 || index >= item.colors.length) {
// might want to throw an exception here
return null;
}
return item.colors[index].split(',');
}
var value;
var typeMatch;
var nameMatch;
this.colorsets.forEach(function(item) {
if(item.name === name && item.type === type) {
value = item;
}
if(!typeMatch && item.type === type) {
typeMatch = item;
}
if(!nameMatch && item.name === name) {
nameMatch = item;
}
});
if(value) {
var result = getArray(value);
log.debug('Requested set found, requested colors found: ' + !!result);
// found requested item, check if it has the colorset for requested count
return result;
}
// TODO: get first to match type?
log.warn('Requested set not found, using type matching');
if(typeMatch) {
var result = getArray(typeMatch);
log.debug('Type matched set found, requested colors found: ' + !!result);
// found requested item, check if it has the colorset for requested count
return result;
}
log.warn('Requested set not found, using name matching');
if(nameMatch) {
var result = getArray(nameMatch);
log.debug('Name matched set found, requested colors found: ' + !!result);
// found requested item, check if it has the colorset for requested count
return result;
}
// no matches, just use the first one
return getArray(this.colorsets[0]);
},
colorsets : [{
'name': 'BrBG',
'type': 'div',
'colors': [
'd8b365,5ab4ac',
'd8b365,f5f5f5,5ab4ac',
'a6611a,dfc27d,80cdc1,018571',
'a6611a,dfc27d,f5f5f5,80cdc1,018571',
'8c510a,d8b365,f6e8c3,c7eae5,5ab4ac,01665e',
'8c510a,d8b365,f6e8c3,f5f5f5,c7eae5,5ab4ac,01665e',
'8c510a,bf812d,dfc27d,f6e8c3,c7eae5,80cdc1,35978f,01665e',
'8c510a,bf812d,dfc27d,f6e8c3,f5f5f5,c7eae5,80cdc1,35978f,01665e',
'543005,8c510a,bf812d,dfc27d,f6e8c3,c7eae5,80cdc1,35978f,01665e,003c30',
'543005,8c510a,bf812d,dfc27d,f6e8c3,f5f5f5,c7eae5,80cdc1,35978f,01665e,003c30'
]
}, {
'name': 'PiYG',
'type': 'div',
'colors': [
'e9a3c9,a1d76a',
'e9a3c9,f7f7f7,a1d76a',
'd01c8b,f1b6da,b8e186,4dac26',
'd01c8b,f1b6da,f7f7f7,b8e186,4dac26',
'c51b7d,e9a3c9,fde0ef,e6f5d0,a1d76a,4d9221',
'c51b7d,e9a3c9,fde0ef,f7f7f7,e6f5d0,a1d76a,4d9221',
'c51b7d,de77ae,f1b6da,fde0ef,e6f5d0,b8e186,7fbc41,4d9221',
'c51b7d,de77ae,f1b6da,fde0ef,f7f7f7,e6f5d0,b8e186,7fbc41,4d9221',
'8e0152,c51b7d,de77ae,f1b6da,fde0ef,e6f5d0,b8e186,7fbc41,4d9221,276419',
'8e0152,c51b7d,de77ae,f1b6da,fde0ef,f7f7f7,e6f5d0,b8e186,7fbc41,4d9221,276419'
]
}, {
'name': 'PRGn',
'type': 'div',
'colors': [
'af8dc3,7fbf7b',
'af8dc3,f7f7f7,7fbf7b',
'7b3294,c2a5cf,a6dba0,008837',
'7b3294,c2a5cf,f7f7f7,a6dba0,008837',
'762a83,af8dc3,e7d4e8,d9f0d3,7fbf7b,1b7837',
'762a83,af8dc3,e7d4e8,f7f7f7,d9f0d3,7fbf7b,1b7837',
'762a83,9970ab,c2a5cf,e7d4e8,d9f0d3,a6dba0,5aae61,1b7837',
'762a83,9970ab,c2a5cf,e7d4e8,f7f7f7,d9f0d3,a6dba0,5aae61,1b7837',
'40004b,762a83,9970ab,c2a5cf,e7d4e8,d9f0d3,a6dba0,5aae61,1b7837,00441b',
'40004b,762a83,9970ab,c2a5cf,e7d4e8,f7f7f7,d9f0d3,a6dba0,5aae61,1b7837,00441b'
]
}, {
'name': 'PuOr',
'type': 'div',
'colors': [
'f1a340,998ec3',
'f1a340,f7f7f7,998ec3',
'e66101,fdb863,b2abd2,5e3c99',
'e66101,fdb863,f7f7f7,b2abd2,5e3c99',
'b35806,f1a340,fee0b6,d8daeb,998ec3,542788',
'b35806,f1a340,fee0b6,f7f7f7,d8daeb,998ec3,542788',
'b35806,e08214,fdb863,fee0b6,d8daeb,b2abd2,8073ac,542788',
'b35806,e08214,fdb863,fee0b6,f7f7f7,d8daeb,b2abd2,8073ac,542788',
'7f3b08,b35806,e08214,fdb863,fee0b6,d8daeb,b2abd2,8073ac,542788,2d004b',
'7f3b08,b35806,e08214,fdb863,fee0b6,f7f7f7,d8daeb,b2abd2,8073ac,542788,2d004b'
]
}, {
'name': 'RdBu',
'type': 'div',
'colors': [
'ef8a62,67a9cf',
'ef8a62,f7f7f7,67a9cf',
'ca0020,f4a582,92c5de,0571b0',
'ca0020,f4a582,f7f7f7,92c5de,0571b0',
'b2182b,ef8a62,fddbc7,d1e5f0,67a9cf,2166ac',
'b2182b,ef8a62,fddbc7,f7f7f7,d1e5f0,67a9cf,2166ac',
'b2182b,d6604d,f4a582,fddbc7,d1e5f0,92c5de,4393c3,2166ac',
'b2182b,d6604d,f4a582,fddbc7,f7f7f7,d1e5f0,92c5de,4393c3,2166ac',
'67001f,b2182b,d6604d,f4a582,fddbc7,d1e5f0,92c5de,4393c3,2166ac,053061',
'67001f,b2182b,d6604d,f4a582,fddbc7,f7f7f7,d1e5f0,92c5de,4393c3,2166ac,053061'
]
}, {
'name': 'RdGy',
'type': 'div',
'colors': [
'ef8a62,999999',
'ef8a62,ffffff,999999',
'ca0020,f4a582,bababa,404040',
'ca0020,f4a582,ffffff,bababa,404040',
'b2182b,ef8a62,fddbc7,e0e0e0,999999,4d4d4d',
'b2182b,ef8a62,fddbc7,ffffff,e0e0e0,999999,4d4d4d',
'b2182b,d6604d,f4a582,fddbc7,e0e0e0,bababa,878787,4d4d4d',
'b2182b,d6604d,f4a582,fddbc7,ffffff,e0e0e0,bababa,878787,4d4d4d',
'67001f,b2182b,d6604d,f4a582,fddbc7,e0e0e0,bababa,878787,4d4d4d,1a1a1a',
'67001f,b2182b,d6604d,f4a582,fddbc7,ffffff,e0e0e0,bababa,878787,4d4d4d,1a1a1a'
]
}, {
'name': 'RdYlBu',
'type': 'div',
'colors': [
'fc8d59,91bfdb',
'fc8d59,ffffbf,91bfdb',
'd7191c,fdae61,abd9e9,2c7bb6',
'd7191c,fdae61,ffffbf,abd9e9,2c7bb6',
'd73027,fc8d59,fee090,e0f3f8,91bfdb,4575b4',
'd73027,fc8d59,fee090,ffffbf,e0f3f8,91bfdb,4575b4',
'd73027,f46d43,fdae61,fee090,e0f3f8,abd9e9,74add1,4575b4',
'd73027,f46d43,fdae61,fee090,ffffbf,e0f3f8,abd9e9,74add1,4575b4',
'a50026,d73027,f46d43,fdae61,fee090,e0f3f8,abd9e9,74add1,4575b4,313695',
'a50026,d73027,f46d43,fdae61,fee090,ffffbf,e0f3f8,abd9e9,74add1,4575b4,313695'
]
}, {
'name': 'RdYlGn',
'type': 'div',
'colors': [
'fc8d59,91cf60',
'fc8d59,ffffbf,91cf60',
'd7191c,fdae61,a6d96a,1a9641',
'd7191c,fdae61,ffffbf,a6d96a,1a9641',
'd73027,fc8d59,fee08b,d9ef8b,91cf60,1a9850',
'd73027,fc8d59,fee08b,ffffbf,d9ef8b,91cf60,1a9850',
'd73027,f46d43,fdae61,fee08b,d9ef8b,a6d96a,66bd63,1a9850',
'd73027,f46d43,fdae61,fee08b,ffffbf,d9ef8b,a6d96a,66bd63,1a9850',
'a50026,d73027,f46d43,fdae61,fee08b,d9ef8b,a6d96a,66bd63,1a9850,006837',
'a50026,d73027,f46d43,fdae61,fee08b,ffffbf,d9ef8b,a6d96a,66bd63,1a9850,006837'
]
}, {
'name': 'Spectral',
'type': 'div',
'colors': [
'fc8d59,99d594',
'fc8d59,ffffbf,99d594',
'd7191c,fdae61,abdda4,2b83ba',
'd7191c,fdae61,ffffbf,abdda4,2b83ba',
'd53e4f,fc8d59,fee08b,e6f598,99d594,3288bd',
'd53e4f,fc8d59,fee08b,ffffbf,e6f598,99d594,3288bd',
'd53e4f,f46d43,fdae61,fee08b,e6f598,abdda4,66c2a5,3288bd',
'd53e4f,f46d43,fdae61,fee08b,ffffbf,e6f598,abdda4,66c2a5,3288bd',
'9e0142,d53e4f,f46d43,fdae61,fee08b,e6f598,abdda4,66c2a5,3288bd,5e4fa2',
'9e0142,d53e4f,f46d43,fdae61,fee08b,ffffbf,e6f598,abdda4,66c2a5,3288bd,5e4fa2'
]
}, {
'name': 'Blues',
'type': 'seq',
'colors': [
'deebf7,3182bd',
'deebf7,9ecae1,3182bd',
'eff3ff,bdd7e7,6baed6,2171b5',
'eff3ff,bdd7e7,6baed6,3182bd,08519c',
'eff3ff,c6dbef,9ecae1,6baed6,3182bd,08519c',
'eff3ff,c6dbef,9ecae1,6baed6,4292c6,2171b5,084594',
'f7fbff,deebf7,c6dbef,9ecae1,6baed6,4292c6,2171b5,084594',
'f7fbff,deebf7,c6dbef,9ecae1,6baed6,4292c6,2171b5,08519c,08306b'
]
}, {
'name': 'BuGn',
'type': 'seq',
'colors': [
'e5f5f9,2ca25f',
'e5f5f9,99d8c9,2ca25f',
'edf8fb,b2e2e2,66c2a4,238b45',
'edf8fb,b2e2e2,66c2a4,2ca25f,006d2c',
'edf8fb,ccece6,99d8c9,66c2a4,2ca25f,006d2c',
'edf8fb,ccece6,99d8c9,66c2a4,41ae76,238b45,005824',
'f7fcfd,e5f5f9,ccece6,99d8c9,66c2a4,41ae76,238b45,005824',
'f7fcfd,e5f5f9,ccece6,99d8c9,66c2a4,41ae76,238b45,006d2c,00441b'
]
}, {
'name': 'BuPu',
'type': 'seq',
'colors': [
'e0ecf4,8856a7',
'e0ecf4,9ebcda,8856a7',
'edf8fb,b3cde3,8c96c6,88419d',
'edf8fb,b3cde3,8c96c6,8856a7,810f7c',
'edf8fb,bfd3e6,9ebcda,8c96c6,8856a7,810f7c',
'edf8fb,bfd3e6,9ebcda,8c96c6,8c6bb1,88419d,6e016b',
'f7fcfd,e0ecf4,bfd3e6,9ebcda,8c96c6,8c6bb1,88419d,6e016b',
'f7fcfd,e0ecf4,bfd3e6,9ebcda,8c96c6,8c6bb1,88419d,810f7c,4d004b'
]
}, {
'name': 'GnBu',
'type': 'seq',
'colors': [
'e0f3db,43a2ca',
'e0f3db,a8ddb5,43a2ca',
'f0f9e8,bae4bc,7bccc4,2b8cbe',
'f0f9e8,bae4bc,7bccc4,43a2ca,0868ac',
'f0f9e8,ccebc5,a8ddb5,7bccc4,43a2ca,0868ac',
'f0f9e8,ccebc5,a8ddb5,7bccc4,4eb3d3,2b8cbe,08589e',
'f7fcf0,e0f3db,ccebc5,a8ddb5,7bccc4,4eb3d3,2b8cbe,08589e',
'f7fcf0,e0f3db,ccebc5,a8ddb5,7bccc4,4eb3d3,2b8cbe,0868ac,084081'
]
}, {
'name': 'Greens',
'type': 'seq',
'colors': [
'e5f5e0,31a354',
'e5f5e0,a1d99b,31a354',
'edf8e9,bae4b3,74c476,238b45',
'edf8e9,bae4b3,74c476,31a354,006d2c',
'edf8e9,c7e9c0,a1d99b,74c476,31a354,006d2c',
'edf8e9,c7e9c0,a1d99b,74c476,41ab5d,238b45,005a32',
'f7fcf5,e5f5e0,c7e9c0,a1d99b,74c476,41ab5d,238b45,005a32',
'f7fcf5,e5f5e0,c7e9c0,a1d99b,74c476,41ab5d,238b45,006d2c,00441b'
]
}, {
'name': 'Greys',
'type': 'seq',
'colors': [
'f0f0f0,636363',
'f0f0f0,bdbdbd,636363',
'f7f7f7,cccccc,969696,525252',
'f7f7f7,cccccc,969696,636363,252525',
'f7f7f7,d9d9d9,bdbdbd,969696,636363,252525',
'f7f7f7,d9d9d9,bdbdbd,969696,737373,525252,252525',
'ffffff,f0f0f0,d9d9d9,bdbdbd,969696,737373,525252,252525',
'ffffff,f0f0f0,d9d9d9,bdbdbd,969696,737373,525252,252525,000000'
]
}, {
'name': 'Oranges',
'type': 'seq',
'colors': [
'fee6ce,e6550d',
'fee6ce,fdae6b,e6550d',
'feedde,fdbe85,fd8d3c,d94701',
'feedde,fdbe85,fd8d3c,e6550d,a63603',
'feedde,fdd0a2,fdae6b,fd8d3c,e6550d,a63603',
'feedde,fdd0a2,fdae6b,fd8d3c,f16913,d94801,8c2d04',
'fff5eb,fee6ce,fdd0a2,fdae6b,fd8d3c,f16913,d94801,8c2d04',
'fff5eb,fee6ce,fdd0a2,fdae6b,fd8d3c,f16913,d94801,a63603,7f2704'
]
}, {
'name': 'OrRd',
'type': 'seq',
'colors': [
'fee8c8,e34a33',
'fee8c8,fdbb84,e34a33',
'fef0d9,fdcc8a,fc8d59,d7301f',
'fef0d9,fdcc8a,fc8d59,e34a33,b30000',
'fef0d9,fdd49e,fdbb84,fc8d59,e34a33,b30000',
'fef0d9,fdd49e,fdbb84,fc8d59,ef6548,d7301f,990000',
'fff7ec,fee8c8,fdd49e,fdbb84,fc8d59,ef6548,d7301f,990000',
'fff7ec,fee8c8,fdd49e,fdbb84,fc8d59,ef6548,d7301f,b30000,7f0000'
]
}, {
'name': 'PuBu',
'type': 'seq',
'colors': [
'ece7f2,2b8cbe',
'ece7f2,a6bddb,2b8cbe',
'f1eef6,bdc9e1,74a9cf,0570b0',
'f1eef6,bdc9e1,74a9cf,2b8cbe,045a8d',
'f1eef6,d0d1e6,a6bddb,74a9cf,2b8cbe,045a8d',
'f1eef6,d0d1e6,a6bddb,74a9cf,3690c0,0570b0,034e7b',
'fff7fb,ece7f2,d0d1e6,a6bddb,74a9cf,3690c0,0570b0,034e7b',
'fff7fb,ece7f2,d0d1e6,a6bddb,74a9cf,3690c0,0570b0,045a8d,023858'
]
}, {
'name': 'PuBuGn',
'type': 'seq',
'colors': [
'ece2f0,1c9099',
'ece2f0,a6bddb,1c9099',
'f6eff7,bdc9e1,67a9cf,02818a',
'f6eff7,bdc9e1,67a9cf,1c9099,016c59',
'f6eff7,d0d1e6,a6bddb,67a9cf,1c9099,016c59',
'f6eff7,d0d1e6,a6bddb,67a9cf,3690c0,02818a,016450',
'fff7fb,ece2f0,d0d1e6,a6bddb,67a9cf,3690c0,02818a,016450',
'fff7fb,ece2f0,d0d1e6,a6bddb,67a9cf,3690c0,02818a,016c59,014636'
]
}, {
'name': 'PuRd',
'type': 'seq',
'colors': [
'e7e1ef,dd1c77',
'e7e1ef,c994c7,dd1c77',
'f1eef6,d7b5d8,df65b0,ce1256',
'f1eef6,d7b5d8,df65b0,dd1c77,980043',
'f1eef6,d4b9da,c994c7,df65b0,dd1c77,980043',
'f1eef6,d4b9da,c994c7,df65b0,e7298a,ce1256,91003f',
'f7f4f9,e7e1ef,d4b9da,c994c7,df65b0,e7298a,ce1256,91003f',
'f7f4f9,e7e1ef,d4b9da,c994c7,df65b0,e7298a,ce1256,980043,67001f'
]
}, {
'name': 'Purples',
'type': 'seq',
'colors': [
'efedf5,756bb1',
'efedf5,bcbddc,756bb1',
'f2f0f7,cbc9e2,9e9ac8,6a51a3',
'f2f0f7,cbc9e2,9e9ac8,756bb1,54278f',
'f2f0f7,dadaeb,bcbddc,9e9ac8,756bb1,54278f',
'f2f0f7,dadaeb,bcbddc,9e9ac8,807dba,6a51a3,4a1486',
'fcfbfd,efedf5,dadaeb,bcbddc,9e9ac8,807dba,6a51a3,4a1486',
'fcfbfd,efedf5,dadaeb,bcbddc,9e9ac8,807dba,6a51a3,54278f,3f007d'
]
}, {
'name': 'RdPu',
'type': 'seq',
'colors': [
'fde0dd,c51b8a',
'fde0dd,fa9fb5,c51b8a',
'feebe2,fbb4b9,f768a1,ae017e',
'feebe2,fbb4b9,f768a1,c51b8a,7a0177',
'feebe2,fcc5c0,fa9fb5,f768a1,c51b8a,7a0177',
'feebe2,fcc5c0,fa9fb5,f768a1,dd3497,ae017e,7a0177',
'fff7f3,fde0dd,fcc5c0,fa9fb5,f768a1,dd3497,ae017e,7a0177',
'fff7f3,fde0dd,fcc5c0,fa9fb5,f768a1,dd3497,ae017e,7a0177,49006a'
]
}, {
'name': 'Reds',
'type': 'seq',
'colors': [
'fee0d2,de2d26',
'fee0d2,fc9272,de2d26',
'fee5d9,fcae91,fb6a4a,cb181d',
'fee5d9,fcae91,fb6a4a,de2d26,a50f15',
'fee5d9,fcbba1,fc9272,fb6a4a,de2d26,a50f15',
'fee5d9,fcbba1,fc9272,fb6a4a,ef3b2c,cb181d,99000d',
'fff5f0,fee0d2,fcbba1,fc9272,fb6a4a,ef3b2c,cb181d,99000d',
'fff5f0,fee0d2,fcbba1,fc9272,fb6a4a,ef3b2c,cb181d,a50f15,67000d'
]
}, {
'name': 'YlGn',
'type': 'seq',
'colors': [
'f7fcb9,31a354',
'f7fcb9,addd8e,31a354',
'ffffcc,c2e699,78c679,238443',
'ffffcc,c2e699,78c679,31a354,006837',
'ffffcc,d9f0a3,addd8e,78c679,31a354,006837',
'ffffcc,d9f0a3,addd8e,78c679,41ab5d,238443,005a32',
'ffffe5,f7fcb9,d9f0a3,addd8e,78c679,41ab5d,238443,005a32',
'ffffe5,f7fcb9,d9f0a3,addd8e,78c679,41ab5d,238443,006837,004529'
]
}, {
'name': 'YlGnBu',
'type': 'seq',
'colors': [
'edf8b1,2c7fb8',
'edf8b1,7fcdbb,2c7fb8',
'ffffcc,a1dab4,41b6c4,225ea8',
'ffffcc,a1dab4,41b6c4,2c7fb8,253494',
'ffffcc,c7e9b4,7fcdbb,41b6c4,2c7fb8,253494',
'ffffcc,c7e9b4,7fcdbb,41b6c4,1d91c0,225ea8,0c2c84',
'ffffd9,edf8b1,c7e9b4,7fcdbb,41b6c4,1d91c0,225ea8,0c2c84',
'ffffd9,edf8b1,c7e9b4,7fcdbb,41b6c4,1d91c0,225ea8,253494,081d58'
]
}, {
'name': 'YlOrBr',
'type': 'seq',
'colors': [
'fff7bc,d95f0e',
'fff7bc,fec44f,d95f0e',
'ffffd4,fed98e,fe9929,cc4c02',
'ffffd4,fed98e,fe9929,d95f0e,993404',
'ffffd4,fee391,fec44f,fe9929,d95f0e,993404',
'ffffd4,fee391,fec44f,fe9929,ec7014,cc4c02,8c2d04',
'ffffe5,fff7bc,fee391,fec44f,fe9929,ec7014,cc4c02,8c2d04',
'ffffe5,fff7bc,fee391,fec44f,fe9929,ec7014,cc4c02,993404,662506'
]
}, {
'name': 'YlOrRd',
'type': 'seq',
'colors': [
'ffeda0,f03b20',
'ffeda0,feb24c,f03b20',
'ffffb2,fecc5c,fd8d3c,e31a1c',
'ffffb2,fecc5c,fd8d3c,f03b20,bd0026',
'ffffb2,fed976,feb24c,fd8d3c,f03b20,bd0026',
'ffffb2,fed976,feb24c,fd8d3c,fc4e2a,e31a1c,b10026',
'ffffcc,ffeda0,fed976,feb24c,fd8d3c,fc4e2a,e31a1c,b10026',
'ffffcc,ffeda0,fed976,feb24c,fd8d3c,fc4e2a,e31a1c,bd0026,800026'
]
}, {
'name': 'Accent',
'type': 'qual',
'colors': [
'7fc97f,fdc086',
'7fc97f,beaed4,fdc086',
'7fc97f,beaed4,fdc086,ffff99',
'7fc97f,beaed4,fdc086,ffff99,386cb0',
'7fc97f,beaed4,fdc086,ffff99,386cb0,f0027f',
'7fc97f,beaed4,fdc086,ffff99,386cb0,f0027f,bf5b17',
'7fc97f,beaed4,fdc086,ffff99,386cb0,f0027f,bf5b17,666666'
]
}, {
'name': 'Dark2',
'type': 'qual',
'colors': [
'1b9e77,7570b3',
'1b9e77,d95f02,7570b3',
'1b9e77,d95f02,7570b3,e7298a',
'1b9e77,d95f02,7570b3,e7298a,66a61e',
'1b9e77,d95f02,7570b3,e7298a,66a61e,e6ab02',
'1b9e77,d95f02,7570b3,e7298a,66a61e,e6ab02,a6761d',
'1b9e77,d95f02,7570b3,e7298a,66a61e,e6ab02,a6761d,666666'
]
}, {
'name': 'Paired',
'type': 'qual',
'colors': [
'a6cee3,b2df8a',
'a6cee3,1f78b4,b2df8a',
'a6cee3,1f78b4,b2df8a,33a02c',
'a6cee3,1f78b4,b2df8a,33a02c,fb9a99',
'a6cee3,1f78b4,b2df8a,33a02c,fb9a99,e31a1c',
'a6cee3,1f78b4,b2df8a,33a02c,fb9a99,e31a1c,fdbf6f',
'a6cee3,1f78b4,b2df8a,33a02c,fb9a99,e31a1c,fdbf6f,ff7f00',
'a6cee3,1f78b4,b2df8a,33a02c,fb9a99,e31a1c,fdbf6f,ff7f00,cab2d6',
'a6cee3,1f78b4,b2df8a,33a02c,fb9a99,e31a1c,fdbf6f,ff7f00,cab2d6,6a3d9a',
'a6cee3,1f78b4,b2df8a,33a02c,fb9a99,e31a1c,fdbf6f,ff7f00,cab2d6,6a3d9a,ffff99',
'a6cee3,1f78b4,b2df8a,33a02c,fb9a99,e31a1c,fdbf6f,ff7f00,cab2d6,6a3d9a,ffff99,b15928'
]
}, {
'name': 'Pastel1',
'type': 'qual',
'colors': [
'bb4ae,ccebc5',
'bb4ae,b3cde3,ccebc5',
'fbb4ae,b3cde3,ccebc5,decbe4',
'fbb4ae,b3cde3,ccebc5,decbe4,fed9a6',
'fbb4ae,b3cde3,ccebc5,decbe4,fed9a6,ffffcc',
'fbb4ae,b3cde3,ccebc5,decbe4,fed9a6,ffffcc,e5d8bd',
'fbb4ae,b3cde3,ccebc5,decbe4,fed9a6,ffffcc,e5d8bd,fddaec',
'fbb4ae,b3cde3,ccebc5,decbe4,fed9a6,ffffcc,e5d8bd,fddaec,f2f2f2'
]
}, {
'name': 'Pastel2',
'type': 'qual',
'colors': [
'b3e2cd,cbd5e8',
'b3e2cd,fdcdac,cbd5e8',
'b3e2cd,fdcdac,cbd5e8,f4cae4',
'b3e2cd,fdcdac,cbd5e8,f4cae4,e6f5c9',
'b3e2cd,fdcdac,cbd5e8,f4cae4,e6f5c9,fff2ae',
'b3e2cd,fdcdac,cbd5e8,f4cae4,e6f5c9,fff2ae,f1e2cc',
'b3e2cd,fdcdac,cbd5e8,f4cae4,e6f5c9,fff2ae,f1e2cc,cccccc'
]
}, {
'name': 'Set1',
'type': 'qual',
'colors': [
'e41a1c,4daf4a',
'e41a1c,377eb8,4daf4a',
'e41a1c,377eb8,4daf4a,984ea3',
'e41a1c,377eb8,4daf4a,984ea3,ff7f00',
'e41a1c,377eb8,4daf4a,984ea3,ff7f00,ffff33',
'e41a1c,377eb8,4daf4a,984ea3,ff7f00,ffff33,a65628',
'e41a1c,377eb8,4daf4a,984ea3,ff7f00,ffff33,a65628,f781bf',
'e41a1c,377eb8,4daf4a,984ea3,ff7f00,ffff33,a65628,f781bf,999999'
]
}, {
'name': 'Set2',
'type': 'qual',
'colors': [
'66c2a5,8da0cb',
'66c2a5,fc8d62,8da0cb',
'66c2a5,fc8d62,8da0cb,e78ac3',
'66c2a5,fc8d62,8da0cb,e78ac3,a6d854',
'66c2a5,fc8d62,8da0cb,e78ac3,a6d854,ffd92f',
'66c2a5,fc8d62,8da0cb,e78ac3,a6d854,ffd92f,e5c494',
'66c2a5,fc8d62,8da0cb,e78ac3,a6d854,ffd92f,e5c494,b3b3b3'
]
}, {
'name': 'Set3',
'type': 'qual',
'colors': [
'8dd3c7,bebada',
'8dd3c7,ffffb3,bebada',
'8dd3c7,ffffb3,bebada,fb8072',
'8dd3c7,ffffb3,bebada,fb8072,80b1d3',
'8dd3c7,ffffb3,bebada,fb8072,80b1d3,fdb462',
'8dd3c7,ffffb3,bebada,fb8072,80b1d3,fdb462,b3de69',
'8dd3c7,ffffb3,bebada,fb8072,80b1d3,fdb462,b3de69,fccde5',
'8dd3c7,ffffb3,bebada,fb8072,80b1d3,fdb462,b3de69,fccde5,d9d9d9',
'8dd3c7,ffffb3,bebada,fb8072,80b1d3,fdb462,b3de69,fccde5,d9d9d9,bc80bd',
'8dd3c7,ffffb3,bebada,fb8072,80b1d3,fdb462,b3de69,fccde5,d9d9d9,bc80bd,ccebc5',
'8dd3c7,ffffb3,bebada,fb8072,80b1d3,fdb462,b3de69,fccde5,d9d9d9,bc80bd,ccebc5,ffed6f'
]
}]
}, {
'protocol': ['Oskari.mapframework.service.Service']
});
|
<h1 align="center">👩⚕️ Movie Recommendation System</h1>
# ✅ Objective of this project
- The objective of this project is to recommend movies.
- We do content based filtering to recommend movies.
# 📚 Procedure :-
## ✔ Working on the dataset
- We get the column names and data shape.
- We perform exploratory data analysis
- We plot Number of Viewers vs rating to conclude important subset fo data for making predictions.
- We group the content by Title and Ratings
- We sort the ratings in descending order
## ✔ Build recommendation model for one movie
- We get ratings for one movie and put it into a new dataframe
- Now we make a correlation matrix between new dataframe and movie matrix
- We arrange the movie matrix in descending order on the basis of correlation score
- The most correlated movies show up on top
## ✔ Generalize the model for all the movies now
- We input movie name from user
- We make the correlation matrix
- We sort the correlation score in descending order
- Our prediction are ready.
|
package domain
import monix.eval.Task
trait ResasRepository {
def getPrefectures(): Task[Seq[Prefecture]]
def getCities(prefCode: Int): Task[Seq[City]]
}
|
package controllers
// IndexController main controller
type IndexController struct {
BaseController
}
// Index main page
func (c *IndexController) Index() {
c.Data["menu"] = "main"
c.TplName = "index.html"
}
|
# Read Me
Website created using Beautiful Jekyll
*Copyright 2016 [Dean Attali](http://deanattali.com)*
Instructions for Building Locally:
1. Install Jekyll and download github repository
2. Make changes locally
3. Run "bundle exec jekyll build" to build website
4. Run "bundle exec jekyll serve" to see website
5. Push changes
|
package de.rki.coronawarnapp.dccticketing.core.server
import com.google.gson.annotations.SerializedName
data class AccessTokenRequest(
@SerializedName("service")
val service: String,
@SerializedName("pubKey")
val pubKey: String
)
|
const precioOriginal=100;
const descuento=15;
function calcularPrecioConDescuento(precio, descuento){
const porcentajePrecioConDescuento=100-descuento;
const precioConDescuento=(precio*porcentajePrecioConDescuento)/100;
return precioConDescuento;
}
function onClickButtonPriceDiscount(){
const precio=document.getElementById('inputPrice').value;
const discount=document.getElementById('inputDiscount').value;
const result=document.getElementById('resultP');
result.innerText='El precio con descuento son $ '+calcularPrecioConDescuento(precio,discount);
//alert(calcularPrecioConDescuento(precio,discount));
}
/*
console.log({
precioOriginal,
descuento,
porcentajePrecioConDescuento,
precioConDescuento
})*/ |
## About the Bill of Rights
- [What does it say?](#)
- [How did it happen?](#)
- [How was it made?](#)
- [Where can I learn more?](#)
- [Where can I see it?](#) |
from ._base import *
class FundamentalTrader(ABCTrader):
internal_rate: float
def __init__(self, player):
super().__init__(player)
self.update_internal_rate()
def update_internal_rate(self):
self.internal_rate = 0.1
def tick(self, cid):
big_c = self.big_c(cid)
exchange_price = self._exchange_price(cid)
if big_c.asks: # selling, already in balance
if big_c.amount > 0:
logger.info("there is some part not selling")
if not big_c.bids:
logger.info("and bids not existing")
self._fast_forward(cid, exchange_price)
self._output_balanced(cid)
elif big_c.asks[0].price != exchange_price:
logger.info(f"exchange price not match ({big_c.asks[0].price} != {exchange_price})")
self._output_balanced(cid)
# otherwise, it must be unchanged
elif big_c.amount > 0:
logger.info("new stock")
self._fast_seller(cid, low=self._exchange_price(cid))
if big_c.amount or big_c.asks:
self._output_balanced(cid)
elif big_c.bids:
if big_c.bids[0].price == 2.0 and big_c.bids[0].amount == 2:
logger.info("forced view")
self._fast_forward(cid, self._exchange_price(cid))
self._output_balanced(cid)
elif max(bid.price for bid in big_c.bids_all) <= big_c.initial_price_rounded:
logger.info("justice! no under initial")
self._output_balanced(cid)
else:
logger.info("forget it")
big_c.ensure_bids([])
# otherwise nothing
def _exchange_price(self, cid):
big_c = self.big_c(cid)
return max([big_c.initial_price_rounded, self._fundamental(cid)])
def _fundamental(self, cid):
big_c = self.big_c(cid)
return round(big_c.rate / self.internal_rate, 2)
def _fast_forward(self, cid, price=None):
logger.debug(f"fast forward #{cid:<5} | {price}")
big_c = self.big_c(cid)
price = price or self._exchange_price(cid)
amount = 100
big_c.ensure_bids([])
big_c.update_user_character(ignore_throttle=True)
while not big_c.bids:
big_c.ensure_bids([TBid(Price=price, Amount=amount)])
big_c.update_user_character(ignore_throttle=True)
amount *= 2
big_c.ensure_bids([TBid(Price=price, Amount=100)])
big_c.update_user_character(ignore_throttle=True)
def _fast_seller(self, cid, amount=None, low=10, high=100000):
logger.debug(f"fast seller #{cid:<5} | ({low}-{high}) / {amount}")
big_c = self.big_c(cid)
big_c.ensure_bids([], force_updates='before')
big_c.ensure_asks([], force_updates='after')
if amount is None:
amount = big_c.amount
while amount:
pin = round(0.618 * high + 0.382 * low, 2)
if pin == high or pin == low:
break
big_c.ensure_asks([TAsk(Price=pin, Amount=1)], force_updates='after')
if big_c.asks:
big_c.ensure_asks([], force_updates='after')
high = pin
else:
low = pin
amount -= 1
if amount:
big_c.ensure_asks([TAsk(Price=low, Amount=amount)], force_updates='after')
def _output_balanced(self, cid):
exchange_price = self._exchange_price(cid)
logger.debug(f"output balanced #{cid:<5} | {exchange_price}")
big_c = self.big_c(cid)
big_c.update_user_character(ignore_throttle=True)
if big_c.total_holding:
big_c.ensure_asks([TAsk(Price=exchange_price, Amount=big_c.total_holding)])
big_c.ensure_bids([TBid(Price=exchange_price, Amount=100)], force_updates='after')
|
export const SET_CURRENT_USER = "SET_CURRENT_USER";
export const SET_ERRORS = "SET_ERRORS";
export const FETCH_CHANNELS = "FETCH_CHANNELS";
export const FILTER_CHANNELS = "FILTER_CHANNELS";
export const FETCH_CHANNEL_DETAIL = "FETCH_CHANNEL_DETAIL";
export const SEND_MESSAGE = "SEND_MESSAGE";
// export const FETCH_L_MESSAGE = "SEND_MESSAGE";
export const SET_MESSAGES_LOADING = "SET_MESSAGES_LOADING";
export const SET_CHANNELS_LOADING = "SET_CHANNELS_LOADING";
// export const FILTER_MESSAGES = "FILTER_MESSAGES";
export const POST_CHANNEL = "POST_CHANNEL" |
class SearchDevelopersJob < Que::Job
def run(search_cache_key)
params = Rails.cache.read(search_cache_key)
api = Github::Api.new(params[:access_token])
logins = api.search(params)
logins.each do |login|
FetchDeveloperJob.enqueue(login, params[:access_token])
end
end
end
|
# frozen_string_literal: true
module Unparser
class CLI
# Source representation for CLI sources
#
# ignore :reek:TooManyMethods
class Source
include AbstractType, Adamantium::Flat, NodeHelpers
# Source state generated after first unparse
class Generated
include Concord::Public.new(:source, :ast, :error)
# Test if source was generated successfully
#
# @return [Boolean]
#
# @api private
#
def success?
!error
end
# Build generated source
#
# @param [Parser::AST::Node] ast
#
# @api private
#
def self.build(ast)
source = Unparser.unparse(ast)
new(source, ast, nil)
rescue StandardError => exception
new(nil, ast, exception)
end
end
# Test if source could be unparsed successfully
#
# @return [Boolean]
#
# @api private
#
def success?
generated.success? && original_ast && generated_ast && original_ast.eql?(generated_ast)
end
# Return error report
#
# @return [String]
#
# @api private
#
def report
if original_ast && generated_ast
report_with_ast_diff
elsif !original_ast
report_original
elsif !generated.success?
report_unparser
elsif !generated_ast
report_generated
else
raise
end
end
memoize :report
private
# Return generated source
#
# @return [String]
#
# @api private
#
def generated
Source::Generated.build(original_ast)
end
memoize :generated
# Return stripped source
#
# @param [String] source
#
# @return [String]
#
# @api private
#
# ignore :reek:UtilityFunction
def strip(source)
source = source.rstrip
indent = source.scan(/^\s*/).first
source.gsub(/^#{indent}/, '')
end
# Return error report for parsing original
#
# @return [String]
#
# @api private
#
def report_original
strip(<<-MESSAGE)
Parsing of original source failed:
#{original_source}
MESSAGE
end
# Report unparser bug
#
# @return [String]
#
# @api private
#
def report_unparser
message = ['Unparsing parsed AST failed']
error = generated.error
message << error
error.backtrace.take(20).each(&message.method(:<<))
message << 'Original-AST:'
message << original_ast.inspect
message.join("\n")
end
# Return error report for parsing generated
#
# @return [String]
#
# @api private
#
def report_generated
strip(<<-MESSAGE)
Parsing of generated source failed:
Original-source:
#{original_source}
Original-AST:
#{original_ast.inspect}
Source:
#{generated.source}
MESSAGE
end
# Return error report with AST difference
#
# @return [String]
#
# @api private
#
def report_with_ast_diff
strip(<<-MESSAGE)
#{ast_diff}
Original-Source:\n#{original_source}
Original-AST:\n#{original_ast.inspect}
Generated-Source:\n#{generated.source}
Generated-AST:\n#{generated_ast.inspect}
MESSAGE
end
# Return ast diff
#
# @return [String]
#
# @api private
#
def ast_diff
Differ.call(
original_ast.inspect.lines.map(&:chomp),
generated_ast.inspect.lines.map(&:chomp)
)
end
# Return generated AST
#
# @return [Parser::AST::Node]
# if parser was sucessful for generated ast
#
# @return [nil]
# otherwise
#
# @api private
#
def generated_ast
generated.success? && Preprocessor.run(Unparser.parse(generated.source))
rescue Parser::SyntaxError
nil
end
memoize :generated_ast
# Return original AST
#
# @return [Parser::AST::Node]
#
# @api private
#
def original_ast
Preprocessor.run(Unparser.parse(original_source))
rescue Parser::SyntaxError
nil
end
memoize :original_ast
# CLI source from string
class String < self
include Concord.new(:original_source)
# Return identification
#
# @return [String]
#
# @api private
#
def identification
'(string)'
end
end # String
# CLI source from file
class File < self
include Concord.new(:file_name)
# Return identification
#
# @return [String]
#
# @api private
#
def identification
"(#{file_name})"
end
private
# Return original source
#
# @return [String]
#
# @api private
#
def original_source
::File.read(file_name)
end
memoize :original_source
end # File
# Source passed in as node
class Node < self
include Concord.new(:original_ast)
# Return original source
#
# @return [String]
#
# @api private
#
def original_source
Unparser.unparse(original_ast)
end
memoize :original_source
end # Node
end # Source
end # CLI
end # Unparser
|
// Copyright © 2018 Drew J. Sonne <[email protected]>
//
// This program is free software: you can redistribute it and/or modify
// it under the terms of the GNU Lesser General Public License as published by
// the Free Software Foundation, either version 3 of the License, or
// (at your option) any later version.
//
// This program is distributed in the hope that it will be useful,
// but WITHOUT ANY WARRANTY; without even the implied warranty of
// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
// GNU Lesser General Public License for more details.
//
// You should have received a copy of the GNU Lesser General Public License
// along with this program. If not, see <http://www.gnu.org/licenses/>.
package cmd
import (
"fmt"
"os"
"context"
apiclient "github.com/getlunaform/lunaform/client"
"github.com/go-openapi/runtime"
httptransport "github.com/go-openapi/runtime/client"
"github.com/go-openapi/strfmt"
"github.com/mitchellh/go-homedir"
"github.com/spf13/cobra"
jww "github.com/spf13/jwalterweatherman"
"github.com/spf13/viper"
"strings"
)
const (
TERRAFORM_SERVER_TYPE_V1 = "application/vnd.lunaform.v1+json"
TERRAFORM_SERVER_AUTH_HEADER = "X-Lunaform-Auth"
)
var cfgFile string
var useHal bool
var lunaformClient *apiclient.Lunaform
var config Configuration
var version string
var authHandler runtime.ClientAuthInfoWriterFunc
var logLevelMapping = map[string]jww.Threshold{
"TRACE": jww.LevelTrace,
"DEBUG": jww.LevelDebug,
"INFO": jww.LevelInfo,
"WARN": jww.LevelWarn,
"ERROR": jww.LevelError,
"CRITICAL": jww.LevelCritical,
"FATAL": jww.LevelFatal,
}
type Configuration struct {
Host string
Port string
Schemes []string
Log struct {
Level string
}
ApiKey string
}
// rootCmd represents the base command when called without any subcommands
var rootCmd = &cobra.Command{
Use: "lunaform",
Short: "A commandline application to interact with lunaform",
Long: `A commandline client to perform operations on 'lunaform'.
These include module, and stack deployment, as well as user and permission management.
For example:
$ lunaform auth users list
$ lunaform tf modules list
$ lunaform tf modules create \
--name my-module \
--type git \
--source [email protected]:zeebox/my-module.git
`,
Version: version,
}
// Execute adds all child commands to the root command and sets flags appropriately.
// This is called by main.main(). It only needs to happen once to the rootCmd.
func Execute() {
if err := rootCmd.Execute(); err != nil {
fmt.Println(err)
os.Exit(1)
}
}
func init() {
cobra.OnInitialize(initConfig)
cobra.OnInitialize(initLogging)
cobra.OnInitialize(initGocdClient)
cobra.OnInitialize(initAuthHandler)
// Here you will define your flags and configuration settings.
// Cobra supports persistent flags, which, if defined here,
// will be global for your application.
rootCmd.PersistentFlags().StringVar(&cfgFile, "config", "", "config file (default is $HOME/.config/lunaform.yaml)")
rootCmd.PersistentFlags().BoolVar(&useHal, "hal", false, "draw HAL elements in response")
}
// initConfig reads in config file and ENV variables if set.
func initConfig() {
if cfgFile != "" {
// Use config file from the flag.
viper.SetConfigFile(cfgFile)
} else {
// Find home directory.
home, err := homedir.Dir()
if err != nil {
fmt.Println(err)
os.Exit(1)
}
// Search config in home directory with name ".client" (without extension).
viper.AddConfigPath(home + "/.config/")
viper.SetConfigName("lunaform")
}
viper.AutomaticEnv() // read in environment variables that match
// If a config file is found, read it in.
if err := viper.ReadInConfig(); err == nil {
if doLogging() {
fmt.Println("Using config file:", viper.ConfigFileUsed())
}
}
viper.Unmarshal(&config)
}
func initLogging() {
if doLogging() {
jww.SetLogThreshold(logLevel())
}
}
func initGocdClient() {
cfg := apiclient.DefaultTransportConfig().
WithHost(config.Host + ":" + config.Port).
WithSchemes(config.Schemes)
transport := httptransport.New(cfg.Host, cfg.BasePath, cfg.Schemes)
transport.Context = context.Background()
if doLogging() {
if logLevel() < jww.LevelDebug {
transport.Debug = true
}
}
transport.Producers[TERRAFORM_SERVER_TYPE_V1] = runtime.JSONProducer()
transport.Consumers[TERRAFORM_SERVER_TYPE_V1] = runtime.JSONConsumer()
lunaformClient = apiclient.New(transport, strfmt.Default)
}
func initAuthHandler() {
authHandler = func(request runtime.ClientRequest, reg strfmt.Registry) (err error) {
return request.SetHeaderParam(TERRAFORM_SERVER_AUTH_HEADER, config.ApiKey)
}
}
func logLevel() jww.Threshold {
return getJwwLogLevel(config.Log.Level)
}
func doLogging() bool {
return config.Log.Level != ""
}
func getJwwLogLevel(string) jww.Threshold {
logLevel := strings.ToUpper(config.Log.Level)
return logLevelMapping[logLevel]
}
|
using System;
namespace Roslyn.Utilities
{
internal static class NumericExtensions
{
// Suggested by Jon Skeet.
private static readonly long DoubleNegativeZeroBits = BitConverter.DoubleToInt64Bits(-0d);
public static bool IsNegativeZero(this double d)
{
return BitConverter.DoubleToInt64Bits(d) == DoubleNegativeZeroBits;
}
public static bool IsNegativeZero(this float f)
{
return BitConverter.DoubleToInt64Bits((double)f) == DoubleNegativeZeroBits;
}
public static bool IsNegativeZero(this decimal d)
{
return BitConverter.DoubleToInt64Bits((double)d) == DoubleNegativeZeroBits;
}
}
} |
package fiofoundation.io.fiosdk.models.fionetworkprovider.request
import com.google.gson.annotations.SerializedName
class GetFeeRequest (@field:SerializedName("end_point") var endPoint: String,
@field:SerializedName("fio_address") var fioAddress: String) : FIORequest() |
use std::env;
use std::fs::OpenOptions;
use std::io;
use std::io::prelude::*;
use std::path::PathBuf;
use std::process::exit;
use std::process::Command;
// ====================
// === WrapLauncher ===
// ====================
/// Run the wrapped launcher overriding its reported version to the provided version.
///
/// The launcher's executable location is also overridden to point to this executable. The launcher
/// is passed all the original arguments plus the arguments that handle the version and location
/// override. The location of the original launcher executable that is wrapped is determined by the
/// environment variable `ENSO_LAUNCHER_LOCATION` that should be set at build-time.
///
/// Additionally, the wrapper appends to a log file called `.launcher_version_log` a line containing
/// the version string that was launched (creating the file if necessary). This can be used by tests
/// to verify the order of launched versions.
pub fn wrap_launcher(version: impl AsRef<str>) {
let args: Vec<String> = env::args().collect();
let missing_location_message = "`ENSO_LAUNCHER_LOCATION` is not defined.";
let launcher_location = env::var("ENSO_LAUNCHER_LOCATION").expect(missing_location_message);
let current_exe_path = env::current_exe().expect("Cannot get current executable path.");
let exe_location = match current_exe_path.to_str() {
Some(str) => str,
None => {
eprintln!("Path {} is invalid.", current_exe_path.to_string_lossy());
exit(1)
}
};
let missing_directory_message = "Executable path should have a parent directory.";
let parent_directory = current_exe_path.parent().expect(missing_directory_message);
let log_name = ".launcher_version_log";
let log_path = parent_directory.join(log_name);
append_to_log(log_path, version.as_ref().to_string()).expect("Cannot write to log.");
let override_args = [
String::from("--internal-emulate-version"),
version.as_ref().to_string(),
String::from("--internal-emulate-location"),
String::from(exe_location),
];
let modified_args = [&override_args[..], &args[1..]].concat();
let exit_status = Command::new(launcher_location).args(modified_args).status();
let exit_code = match exit_status {
Ok(status) =>
if let Some(code) = status.code() {
code
} else {
eprintln!("Process terminated by signal.");
exit(1)
},
Err(error) => {
eprintln!("{}", error);
exit(1)
}
};
exit(exit_code)
}
// === Log ===
/// Appends a line to the file located at the provided path.
pub fn append_to_log(path: PathBuf, line: impl AsRef<str>) -> io::Result<()> {
let mut log_file = OpenOptions::new().create(true).write(true).append(true).open(path)?;
writeln!(log_file, "{}", line.as_ref())?;
Ok(())
}
|
# ApplicationResource is similar to ApplicationRecord - a base class that
# holds configuration/methods for subclasses.
# All Resources should inherit from ApplicationResource.
class ApplicationResource < Graphiti::Resource
# Use the ActiveRecord Adapter for all subclasses.
# Subclasses can still override this default.
self.abstract_class = true
self.adapter = Graphiti::Adapters::ActiveRecord
self.base_url = Rails.application.routes.default_url_options[:host]
self.endpoint_namespace = '/api/v1'
# Found no way to pass endpoint validation with nested routes!
# Without validation, everything's looking fine...
self.validate_endpoints = false
end
|
# ve-tartufo
An italian version of truffle made for VeChain
## Installation
```sh
$ npm install --save git+ssh://[email protected]/efebia-com/ve-tartufo.git
```
## API
```js
const { compile, deploy } = require('ve-tartufo');
```
### How to use it:
```js
const { init, compile, deploy } = require("ve-tartufo");
//remember to use our Web3, because Thor suggest version of web3 is the "1.0.0-beta.37"
const main = async _ => {
try {
//give the path of where your contracts are and compile them all!
const fullPath = 'contracts/MainContract.sol';
const { bytecode, abi } = compile(fullPath);
//connect to your local VeChain
const blockchainURL = "http://localhost:8669";
const web3 = init(blockchainURL);
//add one of the 10 wallets available
const privateKey = "0xdce1443bd2ef0c2631adc1c67e5c93f13dc23a41c18b536effbbdcbcdb96fb65";
const newAccount = web3.eth.accounts.privateKeyToAccount(privateKey);
web3.eth.accounts.wallet.add(newAccount);
//deploy
const from = web3.eth.accounts.wallet[0].address;
const { options: { address } } = await deploy({ web3, bytecode, abi, from, gas: 10000000, gasPrice: 1 });
console.log("deployed:", address);
//call your smart contract methods!
const contract = new web3.eth.Contract(abi, address);
const r1 = await contract.methods.yourMethod().call();
const r2 = await contract.methods.yourMethod('whatever').send({from, gas: 1000000, gasPrice: 1 });
} catch ({ stack }) {
console.log(stack);
}
}
main();
```
## License
[MIT](LICENSE)
|
import 'package:cinema_ticket_maker/api/settings.dart';
import 'package:flutter/cupertino.dart';
import 'package:flutter/material.dart';
import 'package:flutter/services.dart';
Widget newRowForLayoutDialog(BuildContext context) {
final TextEditingController rowIdentifier = TextEditingController();
final TextEditingController numberOfSeats = TextEditingController(text: "1");
return Scaffold(
appBar: AppBar(
title: const Text("Add new row"),
),
body: SafeArea(
child: Column(
children: [
const SizedBox(
height: 80,
),
const Text("Row identifier"),
TextField(
controller: rowIdentifier,
textAlign: TextAlign.center,
),
const SizedBox(
height: 80,
),
const Text("Number of seats"),
TextField(
controller: numberOfSeats,
textAlign: TextAlign.center,
inputFormatters: [FilteringTextInputFormatter.digitsOnly],
keyboardType: TextInputType.number,
),
],
),
),
bottomSheet: Padding(
padding: const EdgeInsets.all(20),
child: Row(
mainAxisAlignment: MainAxisAlignment.spaceAround,
children: [
TextButton(
style: TextButton.styleFrom(
backgroundColor: Colors.red,
),
onPressed: () {
Navigator.of(context).pop();
},
child: const Text(
"Cancel",
style: TextStyle(color: Colors.black),
),
),
TextButton(
style: TextButton.styleFrom(
backgroundColor: Colors.green,
),
onPressed: () async {
if (rowIdentifier.text.isEmpty) {
ScaffoldMessenger.of(context).showSnackBar(
const SnackBar(
content: Text("The row identifier cannot be empty"),
backgroundColor: Colors.red,
),
);
return;
}
int nS = int.parse(numberOfSeats.text);
if (numberOfSeats.text.isEmpty || nS <= 0) {
ScaffoldMessenger.of(context).showSnackBar(
const SnackBar(
content:
Text("The number of seats cannot be empty or <= 0"),
backgroundColor: Colors.red,
),
);
return;
}
Settings.cinemaLayout.addRow(
rowIdentifier.text,
nS,
);
await Settings.updateCinemaLayout();
Navigator.of(context).pop();
},
child: const Text(
"Add",
style: TextStyle(color: Colors.black),
),
),
],
),
),
);
}
Widget deleteRowForLayoutDialog(BuildContext context) {
return Scaffold(
appBar: AppBar(
title: const Text("Delete a row"),
),
body: SafeArea(
child: StatefulBuilder(
builder: (context, setState) => Settings.cinemaLayout.rows.isEmpty
? const Center(
child: Text("Oops there aren't any rows"),
)
: ListView(
padding: const EdgeInsets.all(20),
children: Settings.cinemaLayout.rows
.map(
(e) => Row(
mainAxisAlignment: MainAxisAlignment.spaceBetween,
children: [
Text("Row identifier: ${e.rowIdentifier}"),
Text("Row length: ${e.length}"),
IconButton(
onPressed: () async {
Settings.cinemaLayout.rows.remove(e);
await Settings.updateCinemaLayout();
setState(() {});
},
icon: const Icon(Icons.delete),
),
],
),
)
.toList(),
),
),
),
);
}
|
export default {
oUserInfo: state => state.user.oUserInfo,
oCompanyInfo: state => state.info.oCompanyInfo,
oBasicInfo: state => state.info.oBasicInfo,
oFileInfo: state => state.info.oFileInfo
}; |
import axios from 'axios'
import getCurrentCredentials from './getCredentials'
const baseURL = process.env.REACT_APP_API_URL
const toBase64 = file => new Promise((resolve, reject) => {
const reader = new FileReader()
reader.readAsDataURL(file)
reader.onload = () => resolve(reader.result)
reader.onerror = error => reject(error)
})
const fetchRecipes = async () => {
let response = await axios.get(baseURL + 'recipes')
return response.data.recipes
}
const searchRecipes = async (query) => {
try {
let response = await axios.post(baseURL + 'search',
{
q: query
}
)
return response.data.recipes
} catch(error) {
return {
errorMessage: error.response.data.message
}
}
}
const fetchCurrentUsersRecipes = async () => {
let response = await axios.get(baseURL + 'recipes?user_recipe=true',
{
headers: getCurrentCredentials()
}
)
return response.data.recipes
}
const submitRecipe = async (title, description, ingredients, directions, image) => {
try {
let encodedImage, recipeParams
recipeParams = {
title: title,
description: description,
ingredients: ingredients,
directions: directions
}
if (image) {
encodedImage = await toBase64(image)
recipeParams.image = encodedImage
}
let response = await axios.post(baseURL + 'recipes',
{
recipe: recipeParams
},
{
headers: getCurrentCredentials()
}
)
return {
message: response.data.message,
}
} catch(error) {
return {
error: error.response.data.error_message || error.message
}
}
}
const editRecipe = async (title, description, ingredients, directions, image, recipeId) => {
try {
let encodedImage, recipeParams
recipeParams = {
title: title,
description: description,
ingredients: ingredients,
directions: directions
}
if (image) {
encodedImage = await toBase64(image)
recipeParams.image = encodedImage
}
let response = await axios.put(`${baseURL}/recipes/${recipeId}`,
{
recipe: recipeParams
},
{
headers: getCurrentCredentials()
}
)
return {
message: response.data.message,
}
} catch(error) {
return {
error: error.response.data.error_message || error.message
}
}
}
const forkRecipe = async (title, description, ingredients, directions, image, recipeId) => {
try {
let encodedImage, recipeParams
recipeParams = {
title: title,
description: description,
ingredients: ingredients,
directions: directions
}
if (image) {
encodedImage = await toBase64(image)
recipeParams.image = encodedImage
}
let response = await axios.post(`${baseURL}/recipes/${recipeId}/fork`,
{
recipe: recipeParams
},
{
headers: getCurrentCredentials()
}
)
return {
recipeId : response.data.forked_recipe_id,
message: response.data.message
}
} catch(error) {
return {
error: error.response.data.error_message || error.message
}
}
}
const getSingleRecipe = async (recipeId) => {
try {
let response = await axios.get(`${baseURL}/recipes/${recipeId}`,
{
headers: getCurrentCredentials()
}
)
return {
recipe: response.data.recipe
}
} catch(error) {
return {
error: error.response.data.error_message
}
}
}
export { fetchRecipes, submitRecipe, getSingleRecipe, editRecipe, forkRecipe, fetchCurrentUsersRecipes, searchRecipes } |
import sbt.Keys.libraryDependencies
name := "schedule"
version := "0.1"
scalaVersion := "2.12.8"
lazy val domian = project
.settings(name := "domian", set)
.dependsOn(macros)
lazy val macros = project
.settings(name := "macros", set)
lazy val parser = project
.settings(name := "parser", set)
.dependsOn(domian)
.enablePlugins(BuildInfoPlugin)
.enablePlugins(LauncherJarPlugin)
.settings(
buildInfoKeys := Seq[BuildInfoKey](name, version, scalaVersion, sbtVersion),
buildInfoPackage := "space.littleinferno.schedule.parser"
)
lazy val server = project
.settings(name := "server", set)
.dependsOn(domian)
.enablePlugins(JavaAppPackaging)
.enablePlugins(DockerPlugin)
lazy val set = Seq(
libraryDependencies ++= Dependencies.circe,
libraryDependencies ++= Dependencies.circeConfig,
libraryDependencies ++= Dependencies.enumeratum,
libraryDependencies ++= Dependencies.cats,
libraryDependencies ++= Dependencies.derevo,
libraryDependencies ++= Dependencies.doobie,
libraryDependencies ++= Dependencies.osLib,
libraryDependencies ++= Dependencies.apacheIO,
libraryDependencies += "ru.tinkoff" %% "typed-schema" % "0.11.0-RC1",
libraryDependencies += "de.heikoseeberger" %% "akka-http-circe" % "1.25.2",
libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.3.0-alpha4",
libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.9.2",
libraryDependencies += "com.pauldijou" %% "jwt-core" % "3.0.0",
libraryDependencies += "com.pauldijou" %% "jwt-circe" % "3.0.0",
addCompilerPlugin("org.scalamacros" % "paradise" % "2.1.1" cross CrossVersion.full),
libraryDependencies += "io.estatico" %% "newtype" % "0.4.3",
libraryDependencies ++= Seq("org.apache.poi" % "poi" % "4.1.0", "org.apache.poi" % "poi-ooxml" % "4.1.0"),
libraryDependencies += "org.parboiled" %% "parboiled" % "2.1.7",
libraryDependencies ++= Seq(
"org.backuity.clist" %% "clist-core" % "3.5.1",
"org.backuity.clist" %% "clist-macros" % "3.5.1" % "provided"
),
libraryDependencies += "org.backuity" %% "ansi-interpolator" % "1.1.0" % "provided",
libraryDependencies += "io.7mind.izumi" %% "fundamentals-bio" % "0.8.6",
resolvers += Resolver.sonatypeRepo("releases"),
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.8",
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.8" % "test",
libraryDependencies += "org.scalaj" %% "scalaj-http" % "2.4.2",
libraryDependencies += "org.webjars.npm" % "swagger-ui-dist" % "3.23.3",
libraryDependencies += "dev.zio" %% "zio" % "1.0.0-RC11-1",
libraryDependencies += "dev.zio" %% "zio-interop-cats" % "2.0.0.0-RC2",
libraryDependencies += "de.vandermeer" % "asciitable" % "0.3.2"
)
enablePlugins(JavaAppPackaging)
//libraryDependencies in ThisBuild ++= (macroParadise ++ betterFor).map(compilerPlugin)
|
package login
import android.content.Context
import android.content.Intent
import com.google.android.gms.auth.api.signin.GoogleSignIn
import com.google.android.gms.auth.api.signin.GoogleSignInAccount
import com.google.android.gms.auth.api.signin.GoogleSignInClient
import com.google.android.gms.auth.api.signin.GoogleSignInOptions
import com.google.android.gms.common.api.ApiException
import com.google.firebase.auth.FirebaseAuth
import com.google.firebase.auth.GoogleAuthProvider
import com.onurcan.exovideoreference.R
import com.onurcan.exovideoreference.helper.Constants
import com.onurcan.exovideoreference.ui.contracts.*
import com.onurcan.exovideoreference.utils.showLogError
import com.onurcan.exovideoreference.utils.showLogInfo
import com.onurcan.exovideoreference.utils.showToast
import com.onurcan.exovideoreference.utils.toSafeString
class HmsGmsLoginHelper(context: Context) : LoginHelper {
private val googleRequestCode = 1994
private val cntx = context
private val gso: GoogleSignInOptions by lazy {
GoogleSignInOptions.Builder(GoogleSignInOptions.DEFAULT_SIGN_IN)
.requestIdToken(context.getString(R.string.default_web_client_id))
.requestId()
.requestEmail()
.build()
}
private val googleSignInClient: GoogleSignInClient by lazy {
GoogleSignIn.getClient(context, gso)
}
private lateinit var loginHelperCallback: LoginHelperCallback
override fun onLoginClick(loginType: LoginType) {
when (loginType) {
LoginType.GOOGLE_HUAWEI -> onGoogleLogInClick()
LoginType.EMAIL -> onEmailLogInClick()
else -> onGoogleLogInClick()
}
}
private fun onGoogleLogInClick() {
loginHelperCallback.redirectToSignIn(googleSignInClient.signInIntent, googleRequestCode)
}
private fun onEmailLogInClick() {
loginHelperCallback.redirectToEmail()
}
override fun checkSilentSignIn() {
checkGoogleSilentSignIn()
}
private fun checkGoogleSilentSignIn() {
googleSignInClient.silentSignIn().addOnSuccessListener { googleAccount ->
firebaseAuthWithGoogle(googleAccount, isSilentLogin = true)
}.addOnCanceledListener {
loginHelperCallback.onSilentSignInFail()
}
}
private fun firebaseAuthWithGoogle(acct: GoogleSignInAccount, isSilentLogin: Boolean = false) {
val auth = FirebaseAuth.getInstance()
val credential = GoogleAuthProvider.getCredential(acct.idToken, null)
FirebaseAuth.getInstance().signInWithCredential(credential).addOnCompleteListener { task ->
if (task.isSuccessful) {
val user = auth.currentUser
val loginUserData = LoginUserData(
userId = user?.uid.toSafeString(),
loginType = LoginType.GOOGLE
)
val loginUserInfoData = LoginUserInfoData(
nameSurname = user?.displayName.toSafeString(),
email = user?.email.toSafeString(),
photoUrl = user?.photoUrl.toString(),
userId = user?.uid.toSafeString()
)
showLogInfo(Constants.mHmsLogin, "SignInWithCredential:success")
if (isSilentLogin)
loginHelperCallback.onSilentSignInSuccess(loginUserData)
else
loginHelperCallback.onLoginSuccess(loginUserData, loginUserInfoData)
} else {
loginHelperCallback.onLoginFail("Firebase Google Auth Fail")
}
}
}
override fun onDataReceived(requestCode: Int, resultCode: Int, data: Intent?) {
when (requestCode) {
googleRequestCode -> onGoogleSignInDataReceived(data)
}
}
override fun setCallback(loginHelperCallback: LoginHelperCallback) {
this.loginHelperCallback = loginHelperCallback
}
override fun sendVerificationCode(email: String, buttonString: String, interval: Int) {
val btn = buttonString
val auth = FirebaseAuth.getInstance()
val user = auth.currentUser
user?.sendEmailVerification()?.addOnCompleteListener {
if (it.isSuccessful) {
showToast(cntx, "Please wait verification code, and then type it and press Login")
}
}
}
override fun onLoginEmail(email: String, password: String, verifyCode: String) {
}
private fun onGoogleSignInDataReceived(data: Intent?) {
val task = GoogleSignIn.getSignedInAccountFromIntent(data)
try {
val account = task.getResult(ApiException::class.java)
firebaseAuthWithGoogle(account!!)
} catch (e: ApiException) {
showLogError(Constants.mHmsLogin, "Google SignIn Failed: $e")
}
}
} |
import * as actionTypes from '../constants/index';
const setBreadCrumb = data => {
return {
type: actionTypes.SET_BREADCRUMB,
data
};
};
const setTags = data => {
return {
type: actionTypes.SET_TAGS,
data
};
};
const setTheme = data => {
return {
type: actionTypes.SET_THEME,
data
};
};
const setCollapse = data => {
return {
type: actionTypes.SET_COLLAPSE,
data
};
};
export { setBreadCrumb, setTags, setTheme, setCollapse };
|
module Llama
class Component
def respond(message)
message = case self
when Llama::Producer::Base: produce(message)
when Llama::Consumer::Base: consume(message)
else process(message)
end
return message
end
def producer?
false
end
def consumer?
false
end
def polling?
false
end
def evented?
false
end
end
end
|
using LinqInfer.Data.Serialisation;
using LinqInfer.Maths;
using System;
using System.Collections.Generic;
using System.Threading.Tasks;
namespace LinqInfer.Learning.Classification.NeuralNetworks
{
public interface INetworkSignalFilter : IImportableFromDataDocument, IExportableAsDataDocument, IPropagatedOutput
{
string Id {get;}
/// <summary>
/// Returns all input sources (reoccurring and predecessors)
/// </summary>
IEnumerable<INetworkSignalFilter> Inputs { get; }
/// <summary>
/// Executes an action through the network, pushing it forward to successive modules
/// </summary>
void ForwardPropagate(Action<INetworkSignalFilter> work);
/// <summary>
/// Executes an action through the network, pushing it backward to previous modules
/// </summary>
Task BackwardPropagate(Vector error);
/// <summary>
/// Enqueues input to be processed
/// </summary>
void Receive(IVector input);
/// <summary>
/// Resets the state back to empty (clears the last outputs)
/// </summary>
void Reset();
}
} |
/// TICKSPERQUARTERNOTE is the number of "ticks" (time measurement in the MIDI file) that
/// corresponds to one quarter note. This number is somewhat arbitrary, but should
/// be chosen to provide adequate temporal resolution.
const int TICKSPERQUARTERNOTE = 960;
Map controllerEventTypes = {'pan': 0x0a};
/// Accidental type
const int MAJOR = 0;
///Accidental type
const int MINOR = 1;
///C
const int SHARPS = 1;
///Accidentals
const int FLATS = -1;
class AccidentalType {
static int SHARPS = 1;
static int FLATS = -1;
}
class AccidentalMode {
static int MAJOR = 0;
static int MINOR = 1;
}
|
#!/bin/sh
BASEDIR=$(dirname $0)
nodeLocation=$BASEDIR"/data/bin/node"
jsLoc=$BASEDIR"/data/xbmc_remote.js"
$nodeLocation --harmony $jsLoc
|
# Muzich
Web app for friends or like minds join in to share and listen their favourite 'muzich' at real time.
|
declare module Qowaiv {
class Guid implements IEquatable, IFormattable, IJsonStringifyable {
constructor();
private v;
toString(): string;
format(f?: string): string;
toJSON(): string;
version(): number;
equals(other: any): boolean;
static fromJSON(s: string): Guid;
static isValid(s: string): boolean;
static parse(s: string): Guid;
private static strip;
static empty(): Guid;
static newGuid(seed?: Guid): Guid;
private static rndGuid;
}
}
interface IEquatable {
equals(other: any): boolean;
}
interface IFormattable {
toString(): string;
format(f: string): string;
}
interface IJsonStringifyable {
toJSON(): string;
}
declare module Qowaiv {
class TimeSpan implements IEquatable, IFormattable, IJsonStringifyable {
private static pattern;
private v;
constructor(d?: number, h?: number, m?: number, s?: number, f?: number);
private num;
getDays(): number;
getHours(): number;
getMinutes(): number;
getSeconds(): number;
getMilliseconds(): number;
getTotalDays(): number;
getTotalHours(): number;
getTotalMinutes(): number;
getTotalSeconds(): number;
getTotalMilliseconds(): number;
multiply(factor: number): TimeSpan;
divide(factor: number): TimeSpan;
toString(): string;
format(format?: string): string;
toJSON(): string;
static fromJSON(s: string): TimeSpan;
equals(other: any): boolean;
static isValid(s: string): boolean;
static parse(str: string): TimeSpan;
static fromSeconds(seconds: number): TimeSpan;
}
}
|
export const pageTheme = {
darkMode: false,
};
export const taskStore = {
tasks: [],
};
|
package compose.examples
import compose.core._
import scalajs.js.annotation.JSExport
trait TwelveBarBlues {
import Score._
import Pitch._
@JSExport
val twelveBarBlues = {
val bar =
( E3.q | B3.q ) ~
( E3.s | B3.s ) ~
( E3.q | Cs4.q ) ~
( E3.s | B3.s )
(bar transpose 0 repeat 4) ~
(bar transpose 5 repeat 2) ~
(bar transpose 0 repeat 2) ~
(bar transpose 7 repeat 1) ~
(bar transpose 5 repeat 1) ~
(bar transpose 0 repeat 2)
}
}
|
import Position.BOTTOM
import Position.BOTTOM_LEFT
import Position.BOTTOM_RIGHT
import Position.MIDDLE
import Position.TOP
import Position.TOP_LEFT
import Position.TOP_RIGHT
import java.io.File
import java.util.EnumSet
fun main() {
val records = File("input/1.txt").readLines().map { line ->
line.split(" | ", limit = 2).map { it.split(' ') }.let { Record(it[0], it[1]) }
}
part1(records)
part2(records)
}
fun part1(records: List<Record>) {
@Suppress("ConvertLambdaToReference")
val segmentCountToDigits = DIGITS
.map { it.value to it.positions.size }
.groupBy { it.second }
.mapValues { it.value.map { e -> e.first } }
val uniqueSegmentCountToDigits = segmentCountToDigits
.filterValues { it.size == 1 }
.mapValues { it.value.single() }
val simpleDigitCount = records.sumOf { r -> r.output.count { it.length in uniqueSegmentCountToDigits } }
println("Total appearances of 1,4,7,8: $simpleDigitCount")
}
fun part2(records: List<Record>) {
println("Total output: ${records.sumOf(Record::deduceOutput)}")
}
data class Digit(val value: Int, val positions: EnumSet<Position>)
enum class Position {
TOP,
TOP_LEFT,
TOP_RIGHT,
MIDDLE,
BOTTOM_LEFT,
BOTTOM_RIGHT,
BOTTOM
}
private val DIGITS = listOf(
Digit(0, EnumSet.of(TOP, TOP_LEFT, TOP_RIGHT, BOTTOM_LEFT, BOTTOM_RIGHT, BOTTOM)),
Digit(1, EnumSet.of(TOP_RIGHT, BOTTOM_RIGHT)),
Digit(2, EnumSet.of(TOP, TOP_RIGHT, MIDDLE, BOTTOM_LEFT, BOTTOM)),
Digit(3, EnumSet.of(TOP, TOP_RIGHT, MIDDLE, BOTTOM_RIGHT, BOTTOM)),
Digit(4, EnumSet.of(TOP_LEFT, TOP_RIGHT, MIDDLE, BOTTOM_RIGHT)),
Digit(5, EnumSet.of(TOP, TOP_LEFT, MIDDLE, BOTTOM_RIGHT, BOTTOM)),
Digit(6, EnumSet.of(TOP, TOP_LEFT, MIDDLE, BOTTOM_LEFT, BOTTOM_RIGHT, BOTTOM)),
Digit(7, EnumSet.of(TOP, TOP_RIGHT, BOTTOM_RIGHT)),
Digit(8, EnumSet.of(TOP, TOP_LEFT, TOP_RIGHT, MIDDLE, BOTTOM_LEFT, BOTTOM_RIGHT, BOTTOM)),
Digit(9, EnumSet.of(TOP, TOP_LEFT, TOP_RIGHT, MIDDLE, BOTTOM_RIGHT, BOTTOM)),
)
data class Record(val patterns: List<String>, val output: List<String>) {
private fun deduceMapping(): Map<Char, Position> {
val mapping = mutableMapOf<Char, Position>()
// The digit 1 is made of the top right and bottom right segment.
val patternFor1 = patterns.first { it.length == 2 }
// We can deduce the mapping for both right segments by counting how many times each segment
// appears among all of the digits (top right appears in 8 digits, bottom right in 9 digits).
if (patterns.count { patternFor1[0] in it } == 8) {
mapping[patternFor1[0]] = TOP_RIGHT
mapping[patternFor1[1]] = BOTTOM_RIGHT
}
else {
mapping[patternFor1[1]] = TOP_RIGHT
mapping[patternFor1[0]] = BOTTOM_RIGHT
}
// The digit 7 is made of both right segments, and the top segment.
val patternFor7 = patterns.first { it.length == 3 }
// We can deduce the mapping for the top segment,
// since it is the only segment which is not shared with the digit 1.
mapping[patternFor7.first { it !in patternFor1 }] = TOP
// The digit 4 is made of both right segments, the top left segment, and the middle segment.
// Both right segments are already deduced, so we only want the remaining two segments.
val patternFor4MinusDeduced = patterns.first { it.length == 4 }.toSet().minus(patternFor1.toSet()).toTypedArray()
// We can deduce the mapping for the top left and middle segments by counting how many times each segment
// appears among all of the digits (top left appears in 6 digits, middle in 7 digits).
if (patterns.count { patternFor4MinusDeduced[0] in it } == 6) {
mapping[patternFor4MinusDeduced[0]] = TOP_LEFT
mapping[patternFor4MinusDeduced[1]] = MIDDLE
}
else {
mapping[patternFor4MinusDeduced[1]] = TOP_LEFT
mapping[patternFor4MinusDeduced[0]] = MIDDLE
}
// The digit 8 uses all seven segments, so we use it and remove the five segments we already know,
// keeping the remaining two segments (bottom left, and bottom).
val remainingSegments = patterns.first { it.length == 7 }.toSet().minus(mapping.keys.toSet()).toTypedArray()
// We can deduce the mapping for the bottom left and bottom segments by counting how many times each segment
// appears among all of the digits (bottom left appears in 4 digits, bottom in 7 digits).
if (patterns.count { remainingSegments[0] in it } == 4) {
mapping[remainingSegments[0]] = BOTTOM_LEFT
mapping[remainingSegments[1]] = BOTTOM
}
else {
mapping[remainingSegments[1]] = BOTTOM_LEFT
mapping[remainingSegments[0]] = BOTTOM
}
return mapping
}
fun deduceOutput(): Int {
val mapping = deduceMapping()
return output.fold(0) { total, digit ->
val positions = digit.map(mapping::getValue).toSet()
val value = DIGITS.first { it.positions == positions }.value
(total * 10) + value
}
}
}
|
### To recreate the Blue Button client certs for HTTP TLS:
```
mkcert -cert-file lfh-bluebutton-client.pem -key-file lfh-bluebutton-client.key localhost 127.0.0.1
```
|
import 'dart:ui';
import 'package:frontend/data/api.dart';
import 'package:frontend/data/node_api.dart';
import 'package:frontend/data/node_model.dart';
class Repository {
List<NodeModel> _nodes;
Future<List<NodeModel>> getNodes() async {
if (_nodes == null) {
_nodes = (await fetchNodes()).nodes;
}
return _nodes;
}
Future<List<String>> getNodeEffects() async {
var nodes = await getNodes();
if (nodes.isNotEmpty) {
return nodes[0].effects;
} else {
return [];
}
}
Future<List<String>> getNodePalettes() async {
var nodes = await getNodes();
if (nodes.isNotEmpty) {
return nodes[0].palettes;
} else {
return [];
}
}
void selectNodeEffect(int effectId) async {
(await getNodes()).where((node) => node.isSelected).forEach((node) {
node.setEffect(effectId);
});
}
void selectNodePalette(int paletteId) async {
(await getNodes()).where((node) => node.isSelected).forEach((node) {
node.setPalette(paletteId);
});
}
void setNodeColor(Color color) async {
(await getNodes()).where((node) => node.isSelected).forEach((node) {
node.setColor(color);
});
}
}
|
package com.tvd12.ezyhttp.server.core.constant;
public final class PropertyNames {
public static final String ASYNC_DEFAULT_TIMEOUT = "async.default_timeout";
public static final String DEBUG = "server.debug";
public static final String ALLOW_OVERRIDE_URI = "server.allow_override_uri";
public static final String SERVER_PORT = "server.port";
public static final String MANAGEMENT_ENABLE = "management.enable";
public static final String MANAGEMENT_PORT = "management.port";
public static final String MANAGEMENT_URIS_EXPOSE = "management.uris_expose";
public static final String RESOURCE_ENABLE = "resources.enable";
public static final String RESOURCE_LOCATIONS = "resources.locations";
public static final String RESOURCE_LOCATION = "resources.location";
public static final String RESOURCE_PATTERN = "resources.pattern";
public static final String RESOURCE_DOWNLOAD_CAPACITY = "resources.download.capacity";
public static final String RESOURCE_DOWNLOAD_THREAD_POOL_SIZE =
"resources.download.thread_pool_size";
public static final String RESOURCE_DOWNLOAD_BUFFER_SIZE = "resources.download.buffer_size";
public static final String RESOURCE_UPLOAD_ENABLE = "resources.upload.enable";
public static final String RESOURCE_UPLOAD_CAPACITY = "resources.upload.capacity";
public static final String RESOURCE_UPLOAD_THREAD_POOL_SIZE =
"resources.upload.thread_pool_size";
public static final String RESOURCE_UPLOAD_BUFFER_SIZE = "resources.upload.buffer_size";
public static final String VIEW_TEMPLATE_MODE = "view.template.mode";
public static final String VIEW_TEMPLATE_PREFIX = "view.template.prefix";
public static final String VIEW_TEMPLATE_SUFFIX = "view.template.suffix";
public static final String VIEW_TEMPLATE_CACHE_TTL_MS = "view.template.cache_ttl_ms";
public static final String VIEW_TEMPLATE_CACHEABLE = "view.template.cacheable";
public static final String VIEW_TEMPLATE_MESSAGES_LOCATION = "view.template.messages_location";
private PropertyNames() {}
}
|
'use strict';
Object.defineProperty(exports, '__esModule', {
value: true
});
function _interopRequireDefault(obj) { return obj && obj.__esModule ? obj : { 'default': obj }; }
var _FuncCurry = require('../Func/curry');
var _FuncCurry2 = _interopRequireDefault(_FuncCurry);
var _scan = require('./scan');
var _scan2 = _interopRequireDefault(_scan);
var _head = require('./head');
var _head2 = _interopRequireDefault(_head);
var _tail = require('./tail');
var _tail2 = _interopRequireDefault(_tail);
exports['default'] = (0, _FuncCurry2['default'])(function (fn, xs) {
return !xs.length ? undefined : (0, _scan2['default'])(fn, (0, _head2['default'])(xs), (0, _tail2['default'])(xs));
});
module.exports = exports['default']; |
-module(cr_app).
-behaviour(application).
-export([start/2, stop/1]).
-copyright('Maxim Sokhatsky').
-include("rafter_opts.hrl").
-compile(export_all).
tcp(Name,Port,Mod,Nodes) -> {Name,{cr_tcp,start_link,
[Name,Port,Mod,Nodes]},
permanent,2000,worker,[cr_tcp]}.
pool(SupName) -> {SupName,{supervisor,start_link,
[{local,SupName},cr_connection,[]]},
permanent,infinity,supervisor,[]}.
vnode({I,N}) -> {I,{cr_vnode,start_link,
[{I,N},cr_kvs]},
permanent,2000,worker,[cr_vnode]}.
heart(Nodes) -> {heart,{cr_heart,start_link,
["heart",Nodes]},
permanent,2000,worker,[cr_heart]}.
log({I,N},Nodes) -> {cr_log:logname(N),{cr_log,start_link,
[N,#rafter_opts{cluster=Nodes}]},
permanent,2000,worker,[cr_log]}.
rafter({I,N},Nodes) -> {N,{cr_rafter,start_link,
[{I,N},#rafter_opts{state_machine=cr_replication,cluster=Nodes}]},
permanent,2000,worker,[cr_rafter]}.
init([Nodes,Opts]) ->
{ok, {{one_for_one, 5, 60},
lists:flatten([ log({0,N},Nodes) || {N,_,_,_} <- Nodes, N == cr:node()]
++ [ rafter({0,N},Nodes) || {N,_,_,_} <- Nodes, N == cr:node()]
++ [ protocol(O,Nodes) || O<-Opts ]
++ [ pool(heart_sup) ]
++ [ pool(vnode_sup) ]) }}.
stop(_) -> ok.
start() -> start(normal,[]).
start(_,_) ->
io:format("Node: ~p~n",[cr:node()]),
{ok,Peers}=application:get_env(cr,peers),
{N,P1,P2,P3} = lists:keyfind(cr:node(),1,Peers),
{_,VNodes} = cr:ring(),
kvs:join(),
Sup = supervisor:start_link({local, cr_sup}, ?MODULE,
[ Peers, [ { interconnect, P1, cr_interconnect },
{ ping, P2, cr_ping },
{ client, P3, cr_client } ]]),
io:format("Supervision: ~p~n",[supervisor:which_children(cr_sup)]),
[ start_vnode({Index,Node},Peers) || {Index,Node} <- VNodes, Node == cr:nodex(cr:node()) ],
spawn(fun() -> supervisor:start_child(heart_sup,heart(Peers)) end),
Sup.
protocol({Name,Port,Mod},Nodes) ->
SupName = list_to_atom(lists:concat([Name,'_',sup])),
[ tcp(Name,Port,Mod,Nodes), % TCP listener gen_server
pool(SupName) ]. % Accepted Clients Supervisor
start_vnode({0,_Name},Peers) -> skip;
start_vnode({Index,Name},_ ) -> supervisor:start_child(vnode_sup,vnode({Index,Name})).
|
import re
import requests
from bs4 import BeautifulSoup
import json
import jsonlines
# それぞれのslizeの添え字は汚い書き方しているけど合っている。ただすったっふまわりのデータの取り方に苦しむ
# h2で書かれているものに統一性はなく、<p>で空白をとっている部分があるんじょでfind_all("<p>")ではずれが生じる
# movieであらかじめdirecterなど書いているがその情報がない場合もあるので最初に書いててもそれのすり合わせがしんどい。
# scrapyでやった方がいいのだろうか、ここまでごり押しでやってきたがあまりにも無謀すぎたのかもしれない。
# スタッフ以外のデータは基本的に取れているため(字幕データは取ってない、それ以前の問題)
all_movie = []
# for num in range(0, 6237//25): # 6237は現在の映画の数。動的にしたいなら指定して持ってくる
with jsonlines.open('jfdb_title.jsonlines', mode='w') as writer:
for num in range(0, 251):
print(num)
url = f"https://jfdb.jp/search?KW=&PAGE={num}"
res = requests.get(url)
# print(res.text)
soup = BeautifulSoup(res.content, "html.parser")
elems = soup.find_all(href=re.compile("/title/\d+")) # ここの正規表現が雑
# print("####################################################################################")
for i in range(1, len(elems)+1, 2): # 214pageの時に3桁になってバグるの直さなきゃ もともと50
print(" ", i)
movie = {
"タイトル": "",
}
title_name = ""
# for j in range(len(elems[i].contents[0].string.split(","))):
# # print(elems[i].contents[0].string.split(","))
# for k in range(len(elems[i].contents[0].string.split(",")[j].split())):
# title_name += elems[i].contents[0].string.split(",")[
# j].split()[k]
# if k+1 != len(elems[i].contents[0].string.split(",")[j].split()):
# title_name += " "
movie["タイトル"] = elems[i].contents[0][9:-9] # 空白の取り除き
all_movie.append(movie)
writer.write(movie)
# JSONファイルのロード
with open('jfdb.jsonlines', 'r', encoding="utf-8_sig") as f:
json_output = json.load(f)
# print(Details_elems)
# print(len(alltitle))
# print(len(alltitle))
|
require "helper/integration_command_test_case"
class IntegrationCommandTestRepository < IntegrationCommandTestCase
def test_repository
assert_match HOMEBREW_REPOSITORY.to_s,
cmd("--repository")
assert_match "#{HOMEBREW_LIBRARY}/Taps/foo/homebrew-bar",
cmd("--repository", "foo/bar")
end
end
|
package parser
import (
"bufio"
"context"
"fmt"
"github.com/brandesign/arma-go-parser/command"
"io"
)
type Handlers map[string]func() interface{}
type Subscriber interface {
GetSubscriptions() []*Subscription
}
func NewParser(r io.Reader, handlers Handlers, subscribers ...Subscriber) (*Parser, error) {
p := &Parser{
reader: r,
listeners: map[string][]func(evt interface{}) error{},
handlers: handlers,
}
for _, subscriber := range subscribers {
subs := subscriber.GetSubscriptions()
for _, sub := range subs {
if err := p.addListener(sub.name, sub.listener); err != nil {
return nil, err
}
}
}
return p, nil
}
type Parser struct {
reader io.Reader
listeners map[string][]func(evt interface{}) error
handlers Handlers
}
func (p *Parser) Run(ctx context.Context) {
scanner := bufio.NewScanner(p.reader)
for {
select {
case <-ctx.Done():
command.Logf("context done")
return
default:
if scanner.Scan() {
l := NewLine(scanner.Text())
if err := p.handleLine(l); err != nil {
command.Logf("cannot handle Line: %s, error: %v\n", l, err)
}
}
}
}
}
func (p *Parser) addListener(evtName string, l func(evt interface{}) error) error {
ls, ok := p.listeners[evtName]
if !ok {
p.listeners[evtName] = []func(evt interface{}) error{l}
return command.Rawf("LADDERLOG_WRITE_%s 1", evtName)
} else {
p.listeners[evtName] = append(ls, l)
}
return nil
}
func (p *Parser) handleLine(l Line) error {
listeners, ok := p.listeners[l.eventName]
if !ok {
return nil
}
evt, err := p.handlers.handleLine(l)
if err != nil {
return err
}
for _, ls := range listeners {
if err := ls(evt); err != nil {
return err
}
}
return nil
}
func NewSubscription(evt string, l func(evt interface{}) error) *Subscription {
return &Subscription{
name: evt,
listener: l,
}
}
type Subscription struct {
name string
listener func(evt interface{}) error
}
func (h Handlers) handleLine(l Line) (interface{}, error) {
f, ok := h[l.eventName]
if !ok {
return nil, fmt.Errorf("no event factory found for %s", l.eventName)
}
evt := f()
if err := l.Scan(evt); err != nil {
return nil, err
}
return evt, nil
}
|
<?php
namespace Database\Seeders;
use Illuminate\Database\Seeder;
use App\Models\User;
use App\Models\Estado;
use App\Models\Municipio;
use App\Models\Parroquia;
use Spatie\Permission\Models\Role;
use Spatie\Permission\Models\Permission;
use Illuminate\Support\Facades\DB;
class DatabaseSeeder extends Seeder
{
/**
* Seed the application's database.
*
* @return void
*/
public function run()
{
$tables = [
'users',
'estados',
'municipios',
'parroquias',
'permissions',
];
$this->truncateTables($tables);
$this->call(SeederUser::class);
$this->call(SeederEstados::class);
$this->call(SeederMunicipios::class);
$this->call(SeederParroquias::class);
$this->call(SeederPermissionsTable::class);
}
protected function truncateTables(Array $tables) {
// Desactiva el chequeo de las claves foráneas en la base de datos.
DB::statement('SET FOREIGN_KEY_CHECKS = 0;');
foreach($tables as $table) {
// Elimina todos los datos de cada tabla...
DB::table($table)->truncate();
}
// Reactiva el chequeo de las claves foráneas en la base de datos.
DB::statement('SET FOREIGN_KEY_CHECKS = 1;');
}
}
|
package com.fabirt.debty.ui.people.detail
import android.view.LayoutInflater
import android.view.ViewGroup
import androidx.core.content.ContextCompat
import androidx.recyclerview.widget.RecyclerView
import com.fabirt.debty.databinding.ViewItemMovementBinding
import com.fabirt.debty.domain.model.Movement
import com.fabirt.debty.ui.common.MovementClickListener
import com.fabirt.debty.util.toCurrencyString
import com.fabirt.debty.util.toDateString
import java.text.SimpleDateFormat
import kotlin.math.absoluteValue
class MovementViewHolder(
private val binding: ViewItemMovementBinding
) : RecyclerView.ViewHolder(binding.root) {
companion object {
fun from(parent: ViewGroup): MovementViewHolder {
val inflater = LayoutInflater.from(parent.context)
val binding = ViewItemMovementBinding.inflate(inflater, parent, false)
return MovementViewHolder(binding)
}
}
fun bind(movement: Movement) {
val amountColor = ContextCompat.getColor(itemView.context, movement.type.color)
binding.tvDate.text = movement.date.toDateString(SimpleDateFormat.SHORT)
binding.tvAmount.text = movement.amount.absoluteValue.toCurrencyString()
binding.tvAmount.setTextColor(amountColor)
binding.tvDescription.text = movement.description
binding.tvMovementType.text = itemView.context.getString(movement.type.name)
}
fun setOnClickListener(movement: Movement, l: MovementClickListener) {
binding.container.setOnClickListener { l(movement) }
}
fun setOnLongClickListener(movement: Movement, l: MovementClickListener) {
binding.container.setOnLongClickListener {
l(movement)
true
}
}
} |
using SpiraAPI.Client.Connection;
using SpiraAPI.Client.Middleware;
namespace SpiraAPI.Client.Client
{
public interface ISpiraClientFactory
{
ISpiraClient Create(ISpiraCredentials credentials);
}
public sealed class SpiraClientFactory
{
private readonly string _endpoint;
public SpiraClientFactory(string endpoint)
{
_endpoint = endpoint;
}
public ISpiraClient Create(ISpiraCredentials credentials)
{
var connection = new SpiraConnection(_endpoint, credentials);
return new SpiraClient(connection);
}
}
}
|
# vim: fileencoding=utf-8 ts=2 sts=2 sw=2 et si ai :
module Helpers
def generate_document_key(min=1, max=999)
"spec_pk_#{Random.rand(min..max)}"
end
def write(driver)
time1 = Time.mktime(2013, 6, 1, 11, 22, 33)
time2 = Time.mktime(2013, 6, 1, 11, 22, 35)
record1 = {'a' => 10, 'b' => 'Tesla'}
record2 = {'a' => 20, 'b' => 'Edison'}
record1_for_id = {'a' => 10, 'b' => 'Tesla', 'tag' => 'test', 'time' => time1.to_i, :ttl => 10}
record2_for_id = {'a' => 20, 'b' => 'Edison', 'tag' => 'test', 'time' => time2.to_i, :ttl => 10}
id1 = Digest::MD5.hexdigest(record1_for_id.to_s)
id2 = Digest::MD5.hexdigest(record2_for_id.to_s)
# store both records in an array to aid with verification
test_records = [record1, record2]
test_times = [time1, time2]
test_records.each_with_index do |rec, idx|
allow(Time).to receive(:now).and_return(test_times[idx])
driver.emit(rec)
end
driver.run # persists to couchbase
# query couchbase to verify data was correctly persisted
db_records = driver.instance.connection.get(id1, id2)
db_records.count.should eq(test_records.count)
db_records.each_with_index do |db_record, idx| # records should be sorted by row_key asc
test_record = test_records[idx]
db_record['tag'].should eq(test_record['tag'])
db_record['time'].should eq(test_record['time'])
db_record['a'].should eq(test_record['a'])
db_record['b'].should eq(test_record['b'])
if driver.instance.include_ttl
db_record['ttl'].should_not be_nil
else
db_record['ttl'].should be_nil
end
end
end # def write
end # module Helpers
|
{% if include.header %}
{% assign header = include.header %}
{% else %}
{% assign header = "###" %}
{% endif %}
This command packages a chart into a versioned chart archive file. If a path
is given, this will look at that path for a chart (which must contain a
Chart.yaml file) and then package that directory.
Versioned chart archives are used by Helm package repositories.
To sign a chart, use the '--sign' flag. In most cases, you should also
provide '--keyring path/to/secret/keys' and '--key keyname'.
$ helm package --sign ./mychart --key mykey --keyring ~/.gnupg/secring.gpg
If '--keyring' is not specified, Helm usually defaults to the public keyring
unless your environment is otherwise configured.
{{ header }} Syntax
```shell
werf helm package [CHART_PATH] [...] [flags] [options]
```
{{ header }} Options
```shell
--app-version=''
set the appVersion on the chart to this version
-u, --dependency-update=false
update dependencies from "Chart.yaml" to dir "charts/" before packaging
-d, --destination='.'
location to write the chart.
--key=''
name of the key to use when signing. Used if --sign is true
--keyring='~/.gnupg/pubring.gpg'
location of a public keyring
--sign=false
use a PGP private key to sign this package
--version=''
set the version on the chart to this semver version
```
{{ header }} Options inherited from parent commands
```shell
--hooks-status-progress-period=5
Hooks status progress period in seconds. Set 0 to stop showing hooks status progress.
Defaults to $WERF_HOOKS_STATUS_PROGRESS_PERIOD_SECONDS or status progress period value
--kube-config=''
Kubernetes config file path (default $WERF_KUBE_CONFIG or $WERF_KUBECONFIG or
$KUBECONFIG)
--kube-config-base64=''
Kubernetes config data as base64 string (default $WERF_KUBE_CONFIG_BASE64 or
$WERF_KUBECONFIG_BASE64 or $KUBECONFIG_BASE64)
--kube-context=''
Kubernetes config context (default $WERF_KUBE_CONTEXT)
--log-color-mode='auto'
Set log color mode.
Supported on, off and auto (based on the stdout’s file descriptor referring to a
terminal) modes.
Default $WERF_LOG_COLOR_MODE or auto mode.
--log-debug=false
Enable debug (default $WERF_LOG_DEBUG).
--log-pretty=true
Enable emojis, auto line wrapping and log process border (default $WERF_LOG_PRETTY or
true).
--log-quiet=false
Disable explanatory output (default $WERF_LOG_QUIET).
--log-terminal-width=-1
Set log terminal width.
Defaults to:
* $WERF_LOG_TERMINAL_WIDTH
* interactive terminal width or 140
--log-verbose=false
Enable verbose output (default $WERF_LOG_VERBOSE).
-n, --namespace=''
namespace scope for this request
--status-progress-period=5
Status progress period in seconds. Set -1 to stop showing status progress. Defaults to
$WERF_STATUS_PROGRESS_PERIOD_SECONDS or 5 seconds
```
|
package me.mocha.spongeplugin.seotda.wool
import org.spongepowered.api.data.key.Keys
import org.spongepowered.api.item.inventory.ItemStack
import org.spongepowered.api.text.Text
import org.spongepowered.api.text.format.TextColors
object WoolRoulette {
val itemName = Text.of(
TextColors.RED, "r",
TextColors.GOLD, "a",
TextColors.YELLOW, "n",
TextColors.GREEN, "d",
TextColors.AQUA, "o",
TextColors.DARK_BLUE, "m",
TextColors.DARK_PURPLE, "!"
)
fun createItemStack(wool: SeotdaWool, quantity: Int = 1) = wool.createItemStack(quantity).apply {
offer(Keys.DISPLAY_NAME, itemName)
}
fun isRoulette(item: ItemStack) = item.get(Keys.DISPLAY_NAME).orElseGet { Text.of() } == itemName
} |
require_relative "classes.rb"
require_relative "help.rb"
require "artii"
require "tty-prompt"
require "tty-file"
require "tty-table"
require "colorize"
args = ARGV
if args.empty?
title = Artii::Base.new
puts title.asciify("Terminal Notes").colorize(:light_blue)
StartMenu.new
else
case args[0].downcase
when "new"
noteboard = Noteboard.new(new_noteboard)
noteboard.noteboard_add(add_note)
when "display"
noteboard_menu
when "h"
help_info
else
puts "That is not a valid argument, please pick one from below"
# options #put the valid options into a method
end
end
|
import styled from 'styled-components'
export const Section = styled.section`
padding: 4rem 0;
margin: 0 auto;
`
|
# react-native role
__WARNING__ This role is in development and is not ready to be used at large.
Installs [react-native](https://facebook.github.io/react-native/) to build iOS and Android
applications. All the installation steps for building android apps on Linux, from the
[getting started](https://facebook.github.io/react-native/docs/getting-started) page, are done,
except for the steps that involve starting android-studio and downloading the various SDK
components. Start android-studio and do these steps manually.
## Role Variables
None.
## Example Playbook
To install the latest version:
```yaml
- hosts: all
become: yes
become_user: root
roles:
- role: tools/react-native
```
|
package takenoko.objectives.amenagement;
/**
* Ajout des aménagements qui sont les suivants :
* <ul>
* <li>ENCLOS</li>
* <li>ENGRAIS</li>
* <li>BASSIN</li>
* </ul>
* Il y a aussi NON lorsque la carte n'a pas d'amenagement
*/
public enum Amenagement {
NON("NON"),ENCLOS("ENCLOS"),ENGRAIS("ENGRAIS"),BASSIN("BASSIN");
private String string;
Amenagement(String string) {
this.string = string;
}
public String getString() {
return string;
}
}
|
{-# LANGUAGE OverloadedStrings #-}
module SimpleServer
( runServer
) where
import Control.Monad.Trans.Reader (runReaderT)
import Data.Text.Lazy (Text)
import Network.Wai.Middleware.RequestLogger (logStdoutDev)
import Web.Scotty.Trans (ScottyT, defaultHandler, get, middleware, scottyT)
import Database.Bolt (BoltCfg)
import Routes
import Data
type Port = Int
-- |Run server with connection pool as a state
runServer :: Port -> BoltCfg -> IO ()
runServer port config = do state <- constructState config
scottyT port (`runReaderT` state) $ do
middleware logStdoutDev
get "/" mainR
get "/graph" graphR
get "/search" searchR
get "/movie/:title" movieR
get "/demo-cl" demoInfoR
get "/demo-cl-zip" demoZipR
get "/demo-cl-sc" demoWebdriverR
get "/demo-ctrl-rest" demoRestReqR
|
from very_first_example import Person
class Student(Person):
def __init__(self, name, age, major):
super().__init__(name, age)
self.major = major
def print(self):
super().print()
print(f"Major in {self.major}")
s1 = Student("Catherine Zeta Jones", 54, "Arts")
s1.print() |
#!/usr/bin/env bash
rm -r output
mkdir output
#python v31_validation.py > output/v31_validation.naf 2> output/v31_validation.err
python v31_raw_text_terms.py > output/v31_raw_text_terms.naf 2> output/v31_raw_text_terms.err
#python v31_add_entity.py > output/v31_add_entity.naf 2> output/v31_add_entity.err
python v31_raw_text_terms_deps.py > output/v31_raw_text_terms_deps.naf 2> output/v31_raw_text_terms_deps.err
python v31_raw_text_terms_deps_mw.py > output/v31_raw_text_terms_deps_mw.naf 2> output/v31_raw_text_terms_deps_mw.err
#python v31_it_raw_text_terms.py
#python v31_raw_text_no_cdata.py > output/v31_raw_text_no_cdata.naf 2> output/v31_raw_text_no_cdata.err
|
CREATE OR REPLACE PACKAGE BODY HR.PACK
AS
PROCEDURE PROC1
AS
BEGIN
NULL;
END;
END; |
// Auto-generated via `yarn build:interfaces`, do not edit
/* eslint-disable @typescript-eslint/no-empty-interface */
import { Struct, Vec } from '@polkadot/types/codec';
import { Bytes, u32 } from '@polkadot/types/primitive';
import { BlockNumber, Signature } from '@polkadot/types/interfaces/runtime';
import { AuthorityId } from '@polkadot/types/interfaces/consensus';
import { SessionIndex } from '@polkadot/types/interfaces/session';
/** u32 */
export interface AuthIndex extends u32 {}
/** Signature */
export interface AuthoritySignature extends Signature {}
/** Struct */
export interface Heartbeat extends Struct {
/** BlockNumber */
readonly blockNumber: BlockNumber;
/** OpaqueNetworkState */
readonly networkState: OpaqueNetworkState;
/** SessionIndex */
readonly sessionIndex: SessionIndex;
/** AuthorityId */
readonly authorityId: AuthorityId;
}
/** Bytes */
export interface OpaqueMultiaddr extends Bytes {}
/** Struct */
export interface OpaqueNetworkState extends Struct {
/** OpaquePeerId */
readonly peerId: OpaquePeerId;
/** Vec<OpaqueMultiaddr> */
readonly externalAddresses: Vec<OpaqueMultiaddr>;
}
/** Bytes */
export interface OpaquePeerId extends Bytes {}
|
// Package s3log aids in parsing and generating logs in Amazon S3's Server Access Log Format
package s3log
import (
"net"
"regexp"
"time"
)
// A Entry is a structured log entry that describes a S3 request
type Entry struct {
Owner string // user ID of bucket owner
Bucket string // bucket name
Time time.Time // time when request was recieved
Remote net.IP // IP address of requester
Requester string // user ID of requester
RequestID string // request ID
Operation string
Key string // key requested from bucket
RequestURI string
Status int // HTTP status code
Error string // Error code
Bytes int64 // Bytes sent to requester
Size int64 // size of object requested
Total time.Duration // time spent serving request
Turnaround time.Duration // time spent handling request before response is sent
Referrer string // HTTP referred
UserAgent string // HTTP UserAgent
Version string // Request version ID
}
var logLine = regexp.MustCompile(`[^" ]+|("[^"]*")`)
var brackets = regexp.MustCompile(`[\[\]]`)
|
/*
* Copyright (c) 2015-2018, Cloudera, Inc. All Rights Reserved.
*
* Cloudera, Inc. licenses this file to you under the Apache License,
* Version 2.0 (the "License"). You may not use this file except in
* compliance with the License. You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* This software is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
* CONDITIONS OF ANY KIND, either express or implied. See the License for
* the specific language governing permissions and limitations under the
* License.
*/
package com.cloudera.labs.envelope.repetition;
import com.cloudera.labs.envelope.output.BulkOutput;
import com.cloudera.labs.envelope.plan.MutationType;
import com.google.common.collect.Lists;
import com.google.common.collect.Sets;
import com.typesafe.config.Config;
import org.apache.spark.sql.Dataset;
import org.apache.spark.sql.Row;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import scala.Tuple2;
import java.util.List;
import java.util.Set;
public class DummyBatchOutput implements BulkOutput {
private static final Logger LOG = LoggerFactory.getLogger(DummyBatchOutput.class);
private static List<Row> outputs = Lists.newArrayList();
@Override
public Set<MutationType> getSupportedBulkMutationTypes() {
return Sets.newHashSet(MutationType.INSERT);
}
@Override
public void applyBulkMutations(List<Tuple2<MutationType, Dataset<Row>>> planned) {
for (Tuple2<MutationType, Dataset<Row>> type : planned) {
LOG.info("Adding {} outputs", type._2().count());
outputs.addAll(type._2().collectAsList());
}
}
@Override
public void configure(Config config) {
}
public static List<Row> getOutputs() {
return outputs;
}
}
|
package net.onrc.onos.api.newintent;
import com.google.common.base.Objects;
import static com.google.common.base.Preconditions.checkNotNull;
/**
* A class to represent an intent related event.
*/
public class IntentEvent {
// TODO: determine a suitable parent class; if one does not exist, consider introducing one
private final long time;
private final Intent intent;
private final IntentState state;
private final IntentState previous;
/**
* Creates an event describing a state change of an intent.
*
* @param intent subject intent
* @param state new intent state
* @param previous previous intent state
* @param time time the event created in milliseconds since start of epoch
* @throws NullPointerException if the intent or state is null
*/
public IntentEvent(Intent intent, IntentState state, IntentState previous, long time) {
this.intent = checkNotNull(intent);
this.state = checkNotNull(state);
this.previous = previous;
this.time = time;
}
/**
* Constructor for serializer.
*/
protected IntentEvent() {
this.intent = null;
this.state = null;
this.previous = null;
this.time = 0;
}
/**
* Returns the state of the intent which caused the event.
*
* @return the state of the intent
*/
public IntentState getState() {
return state;
}
/**
* Returns the previous state of the intent which caused the event.
*
* @return the previous state of the intent
*/
public IntentState getPreviousState() {
return previous;
}
/**
* Returns the intent associated with the event.
*
* @return the intent
*/
public Intent getIntent() {
return intent;
}
/**
* Returns the time at which the event was created.
*
* @return the time in milliseconds since start of epoch
*/
public long getTime() {
return time;
}
@Override
public boolean equals(Object o) {
if (this == o) {
return true;
}
if (o == null || getClass() != o.getClass()) {
return false;
}
IntentEvent that = (IntentEvent) o;
return Objects.equal(this.intent, that.intent)
&& Objects.equal(this.state, that.state)
&& Objects.equal(this.previous, that.previous)
&& Objects.equal(this.time, that.time);
}
@Override
public int hashCode() {
return Objects.hashCode(intent, state, previous, time);
}
@Override
public String toString() {
return Objects.toStringHelper(getClass())
.add("intent", intent)
.add("state", state)
.add("previous", previous)
.add("time", time)
.toString();
}
}
|
#!/bin/bash
python openmm_adaptive_sim.py input.pdb input.coor >log.txt 2>&1
# get rid of the input file created by htmd for reading with mdtraj
rm input_coor.dcd
# need to do this for now, as htmd appears to only copy xtc files over to data.
mdconvert -f traj.dcd -o traj.xtc
# remove the dcd file, as the xtc has everything we need.
rm traj.dcd
|
#ifndef DATA_TRAITS_HPP
#define DATA_TRAITS_HPP
/// COMPONENT
#include <csapex/utility/tmp.hpp>
#include <csapex/utility/semantic_version.h>
/// SYSTEM
#include <memory>
namespace csapex
{
HAS_MEM_FUNC(makeEmpty, has_make_empty);
template <typename T, typename std::enable_if<has_make_empty<T, std::shared_ptr<T> (*)()>::value, int>::type = 0>
inline std::shared_ptr<T> makeEmpty()
{
return T::makeEmpty();
}
template <typename T, typename std::enable_if<!has_make_empty<T, std::shared_ptr<T> (*)()>::value, int>::type = 0>
inline std::shared_ptr<T> makeEmpty()
{
return std::make_shared<T>();
}
// semantic version of token data
template <typename T>
struct semantic_version
{
// default
static constexpr SemanticVersion value{ 0, 0, 0 };
};
} // namespace csapex
#endif // DATA_TRAITS_HPP
|
import com.typesafe.plugin._
import play.api.Play.current
import play.api.test._
import play.api.test.Helpers._
import services.MailService
import org.specs2.mutable.Specification
import scala.concurrent.Await
import scala.concurrent.duration.Duration
/**
* Specs for MailService
*/
class MailServiceSpec extends Specification {
"MailService#sendMail" should {
"send email" in new EmbedSMTPContext {
override val withoutPlugins = Seq("play.modules.reactivemongo.ReactiveMongoPlugin")
running(fakeApp) {
val message = "Hello World, this is the MailService !"
MailService.sendMail(message, "Subject", List("[email protected]"), "[email protected]")(use[MailerPlugin].email)
Await.result(lastReceivedMessage, Duration.Inf) must contain(message)
}
}
}
}
|
import time
# demo1
def process_bar(name,percent,total_length=25):
bar = ''.join(["▮"] * int(percent * total_length)) + ''
bar = '\r' + '[' + \
bar.ljust(total_length) + \
' {:0>4.1f}%|'.format(percent*100) +'100%,'+name+']'
print(bar, end='', flush=True)
for i in range(101):
time.sleep(0.1)
end_str = '100%'
process_bar('do what',i/100)
|
package com.utils
import android.content.Context
import android.util.AttributeSet
import android.util.Log
import android.widget.LinearLayout
class MyLinearLayout @JvmOverloads constructor(
context: Context, attrs: AttributeSet? = null, defStyleAttr: Int = 0
) : LinearLayout(context, attrs, defStyleAttr) {
override fun onAttachedToWindow() {
super.onAttachedToWindow()
log("onAttachedToWindow")
}
override fun onMeasure(widthMeasureSpec: Int, heightMeasureSpec: Int) {
super.onMeasure(widthMeasureSpec, heightMeasureSpec)
log("onMeasure")
}
override fun onLayout(changed: Boolean, l: Int, t: Int, r: Int, b: Int) {
super.onLayout(changed, l, t, r, b)
log("onLayout")
}
override fun onWindowFocusChanged(hasWindowFocus: Boolean) {
super.onWindowFocusChanged(hasWindowFocus)
log("onWindowFocusChanged")
}
override fun onDetachedFromWindow() {
super.onDetachedFromWindow()
log("onDetachedFromWindow")
}
override fun onFinishInflate() {
super.onFinishInflate()
log("onFinishInflate")
}
override fun requestLayout() {
super.requestLayout()
log("requestLayout")
}
private fun log(msg: String) {
Log.i(TAG, "$msg width = $width, height = $height")
}
companion object {
private const val TAG = "MyLinearLayout"
}
} |
#Untitled - By: HP - Mon Mar 1 2021
import sensor, image, time
sensor.reset()
sensor.set_pixformat(sensor.RGB565)
sensor.set_framesize(sensor.QQVGA)
sensor.skip_frames(time = 2000)
clock = time.clock()
KMAN = 0.3 # constant for exposure setting
autoExposureSum = 0
readExposureNum = 10
for i in range(readExposureNum):
autoExposureSum += sensor.get_exposure_us()
autoExposure = autoExposureSum/readExposureNum
manualExposure = int(autoExposure * KMAN) # scale factor for decreasing autoExposure
sensor.set_auto_exposure(False, manualExposure)
yellow_threshold_BBG = [(25, 88, -128, 24, 1, 127)] #Dark Background
yellow_threshold_WBG = [(30, 100, -20, 127, 20, 127)] #Light Background
while(True):
clock.tick()
img = sensor.snapshot()
for blobs in img.find_blobs(yellow_threshold_BBG): #change threshold depending on which background
if(blobs.roundness() > 0.77 and blobs.pixels() > 250):
img.draw_rectangle(blobs.x(), blobs.y(), blobs.w(), blobs.h(), (255, 0, 0), 2)
# TODO: Remeasure threshold and pixels in cafe
|
import java.io.BufferedReader;
import java.io.FileReader;
import org.culpan.hct.backend.Backend;
import org.culpan.hct.backend.BackendFactory;
import org.culpan.hct.frontend.FrontendFactory;
import org.culpan.hct.frontend.Parser;
import org.culpan.hct.frontend.Source;
import org.culpan.hct.intermediate.*;
import org.culpan.hct.message.*;
/**
* <h1>Pascal</h1>
*
* <p>Compile or interpret a Pascal source program.</p>
*
* <p>Copyright (c) 2009 by Ronald Mak</p>
* <p>For instructional purposes only. No warranties.</p>
*/
public class Pascal
{
private Parser parser; // language-independent parser
private Source source; // language-independent scanner
private ICode iCode; // generated intermediate code
private SymTab symTab; // generated symbol table
private Backend backend; // backend
/**
* Compile or interpret a Pascal source program.
* @param operation either "compile" or "execute".
* @param filePath the source file path.
* @param flags the command line flags.
*/
public Pascal(String operation, String filePath, String flags)
{
try {
boolean intermediate = flags.indexOf('i') > -1;
boolean xref = flags.indexOf('x') > -1;
source = new Source(new BufferedReader(new FileReader(filePath)));
source.addMessageListener(new SourceMessageListener());
parser = FrontendFactory.createParser("Pascal", "top-down", source);
parser.addMessageListener(new ParserMessageListener());
backend = BackendFactory.createBackend(operation);
backend.addMessageListener(new BackendMessageListener());
parser.parse();
source.close();
iCode = parser.getICode();
symTab = parser.getSymTab();
backend.process(iCode, symTab);
}
catch (Exception ex) {
System.out.println("***** Internal translator error. *****");
ex.printStackTrace();
}
}
private static final String FLAGS = "[-ix]";
private static final String USAGE =
"Usage: Pascal execute|compile " + FLAGS + " <source file path>";
/**
* The main method.
* @param args command-line arguments: "compile" or "execute" followed by
* optional flags followed by the source file path.
*/
public static void main(String args[])
{
try {
String operation = args[0];
// Operation.
if (!( operation.equalsIgnoreCase("compile")
|| operation.equalsIgnoreCase("execute"))) {
throw new Exception();
}
int i = 0;
String flags = "";
// Flags.
while ((++i < args.length) && (args[i].charAt(0) == '-')) {
flags += args[i].substring(1);
}
// Source path.
if (i < args.length) {
String path = args[i];
new Pascal(operation, path, flags);
}
else {
throw new Exception();
}
}
catch (Exception ex) {
System.out.println(USAGE);
}
}
private static final String SOURCE_LINE_FORMAT = "%03d %s";
/**
* Listener for source messages.
*/
private class SourceMessageListener implements MessageListener
{
/**
* Called by the source whenever it produces a message.
* @param message the message.
*/
public void messageReceived(Message message)
{
MessageType type = message.getType();
Object body[] = (Object []) message.getBody();
switch (type) {
case SOURCE_LINE: {
int lineNumber = (Integer) body[0];
String lineText = (String) body[1];
System.out.println(String.format(SOURCE_LINE_FORMAT,
lineNumber, lineText));
break;
}
}
}
}
private static final String PARSER_SUMMARY_FORMAT =
"\n%,20d source lines." +
"\n%,20d syntax errors." +
"\n%,20.2f seconds total parsing time.\n";
/**
* Listener for parser messages.
*/
private class ParserMessageListener implements MessageListener
{
/**
* Called by the parser whenever it produces a message.
* @param message the message.
*/
public void messageReceived(Message message)
{
MessageType type = message.getType();
switch (type) {
case PARSER_SUMMARY: {
Number body[] = (Number[]) message.getBody();
int statementCount = (Integer) body[0];
int syntaxErrors = (Integer) body[1];
float elapsedTime = (Float) body[2];
System.out.printf(PARSER_SUMMARY_FORMAT,
statementCount, syntaxErrors,
elapsedTime);
break;
}
}
}
}
private static final String INTERPRETER_SUMMARY_FORMAT =
"\n%,20d statements executed." +
"\n%,20d runtime errors." +
"\n%,20.2f seconds total execution time.\n";
private static final String COMPILER_SUMMARY_FORMAT =
"\n%,20d instructions generated." +
"\n%,20.2f seconds total code generation time.\n";
/**
* Listener for back end messages.
*/
private class BackendMessageListener implements MessageListener
{
/**
* Called by the back end whenever it produces a message.
* @param message the message.
*/
public void messageReceived(Message message)
{
MessageType type = message.getType();
switch (type) {
case INTERPRETER_SUMMARY: {
Number body[] = (Number[]) message.getBody();
int executionCount = (Integer) body[0];
int runtimeErrors = (Integer) body[1];
float elapsedTime = (Float) body[2];
System.out.printf(INTERPRETER_SUMMARY_FORMAT,
executionCount, runtimeErrors,
elapsedTime);
break;
}
case COMPILER_SUMMARY: {
Number body[] = (Number[]) message.getBody();
int instructionCount = (Integer) body[0];
float elapsedTime = (Float) body[1];
System.out.printf(COMPILER_SUMMARY_FORMAT,
instructionCount, elapsedTime);
break;
}
}
}
}
}
|
using System;
using System.Threading.Tasks;
using Slack2Display.Server.Hubs;
using Microsoft.AspNetCore.SignalR;
namespace Slack2Display.Server.Services
{
public class CommandService : ICommandService
{
private IHubContext<CommandHub, ICommandClient> HubContext { get; }
public event EventHandler<CommandEventArgs> CommandAdded;
public CommandService(IHubContext<CommandHub, ICommandClient> hubContext)
{
HubContext = hubContext ?? throw new ArgumentNullException(nameof(hubContext));
}
public void AddCommand(ICommandModel command)
{
if (command == null)
{
throw new ArgumentNullException(nameof(command));
}
HubContext.Clients.All.ReceiveCommand(command);
if (CommandAdded != null)
{
CommandAdded.Invoke(this, new CommandEventArgs(command));
}
}
public async Task AddCommandAsync(ICommandModel command)
{
if (command == null)
{
throw new ArgumentNullException(nameof(command));
}
await HubContext.Clients.All.ReceiveCommand(command);
if (CommandAdded != null)
{
CommandAdded.Invoke(this, new CommandEventArgs(command));
}
}
}
}
|
/*
Catalog event - transfer and publishing
Copyright 2014 Commons Machinery http://commonsmachinery.se/
Distributed under an AGPL_v3 license, please see LICENSE in the top dir.
*/
'use strict';
var debug = require('debug')('catalog:event:transfer'); // jshint ignore:line
// External libs
var util = require('util');
var EventEmitter = require('events').EventEmitter;
var Promise = require('bluebird');
var zmq = require('zmq');
var _ = require('underscore');
// Common libs
var config = require('../../../lib/config');
var mongo = require('../../../lib/mongo');
// Event libs
var db = require('./db');
/*! Transfer events from a source collection into the event log.
*
* Emits `event` on successful transfer.
*/
var Transfer = function(sourceCollection, destCollection) {
EventEmitter.call(this);
this.sourceCollection = sourceCollection;
this.destCollection = destCollection;
this.cursor = null;
this.lastEvent = null;
};
util.inherits(Transfer, EventEmitter);
Transfer.prototype.start = function() {
var self = this;
// Find the last event, or something close enough. Since
// ObjectIds start with a timestamp, this get us something at the
// very end of the event log.
this.destCollection.findOne({}).sort({ _id: -1 }).exec()
.then(
function(ev) {
if (ev) {
debug('last logged event: %s %s', ev.id, ev.date);
self.lastEvent = ev;
}
if (!self.cursor) {
self._openCursor();
}
},
function(err) {
debug('error finding last logged event: %s', err);
}
);
};
Transfer.prototype._openCursor = function() {
// TODO: this will need to be more stable by looking further back
// in time, but for initial coding we can take it easy
var query;
if (this.lastEvent) {
debug('opening cursor tailing after %s', this.lastEvent.date);
query = { date: { $gt: this.lastEvent.date }};
}
else {
debug('opening cursor tailing entire collection');
query = {};
}
this.cursor = this.sourceCollection.find(query, {
tailable: true,
awaitdata: true,
numberOfRetries: 0, // Hack: disable tailable cursor timeouts
hint: { $natural: 1 },
// TODO: set readPreference
});
this._nextEvent();
};
Transfer.prototype._nextEvent = function() {
var self = this;
this.cursor.nextObject(function(err, item) {
if (err) {
self._handleCursorError(err);
}
else {
self._handleEvent(item);
}
});
};
Transfer.prototype._handleCursorError = function(err) {
var self = this;
debug('transfer cursor error: %s', err);
this.cursor.close(function() {
debug('transfer cursor closed');
self.cursor = null;
// Retry after a delay
setTimeout(function() { self._openCursor(); }, 5000);
});
};
Transfer.prototype._handleEvent = function(sourceEvent) {
var self = this;
if (!sourceEvent) {
this._handleCursorError('empty collection');
return;
}
// Create an event log version and store it.
this.destCollection.create(sourceEvent)
.then(
function onResolve(logEvent) {
self.lastEvent = logEvent;
self.emit('event', logEvent);
self._nextEvent();
},
// These are not Bluebird Promises, so there's no catch (...)
function onReject(err) {
if (err.code === 11000 || err.code === 11001) {
debug('skipping already logged event: %s', sourceEvent._id);
self._nextEvent();
}
else {
debug('error logging event: %s', err);
self._handleCursorError(err);
}
}
);
};
var Publisher = function() {
this._socket = null;
};
/*! Open socket, returning a promise that resolves when done.
*/
Publisher.prototype.open = function() {
this._socket = zmq.socket('pub');
console.log('publishing events at %s', config.event.pubAddress);
var bindAsync = Promise.promisify(this._socket.bind, this._socket);
return bindAsync(config.event.pubAddress);
};
/*! Publish an event batch */
Publisher.prototype.publish = function(batch) {
var scope = JSON.stringify(_.pick(
batch, 'user', 'date', 'type', 'object', 'version'));
var i;
for (i = 0; i < batch.events.length; i++) {
var ev = batch.events[i];
var msg = [ev.event, scope, JSON.stringify(ev.param)];
debug('publishing: %s', msg);
this._socket.send(msg);
}
};
exports.start = function() {
var pub = new Publisher();
// We need connections to all the staging databases, as well as
// our normal database connection.
Promise.props({
core: mongo.createConnection(config.core.db),
event: db.connect(),
search: mongo.createConnection(config.search.db),
pub: pub.open(),
})
.then(function(conns) {
console.log('event transfer starting');
var coreTransfer = new Transfer(conns.core.collection('coreevents'), db.EventBatch);
coreTransfer.start();
coreTransfer.on('event', function(batch) {
pub.publish(batch);
});
var searchTransfer = new Transfer(conns.search.collection('searchevents'), db.SearchEventBatch);
searchTransfer.start();
searchTransfer.on('event', function(batch) {
pub.publish(batch);
});
})
.catch(function(err) {
console.error('error starting event transfer: %s', err);
});
};
|
# An implementation of the MigrationChecker API
# This class should be added in the puppet context under the key :migration_Checker
# when the future parser and evaluator is operating in order to get callbacks to the methods
# of this class.
#
# When the transaction using the MigrationChecker has finished, the collected
# diagnostics can be obtained by getting the Acceptor and asking it for all diagnostics,
# errors, warnings etc.
#
class PuppetX::Puppetlabs::Migration::MigrationChecker < Puppet::Pops::Migration::MigrationChecker
Issues = PuppetX::Puppetlabs::Migration::MigrationIssues
# The MigrationChecker's severity producer makes all issues have
# warning severity by default.
#
class SeverityProducer < Puppet::Pops::Validation::SeverityProducer
def initialize
super(:warning)
# TODO: TEMPLATE CODE - REMOVE BEFORE RELEASE
# Example of configuring issues to not be a warning
# p = self
# p[Issues::EMPTY_RESOURCE_SPECIALIZATION] = :deprecation
end
end
# An acceptor of migration issues that only accepts diagnostics
# for a given [file, line, pos, issue, severity] once in its lifetime
#
class MigrationIssueAcceptor < Puppet::Pops::Validation::Acceptor
attr_reader :diagnostics
def initialize
@reported = Set.new
super
end
def accept(diagnostic)
# Only accept unique diagnostics (unique == same issue, file, line, pos and severity)
return unless @reported.add?(diagnostic)
# Collect the diagnostic per severity and remember
super diagnostic
end
end
attr_reader :diagnostic_producer
attr_reader :acceptor
def initialize
@acceptor = MigrationIssueAcceptor.new
@diagnostic_producer = Puppet::Pops::Validation::DiagnosticProducer.new(
@acceptor,
SeverityProducer.new,
Puppet::Pops::Model::ModelLabelProvider.new)
end
# @param issue [Puppet::Pops::Issue] the issue to report
# @param semantic [Puppet::Pops::ModelPopsObject] the object for which evaluation failed in some way. Used to determine origin.
# @param options [Hash] hash of optional named data elements for the given issue
# @return [nil] this method does not return a meaningful value
# @raise [Puppet::ParseError] an evaluation error initialized from the arguments (TODO: Change to EvaluationError?)
#
def report(issue, semantic, options={}, except=nil)
diagnostic_producer.accept(issue, semantic, options, except)
end
private :report
def report_ambiguous_integer(o)
radix = o.radix
return unless radix == 8 || radix == 16
report(Issues::MIGRATE4_AMBIGUOUS_INTEGER, o, {:value => o.value, :radix => radix})
end
def report_ambiguous_float(o)
report(Issues::MIGRATE4_AMBIGUOUS_FLOAT, o, {:value => o.value })
end
def report_empty_string_true(value, o)
return unless value == ''
report(Issues::MIGRATE4_EMPTY_STRING_TRUE, o)
end
def report_uc_bareword_type(value, o)
return unless value.is_a?(Puppet::Pops::Types::PAnyType)
return unless o.is_a?(Puppet::Pops::Model::QualifiedReference)
report(Issues::MIGRATE4_UC_BAREWORD_IS_TYPE, o, {:type => value.to_s })
end
def report_equality_type_mismatch(left, right, o)
return unless is_type_diff?(left, right)
report(Issues::MIGRATE4_EQUALITY_TYPE_MISMATCH, o, {:left => left, :right => right })
end
def report_option_type_mismatch(test_value, option_value, option_expr, matching_expr)
return unless is_type_diff?(test_value, option_value) || is_match_diff?(test_value, option_value)
report(Issues::MIGRATE4_OPTION_TYPE_MISMATCH, matching_expr, {:left => test_value, :right => option_value, :option_expr => option_expr})
end
# Helper method used by equality and case option to determine if a diff in type may cause difference between 3.x and 4.x
# @return [Boolean] true if diff should be reported
#
def is_type_diff?(left, right)
l_class = left.class
r_class = right.class
if left.nil? && r_class == String && right.empty? || right.nil? && l_class == String && left.empty?
# undef vs. ''
true
elsif l_class <= Puppet::Pops::Types::PAnyType && r_class <= String || r_class <= Puppet::Pops::Types::PAnyType && l_class <= String
# Type vs. Numeric (caused by uc bare word being a type and not a string)
true
elsif l_class <= Numeric && r_class <= String || r_class <= Numeric && l_class <= String
# String vs. Numeric
true
else
# hash, array, booleans and regexp, etc are only true if compared against same type - no difference between 3x. and 4.x
# or this is a same type comparison (also the same in 3.x. and 4.x)
false
end
end
private :is_type_diff?
def is_match_diff?(left, right)
l_class = left.class
r_class = right.class
return l_class == Regexp && r_class != String || r_class == Regexp && l_class != String
end
private :is_match_diff?
def report_in_expression(o)
report(Issues::MIGRATE4_REVIEW_IN_EXPRESSION, o)
end
def report_array_last_in_block(o)
return unless o.is_a?(Puppet::Pops::Model::LiteralList)
report(Issues::MIGRATE4_ARRAY_LAST_IN_BLOCK, o)
end
end |
# -*- coding: utf-8 -*-
from zipfile import ZipFile
import dateparser
from lxml import etree
from health_stats.database import DBSession
from health_stats.models.events import *
from health_stats.parsers.log_parser import LogParser
# Healthkit constants we care about
# See: https://developer.apple.com/library/watchos/documentation/HealthKit/Reference/HealthKit_Constants/index.html
TYPE_GLUCOSE = "HKQuantityTypeIdentifierBloodGlucose"
TYPE_BP_DIASTOLIC = "HKQuantityTypeIdentifierBloodPressureDiastolic"
TYPE_BP_SYSTOLIC = "HKQuantityTypeIdentifierBloodPressureSystolic"
TYPE_CARBS = "HKQuantityTypeIdentifierDietaryCarbohydrates"
TYPE_DISTANCE_WALKING_RUNNING = "HKQuantityTypeIdentifierDistanceWalkingRunning"
TYPE_FLIGHTS_CLIMBED = "HKQuantityTypeIdentifierFlightsClimbed"
TYPE_STEP_COUNT = "HKQuantityTypeIdentifierStepCount"
class HealthkitLogParser(LogParser):
""" Parser for Apple Healthkit data """
SOURCE = SOURCE_APPLE_HEALTH
def parse_log(self, path):
session = DBSession()
# This file is big enough (and compressed) that we might as well just parse
# it once and worry about saving memory if/when that becomes an issue.
hk_events = []
with ZipFile(path, 'r') as zfile:
xfile = zfile.open('apple_health_export/export.xml')
tree = etree.parse(xfile)
root = tree.getroot()
for rnum, record in enumerate(root.iterfind('.//Record'), start=1):
event = self.parse_record(record)
if event:
hk_events.append(event)
# find earliest/latest and delete any existing rows from this range
times = [e.time for e in hk_events]
self._flush_old_data(session, self.SOURCE, min(times), max(times))
session.commit()
# Now we can restart the csv reader to actually load the data
for event in hk_events:
session.merge(event)
print("Adding {} events".format(len(hk_events)))
session.commit()
def parse_record(self, record):
"""
Parse the values in an xml record and return a LogEvent
"""
# Determine if we want this record
source_name = record.attrib['sourceName']
# source_version = record.attrib.get('sourceVersion')
# todo: We will probably need special handline for sourceName values that we know about from other importers.
# todo: e.g. OneTouch should be converted into OneTouch source events (or ignored, since csv data is richer).
# Parse out other basics about the record
record_type = record.attrib['type']
unit = record.attrib['unit']
value = record.attrib['value']
# Parse the dates and pick the most appropriate one.
# Note: too many other formats are always in "local time", so don't bother trying to deal with tz for now.
event_time = datetime.strptime(record.attrib['endDate'][:19], '%Y-%m-%d %H:%M:%S')
if not event_time:
event_time = datetime.strptime(record.attrib['startDate'][:19], '%Y-%m-%d %H:%M:%S')
if not event_time:
event_time = datetime.strptime(record.attrib['creationDate'][:19], '%Y-%m-%d %H:%M:%S')
# Find the wanted metadata fields
metadata = {}
for meta in record.iterfind('.//MetadataEntry'):
metadata[meta.attrib['key']] = meta.attrib['value']
# Deal with various record types
# if record_type == TYPE_GLUCOSE:
# TODO: re-enable this once we can exclude sources we already get from elsewhere
# if unit != 'mg/dL':
# raise ValueError("Unrecognized unit in record {}".format(', '.join(record)))
# tags = []
# if int(metadata.get('HKWasUserEntered', 0)) == 1:
# tags.append('Manual')
# if metadata.get('Tag Type', 'None') != 'None': # yes, it's stored as the string 'None'
# tags.append(metadata['Tag Type'])
# return GlucoseEvent(
# source=self.SOURCE,
# time=event_time,
# value=value,
# unit=GlucoseEvent.UNIT_MGDL,
# notes=None,
# tags=', '.join(tags),
# )
# elif record_type == TYPE_BP_DIASTOLIC:
# if unit != 'mmHg':
# raise ValueError("Unrecognized unit in record {}".format(', '.join(record)))
# TODO: Implement this if there are no better sources
# elif record_type == TYPE_BP_SYSTOLIC:
# if unit != 'mmHg':
# raise ValueError("Unrecognized unit in record {}".format(', '.join(record)))
# TODO: Implement this if there are no better sources
# elif record_type == TYPE_CARBS:
# if unit != 'g':
# raise ValueError("Unrecognized unit in record {}".format(', '.join(record)))
# TODO: Implement this if there are no better sources
if record_type == TYPE_STEP_COUNT:
if value < 1:
return None
if unit != 'count':
raise ValueError("Unrecognized unit in record {}".format(', '.join(record)))
return StepsEvent(
source=self.SOURCE,
time=event_time,
value=value,
unit='count',
)
|
echo "Updating Pulumi Stack"
# Download dependencies and build
npm install
# Update the stack
pulumi stack select dev
pulumi up --yes
|
{-# LANGUAGE TypeFamilies, FlexibleInstances, PostfixOperators #-}
{-# OPTIONS_HADDOCK hide #-}
-----------------------------------------------------------------------------
-- |
-- Module : ForSyDe.MoC.SDF
-- Copyright : (c) George Ungureanu, KTH/ICT/E 2015;
-- SAM Group, KTH/ICT/ECS 2007-2008
-- License : BSD-style (see the file LICENSE)
--
-- Maintainer : [email protected]
-- Stability : experimental
-- Portability : portable
--
-- The synchronuous library defines process constructors, processes and a signal conduit
-- for the synchronous computational model. A process constructor is a
-- higher order function which together with combinational function(s)
-- and values as arguments constructs a process.
-----------------------------------------------------------------------------
module ForSyDe.Atom.MoC.SDF.Core where
import ForSyDe.Atom.MoC
import ForSyDe.Atom.MoC.Stream
import ForSyDe.Atom.Utility.Tuple
-- | Type synonym for production rate
type Cons = Int
-- | Type synonym for consumption rate
type Prod = Int
-- | Type synonym for a SY signal, i.e. "a signal of SY events"
type Signal a = Stream (SDF a)
-- | The SDF event. It identifies a synchronous dataflow signal, and
-- wraps only a value.
newtype SDF a = SDF { val :: a }
-- | Implenents the SDF semantics for the MoC atoms.
instance MoC SDF where
type Fun SDF a b = (Cons, [a] -> b)
type Ret SDF a = (Prod, [a])
---------------------
_ -.- NullS = NullS
(c,f) -.- s = (comb c f . map val . fromStream) s
where comb c f l = let x' = take c l
xs' = drop c l
in if length x' == c
then SDF (f x') :- comb c f xs'
else NullS
---------------------
cfs -*- s = (comb2 cfs . map val . fromStream) s
where comb2 NullS _ = NullS
comb2 (SDF (c,f):-fs) l = let x' = take c l
xs' = drop c l
in if length x' == c
then SDF (f x') :- comb2 fs xs'
else NullS
---------------------
(-*) NullS = NullS
(-*) ((SDF (p,r)):-xs)
| length r == p = stream (map SDF r) +-+ (xs -*)
| otherwise = error "[MoC.SDF] Wrong production"
---------------------
(-<-) = (+-+)
---------------------
(-&-) _ a = a
---------------------
-- | Allows for mapping of functions on a SDF event.
instance Functor SDF where
fmap f (SDF a) = SDF (f a)
-- | Allows for lifting functions on a pair of SDF events.
instance Applicative SDF where
pure = SDF
(SDF a) <*> (SDF b) = SDF (a b)
instance Foldable SDF where
foldr f z (SDF x) = f x z
foldl f z (SDF x) = f z x
instance Traversable SDF where
traverse f (SDF x) = SDF <$> f x
-- | Shows the value wrapped
instance Show a => Show (SDF a) where
showsPrec _ (SDF x) = (++) (show x)
-- | Reads the value wrapped
instance Read a => Read (SDF a) where
readsPrec _ s = [(SDF x, r) | (x, r) <- reads s]
-----------------------------------------------------------------------------
-- | Transforms a list of values into a SDF signal with only one
-- partition, i.e. all events share the same (initial) tag.
signal :: [a] -> Signal a
signal l = stream (SDF <$> l)
signal2 (l1,l2) = (signal l1, signal l2)
signal3 (l1,l2,l3) = (signal l1, signal l2, signal l3)
signal4 (l1,l2,l3,l4) = (signal l1, signal l2, signal l3, signal l4)
-- | Transforms a signal back to a list
fromSignal :: Signal a -> [a]
fromSignal = fromStream . fmap (\(SDF a) -> a)
-- | Reads a signal from a string. Like with the @read@ function from
-- @Prelude@, you must specify the tipe of the signal.
--
-- >>> readSignal "{1,2,3,4,5}" :: Signal Int
-- {1,2,3,4,5}
readSignal :: Read a => String -> Signal a
readSignal = read
----------------------------------------------------------------------
scen11 (c,p,f) = ctxt11 c p f
scen12 (c,p,f) = ctxt12 c p f
scen13 (c,p,f) = ctxt13 c p f
scen14 (c,p,f) = ctxt14 c p f
scen21 (c,p,f) = ctxt21 c p f
scen22 (c,p,f) = ctxt22 c p f
scen23 (c,p,f) = ctxt23 c p f
scen24 (c,p,f) = ctxt24 c p f
scen31 (c,p,f) = ctxt31 c p f
scen32 (c,p,f) = ctxt32 c p f
scen33 (c,p,f) = ctxt33 c p f
scen34 (c,p,f) = ctxt34 c p f
scen41 (c,p,f) = ctxt41 c p f
scen42 (c,p,f) = ctxt42 c p f
scen43 (c,p,f) = ctxt43 c p f
scen44 (c,p,f) = ctxt44 c p f
scen51 (c,p,f) = ctxt51 c p f
scen52 (c,p,f) = ctxt52 c p f
scen53 (c,p,f) = ctxt53 c p f
scen54 (c,p,f) = ctxt54 c p f
scen61 (c,p,f) = ctxt61 c p f
scen62 (c,p,f) = ctxt62 c p f
scen63 (c,p,f) = ctxt63 c p f
scen64 (c,p,f) = ctxt64 c p f
scen71 (c,p,f) = ctxt71 c p f
scen72 (c,p,f) = ctxt72 c p f
scen73 (c,p,f) = ctxt73 c p f
scen74 (c,p,f) = ctxt74 c p f
scen81 (c,p,f) = ctxt81 c p f
scen82 (c,p,f) = ctxt82 c p f
scen83 (c,p,f) = ctxt83 c p f
scen84 (c,p,f) = ctxt84 c p f
|
<?php
namespace Tests\guest;
use App\Models\ActivityLogEntry;
use App\Models\Exhibition;
use Tests\TestCase;
use App\Models\Guest;
use App\Models\User;
/**
* - /guests/$id/exit:post
*/
class ExitTest extends TestCase {
/**
* 展示退室のテスト
* Guest の滞在中の展示が更新されている
* ActivityLog が生成されている
*/
public function testExit() {
$user = User::factory()->permission('exhibition')->has(Exhibition::factory())->create();
$guest = Guest::factory()->state(['exhibition_id'=>$user->id])->create();
$this->actingAs($user)->post(
"/guests/{$guest->id}/exit",
['exhibition_id' => $user->id]
);
$this->assertResponseOk();
$raw_guest = json_decode($this->response->getContent());
$this->assertNull($raw_guest->exhibition_id);
$this->assertTrue(
ActivityLogEntry::query()
->where('guest_id', $guest->id)
->where('exhibition_id', $user->exhibition->id)
->exists()
);
}
/**
* Exhibition が存在しない
* EXHIBITION_NOT_FOUND
*/
public function testExhibitionNotFound() {
$user = User::factory()->permission('exhibition')->create();
$guest = Guest::factory()->create();
$this->actingAs($user)->post(
"/guests/{$guest->id}/exit",
['exhibition_id' => $user->id]
);
$this->assertResponseStatus(400);
$this->assertJson($this->response->getContent());
$code = json_decode($this->response->getContent())->error_code;
$this->assertEquals('EXHIBITION_NOT_FOUND', $code);
}
/**
* 権限チェック
* - executive, 権限なし の場合は指定できない
* - admin 権限があれば任意の展示を指定できる
* - exhibition 権限のみのときは自分の展示のみ指定できる
* 上記のルールに違反したときに 403 が、そうでない場合は正しく処理がされている事を確認する
*/
public function testPermission() {
$not_permitted_users[] = User::factory()->permission('executive')->create();
$not_permitted_users[] = User::factory()->create();
foreach ($not_permitted_users as $user) {
$this->actingAs($user)->post("/guests/GB_00000000/exit");
$this->assertResponseStatus(403);
}
$admin_users[] = User::factory()->permission('admin')->has(Exhibition::factory())->create();
$admin_users[] = User::factory()->permission('admin', 'exhibition')->has(Exhibition::factory())->create();
$other_exhibition = Exhibition::factory()->create();
foreach ($admin_users as $user) {
foreach ([true, false] as $mode) {
$guest = Guest::factory()->create();
$exh_id = $mode === true ? $user->id : $other_exhibition->id;
$this->actingAs($user)->post(
"/guests/{$guest->id}/exit",
['exhibition_id' => $exh_id]
);
$this->assertResponseOk();
}
}
$exhibition_user = User::factory()->permission('exhibition')->has(Exhibition::factory())->create();
$guest = Guest::factory()->create();
$this->actingAs($exhibition_user)->post(
"/guests/{$guest->id}/exit",
['exhibition_id' => $exhibition_user->id]
);
$this->assertResponseOk();
$guest = Guest::factory()->create();
$this->actingAs($exhibition_user)->post(
"/guests/{$guest->id}/exit",
['exhibition_id' => $other_exhibition->id]
);
$this->assertResponseStatus(403);
}
/**
* Guest が存在しない
* GUEST_NOT_FOUND
*/
public function testGuestNotFound() {
$user = User::factory()->permission('exhibition')->has(Exhibition::factory())->create();
Guest::factory()->create();
$this->actingAs($user)->post(
"/guests/GB-00000/exit",
['exhibition_id' => $user->id]
);
$this->assertResponseStatus(404);
$this->assertJson($this->response->getContent());
$code = json_decode($this->response->getContent())->error_code;
$this->assertEquals('GUEST_NOT_FOUND', $code);
}
/**
* Guest が既に退場処理をしているとき
* GUEST_ALREADY_CHECKED_OUT
*/
public function testAlreadyExited() {
$user = User::factory()->permission('exhibition')->has(Exhibition::factory())->create();
$guest = Guest::factory()->revoked()->create();
$this->actingAs($user)->post(
"/guests/{$guest->id}/exit",
['exhibition_id' => $user->id]
);
$this->assertResponseStatus(400);
$this->assertJson($this->response->getContent());
$code = json_decode($this->response->getContent())->error_code;
$this->assertEquals('GUEST_ALREADY_CHECKED_OUT', $code);
}
/**
* Guest がどこに居ても正しく実行される
* 新しい展示の Exit だけ生成されている
*/
public function testNoMatterWhereGuestIsIn() {
$user = User::factory()
->permission('admin')
->has(Exhibition::factory())
->create();
$exhibition_id = Exhibition::factory()->create()->id;
$guests[] = Guest::factory()->state(['exhibition_id' => $exhibition_id])->create();
$guests[] = Guest::factory()->state(['exhibition_id' => $user->id])->create();
$guests[] = Guest::factory()->state(['exhibition_id' => null])->create();
foreach ($guests as $guest) {
$this->actingAs($user)->post(
"/guests/{$guest->id}/exit",
['exhibition_id' => $user->id]
);
$this->assertResponseOk();
// Exit の log が生成されている
$this->assertTrue(
ActivityLogEntry::query()
->where('guest_id', $guest->id)
->where('exhibition_id', $user->id)
->where('log_type', 'exit')
->exists()
);
}
}
/**
* ログインチェック
* ログインしていないときに 401 が返ってきている
*/
public function testGuest() {
$guest_id = Guest::factory()->create()->id;
$this->post("/guests/{$guest_id}/exit");
$this->assertResponseStatus(401);
}
}
|
<?php
namespace App\Models;
use Illuminate\Database\Eloquent\Factories\HasFactory;
use Illuminate\Database\Eloquent\Model;
class UsersImage extends Model
{
use HasFactory;
protected $fillable =
[
'image_name',
'image_path',
'pet_id',
'business_profile_id',
];
/**
* Get the businessProfile associated with the BusinessProfileImage
*
* @return \Illuminate\Database\Eloquent\Relations\HasOne
*/
public function businessProfile()
{
return $this->hasOne(BusinessProfile::class );
}
/**
* Get the pet associated with the PetsImage
*
* @return \Illuminate\Database\Eloquent\Relations\HasOne
*/
public function pet()
{
return $this->hasOne(Pet::class);
}
}
|
package com.java.demo.designpattern.strategy;
/**
* @author: xinyuan.ymm
* @create: 2017-03-11 上午11:14
*/
public class FlyWithWing implements FlyBehavior {
@Override
public void fly() {
System.out.println("i fly wing");
}
}
|
require 'spec_helper'
RSpec.describe 'fluentd' do
shared_examples 'works' do
it { is_expected.to compile.with_all_deps }
it { is_expected.to contain_class('fluentd') }
it { is_expected.to contain_class('fluentd::install') }
it { is_expected.to contain_class('fluentd::service') }
end
context 'with debian', :debian do
include_examples 'works'
end
context 'with redhat', :redhat do
include_examples 'works'
end
context 'with plugins', :redhat do
let(:params) { { plugins: { plugin_name => plugin_params } } }
let(:plugin_name) { 'fluent-plugin-http' }
let(:plugin_params) { { 'plugin_ensure' => '0.1.0' } }
it { is_expected.to contain_fluentd__plugin(plugin_name).with(plugin_params) }
end
context 'with configs', :redhat do
let(:params) { { configs: { config_name => config_params } } }
let(:config_name) { '100_fwd.conf' }
let(:config_params) { { 'config' => { 'source' => { 'type' => 'forward' } } } }
it { is_expected.to contain_fluentd__config(config_name).with(config_params) }
end
end
|
<?php
namespace RuchJow\TerritorialUnitsBundle\Entity;
use Doctrine\ORM\EntityRepository;
class DistrictRepository extends EntityRepository
{
/**
* @param $name
* @param int $limit
*
* @return District[]
*/
public function findDistrictsByName($name, $limit = 0) {
$name = trim(preg_replace('/\s+/', ' ', $name));
$nameParts = explode(' ', $name);
$qb = $this->createQueryBuilder('d');
$qb->join('d.region', 'r')
->addSelect('r')
->orderBy('d.name');
foreach ($nameParts as $key => $part) {
$qb->andWhere($qb->expr()->like('d.name', ':part_' . $key))
->setParameter('part_' . $key, '%' . $part . '%');
}
if ($limit) {
$qb->setMaxResults($limit);
}
return $qb->getQuery()->getResult();
}
} |
function get_vosa_file() {
local file=${vosa_dir}/${instance}/$vosa_input_file
if [ -e ${file} ]; then
echo $file
else
file=${vosa_dir}/common/$vosa_input_file
if [ -e ${file} ]; then
echo $file
fi
fi
}
function cat_vosa_file() {
local file=$(get_vosa_file)
if [ -e ${file} ]; then
echo "Contents of $file"
echo "===="
cat $file
echo "===="
else
echo "Couldn't find $vosa_input_file in any of the vosa file layers"
exit 1
fi
}
|
#!/bin/bash
CONTAINER=toger
PG_CID=$(docker create -p 5432:5432 --rm --name $CONTAINER \
-e POSTGRES_USER=postgres \
-e POSTGRES_PASSWORD=postgres \
postgres:latest)
docker start $PG_CID |
# no-unused-stylesheet
## Rule Details
This rule aims to find unused SX stylesheet definitions.
Examples of **incorrect** code for this rule:
```jsx
import sx from '@adeira/sx';
export default function MyComponent() {
return null;
}
// Unused ⚠️
const styles = sx.create({
aaa: { color: 'red' },
});
```
```jsx
import sx from '@adeira/sx';
export default function MyComponent() {
return <div className={styles('aaa')} />;
}
const styles = sx.create({
aaa: { color: 'red' },
bbb: { color: 'blue' }, // Unused ⚠️
ccc: { color: 'green' }, // Unused ⚠️
});
```
Examples of **correct** code for this rule:
```jsx
import sx from '@adeira/sx';
export default function MyComponent() {
return <div className={styles('aaa')} />;
}
const styles = sx.create({
aaa: { color: 'red' },
});
```
### Options
_none_
## When Not To Use It
There should be no valid reason to turn this rule off. It helps with a dead code elimination.
|
# Prerequisites:
# 1) gem install xml-simple
# 2) gem install restr
gem 'xml-simple'
gem 'restr'
require 'restr'
logger = Logger.new('restr.log')
logger.level = Logger::DEBUG
Restr.logger = logger
u0 = "http://localhost:3301/sessions.xml"
o = { :username=>'admin', :password=>'camping'}
p0=Restr.post(u0,o)
u1 = "http://localhost:3301/posts/1.xml"
p = Restr.get(u1,o)
# Modify the title
p['title']='HOT off the presses: ' + p['title']
# Update the resource
p2=Restr.put(u1,p,o)
u3="http://localhost:3301/posts.xml"
p3={ :title=>'Brand new REST-issued post', :body=>'RESTstop makes it happen!!!'}
p4=Restr.post(u2,p3)
u3="http://localhost:3301/posts/4.xml"
p5=Restr.delete(u3) |
import pkg_resources
import sys
import warnings
if (3, 5) <= sys.version_info < (3, 6):
warnings.warn(
"Support for Python 3.5 will be removed in web3.py v5",
category=DeprecationWarning,
stacklevel=2)
if sys.version_info < (3, 5):
raise EnvironmentError(
"Python 3.5 or above is required. "
"Note that support for Python 3.5 will be remove in web3.py v5")
from eth_account import Account # noqa: E402
from web3.main import Web3 # noqa: E402
from web3.providers.rpc import ( # noqa: E402
HTTPProvider,
)
from web3.providers.eth_tester import ( # noqa: E402
EthereumTesterProvider,
)
from web3.providers.ipc import ( # noqa: E402
IPCProvider,
)
from web3.providers.websocket import ( # noqa: E402
WebsocketProvider,
)
__version__ = pkg_resources.get_distribution("web3").version
__all__ = [
"__version__",
"Web3",
"HTTPProvider",
"IPCProvider",
"WebsocketProvider",
"TestRPCProvider",
"EthereumTesterProvider",
"Account",
]
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.