branch_name
stringclasses
15 values
target
stringlengths
26
10.3M
directory_id
stringlengths
40
40
languages
sequencelengths
1
9
num_files
int64
1
1.47k
repo_language
stringclasses
34 values
repo_name
stringlengths
6
91
revision_id
stringlengths
40
40
snapshot_id
stringlengths
40
40
input
stringclasses
1 value
refs/heads/master
<repo_name>adriancervero/fake-news-detector<file_sep>/README.md # Fake News Detector This project consists of the building of a machine learning model capable of detecting fake news. The model will be deployed through a web application using Flask. ## Dataset The corpus for the model training was obtained from the next [article](https://www.researchgate.net/publication/333098973_Detection_of_fake_news_in_a_new_corpus_for_the_Spanish_language). \ \ <NAME>., <NAME>., <NAME>., & <NAME>. (2019). Detection of fake news in a new corpus for the Spanish language. Journal of Intelligent & Fuzzy Systems, 36(5), 4869-4876. \ \ And it is available in this repository: https://github.com/jpposadas/FakeNewsCorpusSpanish ## Preprocessing First, I have created a preprocessing function for prepare the text before feed the model with it. This function remove punctuation and tokenize the words of each text. Optionally can remove stopwords and lemmatize the text via parameter. This function will be the first step in the pipeline process. Next step will be the vectorizer that transform the text words in an array of counts of each word of the corpus, with a vocabulary of 3000 words. After this process data is prepared for the model. ## Model Building In order to find the best model for this purpose I have tried several classification models such as Logistic Regression, Support Vector Machine, Random Forest and Boosting. Also, I have evaluated three different feature representation: * Bag of words * N-gram (bigrams and trigrams) \ \ I have tried all posible combinations of these setups to find the best accuracy on the training test using GridSearchCV from Scikit-Learn python module. ## Results LogReg SVM RF Boosting CV Score(Accuracy): 0.82 0.77 0.79 0.80 Test Score (Using LogReg): 0.78 ## Deployment I used Flask for the web application and Heroku for deploying it. Available on: https://fake-news-detector-by-acervero.herokuapp.com/ <file_sep>/preprocessing.py from sklearn.base import BaseEstimator, TransformerMixin import spacy import numpy as np from spacy.lang.en.stop_words import STOP_WORDS nlp = spacy.load('./models/es_core_news_sm-2.3.1') class TextPreprocessing(BaseEstimator, TransformerMixin): def __init__(self, remove_punctuation=True, remove_stopwords=True, lemmatization=True, lowercase=True): self.remove_punctuation = remove_punctuation self.remove_stopwords = remove_stopwords self.lemmatization = lemmatization self.lowercase = lowercase def fit(self, X, y=None): return self def transform(self, X, y=None): X_transformed = [] for idx, text in enumerate(X): prepared_text = "" if self.lowercase: text = text.lower() doc = nlp(text, disable=['tagger','parser', 'ner', 'textcat']) for word in doc: if self.remove_stopwords and word.is_stop: continue if self.remove_punctuation and word.is_punct: continue if self.lemmatization: word = word.lemma_ prepared_text += " " + str(word) X_transformed.append(prepared_text) return np.array(X_transformed) <file_sep>/app.py from flask import Flask,request, url_for, redirect, render_template, jsonify from preprocessing import TextPreprocessing import pickle app = Flask(__name__) def predict_news(text): y = pipeline.predict([text]) return y def loadModels(name_file): with open('models/'+name_file+'.tk', 'rb') as pickle_file: model = pickle.load(pickle_file) return model pipeline = loadModels('pipeline') @app.route('/') def home(): return render_template("home.html") @app.route('/predict', methods=['POST']) def predict(): new_request = request.form y = predict_news(new_request['News']) return render_template('home.html',pred='News is {}'.format(y)) if __name__ == '__main__': app.run(debug=True) <file_sep>/requirements.txt Flask==1.1.2 numpy==1.19.4 scikit_learn==0.23.2 spacy==2.3.4 gunicorn==20.0.4
ea7180d93e66741685c08e4b98c2e02505e0af9a
[ "Markdown", "Python", "Text" ]
4
Markdown
adriancervero/fake-news-detector
e49b569a719939829e2783923d6b792d2543742d
3c854822ac6c6b9eff355708994fc1d81f850a70
refs/heads/master
<repo_name>TanguyMonteiro/foot<file_sep>/src/Controller/FootballeurController.php <?php namespace App\Controller; use App\Entity\Footballeur; use App\Repository\FootballeurRepository; use Doctrine\ORM\EntityManagerInterface; use Symfony\Bundle\FrameworkBundle\Controller\AbstractController; use Symfony\Component\HttpFoundation\Request; use Symfony\Component\HttpFoundation\Response; use Symfony\Component\Routing\Annotation\Route; use Symfony\Component\Serializer\SerializerInterface; /** * @Route("/footAPI") */ class FootballeurController extends AbstractController { /** * @Route("/footballeur", name="footballeur") */ public function index(FootballeurRepository $repository): Response { $footballeurs = $repository->findAll(); return $this->json($footballeurs); } /** * @Route("/delete/{id}" , name="delete_footballeur") */ public function delete(Footballeur $footballeur , EntityManagerInterface $manager){ $manager->remove($footballeur); $manager->flush(); $message = "bien suprimer"; return $this->json($message); } /** * @Route("/create", name="create_footballeur") */ public function create(EntityManagerInterface $manager , Request $request , SerializerInterface $serializer){ $json = $request->getContent(); $footballeur = $serializer->deserialize($json, Footballeur::class , 'json'); $manager->persist($footballeur); $manager->flush(); return $this->json($footballeur); } }
009f2d803f8bf3cbe9c65d6114e8de63dcdb1dd4
[ "PHP" ]
1
PHP
TanguyMonteiro/foot
eae03e1c102ce9d5556405ba2e5fe810efdcbdf2
cd10c294eaa21e1f414628c46078b55450f64c7d
refs/heads/master
<repo_name>oxfordyang2016/youngmenpackage<file_sep>/youngmenpackage/increbackup.sh #!/bin/bash /usr/bin/mysqladmin flush-logs -uroot -pyangmingtestmysql /usr/bin/python /tmp/goodday/ftpuploadincre.py #/usr/bin/scp /var/lib/mysql/yangming-bin.00001* [email protected]:/tmp/goodday/ <file_sep>/youngmenpackage/increbackup.py import ftplib import sys import re sqlfilelist=[] binfilelist=[] barebinfilelist=[] def getuserinfo(macget_lib_position): authlib = CDLL(macget_lib_position) mac=create_string_buffer(288) authlib.authGetFingerprint(pointer(mac),sizeof(mac)) mac=mac.value#this is macode username='u'+mac[0:6] password='p'+mac[-6:] return [username,password] #-----------------via list to get bin and sql filename dir----- def listfile(dirname): filenamedir=os.listdir(dirname) for filename in filenamedir: if "sql" in filename: sqlfilelist.append(dirname+filename) barebinfilelist.append(filename) return [sqlfilelist,binfilelist] #your path format is below #getdirfilelist=listfile('/var/lib/mysql/') #print sys.argv[1]##python ftpupload.py -u will print -u try: from configparser import ConfigParser except ImportError: from ConfigParser import ConfigParser # ver. < 3.0 # instantiate config = ConfigParser() # parse existing file # read pos values from a section to finish upload currentpos = config.get('pos', 'cupos') #print currentpos ''' username = config.get('server', 'user') password = config.get('server','<PASSWORD>') ftp_connection = ftplib.FTP(server, username, password) ''' userinfo=getuserinfo('/tmp/goodday/libauth.so') server='192.168.1.211' #ftp_connection = ftplib.FTP(server, userinfo[0], userinfo[1]) ftp_connection = ftplib.FTP(server, 'u123456', 'p654321') uploadbinfile_list=listfile('/var/lib/mysql/')[1] uploadbinfile_list=listfile('/var/lib/mysql/')[1] uploadbinfile_path=(sorted(uploadbinfile_list))[-3] print sorted(barebinfilelist) detect_list=(sorted(barebinfilelist)) ''' if detect_list[-3]!=currentpos: oldpos=re.split(r'\.',currentpos) oldpos2= int(oldpos[1]) uploadposition=re.split(r'\.',detect_list[-3]) uploadposition2=int(uploadposition[1]) while oldpos2!=uploadposition2: newposition=uploadposition[0]+"."+(str(oldpos2).zfill(6)) print "i will upload %s"%newposition #ftp_connection.storbinary('STOR /tmp/goodday/purebackdata/%s'%newposition, fh) ftp_connection.storbinary('STOR %s'%newposition, fh) oldpos2=oldpos2+1 ''' ftp_connection.storbinary('STOR %s'%(sorted(barebinfilelist))[-3], fh) uploadposition=re.split(r'\.',detect_list[-3]) uploadposition2=int(uploadposition[1]) uploadposition3=uploadposition2+1 finalposition=uploadposition[0]+"."+(str(uploadposition3).zfill(6)) config.set('pos', 'cupos',finalposition) # save to a file with open(u'/tmp/goodday/position.ini', 'w') as configfile: config.write(configfile) #ftp_connection.storbinary('STOR /tmp/goodday/purebackdata/backupdir/%s'%(sorted(barebinfilelist))[-3], fh) fh.close() ''' #-------------------detect if client upload successlly------------------------ # update existing value #yangming-bin.000515 oldpos=re.split(r'\.',currentpos) oldpos2= int(oldpos[1]) newpos=oldpos2+1 newposition="yangming-bin."+str(newpos).zfill(6) config.set('pos', 'cupos',newposition) # save to a file with open(u'/tmp/goodday/position.ini', 'w') as configfile: config.write(configfile) #----------------------get /var/lib/mysql/ bin file---------------------- '''
0925a39989030474109541b34fd30a9ba85df3a2
[ "Python", "Shell" ]
2
Shell
oxfordyang2016/youngmenpackage
e4a7152b15232b94de4c1aa1b61a4a58ca87b0ac
23fa987b4499023d5f4bc6794165cf1ae05515dc
refs/heads/main
<file_sep>#ifndef BACKEND_H #define BACKEND_H #include <QObject> #include <contactsmodel.h> #include <QTcpServer> #include <QTcpSocket> #include <QtSql/QSqlDatabase> class Backend : public QObject { Q_OBJECT public: explicit Backend(QObject *parent = nullptr); ~Backend(); ContactsModel* contactsModel; ChatModel* chatModel; Q_INVOKABLE void on_connectionDialog_closed(const QString ipAddress); Q_INVOKABLE void on_sendButton_clicked(QString message); Q_INVOKABLE void openContact(QString ipAddress); Q_INVOKABLE void removeContact(QString ipAddress); Q_INVOKABLE void reconnectContact(QString ipAddress); private slots: void incomming_connection(); void ready_read(); void on_successful_connection(); void ready_read_connection_params(); void on_disconnected(); private: const quint16 defaultPort = 25565; QTcpServer* server; QList<QTcpSocket*> connections; QString _userName; QSqlDatabase _database; void make_connection(const QString ipAddress); void getUserName(); void startServer(); bool checkIpAddress(QString ipAddress); QTcpSocket* findSocket(QString ipAddress); QTcpSocket* getCurrentSocketSender(); bool openDatabase(QString database, QString databaseName); bool checkTables(QList<QString>& tables) const; bool createDatabaseTables(); ContactsModel* loadDataFromDatabase(); bool saveMessage(Message mes, QString ipAddress); bool saveContact(Contact* contact); bool updateUserName(QString ipAddress,QString name); bool deleteContact(QString ipAddress); }; #endif // BACKEND_H <file_sep>#ifndef CONTACTSTMODEL_H #define CONTACTSTMODEL_H #include <QAbstractListModel> #include <QVector> #include <QMap> #include "contact.h" #include "chatmodel.h" class ContactsModel : public QAbstractListModel { Q_OBJECT public: ContactsModel(); ContactsModel(QObject *parent = nullptr); enum{ IpAddressRole = Qt::UserRole, UserNameRole, IsActiveRole, }; void addContact(Contact* contact); void removeContact(QString ipAddress); Contact* getContact(QString contactIp); void setUserName(QString contactIp,QString name); void setActiveContact(QString contactIp); void setActiveContact(Contact* contact); ChatModel* getChatModel(); void addMessage(QString contactIp, Message mes); private: QMap<QString,int> ipAssociation; QVector<Contact*> model; ChatModel* chatModel; // QAbstractItemModel interface public: QVariant data(const QModelIndex &index, int role) const; int rowCount(const QModelIndex &parent) const; QHash<int, QByteArray> roleNames() const; Qt::ItemFlags flags(const QModelIndex &index) const; }; #endif // CONTACTSTMODEL_H <file_sep>#include "backend.h" #include "contact.h" #include <QTcpServer> #include <QTcpSocket> #include <QProcessEnvironment> #include <QDebug> #include <QTime> #include <QtSql/QSqlDatabase> #include <QtSql/QSqlError> #include <QtSql/QSqlQuery> Backend::Backend(QObject *parent) : QObject(parent) { getUserName(); if(openDatabase("QSQLITE", "chat.db")) { qDebug() << "success open database"; QList<QString> tables; //Можно как-то получше этот участок сделать? tables.append("contacts"); tables.append("messages"); if(checkTables(tables)) { contactsModel = loadDataFromDatabase(); } else { if(!createDatabaseTables()) { qDebug() << "Exception: not full created database"; _database.close(); } contactsModel = new ContactsModel(this); } } startServer(); connect(server, &QTcpServer::newConnection, this, &Backend::incomming_connection); } Backend::~Backend() { } void Backend::on_connectionDialog_closed(const QString ipAddress) { if(checkIpAddress(ipAddress)) make_connection(ipAddress); } void Backend::incomming_connection() { qDebug() << "new connection from user"; QTcpSocket* newConnection = server->nextPendingConnection(); connections.append(newConnection); Contact* contact = new Contact(newConnection->peerAddress().toString()); contactsModel->addContact(contact); if(!saveContact(contact)) qDebug() << "Failed save contact: " << contact->name << "ip: " << contact->ipAddress; connect(newConnection, &QTcpSocket::readyRead, this, &Backend::ready_read_connection_params); newConnection->write(_userName.toUtf8()); } void Backend::ready_read() { QTcpSocket * socket = getCurrentSocketSender(); qDebug() << "Новое сообщение от " << socket->peerAddress(); QString message(socket->readAll()); qDebug() << message; Message mes(Message(message, false,QTime::currentTime().toString("h:m:s ap"))); contactsModel->addMessage(socket->peerAddress().toString(), mes); if(!saveMessage(mes, socket->peerAddress().toString())) qDebug() << "Exception failed save message for contact: " << socket->peerAddress().toString(); } void Backend::on_successful_connection() { QTcpSocket * socket = getCurrentSocketSender(); connections.append(socket); socket->write(_userName.toUtf8()); connect(socket, &QTcpSocket::readyRead, this, &Backend::ready_read_connection_params); //disconnect(socket, &QTcpSocket::connected, this, &Backend::on_successful_connection); } void Backend::ready_read_connection_params() { QTcpSocket * socket = getCurrentSocketSender(); QString userName(socket->readAll()); contactsModel->getChatModel()->setConnectionStatus("Подключено"); contactsModel->setUserName(socket->peerAddress().toString(), userName); if(!updateUserName(socket->peerAddress().toString(), userName)) qDebug() << "Exception: failed update username ip:" << socket->peerAddress().toString() << " new name: " << userName; disconnect(socket, &QTcpSocket::readyRead, this, &Backend::ready_read_connection_params); connect(socket, &QTcpSocket::readyRead, this, &Backend::ready_read); connect(socket, &QTcpSocket::disconnected, this, &Backend::on_disconnected); } void Backend::on_disconnected() { QTcpSocket* socket = getCurrentSocketSender(); if(connections.removeOne(socket)) { qDebug() << "disconnected"; if(contactsModel->getChatModel()->getContact() == contactsModel->getContact(socket->peerAddress().toString())) //Открыт ли чат contactsModel->getChatModel()->setConnectionStatus("Отключено"); disconnect(socket, &QTcpSocket::readyRead, this, &Backend::ready_read); //delete socket; Почему из-за этого вылет?! } } void Backend::make_connection(const QString ipAddress) { if(!contactsModel->getContact(ipAddress)) { QTcpSocket* connection = new QTcpSocket(this); Contact* newContact = new Contact(ipAddress); contactsModel->addContact(newContact); contactsModel->getChatModel()->setConnectionStatus("Соединение..."); if(!saveContact(newContact)) qDebug() << "Failed save contact: " << newContact->name << "ip: " << newContact->ipAddress; connection->connectToHost(ipAddress, defaultPort); connect(connection, &QTcpSocket::connected, this, &Backend::on_successful_connection); } } void Backend::getUserName() { QString name = QProcessEnvironment::systemEnvironment().value("USER"); if(name.isEmpty()) name = QProcessEnvironment::systemEnvironment().value("USERNAME"); _userName = name.toUtf8(); } void Backend::startServer() { server = new QTcpServer(this); bool res = server->listen(QHostAddress::AnyIPv4, defaultPort); qDebug() << "server listen = " << res; if(!res) { qDebug() << "repeat server listen = " << server->listen(QHostAddress::AnyIPv4, defaultPort+1); } } bool Backend::checkIpAddress(QString ipAddress) { QRegExp re("(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)"); qDebug() << ipAddress; return re.exactMatch(ipAddress); } QTcpSocket *Backend::findSocket(QString ipAddress) { if(connections.empty()) return nullptr; for(auto it = connections.cbegin(); it != connections.end(); ++it) if((*it)->peerAddress().toString() == ipAddress) return *it; return nullptr; } QTcpSocket *Backend::getCurrentSocketSender() { QObject * object = QObject::sender(); if (!object) return nullptr; return static_cast<QTcpSocket *>(object); } bool Backend::openDatabase(QString database, QString databaseName) { _database = QSqlDatabase::addDatabase(database,"mainbase"); //Почему нет указателя??? _database.setDatabaseName(databaseName); if(!_database.open()){ qDebug() << "can't open database: " << database << " name:" << databaseName; qCritical() << _database.lastError().text(); return false; } return true; } bool Backend::checkTables(QList<QString> &tables) const { auto databaseTables = _database.tables(); for(auto table = tables.cbegin(); table != tables.cend(); ++table) { if(!databaseTables.contains(*table)) return false; } return true; } bool Backend::createDatabaseTables() { QSqlQuery query(_database); if(!query.exec("PRAGMA foreign_keys=on")) qDebug() << "Failed set up foreign_keys"; qDebug() << "Creating table contacts..."; if(!query.exec("CREATE TABLE contacts(" "ip_address varchar(15) NOT NULL PRIMARY KEY," "name VARCHAR(32) NOT NULL);")) { qDebug() << "failed create table contacts"; return false; } qDebug() << "Creating table messages..."; if(!query.exec("CREATE TABLE messages(" "message_PK INTEGER NOT NULL PRIMARY KEY AUTOINCREMENT," "message TEXT NOT NULL," "is_my_message Boolean NOT NULL," "contact_FK varchar(15) NOT NULL," "time TEXT NOT NULL," "FOREIGN KEY(contact_FK) REFERENCES contacts(ip_address) On DELETE CASCADE);")) { qDebug() << "failed create table messages"; return false; } return true; } ContactsModel *Backend::loadDataFromDatabase() { QSqlQuery contactQuery(_database); QSqlQuery messageQuery(_database); messageQuery.prepare("SELECT message, is_my_message, time FROM messages WHERE contact_FK = ?"); contactQuery.exec("SELECT ip_address, ip_address, name FROM contacts"); ContactsModel* contacts = new ContactsModel(this); while (contactQuery.next()) { Contact* contact = new Contact(contactQuery.value("ip_address").toString(), contactQuery.value("name").toString()); messageQuery.addBindValue(contactQuery.value("ip_address").toString()); messageQuery.exec(); while(messageQuery.next()) { Message mes(messageQuery.value("message").toString(), messageQuery.value("is_my_message").toBool(), messageQuery.value("time").toString()); contact->AddMessage(mes); } contacts->addContact(contact); } return contacts; } bool Backend::saveMessage(Message mes, QString ipAddress) { if(!_database.isOpen()) return false; QSqlQuery query(_database); qDebug() << "Saving message"; query.prepare("INSERT INTO messages(message, is_my_message, contact_fk, time) VALUES (?,?,?,?)"); query.addBindValue(mes.text); query.addBindValue(mes.isMyMessage); query.addBindValue(ipAddress); query.addBindValue(mes.time); return query.exec(); } bool Backend::saveContact(Contact *contact) { if(!_database.isOpen()) return false; QSqlQuery query(_database); qDebug() << "Saving contact"; query.prepare("INSERT INTO contacts (ip_address, name) VALUES (?,?)"); query.addBindValue(contact->ipAddress); query.addBindValue(contact->name); return query.exec(); } bool Backend::updateUserName(QString ipAddress, QString name) { if(!_database.isOpen()) return false; QSqlQuery query(_database); qDebug() << "Updating name: " << name << " ip:" << ipAddress; query.prepare("UPDATE contacts SET name = ? WHERE ip_address = ?"); query.addBindValue(name); query.addBindValue(ipAddress); return query.exec(); } bool Backend::deleteContact(QString ipAddress) { if(!_database.isOpen()) return false; QSqlQuery query(_database); qDebug() << "Deliting contact ip address: " << ipAddress; query.prepare("DELETE FROM contacts WHERE ip_address == ?"); query.addBindValue(ipAddress); return query.exec(); } void Backend::on_sendButton_clicked(QString message) { QString ip = contactsModel->getChatModel()->getContact()->ipAddress; auto socketReciever = findSocket(ip); if(socketReciever) { socketReciever->write(message.toUtf8()); Message mes(Message(message, true,QTime::currentTime().toString("h:m:s ap"))); contactsModel->getChatModel()->addMessage(mes); if(!saveMessage(mes, ip)) qDebug() << "Exception failed save message for contact: " << ip; } } void Backend::openContact(QString ipAddress) { Contact* contact = contactsModel->getContact(ipAddress); ChatModel* chatmodel = contactsModel->getChatModel(); if(contact == chatmodel->getContact()) return; if(contact) //Хотя это вроде невозможно { contactsModel->setActiveContact(contact); QTcpSocket* contactSocket = findSocket(ipAddress); if(contactSocket) { if(contactSocket->isOpen()) chatmodel->setConnectionStatus("Подключено"); else chatmodel->setConnectionStatus("Соединение..."); } else { qDebug() << "Socket doesn't exist"; chatmodel->setConnectionStatus("Не подключено (Архивная запись)"); } } } void Backend::removeContact(QString ipAddress) { QTcpSocket* socket = findSocket(ipAddress); if(socket) { if(socket->isOpen()) socket->disconnected(); connections.removeOne(socket); } Contact* contact = contactsModel->getChatModel()->getContact(); if(contact) { if(contact->ipAddress == ipAddress) { contactsModel->getChatModel()->setContact(nullptr); } } contactsModel->removeContact(ipAddress); if(deleteContact(ipAddress)) qDebug() << "Contact: " << ipAddress << " deleted"; } void Backend::reconnectContact(QString ipAddress) { return; } <file_sep>#include "message.h" Message::Message() { } Message::Message(QString text, bool isMyMessage, QString time) { this->text = text; this->isMyMessage = isMyMessage; this->time = time; } <file_sep>#ifndef CONTACT_H #define CONTACT_H #include <QVector> #include <message.h> class Contact { public: Contact(QString ipAddress = QString(), QString name = "Unknown"); QString ipAddress; QString name; QVector<Message> chat; void AddMessage(Message mes); }; #endif // CONTACT_H <file_sep>#ifndef MESSAGE_H #define MESSAGE_H #include <QString> class Message { public: Message(); Message(QString text, bool isMyMessage, QString time); QString text; bool isMyMessage; QString time; }; #endif // MESSAGE_H <file_sep>#include "contactsmodel.h" #include "contact.h" #include <QtDebug> ContactsModel::ContactsModel() { chatModel = new ChatModel(this); } ContactsModel::ContactsModel(QObject *parent): QAbstractListModel(parent) { chatModel = new ChatModel(this); } void ContactsModel::addContact(Contact* contact) { beginInsertRows(QModelIndex(), model.size(), model.size()); model.append(contact); ipAssociation.insert(contact->ipAddress, model.size()-1); //chatModel->setContact(contact); endInsertRows(); if(model.size() < 2) //Да, это костыль, но оно того стоит chatModel->setContact(contact); else setActiveContact(contact); } void ContactsModel::removeContact(QString ipAddress) { int index = ipAssociation[ipAddress]; beginRemoveRows(QModelIndex(),index,index); model.remove(index); ipAssociation.remove(ipAddress); for(auto it = ipAssociation.begin(); it != ipAssociation.end(); ++it) { if((*it) > index) --(*it); } endRemoveRows(); } Contact* ContactsModel::getContact(QString contactIp) { if(ipAssociation.contains(contactIp)) return model[ipAssociation[contactIp]]; else return nullptr; } void ContactsModel::setUserName(QString contactIp, QString name) { model[ipAssociation[contactIp]]->name = name; if(model[ipAssociation[contactIp]] == chatModel->getContact()) chatModel->userNameChanged(); QModelIndex changedIndex = createIndex(ipAssociation[contactIp],0); dataChanged(changedIndex,changedIndex); } void ContactsModel::setActiveContact(QString contactIp) { QModelIndex oldIndex; if(chatModel->getContact()) oldIndex = createIndex(ipAssociation[chatModel->getContact()->ipAddress],0); else oldIndex = createIndex(model.size(),0); chatModel->setContact(model[ipAssociation[contactIp]]); QModelIndex newIndex = createIndex(ipAssociation[contactIp],0); if(oldIndex < newIndex) dataChanged(oldIndex,newIndex); else dataChanged(newIndex,oldIndex); } void ContactsModel::setActiveContact(Contact *contact) { QModelIndex oldIndex; if(chatModel->getContact()) oldIndex = createIndex(ipAssociation[chatModel->getContact()->ipAddress],0); else oldIndex = createIndex(model.size(),0); chatModel->setContact(contact); QModelIndex newIndex = createIndex(ipAssociation[contact->ipAddress],0); if(oldIndex < newIndex) dataChanged(oldIndex,newIndex); else dataChanged(newIndex,oldIndex); } ChatModel *ContactsModel::getChatModel() { return chatModel; } void ContactsModel::addMessage(QString contactIp, Message mes) { if(model[ipAssociation[contactIp]] == chatModel->getContact()) chatModel->addMessage(mes); else model[ipAssociation[contactIp]]->AddMessage(mes); } QVariant ContactsModel::data(const QModelIndex &index, int role) const { if(!index.isValid()) return QVariant(); switch (role) { case IpAddressRole: return QVariant(model.at(index.row())->ipAddress); case UserNameRole: return QVariant(model.at(index.row())->name); case IsActiveRole: return chatModel->getContact() == model[index.row()]; } return QVariant(); } int ContactsModel::rowCount(const QModelIndex &parent) const { Q_UNUSED(parent) return model.size(); } QHash<int, QByteArray> ContactsModel::roleNames() const { QHash<int, QByteArray> names; names[IpAddressRole] = "ipAddress"; names[UserNameRole] = "userName"; names[IsActiveRole] = "isActive"; return names; } Qt::ItemFlags ContactsModel::flags(const QModelIndex &index) const { return Qt::NoItemFlags; } <file_sep>#include "chatmodel.h" #include "contact.h" ChatModel::ChatModel(QObject* parent): QAbstractListModel(parent) { } ChatModel::ChatModel(Contact *model, QObject *parent):QAbstractListModel(parent) { _contactModel = model; } void ChatModel::setContact(Contact *model) { beginResetModel(); _contactModel = model; endResetModel(); userNameChanged(); } Contact *ChatModel::getContact() { return _contactModel; } void ChatModel::setConnectionStatus(QString connectionStatus) { _connectionStatus = connectionStatus; connectionStatusChanged(); } void ChatModel::setUserName(QString name) { _contactModel->name = name; userNameChanged(); } QString ChatModel::ipAddress() { if(_contactModel) return _contactModel->ipAddress; else return ""; } QString ChatModel::userName() { if(_contactModel) return _contactModel->name; else return ""; } QString ChatModel::connectionStatus() { return _connectionStatus; } void ChatModel::addMessage(Message mes) { if(_contactModel) { beginInsertRows(QModelIndex(), _contactModel->chat.size(), _contactModel->chat.size()); _contactModel->chat.append(mes); endInsertRows(); } } int ChatModel::rowCount(const QModelIndex &parent) const { if(_contactModel) return _contactModel->chat.size(); else return 0; } QVariant ChatModel::data(const QModelIndex &index, int role) const { if(!index.isValid() || !_contactModel) return QVariant(); switch (role) { case TextRole: return _contactModel->chat.at(index.row()).text; case IsMyMessage: return _contactModel->chat.at(index.row()).isMyMessage; case TimeRole: return _contactModel->chat.at(index.row()).time; } return QVariant(); } QHash<int, QByteArray> ChatModel::roleNames() const { QHash<int, QByteArray> names; names[TextRole] = "text"; names[IsMyMessage] = "isMyMessage"; names[TimeRole] = "time"; return names; } Qt::ItemFlags ChatModel::flags(const QModelIndex &index) const { return Qt::NoItemFlags; } <file_sep>#ifndef CHATMODEL_H #define CHATMODEL_H #include <QAbstractListModel> #include <QVector> #include <message.h> #include <contact.h> class ChatModel : public QAbstractListModel { Q_OBJECT Q_PROPERTY(QString userName READ userName NOTIFY userNameChanged) Q_PROPERTY(QString ipAddress READ ipAddress NOTIFY ipAddressChanged) Q_PROPERTY(QString connectionStatus READ connectionStatus NOTIFY connectionStatusChanged) public: ChatModel(QObject* parent = nullptr); ChatModel(Contact* model, QObject* parent = nullptr); enum{ TextRole = Qt::UserRole, IsMyMessage, TimeRole, }; void setContact(Contact* model); Contact* getContact(); void setConnectionStatus(QString connectionStatus); void setUserName(QString name); QString ipAddress(); QString userName(); QString connectionStatus(); void addMessage(Message mes); signals: void userNameChanged(); void ipAddressChanged(); void connectionStatusChanged(); private: Contact* _contactModel; QString _connectionStatus = ""; // QAbstractItemModel interface public: int rowCount(const QModelIndex &parent) const; QVariant data(const QModelIndex &index, int role) const; QHash<int, QByteArray> roleNames() const; Qt::ItemFlags flags(const QModelIndex &index) const; }; #endif // CHATMODEL_H <file_sep>#include "contact.h" Contact::Contact(QString ipAddress, QString name) { this->ipAddress = ipAddress; this->name = name; } void Contact::AddMessage(Message mes) { chat.append(mes); }
6c1a37399a5479f25c4d84b5d46e8919b4ebda1c
[ "C++" ]
10
C++
Desfirit/OnlineChat
1ef54a9d8df441df4e7d1a75a714f5ac92213ebd
f834bfc7c184d071bbc81c1d3d3e78fe7d2c17b1
refs/heads/master
<repo_name>williash23/UMT-BrodieLab-GlobalDiv<file_sep>/FunctionalDiversity190528_funs.R # Functions used in Functional Diversity analysis # Function to prepare mammal range polygons (from IUCN shapefile) for use within # FD analysis. # Must supply file name of mammal or bird range file prep_range_polys <- function(fn = NULL, ctrs = NULL){ poly_tmp <- st_read(fn) %>% dplyr::filter(legend != "Extinct") %>% ## Now removes extinct species dplyr::select(binomial) %>% tidyr::separate(binomial, c("genus", "spp"), " ") %>% dplyr::mutate(species = paste(genus, spp, sep = "_")) %>% dplyr::select(species) poly <- st_transform(poly_tmp, st_crs(ctrs)) # Match countries sf object return(poly) } # Example function calls: #mam <- prep_range_polys(fn = "TERRESTRIAL_MAMMALS.shp", ctrs = ctrs) # Function to calculate FD per community. From: <NAME> # https://github.com/opetchey/ttl-resources/blob/master/functional_diversity/FD.example.1.r Getlength <- function(xtree, comp=NA){ if(!is.data.frame(comp)) result <- Getlength.inner(xtree) if(is.data.frame(comp)){ S <- tapply(comp[,2], comp[,1], function(x) length(x)) FD <- tapply(comp[,2], comp[,1], function(x) Getlength.inner(list(xtree[[1]], xtree[[2]][!is.na(match(dimnames(xtree[[2]])[[1]], x)),]))) FD <- FD/Getlength.inner(xtree) result <- data.frame(S=S, FD.new=FD) } result } Getlength.inner <- function(xtree){ if(!is.matrix(xtree[[2]])) result <- 0 if(is.matrix(xtree[[2]])) result = sum(xtree[[1]][colSums(xtree[[2]]) != 0 & colSums(xtree[[2]]) < length(xtree[[2]][,1])]) result } ## 17/1/03. Written by <NAME>. Please acknowledge as appropriate. Xtree <- function(h) ## evaluate species branch matrix (sensu Petchey&Gaston) from a dendrogram ## tested for results of hclust and agnes ## hclust - hierarchical clustering ## agnes - agglomerative clustering ## used components: ## merge - history of cluster merging ## height - actual heights at merging ## order - permutation to achieve nice output (needed only for agnes) { species.names <- h$labels H1 <- matrix(0, length(h$order), 2 * length(h$order) - 2) l <- vector("numeric", 2 * length(h$order) - 2) for(i in 1:(length(h$order) - 1)) { # evaluate branch lengths # if(h$merge[i, 1] < 0) { l[2 * i - 1] <- h$height[order(h$height)[i]] H1[ - h$merge[i, 1], 2 * i - 1] <- 1 } else { l[2 * i - 1] <- h$height[order(h$height)[i]] - h$height[order(h$height)[h$merge[i, 1]]] H1[, 2 * i - 1] <- H1[, 2 * h$merge[i, 1] - 1] + H1[ , 2 * h$merge[i, 1]] } if(h$merge[i, 2] < 0) { l[2 * i] <- h$height[order(h$height)[i]] H1[ - h$merge[i, 2], 2 * i] <- 1 } else { l[2 * i] <- h$height[order(h$height)[i]] - h$height[order(h$height)[h$merge[i, 2]]] H1[, 2 * i] <- H1[, 2 * h$merge[i, 2] - 1] + H1[, 2 * h$merge[i, 2]] } } dimnames(H1) <- list(species.names,NULL) list(h2.prime=l, H1=H1) ## l contains the length of all the tiny branches ## H1: each row represents one species, each column represents one branch ## 1 indicates that a branch is part of the pathway from species to top of the dendrogram ## 0 otherwise }<file_sep>/FD_PD_190721.R #setwd("D:/Box Sync/Projects/Projects (active)/Functional diversity/Analysis") setwd("C:/Users/JB/Box Sync/Projects/Projects (active)/Functional diversity/Analysis") #setwd("C:/Users/saraw/Documents/FD") library(dplyr) library(tidyr) library(sf) library(picante) library(raster) library(ape) rm(list = ls()) # ----------------------- USER-DEFINED PARAMETERS ---------------------------------------------------------------- # Load functions source("FunctionalDiversity190528_funs.R") st_erase = function(x, y) st_difference(x, st_union(st_combine(y))) # Desired patial projection prj <- "+proj=cea +lon_0=0 +lat_ts=30 +x_0=0 +y_0=0 +datum=WGS84 +units=m +no_defs" # Number of randomization iterations to calculate bias in funct & phylogen loss numrands <- 501 # 1 will be discarded # Bonferroni-correction Ntot <- 232 # number of coutries for which we calculate FD and PD (i.e. where SR0>1) bc <- (0.05/Ntot)/2 # two-tailed; family-wise alpha = 0.05 # Parameters for Petchy's functional diversity calculation Distance.method <- "euclidean" Cluster.method <- "average" #----------------------- SPECIES DATA ------------------------------------------------- dat1 <- data.frame(read.csv("Data/Raw/mammal_threats_traits.csv", header=T)) dat1$ExceptionsToDecline[dat1$ExceptionsToDecline == "na"] <- NA spptree <- read.tree("Data/Raw/mammaltree.tree") dat1$species <- as.character(dat1$species) #----------------------- COUNTRIES AND SPP RANGE PREP --------------------------------- ctrs <- st_read("Data/Raw/UIA_World_Countries_Boundaries.shp") %>% dplyr::select(Country, geometry) %>% st_transform(st_crs(prj)) %>% mutate(ctry_sq_km = as.numeric(st_area(.))/1000000) ctrs_area <- ctrs %>% as.data.frame() %>% dplyr::select(Country, ctry_sq_km, - geometry) ctrs_df <- ctrs %>% as.data.frame() %>% dplyr::select(-geometry) ctrs_df$Country <- as.character(ctrs_df$Country) # Run functions to prepare range polygons # removes mammals who are listed as extinct mam <- prep_range_polys(fn = "Data/Raw/TERRESTRIAL_MAMMALS.shp", ctrs = ctrs) # This data set was obtained from the IUCN website: # https://www.iucnredlist.org/resources/spatial-data-download # Downloaded on 2017-06-15 # Processing function found in script: FunctionalDiversity190528_funs.R #----------------------- SPECIES - COUNTRIES INTERSECTION --------------------------------- polys_ctrs_int <- st_intersects(mam, ctrs, sparse = FALSE) # Dimensions: nrow = number of spp (12112 for mammals), ncol = number of countries (254) spp_int_ls <- list() for(i in 1:nrow(ctrs_df)){ polys_ctrs_int_vec <- which(polys_ctrs_int[,i] == TRUE) spp_list_ctr <- mam[polys_ctrs_int_vec, 1] st_geometry(spp_list_ctr) <- NULL spp_int_ls[[i]] <- spp_list_ctr } #----------------------- MANUALLY ADD CERTAIN SPECIES --------------------------------- # Species present in IUCN range descriptions but not IUCN range maps # Maldives add <- data.frame(species = "Myotis_blythii") spp_int_ls[[59]] <- rbind(spp_int_ls[[59]], add) # Turkey add <- data.frame(species = "Miniopterus_schreibersii") spp_int_ls[[16]] <- rbind(spp_int_ls[[16]], add) # France add <- data.frame(species = "Speothos_venaticus") spp_int_ls[[34]] <- rbind(spp_int_ls[[34]], add) # Sri Lanka add <- data.frame(species = "Myotis_blythii") spp_int_ls[[63]] <- rbind(spp_int_ls[[63]], add) # Bangladesh add <- data.frame(species = "Myotis_blythii") spp_int_ls[[80]] <- rbind(spp_int_ls[[80]], add) # Bhutan add <- data.frame(species = "Myotis_blythii") spp_int_ls[[81]] <- rbind(spp_int_ls[[81]], add) # Israel add <- data.frame(species = "Gazella_dorcas") spp_int_ls[[224]] <- rbind(spp_int_ls[[224]], add) # Croatia add <- data.frame(species = "Miniopterus_schreibersii") spp_int_ls[[248]] <- rbind(spp_int_ls[[248]], add) # Bulgaria add <- data.frame(species = "Miniopterus_schreibersii") spp_int_ls[[254]] <- rbind(spp_int_ls[[254]], add) #----------------------- EXCEPTIONS TO DECLINE --------------------------------- exc_tmp <- dat1 %>% dplyr::select(species, ExceptionsToDecline) exc_tmp$ExceptionsToDecline <- as.character(exc_tmp$ExceptionsToDecline) # Do first one to start data frame spp <- exc_tmp$species[1] tmp <- exc_tmp$ExceptionsToDecline[1] tmp_df <- as.data.frame(tmp) %>% tidyr::separate(tmp, paste("Country", 1:22, sep="_"), sep = ", ") exc_df <- cbind(spp, tmp_df) for(i in 1:nrow(exc_tmp)){ spp <- exc_tmp$species[i] tmp <- exc_tmp$ExceptionsToDecline[i] tmp_df <- as.data.frame(tmp) %>% tidyr::separate(tmp, paste("Country", 1:22, sep="_"), sep = ", ") new_row <- cbind(spp, tmp_df) exc_df <- rbind(exc_df, new_row) } exc_df <- exc_df[rowSums(is.na(exc_df[,2:23])) != 22,] exc_df <- exc_df %>% mutate_if(is.factor, as.character) dat1 <- subset(dat1, select = -c(ExceptionsToDecline)) #----------------------- SAVE DATA FOR LATER USE / LOAD PREVIOUSLY PREPPED DATA ---------------- #save(spp_int_ls, file="Data/spp_int_ls.Rdata") #save(dat1, file="Data/dat1.Rdata") #save(ctrs, file="Data/ctrs.Rdata") #save(ctrs_df, file="Data/ctrs_df.Rdata") #save.image("working") #load("working") #----------------------- ASSESS THREATS PER COUNTRY ------------------------------ ### 'FOR' LOOP, RUNNING THROUGH EACH COUNTRY ONE BY ONE # Number of countries to analyze numcountries <- length(spp_int_ls) out_mat <- matrix(NA, nrow = numcountries, ncol = 170) start <- Sys.time() start for(i in 1:numcountries){ tryCatch({ # Exceptions to decline for country i country_name <- as.character(ctrs_df[i,1]) keep_exc <- which(apply(exc_df, 1, function(r) any(r %in% country_name))) spp_exc <- exc_df[keep_exc,1] spp_exc_df <- as.data.frame(spp_exc) colnames(spp_exc_df) <- "species" # Species list in country i spp_list <- spp_int_ls[[i]] spp_list <- unique(spp_list) datXY <- dplyr::left_join(spp_list, dat1, by="species") datXY <- datXY[complete.cases(datXY),] all_spp <- as.data.frame(datXY$species) colnames(all_spp)[1] <- "species" SR0 <- nrow(datXY) # Taxonomic diversity (species richness) t0.01 <- mean(datXY$DietInv) t0.02 <- mean(datXY$DietVert) t0.03 <- mean(datXY$DietFish) t0.04 <- mean(datXY$DietScav) t0.05 <- mean(datXY$DietFruit) t0.06 <- mean(datXY$DietNect) t0.07 <- mean(datXY$DietSeed) t0.08 <- mean(datXY$DietHerb) t0.09 <- mean(datXY$BodyMass) t0.10 <- mean(datXY$Ground) t0.11 <- mean(datXY$Climbing) t0.12 <- mean(datXY$Volant) #----- Current diversity if(SR0 < 2){ FD0 <- NA; FD_r <- rep(NA, 11); FDrand_allthreats <- NA; FDrand_hab1 <- NA; FDrand_hunt1 <- NA; FDrand_clim1 <- NA; FDrand_con1 <- NA; FDrand_nonnat1 <- NA; FDrand_pol1 <- NA; FDrand_hyb1 <- NA; FDrand_prey1 <- NA; FDrand_dis1 <- NA; FDrand_inb1 <- NA; FDrandLO_allthreats <- NA; FDrandLO_hab1 <- NA; FDrandLO_hunt1 <- NA; FDrandLO_clim1 <- NA; FDrandLO_con1 <- NA; FDrandLO_nonnat1 <- NA; FDrandLO_pol1 <- NA; FDrandLO_hyb1 <- NA; FDrandLO_prey1 <- NA; FDrandLO_dis1 <- NA; FDrandLO_inb1 <- NA; FDrandHI_allthreats <- NA; FDrandHI_hab1 <- NA; FDrandHI_hunt1 <- NA; FDrandHI_clim1 <- NA; FDrandHI_con1 <- NA; FDrandHI_nonnat1 <- NA; FDrandHI_pol1 <- NA; FDrandHI_hyb1 <- NA; FDrandHI_prey1 <- NA; FDrandHI_dis1 <- NA; FDrandHI_inb1 <- NA; FDrandLObonferroni_allthreats <- NA; FDrandLObonferroni_hab1 <- NA; FDrandLObonferroni_hunt1 <- NA; FDrandLObonferroni_clim1 <- NA; FDrandLObonferroni_con1 <- NA; FDrandLObonferroni_nonnat1 <- NA; FDrandLObonferroni_pol1 <- NA; FDrandLObonferroni_hyb1 <- NA; FDrandLObonferroni_prey1 <- NA; FDrandLObonferroni_dis1 <- NA; FDrandLObonferroni_inb1 <- NA; FDrandHIbonferroni_allthreats <- NA; FDrandHIbonferroni_hab1 <- NA; FDrandHIbonferroni_hunt1 <- NA; FDrandHIbonferroni_clim1 <- NA; FDrandHIbonferroni_con1 <- NA; FDrandHIbonferroni_nonnat1 <- NA; FDrandHIbonferroni_pol1 <- NA; FDrandHIbonferroni_hyb1 <- NA; FDrandHIbonferroni_prey1 <- NA; FDrandHIbonferroni_dis1 <- NA; FDrandHIbonferroni_inb1 <- NA; PD0 <- NA; PD_allthreats <- NA; PD_hab1 <- NA; PD_hunt1 <- NA; PD_clim1 <- NA; PD_con1 <- NA; PD_nonnat1 <- NA; PD_pol1 <- NA; PD_hyb1 <- NA; PD_prey1 <- NA; PD_dis1 <- NA; PD_inb1 <- NA; PDrand_allthreats <- NA; PDrand_hab1 <- NA; PDrand_hunt1 <- NA; PDrand_clim1 <- NA; PDrand_con1 <- NA; PDrand_nonnat1 <- NA; PDrand_pol1 <- NA; PDrand_hyb1 <- NA; PDrand_prey1 <- NA; PDrand_dis1 <- NA; PDrand_inb1 <- NA; PDrandLO_allthreats <- NA; PDrandLO_hab1 <- NA; PDrandLO_hunt1 <- NA; PDrandLO_clim1 <- NA; PDrandLO_con1 <- NA; PDrandLO_nonnat1 <- NA; PDrandLO_pol1 <- NA; PDrandLO_hyb1 <- NA; PDrandLO_prey1 <- NA; PDrandLO_dis1 <- NA; PDrandLO_inb1 <- NA; PDrandHI_allthreats <- NA; PDrandHI_hab1 <- NA; PDrandHI_hunt1 <- NA; PDrandHI_clim1 <- NA; PDrandHI_con1 <- NA; PDrandHI_nonnat1 <- NA; PDrandHI_pol1 <- NA; PDrandHI_hyb1 <- NA; PDrandHI_prey1 <- NA; PDrandHI_dis1 <- NA; PDrandHI_inb1 <- NA; PDrandLObonferroni_allthreats <- NA; PDrandLObonferroni_hab1 <- NA; PDrandLObonferroni_hunt1 <- NA; PDrandLObonferroni_clim1 <- NA; PDrandLObonferroni_con1 <- NA; PDrandLObonferroni_nonnat1 <- NA; PDrandLObonferroni_pol1 <- NA; PDrandLObonferroni_hyb1 <- NA; PDrandLObonferroni_prey1 <- NA; PDrandLObonferroni_dis1 <- NA; PDrandLObonferroni_inb1 <- NA; PDrandHIbonferroni_allthreats <- NA; PDrandHIbonferroni_hab1 <- NA; PDrandHIbonferroni_hunt1 <- NA; PDrandHIbonferroni_clim1 <- NA; PDrandHIbonferroni_con1 <- NA; PDrandHIbonferroni_nonnat1 <- NA; PDrandHIbonferroni_pol1 <- NA; PDrandHIbonferroni_hyb1 <- NA; PDrandHIbonferroni_prey1 <- NA; PDrandHIbonferroni_dis1 <- NA; PDrandHIbonferroni_inb1 <- NA; allthreats_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing")) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_allthreats <- nrow(allthreats_spp) hab1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & habitat == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_hab1 <- nrow(hab1_spp) hunt1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & hunting == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_hunt1 <- nrow(hunt1_spp) clim1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & climate == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_clim1 <- nrow(clim1_spp) con1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & conflict == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_con1 <- nrow(con1_spp) nonnat1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & NonNatives == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_nonnat1 <- nrow(nonnat1_spp) pol1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & pollution == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_pol1 <- nrow(pol1_spp) hyb1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & hybrid == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_hyb1 <- nrow(hyb1_spp) prey1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & prey == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_prey1 <- nrow(prey1_spp) dis1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & disease == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_dis1 <- nrow(dis1_spp) inb1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & inbreeding == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_inb1 <- nrow(inb1_spp) hab1_tmp <- NA; hunt1_tmp <- NA thab.01 <- NA; thab.02 <- NA; thab.03 <- NA; thab.04 <- NA; thab.05 <- NA; thab.06 <- NA thab.07 <- NA; thab.08 <- NA; thab.09 <- NA; thab.10 <- NA; thab.11 <- NA; thab.12 <- NA thunt.01 <- NA; thunt.02 <- NA; thunt.03 <- NA; thunt.04 <- NA; thunt.05 <- NA; thunt.06 <- NA thunt.07 <- NA; thunt.08 <- NA; thunt.09 <- NA; thunt.10 <- NA; thunt.11 <- NA; thunt.12 <- NA } else { #----- Standing diversity # Functional diversity # Trait dendrogram for all species at the grid point species.traits <- cbind.data.frame(species=datXY$species, DietInv=datXY$DietInv, DietVert=datXY$DietVert, DietFish=datXY$DietFish, DietScav=datXY$DietScav, DietFruit=datXY$DietFruit, DietNect=datXY$DietNect, DietSeed=datXY$DietSeed, DietHerb=datXY$DietHerb, BodyMass=datXY$BodyMass, Ground=datXY$Ground, Climbing=datXY$Climbing, Volant=datXY$Volant) colnames(species.traits)[1] <- "species" dimnames(species.traits) <- list("species"=as.character(species.traits[,1]),"traits"=dimnames(species.traits)[[2]]) distances <- dist(species.traits[,-1], method=Distance.method) tree <- hclust(distances, method=Cluster.method) xtree <- Xtree(tree) i.prime <- ifelse(colSums(xtree$H1)>0, 1, 0) FD0 <- sum(i.prime * xtree$h2.prime) # Phylogenetic diversity phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(datXY))) names(phymat) <- datXY$species PD0 <- picante::pd(phymat, spptree, include.root=TRUE)$PD #----- Spp impacts: All threats # Removing species undergoing population decline allthreats_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing")) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_allthreats <- nrow(allthreats_spp) numspploss_allthreats <- SR0 - nrow(allthreats_spp) if(SR_allthreats == 0){PD_allthreats <- 0} else { phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(allthreats_spp))) names(phymat) <- allthreats_spp$species PD_allthreats <- picante::pd(phymat, spptree, include.root=TRUE)$PD } if(SR_allthreats < 2){FDrand_allthreats <- NA; FDrandLO_allthreats <- NA; FDrandHI_allthreats <- NA; FDrandLObonferroni_allthreats <- NA; FDrandHIbonferroni_allthreats <- NA; PDrand_allthreats <- NA; PDrandLO_allthreats <- NA; PDrandHI_allthreats <- NA; PDrandLObonferroni_allthreats <- NA; PDrandHIbonferroni_allthreats <- NA} else if(SR_allthreats == SR0){FDrand_allthreats <- 1; FDrandLO_allthreats <- 1; FDrandHI_allthreats <- 1; FDrandLObonferroni_allthreats <- 1; FDrandHIbonferroni_allthreats <- 1; PDrand_allthreats <- 1; PDrandLO_allthreats <- 1; PDrandHI_allthreats <- 1; PDrandLObonferroni_allthreats <- 1; PDrandHIbonferroni_allthreats <- 1} else { rep_comm_1 <- rep(1, nrow(datXY)) community.composition <- as.data.frame(cbind(rep_comm_1, all_spp)) names(community.composition)[1] <- "comm_idx" PDr <- matrix(, nrow=numrands, 1) for(r in 1:numrands){ sppr <- data.frame(all_spp[sample(nrow(all_spp), (SR0-numspploss_allthreats)),]) # random sample of spp names(sppr) <- names(all_spp) rand_comm <- as.data.frame(cbind(rep(r, nrow(sppr)), sppr)) names(rand_comm)[1] <- "comm_idx" community.composition <- rbind(community.composition, rand_comm) # Phyl div in the community given the same amount of spp loss, but random spp phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(sppr))) names(phymat) <- sppr$species PDr[r,1] <- picante::pd(phymat, spptree, include.root=TRUE)$PD } colnames(community.composition)[1] <- "community" FDrand <- Getlength(xtree, community.composition) FDrand <- FDrand[-1,] FDrand_allthreats <- mean(FDrand$FD.new, na.rm=T) FDrandLO_allthreats <- quantile(FDrand$FD.new, probs=c(0.025), na.rm=T) FDrandHI_allthreats <- quantile(FDrand$FD.new, probs=c(0.975), na.rm=T) PDrand_allthreats <- mean(PDr, na.rm=T); PDrandLO_allthreats <- quantile(PDr, probs=c(0.025), na.rm=T) PDrandHI_allthreats <- quantile(PDr, probs=c(0.975), na.rm=T) FDrandLObonferroni_allthreats <- quantile(FDrand$FD.new, probs=c(bc), na.rm=T) FDrandHIbonferroni_allthreats <- quantile(FDrand$FD.new, probs=c(1-bc), na.rm=T) PDrandLObonferroni_allthreats <- quantile(PDr, probs=c(bc), na.rm=T) PDrandHIbonferroni_allthreats <- quantile(PDr, probs=c(1-bc), na.rm=T) } rm(phymat, FDrand, rep_comm_1, PDr, community.composition, sppr, rand_comm) #----- Spp impacts: Habitat loss # Removing species where the threat is major (1) and causing population decline hab1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & habitat == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_hab1 <- nrow(hab1_spp) numspploss_hab1 <- SR0 - nrow(hab1_spp) hab1_tmp <- dplyr::left_join(hab1_spp, dat1, by="species") hab1_tmp <- hab1_tmp[complete.cases(hab1_tmp),] thab.01 <- mean(hab1_tmp$DietInv) thab.02 <- mean(hab1_tmp$DietVert) thab.03 <- mean(hab1_tmp$DietFish) thab.04 <- mean(hab1_tmp$DietScav) thab.05 <- mean(hab1_tmp$DietFruit) thab.06 <- mean(hab1_tmp$DietNect) thab.07 <- mean(hab1_tmp$DietSeed) thab.08 <- mean(hab1_tmp$DietHerb) thab.09 <- mean(hab1_tmp$BodyMass) thab.10 <- mean(hab1_tmp$Ground) thab.11 <- mean(hab1_tmp$Climbing) thab.12 <- mean(hab1_tmp$Volant) if(SR_hab1 == 0){PD_hab1 <- 0}else{ phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(hab1_spp))) names(phymat) <- hab1_spp$species PD_hab1 <- picante::pd(phymat, spptree, include.root=TRUE)$PD } if(SR_hab1 < 2){FDrand_hab1 <- NA; FDrandLO_hab1 <- NA; FDrandHI_hab1 <- NA; FDrandLObonferroni_hab1 <- NA; FDrandHIbonferroni_hab1 <- NA; PDrand_hab1 <- NA; PDrandLO_hab1 <- NA; PDrandHI_hab1 <- NA; PDrandLObonferroni_hab1 <- NA; PDrandHIbonferroni_hab1 <- NA} else if(SR_hab1 == SR0){FDrand_hab1 <- 1; FDrandLO_hab1 <- 1; FDrandHI_hab1 <- 1; FDrandLObonferroni_hab1 <- 1; FDrandHIbonferroni_hab1 <- 1; PDrand_hab1 <- 1; PDrandLO_hab1 <- 1; PDrandHI_hab1 <- 1; PDrandLObonferroni_hab1 <- 1; PDrandHIbonferroni_hab1 <- 1} else { rep_comm_1 <- rep(1, nrow(datXY)) community.composition <- as.data.frame(cbind(rep_comm_1, all_spp)) names(community.composition)[1] <- "comm_idx" PDr <- matrix(, nrow=numrands, 1) for(r in 1:numrands){ sppr <- data.frame(all_spp[sample(nrow(all_spp), (SR0-numspploss_hab1)),]) # random sample of spp names(sppr) <- names(all_spp) rand_comm <- as.data.frame(cbind(rep(r, nrow(sppr)), sppr)) names(rand_comm)[1] <- "comm_idx" community.composition <- rbind(community.composition, rand_comm) # Phyl div in the community given the same amount of spp loss, but random spp phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(sppr))) names(phymat) <- sppr$species PDr[r,1] <- picante::pd(phymat, spptree, include.root=TRUE)$PD } colnames(community.composition)[1] <- "community" FDrand <- Getlength(xtree, community.composition) FDrand <- FDrand[-1,] FDrand_hab1 <- mean(FDrand$FD.new, na.rm=T) FDrandLO_hab1 <- quantile(FDrand$FD.new, probs=c(0.025), na.rm=T) FDrandHI_hab1 <- quantile(FDrand$FD.new, probs=c(0.975), na.rm=T) PDrand_hab1 <- mean(PDr, na.rm=T); PDrandLO_hab1 <- quantile(PDr, probs=c(0.025), na.rm=T) PDrandHI_hab1 <- quantile(PDr, probs=c(0.975), na.rm=T) FDrandLObonferroni_hab1 <- quantile(FDrand$FD.new, probs=c(bc), na.rm=T) FDrandHIbonferroni_hab1 <- quantile(FDrand$FD.new, probs=c(1-bc), na.rm=T) PDrandLObonferroni_hab1 <- quantile(PDr, probs=c(bc), na.rm=T) PDrandHIbonferroni_hab1 <- quantile(PDr, probs=c(1-bc), na.rm=T) } rm(phymat, FDrand, rep_comm_1, PDr, community.composition, sppr, rand_comm) #----- Spp impacts: Hunting # Removing species where the threat is major (1) and causing population decline hunt1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & hunting == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_hunt1 <- nrow(hunt1_spp) numspploss_hunt1 <- SR0 - nrow(hunt1_spp) hunt1_tmp <- dplyr::left_join(hunt1_spp, dat1, by="species") hunt1_tmp <- hunt1_tmp[complete.cases(hunt1_tmp),] thunt.01 <- mean(hunt1_tmp$DietInv) thunt.02 <- mean(hunt1_tmp$DietVert) thunt.03 <- mean(hunt1_tmp$DietFish) thunt.04 <- mean(hunt1_tmp$DietScav) thunt.05 <- mean(hunt1_tmp$DietFruit) thunt.06 <- mean(hunt1_tmp$DietNect) thunt.07 <- mean(hunt1_tmp$DietSeed) thunt.08 <- mean(hunt1_tmp$DietHerb) thunt.09 <- mean(hunt1_tmp$BodyMass) thunt.10 <- mean(hunt1_tmp$Ground) thunt.11 <- mean(hunt1_tmp$Climbing) thunt.12 <- mean(hunt1_tmp$Volant) if(SR_hunt1 == 0){PD_hunt1 <- 0}else{ phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(hunt1_spp))) names(phymat) <- hunt1_spp$species PD_hunt1 <- picante::pd(phymat, spptree, include.root=TRUE)$PD } if(SR_hunt1 < 2){FDrand_hunt1 <- NA; FDrandLO_hunt1 <- NA; FDrandHI_hunt1 <- NA; FDrandLObonferroni_hunt1 <- NA; FDrandHIbonferroni_hunt1 <- NA; PDrand_hunt1 <- NA; PDrandLO_hunt1 <- NA; PDrandHI_hunt1 <- NA; PDrandLObonferroni_hunt1 <- NA; PDrandHIbonferroni_hunt1 <- NA} else if(SR_hunt1 == SR0){FDrand_hunt1 <- 1; FDrandLO_hunt1 <- 1; FDrandHI_hunt1 <- 1; FDrandLObonferroni_hunt1 <- 1; FDrandHIbonferroni_hunt1 <- 1; PDrand_hunt1 <- 1; PDrandLO_hunt1 <- 1; PDrandHI_hunt1 <- 1; PDrandLObonferroni_hunt1 <- 1; PDrandHIbonferroni_hunt1 <- 1} else { rep_comm_1 <- rep(1, nrow(datXY)) community.composition <- as.data.frame(cbind(rep_comm_1, all_spp)) names(community.composition)[1] <- "comm_idx" PDr <- matrix(, nrow=numrands, 1) for(r in 1:numrands){ sppr <- data.frame(all_spp[sample(nrow(all_spp), (SR0-numspploss_hunt1)),]) # random sample of spp names(sppr) <- names(all_spp) rand_comm <- as.data.frame(cbind(rep(r, nrow(sppr)), sppr)) names(rand_comm)[1] <- "comm_idx" community.composition <- rbind(community.composition, rand_comm) # Phyl div in the community given the same amount of spp loss, but random spp phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(sppr))) names(phymat) <- sppr$species PDr[r,1] <- picante::pd(phymat, spptree, include.root=TRUE)$PD } colnames(community.composition)[1] <- "community" FDrand <- Getlength(xtree, community.composition) FDrand <- FDrand[-1,] FDrand_hunt1 <- mean(FDrand$FD.new, na.rm=T) FDrandLO_hunt1 <- quantile(FDrand$FD.new, probs=c(0.025), na.rm=T) FDrandHI_hunt1 <- quantile(FDrand$FD.new, probs=c(0.975), na.rm=T) PDrand_hunt1 <- mean(PDr, na.rm=T); PDrandLO_hunt1 <- quantile(PDr, probs=c(0.025), na.rm=T) PDrandHI_hunt1 <- quantile(PDr, probs=c(0.975), na.rm=T) FDrandLObonferroni_hunt1 <- quantile(FDrand$FD.new, probs=c(bc), na.rm=T) FDrandHIbonferroni_hunt1 <- quantile(FDrand$FD.new, probs=c(1-bc), na.rm=T) PDrandLObonferroni_hunt1 <- quantile(PDr, probs=c(bc), na.rm=T) PDrandHIbonferroni_hunt1 <- quantile(PDr, probs=c(1-bc), na.rm=T) } rm(phymat, FDrand, rep_comm_1, PDr, community.composition, sppr, rand_comm) #----- Spp impacts: Climate change # Removing species where the threat is major (1) and causing population decline clim1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & climate == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_clim1 <- nrow(clim1_spp) numspploss_clim1 <- SR0 - nrow(clim1_spp) if(SR_clim1 == 0){PD_clim1 <- 0}else{ phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(clim1_spp))) names(phymat) <- clim1_spp$species PD_clim1 <- picante::pd(phymat, spptree, include.root=TRUE)$PD } if(SR_clim1 < 2){FDrand_clim1 <- NA; FDrandLO_clim1 <- NA; FDrandHI_clim1 <- NA; FDrandLObonferroni_clim1 <- NA; FDrandHIbonferroni_clim1 <- NA; PDrand_clim1 <- NA; PDrandLO_clim1 <- NA; PDrandHI_clim1 <- NA; PDrandLObonferroni_clim1 <- NA; PDrandHIbonferroni_clim1 <- NA} else if(SR_clim1 == SR0){FDrand_clim1 <- 1; FDrandLO_clim1 <- 1; FDrandHI_clim1 <- 1; FDrandLObonferroni_clim1 <- 1; FDrandHIbonferroni_clim1 <- 1; PDrand_clim1 <- 1; PDrandLO_clim1 <- 1; PDrandHI_clim1 <- 1; PDrandLObonferroni_clim1 <- 1; PDrandHIbonferroni_clim1 <- 1} else { rep_comm_1 <- rep(1, nrow(datXY)) community.composition <- as.data.frame(cbind(rep_comm_1, all_spp)) names(community.composition)[1] <- "comm_idx" PDr <- matrix(, nrow=numrands, 1) for(r in 1:numrands){ sppr <- data.frame(all_spp[sample(nrow(all_spp), (SR0-numspploss_clim1)),]) # random sample of spp names(sppr) <- names(all_spp) rand_comm <- as.data.frame(cbind(rep(r, nrow(sppr)), sppr)) names(rand_comm)[1] <- "comm_idx" community.composition <- rbind(community.composition, rand_comm) # Phyl div in the community given the same amount of spp loss, but random spp phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(sppr))) names(phymat) <- sppr$species PDr[r,1] <- picante::pd(phymat, spptree, include.root=TRUE)$PD } colnames(community.composition)[1] <- "community" FDrand <- Getlength(xtree, community.composition) FDrand <- FDrand[-1,] FDrand_clim1 <- mean(FDrand$FD.new, na.rm=T) FDrandLO_clim1 <- quantile(FDrand$FD.new, probs=c(0.025), na.rm=T) FDrandHI_clim1 <- quantile(FDrand$FD.new, probs=c(0.975), na.rm=T) PDrand_clim1 <- mean(PDr, na.rm=T); PDrandLO_clim1 <- quantile(PDr, probs=c(0.025), na.rm=T) PDrandHI_clim1 <- quantile(PDr, probs=c(0.975), na.rm=T) FDrandLObonferroni_clim1 <- quantile(FDrand$FD.new, probs=c(bc), na.rm=T) FDrandHIbonferroni_clim1 <- quantile(FDrand$FD.new, probs=c(1-bc), na.rm=T) PDrandLObonferroni_clim1 <- quantile(PDr, probs=c(bc), na.rm=T) PDrandHIbonferroni_clim1 <- quantile(PDr, probs=c(1-bc), na.rm=T) } rm(phymat, FDrand, rep_comm_1, PDr, community.composition, sppr, rand_comm) #----- Spp impacts: Human-wildlife conflict # Removing species where the threat is major (1) and causing population decline con1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & conflict == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_con1 <- nrow(con1_spp) numspploss_con1 <- SR0 - nrow(con1_spp) if(SR_con1 == 0){PD_con1 <- 0} else { phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(con1_spp))) names(phymat) <- con1_spp$species PD_con1 <- picante::pd(phymat, spptree, include.root=TRUE)$PD } if(SR_con1 < 2){FDrand_con1 <- NA; FDrandLO_con1 <- NA; FDrandHI_con1 <- NA; FDrandLObonferroni_con1 <- NA; FDrandHIbonferroni_con1 <- NA; PDrand_con1 <- NA; PDrandLO_con1 <- NA; PDrandHI_con1 <- NA; PDrandLObonferroni_con1 <- NA; PDrandHIbonferroni_con1 <- NA} else if(SR_con1 == SR0){FDrand_con1 <- 1; FDrandLO_con1 <- 1; FDrandHI_con1 <- 1; FDrandLObonferroni_con1 <- 1; FDrandHIbonferroni_con1 <- 1; PDrand_con1 <- 1; PDrandLO_con1 <- 1; PDrandHI_con1 <- 1; PDrandLObonferroni_con1 <- 1; PDrandHIbonferroni_con1 <- 1} else { rep_comm_1 <- rep(1, nrow(datXY)) community.composition <- as.data.frame(cbind(rep_comm_1, all_spp)) names(community.composition)[1] <- "comm_idx" PDr <- matrix(, nrow=numrands, 1) for(r in 1:numrands){ sppr <- data.frame(all_spp[sample(nrow(all_spp), (SR0-numspploss_con1)),]) # random sample of spp names(sppr) <- names(all_spp) rand_comm <- as.data.frame(cbind(rep(r, nrow(sppr)), sppr)) names(rand_comm)[1] <- "comm_idx" community.composition <- rbind(community.composition, rand_comm) # Phyl div in the community given the same amount of spp loss, but random spp phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(sppr))) names(phymat) <- sppr$species PDr[r,1] <- picante::pd(phymat, spptree, include.root=TRUE)$PD } colnames(community.composition)[1] <- "community" FDrand <- Getlength(xtree, community.composition) FDrand <- FDrand[-1,] FDrand_con1 <- mean(FDrand$FD.new, na.rm=T) FDrandLO_con1 <- quantile(FDrand$FD.new, probs=c(0.025), na.rm=T) FDrandHI_con1 <- quantile(FDrand$FD.new, probs=c(0.975), na.rm=T) PDrand_con1 <- mean(PDr, na.rm=T); PDrandLO_con1 <- quantile(PDr, probs=c(0.025), na.rm=T) PDrandHI_con1 <- quantile(PDr, probs=c(0.975), na.rm=T) FDrandLObonferroni_con1 <- quantile(FDrand$FD.new, probs=c(bc), na.rm=T) FDrandHIbonferroni_con1 <- quantile(FDrand$FD.new, probs=c(1-bc), na.rm=T) PDrandLObonferroni_con1 <- quantile(PDr, probs=c(bc), na.rm=T) PDrandHIbonferroni_con1 <- quantile(PDr, probs=c(1-bc), na.rm=T) } rm(phymat, FDrand, rep_comm_1, PDr, community.composition, sppr, rand_comm) #----- Spp impacts: Non-natives # Removing species where the threat is major (1) and causing population decline nonnat1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & NonNatives == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_nonnat1 <- nrow(nonnat1_spp) numspploss_nonnat1 <- SR0 - nrow(nonnat1_spp) if(SR_nonnat1 == 0){PD_nonnat1 <- 0} else { phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(nonnat1_spp))) names(phymat) <- nonnat1_spp$species PD_nonnat1 <- picante::pd(phymat, spptree, include.root=TRUE)$PD } if(SR_nonnat1 < 2){FDrand_nonnat1 <- NA; FDrandLO_nonnat1 <- NA; FDrandHI_nonnat1 <- NA; FDrandLObonferroni_nonnat1 <- NA; FDrandHIbonferroni_nonnat1 <- NA; PDrand_nonnat1 <- NA; PDrandLO_nonnat1 <- NA; PDrandHI_nonnat1 <- NA; PDrandLObonferroni_nonnat1 <- NA; PDrandHIbonferroni_nonnat1 <- NA} else if(SR_nonnat1 == SR0){FDrand_nonnat1 <- 1; FDrandLO_nonnat1 <- 1; FDrandHI_nonnat1 <- 1; FDrandLObonferroni_nonnat1 <- 1; FDrandHIbonferroni_nonnat1 <- 1; PDrand_nonnat1 <- 1; PDrandLO_nonnat1 <- 1; PDrandHI_nonnat1 <- 1; PDrandLObonferroni_nonnat1 <- 1; PDrandHIbonferroni_nonnat1 <- 1} else { rep_comm_1 <- rep(1, nrow(datXY)) community.composition <- as.data.frame(cbind(rep_comm_1, all_spp)) names(community.composition)[1] <- "comm_idx" PDr <- matrix(, nrow=numrands, 1) for(r in 1:numrands){ sppr <- data.frame(all_spp[sample(nrow(all_spp), (SR0-numspploss_nonnat1)),]) # random sample of spp names(sppr) <- names(all_spp) rand_comm <- as.data.frame(cbind(rep(r, nrow(sppr)), sppr)) names(rand_comm)[1] <- "comm_idx" community.composition <- rbind(community.composition, rand_comm) # Phyl div in the community given the same amount of spp loss, but random spp phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(sppr))) names(phymat) <- sppr$species PDr[r,1] <- picante::pd(phymat, spptree, include.root=TRUE)$PD } colnames(community.composition)[1] <- "community" FDrand <- Getlength(xtree, community.composition) FDrand <- FDrand[-1,] FDrand_nonnat1 <- mean(FDrand$FD.new, na.rm=T) FDrandLO_nonnat1 <- quantile(FDrand$FD.new, probs=c(0.025), na.rm=T) FDrandHI_nonnat1 <- quantile(FDrand$FD.new, probs=c(0.975), na.rm=T) PDrand_nonnat1 <- mean(PDr, na.rm=T); PDrandLO_nonnat1 <- quantile(PDr, probs=c(0.025), na.rm=T) PDrandHI_nonnat1 <- quantile(PDr, probs=c(0.975), na.rm=T) FDrandLObonferroni_nonnat1 <- quantile(FDrand$FD.new, probs=c(bc), na.rm=T) FDrandHIbonferroni_nonnat1 <- quantile(FDrand$FD.new, probs=c(1-bc), na.rm=T) PDrandLObonferroni_nonnat1 <- quantile(PDr, probs=c(bc), na.rm=T) PDrandHIbonferroni_nonnat1 <- quantile(PDr, probs=c(1-bc), na.rm=T) } rm(phymat, FDrand, rep_comm_1, PDr, community.composition, sppr, rand_comm) #----- Spp impacts: Pollution # Removing species where the threat is major (1) and causing population decline pol1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & pollution == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_pol1 <- nrow(pol1_spp) numspploss_pol1 <- SR0 - nrow(pol1_spp) if(SR_pol1 == 0){PD_pol1 <- 0} else { phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(pol1_spp))) names(phymat) <- pol1_spp$species PD_pol1 <- picante::pd(phymat, spptree, include.root=TRUE)$PD } if(SR_pol1 < 2){FDrand_pol1 <- NA; FDrandLO_pol1 <- NA; FDrandHI_pol1 <- NA; FDrandLObonferroni_pol1 <- NA; FDrandHIbonferroni_pol1 <- NA; PDrand_pol1 <- NA; PDrandLO_pol1 <- NA; PDrandHI_pol1 <- NA; PDrandLObonferroni_pol1 <- NA; PDrandHIbonferroni_pol1 <- NA} else if(SR_pol1 == SR0){FDrand_pol1 <- 1; FDrandLO_pol1 <- 1; FDrandHI_pol1 <- 1; FDrandLObonferroni_pol1 <- 1; FDrandHIbonferroni_pol1 <- 1; PDrand_pol1 <- 1; PDrandLO_pol1 <- 1; PDrandHI_pol1 <- 1; PDrandLObonferroni_pol1 <- 1; PDrandHIbonferroni_pol1 <- 1} else { rep_comm_1 <- rep(1, nrow(datXY)) community.composition <- as.data.frame(cbind(rep_comm_1, all_spp)) names(community.composition)[1] <- "comm_idx" PDr <- matrix(, nrow=numrands, 1) for(r in 1:numrands){ sppr <- data.frame(all_spp[sample(nrow(all_spp), (SR0-numspploss_pol1)),]) # random sample of spp names(sppr) <- names(all_spp) rand_comm <- as.data.frame(cbind(rep(r, nrow(sppr)), sppr)) names(rand_comm)[1] <- "comm_idx" community.composition <- rbind(community.composition, rand_comm) # Phyl div in the community given the same amount of spp loss, but random spp phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(sppr))) names(phymat) <- sppr$species PDr[r,1] <- picante::pd(phymat, spptree, include.root=TRUE)$PD } colnames(community.composition)[1] <- "community" FDrand <- Getlength(xtree, community.composition) FDrand <- FDrand[-1,] FDrand_pol1 <- mean(FDrand$FD.new, na.rm=T) FDrandLO_pol1 <- quantile(FDrand$FD.new, probs=c(0.025), na.rm=T) FDrandHI_pol1 <- quantile(FDrand$FD.new, probs=c(0.975), na.rm=T) PDrand_pol1 <- mean(PDr, na.rm=T); PDrandLO_pol1 <- quantile(PDr, probs=c(0.025), na.rm=T) PDrandHI_pol1 <- quantile(PDr, probs=c(0.975), na.rm=T) FDrandLObonferroni_pol1 <- quantile(FDrand$FD.new, probs=c(bc), na.rm=T) FDrandHIbonferroni_pol1 <- quantile(FDrand$FD.new, probs=c(1-bc), na.rm=T) PDrandLObonferroni_pol1 <- quantile(PDr, probs=c(bc), na.rm=T) PDrandHIbonferroni_pol1 <- quantile(PDr, probs=c(1-bc), na.rm=T) } rm(phymat, FDrand, rep_comm_1, PDr, community.composition, sppr, rand_comm) #----- Spp impacts: Hybridization # Removing species where the threat is major (1) and causing population decline hyb1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & hybrid == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_hyb1 <- nrow(hyb1_spp) numspploss_hyb1 <- SR0 - nrow(hyb1_spp) if(SR_hyb1 == 0){PD_hyb1 <- 0} else { phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(hyb1_spp))) names(phymat) <- hyb1_spp$species PD_hyb1 <- picante::pd(phymat, spptree, include.root=TRUE)$PD } if(SR_hyb1 < 2){FDrand_hyb1 <- NA; FDrandLO_hyb1 <- NA; FDrandHI_hyb1 <- NA; FDrandLObonferroni_hyb1 <- NA; FDrandHIbonferroni_hyb1 <- NA; PDrand_hyb1 <- NA; PDrandLO_hyb1 <- NA; PDrandHI_hyb1 <- NA; PDrandLObonferroni_hyb1 <- NA; PDrandHIbonferroni_hyb1 <- NA} else if(SR_hyb1 == SR0){FDrand_hyb1 <- 1; FDrandLO_hyb1 <- 1; FDrandHI_hyb1 <- 1; FDrandLObonferroni_hyb1 <- 1; FDrandHIbonferroni_hyb1 <- 1; PDrand_hyb1 <- 1; PDrandLO_hyb1 <- 1; PDrandHI_hyb1 <- 1; PDrandLObonferroni_hyb1 <- 1; PDrandHIbonferroni_hyb1 <- 1} else { rep_comm_1 <- rep(1, nrow(datXY)) community.composition <- as.data.frame(cbind(rep_comm_1, all_spp)) names(community.composition)[1] <- "comm_idx" PDr <- matrix(, nrow=numrands, 1) for(r in 1:numrands){ sppr <- data.frame(all_spp[sample(nrow(all_spp), (SR0-numspploss_hyb1)),]) # random sample of spp names(sppr) <- names(all_spp) rand_comm <- as.data.frame(cbind(rep(r, nrow(sppr)), sppr)) names(rand_comm)[1] <- "comm_idx" community.composition <- rbind(community.composition, rand_comm) # Phyl div in the community given the same amount of spp loss, but random spp phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(sppr))) names(phymat) <- sppr$species PDr[r,1] <- picante::pd(phymat, spptree, include.root=TRUE)$PD } colnames(community.composition)[1] <- "community" FDrand <- Getlength(xtree, community.composition) FDrand <- FDrand[-1,] FDrand_hyb1 <- mean(FDrand$FD.new, na.rm=T) FDrandLO_hyb1 <- quantile(FDrand$FD.new, probs=c(0.025), na.rm=T) FDrandHI_hyb1 <- quantile(FDrand$FD.new, probs=c(0.975), na.rm=T) PDrand_hyb1 <- mean(PDr, na.rm=T); PDrandLO_hyb1 <- quantile(PDr, probs=c(0.025), na.rm=T) PDrandHI_hyb1 <- quantile(PDr, probs=c(0.975), na.rm=T) FDrandLObonferroni_hyb1 <- quantile(FDrand$FD.new, probs=c(bc), na.rm=T) FDrandHIbonferroni_hyb1 <- quantile(FDrand$FD.new, probs=c(1-bc), na.rm=T) PDrandLObonferroni_hyb1 <- quantile(PDr, probs=c(bc), na.rm=T) PDrandHIbonferroni_hyb1 <- quantile(PDr, probs=c(1-bc), na.rm=T) } rm(phymat, FDrand, rep_comm_1, PDr, community.composition, sppr, rand_comm) #----- Spp impacts: Prey depletion # Removing species where the threat is major (1) and causing population decline prey1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & prey == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_prey1 <- nrow(prey1_spp) numspploss_prey1 <- SR0 - nrow(prey1_spp) if(SR_prey1 == 0){PD_prey1 <- 0} else { phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(prey1_spp))) names(phymat) <- prey1_spp$species PD_prey1 <- picante::pd(phymat, spptree, include.root=TRUE)$PD } if(SR_prey1 < 2){FDrand_prey1 <- NA; FDrandLO_prey1 <- NA; FDrandHI_prey1 <- NA; FDrandLObonferroni_prey1 <- NA; FDrandHIbonferroni_prey1 <- NA; PDrand_prey1 <- NA; PDrandLO_prey1 <- NA; PDrandHI_prey1 <- NA; PDrandLObonferroni_prey1 <- NA; PDrandHIbonferroni_prey1 <- NA} else if(SR_prey1 == SR0){FDrand_prey1 <- 1; FDrandLO_prey1 <- 1; FDrandHI_prey1 <- 1; FDrandLObonferroni_prey1 <- 1; FDrandHIbonferroni_prey1 <- 1; PDrand_prey1 <- 1; PDrandLO_prey1 <- 1; PDrandHI_prey1 <- 1; PDrandLObonferroni_prey1 <- 1; PDrandHIbonferroni_prey1 <- 1} else { rep_comm_1 <- rep(1, nrow(datXY)) community.composition <- as.data.frame(cbind(rep_comm_1, all_spp)) names(community.composition)[1] <- "comm_idx" PDr <- matrix(, nrow=numrands, 1) for(r in 1:numrands){ sppr <- data.frame(all_spp[sample(nrow(all_spp), (SR0-numspploss_prey1)),]) # random sample of spp names(sppr) <- names(all_spp) rand_comm <- as.data.frame(cbind(rep(r, nrow(sppr)), sppr)) names(rand_comm)[1] <- "comm_idx" community.composition <- rbind(community.composition, rand_comm) # Phyl div in the community given the same amount of spp loss, but random spp phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(sppr))) names(phymat) <- sppr$species PDr[r,1] <- picante::pd(phymat, spptree, include.root=TRUE)$PD } colnames(community.composition)[1] <- "community" FDrand <- Getlength(xtree, community.composition) FDrand <- FDrand[-1,] FDrand_prey1 <- mean(FDrand$FD.new, na.rm=T) FDrandLO_prey1 <- quantile(FDrand$FD.new, probs=c(0.025), na.rm=T) FDrandHI_prey1 <- quantile(FDrand$FD.new, probs=c(0.975), na.rm=T) PDrand_prey1 <- mean(PDr, na.rm=T); PDrandLO_prey1 <- quantile(PDr, probs=c(0.025), na.rm=T) PDrandHI_prey1 <- quantile(PDr, probs=c(0.975), na.rm=T) FDrandLObonferroni_prey1 <- quantile(FDrand$FD.new, probs=c(bc), na.rm=T) FDrandHIbonferroni_prey1 <- quantile(FDrand$FD.new, probs=c(1-bc), na.rm=T) PDrandLObonferroni_prey1 <- quantile(PDr, probs=c(bc), na.rm=T) PDrandHIbonferroni_prey1 <- quantile(PDr, probs=c(1-bc), na.rm=T) } rm(phymat, FDrand, rep_comm_1, PDr, community.composition, sppr, rand_comm) #----- Spp impacts: Disease # Removing species where the threat is major (1) and causing population decline dis1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & disease == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_dis1 <- nrow(dis1_spp) numspploss_dis1 <- SR0 - nrow(dis1_spp) if(SR_dis1 == 0){PD_dis1 <- 0} else { phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(dis1_spp))) names(phymat) <- dis1_spp$species PD_dis1 <- picante::pd(phymat, spptree, include.root=TRUE)$PD } if(SR_dis1 < 2){FDrand_dis1 <- NA; FDrandLO_dis1 <- NA; FDrandHI_dis1 <- NA; FDrandLObonferroni_dis1 <- NA; FDrandHIbonferroni_dis1 <- NA; PDrand_dis1 <- NA; PDrandLO_dis1 <- NA; PDrandHI_dis1 <- NA; PDrandLObonferroni_dis1 <- NA; PDrandHIbonferroni_dis1 <- NA} else if(SR_dis1 == SR0){FDrand_dis1 <- 1; FDrandLO_dis1 <- 1; FDrandHI_dis1 <- 1; FDrandLObonferroni_dis1 <- 1; FDrandHIbonferroni_dis1 <- 1; PDrand_dis1 <- 1; PDrandLO_dis1 <- 1; PDrandHI_dis1 <- 1; PDrandLObonferroni_dis1 <- 1; PDrandHIbonferroni_dis1 <- 1} else { rep_comm_1 <- rep(1, nrow(datXY)) community.composition <- as.data.frame(cbind(rep_comm_1, all_spp)) names(community.composition)[1] <- "comm_idx" PDr <- matrix(, nrow=numrands, 1) for(r in 1:numrands){ sppr <- data.frame(all_spp[sample(nrow(all_spp), (SR0-numspploss_dis1)),]) # random sample of spp names(sppr) <- names(all_spp) rand_comm <- as.data.frame(cbind(rep(r, nrow(sppr)), sppr)) names(rand_comm)[1] <- "comm_idx" community.composition <- rbind(community.composition, rand_comm) # Phyl div in the community given the same amount of spp loss, but random spp phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(sppr))) names(phymat) <- sppr$species PDr[r,1] <- picante::pd(phymat, spptree, include.root=TRUE)$PD } colnames(community.composition)[1] <- "community" FDrand <- Getlength(xtree, community.composition) FDrand <- FDrand[-1,] FDrand_dis1 <- mean(FDrand$FD.new, na.rm=T) FDrandLO_dis1 <- quantile(FDrand$FD.new, probs=c(0.025), na.rm=T) FDrandHI_dis1 <- quantile(FDrand$FD.new, probs=c(0.975), na.rm=T) PDrand_dis1 <- mean(PDr, na.rm=T); PDrandLO_dis1 <- quantile(PDr, probs=c(0.025), na.rm=T) PDrandHI_dis1 <- quantile(PDr, probs=c(0.975), na.rm=T) FDrandLObonferroni_dis1 <- quantile(FDrand$FD.new, probs=c(bc), na.rm=T) FDrandHIbonferroni_dis1 <- quantile(FDrand$FD.new, probs=c(1-bc), na.rm=T) PDrandLObonferroni_dis1 <- quantile(PDr, probs=c(bc), na.rm=T) PDrandHIbonferroni_dis1 <- quantile(PDr, probs=c(1-bc), na.rm=T) } rm(phymat, FDrand, rep_comm_1, PDr, community.composition, sppr, rand_comm) #----- Spp impacts: Inbreeding # Removing species where the threat is major (1) and causing population decline inb1_spp <- datXY %>% dplyr::filter(!(trend == "Decreasing" & inbreeding == 1)) %>% dplyr::select(species) %>% rbind(spp_exc_df) %>% distinct() SR_inb1 <- nrow(inb1_spp) numspploss_inb1 <- SR0 - nrow(inb1_spp) if(SR_inb1 == 0){PD_inb1 <- 0} else { phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(inb1_spp))) names(phymat) <- inb1_spp$species PD_inb1 <- picante::pd(phymat, spptree, include.root=TRUE)$PD } if(SR_inb1 < 2){FDrand_inb1 <- NA; FDrandLO_inb1 <- NA; FDrandHI_inb1 <- NA; FDrandLObonferroni_inb1 <- NA; FDrandHIbonferroni_inb1 <- NA; PDrand_inb1 <- NA; PDrandLO_inb1 <- NA; PDrandHI_inb1 <- NA; PDrandLObonferroni_inb1 <- NA; PDrandHIbonferroni_inb1 <- NA} else if(SR_inb1 == SR0){FDrand_inb1 <- 1; FDrandLO_inb1 <- 1; FDrandHI_inb1 <- 1; FDrandLObonferroni_inb1 <- 1; FDrandHIbonferroni_inb1 <- 1; PDrand_inb1 <- 1; PDrandLO_inb1 <- 1; PDrandHI_inb1 <- 1; PDrandLObonferroni_inb1 <- 1; PDrandHIbonferroni_inb1 <- 1} else { rep_comm_1 <- rep(1, nrow(datXY)) community.composition <- as.data.frame(cbind(rep_comm_1, all_spp)) names(community.composition)[1] <- "comm_idx" PDr <- matrix(, nrow=numrands, 1) for(r in 1:numrands){ sppr <- data.frame(all_spp[sample(nrow(all_spp), (SR0-numspploss_inb1)),]) # random sample of spp names(sppr) <- names(all_spp) rand_comm <- as.data.frame(cbind(rep(r, nrow(sppr)), sppr)) names(rand_comm)[1] <- "comm_idx" community.composition <- rbind(community.composition, rand_comm) # Phyl div in the community given the same amount of spp loss, but random spp phymat <- data.frame(matrix(1, nrow=1, ncol=nrow(sppr))) names(phymat) <- sppr$species PDr[r,1] <- picante::pd(phymat, spptree, include.root=TRUE)$PD } colnames(community.composition)[1] <- "community" FDrand <- Getlength(xtree, community.composition) FDrand <- FDrand[-1,] FDrand_inb1 <- mean(FDrand$FD.new, na.rm=T) FDrandLO_inb1 <- quantile(FDrand$FD.new, probs=c(0.025), na.rm=T) FDrandHI_inb1 <- quantile(FDrand$FD.new, probs=c(0.975), na.rm=T) PDrand_inb1 <- mean(PDr, na.rm=T); PDrandLO_inb1 <- quantile(PDr, probs=c(0.025), na.rm=T) PDrandHI_inb1 <- quantile(PDr, probs=c(0.975), na.rm=T) FDrandLObonferroni_inb1 <- quantile(FDrand$FD.new, probs=c(bc), na.rm=T) FDrandHIbonferroni_inb1 <- quantile(FDrand$FD.new, probs=c(1-bc), na.rm=T) PDrandLObonferroni_inb1 <- quantile(PDr, probs=c(bc), na.rm=T) PDrandHIbonferroni_inb1 <- quantile(PDr, probs=c(1-bc), na.rm=T) } rm(phymat, FDrand, rep_comm_1, PDr, community.composition, sppr, rand_comm) #----- Any communities completely gone (0 spp)? FD_new_0s <- c(nrow(allthreats_spp), nrow(hab1_spp), nrow(hunt1_spp), nrow(con1_spp), nrow(clim1_spp), nrow(nonnat1_spp), nrow(pol1_spp), nrow(hyb1_spp), nrow(prey1_spp), nrow(dis1_spp), nrow(inb1_spp)) FD_new_0s_vec <- which(FD_new_0s == 0) #----- New functional diversity # Generate community composition matrix for each scenario comm_comp_spp <- rbind(all_spp, allthreats_spp, hab1_spp, hunt1_spp, clim1_spp, con1_spp, nonnat1_spp, pol1_spp, hyb1_spp, prey1_spp, dis1_spp, inb1_spp) comm_comp_idx <- c(rep(1, nrow(datXY)), rep(2, nrow(allthreats_spp)), rep(3, nrow(hab1_spp)), rep(4, nrow(hunt1_spp)), rep(5, nrow(clim1_spp)), rep(6, nrow(con1_spp)), rep(7, nrow(nonnat1_spp)), rep(8, nrow(pol1_spp)), rep(9, nrow(hyb1_spp)), rep(10, nrow(prey1_spp)), rep(11, nrow(dis1_spp)), rep(12, nrow(inb1_spp)) ) community.composition <- as.data.frame(cbind(comm_comp_idx, comm_comp_spp)) colnames(community.composition)[1] <- "community" FD <- Getlength(xtree, community.composition) FD_c <- FD[2] FD_r_tmp1 <- t(FD_c) FD_r_tmp2 <- FD_r_tmp1[-1] if(any(FD_new_0s == 0) == TRUE){ ins_vals <- rep(0, length(FD_new_0s_vec)) FD_r <- R.utils::insert(FD_r_tmp2, ats = FD_new_0s_vec, values = ins_vals) } else {FD_r <- FD_r_tmp2} } # end 'Current diversity' if loop # Assemble the results vector for iteration i res_vec <- c(SR0, SR_allthreats, SR_hab1, SR_hunt1, SR_clim1, SR_con1, SR_nonnat1, SR_pol1, SR_hyb1, SR_prey1, SR_dis1, SR_inb1, FD0, FD_r, # length 11 FDrand_allthreats, FDrand_hab1, FDrand_hunt1, FDrand_clim1, FDrand_con1, FDrand_nonnat1, FDrand_pol1, FDrand_hyb1, FDrand_prey1, FDrand_dis1, FDrand_inb1, FDrandLO_allthreats, FDrandLO_hab1, FDrandLO_hunt1, FDrandLO_clim1, FDrandLO_con1, FDrandLO_nonnat1, FDrandLO_pol1, FDrandLO_hyb1, FDrandLO_prey1, FDrandLO_dis1, FDrandLO_inb1, FDrandHI_allthreats, FDrandHI_hab1, FDrandHI_hunt1, FDrandHI_clim1, FDrandHI_con1, FDrandHI_nonnat1, FDrandHI_pol1, FDrandHI_hyb1, FDrandHI_prey1, FDrandHI_dis1, FDrandHI_inb1, FDrandLObonferroni_allthreats, FDrandLObonferroni_hab1, FDrandLObonferroni_hunt1, FDrandLObonferroni_clim1, FDrandLObonferroni_con1, FDrandLObonferroni_nonnat1, FDrandLObonferroni_pol1, FDrandLObonferroni_hyb1, FDrandLObonferroni_prey1, FDrandLObonferroni_dis1, FDrandLObonferroni_inb1, FDrandHIbonferroni_allthreats, FDrandHIbonferroni_hab1, FDrandHIbonferroni_hunt1, FDrandHIbonferroni_clim1, FDrandHIbonferroni_con1, FDrandHIbonferroni_nonnat1, FDrandHIbonferroni_pol1, FDrandHIbonferroni_hyb1, FDrandHIbonferroni_prey1, FDrandHIbonferroni_dis1, FDrandHIbonferroni_inb1, PD0, PD_allthreats/PD0, PD_hab1/PD0, PD_hunt1/PD0, PD_clim1/PD0, PD_con1/PD0, PD_nonnat1/PD0, PD_pol1/PD0, PD_hyb1/PD0, PD_prey1/PD0, PD_dis1/PD0, PD_inb1/PD0, PDrand_allthreats/PD0, PDrand_hab1/PD0, PDrand_hunt1/PD0, PDrand_clim1/PD0, PDrand_con1/PD0, PDrand_nonnat1/PD0, PDrand_pol1/PD0, PDrand_hyb1/PD0, PDrand_prey1/PD0, PDrand_dis1/PD0, PDrand_inb1/PD0, PDrandLO_allthreats/PD0, PDrandLO_hab1/PD0, PDrandLO_hunt1/PD0, PDrandLO_clim1/PD0, PDrandLO_con1/PD0, PDrandLO_nonnat1/PD0, PDrandLO_pol1/PD0, PDrandLO_hyb1/PD0, PDrandLO_prey1/PD0, PDrandLO_dis1/PD0, PDrandLO_inb1/PD0, PDrandHI_allthreats/PD0, PDrandHI_hab1/PD0, PDrandHI_hunt1/PD0, PDrandHI_clim1/PD0, PDrandHI_con1/PD0, PDrandHI_nonnat1/PD0, PDrandHI_pol1/PD0, PDrandHI_hyb1/PD0, PDrandHI_prey1/PD0, PDrandHI_dis1/PD0, PDrandHI_inb1/PD0, PDrandLObonferroni_allthreats/PD0, PDrandLObonferroni_hab1/PD0, PDrandLObonferroni_hunt1/PD0, PDrandLObonferroni_clim1/PD0, PDrandLObonferroni_con1/PD0, PDrandLObonferroni_nonnat1/PD0, PDrandLObonferroni_pol1/PD0, PDrandLObonferroni_hyb1/PD0, PDrandLObonferroni_prey1/PD0, PDrandLObonferroni_dis1/PD0, PDrandLObonferroni_inb1/PD0, PDrandHIbonferroni_allthreats/PD0, PDrandHIbonferroni_hab1/PD0, PDrandHIbonferroni_hunt1/PD0, PDrandHIbonferroni_clim1/PD0, PDrandHIbonferroni_con1/PD0, PDrandHIbonferroni_nonnat1/PD0, PDrandHIbonferroni_pol1/PD0, PDrandHIbonferroni_hyb1/PD0, PDrandHIbonferroni_prey1/PD0, PDrandHIbonferroni_dis1/PD0, PDrandHIbonferroni_inb1/PD0, thab.01-t0.01, thab.02-t0.02, thab.03-t0.03, thab.04-t0.04, thab.05-t0.05, thab.06-t0.06, thab.07-t0.07, thab.08-t0.08, thab.09-t0.09, thab.10-t0.10, thab.11-t0.11, thab.12-t0.12, thunt.01-t0.01, thunt.02-t0.02, thunt.03-t0.03, thunt.04-t0.04, thunt.05-t0.05, thunt.06-t0.06, thunt.07-t0.07, thunt.08-t0.08, thunt.09-t0.09, thunt.10-t0.10, thunt.11-t0.11, thunt.12-t0.12) if(res_vec[1] == 0){res_vec[] <- 0} else {res_vec <- res_vec} out_mat[i,] <- res_vec rm("datXY", "all_spp", "country_name", "keep_exc", "spp_exc", "spp_exc_df", "spp_list", "SR0", "allthreats_spp", "hab1_spp", "hunt1_spp", "clim1_spp", "con1_spp", "nonnat1_spp", "pol1_spp", "prey1_spp", "hyb1_spp", "dis1_spp", "inb1_spp", "numspploss_allthreats", "numspploss_hab1", "numspploss_hunt1", "numspploss_clim1", "numspploss_con1", "numspploss_nonnat1", "numspploss_pol1", "numspploss_prey1", "numspploss_hyb1", "numspploss_dis1", "numspploss_inb1", "SR_allthreats", "SR_hab1", "SR_hunt1", "SR_clim1", "SR_con1", "SR_nonnat1", "SR_pol1", "SR_hyb1", "SR_prey1", "SR_dis1", "SR_inb1", "i.prime", "tree", "xtree", "distances", "comm_comp_idx" , "comm_comp_spp", "community.composition", "species.traits", "FD0", "FD_r", "FD_r_tmp1", "FD_r_tmp2", "FD_c", "FD", "FD_new_0s", "FD_new_0s_vec", "FDrand_allthreats", "FDrand_hab1", "FDrand_hunt1", "FDrand_clim1", "FDrand_con1", "FDrand_nonnat1", "FDrand_pol1", "FDrand_hyb1", "FDrand_prey1", "FDrand_dis1", "FDrand_inb1", "FDrandLO_allthreats", "FDrandLO_hab1", "FDrandLO_hunt1", "FDrandLO_clim1", "FDrandLO_con1", "FDrandLO_nonnat1", "FDrandLO_pol1", "FDrandLO_hyb1", "FDrandLO_prey1", "FDrandLO_dis1", "FDrandLO_inb1", "FDrandHI_allthreats", "FDrandHI_hab1", "FDrandHI_hunt1", "FDrandHI_clim1", "FDrandHI_con1", "FDrandHI_nonnat1", "FDrandHI_pol1", "FDrandHI_hyb1", "FDrandHI_prey1", "FDrandHI_dis1", "FDrandHI_inb1", "FDrandLObonferroni_allthreats", "FDrandLObonferroni_hab1", "FDrandLObonferroni_hunt1", "FDrandLObonferroni_clim1", "FDrandLObonferroni_con1", "FDrandLObonferroni_nonnat1", "FDrandLObonferroni_pol1", "FDrandLObonferroni_hyb1", "FDrandLObonferroni_prey1", "FDrandLObonferroni_dis1", "FDrandLObonferroni_inb1", "FDrandHIbonferroni_allthreats", "FDrandHIbonferroni_hab1", "FDrandHIbonferroni_hunt1", "FDrandHIbonferroni_clim1", "FDrandHIbonferroni_con1", "FDrandHIbonferroni_nonnat1", "FDrandHIbonferroni_pol1", "FDrandHIbonferroni_hyb1", "FDrandHIbonferroni_prey1", "FDrandHIbonferroni_dis1", "FDrandHIbonferroni_inb1", "PD0", "PD_allthreats", "PD_hab1", "PD_hunt1", "PD_clim1", "PD_con1", "PD_nonnat1", "PD_pol1", "PD_hyb1", "PD_prey1", "PD_dis1", "PD_inb1", "PDrand_allthreats", "PDrand_hab1", "PDrand_hunt1", "PDrand_clim1", "PDrand_con1", "PDrand_nonnat1", "PDrand_pol1", "PDrand_hyb1", "PDrand_prey1", "PDrand_dis1", "PDrand_inb1", "PDrandLO_allthreats", "PDrandLO_hab1", "PDrandLO_hunt1", "PDrandLO_clim1", "PDrandLO_con1", "PDrandLO_nonnat1", "PDrandLO_pol1", "PDrandLO_hyb1", "PDrandLO_prey1", "PDrandLO_dis1", "PDrandLO_inb1", "PDrandHI_allthreats", "PDrandHI_hab1", "PDrandHI_hunt1", "PDrandHI_clim1", "PDrandHI_con1", "PDrandHI_nonnat1", "PDrandHI_pol1", "PDrandHI_hyb1", "PDrandHI_prey1", "PDrandHI_dis1", "PDrandHI_inb1", "PDrandLObonferroni_allthreats", "PDrandLObonferroni_hab1", "PDrandLObonferroni_hunt1", "PDrandLObonferroni_clim1", "PDrandLObonferroni_con1", "PDrandLObonferroni_nonnat1", "PDrandLObonferroni_pol1", "PDrandLObonferroni_hyb1", "PDrandLObonferroni_prey1", "PDrandLObonferroni_dis1", "PDrandLObonferroni_inb1", "PDrandHIbonferroni_allthreats", "PDrandHIbonferroni_hab1", "PDrandHIbonferroni_hunt1", "PDrandHIbonferroni_clim1", "PDrandHIbonferroni_con1", "PDrandHIbonferroni_nonnat1", "PDrandHIbonferroni_pol1", "PDrandHIbonferroni_hyb1", "PDrandHIbonferroni_prey1", "PDrandHIbonferroni_dis1", "PDrandHIbonferroni_inb1", "t0.01", "t0.02", "t0.03", "t0.04", "t0.05", "t0.06", "t0.07", "t0.08", "t0.09", "t0.10", "t0.11", "t0.12", "thab.01", "thab.02", "thab.03", "thab.04", "thab.05", "thab.06", "thab.07", "thab.08", "thab.09", "thab.10", "thab.11", "thab.12", "thunt.01", "thunt.02", "thunt.03", "thunt.04", "thunt.05", "thunt.06", "thunt.07", "thunt.08", "thunt.09", "thunt.10", "thunt.11", "thunt.12", "hab1_tmp", "hunt1_tmp", "res_vec") }, error=function(e){}) # end of the tryCatch function } # end i loop #----------- ASSEMBLE AND SAVE THE RESULTS MATRIX ------------------------------- out_df <- as.data.frame(out_mat) res_df <- cbind.data.frame(ctrs_area, out_df) names(res_df) <- c("Country", "Area_sq_km", "SR0", "SR.allthreats", "SR.habitat1", "SR.hunting1", "SR.climate1", "SR.conflict1", "SR.nonNatives1", "SR.pollution1", "SR.hybrid1", "SR.prey1", "SR.disease1", "SR.inbreeding1", "FD0", "FD.allthreats", "FD.habitat1", "FD.hunting1", "FD.climate1", "FD.conflict1", "FD.nonNatives1", "FD.pollution1", "FD.hybrid1", "FD.prey1", "FD.disease1", "FD.inbreeding1", "FDrand.allthreats", "FDrand.habitat1", "FDrand.hunting1", "FDrand.climate1", "FDrand.conflict1", "FDrand.nonNatives1", "FDrand.pollution1", "FDrand.hybrid1", "FDrand.prey1", "FDrand.disease1", "FDrand.inbreeding1", "FDrandLO.allthreats", "FDrandLO.habitat1", "FDrandLO.hunting1", "FDrandLO.climate1", "FDrandLO.conflict1", "FDrandLO.nonNatives1", "FDrandLO.pollution1", "FDrandLO.hybrid1", "FDrandLO.prey1", "FDrandLO.disease1", "FDrandLO.inbreeding1", "FDrandHI.allthreats", "FDrandHI.habitat1", "FDrandHI.hunting1", "FDrandHI.climate1", "FDrandHI.conflict1", "FDrandHI.nonNatives1", "FDrandHI.pollution1", "FDrandHI.hybrid1", "FDrandHI.prey1", "FDrandHI.disease1", "FDrandHI.inbreeding1", "FDrandLObonferroni.allthreats", "FDrandLObonferroni.habitat1", "FDrandLObonferroni.hunting1", "FDrandLObonferroni.climate1", "FDrandLObonferroni.conflict1", "FDrandLObonferroni.nonNatives1", "FDrandLObonferroni.pollution1", "FDrandLObonferroni.hybrid1", "FDrandLObonferroni.prey1", "FDrandLObonferroni.disease1", "FDrandLObonferroni.inbreeding1", "FDrandHIbonferroni.allthreats", "FDrandHIbonferroni.habitat1", "FDrandHIbonferroni.hunting1", "FDrandHIbonferroni.climate1", "FDrandHIbonferroni.conflict1", "FDrandHIbonferroni.nonNatives1", "FDrandHIbonferroni.pollution1", "FDrandHIbonferroni.hybrid1", "FDrandHIbonferroni.prey1", "FDrandHIbonferroni.disease1", "FDrandHIbonferroni.inbreeding1", "PD0", "PD.allthreats", "PD.habitat1", "PD.hunting1", "PD.climate1", "PD.conflict1", "PD.nonNatives1", "PD.pollution1", "PD.hybrid1", "PD.prey1", "PD.disease1", "PD.inbreeding1", "PDrand.allthreats", "PDrand.habitat1", "PDrand.hunting1", "PDrand.climate1", "PDrand.conflict1", "PDrand.nonNatives1", "PDrand.pollution1", "PDrand.hybrid1", "PDrand.prey1", "PDrand.disease1", "PDrand.inbreeding1", "PDrandLO.allthreats", "PDrandLO.habitat1", "PDrandLO.hunting1", "PDrandLO.climate1", "PDrandLO.conflict1", "PDrandLO.nonNatives1", "PDrandLO.pollution1", "PDrandLO.hybrid1", "PDrandLO.prey1", "PDrandLO.disease1", "PDrandLO.inbreeding1", "PDrandHI.allthreats", "PDrandHI.habitat1", "PDrandHI.hunting1", "PDrandHI.climate1", "PDrandHI.conflict1", "PDrandHI.nonNatives1", "PDrandHI.pollution1", "PDrandHI.hybrid1", "PDrandHI.prey1", "PDrandHI.disease1", "PDrandHI.inbreeding1", "PDrandLObonferroni.allthreats", "PDrandLObonferroni.habitat1", "PDrandLObonferroni.hunting1", "PDrandLObonferroni.climate1", "PDrandLObonferroni.conflict1", "PDrandLObonferroni.nonNatives1", "PDrandLObonferroni.pollution1", "PDrandLObonferroni.hybrid1", "PDrandLObonferroni.prey1", "PDrandLObonferroni.disease1", "PDrandLObonferroni.inbreeding1", "PDrandHIbonferroni.allthreats", "PDrandHIbonferroni.habitat1", "PDrandHIbonferroni.hunting1", "PDrandHIbonferroni.climate1", "PDrandHIbonferroni.conflict1", "PDrandHIbonferroni.nonNatives1", "PDrandHIbonferroni.pollution1", "PDrandHIbonferroni.hybrid1", "PDrandHIbonferroni.prey1", "PDrandHIbonferroni.disease1", "PDrandHIbonferroni.inbreeding1", "TraitChange.hab1.DietInv", "TraitChange.hab1.DietVert", "TraitChange.hab1.DietFish", "TraitChange.hab1.DietScav", "TraitChange.hab1.DietFruit", "TraitChange.hab1.DietNect", "TraitChange.hab1.DietSeed", "TraitChange.hab1.DietHerb", "TraitChange.hab1.BodyMass", "TraitChange.hab1.Ground", "TraitChange.hab1.Climbing", "TraitChange.hab1.Volant", "TraitChange.hunt1.DietInv", "TraitChange.hunt1.DietVert", "TraitChange.hunt1.DietFish", "TraitChange.hunt1.DietScav", "TraitChange.hunt1.DietFruit", "TraitChange.hunt1.DietNect", "TraitChange.hunt1.DietSeed", "TraitChange.hunt1.DietHerb", "TraitChange.hunt1.BodyMass", "TraitChange.hunt1.Ground", "TraitChange.hunt1.Climbing", "TraitChange.hunt1.Volant") # Add columns for 'bias' in FD and PD loss # Value explanations: # 1: loss is significantly biased towards functionally (or phylogenetically) unique species # 0: loss is unbiased # -1: loss is significantly biased towards functionally (or phylogenetically) redundant species dat0 <- res_df #--- Functional diversity # FD: all threats dat0$tmp <- ifelse(dat0$FD.allthreats <= dat0$FDrandLO.allthreats, 1, 0) dat0$tmp <- ifelse(dat0$FD.allthreats >= dat0$FDrandHI.allthreats, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.allthreats, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbias.allthreats <- as.factor(dat0$tmp) # FD: habitat 1 dat0$tmp <- ifelse(dat0$FD.habitat1 <= dat0$FDrandLO.habitat1, 1, 0) dat0$tmp <- ifelse(dat0$FD.habitat1 >= dat0$FDrandHI.habitat1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.habitat1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbias.habitat1 <- as.factor(dat0$tmp) # FD: hunting 1 dat0$tmp <- ifelse(dat0$FD.hunting1 <= dat0$FDrandLO.hunting1, 1, 0) dat0$tmp <- ifelse(dat0$FD.hunting1 >= dat0$FDrandHI.hunting1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.hunting1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbias.hunting1 <- as.factor(dat0$tmp) # FD: climate 1 dat0$tmp <- ifelse(dat0$FD.climate1 <= dat0$FDrandLO.climate1, 1, 0) dat0$tmp <- ifelse(dat0$FD.climate1 >= dat0$FDrandHI.climate1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.climate1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbias.climate1 <- as.factor(dat0$tmp) # FD: conflict 1 dat0$tmp <- ifelse(dat0$FD.conflict1 <= dat0$FDrandLO.conflict1, 1, 0) dat0$tmp <- ifelse(dat0$FD.conflict1 >= dat0$FDrandHI.conflict1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.conflict1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbias.conflict1 <- as.factor(dat0$tmp) # FD: non-natives 1 dat0$tmp <- ifelse(dat0$FD.nonNatives1 <= dat0$FDrandLO.nonNatives1, 1, 0) dat0$tmp <- ifelse(dat0$FD.nonNatives1 >= dat0$FDrandHI.nonNatives1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.nonNatives1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbias.nonNatives1 <- as.factor(dat0$tmp) # FD: pollution 1 dat0$tmp <- ifelse(dat0$FD.pollution1 <= dat0$FDrandLO.pollution1, 1, 0) dat0$tmp <- ifelse(dat0$FD.pollution1 >= dat0$FDrandHI.pollution1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.pollution1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbias.pollution1 <- as.factor(dat0$tmp) # FD: hybridization 1 dat0$tmp <- ifelse(dat0$FD.hybrid1 <= dat0$FDrandLO.hybrid1, 1, 0) dat0$tmp <- ifelse(dat0$FD.hybrid1 >= dat0$FDrandHI.hybrid1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.hybrid1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbias.hybrid1 <- as.factor(dat0$tmp) # FD: prey depletion 1 dat0$tmp <- ifelse(dat0$FD.prey1 <= dat0$FDrandLO.prey1, 1, 0) dat0$tmp <- ifelse(dat0$FD.prey1 >= dat0$FDrandHI.prey1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.prey1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbias.prey1 <- as.factor(dat0$tmp) # FD: disease 1 dat0$tmp <- ifelse(dat0$FD.disease1 <= dat0$FDrandLO.disease1, 1, 0) dat0$tmp <- ifelse(dat0$FD.disease1 >= dat0$FDrandHI.disease1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.disease1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbias.disease1 <- as.factor(dat0$tmp) # FD: inbreeding 1 dat0$tmp <- ifelse(dat0$FD.inbreeding1 <= dat0$FDrandLO.inbreeding1, 1, 0) dat0$tmp <- ifelse(dat0$FD.inbreeding1 >= dat0$FDrandHI.inbreeding1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.inbreeding1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbias.inbreeding1 <- as.factor(dat0$tmp) #--- Phylogenetic diversity # PD: all threats dat0$tmp <- ifelse(dat0$PD.allthreats <= dat0$PDrandLO.allthreats, 1, 0) dat0$tmp <- ifelse(dat0$PD.allthreats >= dat0$PDrandHI.allthreats, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.allthreats, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbias.allthreats <- as.factor(dat0$tmp) # PD: habitat 1 dat0$tmp <- ifelse(dat0$PD.habitat1 <= dat0$PDrandLO.habitat1, 1, 0) dat0$tmp <- ifelse(dat0$PD.habitat1 >= dat0$PDrandHI.habitat1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.habitat1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbias.habitat1 <- as.factor(dat0$tmp) # PD: hunting 1 dat0$tmp <- ifelse(dat0$PD.hunting1 <= dat0$PDrandLO.hunting1, 1, 0) dat0$tmp <- ifelse(dat0$PD.hunting1 >= dat0$PDrandHI.hunting1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.hunting1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbias.hunting1 <- as.factor(dat0$tmp) # PD: climate 1 dat0$tmp <- ifelse(dat0$PD.climate1 <= dat0$PDrandLO.climate1, 1, 0) dat0$tmp <- ifelse(dat0$PD.climate1 >= dat0$PDrandHI.climate1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.climate1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbias.climate1 <- as.factor(dat0$tmp) # PD: conflict 1 dat0$tmp <- ifelse(dat0$PD.conflict1 <= dat0$PDrandLO.conflict1, 1, 0) dat0$tmp <- ifelse(dat0$PD.conflict1 >= dat0$PDrandHI.conflict1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.conflict1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbias.conflict1 <- as.factor(dat0$tmp) # PD: non-natives 1 dat0$tmp <- ifelse(dat0$PD.nonNatives1 <= dat0$PDrandLO.nonNatives1, 1, 0) dat0$tmp <- ifelse(dat0$PD.nonNatives1 >= dat0$PDrandHI.nonNatives1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.nonNatives1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbias.nonNatives1 <- as.factor(dat0$tmp) # PD: pollution 1 dat0$tmp <- ifelse(dat0$PD.pollution1 <= dat0$PDrandLO.pollution1, 1, 0) dat0$tmp <- ifelse(dat0$PD.pollution1 >= dat0$PDrandHI.pollution1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.pollution1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbias.pollution1 <- as.factor(dat0$tmp) # PD: hybridization 1 dat0$tmp <- ifelse(dat0$PD.hybrid1 <= dat0$PDrandLO.hybrid1, 1, 0) dat0$tmp <- ifelse(dat0$PD.hybrid1 >= dat0$PDrandHI.hybrid1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.hybrid1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbias.hybrid1 <- as.factor(dat0$tmp) # PD: prey depletion 1 dat0$tmp <- ifelse(dat0$PD.prey1 <= dat0$PDrandLO.prey1, 1, 0) dat0$tmp <- ifelse(dat0$PD.prey1 >= dat0$PDrandHI.prey1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.prey1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbias.prey1 <- as.factor(dat0$tmp) # PD: disease 1 dat0$tmp <- ifelse(dat0$PD.disease1 <= dat0$PDrandLO.disease1, 1, 0) dat0$tmp <- ifelse(dat0$PD.disease1 >= dat0$PDrandHI.disease1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.disease1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbias.disease1 <- as.factor(dat0$tmp) # PD: inbreeding 1 dat0$tmp <- ifelse(dat0$PD.inbreeding1 <= dat0$PDrandLO.inbreeding1, 1, 0) dat0$tmp <- ifelse(dat0$PD.inbreeding1 >= dat0$PDrandHI.inbreeding1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.inbreeding1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbias.inbreeding1 <- as.factor(dat0$tmp) #--- Functional diversity --- with Bonferroni correction # FD: all threats dat0$tmp <- ifelse(dat0$FD.allthreats <= dat0$FDrandLObonferroni.allthreats, 1, 0) dat0$tmp <- ifelse(dat0$FD.allthreats >= dat0$FDrandHIbonferroni.allthreats, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.allthreats, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbiasbonferroni.allthreats <- as.factor(dat0$tmp) # FD: habitat 1 dat0$tmp <- ifelse(dat0$FD.habitat1 <= dat0$FDrandLObonferroni.habitat1, 1, 0) dat0$tmp <- ifelse(dat0$FD.habitat1 >= dat0$FDrandHIbonferroni.habitat1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.habitat1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbiasbonferroni.habitat1 <- as.factor(dat0$tmp) # FD: hunting 1 dat0$tmp <- ifelse(dat0$FD.hunting1 <= dat0$FDrandLObonferroni.hunting1, 1, 0) dat0$tmp <- ifelse(dat0$FD.hunting1 >= dat0$FDrandHIbonferroni.hunting1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.hunting1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbiasbonferroni.hunting1 <- as.factor(dat0$tmp) # FD: climate 1 dat0$tmp <- ifelse(dat0$FD.climate1 <= dat0$FDrandLObonferroni.climate1, 1, 0) dat0$tmp <- ifelse(dat0$FD.climate1 >= dat0$FDrandHIbonferroni.climate1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.climate1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbiasbonferroni.climate1 <- as.factor(dat0$tmp) # FD: conflict 1 dat0$tmp <- ifelse(dat0$FD.conflict1 <= dat0$FDrandLObonferroni.conflict1, 1, 0) dat0$tmp <- ifelse(dat0$FD.conflict1 >= dat0$FDrandHIbonferroni.conflict1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.conflict1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbiasbonferroni.conflict1 <- as.factor(dat0$tmp) # FD: non-natives 1 dat0$tmp <- ifelse(dat0$FD.nonNatives1 <= dat0$FDrandLObonferroni.nonNatives1, 1, 0) dat0$tmp <- ifelse(dat0$FD.nonNatives1 >= dat0$FDrandHIbonferroni.nonNatives1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.nonNatives1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbiasbonferroni.nonNatives1 <- as.factor(dat0$tmp) # FD: pollution 1 dat0$tmp <- ifelse(dat0$FD.pollution1 <= dat0$FDrandLObonferroni.pollution1, 1, 0) dat0$tmp <- ifelse(dat0$FD.pollution1 >= dat0$FDrandHIbonferroni.pollution1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.pollution1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbiasbonferroni.pollution1 <- as.factor(dat0$tmp) # FD: hybridization 1 dat0$tmp <- ifelse(dat0$FD.hybrid1 <= dat0$FDrandLObonferroni.hybrid1, 1, 0) dat0$tmp <- ifelse(dat0$FD.hybrid1 >= dat0$FDrandHIbonferroni.hybrid1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.hybrid1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbiasbonferroni.hybrid1 <- as.factor(dat0$tmp) # FD: prey depletion 1 dat0$tmp <- ifelse(dat0$FD.prey1 <= dat0$FDrandLObonferroni.prey1, 1, 0) dat0$tmp <- ifelse(dat0$FD.prey1 >= dat0$FDrandHIbonferroni.prey1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.prey1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbiasbonferroni.prey1 <- as.factor(dat0$tmp) # FD: disease 1 dat0$tmp <- ifelse(dat0$FD.disease1 <= dat0$FDrandLObonferroni.disease1, 1, 0) dat0$tmp <- ifelse(dat0$FD.disease1 >= dat0$FDrandHIbonferroni.disease1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.disease1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbiasbonferroni.disease1 <- as.factor(dat0$tmp) # FD: inbreeding 1 dat0$tmp <- ifelse(dat0$FD.inbreeding1 <= dat0$FDrandLObonferroni.inbreeding1, 1, 0) dat0$tmp <- ifelse(dat0$FD.inbreeding1 >= dat0$FDrandHIbonferroni.inbreeding1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.inbreeding1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$FDbiasbonferroni.inbreeding1 <- as.factor(dat0$tmp) #--- Phylogenetic diversity --- with Bonferroni correction # PD: all threats dat0$tmp <- ifelse(dat0$PD.allthreats <= dat0$PDrandLObonferroni.allthreats, 1, 0) dat0$tmp <- ifelse(dat0$PD.allthreats >= dat0$PDrandHIbonferroni.allthreats, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.allthreats, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbiasbonferroni.allthreats <- as.factor(dat0$tmp) # PD: habitat 1 dat0$tmp <- ifelse(dat0$PD.habitat1 <= dat0$PDrandLObonferroni.habitat1, 1, 0) dat0$tmp <- ifelse(dat0$PD.habitat1 >= dat0$PDrandHIbonferroni.habitat1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.habitat1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbiasbonferroni.habitat1 <- as.factor(dat0$tmp) # PD: hunting 1 dat0$tmp <- ifelse(dat0$PD.hunting1 <= dat0$PDrandLObonferroni.hunting1, 1, 0) dat0$tmp <- ifelse(dat0$PD.hunting1 >= dat0$PDrandHIbonferroni.hunting1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.hunting1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbiasbonferroni.hunting1 <- as.factor(dat0$tmp) # PD: climate 1 dat0$tmp <- ifelse(dat0$PD.climate1 <= dat0$PDrandLObonferroni.climate1, 1, 0) dat0$tmp <- ifelse(dat0$PD.climate1 >= dat0$PDrandHIbonferroni.climate1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.climate1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbiasbonferroni.climate1 <- as.factor(dat0$tmp) # PD: conflict 1 dat0$tmp <- ifelse(dat0$PD.conflict1 <= dat0$PDrandLObonferroni.conflict1, 1, 0) dat0$tmp <- ifelse(dat0$PD.conflict1 >= dat0$PDrandHIbonferroni.conflict1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.conflict1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbiasbonferroni.conflict1 <- as.factor(dat0$tmp) # PD: non-natives 1 dat0$tmp <- ifelse(dat0$PD.nonNatives1 <= dat0$PDrandLObonferroni.nonNatives1, 1, 0) dat0$tmp <- ifelse(dat0$PD.nonNatives1 >= dat0$PDrandHIbonferroni.nonNatives1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.nonNatives1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbiasbonferroni.nonNatives1 <- as.factor(dat0$tmp) # PD: pollution 1 dat0$tmp <- ifelse(dat0$PD.pollution1 <= dat0$PDrandLObonferroni.pollution1, 1, 0) dat0$tmp <- ifelse(dat0$PD.pollution1 >= dat0$PDrandHIbonferroni.pollution1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.pollution1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbiasbonferroni.pollution1 <- as.factor(dat0$tmp) # PD: hybridization 1 dat0$tmp <- ifelse(dat0$PD.hybrid1 <= dat0$PDrandLObonferroni.hybrid1, 1, 0) dat0$tmp <- ifelse(dat0$PD.hybrid1 >= dat0$PDrandHIbonferroni.hybrid1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.hybrid1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbiasbonferroni.hybrid1 <- as.factor(dat0$tmp) # PD: prey depletion 1 dat0$tmp <- ifelse(dat0$PD.prey1 <= dat0$PDrandLObonferroni.prey1, 1, 0) dat0$tmp <- ifelse(dat0$PD.prey1 >= dat0$PDrandHIbonferroni.prey1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.prey1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbiasbonferroni.prey1 <- as.factor(dat0$tmp) # PD: disease 1 dat0$tmp <- ifelse(dat0$PD.disease1 <= dat0$PDrandLObonferroni.disease1, 1, 0) dat0$tmp <- ifelse(dat0$PD.disease1 >= dat0$PDrandHIbonferroni.disease1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.disease1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbiasbonferroni.disease1 <- as.factor(dat0$tmp) # PD: inbreeding 1 dat0$tmp <- ifelse(dat0$PD.inbreeding1 <= dat0$PDrandLObonferroni.inbreeding1, 1, 0) dat0$tmp <- ifelse(dat0$PD.inbreeding1 >= dat0$PDrandHIbonferroni.inbreeding1, -1, dat0$tmp) dat0$tmp <- ifelse(dat0$SR0 == dat0$SR.inbreeding1, 0, dat0$tmp) dat0$tmp[is.na(dat0$tmp)] <- 0 dat0$PDbiasbonferroni.inbreeding1 <- as.factor(dat0$tmp) dat0 <- subset(dat0, select = -c(tmp)) #------------------- SAVE FILE ------------------------------------------------------ write.csv(dat0, "FuncPhylo_mammals190816.csv", row.names=F) end <- Sys.time() end elapsed <- end - start elapsed # 2.3 days to run on my laptop with numrand=301; 3.5 days on desktop with numrand=501 <file_sep>/0_DataPrep_190730.R ######################### Prepare inputs for analysis ################################################ # Laptop #setwd("C:/Users/JB/Box Sync/Projects/Projects (active)/Functional diversity/Analysis") # Desktop #setwd("D:/Box Sync/Projects/Projects (active)/Functional diversity/Analysis") # Sara #setwd("C:/Users/saraw/Desktop/FD") library(dplyr) #library(tidyr) #library(sf) #library(picante) #library(raster) #library(treeman) library(ape) library(phytools) rm(list = ls()) #----------------------- LOAD AND CLEAN DATA ------------------------------------------------- Threats0 <- data.frame(read.csv("Data/Raw/Threats_mammals.csv", header=T)) Traits <- data.frame(read.csv("Data/Raw/Traits_mammals.csv", header=T)) # Wilman (2014) trait data (see "Articles/Measuring functional diversity" folder) spptree0 <- read.tree("Data/Raw/Uyeda_etal_tetrapods.tree") # This tree was shared with <NAME> privately and so has not been made publicly available here # Clean some of the data names(Traits)[names(Traits) == 'Scientific'] <- 'binary' Threats0$binary <- stringr::str_replace_all(Threats0$binary, c(" " = "_")) Threats <- Threats0 dim(Threats) # 5674 species # Remove marine species Threats <- subset(Threats, order != "Sirenia") Threats <- subset(Threats, binary != "Enhydra_lutris") Threats <- subset(Threats, family != "Eschrichtiidae") Threats <- subset(Threats, family != "Monodontidae") Threats <- subset(Threats, family != "Iniidae") Threats <- subset(Threats, family != "Delphinidae") Threats <- subset(Threats, family != "Lipotidae") Threats <- subset(Threats, family != "Phocoenidae") Threats <- subset(Threats, family != "Pontoporiidae") Threats <- subset(Threats, family != "Balaenidae") Threats <- subset(Threats, family != "Balaenopteridae") Threats <- subset(Threats, family != "Ziphiidae") Threats <- subset(Threats, family != "Neobalaenidae") Threats <- subset(Threats, family != "Physeteridae") Threats <- subset(Threats, family != "Platanistidae") # Add back in the river dolphins riv01 <- subset(Threats0, binary == "Platanista_gangetica") riv02 <- subset(Threats0, binary == "Inia_geoffrensis") riv03 <- subset(Threats0, binary == "Inia_araguaiaensis") riv04 <- subset(Threats0, binary == "Inia_boliviensis") riv05 <- subset(Threats0, binary == "Pontoporia_blainvillei") riv06 <- subset(Threats0, binary == "Lipotes_vexillifer") rivdol <- rbind(riv01, riv02, riv03, riv04, riv05, riv06) Threats <- rbind(Threats, rivdol) dim(Threats) # 5583 species (removed 91 spp) # Remove extinct species Threats <- subset(Threats, risk != "Extinct") dim(Threats) # 5501 species (removed 82 spp) #----------------------- LINK TO SPECIES TRAITS ------------------------------------------- #----- Combine threats and traits # Link the traits and threats databases Dat0 <- suppressWarnings(left_join(Threats, Traits, by="binary")) # Combine vertebrate feeding categories Dat0$Vert <- Dat0$Diet.Vend + Dat0$Diet.Vect + Dat0$Diet.Vunk # vertebrate endotherms, ectotherms, & vertebrate unknown # Foraging strata Dat0$foraging <- Dat0$ForStrat.Value levels(Dat0$foraging)[levels(Dat0$foraging)=="S"] <- "Ar" # lumping scansorial with arboreal levels <- levels(Dat0$foraging) levels[length(levels) + 1] <- "None" Dat0$foraging <- factor(Dat0$foraging, levels = levels) Dat0$foraging[is.na(Dat0$foraging)] <- "None" levels(Dat0$foraging)[levels(Dat0$foraging)=="None"] <- "G" Dat0$Ground <- ifelse(Dat0$foraging=="G", 1, 0) Dat0$Climbing <- ifelse(Dat0$foraging=="Ar", 1, 0) Dat0$Volant <- ifelse(Dat0$foraging=="A", 1, 0) # Activity time # Not currently used. It's not clear that activity time would strongly affect ecological function # Data frame of species threats and traits dat0 <- cbind.data.frame(rowID=Dat0$rowID, species=Dat0$binary, family=Dat0$family, order=Dat0$order, binary=Dat0$binary, trend=Dat0$trend, ExceptionsToDecline=Dat0$ExceptionsToDecline, habitat=Dat0$habitat, hunting=Dat0$hunting, conflict=Dat0$conflict, climate=Dat0$climate, NonNatives=Dat0$NonNatives, pollution=Dat0$pollution, hybrid=Dat0$hybrid, prey=Dat0$prey, disease=Dat0$disease, inbreeding=Dat0$inbreeding, DietInv=Dat0$Diet.Inv, DietVert=Dat0$Vert, DietFish=Dat0$Diet.Vfish, DietScav=Dat0$Diet.Scav, DietFruit=Dat0$Diet.Fruit, DietNect=Dat0$Diet.Nect, DietSeed=Dat0$Diet.Seed, DietHerb=Dat0$Diet.PlantO, BodyMass=Dat0$BodyMass.Value, Ground=Dat0$Ground, Climbing=Dat0$Climbing, Volant=Dat0$Volant) dat0$species <- as.character(dat0$species) dat0$DietInv <- as.numeric(dat0$DietInv) dat0$DietVert <- as.numeric(dat0$DietVert) dat0$DietFish <- as.numeric(dat0$DietFish) dat0$DietScav <- as.numeric(dat0$DietScav) dat0$DietFruit <- as.numeric(dat0$DietFruit) dat0$DietNect <- as.numeric(dat0$DietNect) dat0$DietSeed <- as.numeric(dat0$DietSeed) dat0$DietHerb <- as.numeric(dat0$DietHerb) dat1 <- cbind.data.frame(rowID=dat0$rowID, species=dat0$species, family=dat0$family, order=dat0$order, trend=dat0$trend, ExceptionsToDecline=dat0$ExceptionsToDecline, habitat=dat0$habitat, hunting=dat0$hunting, conflict=dat0$conflict, climate=dat0$climate, NonNatives=dat0$NonNatives, pollution=dat0$pollution, hybrid=dat0$hybrid, prey=dat0$prey, disease=dat0$disease, inbreeding=dat0$inbreeding, DietInv=dat0$DietInv, DietVert=dat0$DietVert, DietFish=dat0$DietFish, DietScav=dat0$DietScav, DietFruit=dat0$DietFruit, DietNect=dat0$DietNect, DietSeed=dat0$DietSeed, DietHerb=dat0$DietHerb, BodyMass=dat0$BodyMass) dat1$Ground <- dat0$Ground dat1$Climbing <- dat0$Climbing dat1$Volant <- dat0$Volant #----- Deal with missing traits dim(dat1) # We're at 5501 species # Add a "genus" column to dat1 tmp1 <- data.frame(species = dat1$species) tmp1$species <- as.character(tmp1$species) tmp2 <- t(data.frame(strsplit(tmp1$species, "_"))) tmp2 <- data.frame(tmp2) names(tmp2) <- c("genus", "epithet") dat1$genus <- tmp2$genus # Find out which species are missing trait data MissingTraits <- dat1[rowSums(is.na(dat1)) > 0,] # rows of dat1 that are missing trait data dim(MissingTraits) # Missing trait data for 745 species # Remove those species from dat1 dat2 <- dplyr::anti_join(dat1, MissingTraits, by="species") # Use genus-average trait data for each missing species NewTraits <- MissingTraits for(i in 1:nrow(MissingTraits)){ #i=745 dattmp <- subset(dat2, genus==NewTraits$genus[i]) NewTraits$DietInv[i] <- mean(dattmp$DietInv, na.omit=T) NewTraits$DietVert[i] <- mean(dattmp$DietVert, na.omit=T) NewTraits$DietFish[i] <- mean(dattmp$DietFish, na.omit=T) NewTraits$DietScav[i] <- mean(dattmp$DietScav, na.omit=T) NewTraits$DietFruit[i] <- mean(dattmp$DietFruit, na.omit=T) NewTraits$DietNect[i] <- mean(dattmp$DietNect, na.omit=T) NewTraits$DietSeed[i] <- mean(dattmp$DietSeed, na.omit=T) NewTraits$DietHerb[i] <- mean(dattmp$DietHerb, na.omit=T) NewTraits$BodyMass[i] <- mean(dattmp$BodyMass, na.omit=T) NewTraits$Ground[i] <- max(dattmp$Ground, na.rm=T) NewTraits$Climbing[i] <- max(dattmp$Climbing, na.rm=T) NewTraits$Volant[i] <- max(dattmp$Volant, na.rm=T) } summary(NewTraits) # 212 NA's # Genus-level averaging worked for 745-212=533 species # Still missing traits for 212 species. # Use family-level averaging for species that are still missing trait data StillMissingTraits <- NewTraits[rowSums(is.na(NewTraits)) > 0,] # rows of NewTraits that are missing trait data NewTraits <- dplyr::anti_join(NewTraits, StillMissingTraits, by="species") # remove those species from NewTraits NewTraits2 <- StillMissingTraits for(i in 1:nrow(StillMissingTraits)){ #i=1 dattmp <- subset(dat2, family==NewTraits2$family[i]) NewTraits2$DietInv[i] <- mean(dattmp$DietInv, na.omit=T) NewTraits2$DietVert[i] <- mean(dattmp$DietVert, na.omit=T) NewTraits2$DietFish[i] <- mean(dattmp$DietFish, na.omit=T) NewTraits2$DietScav[i] <- mean(dattmp$DietScav, na.omit=T) NewTraits2$DietFruit[i] <- mean(dattmp$DietFruit, na.omit=T) NewTraits2$DietNect[i] <- mean(dattmp$DietNect, na.omit=T) NewTraits2$DietSeed[i] <- mean(dattmp$DietSeed, na.omit=T) NewTraits2$DietHerb[i] <- mean(dattmp$DietHerb, na.omit=T) NewTraits2$BodyMass[i] <- mean(dattmp$BodyMass, na.omit=T) NewTraits2$Ground[i] <- max(dattmp$Ground, na.rm=T) NewTraits2$Climbing[i] <- max(dattmp$Climbing, na.rm=T) NewTraits2$Volant[i] <- max(dattmp$Volant, na.rm=T) } summary(NewTraits2) # 1 NA # Genus-level averaging worked for 212-1=211 species # Still missing traits for 1 species. # Find species that is *still* missing trait data StillStillMissing <- NewTraits2[rowSums(is.na(NewTraits2)) > 0,] NewTraits2 <- dplyr::anti_join(NewTraits2, StillStillMissing, by="species") # remove those species from NewTraits NewTraits3 <- StillStillMissing NewTraits3$species # Just one critter still missing: Laonastes_aenigmamus! # Enter data for Laonastes manually, based on IUCN Red List (accessed 6 May 2019) NewTraits3$DietInv <- 10 NewTraits3$DietVert <- 0 NewTraits3$DietFish <- 0 NewTraits3$DietScav <- 0 NewTraits3$DietFruit <- 0 NewTraits3$DietNect <- 0 NewTraits3$DietSeed <- 30 NewTraits3$DietHerb <- 60 rats <- subset(dat1, genus == "Rattus") NewTraits3$BodyMass <- mean(rats$BodyMass, na.rm=T) NewTraits3$Ground <- 1 NewTraits3$Climbing <- 1 NewTraits3$Volant <- 0 # Assemble the new data dat3 <- rbind.data.frame(dat2, NewTraits, NewTraits2, NewTraits3) # Now we have trait data for all 5501 species #----- End of section on 'dealing with missing traits' #----- Standardize data for continuous traits dat1 <- dat3 dat1$DietInv <- as.numeric(scale(dat1$DietInv, scale=TRUE, center=TRUE)) dat1$DietVert <- as.numeric(scale(dat1$DietVert, scale=TRUE, center=TRUE)) dat1$DietFish <- as.numeric(scale(dat1$DietFish, scale=TRUE, center=TRUE)) dat1$DietScav <- as.numeric(scale(dat1$DietScav, scale=TRUE, center=TRUE)) dat1$DietFruit <- as.numeric(scale(dat1$DietFruit, scale=TRUE, center=TRUE)) dat1$DietNect <- as.numeric(scale(dat1$DietNect, scale=TRUE, center=TRUE)) dat1$DietSeed <- as.numeric(scale(dat1$DietSeed, scale=TRUE, center=TRUE)) dat1$DietHerb <- as.numeric(scale(dat1$DietHerb, scale=TRUE, center=TRUE)) dat1$BodyMass <- as.numeric(scale(dat1$BodyMass, scale=TRUE, center=TRUE)) #----- Save file dat1 <- subset(dat1, select = -c(genus)) write.csv(dat1, "Data/Raw/mammal_threats_traits.csv", row.names=F) # This file is then used as input in the main processing script: FD_PD_190721.R #----------------- PHYLOGENY ------------------------------------- ape::is.ultrametric(spptree0) ape::is.rooted(spptree0) ape::is.binary.tree(spptree0) #---- Remove genera from the phylogeny that aren't in the threats database #---- (this massively cuts down the size of the phylogeny file, while keeping genera we'll need later) # add a "genus" column back to dat1 tmp1 <- data.frame(species = dat1$species) tmp1$species <- as.character(tmp1$species) tmp2 <- t(data.frame(strsplit(tmp1$species, "_"))) tmp2 <- data.frame(tmp2) names(tmp2) <- c("genus", "epithet") dat1$genus <- tmp2$genus # genera in the phylogeny tmp <- spptree0$tip.label test <- sub("_[^_]+$", "", tmp) test <- sub("_[^_]+$", "", test) gentree <- data.frame(test) gentree <- data.frame(genus=unique(gentree[,1])) gentree$genus <- as.character(gentree$genus) # list of genera to remove from phylogeny genremove <- suppressWarnings(dplyr::anti_join(gentree, dat1, by = "genus")) # remove those genera from the phylogeny spptree <- spptree0 for(i in 1:nrow(genremove)){ spptree <- drop.tip(spptree, tip = grep(genremove[i,1], spptree$tip.label, value=T)) } # check that that worked tmp <- spptree$tip.label test <- sub("_[^_]+$", "", tmp) test <- sub("_[^_]+$", "", test) gentree2 <- data.frame(test) gentree2 <- data.frame(genus = unique(gentree2[,1])) gentree2$genus <- as.character(gentree2$genus) genremove2 <- suppressWarnings(dplyr::anti_join(gentree2, dat1, by = "genus")) nrow(genremove2) # should be 0 # find out which species are in the threats/traits database but not the phylogeny speciesthreats <- data.frame(species = unique(dat1$species)) speciesthreats$species <- as.character(speciesthreats$species) speciestree <- data.frame(species = spptree$tip.label) speciestree$species <- as.character(speciestree$species) SppMissing <- suppressWarnings(dplyr::anti_join(speciesthreats, speciestree, by = "species")) NumSppNotInPhylo <- nrow(SppMissing) NumSppNotInPhylo # 1664 species # add a 'genus' column to SppMissing tmp <- SppMissing$species test <- sub("_[^_]+$", "", tmp) gen <- data.frame(test) SppMissing$genus <- test # remove species from SppMissing that don't have a genus in the phylogeny SppMissing2 <- suppressWarnings(dplyr::inner_join(SppMissing, gentree2, by = "genus")) nrow(SppMissing2) # able to add 1383 of the missing species to the phylogeny at the root node of their genus NumSppCantBeAddedToPhylo <- NumSppNotInPhylo - nrow(SppMissing2) NumSppCantBeAddedToPhylo # 281 species from the threats data can't be added to the phylogeny #---- Make the tree ultrametric #l <- seq(0, 100, by=1) #LL.out <- NULL #start <- Sys.time() #for(i in 1:length(l)){ # LL.out[i] <- attributes(chronos(spptree, lambda=l[i]))$ploglik } #Sys.time() - start #write.csv(LL.out, "LLout.csv", row.names=F) #--- Find a good lambda value LL.out <- matrix(, nrow=6, ncol=2) LL.out <- data.frame(LL.out) names(LL.out) <- c("lambda", "loglik") LL.out$lambda <- c(0, 0.5, 1, 2, 0.25, 0.75) start <- Sys.time() LL.out$loglik[1] <- attributes(chronos(spptree, lambda = 0))$ploglik Sys.time() - start write.csv(LL.out, "LLout.csv", row.names=F) start <- Sys.time() LL.out$loglik[2] <- attributes(chronos(spptree, lambda = 0.5))$ploglik Sys.time() - start write.csv(LL.out, "LLout.csv", row.names=F) start <- Sys.time() LL.out$loglik[3] <- attributes(chronos(spptree, lambda = 1.0))$ploglik Sys.time() - start write.csv(LL.out, "LLout.csv", row.names=F) start <- Sys.time() LL.out$loglik[4] <- attributes(chronos(spptree, lambda = 2.0))$ploglik Sys.time() - start write.csv(LL.out, "LLout.csv", row.names=F) start <- Sys.time() LL.out$loglik[5] <- attributes(chronos(spptree, lambda = 0.25))$ploglik Sys.time() - start write.csv(LL.out, "LLout.csv", row.names=F) start <- Sys.time() LL.out$loglik[6] <- attributes(chronos(spptree, lambda = 0.75))$ploglik Sys.time() - start write.csv(LL.out, "LLout.csv", row.names=F) #--- Make the phylogeny ultrametric using best lambda value # log likelihoods decline linearly and precipitously from 0 to 2. spptree1 <- ape::chronos(spptree, lambda = 0) ape::is.ultrametric(spptree1) #-------- Add species to the phylogeny (at the root node of the appropriate genus) that are in the threats database but not the phylogeny spptree2 <- spptree1 start <- Sys.time() for(i in 1:nrow(SppMissing2)){ tryCatch({ #i=9 spptree2 <- phytools::add.species.to.genus(spptree2, SppMissing2[i,1], where = "root") }, error=function(e){}) # end of the tryCatch function } Sys.time() - start # were we able to add all missing spp (that had genera in the phylogeny) to the phylogeny? nrow(SppMissing2) == length(spptree2$tip.label) - length(spptree1$tip.label) # if 'TRUE', we've added them all # save phylogeny write.tree(spptree2, file = "mammaltree.tree") <file_sep>/README.md # UMT-BrodieLab-GlobalDiv<file_sep>/PostAnalysis190724_graphs_stats.R library(ggplot2) library(dplyr) library(sf) library(sp) library(cowplot) # desktop #setwd('D:/Box Sync/Projects/Projects (active)/Functional diversity/Analysis') # laptop #setwd('C:/Users/JB/Box Sync/Projects/Projects (active)/Functional diversity/Analysis') ############## DATA ###################################################################### rm(list = ls()) dat0 <- data.frame(read.csv("FuncPhylo_mammals190816.csv", header=T)) # Output from main analysis ############## SOME SIMPLE CORRELATIONS ######################################################## dat1 <- cbind.data.frame(SR.0=dat0$SR0, FD.0=dat0$FD0, PD.0=dat0$PD0) dat1 <- dat1[complete.cases(dat1),] cor(dat1) cor.test(dat1$SR.0, dat1$FD.0) cor.test(dat1$SR.0, dat1$PD.0) cor.test(dat1$FD.0, dat1$PD.0) ############## HOW THE DIFFERENT THREATS CHANGE FD AND PD ######################################## #----------- CREATES THE BOX PLOTS USED IN FIG 1 OF THE MANUSCRIPT --------------------------------- #------- Change in FD dat1 <- cbind.data.frame(FD.change.habitat = dat0$FD.habitat1-1, FD.change.hunting = dat0$FD.hunting1-1, FD.change.climate = dat0$FD.climate1-1, FD.change.pollution = dat0$FD.pollution1-1, FD.change.prey = dat0$FD.prey1-1, FD.change.conflict = dat0$FD.conflict1-1, FD.change.nonnatives = dat0$FD.nonNatives1-1, FD.change.hybrid = dat0$FD.hybrid1-1, FD.change.disease = dat0$FD.disease1-1, FD.change.inbreeding = dat0$FD.inbreeding1-1) datstack <- stack(dat1) names(datstack) <- c("diversityloss", "threat") # one-way ANOVA resaov <- aov(diversityloss ~ threat, data=datstack) summary(resaov) TukeyHSD(resaov) # box plot plotfd <- ggplot(datstack, aes(x=threat, y=diversityloss)) + geom_boxplot(outlier.shape=NA, fill='#A4A4A4', color="black") + ylim(-0.25, 0) + theme_classic() + geom_hline(yintercept=0, linetype="dashed", color="black") + geom_point(aes()) # how many data points per category? dattmp <- na.omit(datstack) summary(dattmp) #------- Change in PD dat1 <- cbind.data.frame(PD.change.habitat = dat0$PD.habitat1-1, PD.change.hunting = dat0$PD.hunting1-1, PD.change.climate = dat0$PD.climate1-1, PD.change.pollution = dat0$PD.pollution1-1, PD.change.prey = dat0$PD.prey1-1, PD.change.conflict = dat0$PD.conflict1-1, PD.change.nonnatives = dat0$PD.nonNatives1-1, PD.change.hybrid = dat0$PD.hybrid1-1, PD.change.disease = dat0$PD.disease1-1, PD.change.inbreeding = dat0$PD.inbreeding1-1) datstack <- stack(dat1) names(datstack) <- c("diversityloss", "threat") # one-way ANOVA resaov <- aov(diversityloss ~ threat, data=datstack) summary(resaov) TukeyHSD(resaov) # box plot plotpd <- ggplot(datstack, aes(x=threat, y=diversityloss)) + geom_boxplot(outlier.shape=NA, fill='#A4A4A4', color="black") + ylim(-0.25, 0) + theme_classic() + geom_hline(yintercept=0, linetype="dashed", color="black") + geom_point(aes()) # how many data points per category? dattmp <- na.omit(datstack) summary(dattmp) #------ Multi-panel figure plot_grid(plotfd, plotpd, labels = "AUTO", ncol = 1, nrow = 2, label_size = 11) ggsave("Results/Graphs/ThreatImpacts_A_fd_B_pd.jpg") #------ Some associated stats # "...on average, the impacts of habitat loss and harvest... exceed those of climate change by..." # factor by which habitat loss impacts on FD exceed those of climate change (1 - mean(dat0$FD.habitat1, na.rm=T)) / (1 - mean(dat0$FD.climate1, na.rm=T)) # factor by which harvest impacts on FD exceed those of climate change (1 - mean(dat0$FD.hunting1, na.rm=T)) / (1 - mean(dat0$FD.climate1, na.rm=T)) # factor by which hunting impacts on PD exceed those of climate change (1 - mean(dat0$PD.habitat1, na.rm=T)) / (1 - mean(dat0$PD.climate1, na.rm=T)) # impacts on PD do not differ significantly between climate and harvest ############## CHANGES IN WHICH TRAITS ARE DRIVING DECLINES IN FD? ######################################## # "...Across countries, reductions in mammal functional diversity driven by habitat loss and harvest were correlated with nationwide # declines in frugivores..." #--- habitat DivChange <- (1 - dat0$FD.habitat1) traitcols <- seq(149, 160, by=1) # columns of dat0 for trait changes dat1 <- dat0[traitcols] dat1 <- cbind.data.frame(DivChange, dat1) dat1 <- dat1[complete.cases(dat1),] corrs <- cor(dat1) corrs[,1] #--- harvest DivChange <- (1 - dat0$FD.hunting1) traitcols <- seq(161, 172, by=1) # columns of dat0 for trait changes dat1 <- dat0[traitcols] dat1 <- cbind.data.frame(DivChange, dat1) dat1 <- dat1[complete.cases(dat1),] corrs <- cor(dat1) corrs[,1] ############## COMPARE PROPORTIONAL LOSS RATES OF SR, FD, & PD ###################################################################### # "...average losses of taxonomic, functional, and phylogenetic diversity across the world from all threats combined were..." #----------------- FD vs SR ---------------------------------------- #--- habitat --- dat1 <- dat0 dat1$SR.loss.prop <- (dat1$SR0 - dat1$SR.habitat1) / dat1$SR0 dat1$loss.prop <- (1 - dat1$FD.habitat1) dat2 <- cbind.data.frame(country = dat1$Country, area = dat1$Area_sq_km, SR0 = dat1$SR0, SR.loss.prop = dat1$SR.loss.prop, loss.prop = dat1$loss.prop) dat2$diff <- dat2$SR.loss.prop - dat2$loss.prop #dat2 <- subset(dat2, area >= 50000) # paired t test #dat3 <- cbind.data.frame(SR.loss.prop = dat2$SR.loss.prop, loss.prop = dat2$loss.prop) #t.test(dat3$SR.loss.prop, dat3$loss.prop, paired = TRUE, alternative = "two.sided") # regression of standing diversity vs the diff between SR.loss.prop and loss.prop m1 <- glm(diff ~ SR0, data=dat2) summary(m1) # regression of countries' areas vs the diff between SR.loss.prop and loss.prop m2 <- glm(diff ~ area, data=dat2) summary(m2) # plotting newdat <- data.frame(SR0 = seq(50, 700, by=50)) newdat$Ypred <- predict(m1, newdata=newdat, type="response", se.fit=T, na.omit=T)$fit newdat$YpredSE <- predict(m1, newdata=newdat, type="response", se.fit=T, na.omit=T)$se.fit newdat$YpredLO <- newdat$Ypred - (1.96 * newdat$YpredSE) newdat$YpredHI <- newdat$Ypred + (1.96 * newdat$YpredSE) newdat #--- hunting --- dat1 <- dat0 dat1$SR.loss.prop <- (dat1$SR0 - dat1$SR.hunting1) / dat1$SR0 dat1$loss.prop <- (1 - dat1$FD.hunting1) dat2 <- cbind.data.frame(country = dat1$Country, area = dat1$Area_sq_km, SR0 = dat1$SR0, SR.loss.prop = dat1$SR.loss.prop, loss.prop = dat1$loss.prop) dat2$diff <- dat2$SR.loss.prop - dat2$loss.prop #dat2 <- subset(dat2, area >= 50000) # paired t test dat3 <- cbind.data.frame(SR.loss.prop = dat2$SR.loss.prop, loss.prop = dat2$loss.prop) t.test(dat3$SR.loss.prop, dat3$loss.prop, paired = TRUE, alternative = "two.sided") # regression of standing diversity vs the diff between SR.loss.prop and loss.prop m1 <- glm(diff ~ SR0, data=dat2) summary(m1) # regression of countries' areas vs the diff between SR.loss.prop and loss.prop m2 <- glm(diff ~ area, data=dat2) summary(m2) # plotting newdat <- data.frame(SR0 = seq(50, 700, by=50)) newdat$Ypred <- predict(m1, newdata=newdat, type="response", se.fit=T, na.omit=T)$fit newdat$YpredSE <- predict(m1, newdata=newdat, type="response", se.fit=T, na.omit=T)$se.fit newdat$YpredLO <- newdat$Ypred - (1.96 * newdat$YpredSE) newdat$YpredHI <- newdat$Ypred + (1.96 * newdat$YpredSE) newdat #----------------- PD vs SR---------------------------------------- #--- habitat --- dat1 <- dat0 dat1$SR.loss.prop <- (dat1$SR0 - dat1$SR.habitat1) / dat1$SR0 dat1$loss.prop <- (1 - dat1$PD.habitat1) dat2 <- cbind.data.frame(country = dat1$Country, area = dat1$Area_sq_km, SR0 = dat1$SR0, SR.loss.prop = dat1$SR.loss.prop, loss.prop = dat1$loss.prop) dat2$diff <- dat2$SR.loss.prop - dat2$loss.prop #dat2 <- subset(dat2, area >= 50000) # paired t test #dat3 <- cbind.data.frame(SR.loss.prop = dat2$SR.loss.prop, loss.prop = dat2$loss.prop) #t.test(dat3$SR.loss.prop, dat3$loss.prop, paired = TRUE, alternative = "two.sided") # regression of standing diversity vs the diff between SR.loss.prop and loss.prop m1 <- glm(diff ~ SR0, data=dat2) summary(m1) # regression of countries' areas vs the diff between SR.loss.prop and loss.prop m2 <- glm(diff ~ area, data=dat2) summary(m2) # plotting newdat <- data.frame(SR0 = seq(50, 700, by=50)) newdat$Ypred <- predict(m1, newdata=newdat, type="response", se.fit=T, na.omit=T)$fit newdat$YpredSE <- predict(m1, newdata=newdat, type="response", se.fit=T, na.omit=T)$se.fit newdat$YpredLO <- newdat$Ypred - (1.96 * newdat$YpredSE) newdat$YpredHI <- newdat$Ypred + (1.96 * newdat$YpredSE) newdat #--- hunting --- dat1 <- dat0 dat1$SR.loss.prop <- (dat1$SR0 - dat1$SR.hunting1) / dat1$SR0 dat1$loss.prop <- (1 - dat1$PD.hunting1) dat2 <- cbind.data.frame(country = dat1$Country, area = dat1$Area_sq_km, SR0 = dat1$SR0, SR.loss.prop = dat1$SR.loss.prop, loss.prop = dat1$loss.prop) dat2$diff <- dat2$SR.loss.prop - dat2$loss.prop #dat2 <- subset(dat2, area >= 50000) # paired t test #dat3 <- cbind.data.frame(SR.loss.prop = dat2$SR.loss.prop, loss.prop = dat2$loss.prop) #t.test(dat3$SR.loss.prop, dat3$loss.prop, paired = TRUE, alternative = "two.sided") # regression of standing diversity vs the diff between SR.loss.prop and loss.prop m1 <- glm(diff ~ SR0, data=dat2) summary(m1) # regression of countries' areas vs the diff between SR.loss.prop and loss.prop m2 <- glm(diff ~ area, data=dat2) summary(m2) # plotting newdat <- data.frame(SR0 = seq(50, 700, by=50)) newdat$Ypred <- predict(m1, newdata=newdat, type="response", se.fit=T, na.omit=T)$fit newdat$YpredSE <- predict(m1, newdata=newdat, type="response", se.fit=T, na.omit=T)$se.fit newdat$YpredLO <- newdat$Ypred - (1.96 * newdat$YpredSE) newdat$YpredHI <- newdat$Ypred + (1.96 * newdat$YpredSE) newdat
1c8c72c983faf893dd506b3b559c993f175687d6
[ "Markdown", "R" ]
5
R
williash23/UMT-BrodieLab-GlobalDiv
9d42fba71e3d72cfd2dd8162b01928e8b38c697a
7592932d0eec277ba26923ae37e4956a761a8653
refs/heads/master
<file_sep>import sys import os import django from django.conf import settings from django.core.management import call_command opts = { 'INSTALLED_APPS': ['pgqueue'], 'DATABASES': { 'default': { 'ENGINE': 'django.db.backends.postgresql', 'HOST': '127.0.0.1', 'PORT': '5432', 'NAME': 'pgqueue', 'USER': 'postgres', 'PASSWORD': '', }, }, } if __name__ == '__main__': sys.path.insert(0, os.path.abspath(os.path.dirname(__file__))) settings.configure(**opts) django.setup() call_command('test', 'pgqueue') <file_sep># Generated by Django 2.0.2 on 2018-03-19 13:01 import django.contrib.postgres.fields.jsonb import django.contrib.postgres.functions from django.db import migrations, models class Migration(migrations.Migration): initial = True dependencies = [ ] operations = [ migrations.CreateModel( name='Job', fields=[ ('id', models.BigAutoField(primary_key=True, serialize=False)), ('task', models.CharField(max_length=255)), ('created_at', models.DateTimeField(default=django.contrib.postgres.functions.TransactionNow)), ('execute_at', models.DateTimeField(default=django.contrib.postgres.functions.TransactionNow)), ('priority', models.PositiveIntegerField(default=0)), ('kwargs', django.contrib.postgres.fields.jsonb.JSONField(blank=True, default={})), ], options={ 'abstract': False, }, ), migrations.AddIndex( model_name='job', index=models.Index(fields=['-priority', 'created_at'], name='pgqueue_job_priorit_c6c76e_idx'), ), ] <file_sep>from django.contrib.postgres.functions import TransactionNow from django.contrib.postgres.fields import JSONField from django.db import models def lazy_empty_dict(): return {} class BaseJob(models.Model): id = models.BigAutoField(primary_key=True) task = models.CharField(max_length=255) created_at = models.DateTimeField(default=TransactionNow) execute_at = models.DateTimeField(default=TransactionNow) priority = models.PositiveIntegerField(default=0) context = JSONField(default=lazy_empty_dict, blank=True) kwargs = JSONField(default=lazy_empty_dict, blank=True) class Meta: indexes = [ models.Index(fields=['-priority', 'created_at']), ] abstract = True def __str__(self): return '{} {}'.format(self.task, self.kwargs) def as_dict(self): return { 'id': self.id, 'created_at': self.created_at, 'execute_at': self.execute_at, 'priority': self.priority, 'kwargs': self.kwargs, 'task': self.task, } class Job(BaseJob): pass <file_sep># Django PGQueue The project was initially forked from [django-postgres-queue][dpq] for internal use in some of my projects. After some changes and refactoring, adding tests, features like warm shutdown I decided to put in on GitHub and PyPI as a standalone package just in case. ## Installation ``` pip install django-pgqueue ``` Then add `pqueue` into `INSTALLED_APPS` and run `manage.py migrate` to create the jobs table. ## Usage Initiate a queue object with defined tasks. This can go wherever you like and can be called whatever you like. For example: ``` # someapp/task_queue.py from pqueue.queue import Queue def say_hello(queue, job): name = job.kwargs['name'] print('Hello, {}!'.format(name)) task_queue = Queue( tasks={ 'say_hello': say_hello, }, notify_channel='someapp_task_queue', ) ``` Now define the worker command ``` # someapp/management/commands/pgqueue_worker.py from pgqueue.worker import WorkerCommand from someapp.queue import task_queue class Command(WorkerCommand): queue = task_queue ``` And call the task this way: ``` from someapp.queue import task_queue task_queue.enqueue('say_hello', {'name': 'Django'}) ``` Please note that only primitives can be used in job’s arguments as they are stored as JSON in the database. When you try to pass any complex non-json-serializable object, you will get an error saying `object is not json serializable`. ## Periodic tasks There is no built in way to run jobs periodically like celerybeat in Celery. But you still can use cron. For example you can create a universal command to execute any task. Something like this: ``` import json from django.core.management import BaseCommand from someapp.queue import task_queue class Command(BaseCommand): def add_arguments(self, parser): parser.add_argument('task_name') parser.add_argument('task_kwargs') def handle(self, task_name, task_kwargs, **options): task_queue.enqueue(task_name, json.loads(task_kwargs)) ``` And then put it into your cron records: ``` 0 0 * * * /path/to/python manage.py run_task say_hello '{"name": "Django!"}' ``` [dpq]: https://github.com/gavinwahl/django-postgres-queue <file_sep>import select import logging import time from django.utils import translation from django.db import connection from .models import Job class Queue: job_model = Job logger = logging.getLogger(__name__) def __init__(self, tasks, notify_channel): self.tasks = tasks self.notify_channel = notify_channel def notify(self, job): with connection.cursor() as cur: cur.execute('NOTIFY "{}", %s;'.format(self.notify_channel), [str(job.pk)]) def get_job_context(self): language = translation.get_language() return {'language': language} def in_atomic_block(self): return connection.in_atomic_block def enqueue(self, task, kwargs=None, execute_at=None, priority=0, context=None): assert not self.in_atomic_block(), 'Task cannot be executed inside a trasnaction' assert task in self.tasks, 'Task "{}" not found in the task list' kwargs = kwargs or {} job_context = self.get_job_context() if context is not None: job_context.update(context) job = Job( task=task, kwargs=kwargs, priority=priority, context=job_context, ) if execute_at: job.execute_at = execute_at job.save() if self.notify_channel: self.notify(job) self.logger.info('New job scheduled %r', job) return job def enqueue_once(self, task, kwargs=None, *args_, **kwargs_): job = self.job_model.objects.filter(task=task, kwargs=kwargs or {}).first() if job is not None: return job return self.enqueue(task, kwargs, *args_, **kwargs_) def listen(self): with connection.cursor() as cur: cur.execute('LISTEN "{}";'.format(self.notify_channel)) def wait(self, timeout): connection.connection.poll() notifies = self._get_notifies() if notifies: return notifies select.select([connection.connection], [], [], timeout) connection.connection.poll() return self._get_notifies() def _get_notifies(self): notifies = [ i for i in connection.connection.notifies if i.channel == self.notify_channel] connection.connection.notifies = [ i for i in connection.connection.notifies if i.channel != self.notify_channel] return notifies def dequeue(self): query = """ DELETE FROM {table} WHERE id = ( SELECT id FROM {table} WHERE execute_at <= NOW() ORDER BY priority DESC, created_at, id FOR UPDATE SKIP LOCKED LIMIT 1 ) RETURNING *; """ query = query.format(table=self.job_model._meta.db_table) results = list(self.job_model.objects.raw(query)) assert len(results) <= 1 if not results: return None return results[0] def run_job(self, job): language = job.context.get('language') with translation.override(language): return self._run_job(job) def _run_job(self, job): task = self.tasks[job.task] start_time = time.time() retval = task(self, job) self.logger.info( 'Processing %r took %0.4f seconds. Task returned %r.', job, time.time() - start_time, retval, extra={ 'data': { 'job': job.as_dict(), 'retval': retval, }, }, ) return retval <file_sep># Generated by Django 2.0.5 on 2018-11-04 22:13 import django.contrib.postgres.fields.jsonb from django.db import migrations import pgqueue.models class Migration(migrations.Migration): dependencies = [ ('pgqueue', '0002_job_context'), ] operations = [ migrations.AlterField( model_name='job', name='context', field=django.contrib.postgres.fields.jsonb.JSONField(blank=True, default=pgqueue.models.lazy_empty_dict), ), migrations.AlterField( model_name='job', name='kwargs', field=django.contrib.postgres.fields.jsonb.JSONField(blank=True, default=pgqueue.models.lazy_empty_dict), ), ] <file_sep>import logging import signal import sys import os from django.core.management import BaseCommand class Worker: logger = logging.getLogger(__name__) wait_timeout = 10 queue = None def __init__(self): self._shutdown = False self._is_waiting = False def handle_shutdown(self, sig, frame): if self._is_waiting: # If there're no active tasks let's exit immediately. sys.exit(0) self.logger.info('Waiting for active tasks to finish...') self._shutdown = True def bind_signals(self): """Handle the signals for warm shutdown.""" signal.signal(signal.SIGINT, self.handle_shutdown) signal.signal(signal.SIGTERM, self.handle_shutdown) def start(self): self.bind_signals() self.queue.listen() self.logger.info('Worker is started with pid {}'.format(os.getpid())) while True: if self._shutdown: return job = self.queue.dequeue() if job is None: self.wait() continue try: self.queue.run_job(job) except Exception as e: self.logger.exception('Error in %r: %r.', job, e, extra={ 'data': { 'job': job.as_dict(), }, }) def wait(self): self._is_waiting = True self.queue.wait(self.wait_timeout) self._is_waiting = False class WorkerCommand(Worker, BaseCommand): def before_start(self): pass def handle(self, *args, **options): self.before_start() self.start() <file_sep># -*- coding: utf-8 -*- # Generated by Django 1.11 on 2018-05-03 17:09 from __future__ import unicode_literals import django.contrib.postgres.fields.jsonb from django.db import migrations class Migration(migrations.Migration): dependencies = [ ('pgqueue', '0001_initial'), ] operations = [ migrations.AddField( model_name='job', name='context', field=django.contrib.postgres.fields.jsonb.JSONField(blank=True, default={}), ), ] <file_sep>import time from contextlib import contextmanager from datetime import timedelta from threading import Thread from unittest import mock from django.test import SimpleTestCase from django.utils import timezone from .decorators import retry, repeat from .models import Job from .queue import Queue from .worker import Worker def demo_task(queue, job): return job.kwargs['value'] class QueueTest(SimpleTestCase): allow_database_queries = True def setUp(self): self.db_objects = [] self.queue = Queue( tasks={'demo_task': demo_task}, notify_channel='demo_queue', ) def tearDown(self): for obj in self.db_objects: obj.delete() def test_enqueue(self): self.queue.enqueue('demo_task', kwargs={'value': 'val1'}) job = Job.objects.first() self.db_objects.append(job) self.assertEqual(job.task, 'demo_task') self.assertEqual(job.kwargs, {'value': 'val1'}) def test_enqueue_once(self): old_job = Job.objects.create(task='demo_task', kwargs={'value': 'val1'}) job_1 = self.queue.enqueue_once('demo_task', {'value': 'val1'}) self.assertEqual(job_1.id, old_job.id) job_2 = self.queue.enqueue_once('demo_task', {'value': 'val2'}) self.assertNotEqual(job_2, old_job.id) def test_dequeue(self): job1 = Job.objects.create(task='demo_task') job2 = Job.objects.create(task='demo_task') self.db_objects += [job1, job2] self.assertEqual(job1.id, self.queue.dequeue().id) self.assertEqual(job2.id, self.queue.dequeue().id) def test_run_job(self): job = Job.objects.create(task='demo_task', kwargs={'value': 'hello'}) self.db_objects.append(job) retval = self.queue.run_job(job) self.assertEqual(retval, 'hello') def test_listen_notify(self): self.queue.listen() job = Job.objects.create(task='demo_task', kwargs={'value': 'hello'}) self.db_objects.append(job) self.queue.notify(job) notifies = self.queue.wait(1) notify = notifies[0] self.assertEqual(notify.channel, 'demo_queue') self.assertEqual(notify.payload, str(job.id)) class WorkerTest(SimpleTestCase): allow_database_queries = True class DemoWorker(Worker): wait_timeout = 1 def __init__(self, queue, *args, **kwargs): super().__init__(*args, **kwargs) self.queue = queue def start(self): super().start() # Close database connection when worker is stopped # top prevent "database is being accessed by other users". from django.db import connection connection.close() def stop(self): self._shutdown = True def bind_signals(self): # Signals doesn't work if the worker is running not in # the main thread, also there are no signals during # tests running so let’s ignore them. pass def setUp(self): self.db_objects = [] def tearDown(self): for obj in self.db_objects: obj.delete() @contextmanager def start_worker(self, queue): worker = self.DemoWorker(queue) thread = Thread(target=worker.start, daemon=True) thread.start() yield worker time.sleep(1) # wait worker to finish the tasks worker.stop() thread.join() def test_enqueue(self): queue = Queue( tasks={ 'task1': mock.Mock(), 'task2': mock.Mock(), }, notify_channel='test_channel', ) with self.start_worker(queue): job1 = queue.enqueue('task1') job2 = queue.enqueue('task2') self.db_objects += [job1, job2] queue.tasks['task1'].assert_called() queue.tasks['task2'].assert_called() class RepeatDecoratorTest(SimpleTestCase): allow_database_queries = True def tearDown(self): Job.objects.all().delete() def test_repeat(self): task = mock.Mock() wrapped_task = repeat(timedelta(minutes=1))(task) queue = Queue(tasks={'task': wrapped_task}, notify_channel='test_channel') job = Job(task='task') # job should not be saved in database to simulate dequeue call wrapped_task(queue, job) new_job = Job.objects.last() delta = new_job.execute_at - timezone.now() self.assertAlmostEqual(delta.total_seconds(), timedelta(minutes=1).total_seconds(), places=1) class RetryDecoratorTest(SimpleTestCase): allow_database_queries = True def tearDown(self): Job.objects.all().delete() def test_retry(self): task = mock.Mock() task.side_effect = Exception() wrapped_task = retry([Exception], delay=timedelta(seconds=1), max_attempts=3)(task) queue = Queue(tasks={'task': task}, notify_channel='test_channel') job = Job.objects.create(task='task') wrapped_task(queue, job) retry_job = Job.objects.last() self.assertEqual(retry_job.task, 'task') self.assertEqual(retry_job.context['retry_attempt'], 1) def test_retry_delay(self): task = mock.Mock() task.side_effect = Exception() wrapped_task = retry([Exception], delay=timedelta(seconds=10), max_attempts=10)(task) queue = Queue(tasks={'task': task}, notify_channel='test_channel') job = Job.objects.create(task='task') for n in range(10): wrapped_task(queue, job) job = Job.objects.order_by('created_at').last() delta = job.execute_at - timezone.now() self.assertAlmostEqual(delta.total_seconds(), timedelta(seconds=10).total_seconds(), places=1) jobs = Job.objects.all() self.assertEqual(jobs.count(), 11) def test_retry_max_attempts(self): task = mock.Mock() task.side_effect = Exception() wrapped_task = retry([Exception], delay=timedelta(seconds=1), max_attempts=10)(task) queue = Queue(tasks={'task': task}, notify_channel='test_channel') job = Job.objects.create(task='task') for n in range(10): wrapped_task(queue, job) job = Job.objects.order_by('created_at').last() self.assertEqual(job.context['retry_attempt'], n + 1) with self.assertRaises(Exception): wrapped_task(queue, job) jobs = Job.objects.all() self.assertEqual(jobs.count(), 11) <file_sep>from functools import wraps import copy from django.utils import timezone RETRY_ATTEMPT = 'retry_attempt' def repeat(interval): """ Endlessly repeats a task, every `delay` (a timedelta). @repeat(datetime.timedelta(minutes=5) def task(queue, job): pass This will run `task` every 5 minutes. It's up to you to kick off the first task, though. """ def wrapper(func): @wraps(func) def decorator(queue, job): queue.enqueue_once(job.task, job.kwargs, execute_at=(timezone.now() + interval)) return func(queue, job) return decorator return wrapper def retry(exceptions, delay, max_attempts): """ Retries failed task every `interval` (a timedelta). @retry([Exception], delay=timedelta(seconds=30), max_attempts=10) def task(queue, job): pass This will repeat `task` every 30 seconds if it fails with `Exception` but not more than ten times. When `max_attempts` is reached, the original exception will be raised. """ def wrapper(func): @wraps(func) def decorator(queue, job): try: return func(queue, job) except Exception as e: if not isinstance(e, tuple(exceptions)) \ or queue is None: # for testing reasons raise attempt = job.context.get(RETRY_ATTEMPT, 0) + 1 if max_attempts is not None and attempt > max_attempts: raise context = copy.copy(job.context) context[RETRY_ATTEMPT] = attempt execute_at = timezone.now() + delay queue.enqueue(job.task, job.kwargs, context=context, execute_at=execute_at) return decorator return wrapper def job_kwargs(func): @wraps(func) def wrapper(queue, job): return func(**job.kwargs) return wrapper <file_sep>from unittest import mock def call_task(task, kwargs): job = mock.Mock() job.kwargs = kwargs queue = mock.Mock() return task(queue, job) <file_sep>from django.contrib import admin from .models import Job @admin.register(Job) class JobAdmin(admin.ModelAdmin): list_display = ('id', 'task', 'execute_at', 'priority') ordering = ('-priority', 'execute_at')
dd723a51f5f964180b7a014177b8a531d42becc0
[ "Markdown", "Python" ]
12
Python
maxpoletaev/django-pgqueue
1979cf93f44d3a4cd52dbba082e9660a4bfafadd
2ea550fbc75672643ab082729f1340f4fea15803
refs/heads/master
<repo_name>gebiWangshushu/Hei.PrometheusFileBaseServiceDiscover<file_sep>/prometheus-filebase-servicediscover.lua ngx.header.content_type = 'application/json' local targetFilePath = '/home/website/prometheus/targets.yml' --prometheus targe配置,777权限 local json = require('cjson') --common metheds local function fileRead(path) local file = io.open(path, 'r') local json = file:read('*a') file:close() return json end local function fileWrite(path, content) local file = io.open(path, 'w+') file:write(content) file:close() end local function success(msg) ngx.say(string.format([[{"code":%d,"message":"%s"}]], 200, msg)) end local function fail(msg) ngx.say(string.format([[{"code":%d,"message":"%s"}]], 300, msg)) end local function tableKeyFind(tbl, key) if tbl == nil then return false end for k, v in pairs(tbl) do if k == key then return true end end return false end local function tableValueFind(tbl, value) if tbl == nil then return false end for k, v in pairs(tbl) do if v == value then return true end end return false end local function appFind(configObj, postConfigObj) if configObj == nil or postConfigObj == nil then print('configObj or postConfigObj can not be nil') return false end for i, v in ipairs(configObj) do if v.labels.app == postConfigObj.labels.app then return i end end return 0 end local function registeConfig(configObj, postConfigObj, targetFile) local index = appFind(configObj, postConfigObj) if index > 0 then for i, target in ipairs(postConfigObj.targets) do if tableValueFind(configObj[index].targets, target) == false then table.insert(configObj[index].targets, target) end end for key, value in pairs(postConfigObj.labels) do if tableValueFind(configObj[index].labels, key) == false then configObj[index].labels[key] = value end end else local config = {targets = postConfigObj.targets, labels = postConfigObj.labels} table.insert(configObj, config) end local newConfig = json.encode(configObj) fileWrite(targetFile, newConfig) print('更新新配置成功', newConfig) return true end local function deregisteConfig(configObj, postConfigObj, targetFile) local index = appFind(configObj, postConfigObj) if index > 0 then for i, target in ipairs(configObj[index].targets) do if tableValueFind(postConfigObj.targets, target) then table.remove(configObj[index].targets, i) end end local len = table.getn(configObj[index].targets) if len == 0 then table.remove(configObj, index) end local newConfig = json.encode(configObj) fileWrite(targetFile, table.getn(configObj) > 0 and newConfig or '') print('注销配置成功', newConfig) return true else print('注销配置失败,配置不存在:', postConfigObj.labels.app) return false end end --common metheds end --main chunk --read and valid post json config ngx.req.read_body() local params = ngx.req.get_body_data() local postConfigObj = json.decode(params) print('postConfigObj.targets:', type(postConfigObj.targets) == 'userdata') if (postConfigObj.labels == nil) or (type(postConfigObj.labels) == 'userdata') or (type(postConfigObj.targets) == 'userdata') or (postConfigObj.labels.app) == nil then return fail('labels.app、targets 均不能为空') end --init target config local configFile = fileRead(targetFilePath) if configFile == nil or configFile == '' then configFile = '[]' end print('原配置', configFile) --update config local configObj = json.decode(configFile) local result = false if postConfigObj.type == 'deregiste' then result = deregisteConfig(configObj, postConfigObj, targetFilePath) if result then return success('注销配置成功') end return fail('注销配置失败,请检查app是否存在') else result = registeConfig(configObj, postConfigObj, targetFilePath) return success('注册配置成功') end <file_sep>/README.md # Hei.PrometheusFileBaseServiceDiscover 监控解决方案 prometheus,基于文件的服务发现配置脚本,使用 OpenResty+Lua 实现 # 背景 prometheus 可以使用 Consul、k8s 等方式做服务发现,也可以基于文件配置的服务发现。 基于文件配置服务发现需要手动去修改文件,然后 prometheus 再去刷新文件配置这样,而如果我们服务很多,我想用 Rest 接口 Post 请求配置,直接修改文件。这样就可以程序在启动时调用注册接口,程序停止时调用撤销注册接口,就可以做到一个粗糙版本的服务发现了。 我选用的是 OpenResty+Lua 实现。 # 部署 ## prometheus 配置 ``` scrape_configs: - job_name: 'file_ds' file_sd_configs: - refresh_interval: 10s #10秒刷新一次 files: /etc/prometheus/*.yml ``` ## OpenResty ``` location /prometheus { content_by_lua_file /home/website/prometheus/prometheus-filebase-servicediscover.lua; } error_log /var/log/nginx/prometheus.error.log debug; #刚开始跑的时候建议把debug日志打开 ``` # 使用 ## 注册服务 ``` curl --location --request POST 'http://your_id:port/prometheus' \ --header 'Content-Type: application/json' \ --data-raw ' { "type":"registe", "targets": ["172.16.3.119:91012", "172.16.3.117:91221"], "labels": { "env": "dev", "app": "container3" } }' ``` 响应: ``` { "code": 200, "message": "注册配置成功" } ``` ## 注销服务 ``` curl --location --request POST 'http://your_id:port/prometheus' \ --header 'Content-Type: application/json' \ --data-raw ' { "type":"deregiste", "targets": ["172.16.3.119:91012", "172.16.3.117:91221"], "labels": { "env": "dev", "app": "container3" } }' ``` 响应: ``` { "code": 200, "message": "注销配置成功" } ```
2ddb64f81f40d5ba9f273e0cad1370d00a216899
[ "Markdown", "Lua" ]
2
Lua
gebiWangshushu/Hei.PrometheusFileBaseServiceDiscover
18993a59e350cde515da32f79dc44f0a952e40fb
5a8cfe10ed1003341b271662c0ff50124a3954f7
refs/heads/master
<file_sep>import Vue from 'vue' import Router from 'vue-router' import Indexs from '../pages/index' import Content from '../pages/content' import Mine from '../pages/mine' Vue.use(Router) export default new Router({ routes: [ { path: '/', name: 'index', component: Indexs }, { path: '/pages/content', name: 'content', component: Content }, { path: '/pages/mine', name: 'mine', component: Mine } ] })
0e448262500ebd0c73e22744817428e826d4e5dd
[ "JavaScript" ]
1
JavaScript
xintianyou/pro2
a123ea58fc38824ce4c6489d2276d4e3e90a73d4
332425d74c49166c0e8311c6043a8fb97ab304cf
refs/heads/master
<file_sep>import { Injectable } from "@angular/core"; import { HttpClient } from "@angular/common/http"; import { User } from "../model/user.model"; import { Observable } from 'rxjs/Observable'; import 'rxjs/add/observable/throw'; import 'rxjs/add/operator/catch'; import 'rxjs/add/operator/do'; import 'rxjs/add/operator/map'; // Not used observable and error handling mechanism just beacause api is not working as expected @Injectable() export class UserService { API_URL = "https://jsonplaceholder.typicode.com/users/"; constructor(private _http: HttpClient) {} getUsers() { return this._http.get(`${this.API_URL}`); } deleteUser(userID) { return this._http.delete(`${this.API_URL}${userID}`); } createUser(user: User) { return this._http.post(`${this.API_URL}`, user); } updateUser(user: User, userID) { return this._http.put(`${this.API_URL}${userID}`, user); } getuserByID(userID) { return this._http.get(`${this.API_URL}${userID}`); } } <file_sep>import { Pipe, PipeTransform } from "@angular/core"; @Pipe({ name: "userFilter" }) export class UserFilterPipe implements PipeTransform { transform(users, searchTerm: string, filterBy: string) { if (!users || !searchTerm) { return users; } if (filterBy == "company") { return users.filter( user => user.company.name.toLowerCase().indexOf(searchTerm.toLowerCase()) !== -1 ); } else if (filterBy == "email") { return users.filter( user => user.email.toLowerCase().indexOf(searchTerm.toLowerCase()) !== -1 ); } else { return users.filter( user => user.name.toLowerCase().indexOf(searchTerm.toLowerCase()) !== -1 ); } } } <file_sep>import { Component, OnInit } from "@angular/core"; import { ActivatedRoute, Router } from "@angular/router"; import { User } from "../model/user.model"; import { UserService } from "../services/user.service"; @Component({ selector: "app-update-user", templateUrl: "./update-user.component.html", styleUrls: ["./update-user.component.css"] }) export class UpdateUserComponent implements OnInit { userID; // In Last moment i have missed something so remove type user model. Need to check user: any = { id: null, name: null, username: null, email: null, address: null, phone: null, website: null, company: null }; constructor( private route: ActivatedRoute, private _userService: UserService, private _routeNavigaor: Router ) { this.userID = route.snapshot.params["id"]; this._userService.getuserByID(this.userID).subscribe((data) => { this.user = data; this.user.companyName = this.user.company.name; this.user.city = this.user.address.city; }); } ngOnInit() {} updateUser(userForm) { if (userForm.valid) { this._userService.updateUser(this.user, this.userID); alert('Info Updated, Redirecting to home page') this._routeNavigaor.navigate(['/home']) } } } <file_sep>import { BrowserModule } from "@angular/platform-browser"; import { NgModule } from "@angular/core"; import { AppRoutingModule } from "./app-routing.module"; import { AppComponent } from "./app.component"; import { UserService } from "./services/user.service"; import { DashboardComponent } from "./dashboard/dashboard.component"; import { UpdateUserComponent } from "./update-user/update-user.component"; import { AdduserComponent } from "./adduser/adduser.component"; import { HttpClientModule } from "@angular/common/http"; import { UserListComponent } from './user-list/user-list.component'; import { FormsModule } from '@angular/forms'; import { UserFilterPipe } from './pipes/user-filter.pipe'; @NgModule({ declarations: [ AppComponent, DashboardComponent, UpdateUserComponent, AdduserComponent, UserListComponent, UserFilterPipe ], imports: [BrowserModule, AppRoutingModule, HttpClientModule, FormsModule], providers: [UserService], bootstrap: [AppComponent] }) export class AppModule {} <file_sep>import { Component, OnInit } from "@angular/core"; import { User } from "../model/user.model"; import { UserService } from "../services/user.service"; @Component({ selector: "app-adduser", templateUrl: "./adduser.component.html", styleUrls: ["./adduser.component.css"] }) export class AdduserComponent implements OnInit { user: User = { id: null, name: null, username: null, email: null, address: null, phone: null, website: null, company: null }; constructor(private _userService: UserService) {} ngOnInit() {} addUser(userForm) { if (userForm.valid) { this._userService.createUser(this.user); } } _fnisActionHidden(){ return true; } } <file_sep>import { Component, OnInit, Input } from "@angular/core"; import { UserService } from "../services/user.service"; import { Router } from "@angular/router"; @Component({ selector: "app-user-list", templateUrl: "./user-list.component.html", styleUrls: ["./user-list.component.css"] }) export class UserListComponent implements OnInit { @Input() hideActions: boolean = false; searchTerm: ''; filterby: string = 'name'; users=[]; valueChanges; constructor(private _userService: UserService, private _router: Router) {} ngOnInit() { this.getUsers(); } getUsers() { this.valueChanges = this._userService .getUsers() .subscribe((data: Array<object>) => { this.users = data; }); } deleteUser(uesrID,index) { this.users.splice(index, 1); // this._userService.deleteUser(uesrID); //Facing issue to get updated array } updateUser(uesrID) { this._router.navigate(["/updateUser", uesrID]); } ngOnDestroy() { this.valueChanges.unsubscribe(); } }
778ed565cb23684de6a8d67b8c0f1f4cc0e7c822
[ "TypeScript" ]
6
TypeScript
swanand6300/CRUD
0829825329021b63458305554d69c298d5d66a32
f3cb8e3c6fbeb9184134195831428338518af49c
refs/heads/master
<repo_name>jcaquino0945/FilRobo<file_sep>/src/app/app-routing.module.ts import { NgModule } from '@angular/core'; import { RouterModule, Routes } from '@angular/router'; import { NavbarComponent } from './navbar/navbar.component'; import { HomeComponent } from './home/home.component'; import { PagenotfoundComponent } from './pagenotfound/pagenotfound.component'; import { AquadroneComponent } from './projects/aquadrone/aquadrone.component'; import { ProjectsComponent } from './projects/projects.component'; import { LisaRobotComponent } from './projects/lisa-robot/lisa-robot.component'; import { RobotractorComponent } from './projects/robotractor/robotractor.component'; import { AboutUsComponent } from './about-us/about-us.component'; import { TutorialsComponent } from './tutorials/tutorials.component'; import { ArduinoComponent } from './tutorials/arduino/arduino.component'; import { HowtoflyComponent } from './tutorials/howtofly/howtofly.component'; import { RobotOSComponent } from './tutorials/robot-os/robot-os.component'; const routes: Routes = [ { path: 'home', component: HomeComponent }, { path: 'projects', component: ProjectsComponent }, { path: 'about-us', component: AboutUsComponent }, { path: 'tutorials', component: TutorialsComponent }, { path: 'projects/aquadrone', component: AquadroneComponent }, { path: 'projects/lisaRobot', component: LisaRobotComponent }, { path: 'projects/robotractor', component: RobotractorComponent }, { path: 'tutorials/arduino', component: ArduinoComponent }, { path: 'tutorials/howtofly', component: HowtoflyComponent }, { path: 'tutorials/robot-os', component: RobotOSComponent }, { path: '', redirectTo: '/home', pathMatch: 'full' }, // redirect to `Home` { path: '**', component: PagenotfoundComponent }, ]; @NgModule({ imports: [RouterModule.forRoot(routes,{ anchorScrolling: 'enabled'})], exports: [RouterModule] }) export class AppRoutingModule { } <file_sep>/src/app/projects/aquadrone/aquadrone.component.spec.ts import { ComponentFixture, TestBed } from '@angular/core/testing'; import { AquadroneComponent } from './aquadrone.component'; describe('AquadroneComponent', () => { let component: AquadroneComponent; let fixture: ComponentFixture<AquadroneComponent>; beforeEach(async () => { await TestBed.configureTestingModule({ declarations: [ AquadroneComponent ] }) .compileComponents(); }); beforeEach(() => { fixture = TestBed.createComponent(AquadroneComponent); component = fixture.componentInstance; fixture.detectChanges(); }); it('should create', () => { expect(component).toBeTruthy(); }); }); <file_sep>/src/app/app.module.ts import { NgModule } from '@angular/core'; import { BrowserModule } from '@angular/platform-browser'; import { AppRoutingModule } from './app-routing.module'; import { AppComponent } from './app.component'; import { NavbarComponent } from './navbar/navbar.component'; import { HomeComponent } from './home/home.component'; import { PagenotfoundComponent } from './pagenotfound/pagenotfound.component'; import { AquadroneComponent } from './projects/aquadrone/aquadrone.component'; import { ProjectsComponent } from './projects/projects.component'; import { TutorialsComponent } from './tutorials/tutorials.component'; import { LisaRobotComponent } from './projects/lisa-robot/lisa-robot.component'; import { RobotractorComponent } from './projects/robotractor/robotractor.component'; import { AboutUsComponent } from './about-us/about-us.component'; import { ArduinoComponent } from './tutorials/arduino/arduino.component'; import { HowtoflyComponent } from './tutorials/howtofly/howtofly.component'; import { RobotOSComponent } from './tutorials/robot-os/robot-os.component'; @NgModule({ declarations: [ AppComponent, NavbarComponent, HomeComponent, PagenotfoundComponent, AquadroneComponent, ProjectsComponent, TutorialsComponent, LisaRobotComponent, RobotractorComponent, AboutUsComponent, ArduinoComponent, HowtoflyComponent, RobotOSComponent ], imports: [ BrowserModule, AppRoutingModule, ], providers: [], bootstrap: [AppComponent] }) export class AppModule { }
90b962c08f0df1d3b4e13afe705cae1bcd1056ce
[ "TypeScript" ]
3
TypeScript
jcaquino0945/FilRobo
283220854b6f269e556a42940b29b36086ee0880
e3486f1c0e725a2b673736cd2f49fdaf1df7f651
refs/heads/main
<file_sep>package com.srinivart.model; import lombok.AllArgsConstructor; import lombok.Data; import lombok.NoArgsConstructor; import org.springframework.data.cassandra.core.mapping.PrimaryKey; import org.springframework.data.cassandra.core.mapping.Table; @Table @Data @NoArgsConstructor @AllArgsConstructor public class Product { @PrimaryKey private int id; private String name; public String getName() { return name; } public void setName(String name) { this.name = name; } } <file_sep>package com.srinivart.controller; import com.srinivart.ResouceNotFoundException; import com.srinivart.model.Product; import com.srinivart.repository.ProductRepository; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.http.ResponseEntity; import org.springframework.web.bind.annotation.*; import java.util.List; @RestController @RequestMapping("/api") public class ProductController { @Autowired ProductRepository productRepository; @PostMapping("/products") public Product addProduct(@RequestBody Product product){ productRepository.save(product); return product; } @GetMapping("/products/{id}") public ResponseEntity<Product> findById(@PathVariable("id") Integer productId){ Product product=productRepository.findById(productId).orElseThrow( () -> new ResouceNotFoundException("Product not found" + productId)); return ResponseEntity.ok().body(product); } @GetMapping("/products") public List<Product> getProducts(){ return productRepository.findAll(); } @PutMapping("products/{id}") public ResponseEntity<Product> updateProduct(@PathVariable(value = "id") Integer productId, @RequestBody Product productDetails) { Product product = productRepository.findById(productId) .orElseThrow(() -> new ResouceNotFoundException("Product not found for this id :: " + productId)); product.setName(productDetails.getName()); final Product updatedProduct = productRepository.save(product); return ResponseEntity.ok(updatedProduct); } @DeleteMapping("products/{id}") public ResponseEntity<Void> deleteProduct(@PathVariable(value = "id") Integer productId) { Product product = productRepository.findById(productId).orElseThrow( () -> new ResouceNotFoundException("Product not found::: " + productId)); productRepository.delete(product); return ResponseEntity.ok().build(); } }
797301fb1cbc5c225ea413e6dd08911206cdf7ae
[ "Java" ]
2
Java
srinivart/Springboot-cassandra
5c63d1f713bf35ccd8da431eaaeaefb942e968fb
5434dee20bc14e84d0846aa3fc5fddc767fc466a
refs/heads/main
<repo_name>mahabubdev/myuid<file_sep>/README.md # myUID - Simple UID generator Just install it and use for generating simple uid # Installation `npm install myuid --save` <br> or <br> `yarn add myuid` Then... ``` const myuid = require('myuid'); myuid(); // returns the generated id ``` in ES6: ``` import myuid from 'myuid'; myuid(); // returns the generated id ``` # Options ``` /*----------------------------------------* * by default the base number is 16. * supports only 2->32 as the base number. *-----------------------------------------*/ myuid(6); // computing with base number 6 myuid(8); // computing with base number 8 myuid(12); // computing with base number 12 ``` How it returns ``` // in base 16 (by default) 26893d11f7d624 c8bf02c8a2088 b3b27c18df4a3 // in base 32 ct59nvd0sco 710cfjcob40 // in base 24 7bkacdi4dek8 a8h1ida53d98 // in base 6 155442533105314350430 222504154304155553310 // in base 8 222504154304155553310 222504154304155553310 // in base 12 b114b296405a840 5a7a2b8a1242b32 ``` <file_sep>/index.js function myUID (baseNum = 16) { /**--------------------------------------* * baseNum is only ranged for 2 -> 32 * baseNum can accept only Integer value *---------------------------------------*/ var timeNum = new Date().getTime(); var randNum = Math.floor( Math.random(100, 999) * 10000 ); // prepare unique number var uNum = timeNum * randNum || timeNum; // generate the id key var idKey = uNum.toString(baseNum); // return the result return idKey; } module.exports = myUID;
8bacecd5e73581b93f32d69092b09ac3ca0f6117
[ "Markdown", "JavaScript" ]
2
Markdown
mahabubdev/myuid
90e84ade59ad36a7ae0514d974b04f7ec24d10bb
6df1443192971c15a8ea3e6a7b44525060564b97
refs/heads/master
<repo_name>davidgabriel42/CS691_PA2<file_sep>/q3/q3.py import pandas as pd import numpy as np import re import findspark findspark.init('/home/dave/spark-2.4.1-bin-hadoop2.7/') import pyspark from pyspark.sql.functions import * from pyspark.sql.types import * #import regexp_extract, col, lit from pyspark.sql import * import sys import os fn1 = sys.argv[1] if os.path.exists(fn1): tweetsFile = fn1 fn2 = sys.argv[2] if os.path.exists(fn2): usersFile = fn2 spark = SparkSession.builder.appName("Q1").getOrCreate() tweets = spark.read.text(tweetsFile) users = spark.read.text(usersFile) #split DF into Cols: uid, tid, tweet split_col = pyspark.sql.functions.split(tweets['value'], '\t') tweets = tweets.withColumn('uid', split_col.getItem(0)) tweets = tweets.withColumn('tid', split_col.getItem(1)) tweets = tweets.withColumn('tweet', split_col.getItem(2)) split_col = pyspark.sql.functions.split(tweets['value'], '(\t)(?:20)') tweets = tweets.withColumn('date', split_col.getItem(1)) tweets = tweets.select('uid', 'tid', 'tweet', 'date') #tweets.show() #process users to DF split_users = pyspark.sql.functions.split(users['value'], '\t') users = users.withColumn('uid', split_users.getItem(0)) users = users.withColumn('loc', split_users.getItem(1)) users = users.select('uid','loc') LAPattern = '(Los\sAngeles)' #extract mentions and create new DF users = users.filter(users.loc.rlike(LAPattern)) #users.printSchema() #users.show(n=20) #09/16/2009 - 09/20/2009 extract tweets from these dates datePattern = '(09-09-)(16|17|18|19|20)' tweets = tweets.filter(tweets.date.rlike(datePattern)) #tweets.show() #join on user id users = users.select(col('uid').alias('uid1'), col('loc')) join = tweets.join(users, tweets.uid == users.uid1) #join.show() LA = join.groupBy("uid").count().sort(desc("count")) #redirect stdio and print result to file sys.stdout = open('most_tweeted_users.txt', 'w') print("Q3: 10 most tweeted users by uid from LA 9/16-9/20/2009") LA.show(n=10) spark.stop() <file_sep>/readme.txt ( ( ( ) ( )\ ) ( ) ) )\ ))\ ) ( /( ( * ) ) )\ (()/( )\ ) ( /(( /( (()/(()/( )\()) ( ( )\ ` ) /( ( /( (((_) /(_)|()/( )\())\())__ /(_))(_)|(_)\ )\ )\ (((_) ( )(_)) )(_)) )\___(_)) /(_)|(_)((_)\___(_))(_)) ((_) ((_|(_) )\___(_(_()) ((_) ((/ __/ __|(_) // _(_) (_) | _ \ _ \ / _ \ _ | | __((/ __|_ _| |_ ) | (__\__ \ / _ \_, /| | | _/ /| (_) | || | _| | (__ | | / / \___|___/ \___//_/ |_| |_| |_|_\ \___/ \__/|___| \___| |_| /___| CS691 PROJECT 2 ========================================================== Set up: I followed this to set up Jupyter, which also set up PySpark (included). I think Jupyter is an excellent tool for working out problems once you get used to it. I highly recommend it. https://blog.sicara.com/get-started-pyspark-jupyter-guide-tutorial-ae2fe84f594f I did not have to do anything else to run the scripts through the Python interpreter. ========================================================== How to run: All .py files can be run from the python interpreter as follows: $python q1.py /[PATHTOFILE]/training_set_tweets.txt /[PATHTOFILE]/training_set_users.txt ========================================================== Versions Used to Build: #python $python --version Python 3.6.5 :: Anaconda, Inc. #spark ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.4.1 /_/ Using Scala version 2.11.12 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_201) Type in expressions to have them evaluated. Type :help for more information. #java (does it matter? probably) $ java -version java version "1.8.0_201" Java(TM) SE Runtime Environment (build 1.8.0_201-b09) Java HotSpot(TM) 64-Bit Server VM (build 25.201-b09, mixed mode) <file_sep>/q2/q2.py import pandas as pd import numpy as np import re import findspark findspark.init('/home/dave/spark-2.4.1-bin-hadoop2.7/') import pyspark from pyspark.sql.functions import * from pyspark.sql.types import * #import regexp_extract, col, lit from pyspark.sql import * import sys import os fn1 = sys.argv[1] if os.path.exists(fn1): tweetsFile = fn1 fn2 = sys.argv[2] if os.path.exists(fn2): usersFile = fn2 spark = SparkSession.builder.appName("q2").getOrCreate() tweets = spark.read.text(tweetsFile) users = spark.read.text(usersFile) #split DF into Cols: uid, tid, tweet split_col = pyspark.sql.functions.split(tweets['value'], '\t') tweets = tweets.withColumn('uid', split_col.getItem(0)) tweets = tweets.withColumn('tid', split_col.getItem(1)) tweets = tweets.withColumn('tweet', split_col.getItem(2)) split_col = pyspark.sql.functions.split(tweets['value'], '(\t)(?:20)') tweets = tweets.withColumn('date', split_col.getItem(1)) tweets = tweets.select('uid', 'tid', 'tweet', 'date') #tweets.show() #regex pattern for retweets RTPattern = '(?:RT\s)(@\w+)' #extract retweets and create new DF retweets = tweets.withColumn('Retweets', regexp_extract(col('tweet'), RTPattern, 1)) retweets = retweets.groupBy("Retweets").count().sort(desc("count")) #clear null tuples retweets = retweets.filter(retweets.Retweets.rlike('@')) #redirect stdio and print result to file sys.stdout = open('most_retweeted_users.txt', 'w') print("Q2: 10 most retweeted users") retweets.show(n=10) spark.stop() <file_sep>/q1/q1.py import pandas as pd import numpy as np import re import findspark findspark.init('/home/dave/spark-2.4.1-bin-hadoop2.7/') import pyspark from pyspark.sql.functions import * from pyspark.sql.types import * #import regexp_extract, col, lit from pyspark.sql import * import sys import os fn1 = sys.argv[1] if os.path.exists(fn1): tweetsFile = fn1 fn2 = sys.argv[2] if os.path.exists(fn2): usersFile = fn2 spark = SparkSession.builder.appName("q1").getOrCreate() tweets = spark.read.text(tweetsFile) users = spark.read.text(usersFile) #split DF into Cols: uid, tid, tweet split_col = pyspark.sql.functions.split(tweets['value'], '\t') tweets = tweets.withColumn('uid', split_col.getItem(0)) tweets = tweets.withColumn('tid', split_col.getItem(1)) tweets = tweets.withColumn('tweet', split_col.getItem(2)) split_col = pyspark.sql.functions.split(tweets['value'], '(\t)(?:20)') tweets = tweets.withColumn('date', split_col.getItem(1)) tweets = tweets.select('uid', 'tid', 'tweet', 'date') #tweets.show() #regex pattern for mentions mentionPattern = '(?<!RT\s)(@\w+)' #extract mentions and create new DF mentions = tweets.withColumn('Mentions', regexp_extract(col('tweet'), mentionPattern, 1)) mentions = mentions.groupBy("Mentions").count().sort(desc("count")) #clear null tuples mentions = mentions.filter(mentions.Mentions.rlike('@')) #redirect stdout and print results sys.stdout = open('popular_mentions.txt', 'w') print("Q1: 20 most mentions") mentions.show(n=20) spark.stop()
73ac771a5ca51dfc48312f0f1a58952edd8f458d
[ "Python", "Text" ]
4
Python
davidgabriel42/CS691_PA2
55802eca38a54be804db4e3a3fa3491343f278bc
94ca0a1d4ec104da75482a04957e57085746ced3
refs/heads/master
<file_sep>from anytree import NodeMixin, RenderTree from itertools import combinations_with_replacement from collections import OrderedDict import numpy as np from math import sqrt, log class Statistics: """Maintains the sufficient statistics for a cluster""" def __init__(self, ts_quantity): self.reset_sufficient_statistics(ts_quantity) self.cluster_diameter = None self.dist_dict_coef = {} self.hoeffding_bound = None def reset_sufficient_statistics(self, ts_quantity): self.sum_dict = {} # key: cluster number self.prd_dict = {} # key: tuple of two cluster numbers self.corr_dict = {} # key: tuple of two cluster numbers self.rnomc_dict = {} # key: tuple of two cluster numbers self.n_of_instances = 0 for i in range(ts_quantity): self.sum_dict[i] = 0. for j in range(ts_quantity): if j >= i: self.prd_dict[(i,j)] = 0. if j > i: self.corr_dict[(i,j)] = 0. self.rnomc_dict[(i,j)] = 0. def print(self): print("# n_of_instances = {}".format(self.n_of_instances)) class Cluster: """Maintains the state of a cluster""" def __init__(self, confidence_level = 0.9, n_min = 5, tau = 0.1): self.active_cluster = True self.statistics = None self.confidence_level = confidence_level self.n_min = n_min self.tau = tau def set_cluster_timeseries(self,list_ts): self.list_of_timeseries = OrderedDict(sorted(list_ts.items(), \ key=lambda t: t[0])) cluster_size = len(self.list_of_timeseries) self.statistics = Statistics(cluster_size) self.update_statistics(init=True) # only calculates matrices def get_cluster_timeseries(self): return self.list_of_timeseries def list_timeseries_names(self): return list(self.list_of_timeseries.keys()) def calcula_sum_dict(self,init=False): for k in self.statistics.sum_dict: self.statistics.sum_dict[k] = self.statistics.sum_dict[k] \ + list(self.list_of_timeseries.values())[k].current_value return self.statistics.sum_dict def calcula_prod_dict(self,init=False): for k in self.statistics.prd_dict: self.statistics.prd_dict[k] = self.statistics.prd_dict[k] \ + ( list(self.list_of_timeseries.values())[k[0]].current_value \ * list(self.list_of_timeseries.values())[k[1]].current_value ) return self.statistics.prd_dict def calcula_corr_dict(self,init=False): for k in self.statistics.corr_dict: i = k[0] j = k[1] p = self.statistics.prd_dict[(i,j)] a = self.statistics.sum_dict[i] a2 = self.statistics.prd_dict[(i,i)] b = self.statistics.sum_dict[j] b2 = self.statistics.prd_dict[(j,j)] n = self.statistics.n_of_instances term_p = p - ((a*b)/n) term_a = sqrt(a2 - ((a*a)/n)) term_b = sqrt(b2 - ((b*b)/n)) self.statistics.corr_dict[(i,j)] = term_p/(term_a*term_b) return self.statistics.corr_dict def calcula_rnomc_dict(self,init=False): max_rnomc = None for k in self.statistics.rnomc_dict: self.statistics.rnomc_dict[k] = \ sqrt( (1-self.statistics.corr_dict[k]) / 2 ) if max_rnomc is None or self.statistics.rnomc_dict[k] > max_rnomc: max_rnomc = self.statistics.rnomc_dict[k] self.cluster_diameter = max_rnomc return self.statistics.rnomc_dict def calcula_distances_coefficients(self): """Calc dist pars needed in 3.4.1 and 3.4.3 of the paper""" if len(self.statistics.rnomc_dict) == 0: return rnorm_copy = self.statistics.rnomc_dict.copy() self.statistics.dist_dict_coef['avg'] = sum(rnorm_copy.values()) / len(rnorm_copy) d0_pair = min(rnorm_copy, key=rnorm_copy.get) d0 = rnorm_copy[min(rnorm_copy, key=rnorm_copy.get)] self.statistics.dist_dict_coef['d0_val'] = d0 self.statistics.dist_dict_coef['d0_pair'] = d0_pair d1_pair = max(rnorm_copy, key=rnorm_copy.get) d1 = rnorm_copy[max(rnorm_copy, key=rnorm_copy.get)] self.statistics.dist_dict_coef['d1_val'] = d1 self.statistics.dist_dict_coef['d1_pair'] = d1_pair rnorm_copy.pop(d1_pair, None) if (rnorm_copy): # in case rnorm dict has only one or 2 elements. in that case # we are calculating this but we are not allowed to split. d2_pair = max(rnorm_copy, key=rnorm_copy.get) d2 = rnorm_copy[max(rnorm_copy, key=rnorm_copy.get)] self.statistics.dist_dict_coef['d2_val'] = d2 self.statistics.dist_dict_coef['d2_pair'] = d2_pair self.statistics.dist_dict_coef['delta'] = d1-d2 else: self.statistics.dist_dict_coef['d2_val'] = None self.statistics.dist_dict_coef['d2_pair'] = None self.statistics.dist_dict_coef['delta'] = None def calcula_hoeffding_bound(self,init=False): """Calc hoeffding bound (epsilon) as proposed in 3.4.1""" r_sqrd = 1 # because the data is normalized self.statistics.hoeffding_bound = \ sqrt(r_sqrd * log(1/self.confidence_level) \ / (2 * self.statistics.n_of_instances)) return self.statistics.hoeffding_bound def update_statistics(self , init=False): if init == False: self.get_new_timeseries_values() self.statistics.n_of_instances += 1 self.calcula_sum_dict() self.calcula_prod_dict() if self.statistics.n_of_instances >= self.n_min: self.calcula_corr_dict() self.calcula_rnomc_dict() self.calcula_hoeffding_bound() self.calcula_distances_coefficients() def get_new_timeseries_values(self): for ts in self.list_of_timeseries.values(): ts.next_val() def get_smaller_distance_with_pivot(self, pivot_1, pivot_2, current): """Look for the distance in rnomc_dict By using 2 pivots and an index """ ## Following the rule that the dict key present a tuple(x,y) where x < y ## here the method max and min are used to find the correct poition dist_1 = self.statistics.rnomc_dict[(min(pivot_1,current), \ max(pivot_1,current))] dist_2 = self.statistics.rnomc_dict[(min(pivot_2,current), \ max(pivot_2,current))] return 2 if dist_1 >= dist_2 else 1 def split_this_cluster(self, pivot_1, pivot_2): pivot_1_list = {} temp_1 = list(self.list_of_timeseries.items())[pivot_1] pivot_1_list[temp_1[0]] = temp_1[1] pivot_2_list = {} temp_2 = list(self.list_of_timeseries.items())[pivot_2] pivot_2_list[temp_2[0]] = temp_2[1] for i in range(len(self.list_of_timeseries.values())): if (i != pivot_1) & (i != pivot_2): cluster = self.get_smaller_distance_with_pivot(pivot_1,pivot_2,i) if cluster == 1: temp_1 = list(self.list_of_timeseries.items())[i] pivot_1_list[temp_1[0]] = temp_1[1] else: #2 temp_2 = list(self.list_of_timeseries.items())[i] pivot_2_list[temp_2[0]] = temp_2[1] ### After creating 2 lists based on 2 pivots, it's time ### to creates the new nodes and set self as the parent node if(self.name == 'root_node'): new_name = "1" else: new_name = str(int(self.name[8:]) + 1) cluster_child_1 = Node_of_tree('CH1_LVL_'+new_name, parent=self) cluster_child_1.set_cluster_timeseries(pivot_1_list) cluster_child_2 = Node_of_tree('CH2_LVL_'+new_name, parent=self) cluster_child_2.set_cluster_timeseries(pivot_2_list) ### Finally, the current cluster is deactivated. self.active_cluster = False def aggregate_this_cluster(self): self.statistics.reset_sufficient_statistics(len(self.list_of_timeseries)) self.children = [] self.active_cluster = True def test_split(self): """Test if splitting is necessary and perform split""" if (self.statistics.n_of_instances >= self.n_min and self.statistics.dist_dict_coef.get('d2_val') is not None): d0 = float(self.statistics.dist_dict_coef['d0_val']) d1 = float(self.statistics.dist_dict_coef['d1_val']) d2 = float(self.statistics.dist_dict_coef['d2_val']) avg = float(self.statistics.dist_dict_coef['avg']) t = float(self.tau) e = float(self.statistics.hoeffding_bound) ### following 3.4.4 Split Algorithm if ( (d1 - d2) > e ) | ( t > e ) : if ( (d1 - d0) * abs( (d1 - avg) - (avg - d0)) ) > e: x1 = self.statistics.dist_dict_coef['d1_pair'][0] y1 = self.statistics.dist_dict_coef['d1_pair'][1] print("#################") print("##### SPLIT #####") print("#################") print(" >>> PIVOTS: x1={} y1={}".format(x1,y1)) print("#################") self.split_this_cluster(pivot_1 = x1, pivot_2 = y1) return True return False def test_aggregate(self): """Test if aggregating is necessary and perform aggregation""" if (self.parent is not None and self.statistics.n_of_instances >= self.n_min and self.statistics.dist_dict_coef.get('d1_val') is not None): if (self.statistics.dist_dict_coef['d1_val'] - self.parent.statistics.dist_dict_coef['d1_val'] > max(self.statistics.hoeffding_bound, self.parent.statistics.hoeffding_bound)): print("#################") print("##### AGGR. #####") print("#################") print(">>> VARIABLES: c_k={} c_j={} e_k={} e_j={}".format( \ self.statistics.dist_dict_coef['d1_val'], \ self.parent.statistics.dist_dict_coef['d1_val'], \ self.statistics.hoeffding_bound, \ self.parent.statistics.hoeffding_bound)) print("#################") self.parent.aggregate_this_cluster() return True return False class Node_of_tree(Cluster, NodeMixin): """Extension to class Cluster to use tree structure""" def __init__(self, name, parent=None, children=None): super(Node_of_tree, self).__init__() self.name = name self.parent = parent if children: self.children = children def print(self): for pre, fill, node in RenderTree(self): print("%s%s %s %s" % ( \ pre, \ node.name, \ node.statistics.dist_dict_coef.get('d1_val'), \ node.list_timeseries_names() if node.active_cluster \ else " [NOT ACTIVE]" )) <file_sep>## ODAC - Online Divisive Agglomerative Clustering ### Requirements - Python3.5 - Virtual Env (venv) ### Execution 1 - Clone the repository 2 - Create the environment: ``` python3 -m venv env ``` 3 - Source into the environment: ``` source env/bin/activate ``` 4 - Install dependencies: ``` pip install --upgrade pip pip install -r requirements.txt ``` 5 - Execute: ```python python main.py ``` ### Reference ODAC algorithm as described in: - Rodrigues, <NAME>, <NAME>, and <NAME>. "Hierarchical clustering of time-series data streams." IEEE transactions on knowledge and data engineering 20.5 (2008): 615-627. <file_sep>df <- data.frame( read.csv("S0.csv"), read.csv("S1.csv"), read.csv("S2.csv"), read.csv("S3.csv"), read.csv("S4.csv"), read.csv("S5.csv"), read.csv("S6.csv"), read.csv("S7.csv") ) names(df) <- c("S0","S1","S2","S3","S4","S5","S6","S7") plot.ts(df, nc=1) abline(v = 1000, col="red") abline(v = 2000, col="red") <file_sep>from skmultiflow.data.regression_generator import RegressionGenerator import numpy as np class SinGenerator: """Generator for a sine wave, with 10% random noise""" def __init__(self, start = 0, inc = 0.1, noise = 0.02): self.state = start self.inc = inc self.noise = noise def next_val(self): self.state = self.state + self.inc return np.sin(self.state) + np.random.randn()*self.noise class Timeseries: """A single timeseries, with an associated name and generator""" def __init__(self, name, generator): self.name = name self.current_value = None self.generator = generator #self.file = open(self.name + ".csv","w") self.next_val() def next_val(self): self.current_value = self.generator.next_val() #self.file.write("{}\r\n".format(self.current_value)) return self.current_value <file_sep>"""Demonstration for clustering time series via ODAC algorithm""" import skmultiflow as sk from timeseries import Timeseries, SinGenerator from tree import Node_of_tree from anytree.search import findall from anytree import RenderTree import numpy as np import math print("#################") print("##### INIT. #####") print("#################") # generate some timeseries # S0-S1 with sine waves # S2-S7 with sine waves, moved by 180° np.random.seed(seed=42) series = {} series['S0'] = Timeseries('S0', SinGenerator()) series['S1'] = Timeseries('S1', SinGenerator()) series['S2'] = Timeseries('S2', SinGenerator(start=math.pi)) series['S3'] = Timeseries('S3', SinGenerator(start=math.pi)) series['S4'] = Timeseries('S4', SinGenerator(start=math.pi)) series['S5'] = Timeseries('S5', SinGenerator(start=math.pi)) series['S6'] = Timeseries('S6', SinGenerator(start=math.pi)) series['S7'] = Timeseries('S7', SinGenerator(start=math.pi)) # Initialize root node of the tree (the initial cluster) and set # timeseries to initial cluster root_node = Node_of_tree('root_node') root_node.set_cluster_timeseries(series) # initial run: let the tree grow to cluster the timeseries # for each active cluster - get next value for each series # inside the cluster and calculate and update statistics for i in range(1000): for active_cluster in \ findall(root_node,filter_=lambda node: node.active_cluster is True): active_cluster.update_statistics() if active_cluster.test_split() or active_cluster.test_aggregate(): print("tree at observation #{}".format(i)) root_node.print() # we can observe, that the algorithm correctly clusters the time series # one cluster with S0-S1 # one cluster with S2-S7 print('#################') root_node.print() # Now, let's introduce a concept drift, that the algorithm has to adjust print("#################") print("##### DRIFT 1 ###") print("#################") # We now change the type of S6-S7 to be like S0-S1: series['S6'].generator = \ SinGenerator(start=series['S6'].generator.state + math.pi) series['S7'].generator = \ SinGenerator(start=series['S7'].generator.state + math.pi) for i in range(1000, 2000): for active_cluster in \ findall(root_node,filter_=lambda node: node.active_cluster is True): active_cluster.update_statistics() if active_cluster.test_split() or active_cluster.test_aggregate(): print("tree at observation #{}".format(i)) root_node.print() # The algorithm split cluster S2-S7 into two clusters # one with S2-S5 # one with S6-S7 # We know, that S6-S7 are now of the same generating structure as S0-S1 # still, we are missing one criteria for aggregation: # the cluster size of the child cluster has to grow bigger than the # parent cluster print('#################') root_node.print() # Let's now change S6-S7 back to it's original generating function # that S2-S7 would be of the same shape # but in order for the aggregation function to apply, we need to # increase the cluster sizes # that it is bigger than their parent clusters size + hoeffding bound print("#################") print("##### DRIFT 2 ###") print("#################") # We do this by adding some dissimilarity within the clusters series['S2'].generator = SinGenerator(start=series['S2'].generator.state - 0.4) series['S3'].generator = SinGenerator(start=series['S3'].generator.state - 0.2) series['S4'].generator = SinGenerator(start=series['S4'].generator.state + 0) series['S5'].generator = SinGenerator(start=series['S5'].generator.state + 0.2) series['S6'].generator = SinGenerator(start=series['S6'].generator.state \ - math.pi + 0.2) series['S7'].generator = SinGenerator(start=series['S7'].generator.state \ - math.pi - 0.3) for i in range(2000,3000): for active_cluster in \ findall(root_node,filter_=lambda node: node.active_cluster is True): active_cluster.update_statistics() if active_cluster.test_split() or active_cluster.test_aggregate(): print("tree at observation #{}".format(i)) root_node.print() # We can see, that the cluster readjusts to model the right structure # one cluster with S0-S1 # one cluster with S2-S7 print('#################') root_node.print()
0d731076fe82cd23e0a713c85e97e877b6f9fe0a
[ "Markdown", "Python", "R" ]
5
Python
rodrigoejcm/odac
e1143b844d1515a19dfea037bb4717f6118fc78f
22cc4c7c0ee744f51387b14e469ff11321091f84
refs/heads/master
<file_sep># SkyHawkModel A4-D Skykhawk model. Used to simulate the vehicle's response to different flight conditions. ## Description of directories - **Main** is the home .py file where the user interacts with the program. - **Classes** contain .py files that define Mission statements, Vehicle definitions, and Calculation. - **Data** contains trajectory information as well as the physical attributes of different aircrafts. Contains an output file that writes simulation data to excel. <file_sep>class Vehicle: num_of_Vehicles = 0 def __init__(self,name,excel): Vehicle.num_of_Vehicles += 1 Data = excel.parse("Attributes") self.name = name self.number = Vehicle.num_of_Vehicles self.Data = Data def get_name(self): return '{}'.format(self.name) def vehicle_geometry(self): return float(self.Data.wingarea),float(self.Data.span) def vehicle_mass(self): return float(self.Data.mass) def momentCoeff(self): Ixx = float(self.Data.Ixx) Izz = float(self.Data.Izz) Iyy = float(self.Data.Iyy) Ixz = float(self.Data.Ixz) # Moment Coefficient Calculations (only calculating moments for hw3 scenario) c0 = 1.0/(Ixx*Izz - Ixz**2) c3 = c0*Izz c4 = c0*Ixz c10 = c0*Ixx c = [c0,c3,c4,c10] return c def aero_coefficients(self): coeff = [float(self.Data.cyb),float(self.Data.cydr),float(self.Data.clb),float(self.Data.clp),float(self.Data.clr),\ float(self.Data.cldr),float(self.Data.cnb),float(self.Data.cnp),float(self.Data.cnr),float(self.Data.cndr)] return coeff <file_sep># Skyhawk 4D model import numpy as np import matplotlib.pyplot as plt import pandas as pd class Vehicle: num_of_Vehicles = 0 def __init__(self,name,excel): Vehicle.num_of_Vehicles += 1 Data = excel.parse("Attributes") self.name = name self.number = Vehicle.num_of_Vehicles self.Data = Data def get_name(self): return '{}'.format(self.name) def vehicle_geometry(self): return float(self.Data.wingarea),float(self.Data.span) def vehicle_mass(self): return float(self.Data.mass) def momentCoeff(self): Ixx = float(self.Data.Ixx) Izz = float(self.Data.Izz) Iyy = float(self.Data.Iyy) Ixz = float(self.Data.Ixz) # Moment Coefficient Calculations (only calculating moments for hw3 scenario) c0 = (Ixx*Izz - Ixz**2)**-1 c3 = c0*Izz c4 = c0*Ixz c10 = c0*Ixx c = [c0,c3,c4,c10] return c def aero_coefficients(self): coeff = [float(self.Data.cyb),float(self.Data.cydr),float(self.Data.clb),float(self.Data.clp),float(self.Data.clr),\ float(self.Data.cldr),float(self.Data.cnb),float(self.Data.cnp),float(self.Data.cnr),float(self.Data.cndr)] return coeff class Mission(): # default attributes beta = 0 p = 0 r = 0 phi = 0 psi = 0 r_def = 0 rudder_in = 0 velocity = 0 altitude = 0 density = 0 qbar = 0 name = ' ' # note: need to calc density based on alt def __init__(self,name): self.name = name @staticmethod def get_name(self): return self.name # Sets custom initial conditions @classmethod def set_init_cond(cls,velocity,altitude,density): cls.velocity = velocity cls.altitude = altitude cls.density = density cls.qbar = 0.5*density*velocity**2 def return_velocity(self): return self.velocity @classmethod def apply_sensor_data(cls,rudder_data): Data = rudder_data.parse("Sheet1") time = Data.time_s.tolist() rudder_deg = Data.rudder_deg.tolist() yaw_rate = Data.yaw_rate_deg_s.tolist() cls.time = time cls.rudder_deg = rudder_deg cls.yaw_rate = yaw_rate @classmethod def return_sensor_data(cls): return cls.time,cls.rudder_deg,cls.yaw_rate # updates vehicle position, to use during calculation @classmethod def update_state(cls,state): cls.beta = state[0] cls.p = state[1] cls.r = state[2] cls.phi = state[3] cls.psi = state[4] # Provides orientation data @classmethod def state(cls): state = [cls.beta,cls.p,cls.r,cls.phi,cls.psi] return state # Run Simulation # can run different planes @classmethod def simulate(cls,vehicle,rudder_in): cls.apply_sensor_data(rudder_in) #state = cls.state() v = cls.return_velocity(cls) yaw = [] i = 0 time = cls.time for x in cls.time: #print("state",cls.state()) r_def = cls.rudder_deg[i]*(np.pi/180) aero = Calc.aeroC(vehicle,cls,r_def,v) forces = Calc.forceCalc(vehicle,cls.qbar,aero) rates = Calc.EOM(forces,vehicle,v,cls) new_state = Calc.RK4(cls.state(),rates) cls.update_state(new_state) yaw.append(new_state[2]*(180/np.pi)) i += 1 fig = plt.subplot() fig.plot(time,rudder_deg,label='rudder in') fig.plot(time,yaw,label= '{}'.format(Vehicle.get_name(vehicle)) ) #fig.plot(time,yaw2,label='bond') plt.ylabel('yaw rate [deg/s]') plt.title('{} Simulation'.format(Vehicle.get_name(vehicle))) # Add simulation name plt.xlabel('time [s]') fig.legend() plt.show() return yaw class Calc(): # 4th Order Runge-Kutta # x: value, rates: rate-of-change @staticmethod def RK4(x, rates): rn = [] dt = 0.05 # need user to set interval for i in range(len(x)): rn.append(x[i]) r1 = rates for i in range(len(x)): rn[i] = x[i]+0.5*dt*r1[i] r2 = rates for i in range(len(x)): rn[i] = x[i]+0.5*dt*r2[i] r3 = rates for i in range(len(x)): rn[i] = x[i]+dt*r3[i] r4 = rates for i in range(len(x)): rn[i] = x[i] + (dt/6.0)*(r1[i]+2.0*r2[i]+2.0*r3[i]+r4[i]) return rn # Cy,Cl,Cn calculator @staticmethod def aeroC(vehicle,mission,r_def,v): area,b = vehicle.vehicle_geometry() # area, and span c = vehicle.aero_coefficients() state = mission.state() beta = state[0] p = state[1] r = state[2] Cy = c[0]*beta + c[1]*r_def Cl = c[2]*beta + c[3]*(p*b/(2*v)) + c[4]*(r*b/(2*v)) + c[5]*r_def Cn = c[6]*beta + c[7]*(p*b/(2*v)) + c[8]*(r*b/(2*v)) + c[9]*r_def aero = [Cy,Cl,Cn] return aero # calculates forces and moments @staticmethod def forceCalc(vehicle,qbar,aero): S,b = vehicle.vehicle_geometry() # Forces fy = qbar*S*aero[0] # Moments L = qbar*S*b*aero[1] N = qbar*S*b*aero[2] forces = [fy,L,N] return forces # Equations of motion calculation @staticmethod def EOM(forces,vehicle,v,mission): f = forces g = 9.81 x = mission.state() c = vehicle.momentCoeff() m = vehicle.vehicle_mass() # x = [beta,p,r,phi,psi] # c = [c0,c3,c4,c10] betaDot = (1/v)*(f[0]/m + g*np.sin(x[3]))-x[2] pAcc = c[1]*f[1] + c[2]*f[2] rAcc = c[2]*f[1] + c[3]*f[2] phiDot = x[1] psiDot = x[2]*np.cos(x[3]) rates = [betaDot,pAcc,rAcc,phiDot,psiDot] return rates # math class that takes care of runge kutta? hawk1 = pd.ExcelFile("Data/Skyhawk_Attributes.xlsx") hawk2 = pd.ExcelFile("Data/Skyhawk_Attributes2.xlsx") rudder_in = pd.ExcelFile("Data/rudder_input.xlsx") yuh = rudder_in.parse("Sheet1") rudder_deg = yuh.rudder_deg.tolist() time = yuh.time_s.tolist() yaw_real = yuh.yaw_rate_deg_s.tolist() #qbar = 13898.5 # plotting class? james = Vehicle('James',hawk1) bond = Vehicle('bond',hawk2) test = Mission("One") test.apply_sensor_data(rudder_in) test.set_init_cond(190,4500,0.77) yaw = test.simulate(james,rudder_in) yaw2 = test.simulate(bond,rudder_in) <file_sep>from Classes.Vehicle import Vehicle from Classes.Mission import Mission import pandas as pd from pandas import ExcelWriter from pandas import ExcelFile # Import vehicle Data hawk1 = pd.ExcelFile("Data/Skyhawk_Attributes.xlsx") hawk2 = pd.ExcelFile("Data/Skyhawk_Attributes2.xlsx") # Enter Rudder Data rudder_in = pd.ExcelFile("Data/rudder_test.xlsx") # Create Vehicles james = Vehicle('James',hawk1) # Create Mission test = Mission("One") test.apply_sensor_data(rudder_in) test.set_init_cond(190,4500,0.77) # Run Simulations # No control System test.control_system(False) yaw = test.simulate(james,rudder_in) # Write Data to excel pd.DataFrame(yaw).to_excel('Data/output.xlsx',header=False,index=False) # Control System active test.control_system(True) yaw = test.simulate(james,rudder_in) <file_sep>import numpy as np class Calc(): @staticmethod def RK4(x, rates,dt): rn = [] #dt = dt # need user to set interval for i in range(len(x)): rn.append(x[i]) r1 = rates for i in range(len(x)): rn[i] = x[i]+0.5*dt*r1[i] r2 = rates for i in range(len(x)): rn[i] = x[i]+0.5*dt*r2[i] r3 = rates for i in range(len(x)): rn[i] = x[i]+dt*r3[i] r4 = rates for i in range(len(x)): rn[i] = x[i] + (dt/6.0)*(r1[i]+2.0*r2[i]+2.0*r3[i]+r4[i]) return rn # Cy,Cl,Cn calculator @staticmethod def aeroC(vehicle,mission,r_def,v): area,b = vehicle.vehicle_geometry() # area, and span c = vehicle.aero_coefficients() state = mission.state() beta = state[0] p = state[1] r = state[2] Cy = c[0]*beta + c[1]*r_def Cl = c[2]*beta + c[3]*(p*b/(2*v)) + c[4]*(r*b/(2*v)) + c[5]*r_def Cn = c[6]*beta + c[7]*(p*b/(2*v)) + c[8]*(r*b/(2*v)) + c[9]*r_def aero = [Cy,Cl,Cn] return aero # calculates forces and moments @staticmethod def forceCalc(vehicle,qbar,aero): S,b = vehicle.vehicle_geometry() # Forces fy = qbar*S*aero[0] # Moments L = qbar*S*b*aero[1] N = qbar*S*b*aero[2] forces = [fy,L,N] return forces # Equations of motion calculation @staticmethod def EOM(forces,vehicle,v,mission): f = forces g = 9.81 x = mission.state() c = vehicle.momentCoeff() m = vehicle.vehicle_mass() # x = [beta,p,r,phi,psi] # c = [c0,c3,c4,c10] betaDot = (1/v)*(f[0]/m + g*np.sin(x[3]))-x[2] pAcc = c[1]*f[1] + c[2]*f[2] rAcc = c[2]*f[1] + c[3]*f[2] phiDot = x[1] psiDot = x[2]*np.cos(x[3]) rDefDot = 0 rates = [betaDot,pAcc,rAcc,phiDot,psiDot,rDefDot] return rates <file_sep>from Classes.Calc import Calc from Classes.Vehicle import Vehicle import numpy as np import matplotlib.pyplot as plt import pandas as pd class Mission(): # default attributes beta = 0 p = 0 r = 0 phi = 0 psi = 0 #r_def = 0 rudder_in = 0 velocity = 0 altitude = 0 density = 0 qbar = 0 name = ' ' control = False # Better way? # note: need to calc density based on alt def __init__(self,name): self.name = name def get_name(self): return self.name @classmethod def control_system(cls,value): cls.control = value # Sets custom initial conditions @classmethod def set_init_cond(cls,velocity,altitude,density): cls.velocity = velocity cls.altitude = altitude cls.density = density cls.qbar = 0.5*density*velocity**2 def return_velocity(self): return self.velocity @classmethod def apply_sensor_data(cls,rudder_data): Data = rudder_data.parse("Sheet1") time = Data.time_s.tolist() rudder_deg = Data.rudder_deg.tolist() yaw_rate = Data.yaw_rate_deg_s.tolist() cls.time = time cls.rudder_deg = rudder_deg cls.yaw_rate = yaw_rate @classmethod def return_sensor_data(cls): return cls.time,cls.rudder_deg,cls.yaw_rate # updates vehicle position, to use during calculation @classmethod def update_state(cls,state): cls.beta = state[0] cls.p = state[1] cls.r = state[2] cls.phi = state[3] cls.psi = state[4] # Provides orientation data @classmethod def state(cls): state = [cls.beta,cls.p,cls.r,cls.phi,cls.psi] return state # Run Simulation # can run different planes @classmethod def simulate(cls,vehicle,rudder_in): cls.apply_sensor_data(rudder_in) #state = cls.state() v = cls.return_velocity(cls) yaw = [] #testing #aero = [] #forces = [] #rates = [] #testing i = 0 time = cls.time kr = -0.4 dt = cls.time[1]-cls.time[0] r_def = 0 for x in cls.time: # Control system check if (cls.control == True): state = cls.state() z = np.exp(-1)*state[2] #zdot = cls.r - z # change in time constant?? r_def = cls.rudder_deg[i]*(np.pi/180) r_def = r_def + kr*(z-state[2]) else: r_def = cls.rudder_deg[i]*(np.pi/180) #r_return.append(r_def) aero = Calc.aeroC(vehicle,cls,r_def,v) forces = Calc.forceCalc(vehicle,cls.qbar,aero) rates = Calc.EOM(forces,vehicle,v,cls) new_state = Calc.RK4(cls.state(),rates,dt) cls.update_state(new_state) yaw.append(new_state[2]*(180/np.pi)) i += 1 fig = plt.subplot() fig.plot(time,cls.rudder_deg,label='rudder in') fig.plot(time,yaw,label= '{}'.format(Vehicle.get_name(vehicle)) ) fig.plot(time,cls.yaw_rate,label='Flight Data') plt.ylabel('yaw rate [deg/s]') plt.title('{},Control:{}'.format(Vehicle.get_name(vehicle),cls.control)) # Add simulation name plt.xlabel('time [s]') plt.grid(True) fig.legend() plt.show() return yaw
8b1e08906464d421767d3449f2abe1810a1dfe16
[ "Markdown", "Python" ]
6
Markdown
FernCarrera/SkyHawkModel
a0b4718774e5b82866a5c051d2ce2a99a2984f2a
97e5e1c878bb6225368dbd090f22c51a9e27c0ae
refs/heads/master
<file_sep>public class hello { public static void main(String[] args) { System.out.println("Hello friends"); System.out.println("Aytekin abi "); System.out.println("Efendi abi "); System.out.println("Ferhat abi "); System.out.println("Hasan abi "); System.out.println("Mehmet abi "); System.out.println("Murat abi"); System.out.println("Naci abi "); System.out.println("Nuri abi "); System.out.println("Zafer abi "); } }
e6bcf1b0cda4bdaa17df240b1760dcd5d890bad0
[ "Java" ]
1
Java
zfrkcdg/friends
d4005d6dfbd5211e66a59156e35bd73b347f4c6b
6f5439820f06d0afe0d4bd9bedf2ab0934e52228
refs/heads/master
<repo_name>1tiktak/MatchFireStore<file_sep>/MatchFireStore/Controller/HomeController.swift // // ViewController.swift // MatchFireStore // // Created by <NAME> on 5/11/19. // Copyright © 2019 <NAME>. All rights reserved. // import UIKit import Firebase import JGProgressHUD class HomeController: UIViewController, SettingsControllerDelegate { let topStackView = TopNavagationStackView() let cardsDeckView = UIView() let bottomControls = HomeBottomControlsStackView() var cardViewModels = [CardViewModel]() override func viewDidLoad() { super.viewDidLoad() topStackView.settingsBUtton.addTarget(self, action: #selector(handleSettings), for: .touchUpInside) bottomControls.refreshButton.addTarget(self, action: #selector(handleRefresh), for: .touchUpInside) setUpLayout() fetchCurrentUser() } override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) print("HomeController Did Appear 🙋‍♂️") // If user is not logged in Kick the user out to the sign/login VC if Auth.auth().currentUser == nil { let registrationController = RegistrationController() let navController = UINavigationController(rootViewController: registrationController) present(navController, animated: true) } } fileprivate var user: User? fileprivate func fetchCurrentUser(){ guard let uid = Auth.auth().currentUser?.uid else { return } Firestore.firestore().collection("Users").document(uid).getDocument { (snapshot, err) in if let err = err { print(err) return } // we have fetched our user here guard let dictionary = snapshot?.data() else { return } self.user = User(dictionary: dictionary) self.fetchUsersFromFirestore() } } @objc fileprivate func handleRefresh(){ fetchUsersFromFirestore() } var lastFetchedUser: User? fileprivate func fetchUsersFromFirestore(){ guard let minAge = user?.minSeekingAge, let maxAge = user?.maxSeekingAge else {return} let hud = JGProgressHUD(style: .dark) hud.textLabel.text = "Reeling in Couples" hud.show(in: view) let query = Firestore.firestore().collection("Users").whereField("age", isGreaterThanOrEqualTo: minAge).whereField("age", isLessThanOrEqualTo: maxAge) query.getDocuments { (snapshot, err) in hud.dismiss(animated: true) if let err = err { print("Failed to fetch users 😭", err) return } snapshot?.documents.forEach({ (documentSnapshot) in let userDictionary = documentSnapshot.data() let user = User(dictionary: userDictionary) self.cardViewModels.append(user.toCardViewModel()) self.lastFetchedUser = user self.setupCardFromUser(user: user) }) } } fileprivate func setupCardFromUser(user: User){ let cardView = CardView(frame: .zero) cardView.cardViewModel = user.toCardViewModel() cardsDeckView.addSubview(cardView) cardsDeckView.sendSubviewToBack(cardView) cardView.fillSuperview() } @objc func handleSettings() { let settingsController = SettingsController() settingsController.delegate = self let navController = UINavigationController(rootViewController: settingsController) present(navController, animated: true) } func didSaveSettings() { print("Notified of dismissal") fetchCurrentUser() } fileprivate func setupFireStoreUserCards() { cardViewModels.forEach { (cardVM) in let cardView = CardView(frame: .zero) cardView.cardViewModel = cardVM cardsDeckView.addSubview(cardView) cardView.fillSuperview() } } // MARK: - FilePrivate fileprivate func setUpLayout() { view.backgroundColor = .white let overallStackView = UIStackView(arrangedSubviews: [topStackView, cardsDeckView, bottomControls]) overallStackView.axis = .vertical view.addSubview(overallStackView) overallStackView.anchor(top: view.safeAreaLayoutGuide.topAnchor, leading: view.leadingAnchor, bottom: view.bottomAnchor, trailing: view.trailingAnchor) overallStackView.isLayoutMarginsRelativeArrangement = true overallStackView.layoutMargins = .init(top: 0, left: 8, bottom: 0, right: 12) overallStackView.bringSubviewToFront(cardsDeckView) } } <file_sep>/Podfile platform :ios, '9.0' target 'MatchFireStore' do # Comment the next line if you're not using Swift and don't want to use dynamic frameworks use_frameworks! # Pods for MatchFireStore pod 'Firebase/Firestore’,’~>5.10.0’ pod 'Firebase/Auth’ pod 'Firebase/Storage’ pod 'Firebase/Core’ pod 'SDWebImage','~>4.4.2’ pod 'JGProgressHUD','~>2.0.3' pod 'GoogleAppMeasurement', '~> 5.2.0' end
3322e767432a40036ae191a2e10306d6bd829931
[ "Swift", "Ruby" ]
2
Swift
1tiktak/MatchFireStore
7bc889a667adf071e3a7cadac5649d0c6ddee451
2d61f06258efeb615af8cc74d7363b2deabdd27b
refs/heads/master
<repo_name>somenasty/Echo<file_sep>/FuncInsertInfo.py import os from InsertInfo import InsertInfo from StcokInfo import StockInfo class FuncInsertInfo: @staticmethod def main_function(code_name, price, amount, flag, csv_file_name): # try: single_stock_info = StockInfo(code_name, price, amount, flag) single_stock_arr = single_stock_info.transfer_to_list() insert_stock_info = InsertInfo() insert_stock_info.insert_to_csv(single_stock_arr, csv_file_name) csv_file_path = os.getcwd() + "/" + csv_file_name print(csv_file_path + " has been updated!") # except Exception as e: # print("Error Message:", e) <file_sep>/featuretest.py import tkinter as tk import csv from FuncInsertInfo import FuncInsertInfo from InsertInfo import InsertInfo class featuretest(tk.Frame): flag = 0 csv_file_name = "test.csv" headers = ["日期", "股票名称", "股票价格", "股票数量", "股票总价"] headers_flag = True while flag != 1: with open(csv_file_name, "r", newline="", encoding='utf-8-sig')as f1: read_csv = csv.reader(f1) headers_flag = [x for x in read_csv] if len(headers_flag) == 0: headers_flag = False insert_headers = InsertInfo() insert_headers.add_headers(headers, csv_file_name) def __init__(self, master=None): super().__init__(master) self.master = master self.pack() self.create_widgets(csv_file_name='test.csv') def create_widgets(self, csv_file_name): self.menu = tk.Button(self) self.menu["text"] = "录入" self.menu["command"] = FuncInsertInfo.main_function() self.menu.pack(side="top") self.quit = tk.Button(self, text="exit", fg="red", command=self.master.destroy) self.quit.pack(side="bottom") root = tk.Tk() app = featuretest(master=root) app.mainloop() <file_sep>/DeleteInfo.py class DeleteInfo: def __init__(self): print("to do")<file_sep>/README.md # Echo Insert/query/delete/sum the info of stock __main__.py is a GUI, To start the project, please just cd to the file path, then execute the command as follow, > python __main__.py. test.csv is the file which stored the info that users' input. this file's code form is GBK. test.py interacts from command line. <file_sep>/test.py import csv import os from DeleteInfo import * from GetProfit import GetProfit from InsertInfo import * from QueryInfo import * from StcokInfo import * flag = 0 csv_file_name = "test.csv" headers = ["日期", "股票名称", "股票价格", "股票数量", "股票总价"] headers_flag = True while flag != 1: with open(csv_file_name, "r", newline="")as f1: read_csv = csv.reader(f1) headers_flag = [x for x in read_csv] if len(headers_flag) == 0: headers_flag = False insert_headers = InsertInfo() insert_headers.add_headers(headers, csv_file_name) get_choice = input("1.录入 2.查询 3.结算 4.退出\n") if get_choice == '1': get_input = input("股票名称 股票价格 股票数量 (买入0/卖出1)\n") try: get_list = get_input.split(" ") single_stock_info = StockInfo(get_list[0], get_list[1], get_list[2], get_list[3]) single_stock_arr = single_stock_info.transfer_to_list() insert_stock_info = InsertInfo() insert_stock_info.insert_to_csv(single_stock_arr, csv_file_name) csv_file_path = os.getcwd() + "/" + csv_file_name print(csv_file_path + " has been updated!") except Exception as e: print("Error Message:", e) elif get_choice == '2': cost_list = [] cost_amount_list = [] cost_sum = 0 cost_amount_sum = 0 get_second_choice = input("1.成本 2.明细\n") get_code_name = input("股票名称\n") result = QueryInfo.get_info_from_csv(get_code_name, csv_file_name) if get_second_choice == '1': for item in result: if float(item[4]) < 0: cost_list.append(float(item[4])) cost_amount_list.append(float(item[3])) for cost in cost_list: cost_sum = cost_sum + cost for amount in cost_amount_list: cost_amount_sum = cost_amount_sum + amount print("成本价格:") print(round((cost_sum/cost_amount_sum), 2)) elif get_second_choice == '2': for row in result: print(row) else: print("error") elif get_choice == '3': get_code_name = input("股票名称\n") result = QueryInfo.get_info_from_csv(get_code_name, csv_file_name) # print(result) get_profit_info = GetProfit.count_profit(csv_file_name, result) print("获得利润:") print(get_profit_info) elif get_choice == '4': flag = 1 else: print("error") <file_sep>/QueryInfo.py import csv class QueryInfo: @staticmethod def get_info_from_csv(code_name, csv_file_name): result = [] with open(csv_file_name, "r", newline="", encoding='utf-8-sig')as f1: f_csv = csv.reader(f1) for row in f_csv: if row[1] == code_name: result.append(row) return result <file_sep>/StcokInfo.py import time class StockInfo: def __init__(self, code_name, price, amount, flag): self.datetime = time.strftime("%Y%m%d %H:%M", time.localtime()) self.code_name = code_name self.amount = amount self.price = price self.total = float(self.price) * int(self.amount) if flag == '0': self.total = -(self.total + self.total*0.0008 + self.total * 0.00002) elif flag == '1': self.total = self.total - self.total*0.0008 - self.total * 0.00002 - self.total * 0.001 else: print("error") self.total = round(self.total, 2) def debug_print(self): print("code name=>", self.code_name, "amount=>", self.amount, "price", self.price, "total", self.total) def transfer_to_list(self): single_stock = [self.datetime, self.code_name, self.price, self.amount, self.total] return single_stock <file_sep>/InsertInfo.py import csv class InsertInfo: @staticmethod def add_headers(headers, csv_file_name): with open(csv_file_name, "a", newline="", encoding='utf-8-sig') as f: f_csv = csv.writer(f) f_csv.writerow(headers) f.close() @staticmethod def insert_to_csv(single_stock, csv_file_name): headers = ["日期", "股票名称", "股票价格", "股票数量", "股票总价"] temp_list = [] with open(csv_file_name, "r", newline="", encoding='utf-8-sig') as f1: f1_csv = csv.reader(f1) for row in f1_csv: temp_list.append(row) if len(temp_list) == 0: InsertInfo.add_headers(headers, csv_file_name) with open(csv_file_name, "a", newline="", encoding='utf-8-sig') as f: f_csv = csv.writer(f) f_csv.writerow(single_stock) f.close() <file_sep>/__main__.py from tkinter import * from tkinter import messagebox import os from FuncInsertInfo import FuncInsertInfo from GetProfit import GetProfit from QueryInfo import QueryInfo csv_file_name = "test.csv" file_flag = 0 headers = ["日期", "股票名称", "股票价格", "股票数量", "股票总价"] get_path = os.getcwd() get_dir_list = os.listdir(get_path) file_flag = get_dir_list.count(csv_file_name) if file_flag == 0: file = open(get_path+r'\\'+csv_file_name, "w", newline="") def submitted(e1, e2, e3, var, txt): code_name = e1.get() price = e2.get() amount = e3.get() flag = var.get() txt.insert(END, code_name) txt.insert(END, " ") txt.insert(END, price) txt.insert(END, " ") txt.insert(END, amount) txt.insert(END, " ") txt.insert(END, flag) txt.insert(END, "\n") FuncInsertInfo.main_function(code_name, price, amount, flag, csv_file_name) def create(): top = Toplevel() top.title('InsertInfo') top.geometry('600x600') var = StringVar() var1 = StringVar() show_panel = Text(top) lab_e1 = Label(top, text="股票名称") lab_e2 = Label(top, text="股票价格") lab_e3 = Label(top, text="股票数量") e1 = Entry(top) e2 = Entry(top) e3 = Entry(top) # b2 = Button(top, text='返回', command=back_to_menu) rd1 = Radiobutton(top, text="买入", variable=var, value=0) rd2 = Radiobutton(top, text="卖出", variable=var, value=1) lab_e1.grid(row=1, column=1) lab_e2.grid(row=2, column=1) lab_e3.grid(row=3, column=1) e1.grid(row=1, column=2) e2.grid(row=2, column=2) e3.grid(row=3, column=2) rd1.grid(row=4, column=2) rd2.grid(row=5, column=2) b1 = Button(top, text='提交', command=lambda: submitted(e1, e2, e3, var, show_panel)) b1.grid(row=8, column=1) show_panel.grid(row=15, column=2, ipadx=1, ipady=1) def query_cost(e4, text): cost_list = [] cost_amount_list = [] cost_sum = 0 cost_amount_sum = 0 result = QueryInfo.get_info_from_csv(e4.get(), csv_file_name) for item in result: if float(item[4]) < 0: cost_list.append(float(item[4])) cost_amount_list.append(float(item[3])) for cost in cost_list: cost_sum = cost_sum + cost for amount in cost_amount_list: cost_amount_sum = cost_amount_sum + amount # print("成本价格:") # print(round((cost_sum / cost_amount_sum), 2)) cost_info = round((cost_sum / cost_amount_sum), 2) text.insert(END, chars=e4.get()) text.insert(END, chars=' 成本价格: ') text.insert(END, chars=cost_info) text.insert(END, chars="\n") def query_detail(e4, text): result = QueryInfo.get_info_from_csv(e4.get(), csv_file_name) for row in result: text.insert(END, chars=row) text.insert(END, '\n') def clear_text_info(txt): txt.delete('1.0', END) def query_menu(): query_menu_window = Toplevel() query_menu_window.geometry('700x400') query_menu_window.title('Query') lab_e4 = Label(query_menu_window, text='股票名称') e4 = Entry(query_menu_window) show_cost_info = Text(query_menu_window) b2 = Button(query_menu_window, text='成本', command=lambda: query_cost(e4, show_cost_info)) b3 = Button(query_menu_window, text='明细', command=lambda: query_detail(e4, show_cost_info)) b4 = Button(query_menu_window, text='清空', command=lambda: clear_text_info(show_cost_info)) lab_e4.grid(row=0, column=0) e4.grid(row=0, column=1) b2.grid(row=1, column=0) b3.grid(row=1, column=1) b4.grid(row=1, column=2) show_cost_info.grid(row=4, column=1) show_cost_info.insert(END, chars=' '.join(headers)) show_cost_info.insert(END, chars='\n') def get_balance(e5, txt): get_code_name = e5.get() result = QueryInfo.get_info_from_csv(get_code_name, csv_file_name) get_profit_info = GetProfit.count_profit(csv_file_name, result) if get_profit_info == 0: messagebox.askquestion(title='错误信息', message='没有找到购买记录') else: txt.insert(END, chars=get_code_name) txt.insert(END, chars=" 获得利润: ") txt.insert(END, chars=get_profit_info) txt.insert(END, chars='\n') def balance_menu(): balance_menu_panel = Toplevel() balance_menu_panel.geometry('600x600') balance_menu_panel.title('Balance') lab_e5 = Label(balance_menu_panel, text='股票名称') e5 = Entry(balance_menu_panel) show_balance_info = Text(balance_menu_panel) b4 = Button(balance_menu_panel, text='结算', command=lambda: get_balance(e5, show_balance_info)) lab_e5.grid(row=0, column=3) e5.grid(row=0, column=4) b4.grid(row=1, column=3) show_balance_info.grid(row=2, column=4, ipadx=1, ipady=1) root = Tk() root.title('CountingSystem') root.geometry('600x600') Button(root, text='录入', command=create).grid(row=1, column=1, padx=1, pady=1) Button(root, text='查询', command=query_menu).grid(row=1, column=2, padx=1, pady=1) Button(root, text='结算', command=balance_menu).grid(row=1, column=3, padx=1, pady=1) Button(root, text='退出', command=lambda: root.quit()).grid(row=1, column=4, padx=1, pady=1) mainloop() <file_sep>/GetProfit.py class GetProfit: @staticmethod def count_profit(csv_file_name, result_list): profit_sum = 0 for row in result_list: profit_sum = round((float(row[4]) + profit_sum), 2) return profit_sum
489ebb0a91bb7a27915953b290f5c090986a71d4
[ "Markdown", "Python" ]
10
Python
somenasty/Echo
e59c9746f198396b0fc9950d525d032b21663ac4
cf4966856deea311bb1f828faa74e3f75d6682cf
refs/heads/master
<repo_name>sushantmzjn/ESoftwarica<file_sep>/app/src/main/java/com/sushant/esoftwarica/adapter/UserAdapter.java package com.sushant.esoftwarica.adapter; import android.content.Context; import android.view.LayoutInflater; import android.view.View; import android.view.ViewGroup; import android.widget.Button; import android.widget.TextView; import androidx.annotation.NonNull; import androidx.recyclerview.widget.RecyclerView; import com.sushant.esoftwarica.R; import com.sushant.esoftwarica.model.User; import java.util.List; import de.hdodenhof.circleimageview.CircleImageView; public class UserAdapter extends RecyclerView.Adapter<UserAdapter.UserViewHolder> { Context context; List<User> userList; int imgId; public UserAdapter(Context context, List<User> userList) { this.context = context; this.userList = userList; } @NonNull @Override public UserViewHolder onCreateViewHolder(@NonNull ViewGroup parent, int viewType) { View view = LayoutInflater.from(parent.getContext()) .inflate(R.layout.list_view, parent, false); return new UserViewHolder(view); } @Override public void onBindViewHolder(@NonNull UserViewHolder holder, final int position) { final User user = userList.get(position); String gender=user.getGender(); if(gender.equals("male")){ imgId=R.drawable.male; } else { imgId=R.drawable.female; } holder.imgview.setImageResource(imgId); holder.Uname.setText(user.getName()); holder.add.setText(user.getAddress()); holder.Age.setText(user.getAge()); holder.gen.setText(user.getGender()); holder.btndelete.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { userList.remove(position); notifyDataSetChanged(); } }); } @Override public int getItemCount() { return userList.size(); } public class UserViewHolder extends RecyclerView.ViewHolder { private CircleImageView imgview; private TextView Uname, add, Age, gen; private Button btndelete; public UserViewHolder(@NonNull View itemView) { super(itemView); imgview = itemView.findViewById(R.id.imageView); Uname = itemView.findViewById(R.id.Title); add = itemView.findViewById(R.id.address); Age = itemView.findViewById(R.id.age); gen = itemView.findViewById(R.id.gender); btndelete = itemView.findViewById(R.id.delete); } } } <file_sep>/app/src/main/java/com/sushant/esoftwarica/model/User.java package com.sushant.esoftwarica.model; import java.util.ArrayList; import java.util.List; public class User{ private int imageid; private String name,address, age, gender; static List<User> userList=new ArrayList<>(); public User(String name, String address, String age, String gender) { this.imageid = imageid; this.name = name; this.address = address; this.age = age; this.gender = gender; } public static List<User> getUserList() { return userList; } public static void setUserList(List<User> userList) { User.userList = userList; } public String getName() { return name; } public void setName(String name) { this.name = name; } public String getAddress() { return address; } public void setAddress(String address) { this.address = address; } public String getAge() { return age; } public void setAge(String age) { this.age = age; } public String getGender() { return gender; } public void setGender(String gender) { this.gender = gender; } }<file_sep>/app/src/main/java/com/sushant/esoftwarica/MainActivity.java package com.sushant.esoftwarica; import android.content.Intent; import android.os.Bundle; import android.text.TextUtils; import android.view.Gravity; import android.view.View; import android.widget.Button; import android.widget.EditText; import android.widget.Toast; import androidx.appcompat.app.AppCompatActivity; public class MainActivity extends AppCompatActivity { private EditText etU, etP; private Button btnL; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); etU = findViewById(R.id.etusername); etP = findViewById(R.id.etpassword); btnL = findViewById(R.id.btnLogin); btnL.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View v) { String user = etU.getText().toString().trim(); String pass = etP.getText().toString().trim(); if (TextUtils.isEmpty(etU.getText().toString())){etU.setError("Input Username"); return;} if (TextUtils.isEmpty(etP.getText().toString())){etP.setError("Input Password"); return;} if (user.equals("sushant") && pass.equals("<PASSWORD>")){ Toast toast = Toast.makeText(getApplicationContext(),"Login Successful",Toast.LENGTH_SHORT); toast.setGravity(Gravity.TOP|Gravity.CENTER_HORIZONTAL,0,0); toast.show(); Intent intent = new Intent(MainActivity.this,Dashboard.class); startActivity(intent); }else { etP.setError("Incorrect Password"); return; } etP.setText(null); etU.setText(null); } }); } } <file_sep>/app/src/main/java/com/sushant/esoftwarica/ui/add/AddFragment.java package com.sushant.esoftwarica.ui.add; import android.os.Bundle; import android.text.TextUtils; import android.view.LayoutInflater; import android.view.View; import android.view.ViewGroup; import android.widget.Button; import android.widget.EditText; import android.widget.RadioButton; import android.widget.RadioGroup; import android.widget.Toast; import androidx.annotation.NonNull; import androidx.fragment.app.Fragment; import com.sushant.esoftwarica.R; import com.sushant.esoftwarica.model.User; import java.util.ArrayList; import java.util.List; public class AddFragment extends Fragment implements View.OnClickListener { private EditText sname, saddress, sage; private Button Abtn; private RadioGroup radioGroup; private RadioButton rmale, rfemale; String gender, name,address,Age; static List<User> userList = new ArrayList<>(); public View onCreateView(@NonNull LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) { final View root = inflater.inflate(R.layout.fragment_add, container, false); sname = root.findViewById(R.id.txtname); saddress = root.findViewById(R.id.txtaddress); sage = root.findViewById(R.id.txtage); Abtn = root.findViewById(R.id.btnadd); rmale = root.findViewById(R.id.rbm); rfemale = root.findViewById(R.id.rbf); radioGroup = root.findViewById(R.id.genderG); Abtn.setOnClickListener(this); return root; } @Override public void onClick(View v) { switch (v.getId()){ case R.id.btnadd: name = sname.getText().toString().trim(); address = saddress.getText().toString().trim(); Age = sage.getText().toString().trim(); if (TextUtils.isEmpty(name)){ sname.setError("Enter Name"); return; } if (TextUtils.isEmpty(address)){ saddress.setError("Enter Address"); return; } if (TextUtils.isEmpty(Age)){ sage.setError("Enter Age"); return; } if (rmale.isChecked()){ gender = "male"; } if (rfemale.isChecked()){ gender = "female"; } User user = new User(name,address, Age, gender); userList = user.getUserList(); userList.add(user); user.setUserList(userList); Toast.makeText(getActivity(), "User added Successfully", Toast.LENGTH_SHORT).show(); sname.setText(""); saddress.setText(""); sage.setText(""); rfemale.setChecked(false); rmale.setChecked(false); break; } } }<file_sep>/app/src/main/java/com/sushant/esoftwarica/ui/home/HomeFragment.java package com.sushant.esoftwarica.ui.home; import android.os.Bundle; import android.view.LayoutInflater; import android.view.View; import android.view.ViewGroup; import androidx.annotation.NonNull; import androidx.fragment.app.Fragment; import androidx.recyclerview.widget.LinearLayoutManager; import androidx.recyclerview.widget.RecyclerView; import com.sushant.esoftwarica.R; import com.sushant.esoftwarica.adapter.UserAdapter; import com.sushant.esoftwarica.model.User; import java.util.ArrayList; import java.util.List; public class HomeFragment extends Fragment { private RecyclerView recyclerView; static List<User> userList=new ArrayList<>(); public View onCreateView(@NonNull LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) { View root = inflater.inflate(R.layout.fragment_home, container, false); recyclerView = root.findViewById(R.id.recyclerview); User user= new User("sushant", "ktm", "22", "male"); userList=user.getUserList(); if(userList.isEmpty()) { userList.add(new User("sushant", "ktm", "22", "male")); userList.add(new User("sony", "ktm", "22", "female")); user.setUserList(userList); } UserAdapter adapter = new UserAdapter(getActivity(), userList); recyclerView.setAdapter(adapter); recyclerView.setLayoutManager(new LinearLayoutManager(getActivity())); return root; } }
b2720ed88b6584be02dd59051371c79223097d7d
[ "Java" ]
5
Java
sushantmzjn/ESoftwarica
7845fb086c17b607934dfe3ff7d6706c2b1addd3
bd59bd3e6970ee505dd863cf490760bd21a752e8
refs/heads/master
<file_sep>'use strict'; import { Client } from "discord.js"; import pkg from 'node-schedule'; const { RecurrenceRule, scheduleJob, Range } = pkg; var bot = new Client(); const prefix = "manu" const commands = { "ping": "Checks Latency", "sum": "Adds args; eg: manusum 10 20 21" } bot.on('ready', () => { // change channel name to name of channel or just set to equal the channel ID if you already know it. var channel = '803275269007212604'; var rule = new RecurrenceRule(); // Creates new Recurrence Rule rule.dayOfWeek = [new Range(1, 5)]; rule.hour = 14; rule.minute = [55, 57, 59]; rule.second = [0, 30]; scheduleJob(rule, function () { bot.channels.cache.get(`803275269007212604`).send("Among us at 3pm? @everyone"); }) console.log("Bot is ready."); }); bot.on("message", function (message) { if (message.author.bot) return; // to check if the author of the bot is a bot if (!message.content.toLowerCase().startsWith(prefix)) return; const commandBody = message.content.slice(prefix.length); const args = commandBody.split(' '); // splits and arguments args = 10 15 20 command = sum const command = args.shift().toLowerCase(); // removes command name from const args and assigns it to const command if (command === "ping") { const timeTaken = Date.now() - message.createdTimestamp; message.reply(`Pong! This message had a latency of ${timeTaken}ms.`); } else if (command === "sum") { const numArgs = args.map(x => parseFloat(x)); const sum = numArgs.reduce((counter, x) => counter += x); message.reply(`The sum of all the arguments you provided is ${sum}!`); } else if (command === "help") { var helpText = 'Prefix: manu\n'; for (var commandName in commands) { helpText += `${commandName}: ${commands[commandName]}\n`; } message.reply(helpText); } }); //login bot.login('<KEY>');
ecb5d0d8a4699ec0ee97c6b756bc65cadfc7f28d
[ "JavaScript" ]
1
JavaScript
Manu1ND/discordBot
0d0a7221f12ecf8ebac3a92951015b6f7812d99c
5a4d307ad8c641d41e2a1341bf47b31cd8bf5c1d
refs/heads/master
<file_sep># Make a class LatLon that can be passed parameters `lat` and `lon` to the # constructor class LatLon: def __init__(self,lat,lon): self.lat = lat self.lon = lon #%% # Make a class Waypoint that can be passed parameters `name`, `lat`, and `lon` to the # constructor. It should inherit from LatLon. Look up the `super` method. class Waypoint(LatLon): def __init__(self,name,lat,lon): super().__init__(lat,lon) self.name = name def __str__(self): return '''The waypoint {self.name} is located at Lat{self.lat}, Lon{self.lon}'''.format(self=self) #%% # Make a class Geocache that can be passed parameters `name`, `difficulty`, # `size`, `lat`, and `lon` to the constructor. What should it inherit from? class Geocache(Waypoint): def __init__(self, name, difficulty, size, lat, lon): super().__init__(name,lat,lon) self.difficulty = difficulty self.size = size def __str__(self): return '''The cache {self.name} of size {self.size} is located at Lat{self.lat}, Lon{self.lon} with a difficulty of {self.difficulty}'''.format(self=self) #%% # Make a new waypoint and print it out: "Catacombs", 41.70505, -121.51521 Catacombs = Waypoint("Catacombs",41.70505,-121.51521) print(Catacombs.name,Catacombs.lat,Catacombs.lon) #%% # Without changing the following line, how can you make it print into something # more human-readable? Hint: Look up the `object.__str__` method waypoint = Catacombs print(waypoint) # Make a new geocache "Newberry Views", diff 1.5, size 2, 44.052137, -121.41556 Newberry = Geocache("Newberry Views",1.5,2,44.052137, -121.41556) # Print it--also make this print more nicely geocache = Newberry.__str__() print(geocache) <file_sep>""" Python makes performing file I/O simple. Take a look at how to read and write to files here: https://docs.python.org/3/tutorial/inputoutput.html#reading-and-writing-files """ # Open up the "foo.txt" file (which already exists) for reading # Print all the contents of the file, then close the file # Note: pay close attention to your current directory when trying to open "foo.txt" foo = open('foo.txt','r') print(foo.read(-1)) foo.close() #%% # Open up a file called "bar.txt" (which doesn't exist yet) for # writing. Write three lines of arbitrary content to that file, # then close the file. Open up "bar.txt" and inspect it to make # sure that it contains what you expect it to contain bar = open("bar.txt",'w') bar.write( """ This is an exercise in writing onto a text file. I think this is interesting to do without modules. Python has more power than I previously knew. """) bar.close() rbar = open('bar.txt','r') print(rbar.read(-1)) rbar.close()
2f859fdf27f9f5633b3828ca5fbcfac54d67e03f
[ "Python" ]
2
Python
Witterone/Intro-Python-I
f7797ac3fdbe1f54602a8bcf82142b154c37e5ff
864809d35d53fd389f985e9cda574718598a73ee
refs/heads/master
<repo_name>alun/vitrina<file_sep>/main/src/java/com/katlex/vitrina/GoodsNavigationException.java package com.katlex.vitrina; public class GoodsNavigationException extends RuntimeException { } <file_sep>/main/grails-app/services/com/katlex/vitrina/goods/IGoodsNavigationService.java package com.katlex.vitrina.goods; /** * Сервис навигации по списку товаров */ interface IGoodsNavigationService { /** * Возвращает id текущего просматриваемого товара */ Long getCurrentGoodsId(); /** * Делает текущим товар с id == value. * Если такой товар с таким id отсутствует в списке, * то вызов возвращает ошибку и при этом текущий товар * остается прежним. */ void setCurrentGoodsId(Long value); /** * Получение id следующего товара в списке */ Long nextGoodsId(); /** * Получение id предыдущего товара в списке */ Long prevGoodsId(); /** * Получение полного колличества товаров в списке */ Long goodsTotal(); /** * Получение номера позициции текущего товара в списке */ Long goodsCurrentPosition(); /** * Получение id товара, находящегося на указанной позиции в списке */ Long goodsGetIdAt( Long pos ); /** * Удаляет текущий товар, либо товар goodsId из списка товаров */ void removeGoodsFromList(Long goodsId); Long removeCurrentGoodsFromList(); } <file_sep>/main/grails-app/services/com/katlex/vitrina/goods/IGoodsOperationsService.java package com.katlex.vitrina.goods; import com.katlex.vitrina.domain.Goods; interface IGoodsOperationsService { Goods saveGoods(Goods goods); void deleteGoods(Goods goods); } <file_sep>/main/grails-app/services/com/katlex/vitrina/goods/IGoodsListService.java package com.katlex.vitrina.goods; import java.util.List; interface IGoodsListService { /** * Установка списка всех товаров. * Для обычного пользователя (ANONYMOUS) список всех товаров состоит из * утвержденных модератором товаров, принадлежащих VALIDATED пользователям. * Для REGISTERED пользователей список всех товаров дополнительно включает товары, * которыми владеет данных пользователь, вне зависимости от их "утвержденности". * Для MODERATOR пользователей, список товаров состоит из всех возможных товаров в системе. */ void allGoods(); /** * Установка списка неутвержденных товаров. * Для модераторов устанавливает список, состоящий только из неутвержденных товаров. * Для остальных - SecutiryException */ void unApprovedGoods(); /** * Для REGISTEGED пользователей, устанавливает список товаров, состоящий только из товаров, * владельцем которых является текущих пользователь. * Для ANONYMOUS - SecutiryException */ void myGoods(); /** * Добавляет фильтр к текущему списку товаров, возможно изменяя при этом сам список */ // void addListFilter( Filter filter); // Дальше идут функции сохранения/загрузки списка товаров // Имена списков локальные для пользователей // Для загрузки может быть использовано полное имя списка "имя_пользователя.имя_списка" // При сохранении полное имя не может быть использовано /** * Проверяет существует ли список товаров с указаным именем */ Boolean listNameExists(String listName); /** * Сохраняет список с указанным именем. Выбрасывает исключение если * спискок с указанным именем существует и force == false. * Если force == true, то старый список стирается */ void saveList(String listName, Boolean force); /** * Загружает список с указанным именем. Выбрасывает исключение если такого * списка не существует. */ void loadList(String listName); /** * Получает имена всех сохранненных списков, * начинающиеся на startWith, включает открытые списки других пользователей, * если open == true */ List<String> listNames(String startWith, Boolean open); /** * Делает список доступным для остальных пользователей */ void setListWorldOpen(String listName, Boolean open); } <file_sep>/main/grails-app/services/UserAdministrationService.java public class UserAdministrationService { }
7c67b0458c36daa9d199a377ef768489cad8ab50
[ "Java" ]
5
Java
alun/vitrina
3b68dbd62baaf356246f55ba2ea942ec230df811
5935c4445ce59a31feed6a0b02bdda16f6d2ffe8
refs/heads/master
<file_sep>using System; using System.Linq; using System.Collections.Generic; namespace Chains { public static partial class Chains { /// <summary> /// Yields the values in an <see cref="IEnumerable{TSource}"/> /// infinitely, starting over again if the end is reached. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <returns> /// An infinitely repeating version of the source enumerable. /// </returns> public static IEnumerable<TSource> Cycle<TSource>( this IEnumerable<TSource> source) { source.EnsureNotNull(nameof(source)); if (!source.Any()) yield break; while (true) foreach (var item in source) yield return item; } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using Microsoft.VisualStudio.TestTools.UnitTesting; using Chains; namespace Chains.Test { [TestClass] public class ScanTests { public const string ExpectedEmptySequence = "Exected sequence to be empty."; public const string ExpectedNonEmptySequence = "Exected sequence not to be empty."; public const string ExpectedSizeMismatch = "Sequence sizes differed."; public const string ExpectedElementMismatch = "Sequence element did not match expectation."; [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void Scan2_Should_ThrowIfSourceIsNull() { var sequence = (IEnumerable<object>)null; var actual = sequence .Scan((x, y) => x) .ToList(); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void Scan2_Should_ThrowIfAccumulatorIsNull() { var sequence = Enumerable.Range(1, 3); var actual = sequence .Scan((Func<int, int, int>)null) .ToList(); } [TestMethod] public void Scan2_Should_ProduceAnEmptySequenceIfSourceIsEmpty() { var sequence = Enumerable.Empty<object>(); var actual = sequence .Scan((x, y) => x); Assert.IsFalse(actual.Any(), ExpectedEmptySequence); } [TestMethod] public void Scan2_Should_ProduceTheExpectedElements() { var sequence = new[]{ 1, 2, 3 }; var expected = new[]{ 1, 3, 6 }; var actual = sequence .Scan((x, y) => x + y); Assert.AreEqual(expected.Length, actual.Count(), ExpectedSizeMismatch); Assert.IsTrue(actual.SequenceEqual(expected), ExpectedElementMismatch); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void Scan3_Should_ThrowIfSourceIsNull() { var sequence = (IEnumerable<object>)null; var actual = sequence .Scan(new object(), (x, y) => x) .ToList(); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void Scan3_Should_ThrowIfAccumulatorIsNull() { var sequence = Enumerable.Range(1, 3); var actual = sequence .Scan(0, (Func<int, int, int>)null) .ToList(); } [TestMethod] public void Scan3_Should_YieldTheSeedValueIfSourceIsEmpty() { var sequence = Enumerable.Empty<int>(); var seed = 1; var actual = sequence .Scan(seed, (x, y) => x); Assert.AreEqual(1, actual.Count(), ExpectedNonEmptySequence); Assert.AreEqual(seed, actual.First(), ExpectedElementMismatch); } [TestMethod] public void Scan3_Should_ProduceTheExpectedElements() { var sequence = new[]{ 1, 2, 3 }; var seed = 0; var expected = new[]{ seed, 1, 3, 6 }; var actual = sequence .Scan(seed, (x, y) => x + y); Assert.AreEqual(expected.Length, actual.Count(), ExpectedSizeMismatch); Assert.IsTrue(actual.SequenceEqual(expected), ExpectedElementMismatch); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void Scan4_Should_ThrowIfSourceIsNull() { var sequence = (IEnumerable<object>)null; var actual = sequence .Scan(new object(), (x, y) => x, x => x) .ToList(); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void Scan4_Should_ThrowIfAccumulatorIsNull() { var sequence = Enumerable.Range(1, 3); var actual = sequence .Scan(0, (Func<int, int, int>)null, x => x) .ToList(); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void Scan4_Should_ThrowIfTransformerIsNull() { var sequence = Enumerable.Range(1, 3); var actual = sequence .Scan(0, (x, y) => x, (Func<int, int>)null) .ToList(); } [TestMethod] public void Scan4_Should_YieldTheTransformedSeedValueIfSourceIsEmpty() { var sequence = Enumerable.Empty<int>(); var seed = 1; var expected = 3; var actual = sequence .Scan(seed, (x, y) => x, x => expected); Assert.AreEqual(1, actual.Count(), ExpectedNonEmptySequence); Assert.AreEqual(expected, actual.First(), ExpectedElementMismatch); } [TestMethod] public void Scan4_Should_ProduceTheExpectedElements() { var sequence = new[]{ 1, 2, 3 }; var seed = 0; var offset = 2; var expected = new[]{ offset, 3, 5, 8 }; var actual = sequence .Scan(seed, (x, y) => x + y, x => x + offset); Assert.AreEqual(expected.Length, actual.Count(), ExpectedSizeMismatch); Assert.IsTrue(actual.SequenceEqual(expected), ExpectedElementMismatch); } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; namespace Chains { public static partial class Chains { /// <summary> /// Compares the values in two <see cref="IEnumerable{TSource}"/> /// instances, element-by-element. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerables. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="other">The enumerable to compare to.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The other is null. /// </exception> /// <returns> /// The first non-zero value of element-by-element comparison, or the /// value -1 if the first sequence is shorter, the value +1 if the /// second sequence is shorter, or finally the value 0. /// </returns> public static int CompareWith<TSource>( this IEnumerable<TSource> source, IEnumerable<TSource> other) where TSource : IComparable<TSource> { source.EnsureNotNull(nameof(source)); other.EnsureNotNull(nameof(other)); var leftEnumerator = source.GetEnumerator(); var rightEnumerator = other.GetEnumerator(); var leftMoved = false; var rightMoved = false; while (true) { leftMoved = leftEnumerator.MoveNext(); rightMoved = rightEnumerator.MoveNext(); if (!leftMoved && !rightMoved) return 0; else if (!leftMoved && rightMoved) return -1; else if (leftMoved && !rightMoved) return 1; var comparison = leftEnumerator.Current .CompareTo(rightEnumerator.Current); if (comparison != 0) return comparison; } } /// <summary> /// Compares the values in two <see cref="IEnumerable{TSource}"/> /// instances, element-by-element, using a key-selection function. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerables. /// </typeparam> /// <typeparam name="TKey"> /// The type of the key. /// This type must implement <see cref="IComparable{T}" />. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="other">The enumerable to compare to.</param> /// <param name="selector">The key-selection function.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The other is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The key-selection function is null. /// </exception> /// <returns> /// The first non-zero value of element-by-element comparison after /// applying the key-selection function, or the value -1 if the first /// sequence is shorter, the value +1 if the second sequence is shorter, /// or finally the value 0. /// </returns> public static int CompareWith<TSource, TKey>( this IEnumerable<TSource> source, IEnumerable<TSource> other, Func<TSource, TKey> selector) where TKey : IComparable<TKey> { selector.EnsureNotNull(nameof(selector)); var source2 = source.Select(selector); var other2 = other.Select(selector); return source2.CompareWith(other2); } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using Microsoft.VisualStudio.TestTools.UnitTesting; using Chains; namespace Chains.Test { [TestClass] public class CompareWithTests { private const string ExpectedNegative = "Expected a negative integer."; private const string ExpectedPositive = "Expected a positive integer."; private const string ExpectedZero = "Expected zero."; [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void CompareWith_Should_ThrowIfSourceIsNull() { IEnumerable<int> first = null; IEnumerable<int> second = Enumerable.Range(0, 3); var result = first.CompareWith(second); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void CompareWith_Should_ThrowIfOtherIsNull() { IEnumerable<int> first = Enumerable.Range(0, 3); IEnumerable<int> second = null; var result = first.CompareWith(second); } [TestMethod] public void CompareWith_Should_ReturnNegativeIfFirstSequenceIsLesser() { var first = new List<int> { 1, 2, 3, 0 }; var second = new List<int> { 1, 2, 3, 4 }; Assert.IsTrue(first.CompareWith(second) < 0, ExpectedNegative); } [TestMethod] public void CompareWith_Should_ReturnNegativeIfFirstSequenceIsShorter() { var first = new List<int> { 1, 2, 3 }; var second = new List<int> { 1, 2, 3, 4 }; Assert.IsTrue(first.CompareWith(second) < 0, ExpectedNegative); } [TestMethod] public void CompareWith_Should_ReturnPositiveIfSecondSequenceIsLesser() { var first = new List<int> { 1, 2, 3, 4 }; var second = new List<int> { 1, 2, 3, 0 }; Assert.IsTrue(first.CompareWith(second) > 0, ExpectedPositive); } [TestMethod] public void CompareWith_Should_ReturnPositiveIfSecondSequenceIsShorter() { var first = new List<int> { 1, 2, 3, 4 }; var second = new List<int> { 1, 2, 3 }; Assert.IsTrue(first.CompareWith(second) > 0, ExpectedPositive); } [TestMethod] public void CompareWith_Should_ReturnZeroIfSequencesAreEqual() { var first = new List<int> { 1, 2, 3, 4 }; var second = new List<int> { 1, 2, 3, 4 }; Assert.IsTrue(first.CompareWith(second) == 0, ExpectedZero); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void CompareWithFunc_Should_ThrowIfSourceIsNull() { IEnumerable<int> first = null; IEnumerable<int> second = Enumerable.Range(0, 3); var result = first.CompareWith(second, x => (int)x); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void CompareWithFunc_Should_ThrowIfOtherIsNull() { IEnumerable<int> first = Enumerable.Range(0, 3); IEnumerable<int> second = null; var result = first.CompareWith(second, x => (int)x); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void CompareWithFunc_Should_ThrowIfSelectorIsNull() { IEnumerable<int> first = Enumerable.Range(0, 3); IEnumerable<int> second = Enumerable.Range(0, 3); Func<int, int> selector = null; var result = first.CompareWith(second, selector); } [TestMethod] public void CompareWithFunc_Should_ReturnNegativeIfFirstSequenceIsLesser() { var first = new List<string> { "1", "2", "3", "0" }; var second = new List<string> { "1", "2", "3", "4" }; Assert.IsTrue(first.CompareWith(second, int.Parse) < 0, ExpectedNegative); } [TestMethod] public void CompareWithFunc_Should_ReturnNegativeIfFirstSequenceIsShorter() { var first = new List<string> { "1", "2", "3" }; var second = new List<string> { "1", "2", "3", "4" }; Assert.IsTrue(first.CompareWith(second, int.Parse) < 0, ExpectedNegative); } [TestMethod] public void CompareWithFunc_Should_ReturnPositiveIfSecondSequenceIsLesser() { var first = new List<string> { "1", "2", "3", "4" }; var second = new List<string> { "1", "2", "3", "0" }; Assert.IsTrue(first.CompareWith(second, int.Parse) > 0, ExpectedPositive); } [TestMethod] public void CompareWithFunc_Should_ReturnPositiveIfSecondSequenceIsShorter() { var first = new List<string> { "1", "2", "3", "4" }; var second = new List<string> { "1", "2", "3" }; Assert.IsTrue(first.CompareWith(second, int.Parse) > 0, ExpectedPositive); } [TestMethod] public void CompareWithFunc_Should_ReturnZeroIfSequencesAreEqual() { var first = new List<string> { "1", "2", "3", "4" }; var second = new List<string> { "1", "2", "3", "4" }; Assert.IsTrue(first.CompareWith(second, int.Parse) == 0, ExpectedZero); } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; namespace Chains { public static partial class Chains { /// <summary> /// Produces a sequence of fixed-size windows over the input sequence. /// Windows are produced until there are not enough remaining elements /// to fill an entire window. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="size">The number of elements in each window.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentOutOfRangeException"> /// The size is less than one. /// </exception> /// <returns> /// A sequence of windows over the original sequence. /// </returns> public static IEnumerable<IEnumerable<TSource>> Windowed<TSource>( this IEnumerable<TSource> source, int size) { source.EnsureNotNull(nameof(source)); size.EnsureGreaterThan(0, nameof(size)); if (!source.Any()) yield break; var seq = source; var iter = seq.Skip(size - 1).GetEnumerator(); while (iter.MoveNext()) { yield return seq.Take(size); seq = seq.Skip(1); } yield break; } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; namespace Chains { public static partial class Chains { /// <summary> /// Applies an accumulator function over a sequence of values, yielding /// a state value each iteration. The seed is used as the initial /// accumulator value, and each intermediate state is transformed using /// a transformation function. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <typeparam name="TState"> /// The type of the intermediate state. /// </typeparam> /// <typeparam name="TResult"> /// The type of the transformed state. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="seed">The initial accumulator value.</param> /// <param name="accumulator">The accumulator function.</param> /// <param name="transformer">The transformation function.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The accumulator function is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The transformation function is null. /// </exception> /// <returns> /// The sequence of transformed intermediate states of the accumulator. /// </returns> public static IEnumerable<TResult> Scan<TSource, TState, TResult>( this IEnumerable<TSource> source, TState seed, Func<TState, TSource, TState> accumulator, Func<TState, TResult> transformer) { return source .Scan(seed, accumulator) .Select(transformer); } /// <summary> /// Applies an accumulator function over a sequence of values, yielding /// a state value each iteration. The seed is used as the initial /// accumulator value. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <typeparam name="TState"> /// The type of the intermediate state. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="seed">The initial accumulator value.</param> /// <param name="accumulator">The accumulator function.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The accumulator function is null. /// </exception> /// <returns> /// The sequence of intermediate states of the accumulator. /// </returns> public static IEnumerable<TState> Scan<TSource, TState>( this IEnumerable<TSource> source, TState seed, Func<TState, TSource, TState> accumulator) { source.EnsureNotNull(nameof(source)); accumulator.EnsureNotNull(nameof(accumulator)); yield return seed; var last = seed; foreach (var item in source) { last = accumulator(last, item); yield return last; } } /// <summary> /// Applies an accumulator function over a sequence of values, yielding /// a state value each iteration. The initial accumulator value is the /// first value in the sequence. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="accumulator">The accumulator function.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The accumulator function is null. /// </exception> /// <returns> /// The sequence of intermediate states of the accumulator. /// </returns> public static IEnumerable<TSource> Scan<TSource>( this IEnumerable<TSource> source, Func<TSource, TSource, TSource> accumulator) { source.EnsureNotNull(nameof(source)); if (!source.Any()) return Enumerable.Empty<TSource>(); var first = source.First(); var rest = source.Skip(1); return rest.Scan(first, accumulator); } } } <file_sep>using System; using System.Collections.Generic; namespace Chains { public static partial class Chains { /// <summary> /// Yields the distinct values in an <see cref="IEnumerable{TSource}"/> /// according to a key-selection function. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <typeparam name="TKey"> /// The type of the key. /// This type must implement <see cref="IEquatable{T}" />. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="selector">The key-selection function to use.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The selector is null. /// </exception> /// <returns> /// A sequence of values distinct by the key-selection function. /// </returns> public static IEnumerable<TSource> DistinctBy<TSource, TKey>( this IEnumerable<TSource> source, Func<TSource, TKey> selector) where TKey : IEquatable<TKey> { source.EnsureNotNull(nameof(source)); selector.EnsureNotNull(nameof(selector)); var hashSet = new HashSet<TKey>(); foreach (var item in source) if (hashSet.Add(selector(item))) yield return item; } } } <file_sep># Chains.NET Spice up your enumerables with extra method chains. ## About This library provides some extra niceties to the `IEnumerable<T>` interface. Many .NET developers enjoy using the standard library features in `System.Linq`. Chains.NET aims to make that experience even better by providing more operations on sequences, inspired by a myriad of functional languages. *This project is a work-in-progress. One day I hope to be able to offer it on NuGet.* ## FAQ **Q:** Why "Chains.NET"? **A:** This libary provides chainable (a.k.a fluent) methods, on top of what LINQ offers, and lots of open source projects in the .NET ecosystem use ".NET" in their name. Finally, as a software engineer, by default I am not good at naming things. <file_sep>using System; using System.Collections.Generic; namespace Chains { public static partial class Chains { /// <summary> /// Performs a given action on each element in an /// <see cref="IEnumerable{T}" />. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="action">The action to perform on each element.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The action is null. /// </exception> public static void ForEach<TSource>( this IEnumerable<TSource> source, Action<TSource> action) { source.EnsureNotNull(nameof(source)); action.EnsureNotNull(nameof(action)); foreach (var element in source) action(element); } } } <file_sep>using System; namespace Chains { internal static class RequirementHelper { private const string ArgumentTooSmallMessage = "Given argument was less than the specified minimum of {0}."; private const string ArgumentTooLargeMessage = "Given argument was greater than the specified maximum of {0}."; internal static TSource EnsureNotNull<TSource>(this TSource source, string name) { if (source == null) throw new ArgumentNullException(name); return source; } internal static TSource EnsureBetweenInclusive<TSource, TCompare>(this TSource source, TCompare min, TCompare max, string name) where TSource : IComparable<TCompare> { if (source.CompareTo(min) < 0) throw new ArgumentOutOfRangeException(name, source, string.Format(ArgumentTooSmallMessage, min)); else if (source.CompareTo(max) > 0) throw new ArgumentOutOfRangeException(name, source, string.Format(ArgumentTooLargeMessage, max)); return source; } internal static TSource EnsureGreaterThan<TSource, TCompare>(this TSource source, TCompare other, string name) where TSource : IComparable<TCompare> { if (source.CompareTo(other) <= 0) throw new ArgumentOutOfRangeException(name, source, string.Format(ArgumentTooSmallMessage, other)); return source; } internal static TSource EnsureLessThan<TSource, TCompare>(this TSource source, TCompare other, string name) where TSource : IComparable<TCompare> { if (source.CompareTo(other) >= 0) throw new ArgumentOutOfRangeException(name, source, string.Format(ArgumentTooLargeMessage, other)); return source; } internal static TSource Ensure<TSource>(this TSource source, Func<TSource, bool> requirement, string name, string message) { if (!requirement(source)) throw new ArgumentException(name, message); return source; } } } <file_sep>using System; using System.Linq; using System.Collections.Generic; namespace Chains { public static partial class Chains { /// <summary> /// Yields the values in an <see cref="IEnumerable{TSource}"/> /// except the first element. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="InvalidOperationException"> /// The source is empty. /// </exception> /// <returns> /// An enumerable containing all except the first element in the source. /// </returns> public static IEnumerable<TSource> ExceptFirst<TSource>( this IEnumerable<TSource> source) { source.EnsureNotNull(nameof(source)); // TODO: Replace with helper method if (!source.Any()) throw new InvalidOperationException(); foreach (var item in source.Skip(1)) yield return item; } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using Microsoft.VisualStudio.TestTools.UnitTesting; using Chains; namespace Chains.Test { [TestClass] public class ExceptFirstTests { [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ExceptFirst_Should_ThrowIfSourceIsNull() { var sequence = (IEnumerable<object>)null; var actual = sequence.ExceptFirst().ToList(); } [TestMethod] [ExpectedException(typeof(InvalidOperationException))] public void ExceptFirst_Should_ThrowIfSourceIsEmpty() { var sequence = Enumerable.Empty<object>(); var actual = sequence.ExceptFirst().ToList(); } [TestMethod] public void ExceptFirst_Should_YieldAllElementsExceptTheFirst() { var sequence = Enumerable.Range(1, 3); var expected = new[]{ 2, 3 }; var actual = sequence.ExceptFirst(); Assert.IsTrue( actual.SequenceEqual(expected), "Actual and expected sequences did not match."); } [TestMethod] public void ExceptFirst_Should_BeTheInverseOfFirst() { var sequence = Enumerable.Range(1, 3); var first = sequence.First(); var exceptFirst = sequence.ExceptFirst(); var concat = exceptFirst.Prepend(first); Assert.IsTrue( concat.SequenceEqual(sequence), "Inverse constraint failed."); } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using Microsoft.VisualStudio.TestTools.UnitTesting; using Chains; namespace Chains.Test { [TestClass] public class ForEachTests { [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ForEach_Should_ThrowIfSourceIsNull() { IEnumerable<int> seq = null; seq.ForEach(Console.WriteLine); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ForEach_Should_ThrowIfActionIsNull() { var seq = Enumerable.Range(0, 10); Action<int> action = null; seq.ForEach(action); } [TestMethod] public void ForEach_Should_ExecuteTheActionForEachItem() { var count = 0; var expected = 6; var seq = Enumerable.Range(0, 4); void Increment(int i) => count += i; seq.ForEach(Increment); Assert.AreEqual(expected, count, "Did not produce expected value."); } } } <file_sep>using System; using Microsoft.VisualStudio.TestTools.UnitTesting; using Chains; namespace Chains.Test { [TestClass] public class RequirementHelperTests { private const string DidNotReturnSameReference = "Did not return the same reference."; private const string DidNotReturnEqualValue = "Did not return equal value."; [TestMethod] public void EnsureNotNull_Should_ReturnSameObjectIfNotNull() { var expected = "Hello, world!"; var actual = expected.EnsureNotNull(nameof(expected)); Assert.AreSame(expected, actual, DidNotReturnSameReference); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void EnsureNotNull_Should_ThrowIfNull() { var expected = (string)null; var actual = expected.EnsureNotNull(nameof(expected)); } [TestMethod] public void EnsureBetweenInclusive_Should_ReturnSameObjectIfBetweenInclusive() { var expected = 5; var min = 0; var max = 10; var actual = expected.EnsureBetweenInclusive(min, max, nameof(expected)); Assert.AreEqual(expected, actual, DidNotReturnEqualValue); } [TestMethod] public void EnsureBetweenInclusive_Should_NotThrowIfEqualToMinimum() { var expected = 0; var min = 0; var max = 10; var actual = expected.EnsureBetweenInclusive(min, max, nameof(expected)); Assert.AreEqual(expected, actual, DidNotReturnEqualValue); } [TestMethod] public void EnsureBetweenInclusive_Should_NotThrowIfEqualToMaximum() { var expected = 10; var min = 0; var max = 10; var actual = expected.EnsureBetweenInclusive(min, max, nameof(expected)); Assert.AreEqual(expected, actual, DidNotReturnEqualValue); } [TestMethod] [ExpectedException(typeof(ArgumentOutOfRangeException))] public void EnsureBetweenInclusive_Should_ThrowIfLessThanMinimum() { var expected = -1; var min = 0; var max = 10; var actual = expected.EnsureBetweenInclusive(min, max, nameof(expected)); } [TestMethod] [ExpectedException(typeof(ArgumentOutOfRangeException))] public void EnsureBetweenInclusive_Should_ThrowIfGreaterThanMaximum() { var expected = 11; var min = 0; var max = 10; var actual = expected.EnsureBetweenInclusive(min, max, nameof(expected)); } [TestMethod] public void EnsureGreaterThan_Should_ReturnSameObjectIfGreaterThanOther() { var expected = "abc"; var other = "aaa"; var actual = expected.EnsureGreaterThan(other, nameof(expected)); Assert.AreSame(expected, actual, DidNotReturnSameReference); } [TestMethod] [ExpectedException(typeof(ArgumentOutOfRangeException))] public void EnsureGreaterThan_Should_ThrowIfEqualToOther() { var expected = "abc"; var other = "abc"; var actual = expected.EnsureGreaterThan(other, nameof(expected)); } [TestMethod] [ExpectedException(typeof(ArgumentOutOfRangeException))] public void EnsureGreaterThan_Should_ThrowIfLesserThanOther() { var expected = "abc"; var other = "zzz"; var actual = expected.EnsureGreaterThan(other, nameof(expected)); } [TestMethod] public void EnsureLessThan_Should_ReturnSameObjectIfLessserThanOther() { var expected = "abc"; var other = "zzz"; var actual = expected.EnsureLessThan(other, nameof(expected)); Assert.AreSame(expected, actual, DidNotReturnSameReference); } [TestMethod] [ExpectedException(typeof(ArgumentOutOfRangeException))] public void EnsureLessThan_Should_ThrowIfEqualToOther() { var expected = "abc"; var other = "abc"; var actual = expected.EnsureLessThan(other, nameof(expected)); } [TestMethod] [ExpectedException(typeof(ArgumentOutOfRangeException))] public void EnsureLessThan_Should_ThrowIfGreaterThanOther() { var expected = "abc"; var other = "aaa"; var actual = expected.EnsureLessThan(other, nameof(expected)); } [TestMethod] public void Ensure_Should_ReturnSameInstanceIfRequirmentPassed() { var expected = "Hello, world."; var actual = expected.Ensure(s => true, nameof(expected), "Did not pass expectation."); Assert.AreSame(expected, actual, DidNotReturnEqualValue); } [TestMethod] [ExpectedException(typeof(ArgumentException))] public void Ensure_Should_ThrowIfRequirementFailed() { var expected = "Hello, world!"; var actual = expected.Ensure(s => false, nameof(expected), "Passed expectation."); } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using Microsoft.VisualStudio.TestTools.UnitTesting; using Chains; namespace Chains.Test { [TestClass] public class WindowedTests { [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void Windowed_Should_ThrowIfSourceIsNull() { var sequence = (IEnumerable<object>)null; var actual = sequence.Windowed(1).ToList(); } [TestMethod] [ExpectedException(typeof(ArgumentOutOfRangeException))] public void Windowed_Should_ThrowIfSizeIsZero() { var sequence = Enumerable.Range(0, 10); var size = 0; var actual = sequence.Windowed(size).ToList(); } [TestMethod] [ExpectedException(typeof(ArgumentOutOfRangeException))] public void Windowed_Should_ThrowIfSizeIsNegative() { var sequence = Enumerable.Range(0, 10); var size = -1; var actual = sequence.Windowed(size).ToList(); } [TestMethod] public void Windowed_Should_ProduceAnEmptySequenceIfNotEnoughElements() { var sequence = Enumerable.Range(1, 1); var size = 2; var actual = sequence.Windowed(size); Assert.IsFalse(actual.Any(), "Expected empty sequence."); } [TestMethod] public void Windowed_Should_ProduceFixedSizeWindowsUntilEnd() { var sequence = new[]{ 1, 2, 3, 4, 5 }; var size = 3; var expected = new List<int[]> { new[]{ 1, 2, 3 }, new[]{ 2, 3, 4 }, new[]{ 3, 4, 5 } }; var actual = sequence.Windowed(size); Assert.AreEqual(expected.Count, actual.Count(), "Mismatched counts at top level."); foreach (var window in actual) Assert.AreEqual(size, window.Count(), "Mismatched counts at window level."); Assert.IsTrue( expected .SelectMany(x => x) .SequenceEqual(actual.SelectMany(x => x)), "Expected sequences to match."); } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using Microsoft.VisualStudio.TestTools.UnitTesting; using Chains; namespace Chains.Tests { [TestClass] public class ProductTests { [TestMethod] public void ProductOfInts_Should_ComputeTheProduct() { var ints = new[]{ 1, 2, 3 }; var expected = 1 * 2 * 3; var actual = ints.Product(); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfInts_Should_ReturnOneForEmptySequence() { var ints = Enumerable.Empty<int>(); var expected = 1; var actual = ints.Product(); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfInts_Should_ThrowForNullSequence() { var ints = (IEnumerable<int>)null; var actual = ints.Product(); } [TestMethod] [ExpectedException(typeof(OverflowException))] public void ProductOfInts_Should_ThrowOnOverflow() { var ints = new[]{ int.MaxValue, 10 }; var actual = ints.Product(); } [TestMethod] public void ProductOfNullableInts_Should_ComputeTheProduct() { var ints = new int?[]{ 1, 2, 3 }; var expected = (int?)(1 * 2 * 3); var actual = ints.Product(); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfNullableInts_Should_IgnoreNullValues() { var ints = new int?[]{ 1, null, 2, null, 3, null }; var expected = (int?)(1 * 2 * 3); var actual = ints.Product(); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfNullableInts_Should_ReturnOneForEmptySequence() { var ints = Enumerable.Empty<int?>(); var expected = (int?)1; var actual = ints.Product(); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfNullableInts_Should_ThrowForNullSequence() { var ints = (IEnumerable<int?>)null; var actual = ints.Product(); } [TestMethod] [ExpectedException(typeof(OverflowException))] public void ProductOfNullableInts_Should_ThrowOnOverflow() { var ints = new int?[]{ int.MaxValue, 10 }; var actual = ints.Product(); } [TestMethod] public void ProductOfLongs_Should_ComputeTheProduct() { var longs = new[]{ 1L, 2L, 3L }; var expected = 1L * 2L * 3L; var actual = longs.Product(); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfLongs_Should_ReturnOneForEmptySequence() { var longs = Enumerable.Empty<long>(); var expected = 1L; var actual = longs.Product(); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfLongs_Should_ThrowForNullSequence() { var longs = (IEnumerable<long>)null; var actual = longs.Product(); } [TestMethod] [ExpectedException(typeof(OverflowException))] public void ProductOfLongs_Should_ThrowOnOverflow() { var longs = new[]{ long.MaxValue, 10L }; var actual = longs.Product(); } [TestMethod] public void ProductOfNullableLongs_Should_ComputeTheProduct() { var longs = new long?[]{ 1L, 2L, 3L }; var expected = (long?)(1L * 2L * 3L); var actual = longs.Product(); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfNullableLongs_Should_IgnoreNullValues() { var longs = new long?[]{ 1L, null, 2L, null, 3L, null }; var expected = (long?)(1L * 2L * 3L); var actual = longs.Product(); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfNullableLongs_Should_ReturnOneForEmptySequence() { var longs = Enumerable.Empty<long?>(); var expected = (long?)1L; var actual = longs.Product(); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfNullableLongs_Should_ThrowForNullSequence() { var longs = (IEnumerable<long?>)null; var actual = longs.Product(); } [TestMethod] [ExpectedException(typeof(OverflowException))] public void ProductOfNullableLongs_Should_ThrowOnOverflow() { var longs = new long?[]{ long.MaxValue, 10L }; var actual = longs.Product(); } [TestMethod] public void ProductOfFloats_Should_ComputeTheProduct() { var floats = new[]{ 1.0F, 2.0F, 3.0F }; var expected = 1.0F * 2.0F * 3.0F; var actual = floats.Product(); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfFloats_Should_ReturnOneForEmptySequence() { var floats = Enumerable.Empty<float>(); var expected = 1.0F; var actual = floats.Product(); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfFloats_Should_ThrowForNullSequence() { var floats = (IEnumerable<float>)null; var actual = floats.Product(); } [TestMethod] public void ProductOfNullableFloats_Should_ComputeTheProduct() { var floats = new float?[]{ 1.0F, 2.0F, 3.0F }; var expected = (float?)(1.0F * 2.0F * 3.0F); var actual = floats.Product(); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfNullableFloats_Should_IgnoreNullValues() { var floats = new float?[]{ 1.0F, null, 2.0F, null, 3.0F, null }; var expected = (float?)(1.0F * 2.0F * 3.0F); var actual = floats.Product(); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfNullableFloats_Should_ReturnOneForEmptySequence() { var floats = Enumerable.Empty<float?>(); var expected = (float?)1.0F; var actual = floats.Product(); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfNullableFloats_Should_ThrowForNullSequence() { var floats = (IEnumerable<float?>)null; var actual = floats.Product(); } [TestMethod] public void ProductOfDoubles_Should_ComputeTheProduct() { var doubles = new[]{ 1.0, 2.0, 3.0 }; var expected = 1.0 * 2.0 * 3.0; var actual = doubles.Product(); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfDoubles_Should_ReturnOneForEmptySequence() { var doubles = Enumerable.Empty<double>(); var expected = 1.0; var actual = doubles.Product(); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfDoubles_Should_ThrowForNullSequence() { var doubles = (IEnumerable<double>)null; var actual = doubles.Product(); } [TestMethod] public void ProductOfNullableDoubles_Should_ComputeTheProduct() { var doubles = new double?[]{ 1.0, 2.0, 3.0 }; var expected = (double?)(1.0 * 2.0 * 3.0); var actual = doubles.Product(); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfNullableDoubles_Should_IgnoreNullValues() { var doubles = new double?[]{ 1.0, null, 2.0, null, 3.0, null }; var expected = (double?)(1.0 * 2.0 * 3.0); var actual = doubles.Product(); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfNullableDoubles_Should_ReturnOneForEmptySequence() { var doubles = Enumerable.Empty<double?>(); var expected = (double?)1.0; var actual = doubles.Product(); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfNullableDoubles_Should_ThrowForNullSequence() { var doubles = (IEnumerable<double?>)null; var actual = doubles.Product(); } [TestMethod] public void ProductOfDecimals_Should_ComputeTheProduct() { var decimals = new[]{ 1.0M, 2.0M, 3.0M }; var expected = 1.0M * 2.0M * 3.0M; var actual = decimals.Product(); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfDecimals_Should_ReturnOneForEmptySequence() { var decimals = Enumerable.Empty<decimal>(); var expected = 1.0M; var actual = decimals.Product(); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfDecimals_Should_ThrowForNullSequence() { var decimals = (IEnumerable<decimal>)null; var actual = decimals.Product(); } [TestMethod] [ExpectedException(typeof(OverflowException))] public void ProductOfDecimals_Should_ThrowOnOverflow() { var decimals = new[]{ decimal.MaxValue, 10.0M }; var actual = decimals.Product(); } [TestMethod] public void ProductOfNullableDecimals_Should_ComputeTheProduct() { var decimals = new decimal?[]{ 1.0M, 2.0M, 3.0M }; var expected = (decimal?)(1.0M * 2.0M * 3.0M); var actual = decimals.Product(); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfNullableDecimals_Should_IgnoreNullValues() { var decimals = new decimal?[]{ 1.0M, null, 2.0M, null, 3.0M, null }; var expected = (decimal?)(1.0M * 2.0M * 3.0M); var actual = decimals.Product(); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfNullableDecimals_Should_ReturnOneForEmptySequence() { var decimals = Enumerable.Empty<decimal?>(); var expected = (decimal?)1.0M; var actual = decimals.Product(); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfNullableDecimals_Should_ThrowForNullSequence() { var decimals = (IEnumerable<decimal?>)null; var actual = decimals.Product(); } [TestMethod] [ExpectedException(typeof(OverflowException))] public void ProductOfNullableDecimals_Should_ThrowOnOverflow() { var decimals = new decimal?[]{ decimal.MaxValue, 10.0M }; var actual = decimals.Product(); } [TestMethod] public void ProductOfTransformedInts_Should_ComputeTheProduct() { var ints = new[]{ 1, 2, 3 }; var expected = 1 * 2 * 3; var actual = ints.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfTransformedInts_Should_ReturnOneForEmptySequence() { var ints = Enumerable.Empty<int>(); var expected = 1; var actual = ints.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedInts_Should_ThrowForNullSequence() { var ints = (IEnumerable<int>)null; var actual = ints.Product(x => x); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedInts_Should_ThrowForNullSelector() { var ints = new[]{ 1, 2, 3 }; var actual = ints.Product((Func<int, int>)null); } [TestMethod] [ExpectedException(typeof(OverflowException))] public void ProductOfTransformedInts_Should_ThrowOnOverflow() { var ints = new[]{ int.MaxValue, 10 }; var actual = ints.Product(x => x); } [TestMethod] public void ProductOfTransformedNullableInts_Should_ComputeTheProduct() { var ints = new int?[]{ 1, 2, 3 }; var expected = (int?)(1 * 2 * 3); var actual = ints.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfTransformedNullableInts_Should_IgnoreNullValues() { var ints = new int?[]{ 1, null, 2, null, 3, null }; var expected = (int?)(1 * 2 * 3); var actual = ints.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfTransformedNullableInts_Should_ReturnOneForEmptySequence() { var ints = Enumerable.Empty<int?>(); var expected = (int?)1; var actual = ints.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedNullableInts_Should_ThrowForNullSequence() { var ints = (IEnumerable<int?>)null; var actual = ints.Product(x => x); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedNullableInts_Should_ThrowForNullSelector() { var ints = new int?[]{ 1, 2, 3 }; var actual = ints.Product((Func<int?, int?>)null); } [TestMethod] [ExpectedException(typeof(OverflowException))] public void ProductOfTransformedNullableInts_Should_ThrowOnOverflow() { var ints = new int?[]{ int.MaxValue, 10 }; var actual = ints.Product(x => x); } [TestMethod] public void ProductOfTransformedLongs_Should_ComputeTheProduct() { var longs = new[]{ 1L, 2L, 3L }; var expected = 1L * 2L * 3L; var actual = longs.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfTransformedLongs_Should_ReturnOneForEmptySequence() { var longs = Enumerable.Empty<long>(); var expected = 1L; var actual = longs.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedLongs_Should_ThrowForNullSequence() { var longs = (IEnumerable<long>)null; var actual = longs.Product(x => x); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedLongs_Should_ThrowForNullSelector() { var longs = new[]{ 1L, 2L, 3L }; var actual = longs.Product((Func<long, long>)null); } [TestMethod] [ExpectedException(typeof(OverflowException))] public void ProductOfTransformedLongs_Should_ThrowOnOverflow() { var longs = new[]{ long.MaxValue, 10L }; var actual = longs.Product(x => x); } [TestMethod] public void ProductOfTransformedNullableLongs_Should_ComputeTheProduct() { var longs = new long?[]{ 1L, 2L, 3L }; var expected = (long?)(1L * 2L * 3L); var actual = longs.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfTransformedNullableLongs_Should_IgnoreNullValues() { var longs = new long?[]{ 1L, null, 2L, null, 3L, null }; var expected = (long?)(1L * 2L * 3L); var actual = longs.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfTransformedNullableLongs_Should_ReturnOneForEmptySequence() { var longs = Enumerable.Empty<long?>(); var expected = (long?)1L; var actual = longs.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedNullableLongs_Should_ThrowForNullSequence() { var longs = (IEnumerable<long?>)null; var actual = longs.Product(x => x); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedNullableLongs_Should_ThrowForNullSelector() { var longs = new long?[]{ 1L, 2L, 3L }; var actual = longs.Product((Func<long?, long?>)null); } [TestMethod] [ExpectedException(typeof(OverflowException))] public void ProductOfTransformedNullableLongs_Should_ThrowOnOverflow() { var longs = new long?[]{ long.MaxValue, 10L }; var actual = longs.Product(x => x); } [TestMethod] public void ProductOfTransformedFloats_Should_ComputeTheProduct() { var floats = new[]{ 1.0F, 2.0F, 3.0F }; var expected = 1.0F * 2.0F * 3.0F; var actual = floats.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfTransformedFloats_Should_ReturnOneForEmptySequence() { var floats = Enumerable.Empty<float>(); var expected = 1.0F; var actual = floats.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedFloats_Should_ThrowForNullSequence() { var floats = (IEnumerable<float>)null; var actual = floats.Product(x => x); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedFloats_Should_ThrowForNullSelector() { var floats = new[]{ 1.0F, 2.0F, 3.0F }; var actual = floats.Product((Func<float, float>)null); } [TestMethod] public void ProductOfTransformedNullableFloats_Should_ComputeTheProduct() { var floats = new float?[]{ 1.0F, 2.0F, 3.0F }; var expected = (float?)(1.0F * 2.0F * 3.0F); var actual = floats.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfTransformedNullableFloats_Should_IgnoreNullValues() { var floats = new float?[]{ 1.0F, null, 2.0F, null, 3.0F, null }; var expected = (float?)(1.0F * 2.0F * 3.0F); var actual = floats.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfTransformedNullableFloats_Should_ReturnOneForEmptySequence() { var floats = Enumerable.Empty<float?>(); var expected = (float?)1.0F; var actual = floats.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedNullableFloats_Should_ThrowForNullSequence() { var floats = (IEnumerable<float?>)null; var actual = floats.Product(x => x); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedNullableFloats_Should_ThrowForNullSelector() { var floats = new float?[]{ 1.0F, 2.0F, 3.0F }; var actual = floats.Product((Func<float?, float?>)null); } [TestMethod] public void ProductOfTransformedDoubles_Should_ComputeTheProduct() { var doubles = new[]{ 1.0, 2.0, 3.0 }; var expected = 1.0 * 2.0 * 3.0; var actual = doubles.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfTransformedDoubles_Should_ReturnOneForEmptySequence() { var doubles = Enumerable.Empty<double>(); var expected = 1.0; var actual = doubles.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedDoubles_Should_ThrowForNullSequence() { var doubles = (IEnumerable<double>)null; var actual = doubles.Product(x => x); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedDoubles_Should_ThrowForNullSelector() { var doubles = new[]{ 1.0, 2.0, 3.0 }; var actual = doubles.Product((Func<double, double>)null); } [TestMethod] public void ProductOfTransformedNullableDoubles_Should_ComputeTheProduct() { var doubles = new double?[]{ 1.0, 2.0, 3.0 }; var expected = (double?)(1.0 * 2.0 * 3.0); var actual = doubles.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfTransformedNullableDoubles_Should_IgnoreNullValues() { var doubles = new double?[]{ 1.0, null, 2.0, null, 3.0, null }; var expected = (double?)(1.0 * 2.0 * 3.0); var actual = doubles.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfTransformedNullableDoubles_Should_ReturnOneForEmptySequence() { var doubles = Enumerable.Empty<double?>(); var expected = (double?)1.0; var actual = doubles.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedNullableDoubles_Should_ThrowForNullSequence() { var doubles = (IEnumerable<double?>)null; var actual = doubles.Product(x => x); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedNullableDoubles_Should_ThrowForNullSelector() { var doubles = new double?[]{ 1.0, 2.0, 3.0 }; var actual = doubles.Product((Func<double?, double?>)null); } [TestMethod] public void ProductOfTransformedDecimals_Should_ComputeTheProduct() { var decimals = new[]{ 1.0M, 2.0M, 3.0M }; var expected = 1.0M * 2.0M * 3.0M; var actual = decimals.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfTransformedDecimals_Should_ReturnOneForEmptySequence() { var decimals = Enumerable.Empty<decimal>(); var expected = 1.0M; var actual = decimals.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedDecimals_Should_ThrowForNullSequence() { var decimals = (IEnumerable<decimal>)null; var actual = decimals.Product(x => x); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedDecimals_Should_ThrowForNullSelector() { var decimals = new[]{ 1.0M, 2.0M, 3.0M }; var actual = decimals.Product((Func<decimal, decimal>)null); } [TestMethod] [ExpectedException(typeof(OverflowException))] public void ProductOfTransformedDecimals_Should_ThrowOnOverflow() { var decimals = new[]{ decimal.MaxValue, 10.0M }; var actual = decimals.Product(x => x); } [TestMethod] public void ProductOfTransformedNullableDecimals_Should_ComputeTheProduct() { var decimals = new decimal?[]{ 1.0M, 2.0M, 3.0M }; var expected = (decimal?)(1.0M * 2.0M * 3.0M); var actual = decimals.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfTransformedNullableDecimals_Should_IgnoreNullValues() { var decimals = new decimal?[]{ 1.0M, null, 2.0M, null, 3.0M, null }; var expected = (decimal?)(1.0M * 2.0M * 3.0M); var actual = decimals.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] public void ProductOfTransformedNullableDecimals_Should_ReturnOneForEmptySequence() { var decimals = Enumerable.Empty<decimal?>(); var expected = (decimal?)1.0M; var actual = decimals.Product(x => x); Assert.AreEqual(expected, actual); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedNullableDecimals_Should_ThrowForNullSequence() { var decimals = (IEnumerable<decimal?>)null; var actual = decimals.Product(x => x); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ProductOfTransformedNullableDecimals_Should_ThrowForNullSelector() { var decimals = new decimal?[]{ 1.0M, 2.0M, 3.0M }; var actual = decimals.Product((Func<decimal?, decimal?>)null); } [TestMethod] [ExpectedException(typeof(OverflowException))] public void ProductOfTransformedNullableDecimals_Should_ThrowOnOverflow() { var decimals = new decimal?[]{ decimal.MaxValue, 10.0M }; var actual = decimals.Product(x => x); } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using Microsoft.VisualStudio.TestTools.UnitTesting; using Chains; namespace Chains.Test { [TestClass] public class ChunkByTests { [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ChunkBy_Should_ThrowIfSourceIsNull() { var seq = ((IEnumerable<int>)null).ChunkBy(1).ToList(); } [TestMethod] [ExpectedException(typeof(ArgumentException))] public void ChunkBy_Should_ThrowIfSizeIsNegative() { var seq = new List<int> { 1, 2, 3 }; var newSeq = seq.ChunkBy(-1).ToList(); } [TestMethod] [ExpectedException(typeof(ArgumentException))] public void ChunkBy_Should_ThrowIfSizeIsZero() { var seq = new List<int> { 1, 2, 3 }; var newSeq = seq.ChunkBy(0).ToList(); } [TestMethod] public void ChunkBy_Should_ReturnAnEmptySequenceIfSourceIsEmpty() { var seq = Enumerable.Empty<int>(); var size = 2; var actual = seq.ChunkBy(size); Assert.IsFalse(!actual.Any(), "Expected empty sequence."); } [TestMethod] public void ChunkBy_Should_CreateChunksNoLargerThanSize() { var seq = Enumerable.Range(0, 10); var size = 2; var actual = seq.ChunkBy(size); foreach (var item in actual) { var chunkSize = item.Count(); Assert.IsTrue( chunkSize <= size, $"Expected chunk size <= {size}, size was {chunkSize}."); } } [TestMethod] public void ChunkBy_Should_CreateNoMoreChunksThanNecessary() { var seq = Enumerable.Range(0, 10); var size = 2; var numberOfChunks = 5; var actual = seq.ChunkBy(size); Assert.AreEqual( numberOfChunks, actual.Count(), "Incorrect number of chunks."); } [TestMethod] public void ChunkBy_Should_CreateSmallerChunkIfNotEnoughElementsRemain() { var seq = Enumerable.Range(0, 10); var size = 3; var actual = seq.ChunkBy(size); Assert.IsTrue( actual.Last().Count() < size, $"Expected last chunk size strictly less than {size}."); } [TestMethod] public void ChunkBy_Should_NotChangeTheOrderOfElements() { var seq = Enumerable.Range(0, 10); var size = 2; var actual = seq.ChunkBy(size).SelectMany(x => x); Assert.IsTrue( seq.SequenceEqual(actual), "Expected sequences to be equal."); } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; namespace Chains { public static partial class Chains { /// <summary> /// Splits elements of an <see cref="IEnumerable{TSource}"/> into chunks /// with a given maximum size. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerables. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="size">The maximum chunk size.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentException"> /// The size is not a positive integer. /// </exception> /// <returns> /// An enumerable of enumerables, where each one contains at most `size` /// number of elements. /// </returns> public static IEnumerable<IEnumerable<TSource>> ChunkBy<TSource>( this IEnumerable<TSource> source, int size) { source.EnsureNotNull(nameof(source)); size.Ensure(x => x > 0, nameof(size), "Must be a positive integer."); if (!source.Any()) { yield return Enumerable.Empty<TSource>(); yield break; } var chunk = source.Take(size); var rest = source.Skip(size); while (chunk.Any()) { yield return chunk; chunk = rest.Take(size); rest = rest.Skip(size); } } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using Microsoft.VisualStudio.TestTools.UnitTesting; using Chains; namespace Chains.Test { [TestClass] public class ExceptLastTests { [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ExceptLast_Should_ThrowIfSourceIsNull() { var sequence = (IEnumerable<object>)null; var actual = sequence.ExceptLast().ToList(); } [TestMethod] [ExpectedException(typeof(InvalidOperationException))] public void ExceptLast_Should_ThrowIfSourceIsEmpty() { var sequence = Enumerable.Empty<object>(); var actual = sequence.ExceptLast().ToList(); } [TestMethod] public void ExceptLast_Should_YieldAllElementsExceptTheLast() { var sequence = Enumerable.Range(1, 3); var expected = new[]{ 1, 2 }; var actual = sequence.ExceptLast(); Assert.IsTrue( actual.SequenceEqual(expected), "Actual and expected sequences did not match."); } [TestMethod] public void ExceptLast_Should_BeTheInverseOfLast() { var sequence = Enumerable.Range(1, 3); var last = sequence.Last(); var exceptLast = sequence.ExceptLast(); var concat = exceptLast.Append(last); Assert.IsTrue( concat.SequenceEqual(sequence), "Inverse constraint failed."); } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using Microsoft.VisualStudio.TestTools.UnitTesting; using Chains; namespace Chains.Test { [TestClass] public class DistinctByTests { [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void DistinctBy_Should_ThrowIfSourceIsNull() { var seq = ((IEnumerable<int>)null).DistinctBy(x => x).ToList(); } [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void DistinctBy_Should_ThrowIfKeySelectorIsNull() { var seq = new List<int> { 1, 2, 3 }; var newSeq = seq.DistinctBy((Func<int, int>)null).ToList(); } [TestMethod] public void DistinctBy_Should_SelectValuesByKey() { var seq = new List<int> { 1, 2, 3, 4 }; var expected = new List<int> { 1, 2 }; var actual = seq.DistinctBy(x => x % 2); foreach (var item in actual) Assert.IsTrue( expected.Contains(item), $"Expected {item} in sequence."); } } } <file_sep>using System; using System.Collections.Generic; namespace Chains { public static partial class Chains { /// <summary> /// Creates a histogram of a given <see cref="IEnumerable{T}" />. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <returns> /// An <see cref="IDictionary{TKey,TValue}" /> mapping each element in /// the source enumerable to the number of times it appeared. /// </returns> public static IDictionary<TSource, int> ToHistogram<TSource>( this IEnumerable<TSource> source) { source.EnsureNotNull(nameof(source)); var dict = new Dictionary<TSource, int>(); foreach (var element in source) { if (!dict.ContainsKey(element)) { dict[element] = 0; } dict[element]++; } return dict; } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using Microsoft.VisualStudio.TestTools.UnitTesting; using Chains; namespace Chains.Test { [TestClass] public class ToHistogramTests { [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void ToHistogram_Should_ThrowIfSourceIsNull() { var seq = ((IEnumerable<int>)null).ToHistogram(); } [TestMethod] public void ToHistogram_Should_ReturnAnEmptyDictionaryIfSourceIsEmpty() { var seq = Enumerable.Empty<int>(); var actual = seq.ToHistogram(); Assert.IsFalse(actual.Any(), "Expected empty dictionary."); } [TestMethod] public void ToHistogram_Should_GenerateCountsOfUniqueValues() { var seq = new[] { 1, 2, 3, 1, 2, 1 }; var actual = seq.ToHistogram(); Assert.IsTrue(actual.ContainsKey(1), "Expected key 1 to exist."); Assert.AreEqual(actual[1], 3, "Expected 3 occurrences of key 1."); Assert.IsTrue(actual.ContainsKey(2), "Expected key 2 to exist."); Assert.AreEqual(actual[2], 2, "Expected 2 occurrences of key 2."); Assert.IsTrue(actual.ContainsKey(3), "Expected key 3 to exist."); Assert.AreEqual(actual[3], 1, "Expected 1 occurence of key 3."); } } } <file_sep>using System; namespace Chains { /// <summary> /// Provides additional extension methods for <see cref="IEnumerable{T}" /> /// similar to LINQ. /// </summary> public static partial class Chains { } } <file_sep>using System; using System.Collections.Generic; using System.Linq; namespace Chains { public static partial class Chains { /// <summary> /// Computes the product of a sequence of <see cref="Int32" /> values. /// </summary> /// <param name="source">The source enumerable to operate on.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="OverflowException"> /// The product exceeds the maximum value. /// </exception> /// <returns> /// The product of the values, or one if the sequence is empty. /// </returns> public static int Product(this IEnumerable<int> source) { source.EnsureNotNull(nameof(source)); return source .Aggregate(1, (x, y) => { checked { return x * y; } }); } /// <summary> /// Computes the product of a sequence of nullable /// <see cref="Int32" /> values, excluding null values. /// </summary> /// <param name="source">The source enumerable to operate on.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="OverflowException"> /// The product exceeds the maximum value. /// </exception> /// <returns> /// The product of the values, or one if the sequence is empty. /// </returns> public static int? Product(this IEnumerable<int?> source) { source.EnsureNotNull(nameof(source)); return source .OfType<int>() .Aggregate(1, (x, y) => { checked { return x * y; } }); } /// <summary> /// Computes the product of a sequence of <see cref="Int64" /> values. /// </summary> /// <param name="source">The source enumerable to operate on.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="OverflowException"> /// The product exceeds the maximum value. /// </exception> /// <returns> /// The product of the values, or one if the sequence is empty. /// </returns> public static long Product(this IEnumerable<long> source) { source.EnsureNotNull(nameof(source)); return source .Aggregate(1L, (x, y) => { checked { return x * y; } }); } /// <summary> /// Computes the product of a sequence of nullable /// <see cref="Int64" /> values, excluding null values. /// </summary> /// <param name="source">The source enumerable to operate on.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="OverflowException"> /// The product exceeds the maximum value. /// </exception> /// <returns> /// The product of the values, or one if the sequence is empty. /// </returns> public static long? Product(this IEnumerable<long?> source) { source.EnsureNotNull(nameof(source)); return source .OfType<long>() .Aggregate(1L, (x, y) => { checked { return x * y; } }); } /// <summary> /// Computes the product of a sequence of <see cref="Single" /> values. /// </summary> /// <param name="source">The source enumerable to operate on.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <returns> /// The product of the values, or one if the sequence is empty. /// </returns> public static float Product(this IEnumerable<float> source) { source.EnsureNotNull(nameof(source)); return source .Aggregate(1.0F, (x, y) => x * y); } /// <summary> /// Computes the product of a sequence of nullable /// <see cref="Single" /> values, excluding null values. /// </summary> /// <param name="source">The source enumerable to operate on.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <returns> /// The product of the values, or one if the sequence is empty. /// </returns> public static float? Product(this IEnumerable<float?> source) { source.EnsureNotNull(nameof(source)); return source .OfType<float>() .Aggregate(1.0F, (x, y) => x * y); } /// <summary> /// Computes the product of a sequence of <see cref="Double" /> values. /// </summary> /// <param name="source">The source enumerable to operate on.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <returns> /// The product of the values, or one if the sequence is empty. /// </returns> public static double Product(this IEnumerable<double> source) { source.EnsureNotNull(nameof(source)); return source .Aggregate(1.0, (x, y) => x * y); } /// <summary> /// Computes the product of a sequence of nullable /// <see cref="Double" /> values, excluding null values. /// </summary> /// <param name="source">The source enumerable to operate on.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <returns> /// The product of the values, or one if the sequence is empty. /// </returns> public static double? Product(this IEnumerable<double?> source) { source.EnsureNotNull(nameof(source)); return source .OfType<double>() .Aggregate(1.0, (x, y) => x * y); } /// <summary> /// Computes the product of a sequence of <see cref="Decimal" /> values. /// </summary> /// <param name="source">The source enumerable to operate on.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="OverflowException"> /// The product exceeds the maximum value. /// </exception> /// <returns> /// The product of the values, or one if the sequence is empty. /// </returns> public static decimal Product(this IEnumerable<decimal> source) { source.EnsureNotNull(nameof(source)); return source .Aggregate(1.0M, (x, y) => { checked { return x * y; } }); } /// <summary> /// Computes the product of a sequence of nullable /// <see cref="Decimal" /> values, excluding null values. /// </summary> /// <param name="source">The source enumerable to operate on.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="OverflowException"> /// The product exceeds the maximum value. /// </exception> /// <returns> /// The product of the values, or one if the sequence is empty. /// </returns> public static decimal? Product(this IEnumerable<decimal?> source) { source.EnsureNotNull(nameof(source)); return source .OfType<decimal>() .Aggregate(1.0M, (x, y) => { checked { return x * y; } }); } /// <summary> /// Computes the product of a sequence of <see cref="Int32" /> values, /// obtained by invoking a transform function on the elements of the /// input sequence. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="selector">The transformation function to use.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The selector is null. /// </exception> /// <exceptuon cref="OverflowException"> /// The product exceeds the maximum value. /// </exception> /// <returns> /// The product of the projected values, or one if the sequence is /// empty. /// </returns> public static int Product<TSource>( this IEnumerable<TSource> source, Func<TSource, int> selector) { source.EnsureNotNull(nameof(source)); selector.EnsureNotNull(nameof(selector)); return source .Select(selector) .Aggregate(1, (x, y) => { checked { return x * y; } }); } /// <summary> /// Computes the product of a sequence of nullable <see cref="Int32" /> /// values, obtained by invoking a transform function on the elements /// of the input sequence, excluding null values. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="selector">The transformation function to use.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The selector is null. /// </exception> /// <exceptuon cref="OverflowException"> /// The product exceeds the maximum value. /// </exception> /// <returns> /// The product of the projected values, or one if the sequence is /// empty. /// </returns> public static int? Product<TSource>( this IEnumerable<TSource> source, Func<TSource, int?> selector) { source.EnsureNotNull(nameof(source)); selector.EnsureNotNull(nameof(selector)); return source .Select(selector) .OfType<int>() .Aggregate(1, (x, y) => { checked { return x * y; } }); } /// <summary> /// Computes the product of a sequence of <see cref="Int64" /> values, /// obtained by invoking a transform function on the elements of the /// input sequence. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="selector">The transformation function to use.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The selector is null. /// </exception> /// <exceptuon cref="OverflowException"> /// The product exceeds the maximum value. /// </exception> /// <returns> /// The product of the projected values, or one if the sequence is /// empty. /// </returns> public static long Product<TSource>( this IEnumerable<TSource> source, Func<TSource, long> selector) { source.EnsureNotNull(nameof(source)); selector.EnsureNotNull(nameof(selector)); return source .Select(selector) .Aggregate(1L, (x, y) => { checked { return x * y; } }); } /// <summary> /// Computes the product of a sequence of nullable <see cref="Int64" /> /// values, obtained by invoking a transform function on the elements /// of the input sequence, excluding null values. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="selector">The transformation function to use.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The selector is null. /// </exception> /// <exceptuon cref="OverflowException"> /// The product exceeds the maximum value. /// </exception> /// <returns> /// The product of the projected values, or one if the sequence is /// empty. /// </returns> public static long? Product<TSource>( this IEnumerable<TSource> source, Func<TSource, long?> selector) { source.EnsureNotNull(nameof(source)); selector.EnsureNotNull(nameof(selector)); return source .Select(selector) .OfType<long>() .Aggregate(1L, (x, y) => { checked { return x * y; } }); } /// <summary> /// Computes the product of a sequence of <see cref="Single" /> values, /// obtained by invoking a transform function on the elements of the /// input sequence. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="selector">The transformation function to use.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The selector is null. /// </exception> /// <returns> /// The product of the projected values, or one if the sequence is /// empty. /// </returns> public static float Product<TSource>( this IEnumerable<TSource> source, Func<TSource, float> selector) { source.EnsureNotNull(nameof(source)); selector.EnsureNotNull(nameof(selector)); return source .Select(selector) .Aggregate(1.0F, (x, y) => x * y); } /// <summary> /// Computes the product of a sequence of nullable <see cref="Single" /> /// values, obtained by invoking a transform function on the elements /// of the input sequence, excluding null values. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="selector">The transformation function to use.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The selector is null. /// </exception> /// <returns> /// The product of the projected values, or one if the sequence is /// empty. /// </returns> public static float? Product<TSource>( this IEnumerable<TSource> source, Func<TSource, float?> selector) { source.EnsureNotNull(nameof(source)); selector.EnsureNotNull(nameof(selector)); return source .Select(selector) .OfType<float>() .Aggregate(1.0F, (x, y) => x * y); } /// <summary> /// Computes the product of a sequence of <see cref="Double" /> values, /// obtained by invoking a transform function on the elements of the /// input sequence. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="selector">The transformation function to use.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The selector is null. /// </exception> /// <returns> /// The product of the projected values, or one if the sequence is /// empty. /// </returns> public static double Product<TSource>( this IEnumerable<TSource> source, Func<TSource, double> selector) { source.EnsureNotNull(nameof(source)); selector.EnsureNotNull(nameof(selector)); return source .Select(selector) .Aggregate(1.0, (x, y) => x * y); } /// <summary> /// Computes the product of a sequence of nullable <see cref="Double" /> /// values, obtained by invoking a transform function on the elements /// of the input sequence, excluding null values. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="selector">The transformation function to use.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The selector is null. /// </exception> /// <returns> /// The product of the projected values, or one if the sequence is /// empty. /// </returns> public static double? Product<TSource>( this IEnumerable<TSource> source, Func<TSource, double?> selector) { source.EnsureNotNull(nameof(source)); selector.EnsureNotNull(nameof(selector)); return source .Select(selector) .OfType<double>() .Aggregate(1.0, (x, y) => x * y); } /// <summary> /// Computes the product of a sequence of <see cref="Decimal" /> values, /// obtained by invoking a transform function on the elements of the /// input sequence. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="selector">The transformation function to use.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The selector is null. /// </exception> /// <exceptuon cref="OverflowException"> /// The product exceeds the maximum value. /// </exception> /// <returns> /// The product of the projected values, or one if the sequence is /// empty. /// </returns> public static decimal Product<TSource>( this IEnumerable<TSource> source, Func<TSource, decimal> selector) { source.EnsureNotNull(nameof(source)); selector.EnsureNotNull(nameof(selector)); return source .Select(selector) .Aggregate(1.0M, (x, y) => { checked { return x * y; } }); } /// <summary> /// Computes the product of a sequence of nullable /// <see cref="Decimal" /> values, obtained by invoking a transform /// function on the elements of the input sequence, excluding null /// values. /// </summary> /// <typeparam name="TSource"> /// The type of the items in the source enumerable. /// </typeparam> /// <param name="source">The source enumerable to operate on.</param> /// <param name="selector">The transformation function to use.</param> /// <exception cref="ArgumentNullException"> /// The source is null. /// </exception> /// <exception cref="ArgumentNullException"> /// The selector is null. /// </exception> /// <exceptuon cref="OverflowException"> /// The product exceeds the maximum value. /// </exception> /// <returns> /// The product of the projected values, or one if the sequence is /// empty. /// </returns> public static decimal? Product<TSource>( this IEnumerable<TSource> source, Func<TSource, decimal?> selector) { source.EnsureNotNull(nameof(source)); selector.EnsureNotNull(nameof(selector)); return source .Select(selector) .OfType<decimal>() .Aggregate(1.0M, (x, y) => { checked { return x * y; } }); } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using Microsoft.VisualStudio.TestTools.UnitTesting; using Chains; namespace Chains.Test { [TestClass] public class CycleTests { [TestMethod] [ExpectedException(typeof(ArgumentNullException))] public void Cycle_Should_ThrowIfSourceIsNull() { var sequence = (IEnumerable<object>)null; var actual = sequence.Cycle().ToList(); } [TestMethod] public void Cycle_Should_YieldTheElementsOfAnEnumerableInfinitely() { var sequence = Enumerable.Range(1, 3); var count = 5; var expected = new[]{ 1, 2, 3, 1, 2 }; var actual = sequence.Cycle().Take(count); Assert.IsTrue( actual.SequenceEqual(expected), "Actual and expected sequences did not match."); } [TestMethod] public void Cycle_Should_YieldNothingForAnEmptyEnumerable() { var sequence = Enumerable.Empty<object>(); var actual = sequence.Cycle(); Assert.IsFalse(actual.Any(), "Expected empty sequence."); } } } <file_sep>using System.Runtime.CompilerServices; [assembly:InternalsVisibleTo("Chains.Test")]
14fe6eafcaee99a863974bb44cfc848cb10976dd
[ "Markdown", "C#" ]
26
C#
SteveXCIV/Chains.NET
6f17cc9f11b269ed9b56d2c7147860e73bd0dc40
dc17e3b45998d038cdf8ac32750b993b9106af83
refs/heads/master
<repo_name>FrankieFabuloso/CMPS101_PA2<file_sep>/makefile #------------------------------------------------------------------------------ # Makefile for Lex # # make makes Lex # make clean removes object files # #------------------------------------------------------------------------------ FLAGS = -std=c99 -Wall SOURCES1 = List.c Lex.c SOURCES2 = List.c ListClient.c OBJECTS1 = List.o Lex.o OBJECTS2 = List.o ListClient.o HEADERS = List.h EXEBIN1 = Lex EXEBIN2 = ListClient LISTSTUFF = Lex.c ListClient.c List.h makefile README all: $(EXEBIN1) $(EXEBIN2) $(EXEBIN1) : $(OBJECTS1) $(HEADERS) gcc -o $(EXEBIN1) $(OBJECTS1) $(OBJECTS1) : $(SOURCES1) $(HEADERS) gcc -c $(FLAGS) $(SOURCES1) $(EXEBIN2) : $(OBJECTS2) $(HEADERS) gcc -o $(EXEBIN2) $(SOURCES2) $(OBJECTS2) : $(SOURCES2) $(HEADERS) gcc -c $(FLAGS) $(SOURCES2) clean : rm -f $(EXEBIN1) $(OBJECTS1) $(EXEBIN2) $(OBJECTS2) submit : submit cmps101-pt.f14 pa2 $(SOURCES1) $(HEADERS) $(LISTSTUFF) check: valgrind --leak-check=full $(EXEBIN) <file_sep>/List.c /*----------------------------------------------------------------------------- * List.c * * Name: <NAME> * User ID: frocha * Program name: pa2 * Source code: Professor Tantalo * *///--------------------------------------------------------------------------- #include <stdio.h> #include <stdlib.h> #include <string.h> #include <assert.h> #include"List.h" // private types and functions ------------------------------------------------ // NodeObj typedef struct NodeObj{ int data; struct NodeObj* prev; struct NodeObj* next; } NodeObj; // Node typedef NodeObj* Node; // newNode() // constructor for private Node type Node newNode(int x) { Node N = malloc(sizeof(NodeObj)); N->data = x; N->prev = N->next = NULL; return(N); } // freeNode() // destructor for the Node type void freeNode(Node* pN){ if( pN!=NULL && *pN!=NULL ){ free(*pN); *pN = NULL; } } // ListObj typedef struct ListObj{ Node front; Node back; Node cursor; int length; int index; } ListObj; // List // Exported reference type typedef struct ListObj* List; // Constructors-Destructors --------------------------------------------------- // newList() // constructor for the List type List newList(void){ List L; L = malloc(sizeof(ListObj)); assert(L!=NULL); L->front = L->back = L->cursor = NULL; L->length = 0; L->index = -1; return (L); } // freeList() // destructor for the List type void freeList(List* p){ if( p!=NULL && *p!=NULL ){ while( !isEmpty(*p) ){ deleteBack(*p); //empty the List } free(*p); *p = NULL; } } // Access functions ----------------------------------------------------------- // isEmpty() // returns 1 (true) if S is empty, 0 (false) otherwise // pre: none int isEmpty(List L){ if( L==NULL ){ fprintf(stderr, "List Error: calling isEmpty() on NULL List reference\n"); exit(1); } return(L->length==0); } // length() // returns the length of the List // pre: none int length(List L){ if (L==NULL){ fprintf(stderr, "List Error: calling length() on NUll List reference\n"); exit(1); } return(L->length); } // getIndex() // retruns the location of the cursor // pre: none int getIndex(List L){ if (L==NULL){ fprintf(stderr, "List Error: calling length() on NULL List reference\n"); exit(1); } return(L->index); } // front() // returns the front element of this List // pre: length()>0 int front(List L){ if(isEmpty(L)){ fprintf(stderr, "List Error: calling front() on NULL List reference"); exit(1); } return L->front->data; } // back() // returns the back element of this List // pre: length()>0 int back(List L){ if(L==NULL){ fprintf(stderr, "List Error: calling back() on NULL List reference"); exit(1); } return L->back->data; } // getElement() // returns the element at which the cursor is pointing to // pre: int getElement(List L){ if(L==NULL){ fprintf(stderr, "List Error: calling getElement) on NULL List reference"); exit(1); } if(getIndex(L) < 0){ fprintf(stderr, "List Error: calling getElement() on out of bounds cursor reference"); exit(1); } if(getIndex(L) == 0) { return back(L); }else{ return L->cursor->data; } } // equals() // returns 1 if the two lists equal, 0 if they dont // pre: none int equals(List A, List B){ int flag = 1; Node N = NULL; Node M = NULL; N = A->front; M = B->front; if(length(A) == length(B)){ while(flag && N!=NULL){ flag = (N->data == M->data); N = N->next; M = M->next; } return flag; }else{ return 0; } } // Manipulation procedures ---------------------------------------------------- // clear() // pre: none void clear(List L){ while(L->length > 0){ deleteBack(L); } } // moveTo() // moves cursor to index wanted (givesn by i) // pre: none void moveTo(List L, int i){ L->index = i; L->cursor = L->back; int j; for(j=0; j<i; j++){ if(L->cursor->next == NULL){ L->index = -1; return; }else{ L->cursor = L->cursor->next; } } } // movePrev() // moves cursor towards the back of the List // pre: none void movePrev(List L){ if(L->cursor->prev == NULL ){ L->index = -1; }else{ moveTo(L, getIndex(L)-1); } } // moveNext() // moves cursor towards the front of the List // pre: none void moveNext(List L){ if(L->cursor == NULL || L->index == -1){ L->index = -1; }else{ moveTo(L, getIndex(L)+1); } } // prepend() // adds data to the back of this List // pre: none void prepend(List L, int data){ Node N = NULL; N = newNode(data); if(isEmpty(L)){ L->front = L->back = N; L->index = 0; }else{ N->next = L->back; L->back->prev = N; L->back = N; L->cursor = L->back; } L->length++; } // append() // adds data to the begining of this List // pre: none void append(List L, int data){ Node N = NULL; N = newNode(data); if(isEmpty(L)){ L->front = L->back = N; L->index = 0; }else{ N->prev = L->front; L->front->next = N; L->front = N; } L->length++; } // insertBefore() // adds new element before the cursor // pre: length()>0, getIndex()>=0 void insertBefore(List L, int data){ Node before = NULL; Node N = newNode(data); if(L==NULL){ fprintf(stderr, "List Error: calling insertBefore() on NULL List reference"); exit(1); } if(getIndex(L) < 0){ fprintf(stderr, "List Error: calling insertBefore() on undefined cusor reference"); exit(1); } if(getIndex(L)==0){ prepend(L, data); return; } before = L->cursor->prev; before->next = N; N->prev = before; N->next = L->cursor; L->cursor->prev = N; L->length++; } // insterAfter() // inserts new element after cursor in this List // pre: length()>0, getIndex()>=0 void insertAfter(List L, int data){ Node after = NULL; Node N = newNode(data); if(L==NULL){ fprintf(stderr, "List Error: calling insertAfter() on NULL List reference"); exit(1); } if(getIndex(L) < 0){ fprintf(stderr, "List Error: calling insertAfter() on undefined cusor reference"); exit(1); } if(getIndex(L)==0){ append(L, data); return; } after = L->cursor->next; after->prev = N; N->next = after; N->prev = L->cursor; L->cursor->next = N; L->length++; } // deleteFront() // deletes the front element of the List // pre: length()>0 void deleteFront(List L){ Node N = NULL; if(L==NULL){ fprintf(stderr, "List Error: calling deleteFront() on NULL List reference"); exit(1); } if(isEmpty(L)){ printf("List Error: calling deleteFront() on empty List\n" ); exit(1); } N = L->front; if(L->length <= 1){ L->front = L->back = L->cursor = NULL; L->index = -1; L->length = 0; }else{ L->front = L->front->prev; L->front->next = NULL; N->prev = NULL; L->length--; } freeNode(&N); } // deleteBack() // deletes the bacl of the list // pre: length()>0 void deleteBack(List L){ Node N = NULL; if(L==NULL){ fprintf(stderr, "List Error: calling deleteBack() on NULL List reference"); exit(1); } if(isEmpty(L)){ printf("List Error: calling deleteBack on empty Queue\n" ); exit(1); } N = L->back; if(L->length == 1){ L->back = L->front = L->cursor = NULL; L->index = -1; L->length = 0; }else{ L->back = L->back->next; L->back->prev = NULL; N->next = NULL; L->length--; } freeNode(&N); } // delete() // delets cursor element in this list. Cursor is undefines after this op. // pre: length()>0, getIndex()>=0 void delete(List L){ Node F = NULL; Node B = NULL; Node N = NULL; if(L==NULL){ fprintf(stderr, "List Error: calling delete() on NULL List reference"); exit(1); } if(getIndex(L) < 0){ fprintf(stderr, "List Error: calling delete() on undefined pointer"); exit(1); } if(getIndex(L) == 0){ deleteBack(L); return; } if(getIndex(L) == L->length-1){ deleteFront(L); return; } N = L->cursor; F = L->cursor->next; B = L->cursor->prev; L->cursor->prev = NULL; L->cursor->next = NULL; L->cursor = NULL; B->next = F; F->prev = B; L->index = -1; L->length--; freeNode(&N); } // Other operations ----------------------------------------------------------- void printList(FILE* out, List L){ Node N = NULL; if( L==NULL){ fprintf(stderr, "List Error: calling printStack() on NUll List reference\n" ); exit(1); } for(N=L->back; N!=NULL; N=N->next) fprintf(out, "%d ", N->data); fprintf(out, "\n"); } List copyList(List L){ List Q = newList(); Node N = L->back; while(N!=NULL){ append(Q, N->data); N = N->next; } Q->index = -1; return Q; } <file_sep>/Lex.c /*----------------------------------------------------------------------------- * Lex.c * * Name: <NAME> * User ID: frocha * Program name: pa2 * Source code: Professor Tantalo * *///--------------------------------------------------------------------------- #include<stdio.h> #include<stdlib.h> #include<string.h> #include "List.h" #define MAX_LEN 160 int main(int argc, char * argv[]){ int n, count=0; FILE *in, *out; char line[MAX_LEN]; char tokenlist[MAX_LEN]; char* token; // check command line for correct number of arguments if( argc != 3 ){ printf("Usage: %s <input file> <output file>\n", argv[0]); exit(1); } // open files for reading and writing in = fopen(argv[1], "r"); out = fopen(argv[2], "w"); if( in==NULL ){ printf("Unable to open file %s for reading\n", argv[1]); exit(1); } if( out==NULL ){ printf("Unable to open file %s for writing\n", argv[2]); exit(1); } /* read each line of input file, then count and print tokens */ while( fgets(line, MAX_LEN, in) != NULL) { count++; n = 0; token = strtok(line, " \n"); tokenlist[0] = '\0'; while( token!=NULL ){ strcat(tokenlist, " "); strcat(tokenlist, token); strcat(tokenlist, "\n"); n++; token = strtok(NULL, " \n"); } } fclose(in); char** words; words = malloc(sizeof(char*) * (count)); //made array in = fopen(argv[1], "r"); char wordsSize [MAX_LEN]; /* - -fills array with the file input - - */ int i; for(i=0; i<count; i++){ char* x = fgets(wordsSize, MAX_LEN, in); if(x == NULL){ break; } char* position = strchr(wordsSize, '\n'); if(position != NULL){ position = ('\0'); } size_t strlen(position); words[i] = calloc(length + 1, sizeof(char*)); strcpy(words[i], wordsSize); } /* -- checking to see if array was make correctly -- * int k; for(k=0; k<count; k++){ fprintf(stdout, "%s", words[k]); } fprintf(out, "\n"); */ /*------using read in String[] named tokens alphebetize words in array-----*/ List new1 = newList(); int j; /* ------ sort the new list ----*/ prepend(new1, 0); for(j= 1; j<count; j++ ){ while(getIndex(new1) != -1 && strcmp(words[getElement(new1)], words[j]) < 0){ moveNext(new1); } if(getIndex(new1) == -1){ append(new1, j); moveTo(new1, 0); } else if(strcmp(words[getElement(new1)], words[j]) >= 0){ insertBefore(new1, j); moveTo(new1, 0); } } printList(stdout, new1); moveTo(new1, 0); while(getIndex(new1) > -1){ fprintf(out, "%s", words[getElement(new1)]); moveNext(new1); } for(j=0; j < count; j++){ free(words[j]); } clear(new1); free(words); freeList(&new1); fclose(in); fclose(out); return(0); }
907b66a73e8afc4b52cfb2715af6c11f92d79b90
[ "C", "Makefile" ]
3
Makefile
FrankieFabuloso/CMPS101_PA2
f7600248f064fc7aaa69240788ddcfc3f9753ef7
47c28be0a5e764f405825d4676a57c3ae5e76247
refs/heads/main
<file_sep># python_excel_app # dynamic add columns from csv to database [pandas] ```python import pandas as pd df = pd.read_csv('data.csv') #print(df['Duration']) db = [] #dynamic function not depend on static csv names for i in range(df.shape[0]): obj = {'duration': '', 'pulse': '', 'maxpulse': '', 'calories': ''} for x in df.columns: if x == "Duration": obj['duration'] = df[x].loc[i] if x == "Pulse": obj['pulse'] = df[x].loc[i] if x == "Maxpulse": obj['maxpulse'] = df[x].loc[i] if x == "Calories": obj['calories'] = df[x].loc[i] db.append(obj) print(db) print(df.loc[[0,1]]) ``` <file_sep>#load data into a DataFrame object: df = pd.DataFrame(data) valueslists = [] row1 = [column for column in df.columns] tupleRow = tuple(row1) tuplelist = [] lastindex = len(row1) - 1 query = "INSERT INTO Customers " + str(tupleRow) + " VALUES " for i, row in df.iterrows(): loopindex = 0 query += "(" for j, column in row.iteritems(): if lastindex != loopindex: query += "'%s', " else: query += "'%s'" tuplelist.append(column) loopindex += 1 query += "), " query = query[0:len(query)-2] query += ";" tupleTuple = tuple(tuplelist) query = query%tupleTuple print(query) <file_sep>import pwd_hasher import sqlite3 import datetime import subprocess import json import xmltodict import os import functions import ssl import logging import math import openpyxl import pandas as pd from flask import Flask, render_template, redirect, url_for, request, abort, session, jsonify, send_from_directory, Response, Blueprint, flash from flask import jsonify from postgres import connection, cursor from werkzeug.utils import secure_filename from sqlite3 import Error app = Flask(__name__) logging.basicConfig(filename='record.log', level=logging.DEBUG, format=f'%(levelname)s %(threadName)s : %(message)s') uploads_dir = os.path.join(app.static_folder, 'uploads') ssl._create_default_https_context = ssl._create_unverified_context ALLOWED_EXTENSIONS = set(['xlsx', 'xls', 'xlsm', 'xlsb', 'csv', 'xltx', 'xlam']) # inventory code 8/9/2021 # function to create new dir in the upload folder for each excel there are folder with time def CreateNewDir(): UPLOAD_FOLDER = os.path.join(app.root_path, 'upload_dir\\') uploadDay = str(datetime.datetime.today().strftime("%Y-%m-%d")) UPLOAD_FOLDER = UPLOAD_FOLDER+uploadDay try: os.makedirs(UPLOAD_FOLDER) except FileExistsError: # directory already exists return UPLOAD_FOLDER return UPLOAD_FOLDER # check if file in allowed extensions def allowed_file(filename): return '.' in filename and \ filename.rsplit('.', 1)[1].lower() in ALLOWED_EXTENSIONS @app.route('/uploadexcel', methods=['POST', 'GET']) def upload_file(): if request.method == 'POST': # check if the post request has the file part if 'file' not in request.files: flash('Upload Error', 'upload') flash('No file part', 'upload') return redirect(url_for('inventory')) file = request.files['file'] # if user does not select file, browser also # submit an empty part without filename if file.filename == '': flash('Upload Error', 'upload') flash('No selected file', 'upload') return redirect(url_for('inventory')) if file and allowed_file(file.filename): filename = secure_filename(file.filename) UPLOAD_FOLDER = CreateNewDir() # HERE UPloaded excel try: lastRecoredId = "SELECT * FROM master_db ORDER BY id DESC LIMIT 1;" unqiueId = cursor.execute(lastRecoredId) selectResult = cursor.fetchone() except: flash('System', 'upload') flash('master_db table is not exist', 'upload') return redirect(url_for('inventory')) file_extension = "" try: file_extension = "file" + str(selectResult[0]) + "_" except: file_extension = "" try: filename = file_extension + filename file_path = os.path.join(UPLOAD_FOLDER, filename) file.save(os.path.join(UPLOAD_FOLDER, filename)) empty_sheet_test = pd.read_excel(file_path) # check if sheet empty remove it and redirect with message if empty_sheet_test.empty: os.remove(file_path) flash('System', 'upload') flash('A blank excel sheet is not loaded, please upload a valid excel sheet', 'upload') return redirect(url_for('inventory')) else: file_namedb = filename.split('.')[0] except: flash('System', 'upload') flash('Could Not Upload Excel FIle Make sure file not opened', 'upload') return redirect(url_for('inventory')) """DROP insertSheetQuery = "DROP TABLE master_db;" insertSheetQuery1 = "DROP TABLE worksheets;" sheetQuery = cursor.execute(insertSheetQuery); sheetQuery = cursor.execute(insertSheetQuery1); """ # Insert Sheet Query In Master DB filerealname = functions.riskVar(filename) insertSheetQuery = "INSERT INTO master_db (filename, file_path, realname) VALUES ('%s' , '%s', '%s') RETURNING id, file_path;"%(file_namedb, file_path, filerealname) sheetQuery = cursor.execute(insertSheetQuery); sheetQueryResult = cursor.fetchone() sheet_id = int(sheetQueryResult[0]) sheet_path = sheetQueryResult[1] # Get All Sheets uploaded sheetHandler = pd.ExcelFile(sheet_path) worksheetsList = sheetHandler.sheet_names uploaded_sheets = [] # read sheets one by one for worksheet in worksheetsList: df = pd.read_excel(sheet_path, sheet_name=worksheet) df = df.loc[:, ~df.columns.str.contains('^Unnamed')] dataframe = pd.DataFrame(df) # uploaded_sheets.append(sheet_name) # check if empty sheet contiue first continue if df.empty or len(df.columns) == 0 or len(dataframe.columns) == 0: continue # HERE CREATE TABLES STEP sheetLastIdString = "SELECT * FROM master_db ORDER BY id DESC LIMIT 1;" sheet_unqiueId = "" try: sheetLastIdQuery = cursor.execute(sheetLastIdString) sheetLastIdResult = cursor.fetchone() sheet_unqiueId = str(sheetLastIdResult[0]) except: sheet_unqiueId = "1" # create NEw TABLE QUERY tablename = "sheet" + sheet_unqiueId + "_" + functions.filterVar(functions.betterVar(worksheet)) column_names = [functions.filterVar(functions.betterVar(column)) for column in dataframe.columns] # here there was a table with only titles So no insert query if int(dataframe.size) == len(dataframe.columns): return "handled case" continue queryString = "CREATE TABLE IF NOT EXISTS %s (id serial PRIMARY KEY ,"%tablename for column in range(len(column_names)): columnName = column_names[column] if column == len(column_names) -1: queryString += " %s VARCHAR (255) NULL " %columnName else: queryString += " %s VARCHAR (255) NULL, " %columnName queryString += ");" # insert statment loop over dataframe and insert values if len(column_names) == 1: columnsTitles = "(" + column_names[0] + ")" else: tupleRow = tuple(column_names) columnsTitles = str(tupleRow).replace("'", "") lastindex = len(column_names) - 1 insertQuery = "INSERT INTO " + tablename + " " + columnsTitles + " VALUES " for i, row in df.iterrows(): loopindex = 0 insertQuery += "(" for j, column in row.iteritems(): if lastindex != loopindex: insertQuery += "'%s', " else: insertQuery += "'%s'" try: if math.isnan(column): column = '' except: column = column loopindex += 1 insertQuery += "), " insertQuery = insertQuery[0:len(insertQuery)-2] insertQuery += ";" createTableStatment = cursor.execute(queryString) insertDataQuery = cursor.execute(insertQuery) # Here Insert worksheets in worksheets tables MAIN TABLE sheetrealname = functions.riskVar(worksheet) insertWorkSheetQuery = "INSERT INTO worksheets (file_id, file_name, sheet_name, file_path, real_name) VALUES (%s, '%s', '%s', '%s', '%s') RETURNING id;"%(sheet_id, file_namedb, tablename, file_path, sheetrealname) insertWorkSheetResult = cursor.execute(insertWorkSheetQuery) workSheetFetch = cursor.fetchone() worksheetid = workSheetFetch[0] uploaded_sheets.append(worksheetid) flash('Success', 'upload') success_message = 'Sheet Added To system successfully %s'%file_path flash(success_message, 'upload') return redirect(url_for('inventory')) else: flash('Upload Error','upload') flash('The uploaded file does not have a valid Excel extension', 'upload') return redirect(url_for('inventory')) @app.route('/far') def inventory(): if 'user' not in session: return redirect(url_for('login')) else: # LIST Contains All Excel Files allfiles = functions.returnFileList() # List contains all excelwork sheets (Used in delete) allsheets = functions.returnWorksheetsList() sidebar = {'title': 'Autonet', 'menu': 'dashboard', 'submenu': ''} return render_template('inventory.html', session=session, sidebar=sidebar, allfiles=allfiles, allsheets=allsheets) # route that will delete the excel file and all tables and recoreds for it + the excel file in the server @app.route('/delete_excelfile', methods=['GET', 'POST']) def delete_excelfile(): if request.method == 'POST': delete_counter = 0 fileToDeleteId = request.form.get('deleted_file') if fileToDeleteId == None: flash('danger', 'delete') flash('Your request could not be processed. Please select a file to delete.', 'delete') return redirect(url_for('inventory')) if fileToDeleteId == 'none': flash('danger', 'delete') flash('Your request could not be processed. Please select a file to delete.', 'delete') return redirect(url_for('inventory')) fileToDeleteId = int(fileToDeleteId) """ insertSheetQuery = "DROP TABLE master_db;" insertSheetQuery1 = "DROP TABLE worksheets;" sheetQuery = cursor.execute(insertSheetQuery); sheetQuery = cursor.execute(insertSheetQuery1); """ # delete the file from Server # get the filepath try: excelFileString = "SELECT file_path FROM master_db WHERE id=%s;"%fileToDeleteId excelFileQuery = cursor.execute(excelFileString) excelFilePathResult = cursor.fetchone() excelFilePath = excelFilePathResult[0] os.remove(excelFilePath) except: flash('danger', 'delete') flash('Could not delete Excel File from Server.', 'delete') return redirect(url_for('inventory')) # GET ALL worksheets for that file try: WorksheetsString = "SELECT id, sheet_name FROM worksheets WHERE file_id=%s;"%fileToDeleteId WorksheetsQuery = cursor.execute(WorksheetsString) WorksheetsResult = cursor.fetchall() except: flash('danger', 'delete') flash('Could not delete the Selected File Make sure it exist.') return redirect(url_for('inventory')) # delete all tables for that file try: for sheet_table in WorksheetsResult: deleteTableString = "DROP TABLE IF EXISTS %s;"%sheet_table[1] deleteTableQuery = cursor.execute(deleteTableString) except: flash('danger', 'delete') flash('Could not delete the Selected File Becuase the worksheets Problem1 Contact Developer.') return redirect(url_for('inventory')) # delete worksheets recoreds try: for worksheet_table in WorksheetsResult: deleteWorksheetString = "DELETE FROM worksheets WHERE id=%s"%worksheet_table[0] deleteWorkSheetQuery = cursor.execute(deleteWorksheetString) delete_counter += 1 except: flash('danger', 'delete') flash('Could not delete the Selected File Becuase the worksheets Problem2 Contact Developer.') return redirect(url_for('inventory')) # DELETE excel file from master db try: excelFileStringDelete = "DELETE FROM master_db WHERE id=%s"%fileToDeleteId excelFileQuery = cursor.execute(excelFileStringDelete) except: flash('danger', 'delete') flash("Could not delete the File recored From System Database.") return redirect(url_for('inventory')) # here Every Thing is Fine and delete the file from system with all it's data and tabls flash('success', 'delete') flash("Congrats You deleted A file From a system that include %s worksheets"%delete_counter, "delete") return redirect(url_for('inventory')) # this route will delete single worksheet Tip: if there only 1 sheet left in the file whole file will be deleted @app.route('/delete_worksheet', methods=["GET", "POST"]) def delete_worksheet(): if request.method == "POST": sheet_id = request.form.get("worksheet_delete") if sheet_id == None or sheet_id == 'none': flash('danger', 'delete') flash('Your request could not be processed. Please select a sheet to delete.', 'delete') return redirect(url_for('inventory')) sheet_id = int(sheet_id) # GET worksheet data try: worksheetString = "SELECT id, file_id, sheet_name, file_path, real_name FROM worksheets WHERE id=%s;"%sheet_id worksheetQuery = cursor.execute(worksheetString) worksheetResult = cursor.fetchone() except: flash('danger', 'delete') flash('Your request could not be processed. The worksheet could not be found.', 'delete') return redirect(url_for('inventory')) # GET File ID and path try: excelFileString = "SELECT id, file_path FROM master_db WHERE id=%s;"%worksheetResult[1] excelFileQuery = cursor.execute(excelFileString) excelFileResult = cursor.fetchone() except: flash('danger', 'delete') flash('Your request could not be processed. Excel file could not be found.', 'delete') return redirect(url_for('inventory')) # Get all the other worksheets in the file try: otherWorksheetsString = "SELECT id FROM worksheets WHERE file_id=%s AND id <> %s;"%(worksheetResult[1], sheet_id) otherWorksheetsQuery = cursor.execute(otherWorksheetsString) otherWorksheetsResult = cursor.fetchall() except: flash('danger', 'delete') flash('Your request could not be processed. Error (3) was found on the server Contact the developer.', 'delete') return redirect(url_for('inventory')) if not otherWorksheetsResult: try: excelFilePath = excelFileResult[1] os.remove(excelFilePath) excelFileDeleteString = "DELETE FROM master_db WHERE id=%s;"%worksheetResult[1] excelFileDeleteQuery = cursor.execute(excelFileDeleteString) except: flash("danger", "delete") flash("The Excel file for this worksheet could not be found.", "delete") return redirect(url_for('inventory')) else: worksheetRealName = functions.rishVarReverse(worksheetResult[4]) try: workbook=openpyxl.load_workbook(worksheetResult[3]) std=workbook.get_sheet_by_name(worksheetRealName) workbook.remove_sheet(std) workbook.save(worksheetResult[3]) except KeyError: flash("danger", "delete") flash("The selected worksheet could not be deleted from Excel Book.", "delete") return redirect(url_for('inventory')) # delete the worksheet table try: worksheetTableString = "DROP TABLE IF EXISTS %s;"%worksheetResult[2] worksheetTableQuery = cursor.execute(worksheetTableString) except: flash("danger", "delete") flash("The selected worksheet Table could not be deleted.", "delete") return redirect(url_for('inventory')) # delete the worksheet recored from workseets try: worksheetRecoredString = "DELETE FROM worksheets WHERE id=%s;"%worksheetResult[0] worksheetRecoredQuery = cursor.execute(worksheetRecoredString) except: flash("danger", "delete") flash("WorkSheet Recored Could not be deleted.", "delete") return redirect(url_for('inventory')) flash("success", "delete") flash("The worksheet has been successfully deleted from the system.", "delete") return redirect(url_for('inventory')) return redirect(url_for('inventory')) @app.route('/tested') def mytest(): allfiles = functions.returnFileList() allsheets = functions.returnWorksheetsList() """ insertSheetQuery = "DROP TABLE master_db;" insertSheetQuery1 = "DROP TABLE worksheets;" sheetQuery = cursor.execute(insertSheetQuery); sheetQuery = cursor.execute(insertSheetQuery1); """ return str(allsheets) try: workbook=openpyxl.load_workbook('C:\\Users\\Mahmoud\\Downloads\\allvideos\\flask-app\\flask-app\\upload_dir\\2021-08-10\\file5_Inventory.xlsx') sheetnames = workbook.get_sheet_names() std=workbook.get_sheet_by_name('Spare SFPs') workbook.remove_sheet(std) sheetnames = workbook.get_sheet_names() workbook.save('C:\\Users\\Mahmoud\\Downloads\\allvideos\\flask-app\\flask-app\\upload_dir\\2021-08-10\\file5_Inventory.xlsx') except KeyError: return "Sheet Already Deleted" return str(sheetnames) # Inventory Code End 8/9/2021 @app.route('/') def dashboard(): if 'user' not in session: return redirect(url_for('login')) else: sidebar = {'title': 'Autonet', 'menu': 'dashboard', 'submenu': ''} return render_template('dashboard.html', session=session, sidebar=sidebar) @app.route('/pdc') def pdc(): if 'user' not in session: return redirect(url_for('login')) else: sidebar = {'title': 'Autonet', 'menu': 'pdc', 'submenu': ''} return render_template('pdc.html', session=session, sidebar=sidebar) @app.route('/ash') def ash(): if 'user' not in session: return redirect(url_for('login')) else: sidebar = {'title': 'Autonet', 'menu': 'ash', 'submenu': ''} return render_template('ash.html', session=session, sidebar=sidebar) @app.route('/mpls') def mpls(): if 'user' not in session: return redirect(url_for('login')) else: sidebar = {'title': 'Bgp & Color', 'menu': 'mpls', 'submenu': ''} return render_template('/mpls.html', session=session, sidebar=sidebar) @app.route('/nettools') def nettools(): if 'user' not in session: return redirect(url_for('login')) else: sidebar = {'title': 'Autonet', 'menu': 'nettools', 'submenu': ''} return render_template('nettools.html', session=session, sidebar=sidebar) @app.route('/netmikoroute', methods=['POST', 'GET']) def netmikoroute(): if request.method == 'GET': return render_template('nettools.html') else: routing = request.form['routing'] print(routing) return render_template('nettools.html') @app.route('/security') def security(): if 'user' not in session: return redirect(url_for('login')) else: sidebar = {'title': 'Autonet', 'menu': 'security', 'submenu': ''} return render_template('security.html', session=session, sidebar=sidebar) @app.route('/admin') def admin(): if 'user' not in session: return redirect(url_for('login')) else: if not session['user'][3] == 1: return redirect(url_for('error404')) table_names = functions.get_table_names() pci_rows = functions.get_pci() records = functions.get_record_time() sidebar = {'title': 'Autonet', 'menu': 'settings', 'submenu': 'admin'} return render_template('admin.html', session=session, sidebar=sidebar, db_rows=pci_rows, table_names=table_names) @app.route('/admin-xml', methods=['POST', 'OPTIONS']) def admin_xml(): if 'user' not in session: return jsonify({'status': 'error', 'message': 'You are not logged in'}) else: if request.method == 'OPTIONS': headers = { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'POST, GET, OPTIONS', 'Access-Control-Max-Age': 1000, 'Access-Control-Allow-Headers': 'origin, x-csrftoken, content-type, accept', } return '', 200, headers try: try: file = request.files['xml_file'] xml_json_flag = 'xml' except Exception as xml_error: print("xml_error: ", xml_error) try: file = request.files['json_file'] xml_json_flag = 'json' except Exception as json_error: print('json_error: ', json_error) return jsonify({'status': 'error', 'message': 'File is not selected'}) if not file or not file.filename: return jsonify({'status': 'error', 'message': 'File is not selected'}) print(file.filename) table_name = request.form['table_name'] table_keys = json.loads(request.form['table_keys']) table_fields = json.loads(request.form['table_fields']) print("table info: ", xml_json_flag, table_name, table_keys, table_fields) filename = os.path.join(uploads_dir, file.filename) file.save(filename) print(filename) with open(filename) as file_data: if xml_json_flag == 'xml': data_dict = xmltodict.parse(file_data.read()) elif xml_json_flag == 'json': # json_dump = json.dumps(file_data) data_dict = json.load(file_data) else: return jsonify({'status': 'error', 'message': 'Undefined file format'}) file_data.close() # print(data_dict) check_vrf = '' try: vr_data = functions.custom_find_key(vr_key, data_dict) except Exception as error: print("vr_data error: ", error) check_vrf = 'N/A' vr_data = [] try: int_data = functions.custom_find_key(int_key, data_dict) except Exception as error: print("int_data error: ", error) int_data = data_dict['TABLE_intf']['ROW_intf'] conn = sqlite3.connect('database.db') cur = conn.cursor() table_name = table_name.replace('-', '_') cur.execute("SELECT count(name) FROM sqlite_master WHERE type='table' AND name='" + table_name + "'") check_table = cur.fetchone() if check_table[0] == 1 or check_table[0] == '1': print("Table already exist") return jsonify({'status': 'error', 'message': 'Table already exist'}) else: create_table_sql = 'CREATE TABLE ' + table_name + '(id INT NOT NULL, IP CHAR(256), Location CHAR(256), VRF CHAR(256), ' for item in table_fields: create_table_sql += item + " CHAR(256), " create_table_sql = create_table_sql[:-2] create_table_sql += ")" cur.execute(create_table_sql) # conn.commit() print("check_vrf: ", check_vrf) print("int_data: ", int_data) print("len(int_data): ", len(int_data)) for i in range(len(int_data)): if check_vrf != 'N/A': vr_item = vr_data[i]['ROW_vrf'] item_vrf = vr_item['vrf-name-out'] int_item = int_data[i]['ROW_intf'] else: int_item = int_data[i] item_vrf = int_item['vrf-name-out'] item_id = i try: item_ip = int_item['subnet'] + "/" + int_item['masklen'] except Exception as error: print("error item_ip: ", error) item_ip = 'N/A' item_location = table_name.replace('_', ' ') insert_sql = "INSERT INTO " + table_name + " (id, IP, Location, VRF" for field_item in table_fields: insert_sql += ", " + field_item insert_sql += ")" insert_sql += " VALUES (" + str(item_id) + ", '" + item_ip + "', '" + item_location + "', '" + item_vrf for field_key in table_keys: try: insert_sql += "', '" + int_item[field_key] except Exception as error: print("error tag: ", error) insert_sql += "', '" + 'N/A' insert_sql += "')" print(insert_sql) cur.execute(insert_sql) conn.commit() functions.db_record_time(table_name) return jsonify({'status': 'success', 'message': 'success added table', 'vr_data': vr_data, 'int_data': int_data}), 201 except Exception as err: print("error: ", err) return jsonify({'status': 'error', 'message': 'Failed to upload file'}) @app.route('/admin-xml-2', methods=['POST', 'OPTIONS']) def admin_xml_2(): if 'user' not in session: return jsonify({'status': 'error', 'message': 'You are not logged in'}) if request.method == 'OPTIONS': headers = { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'POST, GET, OPTIONS', 'Access-Control-Max-Age': 1000, 'Access-Control-Allow-Headers': 'origin, x-csrftoken, content-type, accept', } return '', 200, headers try: file = request.files['xml_file'] xml_json_flag = 'xml' except Exception as xml_error: print("xml_error: ", xml_error) try: file = request.files['json_file'] xml_json_flag = 'json' except Exception as json_error: print('json_error: ', json_error) return jsonify({'status': 'error', 'message': 'File is not selected'}) if not file or not file.filename: return jsonify({'status': 'error', 'message': 'File is not selected'}) table_name = request.form['table_name'] exclude_vrf = request.form['exclude_vrf'] exclude_ipnexthop = request.form['exclude_ipnexthop'] map_ipnexthop = request.form['map_ipnexthop'] table_keys = json.loads(request.form['table_keys']) table_fields = json.loads(request.form['table_fields']) filename = os.path.join(uploads_dir, file.filename) file.save(filename) table_name = table_name.replace('-', '_').strip() rt_res = functions.rt_to_db(xml_json_flag, filename, table_name, exclude_vrf, exclude_ipnexthop, map_ipnexthop, table_keys, table_fields) if rt_res['status'] == 'error': return jsonify(rt_res) return jsonify({'status': 'success', 'message': 'success added table'}), 201 @app.route('/search-ip', methods=['POST', 'OPTIONS']) def search_nettools(): if 'user' not in session: return jsonify({'status': 'error', 'message': 'You are not logged in'}) else: if request.method == 'OPTIONS': headers = { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'POST, GET, OPTIONS', 'Access-Control-Max-Age': 1000, 'Access-Control-Allow-Headers': 'origin, x-csrftoken, content-type, accept', } return '', 200, headers ip_array = request.get_json()['ip_array'] dbs = ['PDC_DCS_PRD_DSW1_CONNECTED', 'PDC_DCS_PRD_DSW2_CONNECTED', 'PDC_DCS_STG_DSW1_CONNECTED', 'PDC_DCS_STG_DSW2_CONNECTED', 'PDC_DCW_DSW1_CONNECTED', 'PDC_DCW_DSW2_CONNECTED', 'PDC_DCI_SW01_CONNECTED', 'PDC_DCI_SW02_CONNECTED', 'PDC_DCI_SW03_CONNECTED', 'PDC_DCI_SW04_CONNECTED', 'PDC_DCS_PRD_DSW1', 'PDC_DCS_PRD_DSW2', 'PDC_DCS_STG_DSW1', 'PDC_DCS_STG_DSW2', 'PDC_DCW_DSW1', 'PDC_DCW_DSW2', 'PDC_DCI_SW01', 'PDC_DCI_SW02', 'PDC_DCI_SW03', 'PDC_DCI_SW04', 'CDC_DCS_PRD_DSW1_CONNECTED', 'CDC_DCS_PRD_DSW2_CONNECTED', 'CDC_DCS_STG_DSW1_CONNECTED', 'CDC_DCS_STG_DSW2_CONNECTED', 'CDC_DCW_DSW1_CONNECTED', 'CDC_DCW_DSW2_CONNECTED', 'CDC_DCI_SW01_CONNECTED', 'CDC_DCI_SW02_CONNECTED', 'CDC_DCI_SW03_CONNECTED', 'CDC_DCI_SW04_CONNECTED', 'CDC_DCS_PRD_DSW1', 'CDC_DCS_PRD_DSW2', 'CDC_DCS_STG_DSW1', 'CDC_DCS_STG_DSW2', 'CDC_DCW_DSW1', 'CDC_DCW_DSW2', 'CDC_DCI_SW01', 'CDC_DCI_SW02', 'CDC_DCI_SW03', 'CDC_DCI_SW04', 'PHX_DCS_DSW1_CONNECTED', 'PHX_DCS_DSW2_CONNECTED', 'PHX_DCW_DSW1_CONNECTED', 'PHX_DCW_DSW2_CONNECTED', 'PHX_DCS_DSW1', 'PHX_DCS_DSW2', 'PHX_DCW_DSW1', 'PHX_DCW_DSW2', 'DFW_DCW_DSW1_CONNECTED', 'DFW_DCW_DSW2_CONNECTED', 'DFW_DCW_BSW01_CONNECTED', 'DFW_DCW_BSW02_CONNECTED', 'DFW_DCW_BSW03_CONNECTED', 'DFW_DCW_BSW04_CONNECTED', 'DFW_DCW_LSW01_CONNECTED', 'DFW_DCW_DSW1', 'DFW_DCW_DSW2', 'DFW_DCW_BSW01', 'DFW_DCW_BSW02', 'DFW_DCW_BSW03', 'DFW_DCW_BSW04', 'DFW_DCW_LSW01', 'ASH_SW01_CONNECTED', 'ASH_SW02_CONNECTED', 'ASH_NX01_CONNECTED', 'ASH_NX02_CONNECTED', 'ASH_SW05_CONNECTED', 'ASH_SW06_CONNECTED', 'ASH_SW01', 'ASH_SW02', 'ASH_NX01', 'ASH_NX02', 'ASH_SW05', 'ASH_SW06', 'SJC_SW01_CONNECTED', 'SJC_SW02_CONNECTED', 'SJC_NX01_CONNECTED', 'SJC_NX02_CONNECTED', 'SJC_SW05_CONNECTED', 'SJC_SW06_CONNECTED', 'SJC_SW01', 'SJC_SW02', 'SJC_NX01', 'SJC_NX02', 'SJC_SW05', 'SJC_SW06', #IF NOT FOUND ON ABOVE DB's SEARCH THIS DB 'UVN' ] search_ips = functions.func_search_ip(ip_array, dbs) return jsonify({'status': 'success', 'message': search_ips}) @app.route('/manage-pci', methods=['POST', 'OPTIONS']) def manage_pci(): if 'user' not in session: return jsonify({'status': 'error', 'message': 'You are not logged in'}) else: if request.method == 'OPTIONS': headers = { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'POST, GET, OPTIONS', 'Access-Control-Max-Age': 1000, 'Access-Control-Allow-Headers': 'origin, x-csrftoken, content-type, accept', } return '', 200, headers method_type = request.get_json()['method_type'] try: with sqlite3.connect(database='database.db') as conn: cur = conn.cursor() if method_type == 'add': item_ip = request.get_json()['ip'] item_status = request.get_json()['status'] insert_id = 1 last_sql = "SELECT * FROM DC_PCI ORDER BY id DESC LIMIT 1" cur.execute(last_sql) last_item = cur.fetchall() if len(last_item) > 0: insert_id = last_item[0][0] + 1 insert_sql = "INSERT INTO DC_PCI(id, IP, STATUS) VALUES (" + str(insert_id) + ", '" + item_ip + "', '" + item_status + "')" cur.execute(insert_sql) conn.commit() functions.db_record_time('DC_PCI') elif method_type == 'edit': item_id = request.get_json()['id'] item_ip = request.get_json()['ip'] item_status = request.get_json()['status'] update_sql = "UPDATE DC_PCI SET IP = '" + item_ip + "', STATUS = '" + item_status + "' WHERE id = '" + item_id + "'" cur.execute(update_sql) conn.commit() functions.db_record_time('DC_PCI') elif method_type == 'remove': item_id = request.get_json()['id'] delete_sql = "DELETE FROM DC_PCI WHERE id = " + item_id cur.execute(delete_sql) conn.commit() functions.db_record_time('DC_PCI') else: return jsonify({'status': 'error', 'message': 'Undefined method'}) except Exception as error: print(error) conn.rollback() finally: conn.close() return jsonify({'status': 'success', 'message': 'updated success'}) @app.route('/remove-table', methods=['POST', 'OPTIONS']) def remove_table(): if 'user' not in session: return jsonify({'status': 'error', 'message': 'You are not logged in'}) else: if request.method == 'OPTIONS': headers = { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'POST, GET, OPTIONS', 'Access-Control-Max-Age': 1000, 'Access-Control-Allow-Headers': 'origin, x-csrftoken, content-type, accept', } return '', 200, headers table_name = request.get_json()['table_name'] conn = sqlite3.connect(database='database.db') cursor = conn.cursor() drop_sql = "DROP TABLE " + table_name cursor.execute(drop_sql) conn.commit() conn.close() return jsonify({'status': 'success', 'message': "Removed a table successfully"}) @app.route('/upload-pci', methods=['POST', 'OPTIONS']) def upload_pci(): if request.method == 'OPTIONS': headers = { 'Access-Control-Allow-Origin': '*', 'Access-Control-Allow-Methods': 'POST, GET, OPTIONS', 'Access-Control-Max-Age': 1000, 'Access-Control-Allow-Headers': 'origin, x-csrftoken, content-type, accept', } return '', 200, headers try: file = request.files['pci_file'] except Exception as pci_error: print("pci_error: ", pci_error) return jsonify({'status': 'error', 'message': 'Failed to upload file'}) filename = os.path.join(uploads_dir, file.filename) file.save(filename) print(filename) functions.excel_database(filename, 'DC_PCI') return jsonify({'status': 'success', 'message': 'success added table'}), 201 @app.route('/login', methods=['POST', 'GET']) def login(): if request.method == 'GET': if 'user' in session: return redirect(url_for('dashboard')) return render_template('login.html') else: email = request.form['email'] pwd = request.form['pwd'] print(email) if email == '' or pwd == '': return render_template('login.html', email=email, pwd=pwd) try: with sqlite3.connect(database='database.db') as conn: cur = conn.cursor() cur.execute("SELECT * FROM user WHERE email = ?", (email,)) users = cur.fetchall() if len(users) < 1: print("doesn't exist") return redirect(url_for('login')) user = users[0] stored_pwd = user[4] if not pwd_hasher.verify_password(stored_pwd, pwd): print('wrong password') return redirect(url_for('login')) print("Login success") session['user'] = user print('login ok') except sqlite3.Error as e: print(e) conn.rollback() finally: print("finally") conn.close() return redirect(url_for('dashboard')) @app.route('/user-manage', methods=['POST', 'GET']) def user_management(): if request.method == 'GET': if 'user' not in session: return redirect(url_for('login')) if not session['user'][3] == 1: return redirect(url_for('error404')) if session['user'][3] != 1: return redirect(url_for('error404')) sidebar = {'title': 'User Management', 'menu': 'settings', 'submenu': 'user-manage'} users = functions.db_manage_user('all', 'none', 'none') return render_template('user_management.html', session=session, sidebar=sidebar, users=users) else: if 'user' not in session: return jsonify({'status': 'error', 'message': 'You are not logged in'}) if session['user'][3] != 1: return jsonify({'status': 'error', 'message': 'Permission is not defined'}) method_type = request.get_json()['method_type'] if method_type == 'edit': user_id = request.get_json()['user_id'] user_role = request.get_json()['user_role'] functions.db_manage_user('edit', user_id, user_role) return jsonify({'status': 'success', 'message': 'User is updated successfully'}) elif method_type == 'remove': user_id = request.get_json()['user_id'] functions.db_manage_user('remove', user_id, 'none') return jsonify({'status': 'success', 'message': 'User is removed successfully'}) @app.route('/logout') def logout(): session.pop('user', None) return redirect(url_for('dashboard')) @app.route('/register', methods=['POST', 'GET']) def register(): if request.method == 'GET': if 'user' in session: return redirect(url_for('dashboard')) return render_template('register.html') else: email = request.form['email'] name = request.form['name'] pwd = pwd_hasher.hash_password(request.form['pwd']) # user role role = 2 try: with sqlite3.connect(database='database.db') as conn: cur = conn.cursor() cur.execute("INSERT INTO user (name, email, role, pwd) VALUES (?, ?, ?, ?)", (name, email, role, pwd)) conn.commit() functions.db_record_time('user') except: conn.rollback() finally: conn.close() return redirect(url_for('login')) @app.route('/404', methods=['GET']) def error404(): return "Not found page" if __name__ == '__main__': app.run('0.0.0.0', port=5001) #port can be anything higher than 5000.
99e1158b27dcb4cca08b46bc4892036835a6dac5
[ "Markdown", "Python" ]
3
Markdown
MahmoudHegazi/python_excel_app
dcd857a04000ade4f2aa971a453d95c4d0b30cf2
1ed8013f7b7d8a83b178497413e3e9c6ba14486f
refs/heads/master
<file_sep># Recaptured Screen Image Demoiréing (AMNet) This code is the official implementation of TCSVT 2020 paper "Recaptured Screen Image Demoiréing". Paper:<br> -------- https://ieeexplore.ieee.org/abstract/document/8972378<br> Environment:<br> -------- Windows 8 + Nvidia Titan X GPU <br> Python (version 3.6.4) + Tensorflow (version 1.10.0) <br> Network:<br> ------- <div align=center><img src="https://github.com/tju-maoyan/AMNet/blob/master/images/Network.png"></div><br> Fig. 1. The architecture of our AMNet: (a) the generator of our network, comprised of additive (circled by the purple rectangle) and multiplicative (circled by the green rectangle) modules, (b) the ASPP block in the generator network, (c) the multiplicative block in the generator network, and (d) the discriminator of our network. In particular, the “k” represents kernel size, the “n” represents the number of channels, the “s” represents stride size, and the “d” represents the dilation rate. The upsampling layer is realized by 2× nearest neighbor upsampling.<br> Results:<br> ------- <div align=center><img src="https://github.com/tju-maoyan/AMNet/blob/master/images/demoire_exp.png"></div><br> Fig. 2. The recaptured screen images (top row), our demoiréing results (the second row), and the corresponding screenshot images (bottom row). Please zoom in the figure for better observation.<br> <br> <div align=center><img src="https://github.com/tju-maoyan/AMNet/blob/master/images/SOTA.png"></div><br> Fig. 3. Visual quality comparisons for one image captured by Huawei Honor 6X with the screen Philips MWX12201<br> Download pre-trained model:<br> -------- `VGG19:` https://pan.baidu.com/s/1YFbPiBYtdIa6ZDmWYJHZJQ (key:l6x1)<br> `trained model:` https://pan.baidu.com/s/1qvS04gnSSLbqvBCR9K3BAw (key:3kja)<br> Download dataset:<br> -------- `Training set:` https://pan.baidu.com/s/1Xn5YygDb9Eg5u5zL3plrsA (key:gpxd)<br> `Test set:` https://pan.baidu.com/s/1KCZciRYb-MP16u4W1w3X0Q (key:isn6)<br> Test:<br> ------- * Please download pre-trained model and test set. * change the path in `test.py`. * run: `python test.py`. Train:<br> -------- * Please download training set. * change the path in `main.py`. * run: `python main.py`. Citation:<br> ------- If you find this work useful for your research, please cite:<br> ``` @article{Yue2020Recaptured, author = {<NAME> <NAME> <NAME>}, year = {2021}, title = {Recaptured Screen Image Demoir\'eing}, volume={31}, number={1}, pages={49-60}, journal = {IEEE Transactions on Circuits and Systems for Video Technology}, doi = {10.1109/TCSVT.2020.2969984} } ``` Contactor:<br> ---------- If you have any question, please concat me with <EMAIL>. <file_sep>import argparse from glob import glob import tensorflow as tf from model import demoire from utils import * import h5py import random import os os.environ["CUDA_VISIBLE_DEVICES"] = '0' parser = argparse.ArgumentParser(description='') parser.add_argument('--epoch', dest='epoch', type=int, default=20, help='# of epoch') parser.add_argument('--batch_size', dest='batch_size', type=int, default=16, help='# images in batch') parser.add_argument('--lr', dest='lr', type=float, default=0.0005, help='initial learning rate for adam') parser.add_argument('--use_gpu', dest='use_gpu', type=int, default=1, help='gpu flag, 1 for GPU and 0 for CPU') parser.add_argument('--phase', dest='phase', default='train', help='train or test or val or plot_feature_map') parser.add_argument('--checkpoint_dir', dest='ckpt_dir', default='./checkpoint', help='models are saved here') parser.add_argument('--sample_dir', dest='sample_dir', default='./sample', help='sample are saved here') parser.add_argument('--test_dir', dest='test_dir', default='./test', help='test sample are saved here') parser.add_argument('--file_in', dest='file_in', default='./trainset/datain.h5', help='datain are saved here') parser.add_argument('--file_out', dest='file_out', default='./trainset/dataout.h5', help='dataout are saved here') #parser.add_argument('--val_in', dest='val_in', default='../../data3/valin.h5', help='valin are saved here') #parser.add_argument('--val_out', dest='val_out', default='../../data3/valout.h5', help='valout are saved here') args = parser.parse_args() tf.reset_default_graph() class Sampler(object): def __init__(self): self.name = "demoire" [self.cur_batch_in, self.cur_batch_out] = self.load_new_data() self.train_batch_idx = 0 def load_new_data(self): moire_rgb_pt = h5py.File(args.file_in, 'r') gt_rgb_pt = h5py.File(args.file_out, 'r') for key1 in moire_rgb_pt.keys(): data_in = moire_rgb_pt[(key1)] for key2 in gt_rgb_pt.keys(): data_out = gt_rgb_pt[(key2)] return data_in, data_out def __call__(self, batch_size=args.batch_size): return self.cur_batch_in, self.cur_batch_out def data_shape(self): return self.cur_batch_in.shape[0] data = Sampler() def demoire_train(demoire): demoire.train(batch_size=args.batch_size, ckpt_dir=args.ckpt_dir, epoch=args.epoch, sample_dir=args.sample_dir) def main(_): if not os.path.exists(args.ckpt_dir): os.makedirs(args.ckpt_dir) if not os.path.exists(args.sample_dir): os.makedirs(args.sample_dir) if not os.path.exists(args.test_dir): os.makedirs(args.test_dir) gpu_options = tf.GPUOptions(per_process_gpu_memory_fraction=0.95) gpu_options.allow_growth = True with tf.Session(config=tf.ConfigProto(gpu_options=gpu_options)) as sess: model = demoire(sess, data, args) demoire_train(model) if __name__ == '__main__': tf.app.run() <file_sep>import time import numpy as np import tensorflow.contrib.slim as slim from utils import * import random import matplotlib.pyplot as plt import tensorflow as tf import tensorflow.contrib as tc import tensorflow.contrib.layers as tcl import scipy.io as sio import utils from scipy.misc import imread from PIL import Image import vgg def aspp(input, input_channels, reuse=False): with tf.variable_scope('Aspp') as scope: if reuse: scope.reuse_variables() p1 = tf.layers.conv2d(input, input_channels, 1, padding='same', name='aspp_p1') p1 = tf.nn.relu(p1) p2 = tf.layers.conv2d(input, input_channels, 3, padding='same', dilation_rate=(6, 6), name='aspp_p2') p2 = tf.nn.relu(p2) p3 = tf.layers.conv2d(input, input_channels, 3, padding='same', dilation_rate=(12, 12), name='aspp_p3') p3 = tf.nn.relu(p3) p4 = tf.layers.conv2d(input, input_channels, 3, padding='same', dilation_rate=(18, 18), name='aspp_p4') p4 = tf.nn.relu(p4) p5 = tf.nn.avg_pool(input, ksize=[1, input.shape[1], input.shape[2], 1], strides=[1, 1, 1, 1], padding='VALID', name='aspp_pool') p5 = tf.image.resize_images(p5, size=(input.shape[1], input.shape[2]), method=0) output = tf.concat([p1, p2, p3, p4, p5], 3) output = tf.layers.conv2d(output, input_channels, 1, padding='same', name='aspp_bottle') output = tf.nn.relu(output) return output def generator(input, patch_size1, patch_size2, batch_size=8, is_training=True, reuse=False, output_channels=3): with tf.variable_scope('Generator') as scope: if reuse: scope.reuse_variables() with tf.variable_scope('conv_in'): conv_in = tf.layers.conv2d(input, 32, 7, padding='same', name='conv1_1') conv_in = tf.nn.relu(conv_in) # bias # encoder with tf.variable_scope('conv1_1'): output = tf.layers.conv2d(conv_in, 32, 3, padding='same', name='conv1_1') output = tf.nn.relu(output) with tf.variable_scope('conv1_2'): output = tf.layers.conv2d(output, 32, 3, padding='same', name='conv1_2') conv1_2 = tf.nn.relu(output) with tf.variable_scope('conv2_1'): output = tf.layers.conv2d(conv1_2, 64, 3, strides=(2, 2), padding='same', name='conv2_1') output = tf.nn.relu(output) with tf.variable_scope('conv2_2'): output = tf.layers.conv2d(output, 64, 3, padding='same', name='conv2_2') conv2_2 = tf.nn.relu(output) with tf.variable_scope('conv3_1'): output = tf.layers.conv2d(conv2_2, 128, 3, strides=(2, 2), padding='same', name='conv3_1') output = tf.nn.relu(output) with tf.variable_scope('conv3_2'): output = tf.layers.conv2d(output, 128, 3, padding='same', name='conv3_2') conv3_2 = tf.nn.relu(output) conv3_2 = aspp(conv3_2, 128, reuse) # decoder with tf.variable_scope('deconv3_1'): output = tf.layers.conv2d(conv3_2, 128, 3, padding='same', name='deconv3_1') output = tf.nn.relu(output) with tf.variable_scope('deconv2_2'): output = tf.image.resize_images(output, size=(patch_size1//2, patch_size2//2), method=1) output = tf.layers.conv2d(output, 64, 3, padding='same', name='deconv2_2') output = tf.nn.relu(output) #output_b_aspp = output output += aspp(conv2_2, 64, reuse) #output_a_aspp = output with tf.variable_scope('deconv2_1'): output = tf.layers.conv2d(output, 64, 3, padding='same', name='deconv2_1') output = tf.nn.relu(output) with tf.variable_scope('deconv1_2'): output = tf.image.resize_images(output, size=(patch_size1, patch_size2), method=1) output = tf.layers.conv2d(output, 32, 3, padding='same', name='deconv1_2') output = tf.nn.relu(output) # output_b_aspp = output output += aspp(conv1_2, 32, reuse) #output_a_aspp = output with tf.variable_scope('deconv1_1'): output = tf.layers.conv2d(output, 32, 3, padding='same', name='deconv1_1') output = tf.nn.relu(output) with tf.variable_scope('deconv1_0'): output = tf.layers.conv2d(output, output_channels, 3, padding='same', name='deconv1_0') output += input output_mid = output # multiply r = 16 pool_size = [1, input.shape[1], input.shape[2], 1] with tf.variable_scope('multi_1'): multi_1 = tf.layers.conv2d(output, 64, 3, padding='same', name='multi_1') se = tf.nn.avg_pool(multi_1, pool_size, [1, 1, 1, 1], padding='VALID') se = tf.layers.conv2d(se, 64/r, 1, padding='same', use_bias=True) se = tf.nn.relu(se) se = tf.layers.conv2d(se, 64, 1, padding='same', use_bias=True) se = tf.sigmoid(se) multi_1 = se * multi_1 multi_1 = tf.nn.relu(multi_1) with tf.variable_scope('multi_2'): multi_2 = tf.layers.conv2d(multi_1, 64, 3, padding='same', name='multi_2') se = tf.nn.avg_pool(multi_2, pool_size, [1, 1, 1, 1], padding='VALID') se = tf.layers.conv2d(se, 64/r, 1, padding='same', use_bias=True) se = tf.nn.relu(se) se = tf.layers.conv2d(se, 64, 1, padding='same', use_bias=True) se = tf.sigmoid(se) multi_2 = se * multi_2 multi_2 = tf.nn.relu(multi_2) with tf.variable_scope('multi_3'): multi_3 = tf.layers.conv2d(multi_2, 64, 3, padding='same', name='multi_3') se = tf.nn.avg_pool(multi_3, pool_size, [1, 1, 1, 1], padding='VALID') se = tf.layers.conv2d(se, 64/r, 1, padding='same', use_bias=True) se = tf.nn.relu(se) se = tf.layers.conv2d(se, 64, 1, padding='same', use_bias=True) se = tf.sigmoid(se) multi_3 = se * multi_3 multi_3 = tf.nn.relu(multi_3) with tf.variable_scope('multi_4'): multi_4 = tf.layers.conv2d(multi_3, 64, 3, padding='same', name='multi_4') se = tf.nn.avg_pool(multi_4, pool_size, [1, 1, 1, 1], padding='VALID') se = tf.layers.conv2d(se, 64/r, 1, padding='same', use_bias=True) se = tf.nn.relu(se) se = tf.layers.conv2d(se, 64, 1, padding='same', use_bias=True) se = tf.sigmoid(se) multi_4 = se * multi_4 multi_4 = tf.nn.relu(multi_4) with tf.variable_scope('multi_5'): multi_5 = tf.layers.conv2d(multi_4, output_channels, 3, padding='same', name='multi_5') tf.add_to_collection('conv_output', multi_5) tf.add_to_collection('conv_output', output) multi_out = tf.nn.relu(output * multi_5) tf.add_to_collection('conv_output', multi_out) return multi_out def discriminator(input, is_training=True, reuse=False): with tf.variable_scope('Discriminator') as scope: if reuse: scope.reuse_variables() size = 64 res = [input] d = tcl.conv2d(input, num_outputs=size, kernel_size=4, stride=2, padding='SAME', weights_initializer=tf.random_normal_initializer(0, 0.02)) d = tf.nn.leaky_relu(d, 0.2) res.append(d) d = tcl.conv2d(d, num_outputs=size * 2, kernel_size=4, stride=2, padding='SAME', weights_initializer=tf.random_normal_initializer(0, 0.02)) # d = tcl.instance_norm(d) d = tf.nn.leaky_relu(d, 0.2) res.append(d) d = tcl.conv2d(d, num_outputs=size * 4, kernel_size=4, stride=2, padding='SAME', weights_initializer=tf.random_normal_initializer(0, 0.02)) # d = tcl.instance_norm(d) d = tf.nn.leaky_relu(d, 0.2) res.append(d) d = tcl.conv2d(d, num_outputs=size * 8, kernel_size=4, stride=1, padding='SAME', weights_initializer=tf.random_normal_initializer(0, 0.02)) # d = tcl.instance_norm(d) d = tf.nn.leaky_relu(d, 0.2) res.append(d) d = tcl.conv2d(d, num_outputs=1, kernel_size=4, stride=1, padding='SAME', weights_initializer=tf.random_normal_initializer(0, 0.02)) res.append(d) return res[1:] def criterionGAN(d_images, batch_size, target_bool): loss = 0 d_images = d_images[-1] # print(d_images) # print(batch_size) if target_bool: for i in range(batch_size): d_image = d_images[i] # print(d_image.shape) loss += tf.nn.l2_loss(d_image - 1.0) return loss / batch_size else: for i in range(batch_size): d_image = d_images[i] loss += tf.nn.l2_loss(d_image) return loss / batch_size class demoire(object): def __init__(self, sess, data, args, input_c_dim=3): self.sess = sess self.input_c_dim = input_c_dim self.patch_size = 128 self.data = data self.args = args # build model self.X = tf.placeholder(tf.float32, [args.batch_size, self.patch_size, self.patch_size, self.input_c_dim], name='moire_image') self.Y_ = tf.placeholder(tf.float32, [args.batch_size, self.patch_size, self.patch_size, self.input_c_dim], name='clean_image') self.is_training = tf.placeholder(tf.bool, name='is_training') self.Y = generator(self.X, self.patch_size, self.patch_size, args.batch_size, is_training=self.is_training) self.D_real = discriminator(self.Y_) self.D_fake = discriminator(self.Y, reuse=True) # calculate loss self.loss = (1.0 / args.batch_size) * tf.nn.l2_loss(self.Y_ - self.Y) # MSE loss # define perceptual loss CONTENT_LAYER = 'relu5_4' vgg_dir = 'vgg_pretrained/imagenet-vgg-verydeep-19.mat' demoire_vgg = vgg.net(vgg_dir, vgg.preprocess(self.Y * 255)) clean_vgg = vgg.net(vgg_dir, vgg.preprocess(self.Y_ * 255)) content_size = 128*128*3*args.batch_size/16/16 self.loss_content = 2 * tf.nn.l2_loss(demoire_vgg[CONTENT_LAYER] - clean_vgg[CONTENT_LAYER]) / content_size self.loss_sum = self.loss + self.loss_content # define gan loss self.loss_d_fake = criterionGAN(self.D_fake, args.batch_size, False) self.loss_d_real = criterionGAN(self.D_real, args.batch_size, True) self.G_loss = criterionGAN(self.D_fake, args.batch_size, True) self.D_loss = (self.loss_d_fake + self.loss_d_real) * 0.5 # GAN feature matching loss self.loss_G_GAN_Feat = 0 for i in range(len(self.D_fake)-1): self.loss_G_GAN_Feat += tf.reduce_mean(abs(self.D_real[i]-self.D_fake[i])) / 4.0 self.G_loss_sum = self.G_loss + self.loss_sum + self.loss_G_GAN_Feat * 1000 self.G_vars = [var for var in tf.trainable_variables() if var.name.startswith('Generator')] self.D_vars = [var for var in tf.trainable_variables() if var.name.startswith('Discriminator')] self.lr = args.lr self.fig_count = 30 update_ops = tf.get_collection(tf.GraphKeys.UPDATE_OPS) self.global_step = tf.Variable(initial_value=0, dtype=tf.int32, trainable=False) max_steps = args.epoch * int(self.data.data_shape() // args.batch_size) self.lr = tf.train.polynomial_decay(self.lr, self.global_step, max_steps, end_learning_rate=0.0, power=0.2) with tf.control_dependencies(update_ops): self.train_op = tf.train.AdamOptimizer(self.lr).minimize(self.loss_sum, self.global_step, self.G_vars) self.train_op_g = tf.train.AdamOptimizer(self.lr/50.).minimize(self.G_loss_sum, var_list=self.G_vars) self.train_op_d = tf.train.AdamOptimizer(self.lr/50.).minimize(self.D_loss, var_list=self.D_vars) init = tf.global_variables_initializer() self.sess.run(init) print("[*] Initialize model successfully...") def train(self, batch_size, ckpt_dir, epoch, sample_dir, eval_every_epoch=1): # assert data range is between 0 and 1 dataShape = self.data.data_shape() numBatch = int(dataShape // batch_size) # load pretrained model load_model_status, iter_num = self.load(ckpt_dir) if load_model_status: start_epoch = iter_num // numBatch start_step = 0 print("[*] Model restore success!") print("start epoch = %d, start iter_num = %d" % (start_epoch, iter_num)) else: iter_num = 0 start_epoch = 0 start_step = 0 print("[*] Not find pretrained model!") # make summary tf.summary.scalar('loss', self.loss) tf.summary.scalar('loss_content', self.loss_content) tf.summary.scalar('loss_sum', self.loss_sum) tf.summary.scalar('G_loss', self.G_loss) tf.summary.scalar('D_loss', self.D_loss) tf.summary.scalar('G_loss_sum', self.G_loss_sum) tf.summary.scalar('loss_G_GAN_Feat', self.loss_G_GAN_Feat) tf.summary.scalar('lr', self.lr) writer = tf.summary.FileWriter('./logs', self.sess.graph) merged = tf.summary.merge_all() print("[*] Start training, with start epoch %d start iter %d : " % (start_epoch, iter_num)) start_time = time.time() for epoch in range(start_epoch, epoch): order = np.arange(0, numBatch, 1) random.shuffle(order) for batch_id in range(start_step, numBatch): [data_in, data_out] = self.data(batch_size) batch_in = data_in[order[batch_id]*batch_size:(order[batch_id]+1)*batch_size,:,:,:] batch_out = data_out[order[batch_id]*batch_size:(order[batch_id]+1)*batch_size,:,:,:] batch_in = batch_in / 255.0 batch_out = batch_out / 255.0 _, _, _, loss_content, loss, loss_sum, G_loss, D_loss, loss_G_GAN_Feat, G_loss_sum, summary \ = self.sess.run([self.train_op, self.train_op_g, self.train_op_d, self.loss_content, self.loss, self.loss_sum, self.G_loss, self.D_loss, self.loss_G_GAN_Feat, self.G_loss_sum, merged], feed_dict={self.X: batch_in, self.Y_: batch_out, self.is_training: True}) print("Epoch: [%2d] [%4d/%4d] time: %4.4f loss: %.6f loss_content: %.6f loss_sum: %.6f G_loss: %.6f D_loss: %.6f loss_G_GAN_Feat: %.6f G_loss_sum: %.6f" % (epoch + 1, batch_id + 1, numBatch, time.time() - start_time, loss, loss_content, loss_sum, G_loss, D_loss, loss_G_GAN_Feat, G_loss_sum)) iter_num += 1 writer.add_summary(summary, iter_num) if np.mod(epoch + 1, eval_every_epoch) == 0: # evaluate added eval_data_in = batch_in[:16, :, :, :] eval_data_out = batch_out[:16, :, :, :] samples = self.sess.run(self.Y, feed_dict={self.X: eval_data_in, self.is_training: False}) eval_data_out = np.clip(255 * eval_data_out, 0, 255).astype('uint8') samples = np.clip(255 * samples, 0, 255).astype('uint8') fig = data2fig(eval_data_out) plt.savefig('{}/{}_gt.png'.format(sample_dir, str(self.fig_count).zfill(3)), bbox_inches='tight') plt.close(fig) eval_data_in = np.clip(255 * eval_data_in, 0, 255).astype('uint8') fig = data2fig(eval_data_in) plt.savefig('{}/{}_m.png'.format(sample_dir, str(self.fig_count).zfill(3)), bbox_inches='tight') plt.close(fig) fig = data2fig(samples) plt.savefig('{}/{}_dm.png'.format(sample_dir, str(self.fig_count).zfill(3)), bbox_inches='tight') plt.close(fig) self.fig_count += 1 self.save(iter_num, ckpt_dir) self.global_step = iter_num print("[*] Finish training.") def save(self, iter_num, ckpt_dir, model_name='Demoire-tensorflow'): saver = tf.train.Saver() checkpoint_dir = ckpt_dir if not os.path.exists(checkpoint_dir): os.makedirs(checkpoint_dir) print("[*] Saving model...") saver.save(self.sess, os.path.join(checkpoint_dir, model_name), global_step=iter_num) def load(self, checkpoint_dir): print("[*] Reading checkpoint...") saver = tf.train.Saver() ckpt = tf.train.get_checkpoint_state(checkpoint_dir) if ckpt and ckpt.model_checkpoint_path: full_path = tf.train.latest_checkpoint(checkpoint_dir) global_step = int(full_path.split('/')[-1].split('-')[-1]) saver.restore(self.sess, full_path) return True, global_step else: return False, 0 <file_sep> import argparse from model import * import tensorflow as tf from glob import glob import os from utils import * os.environ["CUDA_VISIBLE_DEVICES"] = "0" parser = argparse.ArgumentParser(description='') parser.add_argument('--checkpoint_dir', dest='ckpt_dir', default='./checkpoint', help='models are saved here') parser.add_argument('--use_gpu', dest='use_gpu', type=int, default=1, help='gpu flag, 1 for GPU and 0 for CPU') parser.add_argument('--test_dir', dest='test_dir', default='./test/text', help='test sample are saved here') parser.add_argument('--test_set', dest='test_set', default='moire_rgb_text', help='dataset of base layer for testing') args = parser.parse_args() class Test(): def __init__(self, sess, input_c_dim=3): self.sess = sess self.input_c_dim = input_c_dim # build model self.X = tf.placeholder(tf.float32, [None, 512, 512, self.input_c_dim], name='moire_image') self.patch_size = 512 self.is_training = tf.placeholder(tf.bool, name='is_training') self.Y = generator(self.X, self.patch_size, self.patch_size, 1, is_training=self.is_training) print("[*] Initialize model successfully...") def load(self, checkpoint_dir): print("[*] Reading checkpoint...") saver = tf.train.Saver() ckpt = tf.train.get_checkpoint_state(checkpoint_dir) if ckpt and ckpt.model_checkpoint_path: #full_path = tf.train.latest_checkpoint(checkpoint_dir) full_path = './checkpoint/Demoire-tensorflow-69581' global_step = int(full_path.split('/')[-1].split('-')[-1]) saver.restore(self.sess, full_path) return True, global_step else: return False, 0 def main(self, test_files, ckpt_dir, save_dir): """Test""" # init variables tf.initialize_all_variables().run() assert len(test_files) != 0, 'No testing data!' load_model_status, global_step = self.load(ckpt_dir) assert load_model_status == True, '[!] Load weights FAILED...' print(" [*] Load weights SUCCESS...") print("[*] " + " start testing...") for idx in range(len(test_files)): moire_image = load_images(test_files[idx]).astype(np.float32) / 255.0 demoire_image = self.sess.run(self.Y, feed_dict={self.X: moire_image, self.is_training: False}) print(test_files[idx]) num_img = test_files[idx][43:-6] print("num_img=",num_img) moireimage = np.clip(255 * moire_image, 0, 255).astype('uint8') outputimage = np.clip(255 * demoire_image, 0, 255).astype('uint8') #save_images(os.path.join(save_dir, num_img+'_m.png'), moireimage) save_images(os.path.join(save_dir, num_img+'_dm.png'), outputimage) if __name__ == '__main__': if not os.path.exists(args.ckpt_dir): os.makedirs(args.ckpt_dir) if not os.path.exists(args.test_dir): os.makedirs(args.test_dir) if args.use_gpu: # added to control the gpu memory print("GPU\n") gpu_options = tf.GPUOptions(per_process_gpu_memory_fraction=0.6) with tf.Session(config=tf.ConfigProto(gpu_options=gpu_options)) as sess: model = Test(sess) test_files = glob('../RAW_img_dm/data/testset3/{}/*.png'.format(args.test_set)) model.main(test_files, ckpt_dir=args.ckpt_dir, save_dir=args.test_dir) else: print("CPU\n") with tf.Session() as sess: model = Test(sess) test_files_base = glob('../RAW_img_dm/data/testset3/{}/*.png'.format(args.test_set_base)) model.main(test_files_base, ckpt_dir=args.ckpt_dir, save_dir=args.test_dir)
41018140c5446e34bf8cd4f16945394ac0ba8bb5
[ "Markdown", "Python" ]
4
Markdown
tju-maoyan/AMNet
f89f85623e3b29e60e52b467a9866350386105e8
9e0702080717a30984109b3d14520b075214d893
refs/heads/master
<repo_name>MicrochipTech/unicens-bare-metal-sam-v71<file_sep>/audio-source/samv71-ucs/libraries/ucsi/ucsi_print.h /*------------------------------------------------------------------------------------------------*/ /* UNICENS Stucture Printing module */ /* Copyright 2018, Microchip Technology Inc. and its subsidiaries. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following conditions are met: */ /* */ /* 1. Redistributions of source code must retain the above copyright notice, this */ /* list of conditions and the following disclaimer. */ /* */ /* 2. Redistributions in binary form must reproduce the above copyright notice, */ /* this list of conditions and the following disclaimer in the documentation */ /* and/or other materials provided with the distribution. */ /* */ /* 3. Neither the name of the copyright holder nor the names of its */ /* contributors may be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" */ /* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE */ /* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE */ /* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE */ /* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL */ /* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR */ /* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER */ /* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, */ /* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE */ /* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /*------------------------------------------------------------------------------------------------*/ #ifndef UCSI_PRINT_H_ #define UCSI_PRINT_H_ #ifdef __cplusplus extern "C" { #endif #include <stdint.h> #include <stdarg.h> #include "ucs_api.h" #include "ucs_cfg.h" #include "ucs_xrm_cfg.h" #define UCSI_PRINT_MAX_NODES (UCS_NUM_REMOTE_DEVICES + 1) #define UCSI_PRINT_MAX_RESOURCES (UCS_XRM_NUM_RESOURCES) typedef enum { ObjState_Unused, ObjState_Build, ObjState_Failed } UCSIPrint_ObjectState_t; typedef enum { NodeState_NotAvailable, NodeState_Ignored, NodeState_Available } UCSIPrint_NodeState_t; void UCSIPrint_Init(Ucs_Rm_Route_t *pRoutes, uint16_t routesSize, void *tag); void UCSIPrint_Service(uint32_t timestamp); void UCSIPrint_SetNetworkAvailable(bool available, uint8_t maxPos); void UCSIPrint_SetNodeAvailable(uint16_t nodeAddress, uint16_t nodePosAddr, UCSIPrint_NodeState_t nodeState); void UCSIPrint_SetRouteState(uint16_t routeId, bool isActive, uint16_t connectionLabel); void UCSIPrint_SetObjectState(Ucs_Xrm_ResObject_t *element, UCSIPrint_ObjectState_t state); void UCSIPrint_UnicensActivity(void); /** * \brief Callback when ever UNICENS_PRINT needs to be serviced. Call UCSIPrint_Service in next service cycle. * \param tag - user pointer given along with UCSIPrint_Init */ extern void UCSIPrint_CB_NeedService(void *tag); /** * \brief Callback when ever UNICENS_PRINT forms a human readable message. * \param tag - user pointer given along with UCSIPrint_Init * \param pMsg - zero terminated human readable string */ extern void UCSIPrint_CB_OnUserMessage(void *tag, const char pMsg[]); #ifdef __cplusplus } #endif #endif <file_sep>/audio-source/samv71-ucs/src/default_config.c /*------------------------------------------------------------------------------------------------*/ /* UNICENS Generated Network Configuration */ /* Generator: xml2struct for Windows V4.4.0 */ /*------------------------------------------------------------------------------------------------*/ #include "ucs_api.h" uint16_t PacketBandwidth = 12; uint16_t RoutesSize = 3; uint16_t NodeSize = 4; /* Route 1 from source-node=0x200 to sink-node=0x2B0 */ Ucs_Xrm_DefaultCreatedPort_t SrcOfRoute1_DcPort = { UCS_XRM_RC_TYPE_DC_PORT, UCS_XRM_PORT_TYPE_MLB, 0 }; Ucs_Xrm_MlbSocket_t SrcOfRoute1_MlbSocket = { UCS_XRM_RC_TYPE_MLB_SOCKET, &SrcOfRoute1_DcPort, UCS_SOCKET_DIR_INPUT, UCS_MLB_SCKT_SYNC_DATA, 4, 0x0A }; Ucs_Xrm_NetworkSocket_t SrcOfRoute1_NetworkSocket = { UCS_XRM_RC_TYPE_NW_SOCKET, 0x0D00, UCS_SOCKET_DIR_OUTPUT, UCS_NW_SCKT_SYNC_DATA, 4 }; Ucs_Xrm_SyncCon_t SrcOfRoute1_SyncCon = { UCS_XRM_RC_TYPE_SYNC_CON, &SrcOfRoute1_MlbSocket, &SrcOfRoute1_NetworkSocket, UCS_SYNC_MUTE_MODE_NO_MUTING, 0 }; Ucs_Xrm_ResObject_t *SrcOfRoute1_JobList[] = { &SrcOfRoute1_DcPort, &SrcOfRoute1_MlbSocket, &SrcOfRoute1_NetworkSocket, &SrcOfRoute1_SyncCon, NULL }; Ucs_Xrm_NetworkSocket_t SnkOfRoute1_NetworkSocket = { UCS_XRM_RC_TYPE_NW_SOCKET, 0x0D00, UCS_SOCKET_DIR_INPUT, UCS_NW_SCKT_SYNC_DATA, 4 }; Ucs_Xrm_DefaultCreatedPort_t SnkOfRoute1_DcPort = { UCS_XRM_RC_TYPE_DC_PORT, UCS_XRM_PORT_TYPE_MLB, 0 }; Ucs_Xrm_MlbSocket_t SnkOfRoute1_MlbSocket = { UCS_XRM_RC_TYPE_MLB_SOCKET, &SnkOfRoute1_DcPort, UCS_SOCKET_DIR_OUTPUT, UCS_MLB_SCKT_SYNC_DATA, 4, 0x0A }; Ucs_Xrm_SyncCon_t SnkOfRoute1_SyncCon = { UCS_XRM_RC_TYPE_SYNC_CON, &SnkOfRoute1_NetworkSocket, &SnkOfRoute1_MlbSocket, UCS_SYNC_MUTE_MODE_NO_MUTING, 0 }; Ucs_Xrm_ResObject_t *SnkOfRoute1_JobList[] = { &SnkOfRoute1_NetworkSocket, &SnkOfRoute1_DcPort, &SnkOfRoute1_MlbSocket, &SnkOfRoute1_SyncCon, NULL }; /* Route 2 from source-node=0x200 to sink-node=0x270 */ Ucs_Xrm_NetworkSocket_t SnkOfRoute2_NetworkSocket = { UCS_XRM_RC_TYPE_NW_SOCKET, 0x0D00, UCS_SOCKET_DIR_INPUT, UCS_NW_SCKT_SYNC_DATA, 4 }; Ucs_Xrm_StrmPort_t SnkOfRoute2_StrmPort0 = { UCS_XRM_RC_TYPE_STRM_PORT, 0, UCS_STREAM_PORT_CLK_CFG_64FS, UCS_STREAM_PORT_ALGN_LEFT16BIT }; Ucs_Xrm_StrmPort_t SnkOfRoute2_StrmPort1 = { UCS_XRM_RC_TYPE_STRM_PORT, 1, UCS_STREAM_PORT_CLK_CFG_WILD, UCS_STREAM_PORT_ALGN_LEFT16BIT }; Ucs_Xrm_StrmSocket_t SnkOfRoute2_StrmSocket = { UCS_XRM_RC_TYPE_STRM_SOCKET, &SnkOfRoute2_StrmPort0, UCS_SOCKET_DIR_OUTPUT, UCS_STREAM_PORT_SCKT_SYNC_DATA, 4, UCS_STREAM_PORT_PIN_ID_SRXA0 }; Ucs_Xrm_SyncCon_t SnkOfRoute2_SyncCon = { UCS_XRM_RC_TYPE_SYNC_CON, &SnkOfRoute2_NetworkSocket, &SnkOfRoute2_StrmSocket, UCS_SYNC_MUTE_MODE_NO_MUTING, 0 }; Ucs_Xrm_ResObject_t *SnkOfRoute2_JobList[] = { &SnkOfRoute2_NetworkSocket, &SnkOfRoute2_StrmPort0, &SnkOfRoute2_StrmPort1, &SnkOfRoute2_StrmSocket, &SnkOfRoute2_SyncCon, NULL }; /* Route 3 from source-node=0x200 to sink-node=0x240 */ Ucs_Xrm_NetworkSocket_t SnkOfRoute3_NetworkSocket = { UCS_XRM_RC_TYPE_NW_SOCKET, 0x0D00, UCS_SOCKET_DIR_INPUT, UCS_NW_SCKT_SYNC_DATA, 4 }; Ucs_Xrm_StrmPort_t SnkOfRoute3_StrmPort0 = { UCS_XRM_RC_TYPE_STRM_PORT, 0, UCS_STREAM_PORT_CLK_CFG_64FS, UCS_STREAM_PORT_ALGN_LEFT16BIT }; Ucs_Xrm_StrmPort_t SnkOfRoute3_StrmPort1 = { UCS_XRM_RC_TYPE_STRM_PORT, 1, UCS_STREAM_PORT_CLK_CFG_WILD, UCS_STREAM_PORT_ALGN_LEFT16BIT }; Ucs_Xrm_StrmSocket_t SnkOfRoute3_StrmSocket = { UCS_XRM_RC_TYPE_STRM_SOCKET, &SnkOfRoute3_StrmPort0, UCS_SOCKET_DIR_OUTPUT, UCS_STREAM_PORT_SCKT_SYNC_DATA, 4, UCS_STREAM_PORT_PIN_ID_SRXA1 }; Ucs_Xrm_SyncCon_t SnkOfRoute3_SyncCon = { UCS_XRM_RC_TYPE_SYNC_CON, &SnkOfRoute3_NetworkSocket, &SnkOfRoute3_StrmSocket, UCS_SYNC_MUTE_MODE_NO_MUTING, 0 }; Ucs_Xrm_ResObject_t *SnkOfRoute3_JobList[] = { &SnkOfRoute3_NetworkSocket, &SnkOfRoute3_StrmPort0, &SnkOfRoute3_StrmPort1, &SnkOfRoute3_StrmSocket, &SnkOfRoute3_SyncCon, NULL }; UCS_NS_CONST uint8_t PayloadRequest1ForNode270[] = { 0x00, 0x00, 0x01, 0x01 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Request1ForNode270 = { 0x00, 0x01, 0x06C1, 0x02, 0x04, PayloadRequest1ForNode270 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Response1ForNode270 = { 0x00, 0x01, 0x06C1, 0x0C, 0x00, NULL }; UCS_NS_CONST uint8_t PayloadRequest2ForNode270[] = { 0x0F, 0x00, 0x00, 0x00, 0x2A, 0x02, 0x00, 0x64, 0x1B, 0x80 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Request2ForNode270 = { 0x00, 0x01, 0x06C4, 0x02, 0x0A, PayloadRequest2ForNode270 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Response2ForNode270 = { 0x00, 0x01, 0x06C4, 0x0C, 0x00, NULL }; UCS_NS_CONST uint8_t PayloadRequest3ForNode270[] = { 0x0F, 0x00, 0x00, 0x00, 0x2A, 0x02, 0x00, 0x64, 0x11, 0xB8 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Request3ForNode270 = { 0x00, 0x01, 0x06C4, 0x02, 0x0A, PayloadRequest3ForNode270 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Response3ForNode270 = { 0x00, 0x01, 0x06C4, 0x0C, 0x00, NULL }; UCS_NS_CONST uint8_t PayloadRequest4ForNode270[] = { 0x0F, 0x00, 0x00, 0x00, 0x2A, 0x02, 0x00, 0x64, 0x12, 0x60 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Request4ForNode270 = { 0x00, 0x01, 0x06C4, 0x02, 0x0A, PayloadRequest4ForNode270 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Response4ForNode270 = { 0x00, 0x01, 0x06C4, 0x0C, 0x00, NULL }; UCS_NS_CONST uint8_t PayloadRequest5ForNode270[] = { 0x0F, 0x00, 0x00, 0x00, 0x2A, 0x02, 0x00, 0x64, 0x13, 0xA0 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Request5ForNode270 = { 0x00, 0x01, 0x06C4, 0x02, 0x0A, PayloadRequest5ForNode270 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Response5ForNode270 = { 0x00, 0x01, 0x06C4, 0x0C, 0x00, NULL }; UCS_NS_CONST uint8_t PayloadRequest6ForNode270[] = { 0x0F, 0x00, 0x00, 0x00, 0x2A, 0x02, 0x00, 0x64, 0x14, 0x48 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Request6ForNode270 = { 0x00, 0x01, 0x06C4, 0x02, 0x0A, PayloadRequest6ForNode270 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Response6ForNode270 = { 0x00, 0x01, 0x06C4, 0x0C, 0x00, NULL }; UCS_NS_CONST uint8_t PayloadRequest7ForNode270[] = { 0x0F, 0x00, 0x00, 0x00, 0x2A, 0x05, 0x00, 0x64, 0x20, 0x00, 0x89, 0x77, 0x72 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Request7ForNode270 = { 0x00, 0x01, 0x06C4, 0x02, 0x0D, PayloadRequest7ForNode270 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Response7ForNode270 = { 0x00, 0x01, 0x06C4, 0x0C, 0x00, NULL }; UCS_NS_CONST uint8_t PayloadRequest8ForNode270[] = { 0x0F, 0x00, 0x00, 0x00, 0x2A, 0x02, 0x00, 0x64, 0x06, 0x00 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Request8ForNode270 = { 0x00, 0x01, 0x06C4, 0x02, 0x0A, PayloadRequest8ForNode270 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Response8ForNode270 = { 0x00, 0x01, 0x06C4, 0x0C, 0x00, NULL }; UCS_NS_CONST uint8_t PayloadRequest9ForNode270[] = { 0x0F, 0x00, 0x00, 0x00, 0x2A, 0x02, 0x00, 0x64, 0x05, 0x00 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Request9ForNode270 = { 0x00, 0x01, 0x06C4, 0x02, 0x0A, PayloadRequest9ForNode270 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Response9ForNode270 = { 0x00, 0x01, 0x06C4, 0x0C, 0x00, NULL }; UCS_NS_CONST uint8_t PayloadRequest10ForNode270[] = { 0x0F, 0x00, 0x00, 0x00, 0x2A, 0x03, 0x00, 0x64, 0x07, 0x01, 0x50 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Request10ForNode270 = { 0x00, 0x01, 0x06C4, 0x02, 0x0B, PayloadRequest10ForNode270 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Response10ForNode270 = { 0x00, 0x01, 0x06C4, 0x0C, 0x00, NULL }; UCS_NS_CONST Ucs_Ns_Script_t ScriptsForNode270[] = { { 0, &Request1ForNode270, &Response1ForNode270 }, { 0, &Request2ForNode270, &Response2ForNode270 }, { 0, &Request3ForNode270, &Response3ForNode270 }, { 0, &Request4ForNode270, &Response4ForNode270 }, { 0, &Request5ForNode270, &Response5ForNode270 }, { 0, &Request6ForNode270, &Response6ForNode270 }, { 0, &Request7ForNode270, &Response7ForNode270 }, { 0, &Request8ForNode270, &Response8ForNode270 }, { 0, &Request9ForNode270, &Response9ForNode270 }, { 0, &Request10ForNode270, &Response10ForNode270 } }; UCS_NS_CONST uint8_t PayloadRequest1ForNode240[] = { 0x00, 0x00, 0x01, 0x01 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Request1ForNode240 = { 0x00, 0x01, 0x06C1, 0x02, 0x04, PayloadRequest1ForNode240 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Response1ForNode240 = { 0x00, 0x01, 0x06C1, 0x0C, 0x00, NULL }; UCS_NS_CONST uint8_t PayloadRequest2ForNode240[] = { 0x0F, 0x00, 0x02, 0x0A, 0x18, 0x03, 0x00, 0x64, 0x00, 0x0F, 0x02, 0x01, 0x00, 0x00, 0x02, 0xA5, 0xDF, 0x03, 0x3F, 0x3F, 0x04, 0x02, 0x02, 0x10, 0x40, 0x40, 0x11, 0x00, 0x00, 0x12, 0x00, 0x00, 0x13, 0x00, 0x00, 0x14, 0x00, 0x00 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Request2ForNode240 = { 0x00, 0x01, 0x06C4, 0x02, 0x26, PayloadRequest2ForNode240 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Response2ForNode240 = { 0x00, 0x01, 0x06C4, 0x0C, 0x00, NULL }; UCS_NS_CONST uint8_t PayloadRequest3ForNode240[] = { 0x0F, 0x00, 0x02, 0x04, 0x18, 0x03, 0x00, 0x64, 0x20, 0x00, 0x00, 0x21, 0x00, 0x00, 0x22, 0x00, 0x00, 0x23, 0x00, 0x00 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Request3ForNode240 = { 0x00, 0x01, 0x06C4, 0x02, 0x14, PayloadRequest3ForNode240 }; UCS_NS_CONST Ucs_Ns_ConfigMsg_t Response3ForNode240 = { 0x00, 0x01, 0x06C4, 0x0C, 0x00, NULL }; UCS_NS_CONST Ucs_Ns_Script_t ScriptsForNode240[] = { { 0, &Request1ForNode240, &Response1ForNode240 }, { 0, &Request2ForNode240, &Response2ForNode240 }, { 0, &Request3ForNode240, &Response3ForNode240 } }; Ucs_Signature_t SignatureForNode200 = { 0x200 }; Ucs_Signature_t SignatureForNode2B0 = { 0x2B0 }; Ucs_Signature_t SignatureForNode270 = { 0x270 }; Ucs_Signature_t SignatureForNode240 = { 0x240 }; Ucs_Rm_Node_t AllNodes[] = { { &SignatureForNode200, NULL, 0 }, { &SignatureForNode2B0, NULL, 0 }, { &SignatureForNode270, ScriptsForNode270, 10 }, { &SignatureForNode240, ScriptsForNode240, 3 } }; Ucs_Rm_EndPoint_t SourceEndpointForRoute1 = { UCS_RM_EP_SOURCE, SrcOfRoute1_JobList, &AllNodes[0] }; Ucs_Rm_EndPoint_t SinkEndpointForRoute1 = { UCS_RM_EP_SINK, SnkOfRoute1_JobList, &AllNodes[1] }; Ucs_Rm_EndPoint_t SinkEndpointForRoute2 = { UCS_RM_EP_SINK, SnkOfRoute2_JobList, &AllNodes[2] }; Ucs_Rm_EndPoint_t SinkEndpointForRoute3 = { UCS_RM_EP_SINK, SnkOfRoute3_JobList, &AllNodes[3] }; Ucs_Rm_Route_t AllRoutes[] = { { &SourceEndpointForRoute1, &SinkEndpointForRoute1, 1, 0x0010 }, { &SourceEndpointForRoute1, &SinkEndpointForRoute2, 1, 0x0011 }, { &SourceEndpointForRoute1, &SinkEndpointForRoute3, 1, 0x0012 } }; <file_sep>/audio-source/samv71-ucs/src/gmac/gmac_init.h #ifndef _GMAC_INIT_H_ #define _GMAC_INIT_H_ #include "gmac.h" #include "gmac_init.h" #include "gmacb_phy.h" #include "gmacd.h" #include "gmii.h" #ifdef __cplusplus extern "C" { #endif void init_gmac(sGmacd *pGmacd); #ifdef __cplusplus } #endif #endif <file_sep>/audio-source/samv71-ucs/libraries/ucsi/ucsi_api.h /*------------------------------------------------------------------------------------------------*/ /* UNICENS Integration Helper Component */ /* Copyright 2017, Microchip Technology Inc. and its subsidiaries. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following conditions are met: */ /* */ /* 1. Redistributions of source code must retain the above copyright notice, this */ /* list of conditions and the following disclaimer. */ /* */ /* 2. Redistributions in binary form must reproduce the above copyright notice, */ /* this list of conditions and the following disclaimer in the documentation */ /* and/or other materials provided with the distribution. */ /* */ /* 3. Neither the name of the copyright holder nor the names of its */ /* contributors may be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" */ /* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE */ /* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE */ /* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE */ /* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL */ /* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR */ /* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER */ /* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, */ /* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE */ /* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /*------------------------------------------------------------------------------------------------*/ #ifndef UCSI_H_ #define UCSI_H_ #ifdef __cplusplus extern "C" { #endif #include "ucsi_cfg.h" /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* Public API */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /** * \brief Initializes UNICENS Integration module. * \note Must be called before any other function of this component * * \param pPriv - External allocated memory area for this particular * instance (static allocated or allocated with malloc) * \param pTag - Pointer given by the integrator. This pointer will be * returned by any callback function of this component */ void UCSI_Init(UCSI_Data_t *pPriv, void *pTag); /** * \brief Runs in programming mode. All nodes will be treated as unknown. * If the node address of a node is not correct, it will be new programmed. * \note This is a special mode. All public functions of this component will not be * accepted if this mode is set. * Except: UCSI_ProcessRxData, UCSI_Service, UCSI_Timeout, UCSI_ProgramIdentStringRam, UCSI_ProgramIdentStringRom. * \note Call UCSI_Init again, to leave this mode. * * \param pPriv - private data section of this instance * \param amountOfNodes - Programming starts only if the found node count in the network is like the given value. */ bool UCSI_RunInProgrammingMode(UCSI_Data_t *pPriv, uint8_t amountOfNodes); /** * \brief Executes the given configuration. If already started, all * existing local and remote INIC resources will be destroyed * \note All given pointers must stay valid until this callback is * raised: "UCSI_CB_OnStop" * * \param pPriv - private data section of this instance * \param packetBw - The amount of bytes per frame, reserved for Ethernet channel. * \param pRoutesList - Reference to a list of routes * \param routesListSize - Number of routes in the list * \param pNodesList - Reference to the list of nodes * \param nodesListSize - Reference to a list of routes * \return true, configuration successfully enqueued, false otherwise */ bool UCSI_NewConfig(UCSI_Data_t *pPriv, uint16_t packetBw, Ucs_Rm_Route_t *pRoutesList, uint16_t routesListSize, Ucs_Rm_Node_t *pNodesList, uint16_t nodesListSize); /** * \brief Executes the given script. If already started, all * existing local and remote INIC resources will be destroyed * \note pScriptList pointer must stay valid until this callback is * raised: "UCSI_CB_OnStop" * \note UCSI_NewConfig must called first, before calling the function * * \param pPriv - private data section of this instance * \param targetAddress - targetAddress - The target node address * \param pScriptList - Pointer to the array of scripts * \param scriptListLength - Number of scripts in the array * \return true, script successfully enqueued, false otherwise */ bool UCSI_ExecuteScript(UCSI_Data_t *pPriv, uint16_t targetAddress, Ucs_Ns_Script_t *pScriptList, uint8_t scriptListLength); /** * \brief Offer the received control data from LLD to UNICENS * \note Call this function only from single context (not from ISR) * \note This function can be called repeated until it return false * * \param pPriv - private data section of this instance * \param pBuffer - Received bytes from MOST control channel * \param len - Length of the received data array * \return true, if the data could be enqueued for processing, remove * the data from LLD queue in this case. * false, data could not be processed due to lag of resources. * In this case do not discard the data. Offer the same * data again after UCSI_CB_OnServiceRequired was * raised or any time later. */ bool UCSI_ProcessRxData(UCSI_Data_t *pPriv, const uint8_t *pBuffer, uint32_t len); /** * \brief Gives UNICENS Integration module time to do its job * \note Call this function only from single context (not from ISR) * * \param pPriv - private data section of this instance */ void UCSI_Service(UCSI_Data_t *pPriv); /** * \brief Call after timer set by UCSI_CB_OnSetServiceTimer * expired. * \note Call this function only from single context (not from ISR) * * \param pPriv - private data section of this instance */ void UCSI_Timeout(UCSI_Data_t *pPriv); /** * \brief Sends an AMS message to the control channel * * \note Call this function only from single context (not from ISR) * * \param pPriv - private data section of this instance * \param msgId - The AMS message id * \param targetAddress - The node / group target address * \param pPayload - The AMS payload to be sent * \param payloadLen - The length of the AMS payload * * \return true, if operation was successful. false if the message could not be sent. */ bool UCSI_SendAmsMessage(UCSI_Data_t *my, uint16_t msgId, uint16_t targetAddress, uint8_t *pPayload, uint32_t payloadLen); /** * \brief Gets the queued AMS message from UNICENS stack * * \note Call this function only from single context (not from ISR) * \note This function may be called cyclic or when UCSI_CB_OnAmsMessageReceived was raised * * \param pPriv - private data section of this instance * \param pMsgId - The received AMS message id will be written to this pointer * \param pSourceAddress - The received AMS source address will be written to this pointer * \param pPayload - The received AMS payload will be written to this pointer * \param pPayloadLen - The received AMS payload length will be written to this pointer * * \return true, if operation was successful. false if no message got be retrieved. */ bool UCSI_GetAmsMessage(UCSI_Data_t *my, uint16_t *pMsgId, uint16_t *pSourceAddress, uint8_t **pPayload, uint32_t *pPayloadLen); /** * \brief Releases the message memory returned by UCSI_GetAmsMessage. * * \note Call this function only from single context (not from ISR) * \note This function must be called when the data of UCSI_GetAmsMessage has been processed. * If this function is not called, UCSI_GetAmsMessage will return always the reference to the same data. * \note UCSI_Service may also free the data returned by UCSI_GetAmsMessage! * * \param pPriv - private data section of this instance */ void UCSI_ReleaseAmsMessage(UCSI_Data_t *my); /** * \brief Enables or disables a route by the given routeId * \note Call this function only from single context (not from ISR) * * \param pPriv - private data section of this instance * \param routeId - identifier as given in XML file along with MOST socket (unique) * \param isActive - true, route will become active. false, route will be deallocated * * \return true, if route was found and the specific command was enqueued to UNICENS. */ bool UCSI_SetRouteActive(UCSI_Data_t *pPriv, uint16_t routeId, bool isActive); /** * \brief Performs an remote I2C write command * \note Call this function only from single context (not from ISR) * * \param pPriv - private data section of this instance * \param targetAddress - targetAddress - The node / group target address * \param isBurst - true, write blockCount I2C telegrams dataLen with a single call. false, write a single I2C message. * \param blockCount - amount of blocks to write. Only used when isBurst is set to true. * \param slaveAddr - The I2C address. * \param timeout - Timeout in milliseconds. * \param dataLen - Amount of bytes to send via I2C * \param pData - The payload to be send. * * \return true, if route command was enqueued to UNICENS. */ bool UCSI_I2CWrite(UCSI_Data_t *pPriv, uint16_t targetAddress, bool isBurst, uint8_t blockCount, uint8_t slaveAddr, uint16_t timeout, uint8_t dataLen, const uint8_t *pData); /** * \brief Performs an remote I2C read command. * \note UCSI_CB_OnI2CRead will be called after this command has been executed * \note Call this function only from single context (not from ISR) * * \param pPriv - private data section of this instance * \param targetAddress - targetAddress - The node / group target address * \param slaveAddr - The I2C address. * \param timeout - Timeout in milliseconds. * \param dataLen - Amount of bytes to send via I2C * * \return true, if route command was enqueued to UNICENS. */ bool UCSI_I2CRead(UCSI_Data_t *pPriv, uint16_t targetAddress, uint8_t slaveAddr, uint16_t timeout, uint8_t dataLen); /** * \brief Enables or disables a route by the given routeId * \note Call this function only from single context (not from ISR) * * \param pPriv - private data section of this instance * \param targetAddress - targetAddress - The node / group target address * \param gpioPinId - INIC GPIO PIN starting with 0 for the first GPIO. * \param isHighState - true, high state = 3,3V. false, low state = 0V. * * \return true, if GPIO command was enqueued to UNICENS. */ bool UCSI_SetGpioState(UCSI_Data_t *pPriv, uint16_t targetAddress, uint8_t gpioPinId, bool isHighState); /** * \brief Programs the IdentString into the specified INICs RAM * * \param pPriv - private data section of this instance * \param signature - The signature of the node to be programmed * \param newIdentString - newIdentString - The data to be programmed * * \return true, if program to RAM command was enqueued to UNICENS. */ bool UCSI_ProgramIdentStringRam(UCSI_Data_t *pPriv, const Ucs_Signature_t *signature, const Ucs_IdentString_t *newIdentString); /** * \brief Programs the IdentString into the specified INICs RAM * * \param pPriv - private data section of this instance * \param signature - The signature of the node to be programmed * \param newIdentString - newIdentString - The data to be programmed * * \return true, if program to RAM command was enqueued to UNICENS. */ bool UCSI_ProgramIdentStringRom(UCSI_Data_t *pPriv, const Ucs_Signature_t *signature, const Ucs_IdentString_t *newIdentString); /** * \brief Enables Promiscuous Mode on the given Node. * * \param pPriv - private data section of this instance * \param targetAddress - targetAddress - The node / group target address * \param enablePromiscuous - true, perfect match filter will be disabled. false, otherwise. * * \return true, if command was enqueued to UNICENS. */ bool UCSI_EnablePromiscuousMode(UCSI_Data_t *pPriv, uint16_t targetAddress, bool enablePromiscuous); /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* CALLBACK SECTION */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /** * \brief Callback when ever a function above was tried to be executed. * \note This function must be implemented by the integrator * \param pTag - Pointer given by the integrator by UCSI_Init * \param command - Enumeration value, identify the used command * \param success - true, if given command was successfully executed. false, either the direct call of the command or its async callback signaled an error * \param nodeAddress - the address of the node reporting this event. 0x1 in case of the local master node. 0xFFFF in case if the node address is unknown. */ extern void UCSI_CB_OnCommandResult(void *pTag, UnicensCmd_t command, bool success, uint16_t nodeAddress); /** * \brief Callback when ever a timestamp is needed * \note This function must be implemented by the integrator * \param pTag - Pointer given by the integrator by UCSI_Init * \return timestamp in milliseconds */ extern uint16_t UCSI_CB_OnGetTime(void *pTag); /** * \brief Callback when the implementer needs to arm a timer. * \note This function must be implemented by the integrator * \note After timer expired, call the UCSI_Timeout from service * Thread. (Not from callback!) * \param pTag - Pointer given by the integrator by UCSI_Init * \param timeout - milliseconds from now on to call back. (0=disable) */ extern void UCSI_CB_OnSetServiceTimer(void *pTag, uint16_t timeout); /** * \brief Callback when ever the state of the Network has changed. * \note This function must be implemented by the integrator * \param pTag - Pointer given by the integrator by UCSI_Init * \param isAvailable - true, if the network is operable. false, network is down. No message or stream can be sent or received. * \param packetBandwidth - The amount of bytes per frame reserved for the Ethernet channel. Must match to the given packetBw value passed to UCSI_NewConfig. * \param amountOfNodes - The amount of network devices found in the ring. */ extern void UCSI_CB_OnNetworkState(void *pTag, bool isAvailable, uint16_t packetBandwidth, uint8_t amountOfNodes); /** * \brief Callback when ever UNICENS forms a human readable message. * This can be error events or when enabled also debug messages. * \note This function must be implemented by the integrator * \param pTag - Pointer given by the integrator by UCSI_Init * \param isError - true, if this message is an important error message. false, user/debug message, not important. * \param format - Zero terminated format string (following printf rules) * \param vargsCnt - Amount of parameters stored in "..." */ extern void UCSI_CB_OnUserMessage(void *pTag, bool isError, const char format[], uint16_t vargsCnt, ...); /** * \brief Callback when the overview with all nodes and routes is getting printed * \note This function must be implemented by the integrator * \param pTag - Pointer given by the integrator by UCSI_Init * \param pString - Zero terminated format string forming a table with informations */ extern void UCSI_CB_OnPrintRouteTable(void *pTag, const char pString[]); /** * \brief Callback when ever this instance needs to be serviced. * \note Call UCSI_Service by your scheduler at the next run * \note This function must be implemented by the integrator * \param pTag - Pointer given by the integrator by UCSI_Init */ extern void UCSI_CB_OnServiceRequired(void *pTag); /** * \brief Callback when ever the INIC should be reseted by the integration code * \note This function must be implemented by the integrator * \param pTag - Pointer given by the integrator by UCSI_Init */ extern void UCSI_CB_OnResetInic(void *pTag); /** * \brief Callback when ever this instance of UNICENS wants to send control data to the LLD. * \note This function must be implemented by the integrator * \param pTag - Pointer given by the integrator by UCSI_Init * \param pPayload - Byte array to be sent on the INIC control channel * \param payloadLen - Length of pPayload in Byte */ extern void UCSI_CB_OnTxRequest(void *pTag, const uint8_t *pPayload, uint32_t payloadLen); /** * \brief Callback when UNICENS instance has been started. * \note This event can be used to enable control message reception * \note This function must be implemented by the integrator * \param pTag - Pointer given by the integrator by UCSI_Init */ extern void UCSI_CB_OnStart(void *pTag); /** * \brief Callback when UNICENS instance has been stopped. * \note This event can be used to free memory holding the resources * passed with UCSI_NewConfig * \note This event can be used to stop control message reception * \note This function must be implemented by the integrator * \param pTag - Pointer given by the integrator by UCSI_Init */ extern void UCSI_CB_OnStop(void *pTag); /** * \brief Callback when UNICENS instance has received an AMS message * \note This function must be implemented by the integrator * \note After this callback, call UCSI_GetAmsMessage indirect by setting a flag * \param pTag - Pointer given by the integrator by UCSI_Init */ extern void UCSI_CB_OnAmsMessageReceived(void *pTag); /** * \brief Callback when a route become active / inactive. * \note This function must be implemented by the integrator * \param pTag - Pointer given by the integrator by UCSI_Init * \param routeId - identifier as given in XML file along with MOST socket (unique) * \param isActive - true, if the route is now in use. false, the route is not established. * \param connectionLabel - The connection label used on the Network. Only valid, if isActive=true */ extern void UCSI_CB_OnRouteResult(void *pTag, uint16_t routeId, bool isActive, uint16_t connectionLabel); /** * \brief Callback when a INIC GPIO changes its state * \note This function must be implemented by the integrator * \param pTag - Pointer given by the integrator by UCSI_Init * \param nodeAddress - Node Address of the INIC sending the update. * \param gpioPinId - INIC GPIO PIN starting with 0 for the first GPIO. * \param isHighState - true, high state = 3,3V. false, low state = 0V. */ extern void UCSI_CB_OnGpioStateChange(void *pTag, uint16_t nodeAddress, uint8_t gpioPinId, bool isHighState); /** * \brief Callback when nodes are discovered or disappear * \note This function must be implemented by the integrator * \param pTag - Pointer given by the integrator by UCSI_Init * \param code - Report code * \param signature - Signature of the found device. Maybe NULL in error cases * \param pNode - Reference to the node structure found in nodes list. Maybe NULL. */ extern void UCSI_CB_OnMgrReport(void *pTag, Ucs_MgrReport_t code, Ucs_Signature_t *signature, Ucs_Rm_Node_t *pNode); /** * \brief Callback when programming (collision resolving) is finished * \note This function must be implemented by the integrator * \param pTag - Pointer given by the integrator by UCSI_Init * \param changed - true, if at least one node was programmed with new configuration. false, nothing was changed. */ extern void UCSI_CB_OnProgrammingDone(void *pTag, bool changed); /** * \brief Callback when an I2C Read (triggered by UCSI_I2CRead) command has been executed * \note This function must be implemented by the integrator * \param pTag - Pointer given by the integrator by UCSI_Init * \param success - true, if the data could be read. false, otherwise * \param targetAddress - targetAddress - The node / group target address * \param slaveAddr - The I2C address. * \param pBuffer - If success is set to true, then the payload in this buffer is valid * \param bufLen - Length of buffer */ extern void UCSI_CB_OnI2CRead(void *pTag, bool success, uint16_t targetAddress, uint8_t slaveAddr, const uint8_t *pBuffer, uint32_t bufLen); #ifdef __cplusplus } #endif #endif /* UCSI_H_ */<file_sep>/README.md # UNICENS example for Atmel SAM V71 This project is a bare metal example integration of the [UNICENS library](https://github.com/MicrochipTech/unicens) for the Atmel SAM V71 controller. ### Needed components __Hardware:__ * [Atmel SAM V71 Xplained Ultra Evaluation Kit](http://www.microchip.com/DevelopmentTools/ProductDetails.aspx?PartNO=atsamv71-xult) * Atmel SAM-ICE (min. HW-Version 7) * OS81118 Phy+Board Variant 3 or OS81210/OS81212/OS81214 Phy+Board Variant 3 * Depending on your use case, you can start with any devices of the [K2L Slim Board Familiy](https://www.k2l.de/products/34/MOST150%20Slim%20Board%20Family/) __Software:__ * Microsoft Windows 7 or newer * [Atmel Studio 7](http://www.microchip.com/development-tools/atmel-studio-7) * [Git for Windows](https://gitforwindows.org) To get the source code, enter: ```bash $ git clone --recurse-submodules https://github.com/MicrochipTech/unicens-bare-metal-sam-v71.git ``` ### Build * Open Atmel Studio 7 * __File -> Open -> Project/Solution... (Ctrl+Shift+O)__ * Select __[your-projects]\\audio-source\\SAMV71-UNICENS-example.atsln__ * Navigate to __Project -> Properties -> Tool__ and select your connected __SAM-ICE__. Use __SWD__ as interface and save the configuration file. * Build the project with __F7__ * Build the project, flash it to the SAM V71 and start debugging with __F5__ ### Change Network Configuration The configuration of the entire network is done via a single XML file. It's located at __[your-projects]\\audio-source\config.xml__ **Hint:** Edit the config.xml file within the Atmel Studio. It will validate it and help filling the correct tags and attributes by using the XML schema __unicens.xsd__ in the same folder. Once you edited it, double click the __[your-projects]\\audio-source\\convertXML.bat__ batch file. It will interpret the XML file and generate a static C-Source code file at __[your-projects]\\audio-source\\samv71-ucs\\src\default_config.c__ After successful conversion, you need to build the project again and download it to the hardware in order to apply the new network configuration.<file_sep>/audio-source/samv71-ucs/src/driver/dim2/dim2_lld.h /*------------------------------------------------------------------------------------------------*/ /* DIM2 LOW LEVEL DRIVER */ /* (c) 2017 Microchip Technology Inc. and its subsidiaries. */ /* */ /* You may use this software and any derivatives exclusively with Microchip products. */ /* */ /* THIS SOFTWARE IS SUPPLIED BY MICROCHIP "AS IS". NO WARRANTIES, WHETHER EXPRESS, IMPLIED OR */ /* STATUTORY, APPLY TO THIS SOFTWARE, INCLUDING ANY IMPLIED WARRANTIES OF NON-INFRINGEMENT, */ /* MERCHANTABILITY, AND FITNESS FOR A PARTICULAR PURPOSE, OR ITS INTERACTION WITH MICROCHIP */ /* PRODUCTS, COMBINATION WITH ANY OTHER PRODUCTS, OR USE IN ANY APPLICATION. */ /* */ /* IN NO EVENT WILL MICROCHIP BE LIABLE FOR ANY INDIRECT, SPECIAL, PUNITIVE, INCIDENTAL OR */ /* CONSEQUENTIAL LOSS, DAMAGE, COST OR EXPENSE OF ANY KIND WHATSOEVER RELATED TO THE SOFTWARE, */ /* HOWEVER CAUSED, EVEN IF MICROCHIP HAS BEEN ADVISED OF THE POSSIBILITY OR THE DAMAGES ARE */ /* FORESEEABLE. TO THE FULLEST EXTENT ALLOWED BY LAW, MICROCHIP'S TOTAL LIABILITY ON ALL CLAIMS */ /* IN ANY WAY RELATED TO THIS SOFTWARE WILL NOT EXCEED THE AMOUNT OF FEES, IF ANY, THAT YOU HAVE */ /* PAID DIRECTLY TO MICROCHIP FOR THIS SOFTWARE. */ /* */ /* MICROCHIP PROVIDES THIS SOFTWARE CONDITIONALLY UPON YOUR ACCEPTANCE OF THESE TERMS. */ /*------------------------------------------------------------------------------------------------*/ #ifndef DIM2_LLD_H_ #define DIM2_LLD_H_ #ifdef __cplusplus extern "C" { #endif #include <stdint.h> #include <stdbool.h> typedef enum { ///MOST Control Channel (ADS, AMS) DIM2LLD_ChannelType_Control, ///MOST Async Channel (MEP, MAMAC, MHP) DIM2LLD_ChannelType_Async, ///Synchronous channel (PCM audio) DIM2LLD_ChannelType_Sync, ///Isochronous channel (TS video/audio multiplex) DIM2LLD_ChannelType_Isoc, ///Don't use BOUNDARY (internal use) DIM2LLD_ChannelType_BOUNDARY } DIM2LLD_ChannelType_t; typedef enum { ///From EHC to MOST DIM2LLD_ChannelDirection_TX, ///From MOST to EHC DIM2LLD_ChannelDirection_RX, ///Don't use BOUNDARY (internal use) DIM2LLD_ChannelDirection_BOUNDARY } DIM2LLD_ChannelDirection_t; /** \brief Initializes the DIM Low Level Driver * \return true, if the module could be initialized, false otherwise. */ bool DIM2LLD_Init(void); /** \brief Setup a communication channel * \param cType - The data type which shall be used for this channel * \param dir - The direction for this unidirectional channel * \param instance - For Isoc or Sync channels multiple instances may be used, starting with 0 for the first instance. For Control and Async there is only instance per direction allowed. * \param channelAddress - The MLB channel address to use. This must be an even value! * \param bufferSize - The maximum amount of bytes, which may be used by the DIM2 module to buffer data * \param subSize - This value is only used for Sync and Isoc data types (you may use 0 for Control and Async). It sets the amount of bytes of the smallest data chunk (4 Byte of 16Bit Stereo, 188 Byte for TS). * \param numberOfBuffers - The maximum amount of messages which is stored in LLD driver (DIM2 uses Ping/Pong Buffer). * \param bufferOffset - If non zero, the given amount of bytes will be appended to the specific buffer. For RX, this area can be filled for example with header data and passed to different software stacks (TCP/IP e.g.). For TX this value will be ignored. * \return true, if the channel could be initialized, false otherwise. */ bool DIM2LLD_SetupChannel(DIM2LLD_ChannelType_t cType, DIM2LLD_ChannelDirection_t dir, uint8_t instance, uint16_t channelAddress, uint16_t bufferSize, uint16_t subSize, uint16_t numberOfBuffers, uint16_t bufferOffset); /** \brief Deinitializes the DIM Low Level Driver * */ void DIM2LLD_Deinit(void); /** \brief Must be called cyclic from task context * */ void DIM2LLD_Service(void); /** \brief Checks if the MLB and INIC have reached locked state. * * \return true, if there is a lock, false otherwise. */ bool DIM2LLD_IsMlbLocked(void); /** \brief Returns the amount of available data buffers (max. "numberOfBuffers" passed with DIM2LLD_SetupChannel), which can be returned by calling DIM2LLD_GetRxData. * \note This is thought to get health informations of the system (debugging). * \param cType - The data type which shall be used for this channel * \param dir - The direction for this unidirectional channel * \param instance - For Isoc or Sync channels multiple instances may be used, starting with 0 for the first instance. For Control and Async there is only instance per direction allowed. * \return true, if there is a lock, false otherwise. */ uint32_t DIM2LLD_GetQueueElementCount(DIM2LLD_ChannelType_t cType, DIM2LLD_ChannelDirection_t dir, uint8_t instance); /** \brief Retrieves received data from the given channel, if available. * \note The payload passed by pBuffer stays valid until the function DIM2LLD_ReleaseRxData is called. * \param cType - The data type which shall be used for this channel * \param dir - The direction for this unidirectional channel * \param instance - For Isoc or Sync channels multiple instances may be used, starting with 0 for the first instance. For Control and Async there is only instance per direction allowed. * \param pos - The position to read, starting with 0 for the oldest entry. Use DIM2LLD_GetQueueElementCount to get maximum pos count (max -1). * \param pBuffer - This function will deliver a pointer to the data. It may be used for further processing, don't forget to call DIM2LLD_ReleaseRxData after wise! If there is no data available, this pointer will be set to NULL. * \param pOffset - To this pointer the offset value will be written. This must be exactly the same value, as the one given with DIM2LLD_SetupChannel, Parameter bufferOffset. The given buffer is extended by this size. * The user may write into the first pOffset bytes, without destroying any informations. The received data starts after wise. May be left NULL. * \param pPacketCounter - To this pointer a unique packet counter will be written. The application can identify this buffer by this value in order to mark it as processed. May be left NULL. * \return Returns the amount of bytes which can be accessed by the pBuffer. */ uint16_t DIM2LLD_GetRxData(DIM2LLD_ChannelType_t cType, DIM2LLD_ChannelDirection_t dir, uint8_t instance, uint32_t pos, const uint8_t **pBuffer, uint16_t *pOffset, uint8_t *pPacketCounter); /** \brief Releases the passed data from the DIM2LLD_GetRxData function call. Call DIM2LLD_ReleaseRxData only in case you received valid data from DIM2LLD_GetRxData (Not if returned NULL). * \param cType - The data type which shall be used for this channel * \param dir - The direction for this unidirectional channel * \param instance - For Isoc or Sync channels multiple instances may be used, starting with 0 for the first instance. For Control and Async there is only instance per direction allowed. */ void DIM2LLD_ReleaseRxData(DIM2LLD_ChannelType_t cType, DIM2LLD_ChannelDirection_t dir, uint8_t instance); /** \brief Gives a pointer to a LLD buffer, if available. The user may fill the buffer asynchronously. After wise call DIM2LLD_SendTxData to finally send the data. * \note The payload passed by pBuffer stays valid until the function DIM2LLD_SendTxData is called. * \param cType - The data type which shall be used for this channel * \param dir - The direction for this unidirectional channel * \param instance - For Isoc or Sync channels multiple instances may be used, starting with 0 for the first instance. For Control and Async there is only instance per direction allowed. * \param pBuffer - This function will deliver a pointer an empty buffer. It may be used for asynchronous filling with data. If there is no buffers free in the LLD module, this pointer is NULL and the return value is 0. * \return Returns the amount of bytes which can be filled into pBuffer. ==> Max buffer size */ uint16_t DIM2LLD_GetTxData(DIM2LLD_ChannelType_t cType, DIM2LLD_ChannelDirection_t dir, uint8_t instance, uint8_t **pBuffer); /** \brief Finally sends the passed data from the DIM2LLD_GetTxData function call. Call DIM2LLD_SendTxData only in case you got valid data from DIM2LLD_GetTxData (Not if returned NULL). * \param cType - The data type which shall be used for this channel * \param dir - The direction for this unidirectional channel * \param instance - For Isoc or Sync channels multiple instances may be used, starting with 0 for the first instance. For Control and Async there is only instance per direction allowed. * \param payloadLength - The length of the data stored in pBuffer returned by DIM2LLD_GetTxData. Make sure that the length is less or equal to the return val of DIM2LLD_GetTxData! */ void DIM2LLD_SendTxData(DIM2LLD_ChannelType_t cType, DIM2LLD_ChannelDirection_t dir, uint8_t instance, uint32_t payloadLength); #ifdef __cplusplus } #endif #endif /* DIM2_LLD_H_ */<file_sep>/audio-source/samv71-ucs/src/board_init.c /*------------------------------------------------------------------------------------------------*/ /* Board Init Component */ /* Copyright 2017, Microchip Technology Inc. and its subsidiaries. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following conditions are met: */ /* */ /* 1. Redistributions of source code must retain the above copyright notice, this */ /* list of conditions and the following disclaimer. */ /* */ /* 2. Redistributions in binary form must reproduce the above copyright notice, */ /* this list of conditions and the following disclaimer in the documentation */ /* and/or other materials provided with the distribution. */ /* */ /* 3. Neither the name of the copyright holder nor the names of its */ /* contributors may be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" */ /* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE */ /* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE */ /* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE */ /* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL */ /* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR */ /* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER */ /* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, */ /* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE */ /* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /*------------------------------------------------------------------------------------------------*/ #include "board_init.h" /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* FUNCTION PROTOTYPES */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ void TCM_StackInit(void); /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* DEFINES AND LOCAL VARIABLES */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ Twid twid; sGmacd gGmacd; GMacb gGmacb; sXdmad xdma; /** GMAC power control pin */ #if !defined(BOARD_GMAC_POWER_ALWAYS_ON) static const Pin gmacPwrDn[] = {BOARD_GMAC_PIN_PWRDN}; #endif /** The PINs for GMAC */ static const Pin gmacPins[] = {BOARD_GMAC_RUN_PINS}; static const Pin gmacResetPin = BOARD_GMAC_RESET_PIN; /** The PINs for TWI*/ static const Pin twiPins[] = PINS_TWI0; /** TWI clock frequency in Hz. */ #define TWCK 400000 /** Slave address of twi_eeprom AT24MAC.*/ #define AT24MAC_SERIAL_NUM_ADD 0x5F /** Page size of an AT24MAC402 chip (in bytes)*/ #define PAGE_SIZE 16 /** Page numbers of an AT24MAC402 chip */ #define EEPROM_PAGES 16 /** EEPROM Pins definition */ #define BOARD_PINS_TWI_EEPROM PINS_TWI0 /** TWI0 peripheral ID for EEPROM device*/ #define BOARD_ID_TWI_EEPROM ID_TWIHS0 /** TWI0 base address for EEPROM device */ #define BOARD_BASE_TWI_EEPROM TWIHS0 /* Push button pin */ static const Pin pushbutton[] = {PIN_PUSHBUTTON_0, PIN_PUSHBUTTON_1}; /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* PUBLIC FUNCTIONS */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ void Board_Init() { /* Initialize the SAM system */ WDT_Disable(WDT); #ifdef ENABLE_TCM TCM_StackInit(); #endif SCB_EnableICache(); SCB_EnableDCache(); // Please edit __DCACHE_PRESENT in samv71q21.h to en/disable DCache // Configure systick for 1 ms if (TimeTick_Configure()) { assert(false); } LED_Configure(0); LED_Configure(1); LED_Clear(0); LED_Clear(1); // enable GMAC interrupts NVIC_ClearPendingIRQ(GMAC_IRQn); NVIC_EnableIRQ(GMAC_IRQn); // Configure Board Pushbuttons // SW1 is a ERASE system function, switch it to port function MATRIX->CCFG_SYSIO |= (1u << 12); // have to disable the pull down on PB12 for SW1 before the pull up can be enabled PIOB->PIO_PPDDR = 1 << 12; PIO_Configure(pushbutton, PIO_LISTSIZE(pushbutton)); // Adjust pio debounce filter parameters PIO_SetDebounceFilter(&pushbutton[0], 10); PIO_SetDebounceFilter(&pushbutton[1], 10); /* Configure TWI pins. */ PIO_Configure(twiPins, ARRAY_SIZE(twiPins)); /* Enable TWI */ PMC_EnablePeripheral(BOARD_ID_TWI_EEPROM); TWI_ConfigureMaster(BOARD_BASE_TWI_EEPROM, TWCK, BOARD_MCK); TWID_Initialize(&twid, BOARD_BASE_TWI_EEPROM); NVIC_ClearPendingIRQ(TWIHS0_IRQn); NVIC_EnableIRQ(TWIHS0_IRQn); /* Setup and enable XDMAC */ XDMAD_Initialize(&xdma, 0); NVIC_ClearPendingIRQ(XDMAC_IRQn); NVIC_SetPriority(XDMAC_IRQn, 1); NVIC_EnableIRQ(XDMAC_IRQn); /* Initialize the hardware interface */ init_gmac(&gGmacd); /* Setup interrupts */ GMACB_Init(&gGmacb, &gGmacd, BOARD_GMAC_PHY_ADDR); GMACB_ResetPhy(&gGmacb); /* PHY initialize */ if (!GMACB_InitPhy(&gGmacb, BOARD_MCK, &gmacResetPin, 1, gmacPins, PIO_LISTSIZE(gmacPins))) { printf("PHY Initialize ERROR!\n\r"); assert(false); } if(!GMACB_PhySetSpeed100(&gGmacb, 1u)) { printf("Set Ethernet PHY Speed ERROR!\n\r"); assert(false); } #ifdef DEBUG //Phy takes a while until ready, do not wait for it in release variant Wait(5000); #endif } bool Board_IsButtonPressed(Board_Button_t button) { switch(button) { case BoardButton_SW0: return !PIO_Get(&pushbutton[0]); case BoardButton_SW1: return !PIO_Get(&pushbutton[1]); } assert(false); return false; } /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* ISR HOOKS */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /** DMA handler **/ void XDMAC_Handler(void) { XDMAD_Handler(&xdma); } /** TWI handler **/ void TWIHS0_Handler(void) { TWID_Handler(&twid) ; } /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* CALLBACK ERROR HOOKS */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ void SystemHalt(const char *message) { printf("System halted by '%s'\r\n", message); while(1); } void NMI_Handler(void) { SystemHalt("NMI_Handler"); } void HardFault_Handler(void) { SystemHalt("HardFault_Handler"); } void MemManage_Handler(void) { SystemHalt("MemManage_Handler"); } void BusFault_Handler(void) { SystemHalt("BusFault_Handler"); } void UsageFault_Handler(void) { SystemHalt("UsageFault_Handler"); }<file_sep>/audio-source/samv71-ucs/src/gmac/gmac_init.c #include <assert.h> #include "gmac_init.h" #include <string.h> /** Enable/Disable CopyAllFrame */ #define GMAC_CAF_DISABLE 0 #define GMAC_CAF_ENABLE 1 /** Enable/Disable NoBroadCast */ #define GMAC_NBC_DISABLE 0 #define GMAC_NBC_ENABLE 1 #define PTP_RX_BUFFERS 16 /** Must be a power of 2 */ #define PTP_RX_BUFF_SIZE 256 /** Must be a power of 2 */ #define DUMMY_BUFFERS 4 /** Must be a power of 2 */ #define DUMMY_BUFF_SIZE 128 /** Must be a power of 2 */ #define AVB_RX_BUFFERS 8 #define AVB_TX_BUFFERS 8 #define AVB_BUFF_SIZE 1280 #define ETH_RX_BUFFERS 8 #define ETH_TX_BUFFERS 8 #define TSU_INCR_NS (6) #define TSU_INCR_SUBNS ((uint16_t)(0.66667f * 65536.f)) #define DESCRIPTOR COMPILER_SECTION(".ram_nocache") COMPILER_ALIGNED(8) #define GMACBUFFER COMPILER_SECTION(".ram_nocache") COMPILER_ALIGNED(DEFAULT_CACHELINE) DESCRIPTOR sGmacRxDescriptor gPtpRxDs[PTP_RX_BUFFERS]; DESCRIPTOR sGmacTxDescriptor gPtpTxDs[DUMMY_BUFFERS]; DESCRIPTOR sGmacRxDescriptor gAvbRxDs[AVB_RX_BUFFERS]; DESCRIPTOR sGmacTxDescriptor gAvbTxDs[AVB_TX_BUFFERS]; DESCRIPTOR sGmacRxDescriptor gEthRxDs[ETH_RX_BUFFERS]; DESCRIPTOR sGmacTxDescriptor gEthTxDs[ETH_TX_BUFFERS]; GMACBUFFER uint8_t gRxPtpBuffer[PTP_RX_BUFFERS * PTP_RX_BUFF_SIZE]; GMACBUFFER uint8_t gTxPtpBuffer[DUMMY_BUFFERS * DUMMY_BUFF_SIZE]; GMACBUFFER uint8_t gRxAvbBuffer[AVB_RX_BUFFERS * AVB_BUFF_SIZE]; GMACBUFFER uint8_t gTxAvbBuffer[AVB_TX_BUFFERS * AVB_BUFF_SIZE]; GMACBUFFER uint8_t gRxEthBuffer[ETH_RX_BUFFERS * ETH_BUFF_SIZE]; GMACBUFFER uint8_t gTxEthBuffer[ETH_TX_BUFFERS * ETH_BUFF_SIZE]; DESCRIPTOR fGmacdTransferCallback gPtpTxCbs[DUMMY_BUFFERS]; DESCRIPTOR fGmacdTransferCallback gAvbTxCbs[AVB_TX_BUFFERS]; DESCRIPTOR fGmacdTransferCallback gEthTxCbs[ETH_TX_BUFFERS]; DESCRIPTOR void *gPtpTxCbTags[DUMMY_BUFFERS]; DESCRIPTOR void *gAvbTxCbTags[AVB_TX_BUFFERS]; DESCRIPTOR void *gEthTxCbTags[ETH_TX_BUFFERS]; #define PTP_ETHER_TYPE (0x88F7u) #define AVB_ETHER_TYPE (0x22F0u) #define ARP_ETHER_TYPE (0x0806u) const gmacQueList_t PTP_QUEUE = GMAC_QUE_0; static void PtpDataReceived(uint32_t status, void *pTag); static void gmac_RxPtpEvtMsgIsrCB (ptpMsgType rxEvtMsg, uint32_t efrsh, uint32_t efrsl, uint32_t eftn); static void gmac_TxPtpEvtMsgIsrCB (ptpMsgType txEvtMsg, uint32_t eftsh, uint32_t eftsl, uint32_t eftn, uint16_t sequenceId); static sGmacd *spGmacd; static sGmacInit QuePTP; static sGmacInit QueAVB; static sGmacInit QueETH; void init_gmac(sGmacd *pGmacd) { assert(NULL != pGmacd); spGmacd = pGmacd; memset(&QuePTP, 0, sizeof(QuePTP)); memset(&QueAVB, 0, sizeof(QueAVB)); memset(&QueETH, 0, sizeof(QueETH)); /* Initialize GMAC driver structure */ QuePTP.bIsGem = 1; QuePTP.bDmaBurstLength = 4; QuePTP.pRxBuffer = gRxPtpBuffer; QuePTP.pRxD = gPtpRxDs; QuePTP.wRxBufferSize = PTP_RX_BUFF_SIZE; QuePTP.wRxSize = PTP_RX_BUFFERS; QuePTP.pTxBuffer = gTxPtpBuffer; QuePTP.pTxD = gPtpTxDs; QuePTP.wTxBufferSize = DUMMY_BUFF_SIZE; QuePTP.wTxSize = DUMMY_BUFFERS; QuePTP.pTxCb = gPtpTxCbs; QuePTP.pTxCbTag = gPtpTxCbTags; QueETH.bIsGem = 1; QueETH.bDmaBurstLength = 4; QueETH.pRxBuffer = gRxEthBuffer; QueETH.pRxD = gEthRxDs; QueETH.wRxBufferSize = ETH_BUFF_SIZE; QueETH.wRxSize = ETH_RX_BUFFERS; QueETH.pTxBuffer = gTxEthBuffer; QueETH.pTxD = gEthTxDs; QueETH.wTxBufferSize = ETH_BUFF_SIZE; QueETH.wTxSize = ETH_TX_BUFFERS; QueETH.pTxCb = gEthTxCbs; QueETH.pTxCbTag = gEthTxCbTags; QueAVB.bIsGem = 1; QueAVB.bDmaBurstLength = 4; QueAVB.pRxBuffer = gRxAvbBuffer; QueAVB.pRxD = gAvbRxDs; QueAVB.wRxBufferSize = AVB_BUFF_SIZE; QueAVB.wRxSize = AVB_RX_BUFFERS; QueAVB.pTxBuffer = gTxAvbBuffer; QueAVB.pTxD = gAvbTxDs; QueAVB.wTxBufferSize = AVB_BUFF_SIZE; QueAVB.wTxSize = AVB_TX_BUFFERS; QueAVB.pTxCb = gAvbTxCbs; QueAVB.pTxCbTag = gAvbTxCbTags; GMACD_Init(pGmacd, GMAC, ID_GMAC, GMAC_CAF_ENABLE, GMAC_NBC_DISABLE); GMACD_InitTransfer(pGmacd, &QueETH, GMAC_QUE_0); GMACD_InitTransfer(pGmacd, &QueAVB, GMAC_QUE_1); GMACD_InitTransfer(pGmacd, &QuePTP, GMAC_QUE_2); GMAC_SetTsuTmrIncReg(GMAC, TSU_INCR_NS, TSU_INCR_SUBNS); /* PTP events can only be registered to QUEUE0! */ /* The packets can be rerouted to any queue */ GMACD_RxPtpEvtMsgCBRegister (pGmacd, gmac_RxPtpEvtMsgIsrCB, PTP_QUEUE); GMACD_TxPtpEvtMsgCBRegister (pGmacd, gmac_TxPtpEvtMsgIsrCB, PTP_QUEUE); GMAC_EnableIt(GMAC, (GMAC_IER_SFR | GMAC_IER_PDRQFR | GMAC_IER_PDRSFR), PTP_QUEUE); GMAC_EnableIt(GMAC, (GMAC_IER_PDRQFT | GMAC_IER_PDRSFT), PTP_QUEUE ); /* QUE must match screener register configuration! */ GMACD_SetRxCallback(pGmacd, PtpDataReceived, GMAC_QUE_1); GMACD_RxPtpEvtMsgCBRegister(pGmacd, gmac_RxPtpEvtMsgIsrCB, PTP_QUEUE); GMACD_TxPtpEvtMsgCBRegister(pGmacd, gmac_TxPtpEvtMsgIsrCB, PTP_QUEUE); enum { AVB_ETHER_TYPE_REG_IDX = 0, PTP_ETHER_TYPE_REG_IDX, ARP_ETHER_TYPE_REG_IDX, unused, ETHER_TYPE_REG_MAX }; // AVB GMAC_WriteScreener2Reg(pGmacd->pHw, AVB_ETHER_TYPE_REG_IDX, ( GMAC_ST2RPQ_ETHE | GMAC_ST2RPQ_I2ETH(AVB_ETHER_TYPE_REG_IDX) | GMAC_ST2RPQ_QNB(GMAC_QUE_1) ) ); GMAC_WriteEthTypeReg(pGmacd->pHw, AVB_ETHER_TYPE_REG_IDX, GMAC_ST2ER_COMPVAL(AVB_ETHER_TYPE)); // PTP GMAC_WriteScreener2Reg(pGmacd->pHw, PTP_ETHER_TYPE_REG_IDX, ( GMAC_ST2RPQ_ETHE | GMAC_ST2RPQ_I2ETH(PTP_ETHER_TYPE_REG_IDX) | GMAC_ST2RPQ_QNB(GMAC_QUE_2) ) ); GMAC_WriteEthTypeReg(pGmacd->pHw, PTP_ETHER_TYPE_REG_IDX, GMAC_ST2ER_COMPVAL(PTP_ETHER_TYPE)); // ARP GMAC_WriteScreener2Reg(pGmacd->pHw, ARP_ETHER_TYPE_REG_IDX, ( GMAC_ST2RPQ_ETHE | GMAC_ST2RPQ_I2ETH(ARP_ETHER_TYPE_REG_IDX) | GMAC_ST2RPQ_QNB(GMAC_QUE_0) ) ); GMAC_WriteEthTypeReg(pGmacd->pHw, ARP_ETHER_TYPE_REG_IDX, GMAC_ST2ER_COMPVAL(ARP_ETHER_TYPE)); NVIC_ClearPendingIRQ(GMAC_IRQn); NVIC_EnableIRQ(GMAC_IRQn); NVIC_ClearPendingIRQ(GMACQ1_IRQn); NVIC_EnableIRQ(GMACQ1_IRQn); NVIC_ClearPendingIRQ(GMACQ2_IRQn); NVIC_EnableIRQ(GMACQ2_IRQn); if((CHIPID->CHIPID_CIDR & CHIPID_CIDR_VERSION_Msk) == 0x01) { //MRLB } } /** * Gmac interrupt handler */ void GMAC_Handler(void) { assert(NULL != spGmacd); GMACD_Handler(spGmacd, GMAC_QUE_0); } void GMACQ1_Handler (void) { assert(NULL != spGmacd); GMACD_Handler(spGmacd, GMAC_QUE_1); } void GMACQ2_Handler (void) { assert(NULL != spGmacd); GMACD_Handler(spGmacd, GMAC_QUE_2); } void GMACQ3_Handler(void) { assert(NULL != spGmacd); GMACD_Handler(spGmacd, GMAC_QUE_3); } void GMACQ4_Handler(void) { assert(NULL != spGmacd); GMACD_Handler(spGmacd, GMAC_QUE_4); } void GMACQ5_Handler(void) { assert(NULL != spGmacd); GMACD_Handler(spGmacd, GMAC_QUE_5); } /* call back routine for PTP Queue */ static void PtpDataReceived(uint32_t status, void *pTag) { uint32_t buffIdx; uint32_t frmSize; uint8_t *msgPtr; ptpMsgType ptpMsg; pTag = pTag; TRACE_INFO("%u ETH_RXCB(%u)\n\r", (unsigned int)GetTicks(), (unsigned int)status); assert(NULL != spGmacd); while(GMACD_OK == GMACD_GetRxDIdx(spGmacd, &buffIdx, &frmSize, PTP_QUEUE)) { msgPtr = (uint8_t *)&gRxPtpBuffer[buffIdx * PTP_RX_BUFF_SIZE]; ptpMsg = (ptpMsgType)(msgPtr[14] & 0x0Fu); switch(ptpMsg) { case SYNC_MSG_TYPE: //gPtpRxdSyncMsg(msgPtr); break; case FOLLOW_UP_MSG_TYPE: //gPtpRxdFollowUpMsg(msgPtr); break; case PDELAY_REQ_TYPE: //gPtpRxdPdelayReqMsg(msgPtr); break; case PDELAY_RESP_TYPE: //gPtpRxdPdelayRespMsg(msgPtr); break; case PDELAY_RESP_FOLLOW_UP_MSG_TYPE: //gPtpRxdPdelayRespFollowUpMsg(msgPtr); break; default: break; }; /* switch (ptpMsg) */ GMACD_FreeRxDTail(spGmacd, PTP_QUEUE); } /* while () */ } static void gmac_RxPtpEvtMsgIsrCB(ptpMsgType rxEvtMsg, uint32_t efrsh, uint32_t efrsl, uint32_t efrn) { efrsh = efrsh; efrsl = efrsl; efrn = efrn; switch(rxEvtMsg) { case SYNC_MSG_TYPE: case PDELAY_REQ_TYPE: case PDELAY_RESP_TYPE: default: break; } } static void gmac_TxPtpEvtMsgIsrCB(ptpMsgType txEvtMsg, uint32_t eftsh, uint32_t eftsl, uint32_t eftn, uint16_t sequenceId) { eftsh = eftsh; eftsl = eftsl; eftn = eftn; sequenceId = sequenceId; switch(txEvtMsg) { case PDELAY_REQ_TYPE: case PDELAY_RESP_TYPE: break; case DELAY_REQ_MSG_TYPE: case FOLLOW_UP_MSG_TYPE: case DELAY_RESP_MSG_TYPE: default: break; } } <file_sep>/audio-source/samv71-ucs/libraries/console/Console.c /*------------------------------------------------------------------------------------------------*/ /* Console Print Component */ /* Copyright 2017, Microchip Technology Inc. and its subsidiaries. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following conditions are met: */ /* */ /* 1. Redistributions of source code must retain the above copyright notice, this */ /* list of conditions and the following disclaimer. */ /* */ /* 2. Redistributions in binary form must reproduce the above copyright notice, */ /* this list of conditions and the following disclaimer in the documentation */ /* and/or other materials provided with the distribution. */ /* */ /* 3. Neither the name of the copyright holder nor the names of its */ /* contributors may be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" */ /* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE */ /* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE */ /* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE */ /* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL */ /* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR */ /* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER */ /* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, */ /* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE */ /* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /*------------------------------------------------------------------------------------------------*/ #include <stdio.h> #include <stdbool.h> #include <stdlib.h> #include <stdarg.h> #include <string.h> #include <assert.h> #include "timetick.h" #include "Console.h" #define SEND_BUFFER (4096) #define ETHERNET_MAX_LEN (1300) #define SOURCE_PORT (2033) #define DESTINATION_PORT (2033) #define ETHERNET_HEADER 14u #define IP_HEADER 20u #define UDP_HEADER 8u #define TOTAL_UDP_HEADER (ETHERNET_HEADER + IP_HEADER + UDP_HEADER) #define HB(value) ((uint8_t)((uint16_t)(value) >> 8) & 0xFF) #define LB(value) ((uint8_t)(value) & 0xFF) static bool initialied = false; static ConsolePrio_t minPrio = PRIO_LOW; static uint8_t ethBuffer[TOTAL_UDP_HEADER]; static char txBuffer[SEND_BUFFER]; static uint32_t txBufPosIn = 0; static uint32_t txBufPosOut = 0; static uint32_t txOverflow = 0; static void InitUdpHeaders(); static bool SendUdp(uint8_t *pPayload, uint32_t payloadLen); void ConsoleInit() { InitUdpHeaders(); initialied = true; } void ConsoleDeinit(void) { initialied = false; } void ConsoleSetPrio(ConsolePrio_t prio) { if (!initialied) return; minPrio = prio; } void ConsolePrintf(ConsolePrio_t prio, const char *statement, ...) { va_list args; if (!initialied) return; if (prio < minPrio || NULL == statement) return; if (0 == txBufPosIn && 0 != txOverflow) { snprintf(txBuffer, sizeof(txBuffer), RED "!! UART TX overflowed %lu times, increse 'ETHERNET_MAX_LEN' !!" RESETCOLOR "\r\n", txOverflow); txBufPosIn = strlen(txBuffer); txOverflow = 0; } va_start(args, statement); vsnprintf(&txBuffer[txBufPosIn], (sizeof(txBuffer) - txBufPosIn), statement, args); va_end(args); txBufPosIn = strlen(txBuffer); assert(txBufPosIn < sizeof(txBuffer)); if ((sizeof(txBuffer) -1) == txBufPosIn) { ++txOverflow; } ConsoleCB_OnServiceNeeded(); } void ConsoleService( void ) { if (!initialied) return; if (0 == txBufPosIn) return; do { uint32_t sendLen = txBufPosIn - txBufPosOut; if (sendLen > ETHERNET_MAX_LEN) sendLen = ETHERNET_MAX_LEN; assert(txBufPosIn >= txBufPosOut); if (SendUdp((uint8_t *)&txBuffer[txBufPosOut], sendLen)) { txBufPosOut += sendLen; assert(txBufPosIn >= txBufPosOut); if (txBufPosIn == txBufPosOut) { txBufPosIn = 0; txBufPosOut = 0; break; } } else { ConsoleCB_OnServiceNeeded(); break; } } while (true); } static void InitUdpHeaders() { uint8_t *buff = ethBuffer; /* ---- Ethernet_HEADER ---- */ //Destination MAC: /* 00 */ *buff++ = 0xffu; /* 01 */ *buff++ = 0xffu; /* 02 */ *buff++ = 0xffu; /* 03 */ *buff++ = 0xffu; /* 04 */ *buff++ = 0xffu; /* 05 */ *buff++ = 0xffu; //Source MAC: /* 06 */ *buff++ = 0x02u; /* 07 */ *buff++ = 0x00u; /* 08 */ *buff++ = 0x00u; /* 09 */ *buff++ = 0x01u; /* 10 */ *buff++ = 0x01u; /* 11 */ *buff++ = 0x01u; //Type /* 12 */ *buff++ = 0x08u; /* 13 */ *buff++ = 0x00u; /* ---- IP_HEADER ---- */ /* 14 */ *buff++ = 0x45u; //Version /* 15 */ *buff++ = 0x00u; //Service Field /* 16, 17 will be filled by SendUdp() */ buff++; buff++; /* 18 */ *buff++ = 0x00u; /* 19 */ *buff++ = 0x0eu; //Identification /* 20 */ *buff++ = 0x00u; /* 21 */ *buff++ = 0x00u; //Flags & Fragment Offset /* 22 */ *buff++ = 0x40u; //TTL /* 23 */ *buff++ = 0x11u; //Protocol /* 24, 25 will be filled by SendUdp() */ buff++; buff++; //Source IP /* 26 */ *buff++ = 0x00u; /* 27 */ *buff++ = 0x00u; /* 28 */ *buff++ = 0x00u; /* 29 */ *buff++ = 0x00u; //Destination IP /* 30 */ *buff++ = 0xffu, /* 31 */ *buff++ = 0xffu; /* 32 */ *buff++ = 0xffu; /* 33 */ *buff++ = 0xffu; //UDP_HEADER /* 34 */ *buff++ = HB(SOURCE_PORT); /* 35 */ *buff++ = LB(SOURCE_PORT); //Source Port /* 36 */ *buff++ = HB(DESTINATION_PORT); /* 37*/ *buff++ = LB(DESTINATION_PORT); //Destination Port /* 38, 39 will be filled by SendUdp() */ buff++; buff++; /* 40 */ *buff++ = 0x00u; /* 41 */ *buff++ = 0x00u; //Checksum (optional) } static bool SendUdp(uint8_t *pPayload, uint32_t payloadLen) { uint16_t ipLen = ETHERNET_HEADER + IP_HEADER + UDP_HEADER + payloadLen; uint32_t crcSum = 0ul; uint8_t crcCount = 0u; if (NULL == pPayload || 0 == payloadLen) return false; ethBuffer[16] = HB(ipLen - ETHERNET_HEADER); ethBuffer[17] = LB(ipLen - ETHERNET_HEADER); //Total Length //Header Checksum while (crcCount < 10u) { crcSum += (uint16_t)(ethBuffer[ETHERNET_HEADER + crcCount++] << 8u); crcSum += ethBuffer[ETHERNET_HEADER + crcCount++]; } crcSum = ~((crcSum & 0xfffful) + (crcSum >> 16u)); ethBuffer[24] = HB(crcSum); ethBuffer[25] = LB(crcSum); ethBuffer[38] = HB(UDP_HEADER + payloadLen); ethBuffer[39] = LB(UDP_HEADER + payloadLen); return ConsoleCB_SendDatagram(ethBuffer, sizeof(ethBuffer), pPayload, payloadLen); } <file_sep>/audio-source/samv71-ucs/libraries/console/Console.h /*------------------------------------------------------------------------------------------------*/ /* Console Print Component */ /* Copyright 2017, Microchip Technology Inc. and its subsidiaries. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following conditions are met: */ /* */ /* 1. Redistributions of source code must retain the above copyright notice, this */ /* list of conditions and the following disclaimer. */ /* */ /* 2. Redistributions in binary form must reproduce the above copyright notice, */ /* this list of conditions and the following disclaimer in the documentation */ /* and/or other materials provided with the distribution. */ /* */ /* 3. Neither the name of the copyright holder nor the names of its */ /* contributors may be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" */ /* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE */ /* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE */ /* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE */ /* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL */ /* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR */ /* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER */ /* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, */ /* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE */ /* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /*------------------------------------------------------------------------------------------------*/ /*----------------------------------------------------------*/ /*! \file * \brief This file contains C-functions starting with "Console" to provide * process and thread safe access to the console output. */ /*----------------------------------------------------------*/ #ifndef _CONSOLE_H_ #define _CONSOLE_H_ #define ENABLE_COLOR #ifdef ENABLE_COLOR #define RESETCOLOR "\033[0m" #define GREEN "\033[0;32m" #define RED "\033[0;31m" #define YELLOW "\033[1;33m" #define BLUE "\033[0;34m" #else #define RESETCOLOR #define GREEN #define RED #define YELLOW #define BLUE #endif #include <stdbool.h> #include <stdint.h> #ifdef __cplusplus extern "C" { #endif typedef enum { PRIO_LOW = 0, PRIO_MEDIUM = 1, PRIO_HIGH = 2, PRIO_ERROR = 0xFF } ConsolePrio_t; /*----------------------------------------------------------*/ /*! \brief Initializes the resources needed to synchronize between processes and threads. * \note This function must be called before any other function of this component. * */ /*----------------------------------------------------------*/ void ConsoleInit( void ); /*----------------------------------------------------------*/ /*! \brief Destroys the resources needed to synchronize between processes and threads. * \note After this function, any other function (except ConsoleInit) must not be called. * */ /*----------------------------------------------------------*/ void ConsoleDeinit( void ); /*----------------------------------------------------------*/ /*! \brief Sets the minimum priority to be displayed. Lower priority messages are discarded * \param prio - The minimum priority to display */ /*----------------------------------------------------------*/ void ConsoleSetPrio( ConsolePrio_t prio ); /*----------------------------------------------------------*/ /*! \brief Uses the board specific PRINT mechanism and provides thread and process safety. * */ /*----------------------------------------------------------*/ void ConsolePrintf( ConsolePrio_t prio, const char *statement, ... ) __attribute__ ((format (gnu_printf, 2, 3))); /*----------------------------------------------------------*/ /*! \brief Call this function in main()-context after ConsoleCB_OnServiceNeeded was raised. * */ /*----------------------------------------------------------*/ void ConsoleService( void ); /*----------------------------------------------------------*/ /*! \brief Callback to integrators code whenever this component needs to be serviced. * \note Do not call ConsoleService() inside this function! Set flag, and call in main()-context. * */ /*----------------------------------------------------------*/ extern void ConsoleCB_OnServiceNeeded(void); /*----------------------------------------------------------*/ /*! \brief Callback to integrators code in order to send Ethernet datagram. * \return Integrator shall return true, if packet was enqueued successful. false, otherwise. * \note Buffer is not valid after this call. * */ /*----------------------------------------------------------*/ extern bool ConsoleCB_SendDatagram( uint8_t *pEthHeader, uint32_t ethLen, uint8_t *pPayload, uint32_t payloadLen ); #ifdef __cplusplus } #endif #endif //_CONSOLE_H_ <file_sep>/audio-source/samv71-ucs/src/task-unicens.c /*------------------------------------------------------------------------------------------------*/ /* UNICENS Daemon Task Implementation */ /* Copyright 2018, Microchip Technology Inc. and its subsidiaries. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following conditions are met: */ /* */ /* 1. Redistributions of source code must retain the above copyright notice, this */ /* list of conditions and the following disclaimer. */ /* */ /* 2. Redistributions in binary form must reproduce the above copyright notice, */ /* this list of conditions and the following disclaimer in the documentation */ /* and/or other materials provided with the distribution. */ /* */ /* 3. Neither the name of the copyright holder nor the names of its */ /* contributors may be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" */ /* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE */ /* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE */ /* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE */ /* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL */ /* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR */ /* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER */ /* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, */ /* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE */ /* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /*------------------------------------------------------------------------------------------------*/ #include <stdio.h> #include <stdint.h> #include <stdbool.h> #include <stdlib.h> #include <unistd.h> #include <string.h> #include <assert.h> #include "Console.h" #include "ucsi_api.h" #include "default_config.h" #include "timetick.h" #include "dim2_lld.h" #include "task-unicens.h" /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* USER ADJUSTABLE */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ #define ENABLE_PROMISCOUS_MODE (true) #define DEBUG_TABLE_PRINT_TIME_MS (250) /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* DEFINES AND LOCAL VARIABLES */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ typedef struct { DIM2LLD_ChannelType_t cType; DIM2LLD_ChannelDirection_t dir; uint8_t instance; uint16_t channelAddress; uint16_t bufferSize; uint16_t subSize; uint16_t numberOfBuffers; uint16_t bufferOffset; } DIM2_Setup_t; typedef struct { bool allowRun; bool lldTrace; bool noRouteTable; UCSI_Data_t unicens; bool unicensRunning; uint32_t unicensTimeout; bool unicensTrigger; bool promiscuousMode; bool amsReceived; } LocalVar_t; static LocalVar_t m; static DIM2_Setup_t mlbConfig[] = { { .cType = DIM2LLD_ChannelType_Control, .dir = DIM2LLD_ChannelDirection_RX, .instance = 0, .channelAddress = 2, .bufferSize = 72, .subSize = 0, .numberOfBuffers = 8, .bufferOffset = 0 }, { .cType = DIM2LLD_ChannelType_Control, .dir = DIM2LLD_ChannelDirection_TX, .instance = 0, .channelAddress = 4, .bufferSize = 72, .subSize = 0, .numberOfBuffers = 8, .bufferOffset = 0 }, { .cType = DIM2LLD_ChannelType_Async, .dir = DIM2LLD_ChannelDirection_RX, .instance = 0, .channelAddress = 6, .bufferSize = 1522, .subSize = 0, .numberOfBuffers = 8, .bufferOffset = 0 }, { .cType = DIM2LLD_ChannelType_Async, .dir = DIM2LLD_ChannelDirection_TX, .instance = 0, .channelAddress = 8, .bufferSize = 1522, .subSize = 0, .numberOfBuffers = 8, .bufferOffset = 0 }, { .cType = DIM2LLD_ChannelType_Sync, .dir = DIM2LLD_ChannelDirection_TX, .instance = 0, .channelAddress = 10, .bufferSize = 512, .subSize = 4, .numberOfBuffers = 4, .bufferOffset = 0 } }; static const uint32_t mlbConfigSize = sizeof(mlbConfig) / sizeof(DIM2_Setup_t); /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* PRIVTATE FUNCTION PROTOTYPES */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ static void ServiceMostCntrlRx(void); /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* PUBLIC FUNCTIONS */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ bool TaskUnicens_Init(void) { m.promiscuousMode = ENABLE_PROMISCOUS_MODE; // Initialize MOST DIM2 driver DIM2LLD_Init(); Wait(100); while (!DIM2LLD_IsMlbLocked()) { ConsolePrintf(PRIO_ERROR, RED "MLB is not locked!" RESETCOLOR "\r\n"); Wait(1000); } for (uint32_t i = 0; i < mlbConfigSize; i++) { if (!DIM2LLD_SetupChannel(mlbConfig[i].cType, mlbConfig[i].dir, mlbConfig[i].instance, mlbConfig[i].channelAddress, mlbConfig[i].bufferSize, mlbConfig[i].subSize, mlbConfig[i].numberOfBuffers, mlbConfig[i].bufferOffset)) { ConsolePrintf(PRIO_ERROR, "Failed to allocate MLB channel with address=0x%X\r\n", mlbConfig[i].channelAddress); assert(false); return false; } } /* Initialize UNICENS */ UCSI_Init(&m.unicens, &m); if (!UCSI_NewConfig(&m.unicens, PacketBandwidth, AllRoutes, RoutesSize, AllNodes, NodeSize)) { ConsolePrintf(PRIO_ERROR, RED "Could not enqueue new UNICENS config" RESETCOLOR "\r\n"); assert(false); return false; } m.allowRun = true; return true; } void TaskUnicens_Service(void) { uint32_t now; if (!m.allowRun) return; ServiceMostCntrlRx(); now = GetTicks(); /* UNICENS Service */ if (m.unicensTrigger) { m.unicensTrigger = false; UCSI_Service(&m.unicens); } if (0 != m.unicensTimeout && now >= m.unicensTimeout) { m.unicensTimeout = 0; UCSI_Timeout(&m.unicens); } if (m.amsReceived) { uint16_t amsId = 0xFFFF; uint16_t sourceAddress = 0xFFFF; uint8_t *pBuf = NULL; uint32_t len = 0; m.amsReceived = false; if (UCSI_GetAmsMessage(&m.unicens, &amsId, &sourceAddress, &pBuf, &len)) { ConsolePrintf(PRIO_HIGH, "Received AMS, id=0x%X, source=0x%X, len=%lu\r\n", amsId, sourceAddress, len); UCSI_ReleaseAmsMessage(&m.unicens); } else assert(false); } } bool TaskUnicens_SetRouteActive(uint16_t routeId, bool isActive) { return UCSI_SetRouteActive(&m.unicens, routeId, isActive); } /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* PRIVATE FUNCTION IMPLEMENTATIONS */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ static void ServiceMostCntrlRx(void) { uint16_t bufLen; const uint8_t *pBuf; DIM2LLD_Service(); do { bufLen = DIM2LLD_GetRxData(DIM2LLD_ChannelType_Control, DIM2LLD_ChannelDirection_RX, 0, 0, &pBuf, NULL, NULL); if (0 != bufLen) { if (m.unicensRunning) { if (m.lldTrace) { ConsolePrintf( PRIO_HIGH, BLUE "%08lu MSG_RX(%d): ", GetTicks(), bufLen); for ( int16_t i = 0; i < bufLen; i++ ) { ConsolePrintf( PRIO_HIGH, "%02X ", pBuf[i] ); } ConsolePrintf( PRIO_HIGH, RESETCOLOR"\n"); } if (!UCSI_ProcessRxData(&m.unicens, pBuf, bufLen)) { ConsolePrintf(PRIO_ERROR, "RX buffer overflow\r\n"); /* UNICENS is busy. Try to reactive it, by calling service routine */ m.unicensTrigger = true; break; } } DIM2LLD_ReleaseRxData(DIM2LLD_ChannelType_Control, DIM2LLD_ChannelDirection_RX, 0); } } while (0 != bufLen); } /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* CALLBACK FUNCTIONS FROM UNICENS */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ void UCSI_CB_OnCommandResult(void *pTag, UnicensCmd_t command, bool success, uint16_t nodeAddress) { if (!success) ConsolePrintf(PRIO_ERROR, RED "OnCommandResult, cmd=0x%X, node=0x%X failed" RESETCOLOR "\r\n", command, nodeAddress); } uint16_t UCSI_CB_OnGetTime(void *pTag) { return GetTicks(); } void UCSI_CB_OnSetServiceTimer(void *pTag, uint16_t timeout) { if (0 == timeout) m.unicensTimeout = 0; else m.unicensTimeout = GetTicks() + timeout; } void UCSI_CB_OnNetworkState(void *pTag, bool isAvailable, uint16_t packetBandwidth, uint8_t amountOfNodes) { pTag = pTag; ConsolePrintf(PRIO_HIGH, YELLOW "Network isAvailable=%s, packetBW=%d, nodeCount=%d" RESETCOLOR "\r\n", isAvailable ? "yes" : "no", packetBandwidth, amountOfNodes); } void UCSI_CB_OnUserMessage(void *pTag, bool isError, const char format[], uint16_t vargsCnt, ...) { va_list argptr; char outbuf[300]; pTag = pTag; va_start(argptr, vargsCnt); vsnprintf(outbuf, sizeof(outbuf), format, argptr); va_end(argptr); if (isError) ConsolePrintf(PRIO_ERROR, RED "%s" RESETCOLOR "\r\n", outbuf); else ConsolePrintf(PRIO_LOW, "%s\r\n", outbuf); } void UCSI_CB_OnPrintRouteTable(void *pTag, const char pString[]) { ConsolePrintf(PRIO_HIGH, "%s\r\n", pString); } void UCSI_CB_OnServiceRequired(void *pTag) { m.unicensTrigger = true; } void UCSI_CB_OnResetInic(void *pTag) { } void UCSI_CB_OnTxRequest(void *pTag, const uint8_t *pPayload, uint32_t payloadLen) { pTag = pTag; assert(pTag == &m); uint8_t *pBuf = NULL; if (m.lldTrace) { ConsolePrintf( PRIO_HIGH, BLUE "%08lu MSG_TX(%lu): ", GetTicks(), payloadLen); for ( uint32_t i = 0; i < payloadLen; i++ ) { ConsolePrintf( PRIO_HIGH, "%02X ", pPayload[i] ); } ConsolePrintf(PRIO_HIGH, RESETCOLOR "\n"); } uint32_t txMaxLen = 0; while (0 == txMaxLen) { txMaxLen = DIM2LLD_GetTxData(DIM2LLD_ChannelType_Control, DIM2LLD_ChannelDirection_TX, 0, &pBuf); } if (NULL == pBuf || txMaxLen < payloadLen) { ConsolePrintf(PRIO_ERROR, RED "UCSI_CB_SendMostMessage buffer is too small! %lu < %lu" RESETCOLOR "\r\n", txMaxLen, payloadLen); assert(false); return; } memcpy(pBuf, pPayload, payloadLen); DIM2LLD_SendTxData(DIM2LLD_ChannelType_Control, DIM2LLD_ChannelDirection_TX, 0, payloadLen); } void UCSI_CB_OnStart(void *pTag) { m.unicensRunning = true; } void UCSI_CB_OnStop(void *pTag) { m.unicensRunning = false; } void UCSI_CB_OnAmsMessageReceived(void *pTag) { m.amsReceived = true; } void UCSI_CB_OnRouteResult(void *pTag, uint16_t routeId, bool isActive, uint16_t connectionLabel) { ConsolePrintf(PRIO_MEDIUM, "Route id=0x%X isActive=%s ConLabel=0x%X\r\n", routeId, (isActive ? "true" : "false"), connectionLabel); TaskUnicens_CB_OnRouteResult(routeId, isActive, connectionLabel); } void UCSI_CB_OnGpioStateChange(void *pTag, uint16_t nodeAddress, uint8_t gpioPinId, bool isHighState) { ConsolePrintf(PRIO_HIGH, "GPIO state changed, nodeAddress=0x%X, gpioPinId=%d, isHighState=%s\r\n", nodeAddress, gpioPinId, isHighState ? "yes" : "no"); } void UCSI_CB_OnI2CRead(void *pTag, bool success, uint16_t targetAddress, uint8_t slaveAddr, const uint8_t *pBuffer, uint32_t bufLen) { if (!success) ConsolePrintf(PRIO_ERROR, RED "I2C read failed, node=0x%X" RESETCOLOR "\r\n", targetAddress); } void UCSI_CB_OnMgrReport(void *pTag, Ucs_MgrReport_t code, Ucs_Signature_t *signature, Ucs_Rm_Node_t *pNode) { pTag = pTag; if (m.promiscuousMode && NULL != signature && UCS_MGR_REP_AVAILABLE == code) { uint16_t targetAddr = signature->node_address; UCSI_EnablePromiscuousMode(&m.unicens, targetAddr, true); } } void UCSI_CB_OnProgrammingDone(void *pTag, bool changed) { pTag = pTag; if (changed) ConsolePrintf(PRIO_HIGH, YELLOW "Programming finished, terminating program.." RESETCOLOR "\r\n"); else ConsolePrintf(PRIO_HIGH, YELLOW "No programming needed (no collision), terminating program.." RESETCOLOR "\r\n"); exit(0); } <file_sep>/audio-source/samv71-ucs/src/task-unicens.h /*------------------------------------------------------------------------------------------------*/ /* UNICENS Daemon Task Implementation */ /* Copyright 2018, Microchip Technology Inc. and its subsidiaries. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following conditions are met: */ /* */ /* 1. Redistributions of source code must retain the above copyright notice, this */ /* list of conditions and the following disclaimer. */ /* */ /* 2. Redistributions in binary form must reproduce the above copyright notice, */ /* this list of conditions and the following disclaimer in the documentation */ /* and/or other materials provided with the distribution. */ /* */ /* 3. Neither the name of the copyright holder nor the names of its */ /* contributors may be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" */ /* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE */ /* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE */ /* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE */ /* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL */ /* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR */ /* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER */ /* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, */ /* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE */ /* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /*------------------------------------------------------------------------------------------------*/ #ifndef TASK_UNICENS_H_ #define TASK_UNICENS_H_ #ifdef __cplusplus extern "C" { #endif /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* Public API */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /** * \brief Initializes the UNICENS Task * \note Must be called before any other function of this component * \return true, if initialization was successful. false, otherwise, do not call any other function in that case */ bool TaskUnicens_Init(void); /** * \brief Gives the UNICENS Task time to maintain it's service routines */ void TaskUnicens_Service(void); /** * \brief Enables or disables a route by the given routeId * * \param routeId - identifier as given in XML file along with MOST socket (unique) * \param isActive - true, route will become active. false, route will be deallocated * * \return true, if route was found and the specific command was enqueued to UNICENS. */ bool TaskUnicens_SetRouteActive(uint16_t routeId, bool isActive); /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* CALLBACK SECTION */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /** * \brief Callback when a route become active / inactive. * \note This function must be implemented by the integrator * \param routeId - identifier as given in XML file along with MOST socket (unique) * \param isActive - true, if the route is now in use. false, the route is not established. * \param connectionLabel - The connection label used on the Network. Only valid, if isActive=true */ extern void TaskUnicens_CB_OnRouteResult(uint16_t routeId, bool isActive, uint16_t connectionLabel); #ifdef __cplusplus } #endif #endif /* TASK_UNICENS_H_ */<file_sep>/audio-source/samv71-ucs/utils/utility.h /* ---------------------------------------------------------------------------- * SAM Software Package License * ---------------------------------------------------------------------------- * Copyright (c) 2014, Atmel Corporation * * All rights reserved. * * Redistribution and use in source and binary forms, with or without * modification, are permitted provided that the following conditions are met: * * - Redistributions of source code must retain the above copyright notice, * this list of conditions and the disclaimer below. * * Atmel's name may not be used to endorse or promote products derived from * this software without specific prior written permission. * * DISCLAIMER: THIS SOFTWARE IS PROVIDED BY ATMEL "AS IS" AND ANY EXPRESS OR * IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF * MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT ARE * DISCLAIMED. IN NO EVENT SHALL ATMEL BE LIABLE FOR ANY DIRECT, INDIRECT, * INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT * LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, * OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF * LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING * NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, * EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. * ---------------------------------------------------------------------------- */ #ifndef UTILITY_H #define UTILITY_H #include "chip.h" #define RESET_CYCLE_COUNTER() do { \ CoreDebug->DEMCR = CoreDebug_DEMCR_TRCENA_Msk; \ __DSB(); DWT->LAR = 0xC5ACCE55; __DSB(); \ DWT->CTRL &= ~DWT_CTRL_CYCCNTENA_Msk; \ DWT->CYCCNT = 0; \ DWT->CTRL = DWT_CTRL_CYCCNTENA_Msk; \ }while(0) #define GET_CYCLE_COUNTER(x) x=DWT->CYCCNT; #define LockMutex(mut, timeout) get_lock(&mut, 1, &timeout) #define ReleaseMutex(mut) free_lock(&mut) #define GetResource(mut, max, timeout) get_lock(&mut, max, &timeout) #define FreeResource(mut) free_lock(&mut) __STATIC_INLINE uint8_t Is_LockFree(volatile uint8_t *Lock_Variable) { /* return Variable value*/ return __LDREXB(Lock_Variable); } __STATIC_INLINE uint8_t get_lock(volatile uint8_t *Lock_Variable, const uint8_t maxValue, volatile uint32_t *pTimeout) { while (*pTimeout) { if(__LDREXB(Lock_Variable) < maxValue) { /* Set the Variable */ while( __STREXB(((*Lock_Variable) + 1), Lock_Variable) ) { if(!(*pTimeout)--) { return 1; // quit if timeout } } /* Memory access barrier */ __DMB(); TRACE_DEBUG("Mutex locked "); return 0; } ((*pTimeout)--); } return 1; } __STATIC_INLINE uint8_t free_lock(volatile uint8_t *Lock_Variable) { /* Memory access barrier Ensure memory operations completed before releasing lock */ __DSB(); if(__LDREXB(Lock_Variable)) { __STREXB( ((*Lock_Variable) - 1), Lock_Variable); TRACE_DEBUG("Mutex freed "); __DSB(); __DMB(); // Ensure memory operations completed before return 0; } else { return 1; } } #endif /* UTILITY_H */ <file_sep>/audio-source/samv71-ucs/src/driver/dim2/dim2_lld.c /*------------------------------------------------------------------------------------------------*/ /* DIM2 LOW LEVEL DRIVER */ /* (c) 2017 Microchip Technology Inc. and its subsidiaries. */ /* */ /* You may use this software and any derivatives exclusively with Microchip products. */ /* */ /* THIS SOFTWARE IS SUPPLIED BY MICROCHIP "AS IS". NO WARRANTIES, WHETHER EXPRESS, IMPLIED OR */ /* STATUTORY, APPLY TO THIS SOFTWARE, INCLUDING ANY IMPLIED WARRANTIES OF NON-INFRINGEMENT, */ /* MERCHANTABILITY, AND FITNESS FOR A PARTICULAR PURPOSE, OR ITS INTERACTION WITH MICROCHIP */ /* PRODUCTS, COMBINATION WITH ANY OTHER PRODUCTS, OR USE IN ANY APPLICATION. */ /* */ /* IN NO EVENT WILL MICROCHIP BE LIABLE FOR ANY INDIRECT, SPECIAL, PUNITIVE, INCIDENTAL OR */ /* CONSEQUENTIAL LOSS, DAMAGE, COST OR EXPENSE OF ANY KIND WHATSOEVER RELATED TO THE SOFTWARE, */ /* HOWEVER CAUSED, EVEN IF MICROCHIP HAS BEEN ADVISED OF THE POSSIBILITY OR THE DAMAGES ARE */ /* FORESEEABLE. TO THE FULLEST EXTENT ALLOWED BY LAW, MICROCHIP'S TOTAL LIABILITY ON ALL CLAIMS */ /* IN ANY WAY RELATED TO THIS SOFTWARE WILL NOT EXCEED THE AMOUNT OF FEES, IF ANY, THAT YOU HAVE */ /* PAID DIRECTLY TO MICROCHIP FOR THIS SOFTWARE. */ /* */ /* MICROCHIP PROVIDES THIS SOFTWARE CONDITIONALLY UPON YOUR ACCEPTANCE OF THESE TERMS. */ /*------------------------------------------------------------------------------------------------*/ #include <assert.h> #include <stddef.h> #include <stdlib.h> #include <string.h> #include "ringbuffer.h" #include "dim2_hal.h" #include "dim2_lld.h" #include "dim2_hardware.h" //USE CASE SPECIFIC: //Depending from this value, different buffer sizes must be used for synchronous streaming (ask for helper tool): #define FCNT_VAL (5) //How many RX and TX pairs available for sync / isoc use case #define MAX_CHANNEL_INSTANCES (4) //Enable to debug /* #define LLD_TRACE */ #ifdef LLD_TRACE /* #define LLD_TRACE_IGNORE_RX */ /* #define LLD_TRACE_IGNORE_TX */ #define LLD_TRACE_IGNORE_CONTROL #define LLD_TRACE_IGNORE_SYNC /* #define LLD_TRACE_IGNORE_ASYNC */ #define LLD_TRACE_IGNORE_ISOC #include "Console.h" #endif //Fixed values: #define DMA_CHANNELS (32 - 1) /* channel 0 is a system channel */ typedef struct { bool hwEnqueued; int16_t payloadLen; int16_t maxPayloadLen; uint16_t offset; uint8_t *buffer; uint8_t packetCounter; } QueueEntry_t; typedef struct { bool channelUsed; RingBuffer_t *ringBuffer; struct dim_channel *dimChannel; uint16_t amountOfEntries; QueueEntry_t *workingStruct; DIM2LLD_ChannelType_t cType; DIM2LLD_ChannelDirection_t dir; uint8_t lastPacketCount; } ChannelContext_t; typedef struct { bool initialized; ChannelContext_t controlLookupTable[DIM2LLD_ChannelDirection_BOUNDARY]; ChannelContext_t asyncLookupTable[DIM2LLD_ChannelDirection_BOUNDARY]; ChannelContext_t syncLookupTable[DIM2LLD_ChannelDirection_BOUNDARY][MAX_CHANNEL_INSTANCES]; ChannelContext_t isocLookupTable[DIM2LLD_ChannelDirection_BOUNDARY][MAX_CHANNEL_INSTANCES]; ///Zero terminated list of all dim channels. This used by the ISR routine. struct dim_channel *allChannels[DMA_CHANNELS]; } LocalVar_t; static LocalVar_t lc = { 0 }; static void ExecuteLLDTrace(DIM2LLD_ChannelType_t cType, DIM2LLD_ChannelDirection_t dir, const uint8_t *buffer, uint16_t payloadLen) { #ifdef LLD_TRACE do { const char *pTypeStr = "Unknown"; const char *pDirStr = "Unknown"; int16_t l; #ifdef LLD_TRACE_IGNORE_TX if (DIM2LLD_ChannelDirection_TX == dir) break; #endif #ifdef LLD_TRACE_IGNORE_RX if (DIM2LLD_ChannelDirection_RX == dir) break; #endif #ifdef LLD_TRACE_IGNORE_CONTROL if (DIM2LLD_ChannelType_Control == cType) break; #endif #ifdef LLD_TRACE_IGNORE_SYNC if (DIM2LLD_ChannelType_Sync == cType) break; #endif #ifdef LLD_TRACE_IGNORE_ASYNC if (DIM2LLD_ChannelType_Async == cType) break; #endif #ifdef LLD_TRACE_IGNORE_ISOC if (DIM2LLD_ChannelType_Isoc == cType) break; #endif switch(dir) { case DIM2LLD_ChannelDirection_TX: pDirStr = YELLOW "TX" RESETCOLOR; break; case DIM2LLD_ChannelDirection_RX: pDirStr = GREEN "RX" RESETCOLOR; break; case DIM2LLD_ChannelDirection_BOUNDARY: default: assert(false); } switch(cType) { case DIM2LLD_ChannelType_Control: pTypeStr = "Control"; break; case DIM2LLD_ChannelType_Async: pTypeStr = "Async"; break; case DIM2LLD_ChannelType_Sync: pTypeStr = "Sync"; break; case DIM2LLD_ChannelType_Isoc: pTypeStr = "Isoc"; break; case DIM2LLD_ChannelType_BOUNDARY: default: assert(false); } ConsolePrintf(PRIO_HIGH, "DIM2LLD_%s (%s): [", pDirStr, pTypeStr); for (l = 0; l < payloadLen; l++) { ConsolePrintf(PRIO_HIGH, " %02X", buffer[l]); } ConsolePrintf(PRIO_HIGH, " ]\r\n"); } while(0); #endif } //This must be adapted use case specific static ChannelContext_t *GetDimContext(DIM2LLD_ChannelType_t cType, DIM2LLD_ChannelDirection_t dir, uint8_t instance) { assert(cType < DIM2LLD_ChannelType_BOUNDARY); assert(dir < DIM2LLD_ChannelDirection_BOUNDARY); assert(instance < MAX_CHANNEL_INSTANCES); if (cType >= DIM2LLD_ChannelType_BOUNDARY || dir >= DIM2LLD_ChannelDirection_BOUNDARY || instance >= MAX_CHANNEL_INSTANCES) return NULL; switch (cType) { case DIM2LLD_ChannelType_Control: if (0 != instance) return NULL; return &lc.controlLookupTable[dir]; case DIM2LLD_ChannelType_Async: if (0 != instance) return NULL; return &lc.asyncLookupTable[dir]; case DIM2LLD_ChannelType_Sync: return &lc.syncLookupTable[dir][instance]; case DIM2LLD_ChannelType_Isoc: return &lc.isocLookupTable[dir][instance]; default: break; } return NULL; } static void CleanUpContext(ChannelContext_t *context) { uint16_t i; assert(NULL != context); if (!context->channelUsed) return; context->channelUsed = false; context->cType = DIM2LLD_ChannelType_BOUNDARY; context->dir = DIM2LLD_ChannelDirection_BOUNDARY; if (NULL != context->ringBuffer) free(context->ringBuffer); if (NULL != context->dimChannel) free(context->dimChannel); if (NULL != context->ringBuffer) RingBuffer_Deinit(context->ringBuffer); if (NULL != context->workingStruct) { for (i = 0; i < context->amountOfEntries; i++) if (NULL != context->workingStruct[i].buffer) free(context->workingStruct[i].buffer); free(context->workingStruct); } } static bool AddDimChannelToIsrList(struct dim_channel *ch) { uint8_t i; bool added = false; assert(NULL != ch); for (i = 0; i < DMA_CHANNELS; i++) { if (lc.allChannels[i] != NULL) continue; lc.allChannels[i] = ch; added = true; break; } assert(added); return added; } bool DIM2LLD_Init(void) { assert(!lc.initialized); memset(&lc, 0, sizeof(lc)); lc.initialized = true; enable_mlb_clock(); initialize_mlb_pins(); disable_mlb_interrupt(); if (DIM_NO_ERROR != dim_startup(DIM2_BASE_ADDRESS, DIM2_MLB_SPEED, FCNT_VAL)) return false; enable_mlb_interrupt(); return true; } bool DIM2LLD_SetupChannel(DIM2LLD_ChannelType_t cType, DIM2LLD_ChannelDirection_t dir, uint8_t instance, uint16_t channelAddress, uint16_t bufferSize, uint16_t subSize, uint16_t numberOfBuffers, uint16_t bufferOffset) { uint8_t result; uint16_t i; ChannelContext_t *context; assert(lc.initialized); if (!lc.initialized) return false; if (DIM2LLD_ChannelDirection_TX == dir) bufferOffset = 0; context = GetDimContext(cType, dir, instance); if (NULL == context) return false; CleanUpContext(context); context->channelUsed = true; context->amountOfEntries = numberOfBuffers; switch (cType) { case DIM2LLD_ChannelType_Control: case DIM2LLD_ChannelType_Async: bufferSize = dim_norm_ctrl_async_buffer_size(bufferSize); break; case DIM2LLD_ChannelType_Sync: bufferSize = dim_norm_sync_buffer_size(bufferSize, subSize); break; case DIM2LLD_ChannelType_Isoc: bufferSize = dim_norm_isoc_buffer_size(bufferSize, subSize); break; default: assert(false); return false; } if (0 == bufferSize) return false; context->cType = cType; context->dir = dir; context->workingStruct = (QueueEntry_t *)calloc(numberOfBuffers, sizeof(QueueEntry_t)); for (i = 0; i < numberOfBuffers; i++) { context->workingStruct[i].offset = bufferOffset; context->workingStruct[i].maxPayloadLen = bufferSize; context->workingStruct[i].buffer = (uint8_t *)calloc(bufferSize + bufferOffset, sizeof(uint8_t)); assert(NULL != context->workingStruct[i].buffer); } context->ringBuffer = (RingBuffer_t *)calloc(1, sizeof(RingBuffer_t)); RingBuffer_Init(context->ringBuffer, numberOfBuffers, sizeof(QueueEntry_t), context->workingStruct); context->dimChannel = calloc(1, sizeof(struct dim_channel)); assert(NULL != context->workingStruct && NULL != context->ringBuffer && NULL != context->dimChannel); AddDimChannelToIsrList(context->dimChannel); disable_mlb_interrupt(); switch (cType) { case DIM2LLD_ChannelType_Control: result = dim_init_control(context->dimChannel, (DIM2LLD_ChannelDirection_TX == dir), channelAddress, bufferSize); enable_mlb_interrupt(); return (DIM_NO_ERROR == result); case DIM2LLD_ChannelType_Async: result = dim_init_async(context->dimChannel, (DIM2LLD_ChannelDirection_TX == dir), channelAddress, bufferSize); enable_mlb_interrupt(); return (DIM_NO_ERROR == result); case DIM2LLD_ChannelType_Sync: result = dim_init_sync(context->dimChannel, (DIM2LLD_ChannelDirection_TX == dir), channelAddress, subSize); enable_mlb_interrupt(); return (DIM_NO_ERROR == result); case DIM2LLD_ChannelType_Isoc: result = dim_init_isoc(context->dimChannel, (DIM2LLD_ChannelDirection_TX == dir), channelAddress, subSize); enable_mlb_interrupt(); return (DIM_NO_ERROR == result); default: enable_mlb_interrupt(); assert(false); return false; } } void DIM2LLD_Deinit(void) { uint16_t i; assert(lc.initialized); if (!lc.initialized) return; for (i = 0; i < (sizeof(lc.controlLookupTable) / sizeof(ChannelContext_t)); i++) CleanUpContext(&lc.controlLookupTable[i]); for (i = 0; i < (sizeof(lc.asyncLookupTable) / sizeof(ChannelContext_t)); i++) CleanUpContext(&lc.asyncLookupTable[i]); for (i = 0; i < (sizeof(lc.syncLookupTable) / sizeof(ChannelContext_t)); i++) CleanUpContext((ChannelContext_t *)&lc.syncLookupTable[i]); for (i = 0; i < (sizeof(lc.isocLookupTable) / sizeof(ChannelContext_t)); i++) CleanUpContext((ChannelContext_t *)&lc.isocLookupTable[i]); disable_mlb_interrupt(); dim_shutdown(); lc.initialized = false; } static void ServiceTxChannel(ChannelContext_t *context) { uint32_t amountTx, i; uint16_t done_buffers; struct dim_ch_state_t st = { 0 }; QueueEntry_t *entry; assert(lc.initialized); if (!lc.initialized) return; if (NULL == context || !context->channelUsed) return; assert(NULL != context->ringBuffer); assert(NULL != context->dimChannel); assert(DIM2LLD_ChannelDirection_TX == context->dir); disable_mlb_interrupt(); dim_service_channel(context->dimChannel); enable_mlb_interrupt(); //Try to release elements from hardware buffer: done_buffers = dim_get_channel_state(context->dimChannel, &st)->done_buffers; if (0 != done_buffers) { disable_mlb_interrupt(); dim_detach_buffers(context->dimChannel, done_buffers); enable_mlb_interrupt(); } for (i = 0; i < done_buffers; i++) RingBuffer_PopReadPtr(context->ringBuffer); //Try to enqueue new elements into hardware buffer: amountTx = RingBuffer_GetReadElementCount(context->ringBuffer); for (i = 0; i < amountTx; i++) { if (!dim_get_channel_state(context->dimChannel, &st)->ready) break; entry = (QueueEntry_t *)RingBuffer_GetReadPtrPos(context->ringBuffer, i); assert(NULL != entry); if (NULL == entry || 0 == entry->payloadLen || entry->hwEnqueued) continue; disable_mlb_interrupt(); if (dim_dbr_space(context->dimChannel) < entry->payloadLen) { enable_mlb_interrupt(); break; } if (dim_enqueue_buffer(context->dimChannel, (uint32_t)entry->buffer, entry->payloadLen)) { enable_mlb_interrupt(); entry->hwEnqueued = true; ExecuteLLDTrace(context->cType, context->dir, entry->buffer, entry->payloadLen); } else { enable_mlb_interrupt(); break; } } } static void ServiceRxChannel(ChannelContext_t *context) { int32_t amountRx, i; uint16_t done_buffers; struct dim_ch_state_t st = { 0 }; QueueEntry_t *entry; assert(lc.initialized); if (!lc.initialized) return; if (NULL == context || !context->channelUsed) return; assert(DIM2LLD_ChannelDirection_RX == context->dir); disable_mlb_interrupt(); dim_service_channel(context->dimChannel); enable_mlb_interrupt(); //Enqueue empty buffers into hardware while (NULL != (entry = (QueueEntry_t *)RingBuffer_GetWritePtr( context->ringBuffer))) { if (!dim_get_channel_state(context->dimChannel, &st)->ready) break; disable_mlb_interrupt(); if (dim_enqueue_buffer(context->dimChannel, (uint32_t)&entry->buffer[entry->offset], entry->maxPayloadLen)) { enable_mlb_interrupt(); entry->hwEnqueued = true; entry->packetCounter = context->lastPacketCount++; RingBuffer_PopWritePtr(context->ringBuffer); } else { enable_mlb_interrupt(); break; } } //Handle filled RX buffers done_buffers = dim_get_channel_state(context->dimChannel, &st)->done_buffers; if (0 != done_buffers) { amountRx = RingBuffer_GetReadElementCount(context->ringBuffer); assert(done_buffers <= amountRx); for (i = 0; i < done_buffers && i < amountRx; i++) { entry = (QueueEntry_t *)RingBuffer_GetReadPtrPos(context->ringBuffer, i); assert(NULL != entry); if (NULL == entry || !entry->hwEnqueued) continue; if (DIM2LLD_ChannelType_Control == context->cType || DIM2LLD_ChannelType_Async == context->cType) entry->payloadLen = (uint16_t)entry->buffer[entry->offset] * 256 + entry->buffer[entry->offset + 1] + 2; else entry->payloadLen = entry->maxPayloadLen; assert(entry->payloadLen <= entry->maxPayloadLen); ExecuteLLDTrace(context->cType, context->dir, entry->buffer, entry->payloadLen); entry->hwEnqueued = false; disable_mlb_interrupt(); dim_detach_buffers(context->dimChannel, 1); enable_mlb_interrupt(); } } } void DIM2LLD_Service(void) { uint16_t i; assert(lc.initialized); if (!lc.initialized) return; //Handle TX channels ServiceTxChannel(&lc.controlLookupTable[DIM2LLD_ChannelDirection_TX]); ServiceTxChannel(&lc.asyncLookupTable[DIM2LLD_ChannelDirection_TX]); for (i = 0; i < MAX_CHANNEL_INSTANCES; i++) { ServiceTxChannel(&lc.syncLookupTable[DIM2LLD_ChannelDirection_TX][i]); ServiceTxChannel(&lc.isocLookupTable[DIM2LLD_ChannelDirection_TX][i]); } //Handle RX channels ServiceRxChannel(&lc.controlLookupTable[DIM2LLD_ChannelDirection_RX]); ServiceRxChannel(&lc.asyncLookupTable[DIM2LLD_ChannelDirection_RX]); for (i = 0; i < MAX_CHANNEL_INSTANCES; i++) { ServiceRxChannel(&lc.syncLookupTable[DIM2LLD_ChannelDirection_RX][i]); ServiceRxChannel(&lc.isocLookupTable[DIM2LLD_ChannelDirection_RX][i]); } } bool DIM2LLD_IsMlbLocked(void) { assert(lc.initialized); if (!lc.initialized) return false; return dim_get_lock_state(); } uint32_t DIM2LLD_GetQueueElementCount(DIM2LLD_ChannelType_t cType, DIM2LLD_ChannelDirection_t dir, uint8_t instance) { ChannelContext_t *context; if (!lc.initialized) return 0; context = GetDimContext(cType, dir, instance); if (NULL == context || !context->channelUsed) return 0; return RingBuffer_GetReadElementCount(context->ringBuffer); } uint16_t DIM2LLD_GetRxData(DIM2LLD_ChannelType_t cType, DIM2LLD_ChannelDirection_t dir, uint8_t instance, uint32_t pos, const uint8_t **pBuffer, uint16_t *pOffset, uint8_t *pPacketCounter) { ChannelContext_t *context; QueueEntry_t *entry; if (!lc.initialized) return 0; if (NULL == pBuffer) return 0; *pBuffer = NULL; if (NULL != pOffset) *pOffset = 0; if (NULL != pPacketCounter) *pPacketCounter = 0; context = GetDimContext(cType, dir, instance); if (NULL == context || !context->channelUsed) return 0; assert(context->cType == cType && context->dir == dir && NULL != context->ringBuffer); entry = (QueueEntry_t *)RingBuffer_GetReadPtrPos(context->ringBuffer, pos); if (NULL == entry || entry->hwEnqueued) return 0; *pBuffer = entry->buffer; if (NULL != pOffset) *pOffset = entry->offset; if (NULL != pPacketCounter) *pPacketCounter = entry->packetCounter; return entry->payloadLen; } void DIM2LLD_ReleaseRxData(DIM2LLD_ChannelType_t cType, DIM2LLD_ChannelDirection_t dir, uint8_t instance) { ChannelContext_t *context; assert(lc.initialized); if (!lc.initialized) return; context = GetDimContext(cType, dir, instance); if (NULL == context) { assert(false); return; } RingBuffer_PopReadPtr(context->ringBuffer); } uint16_t DIM2LLD_GetTxData(DIM2LLD_ChannelType_t cType, DIM2LLD_ChannelDirection_t dir, uint8_t instance, uint8_t **pBuffer) { ChannelContext_t *context; QueueEntry_t *entry; if (!lc.initialized) return 0; if (NULL == pBuffer) return 0; *pBuffer = NULL; context = GetDimContext(cType, dir, instance); if (NULL == context || !context->channelUsed) return 0; assert(context->cType == cType && context->dir == dir && NULL != context->ringBuffer); entry = (QueueEntry_t *)RingBuffer_GetWritePtr(context->ringBuffer); if (NULL == entry) return 0; *pBuffer = entry->buffer; return entry->maxPayloadLen; } void DIM2LLD_SendTxData(DIM2LLD_ChannelType_t cType, DIM2LLD_ChannelDirection_t dir, uint8_t instance, uint32_t payloadLength) { ChannelContext_t *context; QueueEntry_t *entry; assert(lc.initialized); assert(0 != payloadLength); if (!lc.initialized || 0 == payloadLength) return; context = GetDimContext(cType, dir, instance); if (NULL == context) return; assert(DIM2LLD_ChannelDirection_TX == context->dir) ; entry = (QueueEntry_t *)RingBuffer_GetWritePtr(context->ringBuffer); assert(NULL != entry); if (NULL == entry) return; entry->payloadLen = payloadLength; entry->hwEnqueued = false; RingBuffer_PopWritePtr(context->ringBuffer); } void on_mlb_int_isr(void) { assert(lc.initialized); if (!lc.initialized) return; dim_service_mlb_int_irq(); } void on_ahb0_int_isr(void) { assert(lc.initialized); if (!lc.initialized) return; dim_service_ahb_int_irq(lc.allChannels); } <file_sep>/audio-source/samv71-ucs/libraries/ucsi/ucsi_collision.h /*------------------------------------------------------------------------------------------------*/ /* UNICENS Node and MAC Address Collision Solver Component */ /* Copyright 2017, Microchip Technology Inc. and its subsidiaries. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following conditions are met: */ /* */ /* 1. Redistributions of source code must retain the above copyright notice, this */ /* list of conditions and the following disclaimer. */ /* */ /* 2. Redistributions in binary form must reproduce the above copyright notice, */ /* this list of conditions and the following disclaimer in the documentation */ /* and/or other materials provided with the distribution. */ /* */ /* 3. Neither the name of the copyright holder nor the names of its */ /* contributors may be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" */ /* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE */ /* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE */ /* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE */ /* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL */ /* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR */ /* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER */ /* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, */ /* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE */ /* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /*------------------------------------------------------------------------------------------------*/ #ifndef UCSI_COLLISION_H_ #define UCSI_COLLISION_H_ #include <string.h> #include <stdint.h> #include <stdbool.h> #include "ucs_api.h" /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* Public API */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /** * \brief Initializes the collision resolver module * \note Do not use any other function, before calling this method. * * \param pPriv - private data section of this instance */ void UCSICollision_Init(void); /** * \brief Sets any pointer as parameter of the callback function UCSICollision_CB_OnProgramIdentString. * * \param userPtr - Any pointer allowed */ void UCSICollision_SetUserPtr(void *userPtr); /** * \brief Sets the expected amount of network nodes in the ring (MPR). * Programming will only take place, if expected node count is like the found node count. * \note This value is given by the user * \note Values over 64 will be silently ignored * * \param nodeCount - private data section of this instance */ void UCSICollision_SetExpectedNodeCount(uint8_t nodeCount); /** * \brief Sets the found amount of network nodes in the ring (MPR). * Programming will only take place, if expected node count is like the found node count. * \note This value is given by the network * \note Values over 64 will be silently ignored * * \param nodeCount - private data section of this instance */ void UCSICollision_SetFoundNodeCount(uint8_t nodeCount); /** * \brief Stores a found network node (by the UNICENS manager) into a lookup table. * * \param signature - The signature of the node (containing at least valid node address and MAC address) * \param collisionDetected - true, if UNICENS manager detected state which is not normal. false, node is valid */ void UCSICollision_StoreSignature(const Ucs_Signature_t *signature, bool collisionDetected); /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* CALLBACK SECTION */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /** * \brief Callback when the implementer needs to program the INIC * * \param signature - The signature of the node to be programmed * \param newIdentString - The data to be programmed * \param userPtr - The pointer given by UCSICollision_SetUserPtr function, otherwise NULL. */ extern void UCSICollision_CB_OnProgramIdentString(const Ucs_Signature_t *signature, const Ucs_IdentString_t *newIdentString, void *userPtr); /** * \brief Callback when the implementer needs to do final cleanup after flashing * * \param userPtr - The pointer given by UCSICollision_SetUserPtr function, otherwise NULL. */ extern void UCSICollision_CB_OnProgramDone(void *userPtr); /** * \brief Callback to inform the implementer that there was no flashing required. The collision resolving finished without any changes. * * \param userPtr - The pointer given by UCSICollision_SetUserPtr function, otherwise NULL. */ extern void UCSICollision_CB_FinishedWithoutChanges(void *userPtr); #endif /* UCSI_COLLISION_H_ */<file_sep>/audio-source/samv71-ucs/src/driver/dim2/board/dim2_hardware.c /*------------------------------------------------------------------------------------------------*/ /* DIM2 LOW LEVEL DRIVER */ /* (c) 2017 Microchip Technology Inc. and its subsidiaries. */ /* */ /* You may use this software and any derivatives exclusively with Microchip products. */ /* */ /* THIS SOFTWARE IS SUPPLIED BY MICROCHIP "AS IS". NO WARRANTIES, WHETHER EXPRESS, IMPLIED OR */ /* STATUTORY, APPLY TO THIS SOFTWARE, INCLUDING ANY IMPLIED WARRANTIES OF NON-INFRINGEMENT, */ /* MERCHANTABILITY, AND FITNESS FOR A PARTICULAR PURPOSE, OR ITS INTERACTION WITH MICROCHIP */ /* PRODUCTS, COMBINATION WITH ANY OTHER PRODUCTS, OR USE IN ANY APPLICATION. */ /* */ /* IN NO EVENT WILL MICROCHIP BE LIABLE FOR ANY INDIRECT, SPECIAL, PUNITIVE, INCIDENTAL OR */ /* CONSEQUENTIAL LOSS, DAMAGE, COST OR EXPENSE OF ANY KIND WHATSOEVER RELATED TO THE SOFTWARE, */ /* HOWEVER CAUSED, EVEN IF MICROCHIP HAS BEEN ADVISED OF THE POSSIBILITY OR THE DAMAGES ARE */ /* FORESEEABLE. TO THE FULLEST EXTENT ALLOWED BY LAW, MICROCHIP'S TOTAL LIABILITY ON ALL CLAIMS */ /* IN ANY WAY RELATED TO THIS SOFTWARE WILL NOT EXCEED THE AMOUNT OF FEES, IF ANY, THAT YOU HAVE */ /* PAID DIRECTLY TO MICROCHIP FOR THIS SOFTWARE. */ /* */ /* MICROCHIP PROVIDES THIS SOFTWARE CONDITIONALLY UPON YOUR ACCEPTANCE OF THESE TERMS. */ /*------------------------------------------------------------------------------------------------*/ #include <stdint.h> #include <stddef.h> #include <assert.h> #include "samv71.h" #include "board.h" #include "dim2_hardware.h" #include "Console.h" void enable_mlb_clock(void) { volatile Pmc *pmc = PMC; pmc->PMC_PCR = PMC_PCR_EN | PMC_PCR_CMD | PMC_PCR_DIV_PERIPH_DIV_MCK | PMC_PCR_PID(ID_MLB); pmc->PMC_WPMR = PMC_WPMR_WPKEY_PASSWD | PMC_WPMR_WPEN; pmc->PMC_PCER1 = PMC_PCER1_PID53; pmc->PMC_WPMR = PMC_WPMR_WPKEY_PASSWD; } //Warning: Using the MLB pins will disable the virtual UART over USB :-( void initialize_mlb_pins(void) { #define PIN_MLBCLK {PIO_PB4, PIOB, ID_PIOB, PIO_PERIPH_C, PIO_DEFAULT} #define PIN_MLBDAT {PIO_PB5, PIOB, ID_PIOB, PIO_PERIPH_C, PIO_DEFAULT} #define PIN_MLBSIG {PIO_PD10, PIOD, ID_PIOD, PIO_PERIPH_D, PIO_DEFAULT} static const Pin pinsMlb[] = {PIN_MLBCLK, PIN_MLBDAT, PIN_MLBSIG}; PIO_Configure(pinsMlb, 3); MATRIX->CCFG_SYSIO |= CCFG_SYSIO_SYSIO4; MATRIX->CCFG_SYSIO |= CCFG_SYSIO_SYSIO5; } void enable_mlb_interrupt(void) { NVIC_EnableIRQ(AHB0_INT_IRQn); NVIC_EnableIRQ(MLB_INT_IRQn); } void disable_mlb_interrupt(void) { NVIC_DisableIRQ(MLB_INT_IRQn); NVIC_DisableIRQ(AHB0_INT_IRQn); } uint32_t dimcb_io_read(uint32_t *ptr32) { assert(NULL != ptr32); return *ptr32; //No MMU, so it's easy } void dimcb_io_write(uint32_t *ptr32, uint32_t value) { assert(NULL != ptr32); *ptr32 = value; //No MMU, so it's easy } void dimcb_on_error(uint8_t error_id, const char *error_message) { ConsolePrintf(PRIO_ERROR, RED "dim2-hal error:%d, '%s'" RESETCOLOR"\r\n", error_id, error_message); } void mlb_int_handler(void) { irqflags_t flags = cpu_irq_save(); on_mlb_int_isr(); cpu_irq_restore(flags); } void ahb0_int_handler(void) { irqflags_t flags = cpu_irq_save(); on_ahb0_int_isr(); cpu_irq_restore(flags); } <file_sep>/audio-source/samv71-ucs/src/driver/dim2/hal/dim2_hal.h /* * dim2_hal.h - DIM2 HAL interface * (MediaLB, Device Interface Macro IP, OS62420) * * Copyright (C) 2015, Microchip Technology Germany II GmbH & Co. KG * * This program is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the * GNU General Public License for more details. * * This file is licensed under GPLv2. */ #ifndef _DIM2_HAL_H #define _DIM2_HAL_H #include <stdint.h> #include <stdbool.h> #include "dim2_reg.h" #include "dim2_errors.h" #ifdef __cplusplus extern "C" { #endif /* * The values below are specified in the hardware specification. * So, they should not be changed until the hardware specification changes. */ enum mlb_clk_speed { CLK_256FS = 0, CLK_512FS = 1, CLK_1024FS = 2, CLK_2048FS = 3, CLK_3072FS = 4, CLK_4096FS = 5, CLK_6144FS = 6, CLK_8192FS = 7, }; struct dim_ch_state_t { bool ready; /* Shows readiness to enqueue next buffer */ uint16_t done_buffers; /* Number of completed buffers */ }; struct int_ch_state { /* changed only in interrupt context */ volatile int request_counter; /* changed only in task context */ volatile int service_counter; uint8_t idx1; uint8_t idx2; uint8_t level; /* [0..2], buffering level */ }; struct dim_channel { struct int_ch_state state; uint8_t addr; uint16_t dbr_addr; uint16_t dbr_size; uint16_t packet_length; /*< Isochronous packet length in bytes. */ uint16_t bytes_per_frame; /*< Synchronous bytes per frame. */ uint16_t done_sw_buffers_number; /*< Done software buffers number. */ }; uint8_t dim_startup(struct dim2_regs *dim_base_address, uint32_t mlb_clock, uint32_t fcnt); void dim_shutdown(void); bool dim_get_lock_state(void); uint16_t dim_norm_ctrl_async_buffer_size(uint16_t buf_size); uint16_t dim_norm_isoc_buffer_size(uint16_t buf_size, uint16_t packet_length); uint16_t dim_norm_sync_buffer_size(uint16_t buf_size, uint16_t bytes_per_frame); uint8_t dim_init_control(struct dim_channel *ch, uint8_t is_tx, uint16_t ch_address, uint16_t max_buffer_size); uint8_t dim_init_async(struct dim_channel *ch, uint8_t is_tx, uint16_t ch_address, uint16_t max_buffer_size); uint8_t dim_init_isoc(struct dim_channel *ch, uint8_t is_tx, uint16_t ch_address, uint16_t packet_length); uint8_t dim_init_sync(struct dim_channel *ch, uint8_t is_tx, uint16_t ch_address, uint16_t bytes_per_frame); uint8_t dim_destroy_channel(struct dim_channel *ch); void dim_service_mlb_int_irq(void); void dim_service_ahb_int_irq(struct dim_channel *const *channels); uint8_t dim_service_channel(struct dim_channel *ch); struct dim_ch_state_t *dim_get_channel_state(struct dim_channel *ch, struct dim_ch_state_t *state_ptr); uint16_t dim_dbr_space(struct dim_channel *ch); bool dim_enqueue_buffer(struct dim_channel *ch, uint32_t buffer_addr, uint16_t buffer_size); bool dim_detach_buffers(struct dim_channel *ch, uint16_t buffers_number); uint32_t dimcb_io_read(uint32_t *ptr32); void dimcb_io_write(uint32_t *ptr32, uint32_t value); void dimcb_on_error(uint8_t error_id, const char *error_message); #ifdef __cplusplus } #endif #endif /* _DIM2_HAL_H */ <file_sep>/audio-source/samv71-ucs/src/driver/dim2/internal/ringbuffer.h /*------------------------------------------------------------------------------------------------*/ /* DIM2 LOW LEVEL DRIVER */ /* (c) 2017 Microchip Technology Inc. and its subsidiaries. */ /* */ /* You may use this software and any derivatives exclusively with Microchip products. */ /* */ /* THIS SOFTWARE IS SUPPLIED BY MICROCHIP "AS IS". NO WARRANTIES, WHETHER EXPRESS, IMPLIED OR */ /* STATUTORY, APPLY TO THIS SOFTWARE, INCLUDING ANY IMPLIED WARRANTIES OF NON-INFRINGEMENT, */ /* MERCHANTABILITY, AND FITNESS FOR A PARTICULAR PURPOSE, OR ITS INTERACTION WITH MICROCHIP */ /* PRODUCTS, COMBINATION WITH ANY OTHER PRODUCTS, OR USE IN ANY APPLICATION. */ /* */ /* IN NO EVENT WILL MICROCHIP BE LIABLE FOR ANY INDIRECT, SPECIAL, PUNITIVE, INCIDENTAL OR */ /* CONSEQUENTIAL LOSS, DAMAGE, COST OR EXPENSE OF ANY KIND WHATSOEVER RELATED TO THE SOFTWARE, */ /* HOWEVER CAUSED, EVEN IF MICROCHIP HAS BEEN ADVISED OF THE POSSIBILITY OR THE DAMAGES ARE */ /* FORESEEABLE. TO THE FULLEST EXTENT ALLOWED BY LAW, MICROCHIP'S TOTAL LIABILITY ON ALL CLAIMS */ /* IN ANY WAY RELATED TO THIS SOFTWARE WILL NOT EXCEED THE AMOUNT OF FEES, IF ANY, THAT YOU HAVE */ /* PAID DIRECTLY TO MICROCHIP FOR THIS SOFTWARE. */ /* */ /* MICROCHIP PROVIDES THIS SOFTWARE CONDITIONALLY UPON YOUR ACCEPTANCE OF THESE TERMS. */ /*------------------------------------------------------------------------------------------------*/ #ifndef RINGBUFFER_H_ #define RINGBUFFER_H_ #include <stdint.h> #include <stdbool.h> typedef struct { volatile uint32_t dataQueue; volatile uint32_t pRx; volatile uint32_t pTx; volatile uint32_t amountOfEntries; volatile uint32_t sizeOfEntry; volatile uint32_t rxPos; volatile uint32_t txPos; } RingBuffer_t; /*----------------------------------------------------------*/ /*! \brief Initializes the given RingBuffer structure * \note This function must be called before any other functions of this component. * \param rb - Pointer to the RingBuffer_t structure, must not be NULL. * \param amountOfEntries - How many entries can be stored in the ring buffer. * \param workingBuffer - Memory area which is exactly (amountOfEntries * sizeOfEntry) bytes. */ /*----------------------------------------------------------*/ void RingBuffer_Init(RingBuffer_t *rb, uint16_t amountOfEntries, uint32_t sizeOfEntry, void *workingBuffer); /*----------------------------------------------------------*/ /*! \brief Deinitializes the given RingBuffer structure * \note After this function, all functions, except of RingBuffer_Init, must not be called. * \param rb - Pointer to the RingBuffer_t structure, must not be NULL. */ /*----------------------------------------------------------*/ void RingBuffer_Deinit(RingBuffer_t *rb); /*----------------------------------------------------------*/ /*! \brief Gets the amount of entries stored. * \param rb - Pointer to the RingBuffer_t structure, must not be NULL. * \return The amount of filled RX entries, which may be accessed via RingBuffer_GetReadPtrPos. */ /*----------------------------------------------------------*/ uint32_t RingBuffer_GetReadElementCount(RingBuffer_t *rb); /*----------------------------------------------------------*/ /*! \brief Gets the head data from the ring buffer in order to read, if available. * \param rb - Pointer to the RingBuffer_t structure, must not be NULL. * \return Pointer to the oldest enqueued void structure, if data is available, NULL otherwise. */ /*----------------------------------------------------------*/ void *RingBuffer_GetReadPtr(RingBuffer_t *rb); /*----------------------------------------------------------*/ /*! \brief Gets the head data from the ring buffer in order to read, if available. * \param rb - Pointer to the RingBuffer_t structure, must not be NULL. * \param pos - The position to read, starting with 0 for the oldest entry. Use RingBuffer_GetReadElementCount to get maximum pos count (max -1). * \return Pointer to the enqueued void structure on the given position, if data is available, NULL otherwise. */ /*----------------------------------------------------------*/ void *RingBuffer_GetReadPtrPos(RingBuffer_t *rb, uint32_t pos); /*----------------------------------------------------------*/ /*! \brief Marks the oldest available entry as invalid for reading, so it can be reused by TX functions. * \param rb - Pointer to the RingBuffer_t structure, must not be NULL. */ /*----------------------------------------------------------*/ void RingBuffer_PopReadPtr(RingBuffer_t *rb); /*----------------------------------------------------------*/ /*! \brief Gets the head data from the ring buffer in order to write, if available. * \param rb - Pointer to the RingBuffer_t structure, must not be NULL. * \return Pointer to the a free void structure, so user can fill data into. */ /*----------------------------------------------------------*/ void *RingBuffer_GetWritePtr(RingBuffer_t *rb); /*----------------------------------------------------------*/ /*! \brief Marks the packet filled by RingBuffer_GetWritePtr as ready to read. * \note After this call, the structure, got from RingBuffer_GetWritePtr, must not be written anymore. * \param rb - Pointer to the RingBuffer_t structure, must not be NULL. */ /*----------------------------------------------------------*/ void RingBuffer_PopWritePtr(RingBuffer_t *rb); #endif /* RINGBUFFER_H_ */<file_sep>/audio-source/samv71-ucs/libraries/ucsi/ucsi_cfg.h /*------------------------------------------------------------------------------------------------*/ /* UNICENS Integration Helper Component */ /* Copyright 2017, Microchip Technology Inc. and its subsidiaries. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following conditions are met: */ /* */ /* 1. Redistributions of source code must retain the above copyright notice, this */ /* list of conditions and the following disclaimer. */ /* */ /* 2. Redistributions in binary form must reproduce the above copyright notice, */ /* this list of conditions and the following disclaimer in the documentation */ /* and/or other materials provided with the distribution. */ /* */ /* 3. Neither the name of the copyright holder nor the names of its */ /* contributors may be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" */ /* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE */ /* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE */ /* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE */ /* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL */ /* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR */ /* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER */ /* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, */ /* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE */ /* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /*------------------------------------------------------------------------------------------------*/ #ifndef UNICENSINTEGRATION_H_ #define UNICENSINTEGRATION_H_ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* USER ADJUSTABLE VALUES */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ #ifdef DEBUG #define ENABLE_INIC_WATCHDOG (false) #else #define ENABLE_INIC_WATCHDOG (true) #endif #define ENABLE_AMS_LIB (true) #define DEBUG_XRM #define ENABLE_RESOURCE_PRINT #define BOARD_PMS_TX_SIZE (72) #define CMD_QUEUE_LEN (8) #define I2C_WRITE_MAX_LEN (32) #define AMS_MSG_MAX_LEN (45) #define MAX_NODES (8) #include <string.h> #include <stdarg.h> #include "ucs_cfg.h" #include "ucs_api.h" /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* PRIVATE SECTION */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /** * \brief Internal enum for UNICENS Integration */ typedef enum { /**Result is OK and the processing is finished. Safe to dequeue this command.*/ UniCmdResult_OK_ProcessFinished, /**Result is OK but the processing is ongoing. Must wait for callback.*/ UniCmdResult_OK_NeedToWaitForCB, /**Result is error and the processing is finished. Safe to dequeue this command.*/ UniCmdResult_ERROR_ProcessFinished } UnicensCmdResult_t; /** * \brief Internal enum for UNICENS Integration */ typedef enum { UnicensCmd_Unknown, UnicensCmd_Init, UnicensCmd_Stop, UnicensCmd_RmSetRoute, UnicensCmd_NsRun, UnicensCmd_GpioCreatePort, UnicensCmd_GpioWritePort, UnicensCmd_I2CWrite, UnicensCmd_I2CRead, UnicensCmd_SendAmsMessage, UnicensCmd_NDStart, UnicensCmd_NDStop, UnicensCmd_NwStartup, UnicensCmd_NwShutdown, UnicensCmd_ProgIsRam, UnicensCmd_ProgIsRom, UnicensCmd_ProgInitAll, UnicensCmd_PacketFilterMode } UnicensCmd_t; /** * \brief Internal struct for UNICENS Integration */ typedef struct { const Ucs_InitData_t *init_ptr; } UnicensCmdInit_t; /** * \brief Internal struct for UNICENS Integration */ typedef struct { Ucs_Rm_Route_t *routePtr; bool isActive; } UnicensCmdRmSetRoute_t; /** * \brief Internal struct for UNICENS Integration */ typedef struct { uint16_t nodeAddress; Ucs_Ns_Script_t *scriptPtr; uint8_t scriptSize; } UnicensCmdNsRun_t; /** * \brief Internal struct for UNICENS Integration */ typedef struct { uint16_t destination; uint16_t debounceTime; } UnicensCmdGpioCreatePort_t; /** * \brief Internal struct for UNICENS Integration */ typedef struct { uint16_t destination; uint16_t mask; uint16_t data; } UnicensCmdGpioWritePort_t; /** * \brief Internal struct for UNICENS Integration */ typedef struct { uint16_t destination; bool isBurst; uint8_t blockCount; uint8_t slaveAddr; uint16_t timeout; uint8_t dataLen; uint8_t data[I2C_WRITE_MAX_LEN]; } UnicensCmdI2CWrite_t; /** * \brief Internal struct for UNICENS Integration */ typedef struct { uint16_t destination; uint8_t slaveAddr; uint16_t timeout; uint8_t dataLen; } UnicensCmdI2CRead_t; /** * \brief Internal struct for UNICENS Integration */ typedef struct { uint16_t msgId; uint16_t targetAddress; uint8_t pPayload[AMS_MSG_MAX_LEN]; uint32_t payloadLen; } UnicensCmdSendAmsMessage_t; /** * \brief Internal struct for UNICENS Integration */ typedef struct { Ucs_Signature_t signature; Ucs_IdentString_t ident_string; } UnicensCmdProgIsRam_t; /** * \brief Internal struct for UNICENS Integration * \note: currently identical to UnicensCmdProgIsRam_t, but maybe different in future */ typedef struct { Ucs_Signature_t signature; Ucs_IdentString_t ident_string; } UnicensCmdProgIsRom_t; /** * \brief Internal struct for UNICENS Integration * \note: currently identical to UnicensCmdProgIsRam_t, but maybe different in future */ typedef struct { uint16_t destination_address; uint16_t mode; } UnicensCmdPacketFilterMode_t; /** * \brief Internal struct for UNICENS Integration */ typedef struct { UnicensCmd_t cmd; union { UnicensCmdInit_t Init; UnicensCmdRmSetRoute_t RmSetRoute; UnicensCmdNsRun_t NsRun; UnicensCmdGpioCreatePort_t GpioCreatePort; UnicensCmdGpioWritePort_t GpioWritePort; UnicensCmdI2CWrite_t I2CWrite; UnicensCmdI2CRead_t I2CRead; UnicensCmdProgIsRam_t ProgIsRam; UnicensCmdProgIsRom_t ProgIsRom; UnicensCmdPacketFilterMode_t PacketFilterMode; #if (ENABLE_AMS_LIB) UnicensCmdSendAmsMessage_t SendAms; #endif } val; } UnicensCmdEntry_t; /** * \brief Internal variables for one instance of UNICENS Integration * \note Never touch any of this fields! */ typedef struct { volatile uint8_t *dataQueue; volatile uint8_t *pRx; volatile uint8_t *pTx; volatile uint32_t amountOfEntries; volatile uint32_t sizeOfEntry; volatile uint32_t rxPos; volatile uint32_t txPos; } RB_t; /** * \brief Internal variables for one instance of UNICENS Integration * \note Allocate this structure for each instance (static or malloc) * and pass it to UCSI_Init() * \note Never touch any of this fields! */ typedef struct { uint32_t magic; void *tag; bool initialized; bool ndRunning; bool programmingMode; bool programmingJobsTotal; bool programmingJobsFinished; Ucs_Rm_Route_t *pendingRoutePtr; RB_t rb; uint8_t rbBuf[(CMD_QUEUE_LEN * sizeof(UnicensCmdEntry_t))]; Ucs_Inst_t *unicens; Ucs_InitData_t uniInitData; bool triggerService; Ucs_Lld_Api_t *uniLld; void *uniLldHPtr; UnicensCmdEntry_t *currentCmd; bool printTrigger; } UCSI_Data_t; #endif /* UNICENSINTEGRATION_H_ */<file_sep>/audio-source/samv71-ucs/src/board_init.h /*------------------------------------------------------------------------------------------------*/ /* Board Init Component */ /* Copyright 2017, Microchip Technology Inc. and its subsidiaries. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following conditions are met: */ /* */ /* 1. Redistributions of source code must retain the above copyright notice, this */ /* list of conditions and the following disclaimer. */ /* */ /* 2. Redistributions in binary form must reproduce the above copyright notice, */ /* this list of conditions and the following disclaimer in the documentation */ /* and/or other materials provided with the distribution. */ /* */ /* 3. Neither the name of the copyright holder nor the names of its */ /* contributors may be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" */ /* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE */ /* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE */ /* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE */ /* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL */ /* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR */ /* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER */ /* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, */ /* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE */ /* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /*------------------------------------------------------------------------------------------------*/ /*----------------------------------------------------------*/ /*! \file * \brief This file contains board initialization code. * It was introduced to reduce the complexity of the main.cpp file. */ /*----------------------------------------------------------*/ #ifndef _BOARDINIT_H_ #define _BOARDINIT_H_ #ifdef __cplusplus extern "C" { #endif #include "board.h" #include "gmacd.h" #include "gmac_init.h" /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* Public available driver, initialized by this component */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /** TWI driver instance */ extern Twid twid; /** The GMAC driver instance */ extern sGmacd gGmacd; /** The MACB driver instance */ extern GMacb gGmacb; /// /** The DMA driver instance */ extern sXdmad xdma; /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* Public typedefs */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ typedef enum { BoardButton_SW0, BoardButton_SW1 } Board_Button_t; /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* Public functions */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /** * \brief Initializes the Board Support Component * \note Must be called before any other function of this component */ void Board_Init(void); /** * \brief Checks if the given button is pressed * \param button - Enumeration specifying the button to check * \return true, if the button is currently pressed. false, otherwise */ bool Board_IsButtonPressed(Board_Button_t button); #ifdef __cplusplus } #endif #endif /* _BOARDINIT_H_ */<file_sep>/audio-source/samv71-ucs/src/gmac/gmii.h /* ---------------------------------------------------------------------------- */ /* Atmel Microcontroller Software Support */ /* SAM Software Package License */ /* ---------------------------------------------------------------------------- */ /* Copyright (c) 2015, Atmel Corporation */ /* */ /* All rights reserved. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following condition is met: */ /* */ /* - Redistributions of source code must retain the above copyright notice, */ /* this list of conditions and the disclaimer below. */ /* */ /* Atmel's name may not be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* DISCLAIMER: THIS SOFTWARE IS PROVIDED BY ATMEL "AS IS" AND ANY EXPRESS OR */ /* IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF */ /* MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT ARE */ /* DISCLAIMED. IN NO EVENT SHALL ATMEL BE LIABLE FOR ANY DIRECT, INDIRECT, */ /* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT */ /* LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, */ /* OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF */ /* LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING */ /* NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, */ /* EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /* ---------------------------------------------------------------------------- */ #ifndef _GMII_DEFINE_H #define _GMII_DEFINE_H /*--------------------------------------------------------------------------- * Definitions *---------------------------------------------------------------------------*/ //IEEE defined Registers #define GMII_BMCR 0x0 // Basic Mode Control Register #define GMII_BMSR 0x1 // Basic Mode Status Register #define GMII_PHYID1R 0x2 // PHY Identifier Register 1 #define GMII_PHYID2R 0x3 // PHY Identifier Register 2 #define GMII_ANAR 0x4 // Auto_Negotiation Advertisement Register #define GMII_ANLPAR 0x5 // Auto_negotiation Link Partner Ability Register #define GMII_ANER 0x6 // Auto-negotiation Expansion Register #define GMII_ANNPR 0x7 // Auto-negotiation Next Page Register #define GMII_ANLPNPAR 0x8 // Auto_negotiation Link Partner Next Page Ability Register #define GMII_AFEC0R 0x11 // AFE Control 0 Register #define GMII_AFEC3R 0x14 // AFE Control 3 Register #define GMII_RXERCR 0x15 // RXER Counter Register #define GMII_OMSSR 0x17 // Operation Mode Strap Status Register #define GMII_ECR 0x18 // Expanded Control Register #define GMII_ICSR 0x1B // Interrupt Control/Status Register #define GMII_FC 0x1C // Function Control #define GMII_LCSR 0x1D // LinkMDŽ Control/Status Register #define GMII_PC1R 0x1E // PHY Control 1 Register #define GMII_PC2R 0x1F // PHY Control 2 Register // PHY ID Identifier Register #define GMII_LSB_MASK 0x0U // definitions: MII_PHYID1 #define GMII_OUI_MSB 0x0022 // definitions: MII_PHYID2 #define GMII_OUI_LSB 0x1572 // KSZ8061 PHY Id2 // Basic Mode Control Register (BMCR) // Bit definitions: MII_BMCR #define GMII_RESET (1u << 15) // 1= Software Reset; 0=Normal Operation #define GMII_LOOPBACK (1u << 14) // 1=loopback Enabled; 0=Normal Operation #define GMII_SPEED_SELECT_LSB (1u << 13) // 1,0=1000Mbps 0,1=100Mbps; 0,0=10Mbps #define GMII_AUTONEG (1u << 12) // Auto-negotiation Enable #define GMII_POWER_DOWN (1u << 11) // 1=Power down 0=Normal operation #define GMII_ISOLATE (1u << 10) // 1 = Isolates 0 = Normal operation #define GMII_RESTART_AUTONEG (1u << 9) // 1 = Restart auto-negotiation 0 = Normal operation #define GMII_DUPLEX_MODE (1u << 8) // 1 = Full duplex operation 0 = Normal operation // Reserved 7 // Read as 0, ignore on write #define GMII_SPEED_SELECT_MSB (1u << 6) // // Reserved 5 to 0 // Read as 0, ignore on write // Basic Mode Status Register (BMSR) // Bit definitions: MII_BMSR #define GMII_100BASE_T4 (1 << 15) // 100BASE-T4 Capable #define GMII_100BASE_TX_FD (1 << 14) // 100BASE-TX Full Duplex Capable #define GMII_100BASE_T4_HD (1 << 13) // 100BASE-TX Half Duplex Capable #define GMII_10BASE_T_FD (1 << 12) // 10BASE-T Full Duplex Capable #define GMII_10BASE_T_HD (1 << 11) // 10BASE-T Half Duplex Capable // Reserved 10 to 9 // Read as 0, ignore on write #define GMII_EXTEND_STATUS (1 << 8) // 1 = Extend Status Information In Reg 15 // Reserved 7 #define GMII_MF_PREAMB_SUPPR (1 << 6) // MII Frame Preamble Suppression #define GMII_AUTONEG_COMP (1 << 5) // Auto-negotiation Complete #define GMII_REMOTE_FAULT (1 << 4) // Remote Fault #define GMII_AUTONEG_ABILITY (1 << 3) // Auto Configuration Ability #define GMII_LINK_STATUS (1 << 2) // Link Status #define GMII_JABBER_DETECT (1 << 1) // Jabber Detect #define GMII_EXTEND_CAPAB (1 << 0) // Extended Capability // Auto-negotiation Advertisement Register (ANAR) // Auto-negotiation Link Partner Ability Register (ANLPAR) // Bit definitions: MII_ANAR, MII_ANLPAR #define GMII_NP (1 << 15) // Next page Indication // Reserved 7 #define GMII_RF (1 << 13) // Remote Fault // Reserved 12 // Write as 0, ignore on read #define GMII_PAUSE_MASK (3 << 11) // 0,0 = No Pause 1,0 = Asymmetric Pause(link partner) // 0,1 = Symmetric Pause 1,1 = Symmetric&Asymmetric Pause(local device) #define GMII_T4 (1 << 9) // 100BASE-T4 Support #define GMII_TX_FDX (1 << 8) // 100BASE-TX Full Duplex Support #define GMII_TX_HDX (1 << 7) // 100BASE-TX Support #define GMII_10_FDX (1 << 6) // 10BASE-T Full Duplex Support #define GMII_10_HDX (1 << 5) // 10BASE-T Support // Selector 4 to 0 // Protocol Selection Bits #define GMII_AN_IEEE_802_3 0x00001 #endif // #ifndef _MII_DEFINE_H <file_sep>/audio-source/samv71-ucs/src/driver/dim2/internal/ringbuffer.c /*------------------------------------------------------------------------------------------------*/ /* DIM2 LOW LEVEL DRIVER */ /* (c) 2017 Microchip Technology Inc. and its subsidiaries. */ /* */ /* You may use this software and any derivatives exclusively with Microchip products. */ /* */ /* THIS SOFTWARE IS SUPPLIED BY MICROCHIP "AS IS". NO WARRANTIES, WHETHER EXPRESS, IMPLIED OR */ /* STATUTORY, APPLY TO THIS SOFTWARE, INCLUDING ANY IMPLIED WARRANTIES OF NON-INFRINGEMENT, */ /* MERCHANTABILITY, AND FITNESS FOR A PARTICULAR PURPOSE, OR ITS INTERACTION WITH MICROCHIP */ /* PRODUCTS, COMBINATION WITH ANY OTHER PRODUCTS, OR USE IN ANY APPLICATION. */ /* */ /* IN NO EVENT WILL MICROCHIP BE LIABLE FOR ANY INDIRECT, SPECIAL, PUNITIVE, INCIDENTAL OR */ /* CONSEQUENTIAL LOSS, DAMAGE, COST OR EXPENSE OF ANY KIND WHATSOEVER RELATED TO THE SOFTWARE, */ /* HOWEVER CAUSED, EVEN IF MICROCHIP HAS BEEN ADVISED OF THE POSSIBILITY OR THE DAMAGES ARE */ /* FORESEEABLE. TO THE FULLEST EXTENT ALLOWED BY LAW, MICROCHIP'S TOTAL LIABILITY ON ALL CLAIMS */ /* IN ANY WAY RELATED TO THIS SOFTWARE WILL NOT EXCEED THE AMOUNT OF FEES, IF ANY, THAT YOU HAVE */ /* PAID DIRECTLY TO MICROCHIP FOR THIS SOFTWARE. */ /* */ /* MICROCHIP PROVIDES THIS SOFTWARE CONDITIONALLY UPON YOUR ACCEPTANCE OF THESE TERMS. */ /*------------------------------------------------------------------------------------------------*/ #include <stdlib.h> #include <assert.h> #include "ringbuffer.h" void RingBuffer_Init(RingBuffer_t *rb, uint16_t amountOfEntries, uint32_t sizeOfEntry, void *workingBuffer) { assert(NULL != rb); assert(NULL != workingBuffer); rb->dataQueue = (uint32_t)workingBuffer; rb->pRx = (uint32_t)rb->dataQueue; rb->pTx = (uint32_t)rb->dataQueue; rb->amountOfEntries = amountOfEntries; rb->sizeOfEntry = sizeOfEntry; rb->rxPos = 0; rb->txPos = 0; } void RingBuffer_Deinit(RingBuffer_t *rb) { assert(NULL != rb); rb->dataQueue = rb->amountOfEntries = rb->rxPos = rb->txPos = rb->pRx = rb->pTx = 0; } uint32_t RingBuffer_GetReadElementCount(RingBuffer_t *rb) { assert(NULL != rb); assert(0 != rb->dataQueue); return (uint32_t)(rb->txPos - rb->rxPos); } void *RingBuffer_GetReadPtr(RingBuffer_t *rb) { assert(NULL != rb); assert(0 != rb->dataQueue); if (rb->txPos - rb->rxPos > 0) return (void *)rb->pRx; return NULL; } void *RingBuffer_GetReadPtrPos(RingBuffer_t *rb, uint32_t pos) { uint32_t i, t; assert(NULL != rb); assert(0 != rb->dataQueue); if (rb->txPos - rb->rxPos <= pos) return NULL; t = rb->pRx; for (i = 0; i < pos; i++) { t += rb->sizeOfEntry; if (t >= (uint32_t)rb->dataQueue + (rb->amountOfEntries * rb->sizeOfEntry)) t = rb->dataQueue; } return (void *)t; } void RingBuffer_PopReadPtr(RingBuffer_t *rb) { assert(NULL != rb); assert(0 != rb->dataQueue); rb->pRx += rb->sizeOfEntry; if (rb->pRx >= rb->dataQueue + ( rb->amountOfEntries * rb->sizeOfEntry)) rb->pRx = rb->dataQueue; ++rb->rxPos; assert(rb->txPos >= rb->rxPos); } void *RingBuffer_GetWritePtr(RingBuffer_t *rb) { assert(NULL != rb); assert(0 != rb->dataQueue); if (rb->txPos - rb->rxPos < rb->amountOfEntries) return (void *)rb->pTx; return NULL; } void RingBuffer_PopWritePtr(RingBuffer_t *rb) { assert(NULL != rb); assert(0 != rb->dataQueue); rb->pTx += rb->sizeOfEntry; if (rb->pTx >= rb->dataQueue + ( rb->amountOfEntries * rb->sizeOfEntry)) rb->pTx = rb->dataQueue; ++rb->txPos; assert(rb->txPos >= rb->rxPos); }<file_sep>/audio-source/samv71-ucs/libraries/libchip/include/qspi.h /* ---------------------------------------------------------------------------- */ /* Atmel Microcontroller Software Support */ /* SAM Software Package License */ /* ---------------------------------------------------------------------------- */ /* Copyright (c) 2015, Atmel Corporation */ /* */ /* All rights reserved. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following condition is met: */ /* */ /* - Redistributions of source code must retain the above copyright notice, */ /* this list of conditions and the disclaimer below. */ /* */ /* Atmel's name may not be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* DISCLAIMER: THIS SOFTWARE IS PROVIDED BY ATMEL "AS IS" AND ANY EXPRESS OR */ /* IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF */ /* MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT ARE */ /* DISCLAIMED. IN NO EVENT SHALL ATMEL BE LIABLE FOR ANY DIRECT, INDIRECT, */ /* INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT */ /* LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, */ /* OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF */ /* LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING */ /* NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, */ /* EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /* ---------------------------------------------------------------------------- */ /** * \file * * Interface for Serial Peripheral Interface (SPI) controller. * */ #ifndef _QSPI_ #define _QSPI_ /*---------------------------------------------------------------------------- * Macros *----------------------------------------------------------------------------*/ /** * * Here are several macros which should be used when configuring a SPI * peripheral. * * \section qspi_configuration_macros SPI Configuration Macros * - \ref QSPI_PCS * - \ref QSPI_SCBR * - \ref QSPI_DLYBS * - \ref QSPI_DLYBCT */ /** Calculates the value of the CSR SCBR field given the baudrate and MCK. */ #define QSPI_SCBR(baudrate, masterClock) \ ((uint32_t) (masterClock / baudrate) << 8) /** Calculates the value of the CSR DLYBS field given the desired delay (in ns) */ #define QSPI_DLYBS(delay, masterClock) \ ((uint32_t) (((masterClock / 1000000) * delay) / 1000) << 16) /** Calculates the value of the CSR DLYBCT field given the desired delay (in ns) */ #define QSPI_DLYBCT(delay, masterClock) \ ((uint32_t) (((masterClock / 1000000) * delay) / 32000) << 24) /*--------------------------------------------------------------------------- */ #ifdef __cplusplus extern "C" { #endif /*---------------------------------------------------------------------------- * Exported functions *----------------------------------------------------------------------------*/ /** \brief qspi access modes */ typedef enum { CmdAccess = 0, ReadAccess, WriteAccess } Access_t; /** \brief qspi modes SPI or QSPI */ typedef enum { SpiMode = QSPI_MR_SMM_SPI, QspiMemMode = QSPI_MR_SMM_MEMORY } QspiMode_t; /** \brief qspi clock modes , regarding clock phase and clock polarity */ typedef enum { ClockMode_00 = 0, ClockMode_10, ClockMode_01, ClockMode_11 } QspiClockMode_t; /** \brief qspi status codes */ typedef enum { QSPI_SUCCESS = 0, QSPI_BUSY, QSPI_BUSY_SENDING, QSPI_READ_ERROR, QSPI_WRITE_ERROR, QSPI_UNKNOWN_ERROR, QSPI_INIT_ERROR, QSPI_INPUT_ERROR, QSPI_TOTAL_ERROR } QspidStatus_t; /** \brief qspi status regiter bits */ typedef enum { IsReceived = QSPI_SR_RDRF, IsTxSent = QSPI_SR_TDRE, IsTxEmpty = QSPI_SR_TXEMPTY, IsOverrun = QSPI_SR_OVRES, IsCsRise = QSPI_SR_CSR, IsCsAsserted = QSPI_SR_CSS, IsEofInst = QSPI_SR_INSTRE, IsEnabled = QSPI_SR_QSPIENS } QspiStatus_t; /** \brief qspi command structure */ typedef struct { uint8_t Instruction; uint8_t Option; } QspiMemCmd_t; /** \brief qspi buffer structure */ typedef struct { uint32_t TxDataSize; /* Tx buffer size */ uint32_t RxDataSize; /* Rx buffer size */ uint32_t *pDataTx; /* Tx buffer */ uint32_t *pDataRx; /* Rx buffer */ } QspiBuffer_t; /** \brief qspi frame structure for QSPI mode */ typedef struct { union _QspiInstFrame { uint32_t val; struct _QspiInstFrameBM { uint32_t bwidth: 3, /** Width of QSPI Addr , inst data */ reserved0: 1, /** Reserved*/ bInstEn: 1, /** Enable Inst */ bAddrEn: 1, /** Enable Address */ bOptEn: 1, /** Enable Option */ bDataEn: 1, /** Enable Data */ bOptLen: 2, /** Option Length*/ bAddrLen: 1, /** Addrs Length*/ reserved1: 1, /** Option Length*/ bXfrType: 2, /** Transfer type*/ bContinuesRead: 1, /** Continoues read mode*/ reserved2: 1, /** Reserved*/ bDummyCycles: 5, /**< Unicast hash match */ reserved3: 11; /** Reserved*/ } bm; } InstFrame; uint32_t Addr; } QspiInstFrame_t; /** \brief qspi driver structure */ typedef struct { uint8_t qspiId; /* QSPI ID */ Qspi *pQspiHw; /* QSPI Hw instance */ QspiMode_t qspiMode; /* Qspi mode: SPI or QSPI */ QspiMemCmd_t qspiCommand; /* Qspi command structure*/ QspiBuffer_t qspiBuffer; /* Qspi buffer*/ QspiInstFrame_t *pQspiFrame; /* Qspi QSPI mode Fram register informations*/ } Qspid_t; void QSPI_SwReset(Qspi *pQspi); void QSPI_Disable(Qspi *pQspi); void QSPI_Enable(Qspi *pQspi); QspidStatus_t QSPI_EndTransfer(Qspi *pQspi); uint32_t QSPI_GetStatus(Qspi *pQspi, const QspiStatus_t rStatus); void QSPI_ConfigureClock(Qspi *pQspi, QspiClockMode_t ClockMode, uint32_t dwClockCfg); QspidStatus_t QSPI_SingleReadSPI(Qspid_t *pQspid, uint16_t *const pData); QspidStatus_t QSPI_MultiReadSPI(Qspid_t *pQspid, uint16_t * const pData, uint32_t NumOfBytes); QspidStatus_t QSPI_SingleWriteSPI(Qspid_t *pQspid, uint16_t const *pData); QspidStatus_t QSPI_MultiWriteSPI(Qspid_t *pQspid, uint16_t const *pData , uint32_t NumOfBytes); QspidStatus_t QSPI_EnableIt(Qspi *pQspi, uint32_t dwSources); QspidStatus_t QSPI_DisableIt(Qspi *pQspi, uint32_t dwSources); uint32_t QSPI_GetItMask(Qspi *pQspi); uint32_t QSPI_GetEnabledItStatus(Qspi *pQspi); QspidStatus_t QSPI_ConfigureInterface(Qspid_t *pQspid, QspiMode_t Mode, uint32_t dwConfiguration); QspidStatus_t QSPI_SendCommand(Qspid_t *pQspi, uint8_t const KeepCfg); QspidStatus_t QSPI_SendCommandWithData(Qspid_t *pQspi, uint8_t const KeepCfg); QspidStatus_t QSPI_ReadCommand(Qspid_t *pQspi, uint8_t const KeepCfg); QspidStatus_t QSPI_EnableMemAccess(Qspid_t *pQspi, uint8_t const KeepCfg, uint8_t ScrambleFlag); QspidStatus_t QSPI_ReadWriteMem(Qspid_t *pQspid, Access_t const ReadWrite); #ifdef __cplusplus } #endif #endif /* #ifndef _QSPI_ */ <file_sep>/audio-source/samv71-ucs/src/driver/dim2/board/dim2_hardware.h /*------------------------------------------------------------------------------------------------*/ /* DIM2 LOW LEVEL DRIVER */ /* (c) 2017 Microchip Technology Inc. and its subsidiaries. */ /* */ /* You may use this software and any derivatives exclusively with Microchip products. */ /* */ /* THIS SOFTWARE IS SUPPLIED BY MICROCHIP "AS IS". NO WARRANTIES, WHETHER EXPRESS, IMPLIED OR */ /* STATUTORY, APPLY TO THIS SOFTWARE, INCLUDING ANY IMPLIED WARRANTIES OF NON-INFRINGEMENT, */ /* MERCHANTABILITY, AND FITNESS FOR A PARTICULAR PURPOSE, OR ITS INTERACTION WITH MICROCHIP */ /* PRODUCTS, COMBINATION WITH ANY OTHER PRODUCTS, OR USE IN ANY APPLICATION. */ /* */ /* IN NO EVENT WILL MICROCHIP BE LIABLE FOR ANY INDIRECT, SPECIAL, PUNITIVE, INCIDENTAL OR */ /* CONSEQUENTIAL LOSS, DAMAGE, COST OR EXPENSE OF ANY KIND WHATSOEVER RELATED TO THE SOFTWARE, */ /* HOWEVER CAUSED, EVEN IF MICROCHIP HAS BEEN ADVISED OF THE POSSIBILITY OR THE DAMAGES ARE */ /* FORESEEABLE. TO THE FULLEST EXTENT ALLOWED BY LAW, MICROCHIP'S TOTAL LIABILITY ON ALL CLAIMS */ /* IN ANY WAY RELATED TO THIS SOFTWARE WILL NOT EXCEED THE AMOUNT OF FEES, IF ANY, THAT YOU HAVE */ /* PAID DIRECTLY TO MICROCHIP FOR THIS SOFTWARE. */ /* */ /* MICROCHIP PROVIDES THIS SOFTWARE CONDITIONALLY UPON YOUR ACCEPTANCE OF THESE TERMS. */ /*------------------------------------------------------------------------------------------------*/ #ifndef DIM2_HARDWARE_H_ #define DIM2_HARDWARE_H_ #include "dim2_hal.h" //HARDWARE DEPENDEND: #define DIM2_BASE_ADDRESS ((struct dim2_regs *)0x40068000) #define DIM2_MLB_SPEED (CLK_512FS) void enable_mlb_clock(void); void initialize_mlb_pins(void); void enable_mlb_interrupt(void); void disable_mlb_interrupt(void); // to be implemented in LLD void on_mlb_int_isr(void); void on_ahb0_int_isr(void); #endif /* DIM2_HARDWARE_H_ */ <file_sep>/audio-source/samv71-ucs/libraries/ucsi/ucsi_impl.c /*------------------------------------------------------------------------------------------------*/ /* UNICENS Integration Helper Component */ /* Copyright 2017, Microchip Technology Inc. and its subsidiaries. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following conditions are met: */ /* */ /* 1. Redistributions of source code must retain the above copyright notice, this */ /* list of conditions and the following disclaimer. */ /* */ /* 2. Redistributions in binary form must reproduce the above copyright notice, */ /* this list of conditions and the following disclaimer in the documentation */ /* and/or other materials provided with the distribution. */ /* */ /* 3. Neither the name of the copyright holder nor the names of its */ /* contributors may be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" */ /* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE */ /* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE */ /* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE */ /* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL */ /* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR */ /* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER */ /* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, */ /* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE */ /* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /*------------------------------------------------------------------------------------------------*/ #include <assert.h> #include <stdio.h> #include "ucsi_api.h" #include "ucsi_collision.h" #include "ucsi_print.h" /************************************************************************/ /* Private Definitions and variables */ /************************************************************************/ #define TRACE_BUFFER_SZ 100 #define MAGIC (0xA144BEAF) #define LOCAL_NODE_ADDR (0x1) #define UNKNOWN_NODE_ADDR (0xFFFF) #define LIB_VERSION_MAJOR (2) #define LIB_VERSION_MINOR (2) #define LIB_VERSION_RELEASE (0) #define LIB_VERSION_BUILD (4073) static char m_traceBuffer[TRACE_BUFFER_SZ]; static Ucs_Signature_t PrgSignature = { 0x200 }; static Ucs_Rm_Node_t PrgNodes[] = { { &PrgSignature, NULL, 0 } }; /************************************************************************/ /* Throw error if UNICENS Library is not existent or wrong version */ /************************************************************************/ #if !defined(UCS_VERSION_MAJOR) || !defined(UCS_VERSION_MINOR) || !defined(UCS_VERSION_RELEASE) || !defined(UCS_VERSION_BUILD) #error UNICENS library is missing. Perform command 'git submodule update --init --recursive' #endif #if (UCS_VERSION_MAJOR != LIB_VERSION_MAJOR) || (UCS_VERSION_MINOR != LIB_VERSION_MINOR) || (UCS_VERSION_RELEASE != LIB_VERSION_RELEASE) || (UCS_VERSION_BUILD != LIB_VERSION_BUILD) #error UNICENS library is outdated. Perform command 'git submodule update --init --recursive' #endif /************************************************************************/ /* Private Function Prototypes */ /************************************************************************/ static bool EnqueueCommand(UCSI_Data_t *my, UnicensCmdEntry_t *cmd); static void OnCommandExecuted(UCSI_Data_t *my, UnicensCmd_t cmd, bool success); static void RB_Init(RB_t *rb, uint16_t amountOfEntries, uint32_t sizeOfEntry, uint8_t *workingBuffer); static void *RB_GetReadPtr(RB_t *rb); static void RB_PopReadPtr(RB_t *rb); static void *RB_GetWritePtr(RB_t *rb); static void RB_PopWritePtr(RB_t *rb); static uint16_t OnUnicensGetTime(void *user_ptr); static void OnUnicensService( void *user_ptr ); static void OnUnicensError( Ucs_Error_t error_code, void *user_ptr ); static void OnUnicensAppTimer( uint16_t timeout, void *user_ptr ); static void OnUnicensDebugErrorMsg(Ucs_Message_t *m, void *user_ptr); static void OnLldCtrlStart( Ucs_Lld_Api_t* api_ptr, void *inst_ptr, void *lld_user_ptr ); static void OnLldCtrlStop( void *lld_user_ptr ); static void OnLldResetInic(void *lld_user_ptr); static void OnLldCtrlRxMsgAvailable( void *lld_user_ptr ); static void OnLldCtrlTxTransmitC( Ucs_Lld_TxMsg_t *msg_ptr, void *lld_user_ptr ); static void OnUnicensRoutingResult(Ucs_Rm_Route_t* route_ptr, Ucs_Rm_RouteInfos_t route_infos, void *user_ptr); static void OnUnicensNetworkStatus(uint16_t change_mask, uint16_t events, Ucs_Network_Availability_t availability, Ucs_Network_AvailInfo_t avail_info,Ucs_Network_AvailTransCause_t avail_trans_cause, uint16_t node_address, uint8_t max_position, uint16_t packet_bw, void *user_ptr); static void OnUnicensDebugXrmResources(Ucs_Xrm_ResourceType_t resource_type, Ucs_Xrm_ResObject_t *resource_ptr, Ucs_Xrm_ResourceInfos_t resource_infos, Ucs_Rm_EndPoint_t *endpoint_inst_ptr, void *user_ptr); static void OnUcsInitResult(Ucs_InitResult_t result, void *user_ptr); static void OnUcsStopResult(Ucs_StdResult_t result, void *user_ptr); static void OnUcsGpioPortCreate(uint16_t node_address, uint16_t gpio_port_handle, Ucs_Gpio_Result_t result, void *user_ptr); static void OnUcsGpioPortWrite(uint16_t node_address, uint16_t gpio_port_handle, uint16_t current_state, uint16_t sticky_state, Ucs_Gpio_Result_t result, void *user_ptr); static void OnUcsMgrReport(Ucs_MgrReport_t code, Ucs_Signature_t *signature_ptr, Ucs_Rm_Node_t *node_ptr, void *user_ptr); static void OnUcsNsRun(uint16_t node_address, Ucs_Ns_ResultCode_t result, Ucs_Ns_ErrorInfo_t error_info, void *ucs_user_ptr); static void OnUcsAmsRxMsgReceived(void *user_ptr); static void OnUcsGpioTriggerEventStatus(uint16_t node_address, uint16_t gpio_port_handle, uint16_t rising_edges, uint16_t falling_edges, uint16_t levels, void * user_ptr); static void OnUcsI2CWrite(uint16_t node_address, uint16_t i2c_port_handle, uint8_t i2c_slave_address, uint8_t data_len, Ucs_I2c_Result_t result, void *user_ptr); static void OnUcsI2CRead(uint16_t node_address, uint16_t i2c_port_handle, uint8_t i2c_slave_address, uint8_t data_len, uint8_t data_ptr[], Ucs_I2c_Result_t result, void *user_ptr); static void OnUcsProgRam(Ucs_Prg_ResCode_t code, Ucs_Prg_Func_t function, uint8_t ret_len, uint8_t parm[], void *user_ptr); static void OnUcsProgRom(Ucs_Prg_ResCode_t code, Ucs_Prg_Func_t function, uint8_t ret_len, uint8_t parm[], void *user_ptr); static void OnUcsPacketFilterMode(uint16_t node_address, Ucs_StdResult_t result, void *user_ptr); static void OnUcsNetworkStartup(Ucs_StdResult_t result, void *user_ptr); static void OnUcsNetworkShutdown(Ucs_StdResult_t result, void *user_ptr); #if ENABLE_AMS_LIB static void OnUcsAmsWrite(Ucs_AmsTx_Msg_t* msg_ptr, Ucs_AmsTx_Result_t result, Ucs_AmsTx_Info_t info, void *user_ptr); #endif /************************************************************************/ /* Public Function Implementations */ /************************************************************************/ void UCSI_Init(UCSI_Data_t *my, void *pTag) { Ucs_Return_t result; assert(NULL != my); memset(my, 0, sizeof(UCSI_Data_t)); my->magic = MAGIC; my->tag = pTag; my->unicens = Ucs_CreateInstance(); if (NULL == my->unicens) { UCSI_CB_OnUserMessage(my->tag, true, "Can not instance a new version of UNICENS, "\ "increase UCS_NUM_INSTANCES define", 0); assert(false); return; } result = Ucs_SetDefaultConfig(&my->uniInitData); if(UCS_RET_SUCCESS != result) { UCSI_CB_OnUserMessage(my->tag, true, "Can not set default values to UNICENS config (result=0x%X)", 1, result); assert(false); return; } my->uniInitData.user_ptr = my; my->uniInitData.mgr.report_fptr = OnUcsMgrReport; my->uniInitData.general.inic_watchdog_enabled = ENABLE_INIC_WATCHDOG; my->uniInitData.general.get_tick_count_fptr = &OnUnicensGetTime; my->uniInitData.general.request_service_fptr = &OnUnicensService; my->uniInitData.general.error_fptr = &OnUnicensError; my->uniInitData.general.set_application_timer_fptr = &OnUnicensAppTimer; my->uniInitData.general.debug_error_msg_fptr = &OnUnicensDebugErrorMsg; my->uniInitData.ams.enabled = ENABLE_AMS_LIB; my->uniInitData.ams.rx.message_received_fptr = &OnUcsAmsRxMsgReceived; my->uniInitData.network.status.notification_mask = 0xC2; my->uniInitData.network.status.cb_fptr = &OnUnicensNetworkStatus; my->uniInitData.lld.lld_user_ptr = my; my->uniInitData.lld.start_fptr = &OnLldCtrlStart; my->uniInitData.lld.stop_fptr = &OnLldCtrlStop; my->uniInitData.lld.reset_fptr = &OnLldResetInic; my->uniInitData.lld.rx_available_fptr = &OnLldCtrlRxMsgAvailable; my->uniInitData.lld.tx_transmit_fptr = &OnLldCtrlTxTransmitC; my->uniInitData.rm.report_fptr = &OnUnicensRoutingResult; my->uniInitData.rm.debug_resource_status_fptr = &OnUnicensDebugXrmResources; my->uniInitData.gpio.trigger_event_status_fptr = &OnUcsGpioTriggerEventStatus; RB_Init(&my->rb, CMD_QUEUE_LEN, sizeof(UnicensCmdEntry_t), my->rbBuf); UCSICollision_Init(); UCSICollision_SetUserPtr(my); } bool UCSI_RunInProgrammingMode(UCSI_Data_t *my, uint8_t amountOfNodes) { UnicensCmdEntry_t *e; assert(MAGIC == my->magic); if (NULL == my) return false; my->programmingMode = true; UCSICollision_SetExpectedNodeCount(amountOfNodes); if (my->initialized) { e = (UnicensCmdEntry_t *)RB_GetWritePtr(&my->rb); if (NULL == e) return false; e->cmd = UnicensCmd_Stop; RB_PopWritePtr(&my->rb); } my->uniInitData.mgr.packet_bw = 0; my->uniInitData.mgr.routes_list_ptr = NULL; my->uniInitData.mgr.routes_list_size = 0; my->uniInitData.mgr.nodes_list_ptr = PrgNodes; my->uniInitData.mgr.nodes_list_size = 1; my->uniInitData.mgr.enabled = false; e = (UnicensCmdEntry_t *)RB_GetWritePtr(&my->rb); if (NULL == e) return false; e->cmd = UnicensCmd_Init; e->val.Init.init_ptr = &my->uniInitData; RB_PopWritePtr(&my->rb); e = (UnicensCmdEntry_t *)RB_GetWritePtr(&my->rb); if (NULL == e) return false; e->cmd = UnicensCmd_NwStartup; RB_PopWritePtr(&my->rb); e = (UnicensCmdEntry_t *)RB_GetWritePtr(&my->rb); if (NULL == e) return false; e->cmd = UnicensCmd_ProgInitAll; RB_PopWritePtr(&my->rb); e = (UnicensCmdEntry_t *)RB_GetWritePtr(&my->rb); if (NULL == e) return false; e->cmd = UnicensCmd_NDStart; RB_PopWritePtr(&my->rb); UCSI_CB_OnServiceRequired(my->tag); return true; } bool UCSI_NewConfig(UCSI_Data_t *my, uint16_t packetBw, Ucs_Rm_Route_t *pRoutesList, uint16_t routesListSize, Ucs_Rm_Node_t *pNodesList, uint16_t nodesListSize) { UnicensCmdEntry_t *e; assert(MAGIC == my->magic); if (NULL == my || my->programmingMode) return false; if (my->initialized) { e = (UnicensCmdEntry_t *)RB_GetWritePtr(&my->rb); if (NULL == e) return false; e->cmd = UnicensCmd_Stop; RB_PopWritePtr(&my->rb); } my->uniInitData.mgr.packet_bw = packetBw; my->uniInitData.mgr.routes_list_ptr = pRoutesList; my->uniInitData.mgr.routes_list_size = routesListSize; my->uniInitData.mgr.nodes_list_ptr = pNodesList; my->uniInitData.mgr.nodes_list_size = nodesListSize; my->uniInitData.mgr.enabled = true; e = (UnicensCmdEntry_t *)RB_GetWritePtr(&my->rb); if (NULL == e) return false; e->cmd = UnicensCmd_Init; e->val.Init.init_ptr = &my->uniInitData; RB_PopWritePtr(&my->rb); UCSI_CB_OnServiceRequired(my->tag); UCSIPrint_Init(pRoutesList, routesListSize, my); return true; } bool UCSI_ExecuteScript(UCSI_Data_t *my, uint16_t targetAddress, Ucs_Ns_Script_t *pScriptList, uint8_t scriptListLength) { uint8_t i = 0; Ucs_Rm_Node_t *pNode = NULL; UnicensCmdEntry_t e; assert(MAGIC == my->magic); if (NULL == my || my->programmingMode) return false; if (!my->initialized || !my->uniInitData.mgr.enabled || NULL == my->uniInitData.mgr.nodes_list_ptr || 0 == my->uniInitData.mgr.nodes_list_size || NULL == pScriptList || 0 == scriptListLength) { return false; } for (i = 0; i < my->uniInitData.mgr.nodes_list_size; i++) { Ucs_Rm_Node_t *pTempNode = &my->uniInitData.mgr.nodes_list_ptr[i]; if (NULL == pTempNode) break; if (pTempNode->signature_ptr && targetAddress == pTempNode->signature_ptr->node_address) { /* Found correct node in List */ pNode = pTempNode; break; } } if (NULL == pNode) return false; e.cmd = UnicensCmd_NsRun; e.val.NsRun.nodeAddress = targetAddress; e.val.NsRun.scriptPtr = pScriptList; e.val.NsRun.scriptSize = scriptListLength; return EnqueueCommand(my, &e); } bool UCSI_ProcessRxData(UCSI_Data_t *my, const uint8_t *pBuffer, uint32_t len) { Ucs_Lld_RxMsg_t *msg = NULL; assert(MAGIC == my->magic); if (NULL == my->uniLld || NULL == my->uniLldHPtr) return false; msg = my->uniLld->rx_allocate_fptr(my->uniLldHPtr, len); if (NULL == msg) { /*This may happen by definition, OnLldCtrlRxMsgAvailable() will be called, once buffers are available again*/ return false; } msg->data_size = len; memcpy(msg->data_ptr, pBuffer, len); my->uniLld->rx_receive_fptr(my->uniLldHPtr, msg); return true; } void UCSI_Service(UCSI_Data_t *my) { UnicensCmdEntry_t *e; bool popEntry = true; /*Set to false in specific case, where function will callback asynchrony.*/ assert(MAGIC == my->magic); if (NULL != my->unicens && my->triggerService) { my->triggerService = false; Ucs_Service(my->unicens); } if (my->printTrigger) { my->printTrigger = false; UCSIPrint_Service(UCSI_CB_OnGetTime(my->tag)); } if (NULL != my->currentCmd) return; my->currentCmd = e = (UnicensCmdEntry_t *)RB_GetReadPtr(&my->rb); if (NULL == e) return; switch (e->cmd) { case UnicensCmd_Init: if (UCS_RET_SUCCESS == Ucs_Init(my->unicens, e->val.Init.init_ptr, OnUcsInitResult)) popEntry = false; else { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_Init failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_Init, false, LOCAL_NODE_ADDR); } break; case UnicensCmd_Stop: if (UCS_RET_SUCCESS == Ucs_Stop(my->unicens, OnUcsStopResult)) popEntry = false; else { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_Stop failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_Stop, false, LOCAL_NODE_ADDR); } break; case UnicensCmd_RmSetRoute: if (UCS_RET_SUCCESS == Ucs_Rm_SetRouteActive(my->unicens, e->val.RmSetRoute.routePtr, e->val.RmSetRoute.isActive)) { my->pendingRoutePtr = e->val.RmSetRoute.routePtr; popEntry = false; } else { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_Rm_SetRouteActive failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_RmSetRoute, false, e->val.RmSetRoute.routePtr->sink_endpoint_ptr->node_obj_ptr->signature_ptr->node_address); } break; case UnicensCmd_NsRun: if (UCS_RET_SUCCESS != Ucs_Ns_Run(my->unicens, e->val.NsRun.nodeAddress, e->val.NsRun.scriptPtr, e->val.NsRun.scriptSize, OnUcsNsRun)) { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_Ns_Run failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_NsRun, false, e->val.NsRun.nodeAddress); } break; case UnicensCmd_GpioCreatePort: if (UCS_RET_SUCCESS == Ucs_Gpio_CreatePort(my->unicens, e->val.GpioCreatePort.destination, 0, e->val.GpioCreatePort.debounceTime, OnUcsGpioPortCreate)) popEntry = false; else { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_Gpio_CreatePort failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_GpioCreatePort, false, e->val.GpioCreatePort.destination); } break; case UnicensCmd_GpioWritePort: if (UCS_RET_SUCCESS == Ucs_Gpio_WritePort(my->unicens, e->val.GpioWritePort.destination, 0x1D00, e->val.GpioWritePort.mask, e->val.GpioWritePort.data, OnUcsGpioPortWrite)) popEntry = false; else { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_Gpio_WritePort failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_GpioWritePort, false, e->val.GpioWritePort.destination); } break; case UnicensCmd_I2CWrite: if (UCS_RET_SUCCESS == Ucs_I2c_WritePort(my->unicens, e->val.I2CWrite.destination, 0x0F00, (e->val.I2CWrite.isBurst ? UCS_I2C_BURST_MODE : UCS_I2C_DEFAULT_MODE), e->val.I2CWrite.blockCount, e->val.I2CWrite.slaveAddr, e->val.I2CWrite.timeout, e->val.I2CWrite.dataLen, e->val.I2CWrite.data, OnUcsI2CWrite)) popEntry = false; else { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_I2c_WritePort failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_I2CWrite, false, e->val.I2CWrite.destination); } break; case UnicensCmd_I2CRead: if (UCS_RET_SUCCESS == Ucs_I2c_ReadPort(my->unicens, e->val.I2CRead.destination, 0x0F00, e->val.I2CRead.slaveAddr, e->val.I2CRead.dataLen, e->val.I2CRead.timeout, OnUcsI2CRead)) popEntry = false; else { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_I2c_ReadPort failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_I2CRead, false, e->val.I2CRead.destination); } break; #if ENABLE_AMS_LIB case UnicensCmd_SendAmsMessage: { Ucs_AmsTx_Msg_t *msg; msg = Ucs_AmsTx_AllocMsg(my->unicens, e->val.SendAms.payloadLen); if (NULL == msg) { /* Try again later */ popEntry = false; break; } if (0 != e->val.SendAms.payloadLen) { assert(NULL != msg->data_ptr); memcpy(msg->data_ptr, e->val.SendAms.pPayload, e->val.SendAms.payloadLen); } msg->custom_info_ptr = NULL; msg->data_size = e->val.SendAms.payloadLen; msg->destination_address = e->val.SendAms.targetAddress; msg->llrbc = 10; msg->msg_id = e->val.SendAms.msgId; if (UCS_RET_SUCCESS == Ucs_AmsTx_SendMsg(my->unicens, msg, OnUcsAmsWrite)) { popEntry = false; } else { Ucs_AmsTx_FreeUnusedMsg(my->unicens, msg); UCSI_CB_OnUserMessage(my->tag, true, "Ucs_AmsTx_SendMsg failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_SendAmsMessage, false, e->val.SendAms.targetAddress); } break; } #endif case UnicensCmd_ProgIsRam: if (UCS_RET_SUCCESS == Ucs_Prog_IS_RAM(my->unicens, &e->val.ProgIsRam.signature, &e->val.ProgIsRam.ident_string, OnUcsProgRam)) popEntry = false; else { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_Prog_IS_RAM failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_ProgIsRam, false, e->val.ProgIsRam.signature.node_address); } break; case UnicensCmd_ProgIsRom: if (UCS_RET_SUCCESS == Ucs_Prog_IS_ROM(my->unicens, &e->val.ProgIsRom.signature, &e->val.ProgIsRom.ident_string, OnUcsProgRom)) popEntry = false; else { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_Prog_IS_ROM failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_ProgIsRom, false, e->val.ProgIsRom.signature.node_address); } break; case UnicensCmd_NDStart: if (UCS_RET_SUCCESS == Ucs_Nd_Start(my->unicens)) { my->ndRunning = true; } else { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_Nd_Start failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_NDStart, false, LOCAL_NODE_ADDR); } break; case UnicensCmd_NDStop: if (UCS_RET_SUCCESS == Ucs_Nd_Stop(my->unicens)) { my->ndRunning = false; } else { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_Nd_Stop failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_NDStop, false, LOCAL_NODE_ADDR); } break; case UnicensCmd_NwStartup: if (UCS_RET_SUCCESS == Ucs_Network_Startup(my->unicens, 0, 0xFFFFU, OnUcsNetworkStartup)) popEntry = false; else { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_Network_Startup failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_NwStartup, false, LOCAL_NODE_ADDR); } break; case UnicensCmd_NwShutdown: if (UCS_RET_SUCCESS == Ucs_Network_Shutdown(my->unicens, OnUcsNetworkShutdown)) popEntry = false; else { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_Network_Shutdown failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_NwShutdown, false, LOCAL_NODE_ADDR); } break; case UnicensCmd_ProgInitAll: if (UCS_RET_SUCCESS == Ucs_Nd_InitAll(my->unicens)) { UCSI_CB_OnCommandResult(my->tag, UnicensCmd_ProgInitAll, true, LOCAL_NODE_ADDR); } else { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_Nd_InitAll failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_ProgInitAll, false, LOCAL_NODE_ADDR); } break; case UnicensCmd_PacketFilterMode: if (UCS_RET_SUCCESS == Ucs_Network_SetPacketFilterMode(my->unicens, e->val.PacketFilterMode.destination_address, e->val.PacketFilterMode.mode, OnUcsPacketFilterMode)) popEntry = false; else { UCSI_CB_OnUserMessage(my->tag, true, "Ucs_Network_SetPacketFilterMode failed", 0); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_PacketFilterMode, false, e->val.PacketFilterMode.destination_address); } break; default: assert(false); break; } if (popEntry) { my->currentCmd = NULL; RB_PopReadPtr(&my->rb); } } void UCSI_Timeout(UCSI_Data_t *my) { assert(MAGIC == my->magic); if (NULL == my->unicens) return; Ucs_ReportTimeout(my->unicens); if (my->printTrigger) { my->printTrigger = false; UCSIPrint_Service(UCSI_CB_OnGetTime(my->tag)); } } bool UCSI_SendAmsMessage(UCSI_Data_t *my, uint16_t msgId, uint16_t targetAddress, uint8_t *pPayload, uint32_t payloadLen) { #if ENABLE_AMS_LIB UnicensCmdEntry_t entry; assert(MAGIC == my->magic); if (NULL == my || my->programmingMode) return false; if (payloadLen > AMS_MSG_MAX_LEN) { UCSI_CB_OnUserMessage(my->tag, true, "SendAms was called with payload length=%d, allowed is=%d", 2, payloadLen, AMS_MSG_MAX_LEN); return false; } entry.cmd = UnicensCmd_SendAmsMessage; entry.val.SendAms.msgId = msgId; entry.val.SendAms.targetAddress = targetAddress; entry.val.SendAms.payloadLen = payloadLen; memcpy(entry.val.SendAms.pPayload, pPayload, payloadLen); return EnqueueCommand(my, &entry); #else return false; #endif } bool UCSI_GetAmsMessage(UCSI_Data_t *my, uint16_t *pMsgId, uint16_t *pSourceAddress, uint8_t **pPayload, uint32_t *pPayloadLen) { Ucs_AmsRx_Msg_t *msg; assert(MAGIC == my->magic); if (NULL == my || my->programmingMode || NULL == my->unicens || NULL == pPayload || NULL == pPayloadLen) return false; msg = Ucs_AmsRx_PeekMsg(my->unicens); if (NULL == msg) return false; *pMsgId = msg->msg_id; *pSourceAddress = msg->source_address; *pPayload = msg->data_ptr; *pPayloadLen = msg->data_size; return true; } void UCSI_ReleaseAmsMessage(UCSI_Data_t *my) { assert(MAGIC == my->magic); if (NULL == my || my->programmingMode || NULL == my->unicens) return; Ucs_AmsRx_ReleaseMsg(my->unicens); } bool UCSI_SetRouteActive(UCSI_Data_t *my, uint16_t routeId, bool isActive) { uint16_t i; UnicensCmdEntry_t entry; assert(MAGIC == my->magic); if (NULL == my || my->programmingMode || NULL == my->uniInitData.mgr.routes_list_ptr) return false; for (i = 0; i < my->uniInitData.mgr.routes_list_size; i++) { Ucs_Rm_Route_t *route = &my->uniInitData.mgr.routes_list_ptr[i]; if (route->route_id != routeId) continue; entry.cmd = UnicensCmd_RmSetRoute; entry.val.RmSetRoute.routePtr = route; entry.val.RmSetRoute.isActive = isActive; return EnqueueCommand(my, &entry); } return false; } bool UCSI_I2CWrite(UCSI_Data_t *my, uint16_t targetAddress, bool isBurst, uint8_t blockCount, uint8_t slaveAddr, uint16_t timeout, uint8_t dataLen, const uint8_t *pData) { UnicensCmdEntry_t entry; assert(MAGIC == my->magic); if (NULL == my || my->programmingMode || NULL == pData || 0 == dataLen) return false; if (dataLen > I2C_WRITE_MAX_LEN) { UCSI_CB_OnUserMessage(my->tag, true, "I2CWrite was called with payload length=%d, allowed is=%d", 2, dataLen, I2C_WRITE_MAX_LEN); return false; } entry.cmd = UnicensCmd_I2CWrite; entry.val.I2CWrite.destination = targetAddress; entry.val.I2CWrite.isBurst = isBurst; entry.val.I2CWrite.blockCount = blockCount; entry.val.I2CWrite.slaveAddr = slaveAddr; entry.val.I2CWrite.timeout = timeout; entry.val.I2CWrite.dataLen = dataLen; memcpy(entry.val.I2CWrite.data, pData, dataLen); return EnqueueCommand(my, &entry); } bool UCSI_I2CRead(UCSI_Data_t *my, uint16_t targetAddress, uint8_t slaveAddr, uint16_t timeout, uint8_t dataLen) { UnicensCmdEntry_t entry; assert(MAGIC == my->magic); if (NULL == my || my->programmingMode || 0 == dataLen) return false; entry.cmd = UnicensCmd_I2CRead; entry.val.I2CRead.destination = targetAddress; entry.val.I2CRead.slaveAddr = slaveAddr; entry.val.I2CRead.timeout = timeout; entry.val.I2CRead.dataLen = dataLen; return EnqueueCommand(my, &entry); } bool UCSI_SetGpioState(UCSI_Data_t *my, uint16_t targetAddress, uint8_t gpioPinId, bool isHighState) { uint16_t mask; UnicensCmdEntry_t entry; assert(MAGIC == my->magic); if (NULL == my || my->programmingMode) return false; mask = 1 << gpioPinId; entry.cmd = UnicensCmd_GpioWritePort; entry.val.GpioWritePort.destination = targetAddress; entry.val.GpioWritePort.mask = mask; entry.val.GpioWritePort.data = isHighState ? mask : 0; return EnqueueCommand(my, &entry); } bool UCSI_ProgramIdentStringRam(UCSI_Data_t *my, const Ucs_Signature_t *signature, const Ucs_IdentString_t *newIdentString) { UnicensCmdEntry_t entry; assert(MAGIC == my->magic); if (NULL == my || !my->programmingMode || NULL == signature || NULL == newIdentString) return false; entry.cmd = UnicensCmd_ProgIsRam; memcpy(&entry.val.ProgIsRam.signature, signature, sizeof(Ucs_Signature_t)); memcpy(&entry.val.ProgIsRam.ident_string, newIdentString, sizeof(Ucs_IdentString_t)); return EnqueueCommand(my, &entry); } bool UCSI_ProgramIdentStringRom(UCSI_Data_t *my, const Ucs_Signature_t *signature, const Ucs_IdentString_t *newIdentString) { UnicensCmdEntry_t entry; assert(MAGIC == my->magic); if (NULL == my || !my->programmingMode || NULL == signature || NULL == newIdentString) return false; entry.cmd = UnicensCmd_ProgIsRom; memcpy(&entry.val.ProgIsRom.signature, signature, sizeof(Ucs_Signature_t)); memcpy(&entry.val.ProgIsRom.ident_string, newIdentString, sizeof(Ucs_IdentString_t)); return EnqueueCommand(my, &entry); } bool UCSI_EnablePromiscuousMode(UCSI_Data_t *my, uint16_t targetAddress, bool enablePromiscuous) { UnicensCmdEntry_t entry; assert(MAGIC == my->magic); if (NULL == my || my->programmingMode) return false; entry.cmd = UnicensCmd_PacketFilterMode; entry.val.PacketFilterMode.destination_address = targetAddress; entry.val.PacketFilterMode.mode = enablePromiscuous ? 0xA : 0x0; return EnqueueCommand(my, &entry); } /************************************************************************/ /* Private Functions */ /************************************************************************/ static bool EnqueueCommand(UCSI_Data_t *my, UnicensCmdEntry_t *cmd) { UnicensCmdEntry_t *e; if (NULL == my || NULL == cmd) { assert(false); return false; } e = RB_GetWritePtr(&my->rb); if (NULL == e) { UCSI_CB_OnUserMessage(my->tag, true, "Could not enqueue command. Increase CMD_QUEUE_LEN define", 0); return false; } memcpy(e, cmd, sizeof(UnicensCmdEntry_t)); RB_PopWritePtr(&my->rb); UCSI_CB_OnServiceRequired(my->tag); UCSIPrint_UnicensActivity(); return true; } static void OnCommandExecuted(UCSI_Data_t *my, UnicensCmd_t cmd, bool success) { UnicensCmdEntry_t *e; if (NULL == my) { assert(false); return; } e = my->currentCmd; if (NULL == e) { UCSI_CB_OnUserMessage(my->tag, true, "OnUniCommandExecuted was called, but no "\ "command is in queue", 0); assert(false); return; } if (e->cmd != cmd) { UCSI_CB_OnUserMessage(my->tag, true, "OnUniCommandExecuted was called with "\ "wrong command (Expected=0x%X, Got=0x%X", 2, e->cmd, cmd); assert(false); return; } UCSIPrint_UnicensActivity(); switch (e->cmd) { case UnicensCmd_Init: UCSI_CB_OnCommandResult(my->tag, cmd, success, LOCAL_NODE_ADDR); break; case UnicensCmd_Stop: UCSI_CB_OnCommandResult(my->tag, cmd, success, LOCAL_NODE_ADDR); break; case UnicensCmd_GpioCreatePort: UCSI_CB_OnCommandResult(my->tag, cmd, success, e->val.GpioCreatePort.destination); break; case UnicensCmd_GpioWritePort: UCSI_CB_OnCommandResult(my->tag, cmd, success, e->val.GpioWritePort.destination); break; case UnicensCmd_I2CWrite: UCSI_CB_OnCommandResult(my->tag, cmd, success, e->val.I2CWrite.destination); break; case UnicensCmd_I2CRead: UCSI_CB_OnCommandResult(my->tag, cmd, success, e->val.I2CRead.destination); break; #if ENABLE_AMS_LIB case UnicensCmd_SendAmsMessage: UCSI_CB_OnCommandResult(my->tag, cmd, success, e->val.SendAms.targetAddress); break; #endif case UnicensCmd_ProgIsRam: UCSI_CB_OnCommandResult(my->tag, cmd, success, e->val.ProgIsRam.signature.node_address); break; case UnicensCmd_ProgIsRom: UCSI_CB_OnCommandResult(my->tag, cmd, success, e->val.ProgIsRom.signature.node_address); break; case UnicensCmd_PacketFilterMode: UCSI_CB_OnCommandResult(my->tag, cmd, success, e->val.PacketFilterMode.destination_address); break; case UnicensCmd_RmSetRoute: UCSI_CB_OnCommandResult(my->tag, cmd, success, e->val.RmSetRoute.routePtr->sink_endpoint_ptr->node_obj_ptr->signature_ptr->node_address); break; default: UCSI_CB_OnCommandResult(my->tag, cmd, success, UNKNOWN_NODE_ADDR); break; } my->currentCmd = NULL; RB_PopReadPtr(&my->rb); } static void RB_Init(RB_t *rb, uint16_t amountOfEntries, uint32_t sizeOfEntry, uint8_t *workingBuffer) { assert(NULL != rb); assert(NULL != workingBuffer); rb->dataQueue = workingBuffer; rb->pRx = rb->dataQueue; rb->pTx = rb->dataQueue; rb->amountOfEntries = amountOfEntries; rb->sizeOfEntry = sizeOfEntry; rb->rxPos = 0; rb->txPos = 0; } static void *RB_GetReadPtr(RB_t *rb) { assert(NULL != rb); assert(0 != rb->dataQueue); if (rb->txPos - rb->rxPos > 0) return (void *)rb->pRx; return NULL; } static void RB_PopReadPtr(RB_t *rb) { assert(NULL != rb); assert(0 != rb->dataQueue); rb->pRx += rb->sizeOfEntry; if (rb->pRx >= rb->dataQueue + ( rb->amountOfEntries * rb->sizeOfEntry)) rb->pRx = rb->dataQueue; ++rb->rxPos; assert(rb->txPos >= rb->rxPos); } static void *RB_GetWritePtr(RB_t *rb) { assert(NULL != rb); assert(0 != rb->dataQueue); if (rb->txPos - rb->rxPos < rb->amountOfEntries) return (void *)rb->pTx; return NULL; } static void RB_PopWritePtr(RB_t *rb) { assert(NULL != rb); assert(0 != rb->dataQueue); rb->pTx += rb->sizeOfEntry; if (rb->pTx >= rb->dataQueue + ( rb->amountOfEntries * rb->sizeOfEntry)) rb->pTx = rb->dataQueue; ++rb->txPos; assert(rb->txPos >= rb->rxPos); } static uint16_t OnUnicensGetTime(void *user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); return UCSI_CB_OnGetTime(my->tag); } static void OnUnicensService( void *user_ptr ) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); my->triggerService = true; UCSI_CB_OnServiceRequired(my->tag); } static void OnUnicensError( Ucs_Error_t error_code, void *user_ptr ) { UnicensCmdEntry_t e; UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; error_code = error_code; assert(MAGIC == my->magic); UCSI_CB_OnUserMessage(my->tag, true, "UNICENS general error, code=0x%X, restarting", 1, error_code); e.cmd = UnicensCmd_Init; e.val.Init.init_ptr = &my->uniInitData; EnqueueCommand(my, &e); } static void OnUnicensAppTimer( uint16_t timeout, void *user_ptr ) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); UCSI_CB_OnSetServiceTimer(my->tag, timeout); } static void OnUnicensDebugErrorMsg(Ucs_Message_t *m, void *user_ptr) { char val[5]; uint8_t i; UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); m_traceBuffer[0] = '\0'; for (i = 0; NULL != m->tel.tel_data_ptr && i < m->tel.tel_len; i++) { snprintf(val, sizeof(val), "%02X ", m->tel.tel_data_ptr[i]); strcat(m_traceBuffer, val); } UCSI_CB_OnUserMessage(my->tag, true, "Received error message, source=%x, %X.%X.%X.%X, [ %s ]", 6, m->source_addr, m->id.fblock_id, m->id.instance_id, m->id.function_id, m->id.op_type, m_traceBuffer); } static void OnLldCtrlStart( Ucs_Lld_Api_t* api_ptr, void *inst_ptr, void *lld_user_ptr ) { UCSI_Data_t *my = (UCSI_Data_t *)lld_user_ptr; assert(MAGIC == my->magic); my->uniLld = api_ptr; my->uniLldHPtr = inst_ptr; UCSI_CB_OnStart(my->tag); } static void OnLldCtrlStop( void *lld_user_ptr ) { UCSI_Data_t *my = (UCSI_Data_t *)lld_user_ptr; assert(MAGIC == my->magic); my->uniLld = NULL; my->uniLldHPtr = NULL; UCSI_CB_OnStop(my->tag); } static void OnLldResetInic(void *lld_user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)lld_user_ptr; assert(MAGIC == my->magic); UCSI_CB_OnResetInic(my->tag); } static void OnLldCtrlRxMsgAvailable( void *lld_user_ptr ) { UCSI_Data_t *my = (UCSI_Data_t *)lld_user_ptr; assert(MAGIC == my->magic); UCSI_CB_OnServiceRequired(my->tag); } static void OnLldCtrlTxTransmitC( Ucs_Lld_TxMsg_t *msg_ptr, void *lld_user_ptr ) { UCSI_Data_t *my; Ucs_Mem_Buffer_t * buf_ptr; uint8_t buffer[BOARD_PMS_TX_SIZE]; uint32_t bufferPos = 0; my = (UCSI_Data_t *)lld_user_ptr; assert(MAGIC == my->magic); if (NULL == msg_ptr || NULL == my || NULL == my->uniLld || NULL == my->uniLldHPtr) { assert(false); return; } for (buf_ptr = msg_ptr->memory_ptr; buf_ptr != NULL; buf_ptr = buf_ptr->next_buffer_ptr) { if (buf_ptr->data_size + bufferPos > sizeof(buffer)) { UCSI_CB_OnUserMessage(my->tag, true, "TX buffer is too small, increase " \ "BOARD_PMS_TX_SIZE define (%lu > %lu)", 2, buf_ptr->data_size + bufferPos, sizeof(buffer)); my->uniLld->tx_release_fptr(my->uniLldHPtr, msg_ptr); return; } memcpy(&buffer[bufferPos], buf_ptr->data_ptr, buf_ptr->data_size); bufferPos += buf_ptr->data_size; } assert(bufferPos == msg_ptr->memory_ptr->total_size); my->uniLld->tx_release_fptr(my->uniLldHPtr, msg_ptr); UCSI_CB_OnTxRequest(my->tag, buffer, bufferPos); } static void OnUnicensRoutingResult(Ucs_Rm_Route_t* route_ptr, Ucs_Rm_RouteInfos_t route_infos, void *user_ptr) { bool available; uint16_t conLabel; UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); if (route_ptr == my->pendingRoutePtr) { OnCommandExecuted(my, UnicensCmd_RmSetRoute, (UCS_RM_ROUTE_INFOS_BUILT == route_infos)); my->pendingRoutePtr = NULL; } if (NULL == route_ptr || UCS_RM_ROUTE_INFOS_ATD_UPDATE == route_infos || UCS_RM_ROUTE_INFOS_ATD_ERROR == route_infos) return; available = UCS_RM_ROUTE_INFOS_BUILT == route_infos; conLabel = Ucs_Rm_GetConnectionLabel(my->unicens, route_ptr); UCSIPrint_SetRouteState(route_ptr->route_id, available, conLabel); UCSI_CB_OnRouteResult(my->tag, route_ptr->route_id, available, conLabel); if (UCS_RM_ROUTE_INFOS_SUSPENDED == route_infos) { /* Route has been permanently disabled due to a crucial error, enable it again */ UnicensCmdEntry_t entry; entry.cmd = UnicensCmd_RmSetRoute; entry.val.RmSetRoute.routePtr = route_ptr; entry.val.RmSetRoute.isActive = false; EnqueueCommand(my, &entry); entry.val.RmSetRoute.isActive = true; EnqueueCommand(my, &entry); UCSI_CB_OnServiceRequired(my->tag); } } static void OnUnicensNetworkStatus(uint16_t change_mask, uint16_t events, Ucs_Network_Availability_t availability, Ucs_Network_AvailInfo_t avail_info,Ucs_Network_AvailTransCause_t avail_trans_cause, uint16_t node_address, uint8_t max_position, uint16_t packet_bw, void *user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); if (my->programmingMode) { if (UCS_NW_AVAILABLE == availability) UCSICollision_SetFoundNodeCount(max_position); else UCSICollision_Init(); } UCSIPrint_SetNetworkAvailable(UCS_NW_AVAILABLE == availability, max_position); UCSI_CB_OnNetworkState(my->tag, UCS_NW_AVAILABLE == availability, packet_bw, max_position); } static void OnUnicensDebugXrmResources(Ucs_Xrm_ResourceType_t resource_type, Ucs_Xrm_ResObject_t *resource_ptr, Ucs_Xrm_ResourceInfos_t resource_infos, Ucs_Rm_EndPoint_t *endpoint_inst_ptr, void *user_ptr) { char *msg = NULL; UCSI_Data_t *my; uint16_t adr = 0xFFFF; #ifndef DEBUG_XRM resource_type = resource_type; resource_ptr = resource_ptr; resource_infos = resource_infos; endpoint_inst_ptr = endpoint_inst_ptr; user_ptr = user_ptr; #else endpoint_inst_ptr = endpoint_inst_ptr; my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); if (NULL == resource_ptr) return; if (endpoint_inst_ptr && endpoint_inst_ptr->node_obj_ptr && endpoint_inst_ptr->node_obj_ptr->signature_ptr) adr = endpoint_inst_ptr->node_obj_ptr->signature_ptr->node_address; switch (resource_infos) { case UCS_XRM_INFOS_BUILT: msg = (char *)"has been built"; UCSIPrint_SetObjectState(resource_ptr, ObjState_Build); break; case UCS_XRM_INFOS_DESTROYED: msg = (char *)"has been destroyed"; UCSIPrint_SetObjectState(resource_ptr, ObjState_Unused); break; case UCS_XRM_INFOS_ERR_BUILT: msg = (char *)"cannot be built"; UCSIPrint_SetObjectState(resource_ptr, ObjState_Failed); break; case UCS_XRM_INFOS_ERR_DESTROYED: msg = (char *)"cannot be destroyed"; UCSIPrint_SetObjectState(resource_ptr, ObjState_Failed); break; default: msg = (char *)"has unknown state"; break; } switch(resource_type) { case UCS_XRM_RC_TYPE_NW_SOCKET: { Ucs_Xrm_NetworkSocket_t *ms = (Ucs_Xrm_NetworkSocket_t *)resource_ptr; assert(ms->resource_type == resource_type); UCSI_CB_OnUserMessage(my->tag, false, "Xrm-Debug (0x%03X): NW socket %s, handle=%04X, "\ "direction=%d, type=%d, bandwidth=%d", 6, adr, msg, ms->nw_port_handle, ms->direction, ms->data_type, ms->bandwidth); break; } case UCS_XRM_RC_TYPE_MLB_PORT: { Ucs_Xrm_MlbPort_t *m = (Ucs_Xrm_MlbPort_t *)resource_ptr; assert(m->resource_type == resource_type); UCSI_CB_OnUserMessage(my->tag, false, "Xrm-Debug (0x%03X): MLB port %s, index=%d, clock=%d", 4, adr, msg, m->index, m->clock_config); break; } case UCS_XRM_RC_TYPE_MLB_SOCKET: { Ucs_Xrm_MlbSocket_t *m = (Ucs_Xrm_MlbSocket_t *)resource_ptr; assert(m->resource_type == resource_type); UCSI_CB_OnUserMessage(my->tag, false, "Xrm-Debug (0x%03X): MLB socket %s, direction=%d, type=%d,"\ " bandwidth=%d, channel=%d", 6, adr, msg, m->direction, m->data_type, m->bandwidth, m->channel_address); break; } case UCS_XRM_RC_TYPE_USB_PORT: { Ucs_Xrm_UsbPort_t *m = (Ucs_Xrm_UsbPort_t *)resource_ptr; assert(m->resource_type == resource_type); UCSI_CB_OnUserMessage(my->tag, false, "Xrm-Debug (0x%03X): USB port %s, in-cnt=%d, out-cnt=%d", 4, adr, msg, m->streaming_if_ep_in_count, m->streaming_if_ep_out_count); break; } case UCS_XRM_RC_TYPE_USB_SOCKET: { Ucs_Xrm_UsbSocket_t *m = (Ucs_Xrm_UsbSocket_t *)resource_ptr; assert(m->resource_type == resource_type); UCSI_CB_OnUserMessage(my->tag, false, "Xrm-Debug (0x%03X): USB socket %s, direction=%d, type=%d," \ " ep-addr=%02X, frames=%d", 6, adr, msg, m->direction, m->data_type, m->end_point_addr, m->frames_per_transfer); break; } case UCS_XRM_RC_TYPE_STRM_PORT: { Ucs_Xrm_StrmPort_t *m = (Ucs_Xrm_StrmPort_t *)resource_ptr; assert(m->resource_type == resource_type); UCSI_CB_OnUserMessage(my->tag, false, "Xrm-Debug (0x%03X): I2S port %s, index=%d, clock=%d, "\ "align=%d", 5, adr, msg, m->index, m->clock_config, m->data_alignment); break; } case UCS_XRM_RC_TYPE_STRM_SOCKET: { Ucs_Xrm_StrmSocket_t *m = (Ucs_Xrm_StrmSocket_t *)resource_ptr; assert(m->resource_type == resource_type); UCSI_CB_OnUserMessage(my->tag, false, "Xrm-Debug (0x%03X): I2S socket %s, direction=%d, type=%d"\ ", bandwidth=%d, pin=%d", 6, adr, msg, m->direction, m->data_type, m->bandwidth, m->stream_pin_id); break; } case UCS_XRM_RC_TYPE_SYNC_CON: { Ucs_Xrm_SyncCon_t *m = (Ucs_Xrm_SyncCon_t *)resource_ptr; assert(m->resource_type == resource_type); UCSI_CB_OnUserMessage(my->tag, false, "Xrm-Debug (0x%03X): Sync connection %s, mute=%d, "\ "offset=%d", 4, adr, msg, m->mute_mode, m->offset); break; } case UCS_XRM_RC_TYPE_COMBINER: { Ucs_Xrm_Combiner_t *m = (Ucs_Xrm_Combiner_t *)resource_ptr; assert(m->resource_type == resource_type); UCSI_CB_OnUserMessage(my->tag, false, "Xrm-Debug (0x%03X): Combiner %s, bytes per frame=%d", 3, adr, msg, m->bytes_per_frame); break; } case UCS_XRM_RC_TYPE_SPLITTER: { Ucs_Xrm_Splitter_t *m = (Ucs_Xrm_Splitter_t *)resource_ptr; assert(m->resource_type == resource_type); UCSI_CB_OnUserMessage(my->tag, false, "Xrm-Debug (0x%03X): Splitter %s, bytes per frame=%d", 3, adr, msg, m->bytes_per_frame); break; } case UCS_XRM_RC_TYPE_AVP_CON: { Ucs_Xrm_AvpCon_t *m = (Ucs_Xrm_AvpCon_t *)resource_ptr; assert(m->resource_type == resource_type); UCSI_CB_OnUserMessage(my->tag, false, "Xrm-Debug (0x%03X): Isoc-AVP connection %s, packetSize=%d", 3, adr, msg, m->isoc_packet_size); break; } default: UCSI_CB_OnUserMessage(my->tag, true, "Xrm-Debug (0x%03X): Unknown type=%d %s", 3 , adr, resource_type, msg); } #endif } static void OnUcsInitResult(Ucs_InitResult_t result, void *user_ptr) { UnicensCmdEntry_t e; UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); my->initialized = (UCS_INIT_RES_SUCCESS == result); OnCommandExecuted(my, UnicensCmd_Init, (UCS_INIT_RES_SUCCESS == result)); if (!my->initialized) { UCSI_CB_OnUserMessage(my->tag, true, "UcsInitResult reported error (0x%X), restarting...", 1, result); e.cmd = UnicensCmd_Init; e.val.Init.init_ptr = &my->uniInitData; EnqueueCommand(my, &e); } } static void OnUcsStopResult(Ucs_StdResult_t result, void *user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; result = result; /*TODO: check error case*/ assert(MAGIC == my->magic); my->initialized = false; OnCommandExecuted(my, UnicensCmd_Stop, (UCS_RES_SUCCESS == result.code)); UCSI_CB_OnStop(my->tag); } static void OnUcsGpioPortCreate(uint16_t node_address, uint16_t gpio_port_handle, Ucs_Gpio_Result_t result, void *user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); OnCommandExecuted(my, UnicensCmd_GpioCreatePort, (UCS_GPIO_RES_SUCCESS == result.code)); } static void OnUcsGpioPortWrite(uint16_t node_address, uint16_t gpio_port_handle, uint16_t current_state, uint16_t sticky_state, Ucs_Gpio_Result_t result, void *user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); OnCommandExecuted(my, UnicensCmd_GpioWritePort, (UCS_GPIO_RES_SUCCESS == result.code)); } static void OnUcsMgrReport(Ucs_MgrReport_t code, Ucs_Signature_t *signature_ptr, Ucs_Rm_Node_t *node_ptr, void *user_ptr) { uint16_t node_address; uint16_t node_pos_addr; UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); assert(NULL != signature_ptr); node_address = signature_ptr->node_address; node_pos_addr = signature_ptr->node_pos_addr; UCSI_CB_OnMgrReport(my->tag, code, signature_ptr, node_ptr); switch (code) { case UCS_MGR_REP_IGNORED_UNKNOWN: UCSIPrint_SetNodeAvailable(node_address, node_pos_addr, NodeState_Ignored); UCSI_CB_OnUserMessage(my->tag, false, "Node=%X(%X): Ignored, because unknown", 2, node_address, node_pos_addr); break; case UCS_MGR_REP_IGNORED_DUPLICATE: UCSIPrint_SetNodeAvailable(node_address, node_pos_addr, NodeState_Ignored); UCSI_CB_OnUserMessage(my->tag, true, "Node=%X(%X): Ignored, because duplicated", 2, node_address, node_pos_addr); break; case UCS_MGR_REP_NOT_AVAILABLE: UCSIPrint_SetNodeAvailable(node_address, node_pos_addr, NodeState_NotAvailable); UCSI_CB_OnUserMessage(my->tag, false, "Node=%X(%X): Not available", 2, node_address, node_pos_addr); break; case UCS_MGR_REP_WELCOMED: UCSI_CB_OnUserMessage(my->tag, false, "Node=%X(%X): Welcomed", 2, node_address, node_pos_addr); break; case UCS_MGR_REP_SCRIPT_FAILURE: UCSI_CB_OnUserMessage(my->tag, true, "Node=%X(%X): Script failure", 2, node_address, node_pos_addr); break; case UCS_MGR_REP_IRRECOVERABLE: UCSI_CB_OnUserMessage(my->tag, true, "Node=%X(%X): IRRECOVERABLE ERROR!!", 2, node_address, node_pos_addr); break; case UCS_MGR_REP_SCRIPT_SUCCESS: UCSI_CB_OnUserMessage(my->tag, false, "Node=%X(%X): Script ok", 2, node_address, node_pos_addr); break; case UCS_MGR_REP_AVAILABLE: UCSIPrint_SetNodeAvailable(node_address, node_pos_addr, NodeState_Available); UCSI_CB_OnUserMessage(my->tag, false, "Node=%X(%X): Available", 2, node_address, node_pos_addr); break; default: UCSI_CB_OnUserMessage(my->tag, true, "Node=%X(%X): unknown code", 2, node_address, node_pos_addr); break; } } static void OnUcsNsRun(uint16_t node_address, Ucs_Ns_ResultCode_t result, Ucs_Ns_ErrorInfo_t error_info, void *ucs_user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)ucs_user_ptr; assert(MAGIC == my->magic); UCSI_CB_OnCommandResult(my->tag, UnicensCmd_NsRun, (UCS_NS_RES_SUCCESS == result), node_address); #ifdef DEBUG_XRM UCSI_CB_OnUserMessage(my->tag, (UCS_NS_RES_SUCCESS != result), "OnUcsNsRun (%03X): script executed %s", 2, node_address, (UCS_NS_RES_SUCCESS == result ? "succeeded" : "false")); #endif } static void OnUcsAmsRxMsgReceived(void *user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); UCSI_CB_OnAmsMessageReceived(my->tag); } static void OnUcsGpioTriggerEventStatus(uint16_t node_address, uint16_t gpio_port_handle, uint16_t rising_edges, uint16_t falling_edges, uint16_t levels, void * user_ptr) { uint8_t i; UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); for (i = 0; i < 16; i++) { if (0 != ((rising_edges >> i) & 0x1)) UCSI_CB_OnGpioStateChange(my->tag, node_address, i, true); if (0 != ((falling_edges >> i) & 0x1)) UCSI_CB_OnGpioStateChange(my->tag, node_address, i, false); } } static void OnUcsI2CWrite(uint16_t node_address, uint16_t i2c_port_handle, uint8_t i2c_slave_address, uint8_t data_len, Ucs_I2c_Result_t result, void *user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); OnCommandExecuted(my, UnicensCmd_I2CWrite, (UCS_I2C_RES_SUCCESS == result.code)); if (UCS_I2C_RES_SUCCESS != result.code) UCSI_CB_OnUserMessage(my->tag, true, "Remote I2C Write to node=0x%X failed", 1, node_address); } static void OnUcsI2CRead(uint16_t node_address, uint16_t i2c_port_handle, uint8_t i2c_slave_address, uint8_t data_len, uint8_t data_ptr[], Ucs_I2c_Result_t result, void *user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); OnCommandExecuted(my, UnicensCmd_I2CRead, (UCS_I2C_RES_SUCCESS == result.code)); UCSI_CB_OnI2CRead(my->tag, (UCS_I2C_RES_SUCCESS == result.code), node_address, i2c_slave_address, data_ptr, data_len); } #if ENABLE_AMS_LIB static void OnUcsAmsWrite(Ucs_AmsTx_Msg_t* msg_ptr, Ucs_AmsTx_Result_t result, Ucs_AmsTx_Info_t info, void *user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); OnCommandExecuted(my, UnicensCmd_SendAmsMessage, (UCS_AMSTX_RES_SUCCESS == result)); if (UCS_AMSTX_RES_SUCCESS != result) UCSI_CB_OnUserMessage(my->tag, true, "SendAms failed with result=0x%x, info=0x%X", 2, result, info); } #endif static void OnUcsProgRam(Ucs_Prg_ResCode_t code, Ucs_Prg_Func_t function, uint8_t ret_len, uint8_t parm[], void *user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); if (my->currentCmd->cmd != UnicensCmd_ProgIsRam) { /* Workaround for issue found in UCS Lib V2.2.0-3942 */ return; } OnCommandExecuted(my, UnicensCmd_ProgIsRam, (UCS_PRG_RES_SUCCESS == code)); if (UCS_PRG_RES_SUCCESS == code) UCSI_CB_OnUserMessage(my->tag, false, "Write to RAM was successful", 0); else UCSI_CB_OnUserMessage(my->tag, true, "Write to RAM failed with error code %d", 1, code); if (my->programmingJobsTotal == ++my->programmingJobsFinished) { UnicensCmdEntry_t entry; entry.cmd = UnicensCmd_NwShutdown; EnqueueCommand(my, &entry); UCSI_CB_OnServiceRequired(my->tag); } } static void OnUcsProgRom(Ucs_Prg_ResCode_t code, Ucs_Prg_Func_t function, uint8_t ret_len, uint8_t parm[], void *user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); if (my->currentCmd->cmd != UnicensCmd_ProgIsRom) { /* Workaround for issue found in UCS Lib V2.2.0-3942 */ return; } OnCommandExecuted(my, UnicensCmd_ProgIsRom, (UCS_PRG_RES_SUCCESS == code)); if (UCS_PRG_RES_SUCCESS == code) UCSI_CB_OnUserMessage(my->tag, false, "Write to ROM was successful", 0); else UCSI_CB_OnUserMessage(my->tag, true, "Write to ROM failed with error code %d", 1, code); if (my->programmingJobsTotal == ++my->programmingJobsFinished) { UnicensCmdEntry_t entry; entry.cmd = UnicensCmd_NwShutdown; EnqueueCommand(my, &entry); UCSI_CB_OnServiceRequired(my->tag); } } static void OnUcsPacketFilterMode(uint16_t node_address, Ucs_StdResult_t result, void *user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); OnCommandExecuted(my, UnicensCmd_PacketFilterMode, (UCS_RES_SUCCESS == result.code)); if (UCS_RES_SUCCESS != result.code) UCSI_CB_OnUserMessage(my->tag, true, "Set promiscuous mode failed with error code %d", 1, result.code); } static void OnUcsNetworkStartup(Ucs_StdResult_t result, void *user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); OnCommandExecuted(my, UnicensCmd_NwStartup, (UCS_RES_SUCCESS == result.code)); if (UCS_RES_SUCCESS != result.code) UCSI_CB_OnUserMessage(my->tag, true, "NetworkStartup failed with error code %d", 1, result.code); } static void OnUcsNetworkShutdown(Ucs_StdResult_t result, void *user_ptr) { UCSI_Data_t *my = (UCSI_Data_t *)user_ptr; assert(MAGIC == my->magic); OnCommandExecuted(my, UnicensCmd_NwShutdown, (UCS_RES_SUCCESS == result.code)); if (0 != my->programmingJobsTotal) { UCSI_CB_OnProgrammingDone(my->tag, true); } } /************************************************************************/ /* Callback from Collision Resolver Component */ /************************************************************************/ void UCSICollision_CB_OnProgramIdentString(const Ucs_Signature_t *signature, const Ucs_IdentString_t *newIdentString, void *userPtr) { UCSI_Data_t *my = (UCSI_Data_t *)userPtr; assert(MAGIC == my->magic); assert(NULL != signature); assert(NULL != newIdentString); if (my->programmingMode) { UCSI_CB_OnUserMessage(my->tag, false, "Programming INIC on position=0x%X, old address=0x%X, new address=0x%X, old MAC=%04X%04X%04X new MAC=%04X%04X%04X", 9, signature->node_pos_addr, signature->node_address, newIdentString->node_address, signature->mac_47_32, signature->mac_31_16, signature->mac_15_0, newIdentString->mac_47_32, newIdentString->mac_31_16, newIdentString->mac_15_0); if (my->ndRunning) { UnicensCmdEntry_t e; my->ndRunning = false; e.cmd = UnicensCmd_NDStop; EnqueueCommand(my, &e); } switch(signature->chip_id) { case 0x18: case 0x19: UCSI_ProgramIdentStringRom(my, signature, newIdentString); my->programmingJobsTotal++; break; case 0x30: case 0x32: case 0x34: UCSI_ProgramIdentStringRam(my, signature, newIdentString); my->programmingJobsTotal++; break; default: UCSI_CB_OnUserMessage(my->tag, true, "Programming canceled because of unknown chip id=%d", 1, signature->chip_id); break; } } else { UCSI_CB_OnUserMessage(my->tag, false, "Need to program INIC on position=0x%X, old address=0x%X, new address=0x%X, old MAC=%04X%04X%04X new MAC=%04X%04X%04X, enable programming mode!", 9, signature->node_pos_addr, signature->node_address, newIdentString->node_address, signature->mac_47_32, signature->mac_31_16, signature->mac_15_0, newIdentString->mac_47_32, newIdentString->mac_31_16, newIdentString->mac_15_0); } } void UCSICollision_CB_OnProgramDone(void *userPtr) { UCSI_Data_t *my = (UCSI_Data_t *)userPtr; assert(MAGIC == my->magic); if (0 == my->programmingJobsTotal) UCSI_CB_OnProgrammingDone(my, false); } void UCSICollision_CB_FinishedWithoutChanges(void *userPtr) { UCSI_Data_t *my = (UCSI_Data_t *)userPtr; assert(MAGIC == my->magic); UCSI_CB_OnProgrammingDone(my, false); } /************************************************************************/ /* Callback from UCSI Print component: */ /************************************************************************/ void UCSIPrint_CB_NeedService(void *tag) { UCSI_Data_t *my = (UCSI_Data_t *)tag; assert(MAGIC == my->magic); my->printTrigger = true; } void UCSIPrint_CB_OnUserMessage(void *usr, const char pMsg[]) { void *tag = NULL; UCSI_Data_t *my = (UCSI_Data_t *)usr; if (my) { assert(MAGIC == my->magic); tag = my->tag; } UCSI_CB_OnPrintRouteTable(tag, pMsg); } /************************************************************************/ /* Debug Message output from UNICENS stack: */ /************************************************************************/ #if defined(UCS_TR_ERROR) || defined(UCS_TR_INFO) #include <stdio.h> void App_TraceError(void *ucs_user_ptr, const char module_str[], const char entry_str[], uint16_t vargs_cnt, ...) { va_list argptr; void *tag = NULL; UCSI_Data_t *my = (UCSI_Data_t *)ucs_user_ptr; if (my) { assert(MAGIC == my->magic); tag = my->tag; } va_start(argptr, vargs_cnt); vsnprintf(m_traceBuffer, sizeof(m_traceBuffer), entry_str, argptr); va_end(argptr); UCSIPrint_UnicensActivity(); UCSI_CB_OnUserMessage(tag, true, "Error | %s | %s", 2, module_str, m_traceBuffer); } void App_TraceInfo(void *ucs_user_ptr, const char module_str[], const char entry_str[], uint16_t vargs_cnt, ...) { va_list argptr; void *tag = NULL; UCSI_Data_t *my = (UCSI_Data_t *)ucs_user_ptr; if (my) { assert(MAGIC == my->magic); tag = my->tag; } va_start(argptr, vargs_cnt); vsnprintf(m_traceBuffer, sizeof(m_traceBuffer), entry_str, argptr); va_end(argptr); UCSI_CB_OnUserMessage(tag, false, "Info | %s | %s", 2, module_str, m_traceBuffer); } #endif <file_sep>/audio-source/samv71-ucs/libraries/ucsi/ucsi_print.c /*------------------------------------------------------------------------------------------------*/ /* UNICENS Stucture Printing module */ /* Copyright 2018, Microchip Technology Inc. and its subsidiaries. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following conditions are met: */ /* */ /* 1. Redistributions of source code must retain the above copyright notice, this */ /* list of conditions and the following disclaimer. */ /* */ /* 2. Redistributions in binary form must reproduce the above copyright notice, */ /* this list of conditions and the following disclaimer in the documentation */ /* and/or other materials provided with the distribution. */ /* */ /* 3. Neither the name of the copyright holder nor the names of its */ /* contributors may be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" */ /* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE */ /* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE */ /* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE */ /* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL */ /* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR */ /* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER */ /* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, */ /* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE */ /* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /*------------------------------------------------------------------------------------------------*/ #include <stdint.h> #include <stddef.h> #include <stdbool.h> #include <stdio.h> #include <string.h> #include <assert.h> #include "ucsi_cfg.h" #include "ucsi_print.h" #ifdef ENABLE_RESOURCE_PRINT #define SERVICE_TIME (1000) #define MAX_TIMEOUT (30000) #define MPR_RETRIES (5) #define INVALID_CON_LABEL (0xDEAD) #define RESETCOLOR "\033[0m" #define GREEN "\033[0;32m" #define RED "\033[0;31m" #define YELLOW "\033[1;33m" #define BLUE "\033[0;34m" #define STR_BUF_LEN (200) #define STR_RES_LEN (60) struct ResourceList { Ucs_Xrm_ResObject_t *element; UCSIPrint_ObjectState_t state; }; struct ConnectionList { bool isValid; bool isActive; uint16_t routeId; uint16_t connectionLabel; }; struct NodeList { bool isValid; UCSIPrint_NodeState_t nodeState; uint16_t node; uint16_t pos; }; struct LocalVar { bool initialized; bool triggerService; uint32_t nextService; uint32_t timeOut; void *tag; Ucs_Rm_Route_t *pRoutes; uint16_t routesSize; bool networkAvailable; uint8_t mpr; uint8_t waitForMprRetries; struct ResourceList rList[UCSI_PRINT_MAX_RESOURCES]; struct ConnectionList cList[UCSI_PRINT_MAX_RESOURCES]; struct NodeList nList[UCSI_PRINT_MAX_NODES]; }; static struct LocalVar m = { 0 }; static char strBuf[STR_BUF_LEN]; static void PrintTable(void); static void ParseResources(Ucs_Xrm_ResObject_t **ppJobList, char *pBuf, uint32_t bufLen); static bool GetIgnoredNodeString(char *pBuf, uint32_t bufLen); static UCSIPrint_NodeState_t GetNodeState(uint16_t nodeAddress); static uint8_t GetNodeCount(void); static bool GetRouteState(uint16_t routeId, bool *pIsActive, uint16_t *pConLabel); static void RequestTrigger(void); void UCSIPrint_Init(Ucs_Rm_Route_t *pRoutes, uint16_t routesSize, void *tag) { memset(&m, 0, sizeof(struct LocalVar)); if (NULL == pRoutes || 0 == routesSize) return; m.tag = tag; m.pRoutes = pRoutes; m.routesSize = routesSize; m.initialized = true; } void UCSIPrint_Service(uint32_t timestamp) { bool exec = false; if (!m.networkAvailable) return; if (m.triggerService) { m.triggerService = false; m.nextService = timestamp + SERVICE_TIME; if (0 == m.timeOut) m.timeOut = timestamp + MAX_TIMEOUT; UCSIPrint_CB_NeedService(m.tag); return; } if (0 == m.nextService || 0 == m.timeOut) return; if (timestamp >= m.timeOut) { UCSIPrint_CB_OnUserMessage(m.tag, RED "UCSI-Watchdog:Max timeout reached" RESETCOLOR); exec = true; } else if (timestamp >= m.nextService) { if (m.mpr != GetNodeCount() && ++m.waitForMprRetries <= MPR_RETRIES) { m.nextService = timestamp + SERVICE_TIME; return; } exec = true; } if (exec) { m.nextService = 0; m.timeOut = 0; PrintTable(); } else { UCSIPrint_CB_NeedService(m.tag); } } void UCSIPrint_SetNetworkAvailable(bool available, uint8_t maxPos) { if (!m.initialized) return; m.networkAvailable = available; m.mpr = maxPos; m.waitForMprRetries = 0; if (available) { RequestTrigger(); } else { m.triggerService = false; m.nextService = 0; } } void UCSIPrint_SetNodeAvailable(uint16_t nodeAddress, uint16_t nodePosAddr, UCSIPrint_NodeState_t nodeState) { uint16_t i; if (!m.initialized) return; /* Find existing entry */ for (i = 0; i < UCSI_PRINT_MAX_NODES; i++) { if (m.nList[i].isValid && nodePosAddr == m.nList[i].pos) { if (m.nList[i].nodeState != nodeState || m.nList[i].node != nodeAddress) { m.nList[i].node = nodeAddress; m.nList[i].nodeState = nodeState; RequestTrigger(); } return; } } /* Find empty entry and store it there */ for (i = 0; i < UCSI_PRINT_MAX_NODES; i++) { if (!m.nList[i].isValid) { m.nList[i].node = nodeAddress; m.nList[i].pos = nodePosAddr; m.nList[i].nodeState = nodeState; m.nList[i].isValid = true; RequestTrigger(); return; } } UCSIPrint_CB_OnUserMessage(m.tag, RED "UCSI-Watchdog:Could not store node availability, increase UCSI_PRINT_MAX_NODES" RESETCOLOR); } void UCSIPrint_SetRouteState(uint16_t routeId, bool isActive, uint16_t connectionLabel) { uint16_t i; if (!m.initialized) return; RequestTrigger(); /* Find existing entry */ for (i = 0; i < UCSI_PRINT_MAX_RESOURCES; i++) { if (m.cList[i].isValid && routeId == m.cList[i].routeId) { m.cList[i].connectionLabel = connectionLabel; m.cList[i].isActive = isActive; return; } } /* Find empty entry and store it there */ for (i = 0; i < UCSI_PRINT_MAX_RESOURCES; i++) { if (!m.cList[i].isValid) { m.cList[i].routeId = routeId; m.cList[i].isActive = isActive; m.cList[i].connectionLabel = connectionLabel; m.cList[i].isValid = true; return; } } UCSIPrint_CB_OnUserMessage(m.tag, RED "UCSI-Watchdog:Could not store connection label, increase UCSI_PRINT_MAX_RESOURCES" RESETCOLOR); } void UCSIPrint_SetObjectState(Ucs_Xrm_ResObject_t *element, UCSIPrint_ObjectState_t state) { uint16_t i; if (!m.initialized) return; /* Find existing entry */ for (i = 0; i < UCSI_PRINT_MAX_RESOURCES; i++) { if (element == m.rList[i].element) { if (m.rList[i].state != state) { m.rList[i].state = state; RequestTrigger(); } return; } } /* Find empty entry and store it there */ for (i = 0; i < UCSI_PRINT_MAX_RESOURCES; i++) { if (NULL == m.rList[i].element) { m.rList[i].element = element; m.rList[i].state = state; RequestTrigger(); return; } } UCSIPrint_CB_OnUserMessage(m.tag, RED "UCSI-Watchdog:Could not store object state, increase UCSI_PRINT_MAX_RESOURCES" RESETCOLOR); } void UCSIPrint_UnicensActivity(void) { if (!m.initialized) return; if (0 != m.nextService) RequestTrigger(); else UCSIPrint_CB_NeedService(m.tag); } static void PrintTable(void) { uint16_t i; static char inRes[STR_RES_LEN]; static char outRes[STR_RES_LEN]; if (!m.initialized) return; if (!m.networkAvailable) return; UCSIPrint_CB_OnUserMessage(m.tag, "---------------------------------------------------------------------------------------"); UCSIPrint_CB_OnUserMessage(m.tag, " Source | Sink | Active | ID | Label | Resources"); for (i = 0; i < m.routesSize; i++) { const char *sourceAvail = " "; const char *sourceReset = ""; const char *sinkAvail = " "; const char *sinkReset = ""; const char *routeAvail = " "; const char *routeReset = ""; uint16_t srcAddr = m.pRoutes[i].source_endpoint_ptr->node_obj_ptr->signature_ptr->node_address; uint16_t snkAddr = m.pRoutes[i].sink_endpoint_ptr->node_obj_ptr->signature_ptr->node_address; uint8_t shallActive = m.pRoutes[i].active; uint16_t id = m.pRoutes[i].route_id; bool isActive = false; uint16_t label = INVALID_CON_LABEL; UCSIPrint_NodeState_t srcState = GetNodeState(srcAddr); UCSIPrint_NodeState_t snkState = GetNodeState(snkAddr); GetRouteState(id, &isActive, &label); ParseResources(m.pRoutes[i].source_endpoint_ptr->jobs_list_ptr, inRes, sizeof(inRes)); ParseResources(m.pRoutes[i].sink_endpoint_ptr->jobs_list_ptr, outRes, sizeof(outRes)); if (NodeState_Available == srcState) { sourceAvail = GREEN "^"; sourceReset = RESETCOLOR; } else if (NodeState_Ignored == srcState) { sourceAvail = RED "^"; sourceReset = RESETCOLOR; } if (NodeState_Available == snkState) { sinkAvail = GREEN "^"; sinkReset = RESETCOLOR; } else if (NodeState_Ignored == snkState) { sinkAvail = RED "!"; sinkReset = RESETCOLOR; } if (NodeState_Available == srcState && NodeState_Available == snkState) { if (shallActive == isActive) routeAvail = GREEN "^"; else routeAvail = RED "!"; routeReset = RESETCOLOR; } snprintf(strBuf, STR_BUF_LEN, "%s0x%03X%s | %s0x%03X%s | S:%d I:%s%d%s | 0x%04X | 0x%04X | Src:%s Snk:%s", sourceAvail, srcAddr, sourceReset, sinkAvail, snkAddr, sinkReset, shallActive, routeAvail, isActive, routeReset, id, label, inRes, outRes); UCSIPrint_CB_OnUserMessage(m.tag, strBuf); } UCSIPrint_CB_OnUserMessage(m.tag, "---------------------------------------------------------------------------------------"); if (GetIgnoredNodeString(inRes, sizeof(inRes))) { snprintf(strBuf, STR_BUF_LEN, RED "Ignored nodes = { %s }" RESETCOLOR, inRes); UCSIPrint_CB_OnUserMessage(m.tag, strBuf); UCSIPrint_CB_OnUserMessage(m.tag, "---------------------------------------------------------------------------------------"); } } static void ParseResources(Ucs_Xrm_ResObject_t **ppJobList, char *pBuf, uint32_t bufLen) { uint16_t i, j; Ucs_Xrm_ResObject_t *job; UCSIPrint_ObjectState_t oldState = ObjState_Unused; UCSIPrint_ObjectState_t newState = ObjState_Unused; assert(NULL != pBuf && 0 != bufLen); pBuf[0] = '\0'; if (NULL == ppJobList) return; for (i = 0; NULL != (job = ppJobList[i]); i++) { Ucs_Xrm_ResourceType_t typ = *((Ucs_Xrm_ResourceType_t *)job); assert(UCS_XRM_RC_TYPE_QOS_CON >= typ); /* Silently ignore default created port */ if (UCS_XRM_RC_TYPE_DC_PORT == typ) continue; for (j = 0; j < UCSI_PRINT_MAX_RESOURCES; j++) { if (NULL == m.rList[j].element) break; newState = ObjState_Unused; if (job == m.rList[j].element) { newState = m.rList[j].state; break; } } if (oldState != newState) { oldState = newState; if (ObjState_Build == newState) strcat(pBuf, GREEN); else if (ObjState_Failed == newState) strcat(pBuf, RED); else strcat(pBuf, RESETCOLOR); } if (ObjState_Build == newState) strcat(pBuf, "^"); else if (ObjState_Failed == newState) strcat(pBuf, "!"); else strcat(pBuf, " "); switch(typ) { case UCS_XRM_RC_TYPE_NW_SOCKET: strcat(pBuf, "NS"); break; case UCS_XRM_RC_TYPE_MLB_PORT: strcat(pBuf, "MP"); break; case UCS_XRM_RC_TYPE_MLB_SOCKET: strcat(pBuf, "MS"); break; case UCS_XRM_RC_TYPE_USB_PORT: strcat(pBuf, "UP"); break; case UCS_XRM_RC_TYPE_USB_SOCKET: strcat(pBuf, "US"); break; case UCS_XRM_RC_TYPE_STRM_PORT: strcat(pBuf, "SP"); break; case UCS_XRM_RC_TYPE_STRM_SOCKET: strcat(pBuf, "SS"); break; case UCS_XRM_RC_TYPE_SYNC_CON: strcat(pBuf, "SC"); break; case UCS_XRM_RC_TYPE_COMBINER: strcat(pBuf, "C"); break; case UCS_XRM_RC_TYPE_SPLITTER: strcat(pBuf, "S"); break; case UCS_XRM_RC_TYPE_AVP_CON: strcat(pBuf, "AC"); break; default: strcat(pBuf, "E"); break; } } if (ObjState_Unused != newState) strcat(pBuf, RESETCOLOR); assert(strlen(pBuf) < bufLen); } static bool GetIgnoredNodeString(char *pBuf, uint32_t bufLen) { uint16_t i; char pTmp[8]; bool foundNodes = false; assert(NULL != pBuf && 0 != bufLen); pBuf[0] = '\0'; /* Find existing entry */ for (i = 0; i < UCSI_PRINT_MAX_NODES; i++) { if (m.nList[i].isValid && NodeState_Ignored == m.nList[i].nodeState) { foundNodes = true; snprintf(pTmp, sizeof(pTmp), "0x%X ", m.nList[i].node); strcat(pBuf, pTmp); } } assert(strlen(pBuf) < bufLen); return foundNodes; } static UCSIPrint_NodeState_t GetNodeState(uint16_t nodeAddress) { uint16_t i; /* Find existing entry */ for (i = 0; i < UCSI_PRINT_MAX_NODES; i++) { if (m.nList[i].isValid && nodeAddress == m.nList[i].node) { return m.nList[i].nodeState; } } return NodeState_NotAvailable; } static uint8_t GetNodeCount(void) { uint16_t i; uint8_t cnt = 0; for (i = 0; i < UCSI_PRINT_MAX_NODES; i++) { if (m.nList[i].isValid && NodeState_NotAvailable != m.nList[i].nodeState) ++cnt; } return cnt; } static bool GetRouteState(uint16_t routeId, bool *pIsActive, uint16_t *pConLabel) { uint16_t i; assert(NULL != pIsActive); assert(NULL != pConLabel); /* Find existing entry */ for (i = 0; i < UCSI_PRINT_MAX_NODES; i++) { if (m.cList[i].isValid && routeId == m.cList[i].routeId) { *pIsActive = m.cList[i].isActive; *pConLabel = m.cList[i].connectionLabel; return true; } } return false; } static void RequestTrigger(void) { m.triggerService = true; UCSIPrint_CB_NeedService(m.tag); } #else /* ENABLE_RESOURCE_PRINT */ void UCSIPrint_Init(Ucs_Rm_Route_t *pRoutes, uint16_t routesSize, void *tag) {} void UCSIPrint_Service(uint32_t timestamp) {} void UCSIPrint_SetNetworkAvailable(bool available, uint8_t maxPos) {} void UCSIPrint_SetNodeAvailable(uint16_t nodeAddress, UCSIPrint_NodeState_t nodeState) {} void UCSIPrint_SetRouteState(uint16_t routeId, bool isActive, uint16_t connectionLabel) {} void UCSIPrint_SetObjectState(Ucs_Xrm_ResObject_t *element, UCSIPrint_ObjectState_t state) {} void UCSIPrint_UnicensActivity(void) {} #endif <file_sep>/audio-source/samv71-ucs/src/task-audio.c /*------------------------------------------------------------------------------------------------*/ /* Audio Processing Task Implementation */ /* Copyright 2018, Microchip Technology Inc. and its subsidiaries. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following conditions are met: */ /* */ /* 1. Redistributions of source code must retain the above copyright notice, this */ /* list of conditions and the following disclaimer. */ /* */ /* 2. Redistributions in binary form must reproduce the above copyright notice, */ /* this list of conditions and the following disclaimer in the documentation */ /* and/or other materials provided with the distribution. */ /* */ /* 3. Neither the name of the copyright holder nor the names of its */ /* contributors may be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" */ /* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE */ /* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE */ /* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE */ /* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL */ /* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR */ /* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER */ /* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, */ /* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE */ /* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /*------------------------------------------------------------------------------------------------*/ #include <string.h> #include <assert.h> #include "Console.h" #include "dim2_lld.h" #include "task-audio.h" /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* DEFINES AND LOCAL VARIABLES */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* #define ENABLE_AUDIO_RX */ struct TaskAudioVars { bool initialized; uint32_t audioPos; }; static struct TaskAudioVars m = { 0 }; static const uint8_t audioData[] = { #include "beat_be.h" }; /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* PRIVATE FUNCTION PROTOTYPES */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ static bool ProcessStreamingData(const uint8_t *pRxBuf, uint32_t rxLen, uint8_t *pTxBuf, uint32_t txLen); /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* PUBLIC FUNCTIONS */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ bool TaskAudio_Init(void) { memset(&m, 0, sizeof(m)); assert(0 == sizeof(audioData) % 4); m.initialized = true; return true; } void TaskAudio_Service(void) { while(true) { uint8_t *pTxBuf = NULL; const uint8_t *pRxBuf = NULL; uint16_t rxLen = 0; uint16_t txLen = 0; #if ENABLE_AUDIO_RX rxLen = DIM2LLD_GetRxData(DIM2LLD_ChannelType_Sync, DIM2LLD_ChannelDirection_RX, 0, 0, &pRxBuf, NULL, NULL); if (0 == rxLen) break; #endif txLen = DIM2LLD_GetTxData(DIM2LLD_ChannelType_Sync, DIM2LLD_ChannelDirection_TX, 0, &pTxBuf); if (0 == txLen) break; if (ProcessStreamingData(pRxBuf, rxLen, pTxBuf, txLen)) { #if ENABLE_AUDIO_RX DIM2LLD_ReleaseRxData(DIM2LLD_ChannelType_Sync, DIM2LLD_ChannelDirection_RX, 0); #endif DIM2LLD_SendTxData(DIM2LLD_ChannelType_Sync, DIM2LLD_ChannelDirection_TX, 0, txLen); } else break; } } /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* PRIVATE FUNCTION IMPLEMENTATIONS */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ static bool ProcessStreamingData(const uint8_t *pRxBuf, uint32_t rxLen, uint8_t *pTxBuf, uint32_t txLen) { uint32_t i; if (!m.initialized) return false; for (i = 0; i < txLen; i++) { pTxBuf[i] = audioData[m.audioPos++]; if (sizeof(audioData) <= m.audioPos) m.audioPos = 0; } return true; }<file_sep>/audio-source/samv71-ucs/src/driver/dim2/hal/dim2_reg.h /* * dim2_reg.h - Definitions for registers of DIM2 * (MediaLB, Device Interface Macro IP, OS62420) * * Copyright (C) 2015, Microchip Technology Germany II GmbH & Co. KG * * This program is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the * GNU General Public License for more details. * * This file is licensed under GPLv2. */ #ifndef DIM2_OS62420_H #define DIM2_OS62420_H #include <stdint.h> #ifdef __cplusplus extern "C" { #endif struct dim2_regs { /* 0x00 */ uint32_t MLBC0; /* 0x01 */ uint32_t rsvd0[1]; /* 0x02 */ uint32_t MLBPC0; /* 0x03 */ uint32_t MS0; /* 0x04 */ uint32_t rsvd1[1]; /* 0x05 */ uint32_t MS1; /* 0x06 */ uint32_t rsvd2[2]; /* 0x08 */ uint32_t MSS; /* 0x09 */ uint32_t MSD; /* 0x0A */ uint32_t rsvd3[1]; /* 0x0B */ uint32_t MIEN; /* 0x0C */ uint32_t rsvd4[1]; /* 0x0D */ uint32_t MLBPC2; /* 0x0E */ uint32_t MLBPC1; /* 0x0F */ uint32_t MLBC1; /* 0x10 */ uint32_t rsvd5[0x10]; /* 0x20 */ uint32_t HCTL; /* 0x21 */ uint32_t rsvd6[1]; /* 0x22 */ uint32_t HCMR0; /* 0x23 */ uint32_t HCMR1; /* 0x24 */ uint32_t HCER0; /* 0x25 */ uint32_t HCER1; /* 0x26 */ uint32_t HCBR0; /* 0x27 */ uint32_t HCBR1; /* 0x28 */ uint32_t rsvd7[8]; /* 0x30 */ uint32_t MDAT0; /* 0x31 */ uint32_t MDAT1; /* 0x32 */ uint32_t MDAT2; /* 0x33 */ uint32_t MDAT3; /* 0x34 */ uint32_t MDWE0; /* 0x35 */ uint32_t MDWE1; /* 0x36 */ uint32_t MDWE2; /* 0x37 */ uint32_t MDWE3; /* 0x38 */ uint32_t MCTL; /* 0x39 */ uint32_t MADR; /* 0x3A */ uint32_t rsvd8[0xB6]; /* 0xF0 */ uint32_t ACTL; /* 0xF1 */ uint32_t rsvd9[3]; /* 0xF4 */ uint32_t ACSR0; /* 0xF5 */ uint32_t ACSR1; /* 0xF6 */ uint32_t ACMR0; /* 0xF7 */ uint32_t ACMR1; }; #define DIM2_MASK(n) (~((~(uint32_t)0) << (n))) enum { MLBC0_MLBLK_BIT = 7, MLBC0_MLBPEN_BIT = 5, MLBC0_MLBCLK_SHIFT = 2, MLBC0_MLBCLK_VAL_256FS = 0, MLBC0_MLBCLK_VAL_512FS = 1, MLBC0_MLBCLK_VAL_1024FS = 2, MLBC0_MLBCLK_VAL_2048FS = 3, MLBC0_FCNT_SHIFT = 15, MLBC0_FCNT_MASK = 7, MLBC0_FCNT_MAX_VAL = 6, MLBC0_MLBEN_BIT = 0, MIEN_CTX_BREAK_BIT = 29, MIEN_CTX_PE_BIT = 28, MIEN_CTX_DONE_BIT = 27, MIEN_CRX_BREAK_BIT = 26, MIEN_CRX_PE_BIT = 25, MIEN_CRX_DONE_BIT = 24, MIEN_ATX_BREAK_BIT = 22, MIEN_ATX_PE_BIT = 21, MIEN_ATX_DONE_BIT = 20, MIEN_ARX_BREAK_BIT = 19, MIEN_ARX_PE_BIT = 18, MIEN_ARX_DONE_BIT = 17, MIEN_SYNC_PE_BIT = 16, MIEN_ISOC_BUFO_BIT = 1, MIEN_ISOC_PE_BIT = 0, MLBC1_NDA_SHIFT = 8, MLBC1_NDA_MASK = 0xFF, MLBC1_CLKMERR_BIT = 7, MLBC1_LOCKERR_BIT = 6, ACTL_DMA_MODE_BIT = 2, ACTL_DMA_MODE_VAL_DMA_MODE_0 = 0, ACTL_DMA_MODE_VAL_DMA_MODE_1 = 1, ACTL_SCE_BIT = 0, HCTL_EN_BIT = 15 }; enum { CDT0_RPC_SHIFT = 16 + 11, CDT0_RPC_MASK = DIM2_MASK(5), CDT1_BS_ISOC_SHIFT = 0, CDT1_BS_ISOC_MASK = DIM2_MASK(9), CDT3_BD_SHIFT = 0, CDT3_BD_MASK = DIM2_MASK(12), CDT3_BD_ISOC_MASK = DIM2_MASK(13), CDT3_BA_SHIFT = 16, ADT0_CE_BIT = 15, ADT0_LE_BIT = 14, ADT0_PG_BIT = 13, ADT1_RDY_BIT = 15, ADT1_DNE_BIT = 14, ADT1_ERR_BIT = 13, ADT1_PS_BIT = 12, ADT1_MEP_BIT = 11, ADT1_BD_SHIFT = 0, ADT1_CTRL_ASYNC_BD_MASK = DIM2_MASK(11), ADT1_ISOC_SYNC_BD_MASK = DIM2_MASK(13), CAT_MFE_BIT = 14, CAT_MT_BIT = 13, CAT_RNW_BIT = 12, CAT_CE_BIT = 11, CAT_CT_SHIFT = 8, CAT_CT_VAL_SYNC = 0, CAT_CT_VAL_CONTROL = 1, CAT_CT_VAL_ASYNC = 2, CAT_CT_VAL_ISOC = 3, CAT_CL_SHIFT = 0, CAT_CL_MASK = DIM2_MASK(6) }; #ifdef __cplusplus } #endif #endif /* DIM2_OS62420_H */ <file_sep>/audio-source/samv71-ucs/src/main.cpp /*------------------------------------------------------------------------------------------------*/ /* UNICENS Daemon (unicensd) main-loop */ /* Copyright 2018, Microchip Technology Inc. and its subsidiaries. */ /* */ /* Redistribution and use in source and binary forms, with or without */ /* modification, are permitted provided that the following conditions are met: */ /* */ /* 1. Redistributions of source code must retain the above copyright notice, this */ /* list of conditions and the following disclaimer. */ /* */ /* 2. Redistributions in binary form must reproduce the above copyright notice, */ /* this list of conditions and the following disclaimer in the documentation */ /* and/or other materials provided with the distribution. */ /* */ /* 3. Neither the name of the copyright holder nor the names of its */ /* contributors may be used to endorse or promote products derived from */ /* this software without specific prior written permission. */ /* */ /* THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" */ /* AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE */ /* IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE */ /* DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE */ /* FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL */ /* DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR */ /* SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER */ /* CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, */ /* OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE */ /* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE. */ /*------------------------------------------------------------------------------------------------*/ #include <stdio.h> #include <stdint.h> #include <stdbool.h> #include <stdlib.h> #include <string.h> #include "board_init.h" #include "Console.h" #include "task-unicens.h" #include "task-audio.h" /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* USER ADJUSTABLE */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* UNICENS daemon version number */ #define UNICENSD_VERSION ("V4.3.0") /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* DEFINES AND LOCAL VARIABLES */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ typedef struct { uint32_t lastToggle; bool consoleTrigger; bool gmacSendInProgress; } LocalVar_t; static LocalVar_t m; /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* PRIVATE FUNCTION PROTOTYPES */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ static void GmacTransferCallback(uint32_t status, void *pTag); /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* PUBLIC FUNCTIONS */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ int main() { Board_Init(); memset(&m, 0, sizeof(LocalVar_t)); ConsoleInit(); ConsoleSetPrio(PRIO_HIGH); ConsolePrintf(PRIO_HIGH, BLUE "------|V71 UNICENS sample start %s (BUILD %s %s)|------" RESETCOLOR "\r\n", UNICENSD_VERSION, __DATE__, __TIME__); if (!TaskUnicens_Init()) ConsolePrintf(PRIO_ERROR, RED "Init of Task UNICENS Init Failed" RESETCOLOR "\r\n"); if (!TaskAudio_Init()) ConsolePrintf(PRIO_ERROR, RED "Init of Task Audio Failed" RESETCOLOR "\r\n"); while (1) { uint32_t now = GetTicks(); TaskUnicens_Service(); TaskAudio_Service(); if (m.consoleTrigger) { m.consoleTrigger = false; ConsoleService(); } if (now - m.lastToggle >= 333) { m.lastToggle = now; LED_Toggle(0); } } return 0; } /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* CALLBACK FUNCTION FROM TASK UNICENS */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ void TaskUnicens_CB_OnRouteResult(uint16_t routeId, bool isActive, uint16_t connectionLabel) { } /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* CALLBACK FUNCTIONS FROM CONSOLE */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ void ConsoleCB_OnServiceNeeded(void) { m.consoleTrigger = true; } bool ConsoleCB_SendDatagram( uint8_t *pEthHeader, uint32_t ethLen, uint8_t *pPayload, uint32_t payloadLen ) { sGmacSGList sgl; sGmacSG sg[2]; if (m.gmacSendInProgress) return false; sg[0].pBuffer = pEthHeader; sg[0].size = ethLen; sg[1].pBuffer = pPayload; sg[1].size = payloadLen; sgl.sg = sg; sgl.len = 2; if (GMACD_OK != GMACD_SendSG(&gGmacd, &sgl, GmacTransferCallback, NULL, GMAC_QUE_0)) return false; m.gmacSendInProgress = true; return true; } /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ /* CALLBACK FUNCTIONS FROM GMAC */ /*>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>*/ static void GmacTransferCallback(uint32_t status, void *pTag) { m.gmacSendInProgress = false; }
a8600a88163056dad0e42bfee974bd6f58f77d69
[ "Markdown", "C", "C++" ]
29
C
MicrochipTech/unicens-bare-metal-sam-v71
92719394f89bc86a1fe2b2f56dfe090938dbc4a3
f7b3019fa48f4be3726292b058b77708724bac7b
refs/heads/master
<file_sep>import React, { useState,useEffect} from "react"; import axios from "axios"; import { Panel, Button, Notification } from "rsuite"; const Login = (props) => { const [email, setEmail] = useState("<EMAIL>"); const [pass, setPass] = useState("<PASSWORD>"); const [getToken, setGetToken] = useState(false); useEffect(()=>{ const loginHandled = async () => { const login = await axios.post("https://andrey1997acer.codes/api/login", { email: email, password: pass, }); if (login.data.status) { props.setToken(login.data.token) props.setRouter('transaction') } else { props.setRouter('login') Notification["error"]({ title: "Aviso", description: `Email or password incorrect`, duration: 3000, }); } }; if(getToken){ loginHandled() } },[getToken]) return ( <> <h1 className="text-center">Login</h1> <div className="container m-4"> <Panel bordered style={{ background: "#F1F3F4" }}> <div className="form-row "> <div className="form-group col-md-12"> <h5 className="text-center">Email</h5> <input type="text" className="form-control" onChange={(valor) => setEmail(valor.target.value)} value={email} /> </div> <div className="form-group col-md-12"> <h5 className="text-center">Password</h5> <input type="password" className="form-control" onChange={(valor) => setPass(valor.target.value)} value={pass} /> </div> <div className="form-group col-md-12"> <Button onClick={()=>{ setGetToken(true) }} alt="Iniciar Sesión" appearance="primary" style={{ float: "right" }} > {" "} Iniciar Sesión </Button> </div> </div> </Panel> </div> </> ); }; export default Login; <file_sep>import React, { useState } from "react"; import logo from "./logo.svg"; import "./App.css"; import Button from "./components/Buttons/Button"; import "rsuite/dist/styles/rsuite-default.css"; import { Notification } from "rsuite"; import Login from "./components/Login"; import Transaction from "./components/Transactions"; import CreditCard from "./components/CreditCard"; function App() { const [router, setRouter] = useState('login'); const [token, setToken] = useState(""); const [fingerprint, setFingerprint] = useState(""); const [transaction, setTransaction] = useState({}); console.log(`Valor dewl token desde app ${token}`) return ( <div className="App container"> {router === 'login' ? <Login setRouter={setRouter} setToken={setToken} /> : router ==='transaction'?<Transaction setTransaction={setTransaction} token={token} setRouter={setRouter} setFingerprint={setFingerprint}/> : <CreditCard transaction={transaction} setRouter={setRouter}/>} </div> ); } export default App; <file_sep>import React, { useState } from "react"; import {Button} from 'rsuite' const ButtonSesion = (props) => { const [isMen, setIsMen] = useState(props.ismen); const onClickHandled = () => { props.funcion(); } return ( <Button onClick={onClickHandled} color={props.color}>{props.name}</Button> ); }; export default ButtonSesion; <file_sep>import React, { useState } from "react"; import { Panel, Button, Notification } from "rsuite"; import axios from "axios"; import crypto from 'crypto-js' const Transaction = (props) => { const [x_amount, setx_amount] = useState(""); const [x_invoice_num, setx_invoice_num] = useState(""); const [x_fp_sequence, setx_fp_sequence] = useState(""); const [x_fp_timestamp, setx_fp_timestamp] = useState(""); const [x_fp_hash, setx_fp_hash] = useState(""); const [x_test_request, setx_test_request] = useState(false); const [x_show_form, setx_show_form] = useState("PAYMENT_FORM’"); const [x_currency_code, setX_currency_code] = useState('CRC'); const [articulo, setArticulo] = useState(''); const [isGenerado, setIsGenerado] = useState(false); console.log(`Los datos desde transaccion ${props.token}`) const pagar = async () => { if (x_amount && isGenerado) { const cadena = props.token + "^" + x_fp_sequence + "^" + x_fp_timestamp + "^" + x_amount + "^"; const hmac = crypto.HmacMD5(cadena, 'transactionKey'); const transaccion = await axios.post("https://andrey1997acer.codes/api/trasaction", { token:props.token, x_fp_sequence:x_fp_sequence, x_fp_timestamp:x_fp_timestamp, x_amount:x_amount, fingerprint:hmac.toString() }); if(transaccion){ if(transaccion.data){ props.setTransaction(transaccion.data); // props.setFingerprint(transaccion.data.fingerprint); props.setRouter('creditcard'); } } } }; const getX_fp_timestamp = async () => { const fecha = await axios.get("https://andrey1997acer.codes/api/date"); if(fecha){ setx_fp_timestamp(fecha.data.fecha) } console.log(`La fecha del backend ${fecha.data.fecha}`) }; const validarPago = ()=>{ if(!articulo || !x_amount || !x_currency_code || !x_fp_timestamp){ return true; }else{ return false; } } function generar() { getX_fp_sequence() getX_invoice_num() getX_fp_timestamp() setIsGenerado(true) } function getX_fp_sequence() { var caracteres = "123456789"; var contrasenna = ""; var i = 0; for (i = 0; i < 20; i++) contrasenna += caracteres.charAt(Math.floor(Math.random() * caracteres.length)); setx_fp_sequence(contrasenna) } function getX_invoice_num() { var caracteres = "abcdefghijkmnpqrtuvwxyzABCDEFGHJKMNPQRTUVWXYZ2346789"; var contraseña = ""; var i = 0; for (i = 0; i < 20; i++) contraseña += caracteres.charAt(Math.floor(Math.random() * caracteres.length)); console.log(contraseña) setx_invoice_num(contraseña) } return ( <> <h1 className="text-center">Transacción</h1> <Panel bordered style={{ background: "#F1F3F4" }}> <div className="form-row"> <div className="form-group col-md-6"> <h5 className="text-center">Articulo</h5> <input type="text" className="form-control" onChange={(e)=>setArticulo(e.target.value)} value={articulo} /> </div> <div className="form-group col-md-6"> <h5 className="text-center">Monto De La Transacción</h5> <input type="number" className="form-control" onChange={(valor) => setx_amount(valor.target.value)} value={x_amount} /> </div> <div className="form-group col-md-6"> <h5 className="text-center">Moneda</h5> <select className="form-control" value={x_currency_code} onChange={(e)=>setX_currency_code(e.target.value)}> <option key="100" value="100">CRC</option> <option key="101" value="101">USD</option> <option key="102" value="102">MXN</option> <option key="103" value="103" >EUR</option> </select> </div> <div className="form-group col-md-6"> <h5 className="text-center">Número De Factura</h5> <input type="text" className="form-control" disabled={true} //onChange={(valor) => setMonto(valor.target.value)} defaultValue={x_fp_sequence} /> </div> <div className="form-group col-md-6"> <h5 className="text-center">Número Secuencial</h5> <input type="text" className="form-control" disabled={true} // onChange={(valor) => setMonto(valor.target.value)} defaultValue={x_invoice_num} /> </div> <div className="form-group col-md-6"> <h5 className="text-center">Fecha</h5> <input type="text" className="form-control" disabled={true} //onChange={(valor) => setMonto(valor.target.value)} defaultValue={x_fp_timestamp} /> </div> <Button onClick={() => { generar() }} className="mr-3" alt="Generar Codigo" appearance="primary" style={{ float: "right" }} > {" "} Generar Código </Button> <Button onClick={() => { pagar() }} disabled={validarPago()} alt="realizar pago" appearance="primary" color="green" style={{ float:'right' }} > {" "} Siguiente </Button> </div> </Panel> </> ); } export default Transaction;
0e9cf4e6a368efca12dad7a04ff61c36b97ea378
[ "JavaScript" ]
4
JavaScript
andrey1997acer/cliente_app_bancaria
493a8f6ecb3b59be2b249257a8498bef0d025fac
ed73175aa52e5981fce768873f8d8b6c141dae9e
refs/heads/master
<file_sep># Hadoop_Session10Project1 Map reduce programs on cridata analysis <file_sep>package com.bigdata.main; import java.io.IOException; import java.text.ParseException; import java.text.SimpleDateFormat; import java.util.Calendar; import java.util.Date; import java.util.GregorianCalendar; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.fs.Path; import org.apache.hadoop.io.IntWritable; import org.apache.hadoop.io.LongWritable; import org.apache.hadoop.io.Text; import org.apache.hadoop.mapreduce.Job; import org.apache.hadoop.mapreduce.Mapper; import org.apache.hadoop.mapreduce.Reducer; import org.apache.hadoop.mapreduce.lib.input.FileInputFormat; import org.apache.hadoop.mapreduce.lib.input.TextInputFormat; import org.apache.hadoop.mapreduce.lib.output.FileOutputFormat; import org.apache.hadoop.mapreduce.lib.output.TextOutputFormat; import org.apache.log4j.Logger; public class CasesByDateRange { private static Logger logger = Logger.getLogger(CasesByDateRange.class); private static Long START_DATE = new GregorianCalendar(2014,Calendar.OCTOBER,1).getTimeInMillis(); private static Long END_DATE = new GregorianCalendar(2015,Calendar.OCTOBER,31).getTimeInMillis(); public static class CasesMapper extends Mapper<LongWritable, Text, Text, IntWritable> { private final static IntWritable one = new IntWritable(1); private Text dateText = new Text(); SimpleDateFormat sdf = new SimpleDateFormat("MM/dd/yyyy HH:mm:ss a"); public void map(LongWritable key, Text value, Context context) throws IOException, InterruptedException{ String record = value.toString(); String[] columnValues = record.split(",(?=([^\"]*\"[^\"]*\")*[^\"]*$)",-1); if(columnValues.length>15){ try { String dateStr = columnValues[2]; Date date = sdf.parse(dateStr); Long dateInMisslis = date.getTime(); if(dateInMisslis>START_DATE && dateInMisslis<END_DATE){ dateText.set("inRange"); context.write(dateText,one); } } catch (ParseException e) { // TODO Auto-generated catch block e.printStackTrace(); } } } } public static class CaseReducer extends Reducer<Text,IntWritable,Text,IntWritable>{ public void reduce(Text fbiCode,Iterable<IntWritable> values,Context context) throws IOException, InterruptedException{ int totalCases = 0; for (IntWritable val : values) { totalCases = totalCases + val.get(); } context.write(fbiCode, new IntWritable(totalCases)); } } public static void main(String[] args) throws IOException, ClassNotFoundException, InterruptedException { Configuration config = new Configuration(); Job job = new Job(config,"CasesByDateRange"); job.setJarByClass(CasesByDateRange.class); job.setOutputKeyClass(Text.class); job.setOutputValueClass(IntWritable.class); job.setMapperClass(CasesMapper.class); job.setCombinerClass(CaseReducer.class); job.setReducerClass(CaseReducer.class); job.setInputFormatClass(TextInputFormat.class); job.setOutputFormatClass(TextOutputFormat.class); FileInputFormat.addInputPath(job, new Path(args[0])); FileOutputFormat.setOutputPath(job,new Path(args[1])); job.waitForCompletion(true); } }
e71c52092f04037dc47c6a527f9c2883135b6184
[ "Markdown", "Java" ]
2
Markdown
bharanikumar-ab/Hadoop_Session10Project1
79604a819374c3ed677fd39088232c3f56e0c504
62ff17406b4752f2449c4458858f401082e7f695
refs/heads/master
<file_sep>import React from 'react' import { createStore, combineReducers, applyMiddleware } from 'redux' import { Provider } from 'react-redux' import { composeWithDevTools } from 'redux-devtools-extension' import thunk from 'redux-thunk' // reducers import countReducer from './Reducers/countReducer' import modalReducer from './Reducers/modalReducer' import productReducer from './Reducers/productReducer' // Components import Counter from './Component/Counter' import Modal from './Component/Modal' import Products from './Component/Products' const Middlewares = [thunk] export default function App() { const Store = createStore( combineReducers({ countState: countReducer, modalState: modalReducer, productState: productReducer }), composeWithDevTools(applyMiddleware(...Middlewares)) ) return ( <Provider store={Store}> <Counter /> <Modal /> <Products /> </Provider> ) } <file_sep> import { DECREACE, INCREASE, RESET} from '../Actions/ActionsTypes' const defaultState= { count :0, name: 'Youssef' } const reducer = (state = defaultState, action)=>{ switch(action.type){ case INCREASE: return {...state, count: ++state.count } case DECREACE: return {...state, count: --state.count } case RESET: return {...state, count: 0 } default: return state } } export default reducer;<file_sep> import React from 'react' import PropTypes from 'prop-types' import { connect } from 'react-redux' import { modelOpen, modelClose } from '../Actions/ActionsCreators' function Modal({isOpen, name, text, modelClose}) { return ( <div className={`modal-overlay ${isOpen && 'isModalOpen'}`}> <div className="modal-container"> <h4>{name}</h4> <p>{text}</p> <button onClick={modelClose} className="btn" > Close </button> </div> </div> ) } Modal.propTypes = { isOpen: PropTypes.bool.isRequired, name: PropTypes.string.isRequired, text: PropTypes.string.isRequired } export default connect(( {modalState:{isOpen, name, text}} )=> ( {isOpen, name, text} ), { modelOpen, modelClose })(Modal)<file_sep>import React from 'react'; import PropTypes from 'prop-types'; import {connect} from 'react-redux'; import {decrease, increase, reset, modelOpen} from '../Actions/ActionsCreators'; const Counter = ({ count, name, decrease, increase, reset}) => { return ( <div className="container"> <div> <h1 className="section-title">Counter</h1> <h2>{name}</h2> <p className="counter">{count}</p> <div className="buttons"> <button className="btn" onClick={decrease}> decrease </button> <button className="btn" onClick={reset}> reset </button> <button className="btn" onClick={increase}> increase </button> </div> </div> </div> ); }; Counter.propTypes = { name: PropTypes.string.isRequired, count: PropTypes.number.isRequired, }; const mapDsipatchToProps = (dispatch) => { return { increase: () => dispatch (increase ()), decrease: () => dispatch (decrease ()), reset: () => { dispatch (reset()) dispatch ( modelOpen ( 'youssef', 'Lorem ipsum dolor sit amet, consectetur adipisicing elit. Aperiam sed quis totam saepe mollitia tempore, modi eveniet repellat! Odio, non!' ) ); } }; }; export default connect ( ({countState:{name, count}})=>({ name, count }), mapDsipatchToProps) (Counter); <file_sep> import { LOADING, GET_PRODUCT } from '../Actions/ActionsTypes' const defaultState = { loading: false, products: [] } const reducer = (state=defaultState, action)=>{ if(action.type === LOADING){ return { ...state, loading: true, products:[] } } if(action.type === GET_PRODUCT){ return { ...state, loading: false, products: action.payload } } return state } export default reducer
f5f1eb3754c1b3afcfa976ec7fdaad4aec0cd3bb
[ "JavaScript" ]
5
JavaScript
YoussefRashad/Redux-Count-and-Get-Products-App
9718ab3b9b0ee7f6dc7c87aba454f41b83bf3116
95757ae4a39b4f07a2020b4b7ccf687d4d410a5a
refs/heads/master
<file_sep>[/Script/EngineSettings.GeneralProjectSettings] ProjectID=A28971BA4B36F87E36D7969AEC640673 <file_sep>// Fill out your copyright notice in the Description page of Project Settings. #include "FPSProject.h" #include "FPSProjectGameModeBase.h" void AFPSProjectGameModeBase::StartPlay() { Super::StartPlay(); if (GEngine) { //디버그 메시지를 5초동안 표시 //키(첫번째 인수) 값을 -1로 하면 이 메시지를 업데이트하거나 새로고침 할 필요가 없음을 나타냅니다. GEngine->AddOnScreenDebugMessage( -1, 5.0f, FColor::Yellow, TEXT("Hello World. this is FPSGameMode!") ); } } <file_sep>[/Script/EngineSettings.GeneralProjectSettings] ProjectID=4000EFDE46AE4560BDA2338EC3D9DB8B <file_sep>// Fill out your copyright notice in the Description page of Project Settings. #include "CStart.h" #include "CStartGameModeBase.h" <file_sep>[/Script/EngineSettings.GeneralProjectSettings] ProjectID=5DA1DEA1445B423BFCD39E95D5608AD3 ProjectName=ferson BP Game Template <file_sep>[/Script/EngineSettings.GeneralProjectSettings] ProjectID=7B347D3440EE7E934D98B1B1F1EBB39B <file_sep>// Fill out your copyright notice in the Description page of Project Settings. #include "CStart.h" IMPLEMENT_PRIMARY_GAME_MODULE( FDefaultGameModuleImpl, CStart, "CStart" );
3fedff72b287c0a52f22e1ecbbc3e5da88cce556
[ "C++", "INI" ]
7
INI
darker826/VR-Unreal-Study
1cc92fbae93e762e44c9e127399950a72b56eac0
5969cffa93166771ca061b9fefb5ac9b097ec861
refs/heads/master
<repo_name>traings4U/SharedPreference<file_sep>/app/src/main/java/com/mani/mysharedpreferencedemo/MainActivity.kt package com.mani.mysharedpreferencedemo import android.content.Context import androidx.appcompat.app.AppCompatActivity import android.os.Bundle import android.widget.EditText import android.widget.TextView import org.w3c.dom.Text class MainActivity : AppCompatActivity() { var etFirstName : EditText?=null var etLastName : EditText?=null override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) setContentView(R.layout.activity_main) etFirstName = findViewById(R.id.etFirstName) etLastName = findViewById(R.id.etLastName) getReterieveDataFromSharedPreference() findViewById<TextView>(R.id.tvSave).setOnClickListener { val firstName = etFirstName!!.text.toString() val lastName = etLastName!!.text.toString() val sharedPreference = getSharedPreferences("MyPrefName", Context.MODE_PRIVATE) val editor = sharedPreference.edit() editor.putString("FNAME",firstName) editor.putString("LNAME",lastName) editor.apply() } findViewById<TextView>(R.id.tvClear).setOnClickListener { val sharedPreference = getSharedPreferences("MyPrefName", Context.MODE_PRIVATE) val editor = sharedPreference.edit() editor.clear() editor.apply() getReterieveDataFromSharedPreference() } } private fun getReterieveDataFromSharedPreference() { val sharedPreference = getSharedPreferences("MyPrefName", Context.MODE_PRIVATE) val fName = sharedPreference.getString("FNAME","") val lName = sharedPreference.getString("LNAME","") if(!fName!!.isEmpty() && !lName!!.isEmpty()) { etFirstName!!.setText(""+fName) etLastName!!.setText(""+lName) } else { etFirstName!!.setText("") etLastName!!.setText("") } } }
46876ce3d1933a229cdf69f4bffad2dc86fe0da5
[ "Kotlin" ]
1
Kotlin
traings4U/SharedPreference
799cc6d72aaac584db70f133e68059679f6dd6af
94f0a173bee251da2b89aca65bb5690bd5c11a36
refs/heads/master
<repo_name>kaustubhtoraskar/CodeIT<file_sep>/app/src/main/java/svkt/wallet/activities/WalletStatement.java package svkt.wallet.activities; import android.content.Context; import android.content.DialogInterface; import android.content.Intent; import android.net.ConnectivityManager; import android.net.NetworkInfo; import android.support.v7.app.AlertDialog; import android.support.v7.app.AppCompatActivity; import android.os.Bundle; import android.view.Menu; import android.view.MenuItem; import android.widget.TextView; import android.widget.Toast; import com.google.firebase.auth.FirebaseAuth; import com.google.firebase.auth.FirebaseUser; import com.google.firebase.database.DataSnapshot; import com.google.firebase.database.DatabaseError; import com.google.firebase.database.DatabaseReference; import com.google.firebase.database.FirebaseDatabase; import com.google.firebase.database.ValueEventListener; import svkt.wallet.R; import svkt.wallet.models.Transaction; import svkt.wallet.models.User; public class WalletStatement extends AppCompatActivity { TextView totalAmount,name,cardno; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_wallet_statement); totalAmount = findViewById(R.id.totalAmount); name = findViewById(R.id.name); cardno = findViewById(R.id.cardno); passParams(); } public void passParams(){ FirebaseUser currentUser = FirebaseAuth.getInstance().getCurrentUser(); DatabaseReference dbRef = FirebaseDatabase.getInstance().getReference().child("users").child(currentUser.getUid()); dbRef.addValueEventListener(new ValueEventListener() { @Override public void onDataChange(DataSnapshot dataSnapshot) { User current = dataSnapshot.getValue(User.class); totalAmount.setText(getString(R.string.Rs)+ current.balance); name.setText(current.name); cardno.setText(current.cardNo); } @Override public void onCancelled(DatabaseError databaseError) { } }); } @Override public boolean onCreateOptionsMenu(Menu menu) { // Inflate the menu; this adds items to the action bar if it is present. getMenuInflater().inflate(R.menu.menu_passbook, menu); return true; } @Override public boolean onOptionsItemSelected(MenuItem item) { // Handle action bar item clicks here. The action bar will // automatically handle clicks on the Home/Up button, so long // as you specify a parent activity in AndroidManifest.xml. int id = item.getItemId(); switch (id) { case R.id.action_chat: startActivity(new Intent(WalletStatement.this,ChatActivity.class)); break; case R.id.action_passbook : startActivity(new Intent(WalletStatement.this,PassbookActivity.class)); break; case R.id.action_statement : startActivity(new Intent(WalletStatement.this,WalletStatement.class)); break; case R.id.action_logout : signOutDialog(); break; case R.id.action_transfer : startActivity(new Intent(WalletStatement.this,TransactionActivity.class)); break; } return super.onOptionsItemSelected(item); } private void signOutDialog() { final AlertDialog.Builder builder=new AlertDialog.Builder(WalletStatement.this); builder.setMessage("Do you want to Sign Out").setTitle("Sign Out"); builder.setPositiveButton("Yes", new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface dialogInterface, int i) { try { if(isInternetConnected()) { FirebaseAuth.getInstance().signOut(); finish(); startActivity(new Intent(WalletStatement.this,LoginActivity.class)); } } catch (Exception e) { Toast.makeText(WalletStatement.this,R.string.error,Toast.LENGTH_SHORT).show(); } } }); builder.setNegativeButton("No", new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface dialogInterface, int i) { } }); AlertDialog dialog=builder.create(); dialog.show(); } private boolean isInternetConnected() { ConnectivityManager connectivityManager=(ConnectivityManager)this.getSystemService(Context.CONNECTIVITY_SERVICE); NetworkInfo networkInfo=connectivityManager.getActiveNetworkInfo(); if(networkInfo==null || !networkInfo.isConnected() || !networkInfo.isAvailable()) { Toast.makeText(WalletStatement.this,"No Internet Connectivity",Toast.LENGTH_LONG).show(); return false; } return true; } } <file_sep>/app/src/main/java/svkt/wallet/activities/LoginActivity.java package svkt.wallet.activities; import android.app.ProgressDialog; import android.content.Intent; import android.os.Bundle; import android.support.annotation.NonNull; import android.support.design.widget.TextInputEditText; import android.support.v7.app.AppCompatActivity; import android.text.TextUtils; import android.util.Log; import android.view.View; import android.widget.Button; import android.widget.Toast; import com.google.android.gms.tasks.OnCompleteListener; import com.google.android.gms.tasks.Task; import com.google.firebase.auth.AuthResult; import com.google.firebase.auth.FirebaseAuth; import com.google.firebase.auth.FirebaseUser; import com.google.firebase.database.DataSnapshot; import com.google.firebase.database.DatabaseError; import com.google.firebase.database.DatabaseReference; import com.google.firebase.database.FirebaseDatabase; import com.google.firebase.database.ValueEventListener; import svkt.wallet.R; import svkt.wallet.models.User; public class LoginActivity extends AppCompatActivity { private static final String TAG = "LoginActivity"; private TextInputEditText emailEdit,passwordEdit; private Button loginButton,registerButton; private String email,password; private FirebaseAuth firebaseAuth; private FirebaseAuth.AuthStateListener authStateListener; private FirebaseUser firebaseUser; private ProgressDialog progressDialog; private DatabaseReference databaseReference; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_login); emailEdit = findViewById(R.id.email); passwordEdit = findViewById(R.id.password); loginButton = findViewById(R.id.loginButton); registerButton = findViewById(R.id.registerBtn); firebaseAuth = FirebaseAuth.getInstance(); authStateListener=new FirebaseAuth.AuthStateListener() { @Override public void onAuthStateChanged(@NonNull FirebaseAuth firebaseAuth) { firebaseUser = firebaseAuth.getCurrentUser(); if (firebaseUser != null) // User is signed in { Log.e(TAG, "User Sign in:" + firebaseUser.getUid()); databaseReference= FirebaseDatabase.getInstance().getReference().child("users").child(firebaseUser.getUid()); startActivity(new Intent(LoginActivity.this,ChatActivity.class)); } else Log.d(TAG, "No user"); } }; loginButton.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View view) { email = emailEdit.getText().toString(); password = passwordEdit.getText().toString(); if(isValid()){ showProgressDialog(); signInAccount(); } } }); registerButton.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View view) { startActivity(new Intent(LoginActivity.this,RegisterActivity.class)); } }); } private boolean isValid(){ if(email.isEmpty() || !isEmailValid(email)){ emailEdit.setError(getString(R.string.email_error)); emailEdit.setFocusable(true); return false; } else if(password.isEmpty() || password.length() < 8){ passwordEdit.setError(getString(R.string.password_error)); passwordEdit.setFocusable(true); return false; } return true; } private boolean isEmailValid(String email){ return !TextUtils.isEmpty(email) && android.util.Patterns.EMAIL_ADDRESS.matcher(email).matches(); } public void signInAccount() { firebaseAuth.signInWithEmailAndPassword(email, password) .addOnCompleteListener(LoginActivity.this, new OnCompleteListener<AuthResult>() { @Override public void onComplete(@NonNull Task<AuthResult> task) { if(task.isSuccessful()) { Log.e(TAG, "Sign In complete:" + task.isSuccessful()); firebaseUser = firebaseAuth.getCurrentUser(); databaseReference= FirebaseDatabase.getInstance().getReference().child("users").child(firebaseUser.getUid()); retrieveData(); hideProgressDialog(); } else { Log.e(TAG, "signInWithEmail", task.getException()); hideProgressDialog(); Toast.makeText(LoginActivity.this, "Invalid User",Toast.LENGTH_SHORT).show(); } } }); } public void showProgressDialog() { progressDialog=new ProgressDialog(LoginActivity.this); progressDialog.setCancelable(false); progressDialog.setMessage("Loading Wallet..."); progressDialog.setProgressStyle(ProgressDialog.STYLE_SPINNER); progressDialog.show(); } public void hideProgressDialog() { progressDialog.dismiss(); } public void retrieveData() { databaseReference.addValueEventListener(new ValueEventListener() { @Override public void onDataChange(DataSnapshot dataSnapshot) { User user = dataSnapshot.getValue(User.class); Toast.makeText(LoginActivity.this,"Welcome " + user.name,Toast.LENGTH_SHORT).show(); } @Override public void onCancelled(DatabaseError databaseError) { } }); } @Override public void onStart() { super.onStart(); firebaseAuth.addAuthStateListener(authStateListener); } @Override public void onStop() { super.onStop(); firebaseAuth.removeAuthStateListener(authStateListener); } } <file_sep>/app/src/main/java/svkt/wallet/activities/TransactionActivity.java package svkt.wallet.activities; import android.app.ProgressDialog; import android.content.Context; import android.content.DialogInterface; import android.content.Intent; import android.net.ConnectivityManager; import android.net.NetworkInfo; import android.os.Bundle; import android.support.design.widget.TextInputEditText; import android.support.v7.app.AlertDialog; import android.support.v7.app.AppCompatActivity; import android.util.Log; import android.view.Menu; import android.view.MenuItem; import android.view.View; import android.widget.Button; import android.widget.TextView; import android.widget.Toast; import com.google.firebase.auth.FirebaseAuth; import com.google.firebase.auth.FirebaseUser; import com.google.firebase.database.DataSnapshot; import com.google.firebase.database.DatabaseError; import com.google.firebase.database.DatabaseReference; import com.google.firebase.database.FirebaseDatabase; import com.google.firebase.database.Query; import com.google.firebase.database.ValueEventListener; import java.text.DateFormat; import java.text.SimpleDateFormat; import java.util.Calendar; import java.util.HashMap; import svkt.wallet.R; import svkt.wallet.models.Transaction; import svkt.wallet.models.User; public class TransactionActivity extends AppCompatActivity { private static final String TAG = "TransactionActivity"; private TextView totalBalanceText; private TextInputEditText contactNoEdit,amountEdit; private Button transferBtn; private String phoneNo, amount; private ProgressDialog progressDialog; private FirebaseUser firebaseUser; private DatabaseReference databaseReference; private User currentUser,destUser; private String hashKey; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_transaction); getSupportActionBar().setTitle("Transfer Funds"); totalBalanceText = findViewById(R.id.totalBalance); contactNoEdit = findViewById(R.id.contactNo); amountEdit = findViewById(R.id.amount); transferBtn = findViewById(R.id.transferBtn); Bundle bundle = getIntent().getExtras(); if(bundle != null){ String phoneNo = bundle.getString("PHONE_NO"); String amount = bundle.getString("AMOUNT"); contactNoEdit.setText(phoneNo.substring(1,phoneNo.length()-1)); amountEdit.setText(amount.substring(1,amount.length()-1)); } getSelfUser(); transferBtn.setOnClickListener(new View.OnClickListener() { @Override public void onClick(View view) { phoneNo = contactNoEdit.getText().toString(); amount = amountEdit.getText().toString(); if(isValid()){ showProgressDialog("Checking if user exists..."); userExists(phoneNo); } } }); } private boolean isValid(){ if(phoneNo.isEmpty()){ contactNoEdit.setError("Enter existng phone No on Wallet"); contactNoEdit.setFocusable(true); return false; } else if(amount.isEmpty()){ amountEdit.setError("Please enter amount"); amountEdit.setFocusable(true); return false; } else if(Float.parseFloat(amount) > currentUser.balance){ Toast.makeText(TransactionActivity.this,"You have unsufficient balance in our account",Toast.LENGTH_SHORT).show(); amountEdit.setFocusable(true); return false; } return true; } private void doTransaction(String hashKey,long destAmount,String toName){ destAmount += Long.parseLong(amount); FirebaseUser fUser = FirebaseAuth.getInstance().getCurrentUser(); currentUser.balance -= Long.parseLong(amount); DateFormat df = new SimpleDateFormat("yyyy-MM-dd HH:mm:ss"); Calendar calobj = Calendar.getInstance(); String date = df.format(calobj.getTime()); Transaction destTransaction = new Transaction(hashKey,fUser.getUid(),toName,currentUser.name,amount,date,"received"); Transaction srcTransaction = new Transaction(hashKey,fUser.getUid(),toName,currentUser.name,amount,date,"paid"); DatabaseReference reference = FirebaseDatabase.getInstance().getReference().child("users"); reference.child(hashKey).child("balance").setValue(destAmount); reference.child(fUser.getUid()).child("balance").setValue(currentUser.balance); DatabaseReference transRefer = FirebaseDatabase.getInstance().getReference().child("transaction"); transRefer.child(fUser.getUid()).push().setValue(srcTransaction); transRefer.child(hashKey).push().setValue(destTransaction); Toast.makeText(TransactionActivity.this,"Transaction completed",Toast.LENGTH_SHORT).show(); finish(); } private void getSelfUser(){ firebaseUser = FirebaseAuth.getInstance().getCurrentUser(); databaseReference = FirebaseDatabase.getInstance().getReference().child("users").child(firebaseUser.getUid()); databaseReference.addValueEventListener(new ValueEventListener() { @Override public void onDataChange(DataSnapshot dataSnapshot) { currentUser = dataSnapshot.getValue(User.class); totalBalanceText.setText(getString(R.string.Rs) + currentUser.balance); } @Override public void onCancelled(DatabaseError databaseError) { } }); } private void userExists(String phoneNo){ Query query = FirebaseDatabase.getInstance().getReference().child("users") .orderByChild("contactNo").equalTo(phoneNo); query.addListenerForSingleValueEvent(new ValueEventListener() { @Override public void onDataChange(DataSnapshot dataSnapshot) { Log.e(TAG,"Value = " + dataSnapshot.getValue()); if(dataSnapshot.getValue() == null){ hideProgressDialog(); Toast.makeText(TransactionActivity.this,R.string.no_user,Toast.LENGTH_SHORT).show(); } else{ hideProgressDialog(); HashMap map = (HashMap) dataSnapshot.getValue(); for ( Object key : map.keySet() ) { hashKey = (String) key; } Log.e(TAG,"key" + map.keySet()); HashMap map2 = (HashMap) map.get(hashKey); String toName = (String) map2.get("name"); long destAmount = (long) map2.get("balance"); Log.e(TAG,"Dest user balance = " + destAmount); doTransaction(hashKey,destAmount,toName); } } @Override public void onCancelled(DatabaseError databaseError) { Log.e(TAG,"Database error = " + databaseError); } }); } public void showProgressDialog(String message) { progressDialog=new ProgressDialog(TransactionActivity.this); progressDialog.setCancelable(false); progressDialog.setMessage(message); progressDialog.setProgressStyle(ProgressDialog.STYLE_SPINNER); progressDialog.show(); } public void hideProgressDialog() { progressDialog.dismiss(); } @Override public boolean onCreateOptionsMenu(Menu menu) { // Inflate the menu; this adds items to the action bar if it is present. getMenuInflater().inflate(R.menu.menu_passbook, menu); return true; } @Override public boolean onOptionsItemSelected(MenuItem item) { // Handle action bar item clicks here. The action bar will // automatically handle clicks on the Home/Up button, so long // as you specify a parent activity in AndroidManifest.xml. int id = item.getItemId(); switch (id) { case R.id.action_chat: startActivity(new Intent(TransactionActivity.this,ChatActivity.class)); break; case R.id.action_passbook : startActivity(new Intent(TransactionActivity.this,PassbookActivity.class)); break; case R.id.action_statement : startActivity(new Intent(TransactionActivity.this,WalletStatement.class)); break; case R.id.action_logout : signOutDialog(); break; case R.id.action_transfer : startActivity(new Intent(TransactionActivity.this,TransactionActivity.class)); break; } return super.onOptionsItemSelected(item); } private void signOutDialog() { final AlertDialog.Builder builder=new AlertDialog.Builder(TransactionActivity.this); builder.setMessage("Do you want to Sign Out").setTitle("Sign Out"); builder.setPositiveButton("Yes", new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface dialogInterface, int i) { try { if(isInternetConnected()) { FirebaseAuth.getInstance().signOut(); finish(); startActivity(new Intent(TransactionActivity.this,LoginActivity.class)); } } catch (Exception e) { Toast.makeText(TransactionActivity.this,R.string.error,Toast.LENGTH_SHORT).show(); } } }); builder.setNegativeButton("No", new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface dialogInterface, int i) { } }); AlertDialog dialog=builder.create(); dialog.show(); } private boolean isInternetConnected() { ConnectivityManager connectivityManager=(ConnectivityManager)this.getSystemService(Context.CONNECTIVITY_SERVICE); NetworkInfo networkInfo=connectivityManager.getActiveNetworkInfo(); if(networkInfo==null || !networkInfo.isConnected() || !networkInfo.isAvailable()) { Toast.makeText(TransactionActivity.this,"No Internet Connectivity",Toast.LENGTH_LONG).show(); return false; } return true; } } <file_sep>/app/src/main/java/svkt/wallet/activities/PassbookActivity.java package svkt.wallet.activities; import android.content.Context; import android.content.DialogInterface; import android.content.Intent; import android.net.ConnectivityManager; import android.net.NetworkInfo; import android.os.Bundle; import android.support.design.widget.TabLayout; import android.support.v4.app.Fragment; import android.support.v4.app.FragmentManager; import android.support.v4.app.FragmentPagerAdapter; import android.support.v4.view.ViewPager; import android.support.v7.app.AlertDialog; import android.support.v7.app.AppCompatActivity; import android.support.v7.widget.Toolbar; import android.view.Menu; import android.view.MenuItem; import android.widget.Toast; import com.google.firebase.auth.FirebaseAuth; import svkt.wallet.R; import svkt.wallet.fragments.AllFragment; import svkt.wallet.fragments.PaidFragment; import svkt.wallet.fragments.ReceivedFragment; import svkt.wallet.models.Transaction; public class PassbookActivity extends AppCompatActivity { /** * The {@link android.support.v4.view.PagerAdapter} that will provide * fragments for each of the sections. We use a * {@link FragmentPagerAdapter} derivative, which will keep every * loaded fragment in memory. If this becomes too memory intensive, it * may be best to switch to a * {@link android.support.v4.app.FragmentStatePagerAdapter}. */ private SectionsPagerAdapter mSectionsPagerAdapter; /** * The {@link ViewPager} that will host the section contents. */ private ViewPager mViewPager; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_passbook); Toolbar toolbar = findViewById(R.id.toolbar); setSupportActionBar(toolbar); // Create the adapter that will return a fragment for each of the three // primary sections of the activity. mSectionsPagerAdapter = new SectionsPagerAdapter(getSupportFragmentManager()); // Set up the ViewPager with the sections adapter. mViewPager = findViewById(R.id.container); mViewPager.setAdapter(mSectionsPagerAdapter); TabLayout tabLayout = findViewById(R.id.tabs); tabLayout.setupWithViewPager(mViewPager); } @Override public boolean onCreateOptionsMenu(Menu menu) { // Inflate the menu; this adds items to the action bar if it is present. getMenuInflater().inflate(R.menu.menu_passbook, menu); return true; } @Override public boolean onOptionsItemSelected(MenuItem item) { // Handle action bar item clicks here. The action bar will // automatically handle clicks on the Home/Up button, so long // as you specify a parent activity in AndroidManifest.xml. int id = item.getItemId(); switch (id) { case R.id.action_chat: startActivity(new Intent(PassbookActivity.this,ChatActivity.class)); break; case R.id.action_passbook : startActivity(new Intent(PassbookActivity.this,PassbookActivity.class)); break; case R.id.action_statement : startActivity(new Intent(PassbookActivity.this,WalletStatement.class)); break; case R.id.action_logout : signOutDialog(); break; case R.id.action_transfer : startActivity(new Intent(PassbookActivity.this,TransactionActivity.class)); break; } return super.onOptionsItemSelected(item); } private void signOutDialog() { final AlertDialog.Builder builder=new AlertDialog.Builder(PassbookActivity.this); builder.setMessage("Do you want to Sign Out").setTitle("Sign Out"); builder.setPositiveButton("Yes", new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface dialogInterface, int i) { try { if(isInternetConnected()) { FirebaseAuth.getInstance().signOut(); finish(); startActivity(new Intent(PassbookActivity.this,LoginActivity.class)); } } catch (Exception e) { Toast.makeText(PassbookActivity.this,R.string.error,Toast.LENGTH_SHORT).show(); } } }); builder.setNegativeButton("No", new DialogInterface.OnClickListener() { @Override public void onClick(DialogInterface dialogInterface, int i) { } }); AlertDialog dialog=builder.create(); dialog.show(); } private boolean isInternetConnected() { ConnectivityManager connectivityManager=(ConnectivityManager)this.getSystemService(Context.CONNECTIVITY_SERVICE); NetworkInfo networkInfo=connectivityManager.getActiveNetworkInfo(); if(networkInfo==null || !networkInfo.isConnected() || !networkInfo.isAvailable()) { Toast.makeText(PassbookActivity.this,"No Internet Connectivity",Toast.LENGTH_LONG).show(); return false; } return true; } public class SectionsPagerAdapter extends FragmentPagerAdapter { public SectionsPagerAdapter(FragmentManager fm) { super(fm); } @Override public Fragment getItem(int position) { switch(position) { case 0 : AllFragment allFragment = new AllFragment(); return allFragment; case 1 : PaidFragment paidFragment = new PaidFragment(); return paidFragment; case 2 : ReceivedFragment receivedFragment = new ReceivedFragment(); return receivedFragment; } return null; } @Override public int getCount() { // Show 3 total pages. return 3; } @Override public CharSequence getPageTitle(int position) { switch (position) { case 0: return "All"; case 1: return "Paid"; case 2: return "Received"; } return null; } } } <file_sep>/app/src/main/java/svkt/wallet/adapter/TransactionListAdapter.java package svkt.wallet.adapter; import android.content.Context; import android.support.v7.widget.RecyclerView; import android.util.Log; import android.view.LayoutInflater; import android.view.View; import android.view.ViewGroup; import android.widget.ImageView; import android.widget.TextView; import java.util.ArrayList; import svkt.wallet.R; import svkt.wallet.models.Transaction; /** * Created by kaustubh on 04-11-2017. */ public class TransactionListAdapter extends RecyclerView.Adapter<TransactionListAdapter.ViewHolder>{ Context context; ArrayList<Transaction> transactions; public TransactionListAdapter(Context context, ArrayList<Transaction> transactions){ this.context = context; this.transactions = transactions; } @Override public ViewHolder onCreateViewHolder(ViewGroup parent, int viewType) { return new ViewHolder(LayoutInflater.from(context).inflate(R.layout.list_item_transaction,parent,false)); } @Override public void onBindViewHolder(ViewHolder holder, int position) { Transaction transaction = transactions.get(position); Log.e("TransactionListAdapter",transaction.from); holder.amount.setText(context.getString(R.string.Rs) +transaction.amount); holder.transactionDate.setText(transaction.date); if(transaction.type.equals("paid")){ holder.arrowView.setImageDrawable(context.getResources().getDrawable(R.drawable.ic_arrow_upward_orange_24dp)); holder.transactionName.setText(transaction.toName); } else if(transaction.type.equals("received")){ holder.arrowView.setImageDrawable(context.getResources().getDrawable(R.drawable.ic_arrow_downward_orange_24dp)); holder.transactionName.setText(transaction.fromName); } } @Override public int getItemCount() { return transactions.size(); } class ViewHolder extends RecyclerView.ViewHolder{ TextView transactionName , transactionDate ,amount ; ImageView arrowView; ViewHolder(View itemView) { super(itemView); transactionName = itemView.findViewById(R.id.transactionName); transactionDate = itemView.findViewById(R.id.transactionDate); amount = itemView.findViewById(R.id.amount); arrowView = itemView.findViewById(R.id.arrow); } } }
0f30f36df1366d314665d93a7d9f21494979d93c
[ "Java" ]
5
Java
kaustubhtoraskar/CodeIT
ae8cce6824c5914519507d7a2bddc8f70ad2e278
1a8e178bb092d4f8982b522a1c9651ce38fa1bd1
refs/heads/master
<file_sep>package com.example.demo; import java.util.Optional; import java.util.logging.Logger; import org.slf4j.LoggerFactory; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; @SpringBootApplication public class DemoApplication { public static void main(String[] args) { } } <file_sep>package com.example.demo; import java.util.HashMap; import java.util.Map; import java.util.function.Supplier; public interface pruebaFuncionEspe { String method(String string); }
7315dbfa90585128b7c10a238f7dfb45669ae5e7
[ "Java" ]
2
Java
lbotia/TestS4N
e5f114b6043d0d1c35e3f8d43615053cac25dc3d
5051e8e354bb15b93f55320cb883dafd632a4341
refs/heads/master
<repo_name>Crimsonfantasy/Org_Note<file_sep>/linux/shell/wget_example.sh #/!bin/sh #-q slince #O- redict to stdout, + url fater '-' wget -qO- 'http://localhost:8580/emm-war/emm-app/userLogin?UserId=hwacom&Password=<PASSWORD>' <file_sep>/java/JavaSurvey/src/org/fenrir/survey/SymbolLabel.java import org.apache.logging.log4j.core.Logger; class SymbolLabel{ static Logger cc = Logger.getLogger(SymbolLabel.class); public static void main(String [] arg){ System.out.println("lalalala"); } } <file_sep>/os/note/memory_management.html <?xml version="1.0" encoding="iso-8859-1"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" lang="en" xml:lang="en"> <head> <title>memory<sub>management</sub></title> <meta http-equiv="Content-Type" content="text/html;charset=iso-8859-1"/> <meta name="generator" content="Org-mode"/> <meta name="generated" content="2014-09-24 08:06:24 CST"/> <meta name="author" content="CrimsonFantasy"/> <meta name="description" content=""/> <meta name="keywords" content=""/> <style type="text/css"> <!--/*--><![CDATA[/*><!--*/ html { font-family: Times, serif; font-size: 12pt; } .title { text-align: center; } .todo { color: red; } .done { color: green; } .tag { background-color: #add8e6; font-weight:normal } .target { } .timestamp { color: #bebebe; } .timestamp-kwd { color: #5f9ea0; } p.verse { margin-left: 3% } pre { border: 1pt solid #AEBDCC; background-color: #F3F5F7; padding: 5pt; font-family: courier, monospace; font-size: 90%; overflow:auto; } table { border-collapse: collapse; } td, th { vertical-align: top; } dt { font-weight: bold; } div.figure { padding: 0.5em; } div.figure p { text-align: center; } .linenr { font-size:smaller } .code-highlighted {background-color:#ffff00;} .org-info-js_info-navigation { border-style:none; } #org-info-js_console-label { font-size:10px; font-weight:bold; white-space:nowrap; } .org-info-js_search-highlight {background-color:#ffff00; color:#000000; font-weight:bold; } /*]]>*/--> </style> <script type="text/javascript"> <!--/*--><![CDATA[/*><!--*/ function CodeHighlightOn(elem, id) { var target = document.getElementById(id); if(null != target) { elem.cacheClassElem = elem.className; elem.cacheClassTarget = target.className; target.className = "code-highlighted"; elem.className = "code-highlighted"; } } function CodeHighlightOff(elem, id) { var target = document.getElementById(id); if(elem.cacheClassElem) elem.className = elem.cacheClassElem; if(elem.cacheClassTarget) target.className = elem.cacheClassTarget; } /*]]>*///--> </script> </head> <body> <div id="content"> <h1 class="title">memory<sub>management</sub></h1> <div id="table-of-contents"> <h2>Table of Contents</h2> <div id="text-table-of-contents"> <ul> <li><a href="#sec-1">1 Binding time </a> <ul> <li><a href="#sec-1.1">1.1 Compile time </a></li> <li><a href="#sec-1.2">1.2 memory allocation method </a></li> <li><a href="#sec-1.3">1.3 Execution Time(Dynamic Binding) </a></li> </ul> </li> <li><a href="#sec-2">2 Memory allocation </a> <ul> <li><a href="#sec-2.1">2.1 Mthod: </a></li> <li><a href="#sec-2.2">2.2 Compare four allocation method </a></li> </ul> </li> <li><a href="#sec-3">3 external fragment </a></li> <li><a href="#sec-4">4 Page Memory Management </a> <ul> <li><a href="#sec-4.1">4.1 allocation </a></li> <li><a href="#sec-4.2">4.2 adv </a></li> <li><a href="#sec-4.3">4.3 defect </a></li> </ul> </li> <li><a href="#sec-5">5 Internal fragment </a></li> <li><a href="#sec-6">6 Page Table </a></li> <li><a href="#sec-7">7 Segment Memory management </a> <ul> <li><a href="#sec-7.1">7.1 what is Segment? </a></li> <li><a href="#sec-7.2">7.2 ADV </a></li> <li><a href="#sec-7.3">7.3 Defect </a></li> </ul> </li> </ul> </div> </div> <div id="outline-container-1" class="outline-2"> <h2 id="sec-1"><span class="section-number-2">1</span> Binding time </h2> <div class="outline-text-2" id="text-1"> </div> <div id="outline-container-1.1" class="outline-3"> <h3 id="sec-1.1"><span class="section-number-3">1.1</span> Compile time </h3> <div class="outline-text-3" id="text-1.1"> <ul> <li> define <ul> <li> Compile decide the memory location in where. </li> </ul> </li> <li> defect <ol> <li> Initial location of program is fixed. </li> <li> not support Relocation. </li> </ol> </li> </ul> </div> </div> <div id="outline-container-1.2" class="outline-3"> <h3 id="sec-1.2"><span class="section-number-3">1.2</span> memory allocation method </h3> <div class="outline-text-3" id="text-1.2"> <ul> <li> linking loader decide the location. </li> <li> adv: <ul> <li> support Relocation </li> </ul> </li> <li> defect <ul> <li> Loading model that is not required cause waste of time and memory. </li> <li> Model with great amount fuctions need more time and memory when loading. </li> </ul> </li> </ul> <p>* </p> </div> </div> <div id="outline-container-1.3" class="outline-3"> <h3 id="sec-1.3"><span class="section-number-3">1.3</span> Execution Time(Dynamic Binding) </h3> <div class="outline-text-3" id="text-1.3"> <ul> <li> The location is decided at Execution time. </li> <li> Need extra device to support <ul> <li> A Base Register store the base address. </li> <li> Physical address = base address + Local Address. </li> </ul> </li> <li> adv <ul> <li> high flexibility </li> </ul> </li> <li> defect <ol> <li> bad performance </li> </ol> </li> </ul> </div> </div> </div> <div id="outline-container-2" class="outline-2"> <h2 id="sec-2"><span class="section-number-2">2</span> Memory allocation </h2> <div class="outline-text-2" id="text-2"> </div> <div id="outline-container-2.1" class="outline-3"> <h3 id="sec-2.1"><span class="section-number-3">2.1</span> Mthod: </h3> <div class="outline-text-3" id="text-2.1"> <ul> <li> First-Fit </li> <li> Best-Fit </li> <li> Worst-Fit </li> <li> Next-Fit </li> </ul> </div> </div> <div id="outline-container-2.2" class="outline-3"> <h3 id="sec-2.2"><span class="section-number-3">2.2</span> Compare four allocation method </h3> <div class="outline-text-3" id="text-2.2"> <table border="2" cellspacing="0" cellpadding="6" rules="groups" frame="hsides"> <caption></caption> <colgroup><col align="left" /><col align="left" /><col align="left" /> </colgroup> <thead> <tr><th scope="col">0</th><th scope="col">Time</th><th scope="col">Time Utility</th></tr> </thead> <tbody> <tr><td>Firt-Fit</td><td>fine</td><td>fine</td></tr> </tbody> <tbody> <tr><td>Best-Fit</td><td>bad</td><td>fine</td></tr> </tbody> <tbody> <tr><td>Worst-Fit</td><td>bad</td><td>bad</td></tr> </tbody> <tbody> <tr><td>Next-Fit</td><td>fine</td><td>fine</td></tr> </tbody> </table> </div> </div> </div> <div id="outline-container-3" class="outline-2"> <h2 id="sec-3"><span class="section-number-2">3</span> external fragment </h2> <div class="outline-text-2" id="text-3"> <ul> <li> In contaguous method, when Sum of Fress block size &gt;= Process required size But Free blocak is not contiguous, free sapce can not be allocated for proccess. </li> <li> First/best/worst/next fit all cause external fragment. </li> <li> solution <ul> <li> Compatioc: <ul> <li> gater free spaces into Big contiguous free space. </li> <li> defect <ul> <li> It is hard to find a stratege to gather free space in short time. </li> <li> need Dynamic Binding support. </li> </ul> </li> </ul> </li> </ul> </li> </ul> </div> </div> <div id="outline-container-4" class="outline-2"> <h2 id="sec-4"><span class="section-number-2">4</span> Page Memory Management </h2> <div class="outline-text-2" id="text-4"> <ul> <li> Physical memory divide into n equal size,Frame. </li> <li> Logical memory dived into many Page as possible. </li> <li> Page size = Frame size </li> </ul> </div> <div id="outline-container-4.1" class="outline-3"> <h3 id="sec-4.1"><span class="section-number-3">4.1</span> allocation </h3> <div class="outline-text-3" id="text-4.1"> <ul> <li> OS give frames that n*frame &gt;= n*Page </li> <li> non-contifuous allocation </li> </ul> </div> </div> <div id="outline-container-4.2" class="outline-3"> <h3 id="sec-4.2"><span class="section-number-3">4.2</span> adv </h3> <div class="outline-text-3" id="text-4.2"> <ul> <li> slove exteranl fragment </li> <li> support share memory </li> <li> support to make dynamic loading and Virtual memory. </li> </ul> </div> </div> <div id="outline-container-4.3" class="outline-3"> <h3 id="sec-4.3"><span class="section-number-3">4.3</span> defect </h3> <div class="outline-text-3" id="text-4.3"> <ul> <li> internal fragment </li> <li> longer time to access memory.(spending on looking inquire Page table) </li> <li> require extra device support(Page table) </li> </ul> </div> </div> </div> <div id="outline-container-5" class="outline-2"> <h2 id="sec-5"><span class="section-number-2">5</span> Internal fragment </h2> <div class="outline-text-2" id="text-5"> <ul> <li> Frame size &gt; page size; </li> </ul> </div> </div> <div id="outline-container-6" class="outline-2"> <h2 id="sec-6"><span class="section-number-2">6</span> Page Table </h2> <div class="outline-text-2" id="text-6"> </div> </div> <div id="outline-container-7" class="outline-2"> <h2 id="sec-7"><span class="section-number-2">7</span> Segment Memory management </h2> <div class="outline-text-2" id="text-7"> <ul> <li> pysical memory is NOT divided into smallspace. </li> <li> Logical memory is divided into Segments that vary in size. </li> </ul> </div> <div id="outline-container-7.1" class="outline-3"> <h3 id="sec-7.1"><span class="section-number-3">7.1</span> what is Segment? </h3> <div class="outline-text-3" id="text-7.1"> <ul> <li> main </li> <li> subroutine </li> <li> data section </li> </ul> </div> </div> <div id="outline-container-7.2" class="outline-3"> <h3 id="sec-7.2"><span class="section-number-3">7.2</span> ADV </h3> <div class="outline-text-3" id="text-7.2"> <ol> <li> no Internal fragment. </li> <li> support sharing memory and protection <ul> <li> It is easier to do than page management. </li> </ul> </li> </ol> </div> </div> <div id="outline-container-7.3" class="outline-3"> <h3 id="sec-7.3"><span class="section-number-3">7.3</span> Defect </h3> <div class="outline-text-3" id="text-7.3"> <ol> <li> External fragment </li> <li> longer time to access memory. </li> <li> need extra device support. </li> </ol> </div> </div> </div> <div id="postamble"> <p class="author"> Author: CrimsonFantasy <a href="mailto:<EMAIL>">&lt;<EMAIL>&gt;</a> </p> <p class="date"> Date: 2014-09-24 08:06:24 CST</p> <p class="creator">HTML generated by org-mode 6.33x in emacs 23</p> </div> </div> </body> </html> <file_sep>/emacs/xxx.java public class ccc { cc(); void int cc(){ return 0; } } <file_sep>/shll/masterPiece/auto_deployee_2_remote_server_example.sh #!/bin/sh export SSHPASS=<PASSWORD> [email protected] sshpass -e sftp -oBatchMode=no -b - $SERVER_HOST << ! cd server/ocms put $(pwd)/build/libs/ocms-rest-service-0.0.1-SNAPSHOT.jar ! sshpass -p jumbo.net ssh $SERVER_HOST << ! ## remot #### cd ./server/ocms echo "current floder :" $(pwd) # kill not read stdin, so i need to user xargs pgrep -f ocms | xargs kill -15 sleep 1 echo "check ocms is shutdown ?" ps -ef | grep "[o]cms" nohup java -jar ocms-rest-service-0.0.1-SNAPSHOT.jar >> ocms.log & # look log 40 seconds timeout 40 tail -f ocms.log bye #remote ##### ! <file_sep>/java/jvm_monitor/threaddump_linux_jstack-continuous.sh #!/bin/sh # # Takes the JBoss PID as an argument. # # Make sure you set JAVA_HOME # # Create thread dumps a specified number of times (i.e. LOOP) and INTERVAL. # # Thread dumps will be collected in the file "jstack_threaddump.out", in the same directory from where this script is been executed. # # Usage: sh ./threaddump_linux_jstack-continuous.sh <JBOSS_PID> # # Number of times to collect data. LOOP=6 # Interval in seconds between data points. INTERVAL=1 # Setting the Java Home, by giving the path where your JDK is kept JAVA_HOME=/usr/lib/jvm/java-8-oracle i=0 while [ $i -lt $LOOP ] do i=$(expr $i + 1) $JAVA_HOME/bin/jstack -l $1 >> jstack_threaddump.out echo "thread dump #" $i if [ $i -lt $LOOP ]; then echo "sleeping..." #sleep $INTERVAL fi done <file_sep>/db/oracle/ddl_in_store_procedure.sql create PROCEDURE game_record_truncate AS BEGIN -- All DDL statements in Oracle PL/SQL should use Execute Immediate before the statement. Hence you should use: execute immediate ' TRUNCATE TABLE BULL2_BET_CONFIRM PRESERVE MATERIALIZED VIEW LOG REUSE STORAGE; TRUNCATE TABLE BULL2_BET_CONFIRM_DETAIL PRESERVE MATERIALIZED VIEW LOG REUSE STORAGE; TRUNCATE TABLE BULL2_BET_RESULT PRESERVE MATERIALIZED VIEW LOG REUSE STORAGE; TRUNCATE TABLE BULL2_BET_RESULT_DETAIL PRESERVE MATERIALIZED VIEW LOG REUSE STORAGE; TRUNCATE TABLE BULL2_GAME PRESERVE MATERIALIZED VIEW LOG REUSE STORAGE; TRUNCATE TABLE ROULETTE_BET_CONFIRM PRESERVE MATERIALIZED VIEW LOG REUSE STORAGE; TRUNCATE TABLE ROULETTE_BET_CONFIRM_DETAIL PRESERVE MATERIALIZED VIEW LOG REUSE STORAGE; TRUNCATE TABLE ROULETTE_BET_RESULT PRESERVE MATERIALIZED VIEW LOG REUSE STORAGE; TRUNCATE TABLE ROULETTE_BET_RESULT_DETAIL PRESERVE MATERIALIZED VIEW LOG REUSE STORAGE; TRUNCATE TABLE ROULETTE_GAME PRESERVE MATERIALIZED VIEW LOG REUSE STORAGE;'; END; / <file_sep>/Server/SQL/export_csv.sh DBNAME=ngtms TABLE=VendorCfg FNAME=/home/crimsonfantasy/workspace2/ngtms/$(date +%Y.%m.%d)-$DBNAME.csv TEMP=/var/tmp/$(date +%Y.%m.%d)-$DBNAME.temp #(1)creates empty file and sets up column names using the information_schema #sed 's/^/"/g;s/$/"/g' mysql -u root -phwe03 $DBNAME -B -e "SELECT COLUMN_NAME FROM information_schema.COLUMNS C WHERE table_name = '$TABLE';" | awk '{print $1}' | grep -iv ^COLUMN_NAME$ | tr '\n' ',' > $FNAME #(2)appends newline to mark beginning of data vs. column titles echo "" >> $FNAME #(3)dumps data from DB into /var/mysql/tempfile.csv # OPTIONALLY ENCLOSED BY '\"' mysql -u root -phwe03 $DBNAME -B -e "SELECT * INTO OUTFILE '$TEMP' FIELDS TERMINATED BY ',' FROM $TABLE;" #(4)merges data file and file w/ column names cat $TEMP >> $FNAME #(5)deletes tempfile rm -rf $TEMP <file_sep>/Server/ansible/README.md # 用Ansible playbook 部屬到test server ``` ansible-playbook game_server.yaml --limit niu1 ``` # 用Ansible playbook 部屬report server ``` ansible-playbook --extra-vars '@testbed_passwd.yml' dpy_report.yaml --limit niu1 --vault-password-file ./testbed_secret.txt ```
dfad75ec7c1f5f5c742d6ec6d31a0c348015d403
[ "SQL", "HTML", "Markdown", "Java", "Shell" ]
9
Shell
Crimsonfantasy/Org_Note
42346d270da47bc26dc2a7e21617139106e95465
a59816950b45cbd53c62a3af82ce03096a53c81d
refs/heads/main
<file_sep>"""Get historical rates from git.""" import datetime import gspread import requests import time import xmltodict sheetkey = "<KEY>" gc = gspread.service_account() sh = gc.open_by_key(sheetkey) ws = sh.worksheet('kurzy') r = requests.get('https://ban.tipsport.cz/c/feed.php?feed=1101&pid=10661&sid=12302&tid=2161&bid=3284') data = {} main = xmltodict.parse(r.content) date_data = datetime.datetime.strptime(main['odds']['date'], "%d.%m.%Y %H:%M").isoformat() date_retrieved = datetime.datetime.fromtimestamp(int(main['odds']['timestamp'])/1000).isoformat() for competition in main['odds']['competition']: for event in competition['match']['event']: try: for odd in event['odd']: data[event['@name'] + ": " + odd['@fullName']] = float(odd['@rate']) except: nothing = True headers = ws.get('A1:ZZ1')[0] column_pointer = len(headers) row_pointer = ws.row_count for k in data: if k not in headers: headers.append(k) h2c = {} i = 0 for h in headers: h2c[h] = i i += 1 row = [None] * i for k in data: row[h2c[k]] = data[k] row[0] = date_data row[1] = date_retrieved ws.append_row(row) ws.update('A1', [headers]) time.sleep(1.5) # subprocess.run(["git", "checkout", 'master']) # len(main['odds']['competition']) # len(competition['match']) # competition['match'].keys() # event.keys() # event['@name'] # odd['@fullName']<file_sep>"""Test connection""" import datetime import os import pandas as pd import requests import sys url_root = "https://m.tipsport.cz/" path = "2022/" # authentization # the first part is local, the other takes the values from Github secrets try: sys.path.append('2022') import v3.secret as secret os.environ['TIPSPORT_USER'] = secret.TIPSPORT_USER os.environ['TIPSPORT_PASSWORD'] = secret.TIPSPORT_PASSWORD os.environ['TIPSPORT_PRODUCTID'] = secret.TIPSPORT_PRODUCTID except: pass headers = {'Content-Type': 'application/x-www-form-urlencoded'} credentials = { 'username': os.environ['TIPSPORT_USER'], 'password': <PASSWORD>['<PASSWORD>'], 'productId': os.environ['TIPSPORT_PRODUCTID'] } r = requests.post(url_root + 'rest/common/v1/session', data=credentials, headers=headers) auth = r.json() cookies = r.cookies token = auth['sessionToken'] headers = {'Authorization': "Bearer {}".format(token)} # 'společenské sázky' - get matches r1 = requests.get(url_root + 'rest/external/offer/v1/sports', headers=headers, cookies=cookies) data = r1.json() matches = [] for category in data['data']['children']: for supersport in category['children']: if supersport['title'] == 'Společenské sázky': for sport in supersport['children']: for match in sport['children']: item = { 'sport_title': sport['title'], 'match_title': match['title'], 'match_id': match['id'] } matches.append(item) # get races races = [] for match in matches: r2 = requests.get(url_root + 'rest/external/offer/v2/competitions/{}/matches'.format(match['match_id']), headers=headers, cookies=cookies) data2 = r2.json() for race in data2['matches']: item = match.copy() item['race_name'] = race['name'] item['race_id'] = race['id'] races.append(item) # get details events = [] now = datetime.datetime.now().isoformat() for race in races: r3 = requests.get(url_root + 'rest/offer/v2/matches/{}?withOdds=true'.format(race['race_id']), headers=headers, cookies=cookies) # print(race['race_name']) data3 = r3.json() datetimeClosed = data3['match']['datetimeClosed'] for table in data3['match']['eventTables']: for box in table['boxes']: item = race.copy() item['match_close'] = datetimeClosed item['event_name'] = table['name'] item['box_id'] = box['id'] item['box_name'] = box['name'] item['cells'] = [] for cell in box['cells']: it = { 'id': cell['id'], 'name': cell['name'], 'odd': cell['odd'], 'date': now } item['cells'].append(it) events.append(item) # prepare tables try: meta = pd.read_csv(path + 'meta.csv') except: meta = pd.DataFrame() for event in events: e = event.copy() del e['cells'] if (len(meta.index) == 0) or (meta.loc[meta['box_id'] == e['box_id']].empty): meta = meta.append(e, ignore_index=True) try: table = pd.read_csv(path + 'data/' + str(event['box_id']) + '.csv') except: table = pd.DataFrame() for cell in event['cells']: table = table.append(cell, ignore_index=True) table['id'] = table['id'].astype(int) table.to_csv(path + 'data/' + str(event['box_id']) + '.csv', index=False) ids = ['match_id', 'race_id'] for i in ids: meta[i] = meta[i].astype(int) meta.to_csv(path + 'meta.csv', index=False) meta.to_csv('meta.csv') <file_sep># tipsport.cz Scrapers for tipsport.cz Use: - https://mandaty.cz - https://www.seznamzpravy.cz/clanek/fakta-sazky-na-prezidenta-ku-pocitame-sance-kdo-vyhraje-volby-2023-201469 <file_sep>"""Get historical rates from git.""" import datetime import gspread import subprocess import time import xmltodict sheetkey = "<KEY>" gc = gspread.service_account() sh = gc.open_by_key(sheetkey) ws = sh.worksheet('kurzy') proc = subprocess.run(["git", "log", "--oneline"], stdout=subprocess.PIPE) output = subprocess.check_output(["git", "log", "--oneline"], text=True) raw_rows = output.split('\n') hashes = [] for row in raw_rows: it = row.split(' ') if (len(it) > 1) and (it[1] == 'Latest'): hashes.append(it[0]) hashes.reverse() kk = 1 for hash in hashes: subprocess.run(["git", "checkout", hash]) with open("../social.xml") as fin: xmlb = fin.read() xml = (bytes(xmlb, encoding='utf8')) data = {} main = xmltodict.parse(xml) date_data = datetime.datetime.strptime(main['odds']['date'], "%d.%m.%Y %H:%M").isoformat() date_retrieved = datetime.datetime.fromtimestamp(int(main['odds']['timestamp'])/1000).isoformat() for competition in main['odds']['competition']: for event in competition['match']['event']: try: for odd in event['odd']: data[event['@name'] + ": " + odd['@fullName']] = float(odd['@rate']) except: nothing = True headers = ws.get('A1:ZZ1')[0] column_pointer = len(headers) row_pointer = ws.row_count for k in data: if k not in headers: headers.append(k) h2c = {} i = 0 for h in headers: h2c[h] = i i += 1 row = [None] * i for k in data: row[h2c[k]] = data[k] row[0] = date_data row[1] = date_retrieved ws.append_row(row) ws.update('A1', [headers]) time.sleep(1.5) kk += 1 # if kk == 3: # break subprocess.run(["git", "checkout", 'master']) # len(main['odds']['competition']) # len(competition['match']) # competition['match'].keys() # event.keys() # event['@name'] # odd['@fullName']<file_sep># scrapes odds from tipsport.cz and updates github datapackages import csv import datapackage #v0.8.3 import datetime import json import git import os import requests import tipsport_cz_scraper_utils as utils import settings data_path = "data/" # from this script to data # repo settings repo = git.Repo(settings.git_dir) git_ssh_identity_file = settings.ssh_file o = repo.remotes.origin git_ssh_cmd = 'ssh -i %s' % git_ssh_identity_file total_groups = 0 for fdir in settings.tipsport_dirs: dat = utils.scrape_races(fdir) date = datetime.datetime.utcnow().isoformat() for row in dat: data = utils.scrape_race(row) # load or create datapackage try: # load datapackage datapackage_url = settings.project_url + data_path + row['matchId'] +"/datapackage.json" dp = datapackage.DataPackage(datapackage_url) except: # create datapackage dp = datapackage.DataPackage() urldp = settings.project_url + "datapackage_prepared.json" rdp = requests.get(urldp) prepared = rdp.json() dp.descriptor['identifier'] = row['matchId'] dp.descriptor['name'] = "tisport_cz_" + row['matchId'] dp.descriptor['title'] = row['title'] + " - odds from tipsport.cz" dp.descriptor['description'] = "Scraped odds from tipsport.cz for: " + row['title'] for k in prepared: dp.descriptor[k] = prepared[k] if not os.path.exists(settings.git_dir + data_path + row['matchId']): os.makedirs(settings.git_dir + data_path + row['matchId']) with open(settings.git_dir + data_path + row['matchId'] +'/datapackage.json', 'w') as fout: fout.write(dp.to_json()) repo.git.add(settings.git_dir + data_path + row['matchId'] +'/datapackage.json') with open(settings.git_dir + data_path + row['matchId'] +'/odds.csv',"w") as fout: header = [] for resource in dp.resources: if resource.descriptor['name'] == 'odds': for field in resource.descriptor['schema']['fields']: header.append(field['name']) dw = csv.DictWriter(fout,header) dw.writeheader() repo.git.add(settings.git_dir + data_path + row['matchId'] +'/odds.csv') with open(settings.git_dir + data_path + row['matchId'] +'/odds.csv',"a") as fout: header = [] attributes = ['date','title','odds','date_bet','identifier'] for resource in dp.resources: if resource.descriptor['name'] == 'odds': for field in resource.descriptor['schema']['fields']: header.append(field['name']) dw = csv.DictWriter(fout,header) for ro in data: if ro['identifier'] == row['matchId']: for bet in ro['rows']: item = { 'date': date, 'title': bet['title'], 'odds': bet['odds'], 'date_bet': bet['date_bet'], 'identifier': bet['identifier'], } dw.writerow(item) repo.git.add(settings.git_dir + data_path + row['matchId'] +'/odds.csv') total_groups += len(dat) with repo.git.custom_environment(GIT_COMMITTER_NAME=settings.bot_name, GIT_COMMITTER_EMAIL=settings.bot_email): repo.git.commit(message="happily updating data %s groups of bets" % (str(total_groups)), author="%s <%s>" % (settings.bot_name, settings.bot_email)) with repo.git.custom_environment(GIT_SSH_COMMAND=git_ssh_cmd): o.push() <file_sep>"""Create daily averages.""" import numpy as np import pandas as pd path = "2022/" meta = pd.read_csv(path + 'meta.csv') for i, row in meta.iterrows(): df = pd.read_csv(path + 'data/' + row['box_id'] + '.csv') df['date'] = pd.to_datetime(df['date']).dt.date pt = pd.pivot_table(df, index='date', values='odd', columns=['name'], aggfunc=np.average).reset_index() pt.to_csv(path + 'daily/' + row['box_id'] + '.csv', index=False)<file_sep>from lxml import html, etree import re import requests import settings def scrape_race(match): url = settings.tipsport_url + settings.tipsport_endpoint r = requests.post(url,data={"matchId":match['matchId']}) data = [] if r.status_code == 200: group = {'identifier': match['matchId']} domtree = html.fromstring(r.text) trs = domtree.xpath('//tbody/tr') date_bet = trs[1].xpath('td/span')[0].text.strip('()') rows = [] for trn in range(2,len(trs)): tr = trs[trn] item = {} try: item['title'] = tr.xpath('td')[1].text.strip() item['identifier'] = tr.xpath('td/span')[0].text.strip() item['odds'] = tr.xpath('td/a')[0].text.strip() item['date_bet'] = date_bet rows.append(item) except: nothing = None group['rows'] = rows data.append(group) return data def scrape_races(fdir): url = settings.tipsport_url2 + fdir r = requests.get(url) data = [] if r.status_code == 200: domtree = html.fromstring(r.text) table = domtree.xpath('//table[@class="tblOdds"]')[0] trs = table.xpath('tr[@id="oddViewMain0"]') for tr in trs: item = {} item['title'] = tr.xpath('td/a/span')[0].text.strip().strip(':') item['matchId'] = tr.xpath('td/a/span/@data-m')[0] data.append(item) return data if __name__ == "__main__": # test: dat = scrape_races('spolecenske-sazky-25') for row in dat: data = scrape_race(row) print(data) <file_sep># settings for scrapers tipsport_url = "https://www.tipsport.cz/" tipsport_url2 = "https://www.tipsport.cz/kurzy/" tipsport_endpoint = "EventsMatchExtendedOddsAjaxAction.do" tipsport_dirs = ['spolecenske-sazky-25'] # settings for github project_url = "https://raw.githubusercontent.com/michalskop/tipsport.cz/master/" # settings for MakaBot git_dir = "/home/user/project/tipsport.cz/" ssh_file = "/home/user/.ssh/ExampleBot" bot_name = "ExampleBot" bot_email = "<EMAIL>" <file_sep>"""Download current odds, v3.""" import datetime import json import os import pandas as pd import requests url_root_p = "https://partners.tipsport.cz/" path = "v3/" # authentization # the first part is local, the other takes the values from Github secrets try: # sys.path.append('2022') import v3.secret as secret os.environ['TIPSPORT_USER'] = secret.TIPSPORT_USER os.environ['TIPSPORT_PASSWORD'] = secret.TIPSPORT_PASSWORD os.environ['TIPSPORT_PRODUCTID'] = secret.TIPSPORT_PRODUCTID os.environ['PROXY_SERVERS'] = secret.PROXY_SERVERS except: pass # proxy proxy_servers = { 'https': os.environ.get('PROXY_SERVERS') } # authentization headers = {'Content-Type': 'application/x-www-form-urlencoded'} credentials = { 'username': os.getenv('TIPSPORT_USER'), 'password': os.getenv('<PASSWORD>'), 'productId': os.getenv('TIPSPORT_PRODUCTID') } headers = {'Content-Type': 'application/json'} for i in range(10): r = requests.post(url_root_p + 'rest/external/common/v2/session', data=json.dumps(credentials), headers=headers, proxies=proxy_servers) cookies = r.cookies if r.status_code == 200: break else: r2 = requests.get(url_root_p + 'rest/external/offer/v1/matches', headers=headers, cookies=cookies, proxies=proxy_servers) print(r.status_code, r2.status_code) if r.status_code != 200: raise Exception('Could not authenticate, status code {}.'.format(r.status_code)) auth = r.json() cookies = r.cookies token = auth['sessionToken'] headers = {'Authorization': "Bearer {}".format(token)} # 'společenské sázky' - get matches matches = [] r2 = requests.get(url_root_p + 'rest/external/offer/v1/matches', headers=headers, cookies=cookies, proxies=proxy_servers) data2 = r2.json() # json.dump(data2, open(path + 'data2test.json', 'w')) for match in data2['matches']: if match['nameSuperSport'] == 'Společenské sázky': matches.append(match) now = datetime.datetime.now() # json.dump(matches, open(path + 'matchestest.json', 'w')) "105903" matches3 = [] for match in matches: params = { 'idCompetition': match['idCompetition'], 'allEvents': 'True', } r3 = requests.get(url_root_p + 'rest/external/offer/v1/matches/' + str(match['id']), params=params, headers=headers, cookies=cookies, proxies=proxy_servers) data3 = r3.json() try: matches3.append(data3['match']) except: pass # json.dump(matches3, open(path + 'matches3test.json', 'w')) # 'společenské sázky' - get odds, read / write try: meta = pd.read_csv(path + 'meta.csv') except: meta = pd.DataFrame() for match in matches3: # break match_id = match['id'] try: table = pd.read_csv(path + 'data/' + str(match_id) + '.csv') except: table = pd.DataFrame() for et in match['eventTables']: # odds for box in et['boxes']: for cell in box['cells']: row = { 'date': now, 'id': cell['id'], 'name': cell['name'], 'odd': cell['odd'], 'supername': box['name'], 'hypername': et['name'], } table = pd.concat([table, pd.DataFrame([row])]) table = table.drop_duplicates(subset=['id', 'date', 'name']) if len(table) > 0: table.to_csv(path + 'data/' + str(match_id) + '.csv', index=False) # meta meta_row = { 'date': now, 'match_id': match_id, 'match_name': match['name'], 'match_url': match['matchUrl'], 'competition_id': match['idCompetition'], 'competition_name': match['nameCompetition'], 'sport_id': match['idSport'], 'sport_name': match['nameSport'], 'date_closed': datetime.datetime.fromtimestamp(match['dateClosed'] / 1000).isoformat(), 'event_id': et['id'], 'event_name': et['name'], } meta = pd.concat([meta, pd.DataFrame([meta_row])]) meta = meta.drop_duplicates(subset=['match_id', 'event_id', 'date']) meta.to_csv(path + 'meta.csv', index=False) <file_sep># Betting odds from Tipsport.cz Scraper and data of betting odds from tipsport.cz with direct update to Github. Restriction: Works for single bets only (i.e. - one line, one bet) now. Data are structured in tabular datapackage format (http://frictionlessdata.io/guides/tabular-data-package/). Odds for bets on **politics** are updated daily and accessible in this project. ### Example - the data of odds for Czech presidential candidates 2017 http://data.okfn.org/tools/view?url=https%3A%2F%2Fraw.githubusercontent.com%2Fmichalskop%2Ftipsport.cz%2Fmaster%2Fdata%2F2518628%2Fdatapackage.json ### Custom installation Requirements: - Python 3 - Python packages: csv, datetime, datapackage (>=v0.8), git, lxml, re, requests, os Copy example settings into settings and correct it for your Github account (e.g., your bot's account) cp settings-example.py settings.py Note: The origin for the local git project must be 'ssh' address (not 'https' one) for bot to work. ### Automation You can automate the data retrieval using cron. Example: 14 3 * * * /usr/bin/python3 /home/project/tipsport.cz/scraper.py > /dev/null 2>&1 <file_sep>"""Create daily averages.""" import numpy as np import pandas as pd path = "v3/" meta = pd.read_csv(path + 'meta.csv') matches = meta['match_id'].unique() for match_id in matches: try: df = pd.read_csv(path + 'data/' + str(match_id) + '.csv') df['date'] = pd.to_datetime(df['date']).dt.date # v3.1 vs 3.0 if 'supername' in df.columns: df['new_name'] = df['hypername'].fillna('') + ' - ' + df['supername'].fillna('') + ': ' + df['name'] df['new_supername'] = df['hypername'].fillna('') + ' - ' + df['supername'].fillna('') new_supernames = df['new_supername'].unique() if len(new_supernames) > 2: # v3.0 + v3.1 pt = pd.pivot_table(df, index='date', values='odd', columns=['new_name'], aggfunc=np.average).reset_index() else: pt = pd.pivot_table(df, index='date', values='odd', columns=['name'], aggfunc=np.average).reset_index() # v3.0 else: pt = pd.pivot_table(df, index='date', values='odd', columns=['name'], aggfunc=np.average).reset_index() pt.to_csv(path + 'daily/' + str(match_id) + '.csv', index=False) except: pass
a1f18db1f82af852eb4f75f2108275541d744b1d
[ "Markdown", "Python" ]
11
Python
michalskop/tipsport.cz
6a0f7b92085573e07a127a876ed41eaac055793b
b40d5962f92565e8cf32d835ab23d463ded0fc7a
refs/heads/master
<file_sep># Player finder This is a shadowplay-friendly player finder for SAMP 0.3.7 that you can use to find other players while playing & recording without it showing up on your recordings. This only works on players that are in the same interior as you. On the SAMP side, a Lua script is used to read information such as real-time player coordinates, name, skin, current vehicle, gun, etc. Sockets are used to forward this information to a Java application. The Java application listens for updates at localhost:5230 and displays the received data on a map. To display the map over SAMP, an autohotkey script is used, which must be run with admin privileges. This solution is p hacky but I'm not rewriting this. You, however, are more than welcome to. ![img](https://i.imgur.com/AOXesbX.jpg) # Requirements * [JDK 11](https://jdk.java.net/archive/) * [AutoHotkey](https://www.autohotkey.com/) - if you wish to compile the AHK script yourself. * [MoonLoader](https://gtaforums.com/topic/890987-moonloader/) - MoonLoader requires an ASI loader, if you don't already have CLEO, install it. * [SAMPFUNCS](https://blast.hk/threads/17/page-138#post-279414) * [LuaSocket](https://blast.hk/threads/16031/) * [SAMP.Lua](https://github.com/THE-FYP/SAMP.Lua) # Installation 1) Install [CLEO](https://cleo.li/) and [MoonLoader](https://gtaforums.com/topic/890987-moonloader/) if you haven't already. 2) Download and unzip [THIS ARCHIVE](https://drive.google.com/file/d/1zBZlq9cK74Osc22R1kFN_iBukQSnncWU/view?usp=sharing). It contains SAMPFUNCS, SAMP.Lua, LuaSocket, playerfinder.lua, the java app to display a second map over your game, and a compiled AHK file to give the java app priority over SAMP. 3) Drag the contents of the archive's gta san andreas folder into your gta san andreas folder. 4) Run playerfinder.jar. An empty minimap should appear. 5) Run PrioritizeWindow.exe. It requires admin privileges to display the map over SAMP. 6) Run SAMP and you should see the minimap appear over your game. If you would like to build the Java application manually, you will need to download the [skin & map images](https://www.upload.ee/files/11121136/gui.rar.html) and place all 313 of them into `src/main/resources` as follows: [img](https://i.imgur.com/TAF8HRe.png "Structure") Running `gradlew jar` in the Java project root will then build the jar. Sources for everything else, including the lua script and the AHK script used to display the map over SAMP are present in this repo. # Usage * While in game with the Java application running, hold NUMPAD+ and type the ID of a player you would like to find. If they are connected and in the same interior as you, you should now begin to receive real-time location updates. * When they are out of range, you will only receive basic information (coordinates, name). * When they are close to you, you will also receive additional information such as their health/armor, current car/weapon and skin and more frequent location updates. * To stop finding people, just press Numpad+ again. * To close the Java application, make it active by clicking on it and then hit ESC. # Credits * [FYP](https://github.com/THE-FYP). All of this would have been way harder without the samp hacking tools he's graciously published over the past decade. * [this jlink project i used as a starting point.](https://bitbucket.org/FlPe/javafx_jlink_example/src/master/) <file_sep>package server; import model.GameInformation; import service.GuiUpdaterService; import java.io.IOException; import java.net.Socket; import java.util.Scanner; public class SocketConnection implements Runnable { private Socket socket; private GuiUpdaterService updaterService; public SocketConnection(Socket socket, GuiUpdaterService updaterService) { this.socket = socket; this.updaterService = updaterService; } @Override public void run() { System.out.println("Connected: " + socket); try { System.out.println("scanning"); var in = new Scanner(socket.getInputStream()); while (in.hasNextLine()) { String line = in.nextLine(); System.out.println("received line: " + line); parseLine(line); } } catch (Exception e) { System.out.println("Error:" + socket); } finally { closeConnection(); } } public void closeConnection() { try { socket.close(); } catch (IOException e) { System.out.println("Unable to close connection"); } System.out.println("Closed: " + socket); } private void parseLine(String line) { String[] splitLine = line.split(";"); if (splitLine.length == 1) { updaterService.updateName(new GameInformation(splitLine[0])); } else if (splitLine.length == 7) { try { updaterService.updateValues(createGameInformation(splitLine)); } catch (Exception e) { System.out.println("Error parsing line"); } } } private GameInformation createGameInformation(String[] line) { return new GameInformation(Double.parseDouble(line[0]), Double.parseDouble(line[1]), line[2], line[3], line[4], line[5], line[6]); } } <file_sep>rootProject.name = 'playerfinder'<file_sep>package gui; import javafx.geometry.Insets; import javafx.scene.Group; import javafx.scene.Scene; import javafx.scene.control.Label; import javafx.scene.image.Image; import javafx.scene.image.ImageView; import javafx.scene.layout.*; import javafx.scene.paint.Color; import javafx.scene.shape.Rectangle; import javafx.scene.text.Font; import javafx.stage.Stage; import javafx.stage.StageStyle; import java.util.Objects; public class GuiElementHolder { private static final String DEFAULT_SKIN = "74"; private static final String MAP = "map"; private static final String RESOURCE_ROOT = "gui/"; private static final String PNG = ".png"; private static final String SKIN_PREFIX = "Skin_"; private static final String TITLE = "Playerfinder"; private Label name = new Label(); private Label distance = new Label(); private Label hp = new Label(); private Label weapon = new Label(); private ImageView skin = new ImageView(); private ImageView map = new ImageView(); private BorderPane borderPane = new BorderPane(); private VBox text = new VBox(); private StackPane sp = new StackPane(); private Rectangle marker; private String skinModel = ""; public GuiElementHolder(Stage primaryStage, Group root) { setUpLabels(); initMap(); setUpPanes(); setUpScene(primaryStage, root); } private Image loadImageByName(String name) { String path = RESOURCE_ROOT + name + PNG; return new Image(Objects.requireNonNull(getClass().getClassLoader().getResourceAsStream(path))); } private void setUpScene(Stage primaryStage, Group root) { Scene scene = new Scene(root); root.getChildren().add(sp); primaryStage.setResizable(false); primaryStage.setHeight(325.0); primaryStage.setWidth(300.0); primaryStage.setScene(scene); primaryStage.setTitle(TITLE); primaryStage.initStyle(StageStyle.UNDECORATED); primaryStage.setAlwaysOnTop(true); } private void initMap() { updateSkin(DEFAULT_SKIN); setImage(map, MAP); initMarker(); sp.setMaxSize(300, 300); map.setScaleX(3); map.setScaleY(3); map.setTranslateX(-350); map.setTranslateY(-350); } private void initMarker() { marker = new Rectangle(145, 145, 10, 10); marker.setStroke(Color.BLACK); marker.setFill(Color.RED); } private void setUpLabels() { setUpLabel(name); setUpLabel(distance); setUpLabel(hp); setUpLabel(weapon); } private void setUpLabel(Label label) { label.setFont(Font.font(20)); label.setStyle("-fx-font-weight: bold"); label.setBackground(new Background(new BackgroundFill(Color.BLACK, CornerRadii.EMPTY, Insets.EMPTY))); label.setTextFill(Color.WHITE); } private void setUpPanes() { HBox hBox = new HBox(); text.getChildren().addAll(name, distance, hp, weapon, skin); hBox.getChildren().addAll(text); borderPane.setTop(hBox); borderPane.getChildren().add(marker); sp.getChildren().addAll(map, borderPane); } public String getDistanceText() { return this.distance.getText(); } public void setDistanceText(String distance) { this.distance.setText(distance); } public void setHpText(String hp) { this.hp.setText(hp); } public void setWeaponText(String weapon) { this.weapon.setText(weapon); } public void setNameText(String text) { this.name.setText(text); } public void updateMarkerPosition(Double x, Double y) { map.setTranslateX(x); map.setTranslateY(y); } public void updateSkin(String id) { if (id != null && !id.equals(skinModel) && id.chars().allMatch(Character::isDigit)) { skinModel = id; setImage(skin, SKIN_PREFIX + id); } } private void setImage(ImageView imageView, String img) { imageView.setImage(loadImageByName(img)); } public void resetValues() { setHpText(""); setDistanceText(""); setWeaponText(""); updateSkin(DEFAULT_SKIN); updateMarkerPosition(-350.0, -350.0); } } <file_sep>require "lib.moonloader" require "lib.sampfuncs" local host, port = "127.0.0.1", 5230 local socket = require("socket") local weapons = require 'game.weapons' local HOLD_NUMPAD = "Hold NUMPAD+ to begin" function main() while not isSampAvailable() do wait(0) end local connected = false local id = nil local previous = "" while true do wait(100) if not connected then tcp = connect() connected = true else if isKeyDown(107) then id, connected = wait_for_id(tcp, previous) end end previous, connected = handle_id(id, tcp, previous) end end function handle_id(id, tcp, previous) if id ~= nil then if sampIsPlayerConnected(id) then return find_player(id, tcp, previous) else return send_message("Player " .. id .. " not connected", tcp, previous) end else return send_message(HOLD_NUMPAD, tcp, previous) end end function find_player(id, tcp, previous) local x1, y1, z1 = getCharCoordinates(PLAYER_PED) local streamed, ped = sampGetCharHandleBySampPlayerId(id) local name = sampGetPlayerNickname(id) if not streamed then return send_streamed_out_info(id, x1, y1, z1, name, tcp, previous) end return send_streamed_in_info(ped, id, x1, y1, z1, name, tcp, previous) end function send_streamed_out_info(id, x1, y1, z1, name, tcp, previous) local x, y, success, distance = get_streamed_out_info(id, x1, y1, z1) if not success then return send_message("Unable to locate player", tcp, previous) end return send_message(x .. ";" .. y .. ";" .. name .. ";" .. distance .. "; ; ; ", tcp, previous) end function send_streamed_in_info(ped, id, x1, y1, z1, name, tcp, previous) local x, y, distance, skin, health, additional_info = get_streamed_info(ped, id, x1, y1, z1) return send_message(x .. ";" .. y .. "; " .. name .. ";" .. distance .. ";" .. skin .. ";" .. health .. ";" .. additional_info, tcp, previous) end function get_streamed_out_info(i, x1, y1, z1) local success, x, y, z = sampGetStreamedOutPlayerPos(i) if success then distance = math.floor(getDistanceBetweenCoords3d(x1, y1, z1, x, y, z)) direction = get_direction(x, y, z, x1, y1, z1) return x, y, success, distance .. "m " .. " " .. direction end return nil, nil, success, nil end function get_streamed_info(ped, id, x1, y1, z1) local x, y, z = getCharCoordinates(ped) local distance = math.floor(getDistanceBetweenCoords3d(x1, y1, z1, x, y, z)) local direction = get_direction(x, y, z, x1, y1 ,z1) local skin = getCharModel(ped) local health = sampGetPlayerHealth(id) local armor = sampGetPlayerArmor(id) local additional_info = get_additional_info(ped) return x, y, distance .. "m " .. direction, skin, "HP: " .. health .. ", Armor: " .. armor, additional_info end function get_additional_info(ped) if isCharInAnyCar(ped) then local model = getCarModel(getCarCharIsUsing(ped)) return getNameOfVehicleModel(model) else local weapon, _, _ = getCurrentCharWeapon(ped) return weapons.get_name(weapon) end end function connect() local connected = false local tcp = assert(socket.tcp()) while not connected do tcp:connect(host, port) tcp:settimeout(0) wait(1000) _, connected = send_message(HOLD_NUMPAD, tcp) end return tcp end function get_direction(x, y, z, x1, y1, z1) local long = "" local lat = "" if exceeds_long_threshold(x, y, x1, y1) then if y > y1 then long = "NORTH " else long = "SOUTH " end end if exceeds_lat_threshold(x, y, x1, y1) then if x > x1 then lat = "EAST" else lat = "WEST" end end return long .. lat end function exceeds_long_threshold(x, y, x1, y1) return 3 * math.sqrt(math.pow((y - y1), 2)) >= math.sqrt(math.pow((x - x1), 2)) end function exceeds_lat_threshold(x, y, x1, y1) return math.sqrt(math.pow((y - y1), 2)) <= 3 * math.sqrt(math.pow((x - x1), 2)) end function send_message(message, tcp, previous) if message ~= previous then return send_message(message, tcp) end return message, true end function send_message(message, tcp) local index, _ = tcp:send(message .. "\n") return message, index == string.len(message) + 1 end function wait_for_id(tcp, previous) local input, connected = send_message("ID: ", tcp, previous) while isKeyDown(107) and connected do wait(10) if wasKeyPressed(96) or wasKeyPressed(48) then input, connected = send_message(input .. "0", tcp) elseif wasKeyPressed(97) or wasKeyPressed(49) then input, connected = send_message(input .. "1", tcp) elseif wasKeyPressed(98) or wasKeyPressed(50) then input, connected = send_message(input .. "2", tcp) elseif wasKeyPressed(99) or wasKeyPressed(51) then input, connected = send_message(input .. "3", tcp) elseif wasKeyPressed(100) or wasKeyPressed(52) then input, connected = send_message(input .. "4", tcp) elseif wasKeyPressed(101) or wasKeyPressed(53) then input, connected = send_message(input .. "5", tcp) elseif wasKeyPressed(102) or wasKeyPressed(54) then input, connected = send_message(input .. "6", tcp) elseif wasKeyPressed(103) or wasKeyPressed(55) then input, connected = send_message(input .. "7", tcp) elseif wasKeyPressed(104) or wasKeyPressed(56) then input, connected = send_message(input .. "8", tcp) elseif wasKeyPressed(105) or wasKeyPressed(57) then input, connected = send_message(input .. "9", tcp) elseif wasKeyPressed(8) then if string.len(input) > 4 then input, connected = send_message(input:sub(1, -2), tcp, previous) end end end if string.len(input) > 4 then return tonumber(input:sub(5)), connected end return nil, connected end <file_sep>plugins { id 'application' id 'org.openjfx.javafxplugin' version '0.0.5' } repositories { mavenCentral() } dependencies { } javafx { modules = ['javafx.controls'] } mainClassName = 'gui.Main' jar { manifest { attributes 'Main-Class': 'gui.Launcher' } from { configurations.runtime.collect { it.isDirectory() ? it : zipTree(it) } configurations.compile.collect { it.isDirectory() ? it : zipTree(it) } } } sourceSets { main { java { srcDirs 'src/main/java' } resources { srcDirs 'src/main/resources' } } }
b75b52e9aaf582683068347eb9ebd26cc170312a
[ "Markdown", "Java", "Lua", "Gradle" ]
6
Markdown
martin457/playerfinder
8eab8016650f105bc69cd86b2459db48b470153f
f9fa26ee08a980f84b5c9f3edd406297aa0b2087
refs/heads/master
<repo_name>manifoldfinance/metamask-links<file_sep>/src/helpers/openrpcDocumentToJSONRPCSchema.ts import { MethodObject, ContentDescriptorObject, OpenrpcDocument, ExampleObject, } from '@open-rpc/meta-schema'; const schema: any = { type: 'object', properties: { jsonrpc: { type: 'string', enum: ['2.0'], description: 'JSON-RPC version string', }, id: { description: 'unique identifier for the JSON-RPC request', oneOf: [ { type: 'string', }, { type: 'number', }, ], }, method: { type: 'string', }, }, }; const openrpcDocumentToJSONRPCSchema = (openrpcDocument: OpenrpcDocument) => { return { type: 'object', properties: { id: { ...schema.properties.id, }, jsonrpc: { ...schema.properties.jsonrpc, }, method: { type: 'string', oneOf: openrpcDocument?.methods?.map(_method => { const method = _method as MethodObject; return { const: method.name, description: method.description || method.summary, markdownDescription: method.description || method.summary, }; }), }, }, allOf: openrpcDocument?.methods?.map(_method => { const method = _method as MethodObject; return { if: { properties: { method: { const: method.name, }, }, }, then: { properties: { params: { oneOf: [ { type: 'array', minItems: method?.params?.filter( (param: any) => param.required, ).length, maxItems: method?.params?.length, defaultSnippets: method.examples ? method.examples.map((example: any) => { return { label: example.name, description: example.description || example.summary, body: example.params?.map( (ex: ExampleObject) => ex.value, ), }; }) : [], items: method.params?.map((param: any) => { return { ...param.schema, markdownDescription: param.description || param.summary, description: param.description || param.summary, additionalProperties: false, }; }), }, { type: 'object', properties: method.params && (method.params as ContentDescriptorObject[]).reduce( (memo: any, param: ContentDescriptorObject) => { if (typeof param.schema === 'object') { memo[param.name] = { ...param.schema, markdownDescription: param.description || param.summary, description: param.description || param.summary, additionalProperties: false, }; } else { memo[param.name] = param.schema; } return memo; }, {}, ), }, ], }, }, }, }; }), }; }; export default openrpcDocumentToJSONRPCSchema; <file_sep>/gatsby-config.js module.exports = { pathPrefix: '/metamask-link', siteMetadata: { title: 'MetaMask Link', description: 'This tool lets you link to metamask actions.', logoUrl: 'https://raw.githubusercontent.com/MetaMask/brand-resources/master/SVG/metamask-fox.svg', author: '', }, plugins: [ 'gatsby-theme-material-ui', { resolve: 'gatsby-plugin-manifest', options: { name: 'metamask-link', short_name: 'metamask-link', start_url: '/', background_color: 'transparent', theme_color: '#3f51b5', display: 'minimal-ui', icon: 'src/images/metamask-fox.svg', // This path is relative to the root of the site. }, }, ], }; <file_sep>/setup.sh #!/bin/sh yarn install --pure-lockfile --ignore-scripts cd node_modules/sharp/ yarn yarn run install cd - yarn run build <file_sep>/src/hooks/useQueryString.ts /* eslint-disable consistent-return */ import { useState } from 'react'; import * as qs from 'qs'; const useQueryParams = (search: string, depth?: number) => { const parse = (): any => { return qs.parse(search, { ignoreQueryPrefix: true, depth: depth || 100, decoder(str: string) { if (/^([+-]?[0-9]\d*|0)$/u.test(str)) { return parseInt(str, 10); } if (str === 'false') { return false; } if (str === 'true') { return true; } return decodeURIComponent(str); }, }); }; const [query] = useState(parse()); return [query]; }; export default useQueryParams; <file_sep>/README.md # MetaMask Link Create deep links for MetaMask confirmations - including adding custom networks, tokens, payment requests, and more. It uses the `window.ethereum` provider under the hood.
fc730e9f534fe71fc8b8fa0feb31d0258cab9103
[ "JavaScript", "TypeScript", "Markdown", "Shell" ]
5
TypeScript
manifoldfinance/metamask-links
84bb0026f09fe1115007a584ea1ab128cc84d7a0
99b0f9e626045e76476bb21f352fea4ebf28820b
refs/heads/master
<file_sep>#include "shared.h" template <class Type> void myclass<Type>::setx(Type y) { x = y; } template <class Type> Type myclass<Type>::getx() { return x; } // Instantiate myclass for the supported template type parameters template class myclass<int>; template class myclass<long>; <file_sep>template <class Type> class myclass { Type x; public: myclass() { x=0; } void setx(Type y); Type getx(); }; <file_sep>CXX = g++ CXXFLAGS = -g -Wall -fPIC -Wno-deprecated -O3 CXXFLAGS += -std=c++14 all: libshared.so main shared.o: shared.cpp $(CXX) $(CXXFLAGS) -c $^ -o $@ libshared.so: shared.o $(CXX) -o $@ -shared $^ main: main.cpp $(CXX) -c $< -o main.o $(CXX) -o $@ main.o -L. -lshared .PHONY : printmakehelp_and_reminder printmakehelp_and_reminder: Makefile README $(info /**********************************************************************/) $(info * task --> printmakehelp_and_reminder: Makefile README *) $(info * $$@ ----> $@ *) $(info * $$< --------------------------------> $< *) $(info * $$^ --------------------------------> $^ *) $(info /**********************************************************************/) export_LD_LIBRARY_PATH: @echo export LD_LIBRARY_PATH=/home/burmist/home2/training/034_Cpp_Shared_Library_with_Templates/s02 print_LD_LIBRATY_PATH: @echo LD_LIBRARY_PATH = $(value LD_LIBRARY_PATH) .PHONY : clean clean: rm -f *~ rm -f .*~ rm -f *.o rm -f *.so rm -f main <file_sep>#include "shared.h" //template <class Type> void myclass<Type>::setx(Type y) { x = y; } //template <class Type> Type myclass<Type>::getx() { return x; } // Instantiate myclass for the supported template type parameters //template class myclass<int>; //template class myclass<long>; <file_sep>template <class Type> class myclass { Type x; public: myclass() { x=0; } void setx(Type y); Type getx(); }; template <class Type> void myclass<Type>::setx(Type y) { x = y; } template <class Type> Type myclass<Type>::getx() { return x; }
b4fb5caf6212704de81ee4f9fe14b4a2d8390e57
[ "Makefile", "C++" ]
5
C++
burmist-git/034_Cpp_Shared_Library_with_Templates
648b6234ee77c743cd9309ab60315a675422a5a1
e4313be87f2829af68c7d8420a90cf7eefe68ffa
refs/heads/master
<file_sep>#### 1. Array#transpose 는 배열을 입체적으로 다루는 데 도움이 된다. - [spiral array 문제](http://codingdojang.com/scode/266) 다차원 배열을 회전시킬 수 있다. ```ruby cube = ->x,y { (0...y).map {|e| [*0...x].product [e] } } peel = ->cub { cub[0]? cub.shift + peel[cub.transpose.reverse] : [] } sprl = ->x,y { spr=peel[cube[x,y]]; cube[x,y].map {|_|_.map {|e|"%3d" % spr.index(e)}*''} } expect(sprl[6,6]).to eq [" 0 1 2 3 4 5", " 19 20 21 22 23 6", " 18 31 32 33 24 7", " 17 30 35 34 25 8", " 16 29 28 27 26 9", " 15 14 13 12 11 10"] ``` - [상자 넓이 구하기 문제](http://codingdojang.com/scode/506) 다차원 배열의 행과 열을 하나의 배열로 구조화 할 수 있다. ```ruby gap = ->a { a.flat_map {|e| e.each_cons(2).map {|a,b|(b-a).abs} } } area = ->a { gap[(a+a.transpose).map{|_|[0,*_,0]}].sum + (a.size**2)*2 } expect( area[ [[1,4,3,4], [2,3,4,1], [3,4,2,1], [9,3,2,1]]] ).to eq 120 ``` - [Poker Hands 문제](http://codingdojang.com/scode/423) 숫자배열과 무늬배열로 쉽게 구조화 시킬 수 있다. ```ruby to_n = "23456789TJQKA".chars.zip(2..14).to_h hand = ->cards do ranks, suits = cards.map(&:chars).transpose pairs, nums = ranks.map(&to_n).group_by {|_|_}. map {|k,g| [g.size,k] }.sort.reverse.transpose kind = %w(11111 2111 221 311 32 41).zip([0,1,2,3,6,7]).to_h[pairs.join] strt = 4 if nums.each_cons(2).all? {|i,j| i == j.next } flsh = 5 if suits.uniq.one?; stfl = 8 if strt && flsh [[kind, strt, flsh, stfl].compact.max, nums] end play_poker = ->file,games=File.readlines(file).map(&:split) do prt_result = ->idx { puts %w(Tie. Black\ wins. White\ wins.)[idx] } games.map {|cards| hand[cards[0,5]] <=> hand[cards[5,5]] }.each &prt_result end test_data = ["2H 3D 5S 9C KD 2C 3H 4S 8C AH\n", "2H 4S 4C 2D 4H 2S 8S AS QS 3S\n", "2H 3D 5S 9C KD 2C 3H 4S 8C KH\n", "2H 3D 5S 9C KD 2D 3H 5C 9S KH\n"] allow(File).to receive(:readlines).and_return(test_data) results = "White wins.\nBlack wins.\nBlack wins.\nTie.\n" expect{ play_poker["game_data.txt"] }.to output(results).to_stdout ``` <file_sep>describe "Coding Dojang - http://codingdojang.com" do it "#51 - Steps" do step = ->a,b,n=b-a { s=(n**0.5).to_i; 2*s+(n**0.5==s ? -1 : n>s*s+s ? 1:0) } steps = ->n=gets.to_i { puts (1..n).map {gets.chop.split.map &:to_i}.map &step } expect(step[45,48]).to eq 3 expect(step[45,49]).to eq 3 expect(step[45,50]).to eq 4 expect(step[0,2**31]).to eq 92681 #=> stdin/stdout test $stdin = StringIO.new("3\n45 48\n45 49\n45 50\n") expect { steps[] }.to output("3\n3\n4\n").to_stdout end it "#52 - Sliding Window. ver.simple" do def s_window(n, k, str) a = str.split(' ').map(&:to_i) x = (0..n-k).map {|i| a[i,k].minmax } [x.map(&:first).join(' '), x.map(&:last).join(' ')] end expect(s_window(8, 3, "1 3 -1 -3 5 3 6 7").first).to eq "-1 -3 -3 -3 3 3" expect(s_window(8, 3, "1 3 -1 -3 5 3 6 7").last).to eq "3 3 5 5 6 7" end it "#52 - Sliding Window" do # sort, find, index(obj),delete(obj)느림. sort_by, bsearch_index로 대체 # 마찬가지로 first, last보다 [0][-1]이 빠르며, closure보다 &:가 빠름. def s_window(n, k, str) arr,fr = str.split(' ').map.with_index{|v,i|[i,v.to_i]}, [] minmax = (0..n-k).map do |i| if i == 0 fr = arr[i,k].sort_by(&:last) else fr.insert((fr.bsearch_index{|_|_[1]>=arr[i+k-1][1]} || k), arr[i+k-1]) fr.delete_at((fr.bsearch_index{|_|_[0]>arr[i-1][0]} || k)-1) end [fr[0][1], fr[-1][1]] end [minmax.map(&:first).join(' '), minmax.map(&:last).join(' ')] end expect( s_window(8, 3, "1 3 -1 -3 5 3 6 7")[0]).to eq "-1 -3 -3 -3 3 3" expect( s_window(8, 3, "1 3 -1 -3 5 3 6 7")[1]).to eq "3 3 5 5 6 7" big_arr = (0..10**3).map {|i|rand(9)}.join(' ') # check_runtime = Benchmark.realtime { s_window(10**6, 30000, big_arr) } check_runtime = Benchmark.realtime { s_window(10**3, 300, big_arr) } expect( check_runtime ).to be_between(0.0, 15000) end it "#52 - Sliding Window. O(n)" do l_min = ->s { s.reduce([]) {|a,e| a << [a[-1]||e, e].min } } r_min = ->s { s.reverse_each.reduce([]) {|a,e| a.unshift([a[0]||e, e].min) } } l_max = ->s { s.reduce([]) {|a,e| a << [a[-1]||e, e].max } } r_max = ->s { s.reverse_each.reduce([]) {|a,e| a.unshift([a[0]||e, e].max) } } s_window = lambda do n, k, *arr = 2.times.flat_map { gets.split.map(&:to_i) } # for stdin case sl, sr, bl, br, mins, maxs = [[],[],[],[],[],[]] arr.each_slice(k) {|s| sl+=l_min[s]; sr+=r_min[s]; bl+=l_max[s]; br+=r_max[s] } (0..n-k).each {|i| mins << [sr[i],sl[i+k-1]].min; maxs << [br[i],bl[i+k-1]].max } puts mins*' ', maxs*' ' end $stdin = StringIO.new("8 3\n1 3 -1 -3 5 3 6 7\n") expect{ s_window[] }.to output("-1 -3 -3 -3 3 3\n" + "3 3 5 5 6 7\n").to_stdout # performance # arr = (0...1_000_000).map {|i| rand(1_000_000) } # Benchmark.bm(25) do |x| # x.report("case 3. n=10**6, k=5*10**5") { s_window[10*6, 5*10*5, arr] } # end # 축약버전 # 중복을 없애기 위한 .send(fn)동적 호출, concat대신 += (복사발생)을 사용해서 축약코드는 느림. l = ->s,fn { s.reduce([]) {|a,e| a << [a[-1]||e, e].send(fn) } } r = ->s,fn { s.reverse_each.reduce([]) {|a,e| a.unshift([a[0]||e, e].send(fn)) } } s_window = ->n,k,arr do sl, sr, bl, br, mins, maxs = [[],[],[],[],[],[]] arr.each_slice(k) {|s| sl+=l[s,:min]; sr+=r[s,:min]; bl+=l[s,:max]; br+=r[s,:max] } (0..n-k).each {|i| mins << [sr[i],sl[i+k-1]].min; maxs << [br[i],bl[i+k-1]].max } puts mins*' ', maxs*' ' end end it "#53 - Intervals" do pairs = proc { (1..gets.to_i).map { gets.split(/ /).map(&:to_i) }.sort } ni_pw = proc { pairs[].chunk_while{|(_,a),(b,_)|a>=b}.map {|e| e.flatten.minmax*' '} } allow(pairs).to receive(:[]) { [[1,4], [5,6], [6,9], [8,10], [10,10]] } expect( ni_pw.() ).to eq ["1 4", "5 10"] $stdin = StringIO.new("5\n1 4\n5 6\n6 9\n8 10\n10 10\n") # test data expect(ni_pw[]).to eq ["1 4", "5 10"] end it "#54 - 487-3279" do matches = "ABCDEFGHIJKLMNOPRSTUVWXY".gsub(/(...)/).zip("23456789".chars) p_num = ->str { str.delete("-").insert(3,"-").tap {|e| matches.each{|f,n| e.tr!(f,n)} } } dups = ->texts { texts.map(&p_num).group_by{|e|e}.reject {|_,v| v.size<2 } } check = ->texts { sum=dups[texts]; sum.size>0? sum.map {|k,v| [k,v.size]*' ' } : "No dups" } dups_set = %w[4873279 ITS-EASY 888-4567 3-10-10-10 888-GLOP TUT-GLOP 967-11-11 310-GINO F101010 888-1200 -4-8-7-3-2-7-9- 487-3279] no_dups_set = %w[123-4567 487-3279] expect(check[dups_set]).to eq ["487-3279 4", "888-4567 3", "310-1010 2"] expect(check[no_dups_set]).to eq "No dups" end it "#55 - The Knights Of The Round Table" do r = ->a,b,c,s=(a+b+c)/2 { ((s-a)*(s-b)*(s-c)/s)**0.5 } r_t = proc {puts "The radius of the round table is: %.3f" % r[*gets.split.map(&:to_i)] } expect(r[12,12,8]).to eq 2.8284271247461903 #=> for stdin/out test $stdin = StringIO.new("12 12 8\n") expect { r_t[] }.to output("The radius of the round table is: 2.828\n").to_stdout end it "#56 - Insertion Sort" do swap = ->a,s { a[s],a[s-1]=a[s-1],a[s]; a } trav = ->a,s { s==0? a : (a[s]<a[s-1]? trav[swap[a,s],s-1] : a) } sort = ->a,s=1 { a.size>s ? sort[trav[a,s],s+1] : a } expect(sort[[5,2,4,6,1,3]]).to eq [1,2,3,4,5,6] # depdent function expect(swap[[5,2,4,6,1,3],1]).to eq [2,5,4,6,1,3] expect(trav[[5,2,4,6,1,3],1]).to eq [2,5,4,6,1,3] expect(trav[[1,2,4,5,6,3],5]).to eq [1,2,3,4,5,6] end it "#57 - Sort Large File" do require 'bitset' sort_by_bit = ->rf,wf,size,bs=Bitset.new(size) do IO.foreach(rf) {|l| bs[l.to_i] = true } File.open(wf,"w") {|f| (0...size).each {|i| f.puts(i) if bs[i] } } end #=> usage : sort_by_bit["unordered.txt", "sorted.txt", size] unordered_nums = [*0...100].shuffle # shuffle randomly File.open("r.txt","w") {|f| unordered_nums.each {|n| f.puts(n) } } sort_by_bit["r.txt", "w.txt",100] generated_file_nums = File.read("w.txt").split.map(&:to_i) # shuffled arr.sort == generated file's arr expect( unordered_nums.sort ).to eq generated_file_nums =begin crystal require "bit_array" def sort_by_bit(rf, wf, size) ba = BitArray.new(size) File.each_line(rf) {|l| ba[l.to_i] = true } File.open(wf, "w") {|f| 0.upto(size-1) {|i| f.puts(i) if ba[i]} } end sort_by_bit("unordered.txt", "ordered.txt", 5) unordered_nums = (0...100).to_a.shuffle # shuffle randomly File.open("r.txt","w") {|f| unordered_nums.each {|n| f.puts(n) } } sort_by_bit("r.txt", "w.txt", 100) generated_file_nums = File.read("w.txt").split.map(&.to_i) # shuffled arr.sort == generated file's arr unordered_nums.sort.should eq generated_file_nums =end end it "#58 - One Edit Apart" do case_diff = ->a,b { b.chars.combination(b.size-1).include? a.chars } case_same = ->a,b { (0...a.size).count {|i| a[i]!=b[i] } < 2 } one_apart = ->a,b { a.size==b.size ? case_same[a,b] : case_diff[*[a,b].minmax_by(&:size)] } expect( one_apart["cat","dog"] ).to eq false expect( one_apart["cat","cats"] ).to eq true expect( one_apart["cat","cut"] ).to eq true expect( one_apart["cat","cast"] ).to eq true expect( one_apart["cat","at"] ).to eq true expect( one_apart["cat","acts"] ).to eq false end it "#59 - Largest Subset" do add = ->seq,n { h,t = seq[n-1]||n,seq[n+1]||n; seq[h],seq[t],size = t,h,t-h; (seq[:max],seq[:size]=(h..t),size) if size>seq[:size]; seq } largest = ->nums { nums.reduce({max:0..0, size:0}) {|seq,n| add[seq,n] if !seq[n]}[:max].to_a } expect( largest.([1,3,4,5]) ).to eq [*3..5] expect( largest.([1,5,4,3,6]) ).to eq [*3..6] expect( largest.([1,6,10,4,7,9,5]) ).to eq [*4..7] expect( largest.([1,6,10,4,7,9,8,5,11,3,12]) ).to eq [*3..12] end it "#60 - Josephus Problem" do safe_pos = ->n,k,ring=[*1..n] do ring.rotate!(k-1).shift until ring.size==1; ring[0] end #n,k=gets.chomp.split.map &:to_i; expect( safe_pos.(10, 3) ).to eq 4 expect( safe_pos.(100, 3) ).to eq 91 end end <file_sep>#### 7. 간단한 알고리즘들. - [Eight Queens](http://codingdojang.com/scode/392) 재귀 구현으로 잘 알려진 문제. ```ruby require 'benchmark' valid = ->square,x,y do square.reject {|cx,cy| cx==x || cy==y || cx+cy==x+y || cx-cy==x-y } end # 모두가 배타적인 위치에 놓여있을 때 숫자 1로 환산. cnt = ->n,squares,row=1 do squares.select {|x,y| x == row}. reduce(0) do |sum,square| row < n ? sum + cnt[n, valid[squares, *square], row+1] : 1 end end queens = ->size,m=[*1..size] { cnt[size, board=m.product(m)] } expect(queens[2]).to eq 0 expect(queens[4]).to eq 2 expect(queens[6]).to eq 4 expect(queens[7]).to eq 40 expect(queens[8]).to eq 92 expect(queens[10]).to eq 724 expect(Benchmark.realtime { queens[8] }).to be_between(0.001, 0.01) ``` - [이진트리 레이아웃](http://codingdojang.com/scode/534) 이진 트리 생성과 in-order 순회. ```ruby add = ->par,node do par ? node.y = par.y + 1 : (return node) chd = node.val < par.val ? par.l : par.r chd ? add[chd, node] : (node.val < par.val ? par.l = node : par.r = node) par end layout = ->btree_str=gets.chop,cnt=0 do Node = Struct.new(:val, :x, :y, :l, :r) nodes = btree_str.chars.map {|c| Node.new(c, 0, 1)} btree = nodes.reduce(&add) trav = ->node do trav[node.l] if node.l node.x = cnt+=1 puts node.values[0..2].join(" ") trav[node.r] if node.r end trav[btree] end in_ordered = "a 1 4\n" + "c 2 3\n" + "e 3 6\n" + "g 4 5\n" + "h 5 4\n" + "k 6 2\n" + "m 7 3\n" + "n 8 1\n" + "p 9 3\n" + "q 10 5\n" + "s 11 4\n" + "u 12 2\n" expect{ layout["nkcmahgeupsq"] }.to output(in_ordered).to_stdout ``` - [소수의 개수 구해보기](http://codingdojang.com/scode/503) [Sieve of Eratosthenes](https://en.wikipedia.org/wiki/Sieve_of_Eratosthenes). ```ruby primes =->n,s=[1]*n do (2..n).select {|e| (e*2..n).step(e){|i|s[i-1]=0} if s[e-1] > 0 }.size end expect( primes[1000] ).to eq 168 expect( primes[10000000] ).to eq 664579 ``` - [그림판](http://codingdojang.com/scode/483) [BFS: Breadth-first search](https://en.wikipedia.org/wiki/Breadth-first_search). ```ruby require 'matrix' paint_image = proc do xy, start = (1..2).map { gets.split.map(&:to_i) } image = Matrix[*(1..xy[0]).map { gets.chars[0, xy[1]] }] q, find_color, paint = [start[0,2]], image[*start[0,2]].to_s, start[2].to_s cell = ->x,y { x>=0 && y>=0 && image[x,y] == find_color } next_of = ->x,y { [[1,0],[-1,0],[0,-1],[0,1]].map {|r,c| [x+r,y+c] }.select &cell } nexts = ->x,y { image.send(:[]=, x, y, paint); next_of[x,y] } q += nexts[*q.shift] until !q[0] puts image.to_a.map(&:join) end $stdin = StringIO.new("10 10\n" + "5 5 3\n" + "0000000000\n" + "0000001000\n" + "0000110100\n" + "0011000010\n" + "0100000010\n" + "0100000010\n" + "0100000100\n" + "0010001000\n" + "0001011000\n" + "0000100000\n") expect { paint_image[] }.to output("0000000000\n" + "0000001000\n" + "0000113100\n" + "0011333310\n" + "0133333310\n" + "0133333310\n" + "0133333100\n" + "0013331000\n" + "0001311000\n" + "0000100000\n").to_stdout end ``` - [Euclid Problem](http://codingdojang.com/scode/437) [Extended Euclidean](https://en.wikipedia.org/wiki/Extended_Euclidean_algorithm), [Bézout's identity](https://en.wikipedia.org/wiki/B%C3%A9zout%27s_identity), [Modular inverse](https://rosettacode.org/wiki/Modular_inverse). ```ruby # a,b가 주어졌을 때 ax + by = d(최대공약수)를 만족하는 x,y,d를 구하기 find_xy_gcd = ->a,b do x,s,y,t = 1,0,0,1 (a,q,b = b,*a.divmod(b); x,s,y,t = s,x-q*s,t,y-q*t) until b.zero? [x,y,a] end expect(find_xy_gcd[4,6]).to eq [-1,1,2] expect(find_xy_gcd[17,17]).to eq [0,1,17] expect(find_xy_gcd[1071,1029]).to eq [-24,25,21] expect(find_xy_gcd[78696,19332]).to eq [212,-863,36] ``` <file_sep>#### 4. Enumerator#lazy 는 대용량, 무한대를 다루는데 도움이 된다. - [피보나치 수열 구하기](http://codingdojang.com/scode/461) 무한 수열에서 조건을 만족하는 수 모두를 구한다. ```ruby fib = ->x { x < 2 ? x : fib[x-2] + fib[x-1] } fib_seq = ->n { (0..1/0.0).lazy.map(&fib).take_while {|f| f < n}.force } expect( fib_seq[10] ).to eq [0, 1, 1, 2, 3, 5, 8] expect( fib_seq[100000].size ).to eq 26 expect( fib_seq[100000].last ).to eq 75025 ``` - [N'th Palindrome](http://codingdojang.com/scode/401) 무한 수열에서 조건을 만족하는 n번째 수를 획득한다. ```ruby pal = ->nth { (0..1/0.0).lazy. select {|n| n == n.to_s.reverse.to_i}.take(nth).force.last } expect( [1, 4, 30, 100].map(&pal) ).to eq [0, 3, 202, 909] ``` - [아마존 면접문제](http://codingdojang.com/scode/418) 대용량 파일 전체를 메모리에 올리지 않고 1라인씩 읽어가며 처리한다. 가볍고 빠르다. ```ruby require 'rspec' require 'rspec/mocks/standalone' extend RSpec::Matchers # https://github.com/pry/pry/issues/1277 def cnt_officers_at(time, file) exist = ->chk_in,chk_out { (time >= chk_in && time <= chk_out) ? 1 : 0 } File.open(file,'r').each_line.lazy .map {|person| exist[*person.strip.split] }.force.reduce :+ end # mock file data = "09:12:23 11:14:35\n10:34:01 13:23:40\n10:34:31 11:20:10\n" log_file = allow(File).to receive(:open).and_return(data) expect( cnt_officers_at("09:13:20", log_file) ).to eq 1 expect( cnt_officers_at("11:05:20", log_file) ).to eq 3 expect( cnt_officers_at("09:13:20", log_file) ).to eq 1 expect( cnt_officers_at("11:05:20", log_file) ).to eq 3 ``` <file_sep>describe "Coding Dojang - http://codingdojang.com" do it "#11 - Primary Arithmetic" do # 두 수를 더했을 때 올림수 세기. # 팁 : split(' ') split으로도 가능. 관례 : 보통 공백으로 구분 justify = ->nums { nums.map {|s| s.rjust(nums.map(&:size).max) } } carry = ->nums { nums.map {|n| n.chars.map(&:to_i) }.transpose.reverse. reduce([0]) {|c,(a,b)| c[-1]+a+b > 9? c<<1 : c }.count(1) } stdin = proc { (gets('0 0').split(/\n/)-["","0 0"]).map(&:split) } prt = ->n { puts "%s carry operation%s." % [n>0?n:"No", n>1?"s":""] } prt_c = proc {|strs| (strs||stdin[]).map(&justify).map(&carry).each(&prt) } expect( justify.(["123","45"]) ).to eq ["123"," 45"] expect( justify.(["12","345"]) ).to eq [" 12","345"] expect( carry.([" 99","123"]) ).to eq 2 expect( carry.(["555","555"]) ).to eq 3 stdin_sample = [%w(123 456), %w(555 555), %w(123 594)] result = "No carry operation.\n3 carry operations.\n1 carry operation.\n" expect { prt_c.(stdin_sample) }.to output(result).to_stdout end it "#12 - Convert a number to money format" do int = ->num { num.reverse.scan(/.{1,3}/).join(',').reverse } money_format = ->num { n,q = num.to_s.split('.'); int[n] + (q ? "."+q:"") } expect( money_format.(1000) ).to eq "1,000" expect( money_format.(20000000) ).to eq "20,000,000" expect( money_format.(-3245.24) ).to eq "-3,245.24" end it "#13 - Counting Code Lines" do comment = /\s*\/(\*.+?\*\/|\/[^\n]*$)/m cnt = ->code { code.gsub(comment, "").split("\n").map(&:empty?).count(false) } code1 = "// This file contains 3 lines of code\n" + "public interface Dave {\n" + " /**\n" + " * count the number of lines in a file\n" + " */\n" + " int countLines(File inFile); // not the real signature!\n" + "}\n" code2 = "/*****\n" + " * This is a test program with 5 lines of code\n" + " * \/* no nesting allowed!\n" + "\n" + "//*****//***/// Slightly pathological comment ending...\n" + "public class Hello {\n" + " public static final void main(String [] args) { // gotta love Java\n" + " // Say hello\n" + " System./*wait*/out./*for*/println/*it*/(\"Hello/*\");\n" + " }\n" + "}\n" expect( cnt[code1] ).to eq 3 expect( cnt[code2] ).to eq 5 end it "#14 - Rotation 3 ways" do # rotate(+n), rotate(0), rotate(-n) rotate = ->s,list=s.split { list.rotate(-list.shift.to_i)*' ' } cases = { stay: "0 똘기 떵이 호치 새초미", foward: ["1 10 20 30 40 50", "4 가 나 다 라 마 바 사"], reverse: "-2 A B C D E F G" } expect( rotate.(cases[:stay]) ).to eq "똘기 떵이 호치 새초미" expect( rotate.(cases[:foward][0]) ).to eq "50 10 20 30 40" expect( rotate.(cases[:foward][1]) ).to eq "라 마 바 사 가 나 다" expect( rotate.(cases[:reverse]) ).to eq "C D E F G A B" end it "#15 - Nth Palindrome" do # simple way # simple = ->nth { (0..1.0./0.0).lazy. # select{|n|n==n.to_s.reverse.to_i}.take(nth).force.last } # expect([1,4,30,100].map(&simple)).to eq [0,3,202,909] # fast way : 자릿수별 거울수 갯수를 산술적으로 계산 idx = ->n { (2..n).reduce([10]) {|a,d| a[-1]<n ? [d, n-a[-1]-1, a[-1]+9*10**(d/2-(d%2^1))] : (break a)} } gen = proc {|d,pos| h = 10**(d/2-(d%2^1))+pos; [h, (h/10**(d%2)).to_s.reverse] } pal = ->n { n>10 ? gen[*idx[n]].join.to_i : n-1 } nths, pals = [1,4,30,100,30000,1000000], [0,3,202,909,200000002,90000000009] expect( nths.map(&pal) ).to eq pals expect( Benchmark.realtime { pal[10**1000]} ).to be_between(0.001, 0.1) end it "#16 - 미로통과 검사" do require 'matrix' maze_checker = ->maze_str=gets("\n\n") do maze = Matrix[*maze_str.split("\n").map(&:chars)] start, dest = %w(< >).map {|ch| maze.index(ch) }; q = [start] road = ->x,y { x>=0 && y>=0 && " ><".chars.include?(maze[x,y]) } next_of = ->x,y { [[1,0],[-1,0],[0,-1],[0,1]].map {|a,b| [x+a,y+b] }.select &road } nexts = ->x,y { maze.send(:[]=, x, y, "#"); next_of[x,y] } (x,y = q.shift; maze[x,y]==">"? (return true) : q += nexts[x,y]) until !q[0] false end maze1 = "< >\n" maze2 = "########\n#< #\n# ## #\n# ## #\n# >#\n########\n" maze3 = "#######\n#< #\n##### #\n# #\n# #####\n" + "# # #\n# # # #\n# #>#\n#######\n" maze4 = "< # >\n\n" maze5 = "########\n#< #\n# ##\n# #>#\n########\n" maze6 = "#< # #\n# # #\n# # >#\n" expect( [maze1, maze2, maze3].map &maze_checker ).to eq [true, true, true] expect( [maze4, maze5, maze6].map &maze_checker ).to eq [false, false, false] end it "#17 - Quine" do quine = proc { quine = "\nputs \"quine = \" + quine.inspect + quine" puts "quine = " + quine.inspect + quine } expected = %q(quine = "\nputs \"quine = \" + quine.inspect + quine" puts "quine = " + quine.inspect + quine)+"\n" expect { quine.() }.to output(expected).to_stdout # 1. eval way #eval quine="print 'eval quine=';p quine" # 2. inspect way #quine = "\nputs \"quine = \" + quine.inspect + quine" #puts "quine = " + quine.inspect + quine # 3. cheat ways #$><<IO.read($0) #puts IO.read($0) #puts open($0).gets #$stdout << IO.read($0) end it "#18 - Find Files" do ls = ->ext,str { Dir.glob("**/*.#{ext}"). select {|f| File.read(f) =~ /#{str}/ } } expect( ls.("txt", "LIFE IS TOO SHORT") ).to eq [] expect( ls.("rb", "rspec") ).to include("spec/spec_helper.rb") end it "#19 - Convert tab to spaces" do convert = ->code { code.gsub("\t", " ") } cleanse = ->src { File.open(src,'r+') {|f| f.write(conv[File.read(src)]) } } #=> cleanse["source file"] expect( convert.("\t") ).to eq " " end it "#20 - Paging" do # m : 총 건수, n : 한 페이지에 보여줄 게시물 수, 출력 : 총 페이지 수 pages = ->m,n { ( m.to_f / n ).ceil } data = [[0,1],[1,1],[2,1],[1,10],[10,10],[11,10]] expect( data.map(&pages) ).to eq [0,1,2,1,1,2] end end <file_sep>#### 9. irb와 rspec irb 또는 pry에서 rspec을 사용해서 아이디어를 즉시 테스트해 볼 수 있다. - irb에서 RSpec 사용하기 ```ruby require 'rspec' include RSpec::Matchers expect(1).to eq 1 ``` - STDIN 테스트 ```ruby test_stdin = "12345\n" $stdin = StringIO.new(test_stdin) def str_to_i gets.to_i end expect(str_to_i).to eq 12345 ``` - STDOUT 테스트 ```ruby def echo(str) puts str end expect { echo("abcde") }.to output("abcde\n").to_stdout ``` - Mock a file ```ruby gem install rspec-mocks # install ``` ```ruby require 'rspec/mocks/standalone' extend RSpec::Matchers def read(file_name) # lazy way File.read(file_name,'r') end log_file = allow(File).to receive(:read).and_return("file contents\n") expect( read(log_file) ).to eq "file contents\n" ``` - Benchmark ```ruby require 'benchmark' expect( Benchmark.realtime { puts "do_something" } ).to be_between(0.0, 0.1) ``` - RSpec-Benchmark ```ruby gem install 'rspec-benchmark' # install ``` ```ruby require 'rspec-benchmark' include RSpec::Benchmark::Matchers expect { print "do " }.to perform_under(10).ms # 실행평균 10ms 이하 # 1초에 최소 99번 실행할 수 있어야 한다. ips = iteration per sec expect { sleep 0.01; print "do " }.to perform_at_least(99).ips ``` <file_sep>class RDate attr_reader :year, :month, :day attr_reader :year_to_days, :month_to_days def initialize(date) ymd = ->str { [[0,4],[4,2],[6,2]].map {|s,e| str[s,e].to_i } } @year, @month, @day = ymd[date] raise "서기 0년은 존재하지 않습니다" if @year == 0 @month_to_days = sum_days_with_month @year_to_days = sum_days_with_year end def leaf_year? @year%4==0 && @year%100!=0 || @year%400==0 end def sum_date @year_to_days + @month_to_days + @day end def self.subdate(from, to) from_date, to_date = RDate.new(from), RDate.new(to) (to_date.sum_date - from_date.sum_date).abs end private def sum_days_with_month day_per_month = [0,31,28 + (leaf_year?? 1:0),31,30,31,30,31,31,30,31,30,31] days = day_per_month[0,@month].reduce(0, :+) end def sum_days_with_year year = @year - 1 days = year == 0 ? 0 : 365*year leaf_days = year/4 - year/100 + year/400 days + leaf_days end end <file_sep>describe "Coding Dojang - http://codingdojang.com" do it "#101 - Word ladder" do start, dest = %w(hit cog); dict = %w(hot dot dog lot log) nexts = ->word,dict { dict.select {|w| (word.chars-w.chars).size==1 } } laddr = ->dict,word,path { path[-1] == dest ? {path*' ' => path.size} : nexts[word,dict].map {|e| laddr[dict-[e],e,path+[e]] } } w_laddr = ->s,d,dict { v = laddr[dict<<d,s,[s]].flatten.reduce(&:merge) min = v.values.min; [min, v.select{|_,e|e==min}.keys] } expect( w_laddr[start, dest, dict] ).to eq [5, ["hit hot dot dog cog", "hit hot lot log cog"]] end it "#102 - Largest number" do m_perm = ->nums { nums.permutation.to_a.map(&:join).max.to_i } m_sort = ->nums { nums.sort{|x,y|[y,x]*''<=>[x,y]*''}.join.to_i } nums = [3,30,34,5,9] expect(m_perm[nums]).to eq 9534330 # permutation way expect(m_sort[nums]).to eq 9534330 # sorting way end it "#104 - Count primes" do # loop way # primes =->n{ (2..n).select{|e|(2..Math.sqrt(e)).none?{|f|(e%f)==0}}.size } # Sieve of Eratosthenes primes =->n,s=[1]*n do (2..n).select {|e| (e*2..n).step(e){|i|s[i-1]=0} if s[e-1]>0}.size end expect( primes[1000] ).to eq 168 expect( primes[10000000] ).to eq 664579 # crystal & ruby def count_primes(max) # Sieve of Eratosthenes sieve = Array.new(max + 1, true) sieve[0] = sieve[1] = false 2.step(max**0.5) {|i| (i*i).step(max,i) {|j| sieve[j]=false } if sieve[i] } sieve.count {|e|e} end end it "#105 - Count number character" do cnter = ->n { ([*1..n]*'').chars.reduce(Hash.new(0)) {|a,c| a[c]+=1; a} } # cnter = ->n { ('1'..n.to_s).each_with_object(Hash.new(0)) {|s,h| s.each_char{|c| h[c]+=1 } } } expect(cnter[5]).to eq({"1"=>1, "2"=>1, "3"=>1, "4"=>1, "5"=>1}) expect(cnter[1000]).to eq( {"1"=>301, "2"=>300, "3"=>300, "4"=>300, "5"=>300, "6"=>300, "7"=>300, "8"=>300, "9"=>300, "0"=>192} ) end it "#106 - Sum of numbers" do msum = ->l,h do (l..h).reduce(0) {|sum,n| n.to_s.chars.map(&:to_i).reduce(:*)+sum } end expect( msum.(1, 10) ).to eq 45 expect( msum.(10, 15) ).to eq 15 expect( msum.(10, 1000) ).to eq 93150 end it "#107 - Boxes area" do gap = ->a { a.flat_map {|e| e.each_cons(2).map {|c|c.reduce(:-).abs} } } area = ->a,n=a.size { gap[(a+a.transpose).map{|_|[0,*_,0]}].reduce(:+) + n*n*2 } expect( area[ [[1,2], [2,4]] ]).to eq 32 expect( area[ [[1,4,3,4], [2,3,4,1], [3,4,2,1], [9,3,2,1]]] ).to eq 120 end it "#108 - Montecarlo method" do m_pi = ->n do (1..n).reduce {|_,e|_+(rand(0..1.0)**2+rand(0..1.0)**2>1? 0:1) }*4.0/n end expect( m_pi[10000] ).to be_within(0.1).of(3.1415) expect( m_pi[20000] ).to be_within(0.1).of(3.1415) expect( m_pi[30000] ).to be_within(0.1).of(3.1415) end it "#109 - Scale group" do # { 대표수 => 비교 가능한 수 }로 표현. # { 1=>[1, 2, 3, 4], 6=>[6, 5, 4] }. 2는 1234와 비교가능, 4는 1234, 654와 비교가능 def make_group scales = proc { (1..gets.to_i).map { gets.split.map(&:to_i) } } connect = ->set,pair do left, right = pair pushed = set.select {|key,vals| set[key] << right if vals.index(left) } set[left] = set[right]? [left]+set.delete(right) : pair if pushed.size.zero? set end [gets.to_i, scales[].reduce({}, &connect)] end def unknowns known = ->i,group { group.reduce([]) {|a,(_,mem)| mem.index(i)? a|mem : a } } n, group = make_group (1..n).map {|i| n - known[i, group].size } end case_all_chained = ["5", "4", "3 4", "2 3", "1 2", "4 5"].join("\n") case_part_merged = ["5", "4", "2 3", "1 3", "5 4", "2 4"].join("\n") case_part_chained = ["6", "5", "1 2", "2 3", "3 4", "5 4", "6 5"].join("\n") $stdin = StringIO.new(case_all_chained) #=> for stdin test expect( unknowns() ).to eq [0, 0, 0, 0, 0] #=>{1=>[1,2,3,4,5]} $stdin = StringIO.new(case_part_merged) expect( unknowns() ).to eq [3, 2, 1, 1, 3] #=> {2=>[2,3,4] 1=>[1,3], 5=>[5,4]} $stdin = StringIO.new(case_part_chained) expect( unknowns() ).to eq [2, 2, 2, 0, 3, 3] #=> {1=>[1,2,3,4], 6=>[6,5,4]} =begin # shorthands adjs = proc { (1..gets.to_i).map { gets.split.map(&:to_i) } } connect = ->set,pair { l,r = pair add_grp = set.select {|tag,g| set[tag] << r if g.member?(l) } (set[l] = (set[r]? [l]+set.delete(r) : pair)) if add_grp.empty?; set } tagging = proc { [gets.to_i, adjs[].reduce({}, &connect)] } knowns = ->i,set { set.reduce([]) {|a,(_,g)| g.member?(i)? a|g : a } } unknowns = proc { n,set = tagging[]; (1..n).map {|i| n - knowns[i,set].size } } case_all_chained = ["5", "4", "3 4", "2 3", "1 2", "4 5"].join("\n") case_part_merged = ["5", "4", "2 3", "1 3", "5 4", "2 4"].join("\n") case_part_chained = ["6", "5", "1 2", "2 3", "3 4", "5 4", "6 5"].join("\n") $stdin = StringIO.new(case_all_chained) #=> for stdin test expect( unknowns.() ).to eq [0, 0, 0, 0, 0] #=>{1=>[1,2,3,4,5]} $stdin = StringIO.new(case_part_merged) expect( unknowns.() ).to eq [3, 2, 1, 1, 3] #=> {2=>[2,3,4] 1=>[1,3], 5=>[5,4]} $stdin = StringIO.new(case_part_chained) expect( unknowns.() ).to eq [2, 2, 2, 0, 3, 3] #=> {1=>[1,2,3,4], 6=>[6,5,4]} =end end it "#110 - winning probability" do odds = ->a,b { Rational((b**0.5).to_i-(a**0.5).to_i, b-a) } expect(odds[1, 4]).to eq "1/3".to_r expect(odds[1,16]).to eq "1/5".to_r expect(odds[1,2**60]).to eq "1/1073741825".to_r end end <file_sep>#### 3. 다양한 형태의 Stack. (구문평가식, reduce, 문자열) - [LISP계산기 문제. 풀이1](http://codingdojang.com/scode/533) 구문평가식은 스택을 사용한다. 이를 이용해 스택을 구성할 수 있다. ex) 1+(2+(3+4)) ```ruby calc = proc {|op,*args| args.reduce(op) || 0 } lisp_eval = ->exp { eval exp.gsub(/[(\s)]/, '('=>"calc[:", ' '=>',', ')'=>']') } # tests : arity 1, 3, 4, nested s-exp, complex s-exp cases = ["(+)"] + ["(- 10 3)", "(* 2 3)"] + ["(- 10 3 5)", "(* 2 3 4)"] + ["(* (+ 2 3) (- 5 3))", "(/ (+ 9 1) (+ 2 3))"] + ["(* 1 (- 2 3) 4 (+ 2 -1) 3)" ] expect( cases.map(&lisp_eval) ).to eq [0, 7, 6, 2, 24, 10, 2, -12] ``` - [LISP계산기 문제. 풀이2](http://codingdojang.com/scode/533) reduce로 토큰을 쌓은 뒤 조건에 도달하면 top을 계산하는 방식으로 스택을 구현했다. ```ruby calc = proc {|op,*args| args.reduce(op) || 0 } eval_top = ->s { t=[]; t.unshift(s.pop) until t[0].is_a? Symbol; s << calc[*t] } lisp_eval = ->str do tokens = str.gsub(/[()]/, "("=>"",")"=>" )").split. map {|e| e == ")" ? e : e =~ /[[:digit:]]/ ? e.to_i : e.to_sym } tokens.reduce([]) {|stack,e| e == ")" ? eval_top[stack] : stack << e }.pop end # tests : arity 1, 3, 4, nested s-exp, complex s-exp cases = ["(+)"] + ["(- 10 3)", "(* 2 3)"] + ["(- 10 3 5)", "(* 2 3 4)"] + ["(* (+ 2 3) (- 5 3))", "(/ (+ 9 1) (+ 2 3))"] + ["(* 1 (- 2 3) 4 (+ 2 -1) 3)" ] expect( cases.map(&lisp_eval) ).to eq [0, 7, 6, 2, 24, 10, 2, -12] ``` - [Simple Balanced Parentheses 문제](http://codingdojang.com/scode/457) 문자열을 스택으로 사용했다. ```ruby is_balanced = ->str { str.scan(/[()]/).reduce("v") {|a,e| e=='('? a+e : a.chop} == "v" } # test data test_str = "(5+6)∗(7+8)/(4+3)" balanced_strs = %w[ (()()()()) (((()))) (()((())())) ] not_balanced_strs = %w[ ((((((()) ())) (()()(() (()))( ())(() ] expect(is_balanced[test_str]).to eq true expect(balanced_strs.map &is_balanced).to eq [true]*3 expect(not_balanced_strs.map &is_balanced).to eq [false]*5 ``` <file_sep>require 'byebug' describe "Coding Dojang - http://codingdojang.com" do it "#31 - Persons in the office - Amazon" do # simple way def cnt_login(time, file) # simple way exist = ->log { (log.first<=time && log.last >=time) ? 1:0 } File.open(file,'r').each_line .map {|line| exist[line.strip.split] }.reduce :+ end # lazy way def cnt_login_lazy(time, file) # lazy way exist = ->log { (log.first <= time && log.last >= time) ? 1:0 } lazy_cnt_map = File.open(file,'r').each_line.lazy .map {|line| exist[line.strip.split] } lazy_cnt_map.force.reduce :+ end # mocking file data = "09:12:23 11:14:35\n10:34:01 13:23:40\n10:34:31 11:20:10\n" log_file = allow(File).to receive(:open).and_return(data) expect( cnt_login("09:13:20", log_file) ).to eq 1 expect( cnt_login("11:05:20", log_file) ).to eq 3 expect( cnt_login_lazy("09:13:20", log_file) ).to eq 1 expect( cnt_login_lazy("11:05:20", log_file) ).to eq 3 end it "#32 - Korea currency" do num = "영일이삼사오육칠팔구".chars.zip([*0..9]).to_h digit = "십백천만억조경".chars.zip([*1..4,8,12,16].map {|e|10**e}).to_h chunk = ->str { str.chars.map {|n| num[n]||digit[n]}.chunk_while {|i,j|i<j} } pairs = ->str { chunk[str].map {|n,*d| d.size>0? [n,[*d].reduce(:*)] : [1,n]} } revised = ->str { pairs[str].tap {|e| e[-1][-1]=10**7 if e[-2,2]&.map(&:last)==[10**8,10**3] }} valid = ->str { chunk[str].none? {|e| e[0]>=10**8} } kr_cur = ->str { valid[str]? revised[str].reduce(0) {|sum,(n,d)| d>sum ? (sum+n)*d : sum+(n*d)} : false} case1 = %w(영 일 칠 이천오 구천 일십만 십만) case2 = %w(일억오천 일억오천만 억오천만 사천구십칠조이천만삼백십육) expect( case1.map(&kr_cur) ).to eq [0, 1, 7, 2005, 9000, 10**5, 10**5] expect( case2.map(&kr_cur) ).to eq [15*10**7, 15*10**7, false, 4097000020000316] end it "#33 - Friend or Enemy. Path Compression" do root = ->n,nodes { n = nodes[n][:root] until nodes[n][:root]==n; n } set_node = ->n,r,rel,nodes { nodes[n][:root]=r; nodes[n][:rel]=rel; nodes } chk_cons = ->n, rel_info, nodes, incons do a, b, rel = rel_info[n] a_rel, b_rel = nodes[a][:rel], nodes[b][:rel] if root[a,nodes] == root[b,nodes] incons << n if (a_rel||0)^(b_rel||0) != rel else nodes = case [a_rel.nil?, b_rel.nil?] when [true, true] then set_node[b, a, rel, nodes] when [false, true] then set_node[b, root[a,nodes], a_rel^rel, nodes] when [true, false] then set_node[a, root[b,nodes], b_rel^rel, nodes] else set_node[root[b,nodes], root[a,nodes], a_rel^b_rel^rel, nodes] end end [nodes, incons] end find_mosoon = ->arrs do rels = ->s { s.map(&:split).map {|a,b,rel| [a.to_i, b.to_i, (rel=="f"? 0:1)] } } init = ->n { (1..n).reduce({}) {|a,e| a[e]={root:e}; a } } rel_info = rels[arrs]; size,line = rel_info[0]; result = (1..line).reduce([init[size],[]]) {|a,nth| chk_cons[nth, rel_info, *a] } result[1].size > 0 ? result[1] : "THE SPY DID NOT BETRAY" end rpt1 = ["3 3", "1 2 f", "2 3 e", "1 3 f"] rpt2 = ["5 7", "1 2 f", "3 4 e", "4 5 f", "2 4 e", "1 3 e", "1 4 e", "5 3 e"] rpt3 = ["5 7", "1 2 f", "3 4 e", "4 5 f", "2 4 e", "1 3 f", "1 4 e", "5 3 e"] rpt4 = ["5 7", "2 1 f", "3 4 e", "4 5 f", "2 4 e", "1 3 f", "1 4 e", "5 3 e"] expect( find_mosoon[rpt1] ).to eq [3] #모순 expect( find_mosoon[rpt2] ).to eq [5] #모순, 다중 병합. expect( find_mosoon[rpt3] ).to eq "THE SPY DID NOT BETRAY" #정상, 다중 병합. expect( find_mosoon[rpt4] ).to eq "THE SPY DID NOT BETRAY" #정상, 노드순서 뒤집힌 경우. end it "#34 - Mine map" do adjs = ->x,y { [*x-1..x+1].product([*y-1..y+1]) } cnt = ->adjs,mines { adjs.map {|x,y| mines.flatten(1).to_h[[x,y]]}.count('*') } stdin = ->io do (0...io.gets.split[1].to_i).map {|r| io.gets.chop.chars .map.with_index {|v,c|[[r,c],v]} } end mark = ->io,mines=stdin[io] do mines.map {|col| col.map {|xy,v| v=='*'? '*' : cnt[adjs[*xy],mines]}*'' } end #stdin = proc { (0...gets.split[1].to_i).map {|r| gets.chop.chars.map.with_index {|v,c|[[r,c],v]} } } #mark = ->mines=stdin[] { mines.map {|col| col.map {|xy,v| v=='*'? '*' : cnt[adjs[*xy],mines]}*'' } } #expect(mark[$stdin]).to eq %w(*100 2210 1*10 1110) $stdin = StringIO.new("4 4\n*...\n....\n.*..\n....\n") expect( mark[$stdin] ).to eq %w(*100 2210 1*10 1110) end it "#35 - Australian Voting" do def elect(names,votes) counter = votes.reduce(Hash.new(0)) {|cnt,prefer| cnt[prefer.first] +=1; cnt } min_votes, max_votes = counter.values.minmax if max_votes >= votes.size / 2 names[counter.key(max_votes)-1] else last_places = counter.select {|_,v| v == min_votes}.keys votes.each {|v| v.delete_if {|prefer| last_places.include? prefer} } elect(names, votes) end end names = ["<NAME>", "<NAME>", "<NAME>"] votes = [[1, 2, 3], [2, 1, 3], [2, 3, 1], [1, 2, 3], [3, 1, 2]] expect( elect(names,votes) ).to eq "<NAME>" # 출력코드 # n, inputs = gets.to_i, STDIN.read.split(/[\r\n]/) # names, votes = inputs.shift(n), inputs.map {|vote| vote.split.map &:to_i } # p elect(names, votes) #=> "<NAME> end it "#36 - Poker Hands" do to_n = "23456789TJQKA".chars.zip(2..14).to_h hand = ->cards do ranks, suits = cards.map(&:chars).transpose pairs, nums = ranks.map(&to_n).group_by {|_|_}. map {|k,g| [g.size,k] }.sort.reverse.transpose kind = %w(11111 2111 221 311 32 41).zip([0,1,2,3,6,7]).to_h[pairs.join] strt = 4 if nums.each_cons(2).all? {|i,j| i == j.next } flsh = 5 if suits.uniq.one?; stfl = 8 if strt && flsh [[kind, strt, flsh, stfl].compact.max, nums] end play_poker = ->file,games=File.readlines(file).map(&:split) do prt_result = ->idx { puts %w(Tie. Black\ wins. White\ wins.)[idx] } games.map {|cards| hand[cards[0,5]] <=> hand[cards[5,5]] }.each &prt_result end test_data = ["2H 3D 5S 9C KD 2C 3H 4S 8C AH\n", "2H 4S 4C 2D 4H 2S 8S AS QS 3S\n", "2H 3D 5S 9C KD 2C 3H 4S 8C KH\n", "2H 3D 5S 9C KD 2D 3H 5C 9S KH\n"] allow(File).to receive(:readlines).and_return(test_data) results = "White wins.\nBlack wins.\nBlack wins.\nTie.\n" expect{ play_poker["game_data.txt"] }.to output(results).to_stdout end it "#37 - Jolly Jumpers " do jolly = ->n,seq { seq.each_cons(2).map{|a,b|(a-b).abs}.reduce(:+)==(n-1)*2 } seqs = proc { (readline('0').split(/\n/)-["","0"]).map {|line| line.split.map(&:to_i) } } chk = proc { seqs[].map {|e| n,*seq=e; jolly[n,seq]? "Jolly":"Not Jolly" } } expect( jolly.(4, [1,4,2,3]) ).to eq true expect( jolly.(5, [1,4,2,-1,6]) ).to eq false # rspec에서는 동작하지만 guard-rspec 에서는 아래와 같은 에러가 발생한다. # Errno::ENOENT: No such file or directory @ rb_sysopen - -f # byebug $stdin = StringIO.new("4 1 4 2 3\n5 1 4 2 -1 6\n0") expect { puts chk.() }.to output("Jolly\nNot Jolly\n").to_stdout end it "#38 - Random Walk" do require 'set' X, Y = 100, 100; matrix = [*1..Y].product([*1..X]).to_set Tile = Struct.new :x, :y, :stay, :visit tiles = matrix.map {|e| [e, Tile.new(e[0],e[1],0,0)] }.to_h at = ->x,y { tiles[[x,y]] || false } crawl = ->bug { n=at[bug[0]+rand(-1..1),bug[1]+rand(-1..1)]; n ? [n.x,n.y] : bug } trail = ->prev,now { prev==now ? at[*prev].stay+=1 : at[*now].visit+=1 } move = ->bug { crawl[bug].tap {|e| trail[bug,e]} } bug=[rand(X)+1,rand(Y)+1]; (bug=move[bug]; matrix.delete(bug)) until matrix.empty? # puts "total movements : #{tiles.values.map(&:visit).reduce :+}" # tiles.each_slice(X).each {|cols| cols.each { |e| # print "[%3d (%3d)] " % [e[1].stay, e[1].visit]}; print "\n" } end it "#39 - Max rope length - labyrinth" do # BFS. Max length max_rope_length = ->xy=gets.split.map(&:to_i) do labyrinth = Matrix[*(1..xy[1]).map { gets.chars[0, xy[0]] }] q, max_len = [labyrinth.index(".") << -1], 0 road = ->x,y,_ { x>=0 && y>=0 && labyrinth[x,y] == "." } adjs = ->x,y,len { [[1,0],[-1,0],[0,-1],[0,1]].map {|r,c| [x+r,y+c,len] }.select &road } nexts = ->x,y,len { labyrinth.send(:[]=, x, y, "#") max_len = [max_len, len+1].max; adjs[x,y,len+1] } q += nexts[*q.shift] until !q[0] "Maximum rope length is #{max_len}." end max_ropes = proc { puts (1..gets.to_i).map { max_rope_length[] } } test_labyrinths = "2\n" + "3 3\n" + "###\n" + "#.#\n" + "###\n" + "7 6\n" + "#######\n" + "#.#.###\n" + "#.#.###\n" + "#.#.#.#\n" + "#.....#\n" + "#######\n" expected_stdout = "Maximum rope length is 0.\n" + "Maximum rope length is 8.\n" $stdin = StringIO.new(test_labyrinths) expect { max_ropes.call }.to output( expected_stdout ).to_stdout end it "#40 - Directory Diff " do require 'fileutils' ls = ->d { Dir.glob(d+"**/*").reduce([]) {|a,e| File.file?(e)? a<<e.sub(d,""):a }} trx = ->d1,d2 { lst,lsr = ls[d1],ls[d2] %w(T R X).zip([lst-lsr, lsr-lst, (lsr&lst).select {|f| !FileUtils.cmp(d1+f, d2+f) }]).to_h } #=> test에 /bin이 없고, real에 /spec이 없으며, same파일은 내용만 다를 때. # p trx["test/", "real/"] # {"T"=>["spec/redis_spec.rb", "spec/spec_helper.rb"], # "R"=>["bin/bundler", "bin/htmldiff", "bin/ldiff", "bin/respec", "bin/rspec"], # "X"=>["same"] } end end <file_sep>require 'rdate' describe "RDate - 그레고리력 기준" do it ".new - 길이 8, Date문자열로 연월일 숫자 획득" do r_date = RDate.new("20160705") expect( r_date.year ).to eq 2016 expect( r_date.month ).to eq 7 expect( r_date.day ).to eq 5 end it ".month_to_days - 해당년 1월 1일 부터 해당년 전월까지 경과일" do expect( RDate.new("20160105").month_to_days ).to eq 0 expect( RDate.new("20160205").month_to_days ).to eq 31 expect( RDate.new("20160305").month_to_days ).to eq 31+29 expect( RDate.new("20160405").month_to_days ).to eq 31+29+31 end it ".month_to_days - 윤달이 고려된 경과일 계산" do r_date = RDate.new("00040301") expect( r_date.leaf_year? ).to eq true expect( r_date.month_to_days).to eq 31+29 end it ".year_to_days - 서기 1년 1월 1일로 부터 해당년 전년까지 경과일" do expect( RDate.new("00010101").year_to_days ).to eq 0 expect( RDate.new("00020101").year_to_days ).to eq 365 expect( RDate.new("00030101").year_to_days ).to eq 365*2 end it ".year_to_days - 윤년은 366일로 계산" do # 4년이면 윤년 expect( RDate.new("00040101").year_to_days ).to eq 1095 expect( RDate.new("00050101").year_to_days ).to eq 1095 + 366 # 100년이면 평년 expect( RDate.new("01000001").year_to_days ).to eq 36159 expect( RDate.new("01010001").year_to_days ).to eq 36159 + 365 # 400년이면 윤년 expect( RDate.new("04000001").year_to_days ).to eq 145731 expect( RDate.new("04010001").year_to_days ).to eq 145731 + 366 end it ".sum_date - 서기 1년 1월 1일로 부터 경과일" do expect( RDate.new("00010101").sum_date ).to eq 1 expect( RDate.new("00010301").sum_date ).to eq 31+28+1 end it ".subdate - 서기 0년 입력시 에러처리" do not_exist_zero = "서기 0년은 존재하지 않습니다" expect{ RDate.subdate("00000101", "20160706" ) }. to raise_error.with_message not_exist_zero expect{ RDate.subdate("20160706", "00000101" ) }. to raise_error.with_message not_exist_zero end it ".subdate - Date간 차이 일자 계산" do from, to = RDate.new("00010101"), RDate.new("00010201") expect( to.sum_date - from.sum_date ).to eq 31 expect( RDate.subdate("00010101", "00010201") ).to eq 31 expect( RDate.subdate("20160201", "20160301") ).to eq 29 # params reverse-case expect( RDate.subdate("20160301", "20160201") ).to eq 29 # dojang test datas expect( RDate.subdate("20070515", "20070501") ).to eq 14 expect( RDate.subdate("20070501", "20070515") ).to eq 14 expect( RDate.subdate("20070301", "20070515") ).to eq 75 end end <file_sep>describe "Coding Dojang - http://codingdojang.com" do it "#21 - Convert unit" do # 1 inch = 2.54 cm, 1 cm = 10 mm, 1 inch = 72 pt # 1 inch = 96 px, 1 pt = 20 dxa, 1 dxa = 635 emu # %g - float formatting 1.0 -> 1 , 1.375 -> 1.375 units = %w(inch cm mm pt px dxa emu) tr_rate_inch = [1, 2.54, 25.4, 72, 96, 20*72, 635*20*72] unit = units.zip(tr_rate_inch).to_h conv = ->from,to { num,u = from.split; ["%g" % ((unit[to].to_f / unit[u]) * num.to_i) , to]*' ' } expect( conv["10 cm", "cm"] ).to eq "10 cm" expect( conv["10 inch", "mm"] ).to eq "254 mm" expect( conv["1024 px", "pt"] ).to eq "768 pt" expect( conv["768 px", "inch"] ).to eq "8 inch" expect( conv["9144000 emu", "inch"] ).to eq "10 inch" expect( conv["12000 dxa", "px"] ).to eq "800 px" end it "#22 - Minimum Pair" do min_pair = ->sorted_list { sorted_list.each_cons(2).min_by{|a,b|b-a} } expect( min_pair.([1,3,4,8,13,17,20]) ).to eq [3,4] end it "#23 - 3n + 1" do # n이 홀수면 3*n+1, 짝수면 n/2. n이 1이 될 때 까지 수의 길이(갯수)를 구한다. # n의 범위를 입력 받아서 3n+1연산을 했을 때 가장 길이가 긴 n의 길이를 구한다. seq = ->num { num == 1 ? num : seq.(num.odd?? 3*num+1 : num/2) + 1 } max_len = ->lines=$< { lines.map {|line| limits = line.split [limits, eval(limits*'..').map(&seq).max]*' ' } } expect( seq.(22) ).to eq 16 #=> for stdin/stdout lines = ["1 10", "100 200", "201 210", "900 1000"] max_seq_lens = ["1 10 20", "100 200 125", "201 210 89", "900 1000 174"] expect( max_len.(lines) ).to eq max_seq_lens end it "#24 - Manipulate names" do str = "이유덕,이재영,권종표,이재영,박민호,강상희,이재영,김지완,최승혁,이성연," + "박영서,박민호,전경헌,송정환,김재성,이유덕,전경헌" names = str.split(/,/) # 순서대로 1,2,3,4번 각각의 출력. # puts names.select {|n| n[0]=~/[김|이]/ }.group_by {|c|c[0]}. # map {|f,v| "#{f}: #{v.size}" } # puts names.count {|n| n=="이재영" } # puts names.uniq.join # puts names.uniq.sort end it "#25 - Pots of gold game" do # 2명이 번갈아가며 가장 앞, 또는 가장 뒤 항아리를 선택할 수 있을 때 # A가 이길 수 있는 최선의 전략을 찾아보기 # A, B가 최선의 선택을 하는 경우를 가정했을 때 이기는 경우를 발견하는 전략. # A - B - A순으로 큰값을 선택해야 한다. # 따라서 sum = A선택 + B가선택하고 남은 값. 이들 중 가장 큰 값을 재귀로 취하면 된다. sum = ->p { [*p][0]? [ p[0] + [sum[p[1..-2]], sum[p[2..-1]]].min, p[-1] + [sum[p[1..-2]], sum[p[0..-3]]].min ].max : 0 } expect( sum.([9,8,7,5,4,6,3,1,2]) ).to eq 25 expect( sum.([3,6,2,8,1,9,4,5,7]) ).to eq 17 #=> B가 이길수 밖에 없는 상황. end it "#26 - Smallest Range with Catesian Product" do # min_range = ->l { l.shift.product(*l).map(&:minmax).sort_by{|a,b|b-a}[0] } min_range = ->lists { lists.shift.product(*lists). map(&:minmax).min_by {|a,b| b-a } } lists = [[4,10,15,24,26], [0,9,12,20], [5,18,22,30]] expect( min_range.(lists) ).to eq [20,24] end it "#27 - K Palindrome" do def pal(s, k) return "NO" if k < 0 return "YES" if s.size < 2 s[0]==s[-1]? pal(s[1..-2],k) : pal(s[1..-1],k-1) || pal(s[0..-2],k-1) end expect( pal("ab", 1) ).to eq "YES" expect( pal("abc", 1) ).to eq "NO" expect( pal("abxa", 1) ).to eq "YES" expect( pal("abdxa", 1) ).to eq "NO" expect( pal("asfsdfasdfsdfsadfasfadfasdfadfadfadfdf", 15) ).to eq "NO" expect( pal("asfsdfasdfsdfsadfasfadfasdfadfadfadfdf", 30) ).to eq "YES" # one-liner # pal = ->s,k { k<0? "NO" : s[1]? s[0]==s[-1]? pal[s[1..-2],k] : # pal[s[1..-1],k-1] || pal[s[0..-2],k-1] : "YES" } end it "#28 - Special sort" do s_sort = ->list { list.group_by {|e| e < 0 }.map(&:last).flatten } expect( s_sort.([-1,1,3,-2,2]) ).to eq [-1,-2, 1, 3, 2] end it "#29 - Array Swapping in-place - Amazon" do # ver.simple transform = ->a { a.shift(a.size/2).zip(a).flatten } expect(transform[%w(a b c 1 2 3)]).to eq %w(a 1 b 2 c 3) # ver.swap swap = ->arr { arr.each_slice(2).map {|a,b|a,b=b,a}.flatten } transform = ->a,n=a.size/2 { (1...n).each{|i|a[n-i,i*2]=swap[a[n-i,i*2]]}; a} # ver.nested-loop to catesian swap = ->a { n=a.size/2; [*1...n].product([*0...2]).each {|i,j| idx=n-i+2*j a[idx],a[idx+1] = a[idx+1],a[idx] }; a } # ver.in-place transform = ->a,n=a.size/2 { (1...n).each {|i| (0...i).each{|j| x=n-i+2*j; a[x],a[x+1]=a[x+1],a[x]}}; a } expect( transform.(%w(a b c 1 2 3)) ).to eq %w(a 1 b 2 c 3) end it "#30 - Alphabet with numbers - Facebook" do ch = ('1'..'26').zip('a'..'z').to_h head = ->n { [1,2].map {|i| [n[i..-1]||"", ch[n[0,i]]] }.uniq.select {|_,c|c} } gens = ->num,s="" { num[0] ? head[num].flat_map {|n,c| gens[n, s+c] } : s } expect( gens['1123'] ).to eq ["aabc", "aaw", "alc", "kbc", "kw"] expect( gens['12321'] ).to eq ["abcba", "abcu", "awba", "awu", "lcba", "lcu"] expect( gens['12345'] ).to eq ["abcde", "awde", "lcde"] expect( gens['129981'] ).to eq ["abiiha", "liiha"] end end <file_sep>#### 6. Matrix는 반복되는 거대 연산과 2차원 배열 탐색에 유용하다. - [피보나치 수열 구하기](http://redutan.github.io/2016/03/31/anti-oop-if) 피보나치 수 연산에 [행렬 지수함수](https://ko.wikipedia.org/wiki/%ED%96%89%EB%A0%AC_%EC%A7%80%EC%88%98_%ED%95%A8%EC%88%98) 연산자 Matrix#** 를 사용해서 빠르고 간단하게 답을 구할 수 있다. ```ruby require 'matrix' require 'benchmark' # base_matrix [1, 1] fibo nth = fibo ** nth. [F nth+1, F nth] # [1, 0] [F nth, F nth-1]. F nth-1이 답. def fibo_rest(nth, divisor=4294967291) base_matrix = Matrix[[1,1],[1,0]] m_exp = ->m,n { n==1 ? m : (m_exp[m, n>>1]**2).map{|_|_%divisor} * m**(n&1) } m_exp[base_matrix, nth][1,1] end expect( fibo_rest(1) ).to eq 0 expect( fibo_rest(5) ).to eq 3 expect( fibo_rest(10** 10) ).to eq 2839099801 expect( fibo_rest(10**100) ).to eq 2537702751 expect( Benchmark.realtime { fibo_rest(10**100) } ).to be_between(0.0, 1.0) ``` - [미로 통과 검사](http://codingdojang.com/scode/402) BFS 알고리즘으로 2차원 배열의 상하좌우로 이동할 수 있는 길을 찾은 뒤, 이동할 수 있는 지점을 큐에 채워넣는 방식을 반복해서 종착지를 찾으면 true, 그렇지 않으면 false가 된다. 2차원 배열을 직접 다루는 것 보다 매트릭스가 직관적이다. 단, matrix에 값을 할당하는 일은 조금 불편하다. ```ruby require 'matrix' maze_checker = ->maze_str=gets("\n\n") do maze = Matrix[*maze_str.split("\n").map(&:chars)] start, dest = %w(< >).map {|ch| maze.index(ch) } q = [start] road = ->x,y { x>=0 && y>=0 && " ><".chars.include?(maze[x,y]) } next_of = ->x,y { [[1,0],[-1,0],[0,-1],[0,1]].map {|a,b| [x+a,y+b] }.select &road } nexts = ->x,y { maze.send(:[]=, x, y, "#"); next_of[x,y] } (x,y = q.shift; maze[x,y]==">"? (return true) : q += nexts[x,y]) until !q[0] false end maze1 = "< >\n" maze2 = "########\n" + "#< #\n" + "# ## #\n" + "# ## #\n" + "# >#\n" + "########\n" maze3 = "#######\n" + "#< #\n" + "##### #\n" + "# #\n" + "# #####\n" + "# # #\n" + "# # # #\n" + "# #>#\n" + "#######\n" maze4 = "< # >\n\n" maze5 = "########\n" + "#< #\n" + "# ##\n" + "# #>#\n" + "########\n" maze6 = "#< # #\n" + "# # #\n" + "# # >#\n" expect( [maze1, maze2, maze3].map &maze_checker ).to eq [true, true, true] expect( [maze4, maze5, maze6].map &maze_checker ).to eq [false, false, false] ``` <file_sep>source 'http://rubygems.org/' group :development, :test do gem 'rspec', '3.4.0' gem 'rspec-benchmark', '0.1.0' gem 'byebug', '9.0.5' gem 'guard-rspec', '4.7.1' gem 'bitset', '0.1.0' end <file_sep>#### 8. 약기, Syntatic Sugar, 관행 약기와 syntatic sugar는 응축된 표현이 가능케 하고, 관행은 반복과 boiler-plate 코드를 줄일 수 있게 해준다. - Range range클래스는 주어진 값의 #succ로 다음값을 획득하게 되어있다. String#succ는 오른쪽 끝문자의 숫자/영문 여부를 확인한 뒤 숫자면 숫자범위에서만 값 증가, 영문이면 영문내에서 값 증가, 혼합형이면 ascii로 증가시킨다. 값을 증가시키는 중에 최대값을 넘어서는 carry가 발생하면 올림자리 값을 증가시킨다. ```ruby # 관행 ("1a".."zz").to_a #=> ["1a", "1b",.., "9z"] ("11a".."zz").to_a.size #=> 234. 9(1~9까지) * 26(a~z까지) ("11a".."zzz").to_a #=> ["11a", "11b",.., "99z"] # loop를 Stream.forEach()로 바꾸지 말아야 할 3가지 이유](http://bit.ly/28UZ60o) # : 본문의 엑셀 컬럼명 생성을 구현해보자. ex) A, B, ..., Z, AA, AB, ..., ZZ, AAA # : 명시적으로 아래와 같이 구현할 수 있다. xls_cols = ->n do (1..n).flat_map {|e| [*"A".."Z"].repeated_permutation(e).map(&:join) } end xls_cols[3] # 관행 # : 매번 개인이 range에서 요구하는 동작을 구현할 필요 없이 range에 관행을 반영한다. # : 그와 같은 이유로 위의 코드가 필요없이 range만으로 간단히 구현할 수 있다. ("A".."ZZZ").to_a # 약기로 [*"A".."ZZZ"] ``` - String#split 문자열.split("구분자") 처럼 구분자를 명기하면 구분자에 의해서 문자열 여러개로 분리된다. 일반적으로 '공백'과 '줄 바꿈'에 의해서 문자열을 분할 하는 경우가 많다. 따라서 구분자 없이 기입하는 경우 이 둘에 의해서 문자열이 분리된다. ```ruby # 관행 "abc def".split #=> ["abc", "def"] "1\n2\n3\n".split #=> ["1", "2", "3"] "1 2\n3\n".split #=> ["1", "2", "3"] # 구분자 명기 "1234567".split("4") #=> ["123", "567"] ``` - String#to_i 문자열 안에 공백과 줄바꿈 문자가 포함되어 있으면 관행에 의해서 자동으로 탈락된다. ```ruby # 관행 "123".to_i #=> 123 "123 ".to_i #=> 123 "123 \n".to_i #=> 123 "1 23 \n".to_i #=> 1 # 명시적 삭제 "123 ".chop.to_i #=> 123 "123 \n".chomp.to_i #=> 123 ``` - Proc, Lambda, & 루비의 list comprehension은 좌에서 우로 흐른다.(몇몇 언어는 우에서 좌로 흐른다.) 이와 같은 특성으로 루비의 리스트 조작식은 평문처럼 읽히는 장점이 있다. 복잡한 표현식은 함수(Proc, Lambda)로 분리한 뒤, &를 사용하여 함수를 조립, 자연스럽게 읽히는 표현을 만들 수 있다. ```ruby # 객체 메소드 약기 : Symbol#to_proc. 객체의 메소드는 &심볼로 표현할 수 있다. [1, 2, 3].map {|e| e.to_s } #=> [1, 2, 3].map(&:to_s) # proc, lambda 약기 : &함수명 으로 표현할 수 있다. plus_one = ->n { n + 1 } [1, 2, 3].map {|e| plus_one.call(e) } #=> [1, 2, 3].map(&plus_one) # 응용 : 수정 전 DICT = %w(.- -... -.-. -.. . ..-. --. .... .. .--- -.- .-.. -- -. --- .--. --.- .-. ... - ..- ...- .-- -..- -.-- --..).zip('a'..'z').to_h translate = ->morse do morse.split(' ').map {|w| w.split.map {|e| DICT[e] }.join }.join(" ") end test_morse_str = ".... . ... .-.. . . .--. ... . .- .-. .-.. -.--" expect( translate[test_morse_str] ).to eq "he sleeps early" # 응용 : 수정 후 # 왼쪽에서 우측으로 한글처럼 읽어도 무리가 없다. # 약기 단어의 모스부호 변환을 word란 함수로 분리, DICT해쉬를 &DICT로 표현 DICT = %w(.- -... -.-. -.. . ..-. --. .... .. .--- -.- .-.. -- -. --- .--. --.- .-. ... - ..- ...- .-- -..- -.-- --..).zip('a'..'z').to_h word = ->morse_word { morse_word.split.map(&DICT).join } translate = ->morse_str { morse_str.split(' ').map(&word).join(" ") } test_morse_str = ".... . ... .-.. . . .--. ... . .- .-. .-.. -.--" expect( translate[test_morse_str] ).to eq "he sleeps early" ``` - %w 문자열 배열은 따옴표가 반복되므로 %w로 짧게 표현할 수 있다. ```ruby alphabet = %w(a b c d e) #=> ["a", "b", "c", "d", "e"] ``` - `*` splat 연산자는 여러 개의 요소들을 표현하는데 도움이 된다. ```ruby head, *tail = [1, 2, 3, 4] #=> head = 1, tail = [2, 3, 4] *init, last = [1, 2, 3, 4] #=> init = [1, 2, 3], last = 4 param = [1, 2] do_something = ->a,b { [a, b] } do_something.call(*param) arr = %w(a b c d) Hash[*arr] #=> Hash["a", "b", "c", "d"] = {"a"=>"b", "c"=>"d"} ``` - Proc.call .call, .(), [] 모두 함수 호출을 의미한다. 함수 호출을 간결하게 표현할 수 있다. ```ruby plus = ->a,b { a + b} plus[1, 2] plus.(1, 2) plus.call(1, 2) ``` <file_sep>#### 5. Array#dig, Hash#dig 는 n차원의 배열, 해쉬를 단순하게 다루는데 도움이 된다. - [if를 피하고 싶어서](http://redutan.github.io/2016/03/31/anti-oop-if) 다양한 할인정책(쿠폰 - 금액할인, 사이트 - 할인율)과 할인 계산식을 상수, 함수, 파라미터 arity와 관계없이 해쉬와 한 줄의 코드로 응축할 수 있다. ```ruby DICT = { "discnt" => { "coup" => { "fcafe" => 1000, "pack" => [700, 800, 900] }, "rate" => { "normal" => 0.0, "naver" => 0.1, "danawa" => 0.15 }, "form" => { "normal" => "rate_form", "naver" => "rate_form", "danawa" => "rate_form", "fcafe" => "coup_form", "amazon" => "->amt { (amt ** 0.5).round }"} } } # load rule rule = ->path { eval( DICT.dig(*path.split("/")).to_s ) } # local objects to load - 사용자가 정의한 함수들도 로딩 가능 rate_form = ->code,amt,rate=rule["discnt/rate/"+code] { rate * amt } coup_form = ->code,amt,coup=rule["discnt/coup/"+code] { amt < coup ? amt : coup } # usage : code in application rule["discnt/form/naver"].call("naver", 1000) #=> 1_00 (네이버 할인:amt*0.1) rule["discnt/form/amazon"].call(1000) #=> 32 (아마존 할인:amt**0.5.round ) # test - constant cases expect( rule["discnt/rate/normal"] ).to eq 0.0 expect( rule["discnt/coup/pack"] ).to eq [700, 800, 900] # test - function cases expect( rule["discnt/form/normal"].call("normal",1000) ).to eq 0.0 expect( rule["discnt/form/naver"].call("naver",1000) ).to eq 100.0 expect( rule["discnt/form/fcafe"].call("fcafe",1000) ).to eq 1000.0 expect( rule["discnt/form/amazon"].call(1000) ).to eq 32 ``` <file_sep># [coding dojang](http://codingdojang.com/) algorithm & idiom practices. #### 작성 * 루비 습득과 언어의 자유도 확인, Fp느낌을 알고자 작성. * 위 목적에 의해 short-hand, proc, lambda, high-order function 위주로 작성 * 테스트는 [RSpec](http://rspec.info/)을 사용. * [링크](http://codingdojang.com/profile/answer/3058)에서 풀이내역을 확인할 수 있다. #### 항목 * [quiz 01 ~ 10](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/spec/quiz_01_10_spec.rb) : 8-queens 복잡한 재귀, Tug of War에 Dynamic programming, Spiral Array로 2차원 배열의 회전, Four Box로 배열통합을 사용하였다. * [quiz 11 ~ 20](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/spec/quiz_11_20_spec.rb) : Nth-palindrome에 memoization, 미로통과 검사에 BFS 알고리즘을 사용하였다. * [quiz 21 ~ 30](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/spec/quiz_21_30_spec.rb) : K-palindrome에 memoization, 미로통과 검사 문제에 BFS 알고리즘을 사용하였다. Amazon의 Alphabet with numbers 문제가 흥미롭다. * [quiz 31 ~ 40](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/spec/quiz_31_40_spec.rb) : Friend or Enemy에 서로소 집합 자료구조, labyrinth에 BSF를 사용하였다. jolly numbers와 Australian Voting 문제가 흥미롭다. * [quiz 41 ~ 50](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/spec/quiz_41_50_spec.rb) : Optimum polynomial은 역행렬을 사용한 방정식풀이 문제, Euclid Problem에 extended euclid, bezout, modular inverce를 차용하였다. Ugly Number 문제가 흥미롭다. * [quiz 51 ~ 60](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/spec/quiz_51_60_spec.rb) : Sliding window, One Edit Apart 문제가 흥미롭다. * [quiz 61 ~ 70](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/spec/quiz_61_70_spec.rb) : XOR descryption, Simple Balanced Parentheses에 문자열을 활용한 스택, fibonacci에 매트릭스를 활용한 pow연산, lazy 연산을 사용하였다. * [quiz 71 ~ 80](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/spec/quiz_71_80_spec.rb) : Conway's Life Game에 결합, 분배법칙. rotation연산이 사용되었다. tic-tac-toe game과 morse translation 문제가 흥미롭다. * [quiz 81 ~ 90](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/spec/quiz_81_90_spec.rb) : Print image에 BFS 알고리즘을 사용하였다. * [quiz 91 ~ 100](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/spec/quiz_91_100_spec.rb) : OLHC에 입출력 구조화가 사용되었다. * [quiz 101 ~ 110](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/spec/quiz_101_110_spec.rb) : Count primes에 sieve of erastosthenes 알고리즘을 사용하였고 Boxes area에 매트릭스 구조화를 사용하였다. Word Ladder, Scale group문제가 매우 흥미롭다. * [quiz 111 ~ 120](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/spec/quiz_111_120_spec.rb) : Panda's survival, Safe passage 문제가 흥미롭다. * [quiz 121 ~ 130](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/spec/quiz_121_130_spec.rb) : Conditional combination이 사용되었다. #### TO-DO * 테스트 STDIN [StringIO](http://ruby-doc.org/stdlib-2.3.1/libdoc/stringio/rdoc/StringIO.html) 을 명시적으로 표현 * [Enumerable](http://ruby-doc.org/core-2.3.1/Enumerable.html), [Lazy](http://ruby-doc.org/core-2.3.1/Enumerator/Lazy.html), [Concurrent](https://github.com/ruby-concurrency/concurrent-ruby), code-golf tip을 cheet sheet로 묶기. ## 참고자료 * [Extended Euclidean algorithm](https://en.wikipedia.org/wiki/Extended_Euclidean_algorithm) * [Bézout's identity](https://en.wikipedia.org/wiki/B%C3%A9zout%27s_identity) * [Modular inverse](https://rosettacode.org/wiki/Modular_inverse) * [BFS: Breadth-first search](https://en.wikipedia.org/wiki/Breadth-first_search) * [Dynamic Programming](https://ko.wikipedia.org/wiki/%EB%8F%99%EC%A0%81_%EA%B3%84%ED%9A%8D%EB%B2%95) * [Memoization](https://ko.wikipedia.org/wiki/%EB%A9%94%EB%AA%A8%EC%9D%B4%EC%A0%9C%EC%9D%B4%EC%85%98) <file_sep>describe "Coding Dojang - http://codingdojang.com" do it "#111 - Panda's survival" do def max_trail(bamboos, n=[*0...bamboos.size]) trees = n.product(n).map {|x,y| [y,x,bamboos[x][y]] } nexts = ->x,y,v do trees.select {|ex,ey,ev| (x-ex).abs+(y-ey).abs <2 && ev>v } end trail = ->xy,cnt=1,t=nexts[*xy] { t[0]? t.map {|e| trail[e,cnt+1] } : cnt } trees.map(&trail).flatten.max end bamboos_1 = [ [0, 0], [0, 0] ] expect(max_trail(bamboos_1)).to eq 1 bamboos_2 = [ [ 1, 6, 7], [ 2, 5, 8], [ 3, 4, 9] ] expect(max_trail(bamboos_2)).to eq 9 bamboos_3 = [ [14, 9,12,10], [ 1,11, 5, 4], [ 7,15, 2,13], [ 6, 3,16, 8] ] expect(max_trail(bamboos_3)).to eq 4 end it "#112 - Safe Passage" do # 빠른 둘로 가장 느린 둘을 이동시키는 시간을 한 사이클로 두면 수식으로 일반화 가능(홀/짝 두가지). # a,b를 가장 빠른 둘, rest를 나머지라 할 때, # 전체이동시간 = (a x 이동횟수) + (b x 이동횟수) + (느린 멤버 둘씩 이동하는 시간의 총합) sum = ->a,b,rest,size { a*(size+1)/2 + b*(size+1 - size%2) + rest.reverse.each_slice(2).map(&:first).reduce(:+) } safe_time = ->times { a,b,*rest = times.split.map(&:to_i).drop(1).sort rest[0] ? sum[a,b,rest,rest.size] : b || a } expect( safe_time["2 15 5"] ).to eq 15 expect( safe_time["4 1 2 7 10"] ).to eq 17 expect( safe_time["5 12 1 3 8 6"] ).to eq 29 end it "#114 - Pingpong" do # 제약사항 : For Loop 또는 Array를 쓰지 말 것 # Assignment를 쓰지 말 것, 즉, 변수 할당을 하지 말 것. # String을 쓰지 말 것 # 1. string, list comprehension - 쉽게 풀기 sevens = ->nth { (0..nth).select {|n| n%7==0 || n.to_s.include?("7") || n==nth} } pingpong = ->x { sevens.(x).each_cons(2).map {|a,b| b-a }.map.with_index. reduce(0) {|sum,(gap,i)| sum + gap*(i.odd?? -1:1)} } # test expect( [8, 22, 68, 100].map &pingpong ).to eq [6, 0, 2, 2] expect( Benchmark.realtime { pingpong.(10000) } ).to be_between(0.001, 0.1) # 2. list comprehension - string 쓰지 않기 is_7 = ->n { n.abs<10 ? (n.abs==7) : is_7.(n/10) || is_7.(n%10) } sevens = ->nth { (0..nth).select {|n| is_7.(n) || n%7==0 || n==nth } } pingpong = ->x { sevens.(x).each_cons(2).map {|a,b| b-a }. map.with_index {|gap,i| gap * (i.odd?? -1:1) }.reduce:+ } # test expect( [7, 17, -7].all? &is_7 ).to be_truthy # 7s expect( [8, 0, -2].all? &is_7 ).to be_falsy # not 7s expect( sevens.(20) ).to eq [0, 7, 14, 17, 20] expect( [8, 22, 68, 100].map(&pingpong) ).to eq [6, 0, 2, 2] expect( Benchmark.realtime { pingpong.(10000) } ).to be_between(0.001, 0.2) # 3. recursion is_7 = ->n { n.abs<10 ? n.abs==7 : is_7[n/10] || is_7[n%10] || n%7==0 } pingpong = proc do |nth, idx, val, inc| idx.nil? ? pingpong.(nth, 1, 1, 1) : idx == nth ? val : pingpong.(nth, idx+1, val+inc, is_7.(idx+1) ? -inc : inc ) end # test expect( [8, 22, 68, 100].map(&pingpong) ).to eq [6, 0, 2, 2] end it "#115 - Reverse Spiral Array" do cube = ->x,y { (1..y).map {|row| [*1..x].product [row]} } peel = ->cub { cub[0]? cub.pop + peel[cub.reverse.transpose] : [] } rev_sp = proc { x,y = gets.split.map(&:to_i); r_sp = peel[cube[x,y]].reverse puts cube[x,y].map {|r| r.map {|e| "%3d" % r_sp.index(e)}*''} } $stdin = StringIO.new("5 6") #=> for stdin test expect{ rev_sp.() }.to output(" 16 17 18 19 20\n" + " 15 4 5 6 21\n" + " 14 3 0 7 22\n" + " 13 2 1 8 23\n" + " 12 11 10 9 24\n" + " 29 28 27 26 25\n" ).to_stdout end it "#116 - Look and say sequence" do # import Data.List # let ant = iterate (concatMap(\g-> [length g, head g]) . group)[1] # ant !! 999 !! 999 ant = ->seq=[1] { seq.chunk{|_|_}.flat_map {|h,g|[g.size,h]} } iterate = ->n { n==1? [1].lazy : ant[iterate[n-1]] } ant_seq = ->nth,n { iterate[100].take(100).force.last } expect( ant_seq[100,100] ).to eq 1 end it "#117 - 120th Prisoner" do free = ->n,f=1,cnt=0 { n>=f ? free[n,f+1,cnt+(n%f>0?0:1)] : cnt.odd? } cnt_frees = ->n { (1..n).count(&free) } expect( [1,4,9,16,25,36,49,64,81,100].all?(&free) ).to eq true expect( [2,3,8,18,120].all?(&free) ).to eq false expect( cnt_frees[120] ).to eq 10 end it "#118 - Longest common partial string" do def longest_common_str(io=$stdin) a,b = (1..2).map {io.gets.chop}.sort_by(&:size) cmn = ->s { (s...a.size).reduce("") {|_,e| b.include?(a[s..e])? a[s..e] : break } } lcs = (0...a.size).reduce("") {|gcs,s| [gcs, cmn[s]||""].max_by(&:size) } puts lcs.size, lcs end test_stdin = StringIO.new("photography\nautograph\n") # for stdin expect{ longest_common_str(test_stdin) }.to output("7\ntograph\n").to_stdout end it "#120 - Dash Insert" do def dash_insert(str=gets.chop) head, *tail = str.chars.map(&:to_i).chunk_while {|a,b| a%2+b%2 == 1}.to_a puts (head + tail.map {|e| ["*-"[e[0]%2] ,e] }).join end $stdin = StringIO.new("4546793\n") expect{ dash_insert() }.to output("454*67-9-3\n").to_stdout end end <file_sep>describe "Coding Dojang - http://codingdojang.com" do # perfomance test, regular expression, matrix rotation it "#1 - Sum of self number" do # self_number란 generator가 없는 수를 말한다. # 1~n까지의 self number란 n까지의 generator를 배재한 결과를 말한다. generator = ->n { n + n.to_s.chars.map(&:to_i).reduce(:+) } sum_selfnum = ->n { ( [*1..n] - (1..n).map(&generator) ).reduce:+ } # one-liner # ([*1..5000] - (1..5000).map {|n| n.digits.sum + n }).sum #([*1..5000]-(1..5000).map {|n| n+n.to_s.chars.map(&:to_i).reduce(:+)}).reduce:+ expect( generator.(1) ).to eq 2 expect( generator.(91) ).to eq 101 expect( (1..5).map(&generator) ).to eq [2, 4, 6, 8, 10] expect( (1..5).map(&generator) ).not_to include(1, 3, 5, 7, 9) # sum of self number expect( sum_selfnum.(5) ).to eq 9 expect( sum_selfnum.(5000) ).to eq 1227365 end it "#2 - Spiral array" do # cube를 외곽부터 90도씩 돌려가며 깎아내서 수열을 만든 뒤 # cube 인덱스로 깎아낸 수열의 값을 획득한다 cube = ->x,y { (0...y).map {|e| [*0...x].product [e] } } peel = ->cub { cub[0]? cub.shift + peel[cub.transpose.reverse] : [] } spiral = ->x,y { spr = peel[cube[x,y]] cube[x,y].map {|_|_.map {|e| "%3d" % spr.index(e)}*'' } } expect( spiral[4,3] ).to eq [" 0 1 2 3", " 9 10 11 4", " 8 7 6 5"] expect( spiral[6,6] ).to eq [" 0 1 2 3 4 5", " 19 20 21 22 23 6", " 18 31 32 33 24 7", " 17 30 35 34 25 8", " 16 29 28 27 26 9", " 15 14 13 12 11 10"] end it "#3 - LCD Display" do DGTS = %w(DSDDSDDDDD CRRRCLLRCC SSDDDDDSDD CRLRRRCRCR DSDDSDDSDD) dic = ->n { s,d,l = " -|".chars [ s*(n+2), s+d*n+s, l+s*(n+1), l+s*n+l, s*(n+1)+l ] } to_code = ->size,num do num.chars.map {|n| DGTS.map {|d| d[n.to_i] } }. transpose.map.with_index {|a,i| [a] * (i.odd?? size:1) } end lcd = ->s,num { s = s.to_i; dh = "SDLCR".chars.zip(dic[s]).to_h to_code[s,num].flatten(1).map {|line| line.map(&dh)*'' } } print_lcd = proc { puts $<.map(&:split).map(&lcd) } expect( to_code.(1, "0") ).to eq [[["D"]], [["C"]], [["S"]], [["C"]], [["D"]]] expect( to_code.(2, "0") ).to eq [[["D"]], [["C"], ["C"]], [["S"]], [["C"], ["C"]], [["D"]]] expect( to_code.(2, "40")).to eq [[["S", "D"]], [["C", "C"], ["C", "C"]], [["D", "S"]], [["R", "C"], ["R", "C"]], [["S", "D"]]] expect( lcd.("2", "40") ).to eq [" -- ", # S D "| || |", # C C "| || |", # C C " -- ", # D S " || |", # R C " || |", # R C " -- "] # S D end it "#4 - Slurpy. regular expression" do slump = /([DE]F+)+G/ slimp = /(?<slimp>AH|AB\g<slimp>C|A#{slump}C)/ slurpy = /#{slimp}#{slump}/ is_slurpy = proc { gets.chop =~ /^#{slurpy}$/ ? "YES" : "NO" } check = proc { (1..(gets.to_i)).map(&is_slurpy) } slurpy_out = proc { puts "SLURPY OUTPUT", check.(), "END OF OUTPUT" } slumps = %w(DFG EFG DFFFFFG DFDFDFDFG DFEFFFFFG) not_slumps = %w(DFEFF EFAHG DEFG DG EFFFFDG) slimps = %w(AH ABAHC ABABAHCC ADFGC ADFFFFGC ABAEFGCC ADFDFGC) not_slimps = %w(ABC ABAH DFGC ABABAHC SLIMP ADGC) slurpys = %w(AHDFG ADFGCDFFFFFG ABAEFGCCDFEFFFFFG) not_slurpys= %w(AHDFGA DFGAH ABABCC) #=> regex test expect( slumps.all? {|s| s=~/^#{slump}$/ } ).to eq true expect( not_slumps.all? {|s| s=~/^#{slump}$/ } ).to eq false expect( slimps.all? {|s| s=~/^#{slimp}$/} ).to eq true expect( not_slimps.all? {|s| s=~/^#{slimp}$/ } ).to eq false expect( slurpys.all? {|s| s=~/^#{slurpy}$/ } ).to eq true expect( not_slurpys.all? {|s| s=~/^#{slurpy}$/ } ).to eq false end it "#5 - Multiples of 3 and 5" do # 3 또는 5로 나뉘어 지는 것의 합. # n%3==0||n%5==0 을 스마트하게 n%3 * n%5 < 1 sum_multiples = ->max { (1...max).select {|n| n%3 * n%5 < 1 }.reduce:+ } expect( sum_multiples.(10) ).to eq 3+5+6+9 expect( sum_multiples.(1000) ).to eq 233168 end it "#6 - Eight Queens Problem" do # 8x8 체스판. 8개의 퀸이 행, 열, 대각선으로 겹치지 않게 놓일 수 있는 가짓수 # 체스판의 1칸을 sqaure라 하자. # row 1~8까지 row조건에 맞는 것을 걸러내서. row가 max에 도달하면 cnt 1증가 invalid = ->sx,sy,x,y { sx==x || sy==y || sx+sy==x+y || sx-sy==x-y } valids = ->squares,x,y { squares.reject {|sx,sy| invalid.(sx,sy,x,y) } } cnt_exs = lambda do |max,squares,row=1| squares.select {|sx,sy| sx == row }.reduce(0) do |sum,square| row == max ? 1 : sum + cnt_exs.(max, valids[squares,*square], row+1) end end queens = ->size,axis=[*1..size] { cnt_exs[size, axis.product(axis)] } expect( queens.(2) ).to eq 0 expect( queens.(4) ).to eq 2 expect( queens.(6) ).to eq 4 expect( queens.(7) ).to eq 40 expect( queens.(8) ).to eq 92 # expect( queens.(10) ).to eq 724 #=> 6~8초 # for performance expect( Benchmark.realtime { queens[8] } ).to be_between(0.0, 0.5) expect { queens[8] }.to perform_under(50).ms expect { queens[8] }.to perform_under(50).ms.and_sample(10) # expect { queens[4] }.to perform_at_least(100).ips end it "#7 - Count Eight" do # 1부터 10000까지 8이란 숫자는 총 몇번 등장하는가? counter = ->num,s { ([*1..num]*'').count s } # counter_2 = ->num,s { ([*1..num]*'').reduce(0) {|sum,e| sum+e.count(s) } } expect( counter.(10, '1') ).to eq 2 expect( counter.(10000, '8') ).to eq 4000 end it "#8 - SubDate" do # 그레고리력 기준 require 'rdate' from, to = RDate.new("00010101"), RDate.new("00010201") expect( to.sum_date - from.sum_date ).to eq 31 expect( RDate.subdate("00010101", "00010201") ).to eq 31 expect( RDate.subdate("20160201", "20160301") ).to eq 29 # params reverse-case expect( RDate.subdate("20160301", "20160201") ).to eq 29 # dojang test datas expect( RDate.subdate("20070515", "20070501") ).to eq 14 expect( RDate.subdate("20070501", "20070515") ).to eq 14 expect( RDate.subdate("20070301", "20070515") ).to eq 75 end it "#9 - Tug of War" do # Dynamic programming에 Set을 활용. 퍼포먼스 문제 # Set을 값으로 갖는 배열을 n개 만들고 더해진 값을 Set에 저장. require 'set' def generate_sums_with(sums, weight, size=sums.size-2) (size).downto(0).each do |cnt| sums[cnt].each {|s| sums[cnt+1].add(s+weight) } end end def balanced_tug weights = (1..gets.to_i).map { gets.to_i } sums = (0..weights.size).map { Set.new }.tap {|e| e[0].add(0) } weights.each {|weight| generate_sums_with(sums, weight) } sum = weights.reduce(:+) balanced_one = sums[weights.size/2].min_by {|e| (sum-e*2).abs } [balanced_one, sum-balanced_one].minmax.join(" ") end # stdin test data test_stdin = %w( 3 100 90 200 6 45 55 70 60 50 75 4 92 56 47 82 5 2 3 4 7 8 4 50 50 100 200).join("\n") $stdin = StringIO.new(test_stdin) expect( balanced_tug ).to eq "190 200" expect( balanced_tug ).to eq "175 180" expect( balanced_tug ).to eq "138 139" expect( balanced_tug ).to eq "12 12" expect( balanced_tug ).to eq "150 250" # for perfomance # require 'benchmark' # case_100 = ([100] + 100.times.map { Random.rand(1..450) }).join("\n") # $stdin = StringIO.new(case_100) # expect( Benchmark.realtime { balanced_tug } ).to be_between(0.0, 5.0) end it "#10 - Four Boxes " do # 직사각형 4개의 면적 구하기, 사각형 좌표를 product로 풀어낸 뒤 uniq좌표 갯수 구하는 방식 tiles = ->x1,y1,x2,y2 { [*x1...x2].product([*y1...y2]) } area = ->rects { rects.reduce([]) {|sum,rect| sum + tiles[*rect]}.uniq.size } # one-liner # rects.map {|e|[*e[0]...e[2]].product([*e[1]...e[3]])}.flatten(1).uniq.size rects = [[1,2,4,4],[2,3,5,7],[3,1,6,5],[7,3,8,6]] # special case : x1=x2 or y1=y2 expect(tiles[1,4,1,5]).to eq [] expect(tiles[1,2,4,2]).to eq [] # common case expect(tiles[*rects[0]]).to eq [[1,2],[1,3],[2,2],[2,3],[3,2],[3,3]] expect(area[rects]).to eq 26 end end <file_sep>class MazeChecker attr_reader :maze, :start, :dest def set_maze(maze_str) @maze = maze_str.split("\n").map(&:chars) @maze.each_with_index do |row, idx| @start = [idx, row.index("<")] if row.index("<") @dest = [idx, row.index(">")] if row.index(">") end end def is_road?(x, y) (x >= 0 && y >=0) && [" ", ">", "<"].include?(@maze.dig(x,y)) end def next_of(x, y) [[1,0],[-1,0],[0,-1],[0,1]].map {|ax,ay| [x+ax, y+ay] } .select {|ex,ey| is_road?(ex, ey) } end def move_next(x, y) @maze[x][y] = "#" next_of(x, y) end def traverse queue = [@start] until queue.empty? cursor = queue.shift return true if @maze[cursor.first][cursor.last] == ">" queue += move_next(*cursor) end false end end <file_sep>describe "Coding Dojang - http://codingdojang.com" do it "#91 - Reverse Cards" do stdin = proc{ [].tap{|_|10.times{_<<gets.chomp.split.map(&:to_i)}} } flip = ->a,b,cards{ cards.insert(a-1,*cards.slice!(a-1..b-1).reverse) } flipped_str = ->revs=stdin.(){ revs.reduce([*1..20]){|a,e|flip[*e,a]}*' ' } expect( flip.(5, 10, [*1..20]) ).to eq [1,2,3,4,10,9,8,7,6,5,*(11..20)] expect( flipped_str.([[1,1],[2,2]]) ).to eq [*1..20]*' ' expect( flipped_str.((1..10).map {|e|[e,e]}) ).to eq [*1..20]*' ' end it "#92 - 90 degree times" do rtime = ->n { h=(6*n-3)/11.0; m=(h-h.to_i)*60; "%2d. %02d:%02d" % [n,h,m] } times = ->n=0,r=[] { r << rtime[n+=1] until rtime[n+1][4,2]> '23'; r } prt_rt = ->r=times[] { puts r, "Total: #{r.size}번!"} expect( rtime.(1 ) ).to eq " 1. 00:16" expect( rtime.(44) ).to eq "44. 23:43" #=> stdout test rt_outs = " 1. 00:16\n 2. 00:49\n 3. 01:21\n 4. 01:54\n 5. 02:27\n 6. 03:00\n"+ " 7. 03:32\n 8. 04:05\n 9. 04:38\n10. 05:10\n11. 05:43\n12. 06:16\n"+ "13. 06:49\n14. 07:21\n15. 07:54\n16. 08:27\n17. 09:00\n18. 09:32\n"+ "19. 10:05\n20. 10:38\n21. 11:10\n22. 11:43\n23. 12:16\n24. 12:49\n"+ "25. 13:21\n26. 13:54\n27. 14:27\n28. 15:00\n29. 15:32\n30. 16:05\n"+ "31. 16:38\n32. 17:10\n33. 17:43\n34. 18:16\n35. 18:49\n36. 19:21\n"+ "37. 19:54\n38. 20:27\n39. 21:00\n40. 21:32\n41. 22:05\n42. 22:38\n"+ "43. 23:10\n44. 23:43\nTotal: 44번!\n" expect { prt_rt.() }.to output(rt_outs).to_stdout end it "#93 - Duplicate numbers " do check_dup = ->nums{ nums.map(&:chars).map {|n| n==n.uniq&&n.size==10} } nums = %w(0123456789 01234 01234567890 6789012345 012322456789) expect( check_dup.(nums) ).to eq [true, false, false, true, false] end it "#94 - Max Perfomance" do max_pf_with_options = ->part_pows, p_pows=part_pows.sort.reverse do base_pow, base_price, p_price = 150, 10, 3 pf = ->size { (base_pow + p_pows[0,size].reduce(:+)) / (base_price + p_price*size) } (1..p_pows.size).reduce([base_pow/base_price]) {|pfs,size| pfs << pf[size]}.max end expect( max_pf_with_options[ [30, 70, 15, 40, 65] ] ).to eq 17 end it "#95 - H-index & G-index" do paper_eval_by_citations = ->str, cs=str.split.map(&:to_i) do puts "hidx : #{ cs.select {|h| cs.count {|e|e >= h} == h }.max }" puts "gidx : #{ (0..cs.size).select {|g| cs.max(g).reduce(0,:+) >= g**2 }.max }" end expect { paper_eval_by_citations["0 15 4 0 7 10 0"] }. to output( "hidx : 4\ngidx : 6\n" ).to_stdout end it "#96 - Printing OXs" do ox = ->n=read.to_i { (1..n).map {|i| 'O'*(n-i)+'X'*i } } expect( ox.(6) ).to eq ["OOOOOX", "OOOOXX", "OOOXXX", "OOXXXX", "OXXXXX", "XXXXXX"] end it "#97 - Compare version" do ver = ->str { str.split('.').map(&:to_i) } compare = ->a,b { [a, "=><"[ver.(a)<=>ver.(b)], b]*' ' } expect( compare['0.0.2', '0.0.1'] ).to eq "0.0.2 > 0.0.1" expect( compare['1.0.10', '1.0.3'] ).to eq "1.0.10 > 1.0.3" expect( compare['1.2.0', '1.1.99'] ).to eq "1.2.0 > 1.1.99" expect( compare['1.1', '1.0.1'] ).to eq "1.1 > 1.0.1" expect( compare['1.1.0','1.1.0'] ).to eq "1.1.0 = 1.1.0" expect( compare['1.1.99','1.2.0'] ).to eq "1.1.99 < 1.2.0" end it "#98 - Happy Number" do _next = ->n { n.to_s.chars.reduce(0){|a,e| a+e.to_i**2} } is_happy = ->n,seqs=[] { s=_next[n]; s==1 ? "a Happy" : (seqs.include?(s)? "an UnHappy" : is_happy[s,seqs<<s]) } prt_happy = proc { gets.to_i.times {|nth| n=gets.to_i; puts "Case ##{nth}: #{n} is #{is_happy[n]} Number" } } expect( _next[15] ).to eq 26 expect( [4,7,13].map(&is_happy) ).to eq ["an UnHappy", "a Happy", "a Happy"] $stdin = StringIO.new("3\n4\n7\n13\n") expect { prt_happy.() }.to output("Case #0: 4 is an UnHappy Number\n"+ "Case #1: 7 is a Happy Number\n"+ "Case #2: 13 is a Happy Number\n").to_stdout end it "#100 - OLHC, Scan word" do enq = ->q,m,v { q[m]? q[m][1,3]=[*[*q[m][1,2],v].minmax,v] : q[m]=[*[v]*4]; q } olhc = ->s { s.reduce({}) {|q,e| min,_,price=e.scan(/\w+/); enq[q,min,price.to_i] } } agg_olhc = ->stocks { %w(open low high close).zip(olhc[stocks].values.transpose) } prt_olhc = ->str { agg_olhc[str].each {|tag,prices| puts "#{tag} = #{prices}" } } stocks_str = ["1:02, 1100", "1:20, 1170", "1:59, 1090", "2:30, 1030", "2:31, 1110", "2:42, 1150", "2:55, 1210", "2:56, 1190", "3:02, 1120", "3:09, 1100", "4:15, 1090", "4:20, 1080", "4:55, 1050", "4:56, 1020", "4:57, 1000"] tagged_olhc = [["open", [1100, 1030, 1120, 1090]], ["low", [1090, 1030, 1100, 1000]], ["high", [1170, 1210, 1120, 1090]], ["close", [1090, 1190, 1100, 1000]]] expect( agg_olhc[stocks_str] ).to eq tagged_olhc # prt_ohlc.() end end <file_sep># 코딩도장 수련기 - 수련의 명암 이 글은 [루비 입문자의 수련 경험](http://codingdojang.com/profile/answer/3058) 공유 목적으로 작성되었습니다. ## 1. 수련 > #### a. 시작 > - 책과 라이브 코딩 영상을 보니 간지난다. 알고싶다 루비. > - 행사코드 없이 쉽고, 빠르게, 간결하게 코드를 작성하고 싶다. > #### b. 시행착오 > - 루비를 잘 알고 싶은데 흥미가 생기지 않는다. 책을 읽었는데 돌아서면 잊는다. > - '까먹지 말자'는 차원에서 문제풀이 사이트에서 한두문제씩 풀기 시작. > - [프로젝트 오일러](http://euler.synap.co.kr/)에서 시작. > - 무엇을 모르는지, 무엇에 취약한지, 어떤 해결법이 있는지 피드백이 없어 흥미 급감. 그리고 중단. > - 3 ~ 4개월 이후, [코딩도장](http://codingdojang.co.m/)에서 다시 시작. 단순히, 다른 사람의 풀이를 볼 수 있어서 선택.  > #### c. 경과 > - 더듬더듬 쉬운 문제부터 풀기 시작. > - 조금씩 익숙해지며 짧은 코드, 빠른 코드를 원하게 됨. > - 각종 모듈과 메소드, 이디엄, [컬렉션 파이프라인](http://martinfowler.com/articles/collection-pipeline), 성능(메소드, 루비 자료구조) 인지. > - RSpec으로 테스트 작성 시작. > - 다른 사람, 다른 언어의 풀이들을 보며 인식을 넓힘, 부족한 점 인식. > - 흥미가 유지됨. 조금씩 자신감이 생기고 관심사가 확장됨. 몇몇 언어에 흥미를 갖게 됨. > #### d. 정체와 해소 > - 쉬운 문제들을 풀고나니 안 풀리는 문제들만 남고 정체지점에 도달. > - 풀리지 않는 문제들의 공통점을 뽑아서 몇 가지 취약점을 알게 됨. > - 괜찮은 풀이를 이해하는 방식으로 취약문제 하나씩 풀어나감. > - 취약문제가 해소될 때 마다 잘 모르던 알고리즘과 해결법 학습. > - 많이 나아졌지만 그럼에도 불구하고 여전히 많은 정체가 남음. ## 2. 소고 > #### a. 수련의 밝은 부분 > - 문제풀이는 흥미를 유지시켜주는 좋은 학습법이였다. > - 반복되는 수련은 좋든 싫든 습관을 빠르게 정착시켜주었다. > - 풀리지 않는 문제의 공통점을 통해 모르는 것과 취약점을 명확히 인식할 수 있었다. > #### b. 수련의 어두운 부분 > - 잘못된 수련은 좋지 않은 습관도 빠르게 정착시켰다.(약기 남발) > - 문제풀이 사이트에서 다루지 않는 복잡한 문제, 복잡한 구성, 구조화는 수련되지 않았다. ## 3. 결론 - 문제풀이는 초보에게 좋은 수련방법이다. - 평소의 수련중에도 좋은 습관이 정착되도록 노력하는 것이 중요하다. - 흥미를 유지하고 정체에 도달해서 모르는 것을 인식하고 "모른다"는 불편을 견뎌내면 나아진다. - 조그마한 문제풀이 사이트의 수련만으로는 부족하다. 보충해 줄 수단도 필요하다. ## Tips 팁 실행해보기 ```ruby # gem install cmd > gem install rspec cmd > gem install rspec-mocks cmd > gem install 'rspec-benchmark' # irb에서 Tips 실행하기 위한 준비 cnd > irb irb > require 'rspec' irb > require 'rspec/mocks/standalone' irb > require 'benchmark' irb > require 'rspec-benchmark' irb > extend RSpec::Matchers irb > include RSpec::Benchmark::Matchers irb > # 실행하고 싶은 팁 코드를 복사해서 irb에 붙여넣고 실행결과를 확인한다. ``` - [Array#transpose](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/tips/1_transpose.md) - [Enumerable#chunk](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/tips/2_chunk.md) - [Stack](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/tips/3_stack.md) - [Enumerator::lazy](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/tips/4_lazy.md) - [Array#dig](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/tips/5_dig.md) - [Matrix](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/tips/6_matrix.md) - [Some Algorithms](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/tips/7_some_algo_ds.md) - [Shorthands & Syntatic Sugars](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/tips/8_code_golf.md) - [Irb with RSpec](https://github.com/cavinkwon/ruby_coding_dojang/blob/master/tips/9_test.md) ## Etc - 루비가 느리게 느껴지지 않았다. - 여럿이 같이 수련했더라면 더 좋았겠다. - 수련 사이트로 [HackerRank](https://www.hackerrank.com/), [코딩도장](https://www.hackerrank.com/) 추천. <file_sep>require 'byebug' describe "Coding Dojang - http://codingdojang.com" do it "#41 - Every Other Digit" do convert = lambda do |str| str.chars.map.with_index {|val,i| i.odd? && val.to_i > 0 ? "*" : val }*'' end expect( convert.("a1b2cde3~g45hi6") ).to eq "a*b*cde*~g4*hi6" end it "#42 - Light More Light" do # 어떤 수의 제곱 - 홀수개의 약수(on) # 제곱 이외의 수(소수 포함) - 짝수개의 약수(off) is_on = ->num { num**0.5 % 1 > 0 ? "no" : "yes" } chk_lights = proc { puts gets("0").chop.split.map(&:to_i).map &is_on } $stdin = StringIO.new("3\n6241\n8191\n0\n") expect{ chk_lights.() }.to output( "no\nyes\nno\n" ).to_stdout end it "#43 - Optimum polynomial" do # 최적의 다항식 찾기 # n을 입력해서 나온 수열 op[n]을 이용해 차이수열을 만들어낸뒤 # 차이수열을 이용해 op[n+1] = op.last + inc(증분).last로 bop와 fit를 유추해낸다. df = ->arr { arr.each_cons(2).map {|a,b| b-a } } get_fit = ->op do return op[0] if op.size == 1 dfs = (2...op.size).reduce([df[op]]) {|a,_| a.unshift df[a[0]] } inc = (1..dfs.size).reduce([0,*dfs]) {|a,k| a[k] = a[k][-1] + a[k-1]; a } op.last + inc.last end fit_sum = ->fn { op=[]; (1..10).reduce(0) {|sum,n| sum + get_fit[op << fn[n]] } } fn_u = ->n { (0..10).reduce(0) {|value,i| value += (-1)**i * (n**i) } } expect(fit_sum[fn_u]).to eq 37076114526 end it "#44 - Ones" do ones= ->f,num='1' { num.to_i%f==0 ? num.size : ones[f,num+'1'] } expect( [3,7,9901].map(&ones) ).to eq [3,6,12] end it "#45 - Reverse And Add" do rev_add = ->n { [n,n.reverse].map(&:to_i).reduce :+ } pal = ->n,cnt=0 { n==n.reverse ? [cnt,n]*' ' : pal[rev_add[n].to_s, cnt+1] } prt_pal = proc { (1..gets.to_i).map {gets.to_i}.each {|e| p pal[e.to_s] } } expect( rev_add.("195") ).to eq 786 expect( pal.("195") ).to eq "4 9339" expect( pal.("1") ).to eq "0 1" expect( %w(1 195 265 750).map(&pal) ).to eq ["0 1", "4 9339", "5 45254", "3 6666"] end it "#46 - Diamond" do # daum devon code golf # a = %w(5*5 5*5 4*1*4 3*3*3 2*5*2) # puts (a+%w(**7**)+a.reverse).map {|e|e.chars.map {|c|c>"*" ? " "*c.to_i : c}*''} diamond = " * \n" + " * \n" + " * * \n" + " * * \n" + " * * \n" + "** **\n" + " * * \n" + " * * \n" + " * * \n" + " * \n" + " * \n" prt_diamond = proc do a = %w(5*5 5*5 4*1*4 3*3*3 2*5*2) puts (a+%w(**7**)+a.reverse).map {|e|e.chars.map {|c|c>"*" ? " "*c.to_i : c}*''} end expect( prt_diamond ).to output(diamond).to_stdout end it "#47 - Check the Check" do # k에서 뻗어나가는 직선,사선에 있는 말들을 문자열로 만든뒤 in_check여부 판단 check_incheck = ->chess_board,x,y,kv do v = ->ex,ey { ex<0 || ey<0 ? nil : chess_board.dig(ex,ey) } s_rq = [*0..7].map {|d| [v[x-d,y], v[x+d,y], v[x,y-d], v[x,y+d]] }.transpose.map(&:join) s_bq = [*0..7].map {|d| [v[x-d,y-d], v[x-d,y+d], v[x+d,y+d], v[x+d,y-d]] }.transpose.map(&:join) s_p = s_bq[(kv=="K"?0:2),2].map {|s| s[1,2] } s_n = (t=[1,-1].product([2,-2]); (t+t.map(&:reverse)).map {|dx,dy| v[x+dx,y+dy]}.compact) rq, bq, rp, rn = [%w(kR kQ), %w(kB kQ), "P", "N"] ([rq,bq].map {|e|e.map(&:swapcase!)}; [rp,rn].map(&:swapcase!)) if kv == "K" is_incheck = rq.any? {|s| s_rq.include?(s)} | s_p.any? {|s| rp == s } | bq.any? {|s| s_bq.include?(s)} | s_n.any? {|s| rn == s } end result = ->nth,game do is_k = ->x,y { game[x][y].upcase == "K" ? [x,y,game[x][y]] : nil } kings = [*0..7].product([*0..7]).map(&is_k).compact checkmate = kings.map {|x,y,v| check_incheck[game,x,y,v] } winner = checkmate[0]? "black" : checkmate[1]? "white" : "no" "Game ##{nth} : #{winner} king is in check." end check_chess_games = proc do games, tmp = [], ["default"] until tmp.join.empty? tmp = (0..7).map { gets.chop.chars.map {|c| c == "." ? "" : c } } tmp.join.empty? ? break : games.tap {|boards| boards << tmp; puts "" } end puts games.map.with_index(1) {|game,nth| result[nth, game] } end stdin_games = "..k.....\n" + "ppp.pppp\n" + "........\n" + ".R...B..\n" + "........\n" + "........\n" + "PPPPPPPP\n" + "K.......\n" + "rnbqk.nr\n" + "ppp..ppp\n" + "....p...\n" + "...p....\n" + ".bPP....\n" + ".....N..\n" + "PP..PPPP\n" + "RNBQKB.R\n" + "........\n" + "........\n" + "........\n" + "........\n" + "........\n" + "........\n" + "........\n" + "........\n" game_result = "\n\n" + "Game #1 : black king is in check.\n" + "Game #2 : white king is in check.\n" $stdin = StringIO.new(stdin_games) expect{ check_chess_games.() }.to output( game_result ).to_stdout end it "#48 - An Easy Problem " do ones = ->n { n.to_s(2).chars.count('1') } find_j = ->n,ones_i=ones[n] { n+=1; ones[n]==ones_i ? n:find_j[n,ones_i] } stdin = proc { (readline('0').split(/\n/) - ["","0"]).map(&:to_i) } prt = proc { stdin[].map(&find_j) } expect( [1,2,3,4,78].map(&find_j) ).to eq [2,4,5,8,83] $stdin = StringIO.new("1\n2\n3\n4\n78\n0") expect { puts prt.() }.to output("2\n4\n5\n8\n83\n").to_stdout end it "#49 - Ugly Number" do primes, steps = [2,3,5], [[1],[1],[1]] make_steps = lambda do |ugly| steps.each_with_index {|val,i| val.shift if val[0]==ugly; val<<ugly*primes[i]} end nth_ugly = ->n=1500 { (0..n).reduce{steps.map(&:first).min.tap(&make_steps)} } # expect( nth_ugly.() ).to eq 859963392 # expect( nth_ugly.(1550) ).to eq 1093500000 expect( nth_ugly.(100_000) ).to eq 290142196707511001929482240000000000000 end it "#50 - Euclid Problem" do # 알고리즘 : extended euclid, bezout, modular invert # ax + by = gcd(최대공약수), s,t는 i-1번째 a,b추적을 위한 변수 # gcd,q,r = r,*gcd.divmod(r) r이 0일때 gcd는 최대공약수. gcd,r초기값은 x,y. # 따라서 별도 변수 필요없이 x를 gcd, y를 나머지값으로 활용 # a[i],s[i] = s[i-1], a[i-1]-q[i]*s[i-1] 초기값 a,s = 0,1 # b[i],t[i] = t[i-1], b[i-1]-q[i]*t[i-1] 초기값 b,t = 1,0 bz = lambda do |x,y| a,s,b,t = 1,0,0,1 (x,q,y=y,*x.divmod(y); a,s,b,t=s,a-q*s,t,b-q*t) until y.zero?; [a,b,x] end expect( bz.(4, 6) ).to eq [-1, 1, 2] expect( bz.(17, 17) ).to eq [0, 1, 17] expect( bz.(1071,1029) ).to eq [-24, 25, 21] expect( bz.(78696,19332) ).to eq [212, -863, 36] # gcd = ->x,y { x,y = y, x%y until y.zero?; x.abs } # gcd = ->x,y { y.zero? ? x.abs : gcd[y,x%y] } =begin def bezout(x, y) gcd,r=x,y;s=b=0;t=a=1; print "[before] : #{gcd}, 0, #{r} / #{a},#{s} / #{b},#{t} \n" until r.zero? do gcd, q, r = r, *gcd.divmod(r) # gcd와 r은 최대공약수를 위해서, q는 a,b계산을 위해 a, s = s, a-q*s # 전단계 q(나눔수) b, t = t, b-q*t print "[after] : #{gcd}, #{q}, #{r} / #{a},#{s} / #{b},#{t} \n" end [a, b, gcd] end =end end end <file_sep>require 'maze_checker' describe "maze_checker" do let (:checker) { MazeChecker.new } subject (:maze1) { maze1 = "< >\n" } subject (:maze2) { maze2 = "########\n" + "#< #\n" + "# ## #\n" + "# ## #\n" + "# >#\n" + "########\n" } subject (:maze3) { maze3 = "#######\n" + "#< #\n" + "##### #\n" + "# #\n" + "# #####\n" + "# # #\n" + "# # # #\n" + "# #>#\n" + "#######\n" } subject (:maze4) { maze4 = "< # >\n" } subject (:maze5) { maze5 = "########\n" + "#< #\n" + "# ##\n" + "# #>#\n" + "########\n" } subject (:maze6) { maze6 = "#< # #\n" + "# # #\n" + "# # >#\n" } it ".set_maze - recieve maze" do checker.set_maze(maze1) expect( checker.maze.size ).to eq 1 checker.set_maze(maze2) expect( checker.maze.size ).to eq 6 end it ".is_road? - sense start, dest, wall, road" do checker.set_maze(maze1) expect( [checker.start, checker.dest] ).to eq [[0,0],[0,6]] expect( checker.is_road?(1, 0) ).to eq false expect( checker.is_road?(0,-1) ).to eq false expect( checker.is_road?(*checker.start) ).to eq true expect( checker.is_road?(*checker.dest) ).to eq true checker.set_maze(maze2) expect( [checker.start, checker.dest] ).to eq [[1,1],[4,6]] expect( checker.is_road?(0, 0) ).to eq false expect( checker.is_road?(5, 7) ).to eq false expect( checker.is_road?(*checker.dest) ).to eq true end it ".move_next - sence nexts, move nexts" do checker.set_maze(maze1) expect( checker.next_of(*checker.start) ).to eq [[0,1]] expect( checker.next_of(*checker.dest) ).to eq [[0,5]] expect( checker.next_of(0, 1) ).to eq [[0,0],[0,2]] expect( checker.move_next(*checker.start) ).to eq [[0,1]] expect( checker.next_of(0, 1) ).to eq [[0,2]] end context "traversable case" do it ".traverse - traverse all, is_traversable_maze?" do checker.set_maze(maze1) expect( checker.traverse ).to eq true checker.set_maze(maze2) expect( checker.traverse ).to eq true checker.set_maze(maze3) expect( checker.traverse ).to eq true end end context "not traversable case" do it ".traverse - traverse all, is_traversable_maze?" do checker.set_maze(maze4) expect( checker.traverse ).to eq false checker.set_maze(maze5) expect( checker.traverse ).to eq false checker.set_maze(maze6) expect( checker.traverse ).to eq false end end end # bfs algorithm # 깊이 0인 1을 넣는다. # q에 들어있는 1(깊이 0 )로 인접한 깊이 1(2 3)을 구해서 넣는다. 1을 빼낸다" # q에 들어있는 2(깊이 1 )로 인접한 깊이 2(4,5)를 구해서 넣는다. 2를 빼낸다. # q에 들어있는 3(깊이 1 )로 인접한 깊이 2(4,6)을 구해서 넣는다. 3을 빼낸다. # 4는 이미 들어있으므로 넣지 않는다. # q에 들어있는 4(깊이 2 )로 인접한 깊이 3(없음)을 구해서 넣는다. 4를 빼낸다. # q에 들어있는 5(깊이 2 )로 인접한 깊이 3(없음)을 구해서 넣는다. 5를 빼낸다. # q에 들어있는 6(깊이 2 )로 인접한 깊이 3(없음)을 구해서 넣는다. 6을 빼낸다. # q가 비워지거나(false) 목표지에 도달(true)하면 종료한다. <file_sep>describe "Coding Dojang - http://codingdojang.com" do it "#61 - Bubble Sort" do swap = ->a,i,cnt=0 do ( a[i],a[i+1]=a[i+1],a[i]; cnt+=1 ) if a[i]>a[i+1]; [cnt, a] end stage = ->cnt,a { (0..a.size-2).each {|i| cnt,a=swap[a,i,cnt]}; [cnt,a] } b_sort = ->arr do loops,swaps,cnt = 0,0,-1; (cnt,arr=stage[0,arr]; loops+=1; swaps+=cnt) until cnt==0; [loops,swaps] end expect( swap.( [2,1,3],0) ).to eq [1,[1,2,3]] # case : swap expect( swap.( [1,2,3],1) ).to eq [0,[1,2,3]] # case : sorted expect( stage.( 0,[2,1,5,4] ) ).to eq [2,[1,2,4,5]] # case : swap expect( stage.( 0,[1,2,4,5] ) ).to eq [0,[1,2,4,5]] # case : sorted expect( b_sort.( [3,1,4,1,5,9,2,6] ) ).to eq [5,8] end it "#62 - Two printers" do time = ->x,y,n { t=(n*x*y*1.0/(x+y)).ceil; t%x+t%y > 0? t+1 : t } rcv = proc { puts "input data:"; (1..gets.to_i).map { gets.split.map(&:to_i) } } prt_times = ->input=rcv[] { puts "\nanswer:", input.map(&time)*' ' } $stdin = StringIO.new("2\n1 1 5\n3 5 4\n") expected = "input data:\n" + "\n" + "answer:\n" + "3 9\n" expect { prt_times.call }.to output( expected ).to_stdout end it "#63 - Largest prime factor" do largest_prime = ->num,f=2 { num%f==0 ? num=num/f : f=f+1 until num==f; f } expect( largest_prime.(600851475143) ).to eq 6857 end it "#64 - Right triangle maximum circumference" do # a+b < c 일 때. a * (2a-p) < 0. 따라서 0 < a < p/2, 0 < b <= a, c = sqrt(a^2 + b^2) find_circum = ->p_len,circums=Hash.new(0) do cnt = ->a,b,c { circums[a+b+c] += 1 if (a+b+c <= p_len) && (c%1 == 0) } (1...p_len/2).each {|a| (1..a).each {|b| cnt[a,b,(a**2+b**2)**0.5] } } circums.max_by(&:last) end expect( find_circum[120] ).to eq [120, 3] expect( find_circum[1000] ).to eq [840, 8] end it "#65 - XOR descryption" do cnter, key = Hash.new(0), [] enc_s = IO.read("spec/data/cipher.txt").split(/,/).map &:to_i enc_s.each_slice(3) {|_| _.each_with_index {|c,i| cnter[[i,c]] += 1 key[i] = c if cnter[[i,c]] > cnter[[i,key[i]]] } } key = key.map {|k| k^" ".ord } ascii_sum = enc_s.map.with_index {|v,i| v^key[i%3] }.reduce :+ expect( ascii_sum ).to eq 107359 end it "#66 - Icecream factory" do min_box = ->tpts { tpts.sort.reduce([]) {|a,t| (a.flatten&t).empty?? a<<t : a.map {|box| (box&t).empty?? box : box&t }} } case1 = [[*-20..-15],[*-14..-5],[*-18..-13],[*-5..-3]] case2 = [[*-10..0], [*-10..0],[*-1..0],[*-15..-2],[*-15..-2],[*-15..-14]] case3 = [[*-20..-18],[*-20..-5],[*-15..-2],[*-19..-4],[*-4..-1],[*-10..-3]] case4 = [[*-20..-15],[*-14..-5],[*-18..-13],[*-5..-3],[-2,-1],[*-7..-2]] case5 = [[*-10..-5],[*-4..0],[*-11..-11]] expect( min_box[case1] ).to eq [[-18, -17, -16, -15], [-5]] expect( min_box[case2] ).to eq [[-15, -14], [-1, 0]] expect( min_box[case3] ).to eq [[-19, -18], [-4, -3]] expect( min_box[case4] ).to eq [[-18, -17, -16, -15],[-5], [-2, -1]] expect( min_box[case5] ).to eq [[-11], [-10, -9, -8, -7, -6, -5], [-4, -3, -2, -1, 0]] end it "#67 - Simple Balanced Parentheses" do is_balanced = ->str { str.scan(/[()]/).reduce("v") {|a,e| e=='('? a+e : a.chop} == "v" } # test data test_str = "(5+6)∗(7+8)/(4+3)" balanced = %w[ (()()()()) (((()))) (()((())())) ] not_balanced = %w[ ((((((()) ())) (()()(() (()))( ())(() ] # test case expect( is_balanced[test_str] ).to eq true expect( balanced.map(&is_balanced) ).to eq [true]*3 expect( not_balanced.map(&is_balanced) ).to eq [false]*5 end it "#68 - Decimal to to_s(n)" do rest = "0123456789ABCDEF" to_n = ->num,f { num<f ? rest[num] : to_n[num/f,f]+rest[num%f] } expect( to_n[233, 2] ).to eq "11101001" expect( to_n[233, 8] ).to eq "351" expect( to_n[233,16] ).to eq "E9" end it "#69 - Fibonacci. smaller than n " do fib = ->x{ x<2 ? x:fib[x-2]+fib[x-1] } fibs = ->n { (0..Float::INFINITY).lazy.map(&fib).take_while{|e|e<n}.force } expect( fibs[100000].size ).to eq 26 expect( fibs[100000].last ).to eq 75025 end it "#70 - fibonacci2" do require 'matrix' fibs = ->n do n-=1; lim=4294967291; v=m=Matrix[[1,1],[1,0]] (m,v=(m**2).map{|_|_%lim},n&1==1?(v*m).map{|_|_%lim}:v; n>>=1) until n<1 v[1,1] end expect( fibs[5] ).to eq 3 expect( fibs[1000000000000000] ).to eq 3010145777 expect( Benchmark.realtime { fibs[10**4] } ).to be_between(0.0, 1.0) end end <file_sep>describe "Coding Dojang - http://codingdojang.com" do it "#81 - Mow grass" do row = ->y { gets.split.map.with_index {|h,x| [y,x,h] } } lawns = proc { (1..gets.to_i).map { (1..gets.to_i).flat_map(&row) } } clean = ->m { m.any? {|ex,ey,ev| m.any? {|x,_,v|x==ex && v>ev} && m.any? {|_,y,v|y==ey && v>ev} } ? "NO" : "YES" } mown = ->lawns { puts lawns.map.with_index {|m,n| "Case ##{n+1}: #{clean[m]}" } } case_yes = [[0, 0, "1"], [0, 1, "2"], [0, 2, "1"]] case_no = [[0, 0, "1"], [0, 1, "2"], [1, 0, "2"], [1, 1, "1"]] expect( clean[case_yes] ).to eq "YES" expect( clean[case_no ] ).to eq "NO" stdin_lawns = "3\n" + "3 3\n" + "2 1 2\n" + "1 1 1\n" + "2 1 2\n" + "5 5\n" + "2 2 2 2 2\n" + "2 1 1 1 2\n" + "2 1 2 1 2\n" + "2 1 1 1 2\n" + "2 2 2 2 2\n" + "1 3\n" + "1 2 1\n" expected = "Case #1: YES\n" + "Case #2: NO\n" + "Case #3: YES\n" $stdin = StringIO.new(stdin_lawns) expect{ mown[lawns[]] }.to output( expected ).to_stdout end it "#82 - Number Baseball" do cnt = ->sb,a,b { sb[0]+=1 if a==b; sb[1]+=1 if a[0]==b[0] && a[1]!=b[1]; sb } valid = ->n1,n2,st,ba do ai, bi = [n1, n2.chars].map {|e| e.map.with_index.to_a } ai.product(bi).reduce([0,0]) {|sb,(a,b)| cnt[sb,a,b]} == [st.to_i, ba.to_i] end guesses = proc { _,*tail = gets.split; tail.each_slice(3).to_a } matched = proc {|all,guess| all.select {|e| valid[e, *guess] } } answers = proc { p guesses[].reduce([*"1".."9"].permutation(3), &matched).size } $stdin = StringIO.new("4 123 1 1 356 1 0 327 2 0 489 0 1\n") expect{ answers.() }.to output("2\n").to_stdout end it "#83 - Investment in stocks" do income = proc { prn, _, *per = $stdin.read.split.map(&:to_i) (per.reduce(prn) {|sum,p| sum*(p*0.01+1) } - prn).round } stat = ->incom { s=(-1..1).zip(%w(bad same good)).to_h; [incom, s[incom<=>0]] } #=> stdin/out test $stdin = StringIO.new("10000\n4\n10 -10 5 -5\n") expect { puts stat[income.call] }.to output("-125\nbad\n").to_stdout end it "#84 - JSON extraction" do # json 배열 획득 > set에 입력 > 버퍼별로 set에 새로 추가된 요소를 map. require 'set' rcved = ->buffers,sets=Set.new do buffers.map {|s| s.gsub(/{(.|^})+?}/).to_a.delete_if {|e| !sets.add?(e) } } end buffers = [ '[ { "seq" : 0, "content" : "abcd"', '[ { "seq" : 0, "content" : "abcd" }, { "seq" : 1, "content" : "d"', '[ { "seq" : 0, "content" : "abcd" }, { "seq" : 1, "content" : "dsfswde" }, { "seq" : 2, "content"' ] expected = [ [], ['{ "seq" : 0, "content" : "abcd" }'], ['{ "seq" : 1, "content" : "dsfswde" }'] ] expect( rcved[buffers] ).to eq expected end it "#86 - Ascii art N" do art = ->n=gets.to_i { (1..n).map {|i| a="N".rjust(i).ljust(n); a[0]=a[-1]="N"; a} } expect( art.(1) ).to eq ["N"] expect( art.(3) ).to eq ["N N", "NNN", "N N"] expect( art.(5) ).to eq ["N N", "NN N", "N N N", "N NN", "N N"] end it "#89 - Paint image" do # BFS algorithms & Matrix paint_image = proc do xy, start = (1..2).map { gets.split.map(&:to_i) } image = Matrix[*(1..xy[0]).map { gets.chars[0, xy[1]] }] q, find_color, paint = [start[0,2]], image[*start[0,2]].to_s, start[2].to_s cell = ->x,y { x>=0 && y>=0 && image[x,y] == find_color } next_of = ->x,y { [[1,0],[-1,0],[0,-1],[0,1]].map {|r,c| [x+r,y+c] }.select &cell } nexts = ->x,y { image.send(:[]=, x, y, paint); next_of[x,y] } q += nexts[*q.shift] until !q[0] puts image.to_a.map(&:join) end $stdin = StringIO.new("10 10\n" + "5 5 3\n" + "0000000000\n" + "0000001000\n" + "0000110100\n" + "0011000010\n" + "0100000010\n" + "0100000010\n" + "0100000100\n" + "0010001000\n" + "0001011000\n" + "0000100000\n") expect { paint_image[] }.to output("0000000000\n" + "0000001000\n" + "0000113100\n" + "0011333310\n" + "0133333310\n" + "0133333310\n" + "0133333100\n" + "0013331000\n" + "0001311000\n" + "0000100000\n").to_stdout end it "#90 - CamelCase to PotholeCase" do to_pathole = ->camel { camel.gsub(/([A-Z0-9])/,'_\1').downcase } # pc = ->s {s.chars.reduce([]) {|a,e| a<<(e==e.upcase ? "_"+e.downcase : e)}*''} expect( to_pathole.("codingDojang") ).to eq "coding_dojang" expect( to_pathole.("numGoat30") ).to eq "num_goat_3_0" end end <file_sep>describe "Coding Dojang - http://codingdojang.com" do it "#121 - Trucks : ACM 2016" do def min_times (_,w,l),trucks = (1..2).map {gets.split.map &:to_i} add = ->b,c,t { b.shift; b.reduce(:+) <= l-t ? [b<<t, c+1] : add[b<<0, c+1, t] } puts trucks.reduce([[0]*w,0]) {|a,truck| add[*a,truck] }[1] + w end # stdin datas case1 = "4 2 10\n7 4 5 6\n" case2 = "1 100 100\n10\n" case3 = "10 100 100\n10 10 10 10 10 10 10 10 10 10\n" $stdin = StringIO.new(case1 + case2 + case3) expect{ min_times }.to output("8\n").to_stdout # case 1 expect{ min_times }.to output("101\n").to_stdout # case 2 expect{ min_times }.to output("110\n").to_stdout # case 3 end it "#122 - Lottery Game" do def lottery_game (n, sum), nums = 2.times.map { gets.split.map(&:to_i) } (1..n).flat_map {|i| nums.combination(i).select {|e| e.reduce(:+) == sum } } .map {|e| puts e.sort.join(' ') } end $stdin = StringIO.new("10 50\n25 27 3 12 6 15 9 30 21 19\n") expect { lottery_game }.to output("6 19 25\n").to_stdout # conditional combination def lottery_game_fast (n, sum), nums = 2.times.map { gets.split.map(&:to_i) } comb = ->head,rest,out do return out if (head + rest).empty? if rest.empty? head.sum == sum ? out << head : out else comb[head +[rest[0]], rest[1..-1], out] if head.sum(0) <= sum - rest[0] (head + rest[1..-1]).sum(0) >= sum ? comb[head, rest[1..-1], out] : out end end puts comb[[],nums,[]].map {|e| e.sort.join(' ') } end end it "#124 - Lisp Computer(s-exp +-*/)" do # 1. lambda eval way calc = proc {|op,*args| args.reduce(op) || 0 } lisp_eval = ->exp { eval exp.gsub(/[(\s)]/, '('=>"calc[:", ' '=>',', ')'=>']') } # 2. stack way calc = proc {|op,*args| args.reduce(op) || 0 } eval_top = ->s { t=[]; t.unshift(s.pop) until t[0].is_a? Symbol; s << calc[*t] } lisp_eval = ->str do tokens = str.gsub(/[()]/, "("=>"",")"=>" )").split. map {|e| e == ")" ? e : e =~ /[[:digit:]]/ ? e.to_i : e.to_sym } tokens.reduce([]) {|stack,e| e == ")" ? eval_top[stack] : stack << e }.pop end # 3. stack way shorthands calc = proc {|op,*args| args.reduce(op) || 0 } eval = ->s { t=[]; t.unshift(s.pop) until t[0].is_a? Symbol; s << calc[*t] } push = ->s,e { e == ")" ? eval[s] : s << (e =~ /[[:digit:]]/ ? e.to_i : e.to_sym) } lisp_eval = ->s { s.gsub(/[()]/, "("=>"",")"=>" )").split.reduce([], &push).pop } cases = ["(+)"] + ["(- 10 3)", "(* 2 3)"] + ["(- 10 3 5)", "(* 2 3 4)"] + ["(* (+ 2 3) (- 5 3))", "(/ (+ 9 1) (+ 2 3))"] + ["(* 1 (- 2 3) 4 (+ 2 -1) 3)" ] expect( cases.map(&lisp_eval) ).to eq [0, 7, 6, 2, 24, 10, 2, -12] end it "#125 - Binary Tree - Layout 1" do add = ->par,node do par ? node.y = par.y + 1 : (return node) chd = node.val < par.val ? par.l : par.r chd ? add[chd, node] : (node.val < par.val ? par.l = node : par.r = node) par end layout = ->btree_str=gets.chop,cnt=0 do Node = Struct.new(:val, :x, :y, :l, :r) nodes = btree_str.chars.map {|c| Node.new(c, 0, 1)} btree = nodes.reduce(&add) trav = ->node do trav[node.l] if node.l node.x = cnt+=1 puts node.values[0..2].join(" ") trav[node.r] if node.r end trav[btree] end in_ordered = "a 1 4\n" + "c 2 3\n" + "e 3 6\n" + "g 4 5\n" + "h 5 4\n" + "k 6 2\n" + "m 7 3\n" + "n 8 1\n" + "p 9 3\n" + "q 10 5\n" + "s 11 4\n" + "u 12 2\n" expect{ layout["nkcmahgeupsq"] }.to output(in_ordered).to_stdout end it "#126 - Binary Tree - Layout 2" do add = ->par,node do par ? node.y = par.y + 1 : (return node) chd = node.val < par.val ? par.l : par.r chd ? add[chd, node] : (node.val < par.val ? par.l = node : par.r = node) par end calc_x = ->node,par,max_y do left_x = par.x + 2**(max_y-node.y) * (node == par.l ? -1 : 1) if par node.x = (node.l ? node.l.x + 2**(max_y-node.y-1) : par&.x > 0 ? left_x : 1) node.r.x = node.x + 2**(max_y-node.y-1) if node.r end layout = ->tree_str=gets.chop do Node = Struct.new(:val, :x, :y, :l, :r) nodes = tree_str.chars.map {|c| Node.new(c, 0, 1) } btree, max_y = nodes.reduce(&add), nodes.max_by(&:y).y trav = ->node,par=nil do trav[node.l, node] if node.l calc_x[node, par, max_y] puts node.values[0..2].join(" ") trav[node.r, node] if node.r end trav[btree] end in_ordered = "a 1 4\n" + "c 3 3\n" + "d 4 5\n" + "e 5 4\n" + "g 6 5\n" + "k 7 2\n" + "m 11 3\n" + "n 15 1\n" + "p 19 3\n" + "q 21 4\n" + "u 23 2\n" expect{ layout["nkcmaedgupq"] }.to output(in_ordered).to_stdout end end <file_sep>describe "Coding Dojang - http://codingdojang.com" do it "#71 - Find numbers" do # Reduce로 카테시안 프로덕트 재귀처리. # ex) 프로덕트n = 프로덕트(프로덕트n-1,n) #=> 가령, 3까지의 조합은 cases[3] = cases[cases[2],3] cases = ->a,b,op=['+','-',''] { [*a].product(op,[b]).map(&:join) } case_cnt = ->n { (2..9).reduce(1, &cases).select {|e| eval(e)==n } } # 두 수의 조합, reduce에 의한 recursion, count 확인 expect(cases[1,2]).to eq %w(1+2 1-2 12) cases_123_result = %w(1+2+3 1+2-3 1+23 1-2+3 1-2-3 1-23 12+3 12-3 123) expect(cases[cases[1,2],3]).to eq cases_123_result expect(case_cnt[100].size).to eq 11 # one-liner # (2..9).reduce(1){|a,e|[*a].product(['+','-',''],[e]).map &:join}.count{|e|eval(e)==100} end it "#72 - tic-tac-toe game" do made = ->u { %w(123 456 789 147 258 369 159 357).any? {|e| (e.chars&u)[2] } } prt = ->w,m { puts "[Winner: #{w}]", m.each_slice(3).map {|row| row*" " } } play = ->p1=[],p2=[],map=[*1..9],n=gets.chop do map[n.to_i-1] = (p1|p2).size.odd? ? (p2<<n; "X") : (p1<<n; "O") winner = made[p1]? "P1" : made[p2]? "P2" : ("None" if (p1+p2)[8]) winner ? prt[winner,map] : play[p1,p2,map] end expect_p1_win = "[Winner: P1]\n" + "O 2 3\n" + "X O 6\n" + "7 X O\n" expect_p2_win = "[Winner: P2]\n" + "O O X\n" + "4 X X\n" + "X O O\n" expect_draw = "[Winner: None]\n" + "X O O\n" + "O O X\n" + "X X O\n" # for stdin test_data = ->p1,p2 { StringIO.new(p1.zip(p2).flatten*"\n") } # case : P1 Win $stdin = test_data[[9,5,1], [4,8]] expect{ play[] }.to output( expect_p1_win ).to_stdout # case : P2 Win $stdin = test_data[[1,2,9,8,4], [3,6,7,5]] expect{ play[] }.to output( expect_p2_win ).to_stdout # case : draw $stdin = test_data[[4,5,9,2,3], [8,7,6,1]] expect{ play[] }.to output( expect_draw ).to_stdout end it "#73 - Compress String" do compress = ->str { str.gsub(/((\w)\2*)/) { $2+$1.size.to_s } } expect( compress.("aaabbcccccca") ).to eq "a3b2c6a1" end it "#74 - Count zeros" do cnt_zeros_factorial = ->n { n<5? 0 : n/5 + cnt_zeros_factorial[n/5] } expect( cnt_zeros_factorial.(12) ).to eq 2 expect( cnt_zeros_factorial.(25) ).to eq 6 end it "#75 - <NAME>" do discnt = ->pack { [0, 1, 0.95, 0.9, 0.8, 0.75][pack.size] * 8 * pack.size } price = ->cart { cart.empty? ? 0 : cart.map(&discnt).reduce(:+) } min_cart = ->cart,book do make_cases = ->i,t=cart.dup { t[i]+=[book]; t if t.all? {|_|_.size==_.uniq.size} } ((0...cart.size).map(&make_cases).compact << cart+[[book]]).min_by &price end min_price = ->books,cart=[],book=books.shift do book ? min_price[books, min_cart[cart,book]] : price[cart] end no_discnt_cases = [[], [0], [0,0]] simple_cases = [[0,1], [0,2,4]] several_cases = [[0,0,1], [0,0,1,1], [0,0,1,2,2,3]] edge_cases = [ [0,0,1,1,2,2,3,4], [0, 0, 0, 0, 0, 1, 1, 1, 1, 1, 2, 2, 2, 2, 3, 3, 3, 3, 3, 4, 4, 4, 4] ] expect( no_discnt_cases.map(&min_price) ).to eq [8*0, 8*1, 8*2] expect( simple_cases.map(&min_price) ).to eq [(8 * 2 * 0.95), (8 * 3 * 0.9)] expect( several_cases.map(&min_price) ).to eq [(8 * 2 * 0.95) + (8 * 1 * 1.0), (8 * 2 * 0.95) * 2, (8 * 2 * 0.95) + (8 * 4 * 0.8)] expect( edge_cases.map(&min_price) ).to eq [(8 * 4 * 0.8) * 2, (8 * 5 * 0.75) * 3 + (8 * 4 * 0.8 ) * 2 ] end it "#76 - Morse translation" do DIC = %w(.- -... -.-. -.. . ..-. --. .... .. .--- -.- .-.. -- -. --- .--. --.- .-. ... - ..- ...- .-- -..- -.-- --..).zip('a'..'z').to_h translate = ->morse { morse.split(' ').map {|w| w.split.map(&DIC)*'' }*' ' } test_morse_str = ".... . ... .-.. . . .--. ... . .- .-. .-.. -.--" expect( translate.(test_morse_str) ).to eq "he sleeps early" end it "#77 - Life Game" do prev_state = ->s,current=s.chars.map(&:to_i) do seeds = [[0,0],[0,1],[1,0],[1,1]] gen = ->seed { current.reduce(seed) {|a,c| a << (a[-2] ^ a[-1 ] ^ c) }*'' } prevs = seeds.map(&gen).select {|e| e[0,2] == e[-2,2] }.map {|e| e[1..-2] } prevs[1] ? "Multiple" : prevs[0] || "No" end expect( prev_state["00011101"] ).to eq "01100101" expect( prev_state["000"] ).to eq "Multiple" expect( prev_state["000001"] ).to eq "No" expect( prev_state["11110"] ).to eq "10010" end it "#78 - Print Number" do expect("gwnae".to_i("!".ord)).to eq 20150111 bit_str = ->n { [20150111.to_s(n), n, n.chr] } (11..36).map(&bit_str).reject {|e| e[0]=~/\d/ } #"ߟo".codepoints.map {|e| '%04d' % e }.join.to_i # [[31, "lpbqi"], [33, "gwnae"], [36, "bzvxb"]] # ["2015".hex].pack("U") ["0111".hex].pack("U") # "―" "đ" # "\u{2015}" [2015].pack('U*') "ߟ" # p "国".unpack("U*").first # "07DF".hex = 2015 = unicode "ߟ".codepoints 2015.to_s(16) # "6F".hex = 111 = unicode "o".codepoints end it "#79 - JSON Validator" do # Hash(키가 Symbol), Array로 파싱되는 것은 true. 이외의 것들은 parse error. def is_valid_json(str) eval_exp = eval(str) (eval_exp.class == Hash && eval_exp.keys.all? {|e| e.class == Symbol} ) || (eval_exp.class == Array) ? "true" : Exception rescue Exception; "Parse error!" end # json_lib 크로스 체크 require 'json' def chk_with_jsonlib(str) JSON.parse(str) ? "true" : Exception rescue Exception; "Parse error!" end # test data valids = ['{"level":3, "attr":{"power":120, "name":"hero"}, "friendIds":[12, 23, 34, 23]}', '[12, 23, 34, "23"]'] not_valids = ['{"level":3, "attr":{"power":120, "name":"hero"}, "friendIds":[12, 23, 34, 23}', '["level":3, 23, 34, "23"]'] hash_but_json_err = '{"key" => value}' # 해쉬지만 json이 아닌 것. all_cases = valids + not_valids + [hash_but_json_err] # test expect( valids.map {|s| is_valid_json(s) } ).to eq ["true"]*2 expect( not_valids.map {|s| is_valid_json(s) } ).to eq ["Parse error!"]*2 expect( is_valid_json(hash_but_json_err) ).to eq "Parse error!" # json_lib로 파싱한 결과와 크로스 체크 expect( all_cases.map {|s| is_valid_json(s) } ). to eq all_cases.map {|s| chk_with_jsonlib(s) } end it "#80 - Staying seconds" do # 크리스탈 p (0..23).map {|h| h.to_s=~/3/? 60:15 }.sum*60 stay = ->n { [*0..23].product([*0..59]).count {|t| t*''=~/3/}*60 } expect( stay.('3') ).to eq 29700 end end <file_sep>#### 2. Enumerable#chunk 또는 #chunk_while로 배열을 순차적으로 묶어낼 수 있다. - [Intervals 문제](http://codingdojang.com/scode/440) 연속적으로 증가하는 숫자들을 하나로 묶어낼 수 있다. ```ruby pairs = -> { (1..gets.to_i).map { gets.split.map &:to_i }.sort } intervals = -> { pairs[].chunk_while{|(_,a),(b,_)|a>=b}.map {|e| e.flatten.minmax*' '} } $stdin = StringIO.new("5\n1 4\n5 6\n6 9\n8 10\n10 10\n") # test data expect(intervals.call).to eq ["1 4", "5 10"] ``` - [Dash Insert 문제](http://codingdojang.com/scode/529) 홀수와 짝수가 연속되지 않은 것들을 하나로 묶어낼 수 있다. ```ruby def dash_insert(str=gets.chop) head, *tail = str.chars.map(&:to_i).chunk_while {|a,b| a%2+b%2 == 1}.to_a puts (head + tail.map {|e| ["*-"[e[0]%2] ,e] }).join end $stdin = StringIO.new("4546793\n") expect{ dash_insert() }.to output("454*67-9-3\n").to_stdout ``` - [개미수열 문제](http://codingdojang.com/scode/516) 똑같은 숫자들을 하나로 묶어낼 수 있다. ```ruby ant = ->seq=[1] { seq.chunk{|_|_}.flat_map {|h,g|[g.size,h]} } iterate = ->n { n==1? [1].lazy : ant[iterate[n-1]] } ant_seq = ->nth,n { puts iterate[100].take(100).force.last } expect{ ant_seq[100,100] }.to eq "1" ```
7717b1a848b2cac74392659106e4b49170a2897f
[ "Markdown", "Ruby" ]
29
Markdown
cavinkwon/coding-dojang
61640b3f9a1764f34f499ea6bf136dca2f3eca9c
27f43c0cc6b725c79e9d993788dbaff5d69c0e1e
refs/heads/master
<file_sep>// assign variables var x; var a = 3; var b = 071; //reassign variables delete x; var a = 5; if (b === a) { // doSomething() }
1ddf586b04d98a43d4f317ec82b4a286f22d9332
[ "JavaScript" ]
1
JavaScript
kaskas2/web-commerce
75843b7feb6e6f3650b43cfec61cc53199339e89
2c7f75f4d71ba7bb868dbca641f3887894f9bfb5
refs/heads/master
<repo_name>maurohn/catawba-mobile<file_sep>/www/js/controllers.js angular.module('RedHat-Forum.controllers', []) .config(function($ionicConfigProvider) { $ionicConfigProvider.scrolling.jsScrolling(false); $ionicConfigProvider.backButton.text('').icon('ion-android-arrow-back'); $ionicConfigProvider.platform.ios.navBar.alignTitle('center'); $ionicConfigProvider.platform.android.navBar.alignTitle('center'); $ionicConfigProvider.platform.ios.backButton.previousTitleText('').icon('ion-ios-arrow-thin-left'); $ionicConfigProvider.platform.android.backButton.previousTitleText('').icon('ion-android-arrow-back'); $ionicConfigProvider.views.transition('android'); // $ionicConfigProvider.platform.ios.views.transition('ios'); // $ionicConfigProvider.platform.android.views.transition('android'); $ionicConfigProvider.views.swipeBackEnabled(false); }) .run(function($rootScope, $ionicPlatform, $ionicPopup, $http, $ionicLoading, $localStorage) { //Localization.locate(true); $ionicPlatform.registerBackButtonAction(function (event) { event.preventDefault(); $ionicPopup.confirm({ title: 'Ops!', template: 'Are you sure you want to quit?', cancelText: 'Cancel', okText: 'Close' }).then(function(res){ if( res ){ navigator.app.exitApp(); } }) }, 100); }) .controller('AuthCtrl', function($scope, $ionicConfig) { }) .controller('WalkthroughCtrl', function($rootScope, $state, $templateCache, $q, $rootScope, $localStorage, $http, $ionicLoading, $ionicPopup) { if($localStorage.user_session) { $state.go('app.feeds-categories'); } else { $rootScope.forums = []; $rootScope.data = []; $rootScope.data.selectedForum = 1; //inicia el evento cargando y bloquea la pantalla $ionicLoading.show({ template: '<ion-spinner icon="android"></ion-spinner>' }); $http({ method: 'GET', url: url_backend + FORUMS, params:{}, headers: {'Accept': 'application/json'} }).success(function(response){ //Apaga el evento cargando $ionicLoading.hide(); //alert(JSON.stringify(response)); $rootScope.forums = response; }).error(function(data, status){ //Apaga el evento cargando $ionicLoading.hide(); $ionicPopup.alert({ title: 'Ups!', template: status + ' Call service error!', okText: 'OK!' }); }); $ionicPopup.alert({ title: 'Select a Forum', template: 'Please select a forum: <select ng-options="opt.id as opt.city for opt in forums" ng-model="data.selectedForum" style="background-color: azure;"></select>', scope: $rootScope, buttons: [ { text: 'Save', type: 'button-positive', onTap: function(e) { $localStorage.forumId = $rootScope.data.selectedForum; var forum = _.find($rootScope.forums, {id: $rootScope.data.selectedForum}); $localStorage.forum_survey_url = forum.survey_url; } } ] }); } }) // APP .controller('AppCtrl', function($scope, $ionicConfig, $localStorage) { $scope.user = JSON.parse($localStorage.user_session); }) //LOGIN .controller('LoginCtrl', function($scope, $state, $templateCache, $q, $rootScope, $localStorage, $http, $ionicLoading, $ionicPopup) { $scope.user = {}; // $scope.user.email = "<EMAIL>"; // $scope.user.password = "<PASSWORD>"; $scope.doLogIn = function(){ //inicia el evento cargando y bloquea la pantalla $ionicLoading.show({ template: '<ion-spinner icon="android"></ion-spinner>' }); $http({ method: 'GET', url: url_backend + OAUTH, params:{'email': $scope.user.email, 'pass': $scope.user.password}, headers: {'Accept': 'application/json'} }).success(function(response){ //Apaga el evento cargando $ionicLoading.hide(); //alert(JSON.stringify(response)); $scope.user = response; //$scope.user.name = response.first_name + ' ' + response.last_name; $localStorage.user_session = JSON.stringify(response); $state.go('app.feeds-categories'); }).error(function(data, status){ //Apaga el evento cargando $ionicLoading.hide(); if(status == 705){ $ionicPopup.alert({ title: 'Ups!', template: data.error + ', please Sing Up first!', okText: 'OK!' }); $state.go('auth.signup'); } else if (status == 706){ $ionicPopup.alert({ title: 'Ups!', template: data.error + ', please try again!', okText: 'OK!' }); } else { $ionicPopup.alert({ title: 'Ups!', template: status + ' Call service error!', okText: 'OK!' }); } }); }; // We need this for the form validation $scope.selected_tab = ""; $scope.$on('my-tabs-changed', function (event, data) { $scope.selected_tab = data.title; }); }) .controller('SignupCtrl', function($scope, $state, $templateCache, $q, $rootScope, $localStorage, $http, $ionicLoading, $ionicPopup) { $scope.user = {}; // $scope.user.email = "<EMAIL>"; // $scope.user_inexistent = false; $scope.countries = [ {name: 'Afghanistan', code: 'AF'}, {name: 'Åland Islands', code: 'AX'}, {name: 'Albania', code: 'AL'}, {name: 'Algeria', code: 'DZ'}, {name: 'American Samoa', code: 'AS'}, {name: 'Andorra', code: 'AD'}, {name: 'Angola', code: 'AO'}, {name: 'Anguilla', code: 'AI'}, {name: 'Antarctica', code: 'AQ'}, {name: 'Antigua and Barbuda', code: 'AG'}, {name: 'Argentina', code: 'AR'}, {name: 'Armenia', code: 'AM'}, {name: 'Aruba', code: 'AW'}, {name: 'Australia', code: 'AU'}, {name: 'Austria', code: 'AT'}, {name: 'Azerbaijan', code: 'AZ'}, {name: 'Bahamas', code: 'BS'}, {name: 'Bahrain', code: 'BH'}, {name: 'Bangladesh', code: 'BD'}, {name: 'Barbados', code: 'BB'}, {name: 'Belarus', code: 'BY'}, {name: 'Belgium', code: 'BE'}, {name: 'Belize', code: 'BZ'}, {name: 'Benin', code: 'BJ'}, {name: 'Bermuda', code: 'BM'}, {name: 'Bhutan', code: 'BT'}, {name: 'Bolivia', code: 'BO'}, {name: 'Bosnia and Herzegovina', code: 'BA'}, {name: 'Botswana', code: 'BW'}, {name: 'Bouvet Island', code: 'BV'}, {name: 'Brazil', code: 'BR'}, {name: 'British Indian Ocean Territory', code: 'IO'}, {name: 'Brunei Darussalam', code: 'BN'}, {name: 'Bulgaria', code: 'BG'}, {name: 'Burkina Faso', code: 'BF'}, {name: 'Burundi', code: 'BI'}, {name: 'Cambodia', code: 'KH'}, {name: 'Cameroon', code: 'CM'}, {name: 'Canada', code: 'CA'}, {name: 'Cape Verde', code: 'CV'}, {name: 'Cayman Islands', code: 'KY'}, {name: 'Central African Republic', code: 'CF'}, {name: 'Chad', code: 'TD'}, {name: 'Chile', code: 'CL'}, {name: 'China', code: 'CN'}, {name: 'Christmas Island', code: 'CX'}, {name: 'Cocos (Keeling) Islands', code: 'CC'}, {name: 'Colombia', code: 'CO'}, {name: 'Comoros', code: 'KM'}, {name: 'Congo', code: 'CG'}, {name: 'Congo, The Democratic Republic of the', code: 'CD'}, {name: 'Cook Islands', code: 'CK'}, {name: 'Costa Rica', code: 'CR'}, {name: '<NAME>', code: 'CI'}, {name: 'Croatia', code: 'HR'}, {name: 'Cuba', code: 'CU'}, {name: 'Cyprus', code: 'CY'}, {name: 'Czech Republic', code: 'CZ'}, {name: 'Denmark', code: 'DK'}, {name: 'Djibouti', code: 'DJ'}, {name: 'Dominica', code: 'DM'}, {name: 'Dominican Republic', code: 'DO'}, {name: 'Ecuador', code: 'EC'}, {name: 'Egypt', code: 'EG'}, {name: 'El Salvador', code: 'SV'}, {name: 'Equatorial Guinea', code: 'GQ'}, {name: 'Eritrea', code: 'ER'}, {name: 'Estonia', code: 'EE'}, {name: 'Ethiopia', code: 'ET'}, {name: 'Falkland Islands (Malvinas)', code: 'FK'}, {name: 'Faroe Islands', code: 'FO'}, {name: 'Fiji', code: 'FJ'}, {name: 'Finland', code: 'FI'}, {name: 'France', code: 'FR'}, {name: 'French Guiana', code: 'GF'}, {name: 'French Polynesia', code: 'PF'}, {name: 'French Southern Territories', code: 'TF'}, {name: 'Gabon', code: 'GA'}, {name: 'Gambia', code: 'GM'}, {name: 'Georgia', code: 'GE'}, {name: 'Germany', code: 'DE'}, {name: 'Ghana', code: 'GH'}, {name: 'Gibraltar', code: 'GI'}, {name: 'Greece', code: 'GR'}, {name: 'Greenland', code: 'GL'}, {name: 'Grenada', code: 'GD'}, {name: 'Guadeloupe', code: 'GP'}, {name: 'Guam', code: 'GU'}, {name: 'Guatemala', code: 'GT'}, {name: 'Guernsey', code: 'GG'}, {name: 'Guinea', code: 'GN'}, {name: 'Guinea-Bissau', code: 'GW'}, {name: 'Guyana', code: 'GY'}, {name: 'Haiti', code: 'HT'}, {name: 'Heard Island and Mcdonald Islands', code: 'HM'}, {name: 'Holy See (Vatican City State)', code: 'VA'}, {name: 'Honduras', code: 'HN'}, {name: 'Hong Kong', code: 'HK'}, {name: 'Hungary', code: 'HU'}, {name: 'Iceland', code: 'IS'}, {name: 'India', code: 'IN'}, {name: 'Indonesia', code: 'ID'}, {name: 'Iran, Islamic Republic Of', code: 'IR'}, {name: 'Iraq', code: 'IQ'}, {name: 'Ireland', code: 'IE'}, {name: 'Isle of Man', code: 'IM'}, {name: 'Israel', code: 'IL'}, {name: 'Italy', code: 'IT'}, {name: 'Jamaica', code: 'JM'}, {name: 'Japan', code: 'JP'}, {name: 'Jersey', code: 'JE'}, {name: 'Jordan', code: 'JO'}, {name: 'Kazakhstan', code: 'KZ'}, {name: 'Kenya', code: 'KE'}, {name: 'Kiribati', code: 'KI'}, {name: 'Korea, Democratic People\'s Republic of', code: 'KP'}, {name: 'Korea, Republic of', code: 'KR'}, {name: 'Kuwait', code: 'KW'}, {name: 'Kyrgyzstan', code: 'KG'}, {name: 'Lao People\'s Democratic Republic', code: 'LA'}, {name: 'Latvia', code: 'LV'}, {name: 'Lebanon', code: 'LB'}, {name: 'Lesotho', code: 'LS'}, {name: 'Liberia', code: 'LR'}, {name: '<NAME>', code: 'LY'}, {name: 'Liechtenstein', code: 'LI'}, {name: 'Lithuania', code: 'LT'}, {name: 'Luxembourg', code: 'LU'}, {name: 'Macao', code: 'MO'}, {name: 'Macedonia, The Former Yugoslav Republic of', code: 'MK'}, {name: 'Madagascar', code: 'MG'}, {name: 'Malawi', code: 'MW'}, {name: 'Malaysia', code: 'MY'}, {name: 'Maldives', code: 'MV'}, {name: 'Mali', code: 'ML'}, {name: 'Malta', code: 'MT'}, {name: '<NAME>', code: 'MH'}, {name: 'Martinique', code: 'MQ'}, {name: 'Mauritania', code: 'MR'}, {name: 'Mauritius', code: 'MU'}, {name: 'Mayotte', code: 'YT'}, {name: 'Mexico', code: 'MX'}, {name: 'Micronesia, Federated States of', code: 'FM'}, {name: 'Moldova, Republic of', code: 'MD'}, {name: 'Monaco', code: 'MC'}, {name: 'Mongolia', code: 'MN'}, {name: 'Montserrat', code: 'MS'}, {name: 'Morocco', code: 'MA'}, {name: 'Mozambique', code: 'MZ'}, {name: 'Myanmar', code: 'MM'}, {name: 'Namibia', code: 'NA'}, {name: 'Nauru', code: 'NR'}, {name: 'Nepal', code: 'NP'}, {name: 'Netherlands', code: 'NL'}, {name: 'Netherlands Antilles', code: 'AN'}, {name: 'New Caledonia', code: 'NC'}, {name: 'New Zealand', code: 'NZ'}, {name: 'Nicaragua', code: 'NI'}, {name: 'Niger', code: 'NE'}, {name: 'Nigeria', code: 'NG'}, {name: 'Niue', code: 'NU'}, {name: 'Norfolk Island', code: 'NF'}, {name: 'Northern Mariana Islands', code: 'MP'}, {name: 'Norway', code: 'NO'}, {name: 'Oman', code: 'OM'}, {name: 'Pakistan', code: 'PK'}, {name: 'Palau', code: 'PW'}, {name: 'Palestinian Territory, Occupied', code: 'PS'}, {name: 'Panama', code: 'PA'}, {name: 'Papua New Guinea', code: 'PG'}, {name: 'Paraguay', code: 'PY'}, {name: 'Peru', code: 'PE'}, {name: 'Philippines', code: 'PH'}, {name: 'Pitcairn', code: 'PN'}, {name: 'Poland', code: 'PL'}, {name: 'Portugal', code: 'PT'}, {name: 'Puerto Rico', code: 'PR'}, {name: 'Qatar', code: 'QA'}, {name: 'Reunion', code: 'RE'}, {name: 'Romania', code: 'RO'}, {name: 'Russian Federation', code: 'RU'}, {name: 'Rwanda', code: 'RW'}, {name: '<NAME>', code: 'SH'}, {name: '<NAME> and Nevis', code: 'KN'}, {name: '<NAME>', code: 'LC'}, {name: '<NAME> and Miquelon', code: 'PM'}, {name: '<NAME> and the Grenadines', code: 'VC'}, {name: 'Samoa', code: 'WS'}, {name: 'San Marino', code: 'SM'}, {name: '<NAME>', code: 'ST'}, {name: 'Saudi Arabia', code: 'SA'}, {name: 'Senegal', code: 'SN'}, {name: 'Serbia and Montenegro', code: 'CS'}, {name: 'Seychelles', code: 'SC'}, {name: 'Sierra Leone', code: 'SL'}, {name: 'Singapore', code: 'SG'}, {name: 'Slovakia', code: 'SK'}, {name: 'Slovenia', code: 'SI'}, {name: 'Solomon Islands', code: 'SB'}, {name: 'Somalia', code: 'SO'}, {name: 'South Africa', code: 'ZA'}, {name: 'South Georgia and the South Sandwich Islands', code: 'GS'}, {name: 'Spain', code: 'ES'}, {name: 'Sri Lanka', code: 'LK'}, {name: 'Sudan', code: 'SD'}, {name: 'Suriname', code: 'SR'}, {name: 'Svalbard and <NAME>', code: 'SJ'}, {name: 'Swaziland', code: 'SZ'}, {name: 'Sweden', code: 'SE'}, {name: 'Switzerland', code: 'CH'}, {name: 'Syrian Arab Republic', code: 'SY'}, {name: 'Taiwan, Province of China', code: 'TW'}, {name: 'Tajikistan', code: 'TJ'}, {name: 'Tanzania, United Republic of', code: 'TZ'}, {name: 'Thailand', code: 'TH'}, {name: 'Timor-Leste', code: 'TL'}, {name: 'Togo', code: 'TG'}, {name: 'Tokelau', code: 'TK'}, {name: 'Tonga', code: 'TO'}, {name: 'Trinidad and Tobago', code: 'TT'}, {name: 'Tunisia', code: 'TN'}, {name: 'Turkey', code: 'TR'}, {name: 'Turkmenistan', code: 'TM'}, {name: 'Turks and Caicos Islands', code: 'TC'}, {name: 'Tuvalu', code: 'TV'}, {name: 'Uganda', code: 'UG'}, {name: 'Ukraine', code: 'UA'}, {name: 'United Arab Emirates', code: 'AE'}, {name: 'United Kingdom', code: 'GB'}, {name: 'United States', code: 'US'}, {name: 'United States Minor Outlying Islands', code: 'UM'}, {name: 'Uruguay', code: 'UY'}, {name: 'Uzbekistan', code: 'UZ'}, {name: 'Vanuatu', code: 'VU'}, {name: 'Venezuela', code: 'VE'}, {name: 'Vietnam', code: 'VN'}, {name: 'Virgin Islands, British', code: 'VG'}, {name: 'Virgin Islands, U.S.', code: 'VI'}, {name: 'Wallis and Futuna', code: 'WF'}, {name: 'Western Sahara', code: 'EH'}, {name: 'Yemen', code: 'YE'}, {name: 'Zambia', code: 'ZM'}, {name: 'Zimbabwe', code: 'ZW'} ]; $scope.doSignUp = function(){ //inicia el evento cargando y bloquea la pantalla $ionicLoading.show({ template: '<ion-spinner icon="android"></ion-spinner>' }); $http({ method: 'GET', url: url_backend + SING_UP, params:{'email': $scope.user.email, 'pass': $<PASSWORD>}, headers: {'Accept': 'application/json'} }).success(function(response){ //Apaga el evento cargando $ionicLoading.hide(); // alert(JSON.stringify(response)); $scope.user = response; //$scope.user.name = response.first_name + ' ' + response.last_name; $scope.user_inexistent = true; //$state.go('app.feeds-categories'); }).error(function(data, status){ //Apaga el evento cargando $ionicLoading.hide(); if(status == 705){ $scope.user_inexistent = true; } else if (status == 707){ $ionicPopup.alert({ title: 'Ups!', template: data.error + ', please login!', okText: 'OK!' }); $state.go('auth.login'); } else { $ionicPopup.alert({ title: 'Ups!', template: status + ' Call service error!', okText: 'OK!' }); } }); }; $scope.doRegister = function(){ //inicia el evento cargando y bloquea la pantalla $ionicLoading.show({ template: '<ion-spinner icon="android"></ion-spinner>' }); $http({ method: 'GET', url: url_backend + REGISTER, params:{'email': $scope.user.email, 'pass': <PASSWORD>, 'country': $scope.user.country.code, 'name': $scope.user.name, 'job_title': $scope.user.job_title, 'company': $scope.user.company, 'age': $scope.user.age, 'bio': $scope.user.bio}, headers: {'Accept': 'application/json'} }).success(function(response){ //Apaga el evento cargando $ionicLoading.hide(); //alert(JSON.stringify(response)); $scope.user = response; //$scope.user.name = response.first_name + ' ' + response.last_name; $localStorage.user_session = JSON.stringify(response); $state.go('app.feeds-categories'); }).error(function(data, status){ //Apaga el evento cargando $ionicLoading.hide(); $ionicPopup.alert({ title: 'Ups!', template: status + ' Call service error!', okText: 'OK!' }); }); }; }) .controller('ForgotPasswordCtrl', function($scope, $state) { $scope.recoverPassword = function(){ $state.go('app.feeds-categories'); }; $scope.user = {}; }) .controller('ProfileCtrl', function($scope, $state, $http, $localStorage, $ionicLoading, $ionicPopup) { $scope.user = JSON.parse($localStorage.user_session); $scope.doUpdate = function(){ //inicia el evento cargando y bloquea la pantalla $ionicLoading.show({ template: '<ion-spinner icon="android"></ion-spinner>' }); $http({ method: 'GET', url: url_backend + UPDATE_USER, params:{'shared_user': $scope.user.shared_user, 'persistence_token': $scope.user.persistence_token}, headers: {'Accept': 'application/json'} }).success(function(response){ //Apaga el evento cargando $ionicLoading.hide(); //alert(JSON.stringify(response)); }).error(function(data, status){ //Apaga el evento cargando $ionicLoading.hide(); $ionicPopup.alert({ title: 'Ups!', template: status + ' Call service error!', okText: 'OK!' }); }); }; }) .controller('SponsorsCtrl', function($scope, $state, $http, $localStorage) { $scope.sponsors = {}; $http.get('sponsors.json').success(function(response) { $scope.sponsors = response.sponsors; }); }) .controller('LogoffCtrl', function($scope, $state, $http, $localStorage) { delete $localStorage.user_session; $scope.user = {}; $state.go("auth.walkthrough"); }) .controller('ProfileUserCtrl', function($scope, $state, $http, $stateParams, $localStorage, $ionicLoading, $ionicPopup) { $scope.user = JSON.parse($localStorage.user_session); $scope.userId = $stateParams.user_Id; $scope.user_det = {}; $scope.getUserby = function(){ //inicia el evento cargando y bloquea la pantalla $ionicLoading.show({ template: '<ion-spinner icon="android"></ion-spinner>' }); $http({ method: 'GET', url: url_backend + USER_BY, params:{'user_id': $scope.userId, 'persistence_token': $scope.user.persistence_token}, headers: {'Accept': 'application/json'} }).success(function(response){ //Apaga el evento cargando $ionicLoading.hide(); //alert(JSON.stringify(response)); $scope.user_det = response; }).error(function(data, status){ //Apaga el evento cargando $ionicLoading.hide(); $ionicPopup.alert({ title: 'Ups!', template: status + ' Call service error!', okText: 'OK!' }); }); }; }) .controller('RateApp', function($scope) { $scope.rateApp = function(){ if(ionic.Platform.isIOS()){ //you need to set your own ios app id AppRate.preferences.storeAppURL.ios = '1234555553>'; AppRate.promptForRating(true); }else if(ionic.Platform.isAndroid()){ //you need to set your own android app id AppRate.preferences.storeAppURL.android = 'market://details?id=ionFB'; AppRate.promptForRating(true); } }; }) .controller('SendMailCtrl', function($scope) { $scope.sendMail = function(){ cordova.plugins.email.isAvailable( function (isAvailable) { // alert('Service is not available') unless isAvailable; cordova.plugins.email.open({ to: '<EMAIL>', cc: '<EMAIL>', // bcc: ['<EMAIL>', '<EMAIL>'], subject: 'Greetings', body: 'How are you? Nice greetings from IonFullApp' }); } ); }; }) .controller('MapsCtrl', function($scope, $ionicLoading) { $scope.info_position = { lat: 43.07493, lng: -89.381388 }; $scope.center_position = { lat: 43.07493, lng: -89.381388 }; $scope.my_location = ""; $scope.$on('mapInitialized', function(event, map) { $scope.map = map; }); $scope.centerOnMe= function(){ $scope.positions = []; $ionicLoading.show({ template: 'Loading...' }); // with this function you can get the user’s current position // we use this plugin: https://github.com/apache/cordova-plugin-geolocation/ navigator.geolocation.getCurrentPosition(function(position) { var pos = new google.maps.LatLng(position.coords.latitude, position.coords.longitude); $scope.current_position = {lat: pos.G,lng: pos.K}; $scope.my_location = pos.G+", "+pos.K; $scope.map.setCenter(pos); $ionicLoading.hide(); }); }; }) .controller('AdsCtrl', function($scope, $ionicActionSheet, AdMob, iAd) { $scope.manageAdMob = function() { // Show the action sheet var hideSheet = $ionicActionSheet.show({ //Here you can add some more buttons buttons: [ { text: 'Show Banner' }, { text: 'Show Interstitial' } ], destructiveText: 'Remove Ads', titleText: 'Choose the ad to show', cancelText: 'Cancel', cancel: function() { // add cancel code.. }, destructiveButtonClicked: function() { console.log("removing ads"); AdMob.removeAds(); return true; }, buttonClicked: function(index, button) { if(button.text == 'Show Banner') { console.log("show banner"); AdMob.showBanner(); } if(button.text == 'Show Interstitial') { console.log("show interstitial"); AdMob.showInterstitial(); } return true; } }); }; $scope.manageiAd = function() { // Show the action sheet var hideSheet = $ionicActionSheet.show({ //Here you can add some more buttons buttons: [ { text: 'Show iAd Banner' }, { text: 'Show iAd Interstitial' } ], destructiveText: 'Remove Ads', titleText: 'Choose the ad to show - Interstitial only works in iPad', cancelText: 'Cancel', cancel: function() { // add cancel code.. }, destructiveButtonClicked: function() { console.log("removing ads"); iAd.removeAds(); return true; }, buttonClicked: function(index, button) { if(button.text == 'Show iAd Banner') { console.log("show iAd banner"); iAd.showBanner(); } if(button.text == 'Show iAd Interstitial') { console.log("show iAd interstitial"); iAd.showInterstitial(); } return true; } }); }; }) // FEED //brings all feed categories .controller('FeedsCategoriesCtrl', function($scope, $state, $localStorage, $http, $ionicLoading, $ionicPopup) { $scope.events = []; //inicia el evento cargando y bloquea la pantalla $ionicLoading.show({ template: '<ion-spinner icon="android"></ion-spinner>' }); $http({ method: 'GET', url: url_backend + EVENTS, params:{'id': $localStorage.forumId, 'persistence_token': $scope.user.persistence_token}, headers: {'Accept': 'application/json'} }).success(function(response){ //Apaga el evento cargando $ionicLoading.hide(); //console.log(JSON.stringify(response)); $scope.events = response; }).error(function(data, status){ //Apaga el evento cargando $ionicLoading.hide(); $ionicPopup.alert({ title: 'Ups!', template: status + ' Call service error!', okText: 'OK!' }); }); }) //brings all contacts .controller('MembersCtrl', function($scope, $state, $templateCache, $q, $rootScope, $localStorage, $http, $ionicLoading, $ionicPopup) { $scope.members = []; $scope.total_members = ""; //inicia el evento cargando y bloquea la pantalla $ionicLoading.show({ template: '<ion-spinner icon="android"></ion-spinner>' }); $http({ method: 'GET', url: url_backend + FORUM_MEMBERS, params:{'id': $localStorage.forumId, 'persistence_token': $scope.user.persistence_token}, headers: {'Accept': 'application/json'} }).success(function(response){ //Apaga el evento cargando $ionicLoading.hide(); //console.log(JSON.stringify(response)); $scope.members = response.members; $scope.total_members = response.total_members; }).error(function(data, status){ //Apaga el evento cargando $ionicLoading.hide(); $ionicPopup.alert({ title: 'Ups!', template: status + ' Call service error!', okText: 'OK!' }); }); }) .controller('ForumSurveyCtrl', function($scope, $http, $localStorage) { $scope.survey_url = $localStorage.forum_survey_url; }) // FORUM //bring specific forum providers .controller('EventDetailCtrl', function($scope, $http, $stateParams, $cordovaSocialSharing, $sce) { $scope.event_sources = []; $scope.showfooter = false; $scope.eventId = $stateParams.eventId; $scope.rate = 3; $scope.max = 5; $http.get('eventdetail.json').success(function(response) { var event = _.find(response, {id: $scope.eventId}); $scope.eventTitle = event.title; $scope.event_sources = event; $scope.event_survey_url = $sce.trustAsResourceUrl($scope.event_sources.survey_url); }); $scope.sendComment = function() { alert($scope.msgcomment); }; $scope.shareit = function() { var subject = "Hey, I'm RedhatForum!"; var message = $scope.event_sources.message_to_share; var link = $scope.event_sources.link_to_share; $cordovaSocialSharing.share(message, subject, null, link) // Share via native share sheet .then(function(result) { // Success! }, function(err) { // An error occured. Show a message to the user console.error(err); }); }; }) //bring specific category providers .controller('CategoryFeedsCtrl', function($scope, $http, $stateParams) { $scope.category_sources = []; $scope.categoryId = $stateParams.categoryId; $http.get('feeds-categories.json').success(function(response) { var category = _.find(response, {id: $scope.categoryId}); $scope.categoryTitle = category.title; $scope.category_sources = category.feed_sources; }); }) //this method brings posts for a source provider .controller('FeedEntriesCtrl', function($scope, $stateParams, $http, FeedList, $q, $ionicLoading, BookMarkService) { $scope.feed = []; var categoryId = $stateParams.categoryId, sourceId = $stateParams.sourceId; $scope.doRefresh = function() { $http.get('feeds-categories.json').success(function(response) { $ionicLoading.show({ template: 'Loading entries...' }); var category = _.find(response, {id: categoryId }), source = _.find(category.feed_sources, {id: sourceId }); $scope.sourceTitle = source.title; FeedList.get(source.url) .then(function (result) { $scope.feed = result.feed; $ionicLoading.hide(); $scope.$broadcast('scroll.refreshComplete'); }, function (reason) { $ionicLoading.hide(); $scope.$broadcast('scroll.refreshComplete'); }); }); }; $scope.doRefresh(); $scope.bookmarkPost = function(post){ $ionicLoading.show({ template: 'Post Saved!', noBackdrop: true, duration: 1000 }); BookMarkService.bookmarkFeedPost(post); }; }) // SETTINGS .controller('SettingsCtrl', function($scope, $ionicActionSheet, $state) { $scope.airplaneMode = true; $scope.wifi = false; $scope.bluetooth = true; $scope.personalHotspot = true; $scope.checkOpt1 = true; $scope.checkOpt2 = true; $scope.checkOpt3 = false; $scope.radioChoice = 'B'; // Triggered on a the logOut button click $scope.showLogOutMenu = function() { // Show the action sheet var hideSheet = $ionicActionSheet.show({ //Here you can add some more buttons // buttons: [ // { text: '<b>Share</b> This' }, // { text: 'Move' } // ], destructiveText: 'Logout', titleText: 'Are you sure you want to logout? This app is awsome so I recommend you to stay.', cancelText: 'Cancel', cancel: function() { // add cancel code.. }, buttonClicked: function(index) { //Called when one of the non-destructive buttons is clicked, //with the index of the button that was clicked and the button object. //Return true to close the action sheet, or false to keep it opened. return true; }, destructiveButtonClicked: function(){ //Called when the destructive button is clicked. //Return true to close the action sheet, or false to keep it opened. $state.go('auth.walkthrough'); } }); }; }) // TINDER CARDS //.controller('TinderCardsCtrl', function($scope, $http) { // // $scope.cards = []; // // // $scope.addCard = function(img, name) { // var newCard = {image: img, name: name}; // newCard.id = Math.random(); // $scope.cards.unshift(angular.extend({}, newCard)); // }; // // $scope.addCards = function(count) { // $http.get('http://api.randomuser.me/?results=' + count).then(function(value) { // angular.forEach(value.data.results, function (v) { // $scope.addCard(v.user.picture.large, v.user.name.first + " " + v.user.name.last); // }); // }); // }; // // $scope.addFirstCards = function() { // $scope.addCard("https://dl.dropboxusercontent.com/u/30675090/envato/tinder-cards/left.png","Nope"); // $scope.addCard("https://dl.dropboxusercontent.com/u/30675090/envato/tinder-cards/right.png", "Yes"); // }; // // $scope.addFirstCards(); // $scope.addCards(5); // // $scope.cardDestroyed = function(index) { // $scope.cards.splice(index, 1); // $scope.addCards(1); // }; // // $scope.transitionOut = function(card) { // console.log('card transition out'); // }; // // $scope.transitionRight = function(card) { // console.log('card removed to the right'); // console.log(card); // }; // // $scope.transitionLeft = function(card) { // console.log('card removed to the left'); // console.log(card); // }; //}) // BOOKMARKS .controller('BookMarksCtrl', function($scope, $rootScope, BookMarkService, $state) { $scope.bookmarks = BookMarkService.getBookmarks(); // When a new post is bookmarked, we should update bookmarks list $rootScope.$on("new-bookmark", function(event){ $scope.bookmarks = BookMarkService.getBookmarks(); }); $scope.goToFeedPost = function(link){ window.open(link, '_blank', 'location=yes'); }; $scope.goToWordpressPost = function(postId){ $state.go('app.post', {postId: postId}); }; }) // WORDPRESS .controller('WordpressCtrl', function($scope, $http, $ionicLoading, PostService, BookMarkService) { $scope.posts = []; $scope.page = 1; $scope.totalPages = 1; $scope.doRefresh = function() { $ionicLoading.show({ template: 'Loading posts...' }); //Always bring me the latest posts => page=1 PostService.getRecentPosts(1) .then(function(data){ $scope.totalPages = data.pages; $scope.posts = PostService.shortenPosts(data.posts); $ionicLoading.hide(); $scope.$broadcast('scroll.refreshComplete'); }); }; $scope.loadMoreData = function(){ $scope.page += 1; PostService.getRecentPosts($scope.page) .then(function(data){ //We will update this value in every request because new posts can be created $scope.totalPages = data.pages; var new_posts = PostService.shortenPosts(data.posts); $scope.posts = $scope.posts.concat(new_posts); $scope.$broadcast('scroll.infiniteScrollComplete'); }); }; $scope.moreDataCanBeLoaded = function(){ return $scope.totalPages > $scope.page; }; $scope.bookmarkPost = function(post){ $ionicLoading.show({ template: 'Post Saved!', noBackdrop: true, duration: 1000 }); BookMarkService.bookmarkWordpressPost(post); }; $scope.doRefresh(); }) // WORDPRESS POST .controller('WordpressPostCtrl', function($scope, post_data, $ionicLoading) { $scope.post = post_data.post; $ionicLoading.hide(); $scope.sharePost = function(link){ window.plugins.socialsharing.share('Check this post here: ', null, null, link); }; }) .controller('ImagePickerCtrl', function($scope, $rootScope, $cordovaCamera) { $scope.images = []; $scope.selImages = function() { window.imagePicker.getPictures( function(results) { for (var i = 0; i < results.length; i++) { console.log('Image URI: ' + results[i]); $scope.images.push(results[i]); } if(!$scope.$$phase) { $scope.$apply(); } }, function (error) { console.log('Error: ' + error); } ); }; $scope.removeImage = function(image) { $scope.images = _.without($scope.images, image); }; $scope.shareImage = function(image) { window.plugins.socialsharing.share(null, null, image); }; $scope.shareAll = function() { window.plugins.socialsharing.share(null, null, $scope.images); }; }) ;
0f835a2b28f8ab2940d0c1c113a2fa733e489a5d
[ "JavaScript" ]
1
JavaScript
maurohn/catawba-mobile
e9bc783495e70bf5b4ecacac71c2bed4720b954d
eda28644d78456914dc26d0c58cf2878aac5fcd3
refs/heads/master
<file_sep>#/bin/bash rm -rf out/target_files/SYSTEM/priv-app/Phone.apk rm -rf out/target_files/SYSTEM/media/bootaudio.mp3 #rm -rf out/target_files/SYSTEM/priv-app/MediaProvider.apk <file_sep>local-phone-apps = $(private-phone-apps) local-phone-priv-apps = $(private-phone-priv-apps) private-phone-apps := BasicDreams \ Bluetooth \ ApplicationsProvider \ CellConnService \ CertInstaller \ DocumentsUI \ Galaxy4 \ EngineerModeSim \ EngineerMode \ MtkBt \ HoloSpiralWallpaper \ HTMLViewer \ KeyChain \ Nfc \ NoiseField \ LatinIME \ LiveWallpapers \ MagicSmokeWallpapers \ FMRadio \ Stk1 \ StkSelection \ SmartcardService \ PacProcessor \ SchedulePowerOnOff \ PhaseBeam \ PhotoTable \ PrintSpooler \ YGPS \ UserDictionaryProvider \ VisualizationWallpapers private-phone-priv-apps := BackupRestoreConfirmation \ InputDevices \ CDS_INFO \ DefaultContainerService \ ExternalStorageProvider \ ProxyHandler \ SharedStorageBackup \ Shell \ Tag \ Dialer \ MediaProvider \ TeleService
8bafad07a60b8c8e86588074a568498c810f89d6
[ "Makefile", "Shell" ]
2
Shell
Sharlion/Miui-patch-for-Hla-note1
ed2facbd5af6934edf511373d5fd9aacca8197ce
6e22c8335711c7485e715b89df4f6ce06a0d2952
refs/heads/main
<repo_name>jakarta99/spring-boot-training<file_sep>/src/main/java/tw/com/softleader/training/integration/entity/BackupPolicy.java package tw.com.softleader.training.integration.entity; import javax.persistence.Column; import javax.persistence.Entity; import javax.persistence.GeneratedValue; import javax.persistence.GenerationType; import javax.persistence.Id; import javax.persistence.Table; import lombok.Getter; import lombok.Setter; import lombok.ToString; @Entity @Table(name = "BACKUP_POLICY") @Getter @Setter @ToString public class BackupPolicy { @Id @GeneratedValue(strategy=GenerationType.IDENTITY) @Column(name="ID") private Long id; @Column(name = "POLICY_NO") private String policyNo; @Column(name = "ENDST_NO") private int endstNo; @Column(name = "APPLICANT_LOCAL_NAME") private String applicantLocalName; @Column(name = "APPLICANT_IDNO") private String applicantIdno; } <file_sep>/src/main/java/tw/com/softleader/training/policy/entity/Insured.java package tw.com.softleader.training.policy.entity; import java.util.Set; import javax.persistence.CascadeType; import javax.persistence.Column; import javax.persistence.Entity; import javax.persistence.FetchType; import javax.persistence.GeneratedValue; import javax.persistence.GenerationType; import javax.persistence.Id; import javax.persistence.JoinColumn; import javax.persistence.OneToMany; import javax.persistence.Table; import lombok.Getter; import lombok.Setter; import lombok.ToString; @Entity @Table(name = "INSURED") @Getter @Setter @ToString //@NamedEntityGraph(name = "Insured.items", attributeNodes = @NamedAttributeNode("items")) public class Insured { @Id @GeneratedValue(strategy=GenerationType.IDENTITY) @Column(name="ID") private Long id; @Column(name = "POLICY_ID") private Long policyId; @Column(name = "INSURED_IDNO") private String insuredIndo; @Column(name = "INSURED_LOCAL_NAME") private String insuredLocalName; @OneToMany(cascade = CascadeType.ALL, fetch = FetchType.LAZY, orphanRemoval = true) @JoinColumn(name = "INSURED_ID") private Set<Item> items; } <file_sep>/README.md # spring-boot-training spring boot training book (https://app.gitbook.com/@jakarta99/s/spring-boot/) <file_sep>/src/main/java/tw/com/softleader/training/policy/entity/Item.java package tw.com.softleader.training.policy.entity; import javax.persistence.Column; import javax.persistence.Entity; import javax.persistence.GeneratedValue; import javax.persistence.GenerationType; import javax.persistence.Id; import javax.persistence.Table; import lombok.Getter; import lombok.Setter; @Entity @Table(name = "ITEM") @Getter @Setter public class Item { @Id @GeneratedValue(strategy=GenerationType.IDENTITY) @Column(name="ID") private Long id; @Column(name = "INSURED_ID") private Long insuredId; @Column(name = "CODE") private String code; @Column(name = "ITEM_LOCAL_NAME") private String itemLocalName; @Column(name = "AMOUNT") private Integer amount; @Column(name = "PREMIUM") private Integer premium; } <file_sep>/src/main/java/tw/com/softleader/training/policy/repository/PolicyRepository.java package tw.com.softleader.training.policy.repository; import java.util.List; import org.springframework.data.jpa.repository.EntityGraph; import org.springframework.data.jpa.repository.JpaRepository; import org.springframework.data.jpa.repository.JpaSpecificationExecutor; import tw.com.softleader.training.policy.entity.Policy; public interface PolicyRepository extends JpaRepository<Policy, Long>, JpaSpecificationExecutor<Policy> { @EntityGraph(value = "policy.insureds") Policy findByPolicyNoAndEndstNo(String policyNo, int endstNo); //@EntityGraph(value = "policy.insureds", type = EntityGraphType.LOAD) //@EntityGraph(value = "policy.insureds", attributePaths = { "insureds","insureds.items" }) List<Policy> findByApplicantIdno(String applicantIdno); List<Policy> findByApplicantLocalNameLike(String name); List<Policy> findByPolicyNo(String policyNo); List<Policy> findByPolicyNoAndApplicantLocalNameLike(String policyNo, String applicantLocalName); } <file_sep>/src/main/resources/application.properties spring.banner.location=classpath:/banner.txt logging.level.tw.com.softleader=DEBUG spring.datasource.driver-class-name=org.mariadb.jdbc.Driver spring.datasource.url=jdbc:mysql://localhost:3306/trainingdb spring.datasource.username=root spring.datasource.password=<PASSWORD> spring.datasource.hikari.auto-commit=false spring.integration-datasource.driver-class-name=org.mariadb.jdbc.Driver spring.integration-datasource.url=jdbc:mysql://localhost:3306/integrationdb spring.integration-datasource.username=root spring.integration-datasource.password=<PASSWORD> spring.integration-datasource.hikari.auto-commit=false spring.jpa.show-sql=true spring.jpa.generate-ddl=true spring.jpa.hibernate.ddl-auto=create<file_sep>/src/test/java/tw/com/softleader/training/policy/repository/PolicyRepositoryTest.java package tw.com.softleader.training.policy.repository; import java.util.List; import org.junit.jupiter.api.BeforeAll; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.TestInstance; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.context.SpringBootTest; import org.springframework.data.domain.Page; import org.springframework.data.domain.PageRequest; import org.springframework.data.domain.Pageable; import org.springframework.data.domain.Sort; import org.springframework.transaction.annotation.Transactional; import lombok.extern.slf4j.Slf4j; import tw.com.softleader.training.policy.entity.Policy; @Slf4j @SpringBootTest @TestInstance(TestInstance.Lifecycle.PER_CLASS) class PolicyRepositoryTest { @Autowired private PolicyRepository policyRepository; @BeforeAll void initAll() { Policy policy; for(int i = 1; i<=88; i++) { policy = new Policy(); String policyNo = "00000"+i; policyNo = "9921ABC"+policyNo.substring(policyNo.length()-5); policy.setPolicyNo(policyNo); policy.setEndstNo(0); policy.setApplicantIdno("A123456789"); policy.setApplicantLocalName("王先生"); policyRepository.save(policy); } } @Test @Transactional public void testFindInPage() { Pageable pageRequest = PageRequest.of(0, 10, Sort.by("policyNo").descending()); Page<Policy> page = policyRepository.findAll(pageRequest); log.debug("{}", page); List<Policy> policies = page.getContent(); for(Policy policy:policies) { log.debug("{}", policy); } } // @BeforeAll // void initAll() { // Policy policy0 = new Policy(); // policy0.setPolicyNo("9921ABC00001"); // policy0.setEndstNo(0); // policy0.setApplicantIdno("A123456789"); // policy0.setApplicantLocalName("王先生"); // // // Insured insured0 = new Insured(); // insured0.setInsuredIndo("A176280531"); // insured0.setInsuredLocalName("王哥哥"); // // // Set<Item> items = new LinkedHashSet<Item>(); // Item item = new Item(); // item.setCode("AD"); // item.setItemLocalName("死殘"); // item.setAmount(1000000); // item.setPremium(120); // // items.add(item); // // item = new Item(); // item.setCode("MR"); // item.setItemLocalName("意外醫療"); // item.setAmount(100000); // item.setPremium(50); // // items.add(item); // // insured0.setItems(items); // // Insured insured1 = new Insured(); // insured1.setInsuredIndo("A176280577"); // insured1.setInsuredLocalName("王弟弟"); // // items = new LinkedHashSet<Item>(); // item = new Item(); // item.setCode("AD"); // item.setItemLocalName("死殘"); // item.setAmount(1000000); // item.setPremium(120); // // items.add(item); // // item = new Item(); // item.setCode("MR"); // item.setItemLocalName("意外醫療"); // item.setAmount(100000); // item.setPremium(50); // // items.add(item); // // insured1.setItems(items); // // // // Set<Insured> insureds = new LinkedHashSet<Insured>(); // insureds.add(insured0); // insureds.add(insured1); // policy0.setInsureds(insureds); // // policyRepository.save(policy0); // // // Policy policy1 = new Policy(); // policy1.setPolicyNo("9921ABC00002"); // policy1.setEndstNo(0); // policy1.setApplicantIdno("A111222333"); // policy1.setApplicantLocalName("王叔叔"); // // // insured0 = new Insured(); // insured0.setInsuredIndo("A111222333"); // insured0.setInsuredLocalName("王叔叔"); // // items = new LinkedHashSet<Item>(); // item = new Item(); // item.setCode("AD"); // item.setItemLocalName("死殘"); // item.setAmount(1000000); // item.setPremium(120); // // items.add(item); // // item = new Item(); // item.setCode("MR"); // item.setItemLocalName("意外醫療"); // item.setAmount(100000); // item.setPremium(50); // // items.add(item); // // insured0.setItems(items); // // insured1 = new Insured(); // insured1.setInsuredIndo("A222333555"); // insured1.setInsuredLocalName("王姐姐"); // // items = new LinkedHashSet<Item>(); // item = new Item(); // item.setCode("AD"); // item.setItemLocalName("死殘"); // item.setAmount(1000000); // item.setPremium(120); // // items.add(item); // // item = new Item(); // item.setCode("MR"); // item.setItemLocalName("意外醫療"); // item.setAmount(100000); // item.setPremium(50); // // items.add(item); // // insured1.setItems(items); // // insureds = new LinkedHashSet<Insured>(); // insureds.add(insured0); // insureds.add(insured1); // policy1.setInsureds(insureds); // // policyRepository.save(policy1); // // // } // // @Test // @Transactional // void testFindByPolicyNoAndEndstNo() { // Policy policy = policyRepository.findByPolicyNoAndEndstNo("9921ABC00001", 0); // assertEquals("A123456789", policy.getApplicantIdno()); // // log.info("Now we try to get insureds"); // Set<Insured> insureds = policy.getInsureds(); // for(Insured insured:insureds) { // log.info("{}", insured); // } // } // @Test // @Transactional // void testFindByApplicantIdno() { // List<Policy> policies = policyRepository.findByApplicantIdno("A123456789"); // for(Policy policy:policies) { // assertEquals("A123456789", policy.getApplicantIdno()); // // log.info("Now we try to get insureds"); // List<Insured> insureds = policy.getInsureds(); // for(Insured insured:insureds) { // log.info("{}", insured); // } // } // // // } // @Test // @Transactional // void testFindByGraph() { // Policy policy = policyRepository.findWithGraph(1L, "policy.insureds"); // // assertEquals("A123456789", policy.getApplicantIdno()); // // log.info("Now we try to get insureds"); // List<Insured> insureds = policy.getInsureds(); // for(Insured insured:insureds) { // log.info("{}", insured); // } // // // } // // @Test // void testFindByApplicantLocalNameLike() { // List<Policy> policies = policyRepository.findByApplicantLocalNameLike("王%"); // for(Policy policy:policies) { // assertEquals("王", policy.getApplicantLocalName().substring(0,1)); // } // } } <file_sep>/src/test/java/tw/com/softleader/training/policy/repository/TransPolicyFromPrimaryToIntegrationTest.java package tw.com.softleader.training.policy.repository; import java.util.ArrayList; import java.util.List; import org.junit.jupiter.api.BeforeAll; import org.junit.jupiter.api.Test; import org.junit.jupiter.api.TestInstance; import org.springframework.beans.BeanUtils; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.boot.test.context.SpringBootTest; import lombok.extern.slf4j.Slf4j; import tw.com.softleader.training.integration.entity.BackupPolicy; import tw.com.softleader.training.integration.repository.BackupPolicyRepository; import tw.com.softleader.training.policy.entity.Policy; @Slf4j @SpringBootTest @TestInstance(TestInstance.Lifecycle.PER_CLASS) public class TransPolicyFromPrimaryToIntegrationTest { @Autowired private PolicyRepository policyRepository; @Autowired private BackupPolicyRepository backupPolicyRepository; @BeforeAll void initAll() { Policy policy; for(int i = 1; i<=88; i++) { policy = new Policy(); String policyNo = "00000"+i; policyNo = "9921ABC"+policyNo.substring(policyNo.length()-5); policy.setPolicyNo(policyNo); policy.setEndstNo(0); policy.setApplicantIdno("A123456789"); policy.setApplicantLocalName("王先生"); policyRepository.save(policy); } } @Test public void trans() { List<Policy> policies = policyRepository.findAll(); List<BackupPolicy> backupPosPolicies = new ArrayList<>(); BackupPolicy backupPolicy ; for(Policy policy:policies) { backupPolicy = new BackupPolicy(); BeanUtils.copyProperties(policy, backupPolicy, "id"); backupPosPolicies.add(backupPolicy); } backupPolicyRepository.saveAll(backupPosPolicies); } }
310a33afb00ae36193d91050b170029dd4394396
[ "Markdown", "Java", "INI" ]
8
Java
jakarta99/spring-boot-training
4791c46d720f17606b2e4e54839c690188fe3283
0ce809faac4f419d4867bc86af42ea321e796794
refs/heads/master
<repo_name>wizzardmr42/LinnworksNetSDK<file_sep>/Linnworks/src/net/LinnworksAPI/Class_Update_PurchaseOrderItemParameter.cs using System; namespace LinnworksAPI { public class Update_PurchaseOrderItemParameter { public Guid pkPurchaseItemId; public Guid pkPurchaseId; public Int32? Quantity; public Int32? PackQuantity; public Int32? PackSize; public Decimal? Cost; public Decimal? TaxRate; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_Get_PurchaseOrderResponse.cs using System; using System.Collections.Generic; namespace LinnworksAPI { public class Get_PurchaseOrderResponse { public PurchaseOrderHeader PurchaseOrderHeader; public List<PurchaseOrderItem> PurchaseOrderItem; public Int32 NoteCount; } }<file_sep>/Linnworks/src/javascript/ImportExport.js var ImportExport = { // http://apidoc.linnworks.net/#/ImportExport-RenameFTPFolder RenameFTPFolder: function(server,port,ssl,userName,<PASSWORD>,passiveMode,path,newfolderName,token, server) { return Factory.GetResponse("ImportExport/RenameFTPFolder", token, server, "server=" + server + "&port=" + port + "&ssl=" + ssl + "&userName=" + userName + "&password=" + <PASSWORD> + "&passiveMode=" + passiveMode + "&path=" + path + "&newfolderName=" + newfolderName +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetColumnsFromFile GetColumnsFromFile: function(URL,delimiter,hasHeaders,escape,token, server) { return Factory.GetResponse("ImportExport/GetColumnsFromFile", token, server, "URL=" + URL + "&delimiter=" + JSON.stringify(delimiter) + "&hasHeaders=" + hasHeaders + "&escape=" + JSON.stringify(escape) +""); }, // http://apidoc.linnworks.net/#/ImportExport-CheckFile CheckFile: function(URL,token, server) { return Factory.GetResponse("ImportExport/CheckFile", token, server, "URL=" + URL +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetImportTypes GetImportTypes: function(token, server) { return Factory.GetResponse("ImportExport/GetImportTypes", token, server, ""); }, // http://apidoc.linnworks.net/#/ImportExport-SaveImport SaveImport: function(importConfig,feedType,feedJSON,token, server) { return Factory.GetResponse("ImportExport/SaveImport", token, server, "importConfig=" + JSON.stringify(importConfig) + "&feedType=" + feedType + "&feedJSON=" + feedJSON +""); }, // http://apidoc.linnworks.net/#/ImportExport-IsImportEnabled IsImportEnabled: function(id,token, server) { return Factory.GetResponse("ImportExport/IsImportEnabled", token, server, "id=" + id +""); }, // http://apidoc.linnworks.net/#/ImportExport-EnableImport EnableImport: function(importId,enable,token, server) { return Factory.GetResponse("ImportExport/EnableImport", token, server, "importId=" + importId + "&enable=" + enable +""); }, // http://apidoc.linnworks.net/#/ImportExport-RunNowImport RunNowImport: function(importId,token, server) { return Factory.GetResponse("ImportExport/RunNowImport", token, server, "importId=" + importId +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetImport GetImport: function(id,token, server) { return Factory.GetResponse("ImportExport/GetImport", token, server, "id=" + id +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetImportIdByName GetImportIdByName: function(friendlyName,token, server) { return Factory.GetResponse("ImportExport/GetImportIdByName", token, server, "friendlyName=" + friendlyName +""); }, // http://apidoc.linnworks.net/#/ImportExport-DeleteImport DeleteImport: function(id,token, server) { return Factory.GetResponse("ImportExport/DeleteImport", token, server, "id=" + id +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetExecutingOrQueuedImports GetExecutingOrQueuedImports: function(token, server) { return Factory.GetResponse("ImportExport/GetExecutingOrQueuedImports", token, server, ""); }, // http://apidoc.linnworks.net/#/ImportExport-GetImportSessionErrors GetImportSessionErrors: function(sessionId,pageNumber,entriesPerPage,token, server) { return Factory.GetResponse("ImportExport/GetImportSessionErrors", token, server, "sessionId=" + sessionId + "&pageNumber=" + pageNumber + "&entriesPerPage=" + entriesPerPage +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetImportSessions GetImportSessions: function(importId,pageNumber,entriesPerPage,token, server) { return Factory.GetResponse("ImportExport/GetImportSessions", token, server, "importId=" + importId + "&pageNumber=" + pageNumber + "&entriesPerPage=" + entriesPerPage +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetImports GetImports: function(token, server) { return Factory.GetResponse("ImportExport/GetImports", token, server, ""); }, // http://apidoc.linnworks.net/#/ImportExport-GetImportRegister GetImportRegister: function(id,token, server) { return Factory.GetResponse("ImportExport/GetImportRegister", token, server, "id=" + id +""); }, // http://apidoc.linnworks.net/#/ImportExport-DownloadImportedFile DownloadImportedFile: function(fileId,token, server) { return Factory.GetResponse("ImportExport/DownloadImportedFile", token, server, "fileId=" + fileId +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetImportListOfValues GetImportListOfValues: function(importType,columnName,additionalFieldName,token, server) { return Factory.GetResponse("ImportExport/GetImportListOfValues", token, server, "importType=" + JSON.stringify(importType) + "&columnName=" + columnName + "&additionalFieldName=" + additionalFieldName +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetFullfilmentCenterSettings GetFullfilmentCenterSettings: function(fkStockLocationId,token, server) { return Factory.GetResponse("ImportExport/GetFullfilmentCenterSettings", token, server, "fkStockLocationId=" + fkStockLocationId +""); }, // http://apidoc.linnworks.net/#/ImportExport-SaveOrdersExportId SaveOrdersExportId: function(fkStockLocationId,fkOrdersExportId,token, server) { return Factory.GetResponse("ImportExport/SaveOrdersExportId", token, server, "fkStockLocationId=" + fkStockLocationId + "&fkOrdersExportId=" + fkOrdersExportId +""); }, // http://apidoc.linnworks.net/#/ImportExport-SaveOrdersImportId SaveOrdersImportId: function(fkStockLocationId,fkOrdersImportId,token, server) { return Factory.GetResponse("ImportExport/SaveOrdersImportId", token, server, "fkStockLocationId=" + fkStockLocationId + "&fkOrdersImportId=" + fkOrdersImportId +""); }, // http://apidoc.linnworks.net/#/ImportExport-SaveInventoryImportId SaveInventoryImportId: function(fkStockLocationId,fkInventoryImportId,token, server) { return Factory.GetResponse("ImportExport/SaveInventoryImportId", token, server, "fkStockLocationId=" + fkStockLocationId + "&fkInventoryImportId=" + fkInventoryImportId +""); }, // http://apidoc.linnworks.net/#/ImportExport-DeleteOrdersExportId DeleteOrdersExportId: function(fkStockLocationId,token, server) { return Factory.GetResponse("ImportExport/DeleteOrdersExportId", token, server, "fkStockLocationId=" + fkStockLocationId +""); }, // http://apidoc.linnworks.net/#/ImportExport-DeleteOrdersImportId DeleteOrdersImportId: function(fkStockLocationId,token, server) { return Factory.GetResponse("ImportExport/DeleteOrdersImportId", token, server, "fkStockLocationId=" + fkStockLocationId +""); }, // http://apidoc.linnworks.net/#/ImportExport-DeleteInventoryImportId DeleteInventoryImportId: function(fkStockLocationId,token, server) { return Factory.GetResponse("ImportExport/DeleteInventoryImportId", token, server, "fkStockLocationId=" + fkStockLocationId +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetFulfilmentCenterNameByOrdersExportId GetFulfilmentCenterNameByOrdersExportId: function(fkOrdersExportId,token, server) { return Factory.GetResponse("ImportExport/GetFulfilmentCenterNameByOrdersExportId", token, server, "fkOrdersExportId=" + fkOrdersExportId +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetFulfilmentCenterNameByOrdersImportId GetFulfilmentCenterNameByOrdersImportId: function(fkOrdersImportId,token, server) { return Factory.GetResponse("ImportExport/GetFulfilmentCenterNameByOrdersImportId", token, server, "fkOrdersImportId=" + fkOrdersImportId +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetFulfilmentCenterNameByInventoryImportId GetFulfilmentCenterNameByInventoryImportId: function(fkInventoryImportId,token, server) { return Factory.GetResponse("ImportExport/GetFulfilmentCenterNameByInventoryImportId", token, server, "fkInventoryImportId=" + fkInventoryImportId +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetExportIdByName GetExportIdByName: function(friendlyName,token, server) { return Factory.GetResponse("ImportExport/GetExportIdByName", token, server, "friendlyName=" + friendlyName +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetExportTypes GetExportTypes: function(token, server) { return Factory.GetResponse("ImportExport/GetExportTypes", token, server, ""); }, // http://apidoc.linnworks.net/#/ImportExport-GetExportSubQuerySelectionFieldValues GetExportSubQuerySelectionFieldValues: function(exportType,subQueryName,selectionFieldName,token, server) { return Factory.GetResponse("ImportExport/GetExportSubQuerySelectionFieldValues", token, server, "exportType=" + JSON.stringify(exportType) + "&subQueryName=" + subQueryName + "&selectionFieldName=" + selectionFieldName +""); }, // http://apidoc.linnworks.net/#/ImportExport-SaveExport SaveExport: function(exportConfig,feedType,feedJSON,token, server) { return Factory.GetResponse("ImportExport/SaveExport", token, server, "exportConfig=" + JSON.stringify(exportConfig) + "&feedType=" + feedType + "&feedJSON=" + feedJSON +""); }, // http://apidoc.linnworks.net/#/ImportExport-IsExportEnabled IsExportEnabled: function(id,token, server) { return Factory.GetResponse("ImportExport/IsExportEnabled", token, server, "id=" + id +""); }, // http://apidoc.linnworks.net/#/ImportExport-EnableExport EnableExport: function(exportId,enable,token, server) { return Factory.GetResponse("ImportExport/EnableExport", token, server, "exportId=" + exportId + "&enable=" + enable +""); }, // http://apidoc.linnworks.net/#/ImportExport-RunNowExport RunNowExport: function(exportId,token, server) { return Factory.GetResponse("ImportExport/RunNowExport", token, server, "exportId=" + exportId +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetExport GetExport: function(id,token, server) { return Factory.GetResponse("ImportExport/GetExport", token, server, "id=" + id +""); }, // http://apidoc.linnworks.net/#/ImportExport-DeleteExport DeleteExport: function(id,token, server) { return Factory.GetResponse("ImportExport/DeleteExport", token, server, "id=" + id +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetExecutingOrQueuedExports GetExecutingOrQueuedExports: function(token, server) { return Factory.GetResponse("ImportExport/GetExecutingOrQueuedExports", token, server, ""); }, // http://apidoc.linnworks.net/#/ImportExport-GetExportSessionErrors GetExportSessionErrors: function(sessionId,pageNumber,entriesPerPage,token, server) { return Factory.GetResponse("ImportExport/GetExportSessionErrors", token, server, "sessionId=" + sessionId + "&pageNumber=" + pageNumber + "&entriesPerPage=" + entriesPerPage +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetExportSessions GetExportSessions: function(exportId,pageNumber,entriesPerPage,token, server) { return Factory.GetResponse("ImportExport/GetExportSessions", token, server, "exportId=" + exportId + "&pageNumber=" + pageNumber + "&entriesPerPage=" + entriesPerPage +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetExports GetExports: function(token, server) { return Factory.GetResponse("ImportExport/GetExports", token, server, ""); }, // http://apidoc.linnworks.net/#/ImportExport-GetExportRegister GetExportRegister: function(id,token, server) { return Factory.GetResponse("ImportExport/GetExportRegister", token, server, "id=" + id +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetSQLColumns GetSQLColumns: function(sqlQuery,token, server) { return Factory.GetResponse("ImportExport/GetSQLColumns", token, server, "sqlQuery=" + sqlQuery +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetDropboxAccounts GetDropboxAccounts: function(token, server) { return Factory.GetResponse("ImportExport/GetDropboxAccounts", token, server, ""); }, // http://apidoc.linnworks.net/#/ImportExport-SetDropboxAccounts SetDropboxAccounts: function(accounts,token, server) { return Factory.GetResponse("ImportExport/SetDropboxAccounts", token, server, "accounts=" + JSON.stringify(accounts) +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetColumnsFromUploadedFile GetColumnsFromUploadedFile: function(fileId,delimiter,comment,hasHeaders,escape,quote,token, server) { return Factory.GetResponse("ImportExport/GetColumnsFromUploadedFile", token, server, "fileId=" + fileId + "&delimiter=" + delimiter + "&comment=" + comment + "&hasHeaders=" + hasHeaders + "&escape=" + escape + "&quote=" + quote +""); }, // http://apidoc.linnworks.net/#/ImportExport-CheckHTTPFile CheckHTTPFile: function(URL,token, server) { return Factory.GetResponse("ImportExport/CheckHTTPFile", token, server, "URL=" + URL +""); }, // http://apidoc.linnworks.net/#/ImportExport-EvalExpression EvalExpression: function(expression,token, server) { return Factory.GetResponse("ImportExport/EvalExpression", token, server, "expression=" + expression +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetColumnsFromDropboxFile GetColumnsFromDropboxFile: function(token,filePath,fileName,delimiter,comment,hasHeaders,escape,quote,token, server) { return Factory.GetResponse("ImportExport/GetColumnsFromDropboxFile", token, server, "token=" + token + "&filePath=" + filePath + "&fileName=" + fileName + "&delimiter=" + delimiter + "&comment=" + comment + "&hasHeaders=" + hasHeaders + "&escape=" + escape + "&quote=" + quote +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetColumnsFromHTTPFile GetColumnsFromHTTPFile: function(URL,delimiter,comment,hasHeaders,escape,quote,token, server) { return Factory.GetResponse("ImportExport/GetColumnsFromHTTPFile", token, server, "URL=" + URL + "&delimiter=" + delimiter + "&comment=" + comment + "&hasHeaders=" + hasHeaders + "&escape=" + escape + "&quote=" + quote +""); }, // http://apidoc.linnworks.net/#/ImportExport-CheckFTPFile CheckFTPFile: function(server,port,filePath,fileName,SSL,passiveMode,protocol,userName,password,postDownload,ftpMoveToFolder,token, server) { return Factory.GetResponse("ImportExport/CheckFTPFile", token, server, "server=" + server + "&port=" + port + "&filePath=" + filePath + "&fileName=" + fileName + "&SSL=" + SSL + "&passiveMode=" + passiveMode + "&protocol=" + JSON.stringify(protocol) + "&userName=" + userName + "&password=" + <PASSWORD> + "&postDownload=" + JSON.stringify(postDownload) + "&ftpMoveToFolder=" + ftpMoveToFolder +""); }, // http://apidoc.linnworks.net/#/ImportExport-CheckSFTPFile CheckSFTPFile: function(server,port,filePath,fileName,compression,userName,password,postDownload,ftpMoveToFolder,token, server) { return Factory.GetResponse("ImportExport/CheckSFTPFile", token, server, "server=" + server + "&port=" + port + "&filePath=" + filePath + "&fileName=" + fileName + "&compression=" + compression + "&userName=" + userName + "&password=" + <PASSWORD> + "&postDownload=" + JSON.stringify(postDownload) + "&ftpMoveToFolder=" + ftpMoveToFolder +""); }, // http://apidoc.linnworks.net/#/ImportExport-CheckDropboxFile CheckDropboxFile: function(token,filePath,fileName,token, server) { return Factory.GetResponse("ImportExport/CheckDropboxFile", token, server, "token=" + token + "&filePath=" + filePath + "&fileName=" + fileName +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetColumnsFromFTPFile GetColumnsFromFTPFile: function(server,port,filePath,fileName,SSL,passiveMode,protocol,userName,password,delimiter,hasHeaders,escape,token, server) { return Factory.GetResponse("ImportExport/GetColumnsFromFTPFile", token, server, "server=" + server + "&port=" + port + "&filePath=" + filePath + "&fileName=" + fileName + "&SSL=" + SSL + "&passiveMode=" + passiveMode + "&protocol=" + JSON.stringify(protocol) + "&userName=" + userName + "&password=" + <PASSWORD> + "&delimiter=" + delimiter + "&hasHeaders=" + hasHeaders + "&escape=" + escape +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetColumnsFromSFTPFile GetColumnsFromSFTPFile: function(server,port,filePath,fileName,compression,userName,password,delimiter,hasHeaders,escape,token, server) { return Factory.GetResponse("ImportExport/GetColumnsFromSFTPFile", token, server, "server=" + server + "&port=" + port + "&filePath=" + filePath + "&fileName=" + fileName + "&compression=" + compression + "&userName=" + userName + "&password=" + <PASSWORD> + "&delimiter=" + delimiter + "&hasHeaders=" + hasHeaders + "&escape=" + escape +""); }, // http://apidoc.linnworks.net/#/ImportExport-CheckFTPConnection CheckFTPConnection: function(server,port,filePath,SSL,passiveMode,protocol,userName,password,token, server) { return Factory.GetResponse("ImportExport/CheckFTPConnection", token, server, "server=" + server + "&port=" + port + "&filePath=" + filePath + "&SSL=" + SSL + "&passiveMode=" + passiveMode + "&protocol=" + JSON.stringify(protocol) + "&userName=" + userName + "&password=" + <PASSWORD> +""); }, // http://apidoc.linnworks.net/#/ImportExport-CheckSFTPConnection CheckSFTPConnection: function(server,port,filePath,compression,userName,password,token, server) { return Factory.GetResponse("ImportExport/CheckSFTPConnection", token, server, "server=" + server + "&port=" + port + "&filePath=" + filePath + "&compression=" + compression + "&userName=" + userName + "&password=" + <PASSWORD> +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetFTPFolderContent GetFTPFolderContent: function(server,port,ssl,userName,password,passiveMode,path,token, server) { return Factory.GetResponse("ImportExport/GetFTPFolderContent", token, server, "server=" + server + "&port=" + port + "&ssl=" + ssl + "&userName=" + userName + "&password=" + <PASSWORD> + "&passiveMode=" + passiveMode + "&path=" + path +""); }, // http://apidoc.linnworks.net/#/ImportExport-GetSFTPFolderContent GetSFTPFolderContent: function(server,port,userName,password,compression,path,token, server) { return Factory.GetResponse("ImportExport/GetSFTPFolderContent", token, server, "server=" + server + "&port=" + port + "&userName=" + userName + "&password=" + <PASSWORD> + "&compression=" + compression + "&path=" + path +""); }, // http://apidoc.linnworks.net/#/ImportExport-CreateFTPFolder CreateFTPFolder: function(server,port,ssl,userName,password,passiveMode,path,folderName,token, server) { return Factory.GetResponse("ImportExport/CreateFTPFolder", token, server, "server=" + server + "&port=" + port + "&ssl=" + ssl + "&userName=" + userName + "&password=" + <PASSWORD> + "&passiveMode=" + passiveMode + "&path=" + path + "&folderName=" + folderName +""); }, // http://apidoc.linnworks.net/#/ImportExport-DeleteFTPFile DeleteFTPFile: function(server,port,ssl,userName,password,passiveMode,path,token, server) { return Factory.GetResponse("ImportExport/DeleteFTPFile", token, server, "server=" + server + "&port=" + port + "&ssl=" + ssl + "&userName=" + userName + "&password=" + <PASSWORD> + "&passiveMode=" + passiveMode + "&path=" + path +""); }, }; <file_sep>/Linnworks/src/javascript/Inventory.js var Inventory = { // http://apidoc.linnworks.net/#/Inventory-CreateInventoryItemPrices CreateInventoryItemPrices: function(inventoryItemPrices,token, server) { return Factory.GetResponse("Inventory/CreateInventoryItemPrices", token, server, "inventoryItemPrices=" + JSON.stringify(inventoryItemPrices) +""); }, // http://apidoc.linnworks.net/#/Inventory-UpdateInventoryItemPrices UpdateInventoryItemPrices: function(inventoryItemPrices,token, server) { return Factory.GetResponse("Inventory/UpdateInventoryItemPrices", token, server, "inventoryItemPrices=" + JSON.stringify(inventoryItemPrices) +""); }, // http://apidoc.linnworks.net/#/Inventory-DeleteInventoryItemPrices DeleteInventoryItemPrices: function(inventoryItemPriceIds,token, server) { return Factory.GetResponse("Inventory/DeleteInventoryItemPrices", token, server, "inventoryItemPriceIds=" + JSON.stringify(inventoryItemPriceIds) +""); }, // http://apidoc.linnworks.net/#/Inventory-GetInventoryItemDescriptions GetInventoryItemDescriptions: function(inventoryItemId,token, server) { return Factory.GetResponse("Inventory/GetInventoryItemDescriptions", token, server, "inventoryItemId=" + inventoryItemId +""); }, // http://apidoc.linnworks.net/#/Inventory-CreateInventoryItemDescriptions CreateInventoryItemDescriptions: function(inventoryItemDescriptions,token, server) { return Factory.GetResponse("Inventory/CreateInventoryItemDescriptions", token, server, "inventoryItemDescriptions=" + JSON.stringify(inventoryItemDescriptions) +""); }, // http://apidoc.linnworks.net/#/Inventory-UpdateInventoryItemDescriptions UpdateInventoryItemDescriptions: function(inventoryItemDescriptions,token, server) { return Factory.GetResponse("Inventory/UpdateInventoryItemDescriptions", token, server, "inventoryItemDescriptions=" + JSON.stringify(inventoryItemDescriptions) +""); }, // http://apidoc.linnworks.net/#/Inventory-DeleteInventoryItemDescriptions DeleteInventoryItemDescriptions: function(inventoryItemDescriptionIds,token, server) { return Factory.GetResponse("Inventory/DeleteInventoryItemDescriptions", token, server, "inventoryItemDescriptionIds=" + JSON.stringify(inventoryItemDescriptionIds) +""); }, // http://apidoc.linnworks.net/#/Inventory-GetChannels GetChannels: function(token, server) { return Factory.GetResponse("Inventory/GetChannels", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-GetExtendedPropertyNames GetExtendedPropertyNames: function(token, server) { return Factory.GetResponse("Inventory/GetExtendedPropertyNames", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-GetExtendedPropertyTypes GetExtendedPropertyTypes: function(token, server) { return Factory.GetResponse("Inventory/GetExtendedPropertyTypes", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-GetSystemPropertyNames GetSystemPropertyNames: function(token, server) { return Factory.GetResponse("Inventory/GetSystemPropertyNames", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-GetPackageGroups GetPackageGroups: function(token, server) { return Factory.GetResponse("Inventory/GetPackageGroups", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-GetPostalServices GetPostalServices: function(token, server) { return Factory.GetResponse("Inventory/GetPostalServices", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-GetInventoryItems GetInventoryItems: function(view,stockLocationIds,startIndex,itemsCount,token, server) { return Factory.GetResponse("Inventory/GetInventoryItems", token, server, "view=" + JSON.stringify(view) + "&stockLocationIds=" + JSON.stringify(stockLocationIds) + "&startIndex=" + startIndex + "&itemsCount=" + itemsCount +""); }, // http://apidoc.linnworks.net/#/Inventory-UpdateInventoryItemField UpdateInventoryItemField: function(inventoryItemId,fieldName,fieldValue,token, server) { return Factory.GetResponse("Inventory/UpdateInventoryItemField", token, server, "inventoryItemId=" + inventoryItemId + "&fieldName=" + fieldName + "&fieldValue=" + fieldValue +""); }, // http://apidoc.linnworks.net/#/Inventory-UpdateInventoryItemStockField UpdateInventoryItemStockField: function(inventoryItemId,fieldName,fieldValue,locationId,token, server) { return Factory.GetResponse("Inventory/UpdateInventoryItemStockField", token, server, "inventoryItemId=" + inventoryItemId + "&fieldName=" + fieldName + "&fieldValue=" + fieldValue + "&locationId=" + locationId +""); }, // http://apidoc.linnworks.net/#/Inventory-UpdateInventoryItemLocationField UpdateInventoryItemLocationField: function(inventoryItemId,fieldName,fieldValue,locationId,token, server) { return Factory.GetResponse("Inventory/UpdateInventoryItemLocationField", token, server, "inventoryItemId=" + inventoryItemId + "&fieldName=" + fieldName + "&fieldValue=" + fieldValue + "&locationId=" + locationId +""); }, // http://apidoc.linnworks.net/#/Inventory-GetInventoryItemById GetInventoryItemById: function(id,token, server) { return Factory.GetResponse("Inventory/GetInventoryItemById", token, server, "id=" + id +""); }, // http://apidoc.linnworks.net/#/Inventory-GetInventoryViews GetInventoryViews: function(token, server) { return Factory.GetResponse("Inventory/GetInventoryViews", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-GetInventoryFilterTypes GetInventoryFilterTypes: function(token, server) { return Factory.GetResponse("Inventory/GetInventoryFilterTypes", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-GetInventoryColumnTypes GetInventoryColumnTypes: function(token, server) { return Factory.GetResponse("Inventory/GetInventoryColumnTypes", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-GetNewInventoryView GetNewInventoryView: function(token, server) { return Factory.GetResponse("Inventory/GetNewInventoryView", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-UpdateInventoryViews UpdateInventoryViews: function(views,token, server) { return Factory.GetResponse("Inventory/UpdateInventoryViews", token, server, "views=" + JSON.stringify(views) +""); }, // http://apidoc.linnworks.net/#/Inventory-AddInventoryItem AddInventoryItem: function(inventoryItem,token, server) { return Factory.GetResponse("Inventory/AddInventoryItem", token, server, "inventoryItem=" + JSON.stringify(inventoryItem) +""); }, // http://apidoc.linnworks.net/#/Inventory-DuplicateInventoryItem DuplicateInventoryItem: function(inventoryItem,sourceItemId,token, server) { return Factory.GetResponse("Inventory/DuplicateInventoryItem", token, server, "inventoryItem=" + JSON.stringify(inventoryItem) + "&sourceItemId=" + sourceItemId +""); }, // http://apidoc.linnworks.net/#/Inventory-GetNewItemNumber GetNewItemNumber: function(token, server) { return Factory.GetResponse("Inventory/GetNewItemNumber", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-UpdateInventoryItem UpdateInventoryItem: function(inventoryItem,token, server) { return Factory.GetResponse("Inventory/UpdateInventoryItem", token, server, "inventoryItem=" + JSON.stringify(inventoryItem) +""); }, // http://apidoc.linnworks.net/#/Inventory-DeleteInventoryItems DeleteInventoryItems: function(inventoryItemIds,token, server) { return Factory.GetResponse("Inventory/DeleteInventoryItems", token, server, "inventoryItemIds=" + JSON.stringify(inventoryItemIds) +""); }, // http://apidoc.linnworks.net/#/Inventory-AdjustTemplatesInstant AdjustTemplatesInstant: function(inventoryItemIds,source,subSource,adjustmentOptions,token, server) { return Factory.GetResponse("Inventory/AdjustTemplatesInstant", token, server, "inventoryItemIds=" + JSON.stringify(inventoryItemIds) + "&source=" + source + "&subSource=" + subSource + "&adjustmentOptions=" + JSON.stringify(adjustmentOptions) +""); }, // http://apidoc.linnworks.net/#/Inventory-GetInventoryItemChannelSKUs GetInventoryItemChannelSKUs: function(inventoryItemId,token, server) { return Factory.GetResponse("Inventory/GetInventoryItemChannelSKUs", token, server, "inventoryItemId=" + inventoryItemId +""); }, // http://apidoc.linnworks.net/#/Inventory-CreateInventoryItemChannelSKUs CreateInventoryItemChannelSKUs: function(inventoryItemChannelSKUs,token, server) { return Factory.GetResponse("Inventory/CreateInventoryItemChannelSKUs", token, server, "inventoryItemChannelSKUs=" + JSON.stringify(inventoryItemChannelSKUs) +""); }, // http://apidoc.linnworks.net/#/Inventory-DeleteInventoryItemChannelSKUs DeleteInventoryItemChannelSKUs: function(inventoryItemChannelSKUIds,token, server) { return Factory.GetResponse("Inventory/DeleteInventoryItemChannelSKUs", token, server, "inventoryItemChannelSKUIds=" + JSON.stringify(inventoryItemChannelSKUIds) +""); }, // http://apidoc.linnworks.net/#/Inventory-UnlinkChannelListing UnlinkChannelListing: function(channelRefId,source,subSource,token, server) { return Factory.GetResponse("Inventory/UnlinkChannelListing", token, server, "channelRefId=" + channelRefId + "&source=" + source + "&subSource=" + subSource +""); }, // http://apidoc.linnworks.net/#/Inventory-GetInventoryItemAuditTrail GetInventoryItemAuditTrail: function(inventoryItemId,token, server) { return Factory.GetResponse("Inventory/GetInventoryItemAuditTrail", token, server, "inventoryItemId=" + inventoryItemId +""); }, // http://apidoc.linnworks.net/#/Inventory-GetInventoryItemImages GetInventoryItemImages: function(inventoryItemId,token, server) { return Factory.GetResponse("Inventory/GetInventoryItemImages", token, server, "inventoryItemId=" + inventoryItemId +""); }, // http://apidoc.linnworks.net/#/Inventory-GetInventoryItemImagesForTemplates GetInventoryItemImagesForTemplates: function(inventoryItemIds,source,token, server) { return Factory.GetResponse("Inventory/GetInventoryItemImagesForTemplates", token, server, "inventoryItemIds=" + JSON.stringify(inventoryItemIds) + "&source=" + source +""); }, // http://apidoc.linnworks.net/#/Inventory-SetInventoryItemImageAsMain SetInventoryItemImageAsMain: function(inventoryItemId,mainImageId,token, server) { return Factory.GetResponse("Inventory/SetInventoryItemImageAsMain", token, server, "inventoryItemId=" + inventoryItemId + "&mainImageId=" + mainImageId +""); }, // http://apidoc.linnworks.net/#/Inventory-UploadImagesToInventoryItem UploadImagesToInventoryItem: function(inventoryItemId,imageIds,token, server) { return Factory.GetResponse("Inventory/UploadImagesToInventoryItem", token, server, "inventoryItemId=" + inventoryItemId + "&imageIds=" + JSON.stringify(imageIds) +""); }, // http://apidoc.linnworks.net/#/Inventory-DeleteImagesFromInventoryItem DeleteImagesFromInventoryItem: function(imageURL,inventoryItemId,token, server) { return Factory.GetResponse("Inventory/DeleteImagesFromInventoryItem", token, server, "imageURL=" + imageURL + "&inventoryItemId=" + inventoryItemId +""); }, // http://apidoc.linnworks.net/#/Inventory-GetCategories GetCategories: function(token, server) { return Factory.GetResponse("Inventory/GetCategories", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-CreateCategory CreateCategory: function(categoryName,token, server) { return Factory.GetResponse("Inventory/CreateCategory", token, server, "categoryName=" + categoryName +""); }, // http://apidoc.linnworks.net/#/Inventory-UpdateCategory UpdateCategory: function(category,token, server) { return Factory.GetResponse("Inventory/UpdateCategory", token, server, "category=" + JSON.stringify(category) +""); }, // http://apidoc.linnworks.net/#/Inventory-DeleteCategoryById DeleteCategoryById: function(categoryId,token, server) { return Factory.GetResponse("Inventory/DeleteCategoryById", token, server, "categoryId=" + categoryId +""); }, // http://apidoc.linnworks.net/#/Inventory-GetCountryCodes GetCountryCodes: function(token, server) { return Factory.GetResponse("Inventory/GetCountryCodes", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-GetCountries GetCountries: function(token, server) { return Factory.GetResponse("Inventory/GetCountries", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-CreateCountries CreateCountries: function(countries,token, server) { return Factory.GetResponse("Inventory/CreateCountries", token, server, "countries=" + JSON.stringify(countries) +""); }, // http://apidoc.linnworks.net/#/Inventory-UpdateCountries UpdateCountries: function(countries,token, server) { return Factory.GetResponse("Inventory/UpdateCountries", token, server, "countries=" + JSON.stringify(countries) +""); }, // http://apidoc.linnworks.net/#/Inventory-DeleteCountries DeleteCountries: function(countries,token, server) { return Factory.GetResponse("Inventory/DeleteCountries", token, server, "countries=" + JSON.stringify(countries) +""); }, // http://apidoc.linnworks.net/#/Inventory-GetStockLocations GetStockLocations: function(token, server) { return Factory.GetResponse("Inventory/GetStockLocations", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-GetInventoryItemLocations GetInventoryItemLocations: function(inventoryItemId,token, server) { return Factory.GetResponse("Inventory/GetInventoryItemLocations", token, server, "inventoryItemId=" + inventoryItemId +""); }, // http://apidoc.linnworks.net/#/Inventory-AddItemLocations AddItemLocations: function(itemLocations,token, server) { return Factory.GetResponse("Inventory/AddItemLocations", token, server, "itemLocations=" + JSON.stringify(itemLocations) +""); }, // http://apidoc.linnworks.net/#/Inventory-UpdateItemLocations UpdateItemLocations: function(itemLocations,token, server) { return Factory.GetResponse("Inventory/UpdateItemLocations", token, server, "itemLocations=" + JSON.stringify(itemLocations) +""); }, // http://apidoc.linnworks.net/#/Inventory-DeleteItemLocations DeleteItemLocations: function(inventoryItemId,itemLocations,token, server) { return Factory.GetResponse("Inventory/DeleteItemLocations", token, server, "inventoryItemId=" + inventoryItemId + "&itemLocations=" + JSON.stringify(itemLocations) +""); }, // http://apidoc.linnworks.net/#/Inventory-GetInventoryItemCompositions GetInventoryItemCompositions: function(inventoryItemId,token, server) { return Factory.GetResponse("Inventory/GetInventoryItemCompositions", token, server, "inventoryItemId=" + inventoryItemId +""); }, // http://apidoc.linnworks.net/#/Inventory-CreateInventoryItemCompositions CreateInventoryItemCompositions: function(inventoryItemCompositions,token, server) { return Factory.GetResponse("Inventory/CreateInventoryItemCompositions", token, server, "inventoryItemCompositions=" + JSON.stringify(inventoryItemCompositions) +""); }, // http://apidoc.linnworks.net/#/Inventory-UpdateInventoryItemCompositions UpdateInventoryItemCompositions: function(inventoryItemCompositions,token, server) { return Factory.GetResponse("Inventory/UpdateInventoryItemCompositions", token, server, "inventoryItemCompositions=" + JSON.stringify(inventoryItemCompositions) +""); }, // http://apidoc.linnworks.net/#/Inventory-DeleteInventoryItemCompositions DeleteInventoryItemCompositions: function(stockItemId,inventoryItemCompositionIds,token, server) { return Factory.GetResponse("Inventory/DeleteInventoryItemCompositions", token, server, "stockItemId=" + stockItemId + "&inventoryItemCompositionIds=" + JSON.stringify(inventoryItemCompositionIds) +""); }, // http://apidoc.linnworks.net/#/Inventory-GetInventoryItemExtendedProperties GetInventoryItemExtendedProperties: function(inventoryItemId,token, server) { return Factory.GetResponse("Inventory/GetInventoryItemExtendedProperties", token, server, "inventoryItemId=" + inventoryItemId +""); }, // http://apidoc.linnworks.net/#/Inventory-CreateInventoryItemExtendedProperties CreateInventoryItemExtendedProperties: function(inventoryItemExtendedProperties,token, server) { return Factory.GetResponse("Inventory/CreateInventoryItemExtendedProperties", token, server, "inventoryItemExtendedProperties=" + JSON.stringify(inventoryItemExtendedProperties) +""); }, // http://apidoc.linnworks.net/#/Inventory-UpdateInventoryItemExtendedProperties UpdateInventoryItemExtendedProperties: function(inventoryItemExtendedProperties,token, server) { return Factory.GetResponse("Inventory/UpdateInventoryItemExtendedProperties", token, server, "inventoryItemExtendedProperties=" + JSON.stringify(inventoryItemExtendedProperties) +""); }, // http://apidoc.linnworks.net/#/Inventory-DeleteInventoryItemExtendedProperties DeleteInventoryItemExtendedProperties: function(inventoryItemId,inventoryItemExtendedPropertyIds,token, server) { return Factory.GetResponse("Inventory/DeleteInventoryItemExtendedProperties", token, server, "inventoryItemId=" + inventoryItemId + "&inventoryItemExtendedPropertyIds=" + JSON.stringify(inventoryItemExtendedPropertyIds) +""); }, // http://apidoc.linnworks.net/#/Inventory-GetInventoryItemTitles GetInventoryItemTitles: function(inventoryItemId,token, server) { return Factory.GetResponse("Inventory/GetInventoryItemTitles", token, server, "inventoryItemId=" + inventoryItemId +""); }, // http://apidoc.linnworks.net/#/Inventory-CreateInventoryItemTitles CreateInventoryItemTitles: function(inventoryItemTitles,token, server) { return Factory.GetResponse("Inventory/CreateInventoryItemTitles", token, server, "inventoryItemTitles=" + JSON.stringify(inventoryItemTitles) +""); }, // http://apidoc.linnworks.net/#/Inventory-UpdateInventoryItemTitles UpdateInventoryItemTitles: function(inventoryItemTitles,token, server) { return Factory.GetResponse("Inventory/UpdateInventoryItemTitles", token, server, "inventoryItemTitles=" + JSON.stringify(inventoryItemTitles) +""); }, // http://apidoc.linnworks.net/#/Inventory-DeleteInventoryItemTitles DeleteInventoryItemTitles: function(inventoryItemTitleIds,token, server) { return Factory.GetResponse("Inventory/DeleteInventoryItemTitles", token, server, "inventoryItemTitleIds=" + JSON.stringify(inventoryItemTitleIds) +""); }, // http://apidoc.linnworks.net/#/Inventory-GetSupplierList GetSupplierList: function(token, server) { return Factory.GetResponse("Inventory/GetSupplierList", token, server, ""); }, // http://apidoc.linnworks.net/#/Inventory-GetStockSupplierStat GetStockSupplierStat: function(inventoryItemId,token, server) { return Factory.GetResponse("Inventory/GetStockSupplierStat", token, server, "inventoryItemId=" + inventoryItemId +""); }, // http://apidoc.linnworks.net/#/Inventory-CreateStockSupplierStat CreateStockSupplierStat: function(itemSuppliers,token, server) { return Factory.GetResponse("Inventory/CreateStockSupplierStat", token, server, "itemSuppliers=" + JSON.stringify(itemSuppliers) +""); }, // http://apidoc.linnworks.net/#/Inventory-UpdateStockSupplierStat UpdateStockSupplierStat: function(itemSuppliers,token, server) { return Factory.GetResponse("Inventory/UpdateStockSupplierStat", token, server, "itemSuppliers=" + JSON.stringify(itemSuppliers) +""); }, // http://apidoc.linnworks.net/#/Inventory-DeleteStockSupplierStat DeleteStockSupplierStat: function(stockItemId,itemSupplierIds,token, server) { return Factory.GetResponse("Inventory/DeleteStockSupplierStat", token, server, "stockItemId=" + stockItemId + "&itemSupplierIds=" + JSON.stringify(itemSupplierIds) +""); }, // http://apidoc.linnworks.net/#/Inventory-GetInventoryItemPrices GetInventoryItemPrices: function(inventoryItemId,token, server) { return Factory.GetResponse("Inventory/GetInventoryItemPrices", token, server, "inventoryItemId=" + inventoryItemId +""); }, }; <file_sep>/Linnworks/src/net/LinnworksAPI/LinnworksException.cs using System; using System.Collections.Generic; using System.Linq; using System.Text; namespace LinnworksAPI { public class LinnworksAPIException : System.Exception { public LinnworksAPIException(string message, string code, Exception innerException):base(message,innerException) { _Code = code; } private string _Code; public string Code { get { return _Code; } } } } <file_sep>/Linnworks/src/net/LinnworksAPI/Class_Error.cs using System; using System.Collections.Generic; using System.Linq; using System.Text; namespace LinnworksAPI { public class Error { public string Code; public string Message; } } <file_sep>/Linnworks/src/net/LinnworksAPI/Class_GetOrderPackagingCalculationRequest.cs using System; using System.Collections.Generic; namespace LinnworksAPI { public class GetOrderPackagingCalculationRequest { public List<Guid> pkOrderIds; public Boolean Recalculate; public Boolean SaveRecalculation; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_OrderItem.cs using System; using System.Collections.Generic; namespace LinnworksAPI { public class OrderItem { public Guid OrderId; public Guid ItemId; public Guid StockItemId; public String ItemNumber; public String SKU; public String ItemSource; public String Title; public Int32 Quantity; public String CategoryName; public Int32? CompositeAvailablity; public Guid RowId; public Boolean StockLevelsSpecified; public Int32 OnOrder; public Int32? InOrderBook; public Int32 Level; public Int32? MinimumLevel; public Int32 AvailableStock; public Double PricePerUnit; public Double UnitCost; public Double Discount; public Double Tax; public Double TaxRate; public Double Cost; public Double CostIncTax; public List<OrderItem> CompositeSubItems; public Boolean IsService; public Double SalesTax; public Boolean TaxCostInclusive; public Boolean PartShipped; public Double Weight; public String BarcodeNumber; public Int32 Market; public String ChannelSKU; public String ChannelTitle; public Boolean HasImage; public Guid? ImageId; public List<OrderItemOption> AdditionalInfo; public Int32 StockLevelIndicator; public String BinRack; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_InventoryItem.cs using System; using System.Collections.Generic; namespace LinnworksAPI { public class InventoryItem { public Guid Id; public String SKU; public String Title; public Double? RetailPrice; public Double? PurchasePrice; public String Barcode; public Int32 Available; public Int32 MinimumLevel; public Int32 InOrder; public Int32 StockLevel; public Double StockValue; public Int32 Due; public Boolean Tracked; public String BinRack; public Guid Category; public Boolean IsComposite; public Boolean IsArchived; public String Image; public DateTime? CreatedDate; public DateTime? ModifiedDate; public String VariationGroupName; public List<InventoryItem> Products; public Dictionary<String, ChannelDetails> Channels; public Int32 TotalChangedProducts; public Boolean ContainsChanges; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_SearchField.cs using System; namespace LinnworksAPI { public class SearchField { public String Field; public String Name; public Boolean AllowForAllDates; public Boolean ExactSearchOptional; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_Search_PurchaseOrdersResult.cs using System; using System.Collections.Generic; namespace LinnworksAPI { public class Search_PurchaseOrdersResult { public List<PurchaseOrderHeader> Result; public Int32 TotalPages; public Int32 CurrentPageNumber; public Int32 EntriesPerPage; public Int32 TotalNumberOfRecords; } }<file_sep>/Linnworks/src/javascript/Auth.js var Auth = { // http://apidoc.linnworks.net/#/Auth-GetDebugInformation GetDebugInformation: function(key,password) { return Factory.GetResponse("Auth/GetDebugInformation", "", "", "key=" + key + "&password=" + password +""); }, // http://apidoc.linnworks.net/#/Auth-MultiLogin MultiLogin: function(userName,password) { return Factory.GetResponse("Auth/MultiLogin", "", "", "userName=" + userName + "&password=" + password +""); }, // http://apidoc.linnworks.net/#/Auth-Authorize Authorize: function(userName,password,userId) { return Factory.GetResponse("Auth/Authorize", "", "", "userName=" + userName + "&password=" + <PASSWORD> + "&userId=" + userId +""); }, // http://apidoc.linnworks.net/#/Auth-ResetPassword ResetPassword: function(userName,resetToken,newPassword,confirmNewPassword) { return Factory.GetResponse("Auth/ResetPassword", "", "", "userName=" + userName + "&resetToken=" + resetToken + "&newPassword=" + <PASSWORD> + "&confirmNewPassword=" + <PASSWORD> +""); }, // http://apidoc.linnworks.net/#/Auth-ResetPasswordRequest ResetPasswordRequest: function(userName) { return Factory.GetResponse("Auth/ResetPasswordRequest", "", "", "userName=" + userName +""); }, // http://apidoc.linnworks.net/#/Auth-AuthorizeByApplication AuthorizeByApplication: function(applicationId,applicationSecret,token) { return Factory.GetResponse("Auth/AuthorizeByApplication", "", "", "applicationId=" + applicationId + "&applicationSecret=" + applicationSecret + "&token=" + token +""); }, // http://apidoc.linnworks.net/#/Auth-GetServerUTCTime GetServerUTCTime: function() { return Factory.GetResponse("Auth/GetServerUTCTime", "", "", ""); }, }; <file_sep>/Linnworks/src/net/LinnworksAPI/Class_ConfigPropertySelectionList.cs using System; namespace LinnworksAPI { public class ConfigPropertySelectionList<SelectStringValueOption, Guid> { public GetSelectionList<SelectStringValueOption> OnGetSelectionList; public Boolean Loaded; public Int32 pkPropertyId; public Boolean IsChanged; public Guid PropertyValue; public String PropertyType; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_LinePricingRequest.cs using System; namespace LinnworksAPI { public class LinePricingRequest { public Double PricePerUnit; public Double DiscountPercentage; public Double TaxRatePercentage; public Boolean TaxInclusive; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_RefundInfo.cs using System; namespace LinnworksAPI { public class RefundInfo { public Guid? pkRefundRowId; public String SKU; public String ItemTitle; public Boolean IsItem; public Boolean IsService; public Double Amount; public String Reason; public Boolean Actioned; public DateTime? ActionDate; public String ReturnReference; public Double? Cost; public RefundStatus RefundStatus; public Boolean IgnoredValidation; public Guid? fkOrderItemRowId; public Boolean ShouldSerializeChannelReason; public String ChannelReason; public Boolean ShouldSerializeChannelReasonSec; public String ChannelReasonSec; public Boolean IsNew; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_Add_PurchaseOrderItemParameter.cs using System; namespace LinnworksAPI { public class Add_PurchaseOrderItemParameter { public Guid pkPurchaseId; public Guid fkStockItemId; public Int32 Qty; public Int32 PackQuantity; public Int32 PackSize; public Decimal Cost; public Decimal TaxRate; } }<file_sep>/Linnworks/src/net/LinnworksAPI/ClientConfig.cs using System; using System.Collections.Generic; using System.Linq; using System.Text; namespace LinnworksAPI { public static class ClientConfig { private static bool _ThrowExceptions; public static bool ThrowExceptions { get { return _ThrowExceptions; } set { _ThrowExceptions = value; } } } } <file_sep>/Linnworks/src/net/LinnworksAPI/Class_BooleanFilter.cs using System; namespace LinnworksAPI { public class BooleanFilter { public Boolean? Value; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_StockItemImageSimple.cs using System; namespace LinnworksAPI { public class StockItemImageSimple { public Guid pkRowId; public Boolean IsMain; public Int32 SortOrder; public Guid StockItemId; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_BookedReturnsExchangeItem.cs using System; namespace LinnworksAPI { public class BookedReturnsExchangeItem { public Guid fkOrderItemRowId; public String RowType; public String SKU; public String ItemTitle; public Int32 ReturnQty; public Int32 MaxReturnQty; public Int32? NewQty; public String NewSKU; public String NewTitle; public Guid? fkNewStockItemId; public String Category; public String Reason; public Guid fkReturnLocationId; public String ReturnLocation; public Double? PendingRefundAmount; public Boolean Scrapped; public Int32? ScrapQty; public Guid ParentOrderItemRowId; public Double? AdditionalCost; public String cCurrency; public Int32 pkReturnId; public String ChannelReason; public String ChannelReasonSec; public DateTime ReturnDate; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_Deliver_PurchaseItemParameter.cs using System; namespace LinnworksAPI { public class Deliver_PurchaseItemParameter { public Guid pkPurchaseId; public Guid pkPurchaseItemId; public Int32? Delivered; public Int32? AddToDelivered; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_OrderNote.cs using System; namespace LinnworksAPI { public class OrderNote { public Boolean Internal; public String Note; public DateTime NoteEntryDate; public String NoteUserName; public Guid pkOrderNoteId; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Factory.cs using System; using System.IO; using System.Net; using Newtonsoft.Json; public static class Factory { public static string SerializeAndUrlEscape(object o) { //doesn't work! return System.Web.HttpUtility.UrlEncode(Newtonsoft.Json.JsonConvert.SerializeObject(o)); JsonSerializerSettings jss = new JsonSerializerSettings(); jss.DateFormatString = "yyyy-MM-ddTHH:mm:ss.ffZ"; return Newtonsoft.Json.JsonConvert.SerializeObject(o,jss).Trim('\"'); } public static string GetResponse(string Extension, string Body, string Token, string Server) { if (string.IsNullOrEmpty(Server)) { Server = "https://api.linnworks.net/api/"; } else { Server += "/api/"; } HttpWebRequest Request = (HttpWebRequest)WebRequest.Create(Server + Extension); Request.ContentType = "application/x-www-form-urlencoded"; Request.Method = "POST"; if (!string.IsNullOrEmpty(Token)) { Request.Headers.Add("Authorization", Token); } using (StreamWriter writer = new StreamWriter(Request.GetRequestStream())) { writer.Write(Body); } try { using (StreamReader reader = new StreamReader(Request.GetResponse().GetResponseStream())) { return reader.ReadToEnd(); } } catch (Exception ex) { Console.WriteLine(ex.Message); if (LinnworksAPI.ClientConfig.ThrowExceptions) { bool docatch = true; try { WebException wex = (WebException)ex; string response; using (StreamReader reader = new StreamReader(wex.Response.GetResponseStream())) { response=reader.ReadToEnd(); } LinnworksAPI.Error err = null; try { err = Newtonsoft.Json.JsonConvert.DeserializeObject<LinnworksAPI.Error>(response); } catch { docatch = false; throw new Exception("Error in response from Linnworks API: " + response, ex); } docatch = false; if (err!=null) throw new LinnworksAPI.LinnworksAPIException(err.Message, err.Code, ex); else throw new Exception("Error in response from Linnworks API: " + response, ex); } catch when (docatch) { } throw; } return ""; } } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_StockItemLevel.cs using System; namespace LinnworksAPI { public class StockItemLevel { public StockLocation Location; public Int32 StockLevel; public Double StockValue; public Int32 MinimumLevel; public Int32 InOrderBook; public Int32 Due; public Int32 InOrders; public Int32 Available; public Double UnitCost; public String SKU; public Boolean AutoAdjust; public DateTime LastUpdateDate; public String LastUpdateOperation; public Guid rowid; public Boolean PendingUpdate; public Guid StockItemId; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_CalcOrderHeader.cs using System; using System.Collections.Generic; namespace LinnworksAPI { public class CalcOrderHeader { public Guid pkOrderID; public Int32 nOrderId; public Guid fkPostalServiceId; public Guid fkCountryId; public String cCountry; public Double ItemWeight; public Guid fkPackagingGroupId; public Guid fkPackagingTypeId; public Boolean IsSplitPackaging; public Double PackagingWeight; public Double TotalWeight; public Decimal? TotalWidth; public Decimal? TotalHeight; public Decimal? TotalDepth; public Boolean ManualAdjust; public Int32 SplitPackageCount; public Boolean LabelPrinted; public List<String> CalculationHints; public List<CalcOrderItem> Items; public List<CalcBin> Bins; public CalcMethod DimMethod; public SqlDataRecord DataRecordMetaData; } }<file_sep>/Linnworks/src/net/LinnworksAPI/ImportExport.cs using Newtonsoft.Json; using System; using System.Collections.Generic; namespace LinnworksAPI { public static class ImportExportMethods { private static JsonSerializerSettings serializerSettings = new JsonSerializerSettings() { DateFormatString = "yyyy-MM-ddTHH:mm:ss.ffZ" }; public static ImportRegister EnableImport(Int32 importId, Boolean enable, String ApiToken, String ApiServer) { return Newtonsoft.Json.JsonConvert.DeserializeObject<ImportRegister>(Factory.GetResponse("ImportExport/EnableImport", "importId=" + importId + "&enable=" + enable + "", ApiToken, ApiServer), new JsonSerializerSettings { NullValueHandling = NullValueHandling.Ignore }); } public static void RunNowImport(Int32 importId, String ApiToken, String ApiServer) { Factory.GetResponse("ImportExport/RunNowImport", "importId=" + importId + "", ApiToken, ApiServer); } public static Import GetImport(Int32 id, String ApiToken, String ApiServer) { return Newtonsoft.Json.JsonConvert.DeserializeObject<Import>(Factory.GetResponse("ImportExport/GetImport", "id=" + id + "", ApiToken, ApiServer), new JsonSerializerSettings { NullValueHandling = NullValueHandling.Ignore }); } public static void DeleteImport(Int32 id, String ApiToken, String ApiServer) { Factory.GetResponse("ImportExport/DeleteImport", "id=" + id + "", ApiToken, ApiServer); } public static List<ImportRegister> GetImports(String ApiToken, String ApiServer) { return Newtonsoft.Json.JsonConvert.DeserializeObject<List<ImportRegister>>(Factory.GetResponse("ImportExport/GetImports", "", ApiToken, ApiServer), new JsonSerializerSettings { NullValueHandling = NullValueHandling.Ignore }); } public static FulfilmentCenterImportExportSettings GetFullfilmentCenterSettings(Guid fkStockLocationId, String ApiToken, String ApiServer) { return Newtonsoft.Json.JsonConvert.DeserializeObject<FulfilmentCenterImportExportSettings>(Factory.GetResponse("ImportExport/GetFullfilmentCenterSettings", "fkStockLocationId=" + fkStockLocationId + "", ApiToken, ApiServer), new JsonSerializerSettings { NullValueHandling = NullValueHandling.Ignore }); } public static ExportRegister EnableExport(Int32 exportId, Boolean enable, String ApiToken, String ApiServer) { return Newtonsoft.Json.JsonConvert.DeserializeObject<ExportRegister>(Factory.GetResponse("ImportExport/EnableExport", "exportId=" + exportId + "&enable=" + enable + "", ApiToken, ApiServer), new JsonSerializerSettings { NullValueHandling = NullValueHandling.Ignore }); } public static void RunNowExport(Int32 exportId, String ApiToken, String ApiServer) { Factory.GetResponse("ImportExport/RunNowExport", "exportId=" + exportId + "", ApiToken, ApiServer); } public static Export GetExport(Int32 id, String ApiToken, String ApiServer) { return Newtonsoft.Json.JsonConvert.DeserializeObject<Export>(Factory.GetResponse("ImportExport/GetExport", "id=" + id + "", ApiToken, ApiServer), new JsonSerializerSettings { NullValueHandling = NullValueHandling.Ignore }); } public static void DeleteExport(Int32 id, String ApiToken, String ApiServer) { Factory.GetResponse("ImportExport/DeleteExport", "id=" + id + "", ApiToken, ApiServer); } public static List<ExportRegister> GetExports(String ApiToken, String ApiServer) { return Newtonsoft.Json.JsonConvert.DeserializeObject<List<ExportRegister>>(Factory.GetResponse("ImportExport/GetExports", "", ApiToken, ApiServer), new JsonSerializerSettings { NullValueHandling = NullValueHandling.Ignore }); } } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_ChannelDetails.cs using System; using System.Collections.Generic; namespace LinnworksAPI { public class ChannelDetails { public Int32 LinksCount; public List<InventoryListingTemplate> Templates; public List<FieldTypes> Changes; public Boolean ContainsChanges; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_OrderRelation.cs using System; namespace LinnworksAPI { public class OrderRelation { public Int32 ChildOrderId; public Guid ChildOrderPkOrderId; public Int32 ParentOrderId; public Guid ParentOrderPkOrderId; public String RelationType; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_ScheduleConfiguration.cs using System; namespace LinnworksAPI { public class ScheduleConfiguration { public RepetitionType RepetitionType; public DateTime? OneTimeDate; public DailyFrequencyType? DailyFrequency; public DateTime? OccursFrequencyStartingDate; public Int32? OccursFrequencyEveryX; public String WeeklyDays; public RepetitionType? OccursFrequency; public String OccursOnceAtTime; public Int32? OccursEveryHours; public String StartingTime; public String EndingTime; public Boolean Enabled; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_SqlDataRecord.cs using System; namespace LinnworksAPI { public class SqlDataRecord { public Int32 FieldCount; public Object Item; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_StockItemPricingRule.cs using System; namespace LinnworksAPI { public class StockItemPricingRule { public Int32? pkRowId; public Guid fkStockPricingId; public String Type; public Int32 LowerBound; public Double Value; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_GetConversionRatesRequest.cs using System; namespace LinnworksAPI { public class GetConversionRatesRequest { public Boolean GetCurrenciesFromOrders; public String Currency; } }<file_sep>/Linnworks/src/javascript/Settings.js var Settings = { // http://apidoc.linnworks.net/#/Settings-AddSetting AddSetting: function(category,setting,value,userSpecific,token, server) { return Factory.GetResponse("Settings/AddSetting", token, server, "category=" + category + "&setting=" + setting + "&value=" + value + "&userSpecific=" + userSpecific +""); }, // http://apidoc.linnworks.net/#/Settings-DeleteSetting DeleteSetting: function(category,setting,userSpecific,token, server) { return Factory.GetResponse("Settings/DeleteSetting", token, server, "category=" + category + "&setting=" + setting + "&userSpecific=" + userSpecific +""); }, // http://apidoc.linnworks.net/#/Settings-GetSettings GetSettings: function(categories,token, server) { return Factory.GetResponse("Settings/GetSettings", token, server, "categories=" + JSON.stringify(categories) +""); }, // http://apidoc.linnworks.net/#/Settings-IsCustomerAuthorized IsCustomerAuthorized: function(module,token, server) { return Factory.GetResponse("Settings/IsCustomerAuthorized", token, server, "module=" + module +""); }, // http://apidoc.linnworks.net/#/Settings-IsBetaApplied IsBetaApplied: function(module,token, server) { return Factory.GetResponse("Settings/IsBetaApplied", token, server, "module=" + module +""); }, // http://apidoc.linnworks.net/#/Settings-RequestCustomerAccess RequestCustomerAccess: function(module,name,telephone,time,token, server) { return Factory.GetResponse("Settings/RequestCustomerAccess", token, server, "module=" + module + "&name=" + name + "&telephone=" + telephone + "&time=" + JSON.stringify(time) +""); }, // http://apidoc.linnworks.net/#/Settings-GetMeasures GetMeasures: function(token, server) { return Factory.GetResponse("Settings/GetMeasures", token, server, ""); }, // http://apidoc.linnworks.net/#/Settings-GetXmlSetting GetXmlSetting: function(name,category,token, server) { return Factory.GetResponse("Settings/GetXmlSetting", token, server, "name=" + name + "&category=" + category +""); }, }; <file_sep>/Linnworks/src/net/LinnworksAPI/Class_RowQty.cs using System; namespace LinnworksAPI { public class RowQty { public Guid OrderItemRowId; public Double Refund; public Int32 Qty; public Int32? ScrapQty; public Double AdditionalCost; public Guid? NewStockItemId; public Int32 NewQty; } }<file_sep>/Linnworks/src/javascript/PrintService.js var PrintService = { // http://apidoc.linnworks.net/#/PrintService-CreatePDFfromJobForceTemplate CreatePDFfromJobForceTemplate: function(templateType,IDs,templateID,parameters,printerName,token, server) { return Factory.GetResponse("PrintService/CreatePDFfromJobForceTemplate", token, server, "templateType=" + templateType + "&IDs=" + JSON.stringify(IDs) + "&templateID=" + templateID + "&parameters=" + JSON.stringify(parameters) + "&printerName=" + printerName +""); }, // http://apidoc.linnworks.net/#/PrintService-PrintTemplatePreview PrintTemplatePreview: function(templateId,token, server) { return Factory.GetResponse("PrintService/PrintTemplatePreview", token, server, "templateId=" + templateId +""); }, // http://apidoc.linnworks.net/#/PrintService-GetTemplateList GetTemplateList: function(templateType,token, server) { return Factory.GetResponse("PrintService/GetTemplateList", token, server, "templateType=" + templateType +""); }, // http://apidoc.linnworks.net/#/PrintService-VP_GetPrinters VP_GetPrinters: function(token, server) { return Factory.GetResponse("PrintService/VP_GetPrinters", token, server, ""); }, // http://apidoc.linnworks.net/#/PrintService-DownloadVirtualPrinterClient DownloadVirtualPrinterClient: function(token, server) { return Factory.GetResponse("PrintService/DownloadVirtualPrinterClient", token, server, ""); }, }; <file_sep>/Linnworks/src/net/LinnworksAPI/Class_ReturnInfo.cs using System; namespace LinnworksAPI { public class ReturnInfo { public Int32 pkReturnId; public ReturnType RowType; public String ReturnReference; public Guid fkOrderId; public Guid fkOrderItemRowId; public Int32 nOrderId; public String SKU; public String ItemTitle; public String Reason; public String ChannelReason; public String ChannelReasonSec; public String Category; public Int32 ReturnQty; public Guid? fkReturnLocationId; public Boolean Scrapped; public Int32? ScrapQty; public String LastState; public DateTime LastDate; public Boolean Completed; public Guid? fkNewOrderId; public Guid? fkNewOrderItemRowId; public Guid? fkNewStockItemId; public Int32? NewQty; public Int32? NewOrderId; public Boolean NewOrderCancelled; public String NewSKU; public String NewItemTitle; public DateTime? NewOrderProcessedOn; public Double? AdditionalCost; public Guid? fkRefundRowId; public Double? RefundedAmount; public Double? PendingRefund; public DateTime ReturnDate; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_InventoryParameters.cs using System; using System.Collections.Generic; namespace LinnworksAPI { public class InventoryParameters { public List<Guid> InventoryItemIds; public List<Tuple<Int32, Int32>> SelectedRegions; public Guid Token; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_PurchaseOrderItem.cs using System; namespace LinnworksAPI { public class PurchaseOrderItem { public Guid pkPurchaseItemId; public Guid fkStockItemId; public Int32 Quantity; public Decimal Cost; public Int32 Delivered; public Decimal TaxRate; public Decimal Tax; public Int32 PackQuantity; public Int32 PackSize; public String SKU; public String ItemTitle; } }<file_sep>/Linnworks/src/net/LinnworksAPI/Class_StockLevelUpdate.cs using System; namespace LinnworksAPI { public class StockLevelUpdate { public String SKU; public Guid LocationId; public Int32 Level; } }
111be8c5586774339ff522db24ee298fb449aa34
[ "JavaScript", "C#" ]
39
C#
wizzardmr42/LinnworksNetSDK
d1ebc91d13af021744e096a29f5525ccede2070e
448c772aecb47b59575ee5981044f42b86673c35
refs/heads/main
<file_sep>from PIL import Image import torch import timm import torchvision import torchvision.transforms as T from timm.data.constants import IMAGENET_DEFAULT_MEAN, IMAGENET_DEFAULT_STD import gradio as gr torch.set_grad_enabled(False); with open("imagenet_classes.txt", "r") as f: imagenet_categories = [s.strip() for s in f.readlines()] transform = T.Compose([ T.Resize(256, interpolation=3), T.CenterCrop(224), T.ToTensor(), T.Normalize(IMAGENET_DEFAULT_MEAN, IMAGENET_DEFAULT_STD), ]) model = torch.hub.load('facebookresearch/deit:main', 'deit_base_patch16_224', pretrained=True) def detr(im): img = transform(im).unsqueeze(0) # compute the predictions out = model(img) # and convert them into probabilities scores = torch.nn.functional.softmax(out, dim=-1)[0] # finally get the index of the prediction with highest score topk_scores, topk_label = torch.topk(scores, k=5, dim=-1) d = {} for i in range(5): pred_name = imagenet_categories[topk_label[i]] pred_name = f"{pred_name:<25}" score = topk_scores[i] score = f"{score:.3f}" d[pred_name] = score return d inputs = gr.inputs.Image(type='pil', label="Original Image") outputs = gr.outputs.Label(type="confidences",num_top_classes=5) title = "Deit" description = "demo for Facebook DeiT: Data-efficient Image Transformers. To use it, simply upload your image, or click one of the examples to load them. Read more at the links below." article = "<p style='text-align: center'><a href='https://arxiv.org/abs/2012.12877'>Training data-efficient image transformers & distillation through attention</a> | <a href='https://github.com/facebookresearch/deit'>Github Repo</a></p>" examples = [ ['deer.jpg'], ['cat.jpg'] ] gr.Interface(detr, inputs, outputs, title=title, description=description, article=article, examples=examples).launch()
1039b69fad2fb4c2d3dec23caa7096ab5af1c979
[ "Python" ]
1
Python
AK391/deit
5ad3b43ab6a36166ffe7043b3017d74f5d43c13b
c1b9ae0b3dafddd51caaa3aed0065be1f6d7cfb6
refs/heads/master
<repo_name>everfortunetrading/best4u<file_sep>/routes/admin.js var express = require('express'); var router = express.Router(); var db = require('../database'); var user = require('../admin_credential') var con = db() var app = express() var loggedIn = false; router.get('/', function(req, res, next) { if(loggedIn){ con.query("SELECT * FROM products", function (err, result, fields) { if (err) throw err; res.render('admin', {products: result}); }); } else { console.log('i was here') res.redirect('/admin/login') } }); router.post('/create', (req,res) => { var name, image_url, amazon_url, description name = req.body.name; image_url = req.body.image_url; amazon_url = req.body.amazon_url; description = req.body.description; var sql = "INSERT INTO products (name, description, image_url, amazon_url) VALUE (\"" + name + "\", \"" + description + "\", \"" + image_url + "\", \"" + amazon_url + "\" )" con.query(sql, function(err, result) { if(err) throw err console.log("1 record inserted") }) res.redirect("/admin"); }) router.post('/edit', (req,res) => { var fields = { name: req.body.name, image_url: req.body.image_url, amazon_url:req.body.amazon_url, description: req.body.description } for(var param in fields){ var sql = "UPDATE products SET " + param + " = \"" + fields[param] + "\" WHERE id = " + req.query.id; con.query(sql, function(err, result) { if(err) throw err console.log("1 record edited") }) } res.redirect('/admin') }) router.post("/delete", (req,res) => { var sql = "DELETE FROM products WHERE id=" + req.query.id con.query(sql, (err)=>{ if(err) throw err console.log("1 record deleted") }) res.redirect("/admin") }) router.get('/login', function(req, res){ res.render('login', {alert: false}); }) router.post('/login', function(req, res){ var usernameInput, passInput; usernameInput = req.body.user passInput = req.body.pass if(authenticate(usernameInput, passInput)){ loggedIn = true; res.redirect('/admin') }else{ res.render('login', {alert: true}) } }) function authenticate(u, p){ if(user.username === u && user.password === p){ return true } else { return false } } module.exports = router; <file_sep>/public/assets/js/admin-page.js function show(element) { document.getElementById(element).style.visibility = "visible"; document.getElementById(element).style.height = "auto"; document.getElementById(element).style.width = "auto"; document.getElementById(element).style.padding = "20px"; } function hide(element) { document.getElementById(element).style.visibility = "hidden"; document.getElementById(element).style.height = "0"; document.getElementById(element).style.width = "0"; document.getElementById(element).style.padding = "0"; } function showHide(element) { document.getElementById(element).style.visibility == "visible" ? hide(element) : show(element); } // Validating Empty Field function check_empty() { if (document.getElementById('name').value == "" || document.getElementById('email').value == "" || document.getElementById('msg').value == "") { alert("Fill All Fields !"); } else { document.getElementById('form').submit(); alert("Form Submitted Successfully..."); } } //Function To Display Popup function div_show(id) { document.getElementById(id).style.display = "block"; } //Function to Hide Popup function div_hide(id){ document.getElementById(id).style.display = "none"; }
7d54d38f016e4d844b39c37f7f852975f598693c
[ "JavaScript" ]
2
JavaScript
everfortunetrading/best4u
fc732c1ef104542bd2e113c33222526fb8cef9ab
372c394228fd56d0f5e1febe1664e22c9d6edbe2
refs/heads/master
<file_sep>import pytest from tests.base import test_app from cmm_app.models import User # def test_user_password_correct(): # user = User(email='<EMAIL>') # user.set_password('<PASSWORD>') # assert user.check_password('<PASSWORD>') # def test_user_password_incorrect(): # user = User(email='<EMAIL>') # user.set_password('<PASSWORD>') # assert not user.check_password('<PASSWORD>') # def test_registration_page_get(test_app): # client = test_app.test_client() # resp = client.get('/auth/register') # assert resp.status_code == 200 # assert b'Register' in resp.data # def test_registration_page_post(test_app): # client = test_app.test_client() # resp = client.post( # '/auth/register', # data=dict( # email='<EMAIL>', # password='<PASSWORD>', # password_confirm='<PASSWORD>' # ), follow_redirects=True) # assert resp.status_code == 200 # assert b'Login' in resp.data # def test_login_page_get(test_app): # client = test_app.test_client() # resp = client.get('/auth/login') # assert resp.status_code == 200 # assert b'Login' in resp.data # def test_login_page_post(test_app): # """ # This may not work since there is no user registered. # """ <file_sep>import pytest from cmm_app import app, db from cmm_app.config import TestingConfig @pytest.fixture def test_app(): app.config.from_object(TestingConfig) with app.app_context(): db.create_all() yield app db.session.remove() db.drop_all() <file_sep>alembic==1.0.11 astroid==2.2.5 atomicwrites==1.3.0 attrs==19.1.0 autopep8==1.4.4 Click==7.0 Flask==1.1.1 Flask-Login==0.4.1 Flask-Migrate==2.5.2 Flask-SQLAlchemy==2.4.0 Flask-WTF==0.14.2 importlib-metadata==0.19 isort==4.3.21 itsdangerous==1.1.0 Jinja2==2.10.1 lazy-object-proxy==1.4.1 Mako==1.0.14 MarkupSafe==1.1.1 mccabe==0.6.1 more-itertools==7.2.0 packaging==19.1 pathlib2==2.3.4 pep8==1.7.1 pkg-resources==0.0.0 pluggy==0.12.0 psycopg2==2.8.3 py==1.8.0 pycodestyle==2.5.0 pylint==2.3.1 pyparsing==2.4.2 pytest==5.0.1 python-dateutil==2.8.0 python-editor==1.0.4 six==1.12.0 SQLAlchemy==1.3.6 typed-ast==1.4.0 wcwidth==0.1.7 Werkzeug==0.15.5 wrapt==1.11.2 WTForms==2.2.1 zipp==0.5.2 <file_sep>from flask import Blueprint, render_template, redirect, url_for, request, flash from flask_login import login_user, logout_user, login_required, current_user from werkzeug.urls import url_parse from cmm_app import db from cmm_app.models import User from cmm_app.auth.forms import RegistrationForm, LoginForm auth = Blueprint('auth', __name__, template_folder='templates', static_folder='static') @auth.route('/register', methods=['GET', 'POST']) def register(): form = RegistrationForm(request.form) if form.validate_on_submit(): user = User.query.filter_by(email=form.email.data).first() if not user: new_user = User(email=form.email.data) new_user.set_password(form.password.data) db.session.add(new_user) db.session.commit() url = url_for('auth.login') flash('Registration successful!', 'success') return redirect(url) flash('Email already exists!', 'danger') return render_template('auth/register.html', form=form) @auth.route('/login', methods=['GET', 'POST']) def login(): if current_user.is_authenticated: flash('You are already logged in!', 'warning') return redirect(url_for('dashboard.dashboard')) form = LoginForm(request.form) if form.validate_on_submit(): user = User.query.filter_by(email=form.email.data).first() if user and user.check_password(form.password.data): login_user(user, remember=form.remember_me.data) next_page = request.args.get('next') if not next_page or url_parse(next_page).netloc != '': next_page = url_for('dashboard.dashboard') return redirect(next_page) flash('Invalid credentials!', 'danger') return render_template('auth/login.html', form=form) @auth.route('/logout') @login_required def logout(): logout_user() flash('You have been logged out!', 'success') return redirect(url_for('auth.login')) <file_sep># CoverMyMeds Code Day application The goal of this code day application was to create a per customer project time tracker application. Aside from basic authentication, a user would be able to view customers they are doing work for as well as the projects being worked on. Within these projects, a user can track the time of a task that is being completed for a project. <file_sep>from datetime import datetime from cmm_app import db, login_manager from flask_login import UserMixin from werkzeug.security import generate_password_hash, check_password_hash @login_manager.user_loader def load_user(user_id): return User.query.get(int(user_id)) class User(UserMixin, db.Model): __tablename__ = 'user' id = db.Column(db.Integer, primary_key=True) email = db.Column(db.String, index=True, unique=True, nullable=False) password = db.Column(db.String) created_at = db.Column(db.DateTime, default=datetime.now) profile = db.relationship('Profile', cascade='save-update, merge, delete', uselist=False, back_populates='user') def set_password(self, password): self.password = generate_password_hash(password) def check_password(self, password): return check_password_hash(self.password, password) class Profile(db.Model): __tablename__ = 'profile' id = db.Column(db.Integer, primary_key=True) user_id = db.Column(db.Integer, db.ForeignKey('user.id')) first_name = db.Column(db.String) last_name = db.Column(db.String) birth_date = db.Column(db.DateTime) city = db.Column(db.String(50)) state = db.Column(db.String(50)) about = db.Column(db.String(300)) user = db.relationship('User', back_populates='profile') class Customer(db.Model): __tablename__ = 'customer' id = db.Column(db.Integer, primary_key=True) name = db.Column(db.String, nullable=False) created_at = db.Column(db.DateTime, default=datetime.now) class Project(db.Model): __tablename__ = 'project' id = db.Column(db.Integer, primary_key=True) customer_id = db.Column(db.Integer, db.ForeignKey('customer.id')) name = db.Column(db.String, nullable=False) created_at = db.Column(db.DateTime, default=datetime.now) class Task(db.Model): __tablename__ = 'task' id = db.Column(db.Integer, primary_key=True) project_id = db.Column(db.Integer, db.ForeignKey('project.id')) description = db.Column(db.String, nullable=False) created_at = db.Column(db.DateTime, default=datetime.now) class TaskTime(db.Model): __tablename__ = 'tasktime' id = db.Column(db.Integer, primary_key=True) task_id = db.Column(db.Integer, db.ForeignKey('task.id')) start_time = db.Column(db.DateTime) stop_time = db.Column(db.DateTime) user_id = db.Column(db.Integer, db.ForeignKey('user.id')) created_at = db.Column(db.DateTime, default=datetime.now) def start_timer(self): self.start_time = datetime.now() db.session.add(self) db.session.commit() def stop_timer(self): self.stop_time = datetime.now() db.session.commit() def total_time_worked(self): total_seconds = (self.stop_time - self.start_time).total_seconds() total_time_string = '{} hours {} minutes {} seconds'\ .format(total_seconds // 3600, (total_seconds % 3600) // 60, (total_seconds % 3600) % 60) return total_time_string @classmethod def is_task_being_worked_on(cls): task = cls.query.filter_by(stop_time=None).first() return task def edit_time_duration(self, start_time, end_time): if end_time < start_time: return 'Invalid time duration' conflicting_times = TaskTime.query.filter( end_time > TaskTime.start_time).all() if len(conflicting_times) > 0: return 'End time duration overlaps with other time history' <file_sep>from flask import Blueprint, render_template, redirect, url_for, request, flash from flask_login import login_required, current_user from werkzeug.urls import url_parse from cmm_app import db from cmm_app.user.forms import ProfileForm from cmm_app.models import Profile user = Blueprint('user', __name__, template_folder='templates', static_folder='static') @user.route('/profile') @login_required def profile(): return render_template('user/profile.html') @user.route('/profile/edit', methods=['GET', 'POST']) @login_required def profile_edit(): profile = current_user.profile form = ProfileForm(obj=profile) form.email.data = current_user.email if form.validate_on_submit(): url = url_for('user.profile') current_user.email = form.email.data # Need to check if the updated email conflicts with any other user if profile: form.populate_obj(profile) db.session.commit() flash('Profile saved!', 'success') return redirect(url) new_profile = Profile(user_id=current_user.id, first_name=form.first_name.data, last_name=form.last_name.data, birth_date=form.birth_date.data, city=form.city.data, state=form.state.data, about=form.about.data) db.session.add(new_profile) db.session.commit() flash('Profile saved!', 'success') return redirect(url) return render_template('user/profile_edit.html', form=form) @user.route('/profile/delete') @login_required def profile_delete(): email = current_user.email if request.referrer is None or \ url_parse(request.referrer).path != url_for('user.profile_edit'): flash('This action can only be performed from this page!', 'danger') return redirect(url_for('user.profile_edit')) db.session.delete(current_user) db.session.commit() flash('Account {email} has been deleted!'.format(email=email), 'info') return redirect(url_for('auth.login')) <file_sep>from flask_wtf import FlaskForm from wtforms import TextField, TextAreaField, SubmitField,\ validators from wtforms.fields.html5 import EmailField, DateField class ProfileForm(FlaskForm): email = EmailField('Email', [validators.Email(), validators.DataRequired()]) first_name = TextField('<NAME>') last_name = TextField('<NAME>') birth_date = DateField('Birth Date', format='%Y-%m-%d') city = TextField('City', [validators.Length(max=50)]) state = TextField('State', [validators.Length(max=50)]) about = TextAreaField('About', [validators.Length(max=300)]) submit = SubmitField('Save Profile') <file_sep>"""empty message Revision ID: 5ab1e53b6c05 Revises: <PASSWORD> Create Date: 2019-08-05 22:20:29.604574 """ from alembic import op import sqlalchemy as sa # revision identifiers, used by Alembic. revision = '5ab1e53b6c05' down_revision = '<PASSWORD>' branch_labels = None depends_on = None def upgrade(): # ### commands auto generated by Alembic - please adjust! ### op.add_column('profile', sa.Column('city', sa.String(length=50), nullable=True)) op.add_column('profile', sa.Column('state', sa.String(length=50), nullable=True)) # ### end Alembic commands ### def downgrade(): # ### commands auto generated by Alembic - please adjust! ### op.drop_column('profile', 'state') op.drop_column('profile', 'city') # ### end Alembic commands ### <file_sep>from flask_wtf import FlaskForm from wtforms import PasswordField, SubmitField, BooleanField, validators from wtforms.fields.html5 import EmailField data_required_message = 'This field must be filled!' class RegistrationForm(FlaskForm): email = EmailField('Email:', [validators.Email('Invalid email!'), validators.DataRequired(data_required_message)]) password = PasswordField('Password:', [validators.Length(min=8, max=18), validators.DataRequired(data_required_message)]) password_confirm = PasswordField('Repeat Password:', [validators.Length(min=8, max=18), validators.EqualTo('password', 'Passwords' + 'must match!'), validators .DataRequired(data_required_message)]) submit = SubmitField('Register') class LoginForm(FlaskForm): email = EmailField('Email:', [validators.Email('Invalid Email'), validators.DataRequired(data_required_message)]) password = PasswordField('Password:', [validators.DataRequired(data_required_message)]) remember_me = BooleanField('Remember Me') submit = SubmitField('Login') <file_sep>import os basedir = os.path.abspath(os.path.dirname(__file__)) class DevelopmentConfig(object): SECRET_KEY = 'thisisasecret' # SQLAlchemy SQLALCHEMY_DATABASE_URI = 'postgres://cwtebzjx:<EMAIL>:5432/cwtebzjx' SQLALCHEMY_TRACK_MODIFICATIONS = False class TestingConfig(object): SECRET_KEY = 'testkey' DEBUG = True TESTING = True # SQLAlchemy SQLALCHEMY_DATABASE_URI = 'sqlite:///' + os.path.join(basedir, 'test.db') SQLALCHEMY_TRACK_MODIFICATIONS = False <file_sep>{% extends 'base.html' %} {% block head %} <link rel="stylesheet" href="{{ url_for('user.static', filename='css/profile.css') }}"> {% endblock %} {% block body %} <div class="container"> <div class="row"> <div class="col-md-6 mx-auto"> <h1 class="mb-3">Profile</h1> {% with messages = get_flashed_messages(with_categories=true) %} {% if messages %} {% for category, message in messages %} <div class="alert alert-{{ category }} text-center" role="alert"> {{ message }} </div> {% endfor %} {% endif %} {% endwith %} <a href="{{ url_for('user.profile_edit') }}">Edit Profile</a> <div class="card"> <div class="card-body"> <h5 class="card-title"><strong>Email</strong></h5> <p class="card-text">{{ current_user.email }}</p> </div> <hr> <div class="card-body"> <h5 class="card-title"><strong>Full Name</strong></h5> <p class="card-text">{{ current_user.profile.first_name }} {{ current_user.profile.last_name }}</p> </div> <hr> <div class="card-body"> <h5 class="card-title"><strong>Birth Date </strong></h5> {% if current_user.profile.birth_date %} <p class="card-text">{{ current_user.profile.birth_date.strftime('%m/%d/%Y') }}</p> {% endif %} </div> <hr> <div class="card-body"> <h5 class="card-title"><strong>City/State</strong></h5> <p class="card-text">{{ current_user.profile.city }}{{ ',' if current_user.profile.state }} {{ current_user.profile.state }}</p> </div> <hr> <div class="card-body"> <h5 class="card-title"><strong>About</strong></h5> <p class="card-text">{{ current_user.profile.about }}</p> </div> </div> </div> </div> </div> {% endblock %}<file_sep>from cmm_app import db from cmm_app.models import User, Customer, Project, Task, TaskTime from tests.base import test_app import time from datetime import datetime, timedelta import pytest @pytest.fixture def set_up_db(test_app): user = User(email='<EMAIL>') user.set_password('<PASSWORD>') db.session.add(user) db.session.commit() customer = Customer(name='K&G Creative') db.session.add(customer) db.session.commit() project = Project(customer_id=customer.id, name='TimeTracker') db.session.add(project) db.session.commit() task = Task(project_id=project.id, description='Add UI/UX') db.session.add(task) db.session.commit() def test_task_timer(set_up_db): """ Test the task timer model and it's start/stop timer functionality. """ user = User.query.filter_by(email='<EMAIL>').first() task = Task.query.filter_by(id=1).first() work = TaskTime(task_id=task.id, user_id=user.id) work.start_timer() db.session.add(work) db.session.commit() time.sleep(5) work.stop_timer() assert work.start_time is not None assert work.stop_time is not None assert work.total_time_worked().seconds == 5 def test_task_timer_check_if_task_started_already(set_up_db): user = User.query.filter_by(email='<EMAIL>').first() task = Task.query.filter_by(id=1).first() work = TaskTime(task_id=task.id, user_id=user.id) work.start_timer() db.session.add(work) db.session.commit() assert TaskTime.is_task_being_worked_on() # def test_task_timer_edit_duration(set_up_db): # user = User.query.filter_by(email='<EMAIL>').first() # task = Task.query.filter_by(id=1).first() # task_time_1 = TaskTime(task_id=task.id, user_id=user.id) # task_time_1.start_time = datetime.now() # task_time_1.stop_time = task_time_1.start_time + timedelta(hours=2) # # Task time 2 should be after the first task time # task_time_2 = TaskTime(task_id=task.id, user_id=user.id) # task_time_2.start_time = task_time_1.stop_time + timedelta(days=1) # task_time_2.stop_time = task_time_2.start_time + timedelta(hours=3) # # Edit task time 1 # new_start_time = datetime(2019, 8, 5, 11) # new_end_time = datetime(2019, 8, 5, 9) # assert task_time_1.edit_time_duration( # new_start_time, new_start_time) is not None <file_sep>from flask import Blueprint, render_template from flask_login import login_required, current_user from sqlalchemy.sql import func import json from datetime import timedelta from cmm_app import db from cmm_app.models import Task, TaskTime dash = Blueprint('dashboard', __name__, template_folder='templates') @dash.route('/') @login_required def dashboard(): all_tasks = Task.query.all() return render_template('dashboard/dashboard.html', tasks=all_tasks) @dash.route('/customers') def customers(): return render_template('dashboard/customers.html') @dash.route('/customers/projects') def customer_projects(): return render_template('dashboard/customer_projects.html') @dash.route('/projects') def projects(): return render_template('dashboard/projects.html') @dash.route('/projects/<project_id>/tasks') def project_tasks(): return render_template('dashboard/project_tasks.html') @dash.route('/projects/tasks/<task_id>') def task(task_id): task = Task.query.filter_by(id=task_id).first() task_history = TaskTime.query.filter_by(task_id=task_id).all() task_total_duration = timedelta(0) for task_time in task_history: task_total_duration += (task_time.stop_time - task_time.start_time) return render_template('dashboard/task.html', task=task, task_total_duration=task_total_duration, task_history=task_history) @dash.route('/projects/tasks/<task_id>/start_task') def start_task(task_id): task_time = TaskTime(task_id=task_id, user_id=current_user.id) task_time.start_timer() message = { 'message': 'Timer started' } return json.dumps(message) @dash.route('/projects/tasks/<task_id>/stop_task') def stop_task(task_id): task_time = TaskTime.query.filter(TaskTime.task_id == task_id).filter( TaskTime.stop_time == None).first() task_time.stop_timer() message = { 'message': 'Timer stopped', 'task_time': task_time.total_time_worked() } return json.dumps(message) <file_sep>from flask import Flask, url_for from flask_sqlalchemy import SQLAlchemy from flask_migrate import Migrate from flask_login import LoginManager from cmm_app.config import DevelopmentConfig app = Flask(__name__) app.config.from_object(DevelopmentConfig) db = SQLAlchemy(app) migrate = Migrate(app, db) login_manager = LoginManager() login_manager.login_view = 'auth.login' login_manager.init_app(app) # Must import at this stage or else the modules cannot read 'db' from cmm_app.dashboard.routes import dash from cmm_app.auth.routes import auth from cmm_app.user.routes import user app.register_blueprint(auth, url_prefix='/auth') app.register_blueprint(dash, url_prefix='/dashboard') app.register_blueprint(user, url_prefix='/user') <file_sep>"""empty message Revision ID: <KEY> Revises: <PASSWORD> Create Date: 2019-08-06 10:39:10.466399 """ from alembic import op import sqlalchemy as sa # revision identifiers, used by Alembic. revision = '<KEY>' down_revision = '<PASSWORD>' branch_labels = None depends_on = None def upgrade(): # ### commands auto generated by Alembic - please adjust! ### op.add_column('customer', sa.Column('created_at', sa.DateTime(), nullable=True)) op.add_column('project', sa.Column('created_at', sa.DateTime(), nullable=True)) op.add_column('task', sa.Column('created_at', sa.DateTime(), nullable=True)) op.add_column('tasktime', sa.Column('created_at', sa.DateTime(), nullable=True)) # ### end Alembic commands ### def downgrade(): # ### commands auto generated by Alembic - please adjust! ### op.drop_column('tasktime', 'created_at') op.drop_column('task', 'created_at') op.drop_column('project', 'created_at') op.drop_column('customer', 'created_at') # ### end Alembic commands ###
e218781dd27ab9da9da8bbfb76cd42ed22bfea27
[ "Markdown", "Python", "Text", "HTML" ]
16
Python
laskeym/CoverMyMeds
ef7a13cba498a15c7a0f3cbb1116ef8102606f43
6e9ddaf440987560cdcc1d2e09304e93f03f5360
refs/heads/master
<file_sep># 파이썬 모듈은 변수, 함수, 클래스를 포함할 수 있으며, import 예약어를 사용해 다른 모듈에 정의된 변수, 함수, 클래스 등을 자유롭게 불러와서 사용할 수 있다. # import 모듈명 # import 패키지명.모듈명 # 다음은 keyword 모듈을 import하여 파이썬 예약어를 확인하는 예제다. 물론 kwlist의 자료형은 리스타다. import keyword print(keyword.kwlist) print() # import한 모듈이나 패키지의 실제 파일 위치를 알아보려며, __file__문자열 속성을 확인하면 된다. print(keyword.__file__) # from 예약어를 사용하면, 실제 과정에서 from 다음에 지정한 패키지명이나 모듈명을 생략할 수 있다는 장점이 있다. # from 모듈명 import 클래스명, 함수명 등 # from 패키지명 import 모듈명 # 'import 모듈명' 형식으로 calendar 모듈을 임포트 한 경우, calendar 모듈 내의 month() 메서드를 호출하려면 다음 예제의 calendar.month()처럼 모듈명을 먼저 적어주어야 한다. import calendar print(calendar.month(2020, 1)) # 모듈명(calendar) 생략 불가 # 하지만 'from 모듈명 import 메서드명' 형식으로 임포트한 경우에는 다음처럼 모듈명 없이 메서드명을 바로 사용할 수 있다. from calendar import month print(month(2020, 2)) # 모듈명(calendar) 생략 가능 # import ~ as ~ # as 예약어를 사용하면 이름이 긴 모듈명을 프로그래머가 원하는 별칭(alias)으로 줄여서 사용할 수 있다. # import 이름이_긴_모듈명 as 별칭 # from ~ import ~ as 별칭 import datetime print(datetime.datetime.now()) # 별칭을 사용하지 않은경우 # datetime 모듈의 datetime 타입을 dt라는 별칭으로 지정하면, datetime.datetime.now()대신 dt.now()처럼 코드 길이를 짧게 줄일 수 있다. from datetime import datetime as dt print(dt.now()) # 별칭(dt)를 사용한 경우<file_sep># sort()함수는 리스트를 직접 정렬하고 None을 반환한다. 리스트에서만 사용할 수 있다. li = [2, 5, 3, 1, 4] li.sort() print(li) print() # sorted() 함수는 리스트뿐 아니라 문자열, 튜플, 딕셔너리 등 반복 가능한 자료형에 모두 사용할 수 있다. sorted() 함수는 기존 리스트를 복사해서 새로 만들어 반환하기 때문에 sort() 함수보다는 다소 느리며, 기존의 리스트에는 영향을 주지 않는다. li = [4, 3, 1, 2, 5] print(sorted(li)) print(li)<file_sep>import numpy as np import pandas as pd import matplotlib.pyplot as plt from Investar import Analyzer mk = Analyzer.MarketDB() stocks = ['삼성전자', 'SK하이닉스', '현대자동차', 'NAVER'] df = pd.DataFrame() for s in stocks: df[s] = mk.get_daily_price(s, '2016-01-04', '2018-04-27')['close'] # 시총 상위 4 종목의 수익률을 비교하려면 종가 대신 일간 변동률로 비교를 해야 하기 때문에 데이터프레임에서 제공하는 pct_change() 함수를 사용해 4 종목의 일간 변동률을 구한다. daily_ret = df.pct_change() # 일간 변동률의 평균값에 252를 곱해서 연간 수익률을 구한다. 252는 미국의 1년 평균 개장일로, 우리나라 실정에 맞게 다른 숫자로 바꾸어도 무방하다. annual_ret = daily_ret.mean() * 252 # 일간 리스크는 cov() 함수를 사용해 일간 변동률의 공분산으로 구한다. daily_cov = daily_ret.cov() # 연간 공분산은 일간 공분산에 252를 곱해 계산한다. annual_cov = daily_cov * 252 port_ret = [] # 시총 상위 4종목 비중을 다르게 해 포트폴리오 20,000개를 생성한다. 포트폴리오 수익률, 리스크, 종목 비중을 저장할 각 리스트를 생성한다. port_risk = [] port_weights = [] # Display results print(daily_ret) print(annual_ret) print(daily_cov) print(annual_cov) # Monte Carlo Simulation # 포트폴리오 20,000개를 생성하는데 range() 함수와 for in 구문을 사용했다. for in 구문에서 반복횟수를 사용할 일이 없으면 관습적으로 _ 변수에 할당한다. for _ in range(20000): # 4개의 랜덤 숫자로 구성된 배열을 생성한다. weights = np.random.random(len(stocks)) # 위에서 구한 4개의 랜덤 숫자의 총합으로 나눠 4 종목 비중의 합이 1이 되도록 조정한다. weights /= np.sum(weights) # 랜덤하게 생성한 종목별 비중 배열과 종목별 연간 수익률을 곱해 해당 포트폴리오 전체 수익률(returns)을 구한다. returns = np.dot(weights, annual_ret) # 종목별 연간 공분산과 종목별 비중 배열을 곱한 뒤 이를 다시 종목별 비중의 전치로 곱한다. risk = np.sqrt(np.dot(weights.T, np.dot(annual_cov, weights))) # 포트폴리오 20,000개 수익률, 리스크, 종목별 비중을 각각 리스트에 추가한다. port_ret.append(returns) port_risk.append(risk) port_weights.append(weights) portfolio = {'Returns': port_ret, 'Risk': port_risk} print(portfolio) # i 값은 0, 1, 2, 3 순으로 변한다. 이때 s값은 삼성전자, sk하이닉스, 현대자동차, Naver 순으로 변한다. for i, s in enumerate(stocks): # portfolio 딕셔너리에 삼성전자, sk하이닉스, 현대자동차, Naver 키 순서로 비중값을 추가한다. portfolio[s] = [weights[i] for weight in port_weights] df = pd.DataFrame(portfolio) # 최종 생성된 df 데이터프레임을 출력하면, 시총 상위 4개 종목의 보유 비율에 따라 포트폴리오 20,000개가 각기 다른 리스크와 예상 수익률을 가지는 것을 확인 할 수 있다. df = df[['Returns', 'Risk'] + [s for s in stocks]] print(df) df.plot.scatter(x='Risk', y='Returns', figsize=(10, 7), grid=True) plt.title('Efficient Frontier') plt.xlabel('Risk') plt.ylabel('Expected Returns') plt.show()<file_sep># 구분자 변경하기 print('-'.join('2012/01/04'.split('/'))) # 문자열 내부의 '/'를 '-'로 교체 print('2020/08/17'.replace('/', '-')) # 천 단위 숫자를 쉼표로 구분하기 print(''.join('1,234,567,890'.split(','))) print('1,122,334,455'.replace(',', '')) # 반대로 숫자를 표시할 때 천 단위마다 쉼표로 분리해 문자열 형태로 표시하려면 format() 함수를 사용하면 된다. print(format(1234567890, ','))<file_sep># RSI(Relative Strength Index) rsi = 88 if rsi > 70: print("RSI", rsi, "means overbought.") elif rsi < 30: print("RSI", rsi, "means oversold.") else: print("...") # RSI 88 means overbought.<file_sep># 최대 손실 낙폭 - MDD(Maximum Drawdown)은 특정 기간에 발생한 최고점에서 최저점까지 가장 큰 손실을 의미한다. 퀀트 투자에서는 수익률을 높이는 것 보다 MDD를 낮추는 것이 더 낫다고 할 만큼 중요한 지표로서, 특정기간 동안 최대한 얼마늬 손실이 날 수 있는지를 나타낸다. # MDD = (최저점 - 최고점) / 최저점 # KOSPI(Korea Composite Stock Price Index, 한국종합주가지수)는 1983년 부터 발표되었으며, 1980년 1월 4일에 상장된 모든 종목의 시가 총액을 기준 지수 100포인트로 집계한다. 따라서 KOSPI 지수 2500은 한국 증시가 1980년 당시보다 25배 올랐음을 나타낸다. # KOSPI는 1994년 1145.66 포인트에서 1998년 277.37 포인트까지 4년동안 무려 75.8%가 하락했는데, 이 기간 MDD는 -75.8%이다. 전체 주식 시장이 1/4 토막 난 것이 KOSPI 역사상 최대 손실 낙폭이라고 할 수 있다. # 서브프라임 당시의 MDD - 야후 파이넨스로부터 2004년부터 현재까지의 KOSPI 지수 데이터를 다운로드 받아서 KOSPI의 MDD를 구해보자. MDD를 구하려면 기본적으로 rolling() 함수에 대해 알아야 한다. # 시리즈.rolling(윈도우 크기 [, min_periods=1]) [.집계 함수()] # rolling() 함수는 시리즈에서 윈도우 크기에 해당하는 개수만큼 데이터를 추출하여 집계 함수에 해당하는 연산을 실시한다. 집계 함수로는 최댓값 max(), 평균값 mean(), 최솟값 min()을 사용할 수 있다. min_periods를 지정하면 데이터 개수가 윈도우 크기에 못미치더라도 min_periods로 지정한 개수만 만족하면 연산을 수행한다. # 다음은 야후 파이넨스에서 KOSPI 지수 데이터를 다운로드 한 뒤 rolling() 함수를 이용하여 1년 동안 최댓값과 최소값을 구하여 MDD를 계산하는 예이다. from pandas_datareader import data as pdr import yfinance as yf yf.pdr_override() import matplotlib.pyplot as plt kospi = pdr.get_data_yahoo('^ks11', '2004-01-04') # KOSPI 지수 데이터를 다운로드 한다. KOSPI 지수의 심볼은 ^KS11이다. window = 252 # 산정 기간에 해당하는 window 값은 1년 동안 개장일을 252일로 어림잡아 설정했다. peak = kospi['Adj Close'].rolling(window, min_periods=1).max() # KOSPI 종가 갈럼에서 1년(거래일 기준) 기간 단위로 최고치 peak를 구한다. drawdown = kospi['Adj Close']/peak - 1.0 # drawdown은 최고치(peak) 대비 현재 KOSPI 종가가 얼마나 하락했는지를 구한다. max_dd = drawdown.rolling(window, min_periods=1).min() # drawdown에서 1년 기간 단위로 최저치 max__dd를 구한다. 마이너스값이기 때문에 최저치가 바로 최대 손실 낙폭이 된다. plt.figure(figsize=(9, 7)) plt.subplot(211) # 2행 1열 중 1행에 그린다. kospi['Close'].plot(label='KOSPI', title='KOSPI MDD', grid=True, legend=True) plt.subplot(212) # 2행 1열 중 2행에 그린다. drawdown.plot(c='blue', label='KOSPI DD', grid=True, legend=True) max_dd.plot(c='red', label='KOSPI MDD', grid=True, legend=True) plt.show() # 정확한 MDD는 min()함수로 구한다. print(max_dd.min()) # MDD를 기록한 기간을 구하려면 다음과 같이 인덱싱 조건을 적용하면 된다. print(max_dd[max_dd==-0.5453665130144085])<file_sep>""" 상속은 클래스가 가지는 모든 속성과 메서드를 다른 클래스에게 물려주는 기법이다. 자식 클래스는 여러 부모 클래스로부터 상속 받을 수 있는데, 이를 다중 상속이라고 한다. class 자식 클래스(부모 클래스 1, 부모 클래스 2, ...): pass """ class A: def methodA(self): print("Calling A's methodA") def method(self): print("Calling A's method") class B: def methodB(self): print("Calling B's methdB") class C(A, B): def methodC(self): print("Calling C's methodC") def method(self): print("Calling C's overridden method") # 부모의 변수나 메소드를 사용할 때는 super() 내장 함수를 호출하면 된다. super().method() c = C() print(c.methodA()) print(c.methodB()) print(c.methodC()) print(c.method()) # 위 예제의 method() 메서드처럼 자식 클래스에서 부모 클래스의 메서드 이름과 인수 형식과 동일하게 매서드를 재정의 하는 것을 오버라이딩이라고 한다. <file_sep># 패키지는 여러 모듈(.py 파일)을 특정 디렉터리에 모아놓은 것이다. 패키지명 뒤에 '.'를 붙이고 모듈명을 사용할 수 있다. 예를 들어 A.B로 표기하면 A 패키지의 하위 모듈 B를 명시한 것이다. 이렇게 사용하면 여러 모듈을 사용할 때 모듈명이나 전역변수가 겹치는 문제를 피할 수 있다. 다음은 urllib 패키지의 request 모듈의 자료형을 type() 함수로 확인한 예이다. import urllib.request print(type(urllib.request)) # urllib.request 모듈의 타입은 module이다. import urllib print(type(urllib)) # urllib의 타입은 module로 표시된다. print(urllib.__path__) # urllib은 __path__ 속성이 있으므로 패키지다. print(urllib.__package__) # urllib이 속한 패키지는 urllib이다. <file_sep>i = 1 while i < 7: print(i) i += 2 print() j = 0 while j >= 0: j += 1 if (j % 2) == 0: continue if j > 5: break print(j) else: print('Condition is False.')<file_sep>word = "python" print(len(word)) print(word[0] + word[1] + word[2] + word[3] + word[4] + word[5]) print(word[-6] + word[-5] + word[-4] + word[-3] + word[-2] + word[-1]) <file_sep>class NasdaqStock: """Class for NASDAQ stocks""" # 독스트링 count = 0 # 클래스 변수 def __init__(self, symbol, price): """Constructor for NasdaqStock""" # 독스트링 self.symbol = symbol # 인스턴스 변수 self.price = price # 인스턴스 변수 NasdaqStock.count += 1 print('Calling __init__({}, {:2f}) > count: {}'.format(self.symbol, self.price, NasdaqStock.count)) def __del__(self): """Destructor for NasdaqStock""" # 독스트링 print("Calling __del__({})".format(self)) gg = NasdaqStock("GOOG", 1154.05) del(gg) ms = NasdaqStock("MSFT", 102.44) del(ms) amz = NasdaqStock("AMZN", 1764.00) del(amz)<file_sep># 연평균 성장률(CAGR)을 계산하는 함수를 작성해 보자. CAGR은 Compound Annual Growth Rate의 약자로 우리말로는 '복합 연평균 성장률' 또는 '연복리 수익률'이라고도 부른다. # CAGR은 1년 동안 얼마 만큼씩 증가하는지를 나타내는 값으로, 주로 투작 수익률을 표시하는데 사용되지만, 판매수량이나 사용자 증가율 등을 나타날 때도 쓴다. # CAGR = (L/F)^(1/Y) - 1 # F = 처음 값, L = 마지막 값, Y = 처음 값과 마지막 값 사이의 연(year) 수 # 위의 수학 공식을 파이썬으로 옮겨보면 다음과 같다. def getCAGR(first, last, years): return (last/first)**(1/years) - 1 # 삼성전자는 1998년 4월 27일 65,300원이던 주가가 액면 분할 직전인 2018년 4월 27일 2,669,000원이 되기까지 정확히 20년 동안이나 무려 4,087%로 상승했다. 이 기간 연평균 성장률을 구하면 20.38%가 나온다. cagr = getCAGR(65300, 2669000, 20) print(" SEC CAGR : {:.2%}".format(cagr))<file_sep>from pywinauto import application import os, time # 프로세스 종료 명령 taskkill로 실행 중인 크레온 관련 프로세스(coStarter.exe, CpStart.exe, DibServer.exe)를 종료했다. 인수는 '이미지명이 coStarter로 시작하는 프로세스(/IM coStarter*)를 강제로(/F) 종료하라(/T)'는 뜻이다. 만일 실행 중인 크레온 관련 프로세스가 없으면 해당 프로세스를 찾을 수 없다고 오류 메시지가 발생할 수 있으나, 실행중인 프로세스를 찾아서 종료하는 것이 목적이므로 프로세스를 못 찾는다는 오류 메시지는 무시해도 된다. os.system('taskkill /IM coStarter* /F /T') os.system('taskkill /IM CpStart* /F /T') os.system('taskkill /IM DibServer* /F /T') # WMIC(Windows Management Instrumentation Command-line)는 윈도우 시스템 정보를 조회하거나 변경할 때 사용하는 명령이다. 크레온 프로그램은 가제 종료 신호를 받으면 확인 창을 띄우기 때문에 강제로 한 번 더 프로세스를 종료해야 한다. os.system('wmic process where "name like \'%coStarter%\'" call terminate') os.system('wmic process where "name like \'%CpStart%\'" call terminate') os.system('wmic process where "name like \'%DibServer%\'" call terminate') time.sleep(5) app = application.Application() # 파이윈오토를 이용하여 크레온 프로그램(coStarter.exe)을 크레온 플러스 모드(/prj:cp)로 자동으로 시작한다. 사용자 ID, 암호, 공인인증서 암호를 실행 인수로 지정해 놓으면 로그인 창에 자동으로 입력된다. (* 표시를 자신의 정보로 대체하기 바란다. ) app.start('C:\CREON\STARTER\coStarter.exe /prj:cp/id:**** /pwd:**** /pwdcert:**** /autostart') time.sleep(60)<file_sep># append() 함수는 넘겨받은 인수의 자료형에 상관 없이 리스트 뒤에 그대로 추가한다. L = [1, 2] L.append([3, 4]) print(L) # extend() 함수는 넘겨받은 인수가 반복 자료형일 경우, 반복 자료형 내부의 각 원소를 추가한다. L2 = [1, 2] L2.extend([3, 4]) print(L2)<file_sep># 시리즈 생성 import pandas as pd s = pd.Series([0.0, 3.6, 2.0, 5.8, 4.2, 8.0]) # 리스트로 시리즈 생성 print(s) # 시리즈의 인덱스 변경 s.index = pd.Index([0.0, 1.2, 1.8, 3.0, 3.6, 4.8]) # 인덱스 변경 s.index.name = 'MY_IDX' # 인덱스명 설정 print(s) s.name = 'MY_SERIES' # 시리즈명 설정 print(s) # 데이터 추가 s[5.9] = 5.5 print(s) ser = pd.Series([6.7, 4.2], index=[6.8, 8.0]) # ser 시리즈 생성 s = s.append(ser) # 기존 s 시리즈에 신규 ser 시리즈를 추가 print(s) # 데이터 인덱싱 print(s.index[-1]) # -1은 제일 마지막을 의미하므로 제일 마지막 인덱스 값을 출력 print(s.values[-1]) # 인덱스 순서에 해당하는 데이터를 구하려면 values 속성을 사용한다. print(s.loc[8.0]) # 인덱스를 이용해서 실제로 가리키는 작업을 수행하는 인덱서를 사용해서 데이터를 표시할 수도 있다. 인덱스값을 사용하는 loc 인덱서와 정수 순서를 사용하는 iloc 인덱서가 있다. print(s.loc[8.0]) # 로케이션 인덱서 print(s.iloc[-1]) # 인티저 로케이션 인덱서 # iloc와 values는 인덱스 순서에 해당하는 데이터를 술력한다는 점에서 동일하지만, values는 결과값이 복수개 일 때 배열로 반환하고, iloc는 시리즈로 반환하는 차이점이 있다. print(s.values[:]) print(s.iloc[:]) # 데이터 삭제 s = s.drop(8.0) # s.drop(s.index[-1])과 같다. s 시리즈에 변화를 주지 않으려면 대입하지 않으면 됨 print(s) # 시리즈 정보 보기 print(s.describe()) # 시리즈 출력하기 import pandas as pd s = pd.Series([0.0, 3.6, 2.0, 5.8, 4.2, 8.0, 5.5, 6.7, 4.2]) # 시리즈 생성 s.index = pd.Index([0.0, 1.2, 1.8, 3.0, 3.6, 4.8, 5.9, 6.8, 8.0]) # 시리즈 인덱스 변경 s.index.name = "MY_IDX" # 시리즈 인덱스명 설정 s.name = 'MY_SERIES' # 시리즈 이름 설정 import matplotlib.pyplot as plt plt.title("ELLIOTT_WAVE") plt.plot(s, 'bs--') # 시리즈를 bs--(푸른 사각형과 점선) 형태로 출력 plt.xticks(s.index) # x축의 눈금값을 s 시리즈의 인덱스값으로 설정 plt.yticks(s.values) # y축의 눈금값을 s 시리즈의 데이터값으로 설정 plt.grid(True) plt.show()<file_sep>try: 1/0 except Exception as e: print('Exception occured :', str(e)) <file_sep># 데이터프레임 생성 import pandas as pd df = pd.DataFrame({'KOSPI': [1915, 1961, 2026, 24467, 2041], 'KOSDAQ': [542, 682, 631, 798, 675]}) print(df) # 인덱스 추가 import pandas as pd df = pd.DataFrame({'KOSPI': [1915, 1961, 2026, 24467, 2041], 'KOSDAQ': [542, 682, 631, 798, 675]}, index=[2014, 2015, 2016, 2017, 2018]) print(df) # 데이터프레임 객체에 포함된 데이터의 전체적 모습 확인 print(df.describe()) # 데이터프레임의 인덱스 정보, 칼럼 정보, 메모리 사용량 등을 확인 print(df.info()) # 시리즈를 이요한 데이터프레임 생성 kospi = pd.Series([1915, 1961, 2026, 2467, 2041], index=[2014, 2015, 2016, 2017, 2018], name='KOSPI') print(kospi) kosdaq = pd.Series([542, 682, 631, 798, 675], index=[2014, 2015, 2016, 2017, 2018], name='KOSDAQ') print(kosdaq) df = pd.DataFrame({kospi.name: kospi, kosdaq.name: kosdaq}) print(df) # 리스트를 이용한 데이터프레임 생성 columns = ['KOSPI', 'KOSDAQ'] index = [2014, 2015, 2016, 2017, 2018] rows = [] rows.append([1915, 542]) rows.append([1961, 682]) rows.append([2026, 631]) rows.append([2467, 798]) rows.append([2041, 675]) df = pd.DataFrame(rows, columns=columns, index=index) print(df) # 데이터프레임 순회 처리 for i in df.index: print([i, df['KOSPI'][i], df['KOSDAQ'][i]]) # itertuples() 메서드는 데이터프레임의 각 행을 이름있는 튜플 형태로 반환한다. for row in df.itertuples(name='KRX'): print(row) for row in df.itertuples(): print(row[0], row[1], row[2]) for idx, row in df.iterrows(): print(idx, row[0], row[1])<file_sep>import matplotlib.pyplot as plt from Investar import Analyzer mk = Analyzer.MarketDB() df = mk.get_daily_price('005930', '2017-07-10', '2018-06-30') plt.figure(figsize=(9,6)) plt.subplot(2, 1, 1) plt.title('Samsung Electronics (Investar Data)') plt.plot(df.index, df['Close'], 'c', label='Close') plt.legend(loc='best') plt.subplot(2, 1, 2) plt.bar(df.index, df['Volume'], color='g', label='Volume') plt.legend(loc='best') plt.show()<file_sep># 그래프 출력 # plot(x, y, 마커 형태 [, label='Label']) from pandas_datareader import data as pdr import yfinance as yf yf.pdr_override() sec = pdr.get_data_yahoo('005930.KS', start='2018-05-04') msft = pdr.get_data_yahoo('MSFT', start='2018-05-04') import matplotlib.pyplot as plt plt.plot(sec.index, sec.Close, 'b', label='Samsung Electronics') plt.plot(msft.index, msft.Close, 'r--', label="Microsoft") plt.legend(loc='best') # 범례를 best로 지정하면, 그래프가 표시되지 않는 부분을 찾아서 적절한 위치에 범례를 표시해준다. plt.show() # 일간 변동률(daily percent change)로 주가 비교하기 # R(오늘 변동률) = ((R(오늘 종가) - R(어제 종가) / R(어제 종가)) * 100 # 위의 수학식을 파이썬 코드로 옮기는 데 시리즈 모듈에서 제공하는 shift() 함수를 사용한다. print(type(sec['Close'])) # 삼성전자 종가 칼럼의 데이터 확인 print(sec['Close']) # shift() 함수는 데이터를 이동시킬 때 사용하는 함수로, 인수로 n을 줄 경우 전체 데이터가 n행씩 뒤로 이동한다. print(sec['Close'].shift(1)) # 위의 수학식을 파이썬 코드로 표현 sec_dpc = (sec['Close'] / sec['Close'].shift(1) - 1) * 100 print(sec_dpc.head()) # 첫 번 째 일간 변동률의 값이 NaN인데, 향후 계산을 위해 NaN을 0으로 변경할 필요가 있다. sec_dpc.iloc[0] = 0 # 인티저 로케이션 인덱서를 사용해서 시리즈의 첫 번째 데이터를 0으로 변경한다. print(sec_dpc.head()) # 마이크로 소프트 데이터 프레임도 동일하게 처리해준다. msft_dpc = (msft['Close'] / msft['Close'].shift(1) - 1) * 100 msft_dpc.iloc[0] = 0 print(msft_dpc.head()) # Histogram은 frequency distribution을 나타내는 그래프로서, 데이터 값 들에 대한 구간별 빈도수를 막대 형태로 나타낸다. 이 때 구간 수를 bins라고 하는데 hist() 함수에서 사용되는 bins의 기본값은 10이다. 빈스에 따라 그래프 모양이 달라지므로 관측한 데이터 특성을 잘 보여주도록 빈스값을 정해야 한다. 삼성전자 주식 종가의 일간 변동률을 히스토그램으로 출력해보자. 맷플롯립에서 히스토그램은 hist() 함수를 사용한다. 삼성전자의 일간 변동률을 18개 구간으로 나누어 빈도수를 표시한다. import matplotlib.pyplot as plt sec_dpc = (sec['Close'] / sec['Close'].shift(1) - 1) * 100 sec_dpc.iloc[0] = 0 plt.hist(sec_dpc, bins=18) plt.grid(True) plt.show() # 출력된 결과를 보면 삼성전자 일간 변동률 분포가 0 bin을 기준으로 좌우 대칭적이다. 정규분포 형태와 비슷하다. 엄밀히 얘기하자면, 주가 수익률은 정규분포보다 중앙 부분이 더 뾰족하고, 양쪽 꼬리는 더 두터운 것으로 알려져 있다. 이를 각각 급첨분포(leptokurtic distribution)와 팻 테일(fat tail)이라 부른다. # 주가 수익률이 급첨분포를 나타낸다는 것은 정규분포와 비교했을 때 주가의 움직임이 대부분 매우 작은 범위 안에서 발생한다는 것을 의미한다. 그리고 두꺼운 꼬리를 가리키는 팻 테일은 그래프가 좌우 극단 부분에 해당하는 아주 큰 가격 변동이 정규분포보다 더 많이 발생한다는 의미다. 시리즈의 describe() 매서드를 이용하면 평균과 표준편차를 확인할 수 있다. print(sec_dpc.describe()) # 일간 변동률 누적합 구하기 # sec_dpc는 일간 변동률이기 때문에 종목별로 전체적인 변동률을 비교해보려면, 일간 변동률 누적합(Cumulative Sum)을 계산해야 한다. 누적합은 시리즈에서 제공하는 cumsum() 함수를 이용하여 구할 수 있다. sec_dpc_cs = sec_dpc.cumsum() # 일간 변동률의 누적합을 구한다. print(sec_dpc_cs) <file_sep># __name__ 속성 # 명령창에서 파이썬 셸을 이용하여 moduleA 모듈을 실행할 수 있다. 함수 정의를 제외한 나머지 부분이 실행되므로 print('MODULE_A :', __name__) 코드에 의해 'MODULE_A : __main__'이 출력된다. # moduleA 모듈을 직접 실행하지 않고, 파이썬 셸에서 임포트만 해도 print("MODULE_A :", __name__) 코드가 실행되면서 'MODULE_A : myPackage.moduleA'이 출력된다. 이처럼 __name__속성은 단독으로 실행될 때는 '__main__'문자열이 되고, 임포트할 때는 실제 모듈명('myPackage.moduleA')이 된다. <file_sep># 튜플은 리스트처럼 다양한 자료형의 원소를 가지지만, 대괄호 대신 소괄호를 표시하며 원소를 변경할 수 없다. # 튜플은 다른 리스트나 내장함수도 원소로 가질 수 있음 myTuple = ('a', 'b', 'c', [10, 20, 30], abs, max) # 인덱싱을 사용하여 4번째 원소인 리스트를 출력 print(myTuple[3]) # 5번째 원소인 내장함수 abs()에 -100을 파라미터로 전달 print(myTuple[4](-100)) # 6번째 원소인 내장함수 max()에 리스트를 파라미터로 전달 print(myTuple[5](myTuple[3])) # 원소에 대한 변경이 정말 안 되는지 확인해보자. 첫 번째 원소에 'A'를 대입해 보자. '튜플 객체는 원소할당(item assignment)을 지원하지 않는다.'는 메세지와 함께 타입 에러가 발생할 것이다. myTuple[0] = 'A' <file_sep>from pandas_datareader import data as pdr import yfinance as yf yf.pdr_override() # 2000년 이후의 다우존스 지수 데이터를 야후 파이넨스로 부터 다운로드 한다. dow = pdr.get_data_yahoo('^DJI', '2000-01-04') # 2000년 이후의 KOSPI 데이터를 야후 파이넨스로 부터 다운로드 한다. kospi = pdr.get_data_yahoo('^KS11', '2000-01-04') import matplotlib.pyplot as plt plt.figure(figsize=(9, 5)) # 다우존스 지수를 붉은 점선으로 출력한다. plt.plot(dow.index, dow.Close, 'r--', label='Dow Jones Industrial') # KOSPI를 푸른 실선으로 출력한다. plt.plot(kospi.index, kospi.Close, 'b', label='KOSPI') plt.grid(True) plt.legend(loc='best') plt.show()<file_sep>import pandas as pd import matplotlib.pyplot as plt import datetime from mplfinance.original_flavor import candlestick_ohlc import matplotlib.dates as mdates from Investar import Analyzer mk = Analyzer.MarketDB() df = mk.get_daily_price('엔씨소프트', '2017-01-01') # 종가의 12주 지수 이동평균에 해당하는 60일 지수 이동평균을 구한다. ema60 = df.close.ewm(span=60).mean() # 종가의 26주 지수 이동평균에 해당하는 130일 지수 이동평균을 구한다. ema130 = df.close.ewm(span=130).mean() # 12주(60일) 지수 이동평균에서 26주(130일) 지수 이동평균을 빼서 MACD(Moving Average Convergence Divergence)선을 구한다. macd = ema60 - ema130 # MACD의 9주(45일) 지수 이동평균을 구해서 신호선으로 저장한다. signal = macd.ewm(span=45).mean() # MACD선에서 신호선을 빼서 MACD 히스토그램을 구한다. macdhist = macd - signal df = df.assign(ema130=ema130, ema60=ema60, macd=macd, signal=signal, macdhist=macdhist).dropna() # 캔들 차트에 사용할 수 있게 날짜(date)형 인덱스를 숫자형을 변환한다. df['number'] = df.index.map(mdates.date2num) ohlc = df[['number', 'open', 'high', 'low', 'close']] # 14일 동안의 최댓값을 구한다. min_periods=1을 지정할 경우, 14일 기간에 해당하는 데이터가 모두 누적되지 않았더라도 최소 기간인 1일 이상의 데이터만 존재하면 최댓값을 구하라는 의미다. ndays_high = df.high.rolling(window=14, min_periods=1).max() # 14일 동안의 최솟값을 구한다. min_periods=1로 지정하면, 14일 치 데이터 모두 누적되지 않았더라도 최소 기간인 1일 이상의 데이터만 존재하면 최솟값을 구하라는 의미다. ndays_low = df.low.rolling(window=14, min_periods=1).min() # 빠른 선 %K를 구한다. fast_k = (df.close - ndays_low) / (ndays_high - ndays_low) * 100 # 3일 동안 %K의 평균을 구해서 느린 선 %D에 저장한다. slow_d = fast_k.rolling(window=3).mean() # %K와 %D로 데이터프레임을 생성한 뒤 결측치는 제거한다. df = df.assign(fast_k=fast_k, slow_d=slow_d).dropna() plt.figure(figsize=(9, 9)) p1 = plt.subplot(3, 1, 1) plt.title('Triple Screen Trading (NCSOFT)') plt.grid(True) # ohlc의 숫자형 일자, 시가, 고가, 저가, 종가 값을 이용해서 캔들 차트를 그린다. candlestick_ohlc(p1, ohlc.values, width=6, colorup='red', colordown='blue') p1.xaxis.set_major_formatter(mdates.DateFormatter('%Y-%m')) plt.plot(df.number, df['ema130'], color='c', label='EMA130') for i in range(1, len(df.close)): # 130일 이동 지수평균이 상승하고 %D가 20 아래로 떨어지면 if df.ema130.values[i-1] < df.ema130.values[i] and df.slow_d.values[i-1] >= 20 and df.slow_d.values[i] < 20: # 빨간색 삼각형으로 매수 신호를 표시한다. plt.plot(df.number.values[i], 250000, 'r^') # 130일 이동 지수평균이 하락하고 %D가 80 위로 상승하면 elif df.ema130.values[i-1] > df.ema130.values[i] and df.slow_d.values[i-1] <= 80 and df.slow_d.values[i] > 80: # 파란색 삼각형으로 매수 신호를 표시한다. plt.plot(df.number.values[i], 250000, 'bv') plt.legend(loc='best') p2 = plt.subplot(3, 1, 2) plt.grid(True) p2.xaxis.set_major_formatter(mdates.DateFormatter('%Y-%m')) plt.bar(df.number, df['macdhist'], color='m', label='MACD-Hist') plt.plot(df.number, df['macd'], color='b', label='MACD') plt.plot(df.number, df['signal'], 'g--', label='MACD-Signal') plt.legend(loc='best') p3 = plt.subplot(3, 1, 3) plt.grid(True) p1.xaxis.set_major_formatter(mdates.DateFormatter('%Y-%m')) plt.plot(df.number, df['fast_k'], color='c', label='%K') plt.plot(df.number, df['slow_d'], color='k', label='%D') # Y축 눈금을 0, 20, 80, 100으로 설정하여 스토캐스틱의 기준선을 나타낸다. plt.yticks([0, 20, 80, 100]) plt.legend(loc='best') plt.show()<file_sep>def functionA(): print('FUNCTION_A') def main(): print('MAIN_A :', __name__) if __name__ == '__main__': main()<file_sep>import pandas as pd import matplotlib.pyplot as plt import datetime from mplfinance.original_flavor import candlestick_ohlc import matplotlib.dates as mdates from Investar import Analyzer mk = Analyzer.MarketDB() df = mk.get_daily_price('엔씨소프트', '2017-01-01', '2020-12-01') # 종가의 12주 지수 이동평균에 해당하는 60일 지수 이동평균을 구한다. ema60 = df.close.ewm(span=60).mean() # 종가의 26주 지수 이동평균에 해당하는 130일 지수 이동평균을 구한다. ema130 = df.close.ewm(span=130).mean() # 12주(60일) 지수 이동평균에서 26주(130일) 지수 이동평균을 빼서 MACD(Moving Average Convergence Divergence)선을 구한다. macd = ema60 - ema130 # MACD의 9주(45일) 지수 이동평균을 구해서 신호선으로 저장한다. signal = macd.ewm(span=45).mean() # MACD선에서 신호선을 빼서 MACD 히스토그램을 구한다. macdhist = macd - signal df = df.assign(ema130=ema130, ema60=ema60, macd=macd, signal=signal, macdhist=macdhist).dropna() # 캔들 차트에 사용할 수 있게 날짜(date)형 인덱스를 숫자형을 변환한다. df['number'] = df.index.map(mdates.date2num) ohlc = df[['number', 'open', 'high', 'low', 'close']] # 14일 동안의 최댓값을 구한다. min_periods=1을 지정할 경우, 14일 기간에 해당하는 데이터가 모두 누적되지 않았더라도 최소 기간인 1일 이상의 데이터만 존재하면 최댓값을 구하라는 의미다. ndays_high = df.high.rolling(window=14, min_periods=1).max() # 14일 동안의 최솟값을 구한다. min_periods=1로 지정하면, 14일 치 데이터 모두 누적되지 않았더라도 최소 기간인 1일 이상의 데이터만 존재하면 최솟값을 구하라는 의미다. ndays_low = df.low.rolling(window=14, min_periods=1).min() # 빠른 선 %K를 구한다. fast_k = (df.close - ndays_low) / (ndays_high - ndays_low) * 100 # 3일 동안 %K의 평균을 구해서 느린 선 %D에 저장한다. slow_d = fast_k.rolling(window=3).mean() # %K와 %D로 데이터프레임을 생성한 뒤 결측치는 제거한다. df = df.assign(fast_k=fast_k, slow_d=slow_d).dropna() plt.figure(figsize=(9, 7)) p1 = plt.subplot(2, 1, 1) plt.title('Triple Screen Trading - Second Screen (NCSOFT)') plt.grid(True) # ohlc의 숫자형 일자, 시가, 고가, 저가, 종가 값을 이용해서 캔들 차트를 그린다. candlestick_ohlc(p1, ohlc.values, width=6, colorup='red', colordown='blue') p1.xaxis.set_major_formatter(mdates.DateFormatter('%Y-%m')) plt.plot(df.number, df['ema130'], color='c', label='EMA130') plt.legend(loc='best') p1 = plt.subplot(2, 1, 2) plt.grid(True) p1.xaxis.set_major_formatter(mdates.DateFormatter('%Y-%m')) plt.plot(df.number, df['fast_k'], color='c', label='%K') plt.plot(df.number, df['slow_d'], color='k', label='%D') # Y축 눈금을 0, 20, 80, 100으로 설정하여 스토캐스틱의 기준선을 나타낸다. plt.yticks([0, 20, 80, 100]) plt.legend(loc='best') plt.show()<file_sep>import numpy as np A = np.array([[1, 2], [3, 4]]) print(A) print(type(A)) print(A.ndim) # 배열의 차원 print(A.shape) # 배열 크기 print(A.dtype) # 원소 자료형 print(A.max(), A.mean(), A.min(), A.sum()) # 배열의 접근 print(A[0]) # A 배열의 첫 번째 행 print(A[1]) # A 배열의 두 번째 행 # A 배열의 원소에 접근하는 데 A[행 인덱스][열 인덱스] 형식을 사용해도 되고, A[행 인덱스, 열 인덱스] 형식을 사용해도 된다. print(A[0][0], A[0][1]) print(A[1][0], A[1][1]) print(A[0, 0], A[0, 1]) print(A[1, 0], A[1, 1]) # 조건에 맞는 원소들만 인덱싱할 수도 있다. print(A[A>1]) # Transpose - 배열의 요소를 주대각선을 기준으로 뒤바꾸는 것 print(A) print(A.T) # A.transpose()와 같다. # Flatten - 다차원 배열을 1차원 배열 형태로 바꾸는 것; 평탄화 print(A) print(A.flatten()) # 배열의 연산 (Array Operations) # 같은 크기의 행렬끼리는 사칙 연산을 할 수 있다. 두 행렬에서 같은 위치에 있는 원소끼리 연산을 하면 된다. print(A + A) # np.add(A, A)와 같다. print(A - A) # np.subtract(A, A)와 같다. print(A * A) # np.multiply(A, A)와 같다. print(A / A) # np.divide(A, A)와 같다. # Broadcasting ; 수학에서는 크기가 같은 행렬끼리만 연산을 할 수 있다. 하지만 넘파이에서는 행렬의 크기가 달라도 연산할 수 있게 크기가 작은 행렬을 확장해 주는데, 이를 브로드캐스팅이라고 한다. 다음은 B 행렬을 A 행렬의 크기에 맞게 브로드캐스팅하여 연산한 예이다. B = np.array([10, 100]) print(A * B) # 내적(inner product)구하기 # 넘파이에서 배열끼리 곱셈연산(*)을 할 때 기본적으로 원소별 연산을 수행하기 때문에 두 배열의 내적 곱을 구하려면 dot() 함수를 사용해야 한다. print(B.dot(B)) # np.dot(B * B) print(A.dot(B))<file_sep># set은 중복이 없는 원소 집합을 나타낸다. s = {'A', 'P', 'P', 'L', 'E'} print(s) # {'A', 'P', 'E', 'L'} # 한 가지 기억할 점은 반드시 우리가 생성한 순서대로 원소가 저장되지는 않는다는 점이다. mySet = {'B', 6, 1, 2} print(mySet) # Set 내부에 특정 원소가 존재하는지 검사하려면 다음과 같이 if ~ in ~ 비교 구문으로 확인하면 된다. Set은 다른 반복 자료형 보다 훨씬 빨리 원소의 존재 여부를 검사 할 수 있다. if 'B' in mySet: print("'B' exists in", mySet) # Set의 원소들은 인덱싱이 불가능한 대신, 원소들의 교집합, 합집합, 차집합을 구할 수 있다. setA = {1, 2, 3, 4, 5} setB = {3, 4, 5, 6, 7} print(setA & setB) # setA.intersection(setB) print(setA | setB) # setA.union(setB) print(setA - setB) # setA.difference(setB) print(setB - setA) # setB.difference(setA) # 아쉽게도 셋은 반복 자료형처럼 리터럴로 원소가 없는 생태에서 생성할 수 없다. s = {}으로 생성하면 실제로는 딕셔너리가 생성되니 주의하자. 빈 셋은 s = set()으로 생성해야 한다. ls2 = [] # ls = list()와 같은 결과 d = {} # d = dict()와 같은 결과 t = () # t = tuple()과 같은 결과 # 중복 없는 셋의 특징을 이용하면 다음처럼 리스트에서 중복 원소를 간단히 제거할 수 있다. ls2 = [1, 3, 5, 2, 2, 3, 4, 2, 1, 1, 1, 1, 5] print(list(set(ls2)))<file_sep># Get an image file by using request import requests url = 'http://bit.ly/2JnsHnT' r = requests.get(url, stream=True).raw # Show an image by using pillow from PIL import Image img = Image.open(r) img.show() img.save('src.png') # Print file info print(img.get_format_mimetype) # copy an image file with 'with ~ as' BUF_SIZE = 1024 # 1. 원본 이미지 파일(scr.png)을 바이너리 읽기 모드로 열어서 sf 파일 객체를 생성하고, 대상 이미지 파일 (dst.png)을 바이너리 쓰기 모드로 열어서 df 파일 객체를 생성한다. with open('src.png', 'rb') as sf, open('dst.png', 'wb') as df: while True: # 2. sf 파일 객체로 부터 1024 바이트씩 읽는다. data = sf.read(BUF_SIZE) if not data: # 3. 읽을 data가 없다면 while 반복문을 빠져나온다. break # 4. 읽어 온 data를 df 파일 객체에 쓰고 2 부터 다시 반복한다. df.write(data) # SHA-256으로 파일 복사 검증하기 # 해시는 암호화폐 지갑의 주소처럼 긴 데이터 값을 입력받아서 고정 길이의 고유한 값으로 변환하는 것이 핵심 기능이다. import hashlib # 원본 이미지 파일과 사본 이미지 파일에 대한 해시 객체를 각각 생성한다. sha_src = hashlib.sha256() sha_dst = hashlib.sha256() # 원본 이미지 파일(src.png)을 바이너리 읽기 모드(rb)로 열어서 sf 파일 객체를 생성하고, 사본 이미지 파일(dst.png)을 바이너리 읽기 모드(rb)로 열어서 df 파일 객체를 생성한다. with open('src.png', 'rb') as sf, open('dst.png', 'rb') as df: # sf 파일 객체로부터 전체 내용을 읽어서 원본 이미니제 대한 해시 객체(sha_src)를 업데이트한다. df 파일 객체로부터 전체 내용을 읽어서 사본 이미지에 대한 해시 객체(sha_dst)를 업데이트한다. sha_src.update(sf.read()) sha_dst.update(df.read()) # 원본 이미지 파일과 사본 이미지 파일을 해시값을 16진수로 각각 출력한다. print("src.png's hash : {}".format(sha_src.hexdigest())) print("dst.png's hash : {}".format(sha_dst.hexdigest()))<file_sep>"""Investar URL Configuration The `urlpatterns` list routes URLs to views. For more information please see: https://docs.djangoproject.com/en/3.1/topics/http/urls/ Examples: Function views 1. Add an import: from my_app import views 2. Add a URL to urlpatterns: path('', views.home, name='home') Class-based views 1. Add an import: from other_app.views import Home 2. Add a URL to urlpatterns: path('', Home.as_view(), name='home') Including another URLconf 1. Import the include() function: from django.urls import include, path 2. Add a URL to urlpatterns: path('blog/', include('blog.urls')) """ from django.contrib import admin from hello import views # hello 앱의 views를 임포트 한 뒤 from django.urls import path, re_path # django.urls로 부터 re_path() 함수를 추가적으로 임포트 한다. from index import views as index_views # index 모듈 내의 views를 index_views로 임포트한 후 from balance import views as balance_views # balance 모듈 내의 views를 balance_views로 임포트한 후 urlpatterns = [ path('admin/', admin.site.urls), re_path(r'^(?P<name>[A-Z][a-z]*)$', views.sayHello), # urlpatterns 리스트의 마지막에 hello 앱의 URL에 대한 뷰 처리를 추가한다. path('index/', index_views.main_view), # 제일 마지막 라인에 path() 함수를 추가해서 URLConf를 수정한다. URL이 'index/'이면 index 앱 뷰의 main_view() 함수로 매핑하라는 의미다. path('balance/', balance_views.main_view) # URL이 balance인 경우 balance 앱 뷰의 main_view() 함수로 매핑 될 것이다. ] <file_sep># Arithmetic Operators print(1 + 2) # 3 print(3 - 4) # -1 print(5 * 6) # 30 print(2 ** 8) # 2의 8승. pow(2, 8)와 동일함 print(5 / 3) # 1.6666666666666667 print(5 // 3) # 나눗셈 결과의 몫. 1 print(5 % 3) # 나눗셈 결과의 나머지. 2 <file_sep># BlockRequest() 함수를 사용해서 삼성전자(종목코드 005930)의 현재가와 전일대비 가격을 구하는 예는 다음과 같다. import win32com.client # 주식마스터(StockMst) COM 객체를 생성한다. obj = win32com.client.Dispatch("DsCbo1.stockMst") # SetInputValue() 함수로 조회할 데이터를 삼성전자로 지정한다. obj.SetInputValue(0, 'A005930') # BlockRequest() 함수로 삼성전자에 대한 블록 데이터를 요청한다. obj.BlockRequest() sec = {} # GetHeaderValue() 함수로 현재가 정보(11)를 가져와서 sec 딕셔너리에 넣는다. sec['현재가'] = obj.GetHeaderValue(11) # GetHeaderValue() 함수로 전일대비 가격변동 정보(12)를 가져와서 sec 딕셔너리에 넣는다. sec['전일대비'] = obj.GetHeaderValue(12) <file_sep>def function(): print('FUNCTION()') print("MODULE_B:", __name__)<file_sep>import ctypes import win32com.client from slacker import Slacker from datetime import datetime import slack_config # CREON Plus 공동 Object cpStatus = win32com.client.Dispatch('CpUtil.CpCybos') # 시스템 상태 정보 cpTradeUtil = win32com.client.Dispatch('CpTrade.CpTdUtil') # 주문 관련 도구 # CREON Plus 시스템 점검 함수 def check_creon_system(): # 관리자 권한으로 프로세스 실행 여부 if not ctypes.windll.shell32.IsUserAnAdmin(): print('check_creon_system() : admin user -> FAILED') return False # 연결 여부 체크 if (cpStatus.IsConnect == 0): print('check_creon_system() : connect to server -> FAILED') return False # 주문 관련 초기화 if (cpTradeUtil.TradeInit(0) != 0): print('check_creon_system() : init trade -> FAILED') return False return True slack_token = slack_config.token # 7장 장고 웹 서버 구축 및 자동화에서 발급한 토큰을 입력한다. slack = Slacker(slack_token) def dbgout(message): # datetime.now() 함수로 현재 시간을 구한 후 [월/일 시:분:초] 형식으로 출력한 후 한 칸 띄우고 함수 호출 시 인수로 받은 message 문자열을 출력한다. print(datetime.now().strftime('[%m/%d %H:%M:%S]'), message) strbuf = datetime.now().strftime('[%m/%d %H:%M:%S]') + message # etf-algo-trading 채널로 메세지를 보내려면 워크스페이스에 etf-algo-trading 채널을 미리 만들어 둬야 한다. 별도의 채널을 만들기 싫다면 #etf-algo-trading 대신 #general을 인수로 주어 일반 채널로 메시지를 보내도 된다. slack.chat.post_message('#etf-algo-trading', strbuf) cpStock = win32com.client.Dispatch("DsCbo1.StockMst") # 주식 종목별 정보 def get_current_price(code): cpStock.SetInputValue(0, code) # 종목코드에 대한 가격 정보 cpStock.BlockRequest() item = {} item['cur_price'] = cpStock.GetHeaderValue(11) # 현재가 item['ask'] = cpStock.GetHeaderValue(16) # 매수호가 item['bid'] = cpStock.GetHeaderValue(17) # 매도호가 return item['cur_price'], item['ask'], item['bid']<file_sep># 앞에서 셋이 다른 반복자료형 보다 검색 시간이 훨씬 빠르다고 했는데, 이 말이 사실인지 실제로 확인해 보자. 일반적으로 파이썬 프로그램 성능 측정에 표준 라이브러리 timeit을 사용한다. #timeit(테스트 구분, setup=테스트 준비 구문, number=테스트 반복 횟수) # 순회 속도 비교하기 # setup 구문에서 0부터 9999까지 정수 1만개를 원소로 갖는 리스트, 튜플, 셋을 생성한 후, 각 반복 자료형 별로 모든 원소를 처음부터 끝까지 순회(for ~ in ~)하는 동작을 1000번 반복하는 데 걸릴 총 시간을 비교 해 보자. import timeit iteration_test = """ for i in itr : pass """ # list print(timeit.timeit(iteration_test, setup='itr = list(range(10000))', number=10000)) # tuple print(timeit.timeit(iteration_test, setup='itr = tuple(range(10000))', number=10000)) # set print(timeit.timeit(iteration_test, setup='itr = set(range(10000))', number=10000)) # 검색속도 비교하기 # 다음은 표준 라이브러리인 random 모듈의 randint() 함수를 이용하여 0 이상 9999 이하의 임의의 난수를 생성한 후, 0부터 9999까지 정수 1만 개로 구성된 반복 자료형에 존재하는지 검색(if ~ in ~)하는 코드다. search_test = """ import random x = random.randint(0, len(itr)-1) if x in itr: pass """ #임의의 난수를 검색하는 작업을 1000 번씩 반복해서 수행한 결과, 셋의 검색 속도가 리스트나 튜플보다 월등히 빠른 것으로 나타났다. # set print(timeit.timeit(search_test, setup='itr = set(range(10000))', number=10000)) # list print(timeit.timeit(search_test, setup='itr = list(range(10000))', number=10000)) #tuple print(timeit.timeit(search_test, setup='itr = tuple(range(10000))', number=10000))<file_sep># 고가, 저가, 종가의 합을 3으로 나눠서 중심 가격 TP(Typical Price)를 구한다. df['TP'] = (df['high'] + df['low'] + df['close']) / 3 df['PMF'] = 0 df['NMF'] = 0 # range 함수는 마지막 값을 포함하지 않으므로 0부터 종가 개수 -2까지 반복한다. for i in range(len(df.close)-1): # i번째 중심 가격보다 i+1번째 중심 가격이 높으면 if df.TP.values[i] < df.TP.values[i+1]: # i+1번째 중심 가격과 i+1번째 거래량의 곱을 i+1번째 긍정적 현금 흐름 PMF(Positive Money Flow)에 저장한다. df.PMF.values[i+1] = df.TP.values[i+1] * df.volume.values[i+1] # i+1번째 부정적 현금 흐름 NMF(Negative Money Flow)값은 0으로 저장한다. df.NMF.values[i+1] = 0 else: df.NMF.values[i+1] = df.TP.values[i+1] * df.volume.values[i+1] df.PMF.values[i+1] =0 # 10일 동안의 긍정적 현금 흐름의 합을 10일 동안의 부정적 현금 흐름의 합으로 나눈 결과를 현금 흐름 비율 MFR(Money Flow Ratio) 칼럼에 저장한다. df['MFR'] = df.PMF.rolling(window=10).sum() / df.NMF.rolling(window=10).sum() # 10일 기준으로 현금흐름지수를 계산한 결과를 MFI10(Money Flow Index 10)칼럼에 저장한다. df['MFI10'] = 100 - 100 / (1 + df['MFR'])<file_sep># 최근 유전자 가위 기술(CRISPR)로 각광을 받는 미국 대표 기업들을 딕셔너리로 나타내면 다음과 같다. crispr = {'EDIT': 'Editas Medicine', 'NTLA': 'Intellia Therapeutics'} # 순서가 없으므로 시퀀스 자료형들처럼 인덱스로 값에 접근하는 것은 불가능하다. 만일 다음처럼 인덱스 숫자로 원소에 접근하려고 하면, 인터프리터는 이를 키로 처리하기 때문에 키 에러가 발생한다. # print(crispr[1]) # 순서가 없으므로 인덱스 숫자로 접근할 수 없다. print(crispr['NTLA']) # 원소를 추가하고 싶다면, 다음과 같이 키와 값을 함께 지정하여야 한다. 다음은 CRSP 키에 'CRISPR Therapeutics'를 값으로 넣는 예제다. crispr['CRSP'] = 'CRISPR Therapeutics' print(crispr) # CRSP 종목을 추가했으므로 총 원소 개수는 3개다. print(len(crispr))<file_sep># 리스트 복사 # 리스트는 문자열과 마찬가지로 인텍싱과 슬라이싱이 가능하며, len() 함수를 비롯한 여러 내장 함수를 사용할 수 있다. [:]를 사용하면 리스트를 복사할 수 있다. myList = ['Thoughts', 'become', 'things.'] newList = myList[:] # 전체 영역 슬라이싱 [:]은 새로운 리스트를 반환한다. print(newList) # [:]를 사용하여 리스트를 복사한 경우, 새로운 리스트를 변경하더라도 기존 리스트는 변경되지 않는다. newList[-1] = 'actions.' print(newList) print(myList) # 리스트 내포(comprehension) # 내포 기능은 파이썬이 다른 언어들에 비해 얼마나 더 고수준 언어 인지를 보여준다. 내포 기능을 사용하면 리스트, 딕셔너리, 셋 같은 열거형 객체의 전체 또는 일부 원소를 변경하여 새로운 열거형 객체를 생성할 수 있다. 보통 다른 언어에서는 이런 작업을 for 반복문으로 처리하는데, 파이썬의 내포 기능을 사용하면 훨씬 간결하게 처리할 수 있다. # 다음은 for 반복문을 사용해서 리스트의 모든 원소에 대하여 제곱을 구하는 예제다. nums = [1, 2, 3, 4, 5] squares = [] for x in nums: squares.append(x ** 2) print(squares) # 위의 코드는 리스트 내포를 사용하여 더 간단히 바꿀 수 있다. nums2 = [1, 2, 3, 4, 5] squares2 = [x ** 2 for x in nums2] print(squares2) # 특히, 리스트에서 조건에 맞는 원소만 골라서 가공한 뒤 새로운 리스트로 생성할 때 편리하게 사용할 수 있으므로 익혀두자. 다음은 결과가 짝수일 때만 원소로 저장하는 코드다. nums3 = [1, 2, 3, 4, 5] squares3 = [x ** 2 for x in nums3 if x % 2 == 0] print(squares3)<file_sep>import numpy as np import pandas as pd import matplotlib.pyplot as plt from Investar import Analyzer mk = Analyzer.MarketDB() stocks = ['삼성전자', 'SK하이닉스', '현대자동차', 'NAVER'] df = pd.DataFrame() for s in stocks: df[s] = mk.get_daily_price(s, '2016-01-04', '2018-04-27')['close'] daily_ret = df.pct_change() annual_ret = daily_ret.mean() * 252 daily_cov = daily_ret.cov() annual_cov = daily_cov * 252 port_ret = [] port_risk = [] port_weights = [] sharpe_ratio = [] for _ in range(20000): weights = np.random.random(len(stocks)) weights /= np.sum(weights) returns = np.dot(weights, annual_ret) risk = np.sqrt(np.dot(weights.T, np.dot(annual_cov, weights))) port_ret.append(returns) port_risk.append(risk) port_weights.append(weights) # 포트폴리오의 수익률을 리스크로 나눈 값을 샤프 지수 리스트에 추가한다. sharpe_ratio.append(returns/risk) portfolio = {'Returns': port_ret, 'Risk': port_risk, 'Sharpe':sharpe_ratio} for i, s in enumerate(stocks): portfolio[s] = [weights[i] for weight in port_weights] df = pd.DataFrame(portfolio) # 샤프 지수 칼럼을 데이터프레임에 추가한다. 생성된 데이터프레임은 다음과 같다. df = df[['Returns', 'Risk', 'Sharpe'] + [s for s in stocks]] # 샤프 지수 칼럼에서 샤프 지숫값이 제일 큰 행을 max_sharpe로 정한다. max_sharpe = df.loc[df['Sharpe'] == df['Sharpe'].max()] # 리스크 칼럼에서 리스크 값이 제일 작은 행을 min_risk로 정한다. min_risk = df.loc[df['Risk'] == df['Risk'].min()] # Return result looks wrong. Will need to be fixed. print(max_sharpe) print(min_risk) # 포트폴리오의 샤프 지수에 따라 컬러맵을 'viridis'로 표시하고 테두리는 검정(k)으로 표시한다. df.plot.scatter(x='Risk', y='Returns', c='Sharpe', cmap='viridis', edgecolors='k', figsize=(11, 7), grid=True) # 샤프 지수가 가장 큰 포트폴리오를 300 크기의 붉은 별표(*)로 표시한다. plt.scatter(x=max_sharpe['Risk'], y=max_sharpe['Returns'], c='r', marker='*', s=300) # 리스크가 제일 작은 포트폴리오를 200 크기의 붉은 엑스표(x)로 표시한다. plt.scatter(x=min_risk['Risk'], y=min_risk['Returns'], c='r', marker='X', s=200) plt.title('Portfolio Optimization') plt.xlabel('Risk') plt.ylabel('Expected Returns') plt.show() <file_sep>import matplotlib.pyplot as plt from Investar import Analyzer mk = Analyzer.MarketDB() df = mk.get_daily_price('NAVER', '2019-01-02') # 20개 종가를 이용해서 평균을 구한다. df['MA20'] = df['close'].rolling(window=20).mean() # 20개 종가를 이용해서 표준편차를 구한 뒤 stdde 칼럼으로 df에 추가한다. df['stddev'] = df['close'].rolling(window=20).std() # 중간 볼린저 밴드 + (2 x 표준편차)를 상단 볼린저 밴드로 계산한다. df['upper'] = df['MA20'] + (df['stddev'] * 2) # 중간 볼린저 밴드 - (2 x 표준편차)를 하단 볼린저 밴드로 계산한다. df['lower'] = df['MA20'] - (df['stddev'] * 2) # (종가 - 하단 밴드) / (상단 밴드 - 하단 밴드)를 구해 %B 칼럼을 생성한다. df['PB'] = (df['close'] - df['lower']) / (df['upper'] - df['lower']) # (상단 밴드 - 하단 밴드) / 중간 밴드 X 100을 구해 bandwidth(밴드폭) 칼럼을 생성한다. # df['bandwidth'] = (df['upper'] - df['lower']) / df['MA20'] * 100 # 고가, 저가, 종가의 합을 3으로 나눠서 중심 가격 TP(Typical Price)를 구한다. df['TP'] = (df['high'] + df['low'] + df['close']) / 3 df['PMF'] = 0 df['NMF'] = 0 # range 함수는 마지막 값을 포함하지 않으므로 0부터 종가 개수 -2까지 반복한다. for i in range(len(df.close)-1): # i번째 중심 가격보다 i+1번째 중심 가격이 높으면 if df.TP.values[i] < df.TP.values[i+1]: # i+1번째 중심 가격과 i+1번째 거래량의 곱을 i+1번째 긍정적 현금 흐름 PMF(Positive Money Flow)에 저장한다. df.PMF.values[i+1] = df.TP.values[i+1] * df.volume.values[i+1] # i+1번째 부정적 현금 흐름 NMF(Negative Money Flow)값은 0으로 저장한다. df.NMF.values[i+1] = 0 else: df.NMF.values[i+1] = df.TP.values[i+1] * df.volume.values[i+1] df.PMF.values[i+1] =0 # 10일 동안의 긍정적 현금 흐름의 합을 10일 동안의 부정적 현금 흐름의 합으로 나눈 결과를 현금 흐름 비율 MFR(Money Flow Ratio) 칼럼에 저장한다. df['MFR'] = df.PMF.rolling(window=10).sum() / df.NMF.rolling(window=10).sum() # 10일 기준으로 현금흐름지수를 계산한 결과를 MFI10(Money Flow Index 10)칼럼에 저장한다. df['MFI10'] = 100 - 100 / (1 + df['MFR']) # 위는 19번째 행까지 NaN 이므로 값이 있는 20번째 행부터 사용한다. df = df[19:] plt.figure(figsize=(9, 8)) # 기존의 볼린저 밴드 차트를 2행 1열의 그리드에서 1열에 배치한다. plt.subplot(2, 1, 1) # x좌표 df.index에 해당하는 종가를 y좌표로 설정해 파란색(#0000ff) 실선으로 표시한다. plt.plot(df.index, df['close'], color='#0000ff', label='Close') # x좌표 df.index에 해당하는 상단 볼린저 밴드값을 y좌표로 설정해 붉은 실선(r--)으로 표시한다. plt.plot(df.index, df['upper'], 'r--', label='Upper band') plt.plot(df.index, df['MA20'], 'k--', label='Moving average 20') plt.plot(df.index, df['lower'], 'c--', label='Lower band') # 상단 볼린저 밴드와 하단 볼린저 밴드 사이를 회색으로 칠한다. plt.fill_between(df.index, df['upper'], df['lower'], color='0.9') # plt.title('NAVER Bollinger Band (20 day, 2 std)') for i in range(len(df.close)): # %b가 0.8 보다 크고 10일 기준 MFI가 80보다 크면 if df.PB.values[i] > 0.8 and df.MFI10.values[i] > 80: # 매수 시점을 나타내기 위해 첫 번째 그래프가 종가 위치에 빨간색 삼각형을 표시한다. plt.plot(df.index.values[i], df.close.values[i], 'r^') # %b가 0.2보다 작고 10일 기준 MFI가 20보다 작으면 elif df.PB.values[i] < 0.2 and df.MFI10.values[i] < 20: # 매도 시점을 나타내기 위해 첫 번째 그래프의 종가 위치에 파란색 삼각형을 표시한다. plt.plot(df.index.values[i], df.close.values[i], 'bv') plt.legend(loc='best') # %B 차트를 2행 1열의 그리드에서 2열에 배치한다. plt.subplot(2, 1, 2) # x좌표 df.index에 해당하는 %b 값을 y좌표로 설정해 파란(b)실선으로 표시한다. # plt.plot(df.index, df['PB'], color='b', label='%B') # x좌표 df.index에 해당하는 bandwidth값을 y좌표로 설정해 자홍(magenta) 실선으로 표시한다. # plt.plot(df.index, df['bandwidth'], color='m', label='Bandwidth') # MFI와 비교할 수 있게 %b를 그대로 표시하지 않고 100을 곱해서 푸른색 실선으로 표시한다. plt.plot(df.index, df['PB'] * 100, 'b', label='%B X 100') # 10일 기준 MFI를 녹색의 점선으로 표시한다. plt.plot(df.index, df['MFI10'], 'g--', label='MFI(10 day') # y축 눈금을 -20부터 120까지 20단위로 표시한다. plt.yticks([-20, 0, 20, 40, 60, 80, 100, 120]) for i in range(len(df.close)): if df.PB.values[i] > 0.8 and df.MFI10.values[i] > 80: plt.plot(df.index.values[i], 0, 'r^') elif df.PB.values[i] < 0.2 and df.MFI10.values[i] < 20: plt.plot(df.index.values[i], 0, 'bv') plt.grid(True) plt.legend(loc='best') plt.show() <file_sep>from datetime import datetime import backtrader as bt # bt.Strategy 클래스를 상속받아서 MyStrategy 클래스를 작성한다. class MyStrategy(bt.Strategy): def __init__(self): # RSI 지표를 사용하려면 MyStrategy 클래스 생성자에서 RSI 지표로 사용할 변수를 저장한다. self.rsi = bt.indicators.RSI(self.data.close) # next() 메서드는 주어진 데이터와 지표(indicator)를 만족시키는 최소 주기마다 자동으로 호출된다. 시장에 참여하고 있지 않을 때 RSI가 30 미망니면 매수하고, 시장에 참여하고 있을 때 RSI가 70을 초과하면 매도하도록 구현한다. def next(self): if not self.position: if self.rsi < 30: self.order = self.buy() else: if self.rsi > 70: self.order = self.sell() # Cerebro 클래스는 백트레이더의 핵심 클래스로서, 데이터를 취합하고 백테스트 또는 라이브 트레이딩을 실행한 뒤 그 결과를 출력하는 기능을 담당한다. cerebro = bt.Cerebro() cerebro.addstrategy(MyStrategy) # 엔씨소프트(036570.KS)의 종가 데이터는 야후 파이낸스 데이터를 이용해서 취합한다. data = bt.feeds.YahooFinanceData(dataname='036570.KS', fromdate=datetime(2017, 1, 1), todate=datetime(2019, 12, 1)) cerebro.adddata(data) # 초기 투자 자금을 천만 원으로 설정한다. cerebro.broker.setcash(10000000) # 엔씨소프트 주식의 매매 단위는 30주로 설정한다. 보유한 현금에 비해 매수하려는 주식의 총 매수 금액(주가 * 매매 단위)이 크면 매수가 이루어지지 않음에 유의하자. cerebro.addsizer(bt.sizers.SizerFix, stake=30) print(f'Initial Portfolio Value : {cerebro.broker.getvalue():,.0f} KRW') # Cerebro 클래스로 백테스트를 실행한다. cerebro.run() print(f'Final Portfolio Value : {cerebro.broker.getvalue():,.0f} KRW') # 백테스트 결과를 차트로 출력한다. cerebro.plot()<file_sep># lambda는 이름 없는 간단한 함수를 만들 때 사용한다. 'lambda 인수 : 표현식'형태로 사용하며, 아래에 선언된 함수 객체와 비슷하게 동작한다. """ def lambda ( 인수 ): return 표현식 """ # 람다로 천 단위 숫자에 쉼표 삽입해 보자. insertComma = lambda x : format(x, ',') print(insertComma(1234567890)) <file_sep>import matplotlib.pyplot as plt import matplotlib.image as mpimg dst_img = mpimg.imread('dst.png') print(dst_img) pseudo_img = dst_img [:, :, 0] # RGB 채널 중 첫 번째 채널(0)만 슬라이싱해서 저장한 관계로 부동소수점 데이터가 2차원으로 변경되었다. print(pseudo_img) # subplot() 함수로 두 이미지를 나란히 표시하자. plt.suptitle('Image Processing', fontsize=18) plt.subplot(1, 2, 1) # 인수를 1, 2, 1 처럼 쉼표로 구분해 넘겨주면 1행 2열의 행렬에서 첫 번째(1) 그림을 설정하는 것이다. plt.title('Original Image') plt.imshow(mpimg.imread('src.png')) plt.subplot(122) # 인수를 쉼표 구분 없이 모두 붙여서 전달할 수도 있다. 예를 들어 1행 2열의 행렬에서 두 번째 그림을 설정할 때는 인수를 122처럼 숫자를 모두 붙여 넘겨줄 수도 있다. plt.title('Pseudocolor Image') dst_img = mpimg.imread('dst.png') pseudo_img = dst_img [:, :, 0] plt.imshow(pseudo_img) plt.show()<file_sep># 파이썬은 함수 호출이 끝나고 결과값을 반환할 때, 여러 결과값을 한꺼번에 반환할 수 있다. 여거 결과값은 기본적으로 튜플 객체로 변환되어 반환된다. def myFunc(): var1 = 'a' var2 = [1, 2, 3] var3 = max return var1, var2, var3 # 여러 개의 결과값은 기본적으로 튜플 타입으로 반환된다. print(myFunc()) # 함수 결과값을 튜플 객체 하나로 받지 않고, 함수에서 반환한 순서대로 여러 객체로 나누어 받으려면 변수를 쉼표로 구분하여 받으면 된다. s, l, f = myFunc() print(s) print(l) print(f)<file_sep># 함수를 정의할 때 반환값을 지정하지 않으면 None을 반환한다. 따라서 아래 세 함수는 모두 None을 반환한다. 참고로 다음처럼 세미콜론;을 사용하면 여러 줄 명령을 한 줄에 작성해 실행할 수 있다. def func1(): pass def func2(): return def func3(): return None print(func1()); print(func2()); print(func3()) # None은 NoneType 클래스의 객체이며, None이 반환되었는지 확인하려면 == 또는 is 연산자를 사용한다. print(type(None)) print(func1() == None) print(func1() is None)<file_sep>from pandas_datareader import data as pdr import yfinance as yf yf.pdr_override() sec = pdr.get_data_yahoo('005930.KS', start='2018-05-04') sec_dpc = (sec['Close']-sec['Close'].shift(1)) / sec['Close'].shift(1) * 100 sec_dpc.iloc[0] = 0 # 일간 변동률의 첫 번째 값인 Nan을 0으로 변경한다. sec_dpc_cs = sec_dpc.cumsum() # 일간 변동률의 누적합을 구한다. print(sec_dpc_cs) msft = pdr.get_data_yahoo('MSFT', start='2018-05-04') msft_dpc= (msft['Close']-msft['Close'].shift(1)) / msft['Close'].shift(1) * 100 msft_dpc.iloc[0] = 0 msft_dpc_cs = msft_dpc.cumsum() print(msft_dpc_cs) import matplotlib.pyplot as plt plt.plot(sec.index, sec_dpc_cs, 'b', label='Samsung Electronics') plt.plot(msft.index, msft_dpc_cs, 'r--', label='Microsoft') plt.ylabel('Change %') plt.grid(True) plt.legend(loc='best') plt.show()<file_sep># Python 문자열의 인덱스 # P y t h o n # 0 1 2 3 4 5 # -6 -5 -4 -3 -2 -1 word = "python" print(word[0:6]) print(word[0:]) print(word[:6]) # 5번째 인덱스에 해당하는 문자(n)까지 표시된다. print(word[:-1]) # -2번째 인덱스에 해당하는 문자(o)까지 표시된다. <file_sep>import pandas as pd import matplotlib.pyplot as plt import datetime from mplfinance.original_flavor import candlestick_ohlc import matplotlib.dates as mdates from Investar import Analyzer mk = Analyzer.MarketDB() df = mk.get_daily_price('엔씨소프트', '2017-01-01', '2020-12-01') # 종가의 12주 지수 이동평균에 해당하는 60일 지수 이동평균을 구한다. ema60 = df.close.ewm(span=60).mean() # 종가의 26주 지수 이동평균에 해당하는 130일 지수 이동평균을 구한다. ema130 = df.close.ewm(span=130).mean() # 12주(60일) 지수 이동평균에서 26주(130일) 지수 이동평균을 빼서 MACD(Moving Average Convergence Divergence)선을 구한다. macd = ema60 - ema130 # MACD의 9주(45일) 지수 이동평균을 구해서 신호선으로 저장한다. signal = macd.ewm(span=45).mean() # MACD선에서 신호선을 빼서 MACD 히스토그램을 구한다. macdhist = macd - signal df = df.assign(ema130=ema130, ema60=ema60, macd=macd, signal=signal, macdhist=macdhist).dropna() # 캔들 차트에 사용할 수 있게 날짜(date)형 인덱스를 숫자형을 변환한다. df['number'] = df.index.map(mdates.date2num) ohlc = df[['number', 'open', 'high', 'low', 'close']] plt.figure(figsize=(9, 7)) p1 = plt.subplot(2, 1, 1) plt.title('Triple Screen Trading - First Screen (NCSOFT)') plt.grid(True) # ohlc의 숫자형 일자, 시가, 고가, 저가, 종가 값을 이용해서 캔들 차트를 그린다. candlestick_ohlc(p1, ohlc.values, width=6, colorup='red', colordown='blue') p1.xaxis.set_major_formatter(mdates.DateFormatter('%Y-%m')) plt.plot(df.number, df['ema130'], color='c', label='EMA130') plt.legend(loc='best') p2 = plt.subplot(2, 1, 2) plt.grid(True) p2.xaxis.set_major_formatter(mdates.DateFormatter('%Y-%m')) plt.bar(df.number, df['macdhist'], color='m', label='MACD-Hist') plt.plot(df.number, df['macd'], color='b', label='MACD') plt.plot(df.number, df['signal'], 'g--', label='MACD-Signal') plt.legend(loc='best') plt.show()<file_sep>crispr = {'EDIT': 'Editas Medicine', 'NTLA': 'Intellia Therapeutics', 'CRSP': 'CRISPR Therapeutics'} # % 기호 방식 for x in crispr: print('%s : %s' % (x, crispr[x])) # old school format # {} 기호 방식 for x in crispr: print('{} : {}'.format(x, crispr[x])) # new school format # f-string 방식 for x in crispr: print(f'{x} : {crispr[x]}') # brand new school format <file_sep>from slacker import Slacker import slack_config slack_token = slack_config.token slack = Slacker(slack_token) markdown_text = ''' This message is plain. *This message is bold.* 'This message is code.' _This message is italic._ ~This message is strike.~ ''' attach_dict = { 'color' : '#ff0000', 'author_name' : 'INVESTAR', 'author_link' : 'github.com/yeonhodev', 'title' : '오늘의 증시 KOSPI', 'title_link' : 'http://finance.naver.com/sise/sise_index.nhn?code=KOSPI', 'text' : '2,326.13 △11.89 (+0.51%)', 'image_url' : 'https://ssl.pstatic.net/imgstock/chart3/day/KOSPI.png' } attach_list = [attach_dict] slack.chat.post_message(channel="#general", text=markdown_text, attachments=attach_list)<file_sep>for i in [1, 2, 3]: print(i) # range(시작값, 멈춤값, 증가값) : 멈춤값으로 주어진 수는 반복할 범위에 포함되지 않는다. for i in range(1, 7, 2): print(i) # enumerate([반복자료형], 인덱스의_시작값) : 각각의 반복과정에서 아이템 인덱스를 구할수 있어서 편리하다. 시작값을 생략하면 첫번째 인덱스는 0부터 시작한다. FAANG = ['FB', 'AMZN', 'AAPL', 'NFLX', 'GOOGL'] for idx, symbol in enumerate(FAANG, 1): print(idx, symbol)<file_sep>import backtrader as bt from datetime import datetime class MyStrategy(bt.Strategy): def __init__(self): self.dataclose = self.datas[0].close self.order = None self.buyprice = None self.buycomm = None self.rsi = bt.indicators.RSI_SMA(self.data.close, period=21) # 기존 코드에 비해서 가장 큰 변화는 MyStrategy 클래스에 notify_order() 메서드가 추가되었다는 점이다. 이 메서드는 주문(order) 상태에 변화가 있을 때마다 자동으로 실행된다. 인수로 주문(order)를 객체를 넘겨 받는다. 주문 상태는 완료(Completed). 취소(Cancelled) def notify_order(self, order): if order.status in [order.Submitted, order.Accepted]: return # 주문 상태가 완료(Completed)이면 매수인지 매도인지 확인하여 상세 주문 정보를 출력한다. 주문 처리 관련 코드는 기존과 같다. 단지 주문 상태를 출력해주는 기능을 추가했다. if order.status in[order.Completed]: if order.isbuy(): self.log(f'BUY : 주가 {order.executed.price:,.0f}, 수량 {order.executed.size:,.0f}, 수수료 {order.executed.comm:,.0f}, 자산 {cerebro.broker.getvalue():,.0f}') self.buyprice = order.executed.price self.buycomm = order.executed.comm else: self.log(f'SELL : 주가 {order.executed.price:,.0f}, 수량 {order.executed.size:,.0f}, 수수료 {order.executed.comm:,.0f}, 자산 {cerebro.broker.getvalue():,.0f}') self.bar_executed = len(self) elif order.status in [order.Cancelled]: self.log('ORDER CANCELLED') elif order.status in [order.Margin]: self.log('ORDER MARGIN') elif order.status in [order.Rejected]: self.log('ORDER REJECTED') self.order = None def next(self): if not self.position: if self.rsi < 30: self.order = self.buy() else: if self.rsi > 70: self.order = self.sell() # log() 메서드는 텍스트 메세지를 인수로 받아서 셀 화면에 주문 일자와 함께 출력하는 역할을 한다. def log(self, txt, dt=None): dt = self.datas[0].datetime.date(0) print(f'[{dt.isoformat()}] {txt}') cerebro = bt.Cerebro() cerebro.addstrategy(MyStrategy) data = bt.feeds.YahooFinanceData(dataname='036570.KS', fromdate=datetime(2017, 1, 1), todate=datetime(2019, 12, 1)) cerebro.adddata(data) cerebro.broker.setcash(10000000) # 수수료(commission)는 매수나 매도가 발생할 때마다 차감된다. 우리나라는 주식을 매도할 때 0.25%를 증권거래세로 내야 하고, 증권회사별로 다르긴 하지만 주식을 매수하거나 매도할 때 일반적으로 0.015%를 증권거래수수료로 내야 한다. 즉, 주식을 한 번 거래(매수/매도)할 때 대략 0.28% 비용이 소요된다. 백트레이더에서는 매수와 매도 시점마다 수수료가 동일 비율로 두 번 차감되므로, 0.28%를 2로 나누어 수수료를 0.14%로 설정했다. cerebro.broker.setcommission(commission=0.0014) # 사이즈(size)는 매매 주문을 적용할 주식수를 나타내며, 특별히 지정하지 않으면 1이다. PercentSizer를 사용하면 포트폴리오 자산에 대한 퍼센트로 지정할 수 있는데, 100으로 지정하면 수수료를 낼 수 없어서 ORDER MARGIN이 발생하므로, 수수료를 차감한 퍼센트로 저장해야 한다. cerebro.addsizer(bt.sizers.PercentSizer, percents=90) print(f'Initial Portfolio Value : {cerebro.broker.getvalue():,.0f} KRW') cerebro.run() print(f'Final Portfolio Value : {cerebro.broker.getvalue():,.0f} KRW') # 주가를 표시할 때 캔들스틱 차트로 표시한다. cerebro.plot(style='candlestick')
6654daff4ec58de24bbd4f0d59c7a960ae6b80ec
[ "Python" ]
51
Python
yeonhodev/python_stock_trading
70983a2a4bdd6eb7aacabd02c7ef828a848de3f6
144f137a71b5264b13f10670e43cb492226ad4c5
refs/heads/master
<file_sep>package org.androidtown.recycler import android.support.v7.widget.RecyclerView import android.view.LayoutInflater import android.view.View import android.view.ViewGroup import java.util.ArrayList import kotlinx.android.synthetic.main.recyclerview_item.view.* //RecyclerView.adapter를 상속받은 Class// Main에서 mainRecyclerView.adapter로 mapping시킨다 class MyRecyclerViewAdapter : RecyclerView.Adapter<RecyclerView.ViewHolder>() { //아이템들의 배열 var scentMembers = ArrayList<ScentMember>() init { scentMembers.add(ScentMember(R.drawable.img_01, "Poppy", "#poppy", "#depression")) scentMembers.add(ScentMember(R.drawable.img2, "Freesia", "#Freesia", "#depression")) scentMembers.add(ScentMember(R.drawable.img_03, "Jabong", "#Jabong", "#depression")) scentMembers.add(ScentMember(R.drawable.img_04, "Lemon", "#Lemon", "#depression")) scentMembers.add(ScentMember(R.drawable.img_05, "Lilac", "#Lilac", "#depression")) scentMembers.add(ScentMember(R.drawable.img_06, "Pine", "#Pine", "#depression")) scentMembers.add(ScentMember(R.drawable.img_07, "Jasmine", "#Jasmine", "#depression")) scentMembers.add(ScentMember(R.drawable.img_08, "Lavandula", "#Lavandula", "#depression")) } override fun onCreateViewHolder(parent: ViewGroup, viewType: Int): RecyclerView.ViewHolder { //XML 디자인 한 부분 적용 var view = LayoutInflater.from(parent.context).inflate(R.layout.recyclerview_item, parent, false) return ItemCell(view) } override fun onBindViewHolder(holder: RecyclerView.ViewHolder, position: Int) { //XML 디자인한 부분에 안에 내용 변경 (image, name, info1, info2) if ( position == 0 ) { var Size: Int = scentMembers.size (holder as ItemCell).title.text = "My Scent " holder.scentSize.text = "$Size" holder.scentName.maxHeight = 0 holder.scentInfo1.maxHeight = 0 holder.scentInfo2.maxHeight = 0 } else { (holder as ItemCell).scentImage.setImageResource(scentMembers[position-1].m_Image) holder.scentName.text = scentMembers[position-1].m_Name holder.scentInfo1.text = scentMembers[position-1].m_Info1 holder.scentInfo2.text = scentMembers[position-1].m_Info2 } } override fun getItemCount(): Int { //아이템의 개수 측정 return scentMembers.size + 1 } private class ItemCell(view: View) : RecyclerView.ViewHolder(view) { var title = view.itemTitle var scentSize = view.itemScentSize var scentImage = view.itemScentImage var scentName = view.itemScentName var scentInfo1 = view.itemScentInfo1 var scentInfo2 = view.itemScentInfo2 } } <file_sep>package org.androidtown.recycler import android.os.Bundle import android.support.v7.app.AppCompatActivity import android.support.v7.widget.GridLayoutManager import kotlinx.android.synthetic.main.activity_main.* class MainActivity : AppCompatActivity() { override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) setContentView(R.layout.activity_main) //RecyclerView의 layout을 그리드뷰로 만들것으로 정의 하는 부분 // 첫번쨰 라인은 spanCount를 1로 지정해준다. var layoutManager = GridLayoutManager(this, 3) layoutManager.spanSizeLookup = object : GridLayoutManager.SpanSizeLookup() { override fun getSpanSize(position: Int): Int { return if (position == 0) 3 else 1 } } mainRecyclerView.layoutManager = layoutManager //어뎁터로 연결 val myRecyclerViewAdapter = MyRecyclerViewAdapter() mainRecyclerView.adapter = myRecyclerViewAdapter } } <file_sep>package org.androidtown.recycler class ScentMember(var m_Image: Int, var m_Name: String, var m_Info1: String, var m_Info2: String)
a706e096b49321ea14df63e6ff355dd5a0383995
[ "Kotlin" ]
3
Kotlin
junho10000/Pium
fd29ed8d92cf0872e47e12269c334ed454ea0cb8
736945053cf30c8ecba1dbbb0d4dd81bc4d093dd
refs/heads/master
<repo_name>springbootbuch/messaging_jms<file_sep>/payment_service/src/main/java/de/springbootbuch/messaging_jms/payment_service/Application.java package de.springbootbuch.messaging_jms.payment_service; import com.fasterxml.jackson.databind.ObjectMapper; import java.util.HashMap; import java.util.Map; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.boot.jackson.JsonComponentModule; import org.springframework.context.annotation.Bean; import org.springframework.jms.annotation.JmsListenerConfigurer; import org.springframework.jms.config.SimpleJmsListenerEndpoint; import org.springframework.jms.listener.adapter.MessageListenerAdapter; import org.springframework.jms.support.converter.MappingJackson2MessageConverter; import org.springframework.jms.support.converter.MessageConverter; import org.springframework.jms.support.converter.MessageType; /** * Part of springbootbuch.de. * * @author <NAME> * @author @rotnroll666 */ @SpringBootApplication public class Application { public static void main(String[] args) throws InterruptedException { SpringApplication.run(Application.class, args); } @Bean ObjectMapper objectMapper( final JsonComponentModule jsonComponentModule ) { final ObjectMapper objectMapper = new ObjectMapper(); objectMapper.registerModule(jsonComponentModule); return objectMapper; } @Bean FilmReturnedEventReceiver filmReturnedEventReceiver() { return new FilmReturnedEventReceiver(); } @Bean MessageConverter filmReturnedEventConverter( final ObjectMapper objectMapper ) { MappingJackson2MessageConverter converter = new MappingJackson2MessageConverter(); converter.setObjectMapper(objectMapper); converter.setTargetType(MessageType.TEXT); converter.setTypeIdPropertyName("eventType"); final Map<String, Class<?>> t = new HashMap<>(); t.put("FilmReturnedEvent", FilmReturnedEvent.class); converter.setTypeIdMappings(t); return converter; } @Bean MessageListenerAdapter filmReturnedEventListener( final MessageConverter filmReturnedEventConverter, final FilmReturnedEventReceiver filmReturnedEventReceiver ) { final MessageListenerAdapter adapter = new MessageListenerAdapter( filmReturnedEventReceiver); adapter .setDefaultListenerMethod("filmReturned"); adapter .setMessageConverter(filmReturnedEventConverter); return adapter; } @Bean JmsListenerConfigurer jmsListenerConfigurer( MessageListenerAdapter filmReturnedEventListener ) { return registrar -> { final SimpleJmsListenerEndpoint rv = new SimpleJmsListenerEndpoint(); rv.setId("returned-film-events-receiver"); rv.setMessageListener(filmReturnedEventListener); rv.setDestination("returned-film-events"); registrar.registerEndpoint(rv); }; } } <file_sep>/simple/src/main/resources/application-unused.properties spring.activemq.broker-url = tcp://192.168.1.210:9876 spring.activemq.user = user spring.activemq.password = <PASSWORD> <file_sep>/simple/src/main/java/de/springbootbuch/messaging_jms/simple/GreetingService.java package de.springbootbuch.messaging_jms.simple; import org.springframework.jms.core.JmsTemplate; /** * Part of springbootbuch.de. * * @author <NAME> * @author @rotnroll666 */ public class GreetingService { private final JmsTemplate jmsTemplate; public GreetingService(JmsTemplate jmsTemplate) { this.jmsTemplate = jmsTemplate; } public void sendGreeting(final String greeting) { this.jmsTemplate.send("greetings-topic", session -> session.createTextMessage(greeting) ); } } <file_sep>/simple/src/main/resources/application-pubsub.properties spring.jms.pub-sub-domain = true <file_sep>/pom.xml <?xml version="1.0" encoding="UTF-8"?> <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> <modelVersion>4.0.0</modelVersion> <parent> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-parent</artifactId> <version>2.4.0</version> <relativePath/> </parent> <groupId>de.springbootbuch</groupId> <artifactId>messaging_jms</artifactId> <version>0.0.1-SNAPSHOT</version> <packaging>pom</packaging> <name>messaging_jms</name> <modules> <module>film_rental</module> <module>payment_service</module> <module>simple</module> </modules> <dependencies> <dependency> <groupId>org.springframework.boot</groupId> <artifactId>spring-boot-starter-activemq</artifactId> </dependency> </dependencies> <build> <pluginManagement> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-compiler-plugin</artifactId> <configuration> <compilerArgs> <arg>-parameters</arg> <arg>-Xlint:unchecked</arg> </compilerArgs> </configuration> </plugin> </plugins> </pluginManagement> <plugins> <plugin> <groupId>io.fabric8</groupId> <artifactId>docker-maven-plugin</artifactId> <version>0.21.0</version> <inherited>false</inherited> <configuration> <images>  </images> </configuration> </plugin> </plugins> </build> <profiles> <profile> <id>use-docker-for-it</id> <activation> <activeByDefault>true</activeByDefault> </activation> <build> <plugins> <plugin> <groupId>io.fabric8</groupId> <artifactId>docker-maven-plugin</artifactId> <executions> <!-- Runs before and after integration test --> <execution> <id>prepare-activemq</id> <phase>pre-integration-test</phase> <goals> <goal>start</goal> </goals> <configuration> <images>  </images> </configuration> </execution> <execution> <id>remove-prepare</id> <phase>post-integration-test</phase> <goals> <goal>stop</goal> </goals> </execution> </executions> </plugin> </plugins> </build> </profile> </profiles> </project> <file_sep>/simple/src/main/java/de/springbootbuch/messaging_jms/simple/DestinationResolverConfig.java package de.springbootbuch.messaging_jms.simple; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Profile; import org.springframework.jms.support.destination.DestinationResolver; /** * Part of springbootbuch.de. * * @author <NAME> * @author @rotnroll666 */ @Profile("destination-router") @Configuration public class DestinationResolverConfig { @Bean public DestinationResolver destinationResolver() { return (session, destinationName, pubSubDomain) -> { if (destinationName.endsWith("queue")) { return session.createQueue(destinationName); } else if (destinationName.endsWith("topic")) { return session.createTopic(destinationName); } throw new RuntimeException( "Invalid destination: " + destinationName); }; } }<file_sep>/film_rental/src/main/resources/application.properties spring.activemq.in-memory = false <file_sep>/README.md # messaging_jms This project needs a local ActiveMQ instance. If you don't have one, you can start one via Docker and Maven: ``` ./mvnw docker:run ``` ## Simple examples Goto module `simple`. Use default Spring boot configuration and talk to a queue domain ``` ./mvnw spring-boot:run ``` See the effects of setting `pub-sub-domain` to true ``` ./mvnw spring-boot:run -Dspring-boot.run.profiles=pubsub ``` Alternative destination name based routing in effect: ``` ./mvnw spring-boot:run -Dspring-boot.run.profiles=destination-router ``` Alternative destination management based on separate templates ``` ./mvnw spring-boot:run -Dspring-boot.run.profiles=distinct-infrastructure ``` ## Complex scenario Start the moduls `film_rental` and `payment` via ``` cd film_rental ./mvnw spring-boot:run ``` And in another shell ``` cd payment_service ./mvnw spring-boot:run ``` You then can post to the rental service like ``` curl -X "POST" "http://localhost:8080/returnedFilms" \ -H "Content-Type: application/json; charset=utf-8" \ -d $'{ "title": "One Flew Over the Cuckoo\'s Nest" }' ``` and see the film being billed in the the payment service.<file_sep>/simple/src/main/java/de/springbootbuch/messaging_jms/simple/Application.java package de.springbootbuch.messaging_jms.simple; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.boot.autoconfigure.condition.ConditionalOnMissingBean; import org.springframework.context.annotation.Bean; import org.springframework.jms.core.JmsTemplate; /** * Part of springbootbuch.de. * * @author <NAME> * @author @rotnroll666 */ @SpringBootApplication public class Application { public static void main(String[] args) { SpringApplication.run(Application.class, args); } @Bean @ConditionalOnMissingBean(GreetingService.class) public GreetingService greetingService(final JmsTemplate jmsTemplate) { return new GreetingService(jmsTemplate); } }<file_sep>/simple/src/main/java/de/springbootbuch/messaging_jms/simple/DistinctInfrastructureConfig.java package de.springbootbuch.messaging_jms.simple; import javax.jms.ConnectionFactory; import org.slf4j.Logger; import org.slf4j.LoggerFactory; import org.springframework.boot.autoconfigure.jms.DefaultJmsListenerContainerFactoryConfigurer; import org.springframework.context.annotation.Bean; import org.springframework.context.annotation.Configuration; import org.springframework.context.annotation.Profile; import org.springframework.jms.annotation.JmsListener; import org.springframework.jms.config.DefaultJmsListenerContainerFactory; import org.springframework.jms.config.JmsListenerContainerFactory; import org.springframework.jms.core.JmsTemplate; /** * Part of springbootbuch.de. * * @author <NAME> * @author @rotnroll666 */ @Profile("distinct-infrastructure") @Configuration public class DistinctInfrastructureConfig { @Bean public JmsListenerContainerFactory<?> topicContainerFactory( DefaultJmsListenerContainerFactoryConfigurer configurer, ConnectionFactory connectionFactory ) { DefaultJmsListenerContainerFactory containerFactory = new DefaultJmsListenerContainerFactory(); configurer .configure(containerFactory, connectionFactory); containerFactory.setPubSubDomain(true); return containerFactory; } @Bean public JmsListenerContainerFactory<?> queueContainerFactory( DefaultJmsListenerContainerFactoryConfigurer configurer, ConnectionFactory connectionFactory ) { DefaultJmsListenerContainerFactory containerFactory = new DefaultJmsListenerContainerFactory(); configurer.configure(containerFactory, connectionFactory); return containerFactory; } @Bean public JmsTemplate topicJmsTemplate( ConnectionFactory connectionFactory ) { JmsTemplate jmsTemplate = new JmsTemplate(connectionFactory); jmsTemplate.setPubSubDomain(true); return jmsTemplate; } @Bean public JmsTemplate queueJmsTemplate( ConnectionFactory connectionFactory ) { return new JmsTemplate(connectionFactory); } @Bean public GreetingService greetingService(final JmsTemplate topicJmsTemplate) { return new GreetingService(topicJmsTemplate); } @Bean AnnotedGreetingListenerAlt annotedGreetingListenerAlt() { return new AnnotedGreetingListenerAlt(); } static class AnnotedGreetingListenerAlt { private static final Logger LOG = LoggerFactory .getLogger(AnnotedGreetingListenerAlt.class); @JmsListener( destination = "greetings-topic", containerFactory = "topicContainerFactory" ) public void onGreeting(final String greeting) { LOG.info("Received greeting {}", greeting); } } }
e4f47936ec3f70ece2f4d464c6a2a12961066f4d
[ "Markdown", "Java", "Maven POM", "INI" ]
10
Java
springbootbuch/messaging_jms
001b595999fd3945c9f28643b54d90a99b144531
14d7aadeb35f7f5ef6d7145c6b1473adaf39e267
refs/heads/master
<repo_name>OOIT2020-2021-IIS-prva-godina/IIS_Grupa2<file_sep>/StartingProjectOOIT/src/geometry/Drawing.java package geometry; import java.awt.Color; import java.awt.Graphics; import java.util.ArrayList; import java.util.HashMap; import java.util.Iterator; import javax.swing.JFrame; import javax.swing.JPanel; public class Drawing extends JPanel{ public static void main (String[] args){ JFrame frame= new JFrame("Drawings"); frame.setSize(800, 600); frame.setVisible(true); Drawing draw=new Drawing(); frame.getContentPane().add(draw); } @Override public void paint(Graphics g) { Point p = new Point(50,50); //p.draw(g); g.setColor(Color.RED); Line l1 = new Line(new Point (100,100), new Point (200,200)); //l1.draw(g); Rectangle r1 = new Rectangle(l1.getEndPoint(), 100, 50); //r1.draw(g); Circle c1 = new Circle(new Point(500, 100), 80); //c1.draw(g); g.setColor(Color.GREEN); Donut d1 = new Donut(new Point(800, 100), 50, 25); //d1.draw(g); Rectangle k1 = new Rectangle(new Point(500, 500), 50, 50); //k1.draw(g); int innerR= (int)(k1.getHeight()*Math.sqrt(2)/2); Donut d2=new Donut(new Point(k1.getUpperLeftPoint().getX()+k1.getWidth()/2,k1.getUpperLeftPoint().getY()+k1.getWidth()/2), 80, innerR); //d2.draw(g); //Vezbe 8. //Zadatak 1. ArrayList<Shape> shapes=new ArrayList<Shape>(); shapes.add(p); shapes.add(l1); shapes.add(c1); shapes.add(d1); shapes.add(k1); Iterator<Shape> it=shapes.iterator(); while(it.hasNext()) { it.next().moveBy(10, 0); } //Zadatak 2. shapes.get(3).draw(g); //prvi nacin /*int arrayListLength=0; while(it.hasNext()) { arrayListLength++; it.next(); } shapes.get(arrayListLength-1).draw(g);*/ //drugi nacin shapes.get(shapes.size()-1).draw(g); /*jedan od nacina, ako zelimo da nacrtamo obrisani element * Shape tempShape = shapes.get(1); shapes.remove(1); tempShape.draw(g);*/ //drugi nacin shapes.remove(1); //pomera se lista shapes.get(1).draw(g); shapes.get(3).draw(g); shapes.add(3, l1); it=shapes.iterator(); while(it.hasNext()) { Shape pomocniS=it.next(); if(pomocniS instanceof Circle || pomocniS instanceof Rectangle) { pomocniS.draw(g); } } //Zadatak 3. try { c1.setRadius(-10); System.out.println("Provera da li se vraca u try blok"); }catch (Exception e) { e.printStackTrace(); } System.out.println("Provera da li se nastavlja izvrsenje programa"); //Zadatak 4. it=shapes.iterator(); while(it.hasNext()) { Shape pomocniS=it.next(); pomocniS.setSelected(true); pomocniS.draw(g); } //Zadatak 5. HashMap<String, Shape> hmShapes = new HashMap<String, Shape>(); hmShapes.put("point", p); hmShapes.put("line", l1); System.out.println(hmShapes.get("line")); } } <file_sep>/StartingProjectOOIT/src/geometry/WBTest.java package geometry; import java.awt.BorderLayout; import java.awt.EventQueue; import javax.swing.JFrame; import javax.swing.JPanel; import javax.swing.border.EmptyBorder; import java.awt.GridBagLayout; import javax.swing.JLabel; import java.awt.GridBagConstraints; public class WBTest extends JFrame { private JPanel contentPane; /** * Launch the application. */ public static void main(String[] args) { EventQueue.invokeLater(new Runnable() { public void run() { try { WBTest frame = new WBTest(); frame.setVisible(true); } catch (Exception e) { e.printStackTrace(); } } }); } /** * Create the frame. */ public WBTest() { setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); setBounds(100, 100, 450, 300); contentPane = new JPanel(); contentPane.setBorder(new EmptyBorder(5, 5, 5, 5)); contentPane.setLayout(new BorderLayout(0, 0)); setContentPane(contentPane); JPanel panel = new JPanel(); contentPane.add(panel, BorderLayout.CENTER); GridBagLayout gbl_panel = new GridBagLayout(); gbl_panel.columnWidths = new int[]{0, 0, 0, 0, 0, 0, 0, 0}; gbl_panel.rowHeights = new int[]{0, 0, 0, 0, 0}; gbl_panel.columnWeights = new double[]{0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, Double.MIN_VALUE}; gbl_panel.rowWeights = new double[]{0.0, 0.0, 0.0, 0.0, Double.MIN_VALUE}; panel.setLayout(gbl_panel); JLabel lblNewLabel = new JLabel("New label"); GridBagConstraints gbc_lblNewLabel = new GridBagConstraints(); gbc_lblNewLabel.gridx = 6; gbc_lblNewLabel.gridy = 3; panel.add(lblNewLabel, gbc_lblNewLabel); } } <file_sep>/StartingProjectOOIT/src/geometry/Rectangle.java package geometry; import java.awt.Color; import java.awt.Graphics; public class Rectangle extends Shape { private Point upperLeftPoint; private int width; private int height; public Rectangle() { } public Rectangle(Point upperLeftPoint, int width, int height) { this.upperLeftPoint = upperLeftPoint; this.width = width; this.height = height; } public Rectangle(Point upperLeftPoint, int width, int height, boolean selected) { this(upperLeftPoint, width, height); setSelected(selected); // menja se prilikom dodavanja Shape // this.selected = selected; } public int area() { return this.width * this.height; } public int circumference() { return 2 * this.width + 2 * this.height; } public boolean equals(Object obj) { if (obj instanceof Rectangle) { Rectangle pomocna = (Rectangle) obj; if (this.upperLeftPoint.equals(pomocna.upperLeftPoint) && this.width == pomocna.width && this.height == pomocna.height) return true; else return false; } else return false; } public boolean contains(int x, int y) { if (x >= upperLeftPoint.getX() && x <= upperLeftPoint.getX() + width && y >= upperLeftPoint.getY() && y <= upperLeftPoint.getY() + height) return true; return false; } public boolean contains(Point p) { if (p.getX() >= upperLeftPoint.getX() && p.getX() <= upperLeftPoint.getX() + width && p.getY() >= upperLeftPoint.getY() && p.getY() <= upperLeftPoint.getY() + height) return true; return false; } @Override public void draw(Graphics g) { g.drawRect(upperLeftPoint.getX(), upperLeftPoint.getY(), width, height); if(selected) { g.setColor(Color.blue); g.drawRect(upperLeftPoint.getX() - 2, upperLeftPoint.getY() - 2, 4, 4); g.drawRect(upperLeftPoint.getX() + width - 2, upperLeftPoint.getY() - 2, 4, 4); g.drawRect(upperLeftPoint.getX() - 2, upperLeftPoint.getY() + height - 2, 4, 4); g.drawRect(upperLeftPoint.getX() + width - 2, upperLeftPoint.getY() + height - 2, 4, 4); } } @Override public void moveTo(int x, int y) { upperLeftPoint.moveTo(x, y); } @Override public void moveBy(int x, int y) { upperLeftPoint.moveBy(x, y); } @Override public int compareTo(Object o) { if(o instanceof Rectangle) { return this.area()-((Rectangle)o).area(); } return 0; } public Point getUpperLeftPoint() { return upperLeftPoint; } public void setUpperLeftPoint(Point upperLeftPoint) { this.upperLeftPoint = upperLeftPoint; } public int getWidth() { return width; } public void setWidth(int width) { this.width = width; } public int getHeight() { return height; } public void setHeight(int height) { this.height = height; } public String toString() { return "Upper left point: " + upperLeftPoint + ", width =" + width + ", height =" + height; } } <file_sep>/StartingProjectOOIT/src/vezbe11/FrmIgrac.java package vezbe11; import java.awt.BorderLayout; import java.awt.Color; import java.awt.Dialog; import java.awt.Dimension; import java.awt.EventQueue; import javax.swing.JFrame; import javax.swing.JPanel; import javax.swing.border.EmptyBorder; import gui.DlgTest; import javax.swing.JLabel; import java.awt.GridBagLayout; import java.awt.GridBagConstraints; import java.awt.Insets; import javax.swing.ButtonGroup; import javax.swing.DefaultListModel; import javax.swing.JToggleButton; import java.awt.event.ActionListener; import java.awt.event.ActionEvent; import javax.swing.JScrollPane; import javax.swing.JList; import javax.swing.JOptionPane; import javax.swing.JCheckBox; import javax.swing.JTextArea; import javax.swing.JTextField; import java.awt.event.KeyAdapter; import java.awt.event.KeyEvent; import javax.swing.JComboBox; import javax.swing.DefaultComboBoxModel; import javax.swing.JButton; public class FrmIgrac extends JFrame { private JPanel contentPane; private final ButtonGroup buttonGroup = new ButtonGroup(); private DefaultListModel<String> dlmIgraci=new DefaultListModel<String>(); private JLabel lblMatic; private JLabel lblIvanovic; private JLabel lblKolarov; private JTextField txtUnetiIgrac; /** * Launch the application. */ public static void main(String[] args) { EventQueue.invokeLater(new Runnable() { public void run() { try { FrmIgrac frame = new FrmIgrac(); frame.setVisible(true); } catch (Exception e) { e.printStackTrace(); } } }); } /** * Create the frame. */ public FrmIgrac() { setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); setBounds(100, 100, 550, 400); contentPane = new JPanel(); contentPane.setBorder(new EmptyBorder(5, 5, 5, 5)); contentPane.setLayout(new BorderLayout(0, 0)); setContentPane(contentPane); contentPane.setBackground(new Color(220, 20, 60)); setTitle("Igraci"); JPanel pnlCenter = new JPanel(); pnlCenter.setBackground(Color.YELLOW); pnlCenter.setSize(150,100); contentPane.add(pnlCenter, BorderLayout.CENTER); GridBagLayout gbl_pnlCenter = new GridBagLayout(); gbl_pnlCenter.columnWidths = new int[]{116, 0, 0, 0}; gbl_pnlCenter.rowHeights = new int[]{0, 0, 32, 0, 0, 0, 0}; gbl_pnlCenter.columnWeights = new double[]{0.0, 1.0, 1.0, Double.MIN_VALUE}; gbl_pnlCenter.rowWeights = new double[]{0.0, 0.0, 0.0, 0.0, 0.0, 1.0, Double.MIN_VALUE}; pnlCenter.setLayout(gbl_pnlCenter); lblKolarov = new JLabel("<NAME>"); GridBagConstraints gbc_lblKolarov = new GridBagConstraints(); gbc_lblKolarov.anchor = GridBagConstraints.EAST; gbc_lblKolarov.insets = new Insets(0, 0, 5, 5); gbc_lblKolarov.gridx = 0; gbc_lblKolarov.gridy = 0; pnlCenter.add(lblKolarov, gbc_lblKolarov); JToggleButton tglbtnKolarov = new JToggleButton("Kolarov"); tglbtnKolarov.addActionListener(new ActionListener() { public void actionPerformed(ActionEvent e) { dlmIgraci.addElement(lblKolarov.getText()); if (tglbtnKolarov.isSelected()) { lblIvanovic.setForeground(Color.black); lblKolarov.setForeground(Color.blue); lblMatic.setForeground(Color.black); } } }); buttonGroup.add(tglbtnKolarov); GridBagConstraints gbc_tglbtnKolarov = new GridBagConstraints(); gbc_tglbtnKolarov.insets = new Insets(0, 0, 5, 5); gbc_tglbtnKolarov.gridx = 1; gbc_tglbtnKolarov.gridy = 0; pnlCenter.add(tglbtnKolarov, gbc_tglbtnKolarov); tglbtnKolarov.setPreferredSize(new Dimension(100,25)); lblMatic = new JLabel("Nemanja Matic"); GridBagConstraints gbc_lblMatic = new GridBagConstraints(); gbc_lblMatic.anchor = GridBagConstraints.EAST; gbc_lblMatic.insets = new Insets(0, 0, 5, 5); gbc_lblMatic.gridx = 0; gbc_lblMatic.gridy = 1; pnlCenter.add(lblMatic, gbc_lblMatic); JToggleButton tglbtnMatic = new JToggleButton("Matic"); tglbtnMatic.addActionListener(new ActionListener() { public void actionPerformed(ActionEvent e) { dlmIgraci.addElement(lblMatic.getText()); if (tglbtnMatic.isSelected()) { lblIvanovic.setForeground(Color.black); lblKolarov.setForeground(Color.black); lblMatic.setForeground(Color.blue); } } }); buttonGroup.add(tglbtnMatic); GridBagConstraints gbc_tglbtnMatic = new GridBagConstraints(); gbc_tglbtnMatic.insets = new Insets(0, 0, 5, 5); gbc_tglbtnMatic.gridx = 1; gbc_tglbtnMatic.gridy = 1; pnlCenter.add(tglbtnMatic, gbc_tglbtnMatic); tglbtnMatic.setPreferredSize(new Dimension(100,25)); lblIvanovic = new JLabel("<NAME>"); GridBagConstraints gbc_lblIvanovic = new GridBagConstraints(); gbc_lblIvanovic.anchor = GridBagConstraints.EAST; gbc_lblIvanovic.insets = new Insets(0, 0, 5, 5); gbc_lblIvanovic.gridx = 0; gbc_lblIvanovic.gridy = 2; pnlCenter.add(lblIvanovic, gbc_lblIvanovic); JToggleButton tglbtnIvanovic = new JToggleButton("Ivanovic"); tglbtnIvanovic.addActionListener(new ActionListener() { public void actionPerformed(ActionEvent e) { dlmIgraci.addElement(lblIvanovic.getText()); if (tglbtnIvanovic.isSelected()) { lblIvanovic.setForeground(Color.blue); lblKolarov.setForeground(Color.black); lblMatic.setForeground(Color.black); } } }); buttonGroup.add(tglbtnIvanovic); GridBagConstraints gbc_tglbtnIvanovic = new GridBagConstraints(); gbc_tglbtnIvanovic.insets = new Insets(0, 0, 5, 5); gbc_tglbtnIvanovic.gridx = 1; gbc_tglbtnIvanovic.gridy = 2; pnlCenter.add(tglbtnIvanovic, gbc_tglbtnIvanovic); tglbtnIvanovic.setPreferredSize(new Dimension(100,25)); JCheckBox chckbxUnesiIgraca = new JCheckBox("Unesi igraca"); chckbxUnesiIgraca.addActionListener(new ActionListener() { public void actionPerformed(ActionEvent e) { if (chckbxUnesiIgraca.isSelected()) { txtUnetiIgrac.setEnabled(true); }else { txtUnetiIgrac.setText(""); txtUnetiIgrac.setEnabled(false); } } }); chckbxUnesiIgraca.setBackground(new Color(255, 255, 0)); GridBagConstraints gbc_chckbxUnesiIgraca = new GridBagConstraints(); gbc_chckbxUnesiIgraca.anchor = GridBagConstraints.EAST; gbc_chckbxUnesiIgraca.insets = new Insets(0, 0, 5, 5); gbc_chckbxUnesiIgraca.gridx = 0; gbc_chckbxUnesiIgraca.gridy = 3; pnlCenter.add(chckbxUnesiIgraca, gbc_chckbxUnesiIgraca); txtUnetiIgrac = new JTextField(); txtUnetiIgrac.addKeyListener(new KeyAdapter() { @Override public void keyPressed(KeyEvent e) { if(e.getKeyCode() == KeyEvent.VK_ENTER) { dlmIgraci.addElement(txtUnetiIgrac.getText()); txtUnetiIgrac.setText(""); } } }); txtUnetiIgrac.setEnabled(false); GridBagConstraints gbc_txtUnetiIgrac = new GridBagConstraints(); gbc_txtUnetiIgrac.insets = new Insets(0, 0, 5, 5); gbc_txtUnetiIgrac.fill = GridBagConstraints.HORIZONTAL; gbc_txtUnetiIgrac.gridx = 1; gbc_txtUnetiIgrac.gridy = 3; pnlCenter.add(txtUnetiIgrac, gbc_txtUnetiIgrac); txtUnetiIgrac.setColumns(10); JLabel lblIgraci = new JLabel("Igraci:"); GridBagConstraints gbc_lblIgraci = new GridBagConstraints(); gbc_lblIgraci.anchor = GridBagConstraints.EAST; gbc_lblIgraci.insets = new Insets(0, 0, 5, 5); gbc_lblIgraci.gridx = 0; gbc_lblIgraci.gridy = 4; pnlCenter.add(lblIgraci, gbc_lblIgraci); JComboBox<String> cbxIgraci = new JComboBox<String> (); cbxIgraci.addActionListener(new ActionListener() { public void actionPerformed(ActionEvent e) { dlmIgraci.addElement(cbxIgraci.getSelectedItem().toString()); } }); cbxIgraci.setModel(new DefaultComboBoxModel<String> (new String[] {"Kolarov", "Matic", "Ivanovic", "Nikolic", "Peric", "Markovic"})); GridBagConstraints gbc_cbxIgraci = new GridBagConstraints(); gbc_cbxIgraci.insets = new Insets(0, 0, 5, 5); gbc_cbxIgraci.fill = GridBagConstraints.HORIZONTAL; gbc_cbxIgraci.gridx = 1; gbc_cbxIgraci.gridy = 4; pnlCenter.add(cbxIgraci, gbc_cbxIgraci); JScrollPane scrollPane = new JScrollPane(); GridBagConstraints gbc_scrollPane = new GridBagConstraints(); gbc_scrollPane.fill = GridBagConstraints.BOTH; gbc_scrollPane.gridx = 2; gbc_scrollPane.gridy = 5; pnlCenter.add(scrollPane, gbc_scrollPane); JList<String> lstIgraci = new JList<String>(); scrollPane.setViewportView(lstIgraci); lstIgraci.setModel(dlmIgraci); JPanel pnlNorth = new JPanel(); pnlNorth.setBackground(new Color(220, 20, 60)); contentPane.add(pnlNorth, BorderLayout.NORTH); JLabel lblNaslov = new JLabel("Forma za unos igraca"); pnlNorth.add(lblNaslov); lblNaslov.setForeground(Color.white); JPanel pnlSouth = new JPanel(); pnlSouth.setBackground(new Color(220, 20, 60)); contentPane.add(pnlSouth, BorderLayout.SOUTH); JButton btnDodajIgraca = new JButton("Dodaj igraca"); btnDodajIgraca.addActionListener(new ActionListener() { public void actionPerformed(ActionEvent e) { DlgIgrac dlgIgrac=new DlgIgrac(); dlgIgrac.setVisible(true); if(dlgIgrac.isOk) { dlmIgraci.addElement(dlgIgrac.txtIme.getText()+" "+dlgIgrac.txtPrezime.getText()); } } }); pnlSouth.add(btnDodajIgraca); JButton btnModifikujIgraca = new JButton("Modifikuj igraca"); btnModifikujIgraca.addActionListener(new ActionListener() { public void actionPerformed(ActionEvent e) { DlgIgrac dlgIzmena = new DlgIgrac(); try { String[] split = dlmIgraci.getElementAt(lstIgraci.getSelectedIndex()).toString().split(" "); dlgIzmena.txtIme.setText(split[0]); dlgIzmena.txtPrezime.setText(split[1]); dlgIzmena.setVisible(true); if (dlgIzmena.isOk) { int index = lstIgraci.getSelectedIndex(); dlmIgraci.removeElementAt(index); dlmIgraci.add(index,dlgIzmena.txtIme.getText() + " " + dlgIzmena.txtPrezime.getText()); } } catch (Exception ex) { JOptionPane.showMessageDialog(null, "Morate selektovati igraca kojem je uneto i ime i prezime"); } } }); pnlSouth.add(btnModifikujIgraca); } } <file_sep>/StartingProjectOOIT/src/gui/FrmTest.java package gui; import java.awt.BorderLayout; import java.awt.Color; import java.awt.Dimension; import java.awt.EventQueue; import javax.swing.JFrame; import javax.swing.JPanel; import javax.swing.border.EmptyBorder; import java.awt.GridBagLayout; import javax.swing.JToggleButton; import java.awt.GridBagConstraints; import java.awt.Insets; import javax.swing.ButtonGroup; import javax.swing.DefaultListModel; import javax.swing.JLabel; import javax.swing.JScrollPane; import javax.swing.JList; import javax.swing.JOptionPane; import java.awt.event.ActionListener; import java.awt.event.ActionEvent; import javax.swing.JButton; import javax.swing.SwingConstants; import java.awt.FlowLayout; import javax.swing.JComboBox; import javax.swing.DefaultComboBoxModel; import javax.swing.JTextField; import java.awt.event.KeyAdapter; import java.awt.event.KeyEvent; public class FrmTest extends JFrame { private JPanel contentPane; private final ButtonGroup prvaButtonGrupa = new ButtonGroup(); private DefaultListModel<String> dlmBoje = new DefaultListModel<String>(); private JLabel lblCrvena; private JLabel lblPlava; private JLabel lblZuta; private JTextField txtUnesiBoju; /** * Launch the application. */ public static void main(String[] args) { EventQueue.invokeLater(new Runnable() { public void run() { try { FrmTest frame = new FrmTest(); frame.setVisible(true); } catch (Exception e) { e.printStackTrace(); } } }); } /** * Create the frame. */ public FrmTest() { setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); setBounds(100, 100, 639, 417); contentPane = new JPanel(); contentPane.setBorder(new EmptyBorder(5, 5, 5, 5)); contentPane.setLayout(new BorderLayout(0, 0)); setContentPane(contentPane); // centralni panel JPanel pnlCenter = new JPanel(); contentPane.add(pnlCenter, BorderLayout.CENTER); GridBagLayout gbl_pnlCenter = new GridBagLayout(); gbl_pnlCenter.columnWidths = new int[] { 0, 0, 0, 0, 0, 0 }; gbl_pnlCenter.rowHeights = new int[] { 33, 0, 0, 0, 0, 0, 0 }; gbl_pnlCenter.columnWeights = new double[] { 0.0, 0.0, 0.0, 0.0, 1.0, Double.MIN_VALUE }; gbl_pnlCenter.rowWeights = new double[] { 0.0, 0.0, 0.0, 0.0, 0.0, 1.0, Double.MIN_VALUE }; pnlCenter.setLayout(gbl_pnlCenter); JToggleButton tglbtnCrvena = new JToggleButton("Crvena boja"); tglbtnCrvena.setPreferredSize(new Dimension(120, 25)); tglbtnCrvena.addActionListener(new ActionListener() { public void actionPerformed(ActionEvent e) { dlmBoje.addElement(lblCrvena.getText()); if (tglbtnCrvena.isSelected()) { lblZuta.setForeground(Color.black); lblPlava.setForeground(Color.black); lblCrvena.setForeground(Color.red); } } }); JLabel lblIzaberiBoju = new JLabel("Izaberi boju:"); GridBagConstraints gbc_lblIzaberiBoju = new GridBagConstraints(); gbc_lblIzaberiBoju.anchor = GridBagConstraints.EAST; gbc_lblIzaberiBoju.insets = new Insets(0, 0, 5, 5); gbc_lblIzaberiBoju.gridx = 3; gbc_lblIzaberiBoju.gridy = 0; pnlCenter.add(lblIzaberiBoju, gbc_lblIzaberiBoju); JComboBox<String> cbxIzaberiBoju = new JComboBox<String>(); cbxIzaberiBoju.addActionListener(new ActionListener() { public void actionPerformed(ActionEvent e) { dlmBoje.addElement(cbxIzaberiBoju.getSelectedItem().toString()); switch (cbxIzaberiBoju.getSelectedItem().toString()) { case "Zelena": lblIzaberiBoju.setForeground(Color.green); break; case "Narandzasta": lblIzaberiBoju.setForeground(Color.orange); break; case "Ljubicasta": lblIzaberiBoju.setForeground(Color.magenta); break; } } }); cbxIzaberiBoju .setModel(new DefaultComboBoxModel<String>(new String[] { "Zelena", "Narandzasta", "Ljubicasta" })); GridBagConstraints gbc_cbxIzaberiBoju = new GridBagConstraints(); gbc_cbxIzaberiBoju.insets = new Insets(0, 0, 5, 0); gbc_cbxIzaberiBoju.fill = GridBagConstraints.HORIZONTAL; gbc_cbxIzaberiBoju.gridx = 4; gbc_cbxIzaberiBoju.gridy = 0; pnlCenter.add(cbxIzaberiBoju, gbc_cbxIzaberiBoju); JLabel lblUnesiBoju = new JLabel("Unesi boju:"); GridBagConstraints gbc_lblUnesiBoju = new GridBagConstraints(); gbc_lblUnesiBoju.anchor = GridBagConstraints.EAST; gbc_lblUnesiBoju.insets = new Insets(0, 0, 5, 5); gbc_lblUnesiBoju.gridx = 3; gbc_lblUnesiBoju.gridy = 1; pnlCenter.add(lblUnesiBoju, gbc_lblUnesiBoju); txtUnesiBoju = new JTextField(); txtUnesiBoju.addKeyListener(new KeyAdapter() { @Override public void keyPressed(KeyEvent e) { if(e.getKeyCode()==KeyEvent.VK_ENTER) { dlmBoje.addElement(txtUnesiBoju.getText()); txtUnesiBoju.setText(""); } } }); GridBagConstraints gbc_txtUnesiBoju = new GridBagConstraints(); gbc_txtUnesiBoju.insets = new Insets(0, 0, 5, 0); gbc_txtUnesiBoju.fill = GridBagConstraints.HORIZONTAL; gbc_txtUnesiBoju.gridx = 4; gbc_txtUnesiBoju.gridy = 1; pnlCenter.add(txtUnesiBoju, gbc_txtUnesiBoju); txtUnesiBoju.setColumns(10); prvaButtonGrupa.add(tglbtnCrvena); GridBagConstraints gbc_tglbtnCrvena = new GridBagConstraints(); gbc_tglbtnCrvena.insets = new Insets(0, 0, 5, 5); gbc_tglbtnCrvena.gridx = 0; gbc_tglbtnCrvena.gridy = 2; pnlCenter.add(tglbtnCrvena, gbc_tglbtnCrvena); lblCrvena = new JLabel("Crvena"); GridBagConstraints gbc_lblCrvena = new GridBagConstraints(); gbc_lblCrvena.anchor = GridBagConstraints.WEST; gbc_lblCrvena.insets = new Insets(0, 0, 5, 5); gbc_lblCrvena.gridx = 1; gbc_lblCrvena.gridy = 2; pnlCenter.add(lblCrvena, gbc_lblCrvena); JToggleButton tglbtnPlava = new JToggleButton("Plava boja"); tglbtnPlava.setPreferredSize(new Dimension(120, 25)); tglbtnPlava.addActionListener(new ActionListener() { public void actionPerformed(ActionEvent e) { dlmBoje.addElement(lblPlava.getText()); if (tglbtnPlava.isSelected()) { lblZuta.setForeground(Color.black); lblPlava.setForeground(Color.red); lblCrvena.setForeground(Color.black); } } }); prvaButtonGrupa.add(tglbtnPlava); GridBagConstraints gbc_tglbtnPlava = new GridBagConstraints(); gbc_tglbtnPlava.insets = new Insets(0, 0, 5, 5); gbc_tglbtnPlava.gridx = 0; gbc_tglbtnPlava.gridy = 3; pnlCenter.add(tglbtnPlava, gbc_tglbtnPlava); lblPlava = new JLabel("Plava"); GridBagConstraints gbc_lblPlava = new GridBagConstraints(); gbc_lblPlava.anchor = GridBagConstraints.WEST; gbc_lblPlava.insets = new Insets(0, 0, 5, 5); gbc_lblPlava.gridx = 1; gbc_lblPlava.gridy = 3; pnlCenter.add(lblPlava, gbc_lblPlava); JToggleButton tglbtnZuta = new JToggleButton("Zuta boja"); tglbtnZuta.setPreferredSize(new Dimension(120, 25)); tglbtnZuta.addActionListener(new ActionListener() { public void actionPerformed(ActionEvent e) { dlmBoje.addElement(lblZuta.getText()); if (tglbtnZuta.isSelected()) { lblZuta.setForeground(Color.red); lblPlava.setForeground(Color.black); lblCrvena.setForeground(Color.black); } } }); prvaButtonGrupa.add(tglbtnZuta); GridBagConstraints gbc_tglbtnZuta = new GridBagConstraints(); gbc_tglbtnZuta.insets = new Insets(0, 0, 5, 5); gbc_tglbtnZuta.gridx = 0; gbc_tglbtnZuta.gridy = 4; pnlCenter.add(tglbtnZuta, gbc_tglbtnZuta); lblZuta = new JLabel("Zuta"); GridBagConstraints gbc_lblZuta = new GridBagConstraints(); gbc_lblZuta.insets = new Insets(0, 0, 5, 5); gbc_lblZuta.anchor = GridBagConstraints.WEST; gbc_lblZuta.gridx = 1; gbc_lblZuta.gridy = 4; pnlCenter.add(lblZuta, gbc_lblZuta); // scroll i lista JScrollPane scrollPaneBoje = new JScrollPane(); GridBagConstraints gbc_scrollPaneBoje = new GridBagConstraints(); gbc_scrollPaneBoje.fill = GridBagConstraints.BOTH; gbc_scrollPaneBoje.gridx = 4; gbc_scrollPaneBoje.gridy = 5; pnlCenter.add(scrollPaneBoje, gbc_scrollPaneBoje); JList lstBoje = new JList(); scrollPaneBoje.setViewportView(lstBoje); lstBoje.setModel(dlmBoje); // gornji panel JPanel pnlSouth = new JPanel(); contentPane.add(pnlSouth, BorderLayout.SOUTH); JButton btnIspis = new JButton("Klikni me"); btnIspis.addActionListener(new ActionListener() { public void actionPerformed(ActionEvent e) { JOptionPane.showMessageDialog(null, "Ovo je antistres dugme :)", "Poruka", JOptionPane.INFORMATION_MESSAGE); } }); pnlSouth.setLayout(new FlowLayout(FlowLayout.CENTER, 5, 5)); pnlSouth.add(btnIspis); JButton btnDodajBoju = new JButton("RGB"); btnDodajBoju.addActionListener(new ActionListener() { public void actionPerformed(ActionEvent e) { DlgTest dlgUnos=new DlgTest(); dlgUnos.setVisible(true); if(dlgUnos.isOk) { dlmBoje.addElement(dlgUnos.txtCrvena.getText()+" "+dlgUnos.txtZelena.getText()+" "+dlgUnos.txtPlava.getText()); pnlCenter.setBackground(new Color(Integer.parseInt(dlgUnos.txtCrvena.getText()), Integer.parseInt(dlgUnos.txtZelena.getText()), Integer.parseInt(dlgUnos.txtPlava.getText()))); } } }); pnlSouth.add(btnDodajBoju); JButton btnPromeniRGB = new JButton("Promeni RGB"); btnPromeniRGB.addActionListener(new ActionListener() { public void actionPerformed(ActionEvent e) { DlgTest dlgPromena=new DlgTest(); try { String[] split = dlmBoje.getElementAt(lstBoje.getSelectedIndex()).split(" "); dlgPromena.txtCrvena.setText(split[0]); dlgPromena.txtZelena.setText(split[1]); dlgPromena.txtPlava.setText(split[2]); dlgPromena.setVisible(true); if(dlgPromena.isOk) { dlmBoje.removeElementAt(lstBoje.getSelectedIndex()); dlmBoje.addElement(dlgPromena.txtCrvena.getText()+" "+dlgPromena.txtZelena.getText()+" "+dlgPromena.txtPlava.getText()); pnlCenter.setBackground(new Color(Integer.parseInt(dlgPromena.txtCrvena.getText()), Integer.parseInt(dlgPromena.txtZelena.getText()), Integer.parseInt(dlgPromena.txtPlava.getText()))); } }catch(Exception ex) { JOptionPane.showMessageDialog(null, "Morate selektovati adekvatnu boju"); } } }); pnlSouth.add(btnPromeniRGB); // donji panel JPanel pnlNorth = new JPanel(); contentPane.add(pnlNorth, BorderLayout.NORTH); pnlNorth.setLayout(new FlowLayout(FlowLayout.CENTER, 5, 5)); JLabel lblNaziv = new JLabel("Zadatak 1"); lblNaziv.setHorizontalAlignment(SwingConstants.CENTER); pnlNorth.add(lblNaziv); pnlNorth.setBackground(Color.green); } } <file_sep>/StartingProjectOOIT/src/geometry/Circle.java package geometry; import java.awt.Color; import java.awt.Graphics; public class Circle extends Shape{ protected Point center; private int radius; public Circle() { } public Circle(Point center, int radius) { this.center = center; this.radius = radius; } public Circle(Point center, int radius, boolean selected) { this(center, radius); setSelected(selected); //menja se prilikom dodavanja Shape //this.selected = selected; } public double area() { return this.radius * this.radius * Math.PI; } public double circumference() { return 2 * this.radius * Math.PI; } public boolean equals(Object obj) { if (obj instanceof Circle) { Circle pomocni = (Circle) obj; if (this.center.equals(pomocni.center) && this.radius == pomocni.radius) { return true; } else { return false; } } else { return false; } } public boolean contains (int x, int y) { return center.distance(x, y) <= radius; } public boolean contains (Point p) { return center.distance(p.getX(), p.getY()) <= radius; } @Override public void draw(Graphics g) { g.drawOval(center.getX()-radius,center.getY()-radius, radius*2, radius*2); if (selected) { g.setColor(Color.BLUE); g.drawRect(center.getX() - 2, center.getY() - 2, 4, 4); g.drawRect(center.getX() - radius - 2, center.getY() - 2, 4, 4); g.drawRect(center.getX() + radius - 2, center.getY() - 2, 4, 4); g.drawRect(center.getX() - 2, center.getY() - radius - 2, 4, 4); g.drawRect(center.getX() - 2, center.getY() + radius - 2, 4, 4); } } @Override public void moveTo(int x, int y) { center.moveTo(x, y); } @Override public void moveBy(int x, int y) { center.moveBy(x, y); } @Override public int compareTo(Object o) { if(o instanceof Circle) { return (int)(this.area()-((Circle)o).area()); } return 0; } public Point getCenter() { return center; } public void setCenter(Point center) { this.center = center; } public int getRadius() { return radius; } public void setRadius(int radius) throws Exception{ if(radius<0) { throw new Exception("Vrednost poluprecnika mora biti veci od 0"); } System.out.println("Provera da li se ova naredba izvrsava ukoliko dodje do izuzetka"); this.radius = radius; } public String toString() { // Center=(x,y), radius= radius return "Center=" + center + ", radius=" + radius; } }
f8bef83ea8c80d5cd4fadb92a6dca651cdae73f4
[ "Java" ]
6
Java
OOIT2020-2021-IIS-prva-godina/IIS_Grupa2
c951e0409536159c6b53b63b05a25a09245a29e0
3e261ae5f4e3cf997f4e04368f56765904d1558b
refs/heads/master
<file_sep>import { BrowserModule } from '@angular/platform-browser'; import { NgModule, Component } from '@angular/core'; import {FormsModule} from '@angular/forms'; import {RouterModule, Routes} from '@angular/router'; import { AppComponent } from './app.component'; import { HeaderComponent } from './header/header.component'; import { FooterComponent } from './footer/footer.component'; import { HomeComponent } from './home/home.component'; import { AboutComponent } from './about/about.component'; import { BlogComponent } from './blog/blog.component'; import { ContactsComponent } from './contacts/contacts.component'; import { GalleryComponent } from './gallery/gallery.component'; const menuRouts: Routes = [ {path : '', component : HomeComponent}, {path : 'about', component : AboutComponent}, {path : 'blog', component : BlogComponent}, {path : 'gallery', component : GalleryComponent}, {path : 'contacts', component : ContactsComponent} ]; @NgModule({ declarations: [ AppComponent, HeaderComponent, FooterComponent, HomeComponent, AboutComponent, BlogComponent, ContactsComponent, GalleryComponent ], imports: [ BrowserModule, FormsModule, RouterModule.forRoot(menuRouts,{enableTracing:true}) ], providers: [], bootstrap: [AppComponent] }) export class AppModule { }
eeee098735a832d311f4e7acc043d01a62d50abb
[ "TypeScript" ]
1
TypeScript
mohan-developer/Angular_education
01e53848b3e08fa9730be29204cb8bd5a6818f5e
103da9db41a02d3c45c29f51ffc5bc084715e1be
refs/heads/master
<repo_name>HuZongHan/flask<file_sep>/main.py from flask import Flask from flask import request from flask import render_template USERS = { 1: {'name': '郭德纲', 'gender': '男', 'city': '北京', 'desc': '班长', }, 2: {'name': '陈乔恩', 'gender': '女', 'city': '上海', 'desc': None, }, 3: {'name': '赵丽颖', 'gender': '女', 'city': '北京', 'desc': '班花'}, 4: {'name': '王宝强', 'gender': '男', 'city': '重庆', 'desc': '超爱吃火锅'}, 5: {'name': '赵雅芝', 'gender': '女', 'city': '重庆', 'desc': '全宇宙三好'}, 6: {'name': '张学友', 'gender': '男', 'city': '上海', 'desc': '奥林匹克总'}, 7: {'name': '陈意涵', 'gender': '女', 'city': '上海', 'desc': None, }, 8: {'name': '赵本山', 'gender': '男', 'city': '南京', 'desc': '副班长'}, 9: {'name': '张柏芝', 'gender': '女', 'city': '上海', 'desc': None, }, 10: {'name': '吴亦凡', 'gender': '男', 'city': '南京', 'desc': '大碗宽面'}, 11: {'name': '鹿晗', 'gender': '保密', 'city': '北京', 'desc': None, }, 12: {'name': '关晓彤', 'gender': '女', 'city': '北京', 'desc': None, }, 13: {'name': '周杰伦', 'gender': '男', 'city': '台北', 'desc': '小伙人才啊'}, 14: {'name': '马云', 'gender': '男', 'city': '南京', 'desc': '一个字:贼'}, 15: {'name': '马化腾', 'gender': '男', 'city': '上海', 'desc': '马云死对头'}, } app = Flask(__name__) @app.route('/') def home(): user_list = [] for uid, info in sorted(USERS.items()): item = [uid, info['name']] user_list.append(item) return render_template('home.html', user_list=user_list) @app.route('/user/info') def user_info(): uid = int(request.args.get('id')) user_data = USERS[uid] lst = [222222, 'bbb', 444444, 'ddd'] return render_template('info.html', user=user_data, lst=lst) @app.route('/menu') def menu(): # menu_items = ['炒鸡蛋', '酸菜鱼', '麻辣烫', '葱油拌面', '鱼香肉丝'] menu_items = [] return render_template('menu.html', menu_items=menu_items) if __name__ == "__main__": app.run(debug=True)
726c2c6f7c9d5ad30308901e78ad3407934b6720
[ "Python" ]
1
Python
HuZongHan/flask
65ef6065758d45c3b2302cc4e695dafa995ac2a1
884c28289286e06d3e2cc1842679b11fa8226e00
refs/heads/master
<file_sep>#ifndef PROCESS_H #define PROCESS_H #include <QProcess> #include <QVariant> class Process : public QProcess { Q_OBJECT Q_DISABLE_COPY(Process) public: Process(QObject *parent = nullptr); ~Process(); Q_INVOKABLE void start(const QString &program, const QVariantList &arguments); Q_INVOKABLE QByteArray readAll(); Q_INVOKABLE QByteArray readAllStandardOutput(); Q_INVOKABLE QByteArray readAllStandardError(); }; #endif // PROCESS_H <file_sep>#include "process_plugin.h" #include "process.h" #include <qqml.h> void ProcessPlugin::registerTypes(const char *uri) { // @uri com.example.Process qmlRegisterType<Process>(uri, 1, 0, "Process"); } <file_sep>#include "process.h" Process::Process(QObject *parent) : QProcess(parent) { } Process::~Process() { } void Process::start(const QString &program, const QVariantList &arguments) { QStringList args; // convert QVariantList from QML to QStringList for QProcess for (int i = 0; i < arguments.length(); i++) args << arguments[i].toString(); QProcess::start(program, args); } QByteArray Process::readAll() { return QProcess::readAll(); } QByteArray Process::readAllStandardOutput() { return QProcess::readAllStandardOutput(); } QByteArray Process::readAllStandardError() { return QProcess::readAllStandardError(); }
313b3c3b45f68049c1d167ddb9e4f9f5480c35c4
[ "C++" ]
3
C++
cooldo/qprocesswrapper
884cc18d0d631628b289c4377f4802c4ad1d6534
723b98c46849930084b80974565c59f28a05fb9c
refs/heads/master
<repo_name>minollisantiago/Getting-and-Cleaning-Data-Peer-Assessment<file_sep>/CodeBook.md #Code Book - Peer Assessment ###General Description This dataset is based on the "Human Activity Recognition Using Smartphones Dataset - Version 1.0[1]", which can be found here: http://archive.ics.uci.edu/ml/datasets/Human+Activity+Recognition+Using+Smartphones For all the features included in the original dataset, only the mean and standard deviation measurements were considered. Both the train set and test set were merged into one dataset and Activity and Subject labels were included. The dataset is composed by 180 observations (rows) for 68 variables (columns), where each row is a vector that contains the average value of the mean and standard deviation measurements from the original dataset for each activity and subject. The original dataset included 30 subjects and 6 activity labels, thus the number of observations of this dataset is equal to 30*6 = 180. The original dataset also included a total of 66 mean and standard deviation measurements, the 2 additional columns included in this dataset are the Subject and Activity labels. ###Variable Description More information about the original dataset, including the experiment's description, and a more detailed description of the features can be found in the link that was provided in the general description of this dataset. The features selected for the original dataset come from the accelerometer and gyroscope 3-axial raw signals tAcc-XYZ and tGyro-XYZ. These time domain signals (prefix 't' to denote time) were captured at a constant rate of 50 Hz. Then they were filtered using a median filter and a 3rd order low pass Butterworth filter with a corner frequency of 20 Hz to remove noise. Similarly, the acceleration signal was then separated into body and gravity acceleration signals (tBodyAcc-XYZ and tGravityAcc-XYZ) using another low pass Butterworth filter with a corner frequency of 0.3 Hz. Subsequently, the body linear acceleration and angular velocity were derived in time to obtain Jerk signals (tBodyAccJerk-XYZ and tBodyGyroJerk-XYZ). Also the magnitude of these three-dimensional signals were calculated using the Euclidean norm (tBodyAccMag, tGravityAccMag, tBodyAccJerkMag, tBodyGyroMag, tBodyGyroJerkMag). Finally a Fast Fourier Transform (FFT) was applied to some of these signals producing fBodyAcc-XYZ, fBodyAccJerk-XYZ, fBodyGyro-XYZ, fBodyAccJerkMag, fBodyGyroMag, fBodyGyroJerkMag. (Note the 'f' to indicate frequency domain signals). These signals were used to estimate variables of the feature vector for each pattern: '-XYZ' is used to denote 3-axial signals in the X, Y and Z directions. * tBodyAcc-XYZ * tGravityAcc-XYZ * tBodyAccJerk-XYZ * tBodyGyro-XYZ * tBodyGyroJerk-XYZ * tBodyAccMag * tGravityAccMag * tBodyAccJerkMag * tBodyGyroMag * tBodyGyroJerkMag * fBodyAcc-XYZ * fBodyAccJerk-XYZ * fBodyGyro-XYZ * fBodyAccMag * fBodyAccJerkMag * fBodyGyroMag * fBodyGyroJerkMag Several variables were estimated from these signals, but the only ones that were included in this dataset are the following measurements: * mean(): Mean value * std(): Standard deviation The original experiments were carried out with a group of 30 volunteers within an age bracket of 19-48 years. Each person performed six activities (WALKING, WALKING_UPSTAIRS, WALKING_DOWNSTAIRS, SITTING, STANDING, LAYING) wearing a smartphone (Samsung Galaxy S II) on the waist. The measurements described above were averaged by each subject and activity. Two additional variables were included in the dataset, the first one (Subject) identifies the subject and the second one (Activity) identifies the activity. An ordered list of all the dataset's variables is included below: 1. Subject 2. Activity 3. tBodyAcc-mean()-X 4. tBodyAcc-mean()-Y 5. tBodyAcc-mean()-Z 6. tBodyAcc-std()-X 7. tBodyAcc-std()-Y 8. tBodyAcc-std()-Z 9. tGravityAcc-mean()-X 10. tGravityAcc-mean()-Y 11. tGravityAcc-mean()-Z 12. tGravityAcc-std()-X 13. tGravityAcc-std()-Y 14. tGravityAcc-std()-Z 15. tBodyAccJerk-mean()-X 16. tBodyAccJerk-mean()-Y 17. tBodyAccJerk-mean()-Z 18. tBodyAccJerk-std()-X 19. tBodyAccJerk-std()-Y 20. tBodyAccJerk-std()-Z 21. tBodyGyro-mean()-X 22. tBodyGyro-mean()-Y 23. tBodyGyro-mean()-Z 24. tBodyGyro-std()-X 25. tBodyGyro-std()-Y 26. tBodyGyro-std()-Z 27. tBodyGyroJerk-mean()-X 28. tBodyGyroJerk-mean()-Y 29. tBodyGyroJerk-mean()-Z 30. tBodyGyroJerk-std()-X 31. tBodyGyroJerk-std()-Y 32. tBodyGyroJerk-std()-Z 33. tBodyAccMag-mean() 34. tBodyAccMag-std() 35. tGravityAccMag-mean() 36. tGravityAccMag-std() 37. tBodyAccJerkMag-mean() 38. tBodyAccJerkMag-std() 39. tBodyGyroMag-mean() 40. tBodyGyroMag-std() 41. tBodyGyroJerkMag-mean() 42. tBodyGyroJerkMag-std() 43. fBodyAcc-mean()-X 44. fBodyAcc-mean()-Y 45. fBodyAcc-mean()-Z 46. fBodyAcc-std()-X 47. fBodyAcc-std()-Y 48. fBodyAcc-std()-Z 49. fBodyAccJerk-mean()-X 50. fBodyAccJerk-mean()-Y 51. fBodyAccJerk-mean()-Z 52. fBodyAccJerk-std()-X 53. fBodyAccJerk-std()-Y 54. fBodyAccJerk-std()-Z 55. fBodyGyro-mean()-X 56. fBodyGyro-mean()-Y 57. fBodyGyro-mean()-Z 58. fBodyGyro-std()-X 59. fBodyGyro-std()-Y 60. fBodyGyro-std()-Z 61. fBodyAccMag-mean() 62. fBodyAccMag-std() 63. fBodyBodyAccJerkMag-mean() 64. fBodyBodyAccJerkMag-std() 65. fBodyBodyGyroMag-mean() 66. fBodyBodyGyroMag-std() 67. fBodyBodyGyroJerkMag-mean() 68. fBodyBodyGyroJerkMag-std() ###Notes: * Original features are normalized and bounded within [-1,1]. * Each row in the dataset is a vector that contains the average mean and standard deviation measurements from the original data set for each subject and activity. ###References [1] <NAME>, <NAME>, <NAME>, <NAME> and <NAME>. Human Activity Recognition on Smartphones using a Multiclass Hardware-Friendly Support Vector Machine. International Workshop of Ambient Assisted Living (IWAAL 2012). Vitoria-Gasteiz, Spain. Dec 2012<file_sep>/README.md #Peer Assessment ###Important note I have included two .R files, one called run_analysis.R and the other run_analysis2.R. The difference between both files is the way in which they get the Samsung data, the first one doesn't require you to have the data files on your working directory, it will handle the downloading in a temp file and delete it after unzipping and reading the data sets into memory. The second one works as required by the assignment, it will get the data sets from the working directory (after you decompress the data that is). ###How the script works Both scripts are split into steps that mimic the ones given in the assignment description, they also include annotations that describe the whole process step by step. Here is a general overview of the steps: * Step 0: Loads the necessary data sets into memory. * Steps 1, 3 and 4: These steps are together, the script handles the merging of the data sets and adds the labels for Subjects and Activities. * Step 2: Subsets the data, including only the mean and std measurements. * Step 5: Creates the final tidy data set and writes it into a .txt file in the working directory. This step includes the computation of the average of the selected measurements for each activity and each subject. ###Tidy Data Set Notes The tidy dataset provided for the assignment is also included in this repo, it is a .txt file with the following particularities: * Row names were not included in the dataset, this makes it easier to write/read the file. * Column names are included in the file, to identify the variables. * Values were separated using the comma (","). ###A final note on reading the tidy dataset If you load the TidyDataSet.txt into memory, remember to set the arguments header to TRUE and sep to "," when you call the read.table() function, here is an example: ```{r} read.table("TidyDataSet.txt", sep = ",", header = TRUE) ``` <file_sep>/run_analysis2.R ##STEP 0: #load the needed files into globalenv() #Train set files TrainSet <- read.table("UCI HAR Dataset/train/X_train.txt") TrainSetLab <- read.table("UCI HAR Dataset/train/y_train.txt") TrainSetSubj <- read.table("UCI HAR Dataset/train/subject_train.txt") #Test set files TestSet <- read.table("UCI HAR Dataset/test/X_test.txt") TestSetLab <- read.table("UCI HAR Dataset/test/y_test.txt") TestSetSubj <- read.table("UCI HAR Dataset/test/subject_test.txt") #Labels ActivityLabels <- read.table("UCI HAR Dataset/activity_labels.txt") Features <- read.table("UCI HAR Dataset/features.txt") ##STEPS 1, 3 AND 4: #Merge data sets (TrainSet and TestSet) #Add the Subject and Activity labels (descriptive names) to the data #1) Adding Subject and Activity labels to data #TrainSet TrainSet <- data.frame(Subject = TrainSetSubj$V1, Activity = factor(TrainSetLab$V1, levels = 1:6, labels = as.character(ActivityLabels$V2)), TrainSet) #TestSet TestSet <- data.frame(Subject = TestSetSubj$V1, Activity = factor(TestSetLab$V1, levels = 1:6, labels = as.character(ActivityLabels$V2)), TestSet) #2) Merge both sets Data <- rbind(TrainSet, TestSet) ##STEP 2: #Subset columns (only mean and std measurements) #1) Original variable names names(Data)[-(1:2)] <- as.character(Features$V2) #2) Find the needed measurements (labels), eliminate the rest #Feature labels with the words "mean" and "std" on them x <- grep(paste(c("mean", "std"), collapse = "|"), names(Data)[-(1:2)], ignore.case = TRUE, value = TRUE) #Feature labels with the words "meanFreq" and "angle" on them y <- grep(paste(c("meanFreq", "angle"), collapse = "|"), names(Data)[-(1:2)], ignore.case = TRUE, value = TRUE) #Final filtered labels, only mean and std measurements FilteredFeatureLabels <- x[!x %in% y] #3) Subsetting the dataset (only mean and std measurements) Data <- Data[, c(names(Data)[1:2], FilteredFeatureLabels)] ##STEP 5: #New tidy data set #1) Data as data.table, for easier subsetting #Install and load the package data.table (checks to see if its installed first) if( !require(data.table, quietly = TRUE) ) { install.packages("data.table") } require(data.table) #New set as data.table() TidyData <- data.table(Data) #2) Set the keys, and calculate the means by Subject and activities #Set the keys for the data table (Subject and Activity) setkey(TidyData, Subject, Activity) #Subset and compute the mean TidyData <- TidyData[, lapply(.SD, mean), by ="Subject,Activity"] #3) Write the tidy data set to a file (in the working directory) write.table(TidyData, file = "TidyDataSet.txt", row.names = FALSE, col.names = TRUE, sep = ",", quote = FALSE)
c254b07616d4574dc8fd135954f363736c020d2f
[ "Markdown", "R" ]
3
Markdown
minollisantiago/Getting-and-Cleaning-Data-Peer-Assessment
6253899fb372ebb30b23d8d05e2f9984b5878110
5f8a7115be54bae3adf6dede81a7c8b0daa03381
refs/heads/master
<repo_name>robertwestman/CPU-Mem<file_sep>/README.md # CPU-Mem A Java program that simulates the interactions between CPU and memory <file_sep>/CPU.java import java.io.*; import java.util.*; public class CPU { //registers private int PC = 0; private int SP = 999; private int IR = 0; private int AC = 0; private int X = 0; private int Y = 0; //input + output readers private BufferedWriter out; private BufferedReader in; //list of instructions with an operand private int instructionsWithOperand[] = {1,2,3,4,5,7,9,20,21,22,23}; //Designated user memory final int USER_MEMORY = 1000; //mode private boolean kernalMode = false; //exit fetching instructions private boolean exit = false; //runtime + processes private static Runtime runtime; private static Process memory; private static CPU cpu; public static void main(String[] args) throws IOException, InterruptedException{ try { runtime = Runtime.getRuntime(); memory = runtime.exec("java Memory " + args[0]); cpu = new CPU(memory.getInputStream(), memory.getOutputStream()); cpu.run(); memory.waitFor(); int exitVal = memory.exitValue(); System.out.println("\nProcess exited: " + exitVal); System.exit(0); } catch (Throwable t) { t.printStackTrace(); } } //constructor that sets up streams public CPU(InputStream input, OutputStream output) { this.in = new BufferedReader(new InputStreamReader(input)); this.out = new BufferedWriter(new OutputStreamWriter(output)); } public void run() throws IOException { while(!exit) { fetch(); this.PC++; } //close streams out.close(); in.close(); } //fetch instruction public void fetch() throws IOException { boolean operand = false; this.IR = read(PC); for(int i = 0; i < instructionsWithOperand.length; i++) { if(this.IR == instructionsWithOperand[i]) { operand = true; break; } } if(operand) { this.PC++; instruction(this.IR, read(this.PC)); } else { instruction(this.IR, 0); } } //execute instruction private void instruction(int instruction, int operand) throws IOException { switch (instruction) { case 1: this.AC = operand; break; case 2: this.AC = this.read(operand); break; case 3: this.AC = this.read(this.read(operand)); break; case 4: this.AC = this.read(operand + this.X); break; case 5: this.AC = this.read(operand + this.Y); break; case 6: this.AC = this.read(this.SP + this.X); break; case 7: this.write(operand, this.AC); break; case 8: this.AC = 1 + (int) (Math.random()*100); break; case 9: if (operand == 1) { System.out.print((int) this.AC); } else if (operand == 2) { System.out.print((char) this.AC); } break; case 10: this.AC += this.X; break; case 11: this.AC += this.Y; break; case 12: this.AC -= this.X; break; case 13: this.AC -= this.Y; break; case 14: this.X = this.AC; break; case 15: this.AC = this.X; break; case 16: this.Y = this.AC; break; case 17: this.AC = this.Y; break; case 18: this.SP = this.AC; break; case 19: this.AC = this.SP; break; case 20: this.PC = operand - 1; break; case 21: if(this.AC == 0) { this.PC = operand - 1; } break; case 22: if(this.AC != 0) { this.PC = operand - 1; } break; case 23: this.write(this.SP, this.PC); this.SP--; this.PC = operand - 1; break; case 24: this.SP++; this.PC = this.read(SP) - 1; break; case 25: this.X++; break; case 26: this.X--; break; case 27: this.write(this.SP, this.AC); this.SP--; break; case 28: this.SP++; this.AC = this.read(SP); break; case 50: exit = true; break; default: System.out.println("\nError: instruction " + instruction + " does not exist"); break; } } protected int read(int address) throws IOException { out.write(String.format("%d\n", address)); out.flush(); String value = in.readLine(); return Integer.parseInt(value); } protected void write(int address, int value) throws IOException { out.write(String.format("%d %d\n", address, value)); out.flush(); } }
d0b61638184eb27e60d3bfe0f04c498e75c72373
[ "Markdown", "Java" ]
2
Markdown
robertwestman/CPU-Mem
b240f2690c227ba8c81e0a44e30acf7072cf41bc
1452bb836c97f1cbc17b9d70a9be62b802ef765b
refs/heads/master
<repo_name>MartinCarniello/Re-conference<file_sep>/db/migrate/003_create_propuesta.rb migration 3, :create_propuesta do up do create_table :propuesta do column :id, Integer, :serial => true column :titulo, DataMapper::Property::String, :length => 255 column :descripcion, DataMapper::Property::String, :length => 255 end end down do drop_table :propuesta end end <file_sep>/spec/app/models/conferencia_spec.rb require 'spec_helper' describe Conferencia do describe 'modelo' do before (:each) do @conferencia = Conferencia.new end it { expect(@conferencia).to respond_to(:id) } it { expect(@conferencia).to respond_to(:titulo) } it { expect(@conferencia).to respond_to(:descripcion)} it { expect(@conferencia).to respond_to(:fecha) } end describe 'inicializacion' do it 'se puede crear pasandole un titulo, descripcion y fecha' do conferencia = Conferencia.new conferencia.titulo = "RubyConf" conferencia.descripcion = "Conferencia internacional de Ruby situada esta vez en Argentina" conferencia.fecha = Date::strptime("30-12-2014", "%d-%m-%Y") expect(conferencia.titulo).to eq("RubyConf") expect(conferencia.descripcion).to eq("Conferencia internacional de Ruby situada esta vez en Argentina") expect(conferencia.fecha).to eq(Date::strptime("30-12-2014", "%d-%m-%Y")) end end end <file_sep>/spec/app/models/account_spec.rb require 'spec_helper' require_relative '../../../app/models/account.rb' describe 'account' do describe 'pass_segura?' do it 'debe dar True siempre' do usuario = Account.new expect(usuario.pass_segura?).to eq(true) end end describe 'validate_password' do it 'debe dar True solo si su password es mas larga que 8 caracteres, tiene al menos una mayuscula y una minuscula' do expect(Account.validate_password('<PASSWORD>')).to eq(true) end it 'debe dar False si y su password es mas corta que 8 caracteres' do expect(Account.validate_password('<PASSWORD>')).to eq(false) end it 'debe dar False si y su password no contiene mayusculas' do expect(Account.validate_password('<PASSWORD>')).not_to eq(true) end it 'debe dar False si y su password no contiene minusculas' do expect(Account.validate_password('<PASSWORD>')).not_to eq(true) end end end <file_sep>/app/models/comentario.rb class Comentario include DataMapper::Resource # property <name>, <type> property :id, Serial, :unique => true property :contenido, String, required: true belongs_to :account end <file_sep>/features/step_definitions/evaluador_sube_comentarios_step.rb Given(/^me dirijo al detalle de la propuesta con titulo (.+)$/) do |titulo| @browser.table(id: 'tabla_propuestas').trs[1..-1].each do |tr| tr.td(index: 1).link.click if tr.td(index: 0).text == titulo end end When(/^seteo el comentario (.+)$/) do |comentario| @browser.text_field(id: 'comentario_contenido').set comentario end And(/^clickeo en comentar$/) do @browser.button(id: 'comentario_crear').click end Then(/^me deberia redirigir al detalle de la propuesta donde deberia estar el comentario (.+) creado$/) do |comentario| expect(@browser.table(id: 'tabla_comentarios').trs[1..-1].detect { |tr| tr.td(index: 1).text == comentario }).not_to eq(nil) end<file_sep>/features/step_definitions/asignar_conferencia_a_evaluador_step.rb Given(/^me dirijo al detalle de la conferencia con titulo (.+)$/) do |titulo_conferencia| @browser.goto("http://localhost:3000/ver_conferencias") @browser.table.trs[1..-1].detect { |tr| tr.td(index: 0).text == titulo_conferencia }.td(index: 3).link.click end When(/^asigno la conferencia al evaluador (.+)$/) do |evaluador| @browser.select_list(id: "account_evaluador").select_value(evaluador) @browser.button(id: "account_evaluador_boton").click end Then(/^deberia tener asignada la conferencia (.+) al evaluador (.+)$/) do |conferencia, evaluador| expect(@browser.table(id: "tabla_evaluadores").trs[1].td(index: 0).text).to eq(evaluador) end <file_sep>/features/step_definitions/crear_conferencia_step.rb Given(/^que me dirijo a la pagina de creacion de conferencia$/) do @browser.goto("localhost:3000/crear_conferencia") end Given(/^seteo el titulo de la conferencia "(.*?)"$/) do |titulo| @browser.text_field(id: "conferencia_titulo").set titulo end Given(/^seteo la descripcion de la conferencia "(.*?)"$/) do |descripcion| @browser.text_field(id: "conferencia_descripcion").set descripcion end Given(/^seteo el dia (\d+), mes (\d+) y anio (\d+) de la conferencia$/) do |dia, mes, anio| if ENV['PADRINO_ENV'] == "travis" @browser.text_field(id: "conferencia_fecha").set "#{dia}-#{mes}-#{anio}" else @browser.input(id: "conferencia_fecha").click @browser.send_keys(dia.to_i) @browser.send_keys(mes.to_i) @browser.send_keys(anio.to_i) end end When(/^clickeo el boton de crear conferencia$/) do @browser.button(id: "conferencia_crear").click end Then(/^me redirecciona a la pagina donde me dice que la conferencia ha sido creada$/) do expect(@browser.div(class: "exito").h3.div.text).to match /^Conferencia creada$/ end Then(/^sigo en la pagina de creacion de conferencia$/) do expect(@browser.url).to match /localhost:3000\/crear_conferencia$/ end<file_sep>/features/step_definitions/home_step.rb Given(/^que me dirijo a la pagina home de la aplicacion$/) do @browser.goto("localhost:3000") end When(/^clickeo el boton que me lleva a crear una conferencia$/) do @browser.link(id: "crear_conferencia").click end Then(/^me redirecciona a la pagina de creacion de conferencia$/) do expect(@browser.url).to match "localhost:3000/crear_conferencia" expect(@browser.text_field(id: "conferencia_titulo").present?).to eq true expect(@browser.text_field(id: "conferencia_descripcion").present?).to eq true end When(/^clickeo el boton que me lleva a ver las conferencias$/) do @browser.link(id: "ver_conferencias").click end Then(/^me redirecciona a la pagina de ver conferencias$/) do expect(@browser.url).to match "localhost:3000/ver_conferencias" expect(@browser.div(class: "jumbotron").h2.text).to eq "Tabla de conferencias" end<file_sep>/features/step_definitions/cross_steps.rb # And(/^se borran los usuarios de la base de datos$/) do # Account.destroy # end # And(/^se borran las conferencias de la base de datos$/) do # Conferencia.destroy # end<file_sep>/app/controllers/crear_propuesta.rb LaReConference::App.controllers :crear_propuesta do get :crear_propuesta, :map => '/crear_propuesta/:id_conferencia' do @crear_propuesta_active = "active" @conferencia = Conferencia.first(id: params[:id_conferencia]) @propuesta = Propuesta.new @propuesta.conferencia = @conferencia render 'crear_propuesta/index' end post :create, with: :id_conferencia do id_conferencia = params[:id_conferencia] titulo = params[:propuesta][:titulo] descripcion = params[:propuesta][:descripcion] conferencia = Conferencia.first(id: id_conferencia) @propuesta = Propuesta.create(titulo: titulo, descripcion: descripcion, conferencia: conferencia) conferencia.propuestas.push(@propuesta) if @propuesta.save flash[:success] = 'Propuesta creada' redirect "/crear_propuesta/#{id_conferencia}" else flash.now[:danger] = 'Error al intentar crear la propuesta' redirect "/crear_propuesta/#{id_conferencia}" end end end<file_sep>/features/support/env.rb # require File.expand_path(File.dirname(__FILE__) + "/../../config/boot") # #require 'simplecov' # require 'simplecov' # SimpleCov.start do # root(File.join(File.dirname(__FILE__), '..','..')) # coverage_dir 'reports/coverage' # add_filter '/spec/' # add_filter '/features/' # add_filter '/admin/' # add_filter '/db/' # add_filter '/config/' # add_group "Models", "app/models" # add_group "Controllers", "app/controllers" # add_group "Helpers", "app/helpers" # end # DataMapper::Logger.new($stdout, :all) # DataMapper.auto_migrate! # organizador = Account.create(nombre: 'UsuarioOrganizador', # email: '<EMAIL>', # password: "<PASSWORD>", # rol: "Organizador") # evaluador = Account.create(nombre: 'UsuarioEvaluador', # email: '<EMAIL>', # password: "<PASSWORD>", # rol: "Evaluador") # organizador.save # evaluador.save require_relative 'dependencies' headless = Headless.new headless.start Before do |scenario| if ENV['PADRINO_ENV'] == "travis" @browser = Watir::Browser.new else @browser = Watir::Browser.new :chrome end end After do |scenario| @browser.close end at_exit do headless.destroy end<file_sep>/features/step_definitions/ver_propuestas_de_conferencia_step.rb Then(/^deberia ver el listado de propuestas que incluye una propuesta con titulo (.+)$/) do |titulo| expect(@browser.table(id: "tabla_propuestas").trs[1].td(index: 0).text).to eq(titulo) end<file_sep>/features/step_definitions/subir_propuesta_steps.rb Given(/^seteo el titulo de la propuesta "(.*?)"$/) do |titulo| @browser.text_field(id: "propuesta_titulo").set titulo end Given(/^seteo el resumen de la propuesta "(.*?)"$/) do |resumen| @browser.text_field(id: "propuesta_descripcion").set resumen end When(/^clickeo el boton de agregar propuesta$/) do @browser.link(id: "create_propuesta").click end When(/^clickeo crear propuesta$/) do @browser.button(id: "propuesta_crear").click end Then(/^deberia estar en la pagina de creacion de una propuesta$/) do expect(@browser.url).to match /localhost:3000\/crear_propuesta/ end Then(/^me deberia redirijir al detalle de la conferencia donde deberia estar la propuesta creada$/) do expect(@browser.url).to match /localhost:3000\/crear_propuesta/ end<file_sep>/app/controllers/registrar_usuario.rb LaReConference::App.controllers :registrar_usuario do get :registrar_usuario, :map => '/registrar_usuario' do @usuario = Account.new render 'registrar_usuario/index' end post :create do nombre = params[:account][:nombre] email = params[:account][:email] password = params[:account][:password] confirme_password = params[:account][:confirme_password] rol = params[:account][:rol_de_usuario] if !Account.first(nombre: nombre) if (password == <PASSWORD>) if Account.validate_password(password) @usuario = Account.new(nombre: nombre, password: <PASSWORD>, email: email, role: rol) (tipo_flash, mensaje) = (@usuario.save ? [:success, 'Usuario creado'] : [:danger, 'Todos los campos son obligatorios, asegurese de que su password incluya al menos 1 mayuscula, 1 minuscula y sea de al menos 8 caracteres']) else (tipo_flash, mensaje) = [:danger, 'La clave no cumple los requisitos'] end else (tipo_flash, mensaje) = [:danger, 'Las claves ingresadas no coinciden'] end else (tipo_flash, mensaje) = [:danger, 'El nombre de usuario ya existe, elije otro'] end flash[tipo_flash] = mensaje redirect '/' end end<file_sep>/app/controllers/ver_conferencias.rb LaReConference::App.controllers :ver_conferencias do get :ver_conferencias, :map => '/ver_conferencias' do @ver_conferencias_active = "active" if usuario_actual.role == "evaluador" @conferencias = usuario_actual.conferencias else @conferencias = Conferencia.all.reverse end render 'ver_conferencias/index' end end<file_sep>/app/helpers/home_helper.rb # Helper methods defined here can be accessed in any controller or view in the application module LaReConference class App module HomeHelper def usuario_actual=(usuario) @usuario_actual = usuario end def usuario_actual @usuario_actual ||= Account.first(id: session[:usuario_actual]) end def login(usuario) session[:usuario_actual] = usuario.id self.usuario_actual = usuario end def logout session.delete(:usuario_actual) end def logueado? !usuario_actual.nil? end end helpers HomeHelper end end <file_sep>/app/controllers/home.rb LaReConference::App.controllers :home do get :index, :map => '/' do @home_active = "active" @crear_usuario_active = "active" @usuario = Account.new render 'home/index' end post :login do nombre = params[:account][:nombre] password = params[:account][:password] @usuario = Account.autenticar(nombre, password) if(!@usuario) flash[:danger] = 'Nombre de usuario y/o contrasenia invalida' redirect '/' else login @usuario flash[:success] = "El usuario #{usuario_actual.nombre} ha sido logueado correctamente" redirect '/' end end get :logout, :map => '/logout' do logout redirect '/' end end <file_sep>/app/models/conferencia.rb class Conferencia include DataMapper::Resource # property <name>, <type> property :id, Serial, unique: true property :titulo, String, required: true property :descripcion, String, required: true property :fecha, Date, required: true has n, :accounts, :through => Resource has n, :propuestas end <file_sep>/app/controllers/ver_una_propuesta.rb LaReConference::App.controllers :ver_una_propuesta do get :ver_propuesta, :map => '/ver_una_propuesta/:id_propuesta' do @ver_una_conferencia_active = "active" @propuesta = Propuesta.first(id: params[:id_propuesta]) @evaluador = Account.new @comentario = Comentario.new @evaluadores_asignados = @propuesta.conferencia.accounts @comentarios = @propuesta.comentarios render 'ver_una_propuesta/index' end post :comentar, with: :id_propuesta do evaluador = usuario_actual contenido = params[:comentario][:contenido] propuesta = Propuesta.first(id: params[:id_propuesta]) comentario = Comentario.create(contenido: contenido, account: evaluador) propuesta.comentarios.push(comentario) if comentario.save flash[:success] = "El comentario ha sido ingresado exitosamente" redirect "ver_una_propuesta/#{propuesta.id}" else flash[:danger] = "Error al intentar agregar un comentario" redirect "ver_una_propuesta/#{propuesta.id}" end end end<file_sep>/features/support/dependencies.rb # require 'debugger' require 'watir-webdriver' require 'headless'<file_sep>/app/controllers/crear_conferencia.rb LaReConference::App.controllers :crear_conferencia do get :crear_conferencia, :map => '/crear_conferencia' do @crear_conferencia_active = "active" @conferencia = Conferencia.new render 'crear_conferencia/index' end post :create do titulo = params[:conferencia][:titulo] descripcion = params[:conferencia][:descripcion] fecha = Date::strptime(params[:conferencia][:fecha], "%Y-%m-%d") conferencia = Conferencia.create(titulo: titulo, descripcion: descripcion, fecha: fecha) if conferencia.save flash[:success] = 'Conferencia creada' redirect '/' else flash.now[:danger] = 'No se ha podido crear la conferencia' render 'crear_conferencia/index' end end end<file_sep>/db/migrate/004_create_comentario.rb migration 3, :create_comentario do up do create_table :comentario do column :id, Integer, :serial => true column :contenido, DataMapper::Property::String, :length => 255 end end down do drop_table :comentario end end <file_sep>/app/models/account.rb class Account include DataMapper::Resource # Available roles ORGANIZADOR = 'organizador' EVALUADOR = 'evaluador' ORADOR = 'orador' # property <name>, <type> property :id, Serial, unique: true property :nombre, String property :password, String property :email, String property :role, String has n, :conferencias, :through => Resource def pass_segura? true end def self.validate_password(password) password[/[[:lower:]]/] and password[/[[:upper:]]/] and password[/\d/] and (password.length >= 8) end def password=(password) super ::BCrypt::Password.create(password) end def password_igual_a(password) ::BC<PASSWORD>.new(self.<PASSWORD>) == password end def self.autenticar(nombre, password) usuario = Account.first(nombre: nombre) return nil if !usuario usuario.password_igual_a(password) ? usuario : nil end def self.find_by_id(id) get(id) rescue nil end def self.find_by_roles(role) self.all(role: role) end end <file_sep>/features/step_definitions/loguear_usuario_step.rb And(/^me logueo como (.+) con contrasenia (.+)$/) do |usuario, contrasenia| @browser.text_field(id: "account_nombre").set usuario @browser.text_field(id: "account_password").set contrasenia @browser.button(id: "usuario_login").click end And(/^me deslogueo$/) do @browser.link(id: "usuario_logout").click end Then(/^me redirecciona a la pagina principal donde me dice que el logueo ha sido correcto para el usuario (.+)$/) do |usuario| expect(@browser.div(class: "exito").h3.div.text).to match /El usuario #{usuario} ha sido logueado correctamente/ end Then(/^me redirecciona a la pagina principal donde me dice que el logueo ha sido incorrecto$/) do expect(@browser.div(class: "error").h3.div.text).to match /^Nombre de usuario y\/o contrasenia invalida$/ end<file_sep>/db/migrate/001_create_conferencia.rb migration 1, :create_conferencia do up do create_table :conferencia do column :id, Integer, :serial => true column :titulo, DataMapper::Property::String, :length => 255 column :descripcion, DataMapper::Property::String, :length => 255 column :fecha, DataMapper::Property::Date end end down do drop_table :conferencia end end <file_sep>/app/controllers/ver_una_conferencia.rb LaReConference::App.controllers :ver_una_conferencia do get :ver_conferencia, :map => '/ver_una_conferencia/:id_conferencia' do @ver_una_conferencia_active = "active" @conferencia = Conferencia.first(id: params[:id_conferencia]) @evaluador = Account.new @evaluadores_asignados = @conferencia.accounts @propuestas = @conferencia.propuestas @evaluadores_select = Account.find_by_roles("evaluador").inject([]) do |array, evaluador| array.push(evaluador.nombre) if !@evaluadores_asignados.include?(evaluador) array end render 'ver_una_conferencia/index' end post :asignar_evaluador, with: :id_conferencia do nombre_evaluador = params[:account][:evaluador] id_conferencia = params[:id_conferencia] conferencia = Conferencia.first(id: id_conferencia) evaluador = Account.first(nombre: nombre_evaluador) conferencia.accounts.push(evaluador) evaluador.conferencias.push(conferencia) if conferencia.save flash[:success] = "El evaluador ha sido asignado a la conferencia" redirect "ver_una_conferencia/#{conferencia.id}" else flash[:danger] = "Ha habido un error para guardar la conferencia o el evaluador" redirect "ver_una_conferencia/#{conferencia.id}" end end end<file_sep>/features/step_definitions/ver_conferencia_step.rb Given(/^que me dirijo a la pagina de ver conferencias$/) do @browser.goto("localhost:3000/ver_conferencias") end Then(/^puedo ver la conferencia que he creado$/) do expect(@browser.table.tr(index: 1).td.text).to match(/RubyConfTestVerConferencia/) end Then(/^no puedo ver las conferencias que se han creado$/) do expect(@browser.table.trs.size).to eq(1) end<file_sep>/features/step_definitions/registrar_usuario_steps.rb Given(/^que me dirijo a la pagina de registro de usuario$/) do @browser.goto("localhost:3000/registrar_usuario") end Given(/^seteo el rol "(.*?)"$/) do |rol| @browser.select_list(:id, "account_rol_de_usuario").select_value(rol) end Given(/^ingreso "(.*?)" como nombre de usuario$/) do |nombre| @browser.text_field(id: "account_nombre").set nombre end Given(/^ingreso "(.*?)" como email$/) do |email| @browser.text_field(id: "account_email").set email end Given(/^ingreso "(.*?)" como contrasenia$/) do |password| @browser.text_field(id: "account_password").set password end Given(/^confirmo la contrasenia "(.*?)"$/) do |confirm| @browser.text_field(id: "account_confirme_password").set confirm end When(/^clickeo el boton de crear usuario$/) do @browser.button(id: "usuario_crear").click end Then(/^sigo en la pagina de creacion de usuario$/) do expect(@browser.url).to match(/localhost:3000\/registrar_usuario/) end Then(/^me redirecciona a la pagina donde me dice que el usuario ha sido creado$/) do expect(@browser.div(class: "exito").h3.div.text).to match /^Usuario creado$/ end Then(/^me redirecciona a la pagina donde me dice un error (.+)$/) do |mensaje| expect(@browser.div(class: "error").h3.div.text).to eq(mensaje) end
ddc838bc2bb09de0b2eb25cbd477e63d76d9e718
[ "Ruby" ]
28
Ruby
MartinCarniello/Re-conference
ab2850022e58ecb0b892837d12bc2b97aea1f3d3
98fa3cabc1a920aba114d4231df32ef4da97c540
refs/heads/main
<repo_name>tungedison/EddystoneBeaconScanner<file_sep>/src/beacon/index.ts export * from './Beacon'; export * from './BeaconService'; <file_sep>/src/index.ts export * from './beacon'; export * from './Eddystone'; <file_sep>/demo/src/logic/try-it-out.js import { Eddystone } from 'eddystone-web-bluetooth'; export default function tryItOut() { var eddystone = new Eddystone(); var beacon, service; eddystone.request() // Scan for Eddystone beacons. .then((newBeacon) => { beacon = newBeacon; return beacon.connect(); // Connect to the Beacon's GATT service. }) .then((newService) => { service = newService; return service.isLocked(); // Check if the beacon is locked. }) .then((isLocked) => { if (isLocked) { return Promise.reject('The beacon is locked. Can\'t write new URL'); } // Beacon's not locked. We can proceed with the recording of the new URL. // Keep in mind that the encoded URL must NOT be longer than 18 characters. return service.writeUrl('https://goo.gl/XXw2hi'); }) .then(() => { beacon.disconnect(); alert('Beacon has been written!'); }); }<file_sep>/README.md # Eddystone Web Bluetooth > Web Bluetooth Eddystone made easier [![Build Status](https://travis-ci.org/zurfyx/eddystone-web-bluetooth.svg?branch=master)](https://travis-ci.org/zurfyx/eddystone-web-bluetooth) [![David](https://david-dm.org/zurfyx/eddystone-web-bluetooth.svg)](https://david-dm.org/zurfyx/eddystone-web-bluetooth) [![David](https://david-dm.org/zurfyx/eddystone-web-bluetooth/dev-status.svg)](https://david-dm.org/zurfyx/eddystone-web-bluetooth#info=devDependencies) [![Code Climate](https://codeclimate.com/github/zurfyx/eddystone-web-bluetooth/badges/gpa.svg)](https://codeclimate.com/github/zurfyx/eddystone-web-bluetooth) <p align="center"> <img src="./assets/demo.gif" width="500" /><br /> <a href="#getting-started">Getting started source-code</a> using <a href="https://twitter.com/ThePhysicalWeb/status/770262699766755329">Physical Web beacons</a> </p> ## Features - [x] Scan Eddystone beacons - [x] Connect / Disconnect - [x] Monitor connection status - [ ] Read Capabilities - [ ] Read / Write Active Slot - [x] Read / Write Advertising Interval - [x] Read / Write Radio Tx Power - [x] Read / Write Advertised Tx Power - [x] Read Lock State - [ ] Write Lock State - [ ] Read / Write Unlock - [ ] Read Public ECDH Key - [ ] Read EID Identity Key - [x] Read / Write ADV Slot Data - [x] Write Factory reset - [ ] Read / Write Remain Connectable ## Getting started ``` npm install --save eddystone-web-bluetooth ``` ```javascript var eddystone = new Eddystone(); var beacon, service; eddystone.request() // Scan for Eddystone beacons. .then((newBeacon) => { beacon = newBeacon; return beacon.connect(); // Connect to the Beacon's GATT service. }) .then((newService) => { service = newService; return service.isLocked(); // Check if the beacon is locked. }) .then((isLocked) => { if (isLocked) { return Promise.reject('The beacon is locked. Can\'t write new URL'); } // Beacon's not locked. We can proceed with the recording of the new URL. // Keep in mind that the encoded URL must NOT be longer than 18 characters. return service.writeUrl('https://www.google.com'); }) .then(() => { beacon.disconnect(); alert('OK!'); }); ``` See the rest of the services [here](https://github.com/zurfyx/eddystone-web-bluetooth/blob/master/src/beacon/BeaconService.ts). ## Development Eddystone Web Bluetooth implementation is based on the official specifications: [https://github.com/google/eddystone/tree/master/configuration-service](https://github.com/google/eddystone/tree/master/configuration-service) ## Contributions Contributions are very welcome. ## License MIT © [<NAME>](//zurfyx.com) ---- Special thanks to @beaufortfrancois for providing https://github.com/beaufortfrancois/sandbox/blob/gh-pages/web-bluetooth/eddystone-url-config/app.js magnificent example source code. <file_sep>/src/constants.ts export default Object.freeze({ EDDYSTONE_CONFIG_SERVICE_UUID: 'a3c87500-8ed3-4bdf-8a39-a01bebede295', CAPABILITIES_CHARACTERISTIC_UUID: 'a3c87501-8ed3-4bdf-8a39-a01bebede295', ACTIVE_SLOT_CHARACTERISTIC_UUID: 'a3c87502-8ed3-4bdf-8a39-a01bebede295', ADVERTISING_INTERVAL_CHARACTERISTIC_UUID: 'a3c87503-8ed3-4bdf-8a39-a01bebede295', RADIO_TX_POWER_CHARACTERISTIC_UUID: 'a3c87504-8ed3-4bdf-8a39-a01bebede295', ADVANCED_ADVERTISED_TX_POWER_CHARACTERISTIC_UUID: 'a3c87505-8ed3-4bdf-8a39-a01bebede295', EDDYSTONE_LOCK_STATE_CHARACTERISTIC_UUID: 'a3c87506-8ed3-4bdf-8a39-a01bebede295', EDDYSTONE_UNLOCK_CHARACTERISTIC_UUID: 'a3c87507-8ed3-4bdf-8a39-a01bebede295', PUBLIC_ECDH_KEY_CHARACTERISTIC_UUID: 'a3c87508-8ed3-4bdf-8a39-a01bebede295', EID_IDENTITY_KEY_CHARACTERISTIC_UUID: 'a3c87509-8ed3-4bdf-8a39-a01bebede295', ADV_SLOT_DATA_CHARACTERISTIC_UUID: 'a3c8750a-8ed3-4bdf-8a39-a01bebede295', ADVANCED_FACTORY_RESET_CHARACTERISTIC_UUID: 'a3c8750b-8ed3-4bdf-8a39-a01bebede295', ADVANCED_REMAIN_CONNECTABLE_CHARACTERISTIC_UUID: 'a3c8750c-8ed3-4bdf-8a39-a01bebede295', }); <file_sep>/demo/src/App.js import React, { Component } from 'react'; import tryItOut from './logic/try-it-out'; import './App.css'; class App extends Component { render() { return ( <div className="App"> <h1>Eddystone Web Bluetooth</h1> <button className="big" onClick={() => tryItOut()}>Try it out!</button> </div> ); } } export default App; <file_sep>/demo/README.md # Demo This folder contains working examples of `eddystone-web-bluetooth`, running on a very simple React setup. The relevant generic bits of code are located into `src/logic`.<file_sep>/src/beacon/enums.ts enum LOCK_VALUES { LOCKED = 0x00, UNLOCKED = 0x01, UNLOCKED_AND_AUTOMATIC_RELOCK_DISABLED = 0x02, } enum DATA_VALUES { UID = 0x00, URL = 0x10, TLM = 0x20, EID = 0x40, } export { LOCK_VALUES, DATA_VALUES, }; <file_sep>/src/beacon/__tests__/url.test.ts import { decodeUrl, encodeUrl } from '../url'; const URL1 = 'https://example.com'; const URL2 = 'https://www.npmjs.com/features'; it('should decode the same value it has encoded', () => { const result1 = decodeUrl(encodeUrl(URL1)); expect(result1).toBe(URL1); const result2 = decodeUrl(encodeUrl(URL2)); expect(result2).toBe(URL2); }); it('the encoded value should be as short as possible (by using prefixed codes)', () => { const result1 = encodeUrl(URL1); expect(result1.byteLength).toBe(9); const result2 = encodeUrl(URL2); expect(result2.byteLength).toBe(15); }); <file_sep>/src/Eddystone.ts import constants from './constants'; import { Beacon } from './beacon'; export class Eddystone { async request(): Promise<Beacon> { const bluetooth: Bluetooth = navigator.bluetooth; if (!bluetooth) { return Promise.reject('Your browser does not support Web Bluetooth.'); } const requestOptions = { filters: [{ services: [constants.EDDYSTONE_CONFIG_SERVICE_UUID] }] }; const device = await bluetooth.requestDevice(requestOptions); return new Beacon(device); } } <file_sep>/src/beacon/Beacon.ts import constants from '../constants'; import { BeaconService } from './BeaconService'; export class Beacon { constructor(public device: BluetoothDevice) {} onDisconnect(listener: (this: this, ev: Event) => any) { this.device.addEventListener('gattserverdisconnected', listener); } async connect(): Promise<BeaconService> { if (!this.device.gatt) { return Promise.reject('Bluetooth device is probably not a beacon - it does not support GATT'); } const bluetoothGattServer = await this.device.gatt.connect(); const service = await bluetoothGattServer .getPrimaryService(constants.EDDYSTONE_CONFIG_SERVICE_UUID); return new BeaconService(service); } disconnect() : void { const gatt: BluetoothRemoteGATTServer | undefined = this.device.gatt; if (!(gatt && gatt.connected)) { console.warn('Ignored disconnection request. You are not connected!'); return; } gatt.disconnect(); } } <file_sep>/src/beacon/BeaconService.ts import constants from '../constants'; import { LOCK_VALUES, DATA_VALUES } from './enums'; import { decodeUrl, encodeUrl } from './url'; export class BeaconService { constructor(public service: BluetoothRemoteGATTService) {} private async readCharacteristic(uuid: string): Promise<DataView> { const characteristic = await this.service.getCharacteristic(uuid); return characteristic.readValue(); } private async writeCharacteristic(uuid: string, value: BufferSource): Promise<void> { const characteristic = await this.service.getCharacteristic(uuid); return characteristic.writeValue(value); } /** * Interval. */ async readInterval(): Promise<number> { const uuid = constants.ADVERTISING_INTERVAL_CHARACTERISTIC_UUID; const rawVal = await this.readCharacteristic(uuid); const val = rawVal.getUint16(0, false); // Big-Endian. return val; } async writeInterval(ms: number): Promise<void> { const uuid = constants.ADVERTISING_INTERVAL_CHARACTERISTIC_UUID; const rawMs = new DataView(new ArrayBuffer(2)); // 2 * 8bit rawMs.setUint16(0, ms, false); return this.writeCharacteristic(uuid, rawMs); } /** * LOCK */ async isLocked(): Promise<boolean> { const uuid = constants.EDDYSTONE_LOCK_STATE_CHARACTERISTIC_UUID; const rawVal = await this.readCharacteristic(uuid); const val = rawVal.getUint8(0); return val === LOCK_VALUES.LOCKED; } /** * RADIO */ async readRadioTxPower(): Promise<number> { const uuid = constants.RADIO_TX_POWER_CHARACTERISTIC_UUID; const rawVal = await this.readCharacteristic(uuid); const val = rawVal.getInt8(0); return val; } /** * Writes Radio Tx Power. * @param dbm Tx power. Values should range between -100 and +20 dBm. * If a power is selected that is not supported by the radio, the beacon should select * the next highest power supported, or else the maximum power. * @see https://github.com/google/eddystone/blob/master/eddystone-url/README.md#tx-power-level */ async writeRadioTxPower(dbm: number): Promise<void> { const uuid = constants.RADIO_TX_POWER_CHARACTERISTIC_UUID; const dbmByte = new Int8Array([dbm]); return this.writeCharacteristic(uuid, dbmByte); } async readAdvertisedTxPower(): Promise<number> { const uuid = constants.ADVANCED_ADVERTISED_TX_POWER_CHARACTERISTIC_UUID; const rawVal = await this.readCharacteristic(uuid); const val = rawVal.getInt8(0); return val; } async writeAdvertisedTxPower(dbm: number): Promise<void> { const uuid = constants.ADVANCED_ADVERTISED_TX_POWER_CHARACTERISTIC_UUID; const dbmByte = new Int8Array([dbm]); return this.writeCharacteristic(uuid, dbmByte); } /** * URL */ async readUrl(): Promise<string> { const uuid = constants.ADV_SLOT_DATA_CHARACTERISTIC_UUID; const rawVal = await this.readCharacteristic(uuid); const type = rawVal.getUint8(0); if (type !== DATA_VALUES.URL) { return Promise.reject('Advertised data is not a URL'); } const rawUrl = new DataView(rawVal.buffer, 2); // w/o type. return decodeUrl(rawUrl); } async writeUrl(url: string): Promise<void> { const uuid = constants.ADV_SLOT_DATA_CHARACTERISTIC_UUID; const raw = encodeUrl(url); if (raw.byteLength > 18) { return Promise.reject('Encoded URL is longer than 18 bytes'); } const urlBytes = Array.from(Array(raw.byteLength).keys()).map((bytePos) => { return raw.getUint8(bytePos); }); const fullBytes = new Uint8Array([DATA_VALUES.URL, ...urlBytes]); // With URL type preceding. return this.writeCharacteristic(uuid, fullBytes); } async clearUrl(): Promise<void> { const uuid = constants.ADV_SLOT_DATA_CHARACTERISTIC_UUID; const clearByte = new Uint8Array([0x00]); return this.writeCharacteristic(uuid, clearByte); } /** * MISC */ async factoryReset(): Promise<void> { const uuid = constants.ADVANCED_FACTORY_RESET_CHARACTERISTIC_UUID; const factoryResetByte = new Uint8Array([0x0B]); return this.writeCharacteristic(uuid, factoryResetByte); } } <file_sep>/src/beacon/url.ts /** * https://github.com/google/eddystone/tree/master/eddystone-url */ import { TextEncoder } from 'text-encoding'; type HexTypes = { [code: number]: string }; const URL_SCHEMES: HexTypes = { 0x00: 'http://www.', 0x01: 'https://www.', 0x02: 'http://', 0x03: 'https://', }; const URL_CODES: HexTypes = { 0x00: '.com/', 0x01: '.org/', 0x02: '.edu/', 0x03: '.net/', 0x04: '.info/', 0x05: '.biz/', 0x06: '.gov/', 0x07: '.com', 0x08: '.org', 0x09: '.edu', 0x0a: '.net', 0x0b: '.info', 0x0c: '.biz', 0x0d: '.gov', }; function decodeUrl(raw: DataView): string { const scheme: string = URL_SCHEMES[raw.getUint8(0)]; const url = Array.from(Array(raw.byteLength).keys()) .slice(1) .map((bytePos) => { const byteVal: number = raw.getUint8(bytePos); return URL_CODES[byteVal] || String.fromCharCode(byteVal); }) .join(''); return `${scheme}${url}`; } function encodeUrl(val: string): DataView { const encoder = new TextEncoder('utf-8'); const encoded: number[] = []; for (let i = 0; i < val.length; i += 1) { // Try shorten the result as much as possible by using the above references. const shortEncoded = shortEncode(val.slice(i)); if (shortEncoded) { encoded.push(shortEncoded.code); i += shortEncoded.jump - 1; continue; } // If it can't be shortened, simply encode the character. encoded.push(encoder.encode(val[i])[0]); } const buffer = new ArrayBuffer(encoded.length); const raw = new DataView(buffer); encoded.forEach((character, i) => raw.setUint8(i, character)); return raw; } function shortEncode(val: string): { code: number, jump: number } | undefined { return shortEncodeWithDict(val, URL_SCHEMES) || shortEncodeWithDict(val, URL_CODES); } function shortEncodeWithDict(val: string, hexTypes: HexTypes) : { code: number, jump: number } | undefined { const matching: string[] = Object.keys(hexTypes).filter((codeIndex: string) => { const code = Number(codeIndex); return val.startsWith(hexTypes[code]); }); if (matching.length === 0) { return undefined; } matching.sort(); const bestMatch = Number(matching[0]); return { code: bestMatch, jump: hexTypes[bestMatch].length, }; } export { encodeUrl, decodeUrl, };
e81294dbccd23de7c35ff826575da8b315ec84a6
[ "JavaScript", "TypeScript", "Markdown" ]
13
TypeScript
tungedison/EddystoneBeaconScanner
398bf50f16b59d37714f849c35c46c761c6887ac
e9a0e12ec0e4bedc738a65808b28c2f4a9bcc9b8
refs/heads/master
<file_sep>package com.cg.ecom.dao; import com.cg.ecom.dto.Login; import com.cg.ecom.exception.LoginException; import com.cg.ecom.repo.LoginRepository; public class LoginDaoImpl implements LoginDao{ /*Checks if the User ID passed in the method is present in the repository *if yes then values mapped with the ID(type Login) are returned *if no then LoginException is raised*/ @Override public Login getLoginDetails(String userid) throws LoginException { if(LoginRepository.cmap.containsKey(userid)) { return LoginRepository.cmap.get(userid); } else throw new LoginException("Invalid User ID"); } } <file_sep>package com.cg.ecom.repo; import java.util.HashMap; import java.util.Map; import com.cg.ecom.dto.Login; public class LoginRepository{ /***************Repository of existing users****************/ public static Map<String, Login> cmap = new HashMap<>(); static { //ID of the user is key and values are of type Login having id, password, user name, role cmap.put("Poorva123", new Login("Poorva123", "Poorva$123", "Poorva", "admin")); cmap.put("Keshav123", new Login("Keshav123", "Keshav$123", "Keshav", "user")); cmap.put("Pritam123", new Login("Pritam123", "Pritam$123", "Pritam", "user")); cmap.put("Rahhi123", new Login("Rahhi123", "Rahhi$123", "Rahhi", "user")); cmap.put("Sudheer123", new Login("Sudheer123", "Sudheer$123", "Sudheer", "user")); cmap.put("Sohail123", new Login("Sohail123", "Sohail$123", "Sohail", "user")); } }
b9f50a3c5145d166df0d5885ff89cce219f3af82
[ "Java" ]
2
Java
PoorvaVats/Sprint
df153b04f07ad599e87aee5533ea409406b09bf5
5b722cbb1b6455d01d89afe713de7fdc500b7418
refs/heads/master
<file_sep>const mapboxgl = require("mapbox-gl"); const markerMaker = function(type, coordinates) { const mark = document.createElement("div"); mark.style.backgroundSize = "contain"; mark.style.width = "50px"; mark.style.height = "50px"; mark.style.backgroundRepeat = "no-repeat"; if (type === 'Activity') { mark.style.backgroundImage = "url(http://i.imgur.com/WbMOfMl.png)" } else if (type === 'Hotel') { mark.style.backgroundImage = "url(http://i.imgur.com/D9574Cu.png)" } else if (type === 'Restaurant') { mark.style.backgroundImage = "url(https://cdn4.iconfinder.com/data/icons/map-pins-2/256/21-512.png)" } else { mark.style.backgroundImage = "url(https://cdn4.iconfinder.com/data/icons/small-n-flat/24/map-marker-512.png)" } return new mapboxgl.Marker(mark).setLngLat(coordinates) } module.exports = markerMaker
538134e784f7e7e3ca829058f9db18f88cc15a60
[ "JavaScript" ]
1
JavaScript
rkadilliu/Trip-Planner-1
42358163faa3face95ad5b3b45578407107d2b35
3a1f7be8a3d77605431de8e3727267a4b87bd09d
refs/heads/master
<repo_name>joao-arthur-moreira/mongodb-test<file_sep>/src/main/resources/application.properties spring.data.mongodb.uri=mongodb://localhost/jajm spring.data.mongodb.username= spring.data.mongodb.password=<file_sep>/src/main/java/br/com/jajm/mongo/infra/repository/LancamentoRepository.java package br.com.jajm.mongo.infra.repository; import java.time.LocalDate; import java.util.List; import org.springframework.data.mongodb.repository.MongoRepository; import br.com.jajm.mongo.domain.model.Lancamento; public interface LancamentoRepository extends MongoRepository<Lancamento, String> { List<Lancamento> findByDescricaoContainingAndDataVencimentoBetween(String descricao, LocalDate davencimentoDe, LocalDate dataVencimentoAte); } <file_sep>/src/main/java/br/com/jajm/mongo/controller/LancamentoController.java package br.com.jajm.mongo.controller; import java.util.List; import java.util.Optional; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.data.domain.Page; import org.springframework.data.domain.Pageable; import org.springframework.http.HttpStatus; import org.springframework.http.ResponseEntity; import org.springframework.web.bind.annotation.CrossOrigin; import org.springframework.web.bind.annotation.DeleteMapping; import org.springframework.web.bind.annotation.GetMapping; import org.springframework.web.bind.annotation.PathVariable; import org.springframework.web.bind.annotation.PostMapping; import org.springframework.web.bind.annotation.PutMapping; import org.springframework.web.bind.annotation.RequestBody; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.ResponseStatus; import org.springframework.web.bind.annotation.RestController; import br.com.jajm.mongo.domain.model.Categoria; import br.com.jajm.mongo.domain.model.Lancamento; import br.com.jajm.mongo.domain.model.Pessoa; import br.com.jajm.mongo.domain.model.filtros.FiltroLancamento; import br.com.jajm.mongo.infra.repository.CategoriaRepository; import br.com.jajm.mongo.infra.repository.LancamentoRepository; import br.com.jajm.mongo.infra.repository.PessoaRepository; @RestController @RequestMapping(path = "/lancamentos") @CrossOrigin public class LancamentoController { @Autowired private LancamentoRepository lancamentoRepository; @Autowired private CategoriaRepository categoriaRepository; @Autowired private PessoaRepository pessoaRepository; @GetMapping public Page<Lancamento> listarTodos(Pageable pageable) { return lancamentoRepository.findAll(pageable); } @GetMapping("/resumo") public List<Lancamento> filtrar(FiltroLancamento filtroLancamento) { return lancamentoRepository.findByDescricaoContainingAndDataVencimentoBetween(filtroLancamento.getDescricao(), filtroLancamento.getDataVencimentoDe(), filtroLancamento.getDataVencimentoAte()); } @GetMapping("/{codigo}") public ResponseEntity<Lancamento> porCodigo(@PathVariable String codigo) { Optional<Lancamento> lancamentoOptional = lancamentoRepository.findById(codigo); return lancamentoOptional.isPresent() ? ResponseEntity .ok(lancamentoOptional.get()) : ResponseEntity.notFound().build(); } @PostMapping @ResponseStatus( value = HttpStatus.CREATED ) public Lancamento adicionar(@RequestBody Lancamento lancamento) { Optional<Pessoa> pessoaOptional = pessoaRepository.findById(lancamento.getPessoa().getCodigo()); Optional<Categoria> categoriaOptional = categoriaRepository.findById(lancamento.getCategoria().getCodigo()); lancamento.setPessoa(pessoaOptional.get()); lancamento.setCategoria(categoriaOptional.get()); return lancamentoRepository.save(lancamento); } @DeleteMapping("/{codigo}") public ResponseEntity<?> remover(@PathVariable String codigo) { Optional<Lancamento> lancamentoOptional = lancamentoRepository.findById(codigo); if (lancamentoOptional.isPresent()) { lancamentoRepository.deleteById(codigo); return ResponseEntity.noContent().build(); } else { return ResponseEntity.notFound().build(); } } @PutMapping public Lancamento atualizar(@RequestBody Lancamento lancamento) { return lancamentoRepository.save(lancamento); } } <file_sep>/src/main/java/br/com/jajm/mongo/controller/PessoaController.java package br.com.jajm.mongo.controller; import java.util.List; import java.util.Optional; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.http.HttpStatus; import org.springframework.http.ResponseEntity; import org.springframework.web.bind.annotation.CrossOrigin; import org.springframework.web.bind.annotation.DeleteMapping; import org.springframework.web.bind.annotation.GetMapping; import org.springframework.web.bind.annotation.PathVariable; import org.springframework.web.bind.annotation.PostMapping; import org.springframework.web.bind.annotation.PutMapping; import org.springframework.web.bind.annotation.RequestBody; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.ResponseStatus; import org.springframework.web.bind.annotation.RestController; import br.com.jajm.mongo.domain.model.Pessoa; import br.com.jajm.mongo.infra.repository.PessoaRepository; @RestController @RequestMapping(path = "/pessoas") @CrossOrigin public class PessoaController { @Autowired private PessoaRepository pessoaRepository; @GetMapping public List<Pessoa> listarTodas() { return pessoaRepository.findAll(); } @GetMapping("/por-nome") public List<Pessoa> listarPorNome(String nome) { return pessoaRepository.findByNomeContainingIgnoreCase(nome); } @GetMapping("/{codigo}") public ResponseEntity<Pessoa> porCodigo(@PathVariable String codigo) { Optional<Pessoa> pessoaOptional = pessoaRepository.findById(codigo); return pessoaOptional.isPresent() ? ResponseEntity .ok(pessoaOptional.get()) : ResponseEntity.notFound().build(); } @PostMapping @ResponseStatus(value = HttpStatus.CREATED) public Pessoa adicionar(@RequestBody Pessoa pessoa) { return pessoaRepository.save(pessoa); } @DeleteMapping("/{codigo}") public ResponseEntity<?> remover(@PathVariable String codigo) { Optional<Pessoa> pessoaOptional = pessoaRepository.findById(codigo); if (pessoaOptional.isPresent()) { pessoaRepository.deleteById(codigo); return ResponseEntity.noContent().build(); } else { return ResponseEntity.notFound().build(); } } @PutMapping public Pessoa atualizar(@RequestBody Pessoa pessoa) { return pessoaRepository.save(pessoa); } @PutMapping("/{codigo}/ativar") @ResponseStatus(value = HttpStatus.NO_CONTENT) public void ativar(@PathVariable String codigo) { ativarInativar(codigo, true); } @PutMapping("/{codigo}/inativar") @ResponseStatus(value = HttpStatus.NO_CONTENT) public void inativar(@PathVariable String codigo) { ativarInativar(codigo, false); } private void ativarInativar(String codigo, boolean status) { Optional<Pessoa> pessoaOptional = pessoaRepository.findById(codigo); Pessoa pessoa = pessoaOptional.get(); pessoa.setStatus(status); pessoaRepository.save(pessoa); } } <file_sep>/src/main/java/br/com/jajm/mongo/domain/model/Categoria.java package br.com.jajm.mongo.domain.model; import org.springframework.data.annotation.Id; import org.springframework.data.mongodb.core.mapping.Document; import lombok.Data; @Data @Document public class Categoria { @Id private String codigo; private String nome; } <file_sep>/src/main/java/br/com/jajm/mongo/infra/repository/PessoaRepository.java package br.com.jajm.mongo.infra.repository; import java.util.List; import org.springframework.data.mongodb.repository.MongoRepository; import br.com.jajm.mongo.domain.model.Pessoa; public interface PessoaRepository extends MongoRepository<Pessoa, String> { List<Pessoa> findByNomeContainingIgnoreCase(String nome); } <file_sep>/src/main/java/br/com/jajm/mongo/domain/model/Pessoa.java package br.com.jajm.mongo.domain.model; import org.springframework.data.annotation.Id; import org.springframework.data.mongodb.core.mapping.Document; import lombok.Data; @Data @Document public class Pessoa { @Id private String codigo; private String nome; private Boolean status; private String cidade; private String estado; private String logradouro; private String numero; private String complemento; private String bairro; private String cep; }
e1fa87db75a128b91136a4f2a44e62ae42c9ff4f
[ "Java", "INI" ]
7
INI
joao-arthur-moreira/mongodb-test
c9c4ee22aa1e17abab9b0da5d0b9245f6b869d9d
67a8abf1b83df9f6d0d663a3774d4bf1c4620811
refs/heads/master
<repo_name>JagdishParyani/RetrofitWithApiArchitecture<file_sep>/ApiService.kt package com.example.retrofit_kotlin import com.example.apiarchitecture.model.BaseModel import retrofit2.Call import retrofit2.http.GET interface ApiService { @GET("/json/movies.json") fun getMoviesData() : Call<String> @GET("/contacts") fun getAllContacts() : Call<String> //https://api.androidhive.info/contacts/ }<file_sep>/ApiCLient.kt package com.example.retrofit_kotlin import com.google.gson.* import retrofit2.Retrofit import retrofit2.converter.gson.GsonConverterFactory import java.lang.reflect.Type class ApiCLient() { private lateinit var mRetrofit: Retrofit private val mBaseUrl : String = "https://api.androidhive.info" fun getClient(): ApiService { val gson = GsonBuilder() .registerTypeAdapter(String::class.java, StringDesirializer()) .create() mRetrofit = Retrofit.Builder() .baseUrl(mBaseUrl) .addConverterFactory(GsonConverterFactory.create(gson)) .build() return mRetrofit.create(ApiService::class.java) } } class StringDesirializer : JsonDeserializer<String> { @Throws(JsonParseException::class) override fun deserialize(json: JsonElement, typeOfT: Type, context: JsonDeserializationContext): String { return json.toString() } } <file_sep>/README.md # RetrofitWithApiArchitecture This is a simple Retrofit Api calling Demo with simple Architecture Here I have uploaded my files directly and not in any folder. So to follow Structure I suggest you to create relevant multiple packages. For Eg : All Controller files should be in controller packade. Thanks <file_sep>/BaseController.kt package com.example.apiarchitecture.controller import android.util.Log import com.example.apiarchitecture.model.BaseModel import com.example.retrofit_kotlin.ApiCLient import com.example.retrofit_kotlin.ApiService import org.json.JSONArray import org.json.JSONObject import retrofit2.Call import retrofit2.Callback import retrofit2.Response abstract open class BaseController { val TAG = "BaseController" lateinit var mCallBackListener: CallBackListener lateinit var mBaseModel: BaseModel lateinit var mApiCLient: ApiCLient lateinit var mApiService: ApiService fun callRequestToServer(modelCall: Call<String>) { //var modelCall: Call<String> = mApiService.getMoviesData() modelCall.enqueue(object : Callback<String> { override fun onFailure(call: Call<String>, t: Throwable) { mCallBackListener.networkConnectionError() Log.e(TAG, t.message) } override fun onResponse(call: Call<String>, response: Response<String>) { if (response.code() == 200) { var obj : JSONObject = JSONObject() var jsonResp = JSONArray(response.body()) obj.put("data", jsonResp) onPopulate(obj) //onPopulate(JSONObject(response.body())) } else { Log.e(TAG, response.code().toString()) } } }) } open fun startFetching(callBackListner: CallBackListener) { //Log.e(TAG, "Request :" + Gson().toJson(model)) this.mCallBackListener = callBackListner mApiCLient = ApiCLient() mApiService = mApiCLient.getClient() } abstract fun onPopulate(objJson: JSONObject?) }<file_sep>/MainActivity.kt package com.example.apiarchitecture import android.os.Bundle import android.support.v7.app.AppCompatActivity import android.support.v7.widget.LinearLayoutManager import android.widget.Toast import com.example.apiarchitecture.adapter.ContactsAdapter import com.example.apiarchitecture.adapter.MoviesAdapter import com.example.apiarchitecture.controller.CallBackListener import com.example.apiarchitecture.controller.ContactsController import com.example.apiarchitecture.controller.MoviesController import com.example.apiarchitecture.model.BaseModel import com.example.apiarchitecture.model.Movies import com.example.apiarchitecture.model.ResModelContacts import kotlinx.android.synthetic.main.activity_main.* class MainActivity : AppCompatActivity(), CallBackListener { override fun handleSuccessData(resModel: BaseModel) { when (resModel) { is Movies -> { rcv.apply { layoutManager = LinearLayoutManager(this@MainActivity) adapter = MoviesAdapter((resModel as Movies).data, this@MainActivity) } } is ResModelContacts -> { rcv.apply { layoutManager = LinearLayoutManager(this@MainActivity) adapter = ContactsAdapter((resModel as ResModelContacts).contacts, this@MainActivity) } } } /*if (resModel is ResModelMovies) { rcv.apply { layoutManager = LinearLayoutManager(this@MainActivity) adapter = MoviesAdapter((resModel as Movies).data, this@MainActivity) } } else if (resModel is ResModelContacts){ rcv.apply { layoutManager = LinearLayoutManager(this@MainActivity) adapter = ContactsAdapter((resModel as ResModelContacts).contacts,this@MainActivity) } }*/ } override fun networkConnectionError() { ShowToast("Network Connection Error") } override fun onserverConnectionError() { ShowToast("Server Connection Error") } override fun handleErrorDataFromServer(errorModel: BaseModel) { ShowToast("Error Data From Server") } fun ShowToast(msg: String) { Toast.makeText(this, msg, Toast.LENGTH_SHORT).show() } override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) setContentView(R.layout.activity_main) btn.setOnClickListener { callAPI() } btn_cont.setOnClickListener { callContactsApi() } } private fun callContactsApi() { ContactsController().startFetching(this) } private fun callAPI() { MoviesController().startFetching(this) } } <file_sep>/ContactsController.kt package com.example.apiarchitecture.controller import com.example.apiarchitecture.model.Contacts import com.example.apiarchitecture.model.ResModelContacts import com.google.gson.Gson import org.json.JSONObject class ContactsController : BaseController() { override fun onPopulate(objJson: JSONObject?) { var contacts : ResModelContacts = Gson().fromJson(objJson.toString(),ResModelContacts::class.java) this.mCallBackListener.handleSuccessData(contacts) } override fun startFetching(callBackListner: CallBackListener) { super.startFetching(callBackListner) this.mCallBackListener = callBackListner callRequestToServer(mApiService.getAllContacts()) } }<file_sep>/MoviesAdapter.kt package com.example.apiarchitecture.adapter import android.content.Context import android.support.v7.widget.RecyclerView import android.view.LayoutInflater import android.view.View import android.view.ViewGroup import android.widget.ImageView import android.widget.TextView import android.widget.Toast import com.bumptech.glide.Glide import com.example.apiarchitecture.R import com.example.apiarchitecture.model.ResModelMovies class MoviesAdapter(private val result: List<ResModelMovies>, private val mContext: Context) : RecyclerView.Adapter<MoviesAdapter.ViewHolder>() { override fun onCreateViewHolder(parent: ViewGroup, position: Int): ViewHolder { val inflater = LayoutInflater.from(parent.context) val v = inflater.inflate(R.layout.list_item, null) return ViewHolder(v) } override fun getItemCount(): Int = result.size override fun onBindViewHolder(holder: ViewHolder, position: Int) { val movie: ResModelMovies = result[position] holder.bind(movie, mContext, position) holder.titlesTxt.setOnClickListener { Toast.makeText(mContext,position.toString(),Toast.LENGTH_SHORT).show() } } class ViewHolder(itemView: View) : RecyclerView.ViewHolder(itemView) { var titlesTxt: TextView var ratingsTxt: TextView var releaseYearsTxt: TextView var genresTxt: TextView var imgs: ImageView init { titlesTxt = itemView.findViewById(R.id.titleTxt) ratingsTxt = itemView.findViewById(R.id.ratingTxt) releaseYearsTxt = itemView.findViewById(R.id.releaseTxt) genresTxt = itemView.findViewById(R.id.genreTxt) imgs = itemView.findViewById(R.id.img) } fun bind(movie: ResModelMovies, mContext: Context,pos : Int) { titlesTxt.text = "$pos "+(movie.title) ratingsTxt.text = "Rating :" + movie.rating.toString() releaseYearsTxt.text = "Year :"+ movie.releaseYear.toString() var s = "" for (i in movie.genre!!) { s = s + i + "," } s = s.substring(0, s.length - 1) genresTxt.text = s Glide.with(mContext).load(movie.image).into(imgs); } } }<file_sep>/CallBackListener.kt package com.example.apiarchitecture.controller import com.example.apiarchitecture.model.BaseModel interface CallBackListener { fun handleSuccessData(resModel: BaseModel) fun networkConnectionError() fun onserverConnectionError() fun handleErrorDataFromServer(errorModel: BaseModel) }<file_sep>/ReqModelContacts.kt package com.example.apiarchitecture.model class ReqModelContacts { }<file_sep>/ContactsAdapter.kt package com.example.apiarchitecture.adapter import android.content.Context import android.support.v7.widget.RecyclerView import android.view.LayoutInflater import android.view.View import android.view.ViewGroup import android.widget.TextView import com.example.apiarchitecture.R import com.example.apiarchitecture.model.Contacts class ContactsAdapter(private val result: List<Contacts>, private val mContext: Context) : RecyclerView.Adapter<ContactsAdapter.ContactsViewHolder>() { override fun onCreateViewHolder(parent: ViewGroup, pos: Int): ContactsAdapter.ContactsViewHolder { val v = LayoutInflater.from(parent.context).inflate(R.layout.list_contacts, null) return ContactsViewHolder(v) } override fun getItemCount(): Int = result.size override fun onBindViewHolder(holder: ContactsViewHolder, pos: Int) { val contacts : Contacts =result.get(pos) holder.bind(contacts) } class ContactsViewHolder(itemView: View) : RecyclerView.ViewHolder(itemView) { var id: TextView var name: TextView var email: TextView var gender: TextView var mobile: TextView init { id = itemView.findViewById(R.id.ids) name = itemView.findViewById(R.id.name) email = itemView.findViewById(R.id.email) gender = itemView.findViewById(R.id.gender) mobile = itemView.findViewById(R.id.mobile) } fun bind(contacts: Contacts) { id.text = contacts.id name.text = contacts.name email.text = contacts.email gender.text = contacts.gender mobile.text = contacts.phone.mobile } } } <file_sep>/ResModelContacts.kt package com.example.apiarchitecture.model data class ResModelContacts(var contacts : List<Contacts>) : BaseModel() data class Contacts ( var id : String, var name : String, var email : String, var address : String, var gender : String, var phone : Phone ) : BaseModel() data class Phone ( var mobile : String, var home : String, var office : String ) : BaseModel()<file_sep>/ReqModelMovies.kt package com.example.apiarchitecture.model class ReqModelMovies : BaseModel(){ }<file_sep>/BaseModel.kt package com.example.apiarchitecture.model import java.io.Serializable open class BaseModel : Serializable { }<file_sep>/ResModelMovies.kt package com.example.apiarchitecture.model data class Movies(var data: ArrayList<ResModelMovies>) : BaseModel() data class ResModelMovies( var title: String, var image: String, var rating: String, var releaseYear: String, var genre: List<String> ) : BaseModel()<file_sep>/MoviesController.kt package com.example.apiarchitecture.controller import com.example.apiarchitecture.model.BaseModel import com.example.apiarchitecture.model.Movies import com.example.apiarchitecture.model.ReqModelMovies import com.example.apiarchitecture.model.ResModelMovies import com.google.gson.Gson import org.json.JSONArray import org.json.JSONObject class MoviesController : BaseController() { override fun onPopulate(objJson: JSONObject?) { //var objArr: JSONArray = objJson!!.getJSONArray("data") var resModelMovies : Movies = Gson().fromJson(objJson.toString(),Movies::class.java) this.mCallBackListener.handleSuccessData(resModelMovies) } override fun startFetching(callBackListner: CallBackListener) { super.startFetching(callBackListner) this.mCallBackListener = callBackListner callRequestToServer(mApiService.getMoviesData()) } }
d67f6a771375eb43f9875a9ee06221efa93f7f23
[ "Markdown", "Kotlin" ]
15
Kotlin
JagdishParyani/RetrofitWithApiArchitecture
fe9944953be259cf4bd750ca4ff656150f5b22f6
a1918bf1a0bfc82fa175d76c4215d702e1017772
refs/heads/master
<file_sep>package com.wy.reptile.common.connection.pool; import java.sql.Connection; import java.sql.SQLException; import java.sql.Statement; public class DBUtil { public static void closeConnection(Connection con) { if (con != null) { try { con.close(); } catch (SQLException e) { e.printStackTrace(); } } } public static void closeStatement(Statement stat) { if (stat != null) { try { stat.close(); } catch (SQLException e) { e.printStackTrace(); } } } }
9d69bef8c93df1c33ccc150e30dc7934a0f5d3bc
[ "Java" ]
1
Java
296075945/reptile
89cee420e6f3d7b2a24b2aa5c1bb8a5f36a82107
782287e22732fd21371bce145c11fc7ad4812b21
refs/heads/master
<file_sep># alexsv-test-ci3 [![CircleCI](https://circleci.com/gh/a-sukhodolsky/alexsv-test-ci3.svg?style=shield)](https://circleci.com/gh/a-sukhodolsky/alexsv-test-ci3) [![Dashboard alexsv-test-ci3](https://img.shields.io/badge/dashboard-alexsv_test_ci3-yellow.svg)](https://dashboard.pantheon.io/sites/7ed6c8d2-d8d0-4259-9305-5f4d503921d7#dev/code) [![Dev Site alexsv-test-ci3](https://img.shields.io/badge/site-alexsv_test_ci3-blue.svg)](http://dev-alexsv-test-ci3.pantheonsite.io/)<file_sep><?php /** * Load services definition file. */ $settings['container_yamls'][] = __DIR__ . '/services.yml'; /** * Include the Pantheon-specific settings file. * * n.b. The settings.pantheon.php file makes some changes * that affect all environments that this site * exists in. Always include this file, even in * a local development environment, to ensure that * the site settings remain consistent. */ include __DIR__ . "/settings.pantheon.php"; /** * Skipping permissions hardening will make scaffolding * work better, but will also raise a warning when you * install Drupal. * * https://www.drupal.org/project/drupal/issues/3091285 */ // $settings['skip_permissions_hardening'] = TRUE; /** * If there is a local settings file, then include it */ $local_settings = __DIR__ . "/settings.local.php"; if (file_exists($local_settings)) { include $local_settings; } // IPv4: Single IPs and CIDR. // See https://en.wikipedia.org/wiki/Classless_Inter-Domain_Routing $request_ip_blacklist = [ '192.0.2.38', '172.16.58.3', '172.16.58.3/30', '192.168.3.11/24', '192.168.3.11', ]; $request_remote_addr = $_SERVER['REMOTE_ADDR']; // Check if this IP is in black list. if (!$request_ip_forbidden = in_array($request_remote_addr, $request_ip_blacklist)) { // Check if this IP is in CIDR black list. foreach ($request_ip_blacklist as $_cidr) { if (strpos($_cidr, '/') !== FALSE) { $_ip = ip2long($request_remote_addr); list ($_net, $_mask) = explode('/', $_cidr, 2); $_ip_net = ip2long($_net); $_ip_mask = ~((1 << (32 - $_mask)) - 1); if ($request_ip_forbidden = ($_ip & $_ip_mask) == ($_ip_net & $_ip_mask)) { break; } } } } if ($request_ip_forbidden) { header('HTTP/1.0 403 Forbidden'); exit; }
cc59f4df4706b8701b7f8bfe167606507fda945b
[ "Markdown", "PHP" ]
2
Markdown
a-sukhodolsky/alexsv-test-ci3
7129dfad759468eda5519da484485f018b94da38
62bcf352ead3bafd438b10b4e91aa02e78516724
refs/heads/master
<file_sep>#!usr/bin/env python #-*- coding: utf8 -*- import os import time import sys print(""" R4POR-T ToolBox Hoşgeldiniz Gerekli Dosyalari Kuralmmi? (y/n) """) welcome = raw_input("Gerekli islemi komutu gir karşim (y/n) : ") if (welcome=="y"): os.system("sudo apt-get update") os.system("clear") elif (welcome=="n"): print("Gerekli Dosyalardan kurulmaktan vazgecildi.") elif (welcome=="Y"): os.system("sudo apt-get update") os.system("clear") elif (welcome=="N"): print("Gerekli Dosyalardan kurulmaktan vazgecildi.") elif (welcome=="yes"): os.system("sudo apt-get update") os.system("clear") elif (welcome=="no"): print("Gerekli Dosyalardan kurulmaktan vazgecildi.") elif (welcome=="YES"): os.system("sudo apt-get update") os.system("clear") elif (welcome=="NO"): print("Gerekli Dosyalardan kurulmaktan vazgecildi.") elif (welcome): time.sleep(1) print(""" Körmüsün? Orda yazıyor ne yapıcağın. """) sys.exit() print(""" Lütfen Giriş yapmak için Kullanıcı adı ve şifre giriniz (default: Kullanıcı adı: root şifre: R4PORT ) """) kuladi = "root" sifresi = "R4PORT" isletimsistemi = os.name posix = "xxx/Linux" nt = "/Windows xxx" kullanici_adi=raw_input("Kullanıcı adınızı giriniz : ") sifre=raw_input("Sifrenizi giriniz : ") if kullanici_adi==kuladi and sifre==sifresi and isletimsistemi=="posix": print("Giriş Başarılı...") time.sleep(2) print ("Kısa Bir süre bekleticez.. işletim sisteminiz. " + posix) time.sleep(2) os.system("clear") elif (kullanici_adi) and (sifre): os.system("clear") print(""" Hatalı Giriş Kardeşim... Sana Bilgisayar Öğreteni sikim aq """) time.sleep(1) sys.exit() elif nt: os.system("clear") print(""" Gerizekalı sen önce linux indir aq Rahatlığa bak """) time.sleep(1) sys.exit() print (""" ____ _ _ ____ ___ ____ _____ _____ ___ ___ _ ____ _____ __ | _ \| || | | _ \ / _ \| _ \_ _| |_ _/ _ \ / _ \| | | __ ) / _ \ \/ / | |_) | || |_| |_) | | | | |_) || | | || | | | | | | | | _ \| | | \ / | _ <|__ _| __/| |_| | _ < | | | || |_| | |_| | |___| |_) | |_| / \ |_| \_\ |_| |_| \___/|_| \_\|_| |_| \___/ \___/|_____|____/ \___/_/\_\ iletişim: https://wwww.instagram.com/onurcan.root web site: https://www.teknoalerji.com github: https://github.com/R4PORT/R4PORT-TOOL-BOX Coded by: R4PORT (<NAME>) TR 01 - PORT TARAMA 02 - SQL Zafiyeti 03 - Site Tarama 04 - Trojan Oluşturma 05 - Güvenlik Duvarı Tespit 06 - AĞ Haritası 07 - Derleme 08 - Exploit Search 09 - Wifi 10 - Turkish Founder 11 - Şehitler 12 - İletişim """) secenek = raw_input("Yapmak isteğiniz komutu giriniz : ") if (secenek=="1"): os.system("clear") print(""" 01 - Pingleme 02 - Hızlı Tarama 03 - version Tarama 04 - Tcp Tarama 05 - İşletim Sistemi Tarama 06 - Port Tarama 06 - AĞ Haritası 07 - Çıkış """) tarama = raw_input("Yapmak isediğniz komutu giriniz : ") if (tarama=="1"): ip1 = raw_input("İp Adresini giriniz : ") print("Taranıyor...") os.system("ping "+ ip1) if (tarama=="2"): ip2 = raw_input("İp Adresini Giriniz : ") print(""" Taranıyor... """) os.system("nmap -sS "+ ip2) if (tarama=="3"): ip3 = raw_input("İP Adresini Giriniz : ") os.system("nmap -sS -sV -O " + ip3 ) if (tarama=="4"): ip3333 = raw_input("İP Adresini Giriniz : ") os.system("nmap -sT -sV -O " + ip3333) if (tarama=="5"): ip5 = raw_input("Yapmak isteğiniz işlemi giriniz : ") os.system("nmap -O " + ip5) if (tarama=="6"): ip6 = raw_input("İp Adresini girniz : ") port6 = raw_input("Port Giriniz : ") os.system("nmap -sT -n -v -sV -p" + port6 + " " + ip6) elif (tarama=="7"): ip7 = raw_input("İP Adresi giriniz : ") print("Ağ Haritası Çıkarılıyor Lütfen Bekleyiniz..") time.sleep(2) os.system("traceroute " + ip7 ) if (secenek=="3"): print(""" 01 - Nikto İle Tarana 02 - Wordpress ile tarama 03 - nmap ile Tarama 04 - Dmitry Tarama 05 - dirbuster Tarama (Web sitenin alt dizinleri) """) niktohost = raw_input("Yapmak İsteğiniz işlemi yazınız.") if (niktohost=="1"): ipgir = raw_input("İp Adresini giriniz : ") os.system("nikto -h "+ ipgir) elif (niktohost=="2"): print(""" 01 - Tam Tarama 02 - Tema Tarama 03 - Plugin tarama 04 - users tarama """) wordpressip = raw_input("host/ip adresini giriniz : ") wordpress = raw_input("Yapmak isteğiniz işlemi giriniz : ") if (wordpress=="1"): os.system("clear") os.system("wpscan --url " + wordpressip) elif (wordpress=="2"): os.system("clear") os.system("wpscan --url " + wordpressip + " --enumerate t") elif (wordpress=="3"): os.system("clear") os.system("wpscan --url " + wordpressip + " --enumerate p") elif (wordpress=="4"): os.system("clear") os.system("wpscan --url " + wordpressip + " --enumerate u") elif (niktohost=="3"): ipgirnmapp = raw_input("domain/ip adresini giriniz : ") os.system("nmap -sT --script=vuln -p80 " + ipgirnmapp) elif (niktohost=="4"): dmitry = raw_input("Domain/ip adresini giriniz : ") os.system("dmitry -winsepfb " + dmitry) elif (niktohost=="5"): os.system("dirbuster") if (secenek=="4"): print(""" 01 - windows/meterpreter/reverse_tcp 02 - android/meterpreter/reverse_tcp """) trojan = raw_input("Yapmak isteğiniz komutu giriniz : ") ip123 = raw_input("LOCAL/DIŞ Ip Adresini Giriniz : ") port1233 = raw_input("port giriniz : ") konum = raw_input("trojanın konumunu belirtiniz (root/Deskop/) : ") if (trojan=="1"): os.system("msfvenom -p windows/meterpreter/reverse_tcp set LHOST=" + ip123 + " LPORT=" + port1233 + " -o " + konum + "output.exe") if (trojan=="2"): ip1234 = raw_input("İp Adresini Giriniz : ") port1234 = raw_input("Port giriniz : ") os.system("msfvenom -p android/meterpreter/reverse_tcp set LHOST=" + ip123 + " LPORT=" + port1233 + " -o " + konum + "output.apk") if (secenek=="5"): print(""" 01 - wafW00f 02 - Nmap İle Tarama """) protectdetection = raw_input("Yapmak isteğiniz işlemi giriniz") if (protectdetection=="1"): os.system("clear") time.sleep(1) ip00 = raw_input("İp Adresini giriniz : ") print(" Taranıyor ") os.system("wafw00f " + ip00) if (protectdetection== "2"): os.system("clear") ipver = raw_input("ip Adresini giriniz") print("Taranıyor...") time.sleep(1) os.system("nmap -P80 --script=vuln" + ipver) if (secenek=="6"): ipver1 = raw_input("İp Adresini giriniz : ") os.system("traceroute " + ipver1 ) print("Ağ Haritası Çıkarıldı.") if (secenek=="7"): derleme = raw_input("Derlenicek dosyanın konumunu giriniz > ") print("Daha Kodlanmadı...") if (secenek=="8"): os.system("clear") print ("Lütfen Bekleyiniz.") time.sleep(1) bilgi1istek = raw_input("Exploiti Aratınız : ") print("Sonuçlar.") time.sleep(1) os.system("searchxploit " + bilgi1istek) elif (secenek =="9"): agkarti = raw_input("Ağ Kartını(interface) biliyormusunuz? (y/n)") if (agkarti =="y"): print(""" 01 interface bulma 02 Airmon-ng monitor moduna alma 03 Airmon-ng monitor modunu kapatma 04 Airodump-ng Dinleme Moduna Alma 05 Airepley-ng Yetkisizlendirme saldırısı 06 Airodump handshake alma """) wifi = raw_input("yapmak istediginiz islemi giriniz") if (wifi =="1"): print (" intercefe bulmaniz icin if satirinin altindaki aygita verilen ismi bulunuz") os.system("ifconfig") elif (wifi =="2"): print("n/ lütfen bekleyiniz...") time.sleep(1) os.system("airmon start " + interfacegir1) elif (wifi =="3"): os.system("airmom stop " + interfacegir2+"mon") elif (wifi =="4"): os.system("airodump " + interfacegir3 + "mon") elif (wifi =="5"): print(""" Gerekli bilgiler Kablosuz ağ mac İstemci mac """) istemciair = raw_input("istemci(client) Mac Adresini girin : ") paketsayisiair = raw_input("paket sayısını girin : ") accesspointair = raw_input("AccessPoint mac adresini girin : ") dinleme = raw_input("interface girin (mon) : ") os.system("airepley-ng --deauth " + paketsayisiair + " -a " + accesspointair + " -c " + istemci) elif (wifi =="6"): accesspoint = raw_input("AccessPoint Mac Adresini girin : ") accesspointchannel = raw_input("Channel (Kanal 'C') giriniz ") os.system("airodump-ng --ssid " + accesspoint + " -c " + accesspointchannel) elif (secenek =="12"): os.system("clear") print (""" iletisim bilgileri : <EMAIL> instagram : https://www.instagram.com github : https://github.com/R4PORT/R4PORT-TOOL-BOX """) elif (secenek =="11"): print("Şehitler Listesi.") print ("<NAME>") print ("<NAME>. Bçvş. <NAME>") print ("<NAME>, <NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>, <NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("H<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("Ak<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("Ceng<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("Ak<NAME>") print("Çet<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("Su<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("Uf<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("Erol İnce") print("Bi<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("Köksal Kaşaltı") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("Y<NAME>") print("<NAME>") print("<NAME>") print("Sev<NAME>üngör") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("unus Uğur") print("<NAME>") print("Ay<NAME>az") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>anoğlu") print("Onur Kılıç") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("Şü<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>. Al<NAME>") print("Topçu Astsb. Kd. Bçvş. B<NAME>") print("P. Uzm. Çvş. <NAME>") print("Rüstem Resul Perçini") print("Mesut Acu") print("Resul Kaptancı") print("<NAME>") print("<NAME>") print("Sevgi Yeşilyurt") print("Şenol Sağman") print("Zekeriya Bitmez") print("Yılmaz Ercan") print("<NAME>") print("<NAME>") print("br<NAME>") print("<NAME>") print("<NAME>") print("D<NAME>am") print("<NAME>") print("<NAME>cioğlu") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print("<NAME>") print ("Lütfi Gülşen") print ("Mesut Yağan") print ("<NAME>") print ("<NAME>") print ("<NAME>") print ("Medet İkizceli") print ("<NAME>") print ("Bül<NAME>alı") print ("Hüseyin Güntekin") elif (secenek =="10"): print(""" NNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNmmNNNmmNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNmmmNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNmmNmNNNNNNNNNNNmNmmNNNNmmNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNmmNNNmNNNmmmddmmmmmNNNNNNNNNNNNNNNmmNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNmNNmmmmdmdyyhhydmmmNNNNNNNNNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNmmmmddhhyo // + sssyyymmmhyydmNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNmmdmmho /// ::: / O / + / ++ SHS + hdmNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNmmNNNNNmmmmmmh /: - ::::: /////////: shmNNNNNmmNNNNNNNNNNNNNN NNNNNNNNNNNNmmmmmNmmysyhhddooosssossoo ++ / :: + / shmNNNNNNNNNmmNNNNNNNNNN NNNNNNNNNNNNmmmmmNm +: -: / shdhhyysoo + / + ooosyyyooo + ohmNNNNNNNNNNNNNNNNNN NNNNNNNNNNNmmNNmmmy ::: ++ ohyyo + /: ---- :::: / + syhyys + / ymmmNNNNNNNNNNNNNNN NNNNNNNNNNNmdddhhy /: + s + - + // ---.....-----: / ohdhyoooymmmdmmNNNNNNNNNNN NNNNNNNNNNmmdhs /:/::+/-...............---::/ shhysssyhdhosdmNNNNNNNNNN NNNNNNNNNNNmmh + /: ----..................----: + yhysoossyhoshdmNNNNNNNNN NNNNNNNNNNNmmho /---......................---/++ ssssydyhhsyhdmNNNNNNNN NNNNNNNNNNNmmdyo --.......................---- :: / + / oohdddhydmmNNNNNNNN NNNNNNNNNNNmmdh + -........................-- :: + yyyysshddmdyhmmmNNNNNNN NNNNNNNNNNNddh +: -......--. ..............--: / shddddhyhdmdoohhdmNNNNNN NNNNNNNNNmmyss /: ----..----: / :::::: //: ------- / syddddhhhddd + / ydmmmNNNNN NNNNNNNNNmmy // syhhhs / -: / shddhhhhhhys / ------- / shhhddddddh + ymmmmNNNNNN NNNNNNNmmmms :: / syyyhh +: / + yddhhysssosyy / -------: + yoosoooyyoymmmmmmNNNN NNNNNNmdmhy +: -: yy /: + s /..-/ yyyyo + / osoo +: --------- ::: / yshyoyhmmmmmmmmmm NNNNmmmmdys +: ---: ----....- / :: - :: -----...--------: / + s / oy / ohdhdmmmmmmN NNNNmddhys + /: --.-...-.... ----- ..........--- ::::: - ody +: o + oyyyydmmmmmN NNNNNmdhs + //: - :: -. --- ... ---- ......----- :: // +++ / :: + s +: / + oysyhddmmmmm NNNNNNmhys //::/+---....----/-...---/++ oooooo ++ /: / :: -: / + // ohsydddhmmmm NNNNNNNNmmhs + :: o /: ++ oooosso + / --- :: / + oosssooo + / ::: // :: + :: / yhyhddddmmNN NNNNNNNNNNmsoo // o + ooyhho ++ / - ::::: // ++++ ooo ++ // ++ // + s /: / + oshddmmdmmmN NNNNNNNNNmmdhso + oo /: ---...----: // ::: / +++++++++ oyhhys /: / + o ++ ydmmmmmmmm NNNNNNNNNNddyo ::: sys + // ++ // + oo ++ / ::: / ++ / +++ oosshhdho: / + ooooydmmmmmmmm NNNNNNNNNNmdhhso / oyysooo +: - :: / + ///// + o + / + oosyyyhhhh /: / osysyhdmmmmmmmm NNNNNNNNNNNNNmdy ++ ssooo ++ /: ---: / +++++ o + ooossyhhyyhh / - + shhhdmmmmmmmNN NNNNNNNNNNNNNmyhhyy + / ::: ----: // / ++ oossosssssssssss / y: / syhhddhdmmmNmNN NNNNNNNNNNNNNNmmmhyso /-----.-:/+ ssyhysooooooo + oo / .. yy / osyhddydmmmmmmN NNNNNNNNNNNNNmmmmdsosyssosoo ++ ossssssoooo ++++ /: .... hmhssyhhddmmmNNNNN NNNNNNNNNNNNNNmmmho // sddddhyssssooo +++++ ///: --....- dmmmdhhhdddmmmmmNN NNNNNNNNNNmNNNNdsoyyhmmmmo // + oss ++ osso + / :: .. ---- .-: dNmmmmmdmmmmmNNNNN NNNNNNNNNmmNNNNmdmmmmmmmmh / ::: / + / + ooo + :: -...---- ::: ymmmmmmmmdmmmmNNNN NNNNNNNNNNNNNNNNmmmNmmmmmdo +: / o: ..-: / :: ....- :: -: /: ommmmNmmmmmmddmNNN NNNNNNNNNNNNNmdmmmmmmmmmmyoo + / ++: ..- :: ----- :::: /// + / dmmmNmmmmmmmmNNNN NNNNNNNNNNmmmmdhmNmmmmmmmhyhhyyo + :: / + / + o / ++ O ///// + yshNmmmmmNNmNNNNNNN NNNNNNmmmdmmNNNNNNNNNmmmmhhhhyoo +: +++ oosooyhho + oyhhyhNmmmNNNNNNNNNNNN NNNNNNNmmmmNNNNNNNNNNNmNmhyhyoo // + o / oshhooshhyshdyyhmmmmNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNmmmdyhhs +: - / öylesine // + yso + yyoyyhdydmNmmNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNNNdhhhhys / :: / + / ssssosdysyddddmmmmmNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNmmhhdhdddhs + ooo + ysdyyhddmdmNmmNNNmhmNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNmhhhdddddddshyhydmdmmmmmmmmNNmmmmmmmmmmNNNNNNNNNNN NNNNNNNNNNNNNNNNNNmdmNNmmmdhmmddmmmmmdddmmmmmmmmmNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNmNNNNmdddmmmNNNNNNmmmmmmNNNNNNNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNmmmmmmmNNmmmmmmNNNNNNNNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNNNmmmmmNNmmmNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNNNmmmNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNNNNmmNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNmNNNNNNNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNmmNmNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNmNNNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN NNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNNN <NAME>. """) <file_sep># R4PORT-TOOL-BOX ____ _ _ ____ ___ ____ _____ _____ ___ ___ _ ____ _____ __ | _ \| || | | _ \ / _ \| _ \_ _| |_ _/ _ \ / _ \| | | __ ) / _ \ \/ / | |_) | || |_| |_) | | | | |_) || | | || | | | | | | | | _ \| | | \ / | _ <|__ _| __/| |_| | _ < | | | || |_| | |_| | |___| |_) | |_| / \ |_| \_\ |_| |_| \___/|_| \_\|_| |_| \___/ \___/|_____|____/ \___/_/\_\ iletişim: https://wwww.instagram.com/onurcan.root web site: https://www.teknoalerji.com github: https://github.com/R4PORT/R4PORT-TOOL-BOX Coded by: R4PORT (<NAME>) python2 Kullanıcı adı : root Şifre : R4PORT username : root Password: <PASSWORD> depoyu güncellemek için apt-get update paramatresini kullanın toolbox yazılımı Kali 2018.x desteklidir içinde bulunanlar. 01 - Port Tarama 02 - SQL Zafiyeti 03 - Site Tarama 04 - Trojan Oluşturma 05 - Güvenlik Duvarı Tespit 06 - AĞ Haritası 07 - Derleme 08 - Exploit Search 09 - Wifi 10 - Turkish Founder 11 - Şehitler 12 - İletişim Şehitler Listesi. <NAME> <NAME>. <NAME> İlhan Varank, Erol Olçok Abdull<NAME> Olçok <NAME> <NAME>aplan Ü<NAME> <NAME> <NAME> <NAME> <NAME> <NAME> Yasin Yılmaz <NAME> Recep Gündüz Hüseyin Kısa Hal<NAME>ldırım Fazıl Gürs, Metin Arslan Osman Yılmaz Mehmet Oruç Lokman Oktay Mahmut Coşkunsu Muhammed Ali Aksu Muhammed Ambar <NAME> <NAME>ı Yasin Naci Ağaroğlu <NAME>cı <NAME>ıkgöz <NAME> <NAME> <NAME> İbrahim Yılmaz Muham<NAME> <NAME>ın <NAME> Tolga Ecebalın Ümit Çoban <NAME> <NAME> Yusuf Elitaş Emrah Sapa Hasan Yılmaz <NAME> <NAME> <NAME> Yasin Yılmaz Ali Anar Eyyüp Oğuz Nedip Cengiz Eker <NAME> <NAME> Bülent Yurtseven Murat Alkan Ahmet Oruç Cüneyt Bursa Mucip Arıgan Burak Cantürk Fahrettin Yavuz Hakan Yorulmaz Adil Büyükcengiz Burhan Öner Haki Aras Ahmet Kara Fatih Kalu Askeri Çoban Celaleddin İbiş <NAME> Fatih Satır Halil Işılar Akın Sertçelik Ayhan Keleş Cemal Demir <NAME>ı Cengiz Polat İhsan Yıldız İzzet Özkan Mehmet Şefik Akif Kapaklı Çetin Can Hakan Ünver Hasan Kaya İsmail Kefal Lokman Biçinci Mete Sertbaş Must<NAME> Yunus <NAME> Salih Alışkan Suat Aloğlu Tim<NAME>ur Ömer Takdemir Sümer Deniz Yusuf Çelik Dursun Acar Alpaslan Yazıcı Akif Altay Münir Murat Ertekin Must<NAME> Önder Güzel Cennet Yiğit Gülşah Güler Ufuk Baysan Fikret Metin Öztürk Kübra Doğanay Mu<NAME>mitçi Zeynep Sağır Demet Sezen Erol İnce Birol Yavuz Faruk Demir Hal<NAME> Hüseyin Gora Hurşit Uzel Hüseyin Kalkan Fevzi Başaran Hakan Yorulmaz <NAME>aya Niyazi Ergüven Must<NAME> Mu<NAME>ınç Mehmet Karacatilki Murat Ellik Seher Yaşar Mehmet Demir Köksal Kaşaltı Mehmet Çetin Münir Alkan Mehmet Şevket Uzun Ozan Özen <NAME> Halit Gülser Zafer Koyuncu Hüseyin Goral Hüseyin Kalkan Serhat Koç Varol Tosun Edip Zengin Velit Bekdaş Yakup Sürüc Turgut Solak Seyit Ahmet Çakır Sevda Güngör Mehmet Demir Kemal Tosun Hasan Gülhan Meriç Alemdar Mehmet Akif Sancar unus Uğur Fırat Bulut Ayşe Aykaz Barış Efe Mehmet Ali Kılıç Mahir Ayabak Murat Mertel Murat Naiboğlu Ahmet Kocabay Ahmet Özsoy Mehmet Yılmaz Onur Ensar Ayanoğlu Onur Kılıç Cuma Dağ <NAME> <NAME> Mehmet Kocakaya Erkan Yiğit <NAME> Fuat Bozkurt Oğuzhan Yaşar Aydın Çopur Beytullah Yeşilay <NAME> Erkan Er Gökhan Eser Has<NAME>ın Mehmet Kocakaya Me<NAME> Mehmet Ali Urel Hasan Yılmaz Yıldız Gürsoy Uhud Kadir Işık Türkmen Tekin Suat Akıncı Ali Alıtkan Aytekin Kuru Ahmet Oruç Mehmet Oruç Yusuf Çelik Ömer İpek Murat İnci <NAME> <NAME> Köksal Karmil Vahit Kaşçıoğlu Vedat Barceğci Mutlu Can Kılıç Tahsin Gerekli Şükrü Bayrakçı Ömer Cankatar Recep Büyük Batuhan Ergin <NAME> Kader Sivri Orhun Göytan Ömer Cankatar Samet Uslu Battal İlgün Şeyhmus Demir Şirin Diril Özgür Gençer Vedat Büyüköztaş P. Kur. Alb. Sait Ertürk Topçu Astsb. Kd. Bçvş. <NAME> P. Uzm. Çvş. <NAME>ş<NAME> Rüstem Resul Perçini Mesut Acu Resul Kaptancı Fatih Dalgıç <NAME> Sevgi Yeşilyurt Şenol Sağman Zekeriya Bitmez Yılmaz Ercan Jouad Merroune <NAME>ye brahim Ateş <NAME>doğdu <NAME> Davut Karaçam <NAME>çı Necmi Bahadır Denizcioğlu Mehmet Şengül Öz<NAME>i <NAME>ülşen <NAME> <NAME>lu Lütfi Gülşen <NAME> <NAME>ım <NAME> <NAME> <NAME>kizceli <NAME> <NAME> <NAME> <file_sep>print (""" R4PORT TOOLBOX Yazılımı Deposu Güncelleniliyor... """) os.system("git clone https://github.com/R4PORT/R4PORT-TOOL-BOX.git")
ff9520f573866c2ad421b8143f548c5ede6b5600
[ "Markdown", "Python" ]
3
Python
R4PORT/R4PORT-TOOL-BOX
d349ccc985325214abac6cf1c6c6a693fb5747b7
d0707697d5416266763fb5838676b741ad4b35d3
refs/heads/master
<file_sep># Generated by Django 3.1.6 on 2021-09-05 13:47 from django.conf import settings from django.db import migrations, models import django.db.models.deletion class Migration(migrations.Migration): initial = True dependencies = [ migrations.swappable_dependency(settings.AUTH_USER_MODEL), ] operations = [ migrations.CreateModel( name='course', fields=[ ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), ('code', models.CharField(max_length=10)), ('name', models.CharField(max_length=20)), ('course_student', models.JSONField()), ], ), migrations.CreateModel( name='teacher', fields=[ ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), ('user', models.OneToOneField(on_delete=django.db.models.deletion.CASCADE, to=settings.AUTH_USER_MODEL)), ], ), migrations.CreateModel( name='lecture', fields=[ ('id', models.AutoField(auto_created=True, primary_key=True, serialize=False, verbose_name='ID')), ('topic', models.CharField(max_length=20)), ('date', models.DateTimeField()), ('duration', models.DurationField()), ('attendance', models.JSONField()), ('lecturer', models.ForeignKey(on_delete=django.db.models.deletion.DO_NOTHING, to='teacher.teacher')), ('subject', models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='teacher.course')), ], ), migrations.AddField( model_name='course', name='course_teacher', field=models.ForeignKey(on_delete=django.db.models.deletion.CASCADE, to='teacher.teacher'), ), ] <file_sep># Generated by Django 3.1.6 on 2021-09-05 17:15 from django.db import migrations, models class Migration(migrations.Migration): dependencies = [ ('teacher', '0001_initial'), ] operations = [ migrations.RemoveField( model_name='lecture', name='duration', ), migrations.AlterField( model_name='lecture', name='attendance', field=models.JSONField(default=list), ), ] <file_sep># Generated by Django 3.1.6 on 2021-09-05 15:06 from django.db import migrations class Migration(migrations.Migration): dependencies = [ ('student', '0002_auto_20210905_1917'), ] operations = [ migrations.RemoveField( model_name='student', name='rollno', ), ] <file_sep>{% extends 'base.html' %} {% load static %} {% block title%} Portal {% endblock%} {% block content%} <div class="container-fluid"> {% if lectures %} <div class="row"> <div class="col-md-12"> <div class="card"> <div class="card-header card-header-primary"> <h4 class="card-title ">Lectures</h4> <p class="card-category"></p> </div> <div class="card-body"> <div class="table-responsive"> <table class="table"> <thead class=" text-primary"> <th> Topic </th> <th> Subject </th> <th> Date </th> <th> Attendance </th> </thead> <tbody> {% for i in lectures %} <tr> <td>{{i.topic}} </td> <td>{{i.subject}} </td> <td>{{i.date}} </td> <td> {% if i.approved is True %}Approved {% else %} <a href="{% url 'teacher:approve' i.topic %}" class="btn btn-primary btn-sm">Give approval </a> {% endif %} </tr> {% endfor %} </tbody> </table> </div> </div> </div> </div> </div> {% endif %} </div> {% endblock %} <file_sep> from django.contrib import admin from django.urls import path from . import views app_name = 'teacher' urlpatterns = [ path('', views.TeacherPage.as_view(),name = 'TeacherPage'), path('Admin/', views.AdminPage.as_view(),name = 'AdminPage'), path('scheduleLec/', views.scheduleLec.as_view(),name = 'scheduleLec'), path('markAttend/<str:topic>/', views.markAttend.as_view(),name = 'markAttend'), path('attendance/<topic>/', views.attendance.as_view(),name = 'attendance'), path('approve/<topic>/', views.approve.as_view(),name = 'approve'), ] <file_sep>from django.shortcuts import render,redirect from django.views.generic.base import View from django.contrib.auth import authenticate, login, logout from django.contrib.auth.models import Group, User # Create your views here. class StudentPage(View): def get(self,request,template_name='studentpage.html'): return render(request,template_name)<file_sep># Attendance Portal ## User roles: ### Teacher 1. Schedule lecture 2. Mark Attendance 3. View Attendance ### Admin 1. Create course 2. Allocate teacher for the course 3. Enroll students in the course 4. Approve Attendance Refer demo video file. <file_sep>from django.db import models from django.contrib.auth.models import User from django.db.models.fields.related import ForeignKey from django.db.models import OneToOneField # Create your models here. class student(models.Model): user: OneToOneField = models.OneToOneField(User, on_delete=models.CASCADE) def __str__(self): return self.user.first_name+" "+self.user.last_name<file_sep>from django.shortcuts import render,redirect from django.views.generic.base import View from django.contrib.auth import authenticate, login, logout from django.contrib.auth.models import Group, User from .models import course, lecture, teacher # Create your views here. class AdminPage(View): def get(self,request,template_name='adminpage.html'): message={} lectures = lecture.objects.all() message['lectures']=lectures return render(request,template_name,message) class TeacherPage(View): def get(self,request,template_name='teacherpage.html'): message={} thisteacher = teacher.objects.filter(user = request.user) thisteacher = thisteacher[0] lectures = lecture.objects.filter(lecturer = thisteacher) message['lectures']=lectures return render(request,template_name,message) class scheduleLec(View): def get(self,request,template_name='schedule_lec.html'): return render(request,template_name) def post(self,request,template_name='schedule_lec.html'): message={} topic = request.POST.get('topic') subject = request.POST.get('subject') Subject = course.objects.all().filter(code = subject) attendance = ["0"] date = request.POST.get('date') lecturer = teacher.objects.filter(user = request.user) try: lectureData = lecture(topic=topic,subject=Subject[0],date=date,lecturer=lecturer[0],attendance=attendance) lectureData.save() message['error']="Lecture scheduled." except: message['error']="Something went wrong!" return render(request,template_name,message) class markAttend(View): def get(self,request,topic,template_name="mark_attendance.html"): message={} Lecture = lecture.objects.filter(topic=topic) message['lecture']= Lecture[0] Course = Lecture[0].subject students = Course.course_student message['students']= students return render(request,template_name,message) def post(self,request,topic,template_name="mark_attendance.html"): attendance = request.POST.getlist('list') lecture.objects.filter(topic=topic).update(attendance=attendance) Lecture = lecture.objects.filter(topic=topic) print(Lecture) print(attendance) message={} message['lecture']=Lecture return render(request,template_name,message) class attendance(View): def get(self,request,topic,template_name="attendance.html"): message={} Lecture = lecture.objects.filter(topic=topic) message['lecture']= Lecture[0] Course = Lecture[0].subject students = Course.course_student message['students']= students return render(request,template_name,message) class approve(View): def get(self,request,topic,template_name="approve.html"): message={} Lecture = lecture.objects.filter(topic=topic) message['lecture']= Lecture[0] Course = Lecture[0].subject students = Course.course_student message['students']= students return render(request,template_name,message) def post(self,request,topic,template_name="approve.html"): lecture.objects.filter(topic=topic).update(approved=True) return redirect('teacher:AdminPage')<file_sep>from typing import DefaultDict from django.db import models from django.db.models import OneToOneField from django.contrib.auth.models import User from django.db.models.deletion import CASCADE, DO_NOTHING # Create your models here. class teacher(models.Model): user: OneToOneField = models.OneToOneField(User, on_delete=models.CASCADE) def __str__(self): return self.user.first_name+" "+self.user.last_name class course(models.Model): code = models.CharField(max_length=10) name = models.CharField(max_length=20) course_teacher = models.ForeignKey(teacher, on_delete=CASCADE) course_student = models.JSONField(default=list) def __str__(self): return self.name class lecture(models.Model): topic = models.CharField(max_length=20) subject = models.ForeignKey(course, on_delete=CASCADE) date = models.DateTimeField() attendance = models.JSONField(default=list) lecturer = models.ForeignKey(teacher, on_delete=DO_NOTHING) approved = models.BooleanField(default=False) def __str__(self): return self.topic<file_sep> from django.contrib import admin from django.urls import path from . import views app_name = 'student' urlpatterns = [ path('', views.StudentPage.as_view(),name = 'StudentPage'), ] <file_sep># Generated by Django 3.1.6 on 2021-09-05 21:36 from django.db import migrations, models class Migration(migrations.Migration): dependencies = [ ('teacher', '0002_auto_20210905_2245'), ] operations = [ migrations.AddField( model_name='lecture', name='approved', field=models.BooleanField(default=False), ), migrations.AlterField( model_name='course', name='course_student', field=models.JSONField(default=list), ), ] <file_sep>from django.contrib import admin from .models import course, lecture, teacher # Register your models here. admin.site.register(teacher) admin.site.register(course) admin.site.register(lecture)<file_sep>from django.shortcuts import render,redirect from django.views.generic.base import View from django.contrib.auth import authenticate, login, logout from django.contrib.auth.models import Group, User # Create your views here. class landing(View): def get(self, request, template_name='landing.html'): return render(request, template_name) class Login(View): def get(self, request, template_name='login.html'): return render(request,template_name) def post(self, request, template_name='login.html'): username = request.POST.get('username') password = request.POST.get('password') user = authenticate(username=username,password=<PASSWORD>) group = None if user is not None: if user.is_active: login(request, user) group = user.groups.all()[0].name if group == 'student_group': return redirect('student:StudentPage') if group == 'teacher_group': return redirect('teacher:TeacherPage') if group == 'admin_group': return redirect('teacher:AdminPage') else: return render(request, 'landing.html') else: return render(request, template_name, {'error_message': 'Your account has been disabled'}) else: return render(request, template_name, {'error_message': 'Invalid Login'}) class Logout(View): def get(self, request,template_name="landing.html"): logout(request) return render(request, 'landing.html')
a8f047afad8e144fdf44074480c1523051fbee65
[ "Markdown", "Python", "HTML" ]
14
Python
VaishnaviM411/portal
a69978d329c0bb6033bcd0a1fcfa0d2dcd611c83
dfa449a2d1050c9282eec62edc84e6e2a675ea9a
refs/heads/master
<file_sep><?php namespace Database\Seeders; use Illuminate\Database\Seeder; use Illuminate\Support\Facades\DB; class PostSeeder extends Seeder { /** * Run the database seeds. * * @return void */ public function run() { $posts = []; $now = now(); $body = file_get_contents(__DIR__ . '/data/post.json'); $summary = 'This is meaningless text meant to demonstrate an actual post summary for the blog layout.'; for ($i = 1; $i <= 10; $i++) { $posts[] = [ 'category' => "tech", 'title' => "Post $i", 'slug' => "post-$i", 'summary' => "This Post $i on Maseno Hub blog. $summary", 'body' => $body, 'author_id' => 1, 'published_at' => random_int(0, 10) < 7 ? $now : null, 'created_at' => $now, 'updated_at' => $now ]; } DB::table('posts')->insert($posts); } } <file_sep><p align="center"> <a href="#" target="_blank"> <img src="public/logo.svg" width="256"> </a> </p> <p align="center"> <a href="https://github.com/sixpeteunder/website/actions/workflows/code-quality.yml" target="_blank"> <img src="https://github.com/sixpeteunder/website/actions/workflows/code-quality.yml/badge.svg"> </a> </p> ## Website Maseno Hub website concept. Built on [TALL](tallstack.dev/) stack: - [TailwindCSS 2.x](https://tailwindcss.com/) - [AlpineJS 2.x](https://github.com/alpinejs/alpine) - [Laravel 8.x](https://laravel.com) - [Livewire 2.x](https://laravel-livewire.com) ## Project setup ```bash composer install npm install ``` ## Build frontend assets ```bash npm run dev ``` ## Migrate database ```bash php artisan migrate ``` ## Start a local development server ```bash php artisan serve ``` ## Contributing - Always start a feature branch, avoid pushing directly to master. - Make sure all checks are passing (use the `composer code:check` and `composer code:fix` helpers) ## License The Maseno Hub website is open-sourced software licensed under the [MIT license](https://opensource.org/licenses/MIT). <file_sep>import Quill from "quill"; const quill = new Quill("#body", { readOnly: true }); quill.setContents(JSON.parse(body));<file_sep><?php namespace App\Http\Livewire\Admin; use App\Models\User; use Mediconesystems\LivewireDatatables\Http\Livewire\LivewireDatatable; class UsersTable extends LivewireDatatable { public $model = User::class; public $exportable = true; public $searchable = 'title'; public $hideable = 'select'; public $exclude = ['current_team_id']; public $sort = 'id|asc'; // public function columns() // { // // // } } <file_sep><?php namespace App\Models; use Illuminate\Database\Eloquent\Factories\HasFactory; use Illuminate\Database\Eloquent\Model; use PhpParser\Node\Expr\PostDec; /** * @property User $author * @property Post $newer * @property Post $older */ class Post extends Model { use HasFactory; /** * The attributes that should be cast to native types. * * @var array */ protected $casts = [ 'published_at' => 'datetime', ]; /** * Get the author that owns the post. */ public function author() { return $this->belongsTo(User::class, 'author_id'); } public function getNewerAttribute() { return self::where('id', '>', $this->id)->orderBy('id', 'asc')->first(); } public function getOlderAttribute() { return self::where('id', '<', $this->id)->orderBy('id', 'desc')->first(); } } <file_sep><?php namespace App\Http\Controllers; use App\Models\Question; use Illuminate\Http\Request; use Illuminate\Support\Facades\Auth; class QuestionController extends Controller { public function index() { return view('questions.index', [ 'questions' => Question::with(['author', 'answers'])->get() ]); } public function new() { return view('questions.new'); } public function create(Request $request) { $data = $request->validate([ 'title' => ['required', 'string', 'max:255'], 'body' => ['required', 'json',] ]); $question = new Question(); $question->title = $data['title']; $question->body = $data['body']; $question->author()->associate(Auth::user()); $question->save(); return redirect()->route('questions.show', ['question' => $question->id]); } public function show($question) { return view('questions.show', [ 'question' => Question::with(['author', 'answers'])->find($question) ]); } } <file_sep><?php namespace App\Http\Controllers; use App\Models\Post; use Illuminate\Http\Request; class PostController extends Controller { public function index() { return view('posts.index', [ 'posts' => Post::with(['author'])->whereNotNull(['published_at'])->orderBy('id', 'desc')->get() ]); } public function show($post) { return view('posts.show', [ 'post' => Post::with(['author'])->find($post)->append(['newer', 'older']) ]); } } <file_sep><?php namespace App\Models; use Illuminate\Database\Eloquent\Collection; use Illuminate\Database\Eloquent\Factories\HasFactory; use Illuminate\Database\Eloquent\Model; use Spatie\Sluggable\{HasSlug, SlugOptions}; /** * @property Collection $answers * @property User $author */ class Question extends Model { use HasFactory; use HasSlug; /** * Get the answers for the question. */ public function answers() { return $this->hasMany(Answer::class); } /** * Get the author that owns the question. */ public function author() { return $this->belongsTo(User::class, 'author_id'); } public function getSlugOptions(): SlugOptions { return SlugOptions::create() ->generateSlugsFrom('title') ->saveSlugsTo('slug'); } } <file_sep><?php namespace Database\Seeders; use Illuminate\Database\Seeder; use Illuminate\Support\Facades\DB; class QuestionSeeder extends Seeder { /** * Run the database seeds. * * @return void */ public function run() { $questions = []; $now = now(); $body = file_get_contents(__DIR__ . '/data/question.json'); for ($i = 1; $i <= 10; $i++) { $questions[] = [ 'category' => random_int(0, 10) > 5 ? 'programming' : 'tech support', 'title' => "Question $i", 'slug' => "question-$i", 'body' => $body, 'author_id' => 1, 'created_at' => $now, 'updated_at' => $now ]; } DB::table('questions')->insert($questions); } } <file_sep>/******/ (() => { // webpackBootstrap /******/ "use strict"; /******/ var __webpack_modules__ = ({ /***/ "./node_modules/timeago.js/esm/format.js": /*!***********************************************!*\ !*** ./node_modules/timeago.js/esm/format.js ***! \***********************************************/ /***/ ((__unused_webpack_module, __webpack_exports__, __webpack_require__) => { __webpack_require__.r(__webpack_exports__); /* harmony export */ __webpack_require__.d(__webpack_exports__, { /* harmony export */ "format": () => (/* binding */ format) /* harmony export */ }); /* harmony import */ var _utils_date__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(/*! ./utils/date */ "./node_modules/timeago.js/esm/utils/date.js"); /* harmony import */ var _register__WEBPACK_IMPORTED_MODULE_1__ = __webpack_require__(/*! ./register */ "./node_modules/timeago.js/esm/register.js"); /** * format a TDate into string * @param date * @param locale * @param opts */ var format = function (date, locale, opts) { // diff seconds var sec = (0,_utils_date__WEBPACK_IMPORTED_MODULE_0__.diffSec)(date, opts && opts.relativeDate); // format it with locale return (0,_utils_date__WEBPACK_IMPORTED_MODULE_0__.formatDiff)(sec, (0,_register__WEBPACK_IMPORTED_MODULE_1__.getLocale)(locale)); }; //# sourceMappingURL=format.js.map /***/ }), /***/ "./node_modules/timeago.js/esm/index.js": /*!**********************************************!*\ !*** ./node_modules/timeago.js/esm/index.js ***! \**********************************************/ /***/ ((__unused_webpack_module, __webpack_exports__, __webpack_require__) => { __webpack_require__.r(__webpack_exports__); /* harmony export */ __webpack_require__.d(__webpack_exports__, { /* harmony export */ "format": () => (/* reexport safe */ _format__WEBPACK_IMPORTED_MODULE_3__.format), /* harmony export */ "render": () => (/* reexport safe */ _realtime__WEBPACK_IMPORTED_MODULE_4__.render), /* harmony export */ "cancel": () => (/* reexport safe */ _realtime__WEBPACK_IMPORTED_MODULE_4__.cancel), /* harmony export */ "register": () => (/* reexport safe */ _register__WEBPACK_IMPORTED_MODULE_2__.register) /* harmony export */ }); /* harmony import */ var _lang_en_US__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(/*! ./lang/en_US */ "./node_modules/timeago.js/esm/lang/en_US.js"); /* harmony import */ var _lang_zh_CN__WEBPACK_IMPORTED_MODULE_1__ = __webpack_require__(/*! ./lang/zh_CN */ "./node_modules/timeago.js/esm/lang/zh_CN.js"); /* harmony import */ var _register__WEBPACK_IMPORTED_MODULE_2__ = __webpack_require__(/*! ./register */ "./node_modules/timeago.js/esm/register.js"); /* harmony import */ var _format__WEBPACK_IMPORTED_MODULE_3__ = __webpack_require__(/*! ./format */ "./node_modules/timeago.js/esm/format.js"); /* harmony import */ var _realtime__WEBPACK_IMPORTED_MODULE_4__ = __webpack_require__(/*! ./realtime */ "./node_modules/timeago.js/esm/realtime.js"); /** * Created by hustcc on 18/5/20. * Contract: <EMAIL> */ (0,_register__WEBPACK_IMPORTED_MODULE_2__.register)('en_US', _lang_en_US__WEBPACK_IMPORTED_MODULE_0__.default); (0,_register__WEBPACK_IMPORTED_MODULE_2__.register)('zh_CN', _lang_zh_CN__WEBPACK_IMPORTED_MODULE_1__.default); //# sourceMappingURL=index.js.map /***/ }), /***/ "./node_modules/timeago.js/esm/lang/en_US.js": /*!***************************************************!*\ !*** ./node_modules/timeago.js/esm/lang/en_US.js ***! \***************************************************/ /***/ ((__unused_webpack_module, __webpack_exports__, __webpack_require__) => { __webpack_require__.r(__webpack_exports__); /* harmony export */ __webpack_require__.d(__webpack_exports__, { /* harmony export */ "default": () => (/* export default binding */ __WEBPACK_DEFAULT_EXPORT__) /* harmony export */ }); var EN_US = ['second', 'minute', 'hour', 'day', 'week', 'month', 'year']; /* harmony default export */ function __WEBPACK_DEFAULT_EXPORT__(diff, idx) { if (idx === 0) return ['just now', 'right now']; var unit = EN_US[Math.floor(idx / 2)]; if (diff > 1) unit += 's'; return [diff + " " + unit + " ago", "in " + diff + " " + unit]; } //# sourceMappingURL=en_US.js.map /***/ }), /***/ "./node_modules/timeago.js/esm/lang/zh_CN.js": /*!***************************************************!*\ !*** ./node_modules/timeago.js/esm/lang/zh_CN.js ***! \***************************************************/ /***/ ((__unused_webpack_module, __webpack_exports__, __webpack_require__) => { __webpack_require__.r(__webpack_exports__); /* harmony export */ __webpack_require__.d(__webpack_exports__, { /* harmony export */ "default": () => (/* export default binding */ __WEBPACK_DEFAULT_EXPORT__) /* harmony export */ }); var ZH_CN = ['秒', '分钟', '小时', '天', '周', '个月', '年']; /* harmony default export */ function __WEBPACK_DEFAULT_EXPORT__(diff, idx) { if (idx === 0) return ['刚刚', '片刻后']; var unit = ZH_CN[~~(idx / 2)]; return [diff + " " + unit + "\u524D", diff + " " + unit + "\u540E"]; } //# sourceMappingURL=zh_CN.js.map /***/ }), /***/ "./node_modules/timeago.js/esm/realtime.js": /*!*************************************************!*\ !*** ./node_modules/timeago.js/esm/realtime.js ***! \*************************************************/ /***/ ((__unused_webpack_module, __webpack_exports__, __webpack_require__) => { __webpack_require__.r(__webpack_exports__); /* harmony export */ __webpack_require__.d(__webpack_exports__, { /* harmony export */ "cancel": () => (/* binding */ cancel), /* harmony export */ "render": () => (/* binding */ render) /* harmony export */ }); /* harmony import */ var _utils_dom__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(/*! ./utils/dom */ "./node_modules/timeago.js/esm/utils/dom.js"); /* harmony import */ var _utils_date__WEBPACK_IMPORTED_MODULE_1__ = __webpack_require__(/*! ./utils/date */ "./node_modules/timeago.js/esm/utils/date.js"); /* harmony import */ var _register__WEBPACK_IMPORTED_MODULE_2__ = __webpack_require__(/*! ./register */ "./node_modules/timeago.js/esm/register.js"); // all realtime timer var TIMER_POOL = {}; /** * clear a timer from pool * @param tid */ var clear = function (tid) { clearTimeout(tid); delete TIMER_POOL[tid]; }; // run with timer(setTimeout) function run(node, date, localeFunc, opts) { // clear the node's exist timer clear((0,_utils_dom__WEBPACK_IMPORTED_MODULE_0__.getTimerId)(node)); var relativeDate = opts.relativeDate, minInterval = opts.minInterval; // get diff seconds var diff = (0,_utils_date__WEBPACK_IMPORTED_MODULE_1__.diffSec)(date, relativeDate); // render node.innerText = (0,_utils_date__WEBPACK_IMPORTED_MODULE_1__.formatDiff)(diff, localeFunc); var tid = setTimeout(function () { run(node, date, localeFunc, opts); }, Math.min(Math.max((0,_utils_date__WEBPACK_IMPORTED_MODULE_1__.nextInterval)(diff), minInterval || 1) * 1000, 0x7fffffff)); // there is no need to save node in object. Just save the key TIMER_POOL[tid] = 0; (0,_utils_dom__WEBPACK_IMPORTED_MODULE_0__.setTimerId)(node, tid); } /** * cancel a timer or all timers * @param node - node hosting the time string */ function cancel(node) { // cancel one if (node) clear((0,_utils_dom__WEBPACK_IMPORTED_MODULE_0__.getTimerId)(node)); // cancel all // @ts-ignore else Object.keys(TIMER_POOL).forEach(clear); } /** * render a dom realtime * @param nodes * @param locale * @param opts */ function render(nodes, locale, opts) { // by .length // @ts-ignore var nodeList = nodes.length ? nodes : [nodes]; nodeList.forEach(function (node) { run(node, (0,_utils_dom__WEBPACK_IMPORTED_MODULE_0__.getDateAttribute)(node), (0,_register__WEBPACK_IMPORTED_MODULE_2__.getLocale)(locale), opts || {}); }); return nodeList; } //# sourceMappingURL=realtime.js.map /***/ }), /***/ "./node_modules/timeago.js/esm/register.js": /*!*************************************************!*\ !*** ./node_modules/timeago.js/esm/register.js ***! \*************************************************/ /***/ ((__unused_webpack_module, __webpack_exports__, __webpack_require__) => { __webpack_require__.r(__webpack_exports__); /* harmony export */ __webpack_require__.d(__webpack_exports__, { /* harmony export */ "register": () => (/* binding */ register), /* harmony export */ "getLocale": () => (/* binding */ getLocale) /* harmony export */ }); /** * Created by hustcc on 18/5/20. * Contract: <EMAIL> */ /** * All supported locales */ var Locales = {}; /** * register a locale * @param locale * @param func */ var register = function (locale, func) { Locales[locale] = func; }; /** * get a locale, default is en_US * @param locale * @returns {*} */ var getLocale = function (locale) { return Locales[locale] || Locales['en_US']; }; //# sourceMappingURL=register.js.map /***/ }), /***/ "./node_modules/timeago.js/esm/utils/date.js": /*!***************************************************!*\ !*** ./node_modules/timeago.js/esm/utils/date.js ***! \***************************************************/ /***/ ((__unused_webpack_module, __webpack_exports__, __webpack_require__) => { __webpack_require__.r(__webpack_exports__); /* harmony export */ __webpack_require__.d(__webpack_exports__, { /* harmony export */ "toDate": () => (/* binding */ toDate), /* harmony export */ "formatDiff": () => (/* binding */ formatDiff), /* harmony export */ "diffSec": () => (/* binding */ diffSec), /* harmony export */ "nextInterval": () => (/* binding */ nextInterval) /* harmony export */ }); /** * Created by hustcc on 18/5/20. * Contract: <EMAIL> */ var SEC_ARRAY = [ 60, 60, 24, 7, 365 / 7 / 12, 12, ]; /** * format Date / string / timestamp to timestamp * @param input * @returns {*} */ function toDate(input) { if (input instanceof Date) return input; // @ts-ignore if (!isNaN(input) || /^\d+$/.test(input)) return new Date(parseInt(input)); input = (input || '') // @ts-ignore .trim() .replace(/\.\d+/, '') // remove milliseconds .replace(/-/, '/') .replace(/-/, '/') .replace(/(\d)T(\d)/, '$1 $2') .replace(/Z/, ' UTC') // 2017-2-5T3:57:52Z -> 2017-2-5 3:57:52UTC .replace(/([+-]\d\d):?(\d\d)/, ' $1$2'); // -04:00 -> -0400 return new Date(input); } /** * format the diff second to *** time ago, with setting locale * @param diff * @param localeFunc * @returns */ function formatDiff(diff, localeFunc) { /** * if locale is not exist, use defaultLocale. * if defaultLocale is not exist, use build-in `en`. * be sure of no error when locale is not exist. * * If `time in`, then 1 * If `time ago`, then 0 */ var agoIn = diff < 0 ? 1 : 0; /** * Get absolute value of number (|diff| is non-negative) value of x * |diff| = diff if diff is positive * |diff| = -diff if diff is negative * |0| = 0 */ diff = Math.abs(diff); /** * Time in seconds */ var totalSec = diff; /** * Unit of time */ var idx = 0; for (; diff >= SEC_ARRAY[idx] && idx < SEC_ARRAY.length; idx++) { diff /= SEC_ARRAY[idx]; } /** * Math.floor() is alternative of ~~ * * The differences and bugs: * Math.floor(3.7) -> 4 but ~~3.7 -> 3 * Math.floor(1559125440000.6) -> 1559125440000 but ~~1559125440000.6 -> 52311552 * * More information about the performance of algebraic: * https://www.youtube.com/watch?v=65-RbBwZQdU */ diff = Math.floor(diff); idx *= 2; if (diff > (idx === 0 ? 9 : 1)) idx += 1; return localeFunc(diff, idx, totalSec)[agoIn].replace('%s', diff.toString()); } /** * calculate the diff second between date to be formatted an now date. * @param date * @param relativeDate * @returns {number} */ function diffSec(date, relativeDate) { var relDate = relativeDate ? toDate(relativeDate) : new Date(); return (+relDate - +toDate(date)) / 1000; } /** * nextInterval: calculate the next interval time. * - diff: the diff sec between now and date to be formatted. * * What's the meaning? * diff = 61 then return 59 * diff = 3601 (an hour + 1 second), then return 3599 * make the interval with high performance. **/ function nextInterval(diff) { var rst = 1, i = 0, d = Math.abs(diff); for (; diff >= SEC_ARRAY[i] && i < SEC_ARRAY.length; i++) { diff /= SEC_ARRAY[i]; rst *= SEC_ARRAY[i]; } d = d % rst; d = d ? rst - d : rst; return Math.ceil(d); } //# sourceMappingURL=date.js.map /***/ }), /***/ "./node_modules/timeago.js/esm/utils/dom.js": /*!**************************************************!*\ !*** ./node_modules/timeago.js/esm/utils/dom.js ***! \**************************************************/ /***/ ((__unused_webpack_module, __webpack_exports__, __webpack_require__) => { __webpack_require__.r(__webpack_exports__); /* harmony export */ __webpack_require__.d(__webpack_exports__, { /* harmony export */ "getDateAttribute": () => (/* binding */ getDateAttribute), /* harmony export */ "setTimerId": () => (/* binding */ setTimerId), /* harmony export */ "getTimerId": () => (/* binding */ getTimerId) /* harmony export */ }); var ATTR_TIMEAGO_TID = 'timeago-id'; /** * get the datetime attribute, `datetime` are supported. * @param node * @returns {*} */ function getDateAttribute(node) { return node.getAttribute('datetime'); } /** * set the node attribute, native DOM * @param node * @param timerId * @returns {*} */ function setTimerId(node, timerId) { // @ts-ignore node.setAttribute(ATTR_TIMEAGO_TID, timerId); } /** * get the timer id * @param node */ function getTimerId(node) { return parseInt(node.getAttribute(ATTR_TIMEAGO_TID)); } //# sourceMappingURL=dom.js.map /***/ }) /******/ }); /************************************************************************/ /******/ // The module cache /******/ var __webpack_module_cache__ = {}; /******/ /******/ // The require function /******/ function __webpack_require__(moduleId) { /******/ // Check if module is in cache /******/ if(__webpack_module_cache__[moduleId]) { /******/ return __webpack_module_cache__[moduleId].exports; /******/ } /******/ // Create a new module (and put it into the cache) /******/ var module = __webpack_module_cache__[moduleId] = { /******/ // no module.id needed /******/ // no module.loaded needed /******/ exports: {} /******/ }; /******/ /******/ // Execute the module function /******/ __webpack_modules__[moduleId](module, module.exports, __webpack_require__); /******/ /******/ // Return the exports of the module /******/ return module.exports; /******/ } /******/ /************************************************************************/ /******/ /* webpack/runtime/define property getters */ /******/ (() => { /******/ // define getter functions for harmony exports /******/ __webpack_require__.d = (exports, definition) => { /******/ for(var key in definition) { /******/ if(__webpack_require__.o(definition, key) && !__webpack_require__.o(exports, key)) { /******/ Object.defineProperty(exports, key, { enumerable: true, get: definition[key] }); /******/ } /******/ } /******/ }; /******/ })(); /******/ /******/ /* webpack/runtime/hasOwnProperty shorthand */ /******/ (() => { /******/ __webpack_require__.o = (obj, prop) => (Object.prototype.hasOwnProperty.call(obj, prop)) /******/ })(); /******/ /******/ /* webpack/runtime/make namespace object */ /******/ (() => { /******/ // define __esModule on exports /******/ __webpack_require__.r = (exports) => { /******/ if(typeof Symbol !== 'undefined' && Symbol.toStringTag) { /******/ Object.defineProperty(exports, Symbol.toStringTag, { value: 'Module' }); /******/ } /******/ Object.defineProperty(exports, '__esModule', { value: true }); /******/ }; /******/ })(); /******/ /************************************************************************/ var __webpack_exports__ = {}; // This entry need to be wrapped in an IIFE because it need to be isolated against other modules in the chunk. (() => { /*!*************************************!*\ !*** ./resources/js/posts/index.js ***! \*************************************/ __webpack_require__.r(__webpack_exports__); /* harmony import */ var timeago_js__WEBPACK_IMPORTED_MODULE_0__ = __webpack_require__(/*! timeago.js */ "./node_modules/timeago.js/esm/index.js"); var nodes = document.querySelectorAll(".timeago"); (0,timeago_js__WEBPACK_IMPORTED_MODULE_0__.render)(nodes, "en_US"); })(); /******/ })() ;<file_sep><?php use App\Http\Controllers\{ EventController, PostController, ProjectController, QuestionController }; use App\Models\{ Answer, Event, Post, Project, Question, User }; use Illuminate\Support\Facades\{Auth, Route}; /* |-------------------------------------------------------------------------- | Web Routes |-------------------------------------------------------------------------- | | Here is where you can register web routes for your application. These | routes are loaded by the RouteServiceProvider within a group which | contains the "web" middleware group. Now create something great! | */ Route::get('/', fn () => view('home', [ 'event' => Event::with(['organizer'])->where('start', '>', now())->latest()->first(), 'stats' => [ 'events' => Event::count(), 'projects' => Project::count(), 'questions' => Question::count(), 'posts' => Post::whereNotNull(['published_at'])->count() ] ]))->name('home'); Route::middleware(['auth:sanctum', 'verified']) ->get('/dashboard', fn () => view('dashboard')) ->name('dashboard'); Route::prefix('events')->name('events.')->group(function () { Route::get('', [EventController::class, 'index']) ->name('index'); Route::get('/{event}', [EventController::class, 'show']) ->name('show') ->whereNumber('event'); }); Route::prefix('projects')->name('projects.')->group(function () { Route::get('', [ProjectController::class, 'index'])->name('index'); Route::get('/{project}', [ProjectController::class, 'show']) ->name('show') ->whereNumber('project'); }); Route::prefix('questions')->name('questions.')->group(function () { Route::get('', [QuestionController::class, 'index'])->name('index'); Route::middleware(['auth:sanctum', 'verified']) ->get('/new', [QuestionController::class, 'new']) ->name('new'); Route::middleware(['auth:sanctum', 'verified']) ->post('/new', [QuestionController::class, 'create']); Route::get('/{question}', [QuestionController::class, 'show']) ->name('show') ->whereNumber('project'); }); Route::prefix('posts')->name('posts.')->group(function () { Route::get('', [PostController::class, 'index'])->name('index'); Route::get('/{post}', [PostController::class, 'show']) ->name('show') ->whereNumber('post'); }); Route::prefix('users')->name('users.')->group(function () { Route::get('/{id}', fn ($id) => view('users.show', [ 'user' => User::with(['events', 'projects', 'questions', 'answers', 'posts']) ->find($id), 'questions' => Question::all(), 'answers' => Answer::with(['question'])->get(), 'posts' => Post::whereNotNull(['published_at'])->orderBy('id', 'desc')->get() ])) ->name('show'); }); Route::middleware(['auth:sanctum', 'verified']) ->get('/me', fn () => redirect(route('users.show', ['id' => Auth::id()]))) ->name('me'); Route::get('/about', fn () => view('about'))->name('about'); // Administrative routes Route::middleware([ 'auth:sanctum', 'verified', 'password.confirm', 'can:administrate' ]) ->prefix('admin') ->name('admin.') ->group(function () { Route::get('', fn () => view('admin.index', [ 'users' => User::count(), 'events' => Event::count(), 'projects' => Project::count(), 'questions' => Question::count(), 'answers' => Answer::count(), 'posts' => Post::whereNotNull(['published_at'])->count() ]))->name('index'); Route::prefix('users')->name('users.')->group(function () { Route::get('', fn () => view('admin.users.index'))->name('index'); }); Route::prefix('events')->name('events.')->group(function () { Route::get('', fn () => view('admin.events.index'))->name('index'); }); Route::prefix('projects')->name('projects.')->group(function () { Route::get('', fn () => view('admin.projects.index'))->name('index'); }); Route::prefix('questions')->name('questions.')->group(function () { Route::get('', fn () => view('admin.questions.index'))->name('index'); }); Route::prefix('posts')->name('posts.')->group(function () { Route::get('', fn () => view('admin.posts.index'))->name('index'); }); }); <file_sep><?php namespace Database\Seeders; use Carbon\CarbonImmutable; use Illuminate\Database\Seeder; use Illuminate\Support\Facades\DB; class EventSeeder extends Seeder { /** * Run the database seeds. * * @return void */ public function run() { $events = []; $now = now(); for ($i = 1; $i <= 10; $i++) { $start = CarbonImmutable::createFromDate(month: $i); $events[] = [ 'title' => "Event $i", 'description' => "Event $i starts on $start", 'venue' => "The usual place", 'start' => $start, 'end' => $start->addHours(4), 'organizer_id' => 1, 'created_at' => $now, 'updated_at' => $now ]; } DB::table('events')->insert($events); } } <file_sep><?php namespace App\Models; use Illuminate\Database\Eloquent\Factories\HasFactory; use Illuminate\Database\Eloquent\Model; /** * @property User $organizer */ class Event extends Model { use HasFactory; /** * The attributes that should be cast to native types. * * @var array */ protected $casts = [ 'start' => 'datetime', 'end' => 'datetime', ]; /** * Get the organizer that owns the event. */ public function organizer() { return $this->belongsTo(User::class, 'organizer_id'); } } <file_sep><?php namespace App\Http\Livewire\Admin; use App\Models\Question; use Mediconesystems\LivewireDatatables\Http\Livewire\LivewireDatatable; class QuestionsTable extends LivewireDatatable { public $model = Question::class; public $exportable = true; public $searchable = 'title'; public $hideable = 'select'; public $exclude = ['body']; public $sort = 'id|asc'; // public function columns() // { // // // } } <file_sep><?php namespace App\Http\Controllers; use App\Models\Event; use Illuminate\Http\Request; use Spatie\CalendarLinks\Link; class EventController extends Controller { public function index() { return view('events.index', [ 'events' => Event::with(['organizer'])->get() ]); } public function show($eventId) { $event = Event::with(['organizer'])->find($eventId); return view('events.show', [ 'event' => $event, 'link' => Link::create($event->title, $event->start, $event->end) ->address($event->venue) ->description($event->description) ]); } } <file_sep>import { Calendar } from "@fullcalendar/core"; import dayGridPlugin from "@fullcalendar/daygrid"; import timeGridPlugin from "@fullcalendar/timegrid"; import listPlugin from "@fullcalendar/list"; const calendarElement = document.getElementById("calendar"); const calendar = new Calendar(calendarElement, { plugins: [dayGridPlugin, timeGridPlugin, listPlugin], initialView: "dayGridMonth", headerToolbar: { left: "prev,next today", center: "title", right: "dayGridMonth,listWeek,timeGridDay", }, events: events, weekNumbers: true, navLinks: true, // businessHours: { // daysOfWeek: [1, 2, 3, 4, 5], // startTime: "07:00", // endTime: "16:00", // }, dayMaxEventRows: true, // eventClick: function (info) { // const event = info.event; // }, }); calendar.render();<file_sep>import Quill from "quill"; const quill = new Quill("#body", { placeholder: 'Give additional information that would enable respondents to understand your question.', theme: "snow" }); const form = document.getElementById("question"); const body = document.querySelector("input[name=body]"); form.onsubmit = function() { body.value = JSON.stringify(quill.getContents()); return true; }; <file_sep><?php namespace Database\Seeders; use Illuminate\Database\Seeder; use Illuminate\Support\Facades\DB; class AnswerSeeder extends Seeder { /** * Run the database seeds. * * @return void */ public function run() { $answers = []; $now = now(); for ($i = 0; $i < 100; $i++) { $question = floor($i / 10); $questionAnswer = ($i - ($question * 10)) + 1; $question++; $answers[] = [ 'text' => "Answer #$questionAnswer to question #$question.", 'author_id' => 1, 'question_id' => $question, 'created_at' => $now, 'updated_at' => $now ]; } DB::table('answers')->insert($answers); } } <file_sep><?php namespace App\Providers; use App\Models\User; use Illuminate\Foundation\Support\Providers\AuthServiceProvider as ServiceProvider; use Illuminate\Support\Facades\Gate; class AuthServiceProvider extends ServiceProvider { /** * List of all administrative user types */ private const ADMINISTRATORS = ['admin', 'editor']; /** * The policy mappings for the application. * * @var array */ protected $policies = [ // 'App\Models\Model' => 'App\Policies\ModelPolicy', ]; /** * Register any authentication / authorization services. * * @return void */ public function boot() { $this->registerPolicies(); // This Gate checks whether the user has any business in the admin dashboard Gate::define('administrate', function (User $user) { return in_array($user->role, self::ADMINISTRATORS) || $user->events->count() > 0 || $user->posts->count() > 0; }); } } <file_sep><?php namespace App\Http\Livewire\Admin; use App\Models\Project; use Mediconesystems\LivewireDatatables\Http\Livewire\LivewireDatatable; class ProjectsTable extends LivewireDatatable { public $model = Project::class; public $exportable = true; public $searchable = 'title'; public $hideable = 'select'; public $sort = 'id|asc'; // public function columns() // { // // // } } <file_sep><?php namespace Database\Seeders; use Illuminate\Database\Seeder; use Illuminate\Support\Facades\DB; class ProjectSeeder extends Seeder { /** * Run the database seeds. * * @return void */ public function run() { $projects = []; $now = now(); for ($i = 1; $i <= 10; $i++) { $projects[] = [ 'title' => "Project $i", 'url' => "https://github.com/masenohub", 'description' => "Project $i by Maseno Hub organization.", 'lead_id' => 1, 'created_at' => $now, 'updated_at' => $now ]; } DB::table('projects')->insert($projects); } } <file_sep><?php namespace App\Http\Controllers; use App\Models\Project; use Illuminate\Http\Request; use OpenGraph; use shweshi\OpenGraph\Exceptions\FetchException; class ProjectController extends Controller { public function index() { return view('projects.index', [ 'projects' => Project::with(['lead'])->get() ]); } public function show($project) { $project = Project::with(['lead'])->find($project); try { // TODO: Implement OpenGraph stuff in frontend // @phpstan-ignore-next-line $project->open_graph = OpenGraph::fetch($project->url); } catch (FetchException) { } return view('projects.show', [ 'project' => $project ]); } }
960aff7cbff8f476748a371c56cd44c73eabee7c
[ "Markdown", "JavaScript", "PHP" ]
22
PHP
MasenoHub/website-old
f68eadaf558c4763e5d20b339d286febe3115c4a
b33a729b506cc538dcb64e1c88ab1af7dac8e569
refs/heads/master
<repo_name>YellowOnion/vermintide-compass<file_sep>/patch/patch.lua local mod, mod_name, oi = Mods.new_mod("Compass") local scenegraph_def = { root = { scale = "fit", size = { 1920, 1080 }, position = { 0, 0, UILayer.hud } } } local compass_ui_def = { scenegraph_id = "root", element = { passes = { { style_id = "compass_text", pass_type = "text", text_id = "compass_text", } } }, content = { compass_text = "NW", }, style = { compass_text = { font_type = "hell_shark", font_size = 32, vertical_alignment = "center", horizontal_alignment = "center", offset = { 0, 500, 0 } } }, offset = { 0, 0, 0 } } local fake_input_service = { get = function () return end, has = function () return end } mod.init = function() if mod.ui_widget then return end local world = Managers.world:world("top_ingame_view") mod.ui_renderer = UIRenderer.create(world, "material", "materials/fonts/gw_fonts") mod.ui_scenegraph = UISceneGraph.init_scenegraph(scenegraph_def) mod.ui_widget = UIWidget.init(compass_ui_def) end mod.get_rotation = function() local player = Managers.player:local_player() local unit = player.player_unit local first_person_extension = ScriptUnit.extension(unit, "first_person_system") local rot = Quaternion.yaw(first_person_extension:current_rotation()) * (-180/math.pi) if rot < 0.0 then rot = 360.0 + rot end return rot end mod.rot_to_cardinal = function(d) local a = 360/16 if d <= a or d >= a*15 then return "N" elseif d <= a*3 then return "NE" elseif d <= a*5 then return "E" elseif d <= a*7 then return "SE" elseif d <= a*9 then return "S" elseif d <= a*11 then return "SW" elseif d <= a*13 then return "W" elseif d <= a*15 then return "NW" else return "IDK" end end local draw_hook = function(func, self, dt, t, my_player) if not mod.ui_widget then mod.init() end local rot = 999999 if not pcall(function () rot = mod.get_rotation() end) then return func(self, dt, t, my_player) end local widget = mod.ui_widget local renderer = mod.ui_renderer local scenegraph = mod.ui_scenegraph widget.content.compass_text = mod.rot_to_cardinal(rot) UIRenderer.begin_pass(renderer, scenegraph, fake_input_service, dt) UIRenderer.draw_widget(renderer, widget) UIRenderer.end_pass(renderer) return func(self, dt, t, my_player) end Mods.hook.set(mod_name, "IngameHud.update", draw_hook) <file_sep>/build.sh #!/usr/bin/env bash set -euo pipefail build() { 7z a ../Compass.zip * mv ../Compass.zip ../Compass.mod cp ../Compass.mod ./ } loop() { while read dir events fname; do echo "file change $fname; rebuilding" build done } build inotifywait -q -e close_write -rm . | loop
78fb0a37acad9fc3bcb55e39cfd779bd7af851ab
[ "Shell", "Lua" ]
2
Lua
YellowOnion/vermintide-compass
3cc004087b2de248d1189ee0efbef67f84234686
05ec0888e274f9f985fae60034aa403060fdfae5
HEAD
<file_sep>from django.contrib import admin from .models import acount account # Register your models here. class accountAdmin(admin.ModelAdmin): list_display= ['username', '<PASSWORD>', 'email'] admin.site.register(accountAdmin, account) <file_sep>from django.db import models from django.contrib.auth.models import (AbstractBaseUser, PermissionsMixin, BaseUserManager, AnonymousUser) from django.core import validators from django.utils import timezone from django.core.mail import send_mail from .signals import account_signed_up, account_activate import datetime from django.shortcuts import render_to_string from django.core.email import EmailMultiAlternatives SEX_CHOICE = ( ('M', u'男'), ('F', u'女'), ) class accountManager(models.Manager): def _create_account(self, username, password, email, nickname, sex, activation_key, is_active=False, is_staff=False, is_super=False, **extra_fields): now = timezone.now() if not username: raise ValueError('The given username must be set') email = self.normalize_email(email) user = self.model(username=username, email=email, nickname=nickname, sex=sex, activation_key=activation_key, is_super=is_super, is_staff=is_staff, is_active=is_active, join_date=now, update=now, **extra_fields) user.set_password(<PASSWORD>) user.save(using=self._db) return user def create_account(self, username, password, email, nickname, sex=None, activation_key, **extra_fields): account = self._create_account(username, password, email, nickname, sex, activation_key, **extra_fields) retrun account def create_superuser(self, username, password, email, nickname, sex, **extra_fields): return self._create_account(username, password, email, nickname, sex, 'ACTIVATE', True, True, True, **extra_fields) class account(AnonymousUser, PermissionsMixin): username = models.CharField(u'用户名', max_length=30, unique=True, validators=[validators.RegexValidator( re.compile('^[\w.@+-]+$'), _('Enter a valid username.'), _('invalid'))] )) password = models.CharField(u'密码', max_length=30) state = models.CharField(u'状态', max_length=10,default='unlock') email = models.EmailField(u'邮箱') is_active = models.BooleanField(u'激活状态', max_length=10, default=False) join_date = models.DateTimeField(u'注册时间', max_length=10, default=timezone.now) update = models.DateTimeField(u'上次登陆时间') is_staff = models.BooleanField(u'职员身份', default=False) is_super = models.BooleanField(u'超级管理员', default=False) activation_key = models.CharField(u'激活码', max_length=40) objects = accountManager() USERNAME_FIELD = 'username' REQUIRED_FIELD = ['email', ] def __unicode__(self): return self.username, def is_active(self): return self.is_active def activate_account(self): self.is_active = True self.save(using self._db) def get_full_name(self): return self.username def get_short_name(self): return self.username def get_points(self): return self.points def is_superuser(self): return is_super def get_state(self): return self.state def set_state(self, state): self.state = state self.save(using self._db) def is_activate_expired(self): expired_day = self.join_date + datetime.timedelta(days = \ setting.ACCOUNT_ACTIVATION_DAYS) return self.activation_key=='ACCEPT' or datetime.now() <= expired_day class profile(models.Model): ueser = OneToOneField(account, unique=True) nickname = models.CharField(u'昵称', max_length=30) sex = models.CharField(u'性别', max_length=1, choices=SEX_CHOICE, null=True, blank=True) level = models.DecimalField(u'等级',default=0) money = models.DecimalField(u'金币', default=0) state = models.CharField(u'状态', max_length=10,default='unlock') points = models.DecimalField(u'积分', default=100) def handle_signed_up(sender, **kwargs): account = kwargs.get('account') email = account.email if isinstance(email, unicode): email = email.encode('utf-8') ctx_dict = { 'activation_key' : account.activation_key, 'expiration_days' : settings.ACCOUNT_ACTIVATION_DAYS, 'site':site, } html_content = render_to_string('send_email.html', ctx_dict}) text_content = '这是forum论坛注册激活邮件,点击激活' subject = 'forum论坛注册激活邮件', message = 'message', from_mail = '<EMAIL>' msg = EmailMultiAlternatives(subject, text_content, from_mail, email) msg.attach_alternative(html_content, "text/html") msg.send() return True #signals for signup account_signed_up.connect(handle_signed_up, dispatch_uid='account_signed_up_id') <file_sep># -*- coding: utf-8 -*- """ """ from django.conf.urls import patterns, url urlpatterns = patterns( 'forum.account.views', url('r^signup/$', 'signup', name='signup'), url('r^login/$', 'login', name='login'), url('r^logout/$', 'logout', name='logout'), url('r^activate_account/(P<activation_key>\w+>)/$', 'activate_account', name='activate'), url('r^reset_passwort/(P<activation_key>\w+>)/$', 'activate_account', name='reset_passwort'), url('r^change_password/(P<activation_key>\w+>)/$', 'activate_account', name='change_password'), ) <file_sep>#!/usr/bin/env python # -*- coding: utf-8 -*- """ """ from django import forms from django.contrib import auth from .models import account class loginForm(forms.Form): username = form.CharField(required=True, label='用户名',error_message={ 'required': u'请正确填写您的用户名', 'invalid': u'请正确填写您的用户名'}) password = form.CharField(required=True, label='密码', widget=forms.PasswordInput(render_value=False) error_messages={'required': u'请正确填写您的用户名和密码'}) def clean(self): username = self.cleaned_data.get['username'] password = self.cleaned_data.get['password'] if username and password: self.user_cache = auth.authenticate( username=username, password=<PASSWORD>) if self.user_cache is None: raise forms.ValidationError( self.error_message['invalid_login'], code = 'invalid_login', parms={'username':username}) elif not self.user.is_active: raise forms.ValidationError( self.error_message['inactivate'], code = 'inactivate',) return self.cleaned_data class signupForm(forms.Form): username = form.CharField(required=True, label='用户名',max_length=30, regex='^[\w.@+-]+$', error_messages={'invalid':'格式错误'}) password1 = form.CharField(required=True, label='密码', widget=forms.PasswordInput(render_value=False)) password2 = form.CharField(required=True, label='密码', widget=forms.PasswordInput(render_value=False)) email = form.CharField(required=True, label='',error_messages={ 'required': u'请填写有效的电子邮箱', 'invalid': u'请正确填写您的电子邮箱'}) def clean_password2(self): password1 = self.cleaned_data['password1'] password2 = self.cleaned_data['password2'] if password1 and password2 and password1 != password2: raise forms.ValidationError( self.error_messages[''], code = 'password_dismatch',) return password2 def clean_username(self): username = self.cleaned_data['username'] try: user = account.objects.get(username=username) except User.DoesNotExist: return username raise form.ValidationError( self.error_messages['用户名已存在'], code='duplicate_username') def clean_email(self): email = self.cleaned_data['email'] try: user = account.objects.get(email=email) except User.DoesNotExist: return email raise form.ValidationError( self.error_messages['用户名已存在'], code='duplicate_username') <file_sep>#!/usr/bin/env python # -*- coding: utf-8 -*- ''' ''' from django.dispatch import Signal account_signed_up = Signal(providing_args='account') <file_sep>#! /usr/bin/env python # -*- coding: utf-8 -*- from django.shortcuts import render, render_to_response # -*-coding:utf-8 -*- import hashlib import random from forum import settings from django.template import RequestContext from django.contrid import messages from django.contrid import auth from .form import loginForm, signupForm from .account import account from .signals import account_signed_up, account_activate def login(request): if request.user.is_authenticated(): return redirect('home') context = RequestContext(request) if request.method = 'POST': form = loginForm(request.POST) if form.is_valid(): auth.login(request, form.user_cache) return redirect('home') else: form = loginForm() return render_to_response('', context_instance=context) def logout(request): auth.logout(request) return redirect('home') def signup(request): context = RequestContext(request) if requests.method = 'POST' form = signupForm(requests.POST) if form.is_valid(): username = form.cleaned_data['username'] password = <PASSWORD>.<PASSWORD>_data['<PASSWORD>'] email = form.cleaned_data['email'] nickname = form.cleaned_data['nickname'] salt = hashlib.sha1(str(random.random())).hexdigest()[:5] activate_key = hashlib.sha1(salt+username).hexdigest() print '--------> activate_key', activate_key user = account.objects.create_account(username, password, email, nickname, None, activate_key) if user not None: account_signed_up.send(sender=signupForm, accout=user) return redirect('home') else: form = signupForm() return render_to_response('',context_instance=context) return def activate_account(request, activation_key): account = request.get('user') if account.is_active(): return if account.activation_key == activation_key and account.is_activate_expired(): account.activate_account() account.activate_key = 'ACCEPT' return return
be1d1feec47605e5b9ac68377377d29c50cb0768
[ "Python" ]
6
Python
Lickyst2013/forum
b92bae1d2c7f4d78b3b5773c870cbef150d7609a
f0df84153865629db2de9631f989ba6ab2e29164
refs/heads/master
<file_sep>/* server.js main server script for the socket.io chat demo */ var net = require('net'); //create a new network server var server = net.createServer(); //array of connected clients var clients = []; server.on('connection', function(socket) { var name; function broadcast(name, message) { clients.forEach(function(client) { if (client !== socket) { client.write('[' + name + '] ' + message); } }); } clients.push(socket); socket.write('Hello! What is your name?\n'); socket.on('data', function(data) { if (!name) { name = data.toString().trim(); socket.write('Hello ' + name + '!\n'); } else { var message = data.toString(); if (message.trim() === 'exit') { socket.end(); } else { broadcast(name, data.toString()); } } }); }); server.on('listening', function() { console.log('server listening on port 3000'); }); server.listen(3000);<file_sep># SQLite Database Directory The first time our server starts up, it will create a SQLite database in this directory to store our data. For more information about SQLite, see: - [SQLite web site](https://www.sqlite.org/) - [Node.js driver for SQLite](https://github.com/mapbox/node-sqlite3) <file_sep>/* script file for the Tasks application */ angular.module('TasksApp', []) .controller('TasksController', function($scope, $http) { function handleApiError(err) { $scope.showSpinner = false; $scope.error = err; } $scope.newTask = {}; $scope.showSpinner = true; $http.get('/api/tasks') .then(function(results) { $scope.showSpinner = false; $scope.tasks = results.data; }, handleApiError); $scope.addTask = function() { $scope.showSpinner = true; $http.post('/api/tasks', $scope.newTask) .then(function(results) { $scope.showSpinner = false; //new task object is returned, including new rowid $scope.tasks.push(results.data); $scope.newTask = {}; }, handleApiError); }; //addTask() $scope.toggleDone = function(task) { var updateTask = angular.copy(task); updateTask.done = !updateTask.done; $scope.showSpinner = true; $http.put('/api/tasks/' + updateTask.rowid, updateTask) .then(function(results) { $scope.showSpinner = false; angular.copy(updateTask, task); }, handleApiError); }; });<file_sep>/* server.js main server script for our task list web service */ var port = 8080; //load all modules we need //express web server framework var express = require('express'); //sqlite library var sqlite = require('sqlite3'); //body parser library var bodyParser = require('body-parser'); //create a new express app var app = express(); //tell express to server static files firom the /static subdir app.use(express.static(__dirname + '/static')); //tell express to parse post body to data as json app.use(bodyParser.json()); //api route for getting tasks app.get('/api/tasks', function(req, res, next) { var sql = 'select rowid,title,done,createdOn from tasks where done !=1'; db.all(sql, function(err, rows) { if (err) { return next(err); } //send rows back to client as JSON res.json(rows); }); }); //when someone posts to /api/tasks... app.post('/api/tasks', function(req, res, next) { var newTask = { title: req.body.title, done: false, createdOn: new Date() }; var sql = 'insert into tasks(title,done,createdOn) values(?,?,?)'; db.run(sql, [newTask.title, newTask.done, newTask.createdOn], function(err) { if (err) { return next(err); } res.status(201).json(newTask); }); }); //when someone PUTS to /api/<tasks/task-id>... app.put('/api/tasks/:rowid', function(req, res, next) { var sql = 'update tasks set done=? where rowid=?'; db.run(sql, [req.body.done, req.params.rowid], function(err) { if (err) { return next(err); } res.json(req.body); }); }); //create new database var db = new sqlite.Database(__dirname + '/data/tasks.db', function(err) { if (err) { throw err; } var sql = 'create table if not exists tasks(title string, done int, createdOn datetime)'; db.run(sql, function(err) { if (err) { throw err; } }); //start the server app.listen(port, function() { console.log('server is listening on http://localhost:' + port); }); }); <file_sep># Instructions This repo contains the starter files for my Intro to Node.js lecture. To get setup, please follow these steps. 1. [Fork this repo](https://help.github.com/articles/fork-a-repo/) into your own GitHub account 1. Clone your forked repo to your lab machine or personal computer 1. Open the project directory in your preferred editor We will implement the server(s) together, and I will explain everything as I go along. # Viewing the Completed Version To view the completed version of this exercise, switch to the `completed` branch by executing the following command from within your project directory. Note that if you have uncommitted changes, git will make you commit or stash them before you can checkout the other branch. ```bash $ git checkout completed ``` To switch back to the master branch, execute this command: ```bash $ git checkout master ```
e83e164907ce1e6ba08c3ec0d0dfa80f982602b2
[ "JavaScript", "Markdown" ]
5
JavaScript
PeterCLu/node-demo
7b9d67680757f2070cb82e8c6cff5db8a6628d00
60b68bf05578818be4a6038b0a48fabbe676f883
refs/heads/master
<file_sep>using System; using System.Collections.Generic; namespace ItzWarty.Comparers { public class ReverseComparer<T> : IComparer<T> where T : IComparable<T> { private readonly IComparer<T> m_originalComparer; public ReverseComparer(IComparer<T> originalComparer) { m_originalComparer = originalComparer; } public ReverseComparer() { m_originalComparer = null; } public int Compare(T x, T y) { if (m_originalComparer == null) return y.CompareTo(x); else return m_originalComparer.Compare(y, x); } } } <file_sep>using System; using System.Collections.Generic; namespace ItzWarty.Collections { public class CollectionFactory : ICollectionFactory { public IConcurrentDictionary<K, V> CreateConcurrentDictionary<K, V>() => new ConcurrentDictionary<K, V>(); public IConcurrentDictionary<K, V> CreateConcurrentDictionary<K, V>(IEnumerable<KeyValuePair<K, V>> collection) => new ConcurrentDictionary<K, V>(collection); public IConcurrentDictionary<K, V> CreateConcurrentDictionary<K, V>(IEqualityComparer<K> comparer) => new ConcurrentDictionary<K, V>(comparer); public IConcurrentDictionary<K, V> CreateConcurrentDictionary<K, V>(IEnumerable<KeyValuePair<K, V>> collection, IEqualityComparer<K> comparer) => new ConcurrentDictionary<K, V>(collection, comparer); public IConcurrentSet<T> CreateConcurrentSet<T>() => new ConcurrentSet<T>(); public IConcurrentSet<T> CreateConcurrentSet<T>(IEnumerable<T> collection) => new ConcurrentSet<T>(collection); public IConcurrentSet<T> CreateConcurrentSet<T>(IEqualityComparer<T> comparer) => new ConcurrentSet<T>(comparer); public IConcurrentSet<T> CreateConcurrentSet<T>(IEnumerable<T> collection, IEqualityComparer<T> comparer) => new ConcurrentSet<T>(collection, comparer); public IConcurrentBag<T> CreateConcurrentBag<T>() => new ConcurrentBag<T>(); public IConcurrentBag<T> CreateConcurrentBag<T>(IEnumerable<T> collection) => new ConcurrentBag<T>(collection); public IHashSet<T> CreateHashSet<T>() => new HashSet<T>(); public IHashSet<T> CreateHashSet<T>(IEnumerable<T> collection) => new HashSet<T>(collection); public IHashSet<T> CreateHashSet<T>(IEqualityComparer<T> comparer) => new HashSet<T>(comparer); public IHashSet<T> CreateHashSet<T>(IEnumerable<T> collection, IEqualityComparer<T> comparer) => new HashSet<T>(collection, comparer); public ISortedSet<T> CreateSortedSet<T>() => new SortedSet<T>(); public ISortedSet<T> CreateSortedSet<T>(IEnumerable<T> collection) => new SortedSet<T>(collection); public ISortedSet<T> CreateSortedSet<T>(IComparer<T> comparer) => new SortedSet<T>(comparer); public ISortedSet<T> CreateSortedSet<T>(IEnumerable<T> collection, IComparer<T> comparer) => new SortedSet<T>(collection, comparer); public IReadOnlyCollection<T> CreateImmutableCollection<T>() => ImmutableCollection.Of<T>(); public IReadOnlyCollection<T> CreateImmutableCollection<T>(params T[] args) => ImmutableCollection.Of<T>(args); public IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>() => ImmutableDictionary.Of<K, V>(); public IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1) => ImmutableDictionary.Of<K, V>(k1, v1); public IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2) => ImmutableDictionary.Of<K, V>(k1, v1, k2, v2); public IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3) => ImmutableDictionary.Of<K, V>(k1, v1, k2, v2, k3, v3); public IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3, K k4, V v4) => ImmutableDictionary.Of<K, V>(k1, v1, k2, v2, k3, v3, k4, v4); public IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3, K k4, V v4, K k5, V v5) => ImmutableDictionary.Of<K, V>(k1, v1, k2, v2, k3, v3, k4, v4, k5, v5); public IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3, K k4, V v4, K k5, V v5, K k6, V v6) => ImmutableDictionary.Of<K, V>(k1, v1, k2, v2, k3, v3, k4, v4, k5, v5, k6, v6); public IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3, K k4, V v4, K k5, V v5, K k6, V v6, K k7, V v7) => ImmutableDictionary.Of<K, V>(k1, v1, k2, v2, k3, v3, k4, v4, k5, v5, k6, v6, k7, v7); public IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3, K k4, V v4, K k5, V v5, K k6, V v6, K k7, V v7, K k8, V v8) => ImmutableDictionary.Of<K, V>(k1, v1, k2, v2, k3, v3, k4, v4, k5, v5, k6, v6, k7, v7, k8, v8); public IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3, K k4, V v4, K k5, V v5, K k6, V v6, K k7, V v7, K k8, V v8, K k9, V v9) => ImmutableDictionary.Of<K, V>(k1, v1, k2, v2, k3, v3, k4, v4, k5, v5, k6, v6, k7, v7, k8, v8, k9, v9); public IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3, K k4, V v4, K k5, V v5, K k6, V v6, K k7, V v7, K k8, V v8, K k9, V v9, K k10, V v10) => ImmutableDictionary.Of<K, V>(k1, v1, k2, v2, k3, v3, k4, v4, k5, v5, k6, v6, k7, v7, k8, v8, k9, v9, k10, v10); public IReadOnlySet<T> CreateImmutableSet<T>() => ImmutableSet.Of<T>(); public IReadOnlySet<T> CreateImmutableSet<T>(params T[] args) => ImmutableSet.Of<T>(args); public IMultiValueDictionary<K, V> CreateMultiValueDictionary<K, V>() => new MultiValueDictionary<K, V>(); public IMultiValueDictionary<K, V> CreateMultiValueDictionary<K, V>(IEqualityComparer<K> comparer) => new MultiValueDictionary<K, V>(comparer); public IMultiValueSortedDictionary<K, V> CreateMultiValueSortedDictionary<K, V>() => new MultiValueSortedDictionary<K, V>(); public IMultiValueSortedDictionary<K, V> CreateMultiValueSortedDictionary<K, V>(IComparer<K> comparer) => new MultiValueSortedDictionary<K, V>(comparer); public IOrderedDictionary<K, V> CreateOrderedDictionary<K, V>() => new OrderedDictionary<K, V>(); public IOrderedDictionary<K, V> CreateOrderedDictionary<K, V>(IEqualityComparer<K> comparer) => new OrderedDictionary<K, V>(comparer); public IOrderedMultiValueDictionary<K, V> CreateOrderedMultiValueDictionary<K, V>(ValuesSortState valuesSortState = ValuesSortState.Unsorted) => new OrderedMultiValueDictionary<K, V>(valuesSortState); public IQueue<T> CreateQueue<T>() => new Queue<T>(); public IPriorityQueue<TValue, TPriority> CreatePriorityQueue<TValue, TPriority>(int capacity) where TPriority : IComparable<TPriority>, IEquatable<TPriority> where TValue : class, IPriorityQueueNode<TPriority> => new HeapPriorityQueue<TValue, TPriority>(capacity); public IConcurrentQueue<T> CreateConcurrentQueue<T>() => new ConcurrentQueue<T>(); public IConcurrentQueue<T> CreateSingleConsumerSingleProducerConcurrentQueue<T>() where T : class => new SingleConsumerSingleProducerConcurrentQueue<T>(); public IUniqueIdentificationSet CreateUniqueIdentificationSet(bool filled) => new UniqueIdentificationSet(filled); public IUniqueIdentificationSet CreateUniqueIdentificationSet(uint low, uint high) => new UniqueIdentificationSet(low, high); public IListDictionary<K, V> CreateListDictionary<K, V>() => new ListDictionary<K, V>(); public IDictionary<K, V> CreateDictionary<K, V>() => new Dictionary<K, V>(); public IDictionary<K, V> CreateDictionary<K, V>(IEqualityComparer<K> comparer) => new Dictionary<K, V>(comparer); public IDictionary<K, V> CreateSortedDictionary<K, V>() => new SortedDictionary<K, V>(); public IDictionary<K, V> CreateSortedDictionary<K, V>(IComparer<K> comparer) => new SortedDictionary<K, V>(comparer); } }<file_sep>using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Net; namespace ItzWarty { /* * Deprecated, needs housekeeping public static class XHR { private static WebClient webclient = new WebClient(); public static string GetURL(string url) { lock (webclient) { return webclient.DownloadString(url); } } public static string GetString(string url) { return GetURL(url); } } */ } <file_sep>using System; namespace ItzWarty { public static class StaticRandom { private static readonly object s_lock = new object(); private static readonly Random s_random = new Random(10); public static int Next(int exclusiveUpperBound) { lock (s_lock) return s_random.Next(exclusiveUpperBound); } public static int Next(int inclusiveLowerBound, int exclusiveUpperBound) { lock (s_lock) return s_random.Next(inclusiveLowerBound, exclusiveUpperBound); } public static float NextFloat(float exclusiveUpperBound) { lock (s_lock) return (float)NextDouble(exclusiveUpperBound); } public static float NextFloat(float inclusiveLowerBound, float exclusiveUpperBound) { lock (s_lock) return (float)NextDouble(inclusiveLowerBound, exclusiveUpperBound); } public static double NextDouble() { lock(s_lock) return s_random.NextDouble(); } public static double NextDouble(double exclusiveUpperBound) { lock(s_lock) return s_random.NextDouble() * exclusiveUpperBound; } public static double NextDouble(double inclusiveLowerBound, double exclusiveUpperBound) { lock (s_lock) return inclusiveLowerBound + s_random.NextDouble() * (exclusiveUpperBound - inclusiveLowerBound); } public static Random NextRandom() { var buffer = new byte[4]; lock(s_lock) s_random.NextBytes(buffer); return new Random(BitConverter.ToInt32(buffer, 0)); } public static bool NextBoolean() { lock (s_lock) return s_random.Next() % 2 == 0; } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using System.Text; using ItzWarty; using ItzWarty.Database; namespace ItzWarty.Database { public class wDBTable { string rootLocation = ""; DatabaseClient dbc; public wDBTable(string name, DatabaseClient root) { rootLocation = root.currentFolderLocation; dbc = root.Clone(); dbc.SelectDB("$" + name); dbc.SetCurrentDBAsRoot(); dbc.ReturnToRoot(); } public wDBRowCollection GetRows() { dbc.ReturnToRoot(); string[] rowNums = dbc.ListDatabases(); List<wDBTableRow> rows = new List<wDBTableRow>(); for (int i = 0; i < rowNums.Length; i++) { string rowNum = rowNums[i]; dbc.SelectDB(rowNum); rows.Add( new wDBTableRow(dbc) ); dbc.ReturnToRoot(); } return new wDBRowCollection(rows); } public static void CreateTable(string tableName, DatabaseClient parentDB) { if (parentDB.ExistDB("$" + tableName) == DatabaseClient.DBResponse.Exists) return; parentDB.CreateDB("$" + tableName); DatabaseClient dbc = parentDB.Clone(); dbc.SelectDB("$" + tableName); dbc.SetValue(".rowsAdded", "0"); } public void AddRow(string[][] keyValues) { int rowsAdded = int.Parse(dbc.GetValue(".rowsAdded")); dbc.ReturnToRoot(); dbc.CreateDB(rowsAdded.ToString()); dbc.SelectDB(rowsAdded.ToString()); wDBTableRow row = new wDBTableRow(dbc); for (int i = 0; i < keyValues.Length; i++) row[keyValues[i][0]] = keyValues[i][1]; dbc.ReturnToRoot(); dbc.SetValue(".rowsAdded", rowsAdded + 1); } } public class wDBTableRow { DatabaseClient dbc; public wDBTableRow(DatabaseClient dbc) { this.dbc = dbc.Clone(); } public string this[string columnName] { get { return dbc.GetValue(columnName); } set { dbc.SetValue(columnName, value); } } } } <file_sep>using System; using System.Collections; using System.Collections.Generic; using System.Linq; using System.Text; using System.Threading; using System.Threading.Tasks; namespace ItzWarty.Collections { public class ConcurrentQueue<T> : System.Collections.Concurrent.ConcurrentQueue<T>, IConcurrentQueue<T> { public ConcurrentQueue() { } public ConcurrentQueue(IEnumerable<T> collection) : base(collection) { } } } <file_sep>using System; using System.Collections; using System.Collections.Generic; using System.Diagnostics; using System.IO; using System.Linq; using System.Reflection; using System.Runtime.CompilerServices; using System.Text; using System.Text.RegularExpressions; using System.Threading; namespace ItzWarty { public class GeneratorExitException : Exception { public GeneratorExitException() : base("The Generator is unable to produce more results. Perhaps, there is nothing left to produce?") {} } public unsafe static class Util { /// <summary> /// Returns whether or not the given value is within (inclusive) the other two parameters /// </summary> [MethodImpl(MethodImplOptions.AggressiveInlining)] public static bool IsBetween(Double a, Double value, Double b) { return (a <= value && value <= b) || (b <= value && value <= a); } /// <summary> /// Generates a string in a stupid way... /// lol /// </summary> public static string GenerateString(int length) { StringBuilder temp = new StringBuilder(); while(temp.Length < length) temp.Append(Guid.NewGuid().ToByteArray().ToHex()); return temp.ToString().Substring(0, length); } /// <summary> /// Creates an array using the given function N times. /// The function takes a parameter i, from 0 to count, and returns T. /// </summary> public static T[] Generate<T>(int count, Func<int, T> generator) { if (count < 0) throw new ArgumentOutOfRangeException("count < 0"); if (generator == null) throw new ArgumentNullException("generator"); T[] result = new T[count]; for (int i = 0; i < count; i++) result[i] = generator(i); return result; } /// <summary> /// Creates an array using the given function N times. /// The function takes a parameter a from 0 to countA and a parameter b, from 0 to countB, and returns T. /// </summary> public static T[] Generate<T>(int countA, int countB, Func<int, int, T> generator) { if (countA < 0) throw new ArgumentOutOfRangeException("countA < 0"); if (countB < 0) throw new ArgumentOutOfRangeException("countB < 0"); if (generator == null) throw new ArgumentNullException("generator"); T[] result = new T[countA * countB]; for (int a = 0; a < countA; a++) for (int b = 0; b < countB; b++) result[a * countB + b] = generator(a, b); return result; } /// <summary> /// Creates an array using the given function N times. /// </summary> public static T[] Generate<T>(int countA, int countB, int countC, Func<int, int, int, T> generator) { if (countA < 0) throw new ArgumentOutOfRangeException("countA < 0"); if (countB < 0) throw new ArgumentOutOfRangeException("countB < 0"); if (countC < 0) throw new ArgumentOutOfRangeException("countC < 0"); if (generator == null) throw new ArgumentNullException("generator"); T[] result = new T[countA * countB * countC]; int i = 0; for (int a = 0; a < countA; a++) for (int b = 0; b < countB; b++) for (int c = 0; c < countC; c++) result[i++] = generator(a, b, c); return result; } /// <summary> /// Generates a given output. Returns null if we are done after this loop. /// Throws GeneratorFinishedException if done. /// </summary> public delegate bool GeneratorDelegate<T>(int i, out T output); public static T[] Generate<T>(GeneratorDelegate<T> generator) where T : class { List<T> result = new List<T>(); bool done = false; int i = 0; try { while(!done) { T output = null; done = generator(i++, out output); result.Add(output); } } catch(GeneratorExitException) { } catch(Exception e) { throw e; } return result.ToArray(); } public static T[] Concat<T>(params object[] args) { var result = new List<T>(); foreach (var element in args) { if (element is T) result.Add((T)element); else { foreach (var subElement in (IEnumerable<T>)element) result.Add(subElement); } } return result.ToArray(); } /// <summary> /// Creates a variable of the given value repeated [count] times. /// Note that this just copies reference if we have a Object. /// </summary> public static T[] Repeat<T>(int count, T t) { T[] result = new T[count]; for(int i = 0; i < count; i++) result[i] = t; return result; } public static byte FindMaximum(byte[] bytes) { byte max = bytes[0]; for(int i = 1; i < bytes.Length; i++) { if(max < bytes[i]) max = bytes[i]; } return max; } public static byte FindMinimum(byte[] bytes) { byte min = bytes[0]; for(int i = 1; i < bytes.Length; i++) { if(min > bytes[i]) min = bytes[i]; } return min; } public static bool ByteArraysEqual(byte[] param1, byte[] param2) { return ByteArraysEqual(param1, 0, param1.Length, param2, 0, param2.Length); } public static bool ByteArraysEqual(byte[] a, int aOffset, byte[] b, int bOffset, int length) { return ByteArraysEqual(a, aOffset, length, b, bOffset, length); } public static bool ByteArraysEqual(byte[] a, int aOffset, int aLength, byte[] b, int bOffset, int bLength) { if (aOffset + aLength > a.Length) { throw new IndexOutOfRangeException("aOffset + aLength > a.Length"); } else if (bOffset + bLength > b.Length) { throw new IndexOutOfRangeException("bOffset + bLength > b.Length"); } else if (aOffset < 0) { throw new IndexOutOfRangeException("aOffset < 0"); } else if (bOffset < 0) { throw new IndexOutOfRangeException("bOffset < 0"); } else if (aLength < 0) { throw new IndexOutOfRangeException("aLength < 0"); } else if (bLength < 0) { throw new IndexOutOfRangeException("bLength < 0"); } if (aLength != bLength) { return false; } else if (a == b && aOffset == bOffset && aLength == bLength) { return true; } fixed (byte* pABase = a) fixed (byte* pBBase = b) { byte* pACurrent = pABase + aOffset, pBCurrent = pBBase + bOffset; var length = aLength; int longCount = length / 8; for (var i = 0; i < longCount; i++) { if (*(ulong*)pACurrent != *(ulong*)pBCurrent) { return false; } pACurrent += 8; pBCurrent += 8; } if ((length & 4) != 0) { if (*(uint*)pACurrent != *(uint*)pBCurrent) { return false; } pACurrent += 4; pBCurrent += 4; } if ((length & 2) != 0) { if (*(ushort*)pACurrent != *(ushort*)pBCurrent) { return false; } pACurrent += 2; pBCurrent += 2; } if ((length & 1) != 0) { if (*pACurrent != *pBCurrent) { return false; } pACurrent += 1; pBCurrent += 1; } return true; } } public static void SubscribeToEventOnce<T>(ref EventHandler<T> @event, EventHandler<T> callback) where T : EventArgs { var signal = new CountdownEvent(2); var accessLock = new object(); var done = false; var handler = new EventHandler<T>( (o, e) => { //Ensure no concurrent invocations of the event, though I'm not sure if .net allows for that lock(accessLock) { //Check if we're done calling the event once. If so, we don't want to invoke twice. if(!done) { //We're now done. Set the flag so we aren't called again. done = true; //Invoke the user's code for the one-time event subscription callback(o, e); //Signal that the user's code is done running, so the SubscribeToEventOnce caller //thread can be unblocked. signal.Signal(); } } } ); //Subscribe to the event which we are trying to listen to once @event += handler; //Signal the countdown event once to tell threads that we're done. In a case like this where we're //really only running 1 thing at a time, it's not important. If we had more than one thread, and were //trying to synchronize all of them, this would be more helpful. For now, this allows us to //wait until the user code has been invoked before we allow this method to return. signal.Signal(); //Wait for the user's callback event to be invoked signal.Wait(); //Unsubscribe to the event. @event -= handler; } public class SingleSubscription { internal CountdownEvent m_countdown = new CountdownEvent(1); internal void Signal() { m_countdown.Signal(); } public void Wait() { m_countdown.Wait(); } } public static SingleSubscription SubscribeToEventOnceAsync<T>(Action<EventHandler<T>> subscribe, Action<EventHandler<T>> unsubscribe, EventHandler<T> callback) where T : EventArgs { var result = new SingleSubscription(); var accessLock = new object(); var done = false; EventHandler<T> handler = null; handler = new EventHandler<T>( (o, e) => { //Ensure no concurrent invocations of the event, though I'm not sure if .net allows for that lock(accessLock) { //Check if we're done calling the event once. If so, we don't want to invoke twice. if(!done) { //We're now done. Set the flag so we aren't called again. done = true; //Invoke the user's code for the one-time event subscription callback(o, e); //Signal that the user's code is done running, so the SubscribeToEventOnce caller //thread can be unblocked. result.Signal(); //Yay closures unsubscribe(handler); } } } ); //Subscribe to the event which we are trying to listen to once subscribe(handler); return result; } /// <SUMMARY> /// FROM: http://blogs.msdn.com/b/toub/archive/2006/05/05/590814.aspx /// Computes the Levenshtein Edit Distance between two enumerables.</SUMMARY> /// <TYPEPARAM name="T">The type of the items in the enumerables.</TYPEPARAM> /// <PARAM name="x">The first enumerable.</PARAM> /// <PARAM name="y">The second enumerable.</PARAM> /// <RETURNS>The edit distance.</RETURNS> public static int EditDistance<T>(IEnumerable<T> x, IEnumerable<T> y) where T : IEquatable<T> { // Validate parameters if(x == null) throw new ArgumentNullException("x"); if(y == null) throw new ArgumentNullException("y"); // Convert the parameters into IList instances // in order to obtain indexing capabilities IList<T> first = x as IList<T> ?? new List<T>(x); IList<T> second = y as IList<T> ?? new List<T>(y); // Get the length of both. If either is 0, return // the length of the other, since that number of insertions // would be required. int n = first.Count, m = second.Count; if(n == 0) return m; if(m == 0) return n; // Rather than maintain an entire matrix (which would require O(n*m) space), // just store the current row and the next row, each of which has a length m+1, // so just O(m) space. Initialize the current row. int curRow = 0, nextRow = 1; int[][] rows = new int[][] {new int[m + 1], new int[m + 1]}; for(int j = 0; j <= m; ++j) rows[curRow][j] = j; // For each virtual row (since we only have physical storage for two) for(int i = 1; i <= n; ++i) { // Fill in the values in the row rows[nextRow][0] = i; for(int j = 1; j <= m; ++j) { int dist1 = rows[curRow][j] + 1; int dist2 = rows[nextRow][j - 1] + 1; int dist3 = rows[curRow][j - 1] + (first[i - 1].Equals(second[j - 1]) ? 0 : 1); rows[nextRow][j] = Math.Min(dist1, Math.Min(dist2, dist3)); } // Swap the current and next rows if(curRow == 0) { curRow = 1; nextRow = 0; } else { curRow = 0; nextRow = 1; } } // Return the computed edit distance return rows[curRow][m]; } /// <summary> /// Takes fileName like annieSquare.dds, AnnieSquare.dds, annie_square_dds, ANNIE_SQUARE.dds and /// outputs an array such as ["annie", "square", "dds"]. Non-alphanumeric values are deemed /// as delimiters as well. /// /// Delimiters: /// non-alphanumerics /// In the middle of two (and only two) uppercase characters that are followed by lowercase characters /// Ie: "ACar" => ["A", "Car"] /// On switch from uppercase string of 3+ to lowercase /// Ie: "MANmode" => ["MAN", "mode"] /// On switch from lowercase string to uppercase /// Ie: "ExampleText" => ["Example", "Text"] /// On switch from number to alphabet or vice versa /// Ie: "IHave10Apples" => ["I", "Have", "10", "Apples"] /// On reaching a non-alphanumeric value /// Ie; "RADS_USER_Kernel.exe" => ["RADS", "USER", "Kernel", "exe"] /// </summary> /// <param name="name"></param> /// <returns></returns> public static IEnumerable<string> ExtractFileNameTokens(string fileName) { StringBuilder sb = new StringBuilder(); // We start as if we were just at position -1 CharType lastlastCharType = CharType.Invalid; CharType lastCharType = CharType.Invalid; CharType charType = CharType.Invalid; CharType nextCharType = fileName.Length >= 1 ? GetCharType(fileName[0]) : CharType.Invalid; for (int i = 0; i < fileName.Length; i++) { lastlastCharType = lastCharType; lastCharType = charType; charType = nextCharType; nextCharType = fileName.Length > i + 1 ? GetCharType(fileName[i + 1]) : CharType.Invalid; char c = fileName[i]; //Console.WriteLine("Got char " + c + " current sb " + sb.ToString()); if (sb.Length == 0) { if (charType != CharType.Invalid) sb.Append(c); } else { // Check delimit condition: In the middle of two (and only two) uppercase characters that are followed by lowercase characters if (lastlastCharType != CharType.Uppercase && //e, current string builder = "A" lastCharType == CharType.Uppercase && //A charType == CharType.Uppercase && //C nextCharType == CharType.Lowercase) //a { yield return sb.ToString(); sb.Clear(); sb.Append(c); } else // Check delimit condition: On switch from uppercase string of 3+ to lowercase if (lastlastCharType == CharType.Uppercase && //M, current string builder = "A" lastCharType == CharType.Uppercase && //A charType == CharType.Uppercase && //N nextCharType == CharType.Lowercase) //m { sb.Append(c); yield return sb.ToString(); sb.Clear(); } else // Check delimit condition: On switch from lowercase string to uppercase if (charType == CharType.Lowercase && //n nextCharType == CharType.Uppercase) //M { sb.Append(c); yield return sb.ToString(); sb.Clear(); } else // Check delimit condition: On switch from number to alphabet or vice versa if ((charType == CharType.Number && (nextCharType == CharType.Lowercase || nextCharType == CharType.Uppercase)) || (nextCharType == CharType.Number && (charType == CharType.Lowercase || charType == CharType.Uppercase))) { sb.Append(c); yield return sb.ToString(); sb.Clear(); } else // Check delimit condition: On reaching a non-alphanumeric value if (charType == CharType.Invalid) { if (sb.Length > 0) yield return sb.ToString(); sb.Clear(); } else // Check delimit condition: On reaching a non-alphanumeric value if(nextCharType == CharType.Invalid) { sb.Append(c); yield return sb.ToString(); sb.Clear(); } else // Didn't get delimited! { // Console.WriteLine("Appending " + c + " " + lastlastCharType + " " + lastCharType + " " + charType + " " + nextCharType); sb.Append(c); } } } // for if (sb.Length > 0) yield return sb.ToString(); yield break; } public static string ToTitleCase(this string s) { return ExtractFileNameTokens(s).Select(token => char.ToUpper(token[0]) + token.Substring(1)).Join(" "); } private static CharType GetCharType(char c) { if ('a' <= c && c <= 'z') return CharType.Lowercase; else if ('A' <= c && c <= 'Z') return CharType.Uppercase; else if ('0' <= c && c <= '9') return CharType.Number; else return CharType.Invalid; } private enum CharType { Invalid, Lowercase, Uppercase, Number } /// <summary> /// Formats a name from UpperCamelCase to Upper Camel Case /// </summary> /// <param name="name"></param> /// <returns></returns> public static string FormatName(string name) { name = name + " "; name = name[0].ToString().ToUpper() + name.Substring(1); //http://stackoverflow.com/questions/4511087/regex-convert-camel-case-to-all-caps-with-underscores string _RESULT_VAL = Regex.Replace(name, @"(?x)( [A-Z][a-z,0-9]+ | [A-Z]+(?![a-z]) )", "_$0"); //Console.WriteLine("* " + _RESULT_VAL); string RESULT_VAL = _RESULT_VAL.Substring(1); //Console.WriteLine("# " + RESULT_VAL); var result = from part in RESULT_VAL.Split(new char[]{ '_', ' '}) let partPad = part + " " let firstChar = part.Length > 3 ? partPad[0].ToString().ToUpper() : partPad[0].ToString().ToLower() select (firstChar + partPad.Substring(1).ToLower()).Trim(); string resultString = string.Join(" ", result.Where((s)=> !string.IsNullOrWhiteSpace(s)) .ToArray()).Trim(); //Make the first letter of the first term capitalized resultString = resultString[0].ToString().ToUpper() + resultString.Substring(1); //Replace multiple space occurrences string realResult = string.Join(" ", resultString.QASS(' ')); //Console.WriteLine("> " + realResult); return realResult; } public static string RemoveNonalphanumeric(this string s) { char[] arr = s.ToCharArray(); arr = Array.FindAll<char>(arr, (c => (char.IsLetterOrDigit(c) || char.IsWhiteSpace(c) || c == '-'))); return new string(arr); } /// <summary> /// http://stackoverflow.com/questions/221925/creating-a-byte-array-from-a-stream /// </summary> /// <param name="input"></param> /// <returns></returns> public static byte[] ReadFully(Stream input) { byte[] buffer = new byte[16 * 1024]; using (MemoryStream ms = new MemoryStream()) { int read; while ((read = input.Read(buffer, 0, buffer.Length)) > 0) { ms.Write(buffer, 0, read); } return ms.ToArray(); } } /// <summary> /// Returns an array containing numbers spaced between 0 and the given maximum value /// </summary> /// <param name="maximum"> /// The number which the result approaches from 0 to its last index /// </param> /// <param name="numElements"> /// The number of elements in the result (includes 0 and maximum) /// </param> /// <param name="uniformityFactor"> /// Greater than 0 /// </param> /// <param name="getRandom">Returns a value in [0.0, 1.0)</param> /// <returns></returns> public static double[] GenerateRandomCumulativeDistribution( double maximum, int numElements, double uniformityFactor, Func<double> getRandom) { var weights = new double[numElements]; weights[0] = 0.0; // actually implicit, but here for readability for (int i = 1; i < weights.Length; i++) weights[i] = getRandom() + uniformityFactor; // :: every element equals the sum of the elements before it for (int i = 1; i < weights.Length; i++) weights[i] += weights[i - 1]; // :: normalize all elements to maximum value keysRemaining for (int i = 0; i <= weights.Length - 2; i++) weights[i] = maximum * weights[i] / weights[weights.Length - 1]; weights[weights.Length - 1] = maximum; return weights; } public static double[] GenerateRandomCumulativeDistribution( double maximum, int numElements, double uniformityFactor) { return GenerateRandomCumulativeDistribution( maximum, numElements, uniformityFactor, StaticRandom.NextDouble ); } /// <summary> /// Gets the attribute of Enum value /// </summary> /// <typeparam name="TAttribute"></typeparam> /// <param name="enumValue"></param> /// <returns></returns> public static TAttribute GetAttributeOrNull<TAttribute>(this Enum enumValue) where TAttribute : Attribute { var enumType = enumValue.GetType(); var memberInfo = enumType.GetTypeInfo().DeclaredMembers.First(member => member.Name.Equals(enumValue.ToString())); var attributes = memberInfo.GetCustomAttributes(typeof(TAttribute), false); return (TAttribute)attributes.FirstOrDefault(); } public static TAttribute GetAttributeOrNull<TAttribute>(this object instance) where TAttribute : Attribute { var instanceType = instance as Type ?? instance.GetType(); return GetAttributeOrNull<TAttribute>(instanceType); } public static TAttribute GetAttributeOrNull<TAttribute>(this Type type) where TAttribute : Attribute { var typeInfo = type.GetTypeInfo(); return GetAttributeOrNull<TAttribute>(typeInfo); } public static TAttribute GetAttributeOrNull<TAttribute>(this TypeInfo typeInfo) where TAttribute : Attribute { var attributes = typeInfo.GetCustomAttributes(typeof(TAttribute), false); return (TAttribute)attributes.FirstOrDefault(); } public static bool IsThrown<TException>(Action action) where TException : Exception { try { action(); return false; } catch (TException) { return true; } } public static TValue KeepExisting<TKey, TValue>(TKey key, TValue value) { return value; } public static long GetUnixTimeMilliseconds() { return (long)(DateTime.UtcNow - new DateTime(1970, 1, 1, 0, 0, 0, 0, DateTimeKind.Utc)).TotalMilliseconds; } public static string NextToken(string input, out string token) { input = input.Trim(); var firstSpaceIndex = input.IndexOf(' '); string remaining; if (firstSpaceIndex < 0) { token = input; remaining = ""; } else { token = input.Substring(0, firstSpaceIndex); remaining = input.Substring(firstSpaceIndex + 1); } return remaining; } } } <file_sep>using Dargon.Ryu; using ItzWarty.Collections; using ItzWarty.Pooling; using NMockito; using Xunit; namespace ItzWarty { public class RyuPackageTests : NMockitoInstance { [Fact] public void Run() { var ryu = new RyuFactory().Create(); ryu.Touch<ItzWartyCommonsRyuPackage>(); ryu.Setup(); AssertTrue(ryu.Get<ICollectionFactory>() is CollectionFactory); AssertTrue(ryu.Get<ObjectPoolFactory>() is DefaultObjectPoolFactory); } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Threading.Tasks; using Dargon.Ryu; using ItzWarty.Collections; using ItzWarty.Pooling; namespace ItzWarty { public class ItzWartyCommonsRyuPackage : RyuPackageV1 { public ItzWartyCommonsRyuPackage() { Singleton<ICollectionFactory, CollectionFactory>(); Singleton<ObjectPoolFactory, DefaultObjectPoolFactory>(); } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.IO; namespace ItzWarty.Database { public class DatabaseClient { /// <summary> /// Root is the root database [the location of the database folder] /// </summary> public string root; /// <summary> /// Folderstack is pushed into whenever you enter, popped when you go up, and cleared when you exit... /// </summary> public List<string> folderStack = new List<string>(); /// <summary> /// When we jump to root, sometimes we want to define our root to actually be a subdatabase of our root database, so we push to here /// When we reset, we will copy all contents from here to the folderstack again. /// </summary> public List<string> defaultFolderStack = new List<string>(); public string currentDBPath { get { if (folderStack.Count == 0) return dbName; else { string final = dbName; for (int i = 0; i < folderStack.Count; i++) final += "." + folderStack[i]; return final; } } } public string currentFolderLocation { get { string final = root + "\\"; for (int i = 0; i < folderStack.Count; i++) final += "\\" + folderStack[i]; return final; } } public string dbName; /// <summary> /// Sets the DB Root, if it doesnt exist, create the DB Root /// </summary> /// <param name="DBNAME"></param> public DatabaseClient(string DBNAME) { this.root = Directory.GetCurrentDirectory() + "\\" + DBNAME; this.dbName = DBNAME; this.defaultFolderStack.Clear(); //We want to be at root, so we shouldn't be inside any subfolder... this.ReturnToRoot(); if (!Directory.Exists(root)) { Directory.CreateDirectory(root); Console.WriteLine("Database \"" + dbName + "\" created"); } ReturnToRoot(); } public void ReturnToRoot() { this.folderStack.Clear(); this.folderStack = new List<string>(this.defaultFolderStack.ToArray()); } public void ReturnToSuperParent() { this.folderStack.Clear(); this.defaultFolderStack.Clear(); } public enum DBResponse { InvalidName, Exists, doesNotExist, Selected, Created, moved, Deleted } public string[] GetExistingDBNames() { string[] toreturn = Directory.GetDirectories(currentFolderLocation); string dirToReplaceToNothing = ""; ReturnToRoot(); SelectDB("Channels"); dirToReplaceToNothing = currentFolderLocation; for (int i = 0; i < toreturn.Length; i++) toreturn[i] = toreturn[i].Replace(dirToReplaceToNothing, ""); return toreturn; } public System.Collections.ArrayList dirList = new System.Collections.ArrayList(); public string[] OutputListofDBs() { dirList.Clear(); //Console.WriteLine(currentFolderLocation); foreach (string dirName in Directory.GetDirectories(currentFolderLocation)) { SubOutputListofDBs(dirName); } string[] final = new string[dirList.Count]; for (int i = 0; i < dirList.Count; i++) { //Console.WriteLine((string)dirList[i]); final[i] = ((string)(dirList[i])); } return final; } public void SubOutputListofDBs(string curDir) { //Console.WriteLine(curDir.Replace(currentFolderLocation, "")); dirList.Add(curDir.Replace(currentFolderLocation, "")); foreach (string dirName in Directory.GetDirectories(curDir)) { SubOutputListofDBs(dirName); } } public bool IsWinNameOkay(string val) { //NTFS does not allow / ? < > \ : * | if (val.Replace("/", "!") != val) return false; if (val.Replace("?", "!") != val) return false; if (val.Replace("<", "!") != val) return false; if (val.Replace(">", "!") != val) return false; if (val.Replace("\\", "!") != val) return false; if (val.Replace(":", "!") != val) return false; if (val.Replace("*", "!") != val) return false; if (val.Replace("|", "!") != val) return false; //FAT doesnt allow the above and ^ if (val.Replace("^", "!") != val) return false; //illegal folder names under windows if (val.ToLowerInvariant() == "com1") return false; if (val.ToLowerInvariant() == "com2") return false; if (val.ToLowerInvariant() == "com3") return false; if (val.ToLowerInvariant() == "com4") return false; if (val.ToLowerInvariant() == "com5") return false; if (val.ToLowerInvariant() == "com6") return false; if (val.ToLowerInvariant() == "com7") return false; if (val.ToLowerInvariant() == "com8") return false; if (val.ToLowerInvariant() == "com9") return false; if (val.ToLowerInvariant() == "lpt1") return false; if (val.ToLowerInvariant() == "lpt2") return false; if (val.ToLowerInvariant() == "lpt3") return false; if (val.ToLowerInvariant() == "lpt4") return false; if (val.ToLowerInvariant() == "lpt5") return false; if (val.ToLowerInvariant() == "lpt6") return false; if (val.ToLowerInvariant() == "lpt7") return false; if (val.ToLowerInvariant() == "lpt8") return false; if (val.ToLowerInvariant() == "lpt9") return false; if (val.ToLowerInvariant() == "con") return false; if (val.ToLowerInvariant() == "nul") return false; if (val.ToLowerInvariant() == "prn") return false; //YOu cant have a period or a space at the end of a file name char[] a_tDB = val.ToCharArray(); if (a_tDB[a_tDB.Length - 1] == ' ') return false; if (a_tDB[a_tDB.Length - 1] == '.') return false; //we shold be fine o_O return true; } #region dbAction /// <summary> /// /// </summary> /// <param name="oldDBNAME"></param> /// <param name="newDBNAME"></param> /// <returns>Invalid Name, exists, does not exist, moved</returns> public DBResponse RenameDB(string oldDBNAME, string newDBNAME) { if (!IsWinNameOkay(newDBNAME)) return DBResponse.InvalidName; if (!IsWinNameOkay(oldDBNAME)) return DBResponse.InvalidName; if (!Directory.Exists(currentFolderLocation+"\\"+oldDBNAME)) return DBResponse.doesNotExist; if (Directory.Exists(currentFolderLocation + "\\" + newDBNAME)) return DBResponse.Exists; Directory.Move(currentFolderLocation + "\\" + oldDBNAME, currentFolderLocation + "\\" + newDBNAME); return DBResponse.moved; } /// <summary> /// /// </summary> /// <param name="targetDB"></param> /// <returns>DBResponses: Exists, doesNotExist</returns> public DBResponse ExistDB(string targetDB) { if (Directory.Exists(currentFolderLocation + "\\"+targetDB)) return DBResponse.Exists; return DBResponse.doesNotExist; } /// <summary> /// /// </summary> /// <param name="targetDB"></param> /// <returns>Exists, Created, InvalidName</returns> public DBResponse CreateDB(string targetDB) { if (ExistDB(targetDB) == DBResponse.Exists) return DBResponse.Exists; if (!IsWinNameOkay(targetDB)) return DBResponse.InvalidName; //create the DB Directory.CreateDirectory(currentFolderLocation + "\\" + targetDB.Trim()); //dummy file for zip archives //File.WriteAllLines(currentFolderLocation + "\\" + targetDB.Trim() + "\\Dummy", new string[] { "dummy" }); Console.WriteLine("Database \"" + currentDBPath + "." + targetDB + "\" created"); return DBResponse.Created; } //DB = folder //Value = file public DBResponse SelectDB(string targetDB) { return SelectDB(targetDB, false); } public DatabaseClient SelectDB2(string targetDB) { DatabaseClient result = this.Clone(); DBResponse response = result.SelectDB(targetDB); if (response == DBResponse.doesNotExist) return null; else if (response == DBResponse.Selected) return result; else return null; } public DatabaseClient SelectDB2(string targetDB, bool autocreate) { DatabaseClient result = this.Clone(); DBResponse response = result.SelectDB(targetDB, autocreate); if (response == DBResponse.doesNotExist) return null; else if (response == DBResponse.Selected) return result; else return null; } /// <summary> /// /// </summary> /// <param name="targetDB"></param> /// <returns>DBResponses: Selected, doesNotExist</returns> public DBResponse SelectDB(string targetDB, bool autocreate) { if (autocreate) { if (ExistDB(targetDB) == DBResponse.doesNotExist) CreateDB(targetDB); } if (Directory.Exists(currentFolderLocation+"\\"+ targetDB)) { folderStack.Add(targetDB); return DBResponse.Selected; } else { Console.WriteLine(currentFolderLocation +"\\"+ targetDB + " Doesntexist"); return DBResponse.doesNotExist; } } public DBResponse DeleteDB(string targetDB) { if (ExistDB(targetDB) == DBResponse.doesNotExist) return DBResponse.doesNotExist; if (!IsWinNameOkay(targetDB)) return DBResponse.InvalidName; Directory.Delete(currentFolderLocation + "\\" + targetDB.Trim()); return DBResponse.Deleted; } #endregion #region valueAction public bool ValueExists(string valueName) { string loc = currentFolderLocation + "\\" + valueName; if (!File.Exists(loc)) return false; return true; } public string GetValue(string valueName) { string loc = currentFolderLocation + "\\" + valueName; if (!File.Exists(loc)) throw new Exception("Value " + valueName + " does NOT exist in" + loc); return File.ReadAllText(loc); } /// <summary> /// Stores the value as the (string)content /// </summary> /// <param name="valueName"></param> /// <param name="valueContent"></param> public void SetValue(string valueName, object valueContent) { if (!IsWinNameOkay(valueName)) throw new Exception(valueName + "is an INVALID windows name at " + currentFolderLocation); string loc = currentFolderLocation + "\\" + valueName; if (!File.Exists(loc)) File.Delete(loc); //start anew FileStream myFS = File.Create(loc); byte[] myBytes = Encoding.ASCII.GetBytes(valueContent.ToString()); myFS.Write(myBytes, 0, myBytes.Length); myFS.Close(); Console.WriteLine(currentDBPath + "." + valueName + " now set to " + valueContent); } public bool GetBool(string valueName) { return GetValue(valueName) == "1"; } public void SetBool(string valueName, bool value) { SetValue(valueName, (value) ? "1" : "0"); } public int GetInt(string intName) { return int.Parse(GetValue(intName)); } public void SetInt(string intName, int value) { SetValue(intName, value.ToString()); } #endregion public bool FileExists(string fileName) { string loc = currentFolderLocation + "\\" + fileName; return File.Exists(loc); } public void WriteFile(string fileName, byte[] data) { string loc = currentFolderLocation + "\\" + fileName; FileStream fs = File.Open(loc, FileMode.OpenOrCreate, FileAccess.ReadWrite, FileShare.None); //While we are writing, you can't do anything with it, sorry. if (File.Exists(loc)) { File.Delete(loc); } } public DatabaseClient Clone() { DatabaseClient copy = new DatabaseClient(this.dbName); copy.defaultFolderStack = new List<string>(this.defaultFolderStack.ToArray()); copy.folderStack = new List<string>(this.folderStack.ToArray()); return copy; } public void SetCurrentDBAsRoot() { defaultFolderStack.Clear(); for (int i = 0; i < folderStack.Count; i++) defaultFolderStack.Add(folderStack[i]); } public void CreateTable(string tableName, string[] columns) { wDBTable.CreateTable(tableName, this); } public wDBTable GetTable(string name) { return new wDBTable(name, this); } public string[] ListDatabases() { string[] childDirs = Directory.GetDirectories(currentFolderLocation); for (int i = 0; i < childDirs.Length; i++) childDirs[i] = childDirs[i].Replace(currentFolderLocation, ""); return childDirs; } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using System.Runtime.CompilerServices; using System.Text; using System.Threading.Tasks; namespace ItzWarty { public static partial class Extensions { [MethodImpl(MethodImplOptions.AggressiveInlining)] public static bool WithinII(this int value, int lower, int upper) { return value >= lower && value <= upper; } [MethodImpl(MethodImplOptions.AggressiveInlining)] public static bool WithinIE(this int value, int lower, int upper) { return value >= lower && value < upper; } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Threading.Tasks; namespace ItzWarty.Collections { public interface ICollectionFactory { IConcurrentDictionary<K, V> CreateConcurrentDictionary<K, V>(); IConcurrentDictionary<K, V> CreateConcurrentDictionary<K, V>(IEnumerable<KeyValuePair<K, V>> collection); IConcurrentDictionary<K, V> CreateConcurrentDictionary<K, V>(IEqualityComparer<K> comparer); IConcurrentDictionary<K, V> CreateConcurrentDictionary<K, V>(IEnumerable<KeyValuePair<K, V>> collection, IEqualityComparer<K> comparer); IConcurrentSet<T> CreateConcurrentSet<T>(); IConcurrentSet<T> CreateConcurrentSet<T>(IEnumerable<T> collection); IConcurrentSet<T> CreateConcurrentSet<T>(IEqualityComparer<T> comparer); IConcurrentSet<T> CreateConcurrentSet<T>(IEnumerable<T> collection, IEqualityComparer<T> comparer); IConcurrentBag<T> CreateConcurrentBag<T>(); IConcurrentBag<T> CreateConcurrentBag<T>(IEnumerable<T> collection); IHashSet<T> CreateHashSet<T>(); IHashSet<T> CreateHashSet<T>(IEnumerable<T> collection); IHashSet<T> CreateHashSet<T>(IEqualityComparer<T> comparer); IHashSet<T> CreateHashSet<T>(IEnumerable<T> collection, IEqualityComparer<T> comparer); ISortedSet<T> CreateSortedSet<T>(); ISortedSet<T> CreateSortedSet<T>(IEnumerable<T> collection); ISortedSet<T> CreateSortedSet<T>(IComparer<T> comparer); ISortedSet<T> CreateSortedSet<T>(IEnumerable<T> collection, IComparer<T> comparer); IReadOnlyCollection<T> CreateImmutableCollection<T>(); IReadOnlyCollection<T> CreateImmutableCollection<T>(params T[] args); IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(); IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1); IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2); IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3); IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3, K k4, V v4); IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3, K k4, V v4, K k5, V v5); IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3, K k4, V v4, K k5, V v5, K k6, V v6); IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3, K k4, V v4, K k5, V v5, K k6, V v6, K k7, V v7); IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3, K k4, V v4, K k5, V v5, K k6, V v6, K k7, V v7, K k8, V v8); IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3, K k4, V v4, K k5, V v5, K k6, V v6, K k7, V v7, K k8, V v8, K k9, V v9); IReadOnlyDictionary<K, V> CreateImmutableDictionary<K, V>(K k1, V v1, K k2, V v2, K k3, V v3, K k4, V v4, K k5, V v5, K k6, V v6, K k7, V v7, K k8, V v8, K k9, V v9, K k10, V v10); IReadOnlySet<T> CreateImmutableSet<T>(); IReadOnlySet<T> CreateImmutableSet<T>(params T[] args); IMultiValueDictionary<K, V> CreateMultiValueDictionary<K, V>(); IMultiValueDictionary<K, V> CreateMultiValueDictionary<K, V>(IEqualityComparer<K> comparer); IMultiValueSortedDictionary<K, V> CreateMultiValueSortedDictionary<K, V>(); IMultiValueSortedDictionary<K, V> CreateMultiValueSortedDictionary<K, V>(IComparer<K> comparer); IOrderedDictionary<K, V> CreateOrderedDictionary<K, V>(); IOrderedDictionary<K, V> CreateOrderedDictionary<K, V>(IEqualityComparer<K> comparer); IOrderedMultiValueDictionary<K, V> CreateOrderedMultiValueDictionary<K, V>(ValuesSortState valuesSortState = ValuesSortState.Unsorted); IQueue<T> CreateQueue<T>(); IPriorityQueue<TValue, TPriority> CreatePriorityQueue<TValue, TPriority>(int capacity) where TPriority : IComparable<TPriority>, IEquatable<TPriority> where TValue : class, IPriorityQueueNode<TPriority>; IConcurrentQueue<T> CreateConcurrentQueue<T>(); IConcurrentQueue<T> CreateSingleConsumerSingleProducerConcurrentQueue<T>() where T : class; IUniqueIdentificationSet CreateUniqueIdentificationSet(bool filled); IUniqueIdentificationSet CreateUniqueIdentificationSet(uint low, uint high); IListDictionary<K, V> CreateListDictionary<K, V>(); IDictionary<K, V> CreateDictionary<K, V>(); IDictionary<K, V> CreateDictionary<K, V>(IEqualityComparer<K> comparer); IDictionary<K, V> CreateSortedDictionary<K, V>(); IDictionary<K, V> CreateSortedDictionary<K, V>(IComparer<K> comparer); } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using System.Text; namespace ItzWarty.Database { public enum ComparisonOperator { Is, IsNot, LessThan, LessThanEqualTo, GreaterThan, GreaterThanEqualTo, StartsWith } public class wDBRowCollection { public List<wDBTableRow> rows = new List<wDBTableRow>(); public wDBRowCollection(){} public wDBRowCollection(List<wDBTableRow> rows) { this.rows = rows; } public wDBRowCollection Where(string what, ComparisonOperator op, object operand) { string operandTwo = ""; if (operand is int) operandTwo = ((int)operand).ToString(); else if (operand is string) operandTwo = (string)operand; else if (operand is bool) operandTwo = ((bool)operand) ? "1" : "0"; wDBRowCollection newCollection = new wDBRowCollection(); for (int i = 0; i < this.rows.Count; i++) { string operandOne = this.rows[i][what]; double opOne = 0, opTwo = 0; double.TryParse(operandOne, out opOne); double.TryParse(operandTwo, out opTwo); switch(op) { case ComparisonOperator.Is: if(operandOne == operandTwo) newCollection.rows.Add(this.rows[i]); break; case ComparisonOperator.IsNot: if(operandOne != operandTwo) newCollection.rows.Add(this.rows[i]); break; case ComparisonOperator.LessThan: if(opOne < opTwo) newCollection.rows.Add(this.rows[i]); break; case ComparisonOperator.LessThanEqualTo: if(opOne <= opTwo) newCollection.rows.Add(this.rows[i]); break; case ComparisonOperator.GreaterThan: if(opOne > opTwo) newCollection.rows.Add(this.rows[i]); break; case ComparisonOperator.GreaterThanEqualTo: if(opOne >= opTwo) newCollection.rows.Add(this.rows[i]); break; case ComparisonOperator.StartsWith: { if (operandOne.StartsWith(operandTwo)) { newCollection.rows.Add(this.rows[i]); } break; } } } return newCollection; } public int Count { get { return this.rows.Count; } } public wDBTableRow this[int i] { get { if (this.rows.Count >= i) return null; return this.rows[i]; } set { this.rows[i] = value; } } } } <file_sep>using System; using System.Collections.Generic; using System.Diagnostics; using System.Runtime.CompilerServices; using System.Text; using ItzWarty.FormatProviders; namespace ItzWarty { public static class Extend { public static T[] Cast<T, U>(this U[] values, Func<U, T> cast) { T[] result = new T[values.Length]; for (int i = 0; i < result.Length; i++) result[i] = cast(values[i]); return result; } public static KeyValuePair<TKey, TValue> PairValue<TKey, TValue>(this TKey key, TValue value) { return new KeyValuePair<TKey, TValue>(key, value); } public static KeyValuePair<TKey, TValue> PairKey<TKey, TValue>(this TValue value, TKey key) { return key.PairValue(value); } /// <summary> /// Calls the given function, passing self as the argument. /// </summary> public static T With<T>(this T self, Action<T> func) { func(self); return self; } /// <summary> /// Calls the given function, passing self as the argument. /// </summary> public static U With<T, U>(this T self, Func<T, U> func) { return func(self); } /// <summary> /// Runs self through the function, and returns the result. /// </summary> /// <typeparam name="T">The type of the fileName parameter</typeparam> /// <typeparam name="U">The type of the output result</typeparam> /// <param name="self">The fileName parameter which is passed through func</param> /// <param name="func">The function which we pass our fileName parameter through.</param> /// <returns>func(self)</returns> public static U Pass<T, U>(this T self, Func<T, U> func) { return func(self); } /// <summary> /// Checks whether argument is <see langword="null"/> and throws <see cref="ArgumentNullException"/> if so. /// </summary> /// <param name="argument">Argument to check on <see langword="null"/>.</param> /// <param name="argumentName">Argument name to pass to Exception constructor.</param> /// <returns>Specified argument.</returns> /// <exception cref="ArgumentNullException"/> [DebuggerStepThrough] public static T ThrowIfNull<T>(this T argument, string argumentName) where T : class { if (argument == null) { throw new ArgumentNullException(argumentName); } else { return argument; } } public static TSource MinBy<TSource, TKey>(this IEnumerable<TSource> source, Func<TSource, TKey> selector) { return source.MinBy(selector, Comparer<TKey>.Default); } public static TSource MinBy<TSource, TKey>(this IEnumerable<TSource> source, Func<TSource, TKey> selector, IComparer<TKey> comparer) { source.ThrowIfNull("source"); selector.ThrowIfNull("selector"); comparer.ThrowIfNull("comparer"); using (IEnumerator<TSource> sourceIterator = source.GetEnumerator()) { if (!sourceIterator.MoveNext()) { throw new InvalidOperationException("Sequence was empty"); } TSource min = sourceIterator.Current; TKey minKey = selector(min); while (sourceIterator.MoveNext()) { TSource candidate = sourceIterator.Current; TKey candidateProjected = selector(candidate); if (comparer.Compare(candidateProjected, minKey) < 0) { min = candidate; minKey = candidateProjected; } } return min; } } public static TSource MaxBy<TSource, TKey>(this IEnumerable<TSource> source, Func<TSource, TKey> selector) { return source.MaxBy(selector, Comparer<TKey>.Default); } public static TSource MaxBy<TSource, TKey>(this IEnumerable<TSource> source, Func<TSource, TKey> selector, IComparer<TKey> comparer) { source.ThrowIfNull("source"); selector.ThrowIfNull("selector"); comparer.ThrowIfNull("comparer"); using (IEnumerator<TSource> sourceIterator = source.GetEnumerator()) { if (!sourceIterator.MoveNext()) { throw new InvalidOperationException("Sequence was empty"); } TSource max = sourceIterator.Current; TKey maxKey = selector(max); while (sourceIterator.MoveNext()) { TSource candidate = sourceIterator.Current; TKey candidateProjected = selector(candidate); if (comparer.Compare(candidateProjected, maxKey) > 0) { max = candidate; maxKey = candidateProjected; } } return max; } } // http://stackoverflow.com/questions/311165/how-do-you-convert-byte-array-to-hexadecimal-string-and-vice-versa public static string ToHex(this byte[] a) { var hex = new StringBuilder(a.Length * 2); foreach (byte b in a) hex.AppendFormat("{0:x2}", b); return hex.ToString(); } [MethodImpl(MethodImplOptions.AggressiveInlining)] public static long GetUnixTime(this DateTime dateTime) { return (long)(dateTime - new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc)).TotalSeconds; } [MethodImpl(MethodImplOptions.AggressiveInlining)] public static long GetUnixTimeMilliseconds(this DateTime dateTime) { return (long)(dateTime - new DateTime(1970, 1, 1, 0, 0, 0, DateTimeKind.Utc)).TotalMilliseconds; } //http://stackoverflow.com/questions/128618/c-file-size-format-provider public static string ToFileSize(this long l) { return String.Format(new FileSizeFormatProvider(), "{0:fs}", l); } private delegate K TryGetValueDelegate<K, V>(K key, out V value); public static bool Within(this double a, double b, double epsilon) { return Math.Abs(a - b) <= epsilon; } public static bool Within(this float a, float b, float epsilon) { return Math.Abs(a - b) <= epsilon; } } } <file_sep>using System; using System.Collections.Generic; using System.Runtime.CompilerServices; using ItzWarty.Collections; namespace ItzWarty { public static partial class Extensions { public static void ForEach<T>(this IEnumerable<T> enumerable, Action<T> action) { foreach (var element in enumerable) { action(element); } } /// <summary> /// Gets a subarray of the given array /// http://stackoverflow.com/questions/943635/c-arrays-getting-a-sub-array-from-an-existing-array /// </summary> public static T[] SubArray<T>(this T[] data, int index, int length = -1) { if (length == -1) length = data.Length - index; T[] result = new T[length]; Array.Copy(data, index, result, 0, length); return result; } [MethodImpl(MethodImplOptions.AggressiveInlining)] public static T Get<T>(this T[] collection, int index) { return collection[index]; } [MethodImpl(MethodImplOptions.AggressiveInlining)] public static T Get<T>(this IList<T> collection, int index) { return collection[index]; } [MethodImpl(MethodImplOptions.AggressiveInlining)] public static V Get<K, V>(this IDictionary<K, V> dict, K key) { return dict[key]; } [MethodImpl(MethodImplOptions.AggressiveInlining)] public static V GetValueOrDefault<K, V>(this Dictionary<K, V> dict, K key) { return ((IDictionary<K, V>)dict).GetValueOrDefault(key); } [MethodImpl(MethodImplOptions.AggressiveInlining)] public static V GetValueOrDefault<K, V>(this IDictionary<K, V> dict, K key) { V result; dict.TryGetValue(key, out result); return result; } [MethodImpl(MethodImplOptions.AggressiveInlining)] public static V GetValueOrDefault<K, V>(this IReadOnlyDictionary<K, V> dict, K key) { V result; dict.TryGetValue(key, out result); return result; } public static bool TryAdd<K, V>(this IConcurrentDictionary<K, V> dict, K key, Func<V> valueFactory) { bool added = false; dict.AddOrUpdate(key, (k) => { added = true; return valueFactory(); }, Util.KeepExisting); return added; } public static bool TryAdd<K, V>(this IConcurrentDictionary<K, V> dict, K key, Func<K, V> valueFactory) { bool added = false; dict.AddOrUpdate(key, (k) => { added = true; return valueFactory(k); }, Util.KeepExisting); return added; } public static bool TryRemove<K, V>(this IConcurrentDictionary<K, V> dict, K key, V value) { return dict.Remove(new KeyValuePair<K, V>(key, value)); } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Threading.Tasks; using ItzWarty.Collections; namespace ItzWarty.Pooling { public interface ObjectPoolFactory { ObjectPool<T> CreatePool<T>(Func<T> generator); ObjectPool<T> CreatePool<T>(Func<T> generator, string name); } public class DefaultObjectPoolFactory : ObjectPoolFactory { private readonly ICollectionFactory collectionFactory; public DefaultObjectPoolFactory(ICollectionFactory collectionFactory) { this.collectionFactory = collectionFactory; } public ObjectPool<T> CreatePool<T>(Func<T> generator) { return new ObjectPoolImpl<T>(generator, collectionFactory.CreateConcurrentBag<T>()); } public ObjectPool<T> CreatePool<T>(Func<T> generator, string name) { return new ObjectPoolImpl<T>(generator, collectionFactory.CreateConcurrentBag<T>(), name); } } } <file_sep>//http://stackoverflow.com/questions/3130922/sortedsett-and-anonymous-icomparert-in-the-constructor-is-not-working using System; using System.Collections; using System.Collections.Generic; namespace ItzWarty.Comparers { public class LambdaComparer<T> : IComparer<T> { private readonly Comparison<T> comparison; public LambdaComparer(Comparison<T> comparison) { this.comparison = comparison; } public int Compare(T x, T y) { return comparison(x, y); } } public class LambdaComparer : IComparer { private readonly Func<object, object, int> comparison; public LambdaComparer(Func<object, object, int> comparison) { this.comparison = comparison; } public int Compare(object x, object y) { return comparison(x, y); } } public class LambdaEqualityComparer<T> : IEqualityComparer<T> { private readonly Func<T, int> hashcode; private readonly Func<T, T, bool> equals; public LambdaEqualityComparer(Func<T, T, bool> equals, Func<T, int> hashcode) { this.hashcode = hashcode; this.equals = equals; } public bool Equals(T x, T y) { return equals(x, y); } public int GetHashCode(T obj) { return hashcode(obj); } } } <file_sep>using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Threading.Tasks; using SCC = System.Collections.Concurrent; namespace ItzWarty.Collections { public class ConcurrentBag<T> : SCC.ConcurrentBag<T>, IConcurrentBag<T> { public ConcurrentBag() : base() { } public ConcurrentBag(IEnumerable<T> collection) : base(collection) { } } } <file_sep>using System; using ItzWarty.Collections; namespace ItzWarty.Pooling { public class ObjectPoolImpl<T> : ObjectPool<T> { private readonly Func<T> generator; private readonly IConcurrentBag<T> container; private readonly string name; public ObjectPoolImpl(Func<T> generator) : this(generator, new ConcurrentBag<T>(), null) {} public ObjectPoolImpl(Func<T> generator, IConcurrentBag<T> container) : this(generator, container, null) { } public ObjectPoolImpl(Func<T> generator, string name) : this(generator, new ConcurrentBag<T>(), name) { } public ObjectPoolImpl(Func<T> generator, IConcurrentBag<T> container, string name) { generator.ThrowIfNull("generator"); container.ThrowIfNull("container"); this.generator = generator; this.container = container; this.name = name; } public string Name => name; public int Count => container.Count; public T TakeObject() { T result; if (!container.TryTake(out result)) { result = generator(); } return result; } public void ReturnObject(T item) { container.Add(item); } } }<file_sep>using System; using System.Collections.Generic; using System.Linq; using System.Text; namespace ItzWarty { /// <summary> /// Timer class which is to be used in the using() block /// </summary> public class UsingTimer : IDisposable { private readonly Action<UsingTimer> m_onDispose; public DateTime StartTime { get; private set; } /// <summary> /// Creates a new using timer and sets the start time to the current time /// </summary> public UsingTimer(Action<UsingTimer> onDispose) { m_onDispose = onDispose; StartTime = DateTime.Now; } /// <summary> /// Disposes of the using timer. /// </summary> public void Dispose() { if (m_onDispose != null) m_onDispose(this); } /// <summary> /// The time that has elapsed since the Using Timer was created /// </summary> public TimeSpan ElapsedTime { get { return DateTime.Now - StartTime; } } // /// <summary> // /// Returns a method that formats the given string, swapping {0} for milliseconds // /// </summary> // /// <param name="format"></param> // /// <returns></returns> // public static Action<UsingTimer> PrintTimeSpanOnDispose(string s = "{0} ms") // { // return delegate(UsingTimer timer) // { // Console.WriteLine(s, timer.ElapsedTime.TotalMilliseconds); // }; // } /// <summary> /// Returns a method that does nothing on dispose. /// /// This simply returns null as the method. /// </summary> public static Action<UsingTimer> DoNothingOnDispose() { return null; } } } <file_sep>using System; using System.Diagnostics; using System.Linq; using System.Reflection; using System.Runtime.InteropServices; namespace ItzWarty.Utilities { public static class AttributeUtilities { private static readonly AttributeUtilitiesInterface instance = new AttributeUtilitiesImpl(); public static bool TryGetInterfaceGuid(Type interfaceType, out Guid guid) { return instance.TryGetInterfaceGuid(interfaceType, out guid); } } public interface AttributeUtilitiesInterface { bool TryGetInterfaceGuid(Type interfaceType, out Guid guid); } public class AttributeUtilitiesImpl : AttributeUtilitiesInterface { public bool TryGetInterfaceGuid(Type interfaceType, out Guid guid) { var attribute = interfaceType.GetAttributeOrNull<GuidAttribute>(); if (attribute == null) { guid = Guid.Empty; return false; } else { guid = Guid.Parse(attribute.Value); return true; } } } } <file_sep>//http://stackoverflow.com/questions/3130922/sortedsett-and-anonymous-icomparert-in-the-constructor-is-not-working using System; using System.Collections; using System.Collections.Generic; using System.Linq; using System.Text; namespace ItzWarty { public class FuncComparer<T> : IComparer<T> { private readonly Comparison<T> comparison; public FuncComparer(Comparison<T> comparison) { this.comparison = comparison; } public int Compare(T x, T y) { return comparison(x, y); } } public class FuncComparer : IComparer { private readonly Func<object, object, int> comparison; public FuncComparer(Func<object, object, int> comparison) { this.comparison = comparison; } public int Compare(object x, object y) { return comparison(x, y); } } } <file_sep>using System; using System.Collections; using System.Collections.Generic; namespace ItzWarty.Comparers { public class EqualityComparer<T> : IEqualityComparer<T>, IEqualityComparer { private readonly Func<T, T, bool> m_equalityComparer; private readonly Func<T, int> m_hasher; public EqualityComparer(Func<T, T, bool> equalityComparer, Func<T, int> hasher) { m_equalityComparer = equalityComparer; m_hasher = hasher; } public bool Equals(T x, T y) { return m_equalityComparer(x, y); } public int GetHashCode(T obj) { return m_hasher(obj); } bool IEqualityComparer.Equals(object x, object y) { return m_equalityComparer((T)x, (T)y); } public int GetHashCode(object obj) { return m_hasher((T)obj); } } }
657f68eb12b04b56e448437805b069ca9258e1a3
[ "C#" ]
23
C#
miyu/commons
db9df9f5a3bd7260bb85c0cfe8ce600996dbc30c
5c21f768333f638131b6f49c9aa0c0cdda1919c8
refs/heads/master
<repo_name>dhruvsai/cricScoreR<file_sep>/RScripts/DataPrep.R # Set this to be the folder where raw data is stored.Get raw data from http://cricsheet.org/ setwd("C:/Users/admin/Downloads/Use_Case_Dhruv/t20_csv") #For refining data in order to remove initial details of a match in all csv files. files <- list.files(pattern=".csv$") t<-lapply(files, function(x){ filex <- readLines(x) filex <- as.character(sapply(filex,function(y)if(!(grepl("info",y)||grepl("version",y)))return(y))) filex <- filex[filex!="NULL"] writeLines(filex,paste0("C://Users/admin/Downloads/Use_Case_Dhruv/Refined_t20data/",x)) }) testfiles <- sample(files,round(0.2*length(files))) trainfiles <- setdiff(files,testfiles) traindf = do.call(rbind, lapply(trainfiles, function(x) read.csv(paste0("C://Users/admin/Downloads/Use_Case_Dhruv/Refined_t20data/",x), header=FALSE,stringsAsFactors = FALSE))) #For binding all csv files into one csv file. filelist = list.files("E:\\UseCase\\Refined_ipldata") lapply(1:length(filelist),function(x){ prev = read.csv(paste0("E:\\UseCase\\Refined_ipldata\\",filelist[[x]])) colnames(prev) = c("Ball","Inning","Over","Team","StrikeBatsman", "NonStrikeBatsman","Bowler","RunsScored","Extras","DismissalMethod", "BatsmanOut") write.csv(prev,paste0("E:\\UseCase\\Refined_ipldata\\",filelist[[x]]),row.names = FALSE) }) #To make it a global option. FinalDataFrame=read.csv(paste0("E:\\UseCase\\Refined_ipldata\\",filelist[[1]])) for(x in 2:length(filelist)){ temp = read.csv(paste0("E:\\UseCase\\Refined_ipldata\\",filelist[[x]])) FinalDataFrame = rbind(FinalDataFrame,temp) } write.csv(FinalDataFrame,"E:\\UseCase\\BindIPLData.csv")
bac8c9a363cd08985161472a3483cc432b35a503
[ "R" ]
1
R
dhruvsai/cricScoreR
db406a03d416813edf34794140aafaca6594f95b
1b8384ba10ff28c9f22e9305e15ec7461330d835
refs/heads/master
<repo_name>magarnier/tuleap<file_sep>/plugins/tracker/tests/Artifact/Attachment/TemporaryFileManagerTest.php <?php /** * Copyright (c) Enalean, 2014. All Rights Reserved. * * This file is a part of Tuleap. * * Tuleap is free software; you can redistribute it and/or modify * it under the terms of the GNU General Public License as published by * the Free Software Foundation; either version 2 of the License, or * (at your option) any later version. * * Tuleap is distributed in the hope that it will be useful, * but WITHOUT ANY WARRANTY; without even the implied warranty of * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the * GNU General Public License for more details. * * You should have received a copy of the GNU General Public License * along with Tuleap. If not, see <http://www.gnu.org/licenses/>. */ require_once __DIR__ .'/../../bootstrap.php'; class TemporaryFileManager_BaseTest extends TuleapTestCase { protected $file_manager; protected $cache_dir; public function setUp() { parent::setUp(); ForgeConfig::store(); $this->cache_dir = trim(`mktemp -d -p /var/tmp cache_dir_XXXXXX`); ForgeConfig::set('codendi_cache_dir', $this->cache_dir); $user = aUser()->withId(101)->build(); $dao = mock('Tracker_Artifact_Attachment_TemporaryFileManagerDao'); stub($dao)->create()->returns(1); $file_info_factory = mock('Tracker_FileInfoFactory'); $this->file_manager = new Tracker_Artifact_Attachment_TemporaryFileManager($user, $dao, $file_info_factory); } public function tearDown() { exec('rm -rf '. escapeshellarg($this->cache_dir)); ForgeConfig::restore(); parent::tearDown(); } } class TemporaryFileManager_getDiskUsageTest extends TemporaryFileManager_BaseTest { public function itReturns0WhenNoFiles() { $this->assertEqual(0, $this->file_manager->getDiskUsage()); } public function itReturnsTheSizeOfTheOnlyFile() { file_put_contents($this->cache_dir .'/rest_attachement_temp_101_mona_lisa.png', 'Content'); $this->assertEqual(7, $this->file_manager->getDiskUsage()); } public function itSumsUpAllTheFiles() { file_put_contents($this->cache_dir .'/rest_attachement_temp_101_mona_lisa.png', 'Content'); file_put_contents($this->cache_dir .'/rest_attachement_temp_101_liza_monet.png', 'Another content'); $this->assertEqual(22, $this->file_manager->getDiskUsage()); } public function itSumsOnlyCurrentUserFiles() { file_put_contents($this->cache_dir .'/rest_attachement_temp_101_mona_lisa.png', 'Content'); file_put_contents($this->cache_dir .'/rest_attachement_temp_101_liza_monet.png', 'Another content'); file_put_contents($this->cache_dir .'/rest_attachement_temp_102_hannibal_lecteur.png', 'Whatever'); $this->assertEqual(22, $this->file_manager->getDiskUsage()); } } class TemporaryFileManager_saveTest extends TemporaryFileManager_BaseTest { public function setUp() { parent::setUp(); ForgeConfig::set('sys_max_size_upload', 10); } public function itCanSaveATemporaryFilesIfQuotaIsNotExceeded() { file_put_contents($this->cache_dir .'/rest_attachement_temp_101_mona_lisa.png', 'Content'); $temporary = $this->file_manager->save('jette_lit.png', 'Mugshot', 'image/png'); $this->assertEqual('jette_lit.png', $temporary->getName()); } public function itCanSaveATemporaryFilesIfQuotaIsExceededBySomeoneElse() { file_put_contents($this->cache_dir .'/rest_attachement_temp_102_mona_lisa.png', 'Content that exceed quota'); $temporary = $this->file_manager->save('jette_lit.png', 'Mugshot', 'image/png'); $this->assertEqual('jette_lit.png', $temporary->getName()); } public function itCannotSaveATemporaryFilesIfQuotaIsExceeded() { file_put_contents($this->cache_dir .'/rest_attachement_temp_101_mona_lisa.png', 'Content that exceed quota'); $this->expectException('Tuleap\Tracker\Artifact\Attachment\QuotaExceededException'); $this->file_manager->save('jette_lit.png', 'Mugshot', 'image/png'); } } class TemporaryFileManager_appendChunkTest extends TemporaryFileManager_BaseTest { private $empty_file; private $wrong_path_file; public function setUp() { parent::setUp(); ForgeConfig::set('sys_max_size_upload', 10); $this->empty_file = new Tracker_Artifact_Attachment_TemporaryFile( 1, 'jette_lit.png', 'random_tmpname', 'Mugshot', 0, 0, 101, 0, 'image/png' ); touch($this->cache_dir .'/rest_attachement_temp_101_'. $this->empty_file->getTemporaryName()); $this->wrong_path_file = new Tracker_Artifact_Attachment_TemporaryFile( 1, 'jette_lit.png', 'wrong_path', 'Mugshot', 0, 0, 101, 0, 'image/png' ); } public function itThrowsExceptionIfOffsetIsNotValid() { $this->expectException('Tracker_Artifact_Attachment_InvalidOffsetException'); $this->file_manager->appendChunk(base64_encode('le content'), $this->empty_file, 2); } public function itThrowsExceptionIfFileDoesNotExist() { $this->expectException('Tracker_Artifact_Attachment_InvalidPathException'); $this->file_manager->appendChunk(base64_encode('le content'), $this->wrong_path_file, 1); } public function itWritesChunkOnTheDisk() { $filepath = $this->cache_dir .'/rest_attachement_temp_101_'. $this->empty_file->getTemporaryName(); $this->file_manager->appendChunk(base64_encode('le content'), $this->empty_file, 1); $this->assertEqual('le content', file_get_contents($filepath)); } public function itThrowsExceptionIfChunkIsTooBig() { $filepath = $this->cache_dir .'/rest_attachement_temp_101_'. $this->empty_file->getTemporaryName(); $this->expectException('Tuleap\Tracker\Artifact\Attachment\QuotaExceededException'); $this->file_manager->appendChunk(base64_encode('le too big content'), $this->empty_file, 1); $this->assertEqual('', file_get_contents($filepath)); } }
64587c09f76c954849c47e1058e7de846fadb80d
[ "PHP" ]
1
PHP
magarnier/tuleap
e84f457c13e5a5ece1a08722a2b3bad4f86b54d0
9daebd5bc479f5e392fb1a0732c29029dd0a5ab2
refs/heads/master
<repo_name>Bitlark/code_gist<file_sep>/2019-08-21 async递归/code.js let requestTimes = 1; // 递归依赖于一个全局的 async function getBookingValidationResult(ctx, soaParams) { let interval = 3000; let soaRes = await SOA({ ctx, serviceCode: "12672", methodName: "BookingValidation", params: soaParams }, false); if (!soaRes || !soaRes.bookingValidationResult || !soaRes.bookingValidationResult.BookabilityInfo || !soaRes.bookingValidationResult.BookabilityInfo.BookabilityStatus || soaRes.bookingValidationResult.BookabilityInfo.BookabilityStatus === "IN_PROCESS" || soaRes.bookingValidationResult.BookabilityInfo.BookabilityStatus === "Fail") { let setTimeoutNo = setTimeout(() => { if (requestTimes <= 3) { soaRes = getBookingValidationResult(ctx, soaParams); requestTimes += 1; } else { clearTimeout(setTimeoutNo); return; // 失败的返回: 是返回某个值,或者抛出异常 } }, i === 1 ? interval : 0); // 这个 i 哪里来的 } return soaRes; } // 顺便提及 // 如果抛出异常,初始化调用方需要捕捉异常 // call getBookingValidationResult
34c460c5d92f4e65181f53228bed9a7bac33699d
[ "JavaScript" ]
1
JavaScript
Bitlark/code_gist
ae3554891db0d929c4238fe4d491361e5f62933a
adf87fc4dc4df48a1ac90877d1f1a8c648e58fb1
refs/heads/master
<file_sep><!-- Created By: <NAME> "version": "1.1.0", "description": "Modern way to optimize static HTML and Assets", "license": "ISC" --> # OPTIMIZEWEB Installation & Configuration: Modern way to optimize the HTML & Assets ### Uses and Benefits: - It is possible to merge multiple components in a one HTML - All Jpg, Png can optimized, no need to upload on tinypng or other server websites. - Customized/Third-party Javascript minimized. - Customized CSS/Third-party CSS mininized. - Any modification in the HTML or SCSS will run the **watch** and update in build folder. **Note:** Once you have done the all below steps, then please change the source file path from gulpfile.js as per your project. Currently, It is set as per SRC folder, which has added in this Repositary. ### Step 1: Install NodeJS **[NodeJS](https://nodejs.org/en/)** ### Step 2: Open the command prompt and install gulp-cli plugin **[gulp-cli](https://www.npmjs.com/package/gulp-cli)** or use directly on CLI: **npm i gulp-cli -g** ### Step 3: Open the project in the Visual Code and create package.json Create a package file on CLI: npm init ### Step 4: Install gulp plugins from VS Code terminal **[gulp](https://www.npmjs.com/package/gulp)** or use directly on CLI: **npm i gulp -D** ### Step 5: After completed above steps then create a gulpfile.js: Create a gulpfile using: **echo.> gulpfile.js** It will ask few details like package name version etc. You can fill or just skip using the enter key ### Step 6: Install remeaning plugins which has to support for optimization: Install plugins from CLI: **npm i gulp-autoprefixer gulp-html-partial gulp-line-ending-corrector gulp-sass gulp-sourcemaps gulp-uglify gulp-image gulp-uglifycss --save-dev** ### Step 7: Download the gulpfile.js from Repo and replace into your project: Please make sure your assets folder path for CSS, JS & Images. If need then you can change the path from gulpfile.js ### Step 8: Now, Just Run project: All steps are done, Now, you can run command **gulp** or **gulp develop** and, you can check new **build** folder has been created in your project with optimized files like CSS, JS & Images and all. #### Difference between "**gulp**" and "**gulp develop**" CLI command: **gulp develop**: if you are not required to optimize the images on each time. **gulp**: optimize the images on each time. ### How to make partial component You can add any HTML file into the parent HTML using the <br> \<partial src="shared/header.html" title="Header Component"></partial> <file_sep>/*** Created By: <NAME> "version": "1.1.0", "description": "Modern way to optimize static HTML and Assets", "license": "ISC" ****/ var gulp = require('gulp'), autoprefixer = require('gulp-autoprefixer'), sass = require('gulp-sass'), sourcemaps = require('gulp-sourcemaps'), uglify = require('gulp-uglify'), uglifycss = require('gulp-uglifycss'), image = require('gulp-image'), lineec = require('gulp-line-ending-corrector'), htmlPartial = require('gulp-html-partial'); var root = './src/' var preBuild = root + 'assets/'; var htmlPath = root + '**/*.html'; var htmlPathExclude = '!' + root + 'components/**/*.html'; var fontPath = preBuild + 'fonts/**/*'; var scssPath = preBuild + 'style/**/*.scss'; var cssPath = preBuild + 'style/**/*.css'; var imgPath = preBuild + 'images/**/*.+(jpg|jpeg|png)'; var mediaPath = preBuild + 'images/**/*.+(gif|mp4)'; var jsPath = preBuild + 'javascript/*'; var rootDest = 'build/' var scssDest = rootDest + 'assets/style/'; var imgDest = rootDest + 'assets/images/'; var mediaDest = rootDest + 'assets/images/'; var jsDest = rootDest + 'assets/javascript/'; var fontDest = rootDest + 'assets/fonts/'; gulp.task('html', async function () { gulp.src([htmlPath, htmlPathExclude]) .pipe(htmlPartial({ basePath: 'src/' })) .pipe(gulp.dest('build')); }); gulp.task('fonts', async function () { return gulp.src(fontPath) .pipe(gulp.dest(fontDest)); }); gulp.task('sass', async function () { return gulp.src(scssPath) .pipe(sourcemaps.init({ loadMaps: true })) .pipe(sass().on('error', sass.logError)) .pipe(autoprefixer('last 2 versions')) .pipe(sourcemaps.write()) .pipe(lineec()) .pipe(uglifycss({ "maxLineLen": 80, "uglyComments": true })) .pipe(gulp.dest(scssDest, { append: true })); }); gulp.task('css', async function () { return gulp.src(cssPath) .pipe(sourcemaps.init({ loadMaps: true })) .pipe(sass().on('error', sass.logError)) .pipe(autoprefixer('last 2 versions')) .pipe(sourcemaps.write()) .pipe(lineec()) .pipe(uglifycss({ "maxLineLen": 80, "uglyComments": true })) .pipe(gulp.dest(scssDest, { append: true })); }); gulp.task('imageOptimize', async function () { return gulp.src(imgPath) .pipe(image()) .pipe(gulp.dest(imgDest, { append: true })) }); gulp.task('mediaFiles', async function () { return gulp.src(mediaPath) .pipe(gulp.dest(mediaDest, { append: true })); }); gulp.task('uglifyJS', async function () { return gulp.src(jsPath) .pipe(uglify()) .pipe(gulp.dest(jsDest, { append: true })) }); gulp.task('watch', async function () { gulp.watch('src/**/*.html', gulp.series('html')); gulp.watch(fontPath, gulp.series('fonts')); gulp.watch(scssPath, gulp.series('sass')); gulp.watch(cssPath, gulp.series('css')); gulp.watch(imgPath, gulp.series('imageOptimize')); gulp.watch(mediaPath, gulp.series('mediaFiles')); gulp.watch(jsPath, gulp.series('uglifyJS')); }) gulp.task('develop', gulp.series('html', 'fonts', 'sass', 'css', 'mediaFiles', 'uglifyJS', 'watch')); gulp.task('default', gulp.series('develop', 'imageOptimize', 'watch'));
9439c9abe97151294ce5d8a1d03002fea02e2bb4
[ "Markdown", "JavaScript" ]
2
Markdown
suyogn/OPTIMIZEWEB
6357b9aa6d03174cfe79e0d8b4ea83904d65dab3
4f74440e731d2ce62ea42c7336c680c2440be35a
refs/heads/master
<repo_name>VituBR19/react-drag-and-drop<file_sep>/README.md # react-drag-and-drop This project is a study, React Todo List using Drag and Drop feature. # How to install Just run into your terminal npm install, to install all modules. # How to use All you have to do is create a new card clicking at the '+' button, so you can write you todo item and click 'Create'. - You can sort your list by drag and drop - You can delete a card if you want or if you finished your item - If you double click on card's text you can edit that text, press `Enter` to apply # Storage The storage was made using just Local Storage from Javascript <file_sep>/src/components/Board/index.js import React, { useState, useEffect } from 'react'; import './index.css'; import { DragDropContext, Droppable } from 'react-beautiful-dnd'; import { getListStyle } from '../../styles'; import { getLocalData, setLocalData } from '../../LocalData'; import Card from '../Card'; function Board(props) { const { boardName } = props; const [itens, setItens] = useState(getLocalData() ? getLocalData() : []); const [activeAdd, setActiveAdd] = useState('hidden'); const [textContent, setTextContent] = useState(''); function dragEnd(result, itens, setItens) { if(!result.destination) return; const { source, destination } = result; const copiedItems = itens; const [removed] = copiedItems.splice(source.index, 1); copiedItems.splice(destination.index, 0, removed); setItens(copiedItems); setLocalData(itens); } function addTask() { setActiveAdd( 'show' ); } function handleCloseModal(e) { if(e.keyCode === 27) { setActiveAdd('hidden'); setTextContent(''); return; } if(e.target.className.startsWith('text')) { setActiveAdd('hidden'); setTextContent(''); return; } } function handleTaskcontent(e) { setTextContent( e.target.value ); } function handleNewTask() { setActiveAdd( 'hidden' ); if(textContent) { const newItem = [...itens, { id:`${Date.now()}`, content: `${textContent}` }]; setLocalData(newItem); const savedNewItem = getLocalData(); setItens(savedNewItem); } setTextContent(''); } return ( <div className="board"> <DragDropContext onDragEnd={result => dragEnd(result, itens, setItens)}> <span className="board-title" >{`${boardName}`}</span> <button className="add" onClick={addTask}>+</button> {/* area to add some text to a card */} <div onClick={handleCloseModal} className={`text-card ${activeAdd}`}> <div className="text-area-container"> <textarea onChange={handleTaskcontent} onKeyDown={handleCloseModal} value={textContent} className="add-text"></textarea> <button onClick={handleNewTask} className="apply-text">Criar</button> </div> </div> <Droppable droppableId='droppable'> {(provided, snapshot) => ( <div {...provided.droppableProps} ref={provided.innerRef} style={getListStyle(snapshot.isDraggingOver)} > { itens.map((item, index) => ( <Card item={item} setItens={setItens} key={index} index={index} /> )) } {provided.placeholder} </div> )} </Droppable> </DragDropContext> </div> ) } export default Board; <file_sep>/src/components/Card/index.js import React, { useState } from 'react'; import './index.css'; import { Draggable } from 'react-beautiful-dnd'; import { getItemStyle } from '../../styles'; import { getLocalData, setLocalData } from '../../LocalData'; function Card(props) { const {item, index, setItens} = props; const [activeInput, setActiveInput] = useState('hidden'); const [activeText, setActiveText] = useState('show'); const [ contentText, setContentText] = useState(''); function handleDelete(e) { const allItems = getLocalData(); const itemIndexToRemove = e.target.id; allItems.splice(itemIndexToRemove, 1); setLocalData(allItems); setItens(allItems); } function handleDoubleClick() { setContentText(item.content); setActiveInput('show'); setActiveText('hidden'); } function handleChange(e) { setContentText( e.target.value ); } function handleKeyDown(e) { if(e.keyCode === 27) { setActiveInput('hidden'); setActiveText('show'); return; } if(e.keyCode !== 13) return; const allItems = getLocalData(); const foundIndex = allItems.findIndex(i => i.id === item.id); allItems[foundIndex].content = contentText; setItens(allItems); setLocalData(allItems); setActiveInput('hidden'); setActiveText('show'); } return ( <> <Draggable className='card' key={item.id} draggableId={item.id} index={index}> { (provided, snapshot) => ( <div ref={provided.innerRef} {...provided.draggableProps} {...provided.dragHandleProps} style={getItemStyle(snapshot.isDragging, provided.draggableProps.style)} > <div className='content-container'> <textarea type="text" col="25" onKeyDown={handleKeyDown} onChange={handleChange} className="edit-text" value={contentText} id={activeInput} autoFocus/> <span onDoubleClick={handleDoubleClick} id={item.id} className={`item-text ${activeText}`}> {item.content} </span> <div className="actions"> {/* delete */} <a id={index} onClick={handleDelete}> x </a> </div> </div> </div> ) } </Draggable> </> ) } export default Card; <file_sep>/src/LocalData.js const getLocalData = () => { return JSON.parse(localStorage.getItem('items')); }; const setLocalData = (item) => { localStorage.setItem('items', JSON.stringify(item)); }; module.exports = { getLocalData, setLocalData };
5f8e7476d2c90a6c7c1d0fe24dccd2f253f84ac4
[ "Markdown", "JavaScript" ]
4
Markdown
VituBR19/react-drag-and-drop
5b61caa72311e037f57b30cfa04078713a39517b
36d63a0c36281b72b1f8d1fe4f072fcd88efcf92
refs/heads/master
<file_sep>library(highcharter) library(ggplot2) # some code ggplot(mtcars) + geom_point(aes(mpg, cyl)) mtcars %>% View()
32d97c8fbd98112f7aa0793a6c639b723c4d69f5
[ "R" ]
1
R
testhtatthat/Heights
993ab83130cdfea00a46e186d7814091c2c393ec
48b2d324b1224481f7db89ba32635a14154cb652
refs/heads/master
<file_sep>#include "node.h" #include "list.h" #include "studente.h" #include <iostream> using namespace std; int main() { List<int> list; for(int i=1; i<=5; i++) list.insertInOrder(i, List<int>::DECRESC); // for(int i=5; i>=0; i--) // list.insertInOrder(i, List<int>::CRESC); // list.insertInOrder(6, List<int>::CRESC); //1 2 3 4 5 6 // cout<<list; // list.insertInOrder(3, List<int>::DECRESC); //3 1 2 3 4 5 6 // list.insertInOrder(5, List<int>::DECRESC); //5 3 1 2 3 4 5 6 // list.insertInOrder(4, List<int>::DECRESC); //5 3 4 1 2 3 4 5 6 // list.insertInOrder(6, List<int>::CRESC); //5 3 4 1 2 3 4 5 6 6 // cout<<list; //0 decresc //5 4 3 2 1 } <file_sep>#include "moto.h" #include "automobile.h" #include "gara.h" #include<iostream> #include<cstdlib> #include<ctime> int main() { int step=10; int max_partecipanti = 5; //parametri: STEP - PARTECIPANTI Gara gara(step, max_partecipanti); //AGGIUNTA CASUALE DI AUTO/MOTO srand(time(0)); for(int i=0; i<max_partecipanti; i++){ Veicolo *v; if( rand()%100<50) //se rand è 1 v = new Moto(120, 5000, "Ducati", "Motorizzazione Delca"); else v = new Auto(180, 10000, "Lamborghini", "Motorizzazione Dellami"); gara.aggiungiPartecipante(*v); } gara.partenza(); }
7b3e0f0dda0154c5c46ff15798027ebe9823718b
[ "C++" ]
2
C++
LucaStrano/Programmazione_II
82c51fb90052bf28fea04031597d090e82fafec2
d5adf9a48ae81e8366d548f576ee4da1c09d12d2
refs/heads/master
<repo_name>forwardToday/goweb<file_sep>/minus/router.go package minus import ( "fmt" "net/http" "reflect" "strings" ) type controllerInfo struct { // regex *regexp.Regexp // params map[int]string controllerType reflect.Type } type ControllerRegistor struct { routers []*controllerInfo App *App } func NewControllerRegistor() *ControllerRegistor { return &ControllerRegistor{} } func (p *ControllerRegistor) Add(pattern string, c ControllerInterface) { t := reflect.Indirect(reflect.ValueOf(c)).Type() route := &controllerInfo{} route.controllerType = t p.routers = append(p.routers, route) } func (app *App) SetStaticPath(url string, path string) *App { Mapp.StaticDirs[url] = path return app } func (p *ControllerRegistor) ServeHTTP(w http.ResponseWriter, r *http.Request) { fmt.Printf("ServeHTTP:v%", p.routers) fmt.Printf("ServeHTTP:s%", r.Method) var started bool for prefix, staticDir := range Mapp.StaticDirs { if strings.HasPrefix(r.URL.Path, prefix) { file := staticDir + r.URL.Path[len(prefix):] http.ServeFile(w, r, file) started = true return } } // requestPath := r.URL.Path //find a matching Route for _, route := range p.routers { //Invoke the request handler vc := reflect.New(route.controllerType) init := vc.MethodByName("Init") in := make([]reflect.Value, 2) ct := &Context{ResponseWriter: w, Request: r} in[0] = reflect.ValueOf(ct) in[1] = reflect.ValueOf(route.controllerType.Name()) init.Call(in) in = make([]reflect.Value, 0) method := vc.MethodByName("Prepare") method.Call(in) if r.Method == "GET" { fmt.Printf("GET ==============") method = vc.MethodByName("Get") method.Call(in) } else if r.Method == "POST" { method = vc.MethodByName("Post") method.Call(in) } else if r.Method == "HEAD" { method = vc.MethodByName("Head") method.Call(in) } else if r.Method == "DELETE" { method = vc.MethodByName("Delete") method.Call(in) } else if r.Method == "PUT" { method = vc.MethodByName("Put") method.Call(in) } else if r.Method == "PATCH" { method = vc.MethodByName("Patch") method.Call(in) } else if r.Method == "OPTIONS" { method = vc.MethodByName("Options") method.Call(in) } // if AutoRender { // method = vc.MethodByName("Render") // method.Call(in) // } method = vc.MethodByName("Finish") method.Call(in) started = true break } //if no matches to url, throw a not found exception if started == false { http.NotFound(w, r) } } <file_sep>/router/router.go package router import ( "minusblog/controllers" "minusblog/minus" ) func init() { minus.Mapp.Handlers.Add("/", &controllers.MinusController{}) } <file_sep>/minus/context.go package minus import ( "net/http" "net/url" ) type Context struct { ResponseWriter http.ResponseWriter Request *http.Request Multipart bool Form url.Values } <file_sep>/controllers/minusController.go package controllers import ( "html/template" "minusblog/minus" ) type MinusController struct { minus.Controller } func (c *MinusController) Get() { t, _ := template.ParseFiles("./view/minus.gtpl") t.Execute(c.Ct.ResponseWriter, nil) // fmt.Fprintf(c.Ct.ResponseWriter, "召唤神龙") } <file_sep>/minus/config.go package minus import ( "time" ) type Config struct { HttpAddr string HttpPort int TemplatePath string RecoverPanic bool RunMode int8 //0=prod,1=dev UseFcgi bool ReadTimeout time.Duration // maximum duration before timing out read of the request, 默认:5*time.Second(5秒超时) WriteTimeout time.Duration // maximum duration before timing out write of the response, 默认:0(不超时) } const ( RunModeProd int8 = 0 RunModeDev int8 = 1 ) <file_sep>/minus/minus.go package minus import ( "fmt" "log" "net" "net/http" "net/http/fcgi" "os" "time" ) const ( Version = "1.0.0" ) var ( Mapp *App AppPath string ) func init() { Mapp = NewApp(nil) AppPath, _ = os.Getwd() } type App struct { Handlers *ControllerRegistor config *Config StaticDirs map[string]string // TemplateRegister *TemplaterRegister } func NewApp(config *Config) *App { cr := NewControllerRegistor() app := &App{ Handlers: cr, config: config, StaticDirs: make(map[string]string), // TemplateRegistor: NewTemplateRegistor(), } cr.App = app return app } func (app *App) Run() { if app.config.HttpAddr == "" { app.config.HttpAddr = "192.168.127.12" } addr := fmt.Sprintf("%s:%d", app.config.HttpAddr, app.config.HttpPort) var err error // err = httpListenAndServe(addr, app.Handlers, app.config.ReadTimeout, app.config.WriteTimeout) for { if app.config.UseFcgi { l, e := net.Listen("tcp", addr) if e != nil { log.Print("Listen: ", e) } //log.Print("UseFcgi, fcgi.Serve") err = fcgi.Serve(l, app.Handlers) } else { //log.Print("http.ListenAndServe") //err = http.ListenAndServe(addr, app.Handlers) err = httpListenAndServe(addr, app.Handlers, app.config.ReadTimeout, app.config.WriteTimeout) } if err != nil { log.Print("ListenAndServe: ", err) //panic(err) } time.Sleep(time.Second * 2) } } func httpListenAndServe(addr string, handler http.Handler, readTimeout time.Duration, writeTimeout time.Duration) error { if readTimeout == 0 { readTimeout = 5 * time.Second } server := &http.Server{ Addr: addr, Handler: handler, ReadTimeout: readTimeout, WriteTimeout: writeTimeout, } return server.ListenAndServe() } func Run(config *Config) { Mapp.config = config Mapp.Run() } <file_sep>/main.go package main import ( "fmt" "minusblog/minus" _ "minusblog/router" "time" ) func main() { fmt.Print("main") config := &minus.Config{ HttpAddr: "localhost", HttpPort: 9090, ReadTimeout: 10 * time.Second, WriteTimeout: 10 * time.Second, } minus.Run(config) } <file_sep>/minus/template.go package minus import () // type TemplaterRegister struct { // }
0c2ca6465629244b1201f8f4266515968f95724f
[ "Go" ]
8
Go
forwardToday/goweb
79ea191d8b881bcda2230103cf57089cacf93cb3
a73bfcb1d374e0c4f93caf2d44e85bc83b54f7a7
refs/heads/master
<repo_name>Julien-Mialon/ConcoursIUT2018<file_sep>/env/Makefile default: @echo "Try make build, make build-nocache or make push." build: docker build -t mpoquet/bashbot:latest \ -t mpoquet/bashbot:$$(date --iso-8601) \ . build-nocache: docker build -t mpoquet/bashbot:latest \ -t mpoquet/bashbot:$$(date --iso-8601) \ --no-cache \ . push: docker push mpoquet/bashbot <file_sep>/bot/sample.sh #!/bin/bash arr=(0 1 2 3 4 5 6) echo ${arr[0]} echo ${arr[1]} arr[12]=42 arr[1]=8 echo ${arr[1]} echo ${arr[12]} echo "#"${arr[91]}"#"<file_sep>/env/Dockerfile # Base image with Nix installed: https://github.com/LnL7/nix-docker FROM lnl7/nix:1.11.16 # Survival kit RUN nix-env -i git gnutar gzip # Bash fun RUN nix-env -i coreutils binutils findutils gnused gawk netcat-gnu jq # Retrieve code RUN git clone https://github.com/Julien-Mialon/ConcoursIUT2018.git /bot <file_sep>/bot/run_docker.sh #!/usr/bin/env bash usage="run_docker.sh [host] [port]" docker run -w /bot/bot \ -P \ -ti mpoquet/bashbot \ /bot/bot/run.sh $@ <file_sep>/bot/run.sh #!/usr/bin/env bash # Arguments parting usage="run.sh [host] [port]" case "$#" in 0) host='localhost' port='8889' ;; 1) host="$1" port='8889' ;; 2) host="$1" port="$2" ;; esac echo "host=${host}" echo "port=${port}" # Bot source ./bot.sh exec 3<>/dev/tcp/${host}/${port} echo -e '{"nickname": "BashBinder"}' >&3 currentType="init" while true; do line=$(head -n1 <&3) run "$line" $currentType case $currentType in init) currentType="map" ;; map) currentType="turn" ;; turn) #echo "result:" $RESULT_IA echo -e $RESULT_IA >&3 ;; esac done <file_sep>/README.md # ConcoursIUT2018 concours iut 2018 <file_sep>/bot/bot.sh #!/bin/bash #set -eux #set -ux declare -A mapArray declare -A playerPositions declare -A playerDirections declare -A playerScores declare -A projectilesPositions declare -A projectilesDirections declare -A mapFlower myProjectile=0 myIdPlayer=0 # hard-coded directions. Convention: XOFFSETzYOFFSET LEFT="-1z0" TOP="0z-1" RIGHT="1z0" BOTTOM="0z1" DONOTMOVE="0z0" # hard-coded rotations of positions declare -A rotateClockwise rotateClockwise[${LEFT}]=${TOP} rotateClockwise[${TOP}]=${RIGHT} rotateClockwise[${RIGHT}]=${BOTTOM} rotateClockwise[${BOTTOM}]=${LEFT} declare -A rotateCounterclockwise rotateCounterclockwise[${LEFT}]=${BOTTOM} rotateCounterclockwise[${BOTTOM}]=${RIGHT} rotateCounterclockwise[${RIGHT}]=${TOP} rotateCounterclockwise[${TOP}]=${LEFT} declare -A movementsMap # DIRECTIONyWANTEDMOVEMENT » action movementsMap[${LEFT}y${LEFT}]='move' movementsMap[${LEFT}y${TOP}]='hrotate' movementsMap[${LEFT}y${BOTTOM}]='trotate' movementsMap[${LEFT}y${RIGHT}]='trotate' movementsMap[${TOP}y${LEFT}]='trotate' movementsMap[${TOP}y${TOP}]='move' movementsMap[${TOP}y${BOTTOM}]='trotate' movementsMap[${TOP}y${RIGHT}]='hrotate' movementsMap[${BOTTOM}y${LEFT}]='hrotate' movementsMap[${BOTTOM}y${TOP}]='trotate' movementsMap[${BOTTOM}y${BOTTOM}]='move' movementsMap[${BOTTOM}y${RIGHT}]='trotate' movementsMap[${RIGHT}y${LEFT}]='trotate' movementsMap[${RIGHT}y${TOP}]='trotate' movementsMap[${RIGHT}y${BOTTOM}]='hrotate' movementsMap[${RIGHT}y${RIGHT}]='move' # Determines in which direction we want to go # - IGNORES DIRECTION ! # - the two cells must be next to each other function findDirection { currentX=${1} currentY=${2} wantedX=${3} wantedY=${4} diffx=$((wantedX - currentX)) diffy=$((wantedY - currentY)) funResult_findDirection="${diffx}z${diffy}" } # Determines which action should be done to go in a direction, taking into # account current direction function findMovementAction { currentDirection=${1} wantedMovement=${2} if [ "${wantedMovement}" = "${DONOTMOVE}" ]; then funResult_findMovementAction='' return 0 else funResult_findMovementAction=${movementsMap[${currentDirection}y${wantedMovement}]} fi } function handleInit { echo "Handle init" json=$1 idJoueur=$(echo $json | jq .idJoueur) myIdPlayer=$idJoueur echo $idJoueur } function handleMap { echo "Handle map" json=$1 idJoueur=$(echo $json | jq .idJoueur) joueurs=$(echo $json | jq .joueurs) echo "id:" $idJoueur echo "joueurs:" $joueurs for mapJson in $(echo $json | jq -c ".map[]") ; do #echo "line:" $mapJson point=$(echo $mapJson | jq .points) cassable=$(echo $mapJson | jq .cassable) posX=$(echo $mapJson | jq .pos[0]) posY=$(echo $mapJson | jq .pos[1]) type="none" if [ $point != "null" ] ; then type="p" #bonus without point ($point) fi if [ $cassable = "false" ] ; then type="B" #mur non cassable fi if [ $cassable = "true" ] ; then type="x" #mur cassable fi key=$(echo $posX"z"$posY) mapArray["$key"]=$type done for playerJson in $(echo $json | jq -c ".joueurs[]") ; do playerId=$(echo $playerJson | jq .id) playerX=$(echo $playerJson | jq .position[0]) playerY=$(echo $playerJson | jq .position[1]) playerDirectionX=$(echo $playerJson | jq .direction[0]) playerDirectionY=$(echo $playerJson | jq .direction[1]) playerScore=$(echo $playerJson | jq .score) playerPositions[$playerId]=$(echo $playerX" "$playerY) playerDirections[$playerId]=$(echo $playerDirectionX" "$playerDirectionY) playerScores[$playerId]=$playerScore key=$(echo $playerX"z"$playerY) mapArray[$key]=$playerId done } function updatePlayerPosition { id=$1 currentX=$2 currentY=$3 dirX=$4 dirY=$5 newX=$(($currentX + $dirX)) newY=$(($currentY + $dirY)) currentPosKey=$(echo $currentX"z"$currentY) mapArray[$currentPosKey]="" playerPositions[$id]=$(echo $newX $newY) newPosKey=$(echo $newX"z"$newY) mapArray[$newPosKey]=$id echo "Move to: " $newX $newY } function rotatePlayer { id=$1 dirX=$2 dirY=$3 playerDirections[$id]=$(echo $dirX $dirY) echo "Rotate player: " $id " to " ${playerDirections[$id]} } function removeBonus { posX=$1 posY=$2 posKey=$(echo $posX"z"$posY) value=${mapArray[$posKey]} if [ $value = "p" ]; then mapArray[$posKey]="" fi echo "remove bonus: " $posX $posY } function newShoot { id=$1 posX=$2 posY=$3 dirX=$4 dirY=$5 projectilesPositions[$id]=$(echo $posX $posY) projectilesDirections[$id]=$(echo $dirX $dirY) echo "new shoot: " $id $posX $posY $dirX $dirY } function respawnPlayer { id=$1 currentX=$2 currentY=$3 newX=$4 newY=$5 currentPosKey=$(echo $currentX"z"$currentY) mapArray[$currentPosKey]="" playerPositions[$id]=$(echo $newX $newY) newPosKey=$(echo $newX"z"$newY) mapArray[$newPosKey]=$id echo "Respawn" $id " from: " $currentX $currentY "to: " $newX $newY } function moveShoot { id=$1 posX=$2 posY=$3 projectilesPositions[$id]=$(echo $posX $posY) echo "move shoot: " $id $posX $posY } function explodeShoot { id=$1 posX=$2 posY=$3 unset projectilesPositions[$id] echo "explode shoot: " $id $posX $posY } function removeWall { posX=$1 posY=$2 key=$(<KEY>) value=${mapArray[$key]} if [ $value = "x" ]; then echo "wall removed" mapArray[$key]="" fi echo "remove wall: " $posX $posY "(" $value ")" } function flowApplyOnCase { incX=$1 incY=$2 newX=$(($3 + $incX)) newY=$(($4 + $incY)) key=$(echo $newX"z"$newY) flowMapValue=${mapArray[$key]} mapFlower[$key]=$5 if [ -z $flowMapValue ]; then flowArrayPositions[$writeIndex]=$newX flowArrayPositions[$writeNextIndex]=$newY writeIndex=$(($writeIndex + 1)) writeNextIndex=$(($writeNextIndex + 1)) elif [ $flowMapValue = "p" ]; then funResult_flowNow=$(echo $newX $newY) elif [ $flowMapValue = "x" ]; then flowArrayPositions[$writeIndex]=$newX flowArrayPositions[$writeNextIndex]=$newY writeIndex=$(($writeIndex + 1)) writeNextIndex=$(($writeNextIndex + 1)) fi } # $1 = x ; $2 = y function flowBackTracking { posX=$1 posY=$2 key=$(echo $posX"z"$posY) curDepth=${mapFlower[$key]} echo "Pos0:" $posX $posY $curDepth while [ $curDepth -gt 1 ] do echo "Pos:" $posX $posY $curDepth nextDepth=$(($curDepth - 1)) incX=0 incY=1 newX=$(($incX + $posX)) newY=$(($incY + $posY)) key=$(<KEY>) curDepth=${mapFlower[$key]} if [ -n "$curDepth" ]; then if [ $curDepth -eq $nextDepth ]; then posX=$newX posY=$newY continue fi fi incX=0 incY=-1 newX=$(($incX + $posX)) newY=$(($incY + $posY)) key=$(echo $newX"z"$newY) curDepth=${mapFlower[$key]} if [ -n "$curDepth" ]; then if [ $curDepth -eq $nextDepth ]; then posX=$newX posY=$newY continue fi fi incX=1 incY=0 newX=$(($incX + $posX)) newY=$(($incY + $posY)) key=$(<KEY> curDepth=${mapFlower[$key]} if [ -n "$curDepth" ]; then if [ $curDepth -eq $nextDepth ]; then posX=$newX posY=$newY continue fi fi incX=-1 incY=0 newX=$(($incX + $posX)) newY=$(($incY + $posY)) key=$(echo $newX"z"$newY) curDepth=${mapFlower[$key]} if [ -n "$curDepth" ]; then if [ $curDepth -eq $nextDepth ]; then posX=$newX posY=$newY continue fi fi done if [ $curDepth -eq 1 ]; then funResult_flowBackTracking=$(echo $posX $posY) return 0 fi } # $1 = x ; $2 = y ; $3 = dirX ; $4 = dirY function flowNow { flowArrayPositions=($1 $2) index=0 nextIndex=1 writeIndex=2 writeNextIndex=3 flowerKey=$(echo $1"z"$2) mapFlower[$flowerKey]=0 while true do posX=${flowArrayPositions[$index]} posY=${flowArrayPositions[$nextIndex]} if [ -z $posX ]; then break fi flowerKey=$(echo $posX"z"$posY) curDepth=${mapFlower[$flowerKey]} nextDepth=$(($curDepth + 1)) if [ $nextDepth -gt 7 ]; then break fi flowApplyOnCase 1 0 $posX $posY $nextDepth if [ -n "$funResult_flowNow" ]; then break; fi flowApplyOnCase -1 0 $posX $posY $nextDepth if [ -n "$funResult_flowNow" ]; then break; fi flowApplyOnCase 0 1 $posX $posY $nextDepth if [ -n "$funResult_flowNow" ]; then break; fi flowApplyOnCase 0 -1 $posX $posY $nextDepth if [ -n "$funResult_flowNow" ]; then break; fi index=$(($index + 1)) nextIndex=$(($nextIndex + 1)) done if [ -n "$funResult_flowNow" ]; then flowBackTracking $funResult_flowNow if [ -n "$funResult_flowBackTracking" ]; then findDirection ${playerPositions[$myIdPlayer]} $funResult_flowBackTracking findMovementAction $(echo $3"z"$4) $funResult_findDirection fi fi } function findBestDirection { unset mapFlower funResult_flowNow="" funResult_flowBackTracking="" funResult_findDirection="" funResult_findMovementAction="" declare -A mapFlower depth=8 flowNow ${playerPositions[$myIdPlayer]} ${playerDirections[$myIdPlayer]} if [ -n "$funResult_flowNow" ]; then echo "Find something: " $funResult_flowNow fi } #$1 = x ; $2 = y ; $3 = dirX ; $4 = dirY function canMoveForward { newX=$(($1 + $3)) newY=$(($2 + $4)) currentPosKey=$(echo $newX"z"$newY) value=${mapArray[$currentPosKey]} echo "plop: #"$value"#" funResult_canMoveForward="false" funResult_shootFirst="false" if [ -z $value ]; then funResult_canMoveForward="true" elif [ $value = "p" ]; then funResult_canMoveForward="true" elif [ $value = "B" ]; then funResult_canMoveForward="false" else funResult_canMoveForward="true" funResult_shootFirst="true" #elif [ $value = "x" ]; then || value = player fi } function updateLine { #echo "line:arg1: #"$1"#" #echo "line:arg2: #"$2"#" case $1 in "\"joueur\"") case $2 in "\"move\"") updatePlayerPosition $3 ${playerPositions[$3]} ${playerDirections[$3]} ;; "\"rotate\"") rotatePlayer $3 $(echo $5 | cut -d"," -f1) $6 ;; "\"recupere_bonus\"") removeBonus $(echo $5 | cut -d"," -f1) $6 ;; "\"shoot\"") newShoot $4 $(echo $6 | cut -d"," -f1) $7 $(echo ${10} | cut -d"," -f1) ${11} if [ $3 = $myIdPlayer ]; then myProjectile=0 fi ;; "\"respawn\"") respawnPlayer $3 ${playerPositions[$3]} $(echo $5 | cut -d"," -f1) $6 ;; esac ;; "\"projectile\"") case $2 in "\"move\"") moveShoot $3 $(echo $5 | cut -d"," -f1) $6 ;; "\"explode\"") explodeShoot $3 $(echo $5 | cut -d"," -f1) $6 if [ $9 = "[" ]; then removeWall $(echo ${10} | cut -d"," -f1) ${11} fi ;; esac ;; esac } function handleTurn { echo "Handle turn" json=$1 echo $json for itemJson in $(echo $json | jq -c ".[]") ; do line=$(echo $itemJson | jq ".[]") updateLine $line done findBestDirection if [ -z $funResult_findMovementAction ]; then canMoveForward ${playerPositions[$myIdPlayer]} ${playerDirections[$myIdPlayer]} if [ $funResult_canMoveForward = "true" ]; then if [ $funResult_shootFirst = "true" ]; then RESULT_IA='["shoot", "move"]' else RESULT_IA='["move", "shoot"]' fi else if [ $funResult_shootFirst = "true" ]; then RESULT_IA='["shoot", "hrotate"]' else RESULT_IA='["hrotate", "shoot"]' fi fi else RESULT_IA='["shoot", "'$funResult_findMovementAction'"]' fi } # $1 : line json $2 : enum (init, map, turn) function run { json=$1 if [ $2 = "init" ]; then handleInit "$json" fi if [ $2 = "map" ]; then handleMap "$json" fi if [ $2 = "turn" ]; then handleTurn "$json" fi } <file_sep>/env/README.md Build ===== To use docker cache, simply run ``make build``. To disable docker cache: ``make build-nocache``. Push to Docker Hub ================== ``` bash make push ```
9e4cfc86bea0c0b61b38654c99ab7029357d791a
[ "Markdown", "Makefile", "Dockerfile", "Shell" ]
8
Makefile
Julien-Mialon/ConcoursIUT2018
12ab7f00c242796325a833aa6050e447ff5052c9
0ec7a50df60b3643437dda4bec6e5a755e046a15
refs/heads/master
<file_sep>package net.prosavage.baseplugin.strings; /** * A utility class for placeholder management */ public class Placeholder { private String key; private String value; /** * Create a placeholder. * * @param key - The text to replace. * @param value - The text will change to this. */ public Placeholder(String key, String value) { this.key = "{" + key + "}"; this.value = value; } /** * Processes placeholders in a line. * * @param line - the line to process. * @return - A line with all the placeholders replaced. */ public String process(String line) { return line.replace(key, value); } } <file_sep>package net.prosavage.baseplugin.strings; import org.bukkit.ChatColor; import java.util.Arrays; import java.util.Collections; import java.util.List; import java.util.stream.Collectors; public class StringProcessor { /** * Colors a string. * * @param string - string to color. * @return - returns a colored string. */ public static String color(String string) { return ChatColor.translateAlternateColorCodes('&', string); } /** * Colors a list of strings. * * @param strings - list of strings to color. * @return - returns a colored list of strings. */ public static List<String> color(List<String> strings) { return strings.stream().map(StringProcessor::color).collect(Collectors.toList()); } /** * Processes Placeholder Objects in a string. * * @param line - line to parse placeholders for. * @param placeholders - Placeholder objects to use in the parsing. * @return - returns the parsed line. */ public static String processMultiplePlaceholders(String line, Placeholder... placeholders) { for (Placeholder placeholder : placeholders) { line = color(placeholder.process(line)); } return line; } /** * Processes Placeholder Objects in a string. * * @param lines - lines to parse placeholders for. * @param placeholders - Placeholder objects to use in the parsing. * @return - returns the parsed lines. */ public static List<String> processMultiplePlaceholders(List<String> lines, Placeholder... placeholders) { for (String line : lines) { processMultiplePlaceholders(line, placeholders); } return lines; } } <file_sep>package net.prosavage.baseplugin.serializer; import net.prosavage.baseplugin.SavagePlugin; public class Serializer { /** * Saves your class to a .json file. */ public void save(Object instance) { SavagePlugin.getPersist().save(instance); } /** * Loads your class from a json file * */ public <T> T load(T def, Class<T> clazz, String name) { return SavagePlugin.getPersist().loadOrSaveDefault(def, clazz, name); } }
83a23461f8cee6287698f69f1187a8fad89ae480
[ "Java" ]
3
Java
Tominous/SavageFramework
e91e3b6edfb8e5258fb1d7b5b464911d6685dd9a
b660fd924aef6d8c18f7661b15bd0bf747388191
refs/heads/master
<repo_name>Sergey73/react<file_sep>/app/src/js/index.js var data = [ {author: "<NAME>", text: "This is one comment"}, {author: "<NAME>", text: "This is *another* comment"} ]; var a = 0; /*setInterval(function () { text = data[a].author console.log(a); console.log(text); }, 2000);*/ function getData () { var res = []; a ? a-- : a++; res.push(data[a]); return res; } var CommentBox = React.createClass({ loadCommentsFromServer: function() { this.setState({data: getData()}); }, handleCommentSubmit: function(comment) { // TODO: submit to the server and refresh the list console.dir(comment); this.setState({data: [comment]}); }, getInitialState: function () { return {data: []}; }, componentDidMount: function() { this.loadCommentsFromServer(); setInterval(this.loadCommentsFromServer, this.props.pollInterval); }, render: function() { return ( <div className="commentBox"> <h1>Comments</h1> <CommentList data={this.state.data} /> <CommentForm onCommentSubmit={this.handleCommentSubmit} /> </div> ); } }); var CommentList = React.createClass({ render: function() { var commentNodes = this.props.data.map(function (comment) { return ( <Comment author = {comment.author}> {comment.text} </Comment> ); }); return ( <div className="commentList"> {commentNodes} </div> ); } }); var Comment = React.createClass({ rawMarkup: function() { var rawMarkup = marked(this.props.children.toString(), {sanitize: true}); return { __html: rawMarkup }; }, render: function() { return ( <div className="comment"> <h2 className="commentAuthor"> {this.props.author} </h2> <span dangerouslySetInnerHTML={this.rawMarkup()} /> </div> ); } }); var CommentForm = React.createClass({ handleSubmit: function (e) { e.preventDefault(); var author = this.refs.author.value.trim(); var text = this.refs.text.value.trim(); if (!text || !author) { return; } this.props.onCommentSubmit({author: author, text: text}); this.refs.author.value = ''; this.refs.text.value = ''; }, render: function() { return ( <form className="commentForm" onSubmit={this.handleSubmit}> <input type="text" placeholder="<NAME>" ref="author"/> <input type="text" placeholder="Say something..." ref="text"/> <input type="submit" value="Post" /> </form> ); } }); ReactDOM.render( <div> <CommentBox url='/api/comments' pollInterval={1000 * 60 * 60 * 24}></CommentBox> </div>, document.getElementById('content') );<file_sep>/README.md babel app/src/js --watch --out-dir app/es5code
e5c3828d4a654b6b0294781fb21913ebf290f0ff
[ "JavaScript", "Markdown" ]
2
JavaScript
Sergey73/react
81b67ddb84d76f1b8581bbd9faf437869b20a164
37933810ec46ae93c60ed560fc7af22204471c52
refs/heads/master
<repo_name>Cl2rk3Gr1ff1n/Synthesis-of-Frontal-Face-Disentangled-Poses-and-Illuminations<file_sep>/siamese_model_2nd.py import torch.nn as nn import torch import torch.nn.functional as f class Siamese(nn.Module): def __init__(self): super(Siamese,self).__init__() # 3x96x96 self.conv11 = nn.Sequential(nn.Conv2d(3, 64, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(64), nn.ELU(True)) # 3x96x96 conv1r has RES self.conv1r = nn.Sequential(nn.Conv2d(3, 64, kernel_size=1, stride=1, padding=0)) # 64x96x96 self.conv12 = nn.Sequential(nn.Conv2d(64, 64, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(64)) # sum with residual, then pass the ELU block self.elu12 = nn.ELU(True) #-------------------------------------------1st layer # 64x96x96 conv21 has RES self.conv21 = nn.Sequential(nn.Conv2d(64, 128, kernel_size=3, stride=2, padding=1), nn.BatchNorm2d(128), nn.ELU(True)) # 128x48x48 self.conv22 = nn.Sequential(nn.Conv2d(128, 64, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(64), nn.ELU(True)) # 64x48x48 self.conv23 = nn.Sequential(nn.Conv2d(64, 128, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(128)) # sum with residual, then pass to ELU block self.elu23 = nn.ELU(True) #-------------------------------------------2nd layer # 128x48x48 conv31 has RES self.conv31 = nn.Sequential(nn.Conv2d(128, 192, kernel_size=3, stride=2, padding=1), nn.BatchNorm2d(192), nn.ELU(True)) # 192x24x24 self.conv32 = nn.Sequential(nn.Conv2d(192, 96, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(96), nn.ELU(True)) # 96x24x24 self.conv33 = nn.Sequential(nn.Conv2d(96, 192, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(192)) # sum with residual, then pass to ELU block self.elu33 = nn.ELU(True) #-------------------------------------------3rd layer #192x24x24 conv41 has RES self.conv41 = nn.Sequential(nn.Conv2d(192, 256, kernel_size=3, stride=2, padding=1), nn.BatchNorm2d(256), nn.ELU(True)) #256x12x12 self.conv42 = nn.Sequential(nn.Conv2d(256, 128, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(128), nn.ELU(True)) #128x12x12 self.conv43 = nn.Sequential(nn.Conv2d(128, 256, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(256)) # sum with residual, then pass to ELU block self.elu43 = nn.ELU(True) #-------------------------------------------4th layer #256x12x12 self.conv51 = nn.Sequential(nn.Conv2d(256, 320, kernel_size=3, stride=2, padding=1), nn.BatchNorm2d(320), nn.ELU(True)) #320x6x6 self.conv52 = nn.Sequential(nn.Conv2d(320, 160, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(160), nn.ELU(True)) #160x6x6 self.conv53 = nn.Sequential(nn.Conv2d(160, 320, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(320)) # sum with residual, then pass to ELU block self.elu53 = nn.ELU(True) #output size 320x6x6 #-------------------------------------------5th layer #-------------------------------------------decomposition #320x6x6 # view->240 + 80 # #240x6x6 Image_1 and Image_2 compute contrastive loss # 80 pas through conv to be 27 # 80x6x6 self.convfc = nn.Conv2d(80, 29, kernel_size=6, stride=1, padding=0) #27x1x1 #27 slice into 20x1x1 + 7x1x1 # self.fc_pose = nn.Linear(27 , 20) # self.fc_light = nn.Linear(27, 7) #compute contractiveLoss #compute Softmax of pose and light #predict Label of pose and light #-------------------------------------------decoder------------------------------------------------------------------- #240x6x6 self.dconv52 = nn.Sequential(nn.ConvTranspose2d(240, 160, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(160), nn.ELU(True)) #160x6x6 self.dconv51 = nn.Sequential(nn.ConvTranspose2d(160, 256, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(256), nn.ELU(True)) #256x6x6 bilinear interpolation upsamping #self.upsampling43 = nn.UpsamplingBilinear2d(scale_factor=2) #256x12x12 self.dconv43 = nn.Sequential(nn.ConvTranspose2d(256 , 256, kernel_size = 3, stride=2, padding=1, output_padding=1), nn.BatchNorm2d(256), nn.ELU(True)) #256x12x12 self.dconv42 = nn.Sequential(nn.ConvTranspose2d(256, 128, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(128), nn.ELU(True)) #128x12x12 self.dconv41 = nn.Sequential(nn.ConvTranspose2d(128, 192, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(192), nn.ELU(True)) #192x12x12 bilinear interpolation upsamping #self.upsampling33 = nn.UpsamplingBilinear2d(scale_factor=2) #192x24x24 self.dconv33 = nn.Sequential(nn.ConvTranspose2d(192 , 192, kernel_size = 3, stride=2, padding=1, output_padding=1), nn.BatchNorm2d(192), nn.ELU(True)) #192x24x24 self.dconv32 = nn.Sequential(nn.ConvTranspose2d(192, 96, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(96), nn.ELU(True)) #96x24x24 self.dconv31 = nn.Sequential(nn.ConvTranspose2d(96, 128, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(128), nn.ELU(True)) #128x24x24 bilinear interpolation upsamping #self.upsampling23 = nn.UpsamplingBilinear2d(scale_factor=2) #128x48x48 self.dconv23 = nn.Sequential(nn.ConvTranspose2d(128 , 128, kernel_size = 3, stride=2, padding=1, output_padding=1), nn.BatchNorm2d(128), nn.ELU(True)) #128x48x48 self.dconv22 = nn.Sequential(nn.ConvTranspose2d(128, 64, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(64), nn.ELU(True)) #64x48x48 self.dconv21 = nn.Sequential(nn.ConvTranspose2d(64, 64, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(64), nn.ELU(True)) #64x48x48 bilinear interpolation upsamping #self.upsampling13 = nn.UpsamplingBilinear2d(scale_factor=2) #64x96x96 self.dconv12 = nn.Sequential(nn.ConvTranspose2d(64 , 64, kernel_size = 3, stride=2, padding=1, output_padding=1), nn.BatchNorm2d(64), nn.ELU(True)) #64x96x96 self.dconv11 = nn.Sequential(nn.ConvTranspose2d(64, 32, kernel_size=3, stride=1, padding=1), nn.BatchNorm2d(32), nn.ELU(True)) #32x96x96 self.output = nn.Sequential(nn.ConvTranspose2d(32, 3, kernel_size=3, stride=1, padding=1), nn.Tanh()) #3x96x96 #self.tanh = nn.tanh() #compute the L1 loss(Lrecon) between GT and reconstruction def forward(self, x, x_2): #-----layer 1------------------------------ out_res_1 = self.conv1r(x) out = self.conv11(x) out = self.conv12(out) out = out + out_res_1 out = self.elu12(out) out_res_1_2 = self.conv1r(x_2) out_2 = self.conv11(x_2) out_2 = self.conv12(out_2) out_2 = out_2 + out_res_1_2 out_2 = self.elu12(out_2) #-----layer 2------------------------------ out_res_2 = self.conv21(out) out = self.conv22(out_res_2) out = self.conv23(out) out = out + out_res_2 out = self.elu23(out) out_res_2_2 = self.conv21(out_2) out_2 = self.conv22(out_res_2_2) out_2 = self.conv23(out_2) out_2 = out_2 + out_res_2_2 out_2 = self.elu23(out_2) #-----layer 3------------------------------- out_res_3 = self.conv31(out) out = self.conv32(out_res_3) out = self.conv33(out) out = out + out_res_3 out = self.elu33(out) out_res_3_2 = self.conv31(out_2) out_2 = self.conv32(out_res_3_2) out_2 = self.conv33(out_2) out_2 = out_2 + out_res_3_2 out_2 = self.elu33(out_2) #-----layer 4-------------------------------- out_res_4 = self.conv41(out) out = self.conv42(out_res_4) out = self.conv43(out) out = out + out_res_4 out = self.elu43(out) out_res_4_2 = self.conv41(out_2) out_2 = self.conv42(out_res_4_2) out_2 = self.conv43(out_2) out_2 = out_2 + out_res_4_2 out_2 = self.elu43(out_2) #-----layer 5-------------------------------- out_res_5 = self.conv51(out) out = self.conv52(out_res_5) out = self.conv53(out) out = out + out_res_5 out = self.elu53(out) out_res_5_2 = self.conv51(out_2) out_2 = self.conv52(out_res_5_2) out_2 = self.conv53(out_2) out_2 = out_2 + out_res_5_2 out_2 = self.elu53(out_2) #----------------------------------------decomposition---------------------------------------------- out_240 = out.narrow(1,0,240) #second dimension[channels]slices from 0 to 239 out_80 = out.narrow(1,240,80) #second dimension[channels]slices from 240 to 319 out_240_2 = out_2.narrow(1, 0, 240) out_80_2 = out_2.narrow(1, 240, 80) out_29 = self.convfc(out_80) out_29_2 = self.convfc(out_80_2) #29x1x1 out_pose = out_29.narrow(1,0,9) #second dimension[channels]slices from 0 to 6 for pose #9x1x1 out_pose_2 = out_29_2.narrow(1, 0, 9) out_light = out_29.narrow(1,9,20) #second dimension[channels]slices from 7 to 26 for light #20x1x1 out_light_2 = out_29_2.narrow(1, 9 , 20) #-----------------------------------------decoder--------------------------------------------------- out = self.dconv52(out_240) out = self.dconv51(out) # out = f.upsample(out, scale_factor = 2, mode='bilinear') out = self.dconv43(out) out = self.dconv42(out) out = self.dconv41(out) # out = f.upsample(out, scale_factor = 2, mode='bilinear') out = self.dconv33(out) out = self.dconv32(out) out = self.dconv31(out) # out = f.upsample(out, scale_factor = 2, mode='bilinear') out = self.dconv23(out) out = self.dconv22(out) out = self.dconv21(out) # out = f.upsample(out, scale_factor = 2, mode='bilinear') out = self.dconv12(out) out = self.dconv11(out) out = self.output(out) # out = out.tanh() out_pose = out_pose.contiguous() out_pose_2 = out_pose_2.contiguous() out_light = out_light.contiguous() out_light_2 = out_light_2.contiguous() out_pose = out_pose.view(out_pose.size(0),-1) out_pose_2 = out_pose_2.view(out_pose_2.size(0),-1) out_light = out_light.view(out_light.size(0),-1) out_light_2 = out_light_2.view(out_light_2.size(0),-1) out_240 = out_240.contiguous() out_240 = out_240.view(64, -1) # 64x8640 out_240_2 = out_240_2.contiguous() out_240_2 = out_240_2.view(64, -1) #--------------------------------------------LOSS # contractive loss between gt and image_1's identity return out_pose, out_pose_2, out_light, out_light_2, out_240, out_240_2, out <file_sep>/mask/multi_mask.py from PIL import Image from pylab import * import numpy as np im = Image.open('249_01_01_051_00.png') im = im.resize((96, 96), Image.BILINEAR) # width, height = im.size # print width,height # 96x96 img = array(im) imshow(img) coordinate = ginput(300) mask = np.load('mask_19.npy') for i in coordinate: y = int(i[0]) x = int(i[1]) mask[x, y, 0] = 5 mask[x, y, 1] = 5 mask[x, y, 2] = 5 # use print img[x, y, 0] & print img[1,60:90,:] to check # the pixel location and coordinate is inversed # coordinate(82,1) right top but pixel[1,82,:] means right top np.save("mask_20", mask) print mask show() <file_sep>/README.md # Siamese-Network-for-Frontal-Face-Synthesis-disentangle-pose-and-light-interference- All the input data are from MultiPIE dataset, 9 different identity faces each time with different poses and illumination. ![input](https://github.com/danny95333/Synthesis-of-Frontal-Face-Disentangled-Poses-and-Illuminations/blob/master/input_samples_iteration_200.png) ## First version: complete basic functions, the frontal face image has clear identity, but blurry boundary and glasses still can't be 100% synthesis. >siamese_out_largeP&LclassW: these are the output with relative large class_P and class_L LOSS. ![first_L](https://github.com/danny95333/Synthesis-of-Frontal-Face-Disentangled-Poses-and-Illuminations/blob/master/siamese_out_largeP%26LclassW/fake_samples_iteration_70000.png) >siamese_out_sameW: these are the output with same weigths to every LOSSes. ![first_Same](https://github.com/danny95333/Synthesis-of-Frontal-Face-Disentangled-Poses-and-Illuminations/blob/master/siamese_out_sameW/fake_samples_iteration_70000.png) ## 2nd update at 30th June * load from June-20 pretrained model and get rid of other losses but L1 loss >2nd_step_only_L1: these are the 2nd update's output with 'Step_decay_learning_rate' ![second_l1](https://github.com/danny95333/Synthesis-of-Frontal-Face-Disentangled-Poses-and-Illuminations/blob/master/2nd_step_only_L1/fake_samples_iteration_40000.png) ## 3rd update: Mask was sample from 20 frontal face image manually, the model and weights was based on the 1st version. * Add Mask at frontal face's boundary to make the netS be more sensitive to the boundary region * Freeze the pre-trained Siamese net's encoder part, which aimed to capture the input images' feature maps. And train the decoder part of the network. >3rd_step_L1_Mask_FreezEnc: These are the 3rd update's output with 'Step_decay_learning_rate', still using L1 as general supervise loss. ![third_l1](https://github.com/danny95333/Synthesis-of-Frontal-Face-Disentangled-Poses-and-Illuminations/blob/master/3rd_step_L1_Mask_FreezEnc/fake_samples_iteration_30000.png) >3rd_step_L2_Mask_FreezEnc: These are the 3rd update's output with 'Step_decay_learning_rate', still using L2 as general supervise loss. Problem: There are still many artifacts on the synthesized images, not 'real' enough ![third_l2](https://github.com/danny95333/Synthesis-of-Frontal-Face-Disentangled-Poses-and-Illuminations/blob/master/3rd_step_L2_Mask_FreezEnc/fake_samples_iteration_30000.png) <file_sep>/siamese_train_M_3.py #from __future__ import print_function import cv2 import argparse import os import random import torch import torch.nn as nn import torch.backends.cudnn as cudnn import torch.optim as optim import torchnet.meter as meter import torchvision.datasets as dset import torchvision.transforms as transforms import torchvision.utils as vutils from torch.autograd import Variable from dataset import multiPIE from siamese_model_2nd import Siamese from contrastive import ContrastiveLoss import numpy as np # import cv2 #from pycrayon import CrayonClient #for plotting loss import matplotlib matplotlib.use('agg') import matplotlib.pyplot as plt import matplotlib.ticker as ticker import time,math from logger import Logger # from models_Parsing import ParseNet saveFile = open('/home/shumao/wyw_files/siamese_output_M_3/record.txt', 'w') saveFile.write("niter:" + str(50000) + "\n") saveFile.write("---lr:" + str(0.0001) + "\n") saveFile.write("beta1:" + str(0.7) + "\n") saveFile.write("W:-1-x-x-x-x-x-" + "\n") saveFile.write("L1 Loss" + "\n") saveFile.write("after load model from: train-3-28000pth") logger = Logger('./log_1'); parser = argparse.ArgumentParser() parser.add_argument('--batchSize', type=int, default=64, help='input batch size') parser.add_argument('--loadSize', type=int, default=100, help='the height / width of the input image to network') parser.add_argument('--fineSize', type=int, default=96, help='the height / width of the input image to network') parser.add_argument('--id_num', type=int, default=200, help='Total training identity.') parser.add_argument('--pose_num', type=int, default=9, help='Total training pose.') parser.add_argument('--light_num', type=int, default=20, help='Total training light.') parser.add_argument('--niter', type=int, default=50000, help='number of iterations to train for') parser.add_argument('--lr', type=float, default=0.0001, help='learning rate, default=0.0002') parser.add_argument('--beta1', type=float, default=0.7, help='beta1 for adam. default=0.7') parser.add_argument('--cuda', action='store_true', help='enables cuda') parser.add_argument('--outf', default='/home/shumao/wyw_files/siamese_output_M_3', help='folder to output images and model checkpoints') parser.add_argument('--manualSeed', type=int, help='manual seed') parser.add_argument('--dataPath', default='/home/shumao/dr-gan/Data_new_realigned2/setting2/train/', help='which dataset to train on') parser.add_argument('--modelPath', default='/home/shumao/wyw_files/siamese_output_3/netS_28000.pth', help='which dataset to train on') parser.add_argument('--save_step', type=int, default=400, help='save weights every 400 iterations ') parser.add_argument('--labelPath', default='/home/shumao/dr-gan/Data_new_realigned2/setting2/Facedata/', help='which dataset to train on') opt = parser.parse_args() print(opt) # print every parser arguments # print(opt.niter) try: os.makedirs(opt.outf) except OSError: pass w_r = 1 # w_cL = 0.02 # w_cP = 0.02 # w_cI = 0.02 # w_P = 0.02 # w_L = 0.02 if opt.manualSeed is None: opt.manualSeed = random.randint(1, 10000) print("Random Seed: ", opt.manualSeed) random.seed(opt.manualSeed) torch.manual_seed(opt.manualSeed) if opt.cuda: torch.cuda.manual_seed_all(opt.manualSeed) cudnn.benchmark = True #---------------------Load Mask------------------- mask = np.load('mask_20.npy') mask = mask.astype(np.float32) M = torch.from_numpy(mask.transpose((2, 0, 1))) FinalMask = M.expand(opt.batchSize,3,96,96) # print m.size() # 3x96x96 #---------------------Load DATA------------------------- dataset_1 = multiPIE(opt.dataPath,opt.loadSize,opt.fineSize,labelPath = opt.labelPath) # dataset_2 = multiPIE(opt.dataPath,opt.loadSize,opt.fineSize,opt.labelPath) dataset_test = multiPIE('/home/shumao/dr-gan/comparison/',opt.loadSize,opt.fineSize,labelPath = opt.labelPath) loader_train_1 = torch.utils.data.DataLoader(dataset=dataset_1, batch_size = opt.batchSize, shuffle=True, num_workers=4, drop_last = True) # loader_train_2 = torch.utils.data.Dataloader(dataset=dataset_1, # batch_size = opt.batchSize, # shuffle=True, # num_workers=4) loader_test = torch.utils.data.DataLoader(dataset=dataset_test, batch_size = 9, shuffle=False, num_workers=4) data_train_1 = iter(loader_train_1) # data_train_2 = iter(loader_train_2) data_test = iter(loader_test) #----------------------Parameters----------------------- num_pose = opt.pose_num num_light = opt.light_num num_iden = opt.id_num def weights_init(m): classname = m.__class__.__name__ if classname.find('Conv') !=-1: m.weight.data.normal_(0.0, 0.02) elif classname.find('BatchNorm') !=-1: m.weight.data.normal_(1.0, 0.02) m.bias.data.fill_(0) netS = Siamese() netS.load_state_dict(torch.load(opt.modelPath)) #-----------------params freeze----------------- for param in netS.conv11.parameters(): param.requires_grad = False for param in netS.conv1r.parameters(): param.requires_grad = False for param in netS.conv12.parameters(): param.requires_grad = False for param in netS.conv21.parameters(): param.requires_grad = False for param in netS.conv22.parameters(): param.requires_grad = False for param in netS.conv23.parameters(): param.requires_grad = False for param in netS.conv31.parameters(): param.requires_grad = False for param in netS.conv32.parameters(): param.requires_grad = False for param in netS.conv33.parameters(): param.requires_grad = False for param in netS.conv41.parameters(): param.requires_grad = False for param in netS.conv42.parameters(): param.requires_grad = False for param in netS.conv43.parameters(): param.requires_grad = False for param in netS.conv51.parameters(): param.requires_grad = False for param in netS.conv52.parameters(): param.requires_grad = False for param in netS.conv53.parameters(): param.requires_grad = False for param in netS.convfc.parameters(): param.requires_grad = False #-----------------params freeze----------------- if(opt.cuda): netS.cuda() #-------------------Loss & Optimization optimizerS = torch.optim.Adam(filter(lambda p: p.requires_grad, netS.parameters()),lr=opt.lr, betas=(opt.beta1, 0.999)) poss_contrastive_loss = ContrastiveLoss() # load from the begining light_contrastive_loss = ContrastiveLoss() identity_contrastive_loss = ContrastiveLoss() reconstructe_loss = nn.L1Loss() pose_class_loss = nn.CrossEntropyLoss() light_class_loss = nn.CrossEntropyLoss() #------------------ Global Variables------------------ input_pose_1 = torch.LongTensor(opt.batchSize) input_light_1 = torch.LongTensor(opt.batchSize) # input_pose_2 = torch.LongTensor(opt.batchSize) # input_light_2 = torch.LongTensor(opt.batchSize) inputImg_1 = torch.FloatTensor(opt.batchSize, 3, opt.fineSize, opt.fineSize) inputImg_2 = torch.FloatTensor(opt.batchSize, 3, opt.fineSize, opt.fineSize) GT = torch.FloatTensor(opt.batchSize, 3,opt.fineSize, opt.fineSize) same_pose = torch.FloatTensor(opt.batchSize) same_iden = torch.FloatTensor(opt.batchSize) same_light = torch.FloatTensor(opt.batchSize) # w_1 = torch.FloatTensor(1) # w_2 = torch.FloatTensor(20) # w_3 = torch.FloatTensor(10) # w_4 = torch.FloatTensor(10) # w_5 = torch.FloatTensor(10) # w_6 = torch.FloatTensor(20) # output_pose_1_label = torch.LongTensor(opt.batchSize) # output_pose_2_label = torch.LongTensor(opt.batchSize) # output_light_1_label = torch.LongTensor(opt.batchSize) # output_light_2_label = torch.LongTensor(opt.batchSize) input_pose_1 = Variable(input_pose_1) # input_pose_2 = Variable(input_pose_2) input_light_1 = Variable(input_light_1) # input_light_2 = Variable(input_light_2) inputImg_1 = Variable(inputImg_1) inputImg_2 = Variable(inputImg_2) GT = Variable(GT) same_pose = Variable(same_pose) same_iden = Variable(same_iden) same_light = Variable(same_light) FinalMask = Variable(FinalMask) # w_1 = Variable(w_1, requires_grad = False) # w_2 = Variable(w_2, requires_grad = False) # w_3 = Variable(w_3, requires_grad = False) # w_4 = Variable(w_4, requires_grad = False) # w_5 = Variable(w_5, requires_grad = False) # w_6 = Variable(w_6, requires_grad = False) pose_mtr = meter.ConfusionMeter(k=opt.pose_num) light_mtr = meter.ConfusionMeter(k=opt.light_num) if(opt.cuda): input_pose_1 = input_pose_1.cuda() # input_pose_2 = input_pose_2.cuda() input_light_1 = input_light_1.cuda() # input_light_2 = input_light_2.cuda() inputImg_1 = inputImg_1.cuda() inputImg_2 = inputImg_2.cuda() GT = GT.cuda() same_pose = same_pose.cuda() same_light = same_light.cuda() same_iden = same_iden.cuda() FinalMask = FinalMask.cuda() # w_1 = w_1.cuda() # w_2 = w_1.cuda() # w_3 = w_1.cuda() # w_4 = w_1.cuda() # w_5 = w_1.cuda() # w_6 = w_1.cuda() # poss_contrastive_loss.cuda() # light_contrastive_loss.cuda() # identity_contrastive_loss.cuda() pose_class_loss.cuda() light_class_loss.cuda() reconstructe_loss.cuda() #------------------test--------- # k = 0 # for meter err_total = 0 err_recon = 0 err_contraL = 0 err_contraP = 0 err_contraI = 0 err_classP = 0 err_classL = 0 def test(iteration, data_test, loader_test): try: images_1,po_1,li_1,GT_1,by_image,same_po,same_li,same_id = data_test.next() except StopIteration: data_test = iter(loader_test) images_1,po_1,li_1,GT_1,by_image,same_po,same_li,same_id = data_test.next() GT.data.resize_(GT_1.size()).copy_(GT_1) inputImg_1.data.resize_(images_1.size()).copy_(images_1) inputImg_2.data.resize_(by_image.size()).copy_(by_image) input_pose_1.data.resize_(po_1.size()).copy_(po_1) input_light_1.data.resize_(li_1.size()).copy_(li_1) output_pose_1, output_pose_2, output_light_1, output_light_2, out_f_1, out_f_2, out = netS(inputImg_1, inputImg_2) vutils.save_image(out.data, '%s/fake_samples_iteration_%03d.png' % (opt.outf, iteration), normalize=True) vutils.save_image(inputImg_1.data, '%s/input_samples_iteration_%03d.png' % (opt.outf, iteration), normalize=True) #-------------------train---------------------- for iteration in range(1,opt.niter+1): running_corrects = 0 running_corrects_light = 0 try: images_1,po_1,li_1,GT_1,by_image,same_po,same_li,same_id= data_train_1.next() except StopIteration: data_train_1 = iter(loader_train_1) images_1,po_1,li_1,GT_1,by_image,same_po,same_li,same_id = data_train_1.next() GT.data.resize_(GT_1.size()).copy_(GT_1) inputImg_1.data.resize_(images_1.size()).copy_(images_1) inputImg_2.data.resize_(by_image.size()).copy_(by_image) input_pose_1.data.resize_(po_1.size()).copy_(po_1) input_light_1.data.resize_(li_1.size()).copy_(li_1) same_pose.data.resize_(same_po.size()).copy_(same_po) same_light.data.resize_(same_li.size()).copy_(same_li) same_iden.data.resize_(same_id.size()).copy_(same_id) netS.zero_grad() output_pose_1, output_pose_2, output_light_1, output_light_2, out_f_1, out_f_2, out = netS(inputImg_1, inputImg_2) #-----------------mask test area----------------------------- # print out.data.type() # print GT.data.type() # print FinalMask.data.type() same # print FinalMask.data.size() 64x3x96x96 # print FinalMask.data.size() # print out.data.size() Final_out = FinalMask * out Final_GT = FinalMask * GT #-----------------mask test area----------------------------- # f_1 & f_2 variable # same_iden variable err_recon = reconstructe_loss(Final_out, Final_GT) err_contraI = identity_contrastive_loss(out_f_1, out_f_2, same_iden) err_contraP = poss_contrastive_loss(output_pose_1, output_pose_2, same_pose) err_contraL = light_contrastive_loss(output_light_1,output_light_2, same_light) err_classL = light_class_loss(output_light_1, input_light_1) err_classP = pose_class_loss(output_pose_1, input_pose_1) # print(err_recon.data.size()) # print(err_contraL.data.size()) # print(err_classP.data.size()) # modify the contrastive loss function to make contrastive loss be 1Lx1L # contrastive loss and Softmax and Loss1 are all requires_grad # err_total = 1 * err_recon + 10 * err_contraP + 10 * err_contraI + 10 * err_classP + 20 * err_classL # err_total = err_recon + err_contraI + err_contraP + err_contraL + err_classL + err_classP err_total = w_r * err_recon err_total.backward() optimizerS.step() #----------------------Visualize----------- if(iteration % 200 == 0): pose_mtr.add(output_pose_1.data, input_pose_1.data) pose_trainacc = pose_mtr.value().diagonal().sum()*1.0/opt.batchSize pose_mtr.reset() light_mtr.add(output_light_1.data, input_light_1.data) light_trainacc = light_mtr.value().diagonal().sum()*1.0/opt.batchSize light_mtr.reset() #----------------------------------------- test(iteration, data_test, loader_test) # #pose prediction # preds_pose = torch.max(output_pose_1.data, 1) # running_corrects += torch.sum(preds == input_pose_1) # print('pose_accuracy: %.2f' # % (running_corrects * 1.0/images.size(0))) # #light prediction # preds_light = torch.max(output_light_1.data, 1) # running_corrects_light += torch.sum(preds_light == input_light_1) # print('light_accuracy: %.2f' # % (running_corrects_light * 1.0/images.size(0))) print('----------------------------------------') print('[%d/%d] Loss_S: %.4f ' %(iteration, opt.niter, err_total.data[0])) print(' Reco_S: %.4f ' %(err_recon.data[0])) print(' conL_S: %.4f ' %(err_contraL.data[0])) print(' conP_S: %.4f ' %(err_contraP.data[0])) print(' conI_S: %.4f ' %(err_contraI.data[0])) print(' Clas_P: %.4f ' %(err_classP.data[0])) print(' Clas_L: %.4f ' %(err_classL.data[0])) if(iteration % opt.save_step == 0): torch.save(netS.state_dict(), '%s/netS_%d.pth' % (opt.outf,iteration))
11ce98cfc112e5902b638a58529d4a717a54e3f5
[ "Markdown", "Python" ]
4
Python
Cl2rk3Gr1ff1n/Synthesis-of-Frontal-Face-Disentangled-Poses-and-Illuminations
cd28d0fd1514e738dbe1fd2d5ece1db2f9b427be
7f488ca2a8a1d4d692da51df0eb4129dab2e55c9
refs/heads/master
<repo_name>SergeyRudi/HW_35<file_sep>/HW_35/src/main/java/core/TDG.java package core; public class TDG { int length = 0, result = 0; static int max = 101; public static void main(String[] args) { for (int i = 2; i <= max; i++) { System.out.println("TC-01." + (String.format("%02d", i - 1)) + " (" + i + (PrimeNumberChecker.validate(i) ? " is prime number)," : " is not prime number),") + i + "," + PrimeNumberChecker.validate(i)); } } }
1722347d88a66cef2bc80b9cce353bad1324f1f9
[ "Java" ]
1
Java
SergeyRudi/HW_35
5ae057c87b1f5a58564c6685c8142097a09f1be4
9a5fd47e0a3d027b799fe501854cc6dd0a4be705
refs/heads/main
<file_sep>package com.book.storiek.Database; import android.content.Context; import android.database.Cursor; import android.database.sqlite.SQLiteDatabase; import android.database.sqlite.SQLiteOpenHelper; import com.book.storiek.Model.Story; import com.squareup.picasso.Picasso; import java.io.File; import java.io.FileOutputStream; import java.io.IOException; import java.io.InputStream; import java.io.OutputStream; import java.util.ArrayList; import java.util.List; public class Database extends SQLiteOpenHelper { Context context; public Database(Context context) { super(context, info_db.DATABASE_NAME, null, info_db.DATABASE_VERSION); this.context = context; isDatabase(); } private void isDatabase() { File check = new File(info_db.PACKAGE); if (check.exists()) { } else { check.mkdir(); } check = context.getDatabasePath(info_db.DATABASE_NAME); if (check.exists()) { } else { try { copyDataBase(); } catch (IOException e) { e.printStackTrace(); } } } private void copyDataBase() throws IOException { InputStream myInput = context.getAssets().open(info_db.DATABASE_SOURCE); String outFileName = info_db.PACKAGE + info_db.DATABASE_NAME; OutputStream myOutput = new FileOutputStream(outFileName); byte[] buffer = new byte[1024]; int length; while ((length = myInput.read(buffer)) > 0) { myOutput.write(buffer, 0, length); } myOutput.flush(); myOutput.close(); myInput.close(); } public List<Story> getAllStory() { SQLiteDatabase db = this.getReadableDatabase(); List<Story> data = new ArrayList<>(); String query = "SELECT * FROM story"; Cursor cursor = db.rawQuery(query, null); if (cursor.moveToFirst()) { do { Story story = new Story(); story.setId(cursor.getInt(cursor.getColumnIndex(info_db.DATA_ID))); story.setCategory(cursor.getString(cursor.getColumnIndex(info_db.DATA_CATEGORY))); story.setName(cursor.getString(cursor.getColumnIndex(info_db.DATA_NAME))); story.setField(cursor.getString(cursor.getColumnIndex(info_db.DATA_FIELD))); story.setFav(cursor.getInt(cursor.getColumnIndex(info_db.DATA_FAV))); story.setImage(cursor.getString(cursor.getColumnIndex(info_db.DATA_IMAGE))); story.setDisc(cursor.getString(cursor.getColumnIndex(info_db.DATA_DISC))); data.add(story); } while (cursor.moveToNext()); } cursor.close(); db.close(); return data; } public List<Story> getTanzStory() { SQLiteDatabase db = this.getReadableDatabase(); List<Story> data = new ArrayList<>(); String query = "SELECT * FROM story WHERE category='tanz'"; Cursor cursor = db.rawQuery(query, null); if (cursor.moveToFirst()) { do { Story story = new Story(); story.setId(cursor.getInt(cursor.getColumnIndex(info_db.DATA_ID))); story.setCategory(cursor.getString(cursor.getColumnIndex(info_db.DATA_CATEGORY))); story.setName(cursor.getString(cursor.getColumnIndex(info_db.DATA_NAME))); story.setField(cursor.getString(cursor.getColumnIndex(info_db.DATA_FIELD))); story.setFav(cursor.getInt(cursor.getColumnIndex(info_db.DATA_FAV))); story.setImage(cursor.getString(cursor.getColumnIndex(info_db.DATA_IMAGE))); story.setDisc(cursor.getString(cursor.getColumnIndex(info_db.DATA_DISC))); data.add(story); } while (cursor.moveToNext()); } cursor.close(); db.close(); return data; } public List<Story> getMasalStory() { SQLiteDatabase db = this.getReadableDatabase(); List<Story> data = new ArrayList<>(); String query = "SELECT * FROM story WHERE category='masal'"; Cursor cursor = db.rawQuery(query, null); if (cursor.moveToFirst()) { do { Story story = new Story(); story.setId(cursor.getInt(cursor.getColumnIndex(info_db.DATA_ID))); story.setCategory(cursor.getString(cursor.getColumnIndex(info_db.DATA_CATEGORY))); story.setName(cursor.getString(cursor.getColumnIndex(info_db.DATA_NAME))); story.setField(cursor.getString(cursor.getColumnIndex(info_db.DATA_FIELD))); story.setFav(cursor.getInt(cursor.getColumnIndex(info_db.DATA_FAV))); story.setImage(cursor.getString(cursor.getColumnIndex(info_db.DATA_IMAGE))); story.setDisc(cursor.getString(cursor.getColumnIndex(info_db.DATA_DISC))); data.add(story); } while (cursor.moveToNext()); } cursor.close(); db.close(); return data; } public List<Story> getDiniStory() { SQLiteDatabase db = this.getReadableDatabase(); List<Story> data = new ArrayList<>(); String query = "SELECT * FROM story WHERE category='dini'"; Cursor cursor = db.rawQuery(query, null); if (cursor.moveToFirst()) { do { Story story = new Story(); story.setId(cursor.getInt(cursor.getColumnIndex(info_db.DATA_ID))); story.setCategory(cursor.getString(cursor.getColumnIndex(info_db.DATA_CATEGORY))); story.setName(cursor.getString(cursor.getColumnIndex(info_db.DATA_NAME))); story.setField(cursor.getString(cursor.getColumnIndex(info_db.DATA_FIELD))); story.setFav(cursor.getInt(cursor.getColumnIndex(info_db.DATA_FAV))); story.setImage(cursor.getString(cursor.getColumnIndex(info_db.DATA_IMAGE))); story.setDisc(cursor.getString(cursor.getColumnIndex(info_db.DATA_DISC))); data.add(story); } while (cursor.moveToNext()); } cursor.close(); db.close(); return data; } public List<Story> getFavStory() { SQLiteDatabase db = this.getReadableDatabase(); List<Story> data = new ArrayList<>(); String query = "SELECT * FROM story WHERE fav = 1"; Cursor cursor = db.rawQuery(query, null); if (cursor.moveToFirst()) { do { Story story = new Story(); story.setId(cursor.getInt(cursor.getColumnIndex(info_db.DATA_ID))); story.setCategory(cursor.getString(cursor.getColumnIndex(info_db.DATA_CATEGORY))); story.setName(cursor.getString(cursor.getColumnIndex(info_db.DATA_NAME))); story.setField(cursor.getString(cursor.getColumnIndex(info_db.DATA_FIELD))); story.setFav(cursor.getInt(cursor.getColumnIndex(info_db.DATA_FAV))); story.setImage(cursor.getString(cursor.getColumnIndex(info_db.DATA_IMAGE))); story.setDisc(cursor.getString(cursor.getColumnIndex(info_db.DATA_DISC))); data.add(story); } while (cursor.moveToNext()); } cursor.close(); db.close(); return data; } public int fav_value(int id) { SQLiteDatabase db = this.getReadableDatabase(); String query = "SELECT " + info_db.DATA_FAV + " FROM story WHERE " + info_db.DATA_ID + "=" + id + ""; int value = 0; Cursor cursor = db.rawQuery(query, null); if (cursor.moveToFirst()) { value = cursor.getInt(cursor.getColumnIndex(info_db.DATA_FAV)); do { } while (cursor.moveToNext()); } db.close(); return value; } public void fav(int status, int id) { SQLiteDatabase db = this.getReadableDatabase(); String query = "UPDATE story SET " + info_db.DATA_FAV + "=" + status + " WHERE " + info_db.DATA_ID + "=" + id + ""; db.execSQL(query); db.close(); } @Override public void onCreate(SQLiteDatabase db) { } @Override public void onUpgrade(SQLiteDatabase db, int oldVersion, int newVersion) { } } <file_sep>package com.book.storiek; import android.os.Bundle; import android.view.LayoutInflater; import android.view.View; import android.view.ViewGroup; import androidx.annotation.NonNull; import androidx.annotation.Nullable; import androidx.fragment.app.Fragment; import androidx.recyclerview.widget.LinearLayoutManager; import androidx.recyclerview.widget.RecyclerView; import com.book.storiek.Adapter.StoryAdapter; import com.book.storiek.Database.Database; import com.book.storiek.Model.Story; import java.util.List; public class TanzFragment extends Fragment { View view; RecyclerView recyclerView; StoryAdapter adapter; @Nullable @Override public View onCreateView(@NonNull LayoutInflater inflater, @Nullable ViewGroup container, @Nullable Bundle savedInstanceState) { view = inflater.inflate(R.layout.recycler_main, container, false); setupViews(); return view; } @Override public void onCreate(@Nullable Bundle savedInstanceState) { super.onCreate(savedInstanceState); } private void setupViews() { recyclerView = view.findViewById(R.id.recycler_id); recyclerView.setLayoutManager(new LinearLayoutManager(getActivity())); Database db = new Database(getActivity()); List<Story> storyList = db.getTanzStory(); adapter = new StoryAdapter(getActivity(), storyList); recyclerView.setAdapter(adapter); } } <file_sep>package com.book.storiek; import androidx.appcompat.app.AppCompatActivity; import android.content.Intent; import android.content.SharedPreferences; import android.os.Bundle; import android.view.View; import android.widget.Button; import android.widget.EditText; import android.widget.Toast; public class RegisterActivity extends AppCompatActivity implements View.OnClickListener { EditText username, email; Button submit; boolean isLogin = false; private static final String LOGIN = "login"; String username_txt, email_text; @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_register); setupViews(); } private void setupViews() { username = findViewById(R.id.Username); submit = findViewById(R.id.Submit); //email = findViewById(R.id.email); submit.setOnClickListener(this); } @Override public void onClick(View v) { int viewID = v.getId(); switch (viewID) { case R.id.Submit: nameValidation(); break; } } private void nameValidation() { username_txt = username.getText().toString().trim(); //email_text = email.getText().toString().trim(); //Boolean isEmail = email_text.contains("@gmail.com"); if (username_txt.isEmpty()) { Toast.makeText(this, "نام کاربری خود را وارد کنید", Toast.LENGTH_SHORT).show(); } /*else if (email_text.isEmpty()) { Toast.makeText(this, "ایمیل را وارد کنید", Toast.LENGTH_SHORT).show(); } else if (isEmail == false) { Toast.makeText(this, "ایمیل معتبر نیست", Toast.LENGTH_SHORT).show(); }*/ else { saveUsername(); } } private void saveUsername() { isLogin = !isLogin; SharedPreferences sharedPreferences = getSharedPreferences(LOGIN, MODE_PRIVATE); SharedPreferences.Editor editor = sharedPreferences.edit(); editor.putString("USERNAME", username_txt); //editor.putString("EMAIL", email_text); editor.putBoolean("ISLOGIN", isLogin); editor.apply(); goToHome(); } private void goToHome() { Intent intent = new Intent(RegisterActivity.this, MainActivity.class); startActivity(intent); finish(); } }
e8f40920b5efdb9ba438e95a3598bfef6fec412b
[ "Java" ]
3
Java
Radoon83/Storiek
6bd43f216d507f89d597e75e107b2f1e5a3811dd
ce7090f21d0d86fae10d575f4b98403ad3e6bdca
refs/heads/main
<file_sep># ProjectHG <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG001.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG002.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG003.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG004.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG005.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG006.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG007.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG008.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG009.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG010.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG011.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG012.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG013.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG014.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG015.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG016.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG017.png"> <img src = "./img/2020_JAVA_프로젝트_결과보고서_ProjectHG018.png"> <file_sep>package Server; import Dao.UserDAO; import Dto.UserDTO; import Info.*; import java.io.DataInputStream; import java.io.DataOutputStream; import java.io.IOException; import java.net.ServerSocket; import java.net.Socket; import java.util.ArrayList; import java.util.Collections; import java.util.HashMap; import java.util.Iterator; import java.util.Map; public class ServerBackground { private ServerSocket serverSocket; private Socket socket; private String msg; private Player player; private GameRoom gameroom; private RoomManage manage = new RoomManage(); private Map<String, DataOutputStream> clientsMap = new HashMap<String, DataOutputStream>(); private Map<String, Player> playerMap = new HashMap<String, Player>(); public void setting() throws IOException { Collections.synchronizedMap(clientsMap); serverSocket = new ServerSocket(7777); gameroom = new GameRoom(); gameroom.setMaxIn(10); gameroom.setRoomId("0"); gameroom.setRoomName("lobby"); manage.AddRoom(gameroom); System.out.println("서버를 시작합니다!"); while (true) { socket = serverSocket.accept(); System.out.println(socket.getInetAddress() + "호스트 접속됨."); Receiver receiver = new Receiver(socket); receiver.start(); } } public static void main(String[] args) throws IOException { ServerBackground serverBackground = new ServerBackground(); serverBackground.setting(); } public void addClient(String id, DataOutputStream out) throws IOException { clientsMap.put(id, out); System.out.println(id + " : 추가"); Player player = new Player(); UserDAO userDB = new UserDAO(); UserDTO user = new UserDTO(); user = userDB.getUserData(id); player.setId(user.getId()); player.setNickname(user.getNickname()); player.setWin(user.getWin()); player.setLose(user.getLose()); playerMap.put(id, player); System.out.println(player.getId()+player.getNickname()); } public void removeClient(String id) { clientsMap.remove(id); playerMap.remove(id); } public void sendMessage(String msg) { Iterator<String> it = clientsMap.keySet().iterator(); String key = ""; System.out.println(""); while (it.hasNext()) { key = it.next(); try { clientsMap.get(key).writeUTF(msg); } catch (IOException e) { System.out.println(e.getMessage()); } } } public void createRoom(String gameName, String id){ System.out.println("방생성"); gameroom = new GameRoom(); gameroom.setRoomName(gameName); gameroom.setRoomId(manage.setRoomId()); gameroom.setRoomMaker(playerMap.get(id).getNickname()); manage.AddRoom(gameroom); manage.AddPlayer(gameroom.getRoomId(), playerMap.get(id)); playerMap.get(id).setRoomId(gameroom.getRoomId()); System.out.println(gameroom.getRoomId()); try { clientsMap.get(id).writeUTF("num,1"); } catch (IOException e) { System.out.println(e.getMessage()); } sendRoomInfo(gameroom.getRoomId()); } public void roomjoin(String room_id, Player player){ player.setRoomId(room_id); manage.AddPlayer(room_id, player); for(GameRoom room : manage.getRoomlist()){ if(room.getRoomId().equals(room_id)){ for(Player user : room.getPlayerlist()){ System.out.println(user.getId()); try { clientsMap.get(player.getId()).writeUTF("num,2"); } catch (IOException e) { System.out.println(e.getMessage()); } } } } sendRoomInfo(room_id); } public void sendRoomInfo(String room_id){ GameRoom room = manage.getRoomById(room_id); try { for(int i=0;i<room.getPlayerlist().size();i++){ if(room.getPlayerlist().size()==1){ clientsMap.get(room.getPlayerlist().get(i).getId()).writeUTF("roominfo,"+room.getPlayerlist().get(0).getId()+",X"); }else{ clientsMap.get(room.getPlayerlist().get(i).getId()).writeUTF("roominfo,"+room.getPlayerlist().get(0).getId()+","+room.getPlayerlist().get(1).getId()); } } for(int i=0; i< room.getPlayerlist().size();i++){ Player player = room.getPlayerlist().get(i); System.out.println(room.getPlayerlist().size()); if(room.getPlayerlist().size()==1){ clientsMap.get(player.getId()).writeUTF("Readyrefresh,"+room.getReadylist().get(0)+",X"); }else{ clientsMap.get(player.getId()).writeUTF("Readyrefresh,"+room.getReadylist().get(0)+","+room.getReadylist().get(1)); } } } catch (IOException e) { System.out.println(e.getMessage()); } } public void roomExit(String id){ Player player = playerMap.get(id); GameRoom room = manage.getRoomById(player.getRoomId()); manage.playerOut(player); manage.deleteRoom(); } public void ready(String id){ GameRoom room = manage.getRoomById(playerMap.get(id).getRoomId()); room.Ready(); try { for(int i=0; i< room.getPlayerlist().size();i++){ Player player = room.getPlayerlist().get(i); if(player.getId().equals(id)) room.getReadylist().set(i, (room.getReadylist().get(i)+1)%2); } for(int i=0; i< room.getPlayerlist().size();i++){ Player player = room.getPlayerlist().get(i); if(room.getPlayerlist().size()==1){ clientsMap.get(player.getId()).writeUTF("Readyrefresh,"+room.getReadylist().get(0)+",X"); }else{ clientsMap.get(player.getId()).writeUTF("Readyrefresh,"+room.getReadylist().get(0)+","+room.getReadylist().get(1)); } } if(room.getReadylist().get(0)==1&&room.getReadylist().get(1)==1){ room.getReadylist().set(0, 0); room.getReadylist().set(1, 0); // 모두 레디 상태이면 > 게임 시작 int rand = (int)(Math.random()*2)+1; String[] whoseTurnList = new String[]{"my","your"}; String[] cards = new String[]{"b1","b2","b3","b4","b5","g1","g2","g3","g4","g5","i1","i2","i3","i4","i5","o1","o2","o3","o4","o5","w1","w2","w3","w4","w5","d1","d2","d3","d4","d5"}; ArrayList<String> u1 = new ArrayList<String>(); ArrayList<String> u2 = new ArrayList<String>(); for(int i=0;i<cards.length;i++) { int r = (int)(Math.random()*30); String tmp = cards[r]; cards[r] = cards[0]; cards[0] = tmp; } for(int i=0;i<15;i++){ u1.add(cards[i]); u2.add(cards[i+15]); } for(int i=0; i< room.getPlayerlist().size();i++){ Player player = room.getPlayerlist().get(i); String mycard = i==0 ? u1.toString().replace(",", "#"): u2.toString().replace(",", "#"); String yourcard = i==0 ? u2.toString().replace(",", "#"): u1.toString().replace(",", "#"); clientsMap.get(player.getId()).writeUTF("GameStart,"+whoseTurnList[(rand+i)%2]+","+mycard+","+yourcard); } } } catch (IOException e) { System.out.println(e.getMessage()); } for(Player player : room.getPlayerlist()){ String nickname = ""; for(Player name : room.getPlayerlist()){ nickname = name.getNickname(); if(!nickname.equals(player.getNickname())) break; } if(room.getReady() == 2){ try { clientsMap.get(player.getId()).writeUTF("start,"+nickname); } catch (IOException e) { System.out.println(e.getMessage()); } } } } public void refresh(){ try { Iterator<String> it = clientsMap.keySet().iterator(); String key = ""; String a = ""; for(GameRoom room : manage.getRoomlist()){ a += room.getRoomId(); a += ","; a += room.getRoomName(); a += ","; a += room.getRoomMaker(); a += ","; a += room.getNowIn(); a += ","; } while (it.hasNext()) { key = it.next(); clientsMap.get(key).writeUTF(a); } System.out.println(a); } catch (IOException e) { System.out.println(e.getMessage()); } } // ----------------------------------------------------------------------------- class Receiver extends Thread { private DataInputStream in; private DataOutputStream out; private String nick; public Receiver(Socket socket) throws IOException { out = new DataOutputStream(socket.getOutputStream()); in = new DataInputStream(socket.getInputStream()); nick = in.readUTF(); addClient(nick, out); } public void run() { try { while (in != null) { msg = in.readUTF().trim(); String[] a = msg.split(","); //System.out.println(msg); if(a[0].equals("f5")){ refresh(); } if(a[0].equals("create")){ createRoom(a[1], a[2]); } if(a[0].equals("join")){ Player player = playerMap.get(a[2]); roomjoin(a[1], player); } if(a[0].equals("ready")){ ready(a[1]); } if(a[0].equals("exitroom")){ roomExit(a[1]); if(a[0].equals("eixt")) removeClient(a[1]); } if(a[0].equals("endGame")){ GameRoom room = manage.getRoomById(playerMap.get(a[1]).getRoomId()); for(Player tmp:room.getPlayerlist()){ clientsMap.get(tmp.getId()).writeUTF("exitInGame,"); } } if(a[0].equals("sendmycard")){ GameRoom room = manage.getRoomById(playerMap.get(a[1]).getRoomId()); for(Player tmp:room.getPlayerlist()){ if(!tmp.getId().equals(a[1])) clientsMap.get(tmp.getId()).writeUTF("receiveyourcard,"); clientsMap.get(tmp.getId()).writeUTF("nextTurn,"); } } if(a[0].equals("BellNotClick")){ GameRoom room = manage.getRoomById(playerMap.get(a[1]).getRoomId()); if(room.getRoomMaker().equals(playerMap.get(a[1]).getNickname())){ for(Player tmp:room.getPlayerlist()){ clientsMap.get(tmp.getId()).writeUTF("nextTurn,"); } } } if(a[0].equals("bellClick")){ GameRoom room = manage.getRoomById(playerMap.get(a[2]).getRoomId()); String my = a[1].split("#")[0].trim(); String your = a[1].split("#")[1].trim(); if(my.substring(0,1).equals(your.substring(0,1))){ // 같은 종류의 카드이면 if((Integer.parseInt(my.substring(1))+Integer.parseInt(your.substring(1)))==5){ // 개수가 총 5개이면 for(Player tmp:room.getPlayerlist()){ if(tmp.getId().equals(a[2])){ // 벨 누른 본인이면 clientsMap.get(tmp.getId()).writeUTF("getAllCard,"); }else{ // 본인이 아니면 clientsMap.get(tmp.getId()).writeUTF("LoseAllCard,"); } } }else{ // 아닌데도 눌렀으면 for(Player tmp:room.getPlayerlist()){ if(tmp.getId().equals(a[2])){ // 벨 누른 본인이면 clientsMap.get(tmp.getId()).writeUTF("givemycard,"); }else{ // 본인이 아니면 clientsMap.get(tmp.getId()).writeUTF("getyourcard,"); } } } }else{ // 다른 종류의 카드이면 if(Integer.parseInt(my.substring(1))==5 || Integer.parseInt(your.substring(1))==5){ // 둘 중 하나가 5개이면 for(Player tmp:room.getPlayerlist()){ if(tmp.getId().equals(a[2])){ // 벨 누른 본인이면 clientsMap.get(tmp.getId()).writeUTF("getAllCard,"); }else{ // 본인이 아니면 clientsMap.get(tmp.getId()).writeUTF("LoseAllCard,"); } } }else{ // 아닌데도 눌렀으면 for(Player tmp:room.getPlayerlist()){ if(tmp.getId().equals(a[2])){ // 벨 누른 본인이면 clientsMap.get(tmp.getId()).writeUTF("givemycard,"); }else{ // 본인이 아니면 clientsMap.get(tmp.getId()).writeUTF("getyourcard,"); } } } } } System.out.println(msg); } } catch (IOException e) { removeClient(nick); } } } } <file_sep>package Client; import java.awt.Image; import java.io.DataInputStream; import java.io.DataOutputStream; import java.io.IOException; import java.net.Socket; import javax.swing.ImageIcon; import javax.swing.JTable; public class ClientBackground { private Socket socket; private DataInputStream in; private DataOutputStream out; private JTable gui; private String msg; private String nickName; private MainFrame main; private int num; public final void setGui(JTable gui) { this.gui = gui; } public final void setFrame(MainFrame main) { this.main = main; } public void connet() { try { socket = new Socket("localhost", 7777); System.out.println("서버 연결됨."); out = new DataOutputStream(socket.getOutputStream()); in = new DataInputStream(socket.getInputStream()); // 접속하자마자 닉네임 전송하면. 서버가 이걸 닉네임으로 인식을 하고서 맵에 집어넣겠지요? out.writeUTF(nickName); System.out.println(nickName + " : 클라이언트 : 메시지 전송완료"); Receiver receiver = new Receiver(socket); receiver.start(); } catch (IOException e) { System.out.println(e.getMessage()); } } public static void main(String[] args) { ClientBackground clientBackground = new ClientBackground(); clientBackground.connet(); } public void sendMessage(String msg2) { try { out.writeUTF(msg2+","+nickName); } catch (IOException e) { System.out.println(e.getMessage()); } } public void setNickname(String nickName) { this.nickName = nickName; } ImageIcon imgicon; Image img; public void Ready(String r1, String r2){ main.Ready(r1, r2); } public void GameStart(String nick){ // Game Start(); System.out.println("게임시작"); if(num == 1){ }else if(num == 2){ } } class Receiver extends Thread { private DataInputStream in; private DataOutputStream out; public Receiver(Socket socket) throws IOException { out = new DataOutputStream(socket.getOutputStream()); in = new DataInputStream(socket.getInputStream()); } public void run() { try { while (in != null) { msg = in.readUTF(); msg.trim(); String a[] = msg.split(","); if(a[0].equals("start")){ GameStart(a[1]); }else if(a[0].equals("num")){ num = Integer.parseInt(a[1]); }else if(a[0].equals("Readyrefresh")){ Ready(a[1],a[2]); }else if(a[0].equals("roominfo")){ main.userSet(a[1],a[2]); }else if(a[0].equals("exitInGame")){ main.ExitInGame(); }else if(a[0].equals("GameStart")){ main.initialGameSet(a[1],a[2],a[3]); }else if(a[0].equals("receiveyourcard")){ main.receiveCard(); }else if(a[0].equals("nextTurn")){ main.TurnChange(); }else if(a[0].equals("getAllCard")){ main.getAllCard(); }else if(a[0].equals("LoseAllCard")){ main.loseAllCard(); }else if(a[0].equals("givemycard")){ main.giveMycard(); }else if(a[0].equals("getyourcard")){ main.getYourcard(); }else{ for(int i=0; i<(a.length-4)/4; i++){ gui.setValueAt(a[4+(i*4)], i, 0); gui.setValueAt(a[5+(i*4)], i, 1); gui.setValueAt(a[6+(i*4)], i, 2); gui.setValueAt(a[7+(i*4)], i, 3); } } System.out.println(msg); } } catch (IOException e) { System.out.println(e.getMessage()); } } } } <file_sep>package Dao; import Dto.UserDTO; import java.sql.Connection; import java.sql.DriverManager; import java.sql.PreparedStatement; import java.sql.ResultSet; import java.sql.SQLException; public class UserDAO { private PreparedStatement pstmt; private ResultSet rs; private Connection conn; public UserDAO() { try { String schema = "javahg" ; // 본인 스키마 입력 String dbURL = "jdbc:mysql://127.0.0.1:3306/"+schema+"?serverTimezone=UTC"; String dbID = "root"; String dbPassword = "<PASSWORD>"; // 본인 패스워드 입력 Class.forName("com.mysql.cj.jdbc.Driver"); conn = DriverManager.getConnection(dbURL, dbID, dbPassword); } catch (Exception e) { System.out.println(e.getMessage()); } } public void close(){ try{ pstmt.close(); conn.close(); }catch(SQLException e){ System.out.println(e.getMessage()); } } // ID, PW 받아서 맞으면 True 틀리면 false 반환 public boolean login(String id, String pwd) { String sql = "select password from member where id = ?"; try { pstmt = conn.prepareStatement(sql); pstmt.setString(1, id); rs = pstmt.executeQuery(); if (rs.next()) { if (rs.getString(1).equals(pwd)) { rs.close(); return true; } else { rs.close(); return false; } } } catch (SQLException e) { System.out.println(e.getMessage()); } return false; } public UserDTO getUserData(String id) { String sql = "select id, nickname, win, lose, point, date from member where id = ?"; try { pstmt = conn.prepareStatement(sql); pstmt.setString(1, id); rs = pstmt.executeQuery(); rs.next(); UserDTO userinfo = new UserDTO(); userinfo.setId(rs.getString(1)); userinfo.setNickname(rs.getString(2)); userinfo.setWin(rs.getInt(3)); userinfo.setLose(rs.getInt(4)); userinfo.setPoint(rs.getInt(5)); userinfo.setDate(rs.getString(6).split(" ")[0]); rs.close(); return userinfo; } catch (SQLException e) { System.out.println(e.getMessage()); } return null; } public boolean getIDExist(String id){ try { pstmt = conn.prepareStatement("select * from member where id = ?"); pstmt.setString(1, id); rs = pstmt.executeQuery(); if(rs.next()){ rs.close(); return true; }else{ rs.close(); return false; } } catch (SQLException e) { System.out.println(e.getMessage()); } return true; } public boolean getNicknameExist(String nickname){ try { pstmt = conn.prepareStatement("select * from member where nickname = ?"); pstmt.setString(1, nickname); rs = pstmt.executeQuery(); if(rs.next()){ rs.close(); return true; }else{ rs.close(); return false; } } catch (SQLException e) { System.out.println(e.getMessage()); } return true; } public void singUpInsert(String id, String pwd, String nickname) { try { String sql = "insert into member(id, password, nickname) values(?, ?, ?)"; pstmt = conn.prepareStatement(sql); pstmt.setString(1, id); pstmt.setString(2, pwd); pstmt.setString(3, nickname); pstmt.executeUpdate(); } catch (SQLException e) { System.out.println(e.getMessage()); } } //비밀번호 변경 // 찾을 id, 변경할 pwd 인수 public void myPagePwdUpdate(String id, String pwd){ String sql = "update member set password = ? where id = ?"; try { pstmt = conn.prepareStatement(sql); pstmt.setString(1, pwd); pstmt.setString(2, id); pstmt.executeUpdate(); } catch (SQLException e) { System.out.println(e.getMessage()); } } public void winUpdate(String id){ String sql = "update member set point=point+100, win=win+1 where id = ?"; try { pstmt = conn.prepareStatement(sql); pstmt.setString(1, id); pstmt.executeUpdate(); } catch (SQLException e) { System.out.println(e.getMessage()); } } public void loseUpdate(String id){ String sql = "update member set point=point+50, lose=lose+1 where id = ?"; try { pstmt = conn.prepareStatement(sql); pstmt.setString(1, id); pstmt.executeUpdate(); } catch (SQLException e) { System.out.println(e.getMessage()); } } }
9b4e5d3b4bad72319ffbb12865a7e196e6583cac
[ "Markdown", "Java" ]
4
Markdown
oh4842/ProjectHG
27f53ed5441ca996edded867b6e99c0d4fe52b94
6d3812f6610b66510e6e955f4ed66badb8f0d705
refs/heads/master
<repo_name>nippe/multiplikationstabellerna<file_sep>/app.js const tables = [3, 4]; const NUMBER_OF_PROBLEMS = 25; const getRandomInt = (min, max) => { min = Math.ceil(min); max = Math.floor(max); return Math.floor(Math.random() * (max - min)) + min; //The maximum is exclusive and the minimum is inclusive // return Math.floor( // Math.random() * (Math.floor(max) - Math.ceil(min) + Math.ceil(min)) // ); } console.log('Tabeller att öva på är:'); for(let i=0; i < tables.length; i++) { console.log(` - ${tables[i]}:ans tabell`); } console.log(''); console.log(''); for(let i=0; i < NUMBER_OF_PROBLEMS; i++) { const multiplier = getRandomInt(0,10); //Math.random(10); const table = getRandomInt(3,5); console.log(`${table} * ${multiplier} = ?`); }
2bc786e03b11b775c24f927b72c3eabedd213c4b
[ "JavaScript" ]
1
JavaScript
nippe/multiplikationstabellerna
27d69c865c328197d3357c975dc3c972a63d1d87
c3e03d6c7a4605557d94af3df03d8969cdf46766
refs/heads/master
<file_sep>import Foundation struct Matrix: CustomStringConvertible { let rows: Int, columns: Int var grid: [Double] var description: String { var thing: String { var temp = "" for (index, item) in self.grid.enumerated() { if (index % self.columns == 0) { temp += "\n\t" } temp += "\(NSString(format: "%.3f", item)) " } return temp } return "\n\(rows)r x \(columns)c matrix \(thing)\n" } init(rows: Int, columns: Int) { self.rows = rows self.columns = columns grid = Array(repeating: 0.0, count: rows * columns) } init(rows: Int, columns: Int, data: [Double]) { assert(data.count == rows * columns) self.rows = rows self.columns = columns grid = data } mutating func randomize() { grid = grid.map { _ in Double.random } } func indexIsValid(row: Int, column: Int) -> Bool { return row >= 0 && row < rows && column >= 0 && column < columns } subscript(row: Int, column: Int) -> Double { get { assert(indexIsValid(row: row, column: column), "Index out of range") return grid[(row * columns) + column] } set { assert(indexIsValid(row: row, column: column), "Index out of range") grid[(row * columns) + column] = newValue } } static func *(first: Matrix, second: Matrix) -> Matrix { assert(first.columns == second.rows) // row size == column size var outputData = Array(repeating: 0.0, count: first.rows * second.columns) let outputRows = first.rows let outputCols = second.columns for i in 0..<outputData.count { let row = i / (outputCols) let col = i % (outputCols) for j in 0..<first.columns { outputData[i] += first[row, j] * second[j, col] } } return Matrix(rows: outputRows, columns: outputCols, data: outputData) } static func *(first: Matrix, second: Vector) -> Vector { assert(first.columns == second.rows) let secondToMatrix = Matrix(rows: second.rows, columns: 1, data: second.values) let outputMatrix = first * secondToMatrix return Vector(rows: first.rows, data: outputMatrix.grid) } static func ==(first: Matrix, second: Matrix) -> Bool { return first.grid == second.grid && first.rows == second.rows && first.columns == second.columns } static func ==(first: Matrix, second: Vector) -> Bool { return first.grid == second.values && first.rows == second.rows && first.columns == 1 } } struct Vector: CustomStringConvertible { let rows: Int var values: [Double] var description: String { var thing: String { var temp = "" for item in self.values { temp += "\t\(NSString(format: "%.3f", item))\n" } return temp } return "\n\(rows)r vector\n\(thing)\n" } init(rows: Int) { self.rows = rows values = Array(repeating: 0.0, count: rows) } init(rows: Int, data: [Double]) { self.rows = rows values = data } mutating func randomize() { values = values.map { _ in Double.random } } mutating func sigmoid() { values = values.map { $0.sigmoid } } func indexIsValid(index: Int) -> Bool { return index >= 0 && index < rows } subscript(index: Int) -> Double { get { assert(indexIsValid(index: index), "Index out of range") return values[index] } set { assert(indexIsValid(index: index), "Index out of range") values[index] = newValue } } static func +(first: Vector, second: Vector) -> Vector { assert(first.rows == second.rows) var output = Vector(rows: first.rows) for i in 0..<output.rows { output.values[i] = first.values[i] + second.values[i] } return output } static func ==(first: Vector, second: Vector) -> Bool { return first.values == second.values && first.rows == second.rows } static func ==(first: Vector, second: Matrix) -> Bool { return first.values == second.grid && first.rows == second.rows && second.columns == 1 } } class Network { let num_layers: Int let sizes: [Int] var weights: [Matrix] = [] var biases: [Vector] = [] var layers: [Vector] = [] init(sizes: [Int]) { num_layers = sizes.count self.sizes = sizes weights = initWeights() biases = initBiases() layers = initLayers() } func initWeights() -> [Matrix] { var array: [Matrix] = [] for size_index in 0..<num_layers - 1 { var newMatrix = Matrix(rows: sizes[size_index + 1], columns: sizes[size_index]) newMatrix.randomize() array.append(newMatrix) } return array } func initBiases() -> [Vector] { var array: [Vector] = [] for size_index in 1..<num_layers { var newVector = Vector(rows: sizes[size_index]) newVector.randomize() array.append(newVector) } return array } func initLayers() -> [Vector] { var array: [Vector] = [] for size_index in 0..<num_layers { array.append(Vector(rows: sizes[size_index])) } return array } func feedForward(input: [Double]) -> [Double] { assert(input.count == sizes[0]) assert((input.filter{$0 > 1.0 || $0 < 0.0}).count == 0) layers[0] = Vector(rows: sizes[0], data: input) print("Set input data") for layerIndex in 0..<num_layers - 1 { print("\nStarting to compute layer \(layerIndex + 1)") print("Weights: \(weights[layerIndex])") print("Layer: \(layers[layerIndex])") let multiplicationResult = weights[layerIndex] * layers[layerIndex] var addBias = multiplicationResult + biases[layerIndex] addBias.sigmoid() layers[layerIndex + 1] = addBias } return layers[num_layers - 1].values } } extension Double { public static var random: Double { return Double(arc4random()) / Double(UINT32_MAX) } public static func random(min: Double, max: Double) -> Double { return Double.random * (max-min) + min } public var sigmoid: Double { return 1.0/(1.0 + exp(self * -1)) } } // UNIT TESTS assert( Matrix(rows: 3, columns: 3, data: [1, 2, 3, 4, 5, 6, 7, 8, 9]) * Vector(rows: 3, data: [0, 1, 0]) == Vector(rows: 3, data: [2, 5, 8]), "Unit Test Failed") assert( Matrix(rows: 2, columns: 3, data: [1, 2, 3, 4, 5, 6]) * Matrix(rows: 3, columns: 2, data: [7, 8, 9, 10, 11, 12]) == Matrix(rows: 2, columns: 2, data: [58, 64, 139, 154]), "Unit Test Failed") // FEED FORWARD PROGRAM var network = Network(sizes: [6, 3, 3, 1]) let result = network.feedForward(input: [0.9, 0.8, 0.6, 0.3, 0.1, 0.1]) print("Result:\n\(result)")<file_sep>### A neural net in Swift. I'm trying to understand how Neural Networks function using resources like [this](http://neuralnetworksanddeeplearning.com/chap1.html) and [this](https://www.youtube.com/playlist?list=PLZHQObOWTQDNU6R1_67000Dx_ZCJB-3pi). Once I started thinking I understood what was going on, I wanted to see if I could build a neural network in Swift. Most of the code defines Matrix and Vector classes that can be created and operated on, mostly so I could multiply, add, and equate vectors and matrices. Some of the code, however, sets up layers, weights, and biases in a Network class. The Network.feedForward method runs a single iteration of the Network object given a set of inputs. Right now, weights and biases are initialized to a random double between 0.0 and 1.0. Eventually I want to implement backpropagation and learning. If you stumble upon this code and want to contribute, please feel free to send me either critiques of my understanding of neural networks or critiques of the "swiftiness" of my code. Please don't send me pull requests.
31e15dae8208061c0b1c5c9e79a0c24b0d815f70
[ "Swift", "Markdown" ]
2
Swift
jameslittle230/swift-neural-network
22ff0bd385072b4736b7042b945b2abc54c3d63d
778524a1efbef8dcc26eddac053cde0d2b6efe99
refs/heads/master
<repo_name>c24t/minimal-dotfiles<file_sep>/zshrc ZSHRC=$HOME/.zshrc export ALIASES=$HOME/.aliases export AUTO_ALIASES=$ALIASES/auto PYTHONRC=$HOME/.pythonrc export EDITOR=vim autoload -Uz compinit && compinit autoload -U promptinit && promptinit autoload -U colors && colors # Lines configured by zsh-newuser-install HISTFILE=$HOME/.history HISTSIZE=10000 SAVEHIST=1000000 setopt autocd extendedglob unsetopt beep bindkey -v # End of lines configured by zsh-newuser-install # source ~/.aliases/**/* for file in $(find $ALIASES -follow); do source $file; done # edit this file, and source it alias vizsh="vim $ZSHRC && source $ZSHRC" # add an alias aaa () { echo "alias $1=\"$2\"" >> $AUTO_ALIASES source $AUTO_ALIASES which $1 } ######### setopts ############################################################## setopt INC_APPEND_HISTORY setopt EXTENDED_HISTORY setopt HIST_IGNORE_DUPS setopt HIST_IGNORE_SPACE setopt HIST_REDUCE_BLANKS setopt ALWAYS_TO_END setopt AUTO_NAME_DIRS setopt AUTO_PUSHD setopt NO_BEEP setopt HIST_FIND_NO_DUPS setopt EXTENDED_HISTORY setopt HIST_EXPIRE_DUPS_FIRST # gets slow with a big history setopt HIST_VERIFY # clear rprompt on new commands, thank god for this setopt transientrprompt ######### bindkeys ############################################################# bindkey -M viins '^?' backward-delete-char bindkey -M viins '^H' backward-delete-char bindkey "^[[A" history-beginning-search-backward bindkey "^[[B" history-beginning-search-forward # :nmap <Space> i bindkey -M vicmd ' ' vi-insert export LESS="-iFRX" bindkey "^R" history-incremental-search-backward bindkey "^T" push-line-or-edit # `man zshcompsys` # The following lines were added by compinstall zstyle ':completion:*' completer _expand _complete _ignored _match _approximate zstyle ':completion:*' completions 1 zstyle ':completion:*' format 'completing %d' zstyle ':completion:*' glob 1 zstyle ':completion:*' group-name '' zstyle ':completion:*' list-colors '' zstyle ':completion:*' list-prompt %SAt %p: Hit TAB for more, or the character to insert%s zstyle ':completion:*' matcher-list '' 'm:{[:lower:][:upper:]}={[:upper:][:lower:]} r:|[._-]=** r:|=** l:|=*' zstyle ':completion:*' match-original both zstyle ':completion:*' max-errors 3 zstyle ':completion:*' prompt '%e%e%e' zstyle ':completion:*' substitute 1 #zstyle :compinstall filename '$HOME/compinstalled' # End of lines added by compinstall source $PYTHONRC # vim:ft=zsh:noet:sw=2 ts=2 <file_sep>/script/uniq.py #! /usr/bin/env python import fileinput import sys if __name__ == "__main__": lines = set() for line in fileinput.input(): if line not in lines: sys.stdout.write(line) lines.add(line) <file_sep>/aliases/essentials alias ls="ls --color=auto" alias ll="ls -GFlh" alias l="ll" alias la="ls -lah" alias lll="la" alias t="tree" alias g="grep" alias psa="ps aux" alias rmrf="rm -rf" alias pull="git pull" alias pul=pull alias push="git push" # dates alias longstardate="date +%Y%m.%d.%H%M" alias shortdate="date +%d%b" alias epoch="date +%s" # tmux alias tl="tmux ls" alias tt="tmux attach -t" # findfile, findfilefollow ff () { find . -type f -iname "*$1*" | grep -i "$1" } fff () { find . -type f -iname "*$1*" -follow | grep -i "$1"} fd () { find . -type d -iname "*$1*" | grep -i "$1" } fdf () { find . -type d -iname "*$1*" -follow | grep -i "$1"} # grepall, grepallfollow ga () {find . -type f -print0 | xargs -0 egrep "$1" | echo "balls"} gaa () {find . -type f -follow -print0 | xargs -0 egrep "$1"} # grepall select by suffix gas () {find . -type f -name "*.$2" -print0 | xargs -0 egrep -n "$1" --color=auto} gaas () {find . -type f -follow -name "*.$2" -print0 | xargs -0 egrep -n "$1" --color=auto} # `ps aux` cut -- get the pid psac () {awk '{print $2}'} alias s="screen -A -fa -U -D -R" alias sls="screen -ls" mcd () { mkdir -p $1 && cd $1 } # dir stack management alias d="dirs -lp" alias pd="popd -q" alias cdd="cd -P" # git alias gg="git grep" alias gbh="git rev-parse --short HEAD" alias gbhh="git rev-parse --short HEAD\^" alias gbhhh="git rev-parse --short HEAD\^\^" alias stash="git stash" alias glg="git lg" alias gbn="git symbolic-ref HEAD --short" alias gg="git grep" # archeology hf () { < $HISTFILE | grep -i $1 | cut -d ';' -f 2- | ~/script/uniq.py | grep -i --color=auto $1 } alias vialiases="( cd $ALIASES && vim essentials ) && source $ZSHRC" # vim:ft=zsh:noet:sw=2 ts=2 <file_sep>/setup.zsh #! /usr/bin/env zsh sudo chsh -s $(which zsh) $USER # go install coreutils if this line fails on OSX dotdir=$(readlink -f $(pwd)) backupdir=dotfiles.$(date +%s).backup function dotrelink () { if [[ -a $2 ]]; then mv -v $2 $backupdir; fi ln -svf $dotdir/$1 $2 } ( cd $HOME mkdir $backupdir mkdir -p .vim/swap/ mkdir -p .vim/undo/ dotrelink zshrc .zshrc dotrelink vimrc .vimrc dotrelink pythonrc .pythonrc dotrelink screenrc .screenrc dotrelink tmuxrc .tmux.conf dotrelink inputrc .inputrc mkdir -p .aliases mkdir -p script dotrelink script/uniq.py script/uniq.py mkdir -p .vim/autoload mkdir -p .vim/bundle dotrelink vim-pathogen/autoload/pathogen.vim .vim/autoload/pathogen.vim dotrelink vim-plugins/vim-fugitive .vim/bundle/vim-fugitive dotrelink vim-plugins/vim-surround .vim/bundle/vim-surround dotrelink vim-plugins/vim-commentary .vim/bundle/vim-commentary dotrelink vim-plugins/vim-unimpaired .vim/bundle/vim-unimpaired dotrelink aliases/essentials .aliases/essentials dotrelink aliases/auto .aliases/auto source .zshrc )
221bb83d7c3f41608bfb67346a46bd341806e11c
[ "Python", "Shell" ]
4
Shell
c24t/minimal-dotfiles
f2293f7f356ee9672c205412c9fd7ff7ebec9dfb
d35f0419a963d64f390cb766cc11709c8768c4b0
refs/heads/master
<file_sep>package jayms.json; import java.io.File; import java.lang.reflect.Array; import java.lang.reflect.Field; import java.lang.reflect.Modifier; import java.math.BigInteger; import java.text.DecimalFormatSymbols; import java.util.ArrayList; import java.util.Set; public final class Utils { public class Tuple<A, B> { private A a; private B b; public Tuple(A a, B b) { this.a = a; this.b = b; } public void setA(A a) { this.a = a; } public void setB(B b) { this.b = b; } public A getA() { return a; } public B getB() { return b; } } public static String toJSONString(Object value) { String result = ""; if (value == null) { result = "null"; }else if (value instanceof Boolean) { result = ((boolean) value) ? "true" : "false"; }else if (value instanceof String) { result += "\"" + (String) value + "\""; }else if (value instanceof Character) { result += "\"" + ((Character) value).toString() + "\""; }else if (value instanceof Number) { result += value; }else if (value.getClass().isArray()) { result += new JSONArray(convertObjectToArray(value)).toJSONString(); }else if (value instanceof JSONArray) { result += ((JSONArray) value).toJSONString(); }else { result += new JSONObject(value).toJSONString(); } return result; } public static String[] bools = { "false", "true" }; public static String doubleRegExp = "[\\x00-\\x20]*[+-]?(((((\\p{Digit}+)(\\.)?((\\p{Digit}+)?)([eE][+-]?(\\p{Digit}+))?)|(\\.((\\p{Digit}+))([eE][+-]?(\\p{Digit}+))?)|(((0[xX](\\p{XDigit}+)(\\.)?)|(0[xX](\\p{XDigit}+)?(\\.)(\\p{XDigit}+)))[pP][+-]?(\\p{Digit}+)))[fFdD]?))[\\x00-\\x20]*"; public static JSONParser parser = new JSONParser(); public static Object fromJSONString(String json) { if (startsAndEnds(json, '"', '"')) { return json.substring(1, json.length()-1); }else if (isNumeric(json)) { if (json.contains(Character.toString(getDecimalSeperator()))) { if (json.matches(doubleRegExp)) { return Double.parseDouble(json); }else { return Float.parseFloat(json); } } BigInteger bigInt = new BigInteger(json); long val = bigInt.longValue(); if (val < Integer.MAX_VALUE) { return bigInt.intValue(); }else { return val; } }else if (json.equals("false") || json.equals("true")) { return json.equals("true") ? true : false; }else if (json.equals("null")) { return null; }else if (startsAndEnds(json, '{', '}')) { return parser.parseJSONObject(json); }else if (startsAndEnds(json, '[', ']')) { return parser.parseJSONArray(json); }else { System.out.println("THROWING: " + json); throw new IllegalArgumentException("In-valid JSON parsed!"); } } public static void invalidJSON() { throw new IllegalArgumentException("In-valid JSON parsed!"); } public static char getMinusSign() { DecimalFormatSymbols symbols = DecimalFormatSymbols.getInstance(); return symbols.getMinusSign(); } public static char getDecimalSeperator() { DecimalFormatSymbols symbols = DecimalFormatSymbols.getInstance(); return symbols.getDecimalSeparator(); } public static boolean isNumeric(String str) { char minusSymbol = getMinusSign(); if (str.charAt(0) == minusSymbol) { str = str.substring(1, str.length()); } boolean dpFound = false; char dpChar = getDecimalSeperator(); for (int i = 0; i < str.length(); i++) { char c = str.charAt(i); if (c == dpChar) { if (dpFound) { return false; } continue; } if (!Character.isDigit(c)) { return false; } } return true; } public static boolean startsAndEnds(String str, char start, char end) { if (str == null || str == "") { return false; } return str.charAt(0) == start && str.charAt(str.length()-1) == end; } public static Object[] convertObjectToArray(Object o) { if (o.getClass().isArray()) { Class ofArray = o.getClass().getComponentType(); if (ofArray.isPrimitive()) { ArrayList ar = new ArrayList(); for (int i = 0; i < Array.getLength(o); i++) { ar.add(Array.get(o, i)); } return ar.toArray(); }else { return (Object[]) o; } }else { throw new IllegalArgumentException("Tried to pass non-array object to 'convertObjectToArray(Object o)' method!"); } } public static Field[] getAllFields(Object o, boolean includeStatic) { Class<?> objClass = o.getClass(); Field[] objFields = getNonStaticFields(objClass); Class<?> current = objClass; ArrayList<Field> superFields = new ArrayList<>(); while (current.getSuperclass() != null) { Class<?> superClass = current.getSuperclass(); Field[] tempFields = getNonStaticFields(superClass); for (int i = 0; i < tempFields.length; i++) { superFields.add(tempFields[i]); } current = superClass; } Field[] result = new Field[objFields.length + superFields.size()]; for (int i = 0; i < objFields.length; i++) { result[i] = objFields[i]; } int baseInd = objFields.length-1; for (int i = 0; i < superFields.size(); i++) { result[baseInd+i] = superFields.get(i); } return result; } public static Field[] getNonStaticFields(Class<?> clazz) { ArrayList<Field> result = new ArrayList<>(); Field[] fields = clazz.getDeclaredFields(); for (int i = 0; i < fields.length; i++) { Field field = fields[i]; if (Modifier.isStatic(field.getModifiers())) { continue; } result.add(field); } return result.toArray(new Field[result.size()]); } public static String getExtension(File f) { String relPath = f.getPath(); int ind = relPath.lastIndexOf("."); return relPath.substring(ind, relPath.length()); } public static boolean keysMatchToFields(Class<?> clazz, Set<String> keys) { Field[] fields = getNonStaticFields(clazz); for (Field f : fields) { if (!keys.contains(f.getName().toLowerCase())) { return false; } } return true; } public static void checkNull(Object value) { if (value == null) { throw new IllegalArgumentException("This key doesn't map to a value!"); } } } <file_sep>package jayms.json.serial; import java.io.File; import jayms.json.JSONFile; public class JSONDeserializer { private JSONFile file; public JSONDeserializer(File file) { this(new JSONFile(file)); } public JSONDeserializer(JSONFile file) { this.file = file; } public <T> T deserialize(DeserializerTransformer<T> transformer) { return transformer.transform(file.read()); } } <file_sep>package jayms.json.serial; import jayms.json.JSONObject; public interface SerializerTransformer<T> { JSONObject transform(T obj); } <file_sep>package jayms.json; import java.lang.reflect.Field; import java.util.HashMap; import java.util.LinkedHashMap; import java.util.Map.Entry; import java.util.Set; import java.util.UUID; public class JSONObject extends LinkedHashMap<String, Object> implements JSONComponent { public JSONObject() { } public JSONObject(Object o) { if (o instanceof JSONObject) { HashMap<String, Object> map = (HashMap<String, Object>) o; for (Entry<String, Object> entry : map.entrySet()) { put(entry.getKey(), entry.getValue()); } return; } if (o.getClass().isArray()) { throw new IllegalArgumentException("Tried to pass array to JSONObject! Use JSONArray for arrays!"); } if (o.getClass().isPrimitive() || o instanceof String) { throw new IllegalArgumentException("Tried to pass primitive type or string to JSONObject!"); } Field[] fields = Utils.getAllFields(o, false); for (Field f : fields) { String key = f.getName(); Object value = null; try { f.setAccessible(true); value = f.get(o); } catch(Exception e) { e.printStackTrace(); } put(key, value); } } public JSONObject(JSONObject o) { this((HashMap<String, Object>)o); } public JSONObject(HashMap<String, Object> map) { for (Entry<String, Object> entry : map.entrySet()) { put(entry.getKey(), entry.getValue()); } } public UUID getUUID(Object key) { Object resultObj = this.get(key); if (resultObj == null) { return null; } if (!(resultObj instanceof String) && !(resultObj instanceof UUID)) { return null; } return resultObj instanceof UUID ? ((UUID) resultObj) : UUID.fromString((String) resultObj); } public String getString(Object key) { Object resultObj = this.get(key); if (resultObj == null) { return null; } if (!(resultObj instanceof String)) { return null; } return (String) resultObj; } public double getDouble(Object key) { Object resultObj = this.get(key); if (resultObj == null) { return Double.NaN; } if (!(resultObj instanceof Double)) { return Double.NaN; } return (double) resultObj; } public boolean getBoolean(Object key) { Object resultObj = this.get(key); if (resultObj == null) { return false; } if (!(resultObj instanceof String) && !(resultObj instanceof Boolean)) { return false; } return resultObj instanceof String ? Boolean.parseBoolean((String) resultObj) : (boolean) resultObj; } public <T> void replaceAsObject(String key, Class<T> clazz) { Object obj = get(key); if (!(obj instanceof JSONObject)) { throw new IllegalArgumentException("Tried to replace as object at key: " + key + " but that is not a JSONObject!"); } put(key, ((JSONObject) obj).toObject(clazz)); } public <T> T toObject(Class<T> clazz) { try { T result = clazz.newInstance(); Field[] fields = Utils.getNonStaticFields(clazz); for (int i = 0; i < fields.length; i++) { Field f = fields[i]; String fn = f.getName(); if (!this.containsKey(fn)) { System.out.println("Doesn't contain everything!"); return null; } f.setAccessible(true); Object toSet = get(fn); //TODO HANDLE JSONOBJECT f.set(result, toSet); } return result; } catch (InstantiationException | IllegalAccessException e) { e.printStackTrace(); } return null; } @Override public String toJSONString() { StringBuilder sb = new StringBuilder(); sb.append("{"); Set<Entry<String, Object>> entries = this.entrySet(); int iteration = 1; for (Entry<String, Object> entry : entries) { String toAppend = "\"" + entry.getKey() + "\":"; Object value = entry.getValue(); toAppend += Utils.toJSONString(value); if (!(iteration >= entries.size())) { toAppend += ","; } sb.append(toAppend); iteration++; } sb.append("}"); return sb.toString(); } } <file_sep>package jayms.json; import java.util.ArrayList; public class JSONParser { public JSONParser() { } public JSONObject parseJSONObject(String json) { JSONObject result = new JSONObject(); if (!Utils.startsAndEnds(json, '{', '}')) { Utils.invalidJSON(); } json = json.substring(1, json.length()-1); String[] entries = breakToJSONString(json); for (int i = 0; i < entries.length; i++) { String entry = entries[i]; int ind = 0; String key = null; String value = null; boolean gettingKey = false; boolean gotKey = false; ArrayList<Character> keyChars = new ArrayList<>(); while (ind < entry.length()) { char c = entry.charAt(ind); ind++; if (c == '"' && !gotKey) { if (gettingKey) { StringBuilder sb = new StringBuilder(); for (int k = 0; k < keyChars.size(); k++) { sb.append(keyChars.get(k)); } key = sb.toString(); gettingKey = false; gotKey = true; c = entry.charAt(ind); if (c != ':') { Utils.invalidJSON(); } }else { gettingKey = true; } continue; } if (gettingKey) { keyChars.add(c); continue; } if (gotKey) { if (c == ':') { continue; } value = entry.substring(ind-1, entry.length()); break; } } result.put(key, Utils.fromJSONString(value)); } return result; } private String[] breakToJSONString(String json) { int ind = 0; ArrayList<String> resultList = new ArrayList<>(); ArrayList<Character> built = new ArrayList<>(); boolean dontBreakCurl = false; boolean dontBreakSquare = false; while (ind < json.length()) { char c = json.charAt(ind); ind++; switch (c) { case '{': dontBreakCurl = true; break; case '}': dontBreakCurl = false; break; case '[': dontBreakSquare = true; break; case ']': dontBreakSquare = false; break; } if (c == ',' && !dontBreakCurl && !dontBreakSquare) { appendBuiltToResult(built, resultList); }else { built.add(c); if (ind == json.length()) { appendBuiltToResult(built, resultList); } } } return resultList.toArray(new String[resultList.size()]); } private void appendBuiltToResult(ArrayList<Character> built, ArrayList<String> resultList) { StringBuilder sb = new StringBuilder(); for (Character chara : built) { sb.append(chara); } built.clear(); resultList.add(sb.toString()); } public JSONArray parseJSONArray(String json) { JSONArray result = new JSONArray(); if (!Utils.startsAndEnds(json, '[', ']')) { Utils.invalidJSON(); } json = json.substring(1, json.length()-1); String[] values = breakToJSONString(json); for (int i = 0; i < values.length; i++) { result.add(Utils.fromJSONString(values[i])); } return result; } } <file_sep>package jayms.json.node; import java.util.Collections; import java.util.Map; import jayms.json.Utils; public class JSONNode { private JSONNode parent; private String name; private Object value; public JSONNode(JSONNode parent, String name, Object value) { this.parent = parent; this.name = name; this.value = value; } public String getName() { return name; } public Object getValue() { return value; } public void addChild(JSONNode JSONNode) { if (hasChild(JSONNode.getName())) { throw new IllegalArgumentException("A JSONNode with this name is aleady a child of this JSONNode!"); } getChildNodes().put(JSONNode.getName(), JSONNode); } public void removeChild(String JSONNodeName) { if (hasChild(JSONNodeName)) { getChildNodes().remove(JSONNodeName); } } public boolean hasChild(String JSONNodeName) { if (!isParentNode()) { return false; } return getChildNodes().containsKey(JSONNodeName); } public JSONNode childNode(String JSONNodeName) { if (!isParentNode()) { throw new IllegalArgumentException("Can't get child JSONNode because this JSONNode isn't a parent!"); } Map<String, JSONNode> children = getChildNodes(); if (!children.containsKey(JSONNodeName)) { throw new IllegalArgumentException("Can't get child JSONNode because this JSONNode isn't a child of this JSONNode!"); } return children.get(JSONNodeName); } public double getDoubleValue() { Object result = getValue(); checkNull(result); if (!(result instanceof String)) { throw new IllegalArgumentException("Failed to cast value to double."); } return (double) result; } public float getFloatValue() { Object result = getValue(); checkNull(result); if (!(result instanceof Float)) { throw new IllegalArgumentException("Failed to cast value to float."); } return (float) result; } public boolean getBooleanValue() { Object result = getValue(); checkNull(result); if (!(result instanceof Boolean)) { throw new IllegalArgumentException("Failed to cast value to boolean."); } return (boolean) result; } public char getCharValue() { Object result = getValue(); checkNull(result); if (!(result instanceof Character)) { throw new IllegalArgumentException("Failed to cast value to char."); } return (char) result; } public byte getByteValue() { Object result = getValue(); checkNull(result); if (!(result instanceof Byte)) { throw new IllegalArgumentException("Failed to cast value to byte."); } return (byte) result; } public long getLongValue() { Object result = getValue(); checkNull(result); if (!(result instanceof Long)) { throw new IllegalArgumentException("Failed to cast value to long."); } return (long) result; } public short getShortValue() { Object result = getValue(); checkNull(result); if (!(result instanceof Short)) { throw new IllegalArgumentException("Failed to cast value to short."); } return (short) result; } public int getIntegerValue() { Object result = getValue(); checkNull(result); if (!(result instanceof Integer)) { throw new IllegalArgumentException("Failed to cast value to int."); } return (int) result; } public String getStringValue() { Object result = getValue(); checkNull(result); if (!(result instanceof String)) { throw new IllegalArgumentException("Failed to cast value to string."); } return (String) result; } private void checkNull(Object value) { Utils.checkNull(value); } public JSONNode getParentJSONNode() { return parent; } public Map<String, JSONNode> getChildren() { return Collections.unmodifiableMap(getChildNodes()); } private Map<String, JSONNode> getChildNodes() { if (!isParentNode()) { return null; } return (Map<String, JSONNode>) value; } public boolean isParentNode() { if (!(value instanceof Map)) { return false; } try { Map<String, JSONNode> map = (Map<String, JSONNode>) value; return true; }catch (Exception e) { return false; } } public boolean isChildNode() { return parent != null; } }
417ed7916c3d29eae585e15b9f5eb2a1df406a44
[ "Java" ]
6
Java
jaymsDooku/JsonLib
1e51fb87f47f62a98e505c731f637014b5ee770f
d6c03c347e9aa8d1216fa44f7edbc7e746b23609
refs/heads/master
<repo_name>Todai88/profile<file_sep>/resources/scripts/charts.js var designBarChartData = { labels: ["HTML5", "CSS", "jQuery", "JavaScript", "ASP.NET"], datasets: [{ label: 'Design', backgroundColor: ["rgba(255,82,0, 0.8)", "rgba(0,100,255,0.8)", "rgba(255,255,255,0.8)", "rgba(231,131,0, 0.8)", "rgba(0,122,204,0.8)"], borderColor: ['rgba(255,39,0,1)', 'rgba(0,20,255, 1)', 'rgba(0,0,0, 1)', 'rgba(232, 90,0, 1)', 'rgba(0,70,204, 1)'], borderWidth: 1, yAxisID: "y-axis-1", data: [30, 35, 25, 30, 40] }] }; var langBarChartData = { labels: ["Java", "C#", "Coffe drinking", "SQL", "VB.NET", "Japanese"], datasets: [{ label: 'Programming', backgroundColor: ["rgba(57,130,255,0.8)", "rgba(168,0,255,0.8)", "rgba(231,131,0, 0.8)", "rgba(0,0,0,1)", "rgba(0,140,220,0.8)", "rgba(230,0,64,0.8)", "rgba(57,66,255,0.8)"], borderColor: ['rgba(230,0,64,0.9)', 'rgba(212,98,197, 1)', 'rgba(232, 90,0, 1)', 'rgba(255,111,0, 1)', 'rgba(212,98,197, 1)', 'rgba(0,0,0,1)', 'rgba(140,194,255)'], borderWidth: 1, yAxisID: "y-axis-1", data: [40, 30, 95, 40, 60, 65] }] }; var theoryBarChartData = { labels: ["IT-Sec", "Management", "Scrum", "SW. Methodologies", "Algorithms"], datasets: [{ label: 'Theory', backgroundColor: ["rgba(0, 127, 253,0.8)", "rgba(104, 24, 0,0.8)", "rgba(255, 114 , 0, 0.8)", "rgba(0, 155, 0, 0.8)", "rgba(0,122,204,0.8)"], borderColor: "rgba(0,0,0,1)", borderWidth: 1, yAxisID: "y-axis-1", data: [40, 30, 50, 30, 20] }] }; Number.prototype.roundTo = function(nTo) { nTo = nTo || 10; return Math.round(this * (1 / nTo)) * nTo; } var slides = $('#slider ul').children().length; var slideWidth = $('#slider').width(); var min = 0; var max = -((slides - 1) * slideWidth); $("#slider ul").width(slides * slideWidth).draggable({ axis: 'x', drag: function(event, ui) { if (ui.position.left > min) ui.position.left = min; if (ui.position.left < max) ui.position.left = max; }, stop: function(event, ui) { $(this).animate({ 'left': (ui.position.left).roundTo(slideWidth) }) } }); window.onload = function() { var ctx = document.getElementById("programming").getContext("2d"); window.myBar = Chart.Bar(ctx, { data: langBarChartData, options: { responsive: true, hoverMode: 'label', hoverAnimationDuration: 400, stacked: false, title: { display: true, text: "Programming", fontSize: 32 }, scales: { yAxes: [{ type: "linear", // only linear but allow scale type registration. This allows extensions to exist solely for log scale for instance display: true, position: "left", id: "y-axis-1", ticks: { max: 100, min: 0, userCallback: function (value, index, values) { if (value % 10 == 0){ if (value == 20){ return "Slow & Steady"; } if (value == 40) { return "White-belt"; } if (value == 60) { return "Pretty Good"; } if (value == 80) { return "Smokin'"; } if (value == 100) { return "Sensei"; } } } } }, { type: "linear", // only linear but allow scale type registration. This allows extensions to exist solely for log scale for instance display: true, position: "right", id: "y-axis-2", ticks: { max: 100, min: 0, userCallback: function (value, index, values) { if (value % 10 == 0){ if (value == 20){ return "Slow & Steady"; } if (value == 40) { return "White-belt"; } if (value == 60) { return "Pretty Good"; } if (value == 80) { return "Smokin'"; } if (value == 100) { return "Sensei"; } } } } }], } } }); window.myBar2 = Chart.Bar(document.getElementById("design").getContext("2d"), { data: designBarChartData, options: { responsive: true, hoverMode: 'label', hoverAnimationDuration: 400, stacked: false, title: { display: true, text: "Design", fontSize: 32 }, scales: { yAxes: [{ type: "linear", // only linear but allow scale type registration. This allows extensions to exist solely for log scale for instance display: true, position: "left", id: "y-axis-1", ticks: { max: 100, min: 0, userCallback: function (value, index, values) { if (value % 10 == 0){ if (value == 20){ return "Slow & Steady"; } if (value == 40) { return "White belt"; } if (value == 60) { return "Samurai"; } if (value == 80) { return "Ninja"; } if (value == 100) { return "Sensei"; } } } } }, { type: "linear", // only linear but allow scale type registration. This allows extensions to exist solely for log scale for instance display: true, position: "right", id: "y-axis-2", ticks: { max: 100, min: 0, userCallback: function (value, index, values) { if (value % 10 == 0){ if (value == 20){ return "Slow & Steady"; } if (value == 40) { return "White-belt"; } if (value == 60) { return "Pretty Good"; } if (value == 80) { return "Smokin'"; } if (value == 100) { return "Sensei"; } } } } }], } } }); window.myBar3 = Chart.Bar(document.getElementById("theory").getContext("2d"), { data: theoryBarChartData, options: { responsive: true, hoverMode: 'label', hoverAnimationDuration: 400, stacked: false, title: { display: true, text: "Theory", fontSize: 32 }, scales: { yAxes: [{ type: "linear", // only linear but allow scale type registration. This allows extensions to exist solely for log scale for instance display: true, position: "left", id: "y-axis-1", ticks: { max: 100, min: 0, userCallback: function (value, index, values) { if (value % 10 == 0){ if (value == 20){ return "Slow & Steady"; } if (value == 40) { return "White-belt"; } if (value == 60) { return "Pretty Good"; } if (value == 80) { return "Smokin'"; } if (value == 100) { return "Sensei"; } } } } }, { type: "linear", // only linear but allow scale type registration. This allows extensions to exist solely for log scale for instance display: true, position: "right", id: "y-axis-2", ticks: { max: 100, min: 0, userCallback: function (value, index, values) { if (value % 10 == 0){ if (value == 20){ return "Slow & Steady"; } if (value == 40) { return "White-belt"; } if (value == 60) { return "Pretty Good"; } if (value == 80) { return "Smokin'"; } if (value == 100) { return "Sensei"; } } } } }], } } }); }; Chart.defaults.global.legend.display = false; Chart.defaults.global.defaultFontSize = 18; <file_sep>/resources/scripts/langChart.js var barChartData = { labels: ["Core Java", "C/C++", "C#", "Relational-Databases", "VB.NET", "Erlang", "JavaScript"], datasets: [{ label: 'Programming', backgroundColor: ["rgba(102,61,0,0.9)", "rgba(5,32,74,0.9)", "rgba(176,0,255,0.8)", "rgba(101,0,0,0.8)", "rgba(0,115,255,0.8)", "rgba(146,0,0,0.9)", "rgba(236,211,0,0.8)"], yAxisID: "y-axis-1", data: [40, 30, 50, 40, 60, 20, 30] }] }; window.onload = function() { var ctx = document.getElementById("canvas").getContext("2d"); window.myBar = Chart.Bar(ctx, { data: barChartData, options: { responsive: true, hoverMode: 'label', hoverAnimationDuration: 400, stacked: false, title:{ display:true, text:"Programming languages", fontSize: 18, fontStyle: 'bold' }, scales: { yAxes: [{ type: "linear", // only linear but allow scale type registration. This allows extensions to exist solely for log scale for instance display: true, position: "left", id: "y-axis-1", ticks: { max: 100, min: 0 } }], } } }); }; Chart.defaults.global.legend.display = false; Number.prototype.roundTo = function(nTo) { nTo = nTo || 10; return Math.round(this * (1 / nTo) ) * nTo; } $(function() { var slides = $('#slider ul').children().length; var slideWidth = $('#slider').width(); var min = 0; var max = -((slides - 1) * slideWidth); $("#slider ul").width(slides*slideWidth).draggable({ axis: 'x', drag: function (event, ui) { if (ui.position.left > min) ui.position.left = min; if (ui.position.left < max) ui.position.left = max; }, stop: function( event, ui ) { $(this).animate({'left': (ui.position.left).roundTo(slideWidth)}) } }); });
a335c440ad48a36ace4380f7f52668c4944a3186
[ "JavaScript" ]
2
JavaScript
Todai88/profile
a460ac06da05cf8dfa79163af5e1be7618b66db8
ec8785ad5bda02cb0edc72ed09226754f3dd8383
refs/heads/master
<repo_name>momentum-cohort-2019-09/w3d1-customer-database-CharlesMcKelvey<file_sep>/main.js // const capitalize = str => str[0].toUpperCase() + str.slice(1) function customerToHtml(customer) { return ` <div class="customerCard"> <img class="customerImg" src="${customer.picture.large}" alt="${customer.name.first + " " + customer.name.last}"> <div class="name">${customer.name.first + " " + customer.name.last}</div> <div class="email">${customer.email}</div> <div class="address">${customer.location.street + " " + customer.location.city + " " + nameToAbbr(customer.location.state) + " " + customer.location.postcode}</div> <div class="dateOfBirth">Date of Birth: ${moment(customer.dob).format('MMM Do YYYY')}</div> <div class="customerSince">Customer Since: ${moment(customer.registered).format('MMM Do YYYY')}</div> </div> ` } document.querySelector('#container').innerHTML = customers.map(customerToHtml).join('\n')
21dc454788550249560d3b1695f656a52483f711
[ "JavaScript" ]
1
JavaScript
momentum-cohort-2019-09/w3d1-customer-database-CharlesMcKelvey
514bbbc682fd194efb456427590f4d451cfde0da
176d1442ebfb8e6c6c194632e60402aa51303ce3
refs/heads/main
<file_sep>#it's my first long project #please tell my if you like it #made by osx447 with python #------------------------------ import time name = input("What's your name : ") while True: try: age = int(input("whats your age " + name + " : ")) break except: print("enter your age in numbers") #now i got your name and age lets use them print("try again later") time.sleep(2) print("just kidding") #just a joke :) while True: if age < 16: print(name + " you dont have a passport or id") dad_fake_id = input("give me your dad id number " + name + " : ") confirm_id = input("your dad's id " + dad_fake_id + " are you sure? (y-n) ") # now you must confirm the id (i know it is fake :-) if confirm_id == "y": print("confirmed") print("just joking") print("bye " + name + ", it was nice to meet you") break else: print("please confirm") else: fake_id = input("please, give me your id " + name + " : ") confirm_your_id = input("your id is" + fake_id + " are you sure (y-n) :") if confirm_your_id == "y": print("you are being hacked..") time.sleep(4) print("just joking") print("bye " + name + ", is was nice to meet you") break else: print("please confirm") #you must confirm to break the loop #and that is it #i know that *karim* will read the code #then please tell me your opinion #thanks for reading the code #coding it took an hour #bye #:-):-):-):-):-):-):-):-):-):-):-):-) <file_sep>#it is my last project put after editing #and adding factions and making it more simple #--------------------------------------------- import time name = input("what's your name : ") while True: try: age = int(input(f"please, enter your age {name} : ")) break except: print("enter your age in numbers please") def joke(): print("try again later") time.sleep(2) print("why are you waiting? ") print("yeah, for the rest of the program") joke() while True: if age < 16: print(name + " you dont have a passport or id") dad_fake_id = input("give me your dad id number " + name + " : ") confirm_id = input("your dad's id " + dad_fake_id + " are you sure? (y-n) ") # now you must confirm the id (i know that the id you will enter is fake :-) if confirm_id == "y": print("confirmed") print("just joking") print("bye " + name + ", it was nice to meet you") break else: print("please confirm") else: fake_id = input("please, give me your id " + name + " : ") confirm_your_id = input("your id is " + fake_id + " are you sure (y-n) :") if confirm_your_id == "y": print("you are being hacked..") time.sleep(4) print("just joking") print("bye " + name + ", is was nice to meet you") break else: print("please confirm") #--------------------------------- #thats it i hope that is better #print("bye " + name) #---------------------------------
f036e92dfda67bc27c1c6680999ad55f7246bd09
[ "Python" ]
2
Python
osx447/small_projects
88ae5010dd29d5397ea85ec6b5f39bad5cb3cab7
6dc528764bc51d573e197dcd409518c1ffd13d3c
refs/heads/master
<file_sep>package com.company; import java.util.ArrayList; import java.util.List; public class UniversityCourses { List students=new ArrayList(); String course; String instructor; UniversityCourses(String courseName,String instructorName){ this.course=courseName; this.instructor=instructorName; } void registration(String studentName){ students.add(studentName); } void drop(String studentName){ students.remove(studentName); } }
f37061ce7cb24543b83f1384af54bd6699f7df74
[ "Java" ]
1
Java
hussamhasi/Atypon
9c47422c38d2fe3db3e5a7ecbfbb23459814e5c8
8a0d2c4076196fe50036667671369709f111adec
refs/heads/master
<repo_name>IsaacYAGI/python-ejercicios<file_sep>/tema6-condicionales/ejercicios-propuestos/ejercicio_propuestos3_tema6.py ##Se ingresa por teclado un número positivo de uno o dos dígitos (1..99) ##mostrar un mensaje indicando si el número tiene uno o dos dígitos. ##(Tener en cuenta que condición debe cumplirse para tener dos dígitos ##un número entero) num = int(input("Ingrese un numero entero de uno o dos digitos: ")) if (num < 10): print("El numero ",num," tiene un digito") else: print("El numero ",num," tiene dos digitos") <file_sep>/tema5-programacion-secuencial/ejercicio3.py ##Leer por teclado dos numeros y calcular la suma y multiplicacion de los mismos. ##Mostrar los resultados por pantalla num1 = int(input("Ingrese numero 1: ")) num2 = int(input("Ingrese numero 2: ")) print ("La suma de los numeros es: ",num1+num2) print ("El producto de los numeros es: ",num1*num2) <file_sep>/tema7-condicionales-anidados/ejercicios-propuestos/ejercicio_propuestos4_tema7.py ''' Un postulante a un empleo, realiza un test de capacitación, se obtuvo la siguiente información: cantidad total de preguntas que se le realizaron y la cantidad de preguntas que contestó correctamente. Se pide confeccionar un programa que ingrese los dos datos por teclado e informe el nivel del mismo según el porcentaje de respuestas correctas que ha obtenido, y sabiendo que: Nivel máximo: Porcentaje>=90%. Nivel medio: Porcentaje>=75% y <90%. Nivel regular: Porcentaje>=50% y <75%. Fuera de nivel: Porcentaje<50%. ''' cant_preguntas = int(input("Ingrese la cantidad total de preguntas: ")) cant_correctas = int(input("Ingrese la cantidad total de preguntas respondidas correctamente: ")) porcentaje = cant_correctas*100/cant_preguntas if porcentaje < 50: print("Fuera de nivel") else: if porcentaje < 75: print("Nivel regular") else: if porcentaje < 90: print("Nivel medio") else: print("Nivel maximo") <file_sep>/tema7-condicionales-anidados/ejercicios-propuestos/ejercicio_propuestos2_tema7.py ##Se ingresa por teclado un valor entero, #mostrar una leyenda que indique si el número es positivo, #negativo o nulo (es decir cero) num = int(input("Ingrese el numero: ")) if num > 0: print("El numero es positivo") else: if num < 0: print("El numero es negativo") else: print("El numero es nulo") <file_sep>/tema7-condicionales-anidados/ejercicios-propuestos/ejercicio_propuestos1_tema7.py #Se cargan por teclado tres números distintos. #Mostrar por pantalla el mayor de ellos. num1 = int(input("Ingrese el num 1: ")) num2 = int(input("Ingrese el num 2: ")) num3 = int(input("Ingrese el num 3: ")) mayor = 0 if num1 > mayor: mayor = num1 if num2 > mayor: mayor = num2 if num3 > mayor: mayor = num3 print("El numero mayor de los tres numeros ingresados es: ",mayor) <file_sep>/README.md # Ejercicios de Python Este repositorio tiene ejercicios básicos realizados en el lenguaje Python con el fin único de practicar. Los ejercicios realizados son obtenidos del sitio [Tutoriales de programacion Ya](https://www.tutorialesprogramacionya.com/pythonya/index.php?inicio=0). ## Instalación Para realizar los ejercicios se utilizó la version **3.6.2** de Python que se puede descargar desde la [página oficial](https://www.python.org/downloads/).<file_sep>/tema5-programacion-secuencial/ejercicios-propuestos/ejercicio_propuestos1_tema5.py ##Realizar la carga del lado de un cuadrado, mostrar por pantalla el perímetro ##del mismo (El perímetro de un cuadrado se calcula multiplicando el valor ##del lado por cuatro) lado_cuad = int(input("Ingrese la medida del lado del cuadrado: ")) print("El perimetro del cuadrado es: ", lado_cuad*4) <file_sep>/tema5-programacion-secuencial/ejercicio4.py ##Realizar la carga del precio de un producto y la cantidad a llevar. ##Mostrar cuanto se debe pagar (se ingresa un valor entero en el precio del producto) cant_prod = int(input("Ingrese la cantidad del producto: ")) precio_prod = int(input("Ingrese el precio del producto: ")) print("La cantidad a pagar por ",cant_prod, " producto(s) a un precio de ",precio_prod," es de ", cant_prod*precio_prod) <file_sep>/tema6-condicionales/ejercicio10.py ##Realizar un programa que solicite ingresar ##dos números distintos y muestre por pantalla el mayor de ellos. num1 = int(input("Ingrese el numero 1: ")) num2 = int(input("Ingrese el numero 2: ")) print("Los numero ingresados fueron: num1: ", num1," y num2 es: ",num2) if num1>num2: print("El numero mayor es: ",num1) else: print("El numero mayor es: ",num2) <file_sep>/tema39-biblioteca-estandar-python/ejercicios-propuestos/ejercicio_propuestos2_tema39.py ''' Confeccionar una programa con las siguientes funciones: 1) Generar una lista con 5 elementos enteros aleatorios comprendidos entre 1 y 3. 2) Controlar que el primer elemento de la lista sea un 1, en el caso que haya un 2 o 3 mezclar la lista y volver a controlar hasta que haya un 1. 3) Imprimir la lista. ''' import random def llenar_arreglo(li): for i in range(5): li.append(random.randint(1,3)) def arreglo_tiene_uno(li): for i in range (len(li)): if (li[i] == 1): return True return False def arreglo_tiene_uno_primero(li): return li[0] == 1 def vaciar_arreglo(li): del(li[0:]) def mezclar(li): random.shuffle(li) lista = [] array_valido = False while not array_valido: llenar_arreglo(lista) print("La lista generada es: ",lista) if (arreglo_tiene_uno(lista)): if (arreglo_tiene_uno_primero(lista)): print("Lista validada, lista: ",lista) array_valido = True else: while (not arreglo_tiene_uno_primero(lista)): print("La lista ",lista," no tiene 1 primero.") mezclar(lista) print("Lista validada, lista: ",lista) array_valido = True else: vaciar_arreglo(lista) print("Vaciando lista") <file_sep>/tema7-condicionales-anidados/ejercicios-propuestos/ejercicio_propuestos3_tema7.py #Confeccionar un programa que permita cargar un número entero positivo #de hasta tres cifras y muestre un mensaje indicando si tiene 1, 2, o 3 cifras. #Mostrar un mensaje de error si el número de cifras es mayor. num = int(input("Ingrese el numero: ")) if num < 10: print("El numero tiene una cifra") else: if num < 100: print("El numero tiene dos cifras") else: if num < 1000: print("El numero tiene tres cifras") else: print("Error, el numero tiene mas de tres cifras") <file_sep>/tema2-hola-mundo/ejercicio1.py ##Escribir hola mundo por pantalla print ("Hola mundo"); <file_sep>/tema13-procesamiento-cadenas/ejercicios-propuestos/ejercicio_propuestos3_tema13.py ''' Solicitar el ingreso de una clave por teclado y almacenarla en una cadena de caracteres. Controlar que el string ingresado tenga entre 10 y 20 caracteres para que sea válido, en caso contrario mostrar un mensaje de error. ''' correcto = False while not correcto: password = input("Ingrese contraseña: ") if len(password) < 10 or len(password) > 20: print("Error, la contraseña debe tener minimo 10 caracteres y un maximo de 20 caracteres") else: correcto = True print("Contraseña ",password," aceptada.") <file_sep>/tema9-ciclo-while/ejercicios-propuestos/ejercicio_propuestos1_tema9.py ''' Escribir un programa que solicite ingresar 10 notas de alumnos y nos informe cuántos tienen notas mayores o iguales a 7 y cuántos menores. ''' notas_mayores = 0 notas_menores = 0 i = 1 while i <=10: # nota_act = float(input("Ingrese nota "+str(i)+": ")) nota_act = float(input(f"Ingrese nota {i}: ")) if nota_act >= 7: notas_mayores += 1 else: notas_menores += 1 i += 1 print("La cantidad de estudiantes con notas mayores a 7 es de: ",notas_mayores) print("La cantidad de estudiantes con notas menores a 7 es de: ",notas_menores) <file_sep>/tema3-calculo-superficie-cuadrado/ejercicio2.py ##Calculo de la superficie de un cuadrado conociendo uno de sus lados ##lado = input("Ingrese la medida del lado del cuadrado: ") ##lado = int(lado) lado = int(input("Ingrese la medida del lado del cuadrado: ")) superficie = lado * lado print("La superficie del cuadrado es ") print (superficie) <file_sep>/tema6-condicionales/ejercicios-propuestos/ejercicio_propuestos2_tema6.py ##Se ingresan tres notas de un alumno, si el promedio es mayor o igual a siete ##mostrar un mensaje "Promocionado". nota1 = int(input("Ingrese la nota 1: ")) nota2 = int(input("Ingrese la nota 2: ")) nota3 = int(input("Ingrese la nota 3: ")) if (((nota1+nota2+nota3)/3) >= 7): print("Promocionado!") <file_sep>/tema6-condicionales/ejercicios-propuestos/ejercicio_propuestos1_tema6.py ##Realizar un programa que solicite la carga por teclado de dos números, ##si el primero es mayor al segundo informar su suma y diferencia, ##en caso contrario informar el producto y la división ##del primero respecto al segundo. num1 = int(input("Ingrese el numero 1: ")) num2 = int(input("Ingrese el numero 2: ")) if (num1 > num2): print("La suma de ",num1," y ",num2," es de ", num1+num2) print("La diferencia de ",num1," y ",num2," es de ", num1-num2) else: print("El producto de ",num1," y ",num2," es de ", num1*num2) print("La division de ",num1," y ",num2," es de ", num1/num2) <file_sep>/tema13-procesamiento-cadenas/ejercicios-propuestos/ejercicio_propuestos1_tema13.py ''' Cargar una oración por teclado. Mostrar luego cuantos espacios en blanco se ingresaron. Tener en cuenta que un espacio en blanco es igual a " ", en cambio una cadena vacía es "" ''' cadena = input("Ingrese una oracion: ") long_cad = len(cadena) espaces = 0 for i in range(0,long_cad): if (cadena[i] == " "): espaces += 1 print("La oracion ingresada es: ",cadena) print("La cantidad de espacios en blanco es: ",espaces) <file_sep>/tema13-procesamiento-cadenas/ejercicios-propuestos/ejercicio_propuestos2_tema13.py ''' Ingresar una oración que pueden tener letras tanto en mayúsculas como minúsculas. Contar la cantidad de vocales. Crear un segundo string con toda la oración en minúsculas para que sea más fácil disponer la condición que verifica que es una vocal. ''' cadena = input("Ingrese una oracion: ") cadena_min = cadena.lower() vocales = "aeiou" long_cad = len(cadena) cant_voc = 0 for i in range(0,long_cad): #Buscamos si el caracter actual esta en la cadena de vocales if cadena_min[i] in vocales: #De existir, se suma uno al contador cant_voc += 1 print("La oracion ingresada es: ",cadena) print("La cantidad de vocales: ",cant_voc) <file_sep>/tema5-programacion-secuencial/ejercicios-propuestos/ejercicio_propuestos4_tema5.py ##Calcular el sueldo mensual de un operario conociendo la cantidad de horas ##trabajadas y el valor por hora. CANT_DIAS = 30 cant_horas = int(input("Ingrese la cantidad de horas trabajadas: ")) sueldo = float(input("Ingrese el sueldo por hora del trabajador: ")) print("El sueldo mensual del trabajador es: ",cant_horas*sueldo*CANT_DIAS) <file_sep>/tema39-biblioteca-estandar-python/ejercicios-propuestos/ejercicio_propuestos1_tema39.py ''' Confeccionar un programa que genere un número aleatorio entre 1 y 100 y no se muestre. El operador debe tratar de adivinar el número ingresado. Cada vez que ingrese un número mostrar un mensaje "Gano" si es igual al generado o "El número aleatorio el mayor" o "El número aleatorio es menor". Mostrar cuando gana el jugador cuantos intentos necesitó. ''' import random def generar_numero_aleatorio(): return random.randint(1,100) def es_el_numero(resp_usuario,resp_correc): return resp_usuario == resp_correc def numero_dado_es_mayor(resp_usuario,resp_correc): return resp_usuario > resp_correc def juego_terminado(numero_correcto,numero_intentos): print("El juego ha terminado!") print("El numero correcto era",numero_correcto,"y lo resolviste en",numero_intentos,"intentos.",sep=" ") def el_numero_es_mayor(): print("El numero que diste es mayor al correcto, intenta de nuevo!") def el_numero_es_menor(): print("El numero que diste es menor al correcto, intenta de nuevo!") def iniciar_juego(): gano = False intentos = 1 numero = 0 respuesta_correc = generar_numero_aleatorio() while (not gano): numero = int(input("Ingresa un numero: ")) if (es_el_numero(numero,respuesta_correc)): juego_terminado(respuesta_correc,intentos) gano = True else: if (numero_dado_es_mayor(numero,respuesta_correc)): el_numero_es_mayor() else: el_numero_es_menor() intentos += 1 iniciar_juego() <file_sep>/tema9-ciclo-while/ejercicios-propuestos/ejercicio_propuestos2_tema9.py ''' En una empresa trabajan n empleados cuyos sueldos oscilan entre $100 y $500, realizar un programa que lea los sueldos que cobra cada empleado e informe cuántos empleados cobran entre $100 y $300 y cuántos cobran más de $300. Además el programa deberá informar el importe que gasta la empresa en sueldos al personal. ''' cant_empl = int(input("Ingrese la cantidad de empleados: ")) importe_total = 0 i = 1 cant_may_300 = 0 while i <= cant_empl: sueldo = float(input(f"Ingrese sueldo de trabajador #{i}: ")) if (sueldo > 300): cant_may_300 += 1 importe_total += sueldo i += 1 print("La cantidad de empleados que ganan mas de 300$ es de: ",cant_may_300) print("La cantidad de empleados que ganan menos de 300$ es de: ",cant_empl - cant_may_300) print("El importe total de la empresa es: ",importe_total) <file_sep>/tema5-programacion-secuencial/ejercicios-propuestos/ejercicio_propuestos2_tema5.py ##Escribir un programa en el cual se ingresen cuatro números, calcular e ##informar la suma de los dos primeros y el producto del tercero y el cuarto. num1 = int(input("Ingrese el numero 1: ")) num2 = int(input("Ingrese el numero 2: ")) num3 = int(input("Ingrese el numero 3: ")) num4 = int(input("Ingrese el numero 4: ")) print("La suma de ",num1," y ",num2," es: ", num1+num2) print("El producto de ",num3," y ",num4," es: ", num3*num4) <file_sep>/tema9-ciclo-while/ejercicio28.py #Mostrar los numeros del 1 al 100 x = 1 while (x <= 100): print(x) x += 1 <file_sep>/tema8-condicionales-operadores-logicos/ejercicios-propuestos/ejercicio_propuestos6_tema8.py ''' De un operario se conoce su sueldo y los años de antigüedad. Se pide confeccionar un programa que lea los datos de entrada e informe: a) Si el sueldo es inferior a 500 y su antigüedad es igual o superior a 10 años, otorgarle un aumento del 20 %, mostrar el sueldo a pagar. b)Si el sueldo es inferior a 500 pero su antigüedad es menor a 10 años, otorgarle un aumento de 5 %. c) Si el sueldo es mayor o igual a 500 mostrar el sueldo en pantalla sin cambios. ''' sueldo_oper = float(input("Ingrese el sueldo del operario: ")) anios_trab = int(input("Ingrese los anios de servicio del operario: ")) if sueldo_oper < 500 and anios_trab >= 10: print("Se aumenta el 20% del sueldo.") print("El nuevo sueldo es: ",sueldo_oper*0.20+sueldo_oper) else: if sueldo_oper < 500 and anios_trab < 10: print("Se aumenta el 5% del sueldo.") print("El nuevo sueldo es: ",sueldo_oper*0.05+sueldo_oper) else: if sueldo_oper > 500: print("El sueldo se mantiene sin cambios") print("El sueldo actual es: ",sueldo_oper) <file_sep>/tema5-programacion-secuencial/ejercicios-propuestos/ejercicio_propuestos3_tema5.py ##Realizar un programa que lea cuatro valores numéricos e informar su suma ##y promedio. num1 = float(input("Ingrese el numero 1: ")) num2 = float(input("Ingrese el numero 2: ")) num3 = float(input("Ingrese el numero 3: ")) num4 = float(input("Ingrese el numero 4: ")) suma = num1+num2+num3+num4; print("La suma de los numeros ingresados es: ",suma) print("El promedio de los numeros ingresados es: ",suma/4) <file_sep>/tema6-condicionales/ejercicio9.py ##Ingresar el sueldo de una persona, si supera los 3000 dolares mostrar un ##mensaje en pantalla indicando que debe abonar impuestos. sueldo = int(input("Ingrese cual es su sueldo: ")) if sueldo > 3000: print("Esta persona debe pagar impuestos") <file_sep>/tema10-ciclo-for/ejercicios-propuestos/ejercicio_propuestos3_tema10.py ''' Desarrollar un programa que muestre la tabla de multiplicar del 5 (del 5 al 50) ''' for i in range(5,51): print("5 x ",i," = ",5*i) <file_sep>/tema12-variables-int-float-cadenas/ejercicios-propuestos/ejercicio_propuestos1_tema12.py ''' Realizar la carga de dos nombres de personas distintos. Mostrar por pantalla luego ordenados en forma alfabética. ''' nombre1 = input("Ingrese el primer nombre: ") nombre2 = input("Ingrese el segundo nombre: ") if nombre1 > nombre2: print("El orden es: ",nombre2,", ",nombre1) else: print("El orden es: ",nombre1,", ",nombre2) <file_sep>/tema7-condicionales-anidados/ejercicio14.py #Confeccionar un programa que pida por teclado tres notas de un alumno, #calcule el promedio e imprima alguno de estos mensajes: #Si el promedio es >=7 mostrar "Promocionado". #Si el promedio es >=4 y <7 mostrar "Regular". #Si el promedio es <4 mostrar "Reprobado". nota1 = int(input("Ingrese la nota 1: ")) nota2 = int(input("Ingrese la nota 2: ")) nota3 = int(input("Ingrese la nota 3: ")) promedio = (nota1+nota2+nota3)/3 if promedio >= 7: print("Promocionado") else: if promedio >= 4 and promedio < 7: print("Regular") else: print("Reprobado")
50b69776bee7ad3a6068235c6c8981c06a17629a
[ "Markdown", "Python" ]
30
Python
IsaacYAGI/python-ejercicios
fe8d9b96a7e947dd3fa992dfac39049fa68935c5
665db9b6286d4840e8fa9c2d9b42b1389753b806
refs/heads/main
<repo_name>Chrizlove/YoutubePlayer<file_sep>/app/src/main/java/com/example/youtubeplayer/MainActivity.kt package com.example.youtubeplayer import android.content.Context import android.content.Intent import androidx.appcompat.app.AppCompatActivity import android.os.Bundle import android.util.Log import android.view.inputmethod.InputMethodManager import android.widget.SearchView import android.widget.Toast import androidx.recyclerview.widget.LinearLayoutManager import com.google.android.youtube.player.YouTubeApiServiceUtil import com.google.android.youtube.player.YouTubeBaseActivity import com.google.android.youtube.player.YouTubeInitializationResult import com.google.android.youtube.player.YouTubePlayer import kotlinx.android.synthetic.main.activity_main.* import retrofit2.Call import retrofit2.Callback import retrofit2.Response class MainActivity : AppCompatActivity() { private lateinit var youtubeadapter: YoutubeVideoAdapter private lateinit var videosList: List<YoutubeVideo> override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) setContentView(R.layout.activity_main) //code for entire searchview to be clivkable searchbar.setOnClickListener { searchbar.isIconified = false } //showing hint when searchview is tapped on searchbar.queryHint = "Search for videos" //preload some videos getVideosList("") //code for onsubmit of searchview query searchbar.setOnQueryTextListener(object : SearchView.OnQueryTextListener{ override fun onQueryTextSubmit(keyword: String?): Boolean { //fetch list of videos keyword?.let { getVideosList(it) } //hiding keyboard when submit button is pressed val keyboard = getSystemService(Context.INPUT_METHOD_SERVICE) as InputMethodManager keyboard.hideSoftInputFromWindow(currentFocus!!.windowToken, 0) return true } override fun onQueryTextChange(p0: String?): Boolean { return false } }) } private fun setUpRecyclerView(videos: List<YoutubeVideo>) { youtubeadapter = YoutubeVideoAdapter(this, videos) val layoutManager = LinearLayoutManager(this) ytVideosRecycler.layoutManager= layoutManager ytVideosRecycler.adapter = youtubeadapter //listener for video click and to change activity to play the video youtubeadapter.setOnItemClickListener(object: YoutubeVideoAdapter.onItemClickListener{ override fun onItemClick(position: Int) { val intent = Intent(applicationContext, YoutubeVideoPlayerActivity::class.java) intent.putExtra("title",videos[position].snippet.title) intent.putExtra("desc",videos[position].snippet.description) intent.putExtra("videoId",videos[position].id.videoId) startActivity(intent) } }) } private fun getVideosList(keyword: String) { val videos = RetrofitInstance.youtubeapi.getYoutubeVideos("snippet",keyword,"<KEY>",10,"video") videos.enqueue(object : Callback<YoutubeAPIData?> { override fun onResponse(call: Call<YoutubeAPIData?>, response: Response<YoutubeAPIData?>) { videosList = response.body()?.items!! //setting up recycler view videosList?.let { setUpRecyclerView(it) } } override fun onFailure(call: Call<YoutubeAPIData?>, t: Throwable) { Toast.makeText(applicationContext, "Unable to fetch results!", Toast.LENGTH_SHORT).show() Log.d("APIError",t.toString()) } }) } }<file_sep>/app/src/main/java/com/example/youtubeplayer/YoutubeAPIData.kt package com.example.youtubeplayer data class YoutubeAPIData (val items: List<YoutubeVideo>) <file_sep>/app/src/main/java/com/example/youtubeplayer/YoutubeListApi.kt package com.example.youtubeplayer import retrofit2.Call import retrofit2.http.GET import retrofit2.http.Query interface YoutubeListApi { @GET("search") fun getYoutubeVideos(@Query ("part") part: String, @Query(value="q") keyword: String, @Query(value="key") key: String, @Query("maxResults") max: Int, @Query("type") type: String): Call<YoutubeAPIData> }<file_sep>/app/src/main/java/com/example/youtubeplayer/YoutubeAPIDatahigh.kt package com.example.youtubeplayer data class YoutubeAPIDatahigh (val url: String)<file_sep>/app/src/main/java/com/example/youtubeplayer/YoutubeAPIid.kt package com.example.youtubeplayer data class YoutubeAPIid (val videoId: String)<file_sep>/app/src/main/java/com/example/youtubeplayer/YoutubeAPIsnippet.kt package com.example.youtubeplayer data class YoutubeAPIsnippet( val title: String, val description: String, val thumbnails: YoutubeAPIthumbnails)<file_sep>/app/src/main/java/com/example/youtubeplayer/RetrofitInstance.kt package com.example.youtubeplayer import retrofit2.Retrofit import retrofit2.converter.gson.GsonConverterFactory object RetrofitInstance { const val BASE_URL = "https://www.googleapis.com/youtube/v3/" private val retrofit by lazy { Retrofit.Builder() .baseUrl(BASE_URL) .addConverterFactory(GsonConverterFactory.create()) .build() } val youtubeapi: YoutubeListApi by lazy { retrofit.create(YoutubeListApi::class.java) } }<file_sep>/README.md # YoutubePlayer A basic app with 2 activities, 1st where is a home screen and user can search for videos and those videos fetched will be displayed. 2nd is a player screen where the video clicked on the home screen will play. Checkout the app- https://drive.google.com/file/d/1QtgDb7dkjXZNhzGXOAQYjLfnpGuoUZ48/view?usp=sharing <file_sep>/app/src/main/java/com/example/youtubeplayer/YoutubeAPIthumbnails.kt package com.example.youtubeplayer data class YoutubeAPIthumbnails(val high: YoutubeAPIDatahigh)<file_sep>/app/src/main/java/com/example/youtubeplayer/YoutubeVideoAdapter.kt package com.example.youtubeplayer import android.content.Context import android.util.Log import android.view.LayoutInflater import android.view.View import android.view.ViewGroup import android.widget.AdapterView import android.widget.ImageView import android.widget.TextView import androidx.recyclerview.widget.RecyclerView import com.bumptech.glide.Glide class YoutubeVideoAdapter(private val context: Context, private val videos: List<YoutubeVideo>): RecyclerView.Adapter<YoutubeVideoAdapter.YoutubeVideoHolder>() { inner class YoutubeVideoHolder(itemView: View, listener: onItemClickListener): RecyclerView.ViewHolder(itemView) { val videoTitle = itemView?.findViewById<TextView>(R.id.videoTitleTextView) val vidThumbnail = itemView?.findViewById<ImageView>(R.id.thumbnailImageView) init{ itemView.setOnClickListener { listener.onItemClick(adapterPosition) } } } override fun onCreateViewHolder(parent: ViewGroup, viewType: Int): YoutubeVideoHolder { val view = LayoutInflater.from(context).inflate(R.layout.lyt_ytvideo, parent, false) return YoutubeVideoHolder(view,mListener) } override fun onBindViewHolder(holder: YoutubeVideoHolder, position: Int) { val currentVideo = videos[position] holder?.videoTitle.text = currentVideo.snippet.title Glide.with(holder?.vidThumbnail.context).load(currentVideo.snippet.thumbnails.high.url).into(holder?.vidThumbnail) } override fun getItemCount(): Int { return videos.size } private lateinit var mListener: onItemClickListener interface onItemClickListener{ fun onItemClick(position: Int) } fun setOnItemClickListener(listener: onItemClickListener) { mListener = listener } }<file_sep>/app/src/main/java/com/example/youtubeplayer/YoutubeVideoPlayerActivity.kt package com.example.youtubeplayer import androidx.appcompat.app.AppCompatActivity import android.os.Bundle import android.util.Log import android.widget.Toast import com.google.android.youtube.player.YouTubeBaseActivity import com.google.android.youtube.player.YouTubeInitializationResult import com.google.android.youtube.player.YouTubePlayer import kotlinx.android.synthetic.main.activity_youtube_video_player.* class YoutubeVideoPlayerActivity : YouTubeBaseActivity() { override fun onCreate(savedInstanceState: Bundle?) { super.onCreate(savedInstanceState) setContentView(R.layout.activity_youtube_video_player) //recieving intents val videotitle:String = intent.getStringExtra("title").toString() val videoDesc:String = intent.getStringExtra("desc").toString() val videoId:String = intent.getStringExtra("videoId").toString() Log.d("title",videotitle) //setting title and desc of videos titleText.text = videotitle descText.text = videoDesc //loading the player initalizePlayer(videoId) } private fun initalizePlayer(videoId: String) { youtubePlayerView.initialize(getString(R.string.youtube_api_key),object: YouTubePlayer.OnInitializedListener{ override fun onInitializationSuccess( p0: YouTubePlayer.Provider?, p1: YouTubePlayer?, p2: Boolean ) { //playing video p1?.loadVideo(videoId) p1?.play() } override fun onInitializationFailure( p0: YouTubePlayer.Provider?, p1: YouTubeInitializationResult? ) { Toast.makeText(applicationContext,"Something went wrong",Toast.LENGTH_LONG).show() } }) } }<file_sep>/app/src/main/java/com/example/youtubeplayer/YoutubeVideo.kt package com.example.youtubeplayer import android.graphics.Bitmap data class YoutubeVideo(val id: YoutubeAPIid, val snippet: YoutubeAPIsnippet)
d05edb05fd581b8ad9544cf775ef27d43180b34a
[ "Markdown", "Kotlin" ]
12
Kotlin
Chrizlove/YoutubePlayer
ac653bae8df92417e06804336dc78441b166810e
ee7b5e72f127fb768f468f21ac6826d4592cecc5
refs/heads/master
<file_sep># Apache-CXF-JAX-WS-Client After building the SOAP WebService with the power of Apache CXF JAX WS as server, we can interact with it as client. Pre-requisities: Apache CXF-JAX-WS Enjoy it <file_sep>package com.pluralsight.cxfdemo.orders; import org.springframework.beans.factory.annotation.Autowired; import org.springframework.stereotype.Controller; import org.springframework.ui.ModelMap; import org.springframework.web.bind.annotation.RequestMapping; import org.springframework.web.bind.annotation.RequestMethod; import com.pluralsight.schema.order.OrderInquiryResponseType; import com.pluralsight.schema.order.OrderInquiryType; import com.pluralsight.service.orders.Orders; @Controller("/processOrderPlacement") public class OrdersController { @Autowired private Orders orders; @RequestMapping(method = RequestMethod.GET) public String processOrderPlacement(ModelMap model) throws Exception { OrderInquiryType orderInquiry = new com.pluralsight.schema.order.ObjectFactory().createOrderInquiryType(); orderInquiry.setAccountId(1234); orderInquiry.setEan13(1234567890123L); orderInquiry.setOrderQuantity(2); orderInquiry.setUniqueOrderId(999); OrderInquiryResponseType response = orders.processOrderPlacement(orderInquiry); model.addAttribute("orderStatus", response.getOrder().getOrderStatus().value()); return "processOrderPlacement"; } }
2a779b7a84f6e649d1320bedbd0e70058288edbc
[ "Markdown", "Java" ]
2
Markdown
aigle0/Apache-CXF-JAX-WS-Client
a2273e8fdc62540be9b4d565e7cd6bc815a5b785
6057498f64a2048e803d59541eeea938aa105526
refs/heads/master
<repo_name>swischuk/operator_inference<file_sep>/Makefile .PHONY: test clean REMOVE = rm -rfv PYTHON = python3 PYTEST = pytest --cov --cov-report html TARGET = tests/*.py # test_core.py test: $(PYTHON) check_docs.py $(PYTEST) $(TARGET) open htmlcov/index.html clean: find . -type d -name "*.egg*" | xargs $(REMOVE) find . -type f -name ".coverage*" | xargs $(REMOVE) find . -type d -name ".pytest_cache" | xargs $(REMOVE) find . -type d -name "__pycache__" | xargs $(REMOVE) find . -type d -name ".ipynb_checkpoints" | xargs $(REMOVE) find . -type d -name "htmlcov" | xargs $(REMOVE) <file_sep>/examples/project_and_save.py # project_and_save.py """An old data management script for computing a POD basis, projecting data, computing velocities, and saving the projected data for later. """ import os import sys import h5py import numpy as np from sklearn.utils.extmath import randomized_svd import rom_operator_inference as roi def offdiag_penalizers(reg, r, m): """Construct a list of regularization matrices that penalize the off-diagonal elements of A and all elements of the remaining operators, where the model has the form dx / dt = c + Ax + H(x⊗x) + Bu. """ regI = np.sqrt(reg) * np.eye(1 + r + r*(r+1)//2 + m) Gs = [regI] * r for i in range(1, r+1): Gs[i][i,i] = 0 return Gs def compute_randomized_svd(data, savepath, n_components=500): """Compute and save randomized SVD following https://scikit-learn.org/stable/modules/generated/sklearn.utils.extmath.randomized_svd.html Parameters ---------- data : (n_variables, n_samples) ndarray Snapshot matrix for which the SVD will be computed. savepath : str Name of file to save the SVD to. n_components : float Number of singular values and vectors to compute and save. """ Vr,Sigma,_ = randomized_svd(data, n_components, n_iter=15, random_state=42) with h5py.File(savepath, 'w') as svd_h5file: svd_h5file.create_dataset("U", data=Vr) svd_h5file.create_dataset("S", data=Sigma) print(f"SVD computed and saved in {savepath}") return Vr,Sigma def project_data(data, projected_data_folder, projected_xdot_folder, Vr, dt, r_vals=-1): """ Compute xdot and project both data and xdot onto the POD basis. Projected data is saved as "<projected_data_folder>/data_reduced_%d.h5" Projected xdot data is saved as "<projected_xdot_folder>/xdot_reduced_%d.h5" Parameters ---------- data : (n_variables, n_samples) ndarray Data to be projected and used to compute, project xdot. projected_data_folder : str Folder to save projected data to. projected_xdot_folder : str Folder to save projected xdot data to. Vr: (n_variables, r) ndarray The POD basis. dt: float Timestep size between data (for computing xdot) r_vals: list of ints <= Vr.shape[1] (the number of columns in Vr). Basis sizes to compute projection for. """ if data.shape[1] < 2: raise ValueError("At least two snapshots required for x' computation") os.makedirs(os.path.dirname(projected_data_folder), exist_ok=True) os.makedirs(os.path.dirname(projected_xdot_folder), exist_ok=True) if r_vals == -1: r_vals = [Vr.shape[1]] for r in r_vals: print(f"r = {r}", flush=True) # select first r basis vectors VrT = Vr[:,:r].T # project snapshots print("\tProjecting snapshots...", end='', flush=True) data_reduced = VrT @ data print("done.") # project velocities print("\tComputing projected velocities...", end='', flush=True) if data.shape[1] <= 4: # Too few time steps for 4th order derivatives xdot_reduced = np.gradient(data_reduced, dt, edge_order=2, axis=-1) else: xdot_reduced = roi.pre.xdot_uniform(data_reduced, dt, order=4) print("done.") # save results print("\tSaving files...", end='', flush=True) fname1 = os.path.join(projected_data_folder, f"data_reduced_{r:d}.h5") with open h5py.File(fname1,'w') as savexf: savexf.create_dataset('data', data=data_reduced) fname2 = os.path.join(projected_xdot_folder, f"xdot_reduced_{r:d}.h5") with open h5py.File(fname2,'w') as savexdotf: savexdotf.create_dataset('xdot', data=xdot_reduced) print("done.") # Some custom error evaluators ------------------------------------------------ def mape(truth, pred): """Mean absolute prediction error.""" mask = truth > 1e-12 return (((np.abs(truth-pred)/truth)[mask]).sum()+pred[~mask].sum())/len(truth) def rpe(truth,pred): """Relative prediction error.""" mask = abs(truth) > 1e-10 rpe_mat = np.zeros(mask.shape) rpe_mat[mask] = (np.abs(truth-pred)/abs(truth))[mask] rpe_mat[~mask] = abs(pred[~mask]) return rpe_mat if __name__ == '__main__': # make command line arguments for trainsize and whether to compute svd trainsize = int(input("How many snapshots?")) svdfile = os.path.join("data", f"svd_nt{trainsize}.h5") compute_svd = input(f"Compute SVD? (True or False -- Type False if it exists as '{svdfile}')") fulldatapath = "data/data_renee.h5" with h5py.File(fulldatapath,'r') as hf: data = hf['data'][:,:] if compute_svd != "False": print("Computing svd...", end='') V,_ = compute_randomized_svd(data[:,:trainsize], svdfile) print("done.") else: print("Loading svd...", end='') with h5py.File(svdfile, 'r') as hfs: V = hfs['Und'][:,:] print("done.") print("Computing xdot and reducing data...") target_folder = os.path.join("data", f"data_reduced_minmax_nt{trainsize}") project_data(data[:,:trainsize], target_folder, target_folder, V, 1e-7, [5,10,15]) <file_sep>/README.md # Operator Inference This is a Python implementation of Operator Inference for constructing projection-based reduced-order models of dynamical systems with a polynomial form. The procedure is **data-driven** and **non-intrusive**, making it a viable candidate for model reduction of black-box or complex systems. The methodology was introduced in [\[1\]](https://www.sciencedirect.com/science/article/pii/S0045782516301104). See [**References**](#references) for more papers that use or build on Operator Inference. **Contributors**: [<NAME>](https://github.com/swischuk), [<NAME>](https://github.com/shanemcq18), [<NAME>](https://github.com/elizqian), [<NAME>](https://github.com/bokramer), [<NAME>](https://kiwi.oden.utexas.edu/). See [this repository](https://github.com/elizqian/operator-inference) for a MATLAB implementation and [DOCUMENTATION.md](DOCUMENTATION.md) for the code documentation. ## Problem Statement Consider the (possibly nonlinear) system of _n_ ordinary differential equations with state variable **x**, input (control) variable **u**, and independent variable _t_: <p align="center"><img src="img/prb/eq1.svg"/></p> where <p align="center"><img src="img/prb/eq2.svg"/></p> This system is called the _full-order model_ (FOM). If _n_ is large, as it often is in high-consequence engineering applications, it is computationally expensive to numerically solve the FOM. This package provides tools for constructing a _reduced-order model_ (ROM) that is up to quadratic in the state **x** with optional linear control inputs **u**. The procedure is data-driven, non-intrusive, and relatively inexpensive. In the most general case, the code can construct and solve a reduced-order system with the polynomial form <p align="center"><img src="img/prb/eq3.svg"/></p> <!-- <p align="center"> <img src="https://latex.codecogs.com/svg.latex?\dot{\hat{\mathbf{x}}}(t)=\hat{A}\hat{\mathbf{x}}(t)+\hat{H}(\hat{\mathbf{x}}\otimes\hat{\mathbf{x}})(t)+\hat{B}\mathbf{u}(t)+\sum_{i=1}^m\hat{N}_{i}\hat{\mathbf{x}}(t)u_{i}(t)+\hat{\mathbf{c}},"/> </p> --> where now <p align="center"><img src="img/prb/eq4.svg"/></p> <p align="center"><img src="img/prb/eq5.svg"/></p> <!-- <p align="center"> <img src="https://latex.codecogs.com/svg.latex?\hat{A}\in\mathbb{R}^{r\times%20r},\qquad\hat{H}\in\mathbb{R}^{r\times%20r^2},\qquad\hat{B}\in\mathbb{R}^{r\times%20m},\qquad\hat{N}_{i}\in\mathbb{R}^{r\times%20r}."/> </p> --> This reduced low-dimensional system approximates the original high-dimensional system, but it is much easier (faster) to solve because of its low dimension _r_ << _n_. See [DETAILS.md](DETAILS.md) for more mathematical details and an index of notation. ## Quick Start #### Installation Install from the command line with the following single command (requires [`pip`](https://pypi.org/project/pip/) and [`git`](https://git-scm.com/)). ```bash $ pip3 install git+https://github.com/swischuk/rom-operator-inference-Python3.git ``` #### Usage Given a linear basis `Vr`, snapshot data `X`, and snapshot velocities `Xdot`, the following code learns a reduced model for a problem of the form _d**x**/dt = **c** + A**x**(t)_, then solves the reduced system for _0 ≤ t ≤ 1_. ```python import numpy as np import rom_operator_inference as roi # Define a model of the form dx / dt = c + Ax(t). >>> model = roi.InferredContinuousROM(modelform="cA") # Fit the model to snapshot data X, the snapshot derivative Xdot, # and the linear basis Vr by solving for the operators c_ and A_. >>> model.fit(Vr, X, Xdot) # Simulate the learned model over the time domain [0,1] with 100 timesteps. >>> t = np.linspace(0, 1, 100) >>> X_ROM = model.predict(X[:,0], t) ``` ## Examples The [`examples/`](examples/) folder contains scripts and notebooks that set up and run several examples: - [`examples/tutorial.ipynb`](https://nbviewer.jupyter.org/github/Willcox-Research-Group/rom-operator-inference-Python3/blob/master/examples/tutorial.ipynb): A walkthrough of a very simple heat equation example. - [`examples/heat_1D.ipynb`](https://nbviewer.jupyter.org/github/Willcox-Research-Group/rom-operator-inference-Python3/blob/master/examples/heat_1D.ipynb): A more complicated one-dimensional heat equation example [\[1\]](https://www.sciencedirect.com/science/article/pii/S0045782516301104). - [`examples/data_driven_heat.ipynb`](https://nbviewer.jupyter.org/github/Willcox-Research-Group/rom-operator-inference-Python3/blob/master/examples/data_driven_heat.ipynb): A purely data-driven example using data generated from a one-dimensional heat equation \[4\]. <!-- - `examples/TODO.ipynb`: Burgers' equation [\[1\]](https://www.sciencedirect.com/science/article/pii/S0045782516301104). --> <!-- - `examples/TODO.ipynb`: Euler equation [\[2\]](https://arc.aiaa.org/doi/10.2514/6.2019-3707). --> <!-- This example uses MATLAB's Curve Fitting Toolbox to generate the random initial conditions. --> (More examples coming) ## References - \[1\] <NAME>. and <NAME>., [Data-driven operator inference for non-intrusive projection-based model reduction.](https://www.sciencedirect.com/science/article/pii/S0045782516301104) _Computer Methods in Applied Mechanics and Engineering_, Vol. 306, pp. 196-215, 2016. ([Download](https://kiwi.oden.utexas.edu/papers/Non-intrusive-model-reduction-Peherstorfer-Willcox.pdf))<details><summary>BibTeX</summary><pre> @article{Peherstorfer16DataDriven, title = {Data-driven operator inference for nonintrusive projection-based model reduction}, author = {<NAME>. and <NAME>.}, journal = {Computer Methods in Applied Mechanics and Engineering}, volume = {306}, pages = {196--215}, year = {2016}, publisher = {Elsevier} }</pre></details> - \[2\] <NAME>., <NAME>., <NAME>., and <NAME>., [Transform & Learn: A data-driven approach to nonlinear model reduction](https://arc.aiaa.org/doi/10.2514/6.2019-3707). In the AIAA Aviation 2019 Forum & Exhibition, Dallas, TX, June 2019. ([Download](https://kiwi.oden.utexas.edu/papers/learn-data-driven-nonlinear-reduced-model-Qian-Willcox.pdf))<details><summary>BibTeX</summary><pre> @inbook{QKMW2019aviation, title = {Transform \\& Learn: A data-driven approach to nonlinear model reduction}, author = {<NAME>. and <NAME>. and <NAME>. and <NAME>.}, booktitle = {AIAA Aviation 2019 Forum}, doi = {10.2514/6.2019-3707}, URL = {https://arc.aiaa.org/doi/abs/10.2514/6.2019-3707}, eprint = {https://arc.aiaa.org/doi/pdf/10.2514/6.2019-3707} }</pre></details> - \[3\] <NAME>., <NAME>., <NAME>., and <NAME>., [Projection-based model reduction: Formulations for physics-based machine learning.](https://www.sciencedirect.com/science/article/pii/S0045793018304250) _Computers & Fluids_, Vol. 179, pp. 704-717, 2019. ([Download](https://kiwi.oden.utexas.edu/papers/Physics-based-machine-learning-swischuk-willcox.pdf))<details><summary>BibTeX</summary><pre> @article{swischuk2019projection, title = {Projection-based model reduction: Formulations for physics-based machine learning}, author = {<NAME>. and <NAME>. and <NAME>. and <NAME>.}, journal = {Computers \\& Fluids}, volume = {179}, pages = {704--717}, year = {2019}, publisher = {Elsevier} }</pre></details> - \[4\] <NAME>., [Physics-based machine learning and data-driven reduced-order modeling](https://dspace.mit.edu/handle/1721.1/122682). Master's thesis, Massachusetts Institute of Technology, 2019. ([Download](https://dspace.mit.edu/bitstream/handle/1721.1/122682/1123218324-MIT.pdf))<details><summary>BibTeX</summary><pre> @phdthesis{swischuk2019physics, title = {Physics-based machine learning and data-driven reduced-order modeling}, author = {<NAME>}, year = {2019}, school = {Massachusetts Institute of Technology} }</pre></details> - \[5\] <NAME>. [Sampling low-dimensional Markovian dynamics for pre-asymptotically recovering reduced models from data with operator inference](https://arxiv.org/abs/1908.11233). arXiv:1908.11233. ([Download](https://arxiv.org/pdf/1908.11233.pdf))<details><summary>BibTeX</summary><pre> @article{peherstorfer2019sampling, title = {Sampling low-dimensional Markovian dynamics for pre-asymptotically recovering reduced models from data with operator inference}, author = {<NAME>}, journal = {arXiv preprint arXiv:1908.11233}, year = {2019} }</pre></details> - \[6\] <NAME>., <NAME>., <NAME>., and <NAME>., [Learning physics-based reduced-order models for a single-injector combustion process](https://arc.aiaa.org/doi/10.2514/1.J058943). _AIAA Journal_, published online March 2020. Also in Proceedings of 2020 AIAA SciTech Forum & Exhibition, Orlando FL, January, 2020. Also Oden Institute Report 19-13. ([Download](https://kiwi.oden.utexas.edu/papers/learning-reduced-model-combustion-Swischuk-Kramer-Huang-Willcox.pdf))<details><summary>BibTeX</summary><pre> @article{SKHW2019_learning_ROMs_combustor, title = {Learning physics-based reduced-order models for a single-injector combustion process}, author = {<NAME>. and <NAME>. and <NAME>. and <NAME>.}, journal = {AIAA Journal}, volume = {}, pages = {Published Online: 19 Mar 2020}, url = {}, year = {2020} }</pre></details> - \[7\] <NAME>., <NAME>., <NAME>., and <NAME>. [Lift & Learn: Physics-informed machine learning for large-scale nonlinear dynamical systems](https://www.sciencedirect.com/science/article/abs/pii/S0167278919307651). _Physica D: Nonlinear Phenomena_, Volume 406, May 2020, 132401. ([Download](https://kiwi.oden.utexas.edu/papers/lift-learn-scientific-machine-learning-Qian-Willcox.pdf))<details><summary>BibTeX</summary><pre> @article{QKPW2020_lift_and_learn, title = {Lift \\& Learn: Physics-informed machine learning for large-scale nonlinear dynamical systems.}, author = {<NAME>. and <NAME>. and <NAME>. and <NAME>.}, journal = {Physica {D}: {N}onlinear {P}henomena}, volume = {406}, pages = {132401}, url = {https://doi.org/10.1016/j.physd.2020.132401}, year = {2020} }</pre></details> <file_sep>/check_docs.py # check_docs.py """Script for catching errors in README and DETAILS files.""" import os import re import difflib # Global variables ============================================================ README = "README.md" DETAILS = "DETAILS.md" DOCS = "DOCUMENTATION.md" SETUP = "setup.py" INIT = os.path.join("rom_operator_inference", "__init__.py") MDFILES = [README, DETAILS, DOCS] DOCFILES = [DETAILS, DOCS] VNFILES = [SETUP, INIT] REDSPACE = '\x1b[41m \x1b[49m' VERSION = re.compile(r'_{0,2}?version_{0,2}?\s*=\s*"([\d\.]+?)"', re.MULTILINE) TEX = re.compile( r'''<img\ src= # begin image tag "https://latex.codecogs.com/svg.latex\? # LaTeX web prefix (.+?) # LATEX CODE "/> # end image tag ''', re.VERBOSE | re.MULTILINE) # Tests ======================================================================= def check_latex_code_for_spaces(filelist=MDFILES): """Make sure there are no spaces in the latex code snippets, since that keeps them from displaying correctly on Github. """ # Read files and search for errors. errors = [] for filename in filelist: with open(filename, 'r') as infile: data = infile.read() for index,tex in enumerate(TEX.findall(data)): if ' ' in tex: errors.append((filename, index, tex)) # Report errors. for name,index,tex in errors: print(f"bad space in {name}, LaTeX occurrence #{index+1}:\n\t", tex.replace(' ', REDSPACE), sep='', end='\n') if errors: nerrs = len(errors) raise SyntaxError(f"{nerrs} LaTeX error{'s' if nerrs > 1 else ''}") def check_references_sections_are_the_same(filelist=MDFILES): """Make sure that the "References" sections in each doc file are the same. """ # Read both '## References' sections. refsections = [] for filename in filelist: with open(filename, 'r') as infile: data = infile.read() refsections.append(data[data.index("## References"):].splitlines()) # Compare sections and report errors. errors = False for i in range(len(filelist)-1): file1, file2 = filelist[i:i+2] for line in difflib.unified_diff(refsections[i], refsections[i+1], fromfile=file1, tofile=file2): print(line) errors = True if errors: raise SyntaxError(f"'References' of {file1} and {file2}" " do not match") def check_index_sections_are_the_same(filelist=DOCFILES): """Make sure that the "Index of Notation" sections in each doc file are the same. """ idxsections = [] for filename in filelist: with open(filename, 'r') as infile: data = infile.read() start = data.index("## Index of Notation") end = data.index("## References") idxsections.append(data[start:end].splitlines()) # Compare sections and report errors. errors = False for i in range(len(filelist)-1): file1, file2 = filelist[i:i+2] for line in difflib.unified_diff(idxsections[i], idxsections[i+1], fromfile=file1, tofile=file2): print(line) errors = True if errors: raise SyntaxError(f"'Index of Notation' of {file1} and {file2} " " do not match") def check_version_numbers_match(filelist=VNFILES): """Make sure that the version number in setup.py and __init__.py match.""" if len(filelist) != 2: raise ValueError("can only compare 2 files at a time") file1, file2 = filelist # Get the version number listed in each file. versions = [] for filename in filelist: with open(filename, 'r') as infile: data = infile.read() versions.append(VERSION.findall(data)[0]) if versions[0] != versions[1]: raise ValueError(f"Version numbers in {file1} and {file2} do " f"not match ('{versions[0]}' != '{versions[1]}')") if __name__ == "__main__": check_latex_code_for_spaces() check_references_sections_are_the_same() check_index_sections_are_the_same() check_version_numbers_match() <file_sep>/DETAILS.md # Summary of Mathematical Details This document gives a short explanation of the mathematical details behind the package. For a full treatment, see [\[1\]](https://www.sciencedirect.com/science/article/pii/S0045782516301104). However, note that some notation has been altered for coding convenience and clarity. **Contents** - [**Problem Statement**](#problem-statement) - [**Projection-based Model Reduction**](#projection-based-model-reduction) - [**Operator Inference via Least Squares**](#operator-inference-via-least-squares) - [**Index of Notation**](#index-of-notation) - [**References**](#references) ## Problem Statement Consider the (possibly nonlinear) system of _n_ ordinary differential equations with state variable **x**, input (control) variable **u**, and independent variable _t_: <p align="center"><img src="img/prb/eq1.svg"/></p> where <p align="center"><img src="img/prb/eq2.svg"/></p> This system is called the _full-order model_ (FOM). If _n_ is large, as it often is in high-consequence engineering applications, it is computationally expensive to numerically solve the FOM. This package provides tools for constructing a _reduced-order model_ (ROM) that is up to quadratic in the state **x** with optional linear control inputs **u**. The procedure is data-driven, non-intrusive, and relatively inexpensive. In the most general case, the code can construct and solve a reduced-order system with the polynomial form <p align="center"><img src="img/prb/eq3.svg"/></p> <!-- <p align="center"> <img src="https://latex.codecogs.com/svg.latex?\dot{\hat{\mathbf{x}}}(t)=\hat{A}\hat{\mathbf{x}}(t)+\hat{H}(\hat{\mathbf{x}}\otimes\hat{\mathbf{x}})(t)+\hat{B}\mathbf{u}(t)+\sum_{i=1}^m\hat{N}_{i}\hat{\mathbf{x}}(t)u_{i}(t)+\hat{\mathbf{c}},"/> </p> --> where now <p align="center"><img src="img/prb/eq4.svg"/></p> <p align="center"><img src="img/prb/eq5.svg"/></p> <!-- <p align="center"> <img src="https://latex.codecogs.com/svg.latex?\hat{A}\in\mathbb{R}^{r\times%20r},\qquad\hat{H}\in\mathbb{R}^{r\times%20r^2},\qquad\hat{B}\in\mathbb{R}^{r\times%20m},\qquad\hat{N}_{i}\in\mathbb{R}^{r\times%20r}."/> </p> --> This reduced low-dimensional system approximates the original high-dimensional system, but it is much easier (faster) to solve because of its low dimension _r_ << _n_. ## Projection-based Model Reduction Model reduction via projection occurs in three steps: 1. **Data Collection**: Gather snapshot data, i.e., solutions to the full-order model (the FOM) at various times / parameters. 2. **Compression**: Compute a low-rank basis (which defines a low-dimensional linear subspace) that captures most of the behavior of the snapshots. 3. **Projection**: Use the low-rank basis to construct a low-dimensional ODE (the ROM) that approximates the FOM. <!-- These steps comprise what is called the _offline phase_ in the literature, since they can all be done before the resulting ROM is simulated. --> This package focuses on step 3, constructing the ROM given the snapshot data and the low-rank basis from steps 1 and 2, respectively. Let _X_ be the _n_ x _k_ matrix whose _k_ columns are each solutions to the FOM of length _n_ (step 1), and let _V_<sub>_r_</sub> be an orthonormal _n_ x _r_ matrix representation for an _r_-dimensional subspace (step 2). For example, a common choice for _V_<sub>_r_</sub> is the POD Basis of rank _r_, the matrix comprised of the first _r_ singular vectors of _X_. We call _X_ the _snapshot matrix_ and _V_<sub>_r_</sub> the _reduced basis matrix_. The classical approach to the projection step is to make the Ansatz <p align="center"><img src="img/dtl/eq01.svg"/></p> Inserting this into the FOM and multiplying both sides by the transpose of _V_<sub>_r_</sub> yields <p align="center"><img src="img/dtl/eq02.svg"/></p> This new system is _r_-dimensional in the sense that <p align="center"><img src="img/dtl/eq03.svg"/></p> If the FOM operator **f** is known and has a nice structure, this reduced system can be solved cheaply by precomputing any involved matrices and then applying a time-stepping scheme. For example, if **f** is linear in **x** and there is no input **u**, then <p align="center"><img src="img/dtl/eq04.svg"/></p> where <p align="center"><img src="img/dtl/eq05.svg"/></p> However, _this approach breaks down if the FOM operator **f** is unknown, uncertain, or highly nonlinear_. ## Operator Inference via Least Squares Instead of directly computing the reduced operators, the Operator Inference framework takes a data-driven approach: assuming a specific structure of the ROM (linear, quadratic, etc.), solve for the involved operators that best fit the data. For example, suppose that we seek a ROM of the form <p align="center"><img src="img/dtl/eq06.svg"/></p> We have only the snapshot matrix _X_, the low-rank basis matrix _V_<sub>_r_</sub> (which was derived from _X_), the inputs _U_, and perhaps the snapshot velocities _X'_ (if not, these must be approximated). Here the (_ij_)<sup>th</sup> entry of _U_ is the _i_<sup>th</sup> component of **u** at the time corresponding to the _j_<sup>th</sup> snapshot. To solve for the linear operators on the right-hand side of the preceding equation, we project the snapshot data via the basis matrix, <p align="center"><img src="img/dtl/eq07.svg"/></p> then solve the least squares problem <p align="center"><img src="img/dtl/eq08.svg"/></p> where **1** is a _k_-vector of 1's and <p align="center"><img src="img/dtl/eq09.svg"/></p> For our purposes, the ⊗ operator between matrices denotes a column-wise Kronecker product (also called the [Khatri-Rao product](https://en.wikipedia.org/wiki/Kronecker_product#Khatri%E2%80%93Rao_product)). The minimization problem given above decouples into _r_ independent ordinary least-squares problems, one for each of the columns of _O<sup>T</sup>_: <p align="center"><img src="img/dtl/eq10.svg"/></p> The entire routine is relatively inexpensive to solve. The code also allows for a Tikhonov regularization matrix or list of matrices (the `P` keyword argument for `predict()` methods), in which case the problem being solved is <p align="center"><img src="img/dtl/eq11.svg"/></p> It can be shown [\[1\]](https://www.sciencedirect.com/science/article/pii/S0045782516301104) that, under some idealized assumptions, these inferred operators converge to the operators computed by explicit projection. The key idea, however, is that _the inferred operators can be cheaply computed without knowing the full-order model_. This is very convenient in situations where the FOM is given by a "black box," such as a legacy code for complex fluid simulations. #### The Discrete Case The framework described above can also be used to construct reduced-order models for approximating _discrete_ dynamical systems. For instance, consider the full-order model <p align="center"><img src="img/dtl/eq12.svg"/></p> Instead of collecting snapshot velocities, we collect _k+1_ snapshots and let _X_ be the _n x k_ matrix whose columns are the first _k_ snapshots and _X'_ be the _n x k_ matrix whose columns are the last _k_ snapshots. That is, the columns **x**<sub>_k_</sub> of _X_ and **x**<sub>_k_</sub>' satisfy <p align="center"><img src="img/dtl/eq13.svg"/></p> Then we set up the same least squares problem as before, but now the right-hand side matrix is <p align="center"><img src="img/dtl/eq14.svg"/></p> The resulting reduced-order model has the form <p align="center"><img src="img/dtl/eq15.svg"><p> <!-- TODO: #### Re-projection and Recovering Intrusive Models --> #### Implementation Note: The Kronecker Product The vector [Kronecker product](https://en.wikipedia.org/wiki/Kronecker_product) ⊗ introduces some redundancies. For example, the product **x** ⊗ **x** contains both _x_<sub>1</sub>_x_<sub>2</sub> and _x_<sub>2</sub>_x_<sub>1</sub>. To avoid these redundancies, we introduce a "compact" Kronecker product <img src="img/dtl/eq16.svg" height=10/> which only computes the unique terms of the usual vector Kronecker product: <p align="center"><img src="img/dtl/eq17.svg"/></p> When the compact Kronecker product is used, we call the resulting operator _H<sub>c</sub>_ instead of _H_. Thus, the reduced order model becomes <p align="center"><img src="img/dtl/eq18.svg"/></p> and the corresponding Operator Inference least squares problem is <p align="center"><img src="img/dtl/eq19.svg"/></p> ## Index of Notation We generally denote scalars in lower case, vectors in bold lower case, matrices in upper case, and indicate low-dimensional quantities with a hat. In the code, a low-dimensional quantity ends with an underscore, so that the model classes follow some principles from the [scikit-learn](https://scikit-learn.org/stable/index.html) [API](https://scikit-learn.org/stable/developers/contributing.html#apis-of-scikit-learn-objects). ### Dimensions | Symbol | Code | Description | | :----: | :--- | :---------- | | <img src="img/ntn/n.svg"/> | `n` | Dimension of the full-order system (large) | | <img src="img/ntn/r.svg"/> | `r` | Dimension of the reduced-order system (small) | | <img src="img/ntn/m.svg"/> | `m` | Dimension of the input **u** | | <img src="img/ntn/k.svg"/> | `k` | Number of state snapshots, i.e., the number of training points | | <img src="img/ntn/s.svg"/> | `s` | Number of parameter samples for parametric training | | <img src="img/ntn/p.svg"/> | `p` | Dimension of the parameter space | | <img src="img/ntn/d.svg"/> | `d` | Number of columns of the data matrix _D_ | | <img src="img/ntn/nt.svg"/> | `nt` | Number of time steps in a simulation | <!-- | <img src="https://latex.codecogs.com/svg.latex?\ell"/> | `l` | Dimension of the output **y** | --> ### Vectors <!-- \sigma_j\in\text{diag}(\Sigma) &= \textrm{singular value of }X\\ \boldsymbol{\mu}\in\mathcal{P} &= \text{system parameter}\\ \mathcal{P}\subset\mathbb{R}^{p} &= \text{parameter space}\\ \Omega\subset\mathbb{R}^{d} &= \text{spatial domain}\\ % \omega\in\Omega &= \text{spatial point (one dimension)}\\ \boldsymbol{\omega}\in\Omega &= \text{spatial point}\\ t\ge 0 &= \text{time}\\ \hat{} &= \textrm{reduced variable, e.g., }\hat{\mathbf{x}}\textrm{ or }\hat{A}\\ \dot{} = \frac{d}{dt} &= \text{time derivative} --> | Symbol | Code | Size | Description | | :----: | :--- | :--: | :---------- | | <img src="img/ntn/x.svg"/> | `x` | <img src="img/ntn/n.svg"/> | Full-order state vector | | <img src="img/ntn/xhat.svg"/> | `x_` | <img src="img/ntn/r.svg"/> | Reduced-order state vector | | <img src="img/ntn/xhatdot.svg"/> | `xdot_` | <img src="img/ntn/r.svg"/> | Reduced-order state velocity vector | | <img src="img/ntn/xrom.svg"/> | `x_ROM` | <img src="img/ntn/n.svg"/> | Approximation to **x** produced by ROM | | <img src="img/ntn/chat.svg"/> | `c_` | <img src="img/ntn/m.svg"/> | Learned constant term | | <img src="img/ntn/u.svg"/> | `u` | <img src="img/ntn/m.svg"/> | Input vector | | <img src="img/ntn/f.svg"/> | `f(t,x,u)` | <img src="img/ntn/n.svg"/> | Full-order system operator | | <img src="img/ntn/fhat.svg"/> | `f_(t,x_,u)` | <img src="img/ntn/n.svg"/> | Reduced-order system operator | | <img src="img/ntn/kronx.svg"/> | `np.kron(x,x)` | <img src="img/ntn/n2.svg"/> | Kronecker product of full state (quadratic terms) | | <img src="img/ntn/kronxhat.svg"/> | `np.kron(x_,x_)` | <img src="img/ntn/r2.svg"/> | Kronecker product of reduced state (quadratic terms) | | <img src="img/ntn/kronxhatc.svg"/> | `kron2c(x_)` | <img src="img/ntn/r2c.svg"/> | Compact Kronecker product of reduced state (quadratic terms) | | <img src="img/ntn/vj.svg"/> | `vj` | <img src="img/ntn/n.svg"/> | _j_<sup>th</sup> subspace basis vector, i.e., column _j_ of _V_<sub>_r_</sub> | <!-- | **y** | `y` | Output vector | --> <!-- | **y_ROM**, **y~** | `y_ROM` | Approximation to **y** produced by ROM | --> ### Matrices | Symbol | Code | Shape | Description | | :----: | :--- | :---: | :---------- | | <img src="img/ntn/Vr.svg"/> | `Vr` | <img src="img/ntn/nxr.svg"/> | low-rank basis of rank _r_ (usually the POD basis) | | <img src="img/ntn/XX.svg"/> | `X` | <img src="img/ntn/nxk.svg"/> | Snapshot matrix | | <img src="img/ntn/XXdot.svg"/> | `Xdot` | <img src="img/ntn/nxk.svg"/> | Snapshot velocity matrix | | <img src="img/ntn/UU.svg"/> | `U` | <img src="img/ntn/mxk.svg"/> | Input matrix (inputs corresonding to the snapshots) | | <img src="img/ntn/XXhat.svg"/> | `X_` | <img src="img/ntn/rxk.svg"/> | Projected snapshot matrix | | <img src="img/ntn/XXhatdot.svg"/> | `Xdot_` | <img src="img/ntn/rxk.svg"/> | Projected snapshot velocity matrix | | <img src="img/ntn/DD.svg"/> | `D` | <img src="img/ntn/kxd.svg"/> | Data matrix | | <img src="img/ntn/OO.svg"/> | `O` | <img src="img/ntn/dxr.svg"/> | Operator matrix | | <img src="img/ntn/RR.svg"/> | `R` | <img src="img/ntn/kxr.svg"/> | Right-hand side matrix | | <img src="img/ntn/PP.svg"/> | `P` | <img src="img/ntn/dxd.svg"/> | Tikhonov regularization matrix | | <img src="img/ntn/AAhat.svg"/> | `A_` | <img src="img/ntn/rxr.svg"/> | Learned state matrix | | <img src="img/ntn/HHhat.svg"/> | `H_` | <img src="img/ntn/rxr2.svg"/> | Learned matricized quadratic tensor | | <img src="img/ntn/HHhatc.svg"/> | `Hc_` | <img src="img/ntn/rxr2c.svg"/> | Learned matricized quadratic tensor without redundancy (compact) | | <img src="img/ntn/BBhat.svg"/> | `B_` | <img src="img/ntn/rxm.svg"/> | Learned input matrix | <!-- | <img src="https://latex.codecogs.com/svg.latex?\hat{N}_i"/> | `Ni_` | <img src="https://latex.codecogs.com/svg.latex?r\times%20r"/> | Bilinear state-input matrix for _i_th input | --> <!-- | <img src="https://latex.codecogs.com/svg.latex?\hat{C}"/> | `C_` | <img src="https://latex.codecogs.com/svg.latex?q\times%20r"/> | Learned output matrix | --> <!-- I_{a\times%20a}\in\mathbb{R}^{a\times a} | | identity matrix\\ --> <!-- \Sigma \in \mathbb{R}^{\ell\times\ell} &= \text{diagonal singular value matrix}\\ --> ## References - \[1\] <NAME>. and <NAME>., [Data-driven operator inference for non-intrusive projection-based model reduction.](https://www.sciencedirect.com/science/article/pii/S0045782516301104) _Computer Methods in Applied Mechanics and Engineering_, Vol. 306, pp. 196-215, 2016. ([Download](https://kiwi.oden.utexas.edu/papers/Non-intrusive-model-reduction-Peherstorfer-Willcox.pdf))<details><summary>BibTeX</summary><pre> @article{Peherstorfer16DataDriven, title = {Data-driven operator inference for nonintrusive projection-based model reduction}, author = {<NAME>. and <NAME>.}, journal = {Computer Methods in Applied Mechanics and Engineering}, volume = {306}, pages = {196--215}, year = {2016}, publisher = {Elsevier} }</pre></details> - \[2\] <NAME>., <NAME>., <NAME>., and <NAME>., [Transform & Learn: A data-driven approach to nonlinear model reduction](https://arc.aiaa.org/doi/10.2514/6.2019-3707). In the AIAA Aviation 2019 Forum & Exhibition, Dallas, TX, June 2019. ([Download](https://kiwi.oden.utexas.edu/papers/learn-data-driven-nonlinear-reduced-model-Qian-Willcox.pdf))<details><summary>BibTeX</summary><pre> @inbook{QKMW2019aviation, title = {Transform \\& Learn: A data-driven approach to nonlinear model reduction}, author = {<NAME>. and <NAME>. and <NAME>. and <NAME>.}, booktitle = {AIAA Aviation 2019 Forum}, doi = {10.2514/6.2019-3707}, URL = {https://arc.aiaa.org/doi/abs/10.2514/6.2019-3707}, eprint = {https://arc.aiaa.org/doi/pdf/10.2514/6.2019-3707} }</pre></details> - \[3\] <NAME>., <NAME>., <NAME>., and <NAME>., [Projection-based model reduction: Formulations for physics-based machine learning.](https://www.sciencedirect.com/science/article/pii/S0045793018304250) _Computers & Fluids_, Vol. 179, pp. 704-717, 2019. ([Download](https://kiwi.oden.utexas.edu/papers/Physics-based-machine-learning-swischuk-willcox.pdf))<details><summary>BibTeX</summary><pre> @article{swischuk2019projection, title = {Projection-based model reduction: Formulations for physics-based machine learning}, author = {<NAME>. and <NAME>. and <NAME>. and <NAME>.}, journal = {Computers \\& Fluids}, volume = {179}, pages = {704--717}, year = {2019}, publisher = {Elsevier} }</pre></details> - \[4\] <NAME>., [Physics-based machine learning and data-driven reduced-order modeling](https://dspace.mit.edu/handle/1721.1/122682). Master's thesis, Massachusetts Institute of Technology, 2019. ([Download](https://dspace.mit.edu/bitstream/handle/1721.1/122682/1123218324-MIT.pdf))<details><summary>BibTeX</summary><pre> @phdthesis{swischuk2019physics, title = {Physics-based machine learning and data-driven reduced-order modeling}, author = {<NAME>}, year = {2019}, school = {Massachusetts Institute of Technology} }</pre></details> - \[5\] <NAME>. [Sampling low-dimensional Markovian dynamics for pre-asymptotically recovering reduced models from data with operator inference](https://arxiv.org/abs/1908.11233). arXiv:1908.11233. ([Download](https://arxiv.org/pdf/1908.11233.pdf))<details><summary>BibTeX</summary><pre> @article{peherstorfer2019sampling, title = {Sampling low-dimensional Markovian dynamics for pre-asymptotically recovering reduced models from data with operator inference}, author = {<NAME>}, journal = {arXiv preprint arXiv:1908.11233}, year = {2019} }</pre></details> - \[6\] <NAME>., <NAME>., <NAME>., and <NAME>., [Learning physics-based reduced-order models for a single-injector combustion process](https://arc.aiaa.org/doi/10.2514/1.J058943). _AIAA Journal_, published online March 2020. Also in Proceedings of 2020 AIAA SciTech Forum & Exhibition, Orlando FL, January, 2020. Also Oden Institute Report 19-13. ([Download](https://kiwi.oden.utexas.edu/papers/learning-reduced-model-combustion-Swischuk-Kramer-Huang-Willcox.pdf))<details><summary>BibTeX</summary><pre> @article{SKHW2019_learning_ROMs_combustor, title = {Learning physics-based reduced-order models for a single-injector combustion process}, author = {<NAME>. and <NAME>. and <NAME>. and <NAME>.}, journal = {AIAA Journal}, volume = {}, pages = {Published Online: 19 Mar 2020}, url = {}, year = {2020} }</pre></details> - \[7\] <NAME>., <NAME>., <NAME>., and <NAME>. [Lift & Learn: Physics-informed machine learning for large-scale nonlinear dynamical systems](https://www.sciencedirect.com/science/article/abs/pii/S0167278919307651). _Physica D: Nonlinear Phenomena_, Volume 406, May 2020, 132401. ([Download](https://kiwi.oden.utexas.edu/papers/lift-learn-scientific-machine-learning-Qian-Willcox.pdf))<details><summary>BibTeX</summary><pre> @article{QKPW2020_lift_and_learn, title = {Lift \\& Learn: Physics-informed machine learning for large-scale nonlinear dynamical systems.}, author = {<NAME>. and <NAME>. and <NAME>. and <NAME>.}, journal = {Physica {D}: {N}onlinear {P}henomena}, volume = {406}, pages = {132401}, url = {https://doi.org/10.1016/j.physd.2020.132401}, year = {2020} }</pre></details> <file_sep>/img/README.md The files here are latex svg images embedded into the markdown-based documentation.
9b2ffdc2853ebfa64780b657682244c7a073046e
[ "Markdown", "Python", "Makefile" ]
6
Makefile
swischuk/operator_inference
a779e78fed1a5a222d2e510657233a7ea2f14948
e1a3af18af426c60cfd320f844a25d522328f52e
refs/heads/master
<repo_name>lolappscam/dotfiles<file_sep>/bash/bashrc # Git completion source $HOME/git-completion.bash # Load RVM if present [[ -s "$HOME/.rvm/scripts/rvm" ]] && source "$HOME/.rvm/scripts/rvm"
943df02d75b9d61eee3a317837b50512cfa0a136
[ "Shell" ]
1
Shell
lolappscam/dotfiles
be45c5c876d2effe2eefdb0011d0a4435e28c5d1
d27e0ea6db036a9b6e89ab349a17b68e03dc1593