code
stringlengths 501
5.19M
| package
stringlengths 2
81
| path
stringlengths 9
304
| filename
stringlengths 4
145
|
---|---|---|---|
.. _formats:
=================
Supported Formats
=================
The following pages describe the file formats currently supported, and format-specific options. A full API is also included for advanced users.
.. toctree::
:maxdepth: 1
format_fits.rst
format_vo.rst
format_hdf5.rst
format_ipac.rst
format_ascii.rst
format_sql.rst
format_online.rst
Unless stated otherwise below, all table types support the following types:
* booleans
* 8-, 16-, 32-, and 64- bit signed integer numbers.
* 8-, 16-, 32-, and 64- bit unsigned integer numbers.
* 32- and 64-bit floating point numbers.
* ASCII strings
| ATpy | /ATpy-0.9.7.tar.gz/ATpy-0.9.7/docs/formats.rst | formats.rst |
========================
Obtaining and Installing
========================
Requirements
============
ATpy requires the following:
- `Python <http://www.python.org>`_ 2.6 or later
- `Numpy <http://www.numpy.org/>`_ 1.5 or later
- `Astropy <http://www.astropy.org>`_ 0.2 or later
The following packages are optional, but are required to read/write to certain
formats:
- `h5py <http://www.h5py.org>`_ 1.3.0 or later (for HDF5 tables)
- `MySQL-python <http://sourceforge.net/projects/mysql-python>`_ 1.2.2 or later
(for MySQL tables)
- `PyGreSQL <http://www.pygresql.org/>`_ 3.8.1 or later (for PostGreSQL tables)
Stable version
==============
The latest stable release of ATpy can be downloaded from `PyPI <https://pypi.python.org/pypi/ATpy>`_. To install ATpy, use the standard installation procedure::
tar xvzf ATpy-X-X.X.tar.gz
cd ATpy-X.X.X/
python setup.py install
Developer version
=================
Advanced users wishing to use the latest development ("unstable") version can check it out with::
git clone git://github.com/atpy/atpy.git
which can then be installed with::
cd atpy
python setup.py install
| ATpy | /ATpy-0.9.7.tar.gz/ATpy-0.9.7/docs/installation.rst | installation.rst |
.. _api:
============================
Full API for Table class
============================
.. automodule:: atpy
Table initialization and I/O
============================
.. automethod:: Table.reset
.. automethod:: Table.read
.. automethod:: Table.write
Meta-data
=========
.. automethod:: Table.add_comment
.. automethod:: Table.add_keyword
.. automethod:: Table.describe
Column manipulation
===================
.. automethod:: Table.add_column
.. automethod:: Table.add_empty_column
.. automethod:: Table.remove_columns
.. automethod:: Table.keep_columns
.. automethod:: Table.rename_column
.. automethod:: Table.set_primary_key
Table manipulation and selection
================================
.. automethod:: Table.sort
.. automethod:: Table.row
.. automethod:: Table.rows
.. automethod:: Table.where
| ATpy | /ATpy-0.9.7.tar.gz/ATpy-0.9.7/docs/api_table.rst | api_table.rst |
Guide for Migrating to Astropy
==============================
.. note:: If you encounter any other issues not described here when migrating to
Astropy, please `let us know <https://github.com/atpy/atpy/issues>`_
and we will update this guide accordingly.
Much of the functionality from ATpy has been incorporated into the `Astropy
<http://www.astropy.org>`_ as the `astropy.table
<http://docs.astropy.org/en/stable/table>`_ sub-package. The Astropy ``Table``.
class can be imported with::
>>> from astropy.table import Table
In the process of including the code in Astropy, the API has been changed in a
backward-incompatible way, and the present document aims to describe the main
changes.
Note that the Astropy ``Table`` class is more powerful than the ATpy
equivalent in many respects, and we do not describe all the features here,
only functionality of ATpy that has changed in Astropy. For a full overview of
the Astropy ``Table`` class, see `astropy.table
<http://docs.astropy.org/en/stable/table>`_.
Adding columns
--------------
In ATpy, columns are added with::
>>> from atpy import Table
>>> t = Table()
>>> t.add_column('a', [1, 2, 3])
In the Astropy 0.2.x ``Table`` class, the equivalent is::
>>> from astropy.table import Table, Column
>>> t = Table()
>>> t.add_column(Column(data=[1, 2, 3], name='a'))
This is a little more verbose, but from Astropy 0.3 onwards, this can be done
more simply with::
>>> from astropy.table import Table, Column
>>> t = Table()
>>> t['a'] = [1, 2, 3]
This is already implemented in the latest developer version of Astropy, and
will be the recommended way of adding columns.
Reading/writing
---------------
While it was possible to read a table directly into the ``Table`` class upon
initialization in ATpy::
>>> from atpy import Table
>>> t = Table('test.fits')
Astropy now requires reading to be done by a class method::
>>> from astropy.table import Table
>>> t = Table.read('test.fits')
Writing should be similar between ATpy and Astropy::
>>> t.write('test.fits')
As of Astropy 0.2, Astropy can read/write the same ASCII formats as ATpy, as
well as VO tables and HDF5. As of Astropy 0.3 (and in the current developer
version of Astropy), FITS reading/writing is also implemented. This means that
as of Astropy 0.3, the only features ATpy includes that are not supported by
Astropy directly are the SQL input/output and the online (VO and IRSA)
querying. However, the VO and IRSA querying will be possible with the new
`astroquery <http://astroquery.readthedocs.org>`_ package which is currently
under development.
Table Sets
----------
Table sets are not implemented in Astropy at this time, but it is possible to
simply loop over the tables in a file and construct a list or dictionary of
them.
| ATpy | /ATpy-0.9.7.tar.gz/ATpy-0.9.7/docs/migration.rst | migration.rst |
.. _manipulating:
================
Modifying tables
================
Manipulating table columns
==========================
Columns can be renamed or removed. To do this, one can use the
``remove_column``, ``remove_columns``, ``keep_columns`` and ``rename_column``
methods. For example, to rename a column ``time`` to ``space``, one can use::
>>> t.rename_column('time','space')
The ``keep_columns`` essentially acts in the opposite way to
``remove_columns`` - it is used to specify which subset of the columns to not
remove, which can be useful for extracting specific columns from a large
table. For more information, see the :ref:`api`.
Sorting tables
==============
To sort a table, use the ``sort()`` method, along with the name of the column to sort by::
>>> t.sort('time')
Combining tables
================
Given two ``Table`` instances with the same column metadata, and the same number of columns, one table can be added to the other via the ``append`` method::
>>> t1 = Table(...)
>>> t2 = Table(...)
>>> t1.append(t2) | ATpy | /ATpy-0.9.7.tar.gz/ATpy-0.9.7/docs/manipulating.rst | manipulating.rst |
.. _format_sql:
=============
SQL databases
=============
.. note::
Structured Query Language (SQL) databases are wildly used in web
infrastructure, and are also used to store large datasets in Science.
Several flavors exist, the most popular of which are SQLite, MySQL, and
PostGreSQL.
SQL databases are supported in ATpy thanks to the sqlite module built-in to Python, the `MySQL-python <http://sourceforge.net/projects/mysql-python>`_ module, and the `PyGreSQL <http://www.pygresql.org/>`_ module. When reading from databases, the first argument in ``Table`` should be the database type (one of ``sqlite``, ``mysql``, and ``postgres``). For SQLite databases, which are stored in a file, reading in a table is easy:
>>> t = atpy.Table('sqlite', 'mydatabase.db')
If more than one table is present in the file, the table name can be specified::
>>> t = atpy.Table('sqlite', 'mydatabase.db', table='observations')
For MySQL databases, standard MySQL parameters can be specified. These include ``user``, ``passwd``, ``db`` (the database name), ``host``, and ``port``. For PostGreSQL databases, standard PostGreSQL parameters can be specified. These include ``user``, ``password``, ``database``, and ``host``.
For example, to read a table called ``velocities`` from a MySQL database called ``measurements``, with a user ``monty`` and password ``spam``, one would use::
>>> t = atpy.Table('mysql', user='monty', passwd='spam',
db='measurements', table='velocities')
To read in all the tables in a database, simply use the ``TableSet`` class, e.g::
>>> t = atpy.TableSet('sqlite', 'mydatabase.db')
or
>>> t = atpy.TableSet('mysql', user='monty', passwd='spam',
db='measurements')
It is possible to retrieve only a subset of a table, or the result of any standard SQL query, using the ``query`` argument. For example, the following will retrieve all entries where the ``quality`` variable is positive::
>>> t = atpy.Table('mysql', user='monty', passwd='spam',
db='measurements', table='velocities',
query='SELECT * FROM velocities WHERE quality > 0;' )
Any valid SQL command should work, including commands used to merge different tables.
Writing tables or table sets to databases is simple, and is done through the ``write`` method. As before, database parameters may need to be specified, e.g.::
>>> t.write('sqlite', 'mydatabase.db')
or
>>> t.write('mysql', user='monty', passwd='spam',
db='measurements')
.. note::
As for file formats, the ``verbose`` argument can be specified to
control whether warning messages are shown when reading (the default is
``verbose=True``), and the ``overwrite`` argument can be used when
writing to overwrite a file (the default is ``overwrite=False``).
Full API for advanced users
---------------------------
.. note ::
The following functions should not be called directly - the arguments should be passed to ``Table()/Table.read()``,
``Table.write()``, ``TableSet()/TableSet.read()``, and
``TableSet.write()`` respectively.
.. autofunction:: atpy.sqltable.read
.. autofunction:: atpy.sqltable.write
.. autofunction:: atpy.sqltable.read_set
.. autofunction:: atpy.sqltable.write_set
| ATpy | /ATpy-0.9.7.tar.gz/ATpy-0.9.7/docs/format_sql.rst | format_sql.rst |
.. _format_fits:
===========
FITS tables
===========
.. note::
The Flexible Image Transport System (FITS) format is a widely used file
format in Astronomy, that is used to store, transmit, and manipulate
images and tables. FITS tables contain one or more header-data units
(HDU) which can be either images or tables in ASCII or binary format.
Tables can contain meta-data, stored in the header.
Overview
--------
FITS tables are supported thanks to the `pyfits <http://www.stsci.edu/resources/software_hardware/pyfits>`_ module. Reading FITS tables is straightforward::
>>> t = atpy.Table('table.fits')
If more than one table is present in the file, the HDU can be specified::
>>> t = atpy.Table('table.fits', hdu=2)
To read in all HDUs in a file, use the ``TableSet`` class::
>>> t = atpy.TableSet('table.fits')
Compressed FITS files can be read easily::
>>> t = atpy.Table('table.fits.gz')
In the event that ATpy does not recognize a FITS table (for example if the file extension is obscure), the type can be explicitly given::
>>> t = atpy.Table('table', type='fits')
.. note::
As for all file formats, the ``verbose`` argument can be specified to
control whether warning messages are shown when reading (the default is
``verbose=True``), and the ``overwrite`` argument can be used when
writing to overwrite a file (the default is ``overwrite=False``).
Full API for advanced users
---------------------------
.. note ::
The following functions should not be called directly - the arguments should be passed to ``Table()/Table.read()``,
``Table.write()``, ``TableSet()/TableSet.read()``, and
``TableSet.write()`` respectively.
.. autofunction:: atpy.fitstable.read
.. autofunction:: atpy.fitstable.write
.. autofunction:: atpy.fitstable.read_set
.. autofunction:: atpy.fitstable.write_set
| ATpy | /ATpy-0.9.7.tar.gz/ATpy-0.9.7/docs/format_fits.rst | format_fits.rst |
.. _maskingandnull:
=======================
Masking and null values
=======================
It is often useful to be able to define missing or invalid values in a table. There are currently two ways to do this in ATpy, :ref:`null`, and :ref:`masking`. The preferred way is to use Masking, but this requires at least NumPy 1.4.1 in most cases, and the latest svn version of NumPy for SQL database input/output. Therefore, for version 0.9.4 of ATpy, the default is to use the Null value method. To opt-in to using masked arrays, specify the ``masked=True`` argument when creating a ``Table`` instance::
>>> t = Table('example.fits.gz', masked=True)
In future, once NumPy 1.5.0 is out, we will switch over to using masked arrays by default, and will slowly phase out the Null value method.
If you want to set the default for masking to be on or off for a whole script, this can be done using the ``set_masked_default`` function::
import atpy
atpy.set_masked_default(True)
If you want to set the default for masking on a user-level, create a file named ``~/.atpyrc`` in your home directory, containing::
[general]
masked_default:yes
The ``set_masked_default`` function overrides the ``.atpyrc`` file, and the ``masked=`` argument in Table overrides both the ``set_masked_default`` function and the ``.atpyrc`` file.
.. _null:
Null values
===========
The basic idea behind this method is to specify a special value in each column that will signify missing or invalid data. To specify the Null value for a column, use the ``null`` argument in ``add_column``::
>>> t.add_column('time', time, null=-999.)
Following this, if the table is written out to a file or database, this null value will be stored.
This method is generally unreliable, especially for floating point values, and does not allow users to easily distinguish between invalid and missing values.
.. _masking:
Masking
=======
NumPy supports masked arrays, where specific elements of an array can be properly masked by using a *mask* - a boolean array. There are several advantages to using this:
* The mask is unrelated to the value in the cell - any cell can be masked, not
just all cells with a specific value
* It is possible to distinguish between invalid (e.g. NaN) and missing values
* Values can easily be unmasked (although when writing to a file/database, the
'old' values are lost for masked elements).
* NumPy provides masked versions of many functions, for example ``sum``,
``mean``, or ``median``, which means that it is easy to correctly compute
statistics on masked arrays, ignoring the masked values.
To specify the mask of a column, use the ``mask`` argument in ``add_column``. To do the equivalent to the example in :ref:`null`, use::
>>> t.add_column('time', time, mask=time==-999.)
When writing out to certain file/database formats, a masked value has to be given a specific value - this is called a *fill* value. To set the fill value, simply use the ``fill`` argument when adding data to a column:
>>> t.add_column('time', time, mask=time==-999., fill=-999.)
In the above example, if the table is written out to an IPAC table, the value of -999. will be used for masked values.
.. note::
When implementing this in ATpy, we discovered a few bugs in the masked
structured implementation of NumPy, which have now been fixed. Therefore,
we recommend using the latest svn version of NumPy if you want to use
masked arrays.
| ATpy | /ATpy-0.9.7.tar.gz/ATpy-0.9.7/docs/masking.rst | masking.rst |
============
Introduction
============
ATpy is a high-level Python package providing a way to manipulate tables of
astronomical data in a uniform way. The two main features of ATpy are:
* It provides a Table class that contains data stored in a NumPy structured
array, along with meta-data to describe the columns, and methods to
manipulate the table (e.g. adding/removing/renaming columns, selecting rows,
changing values).
* It provides built-in support for reading and writing to several common
file/database formats, including FITS, VO, and IPAC tables, and SQLite,
MySQL and PostgreSQL databases, with a very simple API.
In addition, ATpy provides a TableSet class that can be used to contain multiple tables, and supports reading and writing to file/database formats that support this (FITS, VO, and SQL databases).
Finally, ATpy provides support for user-written read/write functions for file/database formats not supported by default. We encourage users to send us custom read/write functions to read commonly used formats, and would be happy to integrate them into the main distribution. | ATpy | /ATpy-0.9.7.tar.gz/ATpy-0.9.7/docs/introduction.rst | introduction.rst |
.. _format_hdf5:
===========
HDF5 tables
===========
.. note::
The Hierarchical Data Format (HDF) is a format that can be used to
store, transmit, and manipulate datasets (n-dimensional arrays or
tables). Datasets can be collected into groups, which can be
collected into larger groups. Datasets and groups can contain
meta-data, in the form of attributes.
HDF5 tables are supported thanks to the `h5py <http://code.google.com/p/h5py/>`_ module. Reading HDF5 tables is straightforward::
>>> t = atpy.Table('table.hdf5')
If more than one table is present in the file, ATpy will give a list of available tables, identified by a path. The specific table to read can then be specified with the ``table=`` argument::
>>> t = atpy.Table('table.hdf5', table='Measurements')
In the case where a table is inside a group, or a hierarchy of groups, the table name may be a full path inside the file::
>>> t = atpy.Table('table.hdf5', table='Group1/Measurements')
To read in all tables in an HDF5 file, use the ``TableSet`` class::
>>> t = atpy.TableSet('table.hdf5')
When writing out an HDF5 table, the default is to write the uncompressed, but it is possible to turn on compression using the ``compression`` argument::
>>> t.write('table.hdf5', compression=True)
To write the table to a specific group within the file, use the ``group`` argument::
>>> t.write('table.hdf5', group='Group4')
Finally, it is possible to append tables to existing files, using the ``append`` argument. For example, the following two commands write out two tables to the same existing file::
>>> t1.write('existing_table.hdf', append=True)
>>> t2.write('existing_table.hdf', append=True)
In the event that ATpy does not recognize an HDF5 table (for example if the file extension is obscure), the type can be explicitly given::
>>> t = atpy.Table('table', type='hdf5')
.. note::
As for all file formats, the ``verbose`` argument can be specified to
control whether warning messages are shown when reading (the default is
``verbose=True``), and the ``overwrite`` argument can be used when
writing to overwrite a file (the default is ``overwrite=False``).
Full API for advanced users
---------------------------
.. note ::
The following functions should not be called directly - the arguments should be passed to ``Table()/Table.read()``,
``Table.write()``, ``TableSet()/TableSet.read()``, and
``TableSet.write()`` respectively.
.. autofunction:: atpy.hdf5table.read
.. autofunction:: atpy.hdf5table.write
.. autofunction:: atpy.hdf5table.read_set
.. autofunction:: atpy.hdf5table.write_set | ATpy | /ATpy-0.9.7.tar.gz/ATpy-0.9.7/docs/format_hdf5.rst | format_hdf5.rst |
.. _format_hdf5:
==============
Online queries
==============
It is possible to query online databases and automatically return the results
as a ``Table`` instance. There are several mechanisms for accessing online
catalogs:
Virtual Observatory
-------------------
An interface to the virtual observatory is provided via the `vo
<https://trac6.assembla.com/astrolib>`_ module. To list the catalogs
available, use the ``list_catalogs()`` method from ``atpy.vo_conesearch``::
>>> from atpy.vo_conesearch import list_catalogs
>>> list_catalogs()
USNO-A2
USNO-B1
USNO NOMAD
USNO ACT
A specific catalog can then be queried with a conesearch by specifying a
catalog, and the coordinates and radius (in degrees) to search::
>>> t = atpy.Table(catalog='USNO-B1', ra=233.112, dec=23.432, radius=0.3, type='vo_conesearch')
How long this query takes will depend on the speed of your network, the load
on the server being queried, and the number of rows in the result. For
advanced users, it is also possible to query catalogs not listed by
``list_catalogs()`` - for more details, see the :ref:`fullapi`.
IRSA Query
----------
In addition to supporting Virtual Observatory queries, ATpy supports queries
to the `NASA/IPAC Infrared Science Archive (IRSA)
<http://irsa.ipac.caltech.edu/>`_. The interface is similar to that of the VO. To list the catalogs
available, use the ``list_catalogs()`` method from ``atpy.irsa_service``::
>>> from atpy.irsa_service import list_catalogs
>>> list_catalogs()
fp_psc 2MASS All-Sky Point Source Catalog (PSC)
fp_xsc 2MASS All-Sky Extended Source Catalog (XSC)
lga_v2 The 2MASS Large Galaxy Atlas
fp_scan_dat 2MASS All-Sky Survey Scan Info
... ...
The first column is the catalog code used in the query. A specific catalog can then be queried by specifying a
query type, a catalog, and additional arguments as required. The different kinds of search are:
* ``Cone``: This is a cone search. Requires ``objstr``, a string containing
either coordinates or an object name (see `here
<http://irsa.ipac.caltech.edu/search_help.html>`_ for more information),
and ``radius``, with units given by ``units`` (``'arcsec'`` by
default). For example::
>>> t = atpy.Table('Cone', 'fp_psc', objstr='m13', \
radius=100., type='irsa')
* ``Box``: This is a box search. Requires ``objstr``, a string containing
either coordinates or an object name (see `here
<http://irsa.ipac.caltech.edu/search_help.html>`_ for more
information), and ``size`` in arcseconds. For example::
>>> t = atpy.Table('Box', 'fp_psc', objstr='T Tau', \
size=200., units='deg', type='irsa')
* ``Polygon``: This is a polygon search. Requires ``polygon``, which should be
a list of tuples of (ra, dec) in decimal degrees::
>>> t = atpy.Table('polygon','fp_psc', \
polygon=[(11.0, 45.0), (12.0, 45.0), (11.5, 46.)], \
type='irsa')
As for the VO query, how long these queries takes will depend on the speed of
your network, the load on the IRSA server, and the number of rows in the
result.
.. _fullapi:
Full API for advanced users
---------------------------
.. note ::
The following functions should not be called directly - the arguments
should be passed to ``Table()/Table.read()`` using either
``type=vo_conesearch`` or ``type=irsa``.
.. autofunction:: atpy.vo_conesearch.read
.. autofunction:: atpy.irsa_service.read
| ATpy | /ATpy-0.9.7.tar.gz/ATpy-0.9.7/docs/format_online.rst | format_online.rst |
import datetime
import time
from polygon import RESTClient
from sqlalchemy import create_engine
from sqlalchemy import text
import pandas as pd
from math import sqrt
from math import isnan
import matplotlib.pyplot as plt
from numpy import mean
from numpy import std
from math import floor
class AUDUSD_return(object):
# Variable to store the total number of instantiated objects in this class
num = 0
# Variable to store the running sum of the return
run_sum = 0
run_squared_sum = 0
run_sum_of_std = 0
last_price = -1
# Init all the necessary variables when instantiating the class
def __init__(self, tick_time, avg_price):
# Store each column value into a variable in the class instance
self.tick_time = tick_time
#self.price = avg_price
if AUDUSD_return.last_price == -1:
hist_return = float('NaN')
else:
hist_return = (avg_price - AUDUSD_return.last_price) / AUDUSD_return.last_price
self.hist_return = hist_return
if isnan(hist_return):
AUDUSD_return.run_sum = 0
else:
# Increment the counter
if AUDUSD_return.num < 5:
AUDUSD_return.num += 1
AUDUSD_return.run_sum += hist_return
AUDUSD_return.last_price = avg_price
def add_to_running_squared_sum(self,avg):
if isnan(self.hist_return) == False:
AUDUSD_return.run_squared_sum += (self.hist_return - avg)**2
def get_avg(self,pop_value):
if isnan(self.hist_return) == False:
AUDUSD_return.run_sum -= pop_value
avg = AUDUSD_return.run_sum/(AUDUSD_return.num)
self.avg_return = avg
return avg
def get_std(self):
if isnan(self.hist_return) == False:
std = sqrt(AUDUSD_return.run_squared_sum/(AUDUSD_return.num))
self.std_return = std
AUDUSD_return.run_sum_of_std += std
AUDUSD_return.run_squared_sum = 0
return std
def get_avg_std(self,pop_value):
if isnan(self.hist_return) == False:
AUDUSD_return.run_sum_of_std -= pop_value
avg_std = AUDUSD_return.run_sum_of_std/(AUDUSD_return.num)
self.avg_of_std_return = avg_std
return avg_std
print(1) | AUDUSD-xiaxin | /AUDUSD_xiaxin-0.1.1-py3-none-any.whl/AUDUSD/audusd.py | audusd.py |
# AV-98
AV-98 is an experimental client for the
[Gemini protocol](https://gemini.circumlunar.space). It is derived from the
[gopher client VF-1](https://github.com/solderpunk/VF-1) by the same author.
AV-98 is "experimental" in the sense that it may occasionally extend or deviate
from the official Gemini specification for the purposes of, well,
experimentation. Despite this, it is expected to be stable enough for regular
daily use at the same time.
## Dependencies
AV-98 has no "strict dependencies", i.e. it will run and work without anything
else beyond the Python standard library. However, it will "opportunistically
import" a few other libraries if they are available to offer an improved
experience.
* The [ansiwrap library](https://pypi.org/project/ansiwrap/) may result in
neater display of text which makes use of ANSI escape codes to control colour.
* The [cryptography library](https://pypi.org/project/cryptography/) will
provide a better and slightly more secure experience when using the default
TOFU certificate validation mode and is highly recommended.
## Features
* TOFU or CA server certificate validation
* Extensive client certificate support if an `openssl` binary is available
* Ability to specify external handler programs for different MIME types
* Gopher proxy support (e.g. for use with
[Agena](https://tildegit.org/solderpunk/agena))
* Advanced navigation tools like `tour` and `mark` (as per VF-1)
* Bookmarks
* IPv6 support
* Supports any character encoding recognised by Python
## Lightning introduction
You use the `go` command to visit a URL, e.g. `go gemini.circumlunar.space`.
Links in Gemini documents are assigned numerical indices. Just type an index to
follow that link.
If a Gemini document is too long to fit on your screen, use the `less` command
to pipe it to the `less` pager.
Use the `help` command to learn about additional commands.
| AV-98 | /AV-98-1.0.1.tar.gz/AV-98-1.0.1/README.md | README.md |
import argparse
import cmd
import cgi
import codecs
import collections
import datetime
import fnmatch
import getpass
import glob
import hashlib
import io
import mimetypes
import os
import os.path
import random
import shlex
import shutil
import socket
import sqlite3
import ssl
from ssl import CertificateError
import subprocess
import sys
import tempfile
import time
import urllib.parse
import uuid
import webbrowser
try:
import ansiwrap as textwrap
except ModuleNotFoundError:
import textwrap
try:
from cryptography import x509
from cryptography.hazmat.backends import default_backend
_HAS_CRYPTOGRAPHY = True
_BACKEND = default_backend()
except ModuleNotFoundError:
_HAS_CRYPTOGRAPHY = False
_VERSION = "1.0.1"
_MAX_REDIRECTS = 5
# Command abbreviations
_ABBREVS = {
"a": "add",
"b": "back",
"bb": "blackbox",
"bm": "bookmarks",
"book": "bookmarks",
"f": "fold",
"fo": "forward",
"g": "go",
"h": "history",
"hist": "history",
"l": "less",
"n": "next",
"p": "previous",
"prev": "previous",
"q": "quit",
"r": "reload",
"s": "save",
"se": "search",
"/": "search",
"t": "tour",
"u": "up",
}
_MIME_HANDLERS = {
"application/pdf": "xpdf %s",
"audio/mpeg": "mpg123 %s",
"audio/ogg": "ogg123 %s",
"image/*": "feh %s",
"text/html": "lynx -dump -force_html %s",
"text/plain": "cat %s",
"text/gemini": "cat %s",
}
# monkey-patch Gemini support in urllib.parse
# see https://github.com/python/cpython/blob/master/Lib/urllib/parse.py
urllib.parse.uses_relative.append("gemini")
urllib.parse.uses_netloc.append("gemini")
def fix_ipv6_url(url):
if not url.count(":") > 2: # Best way to detect them?
return url
# If there's a pair of []s in there, it's probably fine as is.
if "[" in url and "]" in url:
return url
# Easiest case is a raw address, no schema, no path.
# Just wrap it in square brackets and whack a slash on the end
if "/" not in url:
return "[" + url + "]/"
# Now the trickier cases...
if "://" in url:
schema, schemaless = url.split("://")
else:
schema, schemaless = None, url
if "/" in schemaless:
netloc, rest = schemaless.split("/",1)
schemaless = "[" + netloc + "]" + "/" + rest
if schema:
return schema + "://" + schemaless
return schemaless
standard_ports = {
"gemini": 1965,
"gopher": 70,
}
class GeminiItem():
def __init__(self, url, name=""):
if "://" not in url:
url = "gemini://" + url
self.url = fix_ipv6_url(url)
self.name = name
parsed = urllib.parse.urlparse(self.url)
self.scheme = parsed.scheme
self.host = parsed.hostname
self.port = parsed.port or standard_ports.get(self.scheme, 0)
self.path = parsed.path
def root(self):
return GeminiItem(self._derive_url("/"))
def up(self):
pathbits = list(os.path.split(self.path.rstrip('/')))
# Don't try to go higher than root
if len(pathbits) == 1:
return self
# Get rid of bottom component
pathbits.pop()
new_path = os.path.join(*pathbits)
return GeminiItem(self._derive_url(new_path))
def query(self, query):
query = urllib.parse.quote(query)
return GeminiItem(self._derive_url(query=query))
def _derive_url(self, path="", query=""):
"""
A thin wrapper around urlunparse which avoids inserting standard ports
into URLs just to keep things clean.
"""
return urllib.parse.urlunparse((self.scheme,
self.host if self.port == standard_ports[self.scheme] else self.host + ":" + str(self.port),
path or self.path, "", query, ""))
def absolutise_url(self, relative_url):
"""
Convert a relative URL to an absolute URL by using the URL of this
GeminiItem as a base.
"""
return urllib.parse.urljoin(self.url, relative_url)
def to_map_line(self, name=None):
if name or self.name:
return "=> {} {}\n".format(self.url, name or self.name)
else:
return "=> {}\n".format(self.url)
@classmethod
def from_map_line(cls, line, origin_gi):
assert line.startswith("=>")
assert line[2:].strip()
bits = line[2:].strip().split(maxsplit=1)
bits[0] = origin_gi.absolutise_url(bits[0])
return cls(*bits)
CRLF = '\r\n'
# Cheap and cheerful URL detector
def looks_like_url(word):
return "." in word and word.startswith("gemini://")
# GeminiClient Decorators
def needs_gi(inner):
def outer(self, *args, **kwargs):
if not self.gi:
print("You need to 'go' somewhere, first")
return None
else:
return inner(self, *args, **kwargs)
outer.__doc__ = inner.__doc__
return outer
def restricted(inner):
def outer(self, *args, **kwargs):
if self.restricted:
print("Sorry, this command is not available in restricted mode!")
return None
else:
return inner(self, *args, **kwargs)
outer.__doc__ = inner.__doc__
return outer
class GeminiClient(cmd.Cmd):
def __init__(self, restricted=False):
cmd.Cmd.__init__(self)
# Set umask so that nothing we create can be read by anybody else.
# The certificate cache and TOFU database contain "browser history"
# type sensitivie information.
os.umask(0o077)
# Find config directory
## Look for something pre-existing
for confdir in ("~/.av98/", "~/.config/av98/"):
confdir = os.path.expanduser(confdir)
if os.path.exists(confdir):
self.config_dir = confdir
break
## Otherwise, make one in .config if it exists
else:
if os.path.exists(os.path.expanduser("~/.config/")):
self.config_dir = os.path.expanduser("~/.config/av98/")
else:
self.config_dir = os.path.expanduser("~/.av98/")
print("Creating config directory {}".format(self.config_dir))
os.makedirs(self.config_dir)
self.no_cert_prompt = "\x1b[38;5;76m" + "AV-98" + "\x1b[38;5;255m" + "> " + "\x1b[0m"
self.cert_prompt = "\x1b[38;5;202m" + "AV-98" + "\x1b[38;5;255m" + "+cert> " + "\x1b[0m"
self.prompt = self.no_cert_prompt
self.gi = None
self.history = []
self.hist_index = 0
self.idx_filename = ""
self.index = []
self.index_index = -1
self.lookup = self.index
self.marks = {}
self.page_index = 0
self.permanent_redirects = {}
self.previous_redirectors = set()
self.restricted = restricted
self.tmp_filename = ""
self.visited_hosts = set()
self.waypoints = []
self.client_certs = {
"active": None
}
self.active_cert_domains = []
self.active_is_transient = False
self.transient_certs_created = []
self.options = {
"debug" : False,
"ipv6" : True,
"timeout" : 10,
"width" : 80,
"auto_follow_redirects" : True,
"gopher_proxy" : None,
"tls_mode" : "tofu",
}
self.log = {
"start_time": time.time(),
"requests": 0,
"ipv4_requests": 0,
"ipv6_requests": 0,
"bytes_recvd": 0,
"ipv4_bytes_recvd": 0,
"ipv6_bytes_recvd": 0,
"dns_failures": 0,
"refused_connections": 0,
"reset_connections": 0,
"timeouts": 0,
}
self._connect_to_tofu_db()
def _connect_to_tofu_db(self):
db_path = os.path.join(self.config_dir, "tofu.db")
self.db_conn = sqlite3.connect(db_path)
self.db_cur = self.db_conn.cursor()
self.db_cur.execute("""CREATE TABLE IF NOT EXISTS cert_cache
(hostname text, address text, fingerprint text,
first_seen date, last_seen date, count integer)""")
def _go_to_gi(self, gi, update_hist=True, handle=True):
"""This method might be considered "the heart of AV-98".
Everything involved in fetching a gemini resource happens here:
sending the request over the network, parsing the response if
its a menu, storing the response in a temporary file, choosing
and calling a handler program, and updating the history."""
# Don't try to speak to servers running other protocols
if gi.scheme in ("http", "https"):
webbrowser.open_new_tab(gi.url)
return
elif gi.scheme == "gopher" and not self.options.get("gopher_proxy", None):
print("""AV-98 does not speak Gopher natively.
However, you can use `set gopher_proxy hostname:port` to tell it about a
Gopher-to-Gemini proxy (such as a running Agena instance), in which case
you'll be able to transparently follow links to Gopherspace!""")
return
elif gi.scheme not in ("gemini", "gopher"):
print("Sorry, no support for {} links.".format(gi.scheme))
return
# Obey permanent redirects
if gi.url in self.permanent_redirects:
new_gi = GeminiItem(self.permanent_redirects[gi.url], name=gi.name)
self._go_to_gi(new_gi)
return
# Be careful with client certificates!
# Are we crossing a domain boundary?
if self.active_cert_domains and gi.host not in self.active_cert_domains:
if self.active_is_transient:
print("Permanently delete currently active transient certificate?")
resp = input("Y/N? ")
if resp.strip().lower() in ("y", "yes"):
print("Destroying certificate.")
self._deactivate_client_cert()
else:
print("Staying here.")
else:
print("PRIVACY ALERT: Deactivate client cert before connecting to a new domain?")
resp = input("Y/N? ")
if resp.strip().lower() in ("n", "no"):
print("Keeping certificate active for {}".format(gi.host))
else:
print("Deactivating certificate.")
self._deactivate_client_cert()
# Suggest reactivating previous certs
if not self.client_certs["active"] and gi.host in self.client_certs:
print("PRIVACY ALERT: Reactivate previously used client cert for {}?".format(gi.host))
resp = input("Y/N? ")
if resp.strip().lower() in ("y", "yes"):
self._activate_client_cert(*self.client_certs[gi.host])
else:
print("Remaining unidentified.")
self.client_certs.pop(gi.host)
# Do everything which touches the network in one block,
# so we only need to catch exceptions once
try:
# Is this a local file?
if not gi.host:
address, f = None, open(gi.path, "rb")
else:
address, f = self._send_request(gi)
# Spec dictates <META> should not exceed 1024 bytes,
# so maximum valid header length is 1027 bytes.
header = f.readline(1027)
header = header.decode("UTF-8")
if not header or header[-1] != '\n':
raise RuntimeError("Received invalid header from server!")
header = header.strip()
self._debug("Response header: %s." % header)
# Catch network errors which may happen on initial connection
except Exception as err:
# Print an error message
if isinstance(err, socket.gaierror):
self.log["dns_failures"] += 1
print("ERROR: DNS error!")
elif isinstance(err, ConnectionRefusedError):
self.log["refused_connections"] += 1
print("ERROR: Connection refused!")
elif isinstance(err, ConnectionResetError):
self.log["reset_connections"] += 1
print("ERROR: Connection reset!")
elif isinstance(err, (TimeoutError, socket.timeout)):
self.log["timeouts"] += 1
print("""ERROR: Connection timed out!
Slow internet connection? Use 'set timeout' to be more patient.""")
else:
print("ERROR: " + str(err))
return
# Validate header
status, meta = header.split(maxsplit=1)
if len(meta) > 1024 or len(status) != 2 or not status.isnumeric():
print("ERROR: Received invalid header from server!")
f.close()
return
# Update redirect loop/maze escaping state
if not status.startswith("3"):
self.previous_redirectors = set()
# Handle non-SUCCESS headers, which don't have a response body
# Inputs
if status.startswith("1"):
print(meta)
if status == "11":
user_input = getpass.getpass("> ")
else:
user_input = input("> ")
self._go_to_gi(gi.query(user_input))
return
# Redirects
elif status.startswith("3"):
new_gi = GeminiItem(gi.absolutise_url(meta))
if new_gi.url in self.previous_redirectors:
print("Error: caught in redirect loop!")
return
elif len(self.previous_redirectors) == _MAX_REDIRECTS:
print("Error: refusing to follow more than %d consecutive redirects!" % _MAX_REDIRECTS)
return
# Never follow cross-domain redirects without asking
elif new_gi.host != gi.host:
follow = input("Follow cross-domain redirect to %s? (y/n) " % new_gi.url)
# Never follow cross-protocol redirects without asking
elif new_gi.scheme != gi.scheme:
follow = input("Follow cross-protocol redirect to %s? (y/n) " % new_gi.url)
# Don't follow *any* redirect without asking if auto-follow is off
elif not self.options["auto_follow_redirects"]:
follow = input("Follow redirect to %s? (y/n) " % new_gi.url)
# Otherwise, follow away
else:
follow = "yes"
if follow.strip().lower() not in ("y", "yes"):
return
self._debug("Following redirect to %s." % new_gi.url)
self._debug("This is consecutive redirect number %d." % len(self.previous_redirectors))
self.previous_redirectors.add(gi.url)
if status == "31":
# Permanent redirect
self.permanent_redirects[gi.url] = new_gi.url
self._go_to_gi(new_gi)
return
# Errors
elif status.startswith("4") or status.startswith("5"):
print("Error: %s" % meta)
return
# Client cert
elif status.startswith("6"):
# Don't do client cert stuff in restricted mode, as in principle
# it could be used to fill up the disk by creating a whole lot of
# certificates
if self.restricted:
print("The server is requesting a client certificate.")
print("These are not supported in restricted mode, sorry.")
return
# Transient certs are a special case
if status == "61":
print("The server is asking to start a transient client certificate session.")
print("What do you want to do?")
print("1. Start a transient session.")
print("2. Refuse.")
choice = input("> ").strip()
if choice.strip() == "1":
self._generate_transient_cert_cert()
self._go_to_gi(gi, update_hist, handle)
return
else:
return
# Present different messages for different 6x statuses, but
# handle them the same.
if status in ("64", "65"):
print("The server rejected your certificate because it is either expired or not yet valid.")
elif status == "63":
print("The server did not accept your certificate.")
print("You may need to e.g. coordinate with the admin to get your certificate fingerprint whitelisted.")
else:
print("The site {} is requesting a client certificate.".format(gi.host))
print("This will allow the site to recognise you across requests.")
print("What do you want to do?")
print("1. Give up.")
print("2. Generate new certificate and retry the request.")
print("3. Load previously generated certificate from file.")
print("4. Load certificate from file and retry the request.")
choice = input("> ").strip()
if choice == "2":
self._generate_persistent_client_cert()
self._go_to_gi(gi, update_hist, handle)
elif choice == "3":
self._choose_client_cert()
self._go_to_gi(gi, update_hist, handle)
elif choice == "4":
self._load_client_cert()
self._go_to_gi(gi, update_hist, handle)
else:
print("Giving up.")
return
# Invalid status
elif not status.startswith("2"):
print("ERROR: Server returned undefined status code %s!" % status)
return
# If we're here, this must be a success and there's a response body
assert status.startswith("2")
# Can we terminate a transient client session?
if status == "21":
# Make sure we're actually in such a session
if self.active_is_transient:
self._deactivate_client_cert()
print("INFO: Server terminated transient client certificate session.")
else:
# Huh, that's weird
self._debug("Server issues a 21 but we're not in transient session?")
mime = meta
if mime == "":
mime = "text/gemini; charset=utf-8"
mime, mime_options = cgi.parse_header(mime)
if "charset" in mime_options:
try:
codecs.lookup(mime_options["charset"])
except LookupError:
print("Header declared unknown encoding %s" % value)
return
# Read the response body over the network
body = f.read()
# Save the result in a temporary file
## Delete old file
if self.tmp_filename and os.path.exists(self.tmp_filename):
os.unlink(self.tmp_filename)
## Set file mode
if mime.startswith("text/"):
mode = "w"
encoding = mime_options.get("charset", "UTF-8")
try:
body = body.decode(encoding)
except UnicodeError:
print("Could not decode response body using %s encoding declared in header!" % encoding)
return
else:
mode = "wb"
encoding = None
## Write
tmpf = tempfile.NamedTemporaryFile(mode, encoding=encoding, delete=False)
size = tmpf.write(body)
tmpf.close()
self.tmp_filename = tmpf.name
self._debug("Wrote %d byte response to %s." % (size, self.tmp_filename))
# Pass file to handler, unless we were asked not to
if handle:
if mime == "text/gemini":
self._handle_index(body, gi)
else:
cmd_str = self._get_handler_cmd(mime)
try:
subprocess.call(shlex.split(cmd_str % tmpf.name))
except FileNotFoundError:
print("Handler program %s not found!" % shlex.split(cmd_str)[0])
print("You can use the ! command to specify another handler program or pipeline.")
# Update state
self.gi = gi
self.mime = mime
self._log_visit(gi, address, size)
if update_hist:
self._update_history(gi)
def _send_request(self, gi):
"""Send a selector to a given host and port.
Returns the resolved address and binary file with the reply."""
if gi.scheme == "gemini":
# For Gemini requests, connect to the host and port specified in the URL
host, port = gi.host, gi.port
elif gi.scheme == "gopher":
# For Gopher requests, use the configured proxy
host, port = self.options["gopher_proxy"].rsplit(":", 1)
self._debug("Using gopher proxy: " + self.options["gopher_proxy"])
# Do DNS resolution
addresses = self._get_addresses(host, port)
# Prepare TLS context
protocol = ssl.PROTOCOL_TLS if sys.version_info.minor >=6 else ssl.PROTOCOL_TLSv1_2
context = ssl.SSLContext(protocol)
# Use CAs or TOFU
if self.options["tls_mode"] == "ca":
context.verify_mode = ssl.CERT_REQUIRED
context.check_hostname = True
context.load_default_certs()
else:
context.check_hostname = False
context.verify_mode = ssl.CERT_NONE
# Impose minimum TLS version
## In 3.7 and above, this is easy...
if sys.version_info.minor >= 7:
context.minimum_version = ssl.TLSVersion.TLSv1_2
## Otherwise, it seems very hard...
## The below is less strict than it ought to be, but trying to disable
## TLS v1.1 here using ssl.OP_NO_TLSv1_1 produces unexpected failures
## with recent versions of OpenSSL. What a mess...
else:
context.options |= ssl.OP_NO_SSLv3
context.options |= ssl.OP_NO_SSLv2
# Try to enforce sensible ciphers
try:
context.set_ciphers("AESGCM:AESGCM+ECDHE:AESGCM+DHE:CHACHA20+ECDHE:CHACHA20+DHE:!DSS:!SHA1:!MD5:@STRENGTH")
except ssl.SSLError:
# Rely on the server to only support sensible things, I guess...
pass
# Load client certificate if needed
if self.client_certs["active"]:
certfile, keyfile = self.client_certs["active"]
context.load_cert_chain(certfile, keyfile)
# Connect to remote host by any address possible
err = None
for address in addresses:
self._debug("Connecting to: " + str(address[4]))
s = socket.socket(address[0], address[1])
s.settimeout(self.options["timeout"])
s = context.wrap_socket(s, server_hostname = gi.host)
try:
s.connect(address[4])
break
except OSError as e:
err = e
else:
# If we couldn't connect to *any* of the addresses, just
# bubble up the exception from the last attempt and deny
# knowledge of earlier failures.
raise err
if sys.version_info.minor >=5:
self._debug("Established {} connection.".format(s.version()))
self._debug("Cipher is: {}.".format(s.cipher()))
# Do TOFU
if self.options["tls_mode"] == "tofu":
cert = s.getpeercert(binary_form=True)
self._validate_cert(address[4][0], host, cert)
# Remember that we showed the current cert to this domain...
if self.client_certs["active"]:
self.active_cert_domains.append(gi.host)
self.client_certs[gi.host] = self.client_certs["active"]
# Send request and wrap response in a file descriptor
self._debug("Sending %s<CRLF>" % gi.url)
s.sendall((gi.url + CRLF).encode("UTF-8"))
return address, s.makefile(mode = "rb")
def _get_addresses(self, host, port):
# DNS lookup - will get IPv4 and IPv6 records if IPv6 is enabled
if ":" in host:
# This is likely a literal IPv6 address, so we can *only* ask for
# IPv6 addresses or getaddrinfo will complain
family_mask = socket.AF_INET6
elif socket.has_ipv6 and self.options["ipv6"]:
# Accept either IPv4 or IPv6 addresses
family_mask = 0
else:
# IPv4 only
family_mask = socket.AF_INET
addresses = socket.getaddrinfo(host, port, family=family_mask,
type=socket.SOCK_STREAM)
# Sort addresses so IPv6 ones come first
addresses.sort(key=lambda add: add[0] == socket.AF_INET6, reverse=True)
return addresses
def _validate_cert(self, address, host, cert):
"""
Validate a TLS certificate in TOFU mode.
If the cryptography module is installed:
- Check the certificate Common Name or SAN matches `host`
- Check the certificate's not valid before date is in the past
- Check the certificate's not valid after date is in the future
Whether the cryptography module is installed or not, check the
certificate's fingerprint against the TOFU database to see if we've
previously encountered a different certificate for this IP address and
hostname.
"""
now = datetime.datetime.utcnow()
if _HAS_CRYPTOGRAPHY:
# Using the cryptography module we can get detailed access
# to the properties of even self-signed certs, unlike in
# the standard ssl library...
c = x509.load_der_x509_certificate(cert, _BACKEND)
# Check certificate validity dates
if c.not_valid_before >= now:
raise CertificateError("Certificate not valid until: {}!".format(c.not_valid_before))
elif c.not_valid_after <= now:
raise CertificateError("Certificate expired as of: {})!".format(c.not_valid_after))
# Check certificate hostnames
names = []
common_name = c.subject.get_attributes_for_oid(x509.oid.NameOID.COMMON_NAME)
if common_name:
names.append(common_name[0].value)
try:
names.extend([alt.value for alt in c.extensions.get_extension_for_oid(x509.oid.ExtensionOID.SUBJECT_ALTERNATIVE_NAME).value])
except x509.ExtensionNotFound:
pass
names = set(names)
for name in names:
try:
ssl._dnsname_match(name, host)
break
except CertificateError:
continue
else:
# If we didn't break out, none of the names were valid
raise CertificateError("Hostname does not match certificate common name or any alternative names.")
sha = hashlib.sha256()
sha.update(cert)
fingerprint = sha.hexdigest()
# Have we been here before?
self.db_cur.execute("""SELECT fingerprint, first_seen, last_seen, count
FROM cert_cache
WHERE hostname=? AND address=?""", (host, address))
cached_certs = self.db_cur.fetchall()
# If so, check for a match
if cached_certs:
max_count = 0
most_frequent_cert = None
for cached_fingerprint, first, last, count in cached_certs:
if count > max_count:
max_count = count
most_frequent_cert = cached_fingerprint
if fingerprint == cached_fingerprint:
# Matched!
self._debug("TOFU: Accepting previously seen ({} times) certificate {}".format(count, fingerprint))
self.db_cur.execute("""UPDATE cert_cache
SET last_seen=?, count=?
WHERE hostname=? AND address=? AND fingerprint=?""",
(now, count+1, host, address, fingerprint))
self.db_conn.commit()
break
else:
if _HAS_CRYPTOGRAPHY:
# Load the most frequently seen certificate to see if it has
# expired
certdir = os.path.join(self.config_dir, "cert_cache")
with open(os.path.join(certdir, most_frequent_cert+".crt"), "rb") as fp:
previous_cert = fp.read()
previous_cert = x509.load_der_x509_certificate(previous_cert, _BACKEND)
previous_ttl = previous_cert.not_valid_after - now
print(previous_ttl)
self._debug("TOFU: Unrecognised certificate {}! Raising the alarm...".format(fingerprint))
print("****************************************")
print("[SECURITY WARNING] Unrecognised certificate!")
print("The certificate presented for {} ({}) has never been seen before.".format(host, address))
print("This MIGHT be a Man-in-the-Middle attack.")
print("A different certificate has previously been seen {} times.".format(max_count))
if _HAS_CRYPTOGRAPHY:
if previous_ttl < datetime.timedelta():
print("That certificate has expired, which reduces suspicion somewhat.")
else:
print("That certificate is still valid for: {}".format(previous_ttl))
print("****************************************")
print("Attempt to verify the new certificate fingerprint out-of-band:")
print(fingerprint)
choice = input("Accept this new certificate? Y/N ").strip().lower()
if choice in ("y", "yes"):
self.db_cur.execute("""INSERT INTO cert_cache
VALUES (?, ?, ?, ?, ?, ?)""",
(host, address, fingerprint, now, now, 1))
self.db_conn.commit()
with open(os.path.join(certdir, fingerprint+".crt"), "wb") as fp:
fp.write(cert)
else:
raise Exception("TOFU Failure!")
# If not, cache this cert
else:
self._debug("TOFU: Blindly trusting first ever certificate for this host!")
self.db_cur.execute("""INSERT INTO cert_cache
VALUES (?, ?, ?, ?, ?, ?)""",
(host, address, fingerprint, now, now, 1))
self.db_conn.commit()
certdir = os.path.join(self.config_dir, "cert_cache")
if not os.path.exists(certdir):
os.makedirs(certdir)
with open(os.path.join(certdir, fingerprint+".crt"), "wb") as fp:
fp.write(cert)
def _get_handler_cmd(self, mimetype):
# Now look for a handler for this mimetype
# Consider exact matches before wildcard matches
exact_matches = []
wildcard_matches = []
for handled_mime, cmd_str in _MIME_HANDLERS.items():
if "*" in handled_mime:
wildcard_matches.append((handled_mime, cmd_str))
else:
exact_matches.append((handled_mime, cmd_str))
for handled_mime, cmd_str in exact_matches + wildcard_matches:
if fnmatch.fnmatch(mimetype, handled_mime):
break
else:
# Use "xdg-open" as a last resort.
cmd_str = "xdg-open %s"
self._debug("Using handler: %s" % cmd_str)
return cmd_str
def _handle_index(self, body, menu_gi, display=True):
self.index = []
preformatted = False
if self.idx_filename:
os.unlink(self.idx_filename)
tmpf = tempfile.NamedTemporaryFile("w", encoding="UTF-8", delete=False)
self.idx_filename = tmpf.name
for line in body.splitlines():
if line.startswith("```"):
preformatted = not preformatted
elif preformatted:
tmpf.write(line + "\n")
elif line.startswith("=>"):
try:
gi = GeminiItem.from_map_line(line, menu_gi)
self.index.append(gi)
tmpf.write(self._format_geminiitem(len(self.index), gi) + "\n")
except:
self._debug("Skipping possible link: %s" % line)
elif line.startswith("* "):
line = line[1:].lstrip("\t ")
tmpf.write(textwrap.fill(line, self.options["width"],
initial_indent = "• ", subsequent_indent=" ") + "\n")
elif line.startswith(">"):
line = line[1:].lstrip("\t ")
tmpf.write(textwrap.fill(line, self.options["width"],
initial_indent = "> ", subsequent_indent="> ") + "\n")
elif line.startswith("###"):
line = line[3:].lstrip("\t ")
tmpf.write("\x1b[4m" + line + "\x1b[0m""\n")
elif line.startswith("##"):
line = line[2:].lstrip("\t ")
tmpf.write("\x1b[1m" + line + "\x1b[0m""\n")
elif line.startswith("#"):
line = line[1:].lstrip("\t ")
tmpf.write("\x1b[1m\x1b[4m" + line + "\x1b[0m""\n")
else:
tmpf.write(textwrap.fill(line, self.options["width"]) + "\n")
tmpf.close()
self.lookup = self.index
self.page_index = 0
self.index_index = -1
if display:
cmd_str = _MIME_HANDLERS["text/plain"]
subprocess.call(shlex.split(cmd_str % self.idx_filename))
def _format_geminiitem(self, index, gi, url=False):
line = "[%d] %s" % (index, gi.name or gi.url)
if gi.name and url:
line += " (%s)" % gi.url
return line
def _show_lookup(self, offset=0, end=None, url=False):
for n, gi in enumerate(self.lookup[offset:end]):
print(self._format_geminiitem(n+offset+1, gi, url))
def _update_history(self, gi):
# Don't duplicate
if self.history and self.history[self.hist_index] == gi:
return
self.history = self.history[0:self.hist_index+1]
self.history.append(gi)
self.hist_index = len(self.history) - 1
def _log_visit(self, gi, address, size):
if not address:
return
self.log["requests"] += 1
self.log["bytes_recvd"] += size
self.visited_hosts.add(address)
if address[0] == socket.AF_INET:
self.log["ipv4_requests"] += 1
self.log["ipv4_bytes_recvd"] += size
elif address[0] == socket.AF_INET6:
self.log["ipv6_requests"] += 1
self.log["ipv6_bytes_recvd"] += size
def _get_active_tmpfile(self):
if self.mime == "text/gemini":
return self.idx_filename
else:
return self.tmp_filename
def _debug(self, debug_text):
if not self.options["debug"]:
return
debug_text = "\x1b[0;32m[DEBUG] " + debug_text + "\x1b[0m"
print(debug_text)
def _load_client_cert(self):
"""
Interactively load a TLS client certificate from the filesystem in PEM
format.
"""
print("Loading client certificate file, in PEM format (blank line to cancel)")
certfile = input("Certfile path: ").strip()
if not certfile:
print("Aborting.")
return
elif not os.path.exists(certfile):
print("Certificate file {} does not exist.".format(certfile))
return
print("Loading private key file, in PEM format (blank line to cancel)")
keyfile = input("Keyfile path: ").strip()
if not keyfile:
print("Aborting.")
return
elif not os.path.exists(keyfile):
print("Private key file {} does not exist.".format(keyfile))
return
self._activate_client_cert(certfile, keyfile)
def _generate_transient_cert_cert(self):
"""
Use `openssl` command to generate a new transient client certificate
with 24 hours of validity.
"""
certdir = os.path.join(self.config_dir, "transient_certs")
name = str(uuid.uuid4())
self._generate_client_cert(certdir, name, transient=True)
self.active_is_transient = True
self.transient_certs_created.append(name)
def _generate_persistent_client_cert(self):
"""
Interactively use `openssl` command to generate a new persistent client
certificate with one year of validity.
"""
print("What do you want to name this new certificate?")
print("Answering `mycert` will create `~/.av98/certs/mycert.crt` and `~/.av98/certs/mycert.key`")
name = input()
if not name.strip():
print("Aborting.")
return
certdir = os.path.join(self.config_dir, "client_certs")
self._generate_client_cert(certdir, name)
def _generate_client_cert(self, certdir, basename, transient=False):
"""
Use `openssl` binary to generate a client certificate (which may be
transient or persistent) and save the certificate and private key to the
specified directory with the specified basename.
"""
if not os.path.exists(certdir):
os.makedirs(certdir)
certfile = os.path.join(certdir, basename+".crt")
keyfile = os.path.join(certdir, basename+".key")
cmd = "openssl req -x509 -newkey rsa:2048 -days {} -nodes -keyout {} -out {}".format(1 if transient else 365, keyfile, certfile)
if transient:
cmd += " -subj '/CN={}'".format(basename)
os.system(cmd)
self._activate_client_cert(certfile, keyfile)
def _choose_client_cert(self):
"""
Interactively select a previously generated client certificate and
activate it.
"""
certdir = os.path.join(self.config_dir, "client_certs")
certs = glob.glob(os.path.join(certdir, "*.crt"))
certdir = {}
for n, cert in enumerate(certs):
certdir[str(n+1)] = (cert, os.path.splitext(cert)[0] + ".key")
print("{}. {}".format(n+1, os.path.splitext(os.path.basename(cert))[0]))
choice = input("> ").strip()
if choice in certdir:
certfile, keyfile = certdir[choice]
self._activate_client_cert(certfile, keyfile)
else:
print("What?")
def _activate_client_cert(self, certfile, keyfile):
self.client_certs["active"] = (certfile, keyfile)
self.active_cert_domains = []
self.prompt = self.cert_prompt
self._debug("Using ID {} / {}.".format(*self.client_certs["active"]))
def _deactivate_client_cert(self):
if self.active_is_transient:
for filename in self.client_certs["active"]:
os.remove(filename)
for domain in self.active_cert_domains:
self.client_certs.pop(domain)
self.client_certs["active"] = None
self.active_cert_domains = []
self.prompt = self.no_cert_prompt
self.active_is_transient = False
# Cmd implementation follows
def default(self, line):
if line.strip() == "EOF":
return self.onecmd("quit")
elif line.strip() == "..":
return self.do_up()
elif line.startswith("/"):
return self.do_search(line[1:])
# Expand abbreviated commands
first_word = line.split()[0].strip()
if first_word in _ABBREVS:
full_cmd = _ABBREVS[first_word]
expanded = line.replace(first_word, full_cmd, 1)
return self.onecmd(expanded)
# Try to parse numerical index for lookup table
try:
n = int(line.strip())
except ValueError:
print("What?")
return
try:
gi = self.lookup[n-1]
except IndexError:
print ("Index too high!")
return
self.index_index = n
self._go_to_gi(gi)
### Settings
@restricted
def do_set(self, line):
"""View or set various options."""
if not line.strip():
# Show all current settings
for option in sorted(self.options.keys()):
print("%s %s" % (option, self.options[option]))
elif len(line.split()) == 1:
# Show current value of one specific setting
option = line.strip()
if option in self.options:
print("%s %s" % (option, self.options[option]))
else:
print("Unrecognised option %s" % option)
else:
# Set value of one specific setting
option, value = line.split(" ", 1)
if option not in self.options:
print("Unrecognised option %s" % option)
return
# Validate / convert values
if option == "gopher_proxy":
if ":" not in value:
value += ":1965"
else:
host, port = value.rsplit(":",1)
if not port.isnumeric():
print("Invalid proxy port %s" % port)
return
elif option == "tls_mode":
if value.lower() not in ("ca", "tofu", "insecure"):
print("""TLS mode must be "ca", "tofu" or "insecure"!""")
return
elif value.isnumeric():
value = int(value)
elif value.lower() == "false":
value = False
elif value.lower() == "true":
value = True
else:
try:
value = float(value)
except ValueError:
pass
self.options[option] = value
@restricted
def do_cert(self, line):
"""Manage client certificates"""
print("Managing client certificates")
if self.client_certs["active"]:
print("Active certificate: {}".format(self.client_certs["active"][0]))
print("1. Deactivate client certificate.")
print("2. Generate new certificate.")
print("3. Load previously generated certificate.")
print("4. Load externally created client certificate from file.")
print("Enter blank line to exit certificate manager.")
choice = input("> ").strip()
if choice == "1":
print("Deactivating client certificate.")
self._deactivate_client_cert()
elif choice == "2":
self._generate_persistent_client_cert()
elif choice == "3":
self._choose_client_cert()
elif choice == "4":
self._load_client_cert()
else:
print("Aborting.")
@restricted
def do_handler(self, line):
"""View or set handler commands for different MIME types."""
if not line.strip():
# Show all current handlers
for mime in sorted(_MIME_HANDLERS.keys()):
print("%s %s" % (mime, _MIME_HANDLERS[mime]))
elif len(line.split()) == 1:
mime = line.strip()
if mime in _MIME_HANDLERS:
print("%s %s" % (mime, _MIME_HANDLERS[mime]))
else:
print("No handler set for MIME type %s" % mime)
else:
mime, handler = line.split(" ", 1)
_MIME_HANDLERS[mime] = handler
if "%s" not in handler:
print("Are you sure you don't want to pass the filename to the handler?")
def do_abbrevs(self, *args):
"""Print all AV-98 command abbreviations."""
header = "Command Abbreviations:"
self.stdout.write("\n{}\n".format(str(header)))
if self.ruler:
self.stdout.write("{}\n".format(str(self.ruler * len(header))))
for k, v in _ABBREVS.items():
self.stdout.write("{:<7} {}\n".format(k, v))
self.stdout.write("\n")
### Stuff for getting around
def do_go(self, line):
"""Go to a gemini URL or marked item."""
line = line.strip()
if not line:
print("Go where?")
# First, check for possible marks
elif line in self.marks:
gi = self.marks[line]
self._go_to_gi(gi)
# or a local file
elif os.path.exists(os.path.expanduser(line)):
gi = GeminiItem(None, None, os.path.expanduser(line),
"1", line, False)
self._go_to_gi(gi)
# If this isn't a mark, treat it as a URL
else:
self._go_to_gi(GeminiItem(line))
@needs_gi
def do_reload(self, *args):
"""Reload the current URL."""
self._go_to_gi(self.gi)
@needs_gi
def do_up(self, *args):
"""Go up one directory in the path."""
self._go_to_gi(self.gi.up())
def do_back(self, *args):
"""Go back to the previous gemini item."""
if not self.history or self.hist_index == 0:
return
self.hist_index -= 1
gi = self.history[self.hist_index]
self._go_to_gi(gi, update_hist=False)
def do_forward(self, *args):
"""Go forward to the next gemini item."""
if not self.history or self.hist_index == len(self.history) - 1:
return
self.hist_index += 1
gi = self.history[self.hist_index]
self._go_to_gi(gi, update_hist=False)
def do_next(self, *args):
"""Go to next item after current in index."""
return self.onecmd(str(self.index_index+1))
def do_previous(self, *args):
"""Go to previous item before current in index."""
self.lookup = self.index
return self.onecmd(str(self.index_index-1))
@needs_gi
def do_root(self, *args):
"""Go to root selector of the server hosting current item."""
self._go_to_gi(self.gi.root())
def do_tour(self, line):
"""Add index items as waypoints on a tour, which is basically a FIFO
queue of gemini items.
Items can be added with `tour 1 2 3 4` or ranges like `tour 1-4`.
All items in current menu can be added with `tour *`.
Current tour can be listed with `tour ls` and scrubbed with `tour clear`."""
line = line.strip()
if not line:
# Fly to next waypoint on tour
if not self.waypoints:
print("End of tour.")
else:
gi = self.waypoints.pop(0)
self._go_to_gi(gi)
elif line == "ls":
old_lookup = self.lookup
self.lookup = self.waypoints
self._show_lookup()
self.lookup = old_lookup
elif line == "clear":
self.waypoints = []
elif line == "*":
self.waypoints.extend(self.lookup)
elif looks_like_url(line):
self.waypoints.append(GeminiItem(line))
else:
for index in line.split():
try:
pair = index.split('-')
if len(pair) == 1:
# Just a single index
n = int(index)
gi = self.lookup[n-1]
self.waypoints.append(gi)
elif len(pair) == 2:
# Two endpoints for a range of indices
for n in range(int(pair[0]), int(pair[1]) + 1):
gi = self.lookup[n-1]
self.waypoints.append(gi)
else:
# Syntax error
print("Invalid use of range syntax %s, skipping" % index)
except ValueError:
print("Non-numeric index %s, skipping." % index)
except IndexError:
print("Invalid index %d, skipping." % n)
@needs_gi
def do_mark(self, line):
"""Mark the current item with a single letter. This letter can then
be passed to the 'go' command to return to the current item later.
Think of it like marks in vi: 'mark a'='ma' and 'go a'=''a'."""
line = line.strip()
if not line:
for mark, gi in self.marks.items():
print("[%s] %s (%s)" % (mark, gi.name, gi.url))
elif line.isalpha() and len(line) == 1:
self.marks[line] = self.gi
else:
print("Invalid mark, must be one letter")
def do_version(self, line):
"""Display version information."""
print("AV-98 " + _VERSION)
### Stuff that modifies the lookup table
def do_ls(self, line):
"""List contents of current index.
Use 'ls -l' to see URLs."""
self.lookup = self.index
self._show_lookup(url = "-l" in line)
self.page_index = 0
def do_gus(self, line):
"""Submit a search query to the GUS search engine."""
gus = GeminiItem("gemini://gus.guru/search")
self._go_to_gi(gus.query(line))
def do_history(self, *args):
"""Display history."""
self.lookup = self.history
self._show_lookup(url=True)
self.page_index = 0
def do_search(self, searchterm):
"""Search index (case insensitive)."""
results = [
gi for gi in self.lookup if searchterm.lower() in gi.name.lower()]
if results:
self.lookup = results
self._show_lookup()
self.page_index = 0
else:
print("No results found.")
def emptyline(self):
"""Page through index ten lines at a time."""
i = self.page_index
if i > len(self.lookup):
return
self._show_lookup(offset=i, end=i+10)
self.page_index += 10
### Stuff that does something to most recently viewed item
@needs_gi
def do_cat(self, *args):
"""Run most recently visited item through "cat" command."""
subprocess.call(shlex.split("cat %s" % self._get_active_tmpfile()))
@needs_gi
def do_less(self, *args):
"""Run most recently visited item through "less" command."""
cmd_str = self._get_handler_cmd(self.mime)
cmd_str = cmd_str % self._get_active_tmpfile()
subprocess.call("%s | less -R" % cmd_str, shell=True)
@needs_gi
def do_fold(self, *args):
"""Run most recently visited item through "fold" command."""
cmd_str = self._get_handler_cmd(self.mime)
cmd_str = cmd_str % self._get_active_tmpfile()
subprocess.call("%s | fold -w 70 -s" % cmd_str, shell=True)
@restricted
@needs_gi
def do_shell(self, line):
"""'cat' most recently visited item through a shell pipeline."""
subprocess.call(("cat %s |" % self._get_active_tmpfile()) + line, shell=True)
@restricted
@needs_gi
def do_save(self, line):
"""Save an item to the filesystem.
'save n filename' saves menu item n to the specified filename.
'save filename' saves the last viewed item to the specified filename.
'save n' saves menu item n to an automagic filename."""
args = line.strip().split()
# First things first, figure out what our arguments are
if len(args) == 0:
# No arguments given at all
# Save current item, if there is one, to a file whose name is
# inferred from the gemini path
if not self.tmp_filename:
print("You need to visit an item first!")
return
else:
index = None
filename = None
elif len(args) == 1:
# One argument given
# If it's numeric, treat it as an index, and infer the filename
try:
index = int(args[0])
filename = None
# If it's not numeric, treat it as a filename and
# save the current item
except ValueError:
index = None
filename = os.path.expanduser(args[0])
elif len(args) == 2:
# Two arguments given
# Treat first as an index and second as filename
index, filename = args
try:
index = int(index)
except ValueError:
print("First argument is not a valid item index!")
return
filename = os.path.expanduser(filename)
else:
print("You must provide an index, a filename, or both.")
return
# Next, fetch the item to save, if it's not the current one.
if index:
last_gi = self.gi
try:
gi = self.lookup[index-1]
self._go_to_gi(gi, update_hist = False, handle = False)
except IndexError:
print ("Index too high!")
self.gi = last_gi
return
else:
gi = self.gi
# Derive filename from current GI's path, if one hasn't been set
if not filename:
filename = os.path.basename(gi.path)
# Check for filename collisions and actually do the save if safe
if os.path.exists(filename):
print("File %s already exists!" % filename)
else:
# Don't use _get_active_tmpfile() here, because we want to save the
# "source code" of menus, not the rendered view - this way AV-98
# can navigate to it later.
shutil.copyfile(self.tmp_filename, filename)
print("Saved to %s" % filename)
# Restore gi if necessary
if index != None:
self._go_to_gi(last_gi, handle=False)
@needs_gi
def do_url(self, *args):
"""Print URL of most recently visited item."""
print(self.gi.url)
### Bookmarking stuff
@restricted
@needs_gi
def do_add(self, line):
"""Add the current URL to the bookmarks menu.
Optionally, specify the new name for the bookmark."""
with open(os.path.join(self.config_dir, "bookmarks.gmi"), "a") as fp:
fp.write(self.gi.to_map_line(line))
def do_bookmarks(self, line):
"""Show or access the bookmarks menu.
'bookmarks' shows all bookmarks.
'bookmarks n' navigates immediately to item n in the bookmark menu.
Bookmarks are stored using the 'add' command."""
bm_file = os.path.join(self.config_dir, "bookmarks.gmi")
if not os.path.exists(bm_file):
print("You need to 'add' some bookmarks, first!")
return
args = line.strip()
if len(args.split()) > 1 or (args and not args.isnumeric()):
print("bookmarks command takes a single integer argument!")
return
with open(bm_file, "r") as fp:
body = fp.read()
gi = GeminiItem("localhost/" + bm_file)
self._handle_index(body, gi, display = not args)
if args:
# Use argument as a numeric index
self.default(line)
### Help
def do_help(self, arg):
"""ALARM! Recursion detected! ALARM! Prepare to eject!"""
if arg == "!":
print("! is an alias for 'shell'")
elif arg == "?":
print("? is an alias for 'help'")
else:
cmd.Cmd.do_help(self, arg)
### Flight recorder
def do_blackbox(self, *args):
"""Display contents of flight recorder, showing statistics for the
current gemini browsing session."""
lines = []
# Compute flight time
now = time.time()
delta = now - self.log["start_time"]
hours, remainder = divmod(delta, 36000)
minutes, seconds = divmod(remainder, 60)
# Count hosts
ipv4_hosts = len([host for host in self.visited_hosts if host[0] == socket.AF_INET])
ipv6_hosts = len([host for host in self.visited_hosts if host[0] == socket.AF_INET6])
# Assemble lines
lines.append(("Patrol duration", "%02d:%02d:%02d" % (hours, minutes, seconds)))
lines.append(("Requests sent:", self.log["requests"]))
lines.append((" IPv4 requests:", self.log["ipv4_requests"]))
lines.append((" IPv6 requests:", self.log["ipv6_requests"]))
lines.append(("Bytes received:", self.log["bytes_recvd"]))
lines.append((" IPv4 bytes:", self.log["ipv4_bytes_recvd"]))
lines.append((" IPv6 bytes:", self.log["ipv6_bytes_recvd"]))
lines.append(("Unique hosts visited:", len(self.visited_hosts)))
lines.append((" IPv4 hosts:", ipv4_hosts))
lines.append((" IPv6 hosts:", ipv6_hosts))
lines.append(("DNS failures:", self.log["dns_failures"]))
lines.append(("Timeouts:", self.log["timeouts"]))
lines.append(("Refused connections:", self.log["refused_connections"]))
lines.append(("Reset connections:", self.log["reset_connections"]))
# Print
for key, value in lines:
print(key.ljust(24) + str(value).rjust(8))
### The end!
def do_quit(self, *args):
"""Exit AV-98."""
# Close TOFU DB
self.db_conn.commit()
self.db_conn.close()
# Clean up after ourself
if self.tmp_filename and os.path.exists(self.tmp_filename):
os.unlink(self.tmp_filename)
if self.idx_filename and os.path.exists(self.idx_filename):
os.unlink(self.idx_filename)
for cert in self.transient_certs_created:
for ext in (".crt", ".key"):
certfile = os.path.join(self.config_dir, "transient_certs", cert+ext)
if os.path.exists(certfile):
os.remove(certfile)
print()
print("Thank you for flying AV-98!")
sys.exit()
do_exit = do_quit
# Main function
def main():
# Parse args
parser = argparse.ArgumentParser(description='A command line gemini client.')
parser.add_argument('--bookmarks', action='store_true',
help='start with your list of bookmarks')
parser.add_argument('--tls-cert', metavar='FILE', help='TLS client certificate file')
parser.add_argument('--tls-key', metavar='FILE', help='TLS client certificate private key file')
parser.add_argument('--restricted', action="store_true", help='Disallow shell, add, and save commands')
parser.add_argument('--version', action='store_true',
help='display version information and quit')
parser.add_argument('url', metavar='URL', nargs='*',
help='start with this URL')
args = parser.parse_args()
# Handle --version
if args.version:
print("AV-98 " + _VERSION)
sys.exit()
# Instantiate client
gc = GeminiClient(args.restricted)
# Process config file
rcfile = os.path.join(gc.config_dir, "av98rc")
if os.path.exists(rcfile):
print("Using config %s" % rcfile)
with open(rcfile, "r") as fp:
for line in fp:
line = line.strip()
if ((args.bookmarks or args.url) and
any((line.startswith(x) for x in ("go", "g", "tour", "t")))
):
if args.bookmarks:
print("Skipping rc command \"%s\" due to --bookmarks option." % line)
else:
print("Skipping rc command \"%s\" due to provided URLs." % line)
continue
gc.cmdqueue.append(line)
# Say hi
print("Welcome to AV-98!")
if args.restricted:
print("Restricted mode engaged!")
print("Enjoy your patrol through Geminispace...")
# Act on args
if args.tls_cert:
# If tls_key is None, python will attempt to load the key from tls_cert.
gc._activate_client_cert(args.tls_cert, args.tls_key)
if args.bookmarks:
gc.cmdqueue.append("bookmarks")
elif args.url:
if len(args.url) == 1:
gc.cmdqueue.append("go %s" % args.url[0])
else:
for url in args.url:
if not url.startswith("gemini://"):
url = "gemini://" + url
gc.cmdqueue.append("tour %s" % url)
gc.cmdqueue.append("tour")
# Endless interpret loop
while True:
try:
gc.cmdloop()
except KeyboardInterrupt:
print("")
if __name__ == '__main__':
main() | AV-98 | /AV-98-1.0.1.tar.gz/AV-98-1.0.1/av98.py | av98.py |
PyAVISO
=======
aviso
-----
A small library to download data from AVISO. It uses the AVISO's DAP server, to
optimize the use of the net, downloading only the selected subset. For that,
you need to register a username and password at AVISO. It's free. The data is
downloaded in blocks, so don't overload AVISO's server, and run safe in unstable
networks.
Fast aviso howto
----------------
Run this in the shell, outside Python, to install the library:
pip install aviso
Now try this, with proper username and password, to download some data into a NetCDF file:
AVISO_download -R -5/310/15/350 -D 1999-01-01/1999-06-01 -u user_at_AVISO_DAP -p password_at_AVISO_DAP --map='msla' --timestep=30 -o aviso.nc
Documentation
-------------
The full documentation is at http://pyaviso.readthedocs.org
ATENTION
--------
This is not an official package, so please do not complain with AVISO if you have any trouble with it.
| AVISO | /AVISO-0.9.2.tar.gz/AVISO-0.9.2/README.rst | README.rst |
import os
import os.path
from datetime import datetime, timedelta
import time
import re
import logging
import logging.handlers
import numpy
import numpy as np
from numpy import ma
try:
import netCDF4
from netCDF4 import date2num
except:
import pupynere
from pydap.client import open_url
try:
from rings import okuboweiss
except:
pass
def products(ncfile):
""" Calculate some products from the downloaded data
Think about, should I filter the data here or during
the analysis? Probably over there, but if I do here
I'll need to include some metadata, like filter type
and half power cut freq.
"""
assert os.path.isfile(ncfile)
nc = netCDF4.Dataset(ncfile, 'a', format='NETCDF4')
W = nc.createVariable('W', 'f4', ('time', 'latitude', 'longitude'),
fill_value=netCDF4.default_fillvals['f4'])
zeta = nc.createVariable('zeta', 'f4', ('time', 'latitude', 'longitude'),
fill_value=netCDF4.default_fillvals['f4'])
for tn in range(nc.variables['time'].size):
data = {'Lat': nc.variables['Lat'][:],
'Lon': nc.variables['Lon'][:],
'u': nc.variables['u'][tn],
'v': nc.variables['v'][tn],
}
products = okuboweiss.okuboweiss(data)
W[tn] = products['W']
zeta[tn] = products['zeta']
nc.close()
def mask_shallow(ncfile, zlimit=-150, zfile=None):
""" Mask all variables in vars @ gridpoints shallower than mindepth
Use http://opendap.ccst.inpe.br/Misc/etopo2/ETOPO2v2c_f4.nc
In the future move it out of here, into a different support
command line
"""
if zfile is None:
zfile = "http://opendap.ccst.inpe.br/Misc/etopo2/ETOPO2v2c_f4.nc"
nc = netCDF4.Dataset(ncfile, 'a')
Lat = nc.variables['latitude'][:]
Lon = nc.variables['longitude'][:]
#Lon,Lat = numpy.meshgrid(self.data['lon'],self.data['lat'])
#from fluid.common.common import get_bathymery
#z = get_bathymery(Lat, Lon, etopo_file=zfile)
# ========================================================================
# Not cute, but works for now.
ncz = netCDF4.Dataset(zfile, 'r')
# -180:180
lon_z = ncz.variables['x'][:]
lat_z = ncz.variables['y'][:]
# If input 0:360
ind = Lon > 180
Lon[ind] = Lon[ind] - 360
I = Lat.size
J = Lon.size
z = np.empty((I, J))
for i in range(I):
for j in range(J):
x_ind = (np.absolute(Lon[j] - lon_z)).argmin()
y_ind = (np.absolute(Lat[i] - lat_z)).argmin()
z[i, j] = ncz.variables['z'][y_ind, x_ind]
# ========================================================================
ind = z > zlimit
for v in nc.variables.keys():
if nc.variables[v].dimensions == (u'time', u'latitude', u'longitude'):
if nc.variables[v].shape[1:] != ind.shape:
return
I, J = np.nonzero(ind)
for i, j in zip(I, J):
nc.variables[v][:,i,j] = nc.variables[v]._FillValue
nc.sync()
nc.close()
def mask(self):
""" Improve it. Make it more flexible
"""
#
print "Masking data shallower then: %s" % self.metadata['mask_shallow']
bath_mask = self.data['z']>self.metadata['mask_shallow']
#
Lon,Lat = numpy.meshgrid(self.data['lon'],self.data['lat'])
equator_mask = (Lat>-2.5) & (Lat<2.5)
#
for i in range(len(self.data['datetime'])):
self.data['u'][i,:,:]=ma.masked_array(self.data['u'][i,:,:].data,mask=(self.data['u'][i,:,:].mask) | bath_mask | equator_mask)
self.data['v'][i,:,:]=ma.masked_array(self.data['v'][i,:,:].data,mask=(self.data['v'][i,:,:].mask) | bath_mask | equator_mask)
self.data['h'][i,:,:]=ma.masked_array(self.data['h'][i,:,:].data,mask=(self.data['h'][i,:,:].mask) | bath_mask)
def eke(cutperiod=360, dt=7, verbose=False):
"""
Include the possibility to do with a different dataset, like anomaly or ref.
ATENTION, need to move user and password out of here.
"""
from maud import window_1Dmean
l = cutperiod*24 # From days to hours. Aviso time is on hours.
#self.metadata['urlbase'] = "http://%s:%[email protected]/thredds/dodsC" % (self.metadata['username'], self.metadata['password'])
url_uv = "http://aviso-users:[email protected]/thredds/dodsC/dataset-duacs-dt-upd-global-merged-madt-uv-daily"
dataset = open_url(url_uv)
T, I, J = dataset.Grid_0001.shape
eke = ma.masked_all((I,J))
I,J = numpy.nonzero(ma.masked_values(dataset.Grid_0001.Grid_0001[-300::60,:,:], dataset.Grid_0001.attributes['_FillValue']).max(axis=0))
t = ma.array(dataset.time[::dt])
if verbose:
from progressbar import ProgressBar
pbar = ProgressBar(maxval=I.shape[0]).start()
n=-1
for i, j in zip(I,J):
if verbose:
n+=1
pbar.update(n)
doit = True
while doit:
try:
u = ma.masked_values(dataset.Grid_0001.Grid_0001[::dt,i,j], dataset.Grid_0001.attributes['_FillValue'])*1e-2
v = ma.masked_values(dataset.Grid_0002.Grid_0002[::dt,i,j], dataset.Grid_0002.attributes['_FillValue'])*1e-2
u_prime = u-window_1Dmean(u, l=l, t=t, axis=0)
v_prime = v-window_1Dmean(v, l=l, t=t, axis=0)
eke[i,j] = (u_prime**2+v_prime**2).mean()/2.
doit=False
except:
print "I had some trouble. I'll wait a litte bit and try again"
time.sleep(10)
if verbose:
pbar.finish()
return eke | AVISO | /AVISO-0.9.2.tar.gz/AVISO-0.9.2/aviso/extra.py | extra.py |
__author__ = ["Guilherme Castelao <[email protected]>", "Roberto De Almeida <[email protected]>"]
import os
import itertools
from datetime import datetime, timedelta
from UserDict import IterableUserDict
try:
import cPickle as pickle
except ImportError:
import pickle
from numpy import ma
import dap.client
def topex_time_table(dt_days,dt_seconds,dt_microseconds,base_date=None):
"""
"""
if base_date is None:
base_date=datetime(year=1950,month=01,day=01,hour=0,minute=0,second=0)
t=[]
for d, s, ms in itertools.izip(dt_days.compressed(),dt_seconds.compressed(),dt_microseconds.compressed()):
dt=timedelta(days=int(d),seconds=int(s),microseconds=int(ms))
t.append(base_date+dt)
t=ma.masked_equal(t,-1)
return t
def topex_track_table(ndata,tracks,cycles):
"""
"""
track_list=[]
cycle_list=[]
for track, n, cycle in itertools.izip(tracks.compressed(),ndata.compressed(),cycles.compressed()):
for i in range(n):
track_list.append(track)
cycle_list.append(cycle)
track_list=ma.masked_equal(track_list,-1)
cycle_list=ma.masked_equal(cycle_list,-1)
return cycle_list,track_list
def read_file(filename,vars=['CorSSH','MSS','Bathy']):
"""
"""
import dap.client
try:
dataset = dap.client.open(filename)
except:
return
cycles=ma.masked_equal(dataset['Cycles'][:,0],dataset['Cycles']._FillValue)
cycles.set_fill_value(dataset['Cycles']._FillValue)
tracks=ma.masked_equal(dataset['Tracks'][:],dataset['Tracks']._FillValue)
tracks.set_fill_value(dataset['Tracks']._FillValue)
ndata=ma.masked_equal(dataset['NbPoints'][:],dataset['NbPoints']._FillValue)
ndata.set_fill_value(dataset['NbPoints']._FillValue)
[cycle_list,track_list]=topex_track_table(ndata,tracks,cycles)
# Time related
TimeDay=ma.masked_equal(dataset['TimeDay'][:,0],dataset['TimeDay']._FillValue)
TimeDay.set_fill_value(dataset['TimeDay']._FillValue)
TimeSec=ma.masked_equal(dataset['TimeSec'][:,0],dataset['TimeSec']._FillValue)
TimeSec.set_fill_value(dataset['TimeSec']._FillValue)
TimeMicroSec=ma.masked_equal(dataset['TimeMicroSec'][:,0],dataset['TimeMicroSec']._FillValue)
TimeMicroSec.set_fill_value(dataset['TimeMicroSec']._FillValue)
# Improve and include the check of the BeginDates
time_list=topex_time_table(TimeDay,TimeSec,TimeMicroSec)
# Position related
lat=ma.masked_equal(dataset['Latitudes'][:],dataset['Latitudes']._FillValue)*dataset['Latitudes'].scale_factor
lat.set_fill_value(dataset['Latitudes']._FillValue)
lon=ma.masked_equal(dataset['Longitudes'][:],dataset['Longitudes']._FillValue)*dataset['Longitudes'].scale_factor
lon.set_fill_value(dataset['Longitudes']._FillValue)
#
#data={'Cycles':cycles,'Tracks':tracks,'NbPoints':ndata,'Tracks Table':tracks_list,'TimeDay':TimeDay,'TimeSec':TimeSec,'TimeMicroSec':TimeMicroSec,'Time Table':time_list,'CorSSH':CorSSH,'Latitudes':lat,'Longitudes':lon,'MSS':MSS}
#data={'Cycles':cycle_list,'Tracks':track_list,'TimeDay':TimeDay,'TimeSec':TimeSec,'TimeMicroSec':TimeMicroSec,'Datetime':time_list,'CorSSH':CorSSH,'Latitudes':lat,'Longitudes':lon,'MSS':MSS}
data={'Cycles':cycle_list,'Tracks':track_list,'TimeDay':TimeDay,'TimeSec':TimeSec,'TimeMicroSec':TimeMicroSec,'Datetime':time_list,'Latitude':lat,'Longitude':lon}
#
for var in vars:
tmp=ma.masked_equal(dataset[var][:,0],dataset[var]._FillValue)*dataset[var].scale_factor
tmp.set_fill_value(dataset[var]._FillValue)
data[var]=tmp
return data
def make_SSHA(data):
"""
"""
for c in data:
for t in data[c]:
data[c][t]['SSHA'] = data[c][t]['CorSSH']-data[c][t]['MSS']
return
def filter(data,var,limits):
"""
ATENTION, change it to cond instead of limits, so give complete freedom for
the conditions, like choose >= instead of > or only a lower limit.
In work
"""
index=(data[var].data>=limits[0])&(data[var].data<=limits[1])
data_out={}
for key in data:
data_out[key]=data[key][index]
return data_out
def load_TP_dataset(files,filtercond=None,data=None):
"""
"""
if data is None:
data={}
elif type(data)!=dict:
print "data should be a dictionary, and it's %s" % type(data)
return
if type(files)==str:
fileslist = [files]
else:
fileslist = files
i=0
for file in fileslist:
print "File: %s" % file
try:
data_in = read_file(file)
if filtercond is not None:
for var in filtercond:
data_in=filter(data_in,var,filtercond[var])
#
for c in set(data_in['Cycles']):
print "Doing cycle: %s" % c
if c not in data:
data[c]={}
index_c = (data_in['Cycles'].data==c)
for tck in set(data_in['Tracks'][index_c]):
#print "Doing track: %s" % tck
#if tck not in data_out[c].keys():
# data_out[c][tck]={}
index_tck = index_c & (data_in['Tracks'].data==tck)
# Change it for a generic all keys
data[c][tck]={'Datetime':data_in['Datetime'][index_tck],'Latitude':data_in['Latitude'][index_tck],'Longitude':data_in['Longitude'][index_tck],'CorSSH':data_in['CorSSH'][index_tck],'MSS':data_in['MSS'][index_tck],'Bathy':data_in['Bathy'][index_tck]}
if i<=25:
i+=1
else:
i=0
save_dataset(data,'load_TP_dataset.tmp')
except:
pass
#
return data
def load_from_path(path,filtercond=None):
"""
Improve it to accept a URL too, in the case of a path for a DODS server.
Maybe a regex too to restrict to nc files? Maybe to pattern of names.
"""
import os
filenames=os.listdir(path)
filenames.sort()
files=[os.path.join(path,filename) for filename in filenames]
data=load_TP_dataset(files,filtercond)
return data
def load_from_aviso(urlbase='ftp://ftp.cls.fr/pub/oceano/AVISO/SSH/monomission/dt/corssh/ref/j1/',filtercond=None):
""" Load the data from aviso
"""
import urllib
import re
import urlparse
import StringIO
import gzip
import pupynere
import tempfile
f = urllib.urlopen(urlbase)
content = f.read()
filesnames = re.findall('CorSSH_Ref_\w{2}_Cycle\d{1,3}\.nc\.gz',content)
data = {}
for filename in filesnames[:3]:
f = urllib.urlopen(urlparse.urljoin(urlbase,filename))
#content = f.read()
#f=StringIO.StringIO(content)
f=StringIO.StringIO(f.read())
zf = gzip.GzipFile(fileobj=f)
f=open('tmp.nc','w')
f.write(zf.read())
f.close()
data=topex.load_TP_dataset(['tmp.nc'],filtercond=filtercond,data=data)
#x=NetCDFFile(tmp)
#ncf=tempfile.mkstemp(text=zf.read())
#unzf=tempfile.TemporaryFile()
#unzf.write(zf.read())
#unzf.seek(0)
#ncf=pupy.NetCDFFile(fileobj=unzf)
#print ncf.attributes
#print ncf.variables.keys()
return data
def save_dataset(data,filename):
"""
"""
import pickle
output = open(filename,'wb')
pickle.dump(data, output)
output.close()
return
def load_dataset(filename):
"""
"""
import pickle
pkl_file = open(filename, 'rb')
data = pickle.load(pkl_file)
pkl_file.close()
return data
def join_cycles(data):
"""Join all cycles, so that from data[c][t][vars] return data[t][vars] with all cycles
"""
import numpy
vars=data[data.keys()[0]][data[data.keys()[0]].keys()[0]].keys()
data_out={}
for t in invert_keys(data):
data_out[t]={}
for var in vars:
data_out[t][var] = ma.masked_array([])
for c in data:
for t in data[c]:
for var in data[c][t]:
data_out[t][var]=numpy.ma.concatenate((data_out[t][var],data[c][t][var]))
return data_out
def invert_keys(data):
""" Invert the hirerachy of the first 2 level keys in the dictionary.
This is usable to group the data in tracks instead of cycles, like
data[tracks][cycles] = invert_keys(data[cycles][tracks])
"""
data_out={}
for c in data:
for t in data[c]:
if t not in data_out:
data_out[t]={}
if c not in data_out[t]:
data_out[t][c]={}
data_out[t][c] = data[c][t]
return data_out
##############################################################################
#### Extras
##############################################################################
def make_L(data,direction='S',z=None,):
""" Define the along track distance from one reference
direction define the cardinal direction priority (N,S,W or E).
S means that the reference will be the southern most point
z define the bathymetry, if defined, the closest point to that
bathymetry will be the reference. In case of cross this bathymetry
more than once, the direction criteria is used to distinguish.
"""
from fluid.common.distance import distance
all_cycles_data = join_cycles(data)
if z==None:
import rpy
#for t in topex.invert_keys(data):
for t in all_cycles_data:
rpy.set_default_mode(rpy.NO_CONVERSION)
linear_model = rpy.r.lm(rpy.r("y ~ x"), data = rpy.r.data_frame(x=all_cycles_data[t]['Longitude'], y=all_cycles_data[t]['Latitude']))
rpy.set_default_mode(rpy.BASIC_CONVERSION)
coef=rpy.r.coef(linear_model)
if direction=='S':
lat0=all_cycles_data[t]['Latitude'].min()-1
lon0 = (lat0-coef['(Intercept)'])/coef['x']
L_correction = distance(all_cycles_data[t]['Latitude'],all_cycles_data[t]['Longitude'],lat0,lon0).min()
for c in invert_keys(data)[t]:
data[c][t]['L'] = distance(data[c][t]['Latitude'],data[c][t]['Longitude'],lat0,lon0)- L_correction
# This bathymetric method was only copied from an old code. This should be atleast
# changed, if not removed.
elif method=='bathymetric':
import rpy
for t in all_cycles_data:
# First define the near coast values.
idSouth=numpy.argmin(all_cycles_data[t]['Latitude'])
L_tmp = distance(all_cycles_data[t]['Latitude'],all_cycles_data[t]['Longitude'],all_cycles_data[t]['Latitude'][idSouth],all_cycles_data[t]['Longitude'][idSouth])
idNearCoast = L_tmp.data<400e3
if min(all_cycles_data[t]['Bathy'][idNearCoast]) > -z:
idNearCoast = L_tmp.data<600e3
# Then calculate the distance to a reference
rpy.set_default_mode(rpy.NO_CONVERSION)
linear_model = rpy.r.lm(rpy.r("y ~ x"), data = rpy.r.data_frame(x=all_cycles_data[t]['Longitude'], y=all_cycles_data[t]['Latitude']))
rpy.set_default_mode(rpy.BASIC_CONVERSION)
coef=rpy.r.coef(linear_model)
lat0 = all_cycles_data[t]['Latitude'].min()-1
lon0 = (lat0-coef['(Intercept)'])/coef['x']
#L = distance(,lon,lat0,lon0)
#
#id0 = numpy.argmin(numpy.absolute(all_cycles_data[t]['Bathy'][idNearCoast]))
idref=numpy.argmin(numpy.absolute(all_cycles_data[t]['Bathy'][idNearCoast]+z))
#L_correction = distance(all_cycles_data[t]['Latitude'][idNearCoast][idref],all_cycles_data[t]['Longitude'][idNearCoast][idref],all_cycles_data[t]['Latitude'][idNearCoast][idref],all_cycles_data[t]['Longitude'][idNearCoast][idref])
L_correction = distance(all_cycles_data[t]['Latitude'][idNearCoast][idref],all_cycles_data[t]['Longitude'][idNearCoast][idref],lat0,lon0)
for c in topex.invert_keys(data)[t]:
#data[c][t]['L'] = distance(data[c][t]['Latitude'],data[c][t]['Longitude'],all_cycles_data[t]['Latitude'][idNearCoast][id0],all_cycles_data[t]['Longitude'][idNearCoast][id0]) - L_correction
data[c][t]['L'] = distance(data[c][t]['Latitude'],data[c][t]['Longitude'],lat0,lon0) - L_correction
#
return
##############################################################################
####
##############################################################################
##############################################################################
class TOPEX(IterableUserDict):
"""
"""
def __init__(self,tracks=None,nbpoints=None,Cycles=None):
"""
"""
self.data={1:{11:[1,2,3],12:[1.1,1.2,1.3],'teste':['orange','apple','pear']},2:[10,20,30],3:[100,200,300]}
return
def __getitem__(self, key):
print "Chave tipo %s" % type(key).__name__
if isinstance(key, basestring):
if key=='join':
print "key is join"
data_out={}
for k in self.data:
if k not in data_out:
pass
if isinstance(key, slice):
print "It's a slice"
if (key.start==0) & (key.stop>max(self.data.keys())+1) & (key.step==None):
return self.data
print "I'm not ready for that. Only full return, like x[:]"
return
if key not in self.data:
print "%s is not a valid key" % key
return
print "key: %s" % key
return self.data[key] | AVISO | /AVISO-0.9.2.tar.gz/AVISO-0.9.2/aviso/topex.py | topex.py |
import os
import os.path
from datetime import datetime, timedelta
import time
import re
import logging
import logging.handlers
import numpy
import numpy as np
from numpy import ma
try:
import netCDF4
from netCDF4 import date2num
except:
import pupynere
from pydap.client import open_url
# Time to refactor and change somethings. Once class to download, and another
# on the top of that to handle the file like it was a MA
class AVISO_fetch(object):
""" Class to fetch maps from AVISO
- Deal with the file to save the data
- Download
- Download LatLon
- Adjust -180/180 <-> 0/360
- First to second limit
- Download time
- Parse data input into time index
- Define the size of the blocks (number of time snapshots per download)
- Download data in blocks
- Think about how to save the data in a generic way? This class should worry only about download and save it. Another Class should be created on the top of this to use it and offer the downloaded data as a MA.
"""
def __init__(self, cfg):
"""
"""
self.cfg = cfg
self.set_logger()
self.logger.info("Initializing AVISO_fetch class")
self.logger.debug("cfg: %s" % cfg)
if ('username' not in self.cfg) | ('password' not in self.cfg):
self.logger.error("Aviso DAP server requires a registered username and password. I'll abort.")
return
if 'urlbase' not in self.cfg:
self.cfg['urlbase'] = \
"http://%s:%[email protected]/thredds/dodsC" % \
(self.cfg['username'], self.cfg['password'])
self.logger.debug("urlbase: %s" % self.cfg['urlbase'])
#if type(self.cfg['map']) == str:
# self.cfg['map'] = [self.cfg['map']]
if 'force_download' not in self.cfg:
self.cfg['force_download'] = False
if (self.cfg['datadir'] != '') and \
(not os.path.isdir(self.cfg['datadir'])):
print "There is no data directory: %s" % self.cfg['datadir']
return
self.file = os.path.join(self.cfg['datadir'], self.cfg['filename'])
try:
self.nc = netCDF4.Dataset(self.file,'w', format='NETCDF4')
except:
self.nc = pupynere.netcdf_file(self.file,'w')
# ----------
self.nc.created_datetime = datetime.now().isoformat()
self.nc.metadata_map = self.cfg['map']
self.nc.metadata_type = self.cfg['type']
self.nc.metadata_urlbase = self.cfg['urlbase']
self.nc.metadata_force_download = str(self.cfg['force_download'])
# ----------
self.download_data()
self.nc.close()
def set_logger(self):
"""
"""
# Creating another log level
logging.VERBOSE = logging.DEBUG - 1
logging.addLevelName(logging.VERBOSE, 'VERBOSE')
#create logger
logger = logging.getLogger("AVISO fetch")
logger.setLevel(logging.VERBOSE)
#create console handler and set level to debug
ch = logging.StreamHandler()
ch.setLevel(logging.WARN)
#create formatter
formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s")
#add formatter to ch
ch.setFormatter(formatter)
#add ch to logger
logger.addHandler(ch)
if 'logfile' in self.cfg:
#create a rotate file handler
fh = logging.handlers.RotatingFileHandler(
self.cfg['logfile'],
mode='a', maxBytes=1000000, backupCount=10)
fh.setLevel(logging.DEBUG)
#create formatter
formatter = logging.Formatter("%(asctime)s - %(name)s - %(levelname)s - %(message)s")
#add formatter to ch
fh.setFormatter(formatter)
#add ch to logger
logger.addHandler(fh)
#"application" code
#logger.debug("debug message")
#logger.info("info message")
#logger.warn("warn message")
#logger.error("error message")
#logger.critical("critical message")
self.logger = logger
def set_source_filename(self, map):
"""
"""
assert (map in ['madt', 'msla'])
self.cfg['source_filename'] = "dataset-duacs-dt-global-allsat-%s" % (map,)
self.logger.debug("source_filename: %s" % self.cfg['source_filename'])
def set_dataset(self, map, var):
"""
"""
assert (var in ['h', 'u', 'v'])
self.logger.debug("Setting the dataset on the DAP server")
self.set_source_filename(map)
if var == 'h':
url = "%s/%s-h" % (self.cfg['urlbase'], self.cfg['source_filename'])
elif var in ['u', 'v']:
url = "%s/%s-uv" % (self.cfg['urlbase'], self.cfg['source_filename'])
ntries = 40
for i in range(ntries):
self.logger.info("Connecting with the URL: %s" % url)
try:
dataset = open_url(url)
self.logger.debug("Dataset on the DAP server is ready")
return dataset
except:
waitingtime = 30+i*20
self.logger.warn("Failed to open_url. I'll try again in %s" %
waitingtime)
time.sleep(waitingtime)
def download_time(self, dataset):
"""
"""
self.logger.debug("Downloading time")
t = dataset['time'][:]
if 't_ini' not in self.cfg['limits']:
if 'd_ini' in self.cfg['limits']:
assert type(self.cfg['limits']['d_ini']) == datetime, \
"limits:d_ini must be a datetime"
d = date2num(self.cfg['limits']['d_ini'],
dataset['time'].attributes['units'])
self.cfg['limits']['t_ini'] = np.nonzero(t>=d)[0][0]
else:
self.cfg['limits']['t_ini'] = 0
self.logger.debug("Setting t_ini: %s" % self.cfg['limits']['t_ini'])
if 't_step' not in self.cfg['limits']:
self.cfg['limits']['t_step'] = 1
self.logger.debug("Setting t_step: %s" % self.cfg['limits']['t_step'])
if 't_fin' not in self.cfg['limits']:
if 'd_fin' in self.cfg['limits']:
assert type(self.cfg['limits']['d_fin']) == datetime, \
"limits:d_ini must be a datetime"
d = date2num(self.cfg['limits']['d_fin'],
dataset['time'].attributes['units'])
self.cfg['limits']['t_fin'] = np.nonzero(t>d)[0][0]
else:
self.cfg['limits']['t_fin'] = dataset['time'].shape[0]
self.logger.debug("Setting t_fin: %s" % self.cfg['limits']['t_fin'])
t_ini = self.cfg['limits']['t_ini']
t_fin = self.cfg['limits']['t_fin']
t_step = self.cfg['limits']['t_step']
# ----
data={}
#
#from coards import from_udunits
#t0=datetime(1950,1,1)
#if (re.match('^hours since \d{4}-\d{2}-\d{2}$',dataset_h['time'].attributes['units'])):
#if (re.match('^hours since 1950-01-01',self.dataset['h']['time'].attributes['units'])):
# t = self.dataset['h']['time'][t_ini:t_fin:t_step].tolist()
# data['datetime'] = numpy.array([t0+timedelta(hours=h) for h in t])
#else:
# self.logger.error("Problems interpreting the time")
t = dataset['time'][t_ini:t_fin:t_step].tolist()
self.nc.createDimension('time', len(range(t_ini,t_fin,t_step)))
nct = self.nc.createVariable('time', 'f8', ('time', ))
nct[:] = t
nct.units = dataset['time'].attributes['units']
def download_LonLat(self, dataset):
""" Download the Lon x Lat coordinates
"""
self.logger.debug("Downloading LonLat")
data = {}
limits = self.cfg['limits']
Lat = dataset['lat'][:].astype('f')
Lon = dataset['lon'][:].astype('f')
# If data is requested as -180/180, convert to 0/360,
# which is the pattern in AVISO
if limits['LonIni'] < 0: limits['LonIni']+=360
if limits['LonFin'] < 0: limits['LonFin']+=360
Latlimits = numpy.arange(Lat.shape[0])[(Lat[:]>=limits["LatIni"]) & (Lat[:]<=limits["LatFin"])]
Latlimits = [Latlimits[0],Latlimits[-1]]
lat = Lat[Latlimits[0]:Latlimits[-1]+1]
if limits['LonFin'] > limits['LonIni']:
Lonlimits = numpy.arange(Lon.shape[0])[(Lon[:]>=limits["LonIni"]) & (Lon[:]<=limits["LonFin"])]
Lonlimits=[Lonlimits[0],Lonlimits[-1]]
lon = Lon[Lonlimits[0]:Lonlimits[-1]+1]
else:
Lonlimits = [np.nonzero(Lon>=limits['LonIni'])[0][0],
np.nonzero(Lon<=limits['LonFin'])[0][-1]]
lon = np.append(Lon[Lonlimits[0]:],Lon[:Lonlimits[1]+1])
self.cfg['limits']['Latlimits'] = Latlimits
self.cfg['limits']['Lonlimits'] = Lonlimits
Lon, Lat = numpy.meshgrid( lon, lat )
self.slice_size = lon.shape[0]*lat.shape[0]
# ========
self.nc.createDimension('latitude', lat.shape[0])
self.nc.createDimension('longitude', lon.shape[0])
nclat = self.nc.createVariable('latitude', 'f4',
('latitude',))
nclon = self.nc.createVariable('longitude', 'f4',
('longitude',))
ncLat = self.nc.createVariable('Lat', 'f4',
('latitude', 'longitude'))
ncLon = self.nc.createVariable('Lon', 'f4',
('latitude', 'longitude'))
nclat[:] = lat
nclon[:] = lon
ncLat[:] = Lat
ncLon[:] = Lon
def download_data(self):
"""
Not a cute way, but works for now.
"""
if self.cfg['map'] == 'madt+msla':
dataset = self.set_dataset('madt', 'h')
self.download_time(dataset)
self.download_LonLat(dataset)
self.download_var('ssh', dataset['adt']['adt'], dataset['adt'].attributes)
dataset = self.set_dataset('madt', 'u')
self.download_var('u', dataset['u']['u'], dataset['u'].attributes)
self.download_var('v', dataset['v']['v'], dataset['v'].attributes)
dataset = self.set_dataset('msla', 'h')
self.download_var('sla', dataset['sla']['sla'], dataset['sla'].attributes)
dataset = self.set_dataset('msla', 'u')
self.download_var('u_anom', dataset['u']['u'], dataset['u'].attributes)
self.download_var('v_anom', dataset['v']['v'], dataset['v'].attributes)
return
dataset = self.set_dataset(self.cfg['map'], 'h')
self.download_time(dataset)
self.download_LonLat(dataset)
if self.cfg['map'] == 'madt':
self.download_var('h', dataset['adt']['adt'], dataset['adt'].attributes)
elif self.cfg['map'] == 'msla':
self.download_var('sla', dataset['sla']['sla'], dataset['sla'].attributes)
dataset = self.set_dataset(self.cfg['map'], 'u')
self.download_var('u', dataset['u']['u'], dataset['u'].attributes)
self.download_var('v', dataset['v']['v'], dataset['v'].attributes)
def download_var(self, v, dataset, attr):
# Will download blocks of at most 5MB
# i.e. 4e7 floats of 32bits.
dblocks = min(100, max(1, int(4e7/self.slice_size)))
self.logger.debug("Will download %s in blocks of %s" % \
(v, dblocks))
ti = numpy.arange(self.cfg['limits']['t_ini'],
self.cfg['limits']['t_fin'],
self.cfg['limits']['t_step'])
blocks = ti[::dblocks]
if self.cfg['limits']['t_fin'] not in blocks:
blocks = numpy.append(blocks, self.cfg['limits']['t_fin'])
#------
ntries = 40
self.logger.info("Getting %s" % v)
#data['h'] = ma.masked_all((len(ti),Lonlimits[-1]-Lonlimits[0], Latlimits[-1]-Latlimits[0]), dtype=numpy.float64)
#dataset.type.typecode
data = self.nc.createVariable(v, 'i2',
('time', 'latitude', 'longitude'),
fill_value=netCDF4.default_fillvals['i2'])
#data.missing_value = missing_value
units = attr['units']
if units == 'cm':
factor = 1e-2
units = 'm'
elif units == 'cm/s':
factor = 1e-2
units = 'm/s'
else:
factor = None
data.units = units
# Work on these limits. Must have a better way to handle it
Lonlimits = self.cfg['limits']['Lonlimits']
Latlimits = self.cfg['limits']['Latlimits']
for b1, b2 in zip(blocks[:-1], blocks[1:]):
self.logger.debug("From %s to %s of %s" % (b1, b2, blocks[-1]))
ind = numpy.nonzero((ti>=b1) & (ti<b2))
for i in range(ntries):
self.logger.debug("Try n: %s" % i)
try:
if Lonlimits[1] > Lonlimits[0]:
tmp = dataset[b1:b2:self.cfg['limits']['t_step'],
Latlimits[0]:Latlimits[-1]+1,
Lonlimits[0]:Lonlimits[-1]+1]
else:
tmp1 = dataset[b1:b2:self.cfg['limits']['t_step'],
Latlimits[0]:Latlimits[-1]+1,
Lonlimits[0]: ]
tmp2 = dataset[b1:b2:self.cfg['limits']['t_step'],
Latlimits[0]:Latlimits[-1]+1,
:Lonlimits[-1]+1]
tmp = np.append(tmp1, tmp2, axis=2)
ind_valid = tmp != attr['_FillValue']
if factor is not None:
tmp[ind_valid] = factor * tmp[ind_valid]
tmp[~ind_valid] = data._FillValue
#data[ind] = tmp.swapaxes(1,2).astype('f')
data[ind] = tmp
break
except:
waitingtime = 30+i*20
self.logger.warn(
"Failed to download. I'll try again in %ss" % \
waitingtime)
time.sleep(waitingtime)
#data['h'] = 1e-2*data['h'].swapaxes(1,2)
for a in attr:
if a not in ['units', '_FillValue']:
setattr(data, a, attr[a])
class Aviso_map(object):
""" Class to get the maps of h and uv from Aviso
ATENTION, should improve it. Should include a check on the file if
is the same time and area coverage.
"""
def __init__(self, metadata=None, auto=True):
"""
"""
self.metadata = metadata or {}
self.set_default()
if self.metadata['map'] == 'madt+msla':
print "Map: madt+msla"
print "First getting madt"
m = self.metadata.copy()
m['map'] = 'madt'
data_madt = Aviso_map(metadata=m)
print "Now getting msla"
m = self.metadata.copy()
m['map'] = 'msla'
data_msla = Aviso_map(metadata=m)
if (data_madt['Lat']!=data_msla['Lat']).any() | \
(data_madt['Lon']!=data_msla['Lon']).any() | \
(data_madt['datetime']!=data_msla['datetime']).any():
print("PROBLEMS!!!!! data_madt and data_msla are not equivalent!")
return
else:
self.data = data_madt.data
self.data['ssh'] = self.data['h']
del(self.data['h'])
self.data['eta'] = data_msla.data['h']
self.data['u_anom'] = data_msla.data['u']
self.data['v_anom'] = data_msla.data['v']
elif (self.metadata['map'] == 'madt') or (self.metadata['map'] == 'msla'):
self.set_source_filename()
if auto==True:
self.get_data()
def __getitem__(self, key):
"""
"""
if type(self.data[key]) == 'numpy.ma.core.MaskedArray':
return self.data[key]
elif hasattr(self.data[key], 'missing_value'):
return ma.masked_values(self.data[key][:], getattr(self.data[key], 'missing_value'))
return ma.array(self.data[key])
def __setitem__(self, key, value):
"""
"""
self.data[key] = value
def keys(self):
return self.data.keys()
def set_default(self):
"""
"""
if ('username' not in self.metadata) | ('password' not in self.metadata):
print("Aviso DAP server requires a registered username and password, sorry.")
return
if 'type' not in self.metadata:
self.metadata['type'] = "upd" # upd, ref
if 'map' not in self.metadata:
self.metadata['map'] = "madt" # madt, msla
if 'limits' not in self.metadata:
self.metadata['limits'] = {'LatIni': 0, 'LatFin': 15, 'LonIni': 296, 'LonFin': 317}
if 'datadir' not in self.metadata:
self.metadata['datadir'] = "./" #"../data"
if 'urlbase' not in self.metadata:
# Double check this. I believe now it is required to have a password,
# so I should clean the option with no use/pass
if ('username' in self.metadata) & ('password' in self.metadata):
self.metadata['urlbase'] = "http://%s:%[email protected]/thredds/dodsC" % (self.metadata['username'], self.metadata['password'])
else:
self.metadata['urlbase'] = "http://opendap.aviso.oceanobs.com/thredds/dodsC"
if 'force_download' not in self.metadata:
self.metadata['force_download'] = False
return
def set_source_filename(self):
"""
"""
self.metadata['source_filename'] = "dataset-duacs-dt-%s-global-merged-%s" % (self.metadata['type'],self.metadata['map'])
return
def get_data(self):
"""
"""
self.file = os.path.join(self.metadata['datadir'],self.metadata['source_filename']+".nc")
#self.nc = pupynere.netcdf_file(file, 'w')
#self.download()
if self.metadata['force_download'] == True:
self.download()
else:
if os.path.isfile(file):
print "I already downloaded that. I'm just reloading"
else:
print "I need to download it. This might take a while. (%s)" % datetime.now()
self.download()
if os.path.isdir(self.metadata['datadir']) == False:
print(" There is no data directory: %s" & self.metadata['datadir'])
try:
pass
except:
print "Couldn't save the data on pickle"
return
def download(self):
"""
Migrate it to use np.lib.arrayterator.Arrayterator
"""
url_h = "%s/%s-h-daily" % (self.metadata['urlbase'], self.metadata['source_filename'])
dataset_h = open_url(url_h)
url_uv = "%s/%s-uv-daily" % (self.metadata['urlbase'], self.metadata['source_filename'])
dataset_uv = open_url(url_uv)
# ----
if 't_ini' not in self.metadata['limits']:
self.metadata['limits']['t_ini'] = 0
if 't_fin' not in self.metadata['limits']:
self.metadata['limits']['t_fin'] = dataset_h['time'].shape[0]
if 't_step' not in self.metadata['limits']:
self.metadata['limits']['t_step'] = 0
else:
print "Atention!! t_step set to: %s" % self.metadata['limits']['t_step']
t_ini = self.metadata['limits']['t_ini']
t_fin = self.metadata['limits']['t_fin']
t_step = self.metadata['limits']['t_step']
# ----
data={}
#
#from coards import from_udunits
t0=datetime(1950,1,1)
#if (re.match('^hours since \d{4}-\d{2}-\d{2}$',dataset_h['time'].attributes['units'])):
if (re.match('^hours since 1950-01-01',dataset_h['time'].attributes['units'])):
data['datetime']=numpy.array([t0+timedelta(hours=h) for h in dataset_h['time'][t_ini:t_fin:t_step].tolist()])
else:
print "Problems interpreting the time"
return
#time = self.nc.createVariable('time', 'i', ('time',))
#time[:] = dataset_h['time'][t_ini:t_fin:t_step]
#time.units = dataset_h['time'].attributes['units']
#data['time'] = time
#
limits=self.metadata['limits']
Lat=dataset_h['NbLatitudes']
Lon=dataset_h['NbLongitudes']
Latlimits=numpy.arange(Lat.shape[0])[(Lat[:]>=limits["LatIni"]) & (Lat[:]<=limits["LatFin"])]
Latlimits=[Latlimits[0],Latlimits[-1]]
Lonlimits=numpy.arange(Lon.shape[0])[(Lon[:]>=limits["LonIni"]) & (Lon[:]<=limits["LonFin"])]
Lonlimits=[Lonlimits[0],Lonlimits[-1]]
data['Lon'], data['Lat'] = numpy.meshgrid( (Lon[Lonlimits[0]:Lonlimits[-1]]), (Lat[Latlimits[0]:Latlimits[-1]]) )
#------
self.data = data
#Arrayterator = numpy.lib.arrayterator.Arrayterator
#dataset = dataset_h['Grid_0001']['Grid_0001']
#ssh = Arrayterator(dataset)[t_ini:t_fin:t_step]
#blocks = 1e4
file = os.path.join(self.metadata['datadir'],self.metadata['source_filename']+".nc")
nc = pupynere.netcdf_file(file,'w')
nc.createDimension('time', len(range(t_ini,t_fin,t_step)))
nc.createDimension('lon', (Lonlimits[-1]-Lonlimits[0]))
nc.createDimension('lat', (Latlimits[-1]-Latlimits[0]))
dblocks = max(1,int(1e5/((Lonlimits[-1]-Lonlimits[0])*(Latlimits[-1]-Latlimits[0]))))
ti = numpy.arange(t_ini, t_fin, t_step)
blocks = ti[::dblocks]
if ti[-1] not in blocks:
blocks = numpy.append(blocks,t_fin)
ntries = 40
#------
for v, dataset, missing_value in zip(['h','u','v'], [dataset_h['Grid_0001']['Grid_0001'], dataset_uv['Grid_0001']['Grid_0001'], dataset_uv['Grid_0002']['Grid_0002']], [dataset_h['Grid_0001']._FillValue, dataset_uv['Grid_0001']._FillValue, dataset_uv['Grid_0002']._FillValue]):
print "Getting %s" % v
#data['h'] = ma.masked_all((len(ti),Lonlimits[-1]-Lonlimits[0], Latlimits[-1]-Latlimits[0]), dtype=numpy.float64)
self.data[v] = nc.createVariable(v, 'f4', ('time', 'lat', 'lon'))
self.data[v].missing_value = missing_value
for b1, b2 in zip(blocks[:-1], blocks[1:]):
print "From %s to %s of %s" % (b1, b2, blocks[-1])
ind = numpy.nonzero((ti>=b1) & (ti<b2))
for i in range(ntries):
print "Try n: %s" % i
try:
self.data[v][ind] = dataset[b1:b2:t_step, Lonlimits[0]:Lonlimits[-1],Latlimits[0]:Latlimits[-1]].swapaxes(1,2).astype('f')
break
except:
waitingtime = 30+i*20
print "Failed to download. I'll try again in %ss" % waitingtime
time.sleep(waitingtime)
#data['h'] = 1e-2*data['h'].swapaxes(1,2)
#class MAFromDAP(object):
# """
# """
# def __init__(self, dataset, var):
# self.dataset = dataset
# self.var = var
# def find_missingvalue(self):
# """ Extract the missing value from the dataset
# """
# self.missing_value = getattr(self.dataset, '_FillValue')
# def __getitem__(self, key):
# if type(key) == slice:
#
#x = MAFromDAP(dataset_h['Grid_0001'], 'Grid_0001')
#print dir(x)
#print x[::10]
# ==================================================================
# ==== Part of the script that I used for the NBCR paper
# ==== I should include this in the future as an option
# ==== Straight on HDF5. Maybe just netcdf4, which is hdf5.
#import tables
#
#filename = "%s.h5f" % (config['data']['filename'])
#h5f = tables.openFile(os.path.join(config['data']['datadir'],filename), 'w')
#
#filters = tables.Filters(complevel=5, complib='zlib')
#atom = tables.Float64Atom()
#
##if 'aviso' not in h5f.root:
#gaviso = h5f.createGroup(h5f.root, "aviso", "AVISO data")
#
##h5f.root.eddies._v_attrs.data_version = config['data']['data_version']
#h5f.root.aviso._v_attrs.created = datetime.now().isoformat()
#
#h5f.root.aviso._v_attrs.metadata_data_map = metadata['data']['map']
#h5f.root.aviso._v_attrs.metadata_data_type = metadata['data']['type']
#h5f.root.aviso._v_attrs.metadata_data_urlbase = metadata['data']['urlbase']
#h5f.root.aviso._v_attrs.metadata_data_force_download = metadata['data']['force_download']
#
#
#d0 = min(data['datetime'])
#h5f.createCArray(h5f.root.aviso, 'time', tables.Float64Atom(), (nt,), filters=filters)
#h5f.root.aviso.time[:] = ma.masked_array([(d-d0).days+(d-d0).seconds/24./60/60 for d in data['datetime']])
#h5f.root.aviso.time._v_attrs.units = 'days since %s' % datetime.strftime(d0,"%Y-%m-%d %H:%M:%S")
#h5f.root.aviso.time._v_attrs.missing_value = data['datetime'].fill_value
#
#
#h5f.createCArray(h5f.root.aviso, 'Lat', tables.Float64Atom(), (ni,nj), filters=filters)
#h5f.root.aviso.Lat[:] = data['Lat']
#h5f.root.aviso.Lat._v_attrs.units = 'degrees_north'
#h5f.root.aviso.Lat._v_attrs.missing_value = data['Lat'].fill_value
#
#h5f.createCArray(h5f.root.aviso, 'Lon', tables.Float64Atom(), (ni,nj), filters=filters)
#h5f.root.aviso.Lon[:] = data['Lon']
#h5f.root.aviso.Lon._v_attrs.units = 'degrees_east'
#h5f.root.aviso.Lon._v_attrs.missing_value = data['Lon'].fill_value
#
#h5f.flush()
#
#try:
# h5f.createCArray(h5f.root.aviso, 'depth', tables.Float64Atom(), (ni,nj), filters=filters)
# h5f.root.aviso.depth[:] = data['z']
# h5f.root.aviso.depth._v_attrs.units = 'm'
# h5f.root.aviso.depth._v_attrs.missing_value = data['z'].fill_value
#except:
# print "Couldn't save depth"
#
#try:
# h5f.createCArray(h5f.root.aviso, 'ssh', tables.Float64Atom(), (nt,ni,nj), filters=filters)
# h5f.root.aviso.ssh[:] = data['ssh']
# h5f.root.aviso.ssh._v_attrs.units = 'm'
# h5f.root.aviso.ssh._v_attrs.missing_value = data['ssh'].fill_value
#
# h5f.createCArray(h5f.root.aviso, 'u', tables.Float64Atom(), (nt,ni,nj), filters=filters)
# h5f.root.aviso.u[:] = data['u']
# h5f.root.aviso.u._v_attrs.units = 'm'
# h5f.root.aviso.u._v_attrs.missing_value = data['u'].fill_value
# h5f.createCArray(h5f.root.aviso, 'v', tables.Float64Atom(), (nt,ni,nj), filters=filters)
# h5f.root.aviso.v[:] = data['v']
# h5f.root.aviso.v._v_attrs.units = 'm'
# h5f.root.aviso.v._v_attrs.missing_value = data['v'].fill_value
#finally:
# h5f.flush()
#
#try:
# h5f.createCArray(h5f.root.aviso, 'eta', tables.Float64Atom(), (nt,ni,nj), filters=filters)
# h5f.root.aviso.eta[:] = data['eta']
# h5f.root.aviso.eta._v_attrs.units = 'm'
# h5f.root.aviso.eta._v_attrs.missing_value = data['eta'].fill_value
#
# h5f.createCArray(h5f.root.aviso, 'u_anom', tables.Float64Atom(), (nt,ni,nj), filters=filters)
# h5f.root.aviso.u_anom[:] = data['u_anom']
# h5f.root.aviso.u_anom._v_attrs.units = 'm'
# h5f.root.aviso.u_anom._v_attrs.missing_value = data['u_anom'].fill_value
# h5f.createCArray(h5f.root.aviso, 'v_anom', tables.Float64Atom(), (nt,ni,nj), filters=filters)
# h5f.root.aviso.v_anom[:] = data['v_anom']
# h5f.root.aviso.v_anom._v_attrs.units = 'm'
# h5f.root.aviso.v_anom._v_attrs.missing_value = data['v_anom'].fill_value
#finally:
# h5f.flush()
#
#h5f.close()
#
## ============================================================================
#logger.info("Calculating products")
#products = okuboweiss.OkuboWeiss(data, metadata['okuboweiss'], logname = metadata['log']['logname'])
#
#products = prepare_masked_array(products)
## ------------------------------------------------------------------------
#logger.info("Saving products")
#h5f = tables.openFile(os.path.join(config['data']['datadir'],filename), 'r+')
#
#
#if 'products' not in h5f.root:
# gproducts = h5f.createGroup(h5f.root, "products", "Products")
#
#h5f.createCArray(h5f.root.products, 'W', tables.Float64Atom(), (nt,ni,nj), filters=filters)
#h5f.root.products.W[:] = products['W']
#h5f.root.products.W._v_attrs.units = 's^-1'
#h5f.root.products.W._v_attrs.missing_value = products['W'].fill_value
#
#h5f.createCArray(h5f.root.products, 'zeta', tables.Float64Atom(), (nt,ni,nj), filters=filters)
#h5f.root.products.zeta[:] = products['zeta']
#h5f.root.products.zeta._v_attrs.units = 's^-1'
#h5f.root.products.zeta._v_attrs.missing_value = products['zeta'].fill_value
#
## I don't need to define my W0 at this point.
##h5f.createCArray(h5f.root.products, 'W0', tables.Float64Atom(), (1,), filters=filters)
##h5f.root.products.W0[:] = data['W0']
##h5f.root.products.W0._v_attrs.units = 's^-1'
##h5f.root.products.W0._v_attrs.missing_value = 1e20
#
##h5f.root.products._v_attrs.metadata_okuboweiss_smooth_W0 = metadata['okuboweiss']['W0']
##h5f.root.products._v_attrs.metadata_okuboweiss_smooth_scale = metadata['okuboweiss']['smooth']['scale']
##h5f.root.products._v_attrs.metadata_okuboweiss_smooth_method = metadata['okuboweiss']['smooth']['method']
#
##h5f.root.eddies._v_attrs.data_version = config['data']['data_version']
#h5f.root.products._v_attrs.created = datetime.now().isoformat()
#
#
#h5f.flush()
#h5f.close() | AVISO | /AVISO-0.9.2.tar.gz/AVISO-0.9.2/aviso/aviso.py | aviso.py |
from argparse import ArgumentParser
from json import loads, dumps
from os import path
from sys import exit
import subprocess
from typing import Dict, List, Optional
try:
from avoscript.lexer import Lexer
from avoscript.lexer.default import ENV, ENV_CONSTS, LevelIndex
from avoscript import version, AVOSCRIPT, PKGS
from avoscript.parser import imp_parser
from avoscript.lexer.types import Signal
except ImportError:
from src.avoscript.lexer import Lexer
from src.avoscript.lexer.default import ENV, ENV_CONSTS, LevelIndex
from src.avoscript import version, AVOSCRIPT, PKGS
from src.avoscript.parser import imp_parser
from src.avoscript.lexer.types import Signal
from colorama import Fore, init
init(autoreset=True)
parser = ArgumentParser(
"avoscript",
description=f"{Fore.LIGHTRED_EX}AVOScript{Fore.RESET} {Fore.LIGHTCYAN_EX}{version}{Fore.RESET} interpreter",
)
# --== Flags ==-- #
flags = parser.add_argument_group("Flags")
flags.add_argument(
"-i", "--interactive",
help="start interactive mode",
action="store_true"
)
flags.add_argument(
"-v", "--version",
help="show avoscript version",
action="store_true"
)
flags.add_argument(
"-V", "--verbose",
help="enables verbose mode",
action="store_true",
)
# --== With args ==-- #
flags.add_argument(
"-s", "--script",
dest="script",
metavar="<src>",
help="execute script"
)
flags.add_argument(
"-f", "--file",
dest="file",
metavar="<file>",
help="execute file script"
)
package_manager = parser.add_argument_group("Package Manager")
package_manager.add_argument(
"-nf", "--no-fetch",
dest="no_fetch",
help="disable fetching package data",
action="store_false"
)
package_manager.add_argument(
"--upd",
action="store_true",
help="update packages data"
)
package_manager.add_argument(
"--upload",
action="store_true",
help="upload current project"
)
package_manager.add_argument(
"add",
nargs='*',
help="install package"
)
parser.set_defaults(
script="",
file="",
add=None,
)
def git_clone(
url: str,
directory: str,
target_dir: Optional[str] = None
):
"""Clones repo
:param url: repo url
:param directory: current working directory
:param target_dir: dir to clone
"""
if target_dir is not None:
subprocess.run(
f'git clone --depth 1 --no-tags -q {url} {target_dir}',
shell=True, cwd=directory, stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL,
)
else:
subprocess.run(
f'git clone --depth 1 --no-tags -q {url}',
shell=True, cwd=directory, stdout=subprocess.DEVNULL,
stderr=subprocess.DEVNULL,
)
def fetch_pkgs(no_fetch: bool, out_file: str = 'pkgs.json') -> List[Dict[str, str]]:
"""Fetches packages data
:param no_fetch: need to fetch
:param out_file: output file
:return: list of packages
"""
if not no_fetch:
print(f"{Fore.LIGHTMAGENTA_EX}Fetch packages ...{Fore.RESET}")
if not path.exists(path.join(AVOSCRIPT, 'avoscript')):
print(f"{Fore.LIGHTMAGENTA_EX}Cloning repo ...{Fore.RESET}")
git_clone('https://github.com/ethosa/avoscript.git', AVOSCRIPT)
else:
subprocess.run(
'cd avoscript && git init -q && git remote add origin https://github.com/ethosa/avoscript.git && '
'git fetch -q origin master --depth 1 --no-tags && git checkout -q origin/master -- pkgs.json && '
f'git show origin/master:pkgs.json > {out_file}',
cwd=AVOSCRIPT, shell=True
)
try:
out = None
with open(path.join(AVOSCRIPT, 'avoscript', out_file), 'r', encoding='utf-8') as f:
out = f.read()
if out is not None:
return loads(out)
return []
except FileNotFoundError:
print(f"{Fore.LIGHTRED_EX}Need to fetch!{Fore.RESET}")
return []
def install_package(name: str, data: List[Dict[str, str]]):
"""Install package
:param name: package name
:param data: package data
"""
print(f"Install {Fore.LIGHTMAGENTA_EX}{name}{Fore.RESET} package ...")
installed = False
_name = name.replace('-', ' ')
for i in data:
if 'name' in i and i['name'] == _name:
if 'github_url' in i:
print(f"Found {Fore.LIGHTMAGENTA_EX}Github URL{Fore.RESET}, cloning ...")
i['name'] = i['name'].replace(' ', '_')
git_clone(i['github_url'], PKGS, i['name'])
installed = True
print(
f"{Fore.LIGHTGREEN_EX}Successfully installed{Fore.RESET} "
f"{Fore.LIGHTCYAN_EX}{name}{Fore.RESET} "
f"{Fore.LIGHTGREEN_EX}package{Fore.RESET} "
)
break
else:
print(
f"{Fore.LIGHTYELLOW_EX}[WARNING]:{Fore.RESET} package "
f"{Fore.LIGHTMAGENTA_EX}{name}{Fore.RESET} hasn't github_url"
)
if not installed:
print(
f"{Fore.LIGHTRED_EX}[ERROR]:{Fore.RESET} package "
f"{Fore.LIGHTMAGENTA_EX}{name}{Fore.RESET} is not exists"
)
def main():
args = parser.parse_args()
signal = Signal()
signal.NEED_FREE = False
signal.VERBOSE = args.verbose # -V/--verbose
env = [{}]
consts = [{}]
lvl = LevelIndex()
lvl.inc()
# -v/--version flag
if args.version:
print(f"{Fore.LIGHTRED_EX}AVOScript{Fore.RESET} {Fore.LIGHTCYAN_EX}{version}{Fore.RESET}")
# --upload flag
elif args.upload:
print(f"{Fore.LIGHTYELLOW_EX}Working via Github CLI (gh){Fore.RESET}")
package_name = input(f"{Fore.LIGHTCYAN_EX}name of package: {Fore.RESET}")
package_description = input(f"{Fore.LIGHTCYAN_EX}package description: {Fore.RESET}")
github_url = input(f"{Fore.LIGHTCYAN_EX}project Github URL: {Fore.RESET}")
if not package_name:
print(f"{Fore.LIGHTRED_EX}[ERROR]:{Fore.RESET} package name is empty")
return
fetch_pkgs(False)
subprocess.run(
f'cd avoscript && git pull -q --no-tags && git checkout -b {package_name}',
cwd=AVOSCRIPT, shell=True,
stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL
)
data = []
with open(path.join(AVOSCRIPT, 'avoscript', 'pkgs.json'), 'r') as f:
data = loads(f.read())
data.append({
'name': package_name.replace('-', ' '),
'description': package_description,
'github_url': github_url
})
with open(path.join(AVOSCRIPT, 'avoscript', 'pkgs.json'), 'w') as f:
f.write(dumps(data, indent=2))
subprocess.run(
f'cd avoscript && '
f'git add pkgs.json && git commit -q -m "add `{package_name}` package" && '
f'gh pr create -t "Add `{package_name}` package" -B master -b "{package_description}" -l "new package" && '
f'git switch master && git branch -D {package_name}',
cwd=AVOSCRIPT, shell=True
)
print(f"{Fore.GREEN}PR was created{Fore.RESET}")
# --upd flag
elif args.upd:
fetch_pkgs(False)
# -i/--interactive flag
elif args.interactive:
print(
f"Welcome to {Fore.LIGHTRED_EX}AVOScript{Fore.RESET} "
f"{Fore.LIGHTCYAN_EX}{version}{Fore.RESET} interactive mode."
)
print(
f"Write {Fore.LIGHTRED_EX}exit{Fore.RESET} to shutdown interactive mode."
)
print(f"{Fore.LIGHTGREEN_EX}>>>{Fore.RESET} ", end="")
source = input()
while source != 'exit':
signal.NO_CREATE_LEVEL = True
imp_parser(Lexer.lex(source)).value.eval(env, consts, lvl, {}, signal)
print(f"{Fore.LIGHTGREEN_EX}>>>{Fore.RESET} ", end="")
source = input()
print(f"Exited via {Fore.LIGHTRED_EX}exit{Fore.RESET} command")
exit(0)
# -s/--script flag
elif args.script:
imp_parser(Lexer.lex(args.script)).value.eval(env, consts, lvl, {}, signal)
# -f/--file flag
elif args.file:
imp_parser(Lexer.lex_file(args.file)).value.eval(env, consts, lvl, {}, signal)
# add pos arg
elif args.add:
data = fetch_pkgs(args.no_fetch) # -nf/--no-fetch flag
for i in args.add[1:]:
install_package(i, data)
if __name__ == '__main__':
main() | AVOScript | /AVOScript-0.11.5.tar.gz/AVOScript-0.11.5/avos.py | avos.py |
from codecs import decode
from typing import Tuple, List, Any
from re import findall, compile, UNICODE, VERBOSE
from colorama import Fore
from equality import AnyBase
from ..lexer import Lexer, default
from .. import parser
from ..ast import statements
def has_variable(name: str, env, consts) -> Tuple[bool, int, bool]:
"""
Finds variable or constant in environments
:param name: var/const name
:param env: variable environment
:param consts: constants environment
:return: (contains, level index, is constant)
"""
for level in range(len(env) - 1, -1, -1):
if name in env[level]:
return True, level, False
elif name in consts[level]:
return True, level, True
return False, 0, False
# --== AST ==-- #
class ASTExpr(AnyBase):
def __repr__(self) -> str:
return "AST expression"
def eval(self, env, consts, lvl, modules, signal):
raise RuntimeError('nothing to eval')
class NullAST(ASTExpr):
def __repr__(self) -> str:
return "NullAST()"
def eval(self, env, consts, lvl, modules, signal):
return None
class IntAST(ASTExpr):
def __init__(self, i: int):
self.i = i
def __repr__(self) -> str:
return f"IntAST({Fore.LIGHTYELLOW_EX}{self.i}{Fore.RESET})"
def eval(self, env, consts, lvl, modules, signal):
return self.i
class FloatAST(ASTExpr):
def __init__(self, f: float):
self.f = f
def __repr__(self) -> str:
return f"FloatAST({Fore.LIGHTYELLOW_EX}{self.f}{Fore.RESET})"
def eval(self, env, consts, lvl, modules, signal):
return self.f
class BoolAST(ASTExpr):
def __init__(self, b: bool):
self.b = b
def __repr__(self) -> str:
return f"BoolAST({Fore.YELLOW}{self.b}{Fore.RESET})"
def eval(self, env, consts, lvl, modules, signal):
return self.b
class StringAST(ASTExpr):
VARIABLE = r'\$[a-zA-Z][a-zA-Z0-9_]*'
EXPRESSION = r'\$\{[\S\s]*\}'
ESCAPE_SEQUENCE_RE = compile(
r'(\\U........|\\u....|\\x..|\\[0-7]{1,3}|\\N\{[^}]+\}|\\[\\\'"abfnrtv])', UNICODE | VERBOSE
)
def __init__(self, s: str):
self.s = s
def __repr__(self) -> str:
return f'StringAST({Fore.LIGHTGREEN_EX}"{self.s}"{Fore.RESET})'
def decode(self, s: str = None) -> str:
def decode_match(match):
return decode(match.group(0), 'unicode-escape')
if s is None:
return StringAST.ESCAPE_SEQUENCE_RE.sub(decode_match, self.s)
return StringAST.ESCAPE_SEQUENCE_RE.sub(decode_match, s)
def eval(self, env, consts, lvl, modules, signal):
result = self.s
for m in findall(StringAST.VARIABLE, result):
result = result.replace(m, str(VarAST(m[1:]).eval(env, consts, lvl, modules, signal)))
for m in findall(StringAST.EXPRESSION, result):
result = result.replace(
m,
str(parser.expr()(Lexer.lex(m[2:-1]), 0).value.eval(env, consts, lvl, modules, signal))
)
return self.decode(result)
class ArrayAST(ASTExpr):
def __init__(self, arr: List[Any]):
self.arr = [i.value[0] for i in arr]
def __repr__(self) -> str:
return f"ArrayAST({self.arr})"
def eval(self, env, consts, lvl, modules, signal):
return [i.eval(env, consts, lvl, modules, signal) for i in self.arr]
class GeneratorAST(ASTExpr):
def __init__(self, val, var, obj, condition):
self.val = val
self.var = var
self.obj = obj
self.condition = condition
def __repr__(self) -> str:
return f"GeneratorAST({self.val}, {self.var}, {self.obj}, {self.condition})"
def eval(self, env, consts, lvl, modules, signal):
result = []
env.append({})
consts.append({})
lvl.inc()
for i in self.obj.eval(env, consts, lvl, modules, signal):
env[lvl][self.var] = i
if self.condition is not None:
if self.condition.eval(env, consts, lvl, modules, signal):
result.append(self.val.eval(env, consts, lvl, modules, signal))
else:
result.append(self.val.eval(env, consts, lvl, modules, signal))
lvl.dec()
consts.pop()
env.pop()
return result
class VarAST(ASTExpr):
def __init__(self, var_name):
self.var_name = var_name
def __repr__(self) -> str:
return f"VarAST({self.var_name})"
def eval(self, env, consts, lvl, modules, signal):
has_var, level, is_const = has_variable(self.var_name, env, consts)
if has_var:
if is_const:
return consts[level][self.var_name]
else:
return env[level][self.var_name]
signal.ERROR = f'{self.var_name} was used before assign'
class ModuleCallAST(ASTExpr):
def __init__(self, module_name, module_obj):
self.name = module_name
self.obj = module_obj
def __repr__(self) -> str:
return f"ModuleCallAST({self.name}, {self.obj})"
def eval(self, env, consts, lvl, modules, signal):
in_built_in = (self.name, self.obj) in default.BUILTIN
if self.name in modules:
if self.obj not in env[modules[self.name]]:
signal.ERROR = f"unknown module object {self.obj}"
return
elif in_built_in:
return None
return env[modules[self.name]][self.obj]
elif in_built_in:
return None
signal.ERROR = f"unknown module {self.obj}"
class ClassPropAST(ASTExpr):
def __init__(self, name, prop, is_super):
self.name = name
self.prop = prop
self.is_super = is_super
def __repr__(self) -> str:
return f"ClassPropAST({self.name}, {self.prop})"
def eval(self, env, consts, lvl, modules, signal):
if signal.IN_CLASS and signal.CURRENT_CLASS and self.name == 'this':
self.name = signal.CURRENT_CLASS
has_var, level, is_const = has_variable(self.name, env, consts)
if has_var:
obj = env[level][self.name]
result = None
if self.is_super and obj['parent'] is not None:
obj = obj['parent']
if self.prop in obj['env']:
result = obj['env'][self.prop]
if self.prop in obj['consts_env']:
result = obj['consts_env'][self.prop]
while obj['parent'] and result is None:
obj = obj['parent']
if self.prop in obj['env']:
result = obj['env'][self.prop]
break
if self.prop in obj['env_consts']:
result = obj['env_consts'][self.prop]
break
if result is not None:
if obj['prefix'] == 'abstract':
print(f'[WARNING]: {self.prop} is abstract property')
return result
signal.ERROR = f"unknown property {self.prop} of {self.name}"
else:
signal.ERROR = f"unknown class {self.name}"
class ArgumentAST(ASTExpr):
"""serves FuncStmt/CallStmt arguments"""
def __init__(self, name, value):
self.name = name
self.value = value
def __repr__(self) -> str:
return f"ArgumentAST({self.name}, {self.value})"
def eval(self, env, consts, lvl, modules, signal):
return self.name, self.value
class BraceAST(ASTExpr):
"""serves array[index]"""
def __init__(self, obj, v):
self.obj = obj
self.v = v
def __repr__(self) -> str:
return f"BraceAST({self.v})"
def eval(self, env, consts, lvl, modules, signal):
result = None
if isinstance(self.obj, str):
result = VarAST(self.obj).eval(env, consts, lvl, modules, signal)
elif isinstance(self.obj, (ArrayAST, StringAST, statements.CallStmt, ModuleCallAST, ClassPropAST)):
result = self.obj.eval(env, consts, lvl, modules, signal)
if result is not None:
for i in self.v:
result = result[i.eval(env, consts, lvl, modules, signal)]
if result is not None:
return result
signal.ERROR = f"{self.obj.eval(env, consts, lvl, modules, signal)} isn't indexed"
class BinOpAST(ASTExpr):
def __init__(self, op, left, r):
self.op = op
self.left = left
self.r = r
def __repr__(self) -> str:
return f"BinOpAST({self.op}, {self.left}, {self.r})"
def eval(self, env, consts, lvl, modules, signal):
r_val = self.r.eval(env, consts, lvl, modules, signal)
l_val = self.left.eval(env, consts, lvl, modules, signal)
match self.op:
case '*':
return l_val * r_val
case '/':
return l_val / r_val
case '-':
return l_val - r_val
case '+':
return l_val + r_val
case '%':
return l_val % r_val
case _:
signal.ERROR = f'unknown operation {self.op}'
class UnaryOpAST(ASTExpr):
def __init__(self, op, expr):
self.op = op
self.expr = expr
def __repr__(self) -> str:
return f"UnaryOpAST({self.op}, {self.expr})"
def eval(self, env, consts, lvl, modules, signal):
match self.op:
case '++':
binop = BinOpAST('+', self.expr, IntAST(1))
if isinstance(self.expr, VarAST):
assign_stmt = statements.AssignStmt(self.expr.var_name, binop)
assign_stmt.eval(env, consts, lvl, modules, signal)
return self.expr.eval(env, consts, lvl, modules, signal)
return binop.eval(env, consts, lvl, modules, signal)
case '--':
binop = BinOpAST('-', self.expr, IntAST(1))
if isinstance(self.expr, VarAST):
assign_stmt = statements.AssignStmt(self.expr.var_name, binop)
assign_stmt.eval(env, consts, lvl, modules, signal)
return self.expr.eval(env, consts, lvl, modules, signal)
return binop.eval(env, consts, lvl, modules, signal)
case '-':
return -(self.expr.eval(env, consts, lvl, modules, signal))
case _:
signal.ERROR = f"unknown unary operation: {self.op}"
class TernaryOpAST(ASTExpr):
def __init__(self, first, op1, second, op2, third):
self.first = first
self.second = second
self.third = third
self.op1 = op1
self.op2 = op2
def __repr__(self) -> str:
return f"TernaryOpAST({self.first}, {self.op1}, {self.second}, {self.op2}, {self.third})"
def eval(self, env, consts, lvl, modules, signal):
if self.op1 == '?' and self.op2 == ':':
if self.first.eval(env, consts, lvl, modules, signal):
return self.second.eval(env, consts, lvl, modules, signal)
return self.third.eval(env, consts, lvl, modules, signal)
elif self.op1 == 'if' and self.op2 == 'else':
if self.second.eval(env, consts, lvl, modules, signal):
return self.first.eval(env, consts, lvl, modules, signal)
return self.third.eval(env, consts, lvl, modules, signal)
signal.ERROR = f"unknown ternary operator {self.op1}, {self.op2}"
# --== Binary operations ==-- #
class BinOpExpr(AnyBase):
def eval(self, env, consts, lvl, modules, signal):
raise RuntimeError('unknown binary operation')
class RelativeOp(BinOpExpr):
def __init__(self, op, left, r):
self.op = op
self.left = left
self.r = r
def __repr__(self) -> str:
return f"RelOp({self.left}, {self.op}, {self.r})"
def eval(self, env, consts, lvl, modules, signal):
r_val = self.r.eval(env, consts, lvl, modules, signal)
l_val = self.left.eval(env, consts, lvl, modules, signal)
match self.op:
case '==':
return l_val == r_val
case '!=':
return l_val != r_val
case '>':
return l_val > r_val
case '<':
return l_val < r_val
case '>=':
return l_val >= r_val
case '<=':
return l_val <= r_val
case _:
signal.ERROR = f'unknown operation {self.op}'
class AndOp(BinOpExpr):
def __init__(self, left, r):
self.left = left
self.r = r
def __repr__(self) -> str:
return f"AndOp({self.left}, {self.r})"
def eval(self, env, consts, lvl, modules, signal):
return (
self.left.eval(env, consts, lvl, modules, signal) and
self.r.eval(env, consts, lvl, modules, signal)
)
class OrOp(BinOpExpr):
def __init__(self, left, r):
self.left = left
self.r = r
def __repr__(self) -> str:
return f"OrOp({self.left}, {self.r})"
def eval(self, env, consts, lvl, modules, signal):
return (
self.left.eval(env, consts, lvl, modules, signal) or
self.r.eval(env, consts, lvl, modules, signal)
)
class InOp(BinOpExpr):
def __init__(self, left, r):
self.left = left
self.r = r
def __repr__(self) -> str:
return f"InOp({self.left}, {self.r})"
def eval(self, env, consts, lvl, modules, signal):
return (
self.left.eval(env, consts, lvl, modules, signal) in
self.r.eval(env, consts, lvl, modules, signal)
)
class NotOp(BinOpExpr):
def __init__(self, expr):
self.expr = expr
def eval(self, env, consts, lvl, modules, signal):
return not self.expr.eval(env, consts, lvl, modules, signal) | AVOScript | /AVOScript-0.11.5.tar.gz/AVOScript-0.11.5/src/avoscript/ast/expressions.py | expressions.py |
import math
import os
from os import path
import traceback
from typing import Union
from copy import deepcopy
from os.path import exists, isfile
from equality import AnyBase
from colorama import Fore
from ..lexer import Lexer, default
from .. import parser
from ..ast import expressions
from .. import STD, PKGS
class Stmt(AnyBase):
def eval(self, env, consts, lvl, modules, signal):
raise RuntimeError("unknown statement")
class StmtList(Stmt):
def __init__(self, statements):
self.statements = [i.value for i in statements]
def __repr__(self) -> str:
return f"StmtList({', '.join([repr(i) for i in self.statements])})"
def __iter__(self):
for stmt in self.statements:
yield stmt
def eval(self, env, consts, lvl, modules, signal):
if not default.BUILTIN_BUILD:
default.BUILTIN_BUILD = True
for attr in dir(math):
a = getattr(math, attr)
if callable(a) and not attr.startswith('_'):
default.BUILTIN[('math', attr)] = a
in_main = False
in_module = signal.IN_MODULE
if in_module:
signal.IN_MODULE = False
if not signal.IN_MAIN:
signal.IN_MAIN = True
in_main = True
if not signal.NO_CREATE_LEVEL:
lvl.inc()
env.append({})
consts.append({})
# Arguments (if in function)
if signal.IN_FUNCTION and signal.ARGUMENTS:
for n, v in signal.ARGUMENTS.items():
env[lvl][n.name] = v.value.eval(env, consts, lvl, modules, signal)
signal.ARGUMENTS = None
if signal.IN_FUNCTION and signal.KW_ARGUMENTS:
for v in signal.KW_ARGUMENTS:
env[lvl][v.name] = v.value.eval(env, consts, lvl, modules, signal)
signal.KW_ARGUMENTS = None
# Statements
result = None
for stmt in self.statements:
if signal.VERBOSE:
print(f'{Fore.CYAN}[STATEMENT]{Fore.RESET}: {stmt}')
try:
result = stmt.eval(env, consts, lvl, modules, signal)
except Exception as e:
traceback.print_exc()
signal.ERROR = e
if signal.ERROR is not None:
if not signal.IN_TRY:
print(f'RuntimeError: {signal.ERROR} in module "{signal.CURRENT_MODULE}"')
exit(0)
break
if (signal.BREAK or signal.CONTINUE) and signal.IN_CYCLE:
break
if signal.RETURN and signal.IN_FUNCTION:
break
if not signal.NO_CREATE_LEVEL and lvl not in modules.values() and not in_module:
lvl.dec()
env.pop()
consts.pop()
if signal.IN_MAIN and in_main:
signal.IN_MAIN = False
if isinstance(self.statements[-1], EOFStmt):
self.statements[-1].eval(env, consts, lvl, modules, signal)
return
return result
class AssignStmt(Stmt):
def __init__(
self,
name: str,
a_expr: Union[Stmt, 'expressions.ASTExpr', 'expressions.BinOpExpr'],
is_const: bool = False,
is_assign: bool = False,
assign_op: str = '='
):
self.name = name
self.a_expr = a_expr
self.is_const = is_const
self.is_assign = is_assign
self.assign_op = assign_op
def __repr__(self) -> str:
return f"AssignStmt({self.name}, {self.a_expr})"
def __assign_operation(self, val, signal):
if not self.is_assign:
name = self.name
if isinstance(name, str):
name = expressions.VarAST(name)
match self.assign_op:
case '*=':
val = expressions.BinOpAST('*', name, val)
case '/=':
val = expressions.BinOpAST('/', name, val)
case '+=':
val = expressions.BinOpAST('+', name, val)
case '-=':
val = expressions.BinOpAST('-', name, val)
case '=':
pass
case _:
signal.ERROR = f"unknown operator {self.assign_op}"
return val
def eval(self, env, consts, lvl, modules, signal):
has_var, level, is_const = expressions.has_variable(self.name, env, consts)
val = self.__assign_operation(self.a_expr, signal)
if self.is_assign:
# Assign var/const
if self.assign_op != '=':
signal.ERROR = f"{self.name} isn't assigned"
return
if has_var and level == lvl:
signal.ERROR = f"{self.name} is assigned"
return
if self.is_const:
consts[lvl][self.name] = val.eval(env, consts, lvl, modules, signal)
else:
env[lvl][self.name] = val.eval(env, consts, lvl, modules, signal)
elif has_var:
# Reassign
if is_const:
signal.ERROR = f'cannot change constant {self.name}'
return
env[level][self.name] = val.eval(env, consts, lvl, modules, signal)
elif isinstance(self.name, expressions.BraceAST):
result = None
obj = self.name
if isinstance(obj.obj, str):
result = expressions.VarAST(obj.obj).eval(env, consts, lvl, modules, signal)
elif isinstance(
obj.obj,
(expressions.ArrayAST, expressions.StringAST, CallStmt,
expressions.ModuleCallAST, expressions.ClassPropAST)
):
result = obj.obj.eval(env, consts, lvl, modules, signal)
if result is not None:
for i in obj.v[:-1]:
i = i.eval(env, consts, lvl, modules, signal)
result = result[i.eval(env, consts, lvl, modules, signal)]
if result is not None:
i = obj.v[-1].eval(env, consts, lvl, modules, signal)
if i == len(result):
result.append(val.eval(env, consts, lvl, modules, signal))
else:
result[i] = val.eval(env, consts, lvl, modules, signal)
elif isinstance(self.name, expressions.ModuleCallAST):
module = self.name
if module.name not in modules:
signal.ERROR = f"unknown module {module.name}"
return
if module.obj in env[modules[module.name]]:
env[modules[module.name]][module.obj] = val.eval(env, consts, lvl, modules, signal)
elif module.obj in consts[modules[module.name]]:
signal.Error = f'{module.name}.{module.obj} is constant'
else:
signal.ERROR = f"unknown module property {module.obj}"
return
elif isinstance(self.name, expressions.ClassPropAST):
obj = self.name
if signal.IN_CLASS and signal.CURRENT_CLASS and obj.name == 'this':
obj.name = signal.CURRENT_CLASS
has_var, level, is_const = expressions.has_variable(obj.name, env, consts)
if has_var and not is_const:
var = env[level][obj.name]
if obj.is_super and var['parent'] is not None:
var = var['parent']
if obj.prop in var['env']:
var['env'][obj.prop] = val.eval(env, consts, lvl, modules, signal)
return
if obj.prop in var['consts_env']:
var['consts_env'][obj.prop] = val.eval(env, consts, lvl, modules, signal)
return
while var['parent']:
var = var['parent']
if obj.prop in var['env']:
var['env'][obj.prop] = val.eval(env, consts, lvl, modules, signal)
return
if obj.prop in var['consts_env']:
var['consts_env'][obj.prop] = val.eval(env, consts, lvl, modules, signal)
return
env[level][obj.name]['env'] = val.eval(env, consts, lvl, modules, signal)
else:
signal.ERROR = f"unknown class {obj.name}"
else:
signal.ERROR = f"{self.name} isn't assigned"
class AssignClassStmt(Stmt):
def __init__(self, name, body, inherit, prefix, interfaces):
self.name = name
self.body = body
self.inherit = inherit
self.prefix = prefix
self.interfaces = interfaces
def __repr__(self) -> str:
return f"AssignClassStmt({self.prefix + ' ' if self.prefix else ''}{self.name}, {self.inherit}, {self.body})"
def eval(self, env, consts, lvl, modules, signal):
has_var, level, is_const = expressions.has_variable(self.name, env, consts)
if not has_var:
signal.NO_CREATE_LEVEL = True
env.append({})
consts.append({})
lvl.inc()
self.body.eval(env, consts, lvl, modules, signal)
if self.inherit:
has_var, level, is_const = expressions.has_variable(self.inherit, env, consts)
if has_var:
self.inherit = env[level][self.inherit]
else:
signal.ERROR = f"unknown inherit class {self.inherit}"
return
must_have_data = []
# implemented interfaces
for interface in self.interfaces:
h, l, c = expressions.has_variable(interface, env, consts)
if h:
interface = env[l][interface]
must_have_data += [i for i in interface['env'].keys() if i not in must_have_data]
must_have_data += [i for i in interface['consts_env'].keys() if i not in must_have_data]
else:
signal.ERROR = f"unknown interface {interface} of class {self.name}"
return
env[lvl - 1][self.name] = {
'parent': self.inherit,
'env': deepcopy(env[lvl]),
'consts_env': deepcopy(consts[lvl]),
'name': self.name,
'prefix': self.prefix,
'must_have_data': must_have_data
}
parent = self.inherit
if parent:
# what should be implemented?
prefix = parent['prefix']
must_have_data += [i for i in parent['must_have_data'] if i not in must_have_data]
if prefix == 'abstract':
must_have_data += [i for i in parent['env'].keys() if i not in must_have_data]
must_have_data += [i for i in parent['consts_env'].keys() if i not in must_have_data]
while parent['parent']:
parent = parent['parent']
must_have_data += [i for i in parent['must_have_data'] if i not in must_have_data]
if prefix == 'abstract':
must_have_data += [i for i in parent['env'].keys() if i not in must_have_data]
must_have_data += [i for i in parent['consts_env'].keys() if i not in must_have_data]
# what is implemented
for data in must_have_data:
obj = env[lvl - 1][self.name]
prefix = obj['prefix']
if (data in obj['env'] or data in obj['consts_env']) and prefix != 'abstract':
must_have_data.remove(data)
continue
while obj['parent']:
obj = obj['parent']
prefix = obj['prefix']
if (data in obj['env'] or data in obj['consts_env']) and prefix != 'abstract':
must_have_data.remove(data)
break
lvl.dec()
env.pop()
consts.pop()
signal.NO_CREATE_LEVEL = False
if len(must_have_data) > 0:
print(f"[WARNING]: {', '.join(must_have_data)} isn't implemented in {self.name}")
else:
print(has_var, level, env[level][self.name])
signal.ERROR = f"class {self.name} is assigned"
class InterfaceStmt(Stmt):
def __init__(self, name, body):
self.name = name
self.body = body
def __repr__(self) -> str:
return f"InterfaceStmt({self.name}, {self.body})"
def eval(self, env, consts, lvl, modules, signal):
has_var, level, is_const = expressions.has_variable(self.name, env, consts)
if not has_var:
signal.NO_CREATE_LEVEL = True
env.append({})
consts.append({})
lvl.inc()
self.body.eval(env, consts, lvl, modules, signal)
env[lvl - 1][self.name] = {
'env': deepcopy(env[lvl]),
'consts_env': deepcopy(consts[lvl]),
'name': self.name,
'parent': None,
'prefix': None,
}
lvl.dec()
env.pop()
consts.pop()
signal.NO_CREATE_LEVEL = False
else:
signal.ERROR = f"{self.name} is assigned"
class EnumStmt(Stmt):
def __init__(self, name, body):
self.start = 0
self.name = name
self.body = body
def __repr__(self) -> str:
return f"EnumStmt({self.name}, {self.body})"
def eval(self, env, consts, lvl, modules, signal):
has_var, level, is_const = expressions.has_variable(self.name, env, consts)
if not has_var:
signal.NO_CREATE_LEVEL = True
signal.ENUM_COUNTER = 0
env.append({})
consts.append({})
lvl.inc()
for stmt in self.body:
stmt.eval(env, consts, lvl, modules, signal)
env[lvl - 1][self.name] = {
'env': deepcopy(env[lvl]),
'consts_env': deepcopy(consts[lvl]),
'name': self.name,
'parent': None,
'prefix': None,
}
lvl.dec()
env.pop()
consts.pop()
signal.ENUM_COUNTER = None
signal.NO_CREATE_LEVEL = False
else:
signal.ERROR = f"{self.name} is assigned"
class EnumLetStmt(Stmt):
def __init__(self, name, value):
self.name = name
self.value = value
def __repr__(self) -> str:
return f"EnumLetStmt({self.name}, {self.value})"
def eval(self, env, consts, lvl, modules, signal):
if signal.ENUM_COUNTER is not None:
if self.name in consts[lvl]:
signal.ERROR = f"{self.name} is assigned"
return
if self.value is None:
consts[lvl][self.name] = signal.ENUM_COUNTER
signal.ENUM_COUNTER += 1
else:
val = self.value.eval(env, consts, lvl, modules, signal)
consts[lvl][self.name] = val
if isinstance(self.value, expressions.IntAST):
signal.ENUM_COUNTER = val
else:
signal.ERROR = f"enum lets should be in enum statement"
class InitClassStmt(Stmt):
def __init__(self, args, body):
self.args = args
self.body = body
def __repr__(self) -> str:
return f"InitClassStmt({self.args})"
def eval(self, env, consts, lvl, modules, signal):
if None in env[lvl]:
signal.ERROR = "this class equals init function"
return
env[lvl][None] = (self.args, self.body)
class IfStmt(Stmt):
def __init__(self, condition, body, elif_array, else_body):
self.condition = condition
self.body = body
self.elif_array = elif_array
self.else_body = else_body
def __repr__(self) -> str:
return f"IfStmt({self.condition}, {self.body}, {self.else_body})"
def eval(self, env, consts, lvl, modules, signal):
condition = self.condition.eval(env, consts, lvl, modules, signal)
else_statement = True
if condition:
self.body.eval(env, consts, lvl, modules, signal)
else:
for i in self.elif_array:
(((_, condition), _), stmt_list), _ = i
if condition.eval(env, consts, lvl, modules, signal):
stmt_list.eval(env, consts, lvl, modules, signal)
else_statement = False
break
if self.else_body and else_statement:
self.else_body.eval(env, consts, lvl, modules, signal)
class SwitchCaseStmt(Stmt):
def __init__(self, var, cases):
self.var = var
self.cases = cases
def __repr__(self) -> str:
return f"SwitchCaseStmt({self.var}, {self.cases})"
def eval(self, env, consts, lvl, modules, signal):
var = self.var.eval(env, consts, lvl, modules, signal)
result = None
for c in self.cases:
if isinstance(c, CaseStmt):
if c.condition:
val = c.condition.eval(env, consts, lvl, modules, signal)
if val == var:
result = c.body.eval(env, consts, lvl, modules, signal)
break
elif isinstance(val, (tuple, list)) and var in val:
result = c.body.eval(env, consts, lvl, modules, signal)
break
else:
result = c.body.eval(env, consts, lvl, modules, signal)
break
return result
class CaseStmt(Stmt):
def __init__(self, condition, body):
self.condition = condition
self.body = body
def __repr__(self) -> str:
return f"CaseStmt({self.condition}, {self.body})"
def eval(self, env, consts, lvl, modules, signal):
pass
class WhileStmt(Stmt):
def __init__(self, condition, body):
self.condition = condition
self.body = body
def __repr__(self) -> str:
return f"WhileStmt({self.condition}, {self.body})"
def eval(self, env, consts, lvl, modules, signal):
condition = self.condition.eval(env, consts, lvl, modules, signal)
while condition:
self.body.eval(env, consts, lvl, modules, signal)
signal.IN_CYCLE = True
if signal.IN_CYCLE:
if signal.CONTINUE:
signal.CONTINUE = False
continue
elif signal.BREAK:
break
if signal.RETURN and signal.IN_FUNCTION:
break
condition = self.condition.eval(env, consts, lvl, modules, signal)
signal.IN_CYCLE = False
signal.BREAK = False
signal.CONTINUE = False
class ForStmt(Stmt):
def __init__(self, var, cond, action, body):
self.var = var # VarAST or AssignStmt
self.cond = cond # BinOpExpr or VarAst/ArrayAST/CallStmt
self.action = action # AssignStmt(is_assign=False) or StmtList
self.body = body # StmtList or None
def __repr__(self) -> str:
return f"ForStmt({self.var}, {self.cond}, {self.action}, {self.body})"
def eval(self, env, consts, lvl, modules, signal):
env.append({})
consts.append({})
lvl.inc()
if self.body: # for i = 0; i < 10; ++i; {}
self.var.eval(env, consts, lvl, modules, signal)
condition = self.cond.eval(env, consts, lvl, modules, signal)
while condition:
self.body.eval(env, consts, lvl, modules, signal)
signal.IN_CYCLE = True
if signal.IN_CYCLE:
if signal.CONTINUE:
signal.CONTINUE = False
continue
if signal.BREAK:
break
if signal.IN_FUNCTION and signal.RETURN:
break
self.action.eval(env, consts, lvl, modules, signal)
condition = self.cond.eval(env, consts, lvl, modules, signal)
else: # for i in arr {}
for i in self.cond.eval(env, consts, lvl, modules, signal):
env[lvl][self.var] = i
signal.IN_CYCLE = True
self.action.eval(env, consts, lvl, modules, signal)
lvl.dec()
env.pop()
consts.pop()
signal.IN_CYCLE = False
signal.BREAK = False
signal.CONTINUE = False
class BreakStmt(Stmt):
def __repr__(self) -> str:
return "BreakStmt"
def eval(self, env, consts, lvl, modules, signal):
signal.BREAK = True
class ContinueStmt(Stmt):
def __repr__(self) -> str:
return "ContinueStmt"
def eval(self, env, consts, lvl, modules, signal):
signal.CONTINUE = True
class TryCatchStmt(Stmt):
def __init__(self, try_body, e_name, catch_body):
self.try_body = try_body
self.e_name = e_name
self.catch_body = catch_body
def __repr__(self) -> str:
return f"TryCatchStmt({self.try_body}, {self.e_name}, {self.catch_body})"
def eval(self, env, consts, lvl, modules, signal):
signal.IN_TRY = True
self.try_body.eval(env, consts, lvl, modules, signal)
signal.IN_TRY = False
if signal.ERROR is not None:
signal.NO_CREATE_LEVEL = True
env.append({})
consts.append({})
lvl.inc()
env[lvl][self.e_name] = signal.ERROR
signal.ERROR = None
self.catch_body.eval(env, consts, lvl, modules, signal)
lvl.dec()
env.pop()
consts.pop()
class EchoStmt(Stmt):
def __init__(self, data):
self.data = data
def __repr__(self) -> str:
return f"EchoStmt({self.data})"
def eval(self, env, consts, lvl, modules, signal):
if isinstance(self.data, (Stmt, expressions.ASTExpr, expressions.BinOpExpr)):
val = self.data.eval(env, consts, lvl, modules, signal)
if isinstance(val, tuple) and len(val) == 4:
print(f"class {val[3]}")
else:
print(val)
elif isinstance(self.data, (list, tuple)):
for i in self.data:
val = i.eval(env, consts, lvl, modules, signal)
if isinstance(val, tuple) and len(val) == 4:
print(f"class {val[3]}", end=" ")
else:
print(val, end=" ")
print()
else:
print(self.data)
class ReadStmt(Stmt):
def __init__(self, text):
self.text = text
def __repr__(self) -> str:
return f"ReadStmt({self.text})"
def eval(self, env, consts, lvl, modules, signal):
if isinstance(self.text, expressions.ASTExpr):
return input(self.text.eval(env, consts, lvl, modules, signal))
elif isinstance(self.text, str):
return self.text
class FuncStmt(Stmt):
def __init__(self, name, args, body):
self.name = name
self.args = args
self.body = body
self.decorators = None
def __repr__(self) -> str:
return f"FuncStmt({self.name}, {self.args}, {self.body})"
def eval(self, env, consts, lvl, modules, signal):
has_var, level, is_const = expressions.has_variable(self.name, env, consts)
if has_var and not is_const and level == lvl:
signal.ERROR = f"Function {self.name} is exists"
return
env[lvl][self.name] = (self.args, self.body, self.decorators, self.name)
class DecoratorStmt(Stmt):
def __init__(self, names, function):
self.names = names
self.function = function
def __repr__(self) -> str:
return f"DecoratorStmt({self.names}, {self.function})"
def eval(self, env, consts, lvl, modules, signal):
for name in self.names:
obj = None
if isinstance(name, str):
obj = expressions.VarAST(name).eval(env, consts, lvl, modules, signal)
elif isinstance(name, (expressions.ClassPropAST, expressions.ModuleCallAST)):
obj = name.eval(env, consts, lvl, modules, signal)
elif isinstance(name, CallStmt):
obj = (name.name, name.args)
if obj is not None:
if isinstance(obj, tuple):
if self.function.decorators is None:
self.function.decorators = [{
'name': name,
'func_name': self.function.name
}]
else:
self.function.decorators.append({
'name': name,
'func_name': self.function.name
})
else:
signal.ERROR = f"{name} is not function"
return
else:
signal.ERROR = f"{name} is not assigned"
return
self.function.eval(env, consts, lvl, modules, signal)
class LambdaStmt(Stmt):
def __init__(self, args, body):
self.args = args
self.body = body
def __repr__(self) -> str:
return f"LambdaStmt({self.args}, {self.body})"
def eval(self, env, consts, lvl, modules, signal):
return self.args, self.body, None, None
class CallStmt(Stmt):
def __init__(self, name, args):
self.name = name
self.args = args
def __repr__(self) -> str:
return f"CallStmt({self.name}, {self.args})"
def eval(self, env, consts, lvl, modules, signal):
has_var, level, is_const = expressions.has_variable(self.name, env, consts)
f = None
init_obj = None
current_class = signal.CURRENT_CLASS
in_class = signal.IN_CLASS
if has_var and not is_const:
f = env[level][self.name]
if isinstance(f, dict): # class
if not signal.IN_CLASS:
signal.CURRENT_CLASS = self.name
signal.IN_CLASS = True
init_obj = f
if None in f['env']:
f = f['env'][None]
else:
f = ([], StmtList([]))
elif isinstance(self.name, expressions.ModuleCallAST):
f = self.name.eval(env, consts, lvl, modules, signal)
elif isinstance(self.name, expressions.ClassPropAST):
if not signal.IN_CLASS and self.name.name != 'this':
signal.CURRENT_CLASS = self.name.name
signal.IN_CLASS = True
f = self.name.eval(env, consts, lvl, modules, signal)
if f is not None:
if len(f) == 4 and f[2] is not None:
index = 0
for i in f[2][::-1]:
if i['func_name'] == f[3] and f[3] != self.name:
break
if index > 0:
func = f
if isinstance(func, tuple):
func = LambdaStmt(f[0], f[1])
else:
signal.ERROR = f"is not function"
else:
func = expressions.VarAST(i['func_name'])
if isinstance(i['name'], CallStmt):
i['name'].args.insert(0, expressions.ArgumentAST(None, func))
f = i['name'].eval(env, consts, lvl, modules, signal)
i['name'].args.pop(0)
elif isinstance(i['name'], str):
call = CallStmt(
i['name'], [expressions.ArgumentAST(None, func)]
)
f = call.eval(env, consts, lvl, modules, signal)
index += 1
args = [i for i in self.args if i.name is None]
fargs = [i for i in f[0] if i.value is None]
kwargs = [i for i in self.args if i.name is not None]
fkwargs = [i for i in f[0] if i.value is not None]
if len(args) != len(fargs):
signal.ERROR = (
f"function {self.name} waited for {len(fargs)}, but got {len(args)} arguments"
)
return
signal.ARGUMENTS = {n: v for n, v in zip(fargs, args)}
signal.KW_ARGUMENTS = fkwargs + kwargs
if not signal.IN_FUNCTION:
signal.IN_FUNCTION = True
f[1].eval(env, consts, lvl, modules, signal)
signal.IN_FUNCTION = False
else:
f[1].eval(env, consts, lvl, modules, signal)
if init_obj: # initialized class
val = deepcopy(init_obj)
signal.RETURN_VALUE = val
signal.RETURN = False
returned = signal.RETURN_VALUE
signal.RETURN_VALUE = None
signal.IN_CLASS = False
signal.CURRENT_CLASS = None
if in_class:
signal.IN_CLASS = True
if current_class is not None:
signal.CURRENT_CLASS = current_class
return returned
else:
args = [
i.value.eval(env, consts, lvl, modules, signal)
for i in self.args if i.name is None
]
kwargs = {
i.name: i.value.eval(env, consts, lvl, modules, signal)
for i in self.args if i.name is not None
}
if isinstance(self.name, str):
if self.name in default.BUILTIN:
returned = default.BUILTIN[self.name](*args, **kwargs)
return returned
elif isinstance(self.name, expressions.ModuleCallAST):
val = (self.name.name, self.name.obj)
if val in default.BUILTIN:
returned = default.BUILTIN[val](*args, **kwargs)
return returned
signal.ERROR = f"function {self.name} isn't available"
class ReturnStmt(Stmt):
def __init__(self, val):
self.val = val
def __repr__(self) -> str:
return f"ReturnStmt({self.val})"
def eval(self, env, consts, lvl, modules, signal):
signal.RETURN = True
signal.RETURN_VALUE = self.val.eval(env, consts, lvl, modules, signal)
class ImportStmt(Stmt):
PATHS = [
path.curdir,
STD,
PKGS
]
def __init__(self, module_name, objects, from_import):
self.module_name = module_name
self.objects = objects
self.from_import = from_import
def __repr__(self) -> str:
return f"ImportStmt({self.module_name}, {self.objects}, {self.from_import})"
@staticmethod
def _find(file) -> bool:
return exists(file) and isfile(file)
def eval(self, env, consts, lvl, modules, signal):
current_module = signal.CURRENT_MODULE
current_dir = os.getcwd()
if self.module_name is not None:
module_name = None
for i in self.PATHS:
# Find module.avo or module/init.avo
module_name = path.join(i, self.module_name + '.avo')
if self._find(module_name):
break
module_name = path.join(i, self.module_name, 'init.avo')
if self._find(module_name):
break
if module_name is None:
signal.ERROR = f"{self.module_name} isn't exists"
return
# Change current dir to module dir
os.chdir(path.dirname(path.abspath(module_name)))
statements = parser.stmt_list()(Lexer.lex_file(module_name), 0)
if statements:
signal.CURRENT_MODULE = module_name
signal.IN_MODULE = True
modules[self.module_name] = lvl+1
statements.value.eval(env, consts, lvl, modules, signal)
if self.from_import:
environment = [i for i in env[modules[self.module_name]].keys()]
constants = [i for i in consts[modules[self.module_name]].keys()]
for k in environment + constants:
if k not in self.objects:
del env[modules[self.module_name]][k]
else:
while len(self.objects) > 0:
self.module_name = self.objects.pop(0)
self.eval(env, consts, lvl, modules, signal)
signal.CURRENT_MODULE = current_module
os.chdir(current_dir)
class EOFStmt(Stmt):
def __repr__(self) -> str:
return "EOFStmt()"
def eval(self, env, consts, lvl, modules, signal):
if signal.IN_MAIN or not signal.NEED_FREE:
return
env.clear()
consts.clear()
modules.clear()
lvl.i = 0
signal.refresh() | AVOScript | /AVOScript-0.11.5.tar.gz/AVOScript-0.11.5/src/avoscript/ast/statements.py | statements.py |
from enum import Enum
from typing import Tuple, NewType
import sys
class TokenType(Enum):
EOF = 0
NEW_LINE = 1
SPACE = 2
RESERVED = 3
OPERATOR = 4
INT = 5
FLOAT = 6
BOOL = 7
STRING = 8
ID = 9
class Type(Enum):
INT = 0
FLOAT = 1
BOOL = 2
STRING = 3
ARRAY = 4
class Signal:
def __init__(self):
self.IN_CYCLE = False
self.IN_FOR_CYCLE = False
self.IN_FUNCTION = False
self.IN_CLASS = False
self.IN_TRY = False
self.IN_MAIN = False
self.IN_MODULE = False
self.BREAK = False
self.CONTINUE = False
self.RETURN = False
self.NO_CREATE_LEVEL = False
self.CREATE_BACK_LEVEL = False
self.BACK_LEVEL = None
self.RETURN_VALUE = None
self.ARGUMENTS = None
self.KW_ARGUMENTS = None
self.CURRENT_CLASS = None
self.ERROR = None
self.CURRENT_MODULE = 'main'
self.ENUM_COUNTER = 0
# no refresh
self.NEED_FREE = True
self.VERBOSE = False
def refresh(self):
self.IN_CYCLE = False
self.IN_FOR_CYCLE = False
self.IN_FUNCTION = False
self.IN_CLASS = False
self.IN_TRY = False
self.IN_MAIN = False
self.IN_MODULE = False
self.BREAK = False
self.CONTINUE = False
self.RETURN = False
self.NO_CREATE_LEVEL = False
self.CREATE_BACK_LEVEL = False
self.BACK_LEVEL = None
self.RETURN_VALUE = None
self.ARGUMENTS = None
self.KW_ARGUMENTS = None
self.CURRENT_CLASS = None
self.ERROR = None
self.CURRENT_MODULE = 'main'
self.ENUM_COUNTER = 0
class StdString:
def __init__(self):
self.out = ""
def write(self, v):
self.out += v
def __enter__(self):
sys.stdout = self
def __exit__(self, exc_type, exc_val, exc_tb):
sys.stdout = sys.__stdout__
class LevelIndex:
def __init__(self):
self.i = -1
def __index__(self):
return self.i
def __add__(self, other: int) -> int:
return self.i + other
def __sub__(self, other: int) -> int:
return self.i - other
def __repr__(self) -> str:
return str(self.i)
def inc(self):
self.i += 1
def dec(self):
self.i -= 1
Token = NewType('Token', Tuple[str, TokenType]) | AVOScript | /AVOScript-0.11.5.tar.gz/AVOScript-0.11.5/src/avoscript/lexer/types.py | types.py |
from typing import List
from re import compile
from .types import TokenType, Token
class Lexer:
TOKEN_EXPRESSIONS = [
(r'"[^\n"]*"', TokenType.STRING),
(r'\'[^\n\']*\'', TokenType.STRING),
(r'\#\[[\s\S]+\]\#', None),
(r'\#[^\n]+', None),
(r'\n', TokenType.NEW_LINE),
(r'\s', TokenType.SPACE),
(r'\b(if|elif|else|switch|case|while|for|break|continue)\b', TokenType.RESERVED),
(r'\b(echo|read|var|let|func|return|import|from)\b', TokenType.RESERVED),
(r'\b(class|init|super|this|abstract|interface|of|enum)\b', TokenType.RESERVED),
(r'\b(try|catch|null|with)\b', TokenType.RESERVED),
(r'[\(\)\{\}\[\];,]', TokenType.RESERVED),
(r'(\bin\b|\bor\b|\band\b|&&|\|\||\+\=|\-\=|\*\=|\/\=|\+\+|\-\-)', TokenType.OPERATOR),
(r'(=>|->)', TokenType.OPERATOR),
(r'>=', TokenType.OPERATOR),
(r'<=', TokenType.OPERATOR),
(r'==', TokenType.OPERATOR),
(r'::', TokenType.OPERATOR),
(r'!=', TokenType.OPERATOR),
(r'\`.+\`', TokenType.ID),
(r'\-?[0-9]+\.[0-9]+', TokenType.FLOAT),
(r'\-?[0-9]+', TokenType.INT),
(r'[\+\-\/\*\=<>~!@$%^&:\.\?]', TokenType.OPERATOR),
(r'\b(true|on|enable|false|off|disable)\b', TokenType.BOOL),
(r'[a-zA-Z_][a-zA-Z0-9_]*', TokenType.ID),
]
SYMBOL = 1
LINE = 1
@staticmethod
def lex(src: str) -> List[Token]:
"""
Splits source string to tokens
:param src: source string
:return: list of tokens
"""
res: List[Token] = []
i = 0
while i < len(src):
match = None
for pattern, token_type in Lexer.TOKEN_EXPRESSIONS:
regex = compile(pattern)
match = regex.match(src, i)
if match:
if token_type is not None:
if token_type == TokenType.NEW_LINE:
Lexer.SYMBOL = 1
Lexer.LINE += 1
break
text = match.group(0)
Lexer.SYMBOL += len(text)
if token_type != TokenType.SPACE:
res.append((text, token_type))
break
if match:
i = match.end(0)
else:
print(f"error at {i} char ({src[i]}), at {Lexer.LINE} line at {Lexer.SYMBOL} symbol")
exit(-1)
Lexer.SYMBOL = 1
Lexer.LINE = 1
return res + [(None, TokenType.EOF)]
@staticmethod
def lex_file(path_to_file: str) -> List[Token]:
"""
Read and splits source file to tokens
:param path_to_file: path to file
:return: list of tokens
"""
src: str
with open(path_to_file, 'r', encoding='utf-8') as f:
src = f.read()
return Lexer.lex(src) | AVOScript | /AVOScript-0.11.5.tar.gz/AVOScript-0.11.5/src/avoscript/lexer/__init__.py | __init__.py |
from typing import List, Callable
from ..lexer import Token
from ..lexer.result import Result
class Combinator:
def __call__(self, tokens: List[Token], i: int) -> Result:
pass
def __add__(self, other) -> 'Combinator':
return Concat(self, other)
def __mul__(self, other) -> 'Combinator':
return Exp(self, other)
def __or__(self, other) -> 'Combinator':
return Alt(self, other)
def __xor__(self, other) -> 'Combinator':
return Process(self, other)
class Reserved(Combinator):
def __init__(self, value, token: Token):
self.value = value
self.token = token
def __call__(self, tokens: List[Token], i: int) -> Result:
if (
i < len(tokens) and
tokens[i][0] == self.value and
tokens[i][1] == self.token
):
return Result(tokens[i][0], i+1)
def __repr__(self) -> str:
return f"Reserved({self.value}, {self.token})"
class Tag(Combinator):
def __init__(self, token: Token):
self.token = token
def __call__(self, tokens, i) -> Result:
if i < len(tokens) and tokens[i][1] == self.token:
return Result(tokens[i][0], i+1)
def __repr__(self) -> str:
return f"Tag({self.token})"
class Concat(Combinator):
def __init__(self, left: Combinator, r: Combinator):
self.left = left
self.r = r
def __call__(self, tokens: List[Token], i: int) -> Result:
l_res = self.left(tokens, i)
if l_res:
r_res = self.r(tokens, l_res.pos)
if r_res:
res = (l_res.value, r_res.value)
return Result(res, r_res.pos)
def __repr__(self) -> str:
return f"Concat({self.left}, {self.r})"
class Alt(Combinator):
def __init__(self, left: Combinator, r: Combinator):
self.left = left
self.r = r
def __call__(self, tokens: List[Token], i: int) -> Result:
l_res = self.left(tokens, i)
if l_res:
return l_res
else:
return self.r(tokens, i)
def __repr__(self) -> str:
return f"Alt({self.left}, {self.r})"
class Opt(Combinator):
def __init__(self, c: Combinator):
self.c = c
def __call__(self, tokens: List[Token], i: int) -> Result:
res = self.c(tokens, i)
if res:
return res
return Result(None, i)
def __repr__(self) -> str:
return f"Opt({self.c})"
class Rep(Combinator):
def __init__(self, c: Combinator):
self.c = c
def __call__(self, tokens: List[Token], i: int) -> Result:
res = self.c(tokens, i)
result = []
while res:
result.append(res)
i = res.pos
res = self.c(tokens, i)
return Result(result, i)
def __repr__(self) -> str:
return f"Rep({self.c})"
class Process(Combinator):
def __init__(self, c: Combinator, f: Callable):
self.c = c
self.f = f
def __call__(self, tokens: List[Token], i: int) -> Result:
res = self.c(tokens, i)
if res:
res.value = self.f(res.value)
return res
def __repr__(self) -> str:
return f"Process({self.c}, {self.f})"
class Exp(Combinator):
def __init__(self, c: Combinator, sep: Combinator):
self.c = c
self.sep = sep
def __call__(self, tokens: List[Token], i: int) -> Result:
res = self.c(tokens, i)
def process_next(result):
sep_func, r = result
return sep_func(res.value, r)
if self.sep:
next_c = self.sep + self.c ^ process_next
else:
next_c = self.c ^ process_next
next_res = res
while next_res:
next_res = next_c(tokens, res.pos)
if next_res:
res = next_res
return res
def __repr__(self) -> str:
return f"Exp({self.c}, {self.sep})"
class Lazy(Combinator):
def __init__(self, c_func: Callable):
self.c = None
self.c_func = c_func
def __call__(self, tokens: List[Token], i: int) -> Result:
if not self.c:
self.c = self.c_func()
return self.c(tokens, i)
def __repr__(self) -> str:
return f"Lazy({self.c}, {self.c_func})"
class Phrase(Combinator):
def __init__(self, c: Combinator):
self.c = c
def __call__(self, tokens: List[Token], i: int) -> Result:
res = self.c(tokens, i)
if res and res.pos == len(tokens):
return res
else:
raise RuntimeError(f"error at {res.pos} token. {res.value}")
def __repr__(self) -> str:
return f"Phrase({self.c})" | AVOScript | /AVOScript-0.11.5.tar.gz/AVOScript-0.11.5/src/avoscript/parser/combinator.py | combinator.py |
from functools import reduce
from pprint import pprint
from .combinator import *
from ..ast import expressions, statements
from ..lexer.types import Token, TokenType
def keyword(kw: str) -> Reserved:
return Reserved(kw, TokenType.RESERVED)
def operator(kw: str) -> Reserved:
return Reserved(kw, TokenType.OPERATOR)
def process_boolean(op):
match op:
case 'on' | 'true' | 'enable':
return True
case 'off' | 'false' | 'disable':
return False
case _:
raise RuntimeError(f'unknown boolean value: {op}')
id_tag = Tag(TokenType.ID)
num = Tag(TokenType.INT) ^ (lambda x: int(x))
float_num = Tag(TokenType.FLOAT) ^ (lambda x: float(x))
boolean = Tag(TokenType.BOOL) ^ process_boolean
null = keyword('null') ^ (lambda x: expressions.NullAST())
string = Tag(TokenType.STRING) ^ (lambda x: expressions.StringAST(x[1:-1]))
a_expr_precedence_levels = [
['*', '/'],
['+', '-', '%'],
]
relational_operators = ['==', '!=', '>=', '<=', '<', '>']
unary_operators = ['--', '++', '-']
assign_operators = ['+=', '-=', '*=', '/=', '=']
b_expr_precedence_levels = [
['and', '&&'],
['or', '||'],
['in']
]
def array_expr():
"""Array expression
[x, y, z, ...]
"""
def process(p):
(_, data), _ = p
return expressions.ArrayAST(data)
return keyword('[') + Opt(Rep(Lazy(expression) + Opt(keyword(',')))) + keyword(']') ^ process
def array_generator_expr():
"""Array generator expression
[i for i in object if i > x]
"""
def process(p):
((((((_, var), _), val), _), obj), condition), _ = p
if condition is not None:
_, condition = condition
return expressions.GeneratorAST(var, val, obj, condition)
return (
keyword('[') + expression() + keyword('for') + id_tag + operator('in') + expression() +
Opt(keyword('if') + Exp(b_expr(), None)) + keyword(']')
) ^ process
def if_else_expr():
"""Ternary operator
condition ? x : y
x if condition else y
"""
def process(p):
(((body, op1), condition), op2), else_body = p
return expressions.TernaryOpAST(body, op1, condition, op2, else_body)
return (
Lazy(expr) + Alt(keyword('if'), operator('?')) + Lazy(expr) +
Alt(keyword('else'), operator(':')) + Lazy(expression)
) ^ process
def lambda_stmt():
"""Lambda statement
(a, b, c) => {...}
"""
def process(p):
(((((_, args), _), _), _), stmts), _ = p
arguments = []
for arg in args:
if arg.value[0][1] is None:
arguments.append(expressions.ArgumentAST(arg.value[0][0], None))
else:
arguments.append(expressions.ArgumentAST(arg.value[0][0], arg.value[0][1][1]))
return statements.LambdaStmt(arguments, stmts)
return (
keyword('(') + Rep(id_tag + Opt(operator('=') + Lazy(expression)) + Opt(keyword(','))) +
keyword(')') + operator('=>') + keyword('{') + Opt(Lazy(stmt_list)) + keyword('}')
) ^ process
def module_obj_expr():
"""Module object expression
x.y
"""
def process(p):
(module, _), var = p
return expressions.ModuleCallAST(module, var)
return id_tag + operator('.') + id_tag ^ process
def id_or_module():
return class_property_expr() | module_obj_expr() | id_tag
def a_expr_value():
return (
Lazy(call_stmt) |
Lazy(array_generator_expr) |
Lazy(array_expr) |
Lazy(read_stmt) |
brace_expr() |
module_obj_expr() |
Lazy(class_property_expr) |
(num ^ (lambda x: expressions.IntAST(x))) |
(float_num ^ (lambda x: expressions.FloatAST(x))) |
(id_tag ^ (lambda x: expressions.VarAST(x))) |
(boolean ^ (lambda x: expressions.BoolAST(x))) |
string |
null
)
def process_group(p):
(_, r), _ = p
return r
def a_expr_group():
return keyword('(') + Lazy(a_expr) + keyword(')') ^ process_group
def a_expr_term():
return (
a_expr_value() |
a_expr_group() |
unary_op_stmt()
)
def process_binop(op):
return lambda l, r: expressions.BinOpAST(op, l, r)
def any_op_in_list(ops):
op_parsers = [operator(op) for op in ops]
return reduce(lambda l, r: l | r, op_parsers)
def precedence(val_parser, levels, combine):
def op_parser(level):
return any_op_in_list(level) ^ combine
p = val_parser * op_parser(levels[0])
for lvl in levels[1:]:
p = p * op_parser(lvl)
return p
def a_expr():
return precedence(a_expr_term(), a_expr_precedence_levels, process_binop)
# --== Boolean conditions ==-- #
def process_relop(p):
(l, op), r = p
return expressions.RelativeOp(op, l, r)
def b_expr_relop():
"""Relation operation expression
x >= y
x == y
x != y
etc
"""
return (
a_expr() + any_op_in_list(relational_operators) + a_expr()
) ^ process_relop
def b_expr_not():
"""Not expression
not x
!x
"""
return (Alt(keyword('not'), operator('!')) + Lazy(b_expr_term)) ^ (lambda p: expressions.NotOp(p[1]))
def b_expr_group():
return (keyword('(') + Lazy(b_expr) + keyword(')')) ^ process_group
def b_expr_term():
return b_expr_group() | b_expr_not() | b_expr_relop() | (boolean ^ (lambda x: expressions.BoolAST(x)))
def process_logic(op):
match op:
case 'and' | '&&':
return lambda l, r: expressions.AndOp(l, r)
case 'or' | '||':
return lambda l, r: expressions.OrOp(l, r)
case 'in':
return lambda l, r: expressions.InOp(l, r)
case _:
raise RuntimeError(f'unknown logic operator: {op}')
def b_expr():
return precedence(b_expr_term(), b_expr_precedence_levels, process_logic)
def brace_expr():
"""Brace expression
x[y][z]
"""
def process(p):
(((obj, _), v), _), v_arr = p
arr = []
for i in v_arr:
(_, i), _ = i.value
arr.append(i)
return expressions.BraceAST(obj, [v] + arr)
return (
(Lazy(array_expr) | Lazy(call_stmt) | string | id_or_module()) + keyword('[') +
Lazy(expr) + keyword(']') + Rep(keyword('[') + Lazy(expr) + keyword(']'))
) ^ process
def expr():
return (
b_expr() |
a_expr()
)
def class_property_expr():
"""Class property statement
x::y
"""
def process(p):
((is_super, name), _), var = p
if is_super is not None:
is_super = True
return expressions.ClassPropAST(name, var, is_super)
return Opt(keyword('super')) + Alt(id_tag, keyword('this')) + operator('::') + id_tag ^ process
def expression():
return lambda_stmt() | if_else_expr() | Lazy(switch_case_stmt) | expr()
# --== statements ==-- #
def assign_stmt():
"""Assign statement
var x = y
"""
def process(p):
((_, name), _), e = p
return statements.AssignStmt(name, e, False, True)
return (
keyword('var') + id_tag + operator('=') + expression()
) ^ process
def assign_const_stmt():
"""Assign constant statement
let x = y
"""
def process(p):
((_, name), _), e = p
return statements.AssignStmt(name, e, True, True)
return (keyword('let') + id_tag + operator('=') + expression()) ^ process
def reassign_stmt():
"""Reassign statement
x = y
"""
def process(p):
(name, op), e = p
return statements.AssignStmt(name, e, False, False, op)
return ((brace_expr() | id_or_module()) + any_op_in_list(assign_operators) + expression()) ^ process
def unary_op_stmt():
"""Unary operator statement
x++
++x
"""
def process(p):
sym, name = p
if sym not in unary_operators:
name, sym = sym, name
return expressions.UnaryOpAST(sym, expressions.VarAST(name))
return Alt(
any_op_in_list(unary_operators) + id_or_module(), id_or_module() + any_op_in_list(unary_operators)
) ^ process
def stmt_list():
def process(rep):
return statements.StmtList(rep)
return Rep(Lazy(stmt) + Opt(keyword(';')) ^ (lambda x: x[0])) ^ process
def block_stmt():
def process(rep):
(_, l), _ = rep
return l
return keyword('{') + Opt(Lazy(stmt_list)) + keyword('}') ^ process
def if_stmt():
"""if-elif-else statement
if condition1 {
body1
} elif condition2 {
body2
} elif condition3 {
body3
} else {
body4
}
"""
def process(p):
(((((_, condition), _), body), _), elif_array), false_p = p
if false_p:
(_, false_body), _ = false_p
else:
false_body = None
if elif_array:
elif_array = [i.value for i in elif_array]
return statements.IfStmt(condition, body, elif_array, false_body)
result = keyword('if') + Exp(b_expr(), None) + keyword('{') + Opt(Lazy(stmt_list)) + keyword('}')
result += Opt(
Rep(
keyword('elif') + Exp(b_expr(), None) + keyword('{') + Opt(Lazy(stmt_list)) + keyword('}')
)
)
result += Opt(
keyword('else') + keyword('{') + Opt(Lazy(stmt_list)) + keyword('}')
)
return result ^ process
def while_stmt():
"""While statement
while condition {
body
}
"""
def process(p):
(((_, condition), _), body), _ = p
return statements.WhileStmt(condition, body)
result = keyword('while') + Exp(b_expr(), None) + keyword('{') + Opt(Lazy(stmt_list)) + keyword('}')
return result ^ process
def break_stmt():
"""Break statement
break
"""
return keyword('break') ^ (lambda x: statements.BreakStmt())
def continue_stmt():
"""Continue statement
continue
"""
return keyword('continue') ^ (lambda x: statements.ContinueStmt())
def echo_stmt():
"""Echo statement
echo(x, y, z, ...)
"""
def process(p):
(_, data), _ = p
return statements.EchoStmt(data)
return (
keyword('echo') + keyword('(') +
Opt(
Rep(
expression() + Opt(keyword(',')) ^ (lambda x: x[0])
) ^ (lambda x: [i.value for i in x])
) + keyword(')') ^ process
)
def read_stmt():
"""Read statement
x = read(...)
"""
def process(p):
_, text = p
return statements.ReadStmt(text)
return keyword('read') + expression() ^ process
def func_stmt():
"""Function assign statement
func name(args) {
body
}
"""
def process(p):
((((((_, func_name), _), args), _), _), stmts), _ = p
arguments = []
for arg in args:
if arg.value[0][1] is None:
arguments.append(expressions.ArgumentAST(arg.value[0][0], None))
else:
arguments.append(expressions.ArgumentAST(arg.value[0][0], arg.value[0][1][1]))
return statements.FuncStmt(func_name, arguments, stmts)
return (
keyword('func') + id_tag + keyword('(') +
Rep(id_tag + Opt(operator('=') + Lazy(expression)) + Opt(keyword(','))) +
keyword(')') + keyword('{') + Opt(Lazy(stmt_list)) + keyword('}')
) ^ process
def interface_func_stmt():
"""Interface function statement
func name()
"""
def process(p):
(((_, func_name), _), args), _ = p
arguments = []
for arg in args:
if arg.value[0][1] is None:
arguments.append(expressions.ArgumentAST(arg.value[0][0], None))
else:
arguments.append(expressions.ArgumentAST(arg.value[0][0], arg.value[0][1][1]))
return statements.FuncStmt(func_name, arguments, statements.StmtList([]))
return (
keyword('func') + id_tag + keyword('(') +
Rep(id_tag + Opt(operator('=') + Lazy(expression)) + Opt(keyword(','))) +
keyword(')')
) ^ process
def call_stmt():
"""Call statement
func_name(args)
func_name(args) with {
lambda body
}
"""
def process(p):
(((func_name, _), args), _), lambda_body = p
arguments = []
for arg in args:
if arg.value[0][0] is None:
arguments.append(expressions.ArgumentAST(None, arg.value[0][1]))
else:
arguments.append(expressions.ArgumentAST(arg.value[0][0][0], arg.value[0][1]))
if lambda_body:
arguments.append(expressions.ArgumentAST(None, statements.LambdaStmt([], lambda_body[0][1])))
return statements.CallStmt(func_name, arguments)
return (
id_or_module() + keyword('(') +
Rep(Opt(id_tag + operator('=')) + Lazy(expression) + Opt(keyword(','))) +
keyword(')') + Opt(keyword('with') + keyword('{') + Opt(Lazy(stmt_list)) + keyword('}'))
) ^ process
def decorator_stmt():
"""Decorator statement
@decorator
func main() {
...
}
"""
def process(p):
((_, name), name_list), function = p
names = [name]
if name_list is not None:
names += [i.value[1] for i in name_list]
return statements.DecoratorStmt(names, function)
return (
operator('@') + Alt(Lazy(call_stmt), id_or_module()) +
Opt(Rep(operator('@') + Alt(Lazy(call_stmt), id_or_module()))) +
func_stmt()
) ^ process
def return_stmt():
"""Return statement
return x
"""
def process(p):
_, return_value = p
return statements.ReturnStmt(return_value)
return keyword('return') + Opt(expression()) ^ process
def for_stmt():
"""For statement
for var x = y; condition; x++ {
body
}
"""
def process(p):
(((((((_, var), _), cond), _), action), _), body), _ = p
return statements.ForStmt(var, cond, action, body)
return (
keyword('for') + Lazy(assign_stmt) + keyword(';') +
Exp(b_expr(), None) + keyword(';') +
(Lazy(reassign_stmt) | Lazy(unary_op_stmt)) + keyword('{') + Opt(Lazy(stmt_list)) + keyword('}')
) ^ process
def foreach_stmt():
"""Foreach statement
for i in object {
body
}
"""
def process(p):
(((((_, var), _), val), _), body), _ = p
return statements.ForStmt(var, val, body, None)
return (
keyword('for') + id_tag + operator('in') + expression() +
keyword('{') + Opt(Lazy(stmt_list)) + keyword('}')
) ^ process
def import_stmt():
"""Import statement
import module
import a1, a2, a3, ...
from module_name import a, b, c, ...
"""
def process(p):
((x, module), _), objects = p
objects = [i.value[0] for i in objects]
from_import = False
if isinstance(x, tuple):
objects = [module] + objects
(_, module), _ = x
from_import = True
return statements.ImportStmt(module, objects, from_import)
return Alt(
keyword('import') + id_tag + Opt(keyword(',')) + Rep(id_tag + Opt(keyword(','))),
keyword('from') + id_tag + keyword('import') + id_tag + Opt(keyword(',')) + Rep(id_tag + Opt(keyword(',')))
) ^ process
def switch_case_stmt():
"""Switch-case-else statement
switch object {
case x {body1}
case [y, z, w] {body2}
else {body3}
}
"""
def process(p):
((((_, var), _), cases), else_body), _ = p
cases_list = []
for c in cases:
(((_, cond), _), body), _ = c.value
cases_list.append(statements.CaseStmt(cond, body))
if else_body:
((_, _), else_body), _ = else_body
cases_list.append(statements.CaseStmt(None, else_body))
return statements.SwitchCaseStmt(var, cases_list)
return (
keyword('switch') + expression() + keyword('{') +
Rep(
keyword('case') + expression() + keyword('{') + Opt(Lazy(stmt_list)) + keyword('}')
) + Opt(
keyword('else') + keyword('{') + Opt(Lazy(stmt_list)) + keyword('}')
) + keyword('}')
) ^ process
def assign_class_stmt():
"""Class assign statement
[abstract] class MyClass {
body
}
"""
def process(p):
((((((prefix, _), name), inherit), interfaces), _), body), _ = p
if inherit:
_, inherit = inherit
if interfaces:
(_, interface), interfaces = interfaces
interfaces = [i.value for i in interfaces] + [interface]
else:
interfaces = []
return statements.AssignClassStmt(name, body, inherit, prefix, interfaces)
return (
Opt(keyword('abstract')) + keyword('class') + id_tag + Opt(operator(':') + id_tag) +
Opt(keyword('of') + id_tag + Rep(id_tag)) +
keyword('{') + Opt(Lazy(class_body)) + keyword('}')
) ^ process
def assign_interface_stmt():
"""Interface assign statement
interface Name {
body
}
"""
def process(p):
(((_, name), _), body), _ = p
return statements.InterfaceStmt(name, body)
return (
keyword('interface') + id_tag + keyword('{') + Opt(Lazy(interface_body)) + keyword('}')
) ^ process
def class_body():
"""Class body"""
def process(p):
return statements.StmtList(p)
return Rep(
Lazy(class_body_stmt) + Opt(keyword(';')) ^ (lambda x: x[0])
) ^ process
def interface_body():
"""Interface body"""
def process(p):
return statements.StmtList(p)
return Rep(
Lazy(interface_body_stmt) + Opt(keyword(';')) ^ (lambda x: x[0])
) ^ process
def init_class_stmt():
"""Assign class init func
init(args) {
body
}
"""
def process(p):
(((_, args), _), body), _ = p
arguments = []
if args:
(_, args), _ = args
for arg in args:
if arg.value[0][1] is None:
arguments.append(expressions.ArgumentAST(arg.value[0][0], None))
else:
arguments.append(expressions.ArgumentAST(arg.value[0][0], arg.value[0][1][1]))
return statements.InitClassStmt(arguments, body)
return (
keyword('init') + Opt(keyword('(') + Rep(
id_tag + Opt(operator('=') + Lazy(expression)) + Opt(keyword(','))
) + keyword(')')) +
keyword('{') + Opt(Lazy(stmt_list)) + keyword('}')
) ^ process
def class_body_stmt():
return (
init_class_stmt() |
decorator_stmt() |
func_stmt() |
assign_stmt() |
assign_const_stmt() |
assign_class_stmt()
)
def interface_body_stmt():
return (
interface_func_stmt() |
assign_stmt() |
assign_const_stmt()
)
def try_catch_stmt():
"""Try-catch statement
try {
error code
} catch e {
catch error
}
"""
def process(p):
((((((_, try_body), _), _), e_name), _), catch_body), _ = p
return statements.TryCatchStmt(try_body, e_name, catch_body)
return (
keyword('try') + keyword('{') + Opt(Lazy(stmt_list)) + keyword('}') +
keyword('catch') + id_tag + keyword('{') + Opt(Lazy(stmt_list)) + keyword('}')
) ^ process
def assign_enum_stmt():
"""Enum statement
enum CatColor {
BROWN
WHITE
BLACK = "Black"
}"""
def process(p):
(((_, name), _), lets), _ = p
if lets is not None:
body = [i.value[0] for i in lets]
else:
body = []
return statements.EnumStmt(name, body)
return (
keyword('enum') + id_tag + keyword('{') + Opt(Rep(enum_body() + Opt(keyword(',')))) + keyword('}')
) ^ process
def enum_body():
def process(p):
if isinstance(p, tuple):
return statements.EnumLetStmt(p[0][0], p[1])
return statements.EnumLetStmt(p, None)
return Alt(id_tag + operator('=') + expression(), id_tag) ^ process
def stmt():
return (
assign_class_stmt() |
assign_interface_stmt() |
assign_enum_stmt() |
decorator_stmt() |
func_stmt() |
call_stmt() |
for_stmt() |
try_catch_stmt() |
echo_stmt() |
foreach_stmt() |
assign_stmt() |
assign_const_stmt() |
reassign_stmt() |
if_stmt() |
while_stmt() |
unary_op_stmt() |
break_stmt() |
continue_stmt() |
block_stmt() |
return_stmt() |
import_stmt() |
expression() |
(Tag(TokenType.EOF) ^ (lambda x: statements.EOFStmt()))
)
def parser() -> Phrase:
return Phrase(stmt_list())
def imp_parser(tokens: List[Token]):
return parser()(tokens, 0) | AVOScript | /AVOScript-0.11.5.tar.gz/AVOScript-0.11.5/src/avoscript/parser/__init__.py | __init__.py |
from __future__ import print_function, division
import numpy as np
import cv2
import pyaudio
import wave
import threading
import time
import subprocess
import os
class VideoRecorder():
"Video class based on openCV"
def __init__(self, name="temp_video.avi", fourcc="MJPG", sizex=640, sizey=480, camindex=0, fps=30):
self.open = True
self.device_index = camindex
self.fps = fps # fps should be the minimum constant rate at which the camera can
self.fourcc = fourcc # capture images (with no decrease in speed over time; testing is required)
self.frameSize = (sizex, sizey) # video formats and sizes also depend and vary according to the camera used
self.video_filename = name
self.video_cap = cv2.VideoCapture(self.device_index)
self.video_writer = cv2.VideoWriter_fourcc(*self.fourcc)
self.video_out = cv2.VideoWriter(self.video_filename, self.video_writer, self.fps, self.frameSize)
self.frame_counts = 1
self.start_time = time.time()
def record(self):
"Video starts being recorded"
# counter = 1
timer_start = time.time()
timer_current = 0
while self.open:
ret, video_frame = self.video_cap.read()
if ret:
self.video_out.write(video_frame)
# print(str(counter) + " " + str(self.frame_counts) + " frames written " + str(timer_current))
self.frame_counts += 1
# counter += 1
# timer_current = time.time() - timer_start
time.sleep(1/self.fps)
# gray = cv2.cvtColor(video_frame, cv2.COLOR_BGR2GRAY)
# cv2.imshow('video_frame', gray)
# cv2.waitKey(1)
else:
break
def stop(self):
"Finishes the video recording therefore the thread too"
if self.open:
self.open=False
self.video_out.release()
self.video_cap.release()
cv2.destroyAllWindows()
def start(self):
"Launches the video recording function using a thread"
video_thread = threading.Thread(target=self.record)
video_thread.start()
class AudioRecorder():
"Audio class based on pyAudio and Wave"
def __init__(self, filename="temp_audio.wav", rate=44100, fpb=1024, channels=2):
self.open = True
self.rate = rate
self.frames_per_buffer = fpb
self.channels = channels
self.format = pyaudio.paInt16
self.audio_filename = filename
self.audio = pyaudio.PyAudio()
self.stream = self.audio.open(format=self.format,
channels=self.channels,
rate=self.rate,
input=True,
frames_per_buffer = self.frames_per_buffer)
self.audio_frames = []
def record(self):
"Audio starts being recorded"
self.stream.start_stream()
while self.open:
data = self.stream.read(self.frames_per_buffer)
self.audio_frames.append(data)
if not self.open:
break
def stop(self):
"Finishes the audio recording therefore the thread too"
if self.open:
self.open = False
self.stream.stop_stream()
self.stream.close()
self.audio.terminate()
waveFile = wave.open(self.audio_filename, 'wb')
waveFile.setnchannels(self.channels)
waveFile.setsampwidth(self.audio.get_sample_size(self.format))
waveFile.setframerate(self.rate)
waveFile.writeframes(b''.join(self.audio_frames))
waveFile.close()
def start(self):
"Launches the audio recording function using a thread"
audio_thread = threading.Thread(target=self.record)
audio_thread.start()
def start_AVrecording(filename="test"):
global video_thread
global audio_thread
video_thread = VideoRecorder()
audio_thread = AudioRecorder()
audio_thread.start()
video_thread.start()
return filename
def start_video_recording(filename="test"):
global video_thread
video_thread = VideoRecorder()
video_thread.start()
return filename
def start_audio_recording(filename="test"):
global audio_thread
audio_thread = AudioRecorder()
audio_thread.start()
return filename
def stop_AVrecording(filename="test"):
audio_thread.stop()
frame_counts = video_thread.frame_counts
elapsed_time = time.time() - video_thread.start_time
recorded_fps = frame_counts / elapsed_time
print("total frames " + str(frame_counts))
print("elapsed time " + str(elapsed_time))
print("recorded fps " + str(recorded_fps))
video_thread.stop()
# Makes sure the threads have finished
while threading.active_count() > 1:
time.sleep(1)
# Merging audio and video signal
if abs(recorded_fps - 6) >= 0.01: # If the fps rate was higher/lower than expected, re-encode it to the expected
print("Re-encoding")
cmd = "ffmpeg -r " + str(recorded_fps) + " -i temp_video.avi -pix_fmt yuv420p -r 6 temp_video2.avi"
subprocess.call(cmd, shell=True)
print("Muxing")
cmd = "ffmpeg -ac 2 -channel_layout stereo -i temp_audio.wav -i temp_video2.avi -pix_fmt yuv420p " + filename + ".avi"
subprocess.call(cmd, shell=True)
else:
print("Normal recording\nMuxing")
cmd = "ffmpeg -ac 2 -channel_layout stereo -i temp_audio.wav -i temp_video.avi -pix_fmt yuv420p " + filename + ".avi"
subprocess.call(cmd, shell=True)
print("..")
def file_manager(filename="test"):
"Required and wanted processing of final files"
local_path = os.getcwd()
if os.path.exists(str(local_path) + "/temp_audio.wav"):
os.remove(str(local_path) + "/temp_audio.wav")
if os.path.exists(str(local_path) + "/temp_video.avi"):
os.remove(str(local_path) + "/temp_video.avi")
if os.path.exists(str(local_path) + "/temp_video2.avi"):
os.remove(str(local_path) + "/temp_video2.avi")
# if os.path.exists(str(local_path) + "/" + filename + ".avi"):
# os.remove(str(local_path) + "/" + filename + ".avi")
if __name__ == '__main__':
start_AVrecording()
time.sleep(5)
stop_AVrecording()
file_manager() | AVrecordeR | /AVrecordeR-1.0.zip/AVrecordeR-1.0/AVrecordeR.py | AVrecordeR.py |
import h5py
def createFileMap(file, path, depth, info, timeCheck, map_file):
# Path is the current path in the h5 file
if not path:
#Print Header
line = 'Path,Field,DataType,Size,Dim1,Dim2,AttrDepth,TimeInfo,Comment\n'
map_file.write(line)
# Loop over top level groups (directories)
for name, item in file.items():
if isinstance(item,h5py.Group):
path = '/' + name
# Reset info dictionary
info['depth'] = depth
info['dtype'] = None
info['comment'] = None
info['time_info'] = None
info['size'] = None
info['dim1'] = None
info['dim2'] = None
# Pass the current path and repeat
createFileMap(file, path, depth, info, timeCheck, map_file)
else:
return
else:
# Set the current path
top = path
# Keep track of how 'deep' we are in the file
depth += 1
# Loop over next level groups (directories)
for name, item in file[top].items():
# If the item is a group, check for attributes
if isinstance(item,h5py.Group):
path = top + '/' + name
# Reset info dictionary
info['depth'] = depth
info['dtype'] = None
info['comment'] = None
info['time_info'] = None
info['size'] = None
info['dim1'] = None
info['dim2'] = None
if item.attrs.__contains__('exception'):
info['depth'] = depth
if item.attrs.__contains__('comment'):
info['comment'] = item.attrs['comment']
if item.attrs.__contains__('acqStamp'):
info['time_info'] = 'acqStamp'
# Pass the current path and repeat
createFileMap(file, path, depth, info, timeCheck, map_file)
# If the item is a dataset, check for attributes
elif isinstance(item,h5py.Dataset):
# Reset info dictionary
info['dtype'] = None
info['size'] = None
info['dim1'] = None
info['dim2'] = None
if item.attrs.__contains__('exception'):
info['depth'] = 0
if item.attrs.__contains__('comment'):
info['comment'] = item.attrs['comment']
if path in timeCheck.timing_dictionary:
if path+'/'+timeCheck.timing_dictionary[path] in file:
info['time_info'] = timeCheck.timing_dictionary[path]
elif item.attrs.__contains__('acqStamp'):
info['time_info'] = 'acqStamp'
# check that the dataset has a known type (fails for arrays of bools)
try:
info['dtype'] = item.dtype
except:
print(name + ' has bad type')
# Check that the dataset has a reasonable size and shape
info['size'] = int(item.size)
if item.size and item.shape:
shape = item.shape
info['dim1'] = shape[0]
if len(shape) == 2:
info['dim2'] = shape[1]
else:
info['dim2'] = 0
# Write all of the information gathered to the map file
line = path+','
line = line+name+','
line = line+str(info['dtype'])+','
line = line+str(info['size'])+','
line = line+str(info['dim1'])+','
line = line+str(info['dim2'])+','
line = line+str(info['depth'])+','
line = line+str(info['time_info'])+','
if info['comment']:
line = line+str(info['comment'].decode())+'\n'
else:
line = line+str(info['comment'])+'\n'
map_file.write(line) | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/ntupling/createFileMap.py | createFileMap.py |
import numpy as np
from getFileMap import getFileMap
import h5py
from TimingCheck import TimingCheck
def returnGoodData(dataPath,file_list):
''' Clean up dataPath '''
if dataPath[-1] == '/':
dataPath = dataPath[:-1]
''' Extract File Info '''
eventTimeStamps = []
eventRunNumbers = []
eventEvtNumbers = []
for file in file_list:
dir_split = file.split('/')
ext_split = dir_split[-1].split('.')
ts, rn, en = ext_split[0].split('_')
eventTimeStamps.append(int(ts))
eventRunNumbers.append(int(rn))
eventEvtNumbers.append(int(en))
''' Collect Field Info '''
field_info = {}
for run in np.unique(eventRunNumbers):
file_map = getFileMap(run)
if dataPath in file_map:
field_info[str(run)] = file_map[dataPath]
else:
print('Warning: Data Path ' + dataPath + ' not found in Run Number ' + str(run))
''' Setup data arrays '''
nFiles = len(file_list)
propPath = field_info[str(eventRunNumbers[0])][0]
field = field_info[str(eventRunNumbers[0])][1]
dtype = field_info[str(eventRunNumbers[0])][2]
dim1 = int(field_info[str(eventRunNumbers[0])][4])
dim2 = int(field_info[str(eventRunNumbers[0])][5])
if dim2 < 2:
data_array = np.zeros((nFiles,dim1),dtype=dtype)
elif dim2 > 1:
data_array = np.zeros((nFiles,dim1,dim2),dtype=dtype)
bool_array = np.zeros(nFiles,dtype=bool)
warn_array = nFiles*['']
time_array = np.empty(nFiles)
time_array.fill(np.nan)
''' Check that data dimensions do not change between runs (that's a big no-no) '''
if len(np.unique(eventRunNumbers)) > 1:
for run in np.unique(eventRunNumbers):
d1 = field_info[str(run)][4]
d2 = field_info[str(run)][5]
if d1 != dim1 or d2 != dim2:
print('Warning: Field ' + dataPath + ' has inconsistent dimensions')
return data_array, bool_array, warn_array, time_array
''' Loop over files and extract data '''
timeChecker = TimingCheck()
for idx,f in enumerate(file_list):
evbFile = h5py.File(f)
runNum = eventRunNumbers[idx]
attr = int(field_info[str(runNum)][6])
time = field_info[str(runNum)][7]
''' First, check if data is present '''
if attr == 0:
acqBool = evbFile[dataPath].attrs['exception']
if acqBool:
warn_array[idx] = 'No data present for event'
continue
elif attr > 0:
acqBool = evbFile[propPath].attrs['exception']
if acqBool:
warn_array[idx] = 'No data present for event'
continue
''' Next, retrieve data '''
try:
#print(dim2)
if dim2 < 2:
#data_array[idx,:] = evbFile[dataPath][0]
data_array[idx,:] = evbFile[dataPath].value
bool_array[idx] = True
elif dim2 > 1:
data_array[idx,:,:] = evbFile[dataPath].value
bool_array[idx] = True
except:
warn_array[idx] = 'Could not retrieve data'
''' Finally, check timing '''
if time == 'None':
warn_array[idx] = 'No timing information present'
else:
if attr == 0:
timeBool, timeDelta = timeChecker.checkTime(evbFile,propPath,'/'+field)
time_array[idx] = timeDelta
if not timeBool and timeDelta==np.nan:
warn_array[idx] = 'No timing information present'
bool_array[idx] = False
elif not timeBool:
warn_array[idx] = 'Out-of-time data'
bool_array[idx] = False
elif attr > 0:
timeBool, timeDelta = timeChecker.checkTime(evbFile,propPath)
time_array[idx] = timeDelta
if not timeBool and timeDelta==np.nan:
warn_array[idx] = 'No timing information present'
bool_array[idx] = False
elif not timeBool:
warn_array[idx] = 'Out-of-time data'
bool_array[idx] = False
return data_array, bool_array, warn_array, time_array | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/ntupling/returnGoodData.py | returnGoodData.py |
import datetime
import pytz
import numpy as np
''' Class for Checking Time offsets '''
class TimingCheck(object):
def __init__(self):
self.timing_dictionary = self.get_timingDict()
self.GVA = pytz.timezone('Europe/Zurich')
self.UTC = pytz.timezone('UTC')
def checkTime(self,h5File,group,field=''):
if group in self.timing_dictionary:
time_bool, time_delta = getattr(self,self.timing_dictionary[group])(h5File,group)
return time_bool, time_delta
elif h5File[group+field].attrs.__contains__('acqStamp'):
time_bool, time_delta = self.acqStamp(h5File,group+field)
return time_bool, time_delta
else:
print('No timing information for group ' + group)
return False, np.nan
''' List of devices and their timestamp fields '''
def get_timingDict(self):
timing_dictionary = {}
timing_dictionary['/AwakeEventData/BOVWA.01TT41.CAM1/ExtractionImage'] = 'imageTimeStamp'
timing_dictionary['/AwakeEventData/BOVWA.02TT41.CAM2/ExtractionImage'] = 'imageTimeStamp'
timing_dictionary['/AwakeEventData/BOVWA.03TT41.CAM3/ExtractionImage'] = 'imageTimeStamp'
timing_dictionary['/AwakeEventData/BOVWA.04TT41.CAM4/ExtractionImage'] = 'imageTimeStamp'
timing_dictionary['/AwakeEventData/BOVWA.05TT41.CAM5/ExtractionImage'] = 'imageTimeStamp'
timing_dictionary['/AwakeEventData/BOVWA.01TCC4.CAM9/ExtractionImage'] = 'imageTimeStamp'
timing_dictionary['/AwakeEventData/BOVWA.02TCC4.CAM10/ExtractionImage'] = 'imageTimeStamp'
timing_dictionary['/AwakeEventData/BOVWA.03TCC4.CAM11/ExtractionImage'] = 'imageTimeStamp'
timing_dictionary['/AwakeEventData/BOVWA.04TCC4.CAM12/ExtractionImage'] = 'imageTimeStamp'
timing_dictionary['/AwakeEventData/SR.SCOPE27.CH01/Acquisition'] = 'triggerStamp'
timing_dictionary['/AwakeEventData/SR.SCOPE27.CH02/Acquisition'] = 'triggerStamp'
timing_dictionary['/AwakeEventData/SR.SCOPE28.CH01/Acquisition'] = 'triggerStamp'
timing_dictionary['/AwakeEventData/SR.SCOPE29.CH01/Acquisition'] = 'triggerStamp'
timing_dictionary['/AwakeEventData/SR.SCOPE30.CH01/Acquisition'] = 'triggerStamp'
timing_dictionary['/AwakeEventData/SR.SCOPE30.CH02/Acquisition'] = 'triggerStamp'
timing_dictionary['/AwakeEventData/TCC4.AWAKE-SCOPE-CTR.CH1/FileRead'] = 'fileTime'
timing_dictionary['/AwakeEventData/TCC4.AWAKE-SCOPE-CTR.CH2/FileRead'] = 'fileTime'
timing_dictionary['/AwakeEventData/TCC4.AWAKE-SCOPE-CTR.CH3/FileRead'] = 'fileTime'
timing_dictionary['/AwakeEventData/TCC4.AWAKE-SCOPE-CTR.CH4/FileRead'] = 'fileTime'
timing_dictionary['/AwakeEventData/TSG40.AWAKE-LASER-CORRELATOR/FileRead'] = 'fileTime'
timing_dictionary['/AwakeEventData/TSG41.AWAKE-CTRHET-VDI/FileRead'] = 'fileTime'
timing_dictionary['/AwakeEventData/TSG41.AWAKE-CTRHET-WR8/FileRead'] = 'fileTime'
timing_dictionary['/AwakeEventData/TT41.AWAKE-PLASMA-SPECTRO-DOWN/FileRead'] = 'fileTime'
timing_dictionary['/AwakeEventData/TT41.AWAKE-PLASMA-SPECTRO-UP/FileRead'] = 'fileTime'
timing_dictionary['/AwakeEventData/TT41.BCTF.412340/Acquisition'] = 'acqTime'
timing_dictionary['/AwakeEventData/TT41.BTV.412350/Acquisition'] = 'acqTime'
timing_dictionary['/AwakeEventData/TT41.BTV.412350/Image'] = 'acqTime'
timing_dictionary['/AwakeEventData/TT41.BTV.412353/Acquisition'] = 'acqTime'
timing_dictionary['/AwakeEventData/TT41.BTV.412353/Image'] = 'acqTime'
timing_dictionary['/AwakeEventData/TT41.BTV.412426/Acquisition'] = 'acqTime'
timing_dictionary['/AwakeEventData/TT41.BTV.412426/Image'] = 'acqTime'
timing_dictionary['/AwakeEventData/TT41.BTV.412442/Acquisition'] = 'acqTime'
timing_dictionary['/AwakeEventData/TT41.BTV.412442/Image'] = 'acqTime'
timing_dictionary['/AwakeEventData/TT41.BTV.412350.STREAK/StreakImage'] = 'streakImageTime'
timing_dictionary['/AwakeEventData/XMPP-STREAK/StreakImage'] = 'streakImageTime'
return timing_dictionary
''' Timestamp check for PXI cameras '''
def imageTimeStamp(self,h5file,group):
try:
cycle_time = h5file['/AwakeEventInfo/timestamp'].value
except:
print('Could not extract cycle time from file')
return False, np.nan
try:
image_time = h5file[group+'/'+'imageTimeStamp'][0]
except:
print('Could not extract image time from file')
return False, np.nan
time_delta = (image_time - cycle_time)/1e9
if time_delta > 0 and time_delta < 10:
return True, time_delta
else:
return False, time_delta
''' Timestamp check for OASIS scopes '''
def triggerStamp(self,h5file,group):
try:
cycle_time = h5file['/AwakeEventInfo/timestamp'].value
except:
print('Could not extract cycle time from file')
return False, np.nan
try:
trigger_time = h5file[group+'/'+'triggerStamp'][0]
except:
print('Could not extract trigger time from file')
return False, np.nan
time_delta = (trigger_time - cycle_time)/1e9
if time_delta > 5 and time_delta < 10:
return True, time_delta
else:
return False, time_delta
''' Timestamp check for FileReader '''
def fileTime(self,h5file,group):
try:
cycle_time = h5file['/AwakeEventInfo/timestamp'].value
except:
print('Could not extract cycle time from file')
return False, np.nan
try:
file_time = h5file[group+'/'+'fileTime'][0]
except:
print('Could not extract file time from file')
return False, np.nan
time_delta = (file_time - cycle_time)/1e9
if time_delta > -10 and time_delta < 11:
return True, time_delta
else:
return False, time_delta
''' Timestamp check for BI devices '''
def acqTime(self,h5file,group):
try:
cycle_time = h5file['/AwakeEventInfo/timestamp'].value
except:
print('Could not extract cycle time from file')
return False, np.nan
try:
acq_t = h5file[group+'/'+'acqTime'][0].decode()
dtLOC = datetime.datetime.strptime(acq_t,'%Y/%m/%d %H:%M:%S.%f')
dtUTC = self.UTC.localize(dtLOC, is_dst=None)
acq_time = 1e9*dtUTC.timestamp()
except:
print('Could not extract acquisition time from file')
return False, np.nan
time_delta = (acq_time - cycle_time)/1e9
if time_delta > 5 and time_delta < 10:
return True, time_delta
else:
return False, time_delta
''' Timestamp check for Streak Cameras '''
def streakImageTime(self,h5file,group):
try:
cycle_time = h5file['/AwakeEventInfo/timestamp'].value
#print(cycle_time)
cyUTC = datetime.datetime.fromtimestamp(cycle_time/1e9, pytz.timezone('UTC'))
cyLOC = cyUTC.astimezone(self.GVA)
date_str = str(cyLOC.year)+'/'+'{0:02d}'.format(cyLOC.month)+'/'+'{0:02d}'.format(cyLOC.day)
except:
print('Could not extract cycle time from file')
return False, np.nan
if group == '/AwakeEventData/XMPP-STREAK/StreakImage':
#print('hi')
try:
img_t = h5file[group+'/'+'streakImageTime'][0].decode()
#print(img_t)
# HMS,PMF = img_t.split()
# H,M,S = HMS.split(':')
# PM,F = PMF.split('.')
# if PM == 'pm':
# h = int(H)
# h += 12
# H = str(h)
# if PM == 'am' and H == '12':
# H = '00'
#
# t_str = date_str+' '+H+':'+M+':'+S+'.'+F
t_str = date_str+' '+img_t
dtLOC = datetime.datetime.strptime(t_str,'%Y/%m/%d %H:%M:%S.%f')
dtGVA = self.GVA.localize(dtLOC, is_dst=None)
image_time = 1e9*dtGVA.timestamp()
except:
print('Could not extract image time from file')
return False, np.nan
elif group == '/AwakeEventData/TT41.BTV.412350.STREAK/StreakImage':
try:
img_t = h5file[group+'/'+'streakImageTime'][0].decode()
t_str = date_str+' '+img_t
dtLOC = datetime.datetime.strptime(t_str,'%Y/%m/%d %H:%M:%S.%f')
dtGVA = self.GVA.localize(dtLOC, is_dst=None)
image_time = 1e9*dtGVA.timestamp()
except:
print('Could not extract image time from file')
return False, np.nan
time_delta = (image_time - cycle_time)/1e9
if time_delta > 0 and time_delta < 10:
return True, time_delta
else:
return False, time_delta
''' Timestamp check for all other devices '''
def acqStamp(self,h5file,group):
try:
cycle_time = h5file['/AwakeEventInfo/timestamp'].value
except:
print('Could not extract cycle time from file')
return False, np.nan
try:
stamp_time = h5file[group].attrs['acqStamp']
except:
print('Could not extract image time from file')
return False, np.nan
time_delta = (stamp_time - cycle_time)/1e9
if time_delta > 0 and time_delta < 11:
return True, time_delta
else:
return False, time_delta | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/ntupling/TimingCheck.py | TimingCheck.py |
def returnCamProps(runNumber):
if runNumber <= 23:
# cameraList = ['/AwakeEventData/BOVWA.01TT41.CAM1/ExtractionImage/imageRawData',
# '/AwakeEventData/BOVWA.02TT41.CAM2/ExtractionImage/imageRawData',
# '/AwakeEventData/BOVWA.03TT41.CAM3/ExtractionImage/imageRawData',
# '/AwakeEventData/BOVWA.04TT41.CAM4/ExtractionImage/imageRawData',
# '/AwakeEventData/BOVWA.05TT41.CAM5/ExtractionImage/imageRawData',
# '/AwakeEventData/TT41.BTV.412350/Image/imageSet',
# '/AwakeEventData/TT41.BTV.412353/Image/imageSet',
# '/AwakeEventData/TT41.BTV.412426/Image/imageSet',
# '/AwakeEventData/TT41.BTV.412442/Image/imageSet',
# '/AwakeEventData/XMPP-STREAK/StreakImage/streakImageData']
# cameraWidth = [1280,
# 1282,
# 1920,
# 1280,
# 1626,
# 385,
# 385,
# 391,
# 385,
# 672]
# cameraHeight = [1024,
# 1026,
# 1200,
# 960,
# 1236,
# 285,
# 285,
# 280,
# 385,
# 512]
cameraList = ['/AwakeEventData/TT41.BTV.412350/Image/imageSet',
'/AwakeEventData/TT41.BTV.412353/Image/imageSet',
'/AwakeEventData/TT41.BTV.412426/Image/imageSet',
'/AwakeEventData/TT41.BTV.412442/Image/imageSet',
'/AwakeEventData/XMPP-STREAK/StreakImage/streakImageData']
cameraWidth = [385,
385,
391,
385,
672]
cameraHeight = [285,
285,
280,
285,
512]
dictCamProps = {'cameraList': cameraList,
'cameraWidth': cameraWidth,
'cameraHeight': cameraHeight}
return dictCamProps | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/ntupling/returnCamProps.py | returnCamProps.py |
from returnCamProps import returnCamProps
def returnBranchData(file_maps,branches,InputParsed):
# Check to see if the images should be formatted into a 2D array
formatImages = False
if 'formatImages' in InputParsed.Flags:
formatImVal = InputParsed.Flags['formatImages']
if formatImVal:
formatImages = True
# Loop over k = runNumber
dictBranchData = {}
for k in file_maps.keys():
file_map = file_maps[k]
allBranches = file_map['MapBranch']
# Get camera properties for runNumber k
if formatImages:
cam_props = returnCamProps(int(k))
Branches = []
DataType = []
DataSize = []
DataDim1 = []
DataDim2 = []
# Loop over branches
for b in branches:
try: # check if branch exists for given run number
indB = allBranches.index(b)
dType = file_map['DataType'][indB]
dSize = file_map['DataSize'][indB]
dDim1 = file_map['DataDim1'][indB]
dDim2 = file_map['DataDim2'][indB]
# Change data dimensions for images
if formatImages:
if b in cam_props['cameraList']:
indC = cam_props['cameraList'].index(b)
dDim1 = cam_props['cameraHeight'][indC]
dDim2 = cam_props['cameraWidth'][indC]
if dDim1*dDim1 != dSize:
IOError('Camera size does not match array size')
# Add data for each branch
Branches.append(b)
DataType.append(dType)
DataSize.append(dSize)
DataDim1.append(dDim1)
DataDim2.append(dDim2)
except:
print('Branch '+b+' does not exist for run number '+k)
#print('Here')
# Add dictionary for each run number
dictRunNMap = {'Branches': Branches,
'DataType': DataType,
'DataSize': DataSize,
'DataDim1': DataDim1,
'DataDim2': DataDim2}
dictBranchData[k] = dictRunNMap
if len(file_maps.keys()) > 1:
# One file loop to get "master map" for ntuple
Branches = []
DataType = []
DataSize = []
DataDim1 = []
DataDim2 = []
for b in branches:
for k in dictBranchData:
dictRunNMap = dictBranchData[k]
else:
dictBranchData['master'] = dictRunNMap
return dictBranchData | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/ntupling/returnBranchData.py | returnBranchData.py |
import os
import numpy as np
def returnPlasmaDensity(eventTimeStamps):
map_path = os.environ['AAT']+'ntupling/map_files/'
den_file = 'rb_data.csv'
den_info = open(map_path+den_file,'r')
timeStamp = []
USDensity = []
DSDensity = []
DensGrads = []
US_Valves = []
DS_Valves = []
USBoolean = []
DSBoolean = []
USWarning = []
DSWarning = []
for idx,line in enumerate(den_info):
if idx == 0:
continue
lineSplit = line.split(',')
ts = int(lineSplit[0].lstrip())
up = float(lineSplit[1].lstrip())
dn = float(lineSplit[2].lstrip())
gr = float(lineSplit[3].lstrip())
uv = int(lineSplit[4].lstrip())
dv = int(lineSplit[5].lstrip())
ub = lineSplit[6].lstrip()
db = lineSplit[7].lstrip()
uw = lineSplit[8].lstrip()
dw = lineSplit[9].lstrip()
timeStamp.append(ts)
USDensity.append(up)
DSDensity.append(dn)
DensGrads.append(gr)
US_Valves.append(uv)
DS_Valves.append(dv)
if ub == 'True':
USBoolean.append(True)
elif ub == 'False':
USBoolean.append(False)
if db == 'True':
DSBoolean.append(True)
elif db == 'False':
DSBoolean.append(False)
USWarning.append(uw)
DSWarning.append(dw[0:-1])
den_info.close()
US_densities = []
DS_densities = []
Gradients = []
US_valve = []
DS_valve = []
US_dataBool = []
DS_dataBool = []
US_warning = []
DS_warning = []
for eTS in eventTimeStamps:
try:
ind = timeStamp.index(eTS)
US_densities.append(USDensity[ind])
DS_densities.append(DSDensity[ind])
Gradients.append(DensGrads[ind])
US_valve.append(US_Valves[ind])
DS_valve.append(DS_Valves[ind])
US_dataBool.append(USBoolean[ind])
DS_dataBool.append(DSBoolean[ind])
US_warning.append(USWarning[ind])
DS_warning.append(DSWarning[ind])
except:
print('Warning: Time Stamp Out of Range')
US_densities.append(np.nan)
DS_densities.append(np.nan)
Gradients.append(np.nan)
US_valve.append(np.nan)
DS_valve.append(np.nan)
US_dataBool.append(False)
DS_dataBool.append(False)
US_warning.append('')
DS_warning.append('')
return (US_densities, DS_densities, Gradients, US_valve, DS_valve,
US_dataBool, DS_dataBool, US_warning, DS_warning) | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/ntupling/returnPlasmaDensity.py | returnPlasmaDensity.py |
import glob
import datetime as dt
from pytz import timezone
import numpy as np
import time
import os
#
# special time function, gives
#
#header = '/user/awakeop/event_data/'
def fileListfromTime(header,kwords,*args):
# hopefully specified epoch timestamp
header=kwords['searchDir'].value[0]
#header=header[0]
if header[-1] !='/':
header+='/'
try:
t_low=kwords['/AwakeEventInfo/timestamp'].lower_bound.tocompare/1e9
#print('{0:09f}'.format(t_low))
except:
t_low=0
try:
t_high=kwords['/AwakeEventInfo/timestamp'].upper_bound.tocompare/1e9
#print('{0:09f}'.format(t_high))
except:
t_high=1e10 #hardcoded bad solution
#print(t_low,t_high)
# next, create epoch timestamp
start = t_low
end = t_high
dt_low=dt.date.timetuple(dt.date.fromtimestamp(t_low))
dt_high=dt.date.timetuple(dt.date.fromtimestamp(t_high))
# next, create the list of dates to search
years = np.arange(dt_low[0],dt_high[0]+1)
months = np.arange(dt_low[1],dt_high[1]+1)
if months.size==1:
days=[np.arange(dt_low[2],dt_high[2]+1)]
else:
days=[]
for k in range(months.size):
if k==months.size-1:
days+=[np.arange(1,dt_high[2]+1)]
if k==0:
days+=[np.arange(dt_low[2],31+1)]
if k<months.size-1 and k>0:
days+=[np.arange(1,31+1)]
file_list = []
for y in range(years.size):
for m in range(months.size):
for d in days[m]:
date_path = header + str(years[y]) + '/' + '{0:02d}'.format(months[m]) + '/' + '{0:02d}'.format(d) + '/'
#print(y)
#print(date_path)
files = glob.glob(date_path + '*.h5')
files.sort(key=os.path.getmtime)
for f in files:
h5_file = os.path.basename(f)#.rsplit('/',1)[1]
ns_timeStamp = float(h5_file.split('_',1)[0])/1e9
if ns_timeStamp > start and ns_timeStamp < end:
file_list.append(f)
return file_list
#
# List of Standard Keywords that are alway allowed
# please only edit when you are sure what you are doing
#
standardKeywords={'searchDir':fileListfromTime,'/AwakeEventInfo/timestamp': (lambda x=None,*args,**kwargs:np.array([True]))}#:None#: fileListfromTime, 'laser_on': laserFKT, 'RbValveDown': rbDownFKT....
standardFlags=['targetDir','ntupling','nTupling'] | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/ntupling/cutParserDefines.py | cutParserDefines.py |
import numpy as np
import h5py
from returnFileData import returnFileData
from returnFileMap import returnFileMap
from returnBranchData import returnBranchData
from returnCamProps import returnCamProps
def createNtuples (file_list,InputParsed):
if not len(file_list):
print('Error:Empty file list')
return
# Get information on the files to be ntupled
dictFileData = returnFileData(file_list)
runNums = np.unique(dictFileData['FileRunN'])
file_maps = {}
cam_props = {}
for i in runNums:
file_map = returnFileMap(i)
file_maps[str(i)] = file_map
cam_prop = returnCamProps(i)
cam_props[str(i)] = cam_prop
# Get information on the branches to be ntupled
devices = InputParsed.argDevices
branches = [d[7:] for d in devices]
# Add cut branches
for k in InputParsed.args.keys():
if k not in branches:
branches.append(k)
dictBranchData = returnBranchData(file_maps,branches,InputParsed)
# How many files are there?
nFiles = len(file_list)
fNum = range(nFiles)
nBranch = len(branches)
# Now we a ready to ntuple
ntupleDir = InputParsed.Flags['ntupleDir'][0]
ntupleFile = InputParsed.Flags['ntupleFile'][0]
nTup = h5py.File(ntupleDir+ntupleFile, "w")
info_qual = [True] * nFiles
# Add file data to ntuple
Dataset = nTup.create_dataset('/file_info/FileName', data=str(dictFileData['FileName']))
Dataset = nTup.create_dataset('/file_info/FileName.quality', data=info_qual)
Dataset = nTup.create_dataset('/file_info/FileTime', data=dictFileData['FileTime'])
Dataset = nTup.create_dataset('/file_info/FileTime.quality', data=info_qual)
Dataset = nTup.create_dataset('/file_info/RunNumber', data=dictFileData['FileRunN'])
Dataset = nTup.create_dataset('/file_info/RunNumber.quality', data=info_qual)
Dataset = nTup.create_dataset('/file_info/EvtNumber', data=dictFileData['FileEvtN'])
Dataset = nTup.create_dataset('/file_info/EvtNumber.quality', data=info_qual)
# Add branch data to ntuple
BranchData = dictBranchData['master']
count = 0
dec = 0
for branch in branches:
#print(branch)
# Get branch info from 'master'
indB = BranchData['Branches'].index(branch)
dType = BranchData['DataType'][indB]
dDim1 = BranchData['DataDim1'][indB]
dDim2 = BranchData['DataDim2'][indB]
# Allocate data arrays based on branch info
# from 'master' branch info
if dType == 'str' and dDim1 == 1:
my_data = []
elif dType == 'str' and dDim1 != 1:
my_data = []
else:
if dDim2 == 0:
my_data = np.zeros((dDim1,nFiles),dType)
else:
my_data = np.zeros((dDim1,dDim2,nFiles),dType)
# Set data quality to false for this branch
my_qual = [False] * nFiles
# Now loop over files and fill in data
for f,i in zip(file_list,fNum):
count+=1
if np.floor(100*count/(nBranch*nFiles)) > dec:
print(str(dec)+'%')
dec+=10
#print(f)
# Load AEB file
h5_file = h5py.File(f,'r')
runNum = dictFileData['FileRunN'][i]
# Data size may change between runs . . .
nBranchData = dictBranchData[str(runNum)]
indnB = nBranchData['Branches'].index(branch)
dnType = BranchData['DataType'][indnB]
dnDim1 = BranchData['DataDim1'][indnB]
dnDim2 = BranchData['DataDim2'][indnB]
if dnType != dType:
IOError('Data type cannot change between runs. Branch excluded from nTuple')
continue
if dnType == 'str' and dnDim1 == 1:
try:
my_str = h5_file[branch][0].decode()
my_data.append(my_str)
my_qual[i] = True
except:
my_data.append('')
my_qual[i] = False
elif dnType == 'str' and dnDim1 != 1:
try:
sub_data = []
for n_str in h5_file[branch][0]:
my_str = n_str.decode()
sub_data.append(my_str)
my_data.append(sub_data)
my_qual[i] = True
except:
my_data.append(['']*dnDim1)
my_qual[i] = False
elif dnType != 'str':
if dnDim2 == 0:
try:
my_data[:,i] = h5_file[branch].value
my_qual[i] = True
except:
my_qual[i] = False
else:
try:
tmp_dat = h5_file[branch][:]
if branch in cam_props[str(runNum)]['cameraList']:
#print(branch)
indC = cam_props[str(runNum)]['cameraList'].index(branch)
height = cam_props[str(runNum)]['cameraHeight'][indC]
width = cam_props[str(runNum)]['cameraWidth'][indC]
#print('Height = ' + str(height) + ', Width = ' + str(width))
dat = tmp_dat.reshape((height,width))
#print(dat.shape)
my_data[:,:,i] = dat
else:
my_data[:,:,i] = tmp_dat
my_qual[i] = True
except:
my_qual[i] = False
if dType == 'str':
Dataset = nTup.create_dataset(branch, data=str(my_data))
Dataset = nTup.create_dataset(branch+'.quality', data=my_qual)
else:
Dataset = nTup.create_dataset(branch, data=my_data)
Dataset = nTup.create_dataset(branch+'.quality', data=my_qual)
nTup.close() | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/ntupling/createNtuple.py | createNtuple.py |
"""
Fileparser, has to be python 2.7 compatible
problem: time conversion in _awakeTypeconversion() is timezone UTC the correct one?
"""
'''
Imports
'''
import h5py
import numpy as np
import copy
import cutParserDefines
"""
Comparison class
"""
class comparator:
def __init__(self,string,cmptype,tocomp=None):
self.compare=None
self.cmptype=cmptype
self.tocompare=tocomp
self.string=string
self.setcompare()
def setcompare(self):
if self.string=='<=' or self.string =='=<':
self.compare=lambda x: self.cmptype.__le__(x,self.tocompare)
return
if self.string=='>=' or self.string =='=>':
self.compare=lambda x: self.cmptype.__ge__(x,self.tocompare)
return
if self.string=='>':
self.compare=lambda x: self.cmptype.__gt__(x,self.tocompare)
return
if self.string=='<':
self.compare=lambda x: self.cmptype.__lt__(x,self.tocompare)
return
if self.string=='==' or self.string== '=':
self.compare=lambda x: self.cmptype.__eq__(x,self.tocompare)
return
def __call__(self,other,casttype=None):
if other == None:
return np.array([False])
if self.compare is not None and self.tocompare is not None:
try:
return self.compare(np.nan_to_num(other.astype(self.tocompare.dtype)))
except:
return np.array([False])
def _awakeTypeconversion(x):
#
# type conversion is simple:
# a) float
# b) if not a) time/date
# c) if not b) try bool
# d) if not c) then use as string
#
import copy
import re
import calendar
import time
import datetime
import pytz
local = pytz.timezone('Europe/Zurich')
regExpr=re.compile('\s*[tTfF][rRaA][uUlL][sSeE]([eE])?\s*')
reg=re.compile('\[([0-9]+\s?,?)+|(nan\s?,?)*|(inf\s?,?)*\]')
buff=copy.deepcopy(x)
try:
x=np.array(buff.strip('[').strip(']').split(','),dtype=np.dtype('f'))
return x, type(x)
except:
try:
# in seconds, becomes ns and then datatype is numpy.int64
# beware: datatype conversion is done in compare
# is timezone correct? -> check with spencer
# -3600 is a workaround
#x=np.array(((calendar.timegm(time.strptime(x.replace('/',':').replace('-',':').strip(),"%d:%m:%Y %H:%M:%S"))-3600)*1e9),dtype='f')
x=x.replace('/',':').replace('-',':').strip()
dtUTC = datetime.datetime.strptime(x,'%d:%m:%Y %H:%M:%S')
dtLOC = local.localize(dtUTC, is_dst=None)
x = 1e9*dtLOC.timestamp()
return x, type(x)
except:
m=regExpr.match(buff)
if m:
x=np.array(bool(buff),dtype='b')
return x,type(x)
else:
m=reg.match(buff)
if m.group()!='':
x=np.nan_to_num(np.fromstring(buff.strip('[').strip(']'),dtype=np.dtype('f'),sep=','))#,dtype=np.dtype('f')
return x,type(x)
else:
x=np.array([buff])
return x,type(x)
"""
Things that need to be converted: time -> timestamp
Allow following operators:
<, <=, >, >=, ==
"""
class inputObject:
import numpy as np
class cmpObjects:
cmpList=['<=','=<','<','==','>=','=>','>','='] #lazy solution to put '=' at the end
def __init__(self):
pass
def __iter__(self):
return iter(self.cmpList)
cmplist=cmpObjects()
def __init__(self,line):
def assignBound(string,val):
self.isinterval=True
if string=='<=' or string =='<' or string=='=<':
self.upper_bound=comparator(string,type(val),val)
if string=='=>' or string =='>' or string=='>=':
self.lower_bound=comparator(string,type(val),val)
elif string=='=' or string=='==':
self.eqval=comparator(string,type(val),val)
self.lower_bound=self.eqval
self.upper_bound=self.eqval
self.isinterval=False
def lineattr(line,string,attr='name',string2=None):
if type(line) is not type(str()):
return None
if len(line.split(string))>1:
val,mtype=_awakeTypeconversion(line.split(string)[1].strip())
setattr(self,attr,line.split(string)[0].strip())
assignBound(string,val)
return val
#wenn kein string zum trennen erkannt wird ist es anscheinend ein device was gewollt wird
# autoset ist True
buff=[k for k in self.cmplist]
buffBool=False
for k in buff:
if line.find(k)!=-1:
buffBool=True
if buffBool==False:
setattr(self,attr,'DEVICE:'+line.strip())
assignBound('==',True)
self.isdevice=True
self.value='DEVICE:'+line.strip()
return None
return None
self.name=None
self.eqval=None
self.lower_bound=None
self.upper_bound=None
self.isinterval=True
self.value=None
self.isdevice=False
for k in self.cmplist:
buff=lineattr(line,k,'name')
if buff is not None: #if we remove the break statement order doesnt matter in cmplist
self.value=buff
break
def __call__(self,other):
buff=np.zeros(int(other.size),dtype=other.dtype)
if type(other) == type(np.array([])):
buff=other
else:
other.read_direct(buff)
if self.isinterval is False:
return self.eqval(buff,other.dtype).all()
if self.lower_bound is None:
return self.upper_bound(buff,other.dtype)
if self.upper_bound is None:
return self.lower_bound(buff,other.dtype).all()
return (self.lower_bound(buff,other.dtype).all() and self.upper_bound(buff,other.dtype)).all()
def __str__(self):
try:
return str(self.name+' ' +self.val)
except:
return str(self.name)
def append(self,other,name=None):
if other.eqval is not None:# ??? unsciher
self.eqval=other.eqval
if other.lower_bound is not None:
self.lower_bound=other.lower_bound
self.isinterval=True
if other.upper_bound is not None:
self.upper_bound=other.upper_bound
self.isinterval=True
if name is not None:
self.name=other.name
self.value=other.value
return self
class specialInputObject(inputObject):
def __init__(self,x,fkt=None):
inputObject.__init__(self,x)
self.f=fkt
def __call__(self,*args,**kwargs):
if len(args)>0:
if isinstance(args[0],h5py.File):
return inputObject.__call__(self,self.f(args[0],kwargs))
#except:
if self.f is not None:
return self.f(self.value,*args,**kwargs)
"""
/bla/blub <= sets upper bound
/bla/blub >= sets lower bound
/bla/blub == sets equality
last operator set is used for __call__ method
"""
#
# setFileList defines keyword that creates a filelist
#
class inputParser:
def __init__(self,x,*args,setFileList='searchDir',setStandardKeyword=cutParserDefines.standardKeywords,setStandardFlags=cutParserDefines.standardFlags,**kwargs):
self.specialKeywords={}
self.flist=None
self.Flags={}
specialNames={}
for l,k in setStandardKeyword.items():
specialNames[l]=k
for l,k in enumerate(args):
if type(k)==type(tuple()):
specialNames[k[0]]=k[1]
for l,k in kwargs.items():
specialNames[l]=k
self.path=str(x) #python 2.7 kompabilität
argDict={}
content=[]
self.argDevices=[]
with open(self.path,'r') as f:
content=f.read().splitlines()
for buff in content:
buff=buff.split('#')[0].strip() # kill comments
if buff=='': #kills empty lines and only comment lines
continue
# split now for <=,==,=>,<,>
# mybuff=specialInputObject(buff)
mybuff=inputObject(buff)
if mybuff.name in specialNames.keys():
if mybuff.name in self.specialKeywords.keys():
self.specialKeywords[mybuff.name]=self.specialKeywords[mybuff.name].append(mybuff,mybuff.name)
else:
#print(mybuff.name)
self.specialKeywords[mybuff.name]=specialInputObject(buff)
(self.specialKeywords[mybuff.name]).f=specialNames[mybuff.name]
#continue
elif mybuff.name in setStandardFlags:
self.Flags[mybuff.name]=mybuff.value
#continue
else:
if mybuff.name in argDict.keys() and mybuff.isdevice is not True:
argDict[mybuff.name]=argDict[mybuff.name].append(mybuff)
elif mybuff.isdevice is not True:
argDict[mybuff.name]=mybuff
elif mybuff.isdevice is True:
self.argDevices+=[mybuff.name]
self.args=argDict
if setFileList is not None and setFileList in self.specialKeywords.keys():
self.flist=self.specialKeywords[setFileList](self.specialKeywords)
del self.specialKeywords[setFileList] #delte setFileList
def __call__(self,h5file=None,*args,**kwargs):
if h5file is None and self.flist is not None:
return self.__call__(self.flist)
if type(h5file)==type(list()):
buff=[]
for k in range(0,len(h5file)):
buff+=[self.__call__(h5file[k])]
h5file=(buff,h5file)
buff=np.array(h5file[1])[np.where(buff)].tolist()
return buff,h5file
try:
f=h5file
if type(h5file) is type(str()):
f=h5py.File(h5file,'r')
for k in self.args.keys():
if k not in f:
f.close()
return False
if not self.args[k](f[k]):
f.close()
return False
for l,k in self.specialKeywords.items():
if not k(*((f,),args),**kwargs):
f.close()
return False
f.close()
return True
except OSError:
#raise IOError('Input file not found, please provide a string or a filehandle')
return False
def __repr__(self):
return repr(self.args) | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/ntupling/cutParser.py | cutParser.py |
import os
import numpy as np
def returnAlignmentData(eventTimeStamps):
map_path = os.environ['AAT']+'ntupling/map_files/'
align_file = 'LaserAlignmentData.csv'
align_info = open(map_path+align_file,'r')
EvtsTimeStamp = []
Cam3TimeStamp = []
Cam3XPos = []
Cam3YPos = []
Cam3RPos = []
Cam5TimeStamp = []
Cam5XPos = []
Cam5YPos = []
Cam5RPos = []
Cam3TimeBool = []
Cam5TimeBool = []
for idx,line in enumerate(align_info):
if idx == 0:
continue
lineSplit = line.split(',')
evtts_str = lineSplit[0].lstrip()
cam3ts_str = lineSplit[1].lstrip()
cam3x = float(lineSplit[2].lstrip())
cam3y = float(lineSplit[3].lstrip())
cam3r = float(lineSplit[4].lstrip())
cam5ts_str = lineSplit[5].lstrip()
cam5x = float(lineSplit[6].lstrip())
cam5y = float(lineSplit[7].lstrip())
cam5r = float(lineSplit[8].lstrip())
cam3b_str = lineSplit[9].lstrip()
cam5b_str = lineSplit[10].lstrip()
if evtts_str == 'nan':
EvtsTimeStamp.append(np.nan)
else:
EvtsTimeStamp.append(int(evtts_str))
if cam3ts_str == 'nan':
Cam3TimeStamp.append(np.nan)
else:
Cam3TimeStamp.append(int(cam3ts_str))
Cam3XPos.append(cam3x)
Cam3YPos.append(cam3y)
Cam3RPos.append(cam3r)
if cam5ts_str == 'nan':
Cam5TimeStamp.append(np.nan)
else:
Cam5TimeStamp.append(int(cam3ts_str))
Cam5XPos.append(cam5x)
Cam5YPos.append(cam5y)
Cam5RPos.append(cam5r)
if cam3b_str == 'True':
Cam3TimeBool.append(True)
elif cam3b_str == 'False':
Cam3TimeBool.append(False)
if cam5b_str == 'True':
Cam5TimeBool.append(True)
elif cam5b_str == 'False':
Cam5TimeBool.append(False)
align_info.close()
Cam3TimeStamps = []
Cam3XPositions = []
Cam3YPositions = []
Cam3RPositions = []
Cam3DataStatus = []
Cam5TimeStamps = []
Cam5XPositions = []
Cam5YPositions = []
Cam5RPositions = []
Cam5DataStatus = []
for eTS in eventTimeStamps:
try:
ind = EvtsTimeStamp.index(eTS)
Cam3TimeStamps.append(Cam3TimeStamp[ind])
Cam3XPositions.append(Cam3XPos[ind])
Cam3YPositions.append(Cam3YPos[ind])
Cam3RPositions.append(Cam3RPos[ind])
if (np.isnan(Cam3XPos[ind]) or np.isnan(Cam3YPos[ind]) or
np.isnan(Cam3RPos[ind]) or not Cam3TimeBool[ind]):
Cam3DataStatus.append(False)
else:
Cam3DataStatus.append(True)
Cam5TimeStamps.append(Cam5TimeStamp[ind])
Cam5XPositions.append(Cam5XPos[ind])
Cam5YPositions.append(Cam5YPos[ind])
Cam5RPositions.append(Cam5RPos[ind])
if (np.isnan(Cam5XPos[ind]) or np.isnan(Cam5YPos[ind]) or
np.isnan(Cam5RPos[ind]) or not Cam5TimeBool[ind]):
Cam5DataStatus.append(False)
else:
Cam5DataStatus.append(True)
except:
print('Warning: Timestamp not in list of events')
Cam3TimeStamps.append(np.nan)
Cam3XPositions.append(np.nan)
Cam3YPositions.append(np.nan)
Cam3RPositions.append(np.nan)
Cam3DataStatus.append(False)
Cam5TimeStamps.append(np.nan)
Cam5XPositions.append(np.nan)
Cam5YPositions.append(np.nan)
Cam5RPositions.append(np.nan)
Cam5DataStatus.append(False)
return (Cam3XPositions, Cam3YPositions, Cam3RPositions, Cam3TimeStamps, Cam3DataStatus,
Cam5XPositions, Cam5YPositions, Cam5RPositions, Cam5TimeStamps, Cam5DataStatus) | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/ntupling/returnAlignmentData.py | returnAlignmentData.py |
import numpy as np
import pyjapc
import os
import sys
os.environ['AAT'] = '/user/awakeop/AWAKE_ANALYSIS_TOOLS/'
#os.environ['AAT'] = '/afs/cern.ch/user/s/sgess/AWAKE_ANALYSIS_TOOLS/'
sys.path.append(os.environ['AAT']+'analyses/')
import frame_analysis as fa
''' Class for AWAKE Camera Properties '''
class AwakeCamera():
def __init__(self,device,name,system,mode,japc):
self.device = device
self.name = name
self.system = system
self.mode = mode
self.japc = japc
self.run_ana = True
self.fit_gauss = False
self.median_filter = False
self.fillCamHandles()
self.initCam()
def fillCamHandles(self):
if self.system == 'PXI':
self.settings_prop = 'PublishedSettings'
self.exposure_field = 'exposureTime'
self.delay_field = 'delayTime'
self.height_str = 'height'
self.width_str = 'width'
self.pixel_str = 'pixelSize'
self.x_ax_str = ''
self.y_ax_str = ''
self.timestamp_str = 'imageTimeStamp'
if self.mode == 'EXT':
self.image_prop = 'ExtractionImage'
self.image_str = 'imageRawData'
self.timingSelector = 'SPS.USER.AWAKE1'
elif self.mode == 'LASER':
self.image_prop = 'CameraImage'
self.image_str = 'image'
self.timingSelector = 'SPS.USER.ALL'
else:
print('GTFOH')
return
self.acq_string = self.device + '/' + self.image_prop
self.sys_string = self.device + '/' + self.settings_prop
elif self.system == 'BTV':
self.image_prop = 'Image'
self.settings_prop = ''
self.image_str = 'imageSet'
self.height_str = 'nbPtsInSet1'
self.width_str = 'nbPtsInSet2'
self.pixel_str = ''
self.x_ax_str = 'imagePositionSet1'
self.y_ax_str = 'imagePositionSet2'
self.timestamp_str = ''
if self.mode == 'EXT':
self.timingSelector = 'SPS.USER.AWAKE1'
elif self.mode == 'LASER':
self.device = self.device + '.LASER'
self.timingSelector = 'SPS.USER.ALL'
else:
print('GTFOH')
return
self.acq_string = self.device + '/' + self.image_prop
else:
print('GTFOH')
return
def initCam(self):
if self.system == 'PXI':
self.initPXI()
self.getSystem()
elif self.system == 'BTV':
self.initBTV()
else:
print('GTFOH')
return
if self.run_ana:
self.analyze()
def initPXI(self):
camData = self.async_get(self.acq_string)
if self.mode == 'EXT':
self.px_sz = camData[self.pixel_str]
elif self.mode == 'LASER':
self.px_sz = 5.0*camData[self.pixel_str]
else:
print('GTFOH')
return
self.image = camData[self.image_str]
self.width = camData[self.width_str]
self.height = camData[self.height_str]
x_ax = self.px_sz*np.arange(self.width)
self.x_ax = x_ax - np.mean(x_ax)
y_ax = self.px_sz*np.arange(self.height)
self.y_ax = y_ax - np.mean(y_ax)
self.roi = [self.x_ax[0],self.x_ax[-1],self.y_ax[0],self.y_ax[-1]]
def initBTV(self):
camData = self.async_get(self.acq_string)
im_vec = camData[self.image_str]
self.width = camData[self.width_str]
self.height = camData[self.height_str]
self.image = np.reshape(im_vec,(self.width,self.height))
self.x_ax = camData[self.x_ax_str]
self.y_ax = camData[self.y_ax_str]
self.px_sz = self.x_ax[1] - self.x_ax[0]
self.roi = [self.x_ax[0],self.x_ax[-1],self.y_ax[0],self.y_ax[-1]]
def getSystem(self):
sysData = self.async_get(self.sys_string)
self.exp_time = sysData[self.exposure_field]
self.del_time = sysData[self.delay_field]
def updateImage(self):
if self.system == 'PXI':
self.image = self.async_get(self.acq_string+'#'+self.image_str)
elif self.system == 'BTV':
im_vec = self.async_get(self.acq_string+'#'+self.image_str)
self.image = np.reshape(im_vec,(self.width,self.height))
else:
print('GTFOH')
return
if self.run_ana:
self.analyze()
def analyze(self):
self.frame_ana = fa.FrameAna(self.image,self.x_ax,self.y_ax,self.roi)
self.frame_ana.fit_gauss = self.fit_gauss
self.frame_ana.median_filter = self.median_filter
self.frame_ana.analyze_frame()
def async_get(self,param):
return self.japc.getParam(param,timingSelectorOverride=self.timingSelector)
def subCallback(self,name,image):
if self.system == 'PXI':
self.image = image
elif self.system == 'BTV':
self.image = np.reshape(image,(self.width,self.height))
else:
print('GTFOH')
return
if self.run_ana:
self.analyze()
def start_sub(self):
self.japc.subscribeParam(self.acq_string+'#'+self.image_str,self.subCallback)
self.sub_state = True
self.japc.startSubscriptions()
def start_ExtSub(self,extFunc):
self.japc.subscribeParam(self.acq_string+'#'+self.image_str,extFunc)
self.sub_state = True
self.japc.startSubscriptions()
def stop_sub(self):
#print('out dis bitch')
self.japc.stopSubscriptions()
self.japc.clearSubscriptions()
self.sub_state = False | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/japc_support/AwkCmra.py | AwkCmra.py |
import sys
sys.path.append('/user/rieger/')
from awakeIMClass import *
import numpy as np
import scipy as sp
import matplotlib as mpl
import pickle
import time
import os
slth = 3
filt = 'yes' #'yes' or 'no'
fstrength = 10
slc = int(512/slth)
def gaus(x,a,x0,sigma,c):
return a*np.exp(-(x-x0)**2/(2*sigma**2))+c
def bunchAnalysis(vec, slth, x,timeVal):
from scipy import ndimage
#time = get_SCtimelabel() #get correct timevector
slc = int(512/slth) #compute how many stripes will be analyzed, SLice Count
time = np.arange(0,timeVal[-1],timeVal[-1]/slc) #generate a vector for correct plotting dependent on slc
if filt is 'yes':
vec = ndimage.median_filter(vec,fstrength)
print('Image is filtered')
allgvalues = getGvalues(vec,x,slth) #compute all parameters for each stripe
amplitudes = allgvalues[:,0] #in counts/intensity
centroids = allgvalues[:,1] #in millimeters
sigmas = allgvalues[:,2] #in millimeters, *1000 for micrometers
integrals = allgvalues[:,0]*sigmas*np.sqrt(2*np.pi)
print('End of one Bunch analysis...................................................')
return amplitudes, centroids, sigmas, integrals, time
def get_SCtimelabel():
import h5py
file1 = h5py.File('/user/awakeop/event_data/2017/06/02/1496354911335000000_40_25.h5','r')
timelabel = list(file1['AwakeEventData']['TT41.BTV.412350.STREAK']['StreakImage']['streakImageTimeValues'])
timelabel[0] = -8.296966560000001 #correct the first value
timelabel[:] = [i+8.296966560000001 for i in timelabel] #shift all values s.t. we start at time t=0
return timelabel #time vector for the plots
def getDate(stamp):
from datetime import datetime
dt = datetime.fromtimestamp(stamp//1000000000)
date = dt.strftime('%d-%m-%d %H:%M:%S')
return date
def getGvalues(streakimage,x,slth):
from scipy.optimize import curve_fit
slth = slth #SLice THickness
slc = int(512/slth) #SLice Count
sections = np.arange(0,512,slth) #sections determined by slth
allgvalues = np.zeros((slc,4)) #Gauss values for each stripe will be saved here
c = 0
#print('x[0] is '+str(x[0])+' and x[-1] is '+str(x[-1]))
for i in sections: #compute the following for all stripes
if i+slth <=512: #check if we run out of image to compute stuff on
buffero = streakimage[i:i+slth,:] #selecting the values of the stripe with thickness slth
line = np.sum(buffero,0) #summing all values into a line
#print(line)
maximum = np.mean(line)*3 #computing guessing values for the gaus fit
x0 = x[345]
sigma = 1/4*np.abs(x[0]-x[-1]) #was at *1/10
c0 = np.mean(line)*0.99
#print(maximum,x0,sigma,c0)
try:
gvalues,error = curve_fit(gaus,x, line, p0=[maximum,x0,sigma,c0])
except: #fitting was not possible
gvalues = [0,0,0,0] #setting some value
print('No fitting possible, fit number '+str(c))
gvalues[2] = np.abs(gvalues[2])
allgvalues[c] = gvalues
#print(gvalues)
c = c+1
else:
break
return allgvalues #allgvalues has all the parameters of the fitted gaussians per stripe
def ImagePlot(plotax,fig,fixedaxes,japc,vec,something,SliceThickness, YesNoFilter, Filterstrength):
#TT41.BTV.412350.STREAK,XMPP-STREAK
print('ImagePlot executed............................................................................')
import time
time.sleep(1)
timestamp=japc.getParam('BOVWA.01TT41.CAM1/ExtractionImage#imageTimeStamp')
timerange=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageTimeRange')
timeVal=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageTimeValues')
vec=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageData').reshape(512,672)
global slth
global filt
global strength
global slc
slth = int(SliceThickness)
filt = YesNoFilter
fstrength = int(Filterstrength)
slc = int(512/slth)
print('slth = '+str(slth))
if filt is 'yes':
filtertext = 'ndimage.median_filter('+str(fstrength)+') used'
else:
filtertext = ' '
'''
im prinzip hast du jetzt:
bild: vec (512-> timeachse, 672->space achse)
zeitachse: timeVal (beachte 0. wert ist usually schachsinn)
x-achse: fixedaxes
plotaxes (pyplot achse in die man ganz normal plotten kann)
kannst beliebige berechnungen machen (wie in diesen beispielen gemacht)
'''
#print(np.shape(vec))
if something is None:
something = 1.1*np.max(vec[:,300:400].sum()/100/512)
plotax.clear()
xmin = 250 #250
xmax = 422 #422 from 672
vec = vec[:,xmin:xmax]
plotax.imshow(np.fliplr(vec.T),extent=[timeVal[1],timeVal[-1],fixedaxes[0][xmin],fixedaxes[0][xmax]],vmin=400,vmax=np.mean(vec)*1.9,aspect='auto',cmap='jet')
amplitudes, centroids, sigmas, integrals, time = bunchAnalysis(vec, slth, fixedaxes[0],timeVal)
'''
ax2.plot(time,np.abs(sigmas),'k.')
ax2.set_ylabel('Sigma',color='r')
ax2.set_xlim(timeVal[-1],timeVal[1])
'''
#BOVWA.01TT41.CAM1/ExtractionImage/imageTimeStamp
date = 'On '+getDate(timestamp)+', '
text = ', timescale: '+timerange+', '+filtertext
plotax.set_title(date+str(japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageTime'))+text)
return
def SigmaAndAmplitude(plotax,fig,fixedaxes,japc,vec,something3):
plotax.clear()
timeVal=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageTimeValues')
time = np.linspace(timeVal[0],timeVal[-1],timeVal[-1]/slc)
vec=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageData').reshape(512,672)
import h5py
from scipy.optimize import curve_fit
import scipy
amplitudes, centroids, sigmas, integrals, time = bunchAnalysis(vec, slth, fixedaxes[0],timeVal)
plobj1=plotax.plot(time,sigmas*1000,'b.-',label='Sigma along the bunch')
plotax.set_ylim(200,900)
plotax.set_xlim(time[0],time[-1])
plotax.set_ylabel('Sigma [micrometers]', color='b')
plotax.yaxis.tick_left()
#plotax.set_title('Sigma along the bunch')
plotax.yaxis.set_label_position("left")
plotax.legend()
return
def amplitude(plotax,fig,fixedaxes,japc,vec,something,something2):
plotax.clear()
timeVal=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageTimeValues')
vec=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageData').reshape(512,672)
import h5py
from scipy.optimize import curve_fit
import scipy
amplitudes, centroids, sigmas, integrals, time = bunchAnalysis(vec, slth, fixedaxes[0],timeVal)
#print(amplitudes)
plobj1=plotax.plot(time,amplitudes,'b.-', label='Amplitude')
plotax.set_xlim(time[0],time[-1])
plotax.set_ylabel('Amplitude',color='b')
#plotax.set_title('Amplitude along the bunch')
plotax.legend()
return
def centroid(plotax,fig,fixedaxes,japc,vec,something,something2):
plotax.clear()
timeVal=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageTimeValues')
vec=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageData').reshape(512,672)
import h5py
from scipy.optimize import curve_fit
import scipy
amplitudes, centroids, sigmas, integrals, time = bunchAnalysis(vec, slth, fixedaxes[0],timeVal)
#print(centroids)
plobj1=plotax.plot(time,centroids,'b.-', label='Centroid')
plotax.set_xlim(time[0],time[-1])
plotax.set_ylim(-0.5,0.5)
plotax.set_ylabel('centroid [mm]',color='b')
#plotax.set_title('Location of the centroid')
plotax.legend()
return
def integrals(plotax,fig,fixedaxes,japc,vec,something,something2):
plotax.clear()
e=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageTimeRange')
unit = e[-2]+e[-1]
timeVal=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageTimeValues')
vec=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageData').reshape(512,672)
import h5py
from scipy.optimize import curve_fit
import scipy
amplitudes, centroids, sigmas, integrals, time = bunchAnalysis(vec, slth, fixedaxes[0],timeVal)
plobj1=plotax.plot(time,integrals,'b.-',label='counts/slice')
plotax.set_xlim(time[0],time[-1])
plotax.set_xlabel('time ['+unit+']')
plotax.set_ylabel('counts/slice',color='b')
#plotax.set_title('Sum of counts per slice')
plotax.yaxis.set_label_position("left")
plotax.yaxis.tick_left()
plotax.legend()
print('Last function got called...............................................................')
return
if __name__=='__main__':
app = QApplication(sys.argv)
aw = AwakeWindow(["TT41.BCTF.412340/Acquisition#totalIntensityPreferred"],ImagePlot,SigmaAndAmplitude,amplitude,centroid,integrals,fixedaxes=(np.linspace(-8.7,8.7,672),),selector="SPS.USER.AWAKE1",name='Felipe Image',ImagePlot={'something':None,'SliceThickness':slth,'YesNoFilter':'yes','Filterstrength':10},SigmaAndAmplitude={'something3':2},amplitude={'something':None,'something2':2},centroid={'something':None,'something2':2},integrals={'something':None,'something2':2},reverse=True)
progname='felipePlots'
aw.setWindowTitle("%s" % progname)
aw.setWindowIcon(QIcon(os.path.join(os.path.dirname(__file__),'awakeicon1_FkV_icon.ico')))
aw.show()
app.exec_() | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/plotting_tools/felipeBI.py | felipeBI.py |
from __future__ import unicode_literals
"""
Spencer tries to make a thing
"""
''' Get all the things '''
import sys
import time
import os
import matplotlib as mpl
mpl.use('Qt5Agg')
from matplotlib.backends.backend_qt5agg import FigureCanvasQTAgg as FigureCanvas
from matplotlib.backends.backend_qt5agg import NavigationToolbar2QT as NavigationToolbar
from matplotlib.figure import Figure
#import matplotlib.colors as colors
import matplotlib.dates as mdates
#import matplotlib.pyplot as plt
#from matplotlib import cm
import numpy as np
import pyjapc
import datetime
from PyQt5.QtWidgets import (QWidget, QLabel, QLineEdit, QComboBox, QCheckBox, QMessageBox, QGroupBox, QFormLayout, QTabWidget,
QTextEdit, QGridLayout, QVBoxLayout, QHBoxLayout, QApplication, QPushButton, QSizePolicy, QStatusBar, QRadioButton)
import PyQt5.QtCore as QtCore
from PyQt5.QtGui import (QIcon, QDoubleValidator, QIntValidator)
class Canvas(FigureCanvas):
def __init__(self, parent=None, width=5, height=4, dpi=100):
fig = Figure(figsize=(width, height), dpi=dpi, frameon=False, tight_layout=True)
self.axes = fig.add_subplot(111)
self.compute_initial_figure()
FigureCanvas.__init__(self, fig)
self.setParent(parent)
FigureCanvas.setSizePolicy(self, QSizePolicy.Expanding, QSizePolicy.Expanding)
FigureCanvas.updateGeometry(self)
def compute_initial_figure(self):
pass
class Plotter(Canvas):
def __init__(self, *args, **kwargs):
Canvas.__init__(self)
def compute_initial_figure(self):
t_ax = [1,2,3,4,5]
data = [1,2,3,4,5]
self.axes.plot(t_ax,data)
def update_figure(self):
self.axes.cla()
#print(self.xData)
#print(self.yData)
self.axes.plot(self.xData,self.yData,'o',color=self.color)
self.axes.set_xlabel(self.xLabel)
self.axes.set_ylabel(self.yLabel)
if self.time:
self.axes.xaxis.set_major_formatter(mdates.DateFormatter('%H:%M'))
#self.axes.xaxis.set_major_locator(mdates.MinuteLocator(interval=5))
self.draw()
''' This is where my code starts '''
class Example(QWidget):
''' Init Self '''
def __init__(self):
super().__init__()
self.BCTF_dev = 'TT41.BCTF.412340/Acquisition#totalIntensityPreferred'
self.BCTF_name = 'Bunch Intensity'
self.BCTF_ns = True
self.LASER_dev = 'TSG41.AWAKE-LASER-DATA'
self.LASER_inds = range(15,19)
self.LASER_ns = True
self.TwoScreen_dev = 'TSG41.AWAKE-TWO-SCREEN-MEAS'
self.TwoScreen_inds = range(5,9)
self.TwoScreen_ns = False
self.CTR_dev = 'TSG41.AWAKE-CTR-WR4-WR8-MIXERS'
self.CTR_inds = range(1,8)
self.CTR_ns = True
self.StreakFFT_dev = 'TSG41.AWAKE-XMPP-FFTFREQ'
self.StreakFFT_inds = range(1,5)
self.StreakFFT_ns = False
self.LaserEng_dev = 'EMETER04/Acq#value'
self.LaserEng_name = 'Laser Energy'
self.LaserEng_ns = True
self.SPSdelay_dev = 'Sps2AwakeSynchro/ProtonDelayNs#delay'
self.SPSdelay_name = 'Laser-SPS Delay'
self.SPSdelay_ns = True
self.sps0time = 1111688.5
self.Density_dev = 'TSG41.AWAKE-DENSITY-DATA'
self.Density_inds = range(1,4)
self.Density_ns = True
self.getVal= '/ValueAcquisition#floatValue'
self.getName= '/NameAcquisition#nameValue'
self.bufferLength = 100
self.index = 0
self.initJAPC()
self.initUI()
''' JAPC initialization '''
def initJAPC(self):
self.japc = pyjapc.PyJapc("SPS.USER.AWAKE1")
''' Initialize GUI '''
def initUI(self):
plot_list = self.get_list_of_plots()
self.time_array = []
self.data_array = np.zeros((self.bufferLength,len(plot_list)-1))
self.main_widget = QWidget(self)
self.Plots = []
self.xCombos = []
self.yCombos = []
self.toolBars = []
self.zoomButt = []
self.homeButt = []
# Create Layout
grid = QGridLayout()
colors = ['b','r','g','m','c','k']
for i in range(6):
self.Plots.append(Plotter(self.main_widget, width=5, height=4, dpi=100))
self.Plots[i].color = colors[i]
xcb = QComboBox(self)
self.xCombos.append(xcb)
self.xCombos[i].addItems(plot_list)
self.xCombos[i].currentIndexChanged.connect(lambda index, caller=xcb: self.x_select(index,caller))
self.xCombos[i].setFixedWidth(100)
ycb = QComboBox(self)
self.yCombos.append(ycb)
self.yCombos[i].addItems(plot_list)
self.yCombos[i].setCurrentIndex(i+1)
self.yCombos[i].currentIndexChanged.connect(lambda index, caller=ycb: self.y_select(index,caller))
self.yCombos[i].setFixedWidth(100)
zpb = QPushButton('Zoom')
hpb = QPushButton('Home')
self.toolBars.append(NavigationToolbar(self.Plots[i], self))
self.toolBars[i].hide()
self.zoomButt.append(zpb)
self.zoomButt[i].setStyleSheet("background-color:#00f2ff")
self.zoomButt[i].clicked.connect(lambda state, caller=zpb: self.zoom(state, caller))
self.homeButt.append(hpb)
self.homeButt[i].setStyleSheet("background-color:#00f2ff")
self.homeButt[i].clicked.connect(lambda state, caller=hpb: self.home(state, caller))
x_label = QLabel('X-Axis:')
y_label = QLabel('Y-Axis:')
v_box = QVBoxLayout()
v_box.addWidget(x_label)
v_box.addWidget(self.xCombos[i])
v_box.addWidget(y_label)
v_box.addWidget(self.yCombos[i])
v_box.addWidget(self.zoomButt[i])
v_box.addWidget(self.homeButt[i])
v_box.addStretch()
h_box = QHBoxLayout()
h_box.addLayout(v_box)
h_box.addWidget(self.Plots[i])
h_box.addStretch()
grid.addLayout(h_box,np.floor_divide(i,2),np.mod(i,2))
self.setLayout(grid)
self.setGeometry(1600, 300, 1000, 1000)
# Make a window
self.setWindowTitle('Plots')
self.setWindowIcon(QIcon(os.path.join(os.path.dirname(__file__),'awakeicon1_FkV_icon.ico')))
# Start the show
self.show()
self.start_subs()
''' Create list of things we plot '''
def get_list_of_plots(self):
temp = self.japc.getParam(self.StreakFFT_dev+self.getName)
streak_fft_names = temp[self.StreakFFT_inds]
temp = self.japc.getParam(self.TwoScreen_dev+self.getName)
two_screen_names = temp[self.TwoScreen_inds]
temp = self.japc.getParam(self.CTR_dev+self.getName)
ctr_dev_names = temp[self.CTR_inds]
temp = self.japc.getParam(self.LASER_dev+self.getName)
laser_dev_names = temp[self.LASER_inds]
temp = self.japc.getParam(self.Density_dev+self.getName)
dens_dev_names = temp[self.Density_inds]
plot_list = []
plot_list.append('Time')
plot_list.append(self.BCTF_name)
plot_list.append(self.LaserEng_name)
plot_list.extend(streak_fft_names)
plot_list.extend(two_screen_names)
plot_list.extend(ctr_dev_names)
plot_list.extend(laser_dev_names)
plot_list.extend(dens_dev_names)
plot_list.append(self.SPSdelay_name)
return plot_list
''' Zoom and Home '''
def zoom(self, state, caller):
pInd = self.zoomButt.index(caller)
self.toolBars[pInd].zoom()
def home(self, state, caller):
pInd = self.homeButt.index(caller)
self.toolBars[pInd].home()
''' Update plot selection '''
def x_select(self,index,caller):
pInd = self.xCombos.index(caller)
self.Update(pInd)
''' Update plot selection '''
def y_select(self,index,caller):
pInd = self.yCombos.index(caller)
self.Update(pInd)
''' Start Subs '''
def start_subs(self):
self.japc.subscribeParam(self.BCTF_dev,self.proc_plot_data,getHeader=True,unixtime=True)
self.japc.startSubscriptions()
''' What to do when you have the data '''
def proc_plot_data(self, name, paramValue, header):
time.sleep(7)
laserEng = self.japc.getParam(self.LaserEng_dev)
spsDelay = self.japc.getParam(self.SPSdelay_dev) - self.sps0time
laserAlign = self.japc.getParam(self.LASER_dev+self.getVal)[self.LASER_inds]
twoScreen = self.japc.getParam(self.TwoScreen_dev+self.getVal)[self.TwoScreen_inds]
CTR = self.japc.getParam(self.CTR_dev+self.getVal)[self.CTR_inds]
StreakFFT = self.japc.getParam(self.StreakFFT_dev+self.getVal)[self.StreakFFT_inds]
Density = self.japc.getParam(self.Density_dev+self.getVal)[self.Density_inds]
if self.index == (self.bufferLength-1):
self.data_array = np.roll(self.data_array,-1,axis=0)
del(self.time_array[0])
self.time_array.append(datetime.datetime.fromtimestamp(header['acqStamp']))
self.data_array[self.index,0] = paramValue
self.data_array[self.index,1] = laserEng
self.data_array[self.index,2:6] = StreakFFT
self.data_array[self.index,6:10] = twoScreen
self.data_array[self.index,10:17] = CTR
self.data_array[self.index,17:21] = laserAlign
self.data_array[self.index,21:24] = Density
self.data_array[self.index,24] = spsDelay
else:
self.time_array.append(datetime.datetime.fromtimestamp(header['acqStamp']))
self.data_array[self.index,0] = paramValue
self.data_array[self.index,1] = laserEng
self.data_array[self.index,2:6] = StreakFFT
self.data_array[self.index,6:10] = twoScreen
self.data_array[self.index,10:17] = CTR
self.data_array[self.index,17:21] = laserAlign
self.data_array[self.index,21:24] = Density
self.data_array[self.index,24] = spsDelay
# print('BCTF = '+str(self.data_array[self.index,0]))
# print('LasEng = '+str(self.data_array[self.index,1]))
# print('StkFFT = '+str(self.data_array[self.index,2:6]))
# print('2Screen = '+str(self.data_array[self.index,6:10]))
# print('CTR = '+str(self.data_array[self.index,10:17]))
# print('Laser = '+str(self.data_array[self.index,17:21]))
self.index += 1
self.Update(0)
self.Update(1)
self.Update(2)
self.Update(3)
self.Update(4)
self.Update(5)
''' Get Plot Stuff '''
def Update(self,ind):
if self.index > 1:
xInd = self.xCombos[ind].currentIndex()
yInd = self.yCombos[ind].currentIndex()
if xInd == 0:
self.Plots[ind].xData = self.time_array
self.Plots[ind].yData = self.data_array[0:self.index,yInd-1]
self.Plots[ind].time = True
else:
self.Plots[ind].xData = self.data_array[0:self.index,xInd-1]
self.Plots[ind].yData = self.data_array[0:self.index,yInd-1]
self.Plots[ind].time = False
self.Plots[ind].xLabel = self.xCombos[ind].currentText()
self.Plots[ind].yLabel = self.yCombos[ind].currentText()
self.Plots[ind].update_figure()
''' Stop Subs '''
def stop_subs(self):
self.japc.stopSubscriptions()
self.japc.clearSubscriptions()
''' Clear Subs '''
def clear_subs(self):
self.japc.stopSubscriptions()
self.japc.clearSubscriptions()
''' GTFO '''
def closeEvent(self, event):
self.clear_subs()
QWidget.closeEvent(self, event)
''' Start the GUI '''
if __name__ == '__main__':
app = QApplication(sys.argv)
ex = Example()
sys.exit(app.exec_()) | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/plotting_tools/Plotter_gui.py | Plotter_gui.py |
# -*- coding: utf-8 -*-
"""
Created on Sat May 27 08:51:34 2017
@author: Spencer&Karl
"""
import sys
import os.path as osp
import functools
import matplotlib as mpl
# Make sure that we are using QT5
mpl.use('Qt5Agg')
#mpl.use('Agg')
from matplotlib.backends.backend_qt5agg import FigureCanvasQTAgg as FigureCanvas
from matplotlib.figure import Figure
import numpy as np
import pyjapc
from PyQt5.QtWidgets import (QWidget, QLabel, QLineEdit, QComboBox, QCheckBox,
QTextEdit, QGridLayout, QApplication, QPushButton, QSizePolicy, QMessageBox, QGroupBox, QFormLayout, QVBoxLayout, QHBoxLayout, QStatusBar)
from PyQt5.QtCore import QTimer
from PyQt5.QtGui import (QIcon, QDoubleValidator, QRegExpValidator)
import PyQt5.QtGui as QtGui
from PyQt5 import QtCore, QtWidgets
import queue as qe
import pickle
long=int
class AWAKException(Exception):
pass
"""
Spencers Canvas class renamed
"""
class awakeCanvas(FigureCanvas):
def __init__(self, parent=None, width=5, height=4, dpi=100,reverse=False,*args):
#self.fig = plt.figure(figsize=(width, height), dpi=dpi)
self.fig=Figure(figsize=(width, height), dpi=dpi,tight_layout=True)
self.nSubPlots=len([k for k in args if type(k) is type(lambda x:1)])
if reverse:
self.gs=mpl.gridspec.GridSpec(self.nSubPlots,1)
else:
self.gs=mpl.gridspec.GridSpec(1,self.nSubPlots)
#self.axes = [self.fig.add_subplot(int("1"+str(l+1)+"1")) for l in range(0,nSubPlots)] #maybe change later, dont like the subplot call
self.axes = {l:self.fig.add_subplot(self.gs[l]) for l in range(0,self.nSubPlots)} #maybe change later, dont like the subplot call
#for k in self.axes:
# k.hold(False)# We want the axes cleared every time plot() is called
#print(self.axes)
"""
Figure Canvas initialisation
"""
FigureCanvas.__init__(self, self.fig)
self.setParent(parent)
FigureCanvas.setSizePolicy(self, QSizePolicy.Expanding, QSizePolicy.Expanding)
FigureCanvas.updateGeometry(self)
class awakeScreen(awakeCanvas):
def __init__(self,*args, parent=None, width=5, height=4, dpi=100,**kwargs):
if 'reverse' in kwargs.keys():
reverse=kwargs['reverse']
print('Using reverse Image conf!:',reverse)
else:
reverse=False
super(awakeScreen,self).__init__(parent,width,height,dpi,reverse,*args)
self.f=[k for k in args if type(k) is type(lambda x:1)]
self.fnames=[k.__name__ for k in self.f]
self.fargs={}
for k in kwargs.keys():
if k in self.fnames:
self.fargs[k]=kwargs[k]
def __iter__(self):
return self.f.__iter__()
def __call__(self,*args):
#self._cleanAxes()
callDraw=[]
for k,name,ax in zip(self.f,self.fnames,self.axes.keys()):
if name in self.fargs.keys():
try:
# debugging check if all functions are called
#print('function',k,'axes',self.axes[ax],'args',args,'function args',self.fargs[name])
# works fine...
callDraw.append(k(self.axes[ax],*args,**self.fargs[name]))
except:
pass
else:
try:
# debugging check if all functions are called
#print('function',k,'axes',self.axes[ax],'args',args,'function args',self.fargs[name])
# works fine...
callDraw.append(k(self.axes[ax],*args))
except:
pass
if np.any(callDraw):
self.draw()
def _cleanAxes(self):
print(self.fig.axes)
print(self.axes)
for k in self.fig.axes:
if k not in self.axes:
self.fig.delaxes(k)
print(self.fig.axes)
print(self.axes)
def updateFunction(self,fun,replace=False,**kwargs):
if not replace:
if fun.__name__ in self.fnames:
i=0
while fun.__name__ != self.f[i].__name__:
i+=1
if i>10:
print('awakeJAPC|updateFunction: You sure that the functions exists?')
break
self.f[i]=fun
for k in kwargs.keys:
self.fargs[fun.__name__][k]=kwargs[k]
#self.fargs[fun.__name__]=kwargs
return True
if replace:
if not isinstance(fun):
fun=[fun]
if len(fun)>self.axes:
print('Did not update as there are too many functions!')
return False
self.f=fun
self.fnames=[k.__name__ for k in self.f]
self.fargs={}
for k in kwargs.keys():
if k in self.fnames:
self.fargs[k]=kwargs[k]
return True
def draw(self,*args,**kwargs):
super(awakeCanvas,self).draw()
class awakeJAPC(awakeScreen):
def __init__(self,subscriptions,*args,axes=None,selector="SPS.USER.AWAKE1",name='MyAwakeImage',autostart=False,**kwargs):
super(awakeJAPC,self).__init__(*args,**kwargs)#self.screen=awakeScreen(*args,**kwargs)
self.selector=selector
self.subs=awakeSubscriptions(subscriptions,selector,self._workQueue)
self.name=str(name)
self.fixedaxes=axes
self.inSubs=False# lazy solution
if autostart:
self.start()
def start(self,verbose=True):
self.subs.start()
def stop(self,verbose=True):
self.subs.stop()
def _workQueue(self,*args,**kwargs):
tpl=[self.fig,self.fixedaxes,self.subs.japc]
for k in range(len(self.subs)):
tpl.append(None)
while not self.subs.empty():
name,val=self.subs.get()
tpl[self.subs[name]+3]=val
tpl=tuple(tpl)
#print('Subscriptions recieved! Calling the functions!')
if self.inSubs:
super(awakeJAPC,self).__call__(*tpl)
super(awakeJAPC,self).draw()
while not self.subs.all_tasks_done:# k in range(0,len(tpl)-3):
self.subs.task_done()
# lazy solution
if self.inSubs is False:
self.inSubs=not self.inSubs
def __del__(self):
self.stop()
class awakeSubscriptions(qe.Queue):
waitAll=True
_fullNames={}
def __init__(self,x,selector="SPS.USER.AWAKE1",wkQueue=None):
self.selector=selector
self.japc = pyjapc.PyJapc(selector)
if wkQueue is not None:
self._workQueue=wkQueue
#self.japc.setSelector(selector)
self._setSubs(x)
def _setSubs(self,x):
if isinstance(x,str):
super(awakeSubscriptions,self).__init__(1)
self.subs={x:0}
self.subNum={0:x}
return
try:
super(awakeSubscriptions,self).__init__(len(x))
self.subs={str(x[k]):k for k in range(0,len(x))}
self.subNum={k:str(x[k]) for k in range(0,len(x))}
except:
raise AWAKException('For Subscriptions please provide string construtables iterable or single string!')
def _fillQueue(self,name,val):
if not self.waitAll:
self.put((name,val))
else:
if not (name in self._fullNames.keys()):
self.put((name,val))
self._fullNames[name]=True
if self.full():
self._workQueue()
self._fullNames={}
if not self.empty(): #jem,and hat vergessen queue zu leeren
while not self.empty():
self.get()
self.task_done()
while not self.all_tasks_done: #some1 forgot to place a task done...
self.task_done()
def start(self,verbose=True):
self.japc.stopSubscriptions()
self.japc.clearSubscriptions()
for k in list(self.subs.keys()):
print(k)
self.japc.subscribeParam(k,self._fillQueue)
self.japc.startSubscriptions()
if verbose:
print('STARTED ALL SUBSCRIPTIONS!')
def stop(self,verbose=True):
self.japc.stopSubscriptions()
if verbose:
print('STOPPED ALL SUBSCRIPTIONS!')
def __getitem__(self,k):
if k in self.subs.keys():
return self.subs[k]
if k in self.subNum.keys():
return self.subNum[k]
raise AWAKException('Requested object not in Subscritpions')
def setSubscriptions(self,x,selector=None):
self.stop(False)
if selector is not None:
self.japc.setSelector(selector)
self._setSubs(x)
self.start(False)
def __repr__(self):
print(self.subs)
def __iter__(self):
return self.subs.__iter__()
def __len__(self):
return len(self.subNum)
def _workQueue(self,*args,**kwargs):
pass
def __del__(self):
self.stop()
self.japc.clearSubscriptions()
class awakeLoop:
SubCalls=0
MaxCall=5
ListCallNumber=0
savePickle=True
firstCall=True
def __init__(self,subs,wkQue,scanList=None,checkBeam=lambda x: x.japc.getParam("TT41.BCTF.412340/Acquisition#totalIntensityPreferred")<0.1,selector="SPS.USER.AWAKE1",savestr='awakeLoop.pickle',finfun=None,name=None):
print('init called')
self.subs=awakeSubscriptions(subs,selector,self.__call__)
self.callFKT=wkQue
self.ScanList=scanList
if scanList is not None:
self.result=[[] for k in range(0,len(scanList))]
self.file_string=savestr
self.checkBeam=checkBeam
self.finfun=finfun
if name is None:
self.name=self.callFKT.__name__
else:
self.name=name
def reset(self):
self.SubCalls=0
self.MaxCall=5
self.ListCallNumber=0
self.firstCall=True
self.savePickle=True
def __call__(self,*args,**kwargs):
print('awakeLoop called!\nListCallNumber:{0:2.0f}'.format(self.ListCallNumber))
# daten sind in self.subs als queue, sind ungeordnet aber immer als parmeter (name,val)
if self.firstCall:
self.firstCall= not self.firstCall
self.result=[[] for k in range(0,len(self.ScanList))]
return
if self.checkBeam(self.subs):
print('NO BEAM!')
return
#pass
# call order: name, val
try:
self.result[self.ListCallNumber].append(self.callFKT(self.subs.japc,self.ScanList[self.ListCallNumber],self.subs))
except:
print('EXCEPTION IN LOOPCALL!')
self.result[self.ListCallNumber].append({'result':None})
self.SubCalls+=1
if self.SubCalls%self.MaxCall ==0:
self.SubCalls=0
self.ListCallNumber+=1
if self.ListCallNumber==len(self.ScanList):
self.subs.stop()
print("Finished with scan!")
self.reset()
if self.savePickle:
try:
print("Saving scan results!")
pickle.dump(self.result,open(self.file_string,'wb+'))
except:
print("Saving failed! (disk quota?)")
if self.finfun is not None:
print('Calling Finfun!')
self.finfun(self.subs.japc,self.result)
def print(self):
print('printing self!')
print(self.SubCalls,self.MaxCall,self.ListCallNumber,self.savePickle,self.firstCall)
#print(self.subs)
print(self.callFKT)
print(self.ScanList)
def stop(self):
self.subs.stop()
def start(self):
self.result=[[] for k in range(0,len(self.ScanList))]
self.subs.start()
class awakeLaserBox:
def __init__(self,ns1,ns05,ns02,ns01,element='MPPAWAKE:FASTTRIG-1:STREAKTUBE-FINEDELAY',name='XMPP'):
#pixel definitions of laserposition
self.tRange={'1 ns':ns1,'500 ps':ns05,'200 ps':ns02,'100 ps':ns01}
self.elem=element
self.name=name
import pytimber
import time
self.ldb=pytimber.LoggingDB()
self.timeVal=self.ldb.get(self.elem,time.strftime('%Y-%m-%d %H:%M:%S'))
def __call__(self,x=None):
import time
try:
self.timeVal=self.ldb.get(self.elem,time.strftime('%Y-%m-%d %H:%M:%S'))
except: #hardgecoded
if x is not None:
self.timeVal=self.tRange[x][0]
return self.timeVal
else:
return None
#raise AWAKException('__call__ in awakeLaserBox: Please secify tranges via constructor!')
return self.timeVal[self.elem][1][0]
def __getitem__(self,k):
try :
rVal=self.tRange[k]
except:
return None,None
return rVal
class windowClass(QtWidgets.QMainWindow):
def __init__(self,ev=None,parent=None):
QtWidgets.QMainWindow.__init__(self)
self.main_widget = QtWidgets.QWidget(self)
self.setAttribute(QtCore.Qt.WA_DeleteOnClose)
self.setCentralWidget(self.main_widget)
if ev is not None:
self.endF=ev
else:
self.endF=None
def fileQuit():
pass
def closeEvent(self,ce):
if self.endF is not None:
self.endF(self)
class AwakeWindow(QtWidgets.QMainWindow):
def __init__(self,subscriptions,*args,fixedaxes=None,selector="SPS.USER.AWAKE1",name='MyAwakeImage',autostart=True,**kwargs):
self.childWindows={}
QtWidgets.QMainWindow.__init__(self)
self.main_widget = QtWidgets.QWidget(self)
self.awakeJAPC_instance=awakeJAPC(subscriptions,*args,axes=fixedaxes,selector=selector,name=name,autostart=autostart,parent=self.main_widget,**kwargs)
self.setAttribute(QtCore.Qt.WA_DeleteOnClose)
self.setWindowTitle("%s"%self.awakeJAPC_instance.name)
self.file_menu = QtWidgets.QMenu('&File', self)
self.file_menu.addAction('&Quit', self.fileQuit,QtCore.Qt.CTRL + QtCore.Qt.Key_Q)
self.menuBar().addMenu(self.file_menu)
self.help_menu = QtWidgets.QMenu('&Help', self)
self.menuBar().addSeparator()
self.menuBar().addMenu(self.help_menu)
self.help_menu.addAction('&About', self.about)
"""
start& stop
"""
self.start_menu = QtWidgets.QMenu('&Subscriptions Control', self)
self.menuBar().addSeparator()
self.menuBar().addMenu(self.start_menu)
self.start_menu.addAction('&Start Subscriptions', self.start)
self.start_menu.addAction('&Stop Subscriptions', self.stop)
'''
Add Menu for parameters of plotfunctions and add the points
'''
self.option_menu = QtWidgets.QMenu('&Parameter Control', self)
self.menuBar().addSeparator()
self.menuBar().addMenu(self.option_menu)
self.wrapperFKT={}
self.wrapperSETFKT={}
self.prmNAMES={}
for k in kwargs.keys():
if type(kwargs[k]) is not type({}):
continue
else:
for l in kwargs[k]:
# internal naming
buff=str(k)+'/'+str(l)
# save names of parameters, a call to spawnChildWindow and a corresponding setfunction
self.prmNAMES[buff]=buff
self.wrapperFKT[buff]=lambda :self.spawnChildWindow(self.prmNAMES[buff])#'StdPlotMPPT/maxVal'])
self.wrapperSETFKT[buff]=lambda :self.SetPrmToFunction(self.prmNAMES[buff])
self.prm_menus=[]
for k in self.prmNAMES.keys():
#print(k)
prmIdx='&Set parameter: '+self.prmNAMES[k]
#self.prm_menus.append(QtWidgets.QMenu(prmIdx, self))
#self.menuBar().addSeparator()
#self.menuBar().addMenu(self.prm_menus[-1])
#action = self.prm_menus[-1].addAction(prmIdx)
#print(self.wrapperFKT[str(k)])
#action.triggered.connect(self.wrapperFKT['StdPlotMPPT/maxVal'])
self.option_menu.addAction(prmIdx,functools.partial(self.spawnChildWindow,self.prmNAMES[k]))
#self.option_menu.addAction('bla1',self.wrapperFKT['StdPlotMPPT/maxVal'])
"""
Update functions
self.start_menu = QtWidgets.QMenu('&Subscriptions Control', self)
self.menuBar().addSeparator()
self.menuBar().addMenu(self.start_menu)
"""
BoxLayout = QtWidgets.QVBoxLayout(self.main_widget)
BoxLayout.addWidget(self.awakeJAPC_instance)
self.main_widget.setFocus()
self.setCentralWidget(self.main_widget)
self.statusBar().showMessage("Here speaks God, give Gandalf your ring!!", -1)
def start(self):
self.awakeJAPC_instance.start()
def wrapperChildWindow(self,x):
self.spawnChildWindow(x)
def wrapperSetPrmToFunction(self,x):
'''
Hier gehört der ganze conversion kram hin!
'''
text=self.childWindows[x].inpVal.text()
print(text)
text=text.strip()
if text[0] == '[' or text[0]=='(':
if text[-1]==']' or text[-1]==')':
text=np.array(text)
else:
print('NOT ACCEPTED INPUT!')
return
if text.size==1:
text=float(text)
else:
text=float(text)
arg=text#float(self.childWindows[x].inpVal.text())
self.SetPrmToFunction(x,arg)
def SetPrmToFunction(self,x):
'''
Hier gehört der ganze conversion kram hin!
'''
text=self.childWindows[x].inpVal.text()
print('--------------')
print(text)
text=text.strip()
print(text)
'''tests to accept inputs,'''
if text[0] == '[' or text[0]=='(':
if text[-1]==']' or text[-1]==')':
text=text.strip('[').strip('(').strip(']').strip(']')
text=np.fromstring(text,sep=',')
print(text)
else:
text=text
print('text input!')
if text.size==1:
text=float(text[0])
else:
try:
text=float(text)
except:
text=text
print('text input!')
arg=text#float(self.childWindows[x].inpVal.text())
#arg=float(self.childWindows[x].inpVal.text())
# functions are named like functionname/parameter
fname=x.split('/')[0]
farg=x.split('/')[1]
print(self.awakeJAPC_instance.fargs[fname][farg])
print(arg)
self.awakeJAPC_instance.fargs[fname][farg]=arg
def spawnChildWindow(self,name):
# Create a group box for camera properties
'''
self.childWindows[name]=windowClass(self.killChild)
'''
self.childWindows[name]=awakeScreen(parent=None)
child=self.childWindows[name]
child.name=name
child.btnStart = QPushButton('Send', child)
child.btnStart.setStyleSheet("background-color:#63f29a")
child.btnStart.clicked[bool].connect(functools.partial(self.SetPrmToFunction,name))
prop_box = QGroupBox('Properties')
#child.inpVal = QLineEdit(child)
#child.inpVal.setValidator(QDoubleValidator(-100000,100000,4))
child.inpVal = QLineEdit(self)
child.regExp=QtCore.QRegExp('^.*$')# accept everything
child.inpVal.setValidator(QRegExpValidator(child.regExp))
prop_form = QFormLayout()
print(name)
prop_form.addRow('New Value for '+str(name)+':',child.inpVal)
prop_box.setLayout(prop_form)
# Create a plotting window
#child.main_widget = QWidget(self)
# Create Layout
child.vbox=QVBoxLayout()
child.hbox = QHBoxLayout()
child.vbox.addWidget(prop_box)
child.vbox.addWidget(child.btnStart)
#child.hbox.addStretch(1)
child.hbox.addLayout(child.vbox)
child.setLayout(child.hbox)
child.show()
def killChild(self,node):
for k in self.childWindows.keys():
if node.name==str(k):
delcand=self.childWindows[k]
del(delcand)
def stop(self):
self.awakeJAPC_instance.stop()
def fileQuit(self):
self.awakeJAPC_instance.stop()
self.close()
def closeEvent(self, ce):
self.fileQuit()
def about(self):
QtWidgets.QMessageBox.about(self, "About","""A single Awake window, able to show something nice""")
#laserboxMPP=awakeLaserBox((4.3,[270,290]),(9.2,[310,330]),(4.3,[415,440]),(8.35,[225,335]))
laserboxMPP=awakeLaserBox((4.3,[280,300]),(9.2,[300,320]),(4.3,[415,440]),(8.35,[205,325]))
laserboxBI=awakeLaserBox((5.3,[210,260]),None,None,None,name='BI.TT41.STREAK') | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/plotting_tools/awakeIMClass.py | awakeIMClass.py |
import matplotlib.pyplot as plt
from matplotlib import cm
import numpy as np
def view_streak(data_dict,cam,scale,cax,fft_sum=[]):
if len(scale) == 1:
scale_1 = scale
scale_2 = scale
elif len(scale) == 2:
scale_1 = scale[0]
scale_2 = scale[1]
else:
print('ERROR: Bad scale')
return
# get axis
x_ax = data_dict['x_ax']
y_ax = data_dict['y_ax']
f_ax = data_dict['streak_data']['f_ax']
# get image, band, and lineout
img = data_dict['img']
band = data_dict['streak_data']['band']
line = data_dict['streak_data']['line']
# get ffts
fftb = data_dict['streak_data']['fftb']
fftl = data_dict['streak_data']['fftl']
# scale and offset profiles so it fits in frame
noOffD_b = band - min(band)
noOffD_l = line - min(line)
noOff_bMaxD = max(noOffD_b)
noOff_lMaxD = max(noOffD_l)
x_min = min(x_ax)
x_max = max(x_ax)
y_min = min(y_ax)
y_max = max(y_ax)
Dscale_b = scale_1*(x_min/noOff_bMaxD)*noOffD_b
Dscale_l = -scale_2*(x_min/noOff_lMaxD)*noOffD_l
plot_b = Dscale_b + x_min
plot_l = -Dscale_l + x_max
# get band and center indices
xc = data_dict['streak_data']['xc']
xl = data_dict['streak_data']['xl']
xh = data_dict['streak_data']['xh']
# Now make some pretty plots
fig = plt.figure(1)
ax = fig.add_subplot(121)
#plt.pcolormesh(x_ax,y_ax,img,cmap=cm.plasma,clim=cax)
ax.imshow(img,extent=[x_min,x_max,y_min,y_max],cmap=cm.inferno,clim=cax,aspect='auto')
fig.colorbar
ax.plot([xl, xl],[y_min, y_max],'r--',[xh, xh],[y_min, y_max],'r--',[xc, xc],[y_min, y_max],'w--',linewidth=3)
ax.plot(np.flipud(plot_b),y_ax,'r',plot_l,y_ax,'w',linewidth=3)
ax.set_ylabel('Time [ps]')
ax.set_xlabel('X [mm]')
ax.set_title(cam)
ax = fig.add_subplot(222)
ax.semilogy(f_ax,fftb,'r',linewidth=2,label='FFT of Band')
ax.set_xlim([0,500])
#ax.set_xlabel('Spectrum [GHz]')
ax.legend()
ax.set_title('FFTs')
ax = fig.add_subplot(224)
if len(fft_sum) > 1:
ax.semilogy(f_ax,fft_sum,'k',linewidth=2,label='Running FFT Avg.')
ax.set_xlim([0,500])
ax.set_xlabel('Spectrum [GHz]')
ax.legend()
else:
ax.semilogy(f_ax,fftl,'k',linewidth=2,label='FFT of Lineout')
ax.set_xlim([0,500])
ax.set_xlabel('Spectrum [GHz]')
ax.legend()
#ax.set_title('FFT of Lineout')
#plt.show()
return fig | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/plotting_tools/view_streak.py | view_streak.py |
from __future__ import unicode_literals
"""
Spencer tries to make a thing
Karl copied it, because Karls QVBoxLayout doesnt like him, but spencers behaves well.. good boy
"""
''' Get all the things '''
import sys
import time
import os
import matplotlib as mpl
# Make sure that we are using QT5
mpl.use('Qt5Agg')
from matplotlib.backends.backend_qt5agg import FigureCanvasQTAgg as FigureCanvas
from matplotlib.figure import Figure
from matplotlib import cm
import numpy as np
import pyjapc
import awakeIMClass
import scipy.special as special
from PyQt5.QtWidgets import (QWidget, QLabel, QLineEdit, QComboBox, QCheckBox, QMessageBox, QGroupBox, QFormLayout,
QTextEdit, QGridLayout, QVBoxLayout, QHBoxLayout, QApplication, QPushButton, QSizePolicy, QStatusBar)
import PyQt5.QtCore as QtCore
from PyQt5.QtGui import (QIcon, QDoubleValidator, QRegExpValidator)
from PyQt5 import QtCore, QtWidgets
''' This is where my code starts '''
class awakeScanGui(QWidget):
''' Init Self '''
def __init__(self,*args):
super().__init__()
self.initScans(*args)
self.initUI()
''' Initialize GUI '''
def initUI(self):
#self.connect(self,QtCore.Signal('triggered()'),self.closeEvent)
#self.file_menu = QtWidgets.QMenu('&File', self)
#self.file_menu.addAction('&Quit', self.closeEvent,QtCore.Qt.CTRL + QtCore.Qt.Key_Q)
#self.menuBar().addMenu(self.file_menu)
# Create a combobox for selecting camera
self.selectScan = QLabel('Select Scan:')
self.scanList = QComboBox(self)
self.scanList.addItems(list(self.scanFunDict.keys()))
self.scanList.currentIndexChanged.connect(self.selectScanFun)
self.btnStart = QPushButton('Start', self)
self.btnStart.setStyleSheet("background-color:#63f29a")
self.btnStart.clicked[bool].connect(self.doStart)
self.btnStop = QPushButton('Stop', self)
self.btnStop.setStyleSheet("background-color:#63f29a")
self.btnStop.clicked[bool].connect(self.doStop)
# Create a group box for camera properties
prop_box = QGroupBox('Camera Properties')
self.inpNShot = QLineEdit(self)
self.inpNShot.setValidator(QDoubleValidator(-100000,100000,4))
self.inpAnfang = QLineEdit(self)
self.inpAnfang.setValidator(QDoubleValidator(-100000,100000,4))
self.inpEnde = QLineEdit(self)
self.inpEnde.setValidator(QDoubleValidator(-100000,100000,4))
self.inpStep = QLineEdit(self)
self.inpStep.setValidator(QDoubleValidator(-100000,100000,4))
self.inpSavestr = QLineEdit(self)
self.regExp=QtCore.QRegExp('^.*$')# accept everything
self.inpSavestr.setValidator(QRegExpValidator(self.regExp))
#self.inpSavestr.setValidator(QRegExpValidator('^ .*$'))
prop_form = QFormLayout()
prop_form.addRow('Number of Shots (empty=5):',self.inpNShot)
prop_form.addRow('Start Value:',self.inpAnfang)
prop_form.addRow('End Value:',self.inpEnde)
prop_form.addRow('Step Value:',self.inpStep)
prop_form.addRow('Savestring (empty = no save):',self.inpSavestr)
prop_box.setLayout(prop_form)
# Create a plotting window
self.main_widget = QWidget(self)
def lpass(x,*args):
pass
self.screen=awakeIMClass.awakeScreen(lpass,parent=self.main_widget)
# Create Status Bar
self.statusBar = QStatusBar(self)
self.statusBar.setSizeGripEnabled(False)
# Create Layout
self.vbox = QVBoxLayout()
self.vbox.addWidget(self.selectScan)
self.vbox.addWidget(self.scanList)
self.vbox.addWidget(self.btnStart)
self.vbox.addWidget(self.btnStop)
self.vbox.addWidget(prop_box)
self.vbox.addStretch(1)
self.vbox.addWidget(self.statusBar)
self.hbox = QHBoxLayout()
self.hbox.addLayout(self.vbox)
self.hbox.addStretch()
self.hbox.addWidget(self.screen, QtCore.Qt.AlignRight)
self.setLayout(self.hbox)
self.setGeometry(1600, 300, 900, 500)
self.setWindowIcon(QIcon(os.path.join(os.path.dirname(__file__),'awakeicon1_FkV_icon.ico')))
self.setWindowTitle('Streak scan selector')
self.show()
self.statusBar.showMessage('Here speaks God, give Gandalf your Ring!')
print('Init screen')
self.hbox.removeWidget(self.screen)
self.screen=self.activeScan.finfun
self.screen.setParent(self.main_widget)
self.hbox.addWidget(self.screen, QtCore.Qt.AlignRight)
def closeEvent(self, event):
self.doStop()
#QCoreApplication.quit()
#QWidget.closeEvent(self, event)
def doStart(self):
self.doStop()
buffMax=self.inpNShot.text()
savestr=self.inpSavestr.text()
if buffMax !='':
try:
self.activeScan.MaxCall=np.float(buffMax)
except:
self.statusBar.showMessage('Error: Number of shots could not be converted to float')
raise ValueError('No float constructable for N shot!')
try:
start=np.float(self.inpAnfang.text())
end=np.float(self.inpEnde.text())
step=np.float(self.inpStep.text())
except:
self.statusBar.showMessage('Error: one or more of start,end,stop could be converted to float!')
raise ValueError('One or more of start,end,stop could be converted to float!')
self.activeScan.ScanList=np.arange(start,end,step)
print(self.activeScan.ScanList)
if savestr != '':
self.activeScan.file_string=savestr
self.activeScan.savePickle=True
else:
self.activeScan.savePickle=False
self.statusBar.showMessage('Acquiring!')
self.activeScan.start()
return
def doStop(self):
self.activeScan.stop()
self.statusBar.showMessage('Stopped the acquisition!',5)
return
def about(self):
QtWidgets.QMessageBox.about(self, "About","""A single Awake window, able to show something nice""")
def initScans(self,*args):
self.activeScan=None
self.scanFunDict={}
zeroname=''
i=0
for k in args:
if isinstance(k,list):
for l in k:
if not isinstance(l,awakeLoop):
raise IOError('please provide awakeLoop instance')
#self.scanFunDict[l.name]=l
if i==0:
i+=1
zeroname=l.name
# speichere nicht die awakeloop sondern nur die argumente für sie
subList=[]
for m in l.subs.subNum.keys():
subList+=[l.subs.subNum[m]]
self.scanFunDict[l.name]={'subs':subList,'finfun':l.finfun,'scanList':l.ScanList,'wkQue':l.callFKT,'checkBeam':l.checkBeam,'selector':l.subs.selector,'savestr':l.file_string,'name':l.name}
if i==1:
i+=1
self.activeScan=awakeIMClass.awakeLoop(**self.scanFunDict[zeroname])
else:
#speichere nur die argumente fuer awakeloop
#self.scanFunDict[k.name]=k
subList=[]
for m in k.subs.subNum.keys():
subList+=[k.subs.subNum[m]]
self.scanFunDict[k.name]={'subs':subList,'finfun':k.finfun,'scanList':k.ScanList,'wkQue':k.callFKT,'checkBeam':k.checkBeam,'selector':k.subs.selector,'savestr':k.file_string,'name':k.name}
if i==0:
i+=1
print(k.name)
self.activeScan=awakeIMClass.awakeLoop(**self.scanFunDict[k.name])
if not isinstance(args[0],list):
# not working atm!
print('Passing atm')#self.activeScan=args[0]
else:
# not working atm!
print('Passing list atm')
#self.activeScan=args[0][0]
def selectScanFun(self,l):
#self.activeScan=self.scanFunDict[self.scanList.currentText()]
#del(self.activeScan)
self.activeScan=None
print(self.scanFunDict[self.scanList.currentText()])
kwargs=self.scanFunDict[self.scanList.currentText()]
self.activeScan=awakeIMClass.awakeLoop(**kwargs)
print('Now self.activeScan')
self.activeScan.print()
if self.activeScan.finfun is not None and isinstance(self.activeScan.finfun,awakeIMClass.awakeScreen):
self.hbox.removeWidget(self.screen)
self.screen=self.activeScan.finfun
self.screen.setParent(self.main_widget)
self.hbox.addWidget(self.screen, QtCore.Qt.AlignRight)
def TimeBeamScanXMPP(japc,scanListElem,subs):
import time
time.sleep(0.5)
BeamInten=subs.get()[1]
posREAD=japc.getParam('VTUAwake3FT1/Mode#pdcOutDelay')
japc.setParam('VTUAwake3FT1/Mode#pdcOutDelay',scanListElem)
time=japc.getParam('XMPP-STREAK/StreakImage#streakImageTime')
return {'posRead':posREAD,'posSet':scanListElem,'StreakImageTime':time,'BeamIntensity':BeamInten}
def finfunT ( ax,japc,result,*args):
ax.plot([1,2,3,4,5])
return True
TimeScanFinalXMPP=awakeIMClass.awakeScreen(finfunT)
TimeScanXMPP_L=awakeIMClass.awakeLoop("TT41.BCTF.412340/Acquisition#totalIntensityPreferred",TimeBeamScanXMPP,None,finfun=TimeScanFinalXMPP)
def SlitScanXMPP(japc,scanListElem,subs):
import time
import scipy as sp
time.sleep(0.5)
print('SlitScanXMPP called!')
BeamInten=subs.get()[1]
posREAD=japc.getParam("MPP-TSG41-MIRROR1-V/Position#position")
japc.setParam("MPP-TSG41-MIRROR1-V/MoveTo#position",scanListElem)
#posREAD=scanListElem
posSET=scanListElem
print(scanListElem)
imData=japc.getParam("XMPP-STREAK/StreakImage#streakImageData").reshape(512,672)
imData=imData-np.mean(imData[0:25,:])
startGuess=np.array([4e3,16,345,260,50])
def gaussFIT(prm,x,y):
return ((prm[0]/np.sqrt(2*prm[1]**2)*np.exp( - ((x[0]-prm[2])**2 + (x[1]-prm[3])**2)/(2*prm[1]**2)) + prm[4]) -y).ravel()
x,y=np.meshgrid(np.arange(250,450),np.arange(245,267))
INTEN=(imData[240:275,280:420]).sum()#(imData[240:275,280:420]-np.mean(imData[240:275,0:140])).sum()
time=japc.getParam('XMPP-STREAK/StreakImage#streakImageTime')
optimres=sp.optimize.least_squares(gaussFIT,startGuess,args=([x,y],imData[245:267,250:450]))
return {'posRead':posREAD,'posSet':posSET,'StreakImageTime':time,'BeamIntensity':BeamInten,'fitParam':optimres,'measBeamIntensity':INTEN,'ImageData':imData}
def SlitScanFinXMPP( ax,japc,result,*args):
import scipy.optimize as sp_o
def modelFit(x,prm1,prm2,prm3,prm4):
return prm1*prm2*(special.erf((prm3-x)/prm2)-special.erf((prm4-x)/prm2) )
listR=[]
for k in result:
listR+=k
posread=np.array([l['posRead'] for l in listR])
beamInten=np.array([l['measBeamIntensity']/l['BeamIntensity'] for l in listR])
beamInten=beamInten/np.max(beamInten)
yFitVal=np.array([l['fitParam'].x[3]-257 for l in listR])
#
# Fit the double error function model
#
argSorted=np.argsort(posread)
scatterX_inten=posread[argSorted]
beamInten=beamInten[argSorted]
xvals_inten=np.array([scatterX_inten[k] for k in range(0,len(scatterX_inten)-1) if ((scatterX_inten[k-1]<scatterX_inten[k] or k==0) and scatterX_inten[k+1]==scatterX_inten[k])])
shiftvals_inten=[0]
shiftvals_inten+=[k+1 for k in range(0,len(scatterX_inten)-1) if (scatterX_inten[k]<scatterX_inten[k+1] or k==len(scatterX_inten)-1)]
shiftvals_inten+=[len(scatterX_inten)]
yvals_inten=[np.array(beamInten[shiftvals_inten[k]:shiftvals_inten[k+1]]) for k in range(0,len(shiftvals_inten)-1)]
inten_mean=np.array([np.mean(k) for k in yvals_inten if k.size>1])
scale=np.max(inten_mean)
inten_std=np.array([np.sqrt(np.var(k)) for k in yvals_inten if k.size>1])
print(xvals_inten,inten_mean,scale,inten_std)
print(beamInten)
popt,pcov=sp_o.curve_fit(modelFit,xvals_inten,inten_mean,[1,50,7275,7150],sigma=inten_std)
xfit=np.linspace(xvals_inten[0],xvals_inten[-1],200)
print(popt)
print(pcov)
#
# plot
#
ax.clear()
ax.scatter(posread,beamInten,c='r',label='Data')
ax.errorbar(xvals_inten,inten_mean,inten_std,label='Mean + standard error',c='b',linestyle='None',marker='x')
#ax.plot(xfit,modelFit(xfit,*popt),c='k',label='PRM: {0:1.2e},{1:1.2e},{2:1.3e},{3:1.3e}\nCOV: {4:1.2e},{5:1.2e},{6:1.3e},{7:1.3e}'.format(popt[0],popt[1],popt[2],popt[3],np.sqrt(pcov[0,0],np.sqrt(pcov[1,1]),np.sqrt(pcov[2,2]),np.sqrt(pcov[3,3]))))
ax.plot(xfit,modelFit(xfit,popt[0],popt[1],popt[2],popt[3]),c='k',label='Fit, optimum parameter={0:1.3e}'.format(np.abs(popt[2]+popt[3])/2))
ax.legend()
#ax.scatter(posread,yFitVal,c='b')
ax.set_ylim(0.9*np.min(inten_mean),1.1*np.max(inten_mean))
return True
SlitScanFinalXMPP_L=awakeIMClass.awakeScreen(SlitScanFinXMPP)
SlitScanXMPP_L=awakeIMClass.awakeLoop("TT41.BCTF.412340/Acquisition#totalIntensityPreferred",SlitScanXMPP,None,finfun=SlitScanFinalXMPP_L)
def FocusScanXMPP(japc,scanListElem,subs):
import time
import scipy as sp
print('Called FocusScanXMPP!')
time.sleep(0.5)
'''
old translator!, new one is MPP-TSG41-TRANSL1-BI!
'''
#posREAD=japc.getParam("MPP-TSG41-TRANSL1/Position#position")
posREAD=japc.getParam("MPP-TSG41-TRANSL1-BI/Position#position")
#posREAD=scanListElem
imData=japc.getParam("XMPP-STREAK/StreakImage#streakImageData").reshape(512,672)
time=japc.getParam('XMPP-STREAK/StreakImage#streakImageTime')
#japc.setParam("MPP-TSG41-TRANSL1/MoveTo#position",scanListElem)
japc.setParam("MPP-TSG41-TRANSL1-BI/MoveTo#position",scanListElem)
posSET=scanListElem
BeamInten=subs.get()[1]
startGuess=np.array([4e3,16,345,255,50])
def gaussFIT(prm,x,y):
return ((prm[0]/np.sqrt(2*prm[1]**2)*np.exp( - ((x[0]-prm[2])**2 + (x[1]-prm[3])**2)/(2*prm[1]**2)) + prm[4]) -y).ravel()
def gaussFIT1D(prm,x,y):
return ((prm[0]/np.sqrt(2*prm[1]**2)*np.exp( - ((x-prm[2])**2)/(2*prm[1]**2)) + prm[3]) -y).ravel()
print('Before inten')
INTEN=np.sum(imData[:,200:450]-np.mean(imData[:,:200]))
x,y=np.meshgrid(np.arange(100,500),np.arange(245,270))
optimres=sp.optimize.least_squares(gaussFIT,startGuess,args=([x,y],imData[245:270,100:500]))
print('fits')
optimres_1d=sp.optimize.least_squares(gaussFIT1D,[INTEN,15,330,10],args=(np.arange(100,500),imData[245:270,100:500].sum(0)/25))
''' Automatic steering '''
delMR=10
posMR=japc.getParam("MPP-TSG41-MIRROR1-V/Position#position")
if optimres.x[3] <= 240:
japc.setParam("MPP-TSG41-MIRROR1-V/MoveTo#position",posMR+delMR)
chMR=posMR
if optimres.x[3] >= 272:
japc.setParam("MPP-TSG41-MIRROR1-V/MoveTo#position",posMR-delMR)
return {'posRead':posREAD,'posSet':posSET,'StreakImageTime':time,'BeamIntensity':BeamInten,'fitParam':optimres,'measBeamIntensity':INTEN,'MirrorSteering:':posMR,'ImageData':imData,'optimres_1d':optimres_1d}
#return {'posRead':posREAD,'posSet':posSET,'StreakImageTime':time,'BeamIntensity':BeamInten,'fitParam':optimres,'measBeamIntensity':INTEN,'MirrorSteering:':1,'ImageData':imData}
def FocusScanFinXMPP( ax,japc,result,*args):
print('1')
listR=[]
for k in result:
listR+=k
#print(listR)
posread=np.array([l['posRead'] for l in listR])
print('2')
beamSize=np.array([np.abs(l['fitParam'].x[1]) for l in listR])# [k for k in listR]])
print('3')
beamSize_1d=np.array([np.abs(l['optimres_1d'].x[1]) for l in listR])#[k for k in listR]])
print(beamSize,beamSize_1d)
#
# cut
#
#argsBig=np.where(beamSize<40)
#posread=posread[argsBig]
#posread=posread[np.where(beamSize[argsBig]>10)]
#beamSize=beamSize[argsBig]
#beamSize=beamSize[np.where(beamSize[argsBig]>10)]
beamInten=beamSize
print(beamInten)
argSorted=np.argsort(posread)
scatterX_inten=posread[argSorted]
beamInten=beamInten[argSorted]
beamInten_1d=beamSize_1d[argSorted]
xvals_inten=np.array([scatterX_inten[k] for k in range(0,len(scatterX_inten)-1) if ((scatterX_inten[k-1]<scatterX_inten[k] or k==0) and scatterX_inten[k+1]==scatterX_inten[k])])
shiftvals_inten=[0]
shiftvals_inten+=[k+1 for k in range(0,len(scatterX_inten)-1) if (scatterX_inten[k]<scatterX_inten[k+1] or k==len(scatterX_inten)-1)]
shiftvals_inten+=[len(scatterX_inten)]
print('4')
yvals_inten=[np.array(beamInten[shiftvals_inten[k]:shiftvals_inten[k+1]]) for k in range(0,len(shiftvals_inten)-1)]
inten_mean=np.array([np.mean(k) for k in yvals_inten if k.size>1])
inten_std=np.array([np.sqrt(np.var(k)) for k in yvals_inten if k.size>1])
yvals_inten1d=[np.array(beamInten_1d[shiftvals_inten[k]:shiftvals_inten[k+1]]) for k in range(0,len(shiftvals_inten)-1)]
inten_mean1d=np.array([np.mean(k) for k in yvals_inten1d if k.size>1])
inten_std1d=np.array([np.sqrt(np.var(k)) for k in yvals_inten1d if k.size>1])
print(inten_mean,inten_mean1d)
poly=np.polyfit(xvals_inten,inten_mean,2,w=1/inten_std)
poly1d=np.polyfit(xvals_inten,inten_mean1d,2,w=1/inten_std1d)
polyX=np.linspace(xvals_inten[0],xvals_inten[-1],200)
polyY=lambda x,prm: prm[0]*x**2+ prm[1]*x**1+ prm[2]
#
# plot
#
#print(poly,xvals_inten,inten_mean,inten_std)
#print(posread,beamSize)
ax.clear()
ax.plot(np.linspace(xvals_inten[0],xvals_inten[-1],200),polyY(polyX,poly),label='Full beam, Quadratic fit:{0:1.3e}*x^2+{1:1.3e}*x+{2:1.3e}'.format(poly[0],poly[1],poly[2]))
ax.plot(np.linspace(xvals_inten[0],xvals_inten[-1],200),polyY(polyX,poly1d),label='Projected beam, Quadratic fit:{0:1.3e}*x^2+{1:1.3e}*x+{2:1.3e}'.format(poly1d[0],poly1d[1],poly1d[2]))
ax.errorbar(xvals_inten,inten_mean,inten_std,linestyle='None',label='Mean data')
ax.errorbar(xvals_inten,inten_mean1d,inten_std1d,linestyle='None',label='Mean data, projection',c='r')
#ax.plot(xvals_inten,inten_mean,linestyle='None',label='Mean data')
ax.scatter(posread,beamSize,label='Raw data')
ax.scatter(posread,beamSize_1d,label='Raw data, projection')
ax.set_ylim(0,75)
ax.legend()
return True
FocusScanFinal=awakeIMClass.awakeScreen(FocusScanFinXMPP)
FocusScanXMPP_L=awakeIMClass.awakeLoop("TT41.BCTF.412340/Acquisition#totalIntensityPreferred",FocusScanXMPP,None,finfun=FocusScanFinal)
def FocusPartScanXMPP(japc,scanListElem,subs):
import time
import scipy as sp
print('Called FocusScanXMPP!')
time.sleep(0.5)
'''
old translator!, new one is MPP-TSG41-TRANSL1-BI!
'''
#posREAD=japc.getParam("MPP-TSG41-TRANSL1/Position#position")
posREAD=japc.getParam("MPP-TSG41-TRANSL1-BI/Position#position")
#posREAD=scanListElem
imData=japc.getParam("XMPP-STREAK/StreakImage#streakImageData").reshape(512,672)
time=japc.getParam('XMPP-STREAK/StreakImage#streakImageTime')
#japc.setParam("MPP-TSG41-TRANSL1/MoveTo#position",scanListElem)
japc.setParam("MPP-TSG41-TRANSL1-BI/MoveTo#position",scanListElem)
posSET=scanListElem
BeamInten=subs.get()[1]
def gaussFIT(prm,x,y):
return ((prm[0]/np.sqrt(2*prm[1]**2)*np.exp( - ((x-prm[2])**2)/(2*prm[1]**2)) + prm[3]) -y).ravel()
INTEN=np.sum(imData[250:260,200:450]-np.mean(imData[:,:250]))
startGuess=np.array([Inten,15,330,380])
x=np.arange(200,450)
optimres=sp.optimize.least_squares(gaussFIT,startGuess,args=(x,imData[245:270,200:450].sum(0)/25))
''' Automatic steering '''
return {'posRead':posREAD,'posSet':posSET,'StreakImageTime':time,'BeamIntensity':BeamInten,'fitParam':optimres,'measBeamIntensity':INTEN,'ImageData':imData}
#return {'posRead':posREAD,'posSet':posSET,'StreakImageTime':time,'BeamIntensity':BeamInten,'fitParam':optimres,'measBeamIntensity':INTEN,'MirrorSteering:':1,'ImageData':imData}
def FocusPartScanFinXMPP( ax,japc,result,*args):
listR=[]
for k in result:
listR+=k
posread=np.array([l['posRead'] for l in [k for k in listR]])
beamSize=np.array([l['fitParam'].x[1] for l in [k for k in listR]])
print(beamSize)
#
# cut
#
#argsBig=np.where(beamSize<40)
#posread=posread[argsBig]
#posread=posread[np.where(beamSize[argsBig]>10)]
#beamSize=beamSize[argsBig]
#beamSize=beamSize[np.where(beamSize[argsBig]>10)]
beamInten=beamSize
print(beamInten)
argSorted=np.argsort(posread)
scatterX_inten=posread[argSorted]
beamInten=beamInten[argSorted]
xvals_inten=np.array([scatterX_inten[k] for k in range(0,len(scatterX_inten)-1) if ((scatterX_inten[k-1]<scatterX_inten[k] or k==0) and scatterX_inten[k+1]==scatterX_inten[k])])
shiftvals_inten=[0]
shiftvals_inten+=[k+1 for k in range(0,len(scatterX_inten)-1) if (scatterX_inten[k]<scatterX_inten[k+1] or k==len(scatterX_inten)-1)]
shiftvals_inten+=[len(scatterX_inten)]
yvals_inten=[np.array(beamInten[shiftvals_inten[k]:shiftvals_inten[k+1]]) for k in range(0,len(shiftvals_inten)-1)]
inten_mean=np.array([np.mean(k) for k in yvals_inten if k.size>1])
inten_std=np.array([np.sqrt(np.var(k)) for k in yvals_inten if k.size>1])
print(inten_mean)
poly=np.polyfit(xvals_inten,inten_mean,2,w=1/inten_std)
polyX=np.linspace(xvals_inten[0],xvals_inten[-1],200)
polyY=lambda x,prm: prm[0]*x**2+ prm[1]*x**1+ prm[2]
#
# plot
#
#print(poly,xvals_inten,inten_mean,inten_std)
#print(posread,beamSize)
ax.clear()
ax.plot(np.linspace(xvals_inten[0],xvals_inten[-1],200),polyY(polyX,poly),label='Quadratic fit:{0:1.3e}*x^2+{1:1.3e}*x+{2:1.3e}'.format(poly[0],poly[1],poly[2]))
ax.errorbar(xvals_inten,inten_mean,inten_std,linestyle='None',label='Mean data')
#ax.plot(xvals_inten,inten_mean,linestyle='None',label='Mean data')
ax.scatter(posread,beamSize,label='Raw data')
ax.set_ylim(0,75)
ax.legend()
return True
FocusPartScanFinal=awakeIMClass.awakeScreen(FocusPartScanFinXMPP)
FocusPartScanXMPP_L=awakeIMClass.awakeLoop("TT41.BCTF.412340/Acquisition#totalIntensityPreferred",FocusPartScanXMPP,None,finfun=FocusPartScanFinal)
def FocusScanBI(japc,scanListElem,subs):
import time
import scipy as sp
time.sleep(0.5)
print('FocusScanBI called!')
BeamInten=subs.get()[1]
posREAD=japc.getParam("BTV.TT41.412350_FOCUS/Position#position")
japc.setParam("BTV.TT41.412350_FOCUS/MoveTo#position",scanListElem)
#posREAD=scanListElem
posSET=scanListElem
print(scanListElem)
imData=japc.getParam("TT41.BTV.412350.STREAK/StreakImage#streakImageData").reshape(512,672)
imData=imData-np.mean(imData[:,50:100])
def gaussFIT(prm,x,y):
return ((prm[0]/np.sqrt(2*prm[1]**2)*np.exp( - ((x-prm[2])**2)/(2*prm[1]**2)) + prm[3]) -y).ravel()
x=np.arange(250,450)
INTEN=(imData[:,280:420]).sum()#(imData[240:275,280:420]-np.mean(imData[240:275,0:140])).sum()
startGuess=np.array([INTEN,16,345,10])
time=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageTime')
optimres=sp.optimize.least_squares(gaussFIT,startGuess,args=(x,imData[:,250:450].sum(0)))
return {'posRead':posREAD,'posSet':posSET,'StreakImageTime':time,'BeamIntensity':BeamInten,'fitParam':optimres,'measBeamIntensity':INTEN,'ImageData':imData}
def FocusScanFinBI( ax,japc,result,*args):
listR=[]
for k in result:
listR+=k
posread=np.array([l['posRead'] for l in [k for k in listR]])
beamSize=np.array([l['fitParam'].x[1] for l in [k for k in listR]])
print(beamSize)
#
# cut
#
#argsBig=np.where(beamSize<40)
#posread=posread[argsBig]
#posread=posread[np.where(beamSize[argsBig]>10)]
#beamSize=beamSize[argsBig]
#beamSize=beamSize[np.where(beamSize[argsBig]>10)]
beamInten=beamSize
print(beamInten)
argSorted=np.argsort(posread)
scatterX_inten=posread[argSorted]
beamInten=beamInten[argSorted]
xvals_inten=np.array([scatterX_inten[k] for k in range(0,len(scatterX_inten)-1) if ((scatterX_inten[k-1]<scatterX_inten[k] or k==0) and scatterX_inten[k+1]==scatterX_inten[k])])
shiftvals_inten=[0]
shiftvals_inten+=[k+1 for k in range(0,len(scatterX_inten)-1) if (scatterX_inten[k]<scatterX_inten[k+1] or k==len(scatterX_inten)-1)]
shiftvals_inten+=[len(scatterX_inten)]
yvals_inten=[np.array(beamInten[shiftvals_inten[k]:shiftvals_inten[k+1]]) for k in range(0,len(shiftvals_inten)-1)]
inten_mean=np.array([np.mean(k) for k in yvals_inten if k.size>1])
inten_std=np.array([np.sqrt(np.var(k)) for k in yvals_inten if k.size>1])
print(inten_mean)
poly=np.polyfit(xvals_inten,inten_mean,2,w=1/inten_std)
polyX=np.linspace(xvals_inten[0],xvals_inten[-1],200)
polyY=lambda x,prm: prm[0]*x**2+ prm[1]*x**1+ prm[2]
#
# plot
#
#print(poly,xvals_inten,inten_mean,inten_std)
#print(posread,beamSize)
ax.clear()
ax.plot(np.linspace(xvals_inten[0],xvals_inten[-1],200),polyY(polyX,poly),label='Quadratic fit:{0:1.3e}*x^2+{1:1.3e}*x+{2:1.3e}'.format(poly[0],poly[1],poly[2]))
ax.errorbar(xvals_inten,inten_mean,inten_std,linestyle='None',label='Mean data')
#ax.plot(xvals_inten,inten_mean,linestyle='None',label='Mean data')
ax.scatter(posread,beamSize,label='Raw data')
ax.set_ylim(0,75)
ax.legend()
return True
FocusScanFinalBI=awakeIMClass.awakeScreen(FocusScanFinBI)
FocusScanBI_L=awakeIMClass.awakeLoop("TT41.BCTF.412340/Acquisition#totalIntensityPreferred",FocusScanBI,None,finfun=FocusScanFinalBI)
def StreakScanBI(japc,scanListElem,subs):
import time
import scipy as sp
time.sleep(0.5)
print('SlitScanBI called!')
BeamInten=subs.get()[1]
posREAD=japc.getParam("BTV.TT41.412350_STREAK_V/Position#position")
japc.setParam("BTV.TT41.412350_STREAK_V/MoveTo#position",scanListElem)
#posREAD=scanListElem
posSET=scanListElem
print(scanListElem)
imData=japc.getParam("TT41.BTV.412350.STREAK/StreakImage#streakImageData").reshape(512,672)
imData=imData-np.mean(imData[:,20:100])
def gaussFIT(prm,x,y):
return ((prm[0]/np.sqrt(2*prm[1]**2)*np.exp( - ((x-prm[2])**2)/(2*prm[1]**2)) + prm[3]) -y).ravel()
x=np.arange(250,450)
INTEN=(imData[:,280:420]).sum()#(imData[240:275,280:420]-np.mean(imData[240:275,0:140])).sum()
startGuess=np.array([INTEN,16,345,10])
time=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageTime')
optimres=sp.optimize.least_squares(gaussFIT,startGuess,args=(x,imData[240:270,250:450].sum(0)))
return {'posRead':posREAD,'posSet':posSET,'StreakImageTime':time,'BeamIntensity':BeamInten,'fitParam':optimres,'measBeamIntensity':INTEN,'ImageData':imData}
def StreakScanFinBI( ax,japc,result,*args):
import scipy.optimize as sp_o
def modelFit(x,prm1,prm2,prm3,prm4):
return prm1*prm2*(special.erf((prm3-x)/prm2)-special.erf((prm4-x)/prm2) )
listR=[]
for k in result:
listR+=k
posread=np.array([l['posRead'] for l in listR])
beamInten=np.array([l['measBeamIntensity']/l['BeamIntensity'] for l in listR])
beamInten=beamInten/np.max(beamInten)
yFitVal=np.array([l['fitParam'].x[0] for l in listR])
#
# Fit the double error function model
#
argSorted=np.argsort(posread)
scatterX_inten=posread[argSorted]
beamInten=beamInten[argSorted]
xvals_inten=np.array([scatterX_inten[k] for k in range(0,len(scatterX_inten)-1) if ((scatterX_inten[k-1]<scatterX_inten[k] or k==0) and scatterX_inten[k+1]==scatterX_inten[k])])
shiftvals_inten=[0]
shiftvals_inten+=[k+1 for k in range(0,len(scatterX_inten)-1) if (scatterX_inten[k]<scatterX_inten[k+1] or k==len(scatterX_inten)-1)]
shiftvals_inten+=[len(scatterX_inten)]
yvals_inten=[np.array(beamInten[shiftvals_inten[k]:shiftvals_inten[k+1]]) for k in range(0,len(shiftvals_inten)-1)]
inten_mean=np.array([np.mean(k) for k in yvals_inten if k.size>1])
scale=np.max(inten_mean)
inten_std=np.array([np.sqrt(np.var(k)) for k in yvals_inten if k.size>1])
print(xvals_inten,inten_mean,scale,inten_std)
print(beamInten)
#opt,pcov=sp_o.curve_fit(modelFit,xvals_inten,inten_mean,[1,50,7275,7150],sigma=inten_std)
#xfit=np.linspace(xvals_inten[0],xvals_inten[-1],200)
print(popt)
print(pcov)
#
# plot
#
ax.clear()
ax.scatter(posread,beamInten,c='r',label='Data')
ax.errorbar(xvals_inten,inten_mean,inten_std,label='Mean + standard error',c='b',linestyle='None',marker='x')
#ax.plot(xfit,modelFit(xfit,*popt),c='k',label='PRM: {0:1.2e},{1:1.2e},{2:1.3e},{3:1.3e}\nCOV: {4:1.2e},{5:1.2e},{6:1.3e},{7:1.3e}'.format(popt[0],popt[1],popt[2],popt[3],np.sqrt(pcov[0,0],np.sqrt(pcov[1,1]),np.sqrt(pcov[2,2]),np.sqrt(pcov[3,3]))))
#ax.plot(xfit,modelFit(xfit,popt[0],popt[1],popt[2],popt[3]),c='k',label='Fit, optimum parameter={0:1.3e}'.format(np.abs(popt[2]+popt[3])/2))
ax.legend()
#ax.scatter(posread,yFitVal,c='b')
ax.set_ylim(0.9*np.min(inten_mean),1.1*np.max(inten_mean))
return True
StreakIntenScanFinalBI=awakeIMClass.awakeScreen(StreakScanFinBI)
StreakIntenScanBI_L=awakeIMClass.awakeLoop("TT41.BCTF.412340/Acquisition#totalIntensityPreferred",StreakScanBI,None,finfun=StreakIntenScanFinalBI)
def StreakSlitScanXMPP(japc,scanListElem,subs):
import time
import scipy as sp
time.sleep(0.5)
BeamInten=subs.get()[1]
posREAD=japc.getParam("MPP-TSG41-MIRROR1-V/Position#position")
japc.setParam("MPP-TSG41-MIRROR1-V/MoveTo#position",scanListElem)
time=japc.getParam('XMPP-STREAK/StreakImage#streakImageTime')
posSET=scanListElem
imData=japc.getParam("XMPP-STREAK/StreakImage#streakImageData").reshape(512,672)
#prfB=imData[:,200:450].sum(0)/512
#def gaussFIT1D(prm,x,y):
# return ((prm[0]/np.sqrt(2*prm[1]**2)*np.exp( - (x-prm[2])**2 /(2*prm[1]**2)) + prm[3]) -y).ravel()
INTEN=np.sum(imData[:,200:450]-np.mean(imData[:,:150]))
#startGuess=np.array([5e2,30,320,0])
#x=np.arange(0,250)+200
#optimresB=sp.optimize.least_squares(gaussFIT1D,startGuess,args=(x,prfB))
return {'posRead':posREAD,'posSet':posSET,'StreakImageTime':time,'BeamIntensity':BeamInten,'measBeamIntensity':INTEN}
def StreakSlitScanFinXMPP( ax,japc,result,*args):
listR=[]
for k in result:
listR+=k
posread=np.array([l['posRead'] for l in listR])
beamInten=np.array([l['measBeamIntensity']/l['BeamIntensity'] for l in listR])
#
# plot
#
ax.scatter(posread,beamInten/np.max(beamInten),c='r')
#ax.scatter(posread,yFitVal,c='b')
ax.set_ylim(-0.1,1.1)
return True
StreakSlitScanFinalXMPP_L=awakeIMClass.awakeScreen(StreakSlitScanFinXMPP)
StreakSlitScanXMPP_L=awakeIMClass.awakeLoop("TT41.BCTF.412340/Acquisition#totalIntensityPreferred",StreakSlitScanXMPP,None,finfun=StreakSlitScanFinalXMPP_L)
''' Start the GUI '''
if __name__ == '__main__':
app = QApplication(sys.argv)
#ex = awakeScanGui(TimeScanXMPP_L,SlitScanXMPP_L,FocusScanXMPP_L,StreakSlitScanXMPP_L)
ex = awakeScanGui(SlitScanXMPP_L,FocusScanXMPP_L,FocusScanBI_L,StreakIntenScanBI_L)
sys.exit(app.exec_())
#app.exec_() | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/plotting_tools/awakeScanGUI.py | awakeScanGUI.py |
import sys
#sys.path.append('/user/rieger/')
from awakeIMClass import *
import numpy as np
import scipy as sp
import matplotlib as mpl
import pickle
import time
import os
'''
Laser zeroDict, pixelvalue and finedelay!
(timeVal and finedelay)
'''
LaserZeroValDict={'1 ns':(480,1.68),'500 ps':(341,6.75),'200 ps':(10,1.85),'100 ps':(10,5.90)}#LaserZeroValDict[japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeRange')]#acess timerange values
'''
Defining the plots for the GUI
'''
def XMPP_beamImage(plotax,fig,fixedaxes,japc,vec,awkLBox,maxVal,PixelLengthProfileFit):
import time
# fixedaxes beinhaltet x axis calibration value
time.sleep(1)#TT41.BTV.412350.STREAK,XMPP-STREAK
timeVal=japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeValues')
vec=japc.getParam('XMPP-STREAK/StreakImage#streakImageData')
vec=vec.reshape(512,672)
if maxVal <=0:
# maxVal = 1.1*np.max(vec[:,300:400].sum()/100/512)
maxVal=1.5*np.mean(vec[:,315:365])+0.1*np.max(vec[:,315:365])
plotax.clear()
plotax.imshow(np.fliplr(vec.T),extent=[timeVal[-1],timeVal[1],fixedaxes[0][0],fixedaxes[0][-1]],vmin=400,vmax=maxVal,aspect='auto',cmap='Blues')
plotax.set_ylabel('Space (mm)')
plotax.set_xlabel('Time (ps)')
PixelLengthProfileFit=int(PixelLengthProfileFit)
currentFineDelay=awkLBox() #get finedelay setting
fineDelay,pxTpl=awkLBox[japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeRange')]#acess timerange values
if fineDelay is not None:
psShift=(fineDelay-currentFineDelay)*1000
lowerlim=(512-pxTpl[0])/512*(timeVal[-1]-timeVal[1]) + psShift
upperlim=(512-pxTpl[1])/512*(timeVal[-1]-timeVal[1]) + psShift
plotax.plot((upperlim,upperlim),(fixedaxes[0][150],fixedaxes[0][-150]),c='y',linestyle='dotted',linewidth=4)
plotax.plot((lowerlim,lowerlim),(fixedaxes[0][150],fixedaxes[0][-150]),c='y',linestyle='dotted',linewidth=4,label='LASER BOX')
'''
thickness plot
'''
def gaussFIT1D(prm,x,y):
return ((prm[0]/np.sqrt(2*prm[1]**2)*np.exp( - (x-prm[2])**2 /(2*prm[1]**2)) + prm[3]) -y).ravel()
try:
startVals= np.arange(10,490,PixelLengthProfileFit)
endVals= np.arange(10+PixelLengthProfileFit,500,PixelLengthProfileFit)
startGuess=[3,1/4*np.abs(fixedaxes[0][0]-fixedaxes[0][-1]),fixedaxes[0][345],400]
import scipy as sp
slices=[(vec.T[:,l:k].sum(1)/(np.abs(l-k)))/((vec.T[:,l:k].sum()/(np.abs(l-k)))) for l,k in zip(startVals,endVals)]
fits=[sp.optimize.least_squares(gaussFIT1D,startGuess,args=(fixedaxes[0],k)) for k in slices]
parentFig=plotax.get_figure()
if len(parentFig.axes)>3:
ax2=parentFig.axes[3]
ax2.clear()
else:
ax2=plotax.twinx()
ax2.scatter(timeVal[endVals-5],[np.abs(k.x[1]) for k in fits],label='Spatial fits (mm)',s=30,marker='d',c='r')
ax2.set_ylim(0,np.minimum(np.max([np.abs(k.x[1]) for k in fits])*1.1,5))
except:
print('no spatial fit!')
plotax.set_xlim(timeVal[-1],timeVal[1])
plotax.set_title(str(japc.getParam('XMPP-STREAK/StreakImage#streakImageTime')))
def XMPP_ProfilePlot(plotax,fig,fixedaxes,japc,vec,prfLaser,MarkerLaserStageSetValmm,textPos,StagePositionZeroValmm):
laserSetVal=MarkerLaserStageSetValmm
import scipy.constants as spc
import time
time.sleep(1)
plotax.clear()
timeVal=japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeValues')
vec=japc.getParam('XMPP-STREAK/StreakImage#streakImageData')-400
currentPos=japc.getParam('AIRTR01/Acq#numericPosition')
'''
Unverschobener laserwert!
'''
delayZeroPos=StagePositionZeroValmm #mm
delayZeroPs=delayZeroPos*1e-3/spc.c/1e-12 #ps
ZeroPxVal,ZeroFineDelay=LaserZeroValDict[japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeRange')]
'''Calc difference'''
psShiftDelay=2*(currentPos-delayZeroPos)*1e-3/spc.c/1e-12 # in ps, 2* weil zweifacher weg
print(psShiftDelay)
# laserSetVal in ps, aber translator ist in mm
setMMpos=laserSetVal#spc.c/1e12/1e3*laserSetVal+delayZeroPos
if laserSetVal != -1 and laserSetVal != currentPos:
# setze auf zerovalue!
japc.setParam('AIRTR01/Setting#positionIn',setMMpos)
if laserSetVal ==-1:
japc.setParam('AIRTR01/Setting#positionIn',delayZeroPos)
def gaussFIT1D(prm,x,y):
return ((prm[0]/np.sqrt(2*prm[1]**2)*np.exp( - (x-prm[2])**2 /(2*prm[1]**2)) + prm[3]) -y).ravel()
vecP=vec.reshape(512,672)[:,prfLaser[0]:prfLaser[1]].sum(1)/(prfLaser[1]-prfLaser[0])
vecP=vecP/np.max(vecP)
timeVal=np.append(timeVal[1],timeVal[1:])
plobj1=plotax.plot(np.flipud(timeVal),np.flipud(vecP),c='r',linewidth=2,label='temporal Profile')
try:
parentFig=plotax.get_figure()
if len(parentFig.axes)>3:
ax2=parentFig.axes[3]
ax2.clear()
else:
ax2=plotax.twiny()
vecP2=vec.reshape(512,672).sum(0)/(512)
plobj2=ax2.plot(fixedaxes[0],vecP2/np.max(vecP2),label='Spatial Profile')
except:
print('no standard')
try:
import scipy as sp
startGuess=[(np.max(vecP)-np.min(vecP))/2,1/100*(timeVal[-1]-timeVal[0]),timeVal[255],10]
optimres=sp.optimize.least_squares(gaussFIT1D,startGuess,args=(np.flipud(timeVal),np.flipud(vecP)))
print('Finished fit')
'''Calc TimeWindow Shift'''
import pytimber
ldb=pytimber.LoggingDB()
FineDelayStreak=ldb.get('MPPAWAKE:FASTTRIG-1:STREAKTUBE-FINEDELAY',time.strftime('%Y-%m-%d %H:%M:%S'))['MPPAWAKE:FASTTRIG-1:STREAKTUBE-FINEDELAY'][1][0]
print('Finished getting ldb finedelay value:{0:1.2f}'.format(FineDelayStreak))
FineDelay=FineDelayStreak-ZeroFineDelay # set shift
relShift=optimres.x[2]-ZeroPxVal #relative shift measured by laser
totalShift=FineDelay-(FineDelay+relShift)+psShiftDelay
print('trying to plot')
plotax.text(textPos[0],textPos[1],'StageCurrentPosition is {4:3.2f}mm\nStageZeroPosition is {3:3.2f}mm\nMeasured delay shift is:{0:3.0f}ps, set is {1:1.2f}ps\nmarker laser stage shift is:{2:3.0f}ps'.format(totalShift,FineDelay,psShiftDelay,StagePositionZeroValmm,currentPos),bbox=dict(facecolor='red', alpha=0.5))
'''PLot'''
plobj3=plotax.plot(np.flipud(timeVal),np.flipud(gaussFIT1D(optimres.x,timeVal,0)),c='g',linestyle='dotted',linewidth=1.5,label='Gauss fit: sigma={0:1.2f}ps, pos in image is {1:3.0f}ps'.format(np.abs(optimres.x[1]),optimres.x[2]))
legendAll=[l.get_label() for l in plobj1+plobj2+plobj3]
plotax.legend(plobj1+plobj2+plobj3,legendAll)
except:
print('no fitplot')
#plotax.set_ylim(np.min(vec),1.05*np.max(vec))
plotax.set_ylim(0,1.05)
#plotax.set_title('StageZeroPosition is:{0:3.2f}'.format(StagePositionZeroValmm))
'''
Starting the GUI application
'''
app = QApplication(sys.argv)
aw = AwakeWindow(["TT41.BCTF.412340/Acquisition#totalIntensityPreferred"],XMPP_beamImage,XMPP_ProfilePlot,fixedaxes=(np.linspace(-8.7,8.7,672),),selector="SPS.USER.AWAKE1",name='AwakeLaserBox Image',XMPP_beamImage={'awkLBox':laserboxMPP,'maxVal':-1,'PixelLengthProfileFit':10},XMPP_ProfilePlot={'MarkerLaserStageSetValmm':-1,'prfLaser':[0,100],'textPos':[0.1,0.2],'StagePositionZeroValmm':72.6},reverse=True)
progname='AwakeSTREAK'
aw.setWindowTitle("%s" % progname)
aw.setWindowIcon(QIcon(os.path.join(os.path.dirname(__file__),'awakeicon1_FkV_icon.ico')))
aw.show()
app.exec_() | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/plotting_tools/fabianPlots.py | fabianPlots.py |
import sys
#sys.path.append('/user/awakeop/AWAKE_ANALYSIS_TOOLS/plotting_tools/')
from awakeIMClass import *
import numpy as np
import scipy as sp
import matplotlib as mpl
import pickle
import time
import os
'''
Defining the plots for the GUI
'''
def BI_beamImage(plotax,fig,fixedaxes,japc,vec,awkLBox,maxVal,PixelLengthProfileFit,ProfileParam):
# fixedaxes beinhaltet x axis calibration value
try:
time.sleep(1)#TT41.BTV.412350.STREAK,TT41.BTV.412350.STREAK
timeVal=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageTimeValues')
vec=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageData')
vec=vec.reshape(512,672)
except:
print('no data available')
if ProfileParam[0]>ProfileParam[1]:
ProfileParam=(ProfileParam[1],ProfileParam[0])
if maxVal <=0:
# maxVal = 1.1*np.max(vec[:,300:400].sum()/100/512)
maxVal=1.5*np.mean(vec[:,(int(ProfileParam[0]/672*336)+336):(int(ProfileParam[1]/672*336)+336)])+0.1*np.max(vec[:,(int(ProfileParam[0]/672*336)+336):(int(ProfileParam[1]/672*336)+336)])
plotax.clear()
print('before plot')
plotax.imshow(np.fliplr(vec.T),extent=[timeVal[-1],timeVal[1],fixedaxes[0][150],fixedaxes[0][-150]],vmin=175,vmax=maxVal,aspect='auto',cmap='Blues')
print('behind plot')
plotax.plot((timeVal[-1],timeVal[1]),(ProfileParam[0],ProfileParam[0]),c='k',linestyle='dotted',linewidth=2)
plotax.plot((timeVal[-1],timeVal[1]),(ProfileParam[1],ProfileParam[1]),c='k',linestyle='dotted',linewidth=2)
plotax.set_ylabel('Space (px)')
plotax.set_xlabel('Time (ps)')
PixelLengthProfileFit=int(PixelLengthProfileFit)
'''
thickness plot
'''
def gaussFIT1D(prm,x,y):
return ((prm[0]/np.sqrt(2*prm[1]**2)*np.exp( - (x-prm[2])**2 /(2*prm[1]**2)) + prm[3]) -y).ravel()
try:
startVals= np.arange(10,490,PixelLengthProfileFit)
endVals= np.arange(10+PixelLengthProfileFit,500,PixelLengthProfileFit)
startGuess=[3,1/4*np.abs(fixedaxes[0][0]-fixedaxes[0][-1]),fixedaxes[0][345],400]
import scipy as sp
slices=[(vec.T[:,l:k].sum(1)/(np.abs(l-k)))/((vec.T[:,l:k].sum()/(np.abs(l-k)))) for l,k in zip(startVals,endVals)]
fits=[sp.optimize.least_squares(gaussFIT1D,startGuess,args=(fixedaxes[0],k)) for k in slices]
parentFig=plotax.get_figure()
if len(parentFig.axes)>3:
ax2=parentFig.axes[3]
ax2.clear()
else:
ax2=plotax.twinx()
ax2.scatter(timeVal[endVals-5],[np.abs(k.x[1]) for k in fits],label='Spatial fits (mm)',s=30,marker='d',c='r')
ax2.set_ylim(0,np.minimum(np.max([np.abs(k.x[1]) for k in fits])*1.1,5))
except:
print('no spatial fit!')
plotax.set_xlim(timeVal[-1],timeVal[1])
plotax.set_title(str(japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageTime')))
def BI_ProfilePlot(plotax,fig,fixedaxes,japc,vec,ProfileParam):
plotax.clear()
try:
time.sleep(0.5)
timeVal=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageTimeValues')
vec=japc.getParam('TT41.BTV.412350.STREAK/StreakImage#streakImageData')-175
except:
print('no data recieved!')
def gaussFIT1D(prm,x,y):
return ((prm[0]/np.sqrt(2*prm[1]**2)*np.exp( - (x-prm[2])**2 /(2*prm[1]**2)) + prm[3]) -y).ravel()
if ProfileParam[0]>ProfileParam[1]:
ProfileParam=(ProfileParam[1],ProfileParam[0])
vecP=vec.reshape(512,672)[:,ProfileParam[0]:ProfileParam[1]].mean(1)
vecP=vecP/np.max(vecP)
timeVal=np.append(timeVal[1]-timeVal[2],timeVal[1:])
plobj1=plotax.plot(np.flipud(timeVal),np.flipud(vecP),c='r',linewidth=2,label='temporal Profile')
try:
parentFig=plotax.get_figure()
if len(parentFig.axes)>4:
ax2=parentFig.axes[4]
ax2.clear()
else:
ax2=plotax.twiny()
import scipy as sp
vecP2=(vec.reshape(512,672)).mean(0)
plobj2=ax2.plot(fixedaxes[0],vecP2/np.max(vecP2),label='Spatial Profile') #error here?
print('error in plot?')
startGuess=[(np.max(vecP2)-np.min(vecP2))/2,2/3*(fixedaxes[0][-1]-fixedaxes[0][0]),fixedaxes[0][335],0]
optimres=sp.optimize.least_squares(gaussFIT1D,startGuess,args=(fixedaxes[0],vecP2/np.max(vecP2)))
plotobj4=ax2.plot(fixedaxes[0],gaussFIT1D(optimres.x,fixedaxes[0],0),c='k',linestyle='dotted',linewidth=1.5,label='Gauss fit exp(-x**2/(2*sigma**2)): sigma={0:1.2f}'.format(np.abs(optimres.x[1])))
except:
print('no standard')
try:
import scipy as sp
startGuess=[(np.max(vecP)-np.min(vecP))/2,2/3*(timeVal[-1]-timeVal[0]),timeVal[255],175]
optimres=sp.optimize.least_squares(gaussFIT1D,startGuess,args=(np.flipud(timeVal),np.flipud(vecP)))
plobj3=plotax.plot(np.flipud(timeVal),np.flipud(gaussFIT1D(optimres.x,timeVal,0)),c='g',linestyle='dotted',linewidth=1.5,label='Gauss fit: sigma={0:1.2f}'.format(np.abs(optimres.x[1])))
legendAll=[l.get_label() for l in plobj1+plobj2+plobj3+plotobj4]
plotax.legend(plobj1+plobj2+plobj3+plotobj4,legendAll)
except:
print('no fitplot')
#plotax.set_ylim(np.min(vec),1.05*np.max(vec))
plotax.set_ylim(0,1.05)
'''
Starting the GUI application
'''
app = QApplication(sys.argv)
aw = AwakeWindow(["TT41.BCTF.412340/Acquisition#totalIntensityPreferred"],BI_beamImage,BI_ProfilePlot,fixedaxes=(np.arange(0,672),),selector="SPS.USER.AWAKE1",name='BI Streak Image',BI_beamImage={'awkLBox':laserboxMPP,'maxVal':-1,'PixelLengthProfileFit':10,'ProfileParam':(300,370)},BI_ProfilePlot={'ProfileParam':(300,370)},reverse=True)
progname='AwakeSTREAK'
aw.setWindowTitle("%s" % progname)
aw.setWindowIcon(QIcon(os.path.join(os.path.dirname(__file__),'awakeicon1_FkV_icon.ico')))
aw.show()
app.exec_() | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/plotting_tools/awakerunPlotsStreakBI.py | awakerunPlotsStreakBI.py |
from __future__ import unicode_literals
"""
Spencer tries to make a thing
"""
''' Get all the things '''
import sys
import time
import os
import matplotlib as mpl
mpl.use('Qt5Agg')
from matplotlib.backends.backend_qt5agg import FigureCanvasQTAgg as FigureCanvas
from matplotlib.figure import Figure
import matplotlib.colors as colors
import matplotlib.dates as mdates
import matplotlib.pyplot as plt
#from matplotlib import cm
import numpy as np
import pyjapc
import datetime
from PyQt5.QtWidgets import (QWidget, QLabel, QLineEdit, QComboBox, QCheckBox, QMessageBox, QGroupBox, QFormLayout, QTabWidget,
QTextEdit, QGridLayout, QVBoxLayout, QHBoxLayout, QApplication, QPushButton, QSizePolicy, QStatusBar, QRadioButton)
import PyQt5.QtCore as QtCore
from PyQt5.QtGui import (QIcon, QDoubleValidator, QIntValidator)
class Canvas(FigureCanvas):
def __init__(self, parent=None, width=5, height=4, dpi=100):
fig = Figure(figsize=(width, height), dpi=dpi, frameon=True, tight_layout=True)
fig.patch.set_facecolor('aliceblue')
self.axes = fig.add_subplot(111)
self.compute_initial_figure()
FigureCanvas.__init__(self, fig)
self.setParent(parent)
FigureCanvas.setSizePolicy(self, QSizePolicy.Expanding, QSizePolicy.Expanding)
FigureCanvas.updateGeometry(self)
def compute_initial_figure(self):
pass
class SpaceView(Canvas):
def __init__(self, *args, **kwargs):
Canvas.__init__(self)
def compute_initial_figure(self):
t_ax = [1,2,3,4,5]
data = [1,2,3,4,5]
self.axes.plot(t_ax,data)
def update_figure(self,x1,y1,x2,y2):
self.axes.cla()
self.axes.plot(x1,y1,'bo')
earth= plt.Circle((x1,y1), self.rL, color = 'skyblue')
self.axes.add_patch(earth)
self.axes.plot(x2,y2,'ro')
moon= plt.Circle((x2,y2), self.rP, color = 'lightsalmon')
self.axes.add_patch(moon)
self.axes.set_title(self.title, fontsize = 16)
self.axes.set_xlabel('x, mm')
self.axes.set_ylabel('y, mm')
self.axes.legend(self.legend,bbox_to_anchor = (1.05, 1), loc = 2, borderaxespad = 0.)
#self.axes.legend(bbox_to_anchor = (1.05, 1), loc = 2, borderaxespad = 0.)
self.axes.set_xlim([-self.axlim,self.axlim])
self.axes.set_ylim([-self.axlim,self.axlim])
self.axes.grid('on')
# axarr[0,0].yaxis.label.set_color(textcolor)
# axarr[0,0].tick_params(axis = 'y', colors = textcolor)
# axarr[0,0].xaxis.label.set_color(textcolor)
# axarr[0,0].tick_params(axis = 'x', colors = textcolor)
# axarr[0,0].set_axis_bgcolor(bgfigcolor)
self.axes.set_aspect('equal', adjustable='box')
self.draw()
class TimeView(Canvas):
def __init__(self, *args, **kwargs):
Canvas.__init__(self)
def compute_initial_figure(self):
t_ax = [datetime.datetime.now()]
data = [0]
self.axes.plot(t_ax,data,'bo')
def update_figure(self,timeBuffer,dataBuffer1,dataBuffer2):
self.axes.cla()
self.axes.plot(np.array(timeBuffer),np.array(dataBuffer1),'bo', markersize = 4)
self.axes.plot(timeBuffer,dataBuffer2,'ro', markersize =4)
self.axes.set_xlabel('time')
self.axes.xaxis.set_major_formatter(mdates.DateFormatter('%H:%M'))
self.axes.xaxis.set_major_locator(mdates.MinuteLocator(interval=5))
self.axes.set_ylabel(self.yLabel)
self.axes.set_ylim([-self.axlim,self.axlim])
# self.axes.set_xticklabels(rotation = 45)
#self.axes.legend(self.legend)
self.axes.grid('on')
# axarr[1,0].set_axis_bgcolor(bgfigcolor)
# axarr[1,0].yaxis.label.set_color(textcolor)
# axarr[1,0].tick_params(axis = 'y', colors = textcolor)
# axarr[1,0].xaxis.label.set_color(textcolor)
# axarr[1,0].tick_params(axis = 'x', colors = textcolor)
self.draw()
''' This is where my code starts '''
class Example(QWidget):
''' Init Self '''
def __init__(self):
super().__init__()
self.GUI = 'TSG41.AWAKE-LASER-DATA/ValueAcquisition#floatValue'
self.timeBuffer = []
self.xVLC3Buffer = []
self.yVLC3Buffer = []
self.xVLC5Buffer = []
self.yVLC5Buffer = []
self.x352Buffer = []
self.y352Buffer = []
self.x425Buffer = []
self.y425Buffer = []
self.axlim = 3
self.rlEx = 0.7 #fwhm laser at the waist
self.rlEn = 1 #fwhm laser at the waist
self.rp = 0.47 #fwhm p+ if sigma is 200um
self.bufferLength = 40
self.legend = ['Laser','p+']
self.initJAPC()
self.initUI()
''' JAPC initialization '''
def initJAPC(self):
self.japc = pyjapc.PyJapc("SPS.USER.AWAKE1")
''' Initialize GUI '''
def initUI(self):
self.main_widget = QWidget(self)
# Create a plotting window
self.Entrance = SpaceView(self.main_widget, width=5, height=4, dpi=100)
self.Entrance.title = 'Entrance'
self.Entrance.axlim = self.axlim
self.Entrance.rL = self.rlEn
self.Entrance.rP = self.rp
self.Entrance.legend = self.legend
# Create a plotting window
self.Exit = SpaceView(self.main_widget, width=5, height=4, dpi=100)
self.Exit.title = 'Exit'
self.Exit.axlim = self.axlim
self.Exit.rL = self.rlEx
self.Exit.rP = self.rp
self.Exit.legend = self.legend
# Create a plotting window
self.UpX = TimeView(self.main_widget, width=5, height=4, dpi=100)
self.UpX.title = 'UpXTime'
self.UpX.yLabel = 'x, mm'
self.UpX.axlim = self.axlim
self.UpX.legend = self.legend
# Create a plotting window
self.UpY = TimeView(self.main_widget, width=5, height=4, dpi=100)
self.UpY.title = 'UpYTime'
self.UpY.yLabel = 'y, mm'
self.UpY.axlim = self.axlim
self.UpY.legend = self.legend
# Create a plotting window
self.DwX = TimeView(self.main_widget, width=5, height=4, dpi=100)
self.DwX.title = 'DwXTime'
self.DwX.yLabel = 'x, mm'
self.DwX.axlim = self.axlim
self.DwX.legend = self.legend
# Create a plotting window
self.DwY = TimeView(self.main_widget, width=5, height=4, dpi=100)
self.DwY.title = 'DwYTime'
self.DwY.yLabel = 'y, mm'
self.DwY.axlim = self.axlim
self.DwY.legend = self.legend
# Create Layout
grid = QGridLayout()
grid.addWidget(self.Entrance, 0, 0)
grid.addWidget(self.Exit, 0, 1)
grid.addWidget(self.UpX, 1, 0)
grid.addWidget(self.DwX, 1, 1)
grid.addWidget(self.UpY, 2, 0)
grid.addWidget(self.DwY, 2, 1)
self.setLayout(grid)
self.setGeometry(1600, 300, 900, 1000)
# Make a window
self.setWindowTitle('Laser Plots')
self.setWindowIcon(QIcon(os.path.join(os.path.dirname(__file__),'awakeicon1_FkV_icon.ico')))
# Start the show
self.show()
self.start_subs()
''' Start Subs '''
def start_subs(self):
self.japc.subscribeParam(self.GUI,self.proc_gui_data)
self.japc.startSubscriptions()
''' What to do when you have the data '''
def proc_gui_data(self, name, paramValue):
timeStamp = paramValue[0]/1e9
self.timeBuffer.append(datetime.datetime.fromtimestamp(timeStamp))
self.xVLC3Buffer.append(paramValue[7])
self.yVLC3Buffer.append(paramValue[8])
self.xVLC5Buffer.append(paramValue[9])
self.yVLC5Buffer.append(paramValue[10])
self.x352Buffer.append(paramValue[11])
self.y352Buffer.append(paramValue[12])
self.x425Buffer.append(paramValue[13])
self.y425Buffer.append(paramValue[14])
if len(self.timeBuffer)>self.bufferLength:
del(self.timeBuffer[0])
del(self.xVLC3Buffer[0])
del(self.yVLC3Buffer[0])
del(self.xVLC5Buffer[0])
del(self.yVLC5Buffer[0])
del(self.x352Buffer[0])
del(self.y352Buffer[0])
del(self.x425Buffer[0])
del(self.y425Buffer[0])
self.Entrance.update_figure(paramValue[7],paramValue[8],paramValue[11],paramValue[12])
self.Exit.update_figure(paramValue[9],paramValue[10],paramValue[13],paramValue[14])
self.UpX.update_figure(self.timeBuffer,self.xVLC3Buffer,self.x352Buffer)
self.UpY.update_figure(self.timeBuffer,self.yVLC3Buffer,self.y352Buffer)
self.DwX.update_figure(self.timeBuffer,self.xVLC5Buffer,self.x425Buffer)
self.DwY.update_figure(self.timeBuffer,self.yVLC5Buffer,self.y425Buffer)
''' Stop Subs '''
def stop_subs(self):
self.japc.stopSubscriptions()
self.japc.clearSubscriptions()
''' Clear Subs '''
def clear_subs(self):
self.japc.stopSubscriptions()
self.japc.clearSubscriptions()
''' GTFO '''
def closeEvent(self, event):
self.clear_subs()
QWidget.closeEvent(self, event)
''' Start the GUI '''
if __name__ == '__main__':
app = QApplication(sys.argv)
ex = Example()
sys.exit(app.exec_()) | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/plotting_tools/Laser_gui.py | Laser_gui.py |
import sys
sys.path.append('/user/rieger/')
from awakeIMClass import *
import numpy as np
import scipy as sp
import matplotlib as mpl
import pickle
import time
import os
slth = 3
filt = 'yes' #'yes' or 'no'
fstrength = 10
slc = int(512/slth)
def gaus(x,a,x0,sigma,c):
return a*np.exp(-(x-x0)**2/(2*sigma**2))+c
def bunchAnalysis(vec, slth, x,timeVal):
from scipy import ndimage
#time = get_SCtimelabel() #get correct timevector
slc = int(512/slth) #compute how many stripes will be analyzed, SLice Count
time = np.arange(0,timeVal[-1],timeVal[-1]/slc) #generate a vector for correct plotting dependent on slc
if filt is 'yes':
vec = ndimage.median_filter(vec,fstrength)
print('Image is filtered')
allgvalues = getGvalues(vec,x,slth) #compute all parameters for each stripe
amplitudes = allgvalues[:,0] #in counts/intensity
centroids = allgvalues[:,1] #in millimeters
sigmas = allgvalues[:,2] #in millimeters, *1000 for micrometers
integrals = allgvalues[:,0]*sigmas*np.sqrt(2*np.pi)
print('End of one Bunch analysis...................................................')
return amplitudes, centroids, sigmas, integrals, time
def get_SCtimelabel():
import h5py
file1 = h5py.File('/user/awakeop/event_data/2017/06/02/1496354911335000000_40_25.h5','r')
timelabel = list(file1['AwakeEventData']['XMPP-STREAK']['StreakImage']['streakImageTimeValues'])
timelabel[0] = -8.296966560000001 #correct the first value
timelabel[:] = [i+8.296966560000001 for i in timelabel] #shift all values s.t. we start at time t=0
return timelabel #time vector for the plots
def getDate(stamp):
from datetime import datetime
dt = datetime.fromtimestamp(stamp//1000000000)
date = dt.strftime('%d-%m-%d %H:%M:%S')
return date
def getGvalues(streakimage,x,slth):
from scipy.optimize import curve_fit
slth = slth #SLice THickness
slc = int(512/slth) #SLice Count
sections = np.arange(0,512,slth) #sections determined by slth
allgvalues = np.zeros((slc,4)) #Gauss values for each stripe will be saved here
c = 0
#print('x[0] is '+str(x[0])+' and x[-1] is '+str(x[-1]))
for i in sections: #compute the following for all stripes
if i+slth <=512: #check if we run out of image to compute stuff on
buffero = streakimage[i:i+slth,:] #selecting the values of the stripe with thickness slth
line = np.sum(buffero,0) #summing all values into a line
#print(line)
maximum = np.mean(line)*3 #computing guessing values for the gaus fit
x0 = x[345]
sigma = 1/4*np.abs(x[0]-x[-1]) #was at *1/10
c0 = np.mean(line)*0.99
#print(maximum,x0,sigma,c0)
try:
gvalues,error = curve_fit(gaus,x, line, p0=[maximum,x0,sigma,c0])
except: #fitting was not possible
gvalues = [0,0,0,0] #setting some value
print('No fitting possible, fit number '+str(c))
gvalues[2] = np.abs(gvalues[2])
allgvalues[c] = gvalues
#print(gvalues)
c = c+1
else:
break
return allgvalues #allgvalues has all the parameters of the fitted gaussians per stripe
def ImagePlot(plotax,fig,fixedaxes,japc,vec,something,SliceThickness, YesNoFilter, Filterstrength):
#TT41.BTV.412350.STREAK,XMPP-STREAK
print('ImagePlot executed............................................................................')
import time
time.sleep(1)
timestamp=japc.getParam('BOVWA.01TT41.CAM1/ExtractionImage#imageTimeStamp')
timerange=japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeRange')
timeVal=japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeValues')
vec=japc.getParam('XMPP-STREAK/StreakImage#streakImageData').reshape(512,672)
global slth
global filt
global strength
global slc
slth = int(SliceThickness)
filt = YesNoFilter
fstrength = int(Filterstrength)
slc = int(512/slth)
print('slth = '+str(slth))
if filt is 'yes':
filtertext = 'ndimage.median_filter('+str(fstrength)+') used'
else:
filtertext = ' '
'''
im prinzip hast du jetzt:
bild: vec (512-> timeachse, 672->space achse)
zeitachse: timeVal (beachte 0. wert ist usually schachsinn)
x-achse: fixedaxes
plotaxes (pyplot achse in die man ganz normal plotten kann)
kannst beliebige berechnungen machen (wie in diesen beispielen gemacht)
'''
#print(np.shape(vec))
if something is None:
something = 1.1*np.max(vec[:,300:400].sum()/100/512)
plotax.clear()
xmin = 250 #250
xmax = 422 #422 from 672
vec = vec[:,xmin:xmax]
plotax.imshow(np.fliplr(vec.T),extent=[timeVal[1],timeVal[-1],fixedaxes[0][xmin],fixedaxes[0][xmax]],vmin=400,vmax=np.mean(vec)*1.9,aspect='auto',cmap='jet')
amplitudes, centroids, sigmas, integrals, time = bunchAnalysis(vec, slth, fixedaxes[0],timeVal)
'''
ax2.plot(time,np.abs(sigmas),'k.')
ax2.set_ylabel('Sigma',color='r')
ax2.set_xlim(timeVal[-1],timeVal[1])
'''
#BOVWA.01TT41.CAM1/ExtractionImage/imageTimeStamp
date = 'On '+getDate(timestamp)+', '
text = ', timescale: '+timerange+', '+filtertext
plotax.set_title(date+str(japc.getParam('XMPP-STREAK/StreakImage#streakImageTime'))+text)
return
def SigmaAndAmplitude(plotax,fig,fixedaxes,japc,vec,something3):
plotax.clear()
timeVal=japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeValues')
time = np.linspace(timeVal[0],timeVal[-1],timeVal[-1]/slc)
vec=japc.getParam('XMPP-STREAK/StreakImage#streakImageData').reshape(512,672)
import h5py
from scipy.optimize import curve_fit
import scipy
amplitudes, centroids, sigmas, integrals, time = bunchAnalysis(vec, slth, fixedaxes[0],timeVal)
plobj1=plotax.plot(time,sigmas*1000,'b.-',label='Sigma along the bunch')
plotax.set_ylim(200,900)
plotax.set_xlim(time[0],time[-1])
plotax.set_ylabel('Sigma [micrometers]', color='b')
plotax.yaxis.tick_left()
#plotax.set_title('Sigma along the bunch')
plotax.yaxis.set_label_position("left")
plotax.legend()
return
def amplitude(plotax,fig,fixedaxes,japc,vec,something,something2):
plotax.clear()
timeVal=japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeValues')
vec=japc.getParam('XMPP-STREAK/StreakImage#streakImageData').reshape(512,672)
import h5py
from scipy.optimize import curve_fit
import scipy
amplitudes, centroids, sigmas, integrals, time = bunchAnalysis(vec, slth, fixedaxes[0],timeVal)
#print(amplitudes)
plobj1=plotax.plot(time,amplitudes,'b.-', label='Amplitude')
plotax.set_xlim(time[0],time[-1])
plotax.set_ylabel('Amplitude',color='b')
#plotax.set_title('Amplitude along the bunch')
plotax.legend()
return
def centroid(plotax,fig,fixedaxes,japc,vec,something,something2):
plotax.clear()
timeVal=japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeValues')
vec=japc.getParam('XMPP-STREAK/StreakImage#streakImageData').reshape(512,672)
import h5py
from scipy.optimize import curve_fit
import scipy
amplitudes, centroids, sigmas, integrals, time = bunchAnalysis(vec, slth, fixedaxes[0],timeVal)
#print(centroids)
plobj1=plotax.plot(time,centroids,'b.-', label='Centroid')
plotax.set_xlim(time[0],time[-1])
plotax.set_ylim(-0.5,0.5)
plotax.set_ylabel('centroid [mm]',color='b')
#plotax.set_title('Location of the centroid')
plotax.legend()
return
def integrals(plotax,fig,fixedaxes,japc,vec,something,something2):
plotax.clear()
e=japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeRange')
unit = e[-2]+e[-1]
timeVal=japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeValues')
vec=japc.getParam('XMPP-STREAK/StreakImage#streakImageData').reshape(512,672)
import h5py
from scipy.optimize import curve_fit
import scipy
amplitudes, centroids, sigmas, integrals, time = bunchAnalysis(vec, slth, fixedaxes[0],timeVal)
plobj1=plotax.plot(time,integrals,'b.-',label='counts/slice')
plotax.set_xlim(time[0],time[-1])
plotax.set_xlabel('time ['+unit+']')
plotax.set_ylabel('counts/slice',color='b')
#plotax.set_title('Sum of counts per slice')
plotax.yaxis.set_label_position("left")
plotax.yaxis.tick_left()
plotax.legend()
print('Last function got called...............................................................')
return
if __name__=='__main__':
app = QApplication(sys.argv)
aw = AwakeWindow(["TT41.BCTF.412340/Acquisition#totalIntensityPreferred"],ImagePlot,SigmaAndAmplitude,amplitude,centroid,integrals,fixedaxes=(np.linspace(-8.7,8.7,672),),selector="SPS.USER.AWAKE1",name='Felipe Image',ImagePlot={'something':None,'SliceThickness':slth,'YesNoFilter':'yes','Filterstrength':10},SigmaAndAmplitude={'something3':2},amplitude={'something':None,'something2':2},centroid={'something':None,'something2':2},integrals={'something':None,'something2':2},reverse=True)
progname='felipePlots'
aw.setWindowTitle("%s" % progname)
aw.setWindowIcon(QIcon(os.path.join(os.path.dirname(__file__),'awakeicon1_FkV_icon.ico')))
aw.show()
app.exec_() | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/plotting_tools/felipePlots.py | felipePlots.py |
import sys
#sys.path.append('/user/awakeop/AWAKE_ANALYSIS_TOOLS/plotting_tools/')
from awakeIMClass import *
import numpy as np
import scipy as sp
import matplotlib as mpl
import pickle
import time
import os
import scipy.special as sps
import scipy.optimize as spo
'''
defining some functions (not really ):) needed for the varaince esstimation
'''
def zeta(x):
return 2 + x**2 - 0.39269908169872414 * np.exp(-x**2/2) *( (2+x**2)*sps.iv(0,1/4*x**2) + x**2*sps.iv(1,1/4*x**2) )**2
def fpoint(x,mean,var):
buff= zeta(x)*(1+mean**2/var) -2
if buff <0:
return x
else:
return x - np.sqrt(buff)
# 0.42920367320510344*estVar ist varianz der rici distribution, verwende das um zu cutten wenn estimated signal =0
def estimateCut(mean,var,nmin=1,nmax=100):
skipZero=0
if nmin==0:
skipZero=1
estSig=np.zeros(mean[nmin:nmax].shape)
estNoise=np.zeros(mean[nmin:nmax].shape)
for k in range(0,estSig.shape[0]):
try:
# varianze ist die unabhängige variable, varainz bestimmt den max/min wert für buff
buff=spo.brentq(fpoint,0,20,args=(mean[k+nmin],var[k+nmin]))
estNoise[k]=var[k+nmin]/zeta(buff)
estSig[k]=np.maximum(0,mean[k+nmin]**2 + (1-2/zeta(buff))*var[k+nmin])
except:
estNoise[k]=var[k+nmin]*2 # noise fuer signal=0 from wiki
estSig[k]=0
rice_mean=np.sqrt(estNoise)*np.sqrt(np.pi/2)*sps.assoc_laguerre(-estSig/estNoise/2,0.5,0)
# estimated singla, estimated noise, mean_of_dist, variance of dist
return estSig,estNoise,rice_mean, 0.42920367320510344*estNoise
'''
Defining the plots for the GUI
'''
def XMPP_beamImage(plotax,fig,fixedaxes,japc,vec,awkLBox,maxVal,PixelLengthProfileFit,ProfileParam):
# fixedaxes beinhaltet x axis calibration value
#TT41.BTV.412350.STREAK,XMPP-STREAK
try:
time.sleep(1)
timeVal=japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeValues')
vec=japc.getParam('XMPP-STREAK/StreakImage#streakImageData')
vec=vec.reshape(512,672)
except:
print('Failed to retrieve data')
if ProfileParam[0]>ProfileParam[1]:
ProfileParam=(ProfileParam[1],ProfileParam[0])
if maxVal <=0:
# maxVal = 1.1*np.max(vec[:,300:400].sum()/100/512)
maxVal=1.5*np.mean(vec[:,(int(ProfileParam[0]/8.75*336)+336):(int(ProfileParam[1]/8.75*336)+336)])+0.1*np.max(vec[:,(int(ProfileParam[0]/8.75*336)+336):(int(ProfileParam[1]/8.5*336)+336)])
plotax.clear()
plotax.imshow(np.fliplr(vec.T),extent=[timeVal[-1],timeVal[1],fixedaxes[0][150],fixedaxes[0][-150]],vmin=400,vmax=maxVal,aspect='auto',cmap='Blues')
#plotax.plot((1000,0),(-1.75,-1.75),c='k',linestyle='dotted',linewidth=2)
#print('Before first plot')
plotax.plot((timeVal[-1],timeVal[1]),(ProfileParam[0],ProfileParam[0]),c='k',linestyle='dotted',linewidth=2)
#print('Before second plot')
plotax.plot((timeVal[-1],timeVal[1]),(ProfileParam[1],ProfileParam[1]),c='k',linestyle='dotted',linewidth=2)
plotax.set_ylabel('Space (mm)')
plotax.set_xlabel('Time (ps)')
PixelLengthProfileFit=int(PixelLengthProfileFit)
currentFineDelay=awkLBox() #get finedelay setting
fineDelay,pxTpl=awkLBox[japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeRange')]#acess timerange values
if fineDelay is not None:
psShift=(fineDelay-currentFineDelay)*1000
lowerlim=(512-pxTpl[0])/512*(timeVal[-1]-timeVal[1]) + psShift
upperlim=(512-pxTpl[1])/512*(timeVal[-1]-timeVal[1]) + psShift
plotax.plot((upperlim,upperlim),(fixedaxes[0][150],fixedaxes[0][-150]),c='y',linestyle='dotted',linewidth=4)
plotax.plot((lowerlim,lowerlim),(fixedaxes[0][150],fixedaxes[0][-150]),c='y',linestyle='dotted',linewidth=4,label='LASER BOX')
'''
thickness plot
'''
def gaussFIT1D(prm,x,y):
return ((prm[0]/np.sqrt(2*prm[1]**2)*np.exp( - (x-prm[2])**2 /(2*prm[1]**2)) + prm[3]) -y).ravel()
try:
startVals= np.arange(10,490,PixelLengthProfileFit)
endVals= np.arange(10+PixelLengthProfileFit,500,PixelLengthProfileFit)
startGuess=[3,1/4*np.abs(fixedaxes[0][0]-fixedaxes[0][-1]),fixedaxes[0][345],400]
import scipy as sp
slices=[(vec.T[:,l:k].sum(1)/(np.abs(l-k)))/((vec.T[:,l:k].sum()/(np.abs(l-k)))) for l,k in zip(startVals,endVals)]
fits=[sp.optimize.least_squares(gaussFIT1D,startGuess,args=(fixedaxes[0],k)) for k in slices]
parentFig=plotax.get_figure()
if len(parentFig.axes)>3:
ax2=parentFig.axes[3]
ax2.clear()
else:
ax2=plotax.twinx()
ax2.scatter(timeVal[endVals-5],[np.abs(k.x[1]) for k in fits],label='Spatial fits (mm)',s=30,marker='d',c='r')
ax2.set_ylim(0,np.minimum(np.max([np.abs(k.x[1]) for k in fits])*1.1,5))
except:
print('no spatial fit!')
plotax.set_xlim(timeVal[-1],timeVal[1])
plotax.set_title(str(japc.getParam('XMPP-STREAK/StreakImage#streakImageTime')))
def XMPP_PlotFFT(plotax,fig,fixedaxes,japc,vec,historyList,profileParam,historyBKG):
time.sleep(0.5)
if profileParam[0]>profileParam[1]:
profileParam=(profileParam[1],profileParam[0])
plotax.clear()
''' streak image data'''
try:
timeVal=japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeValues')
vec,header=japc.getParam('XMPP-STREAK/StreakImage#streakImageData',getHeader=True,unixtime=True)
vec=vec.reshape(512,672)-400 # to get roughly rid of offset
except:
print('Failed to retrieve data')
''' laser emeter to distinguish from bkg shots'''
LaserPower=japc.getParam('EMETER04/Acq#value') #in J
''' reshape data and normalise, use hanning window and no roll, hanning window mainly to get rid of ringing into the noise and decoherence of low frequency components'''
profile=(vec[:,(int(profileParam[0]/8.75*336)+336):(int(profileParam[1]/8.75*336)+336)]).mean(1) #profile
profile=(profile-np.mean(profile))*np.hanning(512) #subtract meanto get rid of offset ringings
FFT_PRF=np.abs(np.fft.fft(profile))/(np.abs(np.fft.fft(profile))[100:256]).sum() #far away from any measureable frequency
F=1/(timeVal[-1]-timeVal[1])*1e3
FFT_TIME=np.arange(0,512*F,F)
plobj1=plotax.plot(FFT_TIME,FFT_PRF,linewidth=2.5,label='current FFT')
axmax=np.minimum(320,np.maximum(320,40*F))
xticks=np.arange(0,axmax,20)
plotax.xaxis.set_ticks(xticks)
plotax.set_xlim(0,axmax)
plotax.set_ylim(0,np.max(FFT_PRF[3:40])*1.45)
plotax.set_xlabel('Frequency (GHz)')
''' check if histoyBKG or historyList has to be appended'''
data=FFT_PRF
if LaserPower < 0.005:
historyBKG.append(FFT_PRF)
#data=np.array(historyBKG).mean(0)
if len(historyBKG)>15:
del(historyBKG[0])
else:
historyList.append(FFT_PRF)
#data=np.array(historyList).mean(0)
if len(historyList)>20:
del(historyList[0])
try:
parentFig=plotax.get_figure()
if len(parentFig.axes)>4:
ax2=parentFig.axes[4]
ax2.clear()
else:
ax2=plotax.twinx()
try:
''' estimate a decision line'''
if len(historyBKG)>1:
mean_bkg=np.array(historyBKG).mean(0)
var_bkg=np.array(historyBKG).var(0)
else:
raise AWAKException('HistoryBKG is not long enough')
#mean_bkg=np.array(historyList).mean(0)
#var_bkg=np.array(historyList).mean(0)/2 # dummy to not get nan
''' some start and end bins to make it easier'''
Aout1,Aout2,meanCut,varianceCut=estimateCut(mean_bkg,var_bkg,binS,binE) #binS and binE from outside
''' fit variance as variance estimation is !extremely! slow converging, >1k samples for halfway acceptables errors (like 50%), general problem'''
fitpoly=sp.polyfit(FFT_TIME[binS:binE],np.sqrt(varianceCut),1) #order 1 polynom is ok
sigma=sp.polyval(fitpoly,FFT_TIME[binS:binE])
detFreq=0
maxArg=0
if (FFT_PRF[binS+buffarg].flatten()-(meanCut[buffarg]+pref*sigma[buffarg]))>0:
detFreq=FFT_TIME[buffarg+binS]
maxArg=binS+buffarg
print('noFail_profile_fit_and_detection_true')
plotax.plot(FFT_TIME[binS:binE],pref*sigma+meanCut,c='k')
except:
print('no distinction plot')
''' plot history'''
historyData=np.zeros(FFT_TIME.shape)
for k in historyList:
historyData=historyData+k
historyData=historyData/np.maximum(len(historyList),1)
plobj2=ax2.plot(FFT_TIME,historyData,label='FFT History (20 shots)',c='r',linestyle='dotted')
plobj3=ax2.plot((0,1),(0,1),linestyle='None',label='profileParam is independent of other Plots!\n USE PLOT BELOW TO ADJUST XMPP_PlotFFT/profileParam accordingly!!')
ax2.set_xlim(0,axmax)
ax2.set_ylim(0,np.max(data[3:40])*1.25)
ax2.xaxis.set_ticks(xticks)
''' plot selection criterion'''
legendAll=[l.get_label() for l in plobj1+plobj2+plobj3]
plotax.legend(plobj1+plobj2+plobj3,legendAll)
except:
detFreq=0
print('no fft history')
my_gui_vals = japc.getParam('TSG41.AWAKE-XMPP-FFTFREQ/ValueAcquisition#floatValue')
FFT_MAX_IND = np.argmax(FFT_PRF[3:40]) + 3
FFT_MAX_VAL = np.max(FFT_PRF[3:40])
FFT_MAX_FRQ = FFT_TIME[FFT_MAX_IND]
if 'detFreq' in locals():
AVG_MAX_IND = maxArg
AVG_MAX_VAL = np.max(data[3:40])
AVG_MAX_FRQ = detFreq#FFT_TIME[AVG_MAX_IND]
else:
AVG_MAX_VAL = my_gui_vals[4]
AVG_MAX_FRQ = my_gui_vals[3]
my_gui_vals[0] = header['acqStamp']
my_gui_vals[1] = FFT_MAX_FRQ
my_gui_vals[2] = FFT_MAX_VAL
my_gui_vals[3] = AVG_MAX_FRQ
my_gui_vals[4] = AVG_MAX_VAL
japc.setParam('TSG41.AWAKE-XMPP-FFTFREQ/ValueSettings#floatValue',my_gui_vals)
''' profile plot function'''
def XMPP_ProfilePlot(plotax,fig,fixedaxes,japc,vec,profileParam):
plotax.clear()
try:
time.sleep(0.5)
timeVal=japc.getParam('XMPP-STREAK/StreakImage#streakImageTimeValues')
vec=japc.getParam('XMPP-STREAK/StreakImage#streakImageData')
vec=vec.reshape(512,672)-400
except:
print('Failed to retrieve data')
def gaussFIT1D(prm,x,y):
return ((prm[0]/np.sqrt(2*prm[1]**2)*np.exp( - (x-prm[2])**2 /(2*prm[1]**2)) + prm[3]) -y).ravel()
if profileParam[0]>profileParam[1]:
profileParam=(profileParam[1],profileParam[0])
vecP=vec[:,int(profileParam[0]/8.75*336)+336:int(profileParam[1]/8.75*336)+336].sum(1)/(int(profileParam[1]/8.75*336)-int(profileParam[0]/8.75*336))
vecP=vecP/np.max(vecP)
timeVal=np.append(timeVal[1],timeVal[1:])
plobj1=plotax.plot(np.flipud(timeVal),np.flipud(vecP),c='r',linewidth=2,label='temporal Profile')
try:
parentFig=plotax.get_figure()
if len(parentFig.axes)>5:
ax2=parentFig.axes[5]
ax2.clear()
else:
ax2=plotax.twiny()
ax2.clear()
import scipy as sp
vecP2=vec.reshape(512,672).sum(0)/(512)
plobj2=ax2.plot(fixedaxes[0],vecP2/np.max(vecP2),label='Spatial Profile')
startGuess=[(np.max(vecP2)-np.min(vecP2))/2,2/3*(fixedaxes[0][-1]-fixedaxes[0][0]),fixedaxes[0][335],400]
optimres=sp.optimize.least_squares(gaussFIT1D,startGuess,args=(fixedaxes[0],vecP2/np.max(vecP2)))
plotobj4=ax2.plot(fixedaxes[0],gaussFIT1D(optimres.x,fixedaxes[0],0),c='k',linestyle='dotted',linewidth=1.5,label='Gauss fit exp(-x**2/(2*sigma**2)): sigma={0:1.2f}'.format(np.abs(optimres.x[1])))
''' plot lines to show profile selected'''
ax2.plot((profileParam[0],profileParam[0]),(0,1.5),c='k',linestyle='dotted')
ax2.plot((profileParam[1],profileParam[1]),(0,1.5),c='k',linestyle='dotted')
except:
print('no standard')
try:
import scipy as sp
startGuess=[(np.max(vecP)-np.min(vecP))/2,2/3*(timeVal[-1]-timeVal[0]),timeVal[255],400]
optimres=spo.least_squares(gaussFIT1D,startGuess,args=(np.flipud(timeVal),np.flipud(vecP)))
plobj3=plotax.plot(np.flipud(timeVal),np.flipud(gaussFIT1D(optimres.x,timeVal,0)),c='g',linestyle='dotted',linewidth=1.5,label='Gauss fit: sigma={0:1.2f}'.format(np.abs(optimres.x[1])))
legendAll=[l.get_label() for l in plobj1+plobj2+plobj3+plotobj4]
plotax.legend(plobj1+plobj2+plobj3+plotobj4,legendAll)
except:
print('no fitplot')
#plotax.set_ylim(np.min(vec),1.05*np.max(vec))
plotax.set_ylim(0,1.05)
'''
Starting the GUI application
'''
''' parameters for freq estimation and detection'''
binS=5
binE=45
total_p=0.01
p_single=1-(1-total_p)**(1/(binE-binS))
''' sigma * sqrt(-2log(1-F)) : prefactor sqrt() - rayleigh distribution'''
pref=np.sqrt(-2*np.log(1-(1-p_single)))
app = QApplication(sys.argv)
aw = AwakeWindow(["TT41.BCTF.412340/Acquisition#totalIntensityPreferred"],XMPP_beamImage,XMPP_PlotFFT,XMPP_ProfilePlot,fixedaxes=(np.linspace(-8.75,8.75,672),),selector="SPS.USER.AWAKE1",name='AwakeLaserBox Image',XMPP_beamImage={'awkLBox':laserboxMPP,'maxVal':-1,'PixelLengthProfileFit':10,'ProfileParam':(-1.5,1.5)},XMPP_PlotFFT={'historyList':[],'profileParam':(-1,1),'historyBKG':[]},XMPP_ProfilePlot={'profileParam':(-2,2)},reverse=True)
progname='XMPP AwakeSTREAK'
aw.setWindowTitle("%s" % progname)
aw.setWindowIcon(QIcon(os.path.join(os.path.dirname(__file__),'awakeicon1_FkV_icon.ico')))
aw.show()
app.exec_() | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/plotting_tools/awakerunPlotsStreak.py | awakerunPlotsStreak.py |
import numpy as np
import scipy.signal as sig
from scipy.optimize import curve_fit
from numpy.fft import fft
# Centroid and weighted RMS
def profile_moments(prof_data,prof_axis):
cent = prof_data.dot(prof_axis)/prof_data.sum()
rms = np.sqrt(prof_data.dot((prof_axis-cent)**2)/prof_data.sum())
return cent, rms
# Define Gaussian
def gaussian(x, amp, cen, wid, off):
return amp * np.exp(-(x-cen)**2 /(2*wid**2)) + off
# Fit Gaussian
def gaussFit(ax,data,guess):
try:
result,pcov = curve_fit(gaussian,ax,data,guess)
result[2] = abs(result[2])
fit = gaussian(ax, *result)
except:
result = [0,0,0,0]
fit = np.zeros(np.shape(ax))
return fit, result
# Streak analysis
def streak_ana(data_dict,width):
# Extract axes centroid and image
x_ax = data_dict['x_ax']
t_ax = data_dict['y_ax']
xc = data_dict['mean_x']
img = data_dict['img']
# Create lineout and bands for FFT
xl = xc - width*data_dict['sig_x']
xh = xc + width*data_dict['sig_x']
x_ind = (x_ax > xl) & (x_ax < xh)
c_ind = np.argmin(min(abs(x_ax-xc)))
band = img[:,x_ind].mean(1)
line = img[:,c_ind]
outb = img[:,~x_ind].mean(1)
# Create FFT axis
nsamp = len(t_ax)
s_max = round(nsamp/2)
dt = t_ax[1]-t_ax[0]
f_max = 1/dt
t_max = t_ax[-1]-t_ax[0]
f_min = 1/t_max
full_ax = np.linspace(0,f_max,nsamp)
f_ax = 1000*full_ax[0:s_max]
# FFT the data
fb = fft(band)
fl = fft(line)
fo = fft(outb)
# Get the phase for each bin
pb = np.imag(fb)
pl = np.imag(fl)
po = np.imag(fo)
# Get the absolute value of the FFT
fftb = abs(fb[0:s_max])
fftl = abs(fl[0:s_max])
ffto = abs(fo[0:s_max])
# Get the normalized FFT
nfftb = abs(fb[0:s_max])/sum(abs(fb[0:s_max]))
nfftl = abs(fl[0:s_max])/sum(abs(fl[0:s_max]))
nffto = abs(fo[0:s_max])/sum(abs(fo[0:s_max]))
# Store the data
streak_dict = {}
streak_dict['band'] = band
streak_dict['line'] = line
streak_dict['outB'] = outb
streak_dict['f_ax'] = f_ax
streak_dict['fftb'] = fftb
streak_dict['fftl'] = fftl
streak_dict['ffto'] = ffto
streak_dict['nfftb'] = nfftb
streak_dict['nfftl'] = nfftl
streak_dict['nffto'] = nffto
streak_dict['pb'] = pb
streak_dict['pl'] = pl
streak_dict['po'] = po
streak_dict['xc'] = xc
streak_dict['xl'] = xl
streak_dict['xh'] = xh
return streak_dict
# Filter and analyze image
def analyze_frame(frame,x_ax,y_ax,roi={},bg_frame=[],do={}):
# Find pixel sum of image before applying manipulations
sum_no_filt = frame.sum()
# Set ROI to full image if none specified
if not roi:
roi['x_min'] = x_ax[0]
roi['x_max'] = x_ax[-1]
roi['y_min'] = y_ax[0]
roi['y_max'] = y_ax[-1]
# Subtract background image
if bg_frame:
frame = frame - bg_frame
# Cast image as float and apply median filter
im_float = frame.astype(float)
im_filt = sig.medfilt2d(im_float)
# Extract ROI and relevant portions of axes
x_ind = (x_ax >= roi['x_min']) & (x_ax <= roi['x_max'])
y_ind = (y_ax >= roi['y_min']) & (y_ax <= roi['y_max'])
im_roi = im_filt[np.ix_(y_ind,x_ind)]
x_roi = x_ax[x_ind]
y_roi = y_ax[y_ind]
# Find pixel sum and projections of ROI'd image
pix_sum = im_roi.sum()
proj_roi_x = im_roi.mean(0)
proj_roi_y = im_roi.mean(1)
# Find centroid and RMS
xBar,xRMS = profile_moments(proj_roi_x,x_roi)
yBar,yRMS = profile_moments(proj_roi_y,y_roi)
xMax = max(proj_roi_x)
xMin = min(proj_roi_x)
yMax = max(proj_roi_y)
yMin = min(proj_roi_y)
# Perform Gaussian fits
guessX = [xMax-xMin,xBar,xRMS,xMin]
guessY = [yMax-yMin,yBar,yRMS,yMin]
fitX,resX = gaussFit(x_roi,proj_roi_x,guessX)
fitY,resY = gaussFit(y_roi,proj_roi_y,guessY)
# store data
data_dict = {}
data_dict['img'] = im_roi
data_dict['sum'] = pix_sum
data_dict['x_ax'] = x_roi
data_dict['y_ax'] = y_roi
data_dict['proj_x'] = proj_roi_x
data_dict['proj_y'] = proj_roi_y
data_dict['amp_x'] = resX[0]
data_dict['amp_y'] = resY[0]
data_dict['mean_x'] = resX[1]
data_dict['mean_y'] = resY[1]
data_dict['sig_x'] = resX[2]
data_dict['sig_y'] = resY[2]
data_dict['off_x'] = resX[3]
data_dict['off_y'] = resY[3]
data_dict['fit_x'] = fitX
data_dict['fit_y'] = fitY
data_dict['sum_no_filt'] = sum_no_filt
# device specific analyses
if do:
# Streak image analysis
if 'streak' in do.keys():
streak_data = streak_ana(data_dict,do['streak'])
data_dict['streak_data'] = streak_data
return data_dict | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/analyses/analyze_frame.py | analyze_frame.py |
import numpy as np
import scipy.signal as sig
from scipy.optimize import curve_fit
from numpy.fft import fft
''' Class for analyzing images '''
class FrameAna(object):
def __init__(self,frame=[],x_ax=[],y_ax=[],roi=[]):
self.init_frame = frame
self.init_xax = x_ax
self.init_yax = y_ax
self.frame = frame
self.x_ax = x_ax
self.y_ax = y_ax
self.roi = roi
self.bg_frame = np.array([])
self.bg_sub = False
self.median_filter = False
self.fit_gauss = False
self.streak_width = 2.5
self.marker_lim = -3.5
self.nEmbed = 4096
self.frame_analyzed = False
''' Main image analysis method '''
def analyze_frame(self):
# Find pixel sum of image before applying manipulations
self.sum_all = self.frame.sum()
# Cast image as float and apply median filter
self.frame = self.frame.astype(float)
# Subtract background image
if self.bg_sub:
if self.bg_frame.any():
if np.size(self.bg_frame) != np.size(self.frame):
print('Warning: Background frame size does not match image frame size. Background not subtracted')
else:
self.frame = self.frame - self.bg_frame
# Apply median filter
if self.median_filter:
self.frame = sig.medfilt2d(self.frame)
# Extract ROI and relevant portions of axes
self.x_ind = (self.x_ax >= self.roi[0]) & (self.x_ax <= self.roi[1])
self.y_ind = (self.y_ax >= self.roi[2]) & (self.y_ax <= self.roi[3])
#self.y_ind = np.flipud(self.y_ind)
#print(self.y_ind.shape)
self.frame = np.flipud(self.frame)
self.frame = self.frame[np.ix_(self.y_ind,self.x_ind)]
#print
self.x_ax = self.x_ax[self.x_ind]
#self.y_ind = np.flipud(self.y_ind)
self.y_ax = self.y_ax[self.y_ind]
#print(self.y_ax[0])
#print(self.y_ax[-1])
# Get frame min and max
self.min = np.amin(self.frame)
self.max = np.amax(self.frame)
# Find pixel sum and projections of ROI'd image
self.sum_proc = self.frame.sum()
self.proj_x = self.frame.mean(0)
self.proj_y = self.frame.mean(1)
# Find centroid and RMS
self.xBar,self.xRMS = self.profile_moments('x')
self.yBar,self.yRMS = self.profile_moments('y')
self.xMax = np.max(self.proj_x)
self.xMin = np.min(self.proj_x)
self.yMax = np.max(self.proj_y)
self.yMin = np.min(self.proj_y)
# Perform Gaussian fits
if self.fit_gauss:
self.gaussFit('x')
self.gaussFit('y')
self.frame_analyzed = True
self.frame = np.flipud(self.frame)
''' Streak Analysis Method '''
def streak_ana(self):
if not self.frame_analyzed:
print('Must run analyze_frame first.')
return
# Extract axes centroid and image
self.t_ax = np.linspace(self.y_ax[0],self.y_ax[-1],len(self.y_ax))
#self.t_ax = self.y_ax
# Create lineout and bands for FFT
if self.fit_gauss:
xc = self.mean_x
xs = self.sig_x
else:
xc = self.xBar
xs = self.xRMS
# xl is lower band, xh, is the upper band, and xc is the center line
xl = xc - xs*self.streak_width
xh = xc + xs*self.streak_width
x_ind = (self.x_ax > xl) & (self.x_ax < xh)
c_ind = np.argmin(min(abs(self.x_ax-xc)))
# Orig profs
oBand = self.frame[:,x_ind].mean(1)
oLine = self.frame[:,c_ind]
oOutb = self.frame[:,~x_ind].mean(1)
# New interpolation
band = np.interp(self.t_ax,self.y_ax,oBand)
line = np.interp(self.t_ax,self.y_ax,oLine)
outb = np.interp(self.t_ax,self.y_ax,oOutb)
hann_band = np.hanning(len(band))*band
hann_line = np.hanning(len(line))*line
hann_outb = np.hanning(len(outb))*outb
# Create FFT axis
nsamp = len(self.t_ax)
s_max = round(nsamp/2)
dt = self.t_ax[1]-self.t_ax[0]
#dt = np.mean(np.diff(self.t_ax))
f_max = 1/dt
full_ax = np.linspace(0,f_max,nsamp)
f_ax = 1000*full_ax[0:s_max]
# Create FFT embed
arb = self.nEmbed
embed = np.zeros(arb)
embed[(round(arb/2)-round(nsamp/2)):(round(arb/2)+round(nsamp/2))] = hann_band
pad_ax = np.linspace(0,f_max,arb)
f_pad = 1000*pad_ax[0:round(arb/2)]
# FFT the data
fb = fft(band)
fl = fft(line)
fo = fft(outb)
hb = fft(hann_band)
hl = fft(hann_line)
ho = fft(hann_outb)
he = fft(embed)
# Get the absolute value of the FFT
fftb = abs(fb[0:s_max])
fftl = abs(fl[0:s_max])
ffto = abs(fo[0:s_max])
hftb = abs(hb[0:s_max])
hftl = abs(hl[0:s_max])
hfto = abs(ho[0:s_max])
hfte = abs(he[0:round(arb/2)])
# Store the results
self.inner_band = band
self.center_line = line
self.outer_band = outb
self.hann_band = hann_band
self.hann_line = hann_line
self.hann_outb = hann_outb
self.f_ax = f_ax
self.band_fft = fftb
self.line_fft = fftl
self.outer_fft = ffto
self.band_hft = hftb
self.line_hft = hftl
self.outer_hft = hfto
self.embed_hft = hfte
self.f_pad = f_pad
self.band_ref = [xl, xc, xh]
''' Streak Analysis Method '''
def marker_ana(self):
if not self.frame_analyzed:
print('Must run analyze_frame first.')
return
new_frame = np.array(self.init_frame.astype(float))
self.m_ind = (self.init_xax < self.marker_lim)
marker_area = new_frame[np.ix_(self.y_ind,self.m_ind)]
mark_area = sig.medfilt2d(marker_area)
self.m_ax = np.linspace(self.y_ax[0],self.y_ax[-1],len(self.y_ax))
self.proj_m = np.interp(self.m_ax,self.y_ax,mark_area.mean(1))
self.mark_ind = np.argmax(self.proj_m)
self.mark_val = self.y_ax[self.mark_ind]
self.gaussFit('m')
''' Generate COGs. Default Function '''
def profile_moments(self,axis):
if axis == 'x':
no_zero = self.proj_x - min(self.proj_x)
cent = no_zero.dot(self.x_ax)/no_zero.sum()
rms = np.sqrt(no_zero.dot((self.x_ax-cent)**2)/no_zero.sum())
elif axis == 'y':
no_zero = self.proj_y - min(self.proj_y)
cent = no_zero.dot(self.y_ax)/no_zero.sum()
rms = np.sqrt(no_zero.dot((self.y_ax-cent)**2)/no_zero.sum())
return cent, rms
''' Define Gaussian Shape '''
def gaussian(self, x, amp, cen, wid, off):
return amp * np.exp(-(x-cen)**2 /(2*wid**2)) + off
''' Fit Gaussian. Not called by default '''
def gaussFit(self,axis):
if axis == 'x':
guess = [self.xMax-self.xMin,self.xBar,self.xRMS,self.xMin]
#print(guess)
try:
result,pcov = curve_fit(self.gaussian,self.x_ax,self.proj_x,guess)
result[2] = abs(result[2])
fit = self.gaussian(self.x_ax, *result)
except:
print('Failed to fit in '+axis+'-direction')
result = [0,0,0,0]
fit = np.zeros(np.shape(self.x_ax))
self.amp_x = result[0]
self.mean_x = result[1]
self.sig_x = result[2]
self.off_x = result[3]
self.fit_x = fit
elif axis == 'y':
guess = [self.yMax-self.yMin,self.yBar,self.yRMS,self.yMin]
#print(guess)
try:
result,pcov = curve_fit(self.gaussian,self.y_ax,self.proj_y,guess)
result[2] = abs(result[2])
fit = self.gaussian(self.y_ax, *result)
except:
print('Failed to fit in '+axis+'-direction')
result = [0,0,0,0]
fit = np.zeros(np.shape(self.y_ax))
self.amp_y = result[0]
self.mean_y = result[1]
self.sig_y = result[2]
self.off_y = result[3]
self.fit_y = fit
elif axis == 'm':
guess = [np.max(self.proj_m)-np.min(self.proj_m),self.mark_val,10,np.min(self.proj_m)]
#print(guess)
try:
result,pcov = curve_fit(self.gaussian,self.y_ax,self.proj_m,guess)
result[2] = abs(result[2])
fit = self.gaussian(self.y_ax, *result)
except:
print('Failed to fit in '+axis+'-direction')
result = [0,0,0,0]
fit = np.zeros(np.shape(self.y_ax))
self.amp_m = result[0]
self.mean_m = result[1]
self.sig_m = result[2]
self.off_m = result[3]
self.fit_m = fit | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/analyses/frame_analysis.py | frame_analysis.py |
import numpy as np
import scipy as sp
from copy import deepcopy
import pathlib
import re
import os
import sys
import datetime as dt
import copy
import scipy.constants as spc
import h5py as h5
import matplotlib.pyplot as plt
import matplotlib.colors as plcol
import datetime
import awakeBones
def meanlist(x):
r=deepcopy(x[0])
for y in x[1:]:
r +=y
return r/len(x)
def Gauss2D(x, prm): #*1e4 ist da um die start guesses auf ein level zu bringen
return prm[3]*1e4/(2*np.pi*prm[0]**2) * np.exp(- ((x[0] - prm[1])**2 + (x[1] - prm[2])**2)/(2*prm[0]**2)) + prm[4] #/(2*np.pi*prm[0]**2)
def gaussModel(x,coord,y):
return (Gauss2D(coord,x) -y).ravel()
def Gauss1D(x,mu=0,sigma=1,Area=1):
return Area/(np.sqrt(2*np.pi)*sigma)*np.exp(- (x-mu)**2/(2*sigma**2))
def GaussModel1D(x,coord,y):
return x[1]*np.exp(- (coord)**2/(2*x[0]**2))-y #/(np.sqrt(2*np.pi)*x[0])
def Gauss1D(x,coord,y):
if(len(x)<4):
x4=0
else:
x4=x[3]
return x[1]*np.exp(- (coord-x[2])**2/(2*x[0]**2))+x4-y #/(np.sqrt(2*np.pi)*x[0])
def Gauss1D_n(x,coord,y,n=1):
if(len(x)<4):
x4=0
else:
x4=x[3]
z=0
for k in range(0,n):
z=z+Gauss1D(x[k*3+0:k*3+3:1],coord,0)
return z-y+x[-1]
def heavyside(x,a):
return 0.5 * (np.sign(x-a) + 1)
"""
Berechne fuer jede datei die 2D FFT und schaetze fuer jeden datenpunkt innerhalb
eines bestimmten bereichs (<500GHz und )
durch:
http://ac.els-cdn.com/S109078070600019X/1-s2.0-S109078070600019X-main.pdf?_tid=6864cfc0-f44d-11e6-8538-00000aacb361&acdnat=1487252578_962b188f2fcb7d8af0850ce8a43f843e
"""
# np.pi/8 =0.39269908169872414
def zeta(x):
return 2 + x**2 - 0.39269908169872414 * np.exp(-x**2/2) *( (2+x**2)*sp.special.iv(0,1/4*x**2) + x**2*sp.special.iv(1,1/4*x**2) )**2
def fpoint(x,mean,var):
buff= zeta(x)*(1+mean**2/var) -2
if buff <0:
return x
else:
return x - np.sqrt(buff)
# 0.42920367320510344*estVar ist varianz der rici distribution, verwende das um zu cutten wenn estimated signal =0
def estimateCut(mean,var,nmin=1,nmax=100):
skipZero=0
if nmin==0:
skipZero=1
estSig=np.zeros(mean[nmin:nmax].shape)
estNoise=np.zeros(mean[nmin:nmax].shape)
for k in range(0,estSig.shape[0]):
"""if skipZero==1 and k==0:
continue
"""
try:
buff=sp.optimize.brentq(fpoint,0,20,args=(mean[k+nmin],var[k+nmin]))
estNoise[k]=var[k+nmin]/zeta(buff)
estSig[k]=np.maximum(0,mean[k+nmin]**2 + (1-2/zeta(buff))*var[k+nmin])
except:
estNoise[k]=var[k+nmin]*2 # noise fuer signal=0 from wiki
estSig[k]=0
rice_mean=np.sqrt(estNoise)*np.sqrt(np.pi/2)*sp.special.assoc_laguerre(-estSig/estNoise/2,0.5,0)
return estSig,estNoise,rice_mean, 0.42920367320510344*estNoise
def estimateCut2d(mean,var,nmin=1,nmax=100,ymin=0,ymax=50):
skipZero=0
if nmin==0:
skipZero=1
estSig=np.zeros(mean[nmin:nmax,ymin:ymax].shape)
estNoise=np.zeros(mean[nmin:nmax,ymin:ymax].shape)
for k in range(0,estSig.shape[0]):
for l in range(0,estSig.shape[1]):
if skipZero==1 and k==0:
continue
try:
buff=sp.optimize.brentq(fpoint,0,20,args=(mean[k+nmin,l+ymin],var[k+nmin,l+ymin]))
estNoise[k,l]=var[k+nmin,l+ymin]/zeta(buff)
estSig[k,l]=np.maximum(0,mean[k+nmin,l+ymin]**2 + (1-2/zeta(buff))*var[k+nmin,l+ymin])
except:
estNoise[k,l]=1
estSig[k,l]=0
rice_mean=np.sqrt(estNoise)*np.sqrt(np.pi/2)*sp.special.assoc_laguerre(-estSig/estNoise/2,0.5,0)
return estSig,estNoise,np.nan_to_num(rice_mean), 0.42920367320510344*np.nan_to_num(estNoise)
"""
Useful functions needed
liest aus .txt datei enstprechend die werte aus (RB valve on/off laser power)
"""
def ldir(sFolder,fpattern=None):
buff=searchdir(sFolder)
return buff(filepattern=fpattern)
def getValandDate(x):
x=pathlib.Path(x)
timevals=[]
objvals=[]
with open(str(x)) as f:
i=0
for line in f:
if i<2: # ueberspringe ersten 2 zeilen hardgecodet
i +=1
continue
objvals.append(float( (line.split('\t')[-1]).split()[0].replace(',','.') ))
timevals.append( float(dt.datetime.strptime(line.split('\t')[0].split(',')[0],'%Y-%m-%d %H:%M:%S').timestamp())+3600) # +3600 weil UTC timestamp 1h später ist als zürich
return np.array(timevals),np.array(objvals)
"""
Laser und Rubidium metadatenklasse
"""
# is sorted!
class NOSCDATA:
def __init__(self,RbUp,RbDown,LaserPower):
self.tRbUp,self.RbUp=getValandDate(RbUp)
self.tRbDown,self.RbDown=getValandDate(RbDown)
self.tLaser,self.LaserPow=getValandDate(LaserPower)
# nimmt ein objekt: NOSCDATA
def findMatchandvalue(time,y):
LaserOn=False
LaserPower=0
try:
buff= np.where( np.abs(time-y.tLaser) < 19)[0]
except:
buff=np.array([])
if buff.any(): # non empty
LaserOn=True
LaserPower=y.LaserPow[buff] # gibt array aus -> korrigiere im return
RbvalveUp=False
# gibt alle zeiten aus die vor der aktuellen zeit sind
# waehle den letzten value aus
try:
buff= np.where( (time-y.tRbUp) > 0)[0] [-1]
except:
buff=None
if buff:
RbvalveUp=not y.RbUp[buff]
RbvalveDown=False
# gibt alle zeiten aus die vor der aktuellen zeit sind
# waehle den letzten value aus
try:
buff= np.where( (time-y.tRbDown) > 0)[0] [-1]
except:
buff=None
if buff:
RbvalveDown=not y.RbDown[buff]
return LaserOn,LaserPower,RbvalveDown,RbvalveUp
def getMetadata(x): #x has to be eventFile or path/str
if x==None: #empty image
return None,None,None,None
if type(x)==type(str()) or type(x)==type(pathlib.Path()):
x=eventFile(x)
f=h5.File(str(x.path),'r')
try:
charComment=list(f['AwakeEventData']['XMPP-STREAK']['StreakImage']['streakImageInfo'])[0].decode()
except:
return None,None,None,None
#shutter
buff=charComment.find('shutter')
shutter=charComment[buff:buff+20].split('"')[1]
#slit
buff=charComment.find('slit width')
slit=float(charComment[buff:buff+19].split('"')[1])
#MCP
mcp=np.array(f['AwakeEventData']['XMPP-STREAK']['StreakImage']['streakImageMcpGain'])[0]
#Timerange
trange=list(f['AwakeEventData']['XMPP-STREAK']['StreakImage']['streakImageTimeRange'])[0].decode()
if trange.split()[1]=='ps':
trange=float(trange.split()[0])/1e3
else:
trange=float(trange.split()[0])
f.close()
return mcp,shutter,slit,trange
def _getBeamInten(x):
if x==None:
return 0
f=h5.File(str(x),'r')
rval=list(f['AwakeEventData']['TT41.BCTF.412340']['Acquisition']['totalIntensityPreferred'])[0]
f.close()
return rval
def imagemean(x,x_axis,xmin=None,xmax=None):
x_axis=x_axis.reshape(1,x_axis.shape[0])
#bekommt einenen numpy array und berechnet daraus fuer alle y werte den x mw und varianz
if xmin is not None:
xmin=np.where(x_axis>xmin)[-1][0]
else:
xmin=0
if xmax is not None:
xmax=np.where(x_axis<xmax)[-1][-1]
else:
xmax=x_axis.size-1
xax=x_axis[0,xmin:xmax]#.reshape(1,xmax-xmin)
x=x[:,xmin:xmax]/x[:,xmin:xmax].sum(1).reshape(x.shape[0],1)
mw=(x*xax).sum(1)
var=np.zeros((x.shape[0],))
for k in range(0,x.shape[0]):
buff=np.nonzero(x[k,:])
var[k]=(((xax[buff]-mw[k])**2)*x[k,buff]).sum(1)
return mw,var
def selectIMG(x,meanImg=0,y=['.'],externalFiles=awakeBones.nonEventData,imStats=False,*args,**kwargs):
noMODInput=x
fParserNI=cutParser.inputParser(noMODInput,setStandardKeyword={})
fListTotal=filelist()
for k in y:
fListTotal+=ldir(k)
NI_bkg=[]
for k in fListTotal:
if fParserNI(k):
NI_bkg.append(k)
RBUPFiles,RBDOWNFILES,LASERPOWER=externalFiles
nonSCMData=NOSCDATA(RBUPFiles,RBDOWNFILES,LASERPOWER)
NI_bkg=[scMetadata(k) for k in NI_bkg]
NI_Metadata=metadatalist([envMetadata(x,nonSCMData) for x in NI_bkg])
if len(kwargs)>0:
NI_Final=NI_Metadata.search(**kwargs)
else:
NI_Final=NI_Metadata.search(LaserOn=False,RbValveDown=True,RbValveUp=True,shutter='open')
NI_Imagelist=imagelist(NI_Final)
NI_Imagelist.subtractBKG(meanImg)
if imStats:
meanImg,var=eventImageListBKG.mean()
else:
meanImg=None
var=None
return NI_Imagelist,meanImg,var
"""
EventData to a streak camera image class
These classes create metadata infomration for sc images
"""
class eventFile:
def __init__(self,x=None):
if x is not None:
self.path=pathlib.Path(x)
self.time=int(np.floor(float(self.path.name.split('_')[0])/1e9))
self.timeUTC=datetime.datetime.utcfromtimestamp(self.time)
self.timeSTR=datetime.datetime.strftime(self.timeUTC,'%Y/%m/%d %H-%M-%S')
self.Number=int(np.floor(float(self.path.name.split('_')[2].split('.')[0])))
else:
self.path=None
self.time=None
self.Number=None
self.BeamIntensity=_getBeamInten(self.path)
def __eq__(self,other):
if isinstance(other,eventFile):
return self.time==other.time
return False
class scMetadata(eventFile):
def __init__(self,x=None):
super().__init__(x)
# missing: mode
self.mcp,self.shutter,self.slit,self.trange=getMetadata(x)
#
# metadatenklasse die alles enthält
# enthaelt auch die laser/rb werte
#
# gives horrible error when not used properly!
class envMetadata(scMetadata):
def __init__(self,x,y=None):
if type(x)==type(str()) or type(x)==type(pathlib.Path()):
super().__init__(x)
if y is not None:
self.LaserOn,self.LaserPower,self.RbValveDown,self.RbValveUp=findMatchandvalue(self.time,y)
else:
#copy scMetadata
self.mcp=x.mcp
self.shutter=x.shutter
self.slit=x.slit
self.trange=x.trange
self.path=x.path
self.time=x.time
self.timeUTC=x.timeUTC
self.timeSTR=x.timeSTR
self.Number=x.Number
self.BeamIntensity=x.BeamIntensity
if y is not None:
self.LaserOn,self.LaserPower,self.RbValveDown,self.RbValveUp=findMatchandvalue(self.time,y)
else:
self.LaserOn=x.LaserOn
self.LaserPower=x.LaserPower
self.RbValveDown=x.RbValveDown
self.RbValveUp=x.RbValveUp
#
# scImage class includes either all metadata orr only scmetadata, has additionally image properties
#
class scImage(envMetadata,scMetadata):
def __init__(self,x=None):
if isinstance(x,scImage):
self.copy_constructor(x)
return
if type(x)==type(str()) or type(x) == type(pathlib.Path()):
envMetadata.__init__(self,x)
if isinstance(x,scMetadata) and isinstance(x,envMetadata):
#print('blub')
super().__init__(x)
elif isinstance(x,scMetadata): #wird: elif isinstance(x,envMetadata) ?
#print('bla')
super().__init__(x.path)
#print('blaub')
# missing: imheihgt/width
if type(x)==type(None):
self.image=None
self.t=None
self.x=None
else:
f=h5.File(str(self.path),'r')
try:
self.image=np.array(f['AwakeEventData']['XMPP-STREAK']['StreakImage']['streakImageData']).reshape(512,672)
self.t=np.array(f['AwakeEventData']['XMPP-STREAK']['StreakImage']['streakImageTimeValues'])
# safly needed workaround
self.t[0]=self.t[1]-(np.abs(self.t[1]-self.t[2]))
except:
self.image=None
self.t=None
self.x=(np.linspace(0,4.04,672)-2.02)*4.35 #hardcoded atm
f.close()
def copy_constructor(self,other):
self.super(scImage,self).__init__(other) # funzt weil super der reihe nach probiert aufzurufen
self.image=other.image
self.t=other.t
self.x=other.x
"""
Special function for reading non eventbuilder files (i.e. .img vendor files)
and returns a eventbuilder streak class
"""
def scread(filename,mode="ieee-le") :
# select mode
if mode=="ieee-le":
mode="<"
else :
mode=">"
# oeffne die datei
try :
f=open(filename,'rb')
except OSError as err :
print("OS error:".format(err))
else :
data_type=np.dtype(np.uint16)
data_type.newbyteorder(mode)
img_type=np.dtype(np.uint32)
img_type.newbyteorder(mode)
char_im=np.frombuffer(f.read(2),data_type)
comm_length=np.frombuffer(f.read(2),data_type)
im_width=np.int(np.frombuffer(f.read(2),data_type)[0])
im_height=np.int(np.frombuffer(f.read(2),data_type)[0])
x_offs=np.frombuffer(f.read(2),data_type)
y_offs=np.frombuffer(f.read(2),data_type)
bit_depth=np.frombuffer(f.read(2),data_type)
if bit_depth==0 :
bit_depth=1
else :
bit_depth=int((2**(bit_depth+2))/8)
f.seek(25*2,1) #die naechsten 50 byte sind useless dewegen werden sie uebersprungen, whence=1 weil relative position
comment=np.frombuffer(f.read(comm_length[0]),np.dtype(np.uint8)).view('c') # uint8=char -> view('c')
img_type=np.dtype('uint'+str(int(8*bit_depth)))
imgsize=im_width*im_height
# data stored zuerst im_width (656 typischerweise), das ganze im_heigth mal (typ. 508)
y=np.frombuffer(f.read(imgsize*bit_depth),img_type)
mat_data=np.asmatrix(np.ones([im_height,im_width],np.dtype(np.int)))
for k in range(im_height):
mat_data[k,:]=np.asmatrix( y[(k*im_width):((k+1)*im_width)] )
f.close()
comment=str(comment.tostring())
# create important data
# slit:
buff=comment.find('Slit Width')
slitWidth=float(comment[buff:buff+19].split('"')[1])
# data matrix
mat_data=np.array(mat_data)
# mcp
buff=comment.find('MCP Gain')
MCP=float(comment[buff:buff+18].split(',')[0].split('"')[1])
#timescale immer in ns
buff=comment.find('Time Range')
# implement hier den modus weil einfacher
mode=comment[buff:-1].find('Mode')
mode=comment[buff+mode:buff+mode+20].split('"')[1]
# weiter mit timescale
buff=comment[buff+8:buff+19].split('"')[1]
timescale=float(buff.split(' ')[0])
if buff.split(' ')[1]=='ps':
timescale=timescale/1e3
#shutter
buff=comment.find('Shutter')
shutter=False
if comment[buff+5:buff+13].split('"')[1]=='Open':
shutter=True
# mache hier eine eventbuilder klasse daraus
buff=scImage()
#members=[attr for attr in dir(scImage()) if not callable(attr) and not attr.startswith("__") ]
buff.path=pathlib.Path(filename)
buff.image=mat_data
buff.mcp=MCP
buff.Number=None
buff.trange=timescale
buff.slit=slitWidth
buff.shutter=shutter
return buff
"""
Finding files and getting fileslists, regexpression base, together with subclasses for streak images
"""
class cd:
"""Context manager for changing the current working directory"""
def __init__(self, newPath):
self.newPath = (pathlib.Path(newPath)).expanduser()
#self.newPath = newPath
def __enter__(self): #damit man die klasse mit with benutzen kann
self.savedPath = pathlib.Path.cwd()
os.chdir(str(self.newPath))
def __exit__(self, etype, value, traceback):
os.chdir(str(self.savedPath))
class filelist:
def __init__(self,x=[]):
#print(x)
if isinstance(x,filelist):
self.flist=x.flist
elif hasattr(x,'__getitem__'):#type(x).__name__=='list':
self.flist=x
else:
self.flist=[x]
def __eq__(self,other):
if isinstance(other,filelist):
return self.flist==other.flist
return False
def __getitem__(self,num):
return self.flist[num]
def match(self,regExpr=None):
rlist=list()
if regExpr is not None:
for i in self.flist:
if(re.search(re.compile(regExpr),str(i))):
rlist.append(i)
return filelist(rlist)
def __repr__(self):
return str(self.flist)
def __len__(self):
return len(self.flist)
def __iter__(self):
return iter(self.flist)
def __add__(self,other):
if type(other) != type(self):
return self
return filelist(self.flist + other.flist)
class searchdir:
def __init__(self, readDir='.', filepattern='.*'):
self.fileExt=filepattern
self.rDir=readDir
def __call__(self,readDir=None,filepattern=None):
if readDir!=None and not isinstance(readDir,filelist):
self.rDir=readDir
filesImage=filelist([])
if filepattern is not None:
self.fileExt=filepattern
if isinstance(readDir,filelist):
for fiter in readDir.flist:
if(re.search(self.fileExt,fiter)):
filesImage.flist.append(os.path.join(str(self.rDir),fiter))
return filesImage
#self.rDir=self.pathlib.Path(self.rDir)
with cd(self.rDir):
files=os.listdir()
# suche die dateien
for fiter in files:
if(re.search(self.fileExt,fiter)):
filesImage.flist.append(os.path.join(str(self.rDir),fiter))
return filesImage
class metadatalist(filelist):
def __init__(self,x=[]):
super().__init__(x)
def append(self,x):
self.flist.append(x)
def search(self,**kwargs):
# check hier for **kwarg (z.b. mcp=30) und dann gib nur dinge zurück die das attribut haben
buff=[]
for k in self.flist:
isvalid=[getattr(k,str(name))==val for name,val in kwargs.items()]
if all(isvalid):
buff.append(k)
return metadatalist(buff)
class imagelist(metadatalist):
def __init__(self,x=[]):
super().__init__(x)
if isinstance(x,metadatalist) and not isinstance(x,imagelist):
self.flist=[scImage(k) for k in x]
"""
if isinstance(x,imagelist):
self.flist=x.flist
if isinstance(x,list):
self.flist=x
"""
if isinstance(x,scImage):
self.flist=[x]
def subtractBKG(self,x):
if isinstance(x,np.ndarray):
for l,k in enumerate(self.flist):
self.flist[l].image=k.image-x
elif type(x).__name__=='scImage':
for l,k in enumerate(self.flist):
try:
self.flist[l].image=k.image-x.image
except:
pass
return self
def mean(self): # erschafft den mittelwert, eigentlich nur useful fuer
# background aber man kanns ja mal behalten
try:
buff=np.zeros(self.flist[0].image.shape)
buffvar=np.zeros(self.flist[0].image.shape)
rVal=self.flist[0]
for k in self.flist:
buff += k.image
rVal.image=buff/len(self.flist)
for k in self.flist:
buffvar += (k.image-rVal.image)**2
buffvar=buffvar/(len(self.flist)-1)
return rVal,buffvar # returns scImage und varianzMatrix
except:
return scImage()
class roi: #agiert auf numpy arrays mindestens 2d
def __init__(self,x=None,*args):
if x is None:
self.xs=0
self.xe=None
self.ys=0
self.ye=None
return
# atm: alles per args angegbene
self.xs=x
buff=[]
for k in args:
buff.append(k)
self.xe=buff[0]
self.ys=buff[1]
self.ye=buff[2]
#if type(x)== type(list()):
# for k,l in zip(x,)
def shape(self):
if self.xe==None or self.ye==None:
return (0,0)
return (self.xe-self.xs,self.ye-self.ys)
def __call__(self,x):
buff=x[self.ys:self.ye,self.xs:self.xe]
return buff,buff.shape
class scManipulation(imagelist):
def __init__(self,x,fftroi=None,varianz=None):
super().__init__(x)
self.cut=False
if type(fftroi) == type(roi()):
self.roi=fftroi
else:
self.roi=roi(fftroi)
self.var=varianz
def cutNoise(self,Nsigma=2,var=None,enforce=False):
if self.cut==True and enforce==False:
return self
if var is not None:
self.var=var
if self.var is None:
return self
for k in self.flist:
buff=np.where(k.image-Nsigma*np.sqrt(self.var) <0)
#k.image=np.maximum(0,k.image-Nsigma*np.sqrt(self.var)) #scImage
k.image[buff]=0
return self
def lin_interp(self,x=None,roi=None):
if roi is not None:
self.roi=roi
buff=[]
if x is None:
for k in self.flist:
self.roi(k.image)
return
# nehme an ich habe also schon einen lineout bekommen
prm=np.polyfit(np.linspace(0,np.max(x.shape)-1,np.max(x.shape)),x,1)
#return x-(x[-1]-x[0])/(np.max(x.shape)-1)*np.linspace(0,np.max(x.shape)-1,np.max(x.shape)) - x[0]
return x-prm[0]*np.linspace(0,np.max(x.shape)-1,np.max(x.shape))-prm[1]
def profile(self,dim=1,roi=None,mean=True,window=None):
if roi is not None:
self.roi=roi
buff=[]
win=np.ones(self.flist[0].image.shape[0])
if window is not None:
win=win*window(win.shape[0])
for k in self.flist:
im,sh=self.roi(k.image)
if mean:
buff.append(im.sum(dim)/sh[1])
else:
buff.append(im.sum(dim))
if window is not None:
buff[-1]=buff[-1]*win
return buff
def roiMean(self,roi=None):
if roi is not None:
self.roi=roi
ebuff=0
# nehm an dass jedes bild gleich viele pixel hat
if self.roi.xe is None:
ebuff=(self.flist[0].image.shape[1]-1).T
xax=np.linspace(self.roi.xs,ebuff)
else:
xax=np.arange(self.roi.xs,self.roi.xe).T
m=[]
v=[]
for k in self.flist:
m1,v1=imagemean(k.image,k.x,k.x[self.roi.xs],k.x[:self.roi.xe][-1])
m.append(m1)
v.append(v1)
return m,v
class scFFT(scManipulation):
def __init__(self,x,fftroi=None):
super().__init__(x,fftroi)
"""
if type(fftroi) == type(roi()):
self.roi=fftroi
else:
self.roi=roi(fftroi)
"""
def __call__(self):
pass
def fft1d(self,norm=False,linInterp=False,windowFKT=None):
retVal=[]
retValFFT=[]
bbool=False
for k in self.flist:
try:
buffroi,shaperoi=self.roi(k.image)
except:
buffroi,shaperoi=self.roi(k)
buffroi=buffroi.sum(1)/shaperoi[1]
nrg=buffroi.sum()
if windowFKT is not None:
buffroi=windowFKT(buffroi.shape[0])*buffroi
if linInterp:
buffroi=self.lin_interp(buffroi)
if norm:
retVal.append(np.abs(np.fft.fft(buffroi))/buffroi.size/nrg) #
else:
retVal.append(np.abs(np.fft.fft(buffroi))/buffroi.size) #
if bbool==False:
try:
df=1/(k.t[:self.roi.ye][-1]-k.t[self.roi.ys])*1e3 #tim
retValFFT=np.linspace(0,(shaperoi[0]-1)*df,shaperoi[0])
except:
df=1/(k.shape[0]) #tim
retValFFT=np.linspace(0,(k.shape[0]-1)*df,k.shape[0])
bbool=True
return retVal,retValFFT
def fftline(self,norm=False):
retVal=[]
retValFFT=[]
bbool=False
for k in self.flist:
try:
buffroi,shaperoi=self.roi(k.image)
except:
buffroi,shaperoi=self.roi(k)
buff=np.zeros(shaperoi)
normsum=(np.abs(np.fft.fft(buffroi.sum(1)))/shaperoi[0]/shaperoi[1])
for l in range(0,shaperoi[1]):
if norm:
buff[:,l]=np.abs(np.fft.fft(buffroi[:,l]))/shaperoi[0]/normsum[l]#/buffroi.sum() #
else:
buff[:,l]=np.abs(np.fft.fft(buffroi[:,l]))/shaperoi[0]/normsum[l] #
retVal.append(buff)
if bbool==False:
df=1/(k.t[:self.roi.ye][-1]-k.t[self.roi.ys])*1e3 #tim
retValFFT=np.linspace(0,(shaperoi[0]-1)*df,shaperoi[0])
bbool=True
return retVal,retValFFT
def fft2d(self,norm=False,windowX=None,windowY=None):
retVal=[]
retValFFTY=[]
retValFFTX=[]
buffroi,shaperoi=self.roi(self.flist[0].image)
window=np.ones(shaperoi)
if windowX is not None:
buffX=windowX(shaperoi[1]).reshape(1,shaperoi[1])
window=window*buffX
if windowY is not None:
buffY=windowY(shaperoi[0]).reshape(shaperoi[0],1)
window=window*buffY
bbool=False
for k in self.flist:
buffroi,shaperoi=self.roi(k.image)
nrg=buffroi.sum()
if norm:
retVal.append(np.abs(np.fft.fft2(buffroi*window))/buffroi.size/nrg)
else:
retVal.append(np.abs(np.fft.fft2(buffroi*window))/buffroi.size)
if bbool==False:
df=1/(k.t[:self.roi.ye][-1]-k.t[self.roi.ys])*1e3 #time in 1/ps <- annahme daraus folgt *1e3
retValFFTY=np.linspace(0,(shaperoi[0]-1)*df,shaperoi[0])
dx=1/(k.x[:self.roi.ye][-1]-k.x[self.roi.ys])*spc.c*1e3/2/np.pi/1e9 #space
retValFFTX=np.linspace(0,(shaperoi[1]-1)*dx,shaperoi[1])
bbool=True
return retVal,retValFFTX,retValFFTY
def projectFFT(self,x,anyroi=None):
if anyroi==None:
anyroi=roi()
#buff=np.zeros((x[0].shape[0],))
#buff=np.zeros(anyroi.shape())
buff=[]
for k in x:
mybuff,roisize=anyroi(k)
buff += [mybuff.sum(1)/roisize[1]]
#rbuff=np.zeros(buff[0].shape)
#for k in buff:
# rbuff +=k
return buff
def zpadInterp1d(self,nZPad=5,norm=False,subtractMean=False,windowFKT=None):
# 1d interpolation via zeropadding, subtract mean to get rid
retVal=[]
retValFFT=[]
bbool=False
for k in self.flist:
try:
buffroi,shaperoi=self.roi(k.image)
except:
buffroi,shaperoi=self.roi(k)
buffroi=buffroi.sum(1)/shaperoi[1]
nrg=buffroi.sum()
if windowFKT is not None:
buffroi=windowFKT(buffroi.shape[0])*buffroi
lenbuff=buffroi.shape[0]
buffmean=np.mean(buffroi)
if subtractMean:
buffroi=buffroi-buffmean # subtract mean to get rid of ringing effects
buffroi=np.pad(buffroi,(0, (nZPad-1)*lenbuff ),'constant',constant_values=0)
if norm:
retVal.append(np.abs(np.fft.fft(buffroi))/lenbuff/nrg) #
else:
retVal.append(np.abs(np.fft.fft(buffroi))/buffroi.size) #
if bbool==False:
try:
df=1/(k.t[:self.roi.ye][-1]-k.t[self.roi.ys])*1e3 #tim
retValFFT=np.append(np.linspace(0,(shaperoi[0]-1)*df,buffroi.size-nZPad+1,endpoint=True),np.linspace((shaperoi[0]-1)*df+df/nZPad,(shaperoi[0]-1)*df+df*(nZPad-1)/nZPad,nZPad-1,endpoint=False))
except:
df=1/(k.shape[0]) #tim
retValFFT=np.append(np.linspace(0,(k.shape[0]-1)*df,buffroi.size-nZPad+1,endpoint=True),np.linspace((k.shape[0]-1)*df+df/nZPad,(k.shape[0]-1)*df+df*(nZPad-1)/nZPad,nZPad-1,endpoint=False))
bbool=True
return retVal,retValFFT
def zpadInterp2d(self,nZPad=3,norm=True,subtractMean=False,windowX=None,windowY=None):
# 2d interpolation via zeropadding, subtract mean to get rid
retVal=[]
retValFFTY=[]
retValFFTX=[]
bbool=False
buffroi,shaperoi=self.roi(self.flist[0].image)
lenbuffY=buffroi.shape[0]
lenbuffX=buffroi.shape[1]
window=np.ones(shaperoi)
if windowX is not None:
buffX=windowX(shaperoi[1]).reshape(1,shaperoi[1])
window=window*buffX
if windowY is not None:
buffY=windowY(shaperoi[0]).reshape(shaperoi[0],1)
window=window*buffY
for k in self.flist:
try:
buffroi,shaperoi=self.roi(k.image)
except:
buffroi,shaperoi=self.roi(k)
nrg=buffroi.sum()
if (windowX is not None) or (windowY is not None):
buffroi=window*buffroi
if subtractMean:
buffmean=np.mean(buffroi)
buffroi=buffroi-buffmean # subtract mean to get rid of ringing effects
buffroi=np.pad(buffroi,[(0, (nZPad-1)*lenbuffY),(0,(nZPad-1)*lenbuffX)] ,'constant',constant_values=0)
if norm:
retVal.append(np.abs(np.fft.fft2(buffroi))/buffroi.size/nrg) #
"""if k.Number==3002:
plt.figure()
plt.imshow(np.abs(np.fft.fft(buffroi)),vmin=0,vmax=1e-7)
"""
else:
retVal.append(np.abs(np.fft.fft2(buffroi))/buffroi.size) #
if bbool==False:
try:
df=1/(k.t[:self.roi.ye][-1]-k.t[self.roi.ys])*1e3 #tim
retValFFTY=np.append(np.linspace(0,(shaperoi[0]-1)*df,buffroi.shape[0]-nZPad+1,endpoint=True),np.linspace((shaperoi[0]-1)*df+df/nZPad,(shaperoi[0]-1)*df+df*(nZPad-1)/nZPad,nZPad-1,endpoint=False))
dx=1/(k.x[:self.roi.ye][-1]-k.x[self.roi.ys])*spc.c*1e3/2/np.pi/1e9 #space
retValFFTX=np.append(np.linspace(0,(shaperoi[1]-1)*df,buffroi.shape[1]-nZPad+1,endpoint=True),np.linspace((shaperoi[1]-1)*dx+dx/nZPad,(shaperoi[1]-1)*dx+dx*(nZPad-1)/nZPad,nZPad-1,endpoint=False))
except:
df=1/(k.shape[0]) #tim
retValFFTY=np.append(np.linspace(0,(k.shape[0]-1)*df,buffroi.shape[0]-nZPad+1,endpoint=True),np.linspace((k.shape[0]-1)*df+df/nZPad,(k.shape[0]-1)*df+df*(nZPad-1)/nZPad,nZPad-1,endpoint=False))
dx=1/(k.x[:self.roi.ye][-1]-k.x[self.roi.ys])*spc.c*1e3/2/np.pi/1e9 #space
retValFFTXnp.append(np.linspace(0,(k.shape[1]-1)*dx,buffroi.shape[1]-nZPad+1,endpoint=True),np.linspace((k.shape[1]-1)*dx+dx/nZPad,(k.shape[1]-1)*dx+dx*(nZPad-1)/nZPad,nZPad-1,endpoint=False))
bbool=True
return retVal,retValFFTY, retValFFTX
"""
Plotting class for streak class
takes as inpout an scImage object and then can do different plots for it
Class is inefficient, but thats ok
"""
class plotSC:
def __init__(self,x,tickfsize=20,setmax=275,prf_roi=roi(225,425,0,None)):
if type(x).__name__ != 'scImage':
x=scImage(x) #normalerweise funzt das nur noch mit pass
self.image=x
self.tickfsize=tickfsize
self.roi=prf_roi
self.linewidth=1.5
self.max=setmax
self.min=np.min(self.image.image)
#def definePlot(self,)
def streakOverview(self,fig=1,profile=True,prf_roi=None,statistics=False,stat_mean=True,ax=None,cmap='Blues',lsize=20,**kwargs):
if prf_roi is not None:
self.roi=prf_roi
if ax is not None:
a=ax
else:
plt.figure(fig).clf()
a=plt.figure(fig).gca()
fig=plt.gcf()
fig.set_size_inches(18.5, 10.5)
buff=a.imshow(self.image.image,vmin=self.min,vmax=self.max,extent=[self.image.x[0],self.image.x[-1],self.image.t[-1],self.image.t[0]],aspect='auto',cmap=plt.get_cmap(cmap))
if ax is None:
plt.colorbar(buff)
a.set_xlabel('Beam transverse dimension (mm)',fontsize=lsize+2)
a.set_ylabel('Time (ps)',fontsize=lsize+2)
a_extra=None
if ax is None:
plt.gcf().suptitle(str(self.image.Number)+'/'+datetime.datetime.utcfromtimestamp(self.image.time).strftime('%h %d: %H-%M-%S')+'/Laser power: '+str(self.image.LaserPower)+'\nRb valve open up/downstream: ' +str(self.image.RbValveUp)+'/'+str(self.image.RbValveDown),fontsize=24)
mean,varianz=imagemean(self.image.image,self.image.x,self.image.x[self.roi.xs],self.image.x[:self.roi.xe][-1])
if stat_mean:
a.plot(mean,self.image.t,c='r',linewidth=self.linewidth,linestyle='dotted')
if statistics:
a.plot(np.sqrt(varianz)+mean,self.image.t,c='m',linewidth=self.linewidth,linestyle='dotted')
a.plot(-np.sqrt(varianz)+mean,self.image.t,c='m',linewidth=self.linewidth,linestyle='dotted')
if profile:
t=self.image.t
myprof=self.image.image[:,self.roi.xs:self.roi.xe].sum(1)/(self.roi.xe-self.roi.xs)
a.plot((self.image.x[self.roi.xs],self.image.x[self.roi.xs]),(t[self.roi.ys],t[:self.roi.ye][-1]),linewidth=self.linewidth,linestyle='--',color='k')
a.plot((self.image.x[self.roi.xe],self.image.x[self.roi.xe]),(t[self.roi.ys],t[:self.roi.ye][-1]),linewidth=self.linewidth,linestyle='--',color='k')
a.plot((self.image.x[self.roi.xs],self.image.x[self.roi.xe]),(t[self.roi.ys],t[self.roi.ys]),linewidth=self.linewidth,linestyle='--',color='k')
a.plot((self.image.x[self.roi.xs],self.image.x[self.roi.xe]),(t[:self.roi.ye][-1],t[:self.roi.ye][-1]),linewidth=self.linewidth,linestyle='--',color='k')
# plotte jetzt ein lineout, hat keine labels
a_extra=plt.gcf().add_axes(a.get_position(),frameon=False)
a_extra.xaxis.tick_top()
#ax0_extra.xaxis.set_ticklabels([])
a_extra.plot(myprof,np.flipud(t),c='g',linewidth=self.linewidth*1.5,linestyle='dotted')
a_extra.yaxis.set_ticklabels([])
a_extra.set_yticks(ticks=[])
a_extra.set_ylim(t[0],t[-1])
a_extra.set_xlim(np.min(myprof),4*np.max(myprof))
a.tick_params(axis='both',labelsize=lsize)
a.set_xlim(self.image.x[0],self.image.x[-1])
a.set_ylim(self.image.t[-1],self.image.t[0])
return a,a_extra
def streakStatistic(self,fig=2,ax=None,fsize=20):
if ax is not None:
a=ax
else:
plt.figure(fig).clf()
a=plt.figure(fig).gca()
fig=plt.gcf()
fig.set_size_inches(18.5, 10.5)
mean,varianz=imagemean(self.image.image,self.image.x,self.image.x[self.roi.xs],self.image.x[:self.roi.xe][-1])
a.plot(self.image.t,mean,c='r',label='Mean value',linewidth=self.linewidth*1.0)
if ax is None:
plt.gcf().suptitle(str(self.image.Number)+'/'+datetime.datetime.utcfromtimestamp(self.image.time).strftime('%h %d: %H-%M-%S')+'/Laser power: '+str(self.image.LaserPower)+'\nRb valve open up/downstream: ' +str(self.image.RbValveUp)+'/'+str(self.image.RbValveDown),fontsize=22)
a.set_ylabel('Mean (mm)',fontsize=fsize)
plt.legend(loc='upper left',fontsize=fsize-2)
a_extra=a.twinx()
a_extra.plot(self.image.t,np.sqrt(varianz),c='m',linewidth=self.linewidth*1.0,label='std deviation')
a_extra.set_ylabel('std deviation (mm)',fontsize=fsize)
#a_extra.yaxis.set_ticklabels([])
plt.legend(loc='upper right',fontsize=fsize-2)
a.tick_params(axis='both',labelsize=fsize)
a_extra.tick_params(axis='both',labelsize=fsize)
a.set_xlabel('Time (ps)',fontsize=fsize)
a.set_ylim(-.8,-.2)
a_extra.set_ylim(0.4,0.9)
return self
def streakProfile(self,fig=3,a=None,fsize=20,n_roi=None,startGuess=np.array([0.7,100,-0.4,0]),dim=0,ax=None,**kwargs):
if ax is not None:
a=ax
else:
plt.figure(fig).clf()
a=plt.figure(fig).gca()
fig=plt.gcf()
fig.set_size_inches(16.5, 9.5)
roiL=n_roi
buff=scManipulation(self.image)
if n_roi is None:
roiL=[self.roi]
optimres=[]
lObj=[]
if ax is None:
plt.gcf().suptitle(str(self.image.Number)+'/'+datetime.datetime.utcfromtimestamp(self.image.time).strftime('%h %d: %H-%M-%S')+'/Laser power: '+str(self.image.LaserPower)+'\nRb valve open up/downstream: ' +str(self.image.RbValveUp)+'/'+str(self.image.RbValveDown),fontsize=fsize+2)
for k in roiL:
x=self.image.x[k.xs:k.xe]
buff2=buff.profile(dim,roi=k)
optimres.append(sp.optimize.least_squares(scDefines.Gauss1D,startGuess,args=(x,buff2[0]),verbose=0))
lObj+=a.plot(x,scDefines.Gauss1D(optimres[-1].x,x,0),linewidth=self.linewidth)
lObj+=a.plot(x,buff2[0],linewidth=self.linewidth)
#a.legend(lObj,['roi:'+str(l+1),'roi'+str(l+1)+' fit: '+'{0:0.2f}'.format(optimres[-1].x[0])+'and {0:0.2f}'.format(optimres[-1].x[2])])
#l=l+1
b=[]
for k in range(0,len(optimres)):
b+=['roi:'+str(k+1),'roi'+str(k+1)+' fit: '+'{0:0.2f}'.format(optimres[k].x[0])+' and {0:0.2f}'.format(optimres[k].x[2])]
#a.legend(lObj,[['roi:'+str(k+1),'roi'+str(k+1)+' fit: '+'{0:0.2f}'.format(optimres[k].x[0])+'and {0:0.2f}'.format(optimres[k].x[2])] for k in range(0,len(optimres))])
a.legend(lObj,b)
"""
for k,l in zip(optimres,range(0,len(optimres))):
a.plot(x,Gauss1D(k.x,x,0),x,,label='roi:'+str(l+1),linewidth=self.linewidth)
"""
#plt.legend(loc='upper right',fontsize=fsize-2)
a.tick_params(axis='both',labelsize=fsize)
if dim==0:
a.set_xlabel('Space (mm)',fontsize=fsize)
else:
a.set_xlabel('Time (ps)',fontsize=fsize)
return self,optimres,a,lObj,b
def __call__(self,figN,**kwargs): #standard plot image, lineout, FFT
fig=plt.figure(figN)
fig.set_size_inches(18.5,10)
fig.clf()
fig.suptitle(datetime.datetime.utcfromtimestamp(self.image.time).strftime('%h %d: %H-%M-%S')+' / Number:' + str(self.image.Number) + ' Laser Power:' +str(self.image.LaserPower) + ' Rb Valve Up/Down:' + str(self.image.RbValveDown) + '/' + str(self.image.RbValveUp),fontsize=18)
self.ax=[fig.add_subplot(2,2,n+1) for n in range(0,4)]
ax=self.ax
self.streakOverview(ax=ax[0],profile=True,lsize=12,prf_roi=self.roi,statistics=True,**kwargs)
self.streakStatistic(ax=ax[1],fsize=14)
self.streakProfile(ax=ax[2],fsize=14,**kwargs)
return self | AWAKE-ANALYSIS-TOOLS | /AWAKE_ANALYSIS_TOOLS-0.0.2-py3-none-any.whl/analyses/awakeMeat.py | awakeMeat.py |
import datetime
import json
try:
from urllib.request import urlopen
except:
from urllib2 import urlopen
name = 'AWRS'
version = '2019-08-14T1152Z'
def METAR(
identifier = 'EGPF',
URL = 'https://avwx.rest/api/metar/'
):
try:
file_URL = urlopen(URL + identifier)
data_string = file_URL.read().decode('utf-8')
data_JSON = json.loads(data_string)
report = {}
report['raw'] = data_JSON
report['METAR'] = data_JSON['raw']
except:
return None
report['identifier'] = identifier
report['dewpoint'] = int(data_JSON['dewpoint']['value'])
report['QNH'] = data_JSON['altimeter']['value']
report['temperature'] = int(data_JSON['temperature']['value'])
report['visibility'] = int(data_JSON['visibility']['value'])
try:
report['wind_direction'] = int(data_JSON['wind_direction']['value'])
except:
report['wind_direction'] = None
report['wind_speed'] = int(data_JSON['wind_speed']['value'])
# datetime
now = datetime.datetime.utcnow()
tmp = datetime.datetime.strptime(
data_JSON['time']['repr'],
'%d%H%MZ'
)
report['datetime'] = datetime.datetime(
now.year,
now.month,
tmp.day,
tmp.hour,
tmp.minute
)
report['time_UTC'] = report['datetime'].strftime('%Y-%m-%dT%H%MZ')
# rain
if 'RA' in report['METAR']:
report['rain'] = True
else:
report['rain'] = False
return report
def TAF(
identifier = 'EGPF',
URL = 'https://avwx.rest/api/taf/'
):
try:
file_URL = urlopen(URL + identifier)
data_string = file_URL.read().decode('utf-8')
data_JSON = json.loads(data_string)
report = {}
report['raw'] = data_JSON
report['TAF'] = data_JSON['raw']
except:
return None
report['identifier'] = identifier
# list of tuples of start and stop TAF datetimes for rain predictions
report['rain_TAF_datetimes'] = [
(forecast['start_time']['repr'], forecast['end_time']['repr'])\
for forecast in data_JSON['forecast'] if 'RA' in forecast['raw']
]
# list of tuples of start and stop datetimes for rain predictions
report['rain_datetimes'] = []
now = datetime.datetime.utcnow()
for datetimes in report['rain_TAF_datetimes']:
start_datetime = TAF_datetime_to_datetime_object(
datetime_string = datetimes[0],
datetime_for_year_and_month = now
)
stop_datetime = TAF_datetime_to_datetime_object(
datetime_string = datetimes[1],
datetime_for_year_and_month = now
)
report['rain_datetimes'].append((start_datetime, stop_datetime))
# list of human-readable time windows in style %Y-%m-%dT%H%MZ
report['rain_human_readable_datetimes'] = []
for datetimes in report['rain_datetimes']:
report['rain_human_readable_datetimes'].append(
datetimes[0].strftime('%Y-%m-%dT%H%MZ') +\
'--' +\
datetimes[1].strftime('%Y-%m-%dT%H%MZ')
)
return report
def TAF_datetime_to_datetime_object(
datetime_string = None,
datetime_for_year_and_month = None # e.g. datetime.datetime.utcnow()
):
"""
Preprocess datetimes to change hours from 24 to 00, incrementing the date
as necessary.
"""
if datetime_string.endswith('24'):
datetime_string = datetime_string[:-2] + '00'
tmp = datetime.datetime.strptime(datetime_string, '%d%H') +\
datetime.timedelta(days = 1)
else:
tmp = datetime.datetime.strptime(datetime_string, '%d%H')
return datetime.datetime(
datetime_for_year_and_month.year,
datetime_for_year_and_month.month,
tmp.day,
tmp.hour,
tmp.minute
)
def rain_datetimes(
identifier = 'EGPF'
):
report = TAF(identifier = identifier)
return report['rain_datetimes']
def rain_human_readable_datetimes(
identifier = 'EGPF',
return_list = True,
return_string = False
):
report = TAF(identifier = identifier)
if return_list:
return report['rain_human_readable_datetimes']
if return_string:
return ', '.join(report['rain_human_readable_datetimes'])
def rain_soon(
identifier = 'EGPF',
minutes = 30
):
test_time = datetime.datetime.utcnow() +\
datetime.timedelta(minutes = minutes)
report = TAF(identifier = identifier)
if report['rain_datetimes']:
for datetimes in report['rain_datetimes']:
if datetimes[0] <= test_time <= datetimes[1]:
return True
return False | AWRS | /AWRS-2019.8.14.1152.tar.gz/AWRS-2019.8.14.1152/AWRS.py | AWRS.py |
import base64
import json
import logging
import boto3
from botocore.exceptions import ClientError
API = 'secretsmanager'
REGION = 'eu-west-1'
logger = logging.getLogger(__name__)
logger.setLevel(logging.DEBUG)
def _get_client():
# Create a Secrets Manager client
session = boto3.session.Session()#profile_name='bns')
client = session.client(
service_name=API,
region_name=REGION,
)
return client
def _secret_fetch(key):
client = _get_client()
try:
get_secret_value_response = client.get_secret_value(
SecretId=key
)
except ClientError as e:
if e.response['Error']['Code'] == 'DecryptionFailureException':
# Secrets Manager can't decrypt the protected secret text using the provided KMS key.
# Deal with the exception here, and/or rethrow at your discretion.
raise e
elif e.response['Error']['Code'] == 'InternalServiceErrorException':
# An error occurred on the server side.
# Deal with the exception here, and/or rethrow at your discretion.
raise e
elif e.response['Error']['Code'] == 'InvalidParameterException':
# You provided an invalid value for a parameter.
# Deal with the exception here, and/or rethrow at your discretion.
raise e
elif e.response['Error']['Code'] == 'InvalidRequestException':
# You provided a parameter value that is not valid for the current state of the resource.
# Deal with the exception here, and/or rethrow at your discretion.
raise e
elif e.response['Error']['Code'] == 'ResourceNotFoundException':
# We can't find the resource that you asked for.
# Deal with the exception here, and/or rethrow at your discretion.
raise e
else:
# Decrypts secret using the associated KMS CMK.
# Depending on whether the secret is a string or binary, one of these fields will be populated.
if 'SecretString' in get_secret_value_response:
return json.loads(get_secret_value_response['SecretString'])
else:
return base64.b64decode(get_secret_value_response['SecretBinary'])
def get_application_secret(secret_key):
if not secret_key:
raise ValueError('Needs a secrets key. Please provide a key')
return _secret_fetch(secret_key)
def get_database_secret():
secret_name = "prod/mysql"
return _secret_fetch(secret_name) | AWS-Client-Wrappers | /AWS-Client-Wrappers-0.0.1.tar.gz/AWS-Client-Wrappers-0.0.1/aws/boto3.py | boto3.py |
from typing import Optional
import boto3
import botocore
from aws.base import AWSClientBase
AWSS3_PUBLIC_URL = 'https://s3-{}.amazonaws.com/{}/{}'
class S3Client(AWSClientBase):
def __init__(self):
super().__init__()
self.client = self.session.client('s3')
self.llclient = boto3.client('s3', config=botocore.client.Config(signature_version=botocore.UNSIGNED))
def list_dirs(self, bucket, prefix=''):
if prefix != '':
prefix = prefix + '/'
response = self.llclient.list_objects(
Bucket=bucket,
Prefix=prefix,
Delimiter='/'
)
folders = [i.get('Prefix') for i in response.get('CommonPrefixes')]
return folders
def list_objects(self, bucket, prefix=''):
return self.client.list_objects(Bucket=bucket, Prefix=prefix).get('Contents', None)
def make_public_url(self, bucket_name, key):
"""
Gets the public URL from the bucket name and object key
:param bucket_name: S3 bucket name
:param key: object key
:return:
"""
bucket_location = self.client.get_bucket_location(Bucket=bucket_name)
path = AWSS3_PUBLIC_URL.format(bucket_location['LocationConstraint'], bucket_name, key)
return path
def presigned_post(self, bucket, key, acl=None):
if acl is None:
acl = [{'acl': 'public-read'}]
return self.client.generate_presigned_post(bucket, key, Conditions=acl)
def delete(self, bucket, key) -> bool:
"""
Handles the delete operation for bucket objects.
:param bucket: S3 bucket name
:param key: Object key within the bucket
:return: operation status
:rtype bool
"""
status = self.client.delete_object(bucket=bucket, key=key)
return status.get('DeleteMarker')
def upload(self, path, image, bucket, target_path=None, geturl=False, public=False) -> Optional[str]:
"""
Handles the upload procedure from local filesystem to S3 bucket. This API splits up
large files automatically. Parts are then uploaded in parallel.
:param path: The path where the file is located
:param image: The filename and extension
:param bucket: The bucket name that we wish to post to
:param target_path: The directory within the bucket. I.e <bucket>/images/
:param geturl: Whether to return the public URL of the object uploaded
:return: The S3 public access URL for the image once uploaded
:rtype str
"""
if not target_path:
target_path = ''
source = '{}{}'.format(path, image)
if target_path[-1] != '/':
target_path = target_path + '/'
s3target = '{}{}'.format(target_path, image)
args = {}
if public:
args = {'ACL': 'public-read'}
self.client.upload_file(source, bucket, s3target, ExtraArgs=args)
if geturl:
return self.__make_public_url(bucket, s3target)
return True | AWS-Client-Wrappers | /AWS-Client-Wrappers-0.0.1.tar.gz/AWS-Client-Wrappers-0.0.1/aws/s3.py | s3.py |
import math
import matplotlib.pyplot as plt
import numpy as np
from .Generaldistribution import Distribution
class Gaussian(Distribution):
""" Gaussian distribution class for calculating and
visualizing a Gaussian distribution.
Attributes:
mean (float) representing the mean value of the distribution
stdev (float) representing the standard deviation of the distribution
data_list (list of floats) a list of floats extracted from the data file
"""
def __init__(self, mu=0, sigma=1):
Distribution.__init__(self, mu, sigma)
def calculate_mean(self):
"""Function to calculate the mean of the data set.
Args:
None
Returns:
float: mean of the data set
"""
avg = 1.0 * sum(self.data) / len(self.data)
self.mean = avg
return self.mean
def calculate_stdev(self, sample=True):
"""Function to calculate the standard deviation of the data set.
Args:
sample (bool): whether the data represents a sample or population
Returns:
float: standard deviation of the data set
"""
if sample:
n = len(self.data) - 1
else:
n = len(self.data)
mean = self.calculate_mean()
sigma = 0
for d in self.data:
sigma += (d - mean) ** 2
sigma = math.sqrt(sigma / n)
self.stdev = sigma
return self.stdev
def plot_histogram(self):
"""Function to output a histogram of the instance variable data using
matplotlib pyplot library.
Args:
None
Returns:
None
"""
plt.hist(self.data)
plt.title('Histogram of Data')
plt.xlabel('data')
plt.ylabel('count')
plt.show()
def replace_stats_with_data(self, sample=True):
"""Function to calculate mean and standard deviation from the data set
Args:
Sample or not
Returns:
float: the mean value
float: the stdev value
"""
# TODO: The read_data_file() from the Generaldistribution class can read in a data
# file. Because the Gaussiandistribution class inherits from the Generaldistribution class,
# you don't need to re-write this method. However, the method
# doesn't update the mean or standard deviation of
# a distribution. Hence you are going to write a method that calculates mean and stdev
# updates the mean attribute
# updates the standard deviation attribute
#
# Hint: You can use the calculate_mean() and calculate_stdev() methods
# defined previously.
self.calculate_mean()
self.calculate_stdev(sample)
return self.mean, self.stdev
def pdf(self, x):
"""Probability density function calculator for the gaussian distribution.
Args:
x (float): point for calculating the probability density function
Returns:
float: probability density function output
"""
return (1.0 / math.sqrt(2*math.pi*(self.stdev**2))) * math.exp(-0.5*((x - self.mean) / self.stdev) ** 2)
def plot_histogram_pdf(self, n_spaces = 50):
"""Function to plot the normalized histogram of the data and a plot of the
probability density function along the same range
Args:
n_spaces (int): number of data points
Returns:
list: x values for the pdf plot
list: y values for the pdf plot
"""
# mu , sigma = self.replace_stats_with_data(True)
mu = self.mean
sigma = self.stdev
min_range = min(self.data)
max_range = max(self.data)
# calculates the interval between x values
interval = 1.0 * (max_range - min_range) / n_spaces
x = []
y = []
# calculate the x values to visualize
for i in range(n_spaces):
tmp = min_range + interval*i
x.append(tmp)
y.append(self.pdf(tmp))
# make the plots
fig, axes = plt.subplots(2,sharex=True)
fig.subplots_adjust(hspace=.5)
axes[0].hist(self.data, density=True)
axes[0].set_title('Normed Histogram of Data')
axes[0].set_ylabel('Density')
axes[1].plot(x, y)
axes[1].set_title('Normal Distribution for \n Sample Mean and Sample Standard Deviation')
axes[0].set_ylabel('Density')
plt.show()
return x, y
def create_gaussian_file(self, mu, sigma, size_file):
s = np.random.normal(mu, sigma, size_file)
f = open("numbers_gaussian.txt", "w+")
i = 0
for i in range(size_file):
line_number = "{} \n".format(str(int(s[i])))
f.write(line_number)
f.close()
def __add__(self, other):
"""Function to add together two Gaussian distributions
Args:
other (Gaussian): Gaussian instance
Returns:
Gaussian: Gaussian distribution
"""
result = Gaussian()
result.mean = self.mean + other.mean
result.stdev = math.sqrt(self.stdev ** 2 + other.stdev ** 2)
return result
def __repr__(self):
"""Function to output the characteristics of the Gaussian instance
Args:
None
Returns:
string: characteristics of the Gaussian
"""
return "mean {}, standard deviation {}".format(self.mean, self.stdev) | AWS-IFG-distributions | /AWS_IFG_distributions-1.0.tar.gz/AWS_IFG_distributions-1.0/AWS_IFG_distributions/Gaussiandistribution.py | Gaussiandistribution.py |
import math
import matplotlib.pyplot as plt
import numpy as np
from .Generaldistribution import Distribution
class Binomial(Distribution):
""" Binomial distribution class for calculating and
visualizing a Binomial distribution.
Attributes:
mean (float) representing the mean value of the distribution
stdev (float) representing the standard deviation of the distribution
data_list (list of floats) a list of floats to be extracted from the data file
p (float) representing the probability of an event occurring
n (int) the total number of trials
TODO: Fill out all TODOs in the functions below
"""
# A binomial distribution is defined by two variables:
# the probability of getting a positive outcome
# the number of trials
# If you know these two values, you can calculate the mean and the standard deviation
#
# For example, if you flip a fair coin 25 times, p = 0.5 and n = 25
# You can then calculate the mean and standard deviation with the following formula:
# mean = p * n
# standard deviation = sqrt(n * p * (1 - p))
#
def __init__(self, prob=.5, size=20):
Distribution.__init__(self, 1.0 * prob * size, 1.0 * math.sqrt(size * prob * (1 - prob)))
self.p = prob
self.n = size
# TODO: store the probability of the distribution in an instance variable p
# TODO: store the size of the distribution in an instance variable n
# TODO: Now that you know p and n, you can calculate the mean and standard deviation
# Use the calculate_mean() and calculate_stdev() methods to calculate the
# distribution mean and standard deviation
#
# Then use the init function from the Distribution class to initialize the
# mean and the standard deviation of the distribution
#
# Hint: You need to define the calculate_mean() and calculate_stdev() methods
# farther down in the code starting in line 55.
# The init function can get access to these methods via the self
# variable.
def calculate_mean(self):
"""Function to calculate the mean from p and n
Args:
None
Returns:
float: mean of the data set
"""
# TODO: calculate the mean of the Binomial distribution. Store the mean
# via the self variable and also return the new mean value
self.mean = 1.0 * self.p * self.n
return self.mean
def calculate_stdev(self):
"""Function to calculate the standard deviation from p and n.
Args:
None
Returns:
float: standard deviation of the data set
"""
self.stdev = 1.0 * math.sqrt((1.0 * self.n) * self.p * (1 - self.p))
return self.stdev
# TODO: calculate the standard deviation of the Binomial distribution. Store
# the result in the self standard deviation attribute. Return the value
# of the standard deviation.
@property
def replace_stats_with_data(self):
"""Function to calculate p and n from the data set
Args:
None
Returns:
float: the p value
float: the n value
"""
# TODO: The read_data_file() from the Generaldistribution class can read in a data
# file. Because the Binomaildistribution class inherits from the Generaldistribution class,
# you don't need to re-write this method. However, the method
# doesn't update the mean or standard deviation of
# a distribution. Hence you are going to write a method that calculates n, p, mean and
# standard deviation from a data set and then updates the n, p, mean and stdev attributes.
# Assume that the data is a list of zeros and ones like [0 1 0 1 1 0 1].
#
# Write code that:
# updates the n attribute of the binomial distribution
# updates the p value of the binomial distribution by calculating the
# number of positive trials divided by the total trials
# updates the mean attribute
# updates the standard deviation attribute
#
# Hint: You can use the calculate_mean() and calculate_stdev() methods
# defined previously.
data_list = self.data
self.n = len(data_list)
self.p = sum(data_list)/ self.n # (mean - stdev)/mean
self.calculate_mean()
self.calculate_stdev()
return self.p, self.n
def plot_bar(self):
"""Function to output a histogram of the instance variable data using
matplotlib pyplot library.
Args:
None
Returns:
None
"""
# TODO: Use the matplotlib package to plot a bar chart of the data
# The x-axis should have the value zero or one
# The y-axis should have the count of results for each case
#
# For example, say you have a coin where heads = 1 and tails = 0.
# If you flipped a coin 35 times, and the coin landed on
# heads 20 times and tails 15 times, the bar chart would have two bars:
# 0 on the x-axis and 15 on the y-axis
# 1 on the x-axis and 20 on the y-axis
# Make sure to label the chart with a title, x-axis label and y-axis label
# plt.hist(self.data)
x = [0, 1]
height = [(1 - self.p) * self.n, self.p * self.n]
# plt.bar(x, height, width=0.8, bottom=None, *, align='center', data=None, **kwargs)
plt.bar(x, height)
plt.title('Distribution of Outcomes')
plt.ylabel('Number')
plt.xlabel('Outcome')
plt.show()
def pdf(self, k):
"""Probability density function calculator for the binomial distribution.
Args:
k (float): point for calculating the probability density function
Returns:
float: probability density function output
"""
# TODO: Calculate the probability density function for a binomial distribution
# For a binomial distribution with n trials and probability p,
# the probability density function calculates the likelihood of getting
# k positive outcomes.
#
# For example, if you flip a coin n = 60 times, with p = .5,
# what's the likelihood that the coin lands on heads 40 out of 60 times?
return (math.factorial(self.n) / (math.factorial(k) * math.factorial(self.n - k))) * (self.p ** k) * (
(1 - self.p) ** (self.n - k))
def plot_bar_pdf(self):
"""Function to plot the pdf of the binomial distribution
Args:
None
Returns:
list: x values for the pdf plot
list: y values for the pdf plot
"""
# TODO: Use a bar chart to plot the probability density function from
# k = 0 to k = n
# Hint: You'll need to use the pdf() method defined above to calculate the
# density function for every value of k.
# Be sure to label the bar chart with a title, x label and y label
# This method should also return the x and y values used to make the chart
# The x and y values should be stored in separate lists
x = []
y = []
k = 0
n = self.n
# calculate the x values to visualize
for k in range(n + 1):
x.append(k)
y.append(self.pdf(k))
# make the plot
plt.bar(x, y)
plt.title('Distribution of Outcomes')
plt.ylabel('Probability')
plt.xlabel('Outcome')
plt.show()
return x, y
def create_binomial_file (self, number_trials, p, array_size):
# s = np.random.randint(2, size=size_file)
s = np.random.binomial(number_trials, p, array_size)
f = open("numbers_binomial.txt", "w+")
i = 0
for i in range(array_size):
line_number = "{} \n".format(str(s[i]))
f.write(line_number)
f.close()
def __add__(self, other):
"""Function to add together two Binomial distributions with equal p
Args:
other (Binomial): Binomial instance
Returns:
Binomial: Binomial distribution
"""
try:
assert self.p == other.p, 'p values are not equal'
except AssertionError as error:
raise
# TODO: Define addition for two binomial distributions. Assume that the
# p values of the two distributions are the same. The formula for
# summing two binomial distributions with different p values is more complicated,
# so you are only expected to implement the case for two distributions with equal p.
# the try, except statement above will raise an exception if the p values are not equal
# Hint: You need to instantiate a new binomial object with the correct n, p,
# mean and standard deviation values. The __add__ method should return this
# new binomial object.
# When adding two binomial distributions, the p value remains the same
# The new n value is the sum of the n values of the two distributions.
result = Binomial()
result.p = self.p
result.n = self.n + other.n
return result
def __repr__(self):
"""Function to output the characteristics of the Binomial instance
Args:
None
Returns:
string: characteristics of the Gaussian
"""
# TODO: Define the representation method so that the output looks like
# mean 5, standard deviation 4.5, p .8, n 20
#
# with the values replaced by whatever the actual distributions values are
# The method should return a string in the expected format
message = "mean {}, standard deviation {}, p {}, n {}".format(self.mean, self.stdev, self.p, self.n)
return message | AWS-IFG-distributions | /AWS_IFG_distributions-1.0.tar.gz/AWS_IFG_distributions-1.0/AWS_IFG_distributions/Binomialdistribution.py | Binomialdistribution.py |
import boto3
import json
import inspect
FUNCTION_CONTEXT = None
HANDLER = None
INVOKE_LOCAL = None
class Link_Invocation(object):
def get_args_from(self, target):
spec = inspect.getargspec(target)
return spec.args, spec.args[:-len(spec.defaults) if spec.defaults else None]
def get_optional_args_from(self, target):
spec = inspect.getargspec(target)
return spec.args[-len(spec.defaults):] if spec.defaults else []
def __init__(self,
target):
self.target = target
self.all_arguments, self.required_arguments = self.get_args_from(target)
def legal_event(self, event):
contains_required_args = set(self.required_arguments) <= set(event)
no_unrecognized_args = set(event) <= set(self.all_arguments)
return contains_required_args and no_unrecognized_args
def handler_decorator(self, handler):
def wrapped_handler(event, context):
global FUNCTION_CONTEXT
FUNCTION_CONTEXT = context
handler_result = handler(event, context)
if self.legal_event(event):
return self.target(**event)
else:
return handler_result
return wrapped_handler
def __call__(self, handler):
return self.handler_decorator(handler)
class DummyContext():
def __init__(self, name="DummyContext", invoke_local=True, **kwargs):
self.function_name = name
self.invoke_local = invoke_local
for kw in kwargs:
setattr(self, kw, kwargs[kw])
def Invoke(event):
global INVOKE_LOCAL
global HANDLER
if INVOKE_LOCAL:
HANDLER(event, FUNCTION_CONTEXT)
else:
resp = boto3.client('lambda').invoke(FunctionName=FUNCTION_CONTEXT.function_name,
InvocationType="Event",
Payload=json.dumps(event).encode('utf-8'))
if not 200 <= resp['StatusCode'] < 300:
raise Exception("Unable to invoke function %s" % FUNCTION_CONTEXT.function_name)
def Initialize_Invoker(context, handler):
global FUNCTION_CONTEXT
global HANDLER
global INVOKE_LOCAL
FUNCTION_CONTEXT = context
HANDLER = handler
try:
INVOKE_LOCAL = context.invoke_local
except Exception:
pass
if __name__ == "__main__":
def wrapper(test):
print(test)
@Link_Invocation(wrapper)
def test_handler(event, context):
Initialize_Invoker(context, test_handler)
event = {"test": "It's working!"}
context = DummyContext("Invoke")
test_handler(event, context) | AWS-Invoke | /AWS_Invoke-0.0.2-py3-none-any.whl/Invoke/Invoke.py | Invoke.py |
# AWS MFA Helper
Automate obtaining MFA (STS) credentials
This utility will request STS credentials from AWS, then update ~/.aws/credentials with the new credentials. STS credentials are stored in a profile suffixed with -mfa.
If you have [TOTP Generator](https://github.com/jjfalling/TOTP-Generator) installed this utility will attempt to automate the TOTP code generation.
You will need to update your AWS configuration (~/.aws/config) with the following settings:
```
[profile my-aws-profile]
helper_mfa_serial = (your MFA ARN)
helper_totp_service_name = (Optional: service name in TOTP Generatior)
```
If you need to change the STS/MFA credentials suffix in the aws config file, create ~/.aws_mfa_helper.cfg with the following:
```
mfa_creds_suffix = your-new-suffix
```
Run with the --help flag for more information. | AWS-MFA-Helper | /AWS%20MFA%20Helper-1.1.0.tar.gz/AWS MFA Helper-1.1.0/README.md | README.md |
"""Utility to help with generating MFA (STS) tokens and update AWS creds config."""
# ****************************************************************************
# * Keyring TOTP Generator *
# * *
# * Copyright (C) 2018 by Jeremy Falling except where noted. *
# * *
# * This program is free software: you can redistribute it and/or modify *
# * it under the terms of the GNU General Public License as published by *
# * the Free Software Foundation, either version 3 of the License, or *
# * (at your option) any later version. *
# * *
# * This program is distributed in the hope that it will be useful, *
# * but WITHOUT ANY WARRANTY; without even the implied warranty of *
# * MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the *
# * GNU General Public License for more details. *
# * *
# * You should have received a copy of the GNU General Public License *
# * along with this program. If not, see <http://www.gnu.org/licenses/>. *
# ****************************************************************************
import argparse
import configparser
import logging
import signal
import sys
from boto.sts import STSConnection
from os.path import expanduser, join
import aws_mfa_helper
# optional requirement
try:
from totp_generator.core_utils import KeyringTotpGenerator
except ImportError:
pass
# backwards compatibility for py2
try:
input = raw_input
except NameError:
pass
PROGNAME = aws_mfa_helper.__progname__
VERSION = aws_mfa_helper.__version__
MFA_SERIAL_CONFIG_KEY = 'helper_mfa_serial'
TOTP_SERVICE_CONFIG_KEY = 'helper_totp_service_name'
MFA_CREDS_SUFFIX = '-mfa'
logging.basicConfig(level=logging.INFO)
logging.getLogger()
logger = logging.getLogger()
def signal_handler(signal, frame):
"""Catch interrupts and exit without a stack trace."""
print('\nExiting...\n')
sys.exit(0)
def show_version():
"""Show version info and exit."""
print('{name} version {ver}\n'.format(name=PROGNAME, ver=VERSION))
sys.exit(0)
def profile_selection(aws_creds):
"""
User selection of aws profile.
:param aws_creds: user creds dict
:return: name of aws profile
"""
choices = list()
i = 0
for key, key_val in aws_creds.items():
if not key.endswith(MFA_CREDS_SUFFIX) and key is not 'DEFAULT':
i += 1
choices.append(key)
print('{i}: {k}'.format(i=i, k=key))
while True:
user_in = input('\nPick AWS profile: ')
try:
sel = int(user_in) - 1
# range is exclusive
if int(sel) in range(0, i + 1):
break
except ValueError:
pass
print("Your selection is not valid. Try again.")
return choices[int(sel)]
def duration_selection():
"""
User selection of token duration.
:return: int between 900 and 129600
"""
while True:
user_in = input('\nEnter token lifetime in seconds (900-129600) [86400]: ')
if not user_in:
return 86400
try:
sel = int(user_in) - 1
# range is exclusive
if int(sel) in range(899, 129601):
break
except ValueError:
pass
print("Your selection is not valid. Try again.")
return user_in
def mfa_entry(aws_conf, profile):
"""
Try to generate TOTP code if totp_generator is installed. Otherwise prompt user for code.
:param aws_conf: AWS config configparser.ConfigParser object
:param profile: Name of AWS config profile
:return: Six digit MFA code as string
"""
if 'totp_generator.core_utils' in sys.modules:
try:
# totp_generator is installed.
full_profile = 'profile {n}'.format(n=profile)
try:
totp_service = aws_conf[full_profile][TOTP_SERVICE_CONFIG_KEY]
except KeyError:
raise KeyError('{k} was not found in AWS config. Cannot auto-generate TOTP code'.format(
k=TOTP_SERVICE_CONFIG_KEY))
code = KeyringTotpGenerator().get_totp_code(totp_service)
logger.debug('Got TOTP code from totp_generator')
return code
except Exception as err:
logger.debug('Failed to get TOTP code from totp_generator: {e}'.format(e=err))
while True:
user_in = input('\nEnter MFA code: ')
try:
int(user_in)
break
except ValueError:
pass
print("Your selection is not valid. Try again.")
return user_in
def get_mfa_device(config, profile):
"""
Verify that config is not missing required custom elements.
:param config: AWS config configparser.ConfigParser object
:param profile: Name of AWS config profile
:return: Bool
"""
full_profile = 'profile {n}'.format(n=profile)
try:
tmp = config[full_profile][MFA_SERIAL_CONFIG_KEY]
except KeyError:
print('ERROR: you must add {c} to your AWS conf profile with the ARN of your MFA device! Example: \
\n[profile {p}]\n\
{c} = iam::ACCOUNT-NUMBER-WITHOUT-HYPHENS:mfa/MFA-DEVICE-ID\n'.format(
c=MFA_SERIAL_CONFIG_KEY, p=profile))
exit(1)
return config[full_profile][MFA_SERIAL_CONFIG_KEY]
def get_sts_creds(profile, duration, device_id, mfa_code):
"""
Get STS creds from AWS.
:param profile: AWS creds profile name
:param duration: Token lifetime
:param device_id: MFA device ARN
:param mfa_code: MFA TOTP code
:return:
"""
sts_connection = STSConnection(profile_name=profile)
sts_creds = sts_connection.get_session_token(
duration=duration,
mfa_serial_number="{device_id}".format(device_id=device_id),
mfa_token=mfa_code
)
return sts_creds
def update_aws_creds(aws_creds, profile, sts_creds):
"""
Update STS profile with STS creds.
:param aws_creds: AWS creds dict
:param profile: Name of AWS config profile (without MFA suffix)
:param sts_creds: AWS creds boto.sts.credentials.Credentials object
:return: configparser.ConfigParser object
"""
sts_profile = '{p}{s}'.format(p=profile, s=MFA_CREDS_SUFFIX)
if sts_profile not in aws_creds:
aws_creds[sts_profile] = dict()
aws_creds[sts_profile]['aws_access_key_id'] = sts_creds.access_key
aws_creds[sts_profile]['aws_secret_access_key'] = sts_creds.secret_key
# support both session and security keys as various utilities require one or the other
aws_creds[sts_profile]['aws_session_token'] = sts_creds.session_token
aws_creds[sts_profile]['aws_security_token'] = sts_creds.session_token
return aws_creds
def read_aws_file(filepath):
"""
Read AWS config file.
:param aws_creds_file: Full path to AWS creds file
:return: configparser.ConfigParser
"""
config = configparser.ConfigParser()
config.read(filepath)
return config
def save_aws_creds(aws_creds_file, config):
"""
Write AWS config to file.
:param aws_creds_file: AWS creds file path.
:param config: AWS creds boto.sts.credentials.Credentials object
:return: None
"""
with open(aws_creds_file, 'w') as configfile:
config.write(configfile)
return
def load_helper_config(home):
"""
Load aws helper config.
:param home: path to user home
:return: None
"""
file_path = join(home, '.aws_mfa_helper.cfg')
try:
config = read_aws_file(file_path)
logger.debug('Loaded helper config.')
if config.get('mfa_creds_suffix'):
logger.debug('Setting MFA_CREDS_SUFFIX to {s}'.format(s=config['mfa_creds_suffix']))
global MFA_CREDS_SUFFIX
MFA_CREDS_SUFFIX = config['mfa_creds_suffix']
except Exception:
pass
return
def main():
signal.signal(signal.SIGINT, signal_handler)
parser = argparse.ArgumentParser(description='AWS MFA Helper\n\n' +
'Reads AWS config and automates obtaining and updating AWS creds with' +
' STS tokens',
formatter_class=argparse.RawTextHelpFormatter)
parser.add_argument('-d', '--debug', action='store_true', help='enable debug logging')
parser.add_argument('-v', '--version', action='store_true', help='show version and exit')
args = parser.parse_args()
# handle flags
if args.debug:
logging.getLogger().setLevel(logging.DEBUG)
if args.version:
show_version()
user_home = expanduser("~")
aws_creds_file = join(user_home, '.aws', 'credentials')
aws_config_file = join(user_home, '.aws', 'config')
load_helper_config(user_home)
if 'totp_generator.core_utils' in sys.modules:
logger.debug('totp_generatior is installed. Will attempt to automate TOTP generation.')
print('AWS MFA Helper\n')
aws_creds = read_aws_file(aws_creds_file)
aws_conf = read_aws_file(aws_config_file)
profile = profile_selection(aws_creds)
device_id = get_mfa_device(aws_conf, profile)
duration = duration_selection()
mfa_code = mfa_entry(aws_conf, profile)
sts_creds = get_sts_creds(profile, duration, device_id, mfa_code)
aws_creds = update_aws_creds(aws_creds, profile, sts_creds)
save_aws_creds(aws_creds_file, aws_creds)
print('\nUpdated AWS profile {p} with STS credentials. Credentials expire at {d}'.format(p=profile,
d=sts_creds.expiration))
if __name__ == '__main__':
main() | AWS-MFA-Helper | /AWS%20MFA%20Helper-1.1.0.tar.gz/AWS MFA Helper-1.1.0/aws_mfa_helper/cli.py | cli.py |
import re, copy
from pymysql import connect, cursors
class Adaptor(object):
"""Adaptor contain MySQL cursor object.
"""
def __init__(self, **params):
self._mySQLConn = connect(
host=params.get("host"),
user=params.get("user"),
password=params.get("password"),
db=params.get("db"),
port=params.get("port", 3306),
charset='utf8mb4',
cursorclass=cursors.DictCursor
)
self._mySQLCursor = self._mySQLConn.cursor()
def __del__(self):
if self._mySQLConn.open:
self.disconnect()
@property
def mySQLCursor(self):
"""MySQL Server cursor object.
"""
return self._mySQLCursor
def disconnect(self):
self._mySQLConn.close()
def rollback(self):
self._mySQLConn.rollback()
def commit(self):
self._mySQLConn.commit()
class Mage2Connector(object):
"""Magento 2 connection with functions.
"""
GETPRODUCTIDBYSKUSQL = """SELECT distinct entity_id FROM catalog_product_entity WHERE sku = %s;"""
GETROWIDBYENTITYIDSQL = """SELECT distinct row_id FROM catalog_product_entity WHERE entity_id = %s;"""
ENTITYMETADATASQL = """
SELECT eet.entity_type_id, eas.attribute_set_id
FROM eav_entity_type eet, eav_attribute_set eas
WHERE eet.entity_type_id = eas.entity_type_id
AND eet.entity_type_code = %s
AND eas.attribute_set_name = %s;"""
ATTRIBUTEMETADATASQL = """
SELECT DISTINCT t1.attribute_id, t2.entity_type_id, t1.backend_type, t1.frontend_input
FROM eav_attribute t1, eav_entity_type t2
WHERE t1.entity_type_id = t2.entity_type_id
AND t1.attribute_code = %s
AND t2.entity_type_code = %s;"""
ISENTITYEXITSQL = """SELECT count(*) as count FROM {entityTypeCode}_entity WHERE entity_id = %s;"""
ISATTRIBUTEVALUEEXITSQL = """
SELECT count(*) as count
FROM {entityTypeCode}_entity_{dataType}
WHERE attribute_id = %s
AND store_id = %s
AND {key} = %s;"""
REPLACEATTRIBUTEVALUESQL = """REPLACE INTO {entityTypeCode}_entity_{dataType} ({cols}) values ({vls});"""
UPDATEENTITYUPDATEDATSQL = """UPDATE {entityTypeCode}_entity SET updated_at = UTC_TIMESTAMP() WHERE entity_id = %s;"""
GETOPTIONIDSQL = """
SELECT t2.option_id
FROM eav_attribute_option t1, eav_attribute_option_value t2
WHERE t1.option_id = t2.option_id
AND t1.attribute_id = %s
AND t2.value = %s
AND t2.store_id = %s;"""
INSERTCATALOGPRODUCTENTITYEESQL = """
INSERT INTO catalog_product_entity
(entity_id, created_in, updated_in, attribute_set_id, type_id, sku, has_options, required_options, created_at, updated_at)
VALUES(0, 1, 2147483647, %s, %s, %s, 0, 0, UTC_TIMESTAMP(), UTC_TIMESTAMP());"""
INSERTCATALOGPRODUCTENTITYSQL = """
INSERT INTO catalog_product_entity
(attribute_set_id, type_id, sku, has_options, required_options, created_at, updated_at)
VALUES(%s, %s, %s, 0, 0, UTC_TIMESTAMP(), UTC_TIMESTAMP());"""
UPDATECATALOGPRODUCTSQL = """
UPDATE catalog_product_entity
SET attribute_set_id = %s,
type_id = %s,
updated_at = UTC_TIMESTAMP()
WHERE {key} = %s;"""
INSERTEAVATTRIBUTEOPTIONSQL = """INSERT INTO eav_attribute_option (attribute_id) VALUES (%s);"""
OPTIONVALUEEXISTSQL = """
SELECT COUNT(*) as cnt FROM eav_attribute_option_value
WHERE option_id = %s
AND store_id = %s;"""
INSERTOPTIONVALUESQL = """INSERT INTO eav_attribute_option_value (option_id, store_id, value) VALUES (%s, %s, %s);"""
UPDATEOPTIONVALUESQL = """UPDATE eav_attribute_option_value SET value = %s WHERE option_id = %s AND store_id = %s;"""
GETCUSTOMOPTIONIDBYTITLESQL = """
SELECT a.option_id
FROM catalog_product_option a
INNER JOIN catalog_product_option_title b ON a.option_id = b.option_id
WHERE a.product_id = %s AND b.title = %s;"""
REPLACECUSTOMOPTIONSQL = """REPLACE INTO catalog_product_option ({optCols}) VALUES ({optVals});"""
INSERTCUSTOMOPTIONTITLESQL = """
INSERT INTO catalog_product_option_title
(option_id, store_id, title)
VALUES (%s,%s,%s) on DUPLICATE KEY UPDATE title = %s;"""
INSERTCUSTOMOPTIONPRICESQL = """
INSERT INTO catalog_product_option_price
(option_id, store_id, price, price_type)
VALUES (%s,%s,%s,%s) on DUPLICATE KEY UPDATE price = %s, price_type = %s;"""
GETCUSTOMOPTIONTYPEIDBYTITLESQL = """
SELECT a.option_type_id
FROM catalog_product_option_type_value a
INNER JOIN catalog_product_option_type_title b on a.option_type_id = b.option_type_id
WHERE a.option_id = %s
AND b.title = %s;"""
REPLACECUSTOMOPTIONTYPEVALUESQL = """REPLACE INTO catalog_product_option_type_value ({optValCols}) VALUES ({optValVals});"""
INSERTCUSTOMOPTIONTYPETITLESQL = """
INSERT INTO catalog_product_option_type_title
(option_type_id, store_id, title)
VALUES (%s,%s,%s) on DUPLICATE KEY UPDATE title = %s;"""
INSERTCUSTOMOPTIONTYPEPRICESQL = """
INSERT INTO catalog_product_option_type_price
(option_type_id, store_id, price, price_type)
VALUES (%s,%s,%s,%s) on DUPLICATE KEY UPDATE price = %s, price_type = %s;"""
DELETECUSTOMOPTIONSQL = """
DELETE FROM catalog_product_option WHERE product_id = %s"""
UPDATEPRODUCTHASOPTIONSSQL = """
UPDATE catalog_product_entity SET has_options = 1 WHERE entity_id = %s;"""
INSERTPRODUCTIMAGEGALLERYSQL = """
INSERT INTO catalog_product_entity_media_gallery
(attribute_id,value,media_type) VALUES (%s,%s,%s);"""
INSERTPRODUCTIMAGEGALLERYEXTSQL = """
INSERT INTO catalog_product_entity_media_gallery_ext
(value_id,media_source,file) VALUES (%s,%s,%s);"""
INSERTPRODUCTIMAGEGALLERYVALUESQL = """
INSERT INTO catalog_product_entity_media_gallery_value ({cols}) VALUES ({vals});"""
INSERTMEDIAVALUETOENTITYSQL = """
INSERT IGNORE INTO catalog_product_entity_media_gallery_value_to_entity ({cols}) VALUES ({vals});"""
INSERTPRODUCTIMAGESQL = """
INSERT INTO catalog_product_entity_varchar ({cols}) VALUES ({vals}) ON DUPLICATE KEY UPDATE value = %s;"""
SELECTLINKTYPEIDSQL = """
SELECT link_type_id FROM catalog_product_link_type WHERE code = %s;"""
SELECTLINKATTSQL = """
SELECT
t0.link_type_id,
t1.product_link_attribute_id
FROM
catalog_product_link_type t0,
catalog_product_link_attribute t1
WHERE t0.link_type_id = t1.link_type_id
AND t1.product_link_attribute_code = "position"
AND t0.code = %s;"""
INSERTCATALOGPRODUCTLINKSQL = """
INSERT IGNORE INTO catalog_product_link (product_id,linked_product_id,link_type_id) VALUES (%s,%s,%s);"""
INSERTCATALOGPRODUCTLINKATTRIBUTEINT = """
INSERT IGNORE INTO catalog_product_link_attribute_int (product_link_attribute_id,link_id,value) VALUES (%s,%s,%s);"""
DELETEPRODUCTLINKSQL = """
DELETE FROM catalog_product_link WHERE product_id = %s and link_type_id = %s"""
DELETEPRODUCTIMAGEGALLERYSQL = """
DELETE a
FROM catalog_product_entity_media_gallery a
INNER JOIN catalog_product_entity_media_gallery_value b ON a.value_id = b.value_id
WHERE b.entity_id = %s"""
DELTEEPRODUCTIMAGEEXTSQL = """
DELETE FROM catalog_product_entity_media_gallery_ext WHERE file IN (%s)"""
DELETEPRODUCTIMAGEGALLERYEXTSQL = """
DELETE a
FROM catalog_product_entity_media_gallery_ext a
INNER JOIN catalog_product_entity_media_gallery_value b ON a.value_id = b.value_id
WHERE b.entity_id = %s"""
GETPRODUCTOUTOFSTOCKQTYSQL = """
SELECT min_qty
FROM cataloginventory_stock_item
WHERE product_id = %s AND stock_id = %s AND use_config_min_qty = 0;"""
SETSTOCKSTATUSQL = """
INSERT INTO cataloginventory_stock_status
(product_id,website_id,stock_id,qty,stock_status)
VALUES (%s,%s,%s,%s,%s)
ON DUPLICATE KEY UPDATE
qty = %s,
stock_status = %s;"""
SETSTOCKITEMSQL = """
INSERT INTO cataloginventory_stock_item
(product_id,stock_id,qty,is_in_stock,website_id)
VALUES (%s,%s,%s,%s,%s)
ON DUPLICATE KEY UPDATE
qty = %s,
is_in_stock = %s;"""
SETPRODUCTCATEGORYSQL = """
INSERT INTO catalog_category_product
(category_id, product_id, position)
VALUES (%s,%s,%s)
ON DUPLICATE KEY UPDATE
category_id = %s,
product_id = %s,
position = %s;"""
UPDATECATEGORYCHILDRENCOUNTSQL = """
UPDATE catalog_category_entity
SET children_count = children_count + 1
where entity_id = %s;"""
UPDATECATEGORYCHILDRENCOUNTEESQL = """
UPDATE catalog_category_entity
SET children_count = children_count + 1
where row_id = %s;"""
GETMAXCATEGORYIDSQL = """
SELECT max(entity_id) as max_category_id FROM catalog_category_entity;"""
# for Magento 2 CE
GETCATEGORYIDBYATTRIBUTEVALUEANDPATHSQL = """
SELECT a.entity_id
FROM catalog_category_entity a
INNER JOIN catalog_category_entity_varchar b ON a.entity_id = b.entity_id
INNER JOIN eav_attribute c ON b.attribute_id = c.attribute_id AND c.attribute_code = 'name' and c.entity_type_id = 3
WHERE a.level = %s and a.parent_id = %s and b.value = %s;"""
# for Magento 2 EE
GETCATEGORYIDBYATTRIBUTEVALUEANDPATHEESQL = """
SELECT a.row_id
FROM catalog_category_entity a
INNER JOIN catalog_category_entity_varchar b ON a.row_id = b.row_id
INNER JOIN eav_attribute c ON b.attribute_id = c.attribute_id AND c.attribute_code = 'name' and c.entity_type_id = 3
WHERE a.level = %s and a.parent_id = %s and b.value = %s;"""
# for Magento 2 CE
INSERTCATALOGCATEGORYENTITYSQL = """
INSERT INTO catalog_category_entity
(attribute_set_id, parent_id, created_at, updated_at, path, level, children_count, position)
VALUES (%s, %s, now(), now(), %s, %s, %s, %s);"""
# for Magento 2 EE
INSERTCATALOGCATEGORYENTITYEESQL = """
INSERT INTO catalog_category_entity
(entity_id, created_in, updated_in, attribute_set_id, parent_id, created_at, updated_at, path, level, children_count,position)
VALUES (%s, 1, 2147483647, %s, %s, now(), now(), %s, %s, %s,%s);"""
EXPORTPRODUCTSCOUNTSQL = """
SELECT count(*) AS total
FROM catalog_product_entity a
INNER JOIN eav_attribute_set b ON a.attribute_set_id = b.attribute_set_id
WHERE updated_at >= '{updated_at}' AND b.attribute_set_name LIKE '{attribute_set_name}'
"""
# for Magento 2 CE
EXPORTMEDIAIMAGESSQL = """
SELECT
t0.sku,
CONCAT('{base_url}', t1.value) as 'value',
'{image_type}' as 'type'
FROM
catalog_product_entity t0,
catalog_product_entity_varchar t1,
eav_attribute t2
WHERE t0.entity_id = t1.entity_id
AND t1.attribute_id = t2.attribute_id
AND t2.attribute_code = '{attribute_code}'
AND t0.updated_at >= '{updated_at}'
"""
# for Magento 2 EE
EXPORTMEDIAIMAGESEESQL = """
SELECT
t0.sku,
CONCAT('{base_url}', t1.value) as 'value',
'{image_type}' as 'type'
FROM
catalog_product_entity t0,
catalog_product_entity_varchar t1,
eav_attribute t2
WHERE t0.row_id = t1.row_id
AND t1.attribute_id = t2.attribute_id
AND t2.attribute_code = '{attribute_code}'
AND t0.updated_at >= '{updated_at}'
"""
# for Magento 2 CE
EXPORTMEDIAGALLERYSQL = """
SELECT
t0.sku,
CONCAT('{base_url}', t1.value) as 'value',
t2.store_id,
t2.position,
t2.label,
'mage2' as 'media_source',
'media_gallery' as 'type'
FROM
catalog_product_entity t0,
catalog_product_entity_media_gallery t1,
catalog_product_entity_media_gallery_value t2,
catalog_product_entity_media_gallery_value_to_entity t3
WHERE t0.entity_id = t3.entity_id
AND t1.value_id = t2.value_id
AND t1.value_id = t3.value_id
AND t0.updated_at >= '{updated_at}'
"""
# for Magento 2 EE
EXPORTMEDIAGALLERYEESQL = """
SELECT
t0.sku,
CONCAT('{base_url}', t1.value) as 'value',
t2.store_id,
t2.position,
t2.label,
'mage2' as 'media_source',
'media_gallery' as 'type'
FROM
catalog_product_entity t0,
catalog_product_entity_media_gallery t1,
catalog_product_entity_media_gallery_value t2,
catalog_product_entity_media_gallery_value_to_entity t3
WHERE t0.row_id = t3.row_id
AND t1.value_id = t2.value_id
AND t1.value_id = t3.value_id
AND t0.updated_at >= '{updated_at}'
"""
# for Magento 2 CE/EE
GETCONFIGSIMPLEPRODUCTSSQL = """
SELECT
t4.parent_id,
t5.attribute_id,
t4.product_id as child_id,
t3.value,
t0.sku
FROM
catalog_product_entity t0,
catalog_product_entity_int t1,
eav_attribute_option t2,
eav_attribute_option_value t3,
catalog_product_super_link t4,
catalog_product_super_attribute t5
WHERE t0.{id} = t1.{id}
AND t1.attribute_id = t2.attribute_id AND t1.value = t2.option_id
AND t2.option_id = t3.option_id AND t3.store_id = {store_id}
AND t4.parent_id = t5.product_id
AND t0.entity_id = t4.product_id
AND t5.attribute_id = t1.attribute_id AND t1.store_id = {store_id}
AND t5.product_id = {product_id} AND t5.attribute_id IN ({attribute_ids})
"""
REPLACECATALOGPRODUCTRELATIONSQL = """
REPLACE INTO catalog_product_relation
(parent_id, child_id)
VALUES ({parent_id}, {child_id})
"""
REPLACECATALOGPRODUCTSUPERLINKSQL = """
REPLACE INTO catalog_product_super_link
(product_id, parent_id)
VALUES ({product_id}, {parent_id})
"""
REPLACECATALOGPRODUCTSUPERATTRIBUTESQL = """
INSERT IGNORE INTO catalog_product_super_attribute
(product_id, attribute_id)
VALUES ({product_id}, {attribute_id})
"""
UPDATEPRODUCTVISIBILITYSQL = """
UPDATE catalog_product_entity_int SET value = {value}
WHERE {id} = {product_id} AND
attribute_id in (SELECT attribute_id FROM eav_attribute WHERE entity_type_id = 4 AND attribute_code = 'visibility')
"""
GETORDERSSQL = """
SELECT
sales_order.entity_id AS id,
sales_order.increment_id AS m_order_inc_id,
sales_order.created_at AS m_order_date,
sales_order.updated_at AS m_order_update_date,
sales_order.status AS m_order_status,
customer_group.customer_group_code AS m_customer_group,
sales_order.store_id AS m_store_id,
sales_order.customer_id AS m_customer_id,
'' AS shipment_carrier,
IFNULL(sales_order.shipping_method,"") AS shipment_method,
IFNULL(bill_to.firstname,'') AS billto_firstname,
IFNULL(bill_to.lastname,'') AS billto_lastname,
IFNULL(bill_to.email,'') AS billto_email,
IFNULL(bill_to.company,'') AS billto_companyname,
IFNULL(bill_to.street,'') AS billto_address,
IFNULL(bill_to.city,'') AS billto_city,
IFNULL(bill_to_region.code,'') AS billto_region,
IFNULL(bill_to.country_id,'') AS billto_country,
IFNULL(bill_to.postcode,'') AS billto_postcode,
IFNULL(bill_to.telephone,'') AS billto_telephone,
IFNULL(ship_to.firstname,'') AS shipto_firstname,
IFNULL(ship_to.lastname,'') AS shipto_lastname,
IFNULL(ship_to.company,'') AS shipto_companyname,
IFNULL(ship_to.street,'') AS shipto_address,
IFNULL(ship_to.city,'') AS shipto_city,
IFNULL(ship_to_region.code,'') AS shipto_region,
IFNULL(ship_to.country_id,'') AS shipto_country,
IFNULL(ship_to.postcode,'') AS shipto_postcode,
IFNULL(ship_to.telephone,'') AS shipto_telephone,
IFNULL(sales_order.total_qty_ordered,0) AS total_qty,
IFNULL(sales_order.subtotal,0) AS sub_total,
IFNULL(sales_order.discount_amount,0) AS discount_amt,
IFNULL(sales_order.shipping_amount,0) AS shipping_amt,
IFNULL(sales_order.tax_amount,0) AS tax_amt,
'0' AS giftcard_amt,
'0' AS storecredit_amt,
sales_order.grand_total AS grand_total,
sales_order.coupon_code AS coupon_code,
IFNULL(sales_order.shipping_tax_amount,0) AS shipping_tax_amt,
'checkmo' AS payment_method
FROM
sales_order
LEFT JOIN sales_order_address bill_to on (sales_order.entity_id = bill_to.parent_id and bill_to.address_type = 'billing')
LEFT JOIN sales_order_address ship_to on (sales_order.entity_id = ship_to.parent_id and ship_to.address_type = 'shipping')
LEFT JOIN directory_country_region bill_to_region on (bill_to.region_id = bill_to_region.region_id and bill_to.country_id = bill_to_region.country_id)
LEFT JOIN directory_country_region ship_to_region on (ship_to.region_id = ship_to_region.region_id and ship_to.country_id = ship_to_region.country_id)
LEFT JOIN customer_entity customer on sales_order.customer_id = customer.entity_id
LEFT JOIN customer_group customer_group on customer.group_id = customer_group.customer_group_id
WHERE sales_order.updated_at > '{updated_at}'
ORDER BY sales_order.entity_id
"""
GETORDERITEMSSQL = """
SELECT
sales_order_item.item_id AS id,
sales_order_item.order_id AS m_order_id,
sales_order_item.sku AS sku,
sales_order_item.name AS name,
'' AS uom,
sales_order_item.original_price AS original_price,
sales_order_item.price AS price,
sales_order_item.discount_amount AS discount_amt,
sales_order_item.tax_amount AS tax_amt,
sales_order_item.qty_ordered AS qty,
sales_order_item.row_total AS sub_total
FROM
sales_order_item
WHERE parent_item_id is null
AND order_id = '{order_id}'
"""
def __init__(self, setting=None, logger=None):
self.setting = setting
self.logger = logger
self._adaptor = None
def __del__(self):
if self._adaptor is not None:
self.adaptor.__del__()
self.logger.info("Close Mage2 DB connection")
def connect(self):
"""Initiate the connect with MySQL server.
"""
params = {
"host": self.setting["MAGE2DBSERVER"],
"user": self.setting["MAGE2DBUSERNAME"],
"password": self.setting["MAGE2DBPASSWORD"],
"db": self.setting["MAGE2DB"],
"port": self.setting["MAGE2DBPORT"]
}
self.logger.info("Open Mage2 DB connection")
return Adaptor(**params)
@property
def adaptor(self):
self._adaptor = self.connect() if self._adaptor is None else self._adaptor
return self._adaptor
def getEntityMetaData(self, entityTypeCode='catalog_product', attributeSet='Default'):
self.adaptor.mySQLCursor.execute(self.ENTITYMETADATASQL, [entityTypeCode, attributeSet])
entityMetadata = self.adaptor.mySQLCursor.fetchone()
if entityMetadata is not None:
return entityMetadata
else:
log = "attribute_set/entity_type_code: {0}/{1} not existed".format(attributeSet, entityTypeCode)
raise Exception(log)
def getAttributeMetadata(self, attributeCode, entityTypeCode):
self.adaptor.mySQLCursor.execute(self.ATTRIBUTEMETADATASQL, [attributeCode, entityTypeCode])
attributeMetadata = self.adaptor.mySQLCursor.fetchone()
if attributeMetadata is None:
log = "Entity Type/Attribute Code: {0}/{1} does not exist".format(entityTypeCode, attributeCode)
raise Exception(log)
dataType = attributeMetadata['backend_type']
return (dataType, attributeMetadata)
def isEntityExit(self, entityTypeCode, entityId):
sql = self.ISENTITYEXITSQL.format(entityTypeCode=entityTypeCode)
self.adaptor.mySQLCursor.execute(sql, [entityId])
exist = self.adaptor.mySQLCursor.fetchone()
return exist['count']
def isAttributeValueExit(self, entityTypeCode, dataType, attributeId, storeId, entityId):
key = 'row_id' if self.setting['VERSION'] == "EE" else 'entity_id'
sql = self.ISATTRIBUTEVALUEEXITSQL.format(entityTypeCode=entityTypeCode, dataType=dataType, key=key)
self.adaptor.mySQLCursor.execute(sql, [attributeId, storeId, entityId])
exist = self.adaptor.mySQLCursor.fetchone()
return exist['count']
def replaceAttributeValue(self, entityTypeCode, dataType, entityId, attributeId, value, storeId=0):
if entityTypeCode == 'catalog_product' or entityTypeCode == 'catalog_category':
cols = "entity_id, attribute_id, store_id, value"
if self.setting['VERSION'] == "EE":
cols = "row_id, attribute_id, store_id, value"
vls = "%s, %s, {0}, %s".format(storeId)
param = [entityId, attributeId, value]
else:
cols = "entity_id, attribute_id, value"
vls = "%s, %s, %s"
param = [entityId, attributeId, value]
sql = self.REPLACEATTRIBUTEVALUESQL.format(entityTypeCode=entityTypeCode, dataType=dataType, cols=cols, vls=vls)
self.adaptor.mySQLCursor.execute("SET FOREIGN_KEY_CHECKS = 0")
self.adaptor.mySQLCursor.execute(sql, param)
self.adaptor.mySQLCursor.execute("SET FOREIGN_KEY_CHECKS = 1")
def updateEntityUpdatedAt(self, entityTypeCode, entityId):
sql = self.UPDATEENTITYUPDATEDATSQL.format(entityTypeCode=entityTypeCode)
self.adaptor.mySQLCursor.execute(sql, [entityId])
def setAttributeOptionValues(self, attributeId, options, entityTypeCode="catalog_product", adminStoreId=0, updateExistingOption=False):
optionId = self.getOptionId(attributeId, options[adminStoreId], adminStoreId)
if optionId is None:
self.adaptor.mySQLCursor.execute(self.INSERTEAVATTRIBUTEOPTIONSQL, [attributeId])
optionId = self.adaptor.mySQLCursor.lastrowid
for (storeId, optionValue) in options.items():
self.adaptor.mySQLCursor.execute(self.OPTIONVALUEEXISTSQL, [optionId, storeId])
exist = self.adaptor.mySQLCursor.fetchone()
if not exist or exist['cnt'] == 0 :
self.adaptor.mySQLCursor.execute(self.INSERTOPTIONVALUESQL, [optionId, storeId, optionValue])
elif exist['cnt'] >0 and updateExistingOption == True:
self.adaptor.mySQLCursor.execute(self.UPDATEOPTIONVALUESQL, [optionValue, optionId, storeId])
return optionId
def setMultiSelectOptionIds(self, attributeId, values, entityTypeCode="catalog_product", adminStoreId=0, delimiter="|"):
values = values.strip('"').strip("'").strip("\n").strip()
listValues = [v.strip() for v in values.split(delimiter)]
listOptionIds = []
for value in listValues:
options = {0: value}
optionId = self.setAttributeOptionValues(attributeId, options, entityTypeCode=entityTypeCode, adminStoreId=adminStoreId)
listOptionIds.append(str(optionId))
optionIds = ",".join(listOptionIds) if len(listOptionIds) > 0 else None
return optionIds
def getOptionId(self, attributeId, value, adminStoreId=0):
self.adaptor.mySQLCursor.execute(self.GETOPTIONIDSQL, [attributeId, value, adminStoreId])
res = self.adaptor.mySQLCursor.fetchone()
optionId = None
if res is not None:
optionId = res['option_id']
return optionId
def getMultiSelectOptionIds(self, attributeId, values, adminStoreId=0, delimiter="|"):
if values is None:
return [None]
values = values.strip('"').strip("'").strip("\n").strip()
listValues = [v.strip() for v in values.split(delimiter)]
listOptionIds = []
for value in listValues:
optionId = self.getOptionId(attributeId, value, adminStoreId=adminStoreId)
listOptionIds.append(str(optionId))
optionIds = ",".join(listOptionIds) if len(listOptionIds) > 0 else None
return optionIds
def getProductIdBySku(self, sku):
self.adaptor.mySQLCursor.execute(self.GETPRODUCTIDBYSKUSQL, [sku])
entity = self.adaptor.mySQLCursor.fetchone()
if entity is not None:
entityId = int(entity["entity_id"])
else:
entityId = 0
return entityId
def getRowIdByEntityId(self, entityId):
self.adaptor.mySQLCursor.execute(self.GETROWIDBYENTITYIDSQL, [entityId])
entity = self.adaptor.mySQLCursor.fetchone()
if entity is not None:
rowId = int(entity["row_id"])
else:
rowId = 0
return rowId
def insertCatalogProductEntity(self, sku, attributeSet='Default', typeId='simple'):
entityMetadata = self.getEntityMetaData('catalog_product', attributeSet)
if entityMetadata == None:
return 0
if self.setting['VERSION'] == 'EE':
self.adaptor.mySQLCursor.execute("""SET FOREIGN_KEY_CHECKS = 0;""")
self.adaptor.mySQLCursor.execute(self.INSERTCATALOGPRODUCTENTITYEESQL, (entityMetadata['attribute_set_id'], typeId, sku))
productId = self.adaptor.mySQLCursor.lastrowid
self.adaptor.mySQLCursor.execute("""UPDATE catalog_product_entity SET entity_id = row_id WHERE row_id = %s;""", (productId,))
self.adaptor.mySQLCursor.execute("""INSERT INTO sequence_product (sequence_value) VALUES (%s);""", (productId,))
self.adaptor.mySQLCursor.execute("""SET FOREIGN_KEY_CHECKS = 1;""")
else:
self.adaptor.mySQLCursor.execute(self.INSERTCATALOGPRODUCTENTITYSQL, (entityMetadata['attribute_set_id'], typeId, sku))
productId = self.adaptor.mySQLCursor.lastrowid
return productId
def updateCatalogProductEntity(self, productId, attributeSet='Default', typeId='simple'):
entityMetadata = self.getEntityMetaData('catalog_product', attributeSet)
if entityMetadata == None:
return 0
key = 'row_id' if self.setting['VERSION'] == "EE" else 'entity_id'
sql = self.UPDATECATALOGPRODUCTSQL.format(key=key)
self.adaptor.mySQLCursor.execute(sql, [entityMetadata['attribute_set_id'], typeId, productId])
def syncEntityData(self, entityId, data, entityTypeCode='catalog_product', storeId=0, adminStoreId=0):
doNotUpdateOptionAttributes = ['status','visibility','tax_class_id']
for attributeCode, value in data.items():
(dataType, attributeMetadata) = self.getAttributeMetadata(attributeCode, entityTypeCode)
if attributeMetadata['frontend_input'] == 'select' and attributeCode not in doNotUpdateOptionAttributes:
optionId = self.getOptionId(attributeMetadata['attribute_id'], value, adminStoreId=adminStoreId)
if optionId is None:
options = {0: value}
optionId = self.setAttributeOptionValues(attributeMetadata['attribute_id'], options, adminStoreId=adminStoreId)
value = optionId
elif attributeMetadata['frontend_input'] == 'multiselect':
optionIds = self.getMultiSelectOptionIds(attributeMetadata['attribute_id'], value, adminStoreId=adminStoreId)
if optionIds is None:
optionIds = self.setMultiSelectOptionIds(attributeMetadata['attribute_id'], value, adminStoreId=adminStoreId)
value = optionIds
# ignore the static datatype.
if dataType != "static":
exist = self.isAttributeValueExit(entityTypeCode, dataType, attributeMetadata['attribute_id'], adminStoreId, entityId)
storeId = adminStoreId if exist == 0 else storeId
self.replaceAttributeValue(entityTypeCode, dataType, entityId, attributeMetadata['attribute_id'], value, storeId=storeId)
def _getWebsiteIdByStoreId(self, storeId):
self.adaptor.mySQLCursor.execute("SELECT website_id FROM store WHERE store_id = %s",[storeId])
res = self.adaptor.mySQLCursor.fetchone()
websiteId = 0
if res is not None:
websiteId = res['website_id']
return websiteId
def assingWebsite(self, productId, storeId):
websiteId = self._getWebsiteIdByStoreId(storeId)
if websiteId == 0:
websiteId = 1
self.adaptor.mySQLCursor.execute("INSERT IGNORE INTO catalog_product_website (product_id, website_id) VALUES (%s, %s)",[productId,websiteId])
def syncProduct(self, sku, attributeSet, data, typeId, storeId):
try:
productId = self.getProductIdBySku(sku)
if productId == 0:
productId = self.insertCatalogProductEntity(sku, attributeSet, typeId)
else:
self.updateCatalogProductEntity(productId, attributeSet, typeId)
self.syncEntityData(productId, data, entityTypeCode='catalog_product', storeId=storeId)
self.assingWebsite(productId,storeId)
self.adaptor.commit()
return productId
except Exception:
self.adaptor.rollback()
raise
def getCustomOption(self, productId, title):
self.adaptor.mySQLCursor.execute(self.GETCUSTOMOPTIONIDBYTITLESQL, [productId, title])
entity = self.adaptor.mySQLCursor.fetchone()
if entity is not None:
optionId = int(entity["option_id"])
else:
optionId = 0
return optionId
def setCustomOption(self, productId, option):
title = option.pop("title")
storeId = option.pop("store_id", 0)
optionId = self.getCustomOption(productId, title)
optCols = ["option_id", "product_id", "type", "is_require", "sku", "max_characters", "file_extension", "image_size_x", "image_size_y", "sort_order"]
optVals = ["%s","%s","%s","%s","%s","%s","%s","%s","%s","%s"]
values = [
optionId,
productId,
option.pop("type"),
option.pop("is_require"),
option.pop("option_sku", None),
option.pop("max_characters", 0),
option.pop("file_extension", None),
option.pop("image_size_x", 0),
option.pop("image_size_y", 0),
option.pop("sort_order", 1)
]
if optionId == 0:
if storeId == 0:
#optCols.pop("option_id")
optCols.pop(0)
optVals.pop(0)
values.pop(0)
# Replace custom option.
self.adaptor.mySQLCursor.execute(
self.REPLACECUSTOMOPTIONSQL.format(optCols=",".join(optCols), optVals=",".join(optVals)),
values
)
optionId = self.adaptor.mySQLCursor.lastrowid
else:
# There is no option for store# 0.
raise Exception("You have to input the option title({}) with store({}) first.".format(title, storeId))
else:
# Replace custom option.
self.adaptor.mySQLCursor.execute(
self.REPLACECUSTOMOPTIONSQL.format(optCols=",".join(optCols), optVals=",".join(optVals)),
values
)
# Insert custom option title.
if storeId == 0:
optionTitle = title
else:
optionTitle = option.pop("title_alt")
self.adaptor.mySQLCursor.execute(
self.INSERTCUSTOMOPTIONTITLESQL,
[optionId, storeId, optionTitle, optionTitle]
)
optionTitleId = self.adaptor.mySQLCursor.lastrowid
# insert custom option price.
if "option_price" in option.keys():
optionPrice = option.pop("option_price"),
optionPriceType = option.pop("option_price_type")
self.adaptor.mySQLCursor.execute(
self.INSERTCUSTOMOPTIONPRICESQL,
[
optionId,
storeId,
optionPrice,
optionPriceType,
optionPrice,
optionPriceType
]
)
optionPriceId = self.adaptor.mySQLCursor.lastrowid
return optionId
def getCustomOptionValue(self, optionId, valueTitle):
self.adaptor.mySQLCursor.execute(self.GETCUSTOMOPTIONTYPEIDBYTITLESQL, [optionId, valueTitle])
entity = self.adaptor.mySQLCursor.fetchone()
if entity is not None:
optionTypeId = int(entity["option_type_id"])
else:
optionTypeId = 0
return optionTypeId
def setCustomOptionValue(self, optionId, optionValue):
valueTitle = optionValue.pop("option_value_title")
storeId = optionValue.pop("store_id", 0)
optionTypeId = self.getCustomOptionValue(optionId, valueTitle)
optValCols = ["option_type_id", "option_id", "sku", "sort_order",]
optValVals = ["%s","%s","%s","%s"]
values = [
optionTypeId,
optionId,
optionValue.pop("option_value_sku", None),
optionValue.pop("option_value_sort_order", 1)
]
if optionTypeId == 0:
if storeId == 0:
#optValCols.pop("option_type_id")
optValCols.pop(0)
optValVals.pop(0)
values.pop(0)
# Replace custom option.
self.adaptor.mySQLCursor.execute(
self.REPLACECUSTOMOPTIONTYPEVALUESQL.format(optValCols=",".join(optValCols), optValVals=",".join(optValVals)),
values
)
optionTypeId = self.adaptor.mySQLCursor.lastrowid
else:
# There is no option value for store# 0.
raise Exception("You have to input the option type title({}) with store({}) first.".format(valueTitle, storeId))
else:
# Replace custom option.
self.adaptor.mySQLCursor.execute(
self.REPLACECUSTOMOPTIONTYPEVALUESQL.format(optValCols=",".join(optValCols), optValVals=",".join(optValVals)),
values
)
# Insert custom option typle title.
if storeId == 0:
optionTypeTitle = valueTitle
else:
optionTypeTitle = optionValue.pop("option_value_title_alt")
self.adaptor.mySQLCursor.execute(
self.INSERTCUSTOMOPTIONTYPETITLESQL,
[optionTypeId, storeId, optionTypeTitle, optionTypeTitle]
)
optionTypeTitleId = self.adaptor.mySQLCursor.lastrowid
# insert custom option type price.
if "option_value_price" in optionValue.keys():
optionTypePrice = optionValue.pop("option_value_price"),
optionTypePriceType = optionValue.pop("option_value_price_type")
self.adaptor.mySQLCursor.execute(
self.INSERTCUSTOMOPTIONTYPEPRICESQL,
[
optionTypeId,
storeId,
optionTypePrice,
optionTypePriceType,
optionTypePrice,
optionTypePriceType
]
)
optionTypePriceId = self.adaptor.mySQLCursor.lastrowid
return optionTypeId
def setCustomOptions(self, sku, data):
productId = self.getProductIdBySku(sku)
if self.setting["VERSION"] == "EE":
productId = self.getRowIdByEntityId(productId)
if data:
self.adaptor.mySQLCursor.execute(self.DELETECUSTOMOPTIONSQL,[productId])
for option in data:
optionId = self.setCustomOption(productId, option)
if len(option["option_values"]) > 0:
for optionValue in option["option_values"]:
optionTypeId = self.setCustomOptionValue(optionId, optionValue)
self.adaptor.mySQLCursor.execute(self.UPDATEPRODUCTHASOPTIONSSQL, [productId])
return productId
def setImageGallery(self, sku, data):
productId = self.getProductIdBySku(sku)
if self.setting["VERSION"] == "EE":
productId = self.getRowIdByEntityId(productId)
attributes = dict((k, v) for (k, v) in data.items() if type(v) is not list)
#self.logger.info(attributes)
galleryAttributeCode = list(filter(lambda k: (type(data[k]) is list), data.keys()))[0]
(dataType, galleryAttributeMetadata) = self.getAttributeMetadata(galleryAttributeCode, 'catalog_product')
#self.logger.info(galleryAttributeCode)
#self.logger.info(dataType)
#self.logger.info(galleryAttributeMetadata)
imageValues = [d['value'] for d in data['media_gallery'] if 'value' in d]
#self.logger.info(imageValues)
self.adaptor.mySQLCursor.execute(self.DELETEPRODUCTIMAGEGALLERYEXTSQL,[productId])
#self.logger.info(self.adaptor.mySQLCursor._last_executed)
self.adaptor.mySQLCursor.execute(self.DELETEPRODUCTIMAGEGALLERYSQL,[productId])
#self.logger.info(self.adaptor.mySQLCursor._last_executed)
for imageValue in imageValues:
self.adaptor.mySQLCursor.execute(self.DELTEEPRODUCTIMAGEEXTSQL,[imageValue])
#self.logger.info(self.adaptor.mySQLCursor._last_executed)
for image in data[galleryAttributeCode]:
value = image.pop("value")
storeId = image.pop("store_id", 0)
label = image.pop("label", "")
position = image.pop("position", 1)
mediaType = image.pop("media_type",'image')
mediaSource = image.pop("media_source",'S3')
self.adaptor.mySQLCursor.execute(
self.INSERTPRODUCTIMAGEGALLERYSQL,
[galleryAttributeMetadata["attribute_id"], value, mediaType]
)
#self.logger.info(self.adaptor.mySQLCursor._last_executed)
valueId = self.adaptor.mySQLCursor.lastrowid
self.adaptor.mySQLCursor.execute(
self.INSERTPRODUCTIMAGEGALLERYEXTSQL,
[valueId, mediaSource, value]
)
#self.logger.info(self.adaptor.mySQLCursor._last_executed)
cols = ["value_id", "store_id", "row_id", "label", "position"] if self.setting["VERSION"] == "EE" \
else ["value_id", "store_id", "entity_id", "label", "position"]
vals = ["%s", "%s", "%s", "%s", "%s"]
params = [valueId, storeId, productId, label, position]
sql = self.INSERTPRODUCTIMAGEGALLERYVALUESQL.format(cols=",".join(cols), vals=",".join(vals))
self.adaptor.mySQLCursor.execute(sql, params)
cols = ["value_id", "row_id"] if self.setting["VERSION"] == "EE" else ["value_id", "entity_id"]
vals = ["%s", "%s"]
params = [valueId, productId]
sql = self.INSERTMEDIAVALUETOENTITYSQL.format(cols=",".join(cols), vals=",".join(vals))
self.adaptor.mySQLCursor.execute(sql, params)
attCodes = list(filter(lambda k: (attributes[k] == value), attributes.keys()))
if len(attCodes) > 0:
for attCode in attCodes:
# assign the attribute.
(dataType, attributeMetadata) = self.getAttributeMetadata(attCode, 'catalog_product')
cols = ["attribute_id", "store_id", "row_id", "value"] if self.setting["VERSION"] == "EE" \
else ["attribute_id", "store_id", "entity_id", "value"]
vals = ["%s", "%s", "%s", "%s"]
params = [attributeMetadata["attribute_id"], storeId, productId, value, value]
sql = self.INSERTPRODUCTIMAGESQL.format(cols=",".join(cols), vals=",".join(vals))
self.adaptor.mySQLCursor.execute(sql, params)
return productId
def getLinkAttributes(self, code):
self.adaptor.mySQLCursor.execute(self.SELECTLINKATTSQL, [code])
row = self.adaptor.mySQLCursor.fetchone()
if row is not None:
linkTypeId = int(row["link_type_id"])
productLinkAttributeId = int(row["product_link_attribute_id"])
else:
raise Exception("cannot find link_type_id for {0}.".format(code))
return (linkTypeId, productLinkAttributeId)
def setLinks(self, sku, data):
productId = self.getProductIdBySku(sku)
if self.setting["VERSION"] == "EE":
productId = self.getRowIdByEntityId(productId)
if productId == 0:
return productId
for code, links in data.items():
(linkTypeId, productLinkAttributeId) = self.getLinkAttributes(code)
if links:
self.adaptor.mySQLCursor.execute(self.DELETEPRODUCTLINKSQL,[productId,linkTypeId])
for link in links:
linkedProductId = self.getProductIdBySku(link["linked_sku"])
if self.setting["VERSION"] == "EE":
linkedProductId = self.getRowIdByEntityId(linkedProductId)
if linkedProductId == 0:
self.logger.warning("sku/link: {0}/{1} failed. Linked product is not found.".format(sku,link))
continue
self.adaptor.mySQLCursor.execute(self.INSERTCATALOGPRODUCTLINKSQL, [productId, linkedProductId, linkTypeId])
linkId = self.adaptor.mySQLCursor.lastrowid
self.adaptor.mySQLCursor.execute(self.INSERTCATALOGPRODUCTLINKATTRIBUTEINT, [productLinkAttributeId, linkId, link["position"]])
return productId
def _getProductOutOfStockQty(self, productId, websiteId=0, minQty=0):
outOfStockQty = minQty
self.adaptor.mySQLCursor.execute(self.GETPRODUCTOUTOFSTOCKQTYSQL, [productId, 1])
res = self.adaptor.mySQLCursor.fetchone()
if res is not None and len(res) > 0:
outOfStockQty = res['min_qty']
return outOfStockQty
def setInventory(self, sku, data):
productId = self.getProductIdBySku(sku)
metaData = dict((k, list(set(map(lambda d: d[k], data)))) for k in ['store_id'])
for storeId in metaData["store_id"]:
websiteId = self._getWebsiteIdByStoreId(storeId)
qty = sum(int(item["qty"]) for item in list(filter(lambda t: (t["store_id"]==storeId), data)))
outOfStockQty = self._getProductOutOfStockQty(productId, websiteId=websiteId)
if qty > int(outOfStockQty):
isInStock = 1
stockStatus = 1
else:
isInStock = 0
stockStatus = 0
self.adaptor.mySQLCursor.execute(self.SETSTOCKSTATUSQL, [productId, websiteId, 1, qty, stockStatus, qty, stockStatus])
self.adaptor.mySQLCursor.execute("SET FOREIGN_KEY_CHECKS = 0")
self.adaptor.mySQLCursor.execute(self.SETSTOCKITEMSQL, [productId, 1, qty, isInStock, websiteId, qty, isInStock])
self.adaptor.mySQLCursor.execute("SET FOREIGN_KEY_CHECKS = 1")
return productId
def setProductCategory(self, productId, categoryId, position=0):
if productId is None or categoryId is None:
pass
else:
self.adaptor.mySQLCursor.execute("SET FOREIGN_KEY_CHECKS = 0")
self.adaptor.mySQLCursor.execute(self.SETPRODUCTCATEGORYSQL, [categoryId, productId, position, categoryId, productId, position])
self.adaptor.mySQLCursor.execute("SET FOREIGN_KEY_CHECKS = 1")
if self.setting["VERSION"] == "EE":
self.adaptor.mySQLCursor.execute(self.UPDATECATEGORYCHILDRENCOUNTEESQL, [categoryId])
else:
self.adaptor.mySQLCursor.execute(self.UPDATECATEGORYCHILDRENCOUNTSQL, [categoryId])
def getMaxCategoryId(self):
self.adaptor.mySQLCursor.execute(self.GETMAXCATEGORYIDSQL)
res = self.adaptor.mySQLCursor.fetchone()
maxCategoryId = int(res['max_category_id'])
return maxCategoryId
def insertCatalogCategoryEntity(self, currentPathIds, attributeSet='Default'):
entityMetadata = self.getEntityMetaData('catalog_category', attributeSet)
if entityMetadata == None:
return 0
entityId = self.getMaxCategoryId() + 1
parentId = currentPathIds[-1]
level = len(currentPathIds)
pathIds = currentPathIds[:]
pathIds.append(entityId)
path = "/".join([str(pathId) for pathId in pathIds])
childrenCount = 0
position = 0
if self.setting["VERSION"] == "EE":
self.adaptor.mySQLCursor.execute("SET FOREIGN_KEY_CHECKS = 0;")
self.adaptor.mySQLCursor.execute(
self.INSERTCATALOGCATEGORYENTITYEESQL,
[entityId, entityMetadata['attribute_set_id'], parentId, path, level, childrenCount, position]
)
categoryId = self.adaptor.mySQLCursor.lastrowid
self.adaptor.mySQLCursor.execute("""UPDATE catalog_category_entity SET entity_id = row_id WHERE row_id = %s;""", (categoryId,))
self.adaptor.mySQLCursor.execute("""INSERT INTO sequence_catalog_category (sequence_value) VALUES (%s);""", (categoryId,))
self.adaptor.mySQLCursor.execute("SET FOREIGN_KEY_CHECKS = 1;")
else:
self.adaptor.mySQLCursor.execute(
self.INSERTCATALOGCATEGORYENTITYSQL,
[entityMetadata['attribute_set_id'], parentId, path, level, childrenCount, position]
)
categoryId = self.adaptor.mySQLCursor.lastrowid
return categoryId
def _createCategory(self, currentPathIds, category, storeId):
categoryId = self.insertCatalogCategoryEntity(currentPathIds)
urlKey = re.sub('[^0-9a-zA-Z ]+', '', category).replace(" ","-").lower()
data = {
'name': category,
'url_key': urlKey,
'url_path': urlKey,
'is_active': '1',
'is_anchor': '1',
'include_in_menu': '0',
'custom_use_parent_settings': '0',
'custom_apply_to_products': '0',
'display_mode': 'PRODUCTS',
}
self.syncEntityData(categoryId, data, entityTypeCode='catalog_category', storeId=storeId)
return categoryId
def _getCategoryId(self, level, parentId, category):
categoryId = None
sql = self.GETCATEGORYIDBYATTRIBUTEVALUEANDPATHSQL
if self.setting["VERSION"] == "EE":
sql = self.GETCATEGORYIDBYATTRIBUTEVALUEANDPATHEESQL
self.adaptor.mySQLCursor.execute(sql, [level, parentId, category])
res = self.adaptor.mySQLCursor.fetchone()
if res is not None and len(res) > 0:
categoryId = res['row_id'] if self.setting["VERSION"] == "EE" else res['entity_id']
return categoryId
def setCategories(self, sku, data):
productId = self.getProductIdBySku(sku)
for row in data:
storeId = row.pop("store_id", 0)
delimeter = row.pop("delimeter", "/")
applyAllLevels = row.pop("apply_all_levels")
path = row.pop("path")
position = row.pop("position", 0)
if path is None or path.strip() == '':
raise Exception("Path is empty for {0}".format(sku))
elif productId == 0:
raise Exception("Product({0}) does not existed in Magento".format(sku))
else:
categories = path.split(delimeter)
try:
parentId = 1
currentPathIds = [1]
for idx in range(0,len(categories)):
currentPath = delimeter.join(categories[0:idx+1])
category = categories[idx]
level = idx + 1
categoryId = self._getCategoryId(level, parentId, category)
if categoryId is None:
categoryId = self._createCategory(currentPathIds, category, storeId)
if applyAllLevels == True:
if level == 1:
parentId = categoryId
currentPathIds.append(categoryId)
continue
else:
self.setProductCategory(productId, categoryId, position=position)
elif level == len(categories):
self.setProductCategory(productId, categoryId, position=position)
currentPathIds.append(categoryId)
parentId = categoryId
except Exception:
raise
return productId
def getVariants(self, productId, attributeIds, adminStoreId=0):
sql = self.GETCONFIGSIMPLEPRODUCTSSQL.format(
id="row_id" if self.setting["VERSION"] == "EE" else "entity_id",
store_id=adminStoreId,
product_id=productId,
attribute_ids=",".join([str(attributeId) for attributeId in attributeIds])
)
self.adaptor.mySQLCursor.execute(sql)
rows = self.adaptor.mySQLCursor.fetchall()
variants = []
if len(rows) > 0:
metaData = dict((k, list(set(map(lambda d: d[k], rows)))) for k in ['sku'])
for sku in metaData["sku"]:
attributes = list(filter(
lambda row: row["sku"] == sku,
rows
)
)
variants.append({
"sku": sku,
"attributes": dict(
(
attribute["attribute_id"],
attribute["value"]
) for attribute in attributes
)
}
)
return variants
def setVariants(self, sku, data):
# Validate the child product.
# data = {
# "store_id": "0",
# "variant_visibility": True,
# "variants": [
# {
# "variant_sku": "abc",
# "attributes": {"att_a": "abc", "att_x": "xyz"}
# }
# ]
# }
parentProductId = self.getProductIdBySku(sku)
attrbuteIds = None
for product in data.get("variants"):
attributes = []
for attributeCode, value in product["attributes"].items():
(dataType, attributeMetadata) = self.getAttributeMetadata(attributeCode, "catalog_product")
# Check if the attribute is a "SELECT".
assert (attributeMetadata['frontend_input'] == 'select'), \
"The attribute({attribute_code}) is not a 'SELECT' type.".format(attribute_code=attributeCode)
# Check if the product has the attribute with the option value.
optionId = self.getOptionId(attributeMetadata['attribute_id'], value, adminStoreId=data.get("store_id", "0"))
if optionId is None:
options = {0: value}
self.setAttributeOptionValues(attributeMetadata['attribute_id'], options, adminStoreId=data.get("store_id", "0"))
attributes.append(
{"attribute_code": attributeCode, "value": value, "attribute_id": attributeMetadata['attribute_id']}
)
if attrbuteIds is None:
attributeIds = [attribute["attribute_id"] for attribute in attributes]
else:
_attributeIds = [attribute["attribute_id"] for attribute in attributes]
assert (len(list(set(attributeIds) - set(_attributeIds))) == 0), \
"Previous attributes({previous_attribute_ids}) are not matched with attributes({attribute_ids}).".format(
previous_attribute_ids=",".join([str(attributeId) for attributeId in attributeIds]),
attribute_ids=",".join([str(attributeId) for attributeId in _attributeIds])
)
# Check if there is a product with the same attributes.
variants = list(filter(lambda product: product["attributes"] == dict(
(attribute["attribute_id"], attribute["value"]) for attribute in attributes
),
self.getVariants(parentProductId, attributeIds, adminStoreId=data.get("store_id", "0"))
))
# If there is a product matched with the attribute and value set and the sku is not matched,
# raise an exception.
if len(variants) != 0:
assert (variants[0]["sku"] == product["variant_sku"]), \
"There is already a product({sku}) matched with the attributes({attributes}) of the sku({variant_sku}).".format(
sku=variants[0]["sku"],
attributes=",".join(["{attribute_code}:{value}".format(attribute_code=attribute["attribute_code"],value=attribute["value"]) for attribute in attributes]),
variant_sku=product["variant_sku"]
)
# Insert or update the child product.
for product in data.get("variants"):
self.logger.info("Insert/Update the item({variant_sku}).".format(variant_sku=product["variant_sku"]))
for attributeId in attributeIds:
self.adaptor.mySQLCursor.execute(
self.REPLACECATALOGPRODUCTSUPERATTRIBUTESQL.format(
product_id=parentProductId,
attribute_id=attributeId
)
)
productId = self.getProductIdBySku(product["variant_sku"])
if self.setting["VERSION"] == "EE" and productId != 0:
productId = self.getRowIdByEntityId(productId)
self.adaptor.mySQLCursor.execute(
self.REPLACECATALOGPRODUCTRELATIONSQL.format(
parent_id=parentProductId, child_id=productId
)
)
self.adaptor.mySQLCursor.execute(
self.REPLACECATALOGPRODUCTSUPERLINKSQL.format(
product_id=productId, parent_id=parentProductId
)
)
self.adaptor.mySQLCursor.execute(
self.UPDATEPRODUCTVISIBILITYSQL.format(
id="row_id" if self.setting["VERSION"] == "EE" else "entity_id",
value=4 if data.get("variant_visibility", False) else 1,
product_id=productId
)
)
# Visibility for the parent product.
self.adaptor.mySQLCursor.execute(
self.UPDATEPRODUCTVISIBILITYSQL.format(
id="row_id" if self.setting["VERSION"] == "EE" else "entity_id",
value=4,
product_id=parentProductId
)
)
return parentProductId
def syncProductExtData(self, sku, dataType, data):
try:
if dataType == "customoption":
productId = self.setCustomOptions(sku, data)
elif dataType == "imagegallery":
productId = self.setImageGallery(sku, data)
elif dataType == "links":
productId = self.setLinks(sku, data)
elif dataType == "inventory":
productId = self.setInventory(sku, data)
elif dataType == "categories":
productId = self.setCategories(sku, data)
elif dataType == "variants":
productId = self.setVariants(sku, data)
else:
raise Exception("Data Type({0}) is not supported.".format(dataType))
self.adaptor.commit()
return productId
except Exception:
self.adaptor.rollback()
raise
def getTotalProductsCount(self, cutdt, attributeSetName='%', sql=None):
sql = self.EXPORTPRODUCTSCOUNTSQL if sql is None else sql
sql = sql.format(
updated_at=cutdt,
attribute_set_name=attributeSetName
)
self.adaptor.mySQLCursor.execute(sql)
res = self.adaptor.mySQLCursor.fetchone()
return res['total']
def getImages(self, cutdt, offset=None, limit=None, sql=None):
sql = (
self.EXPORTMEDIAIMAGESEESQL if self.setting["VERSION"] == "EE" else self.EXPORTMEDIAIMAGESSQL
) if sql is None else sql
if offset is not None and limit is not None:
sql = sql + " LIMIT {0} OFFSET {1}".format(limit,offset)
elif limit is not None:
sql = sql + " LIMIT {0}".format(limit)
result = []
imageTypes = ["image", "small_image", "thumbnail"]
for imageType in imageTypes:
sql = sql.format(
base_url=self.setting["MEDIABASEURL"],
image_type=imageType,
attribute_code=imageType,
updated_at=cutdt
)
self.adaptor.mySQLCursor.execute(sql)
res = self.adaptor.mySQLCursor.fetchall()
result.extend(res)
return result
def getGallery(self, cutdt, offset=None, limit=None, sql=None):
sql = (
self.EXPORTMEDIAGALLERYEESQL if self.setting["VERSION"] == "EE" else self.EXPORTMEDIAGALLERYSQL
) if sql is None else sql
if offset is not None and limit is not None:
sql = sql + " LIMIT {0} OFFSET {1}".format(limit,offset)
elif limit is not None:
sql = sql + " LIMIT {0}".format(limit)
sql = sql.format(
base_url=self.setting["MEDIABASEURL"],
updated_at=cutdt
)
self.adaptor.mySQLCursor.execute(sql)
res = self.adaptor.mySQLCursor.fetchall()
return res
def getOrders(self, cutdt):
sql = self.GETORDERSSQL.format(updated_at=cutdt)
self.adaptor.mySQLCursor.execute(sql)
rawOrders = self.adaptor.mySQLCursor.fetchall()
for order in rawOrders:
sql = self.GETORDERITEMSSQL.format(order_id=order['id'])
self.adaptor.mySQLCursor.execute(sql)
order['items'] = self.adaptor.mySQLCursor.fetchall()
return rawOrders | AWS-Mage2Connector | /AWS_Mage2Connector-0.0.2-py3-none-any.whl/aws_mage2connector/aws_mage2connector.py | aws_mage2connector.py |
import subprocess
import aws_manager.settings as settings
import distutils.spawn
def read_file_to_array(path_to_file):
"""
read a file into an array
:param str path_to_file: full path to array
:return:
"""
try:
file_content = open(path_to_file);
except IOError:
return False
content_array = file_content.readlines();
file_content.close();
return content_array;
def is_pip_installed():
"""
Check if the pip is installed
:return True if installed, False otherwise
"""
return is_command_exists("pip")
def install_pip():
"""
install pip
:return:
"""
install = raw_input("%s needs to have pip install in order to run. Do you wish to install it now? (y/n)" %
settings.script_name)
if install is "y":
subprocess.call("sudo python %s" % settings.pip_installation, shell=True)
elif install is "n":
print "Cannot run without Pip installed"
exit()
else:
print "Cannot run without Pip installed"
install_pip()
init_and_run()
def uninstall_pip():
"""
uninstall pip
:return:
"""
print "This will uninstall both boto3 moudle and pip."
uninstall_boto3()
install = raw_input("%s needs to have pip install in order to run. Are you sure you wish to uninstall it? (y/n)" %
settings.script_name)
if install is "y":
subprocess.call("sudo pip uninstall pip", shell=True)
init_and_run()
def is_boto3_installed():
"""
Check if the boto3 is installed
:return True if installed, False otherwise
"""
return is_module_exists("boto3")
def install_boto3():
"""
install boto3
:return:
"""
install = raw_input("%s needs to have boto3 installed in order to run. Do you wish to install it now? (y/n)" %
settings.script_name)
if install is "y":
subprocess.call("sudo pip install boto3", shell=True)
elif install is "n":
print "Cannot run without boto3 installed"
exit()
else:
print "Cannot run without boto3 installed"
install_boto3()
init_and_run()
def uninstall_boto3():
"""
uninstall boto3
:return:
"""
install = raw_input("%s needs to have boto3 install in order to run. Are you sure you wish to uninstall it? (y/n)" %
settings.script_name)
if install is "y":
subprocess.call("sudo pip uninstall boto3", shell=True)
else:
init_and_run()
def is_command_exists(name):
"""
Check if a command exists
:param str name: the name of the comman
:return: True if the command exists, False otherwise
"""
return distutils.spawn.find_executable(name) is not None
def is_module_exists(name):
"""
Check if a moudle exists
:param str name: the name of the moudle
:return:
"""
try:
__import__(name)
except ImportError:
return False
else:
return True
def init_and_run():
"""
Initialize the system, making sure all the dependencies and certificates are installed and run the script
"""
install_dependencies()
import aws_manager.menus as menus
import aws_manager.aws as aws
if not aws.is_valid_credentials_set():
menus.show_credential_setup_menu()
else:
menus.show_main_menu()
def install_dependencies():
"""
Install all the dependencies necessary to run
"""
if not is_pip_installed():
install_pip()
if not is_boto3_installed():
install_boto3() | AWS-Manager | /AWS-Manager-0.4.tar.gz/AWS-Manager-0.4/aws_manager/functions/functions.py | functions.py |
import aws_manager.aws as aws
import subprocess;
import aws_manager.settings as settings;
import aws_manager.functions as functions
_instances = None
""":rtype list instances of ec2"""
_region = None
def show_credential_setup_menu():
"""
Show the menu for adding credentials files
"""
subprocess.call("clear")
print "Let's setup some initial settings:";
aws_file_path = raw_input("What is the path to your AWS Access Key?\n");
aws.save_path_to_config(aws_file_path)
credentials = aws.load_from_config()
if credentials is None:
print "User credentials are invalid. Make sure you have the right path and permissions for the file"
exit();
show_main_menu()
def show_main_menu():
"""
Show the script main menu
"""
credentials = aws.get_current_credentials()
subprocess.call("clear")
print "Welcome to", settings.script_name
print "AWS User Key:", credentials["username"]
print ""
print "Please Choose an option from the menu:"
print "1. EC2 Instances Management"
print "2. Change AWS credentials settings"
print "3. Uninstall Pip & Boto3"
print "4. Exit"
try:
main_menu_options(int(raw_input()))()
except ValueError:
show_main_menu()
def main_menu_options(i):
"""
The main menu options process
:param int i: the index of the option
:return:
"""
return {
1: show_region_menu,
2: show_credential_setup_menu,
3: functions.uninstall_pip,
4: exit
}.get(i, show_main_menu)
def show_region_menu(is_no_instances=False):
"""
Show the region menu
:param boolean is_no_instances: default is False. call this method with True to show that there were no instances in
the selected region
"""
global _instances, _region
subprocess.call("clear")
if is_no_instances:
print "Sorry, no instances were found in this region\n"
regions = {
1: "US East (N. Virginia): us-east-1",
2: "US West (Oregon): us-west-2",
3: "US West (N. California): us-west-1",
4: "EU (Ireland): eu-west-1",
5: "EU (Frankfurt): eu-central-1",
6: "Asia Pacific (Singapore): ap-southeast-1",
7: "Asia Pacific (Sydney): ap-southeast-2",
8: "Asia Pacific (Tokyo): ap-northeast-1",
9: "South America (Sao Paulo): sa-east-1",
10: "Back"
}
print "Please choose the region of the EC2 instance you wish to connect to:"
for key, value in regions.items():
print "%d. %s" % (key, value)
try:
user_input = int(raw_input())
chosen_region = region_menu_options(user_input)
if isinstance(chosen_region, str):
print "Loading instances..."
_instances = aws.load_ec2_instances(chosen_region)
_region = regions.get(user_input)
if _instances is None:
show_region_menu(True)
else:
show_environments_menu()
else:
chosen_region()
except ValueError:
show_region_menu()
def region_menu_options(i):
"""
region menu options
:param int i: the index the user chosen
:return:
"""
return {
1: "us-east-1",
2: "us-west-2",
3: "us-west-1",
4: "eu-west-1",
5: "eu-central-1",
6: "ap-southeast-1",
7: "ap-southeast-2",
8: "ap-northeast-1",
9: "sa-east-1",
10: show_main_menu
}.get(i, show_region_menu)
def show_environments_menu():
global _instances
"""
Show the environments menu
:param list instances: of EC2 instance to filter the environment from
"""
subprocess.call("clear")
print "Region: %s\n" % _region
print "Please choose the environment your instance is located in:"
environments = aws.get_environments_from_instances(_instances)
for i, environment in enumerate(environments):
print "%d. %s" % (i + 1, environment)
print "%d. Back" % (len(environments) + 1)
chosen_index = int(raw_input());
try:
chosen_environment = environments[chosen_index - 1]
show_applications_menu(chosen_environment)
except ValueError:
show_environments_menu()
except IndexError:
if chosen_index == len(environments) + 1:
show_region_menu()
else:
show_environments_menu()
def show_applications_menu(environment):
global _instances
"""
Show the application menu
:param list instances: of EC2 instance to filter the environment from
:param str environment: the name of the environment the user chosen
"""
subprocess.call("clear")
print "Region: %s\nEnvironment: %s\n" % (_region, environment)
print "Please choose the application your instance is part of:"
filtered_instance = aws.filter_instances_by_tag(_instances, "Environment", environment)
applications = aws.get_applications_from_instances(filtered_instance)
for i, application in enumerate(applications):
print "%d. %s" % (i + 1, application)
print "%d. Back" % (len(applications) + 1)
chosen_index = int(raw_input())
try:
chosen_application = applications[chosen_index - 1]
show_instances_menu(filtered_instance, environment, chosen_application)
except ValueError:
show_applications_menu(environment)
except IndexError:
if chosen_index == len(applications) + 1:
show_environments_menu()
else:
show_applications_menu(environment)
def show_instances_menu(instances, environment, application):
global _instances
"""
Show the instance list
:param list instances: the list of instances from AWS
:param str environment: the environment we are showing
:param str application: the application the user chose
:return:
"""
subprocess.call("clear")
print "Region: %s\nEnvironment: %s\nApplication: %s\n" % (_region, environment, application)
print "Please choose the instance you want to manage:"
filtered_instances = aws.filter_instances_by_tag(instances, "Application", application)
for i, instance in enumerate(filtered_instances):
print "%d. %s" % (i + 1, aws.convert_instance_to_menu_string(instance))
print "%d. Back" % (len(filtered_instances) + 1)
chosen_index = int(raw_input())
try:
show_instance_manager_menu(filtered_instances, chosen_index - 1, environment, application)
except ValueError:
show_instances_menu(_instances, environment, application)
except IndexError:
if chosen_index == len(filtered_instances) + 1:
show_applications_menu(environment)
else:
show_instances_menu(_instances, environment, application)
def show_instance_manager_menu(instances, index, environment, application):
global _instances
"""
Menu for a specific instance
:param list instances: the EC2 AWS instances list
:param int index: the index of the current instance in the list
:param str environment: the environment the user chose
:param str application: the application the user chose
:return:
"""
subprocess.call("clear")
instance = instances[index]
i = 1
print "Region: %s\nEnvironment: %s\nApplication: %s\n" % (_region, environment, application)
print "Instance: %s" % (aws.get_instance_tag(instance, "Name"))
print "Please choose what you want to do:"
print "%d. Connect" % i
i += 1
if aws.has_repository(instance):
print "%d. Pull git branch" % i
i += 1
print "%d. Back" % i
chosen = int(raw_input())
if chosen == i:
show_instances_menu(_instances, environment, application)
elif chosen == 1:
aws.connect_to_instance(instances, index)
show_instance_manager_menu(instances, index, environment, application)
elif chosen == 2:
show_git_pull_menu(instances, index, environment, application)
else:
show_instance_manager_menu(instances, index, environment, application)
def show_git_pull_menu(instances, index, environment, application):
"""
Show the git pull menu for the instance
:param list instances: the EC2 AWS instances list
:param int index: the index of the current instance in the list
:param str environment: the environment the user chose
:param str application: the application the user chose
"""
instance = instances[index]
default_branch = aws.get_instance_tag(instance, "Default Branch")
if default_branch is None:
default_branch = settings.default_git_branch
branch = raw_input("Please specify branch or press enter for default (default: %s)\n" % default_branch)
if branch == "":
aws.pull_git_branch(instances, index, default_branch)
else:
aws.pull_git_branch(instances, index, branch)
raw_input("Press enter to continue")
show_instance_manager_menu(instances, index, environment, application) | AWS-Manager | /AWS-Manager-0.4.tar.gz/AWS-Manager-0.4/aws_manager/menus/menus.py | menus.py |
import aws_manager.config_file as config_file
import aws_manager.functions as functions
_username = None
""":type str Aws user name"""
_aws_access_key = None
""":type str Aws access key """
_aws_access_key_secret = None
""":type str Aws access key secret"""
_file_path = None
""":type str file path """
def save_path_to_config(path):
"""
Save path to credential file into the config file
:param str path: to credential file
"""
path = path.replace("\\", "")
config_file.add_parameter("aws_credentials", "aws_credential_path", path)
def load_from_config():
"""
create an AwsCredentials instance from a credential file in the config
:return AwsCredentials: instance
"""
aws_credentials_path = get_stored_credentials_path()
if aws_credentials_path is None:
return None
credentials = get_current_credentials(aws_credentials_path)
return credentials
def get_current_credentials(file_path=None):
global _aws_access_key, _aws_access_key_secret, _username
"""
Get the current loaded aws credentials
:param str file_path: optional path to load the credentials from
:rtype: dict
:return dict with "key" and "secret" keys
"""
if file_path is not None:
load_from_file(file_path)
credentials = {"key": _aws_access_key, "secret": _aws_access_key_secret, "username": _username}
return credentials
def set_current_credentials(key, secret):
global _aws_access_key, _aws_access_key_secret
"""
Set the current credentials
:param str key: the aws access key
:param str secret: the aws access key secret
"""
_aws_access_key = key
_aws_access_key_secret = secret
def get_stored_credentials_path():
"""
Get the stored credentials path
:return str: the stored credential path
"""
return config_file.get_parameter("aws_credentials", "aws_credential_path");
def remove_stored_credential_path():
"""
Remove stored credential path
"""
return config_file.remove_parameter("aws_credentials", "aws_credential_path");
def is_valid_credentials_set():
"""
Check if there is a valid credential set
:return: True if there are credential set, False otherwise
"""
if load_from_config() is None:
return False
return True
def set_file_path(file_path):
global _file_path
"""
Set the file path
:param str file_path: the file path
:return:
"""
_file_path = file_path
def get_file_path():
global _file_path
"""
Get the file path
:return: the file path for the credentials
:rtype str
"""
return _file_path
def set_username(username):
global _username
"""
Set the user name
:param str username: set the user name
"""
_username = username
def get_username():
global _username
"""
get the username
:return: the user name
:rtype str
"""
return _username
def set_key(key):
global _aws_access_key
"""
Set the Aws access key
:param str key: the Aws access key
"""
_aws_access_key = key
def get_key():
global _aws_access_key
"""
Get the AWS access key
:return: the AWS access Key
:rtype: str
"""
return _aws_access_key
def set_secret(secret):
global _aws_access_key_secret
"""
Set the AWS secret
:param str secret: the aws secret
"""
_aws_access_key_secret = secret;
def get_secret():
global _aws_access_key_secret
"""
Get the aws secret
:return: the aws secret
:rtype str: the aws secret
"""
return _aws_access_key_secret
def load_from_file(file_path):
global _file_path, _username, _aws_access_key, _aws_access_key_secret
"""
Creates a credential object from a file
:param str file_path: the path to credential file
"""
try:
_file_path = file_path.strip()
_file_path = _file_path.replace("\\", "")
credentials_file_content = functions.read_file_to_array(_file_path)
if not credentials_file_content:
raise IOError("Error opening credentials file")
credentials_file_content = credentials_file_content[1].split(",")
_username = credentials_file_content[0].replace('"', '')
_aws_access_key = credentials_file_content[1]
_aws_access_key_secret = credentials_file_content[2]
except IOError, e:
remove_stored_credential_path()
print e
exit()
def save_to_config():
global _file_path
"""
Save the credentials to the config file
"""
config_file.add_parameter("aws_credentials", "aws_credential_path", _file_path) | AWS-Manager | /AWS-Manager-0.4.tar.gz/AWS-Manager-0.4/aws_manager/aws/aws_credentials.py | aws_credentials.py |
import aws_credentials
import boto3
import collections
import aws_manager.config_file as config_file
from subprocess import PIPE, Popen
import subprocess
def get_default_region():
return "eu-central-1"
def load_ec2_instances(region):
"""
Load the EC2 instances a region
:param region:
:rtype: list
:return: a list of the instances in a region or None if there are no instances
"""
ec2 = _get_resource("ec2", region)
ec2_instances = ec2.instances.all()
counter = collections.Counter(ec2_instances)
ec2_size = sum(counter.itervalues())
if ec2_size == 0:
return None
return ec2_instances.filter(Filters=[{'Name': 'instance-state-name', 'Values': ['running']}])
def get_environments_from_instances(instances):
"""
Get all the environments available from instances lists
:param list instances: the list of instance
:rtype: list
:return: a list of the environments
"""
environments = []
for instance in instances:
tags = instance.tags
for tag in tags:
key = tag.get("Key")
if key == "Environment":
environment = tag.get("Value").strip()
if environment not in environments and environment is not "":
environments.append(environment)
return environments
def get_applications_from_instances(instances):
"""
Get all the application available from instances lists
:param list instances: the list of instance
:rtype: list
:return: a list of the applications
"""
applications = []
for instance in instances:
tags = instance.tags
for tag in tags:
key = tag.get("Key")
if key == "Application":
application = tag.get("Value").strip()
if application not in applications:
applications.append(application)
return applications
def get_nat_from_instances(instances):
"""
Get a NAT instance from the list of instances
:param instances: the list of instances
:return: the NAT instance from that list
"""
for instance in instances:
name = get_instance_tag(instance, "Name")
if name == "NAT":
return instance
return None
def convert_instance_to_menu_string(instance):
"""
Convernt an instance object into menu string
:param instance: the instance to prepare the string for
:rtype: str
:return: the string of the instance
"""
additional_list = []
string = ""
name = get_instance_tag(instance, "Name")
if instance.private_ip_address is not None and instance.private_ip_address != "":
additional_list.append("Private Ip: %s" % instance.private_ip_address)
if instance.public_ip_address is not None and instance.public_ip_address != "":
additional_list.append("Public Ip: %s" % instance.public_ip_address)
public_domain = get_instance_tag(instance, "Domain")
if public_domain is not None and public_domain != "":
public_domain = "Domain: %s" % public_domain
additional_list.append(public_domain)
for additional in additional_list:
if not string:
string = "(%s" % additional
else:
string = "%s, %s" % (string, additional)
if len(additional_list) == 0:
return name
else:
string = "%s)" % string
string = "%s %s" % (name, string)
return string
def get_instance_tag(instance, key):
"""
Get instance tag
:param boto3.ec2 instance: the instance to get the name for
:param str key: the key of the tag
:rtype: str
:return: the name of the instance or None if no name was defined
"""
tags = instance.tags
for tag in tags:
if tag.get("Key").strip() == key.strip():
return tag.get("Value").strip();
return None
def filter_instances_by_tag(instances, key, value):
"""
filter a list of instances according to a tag value
:param list instances: the list of instances to filter
:param str key: the key to filter the instance according to
:param str value: the value of the key
:rtype: list
:return: a filtered list of instances
"""
filtered_instance = []
for instance in instances:
tags = instance.tags
for tag in tags:
if tag.get("Key").strip() == key.strip() and tag.get("Value").strip() == value.strip():
filtered_instance.append(instance)
return filtered_instance
def connect_to_instance(instances, index, command=None):
"""
Connect to an ec2 instance.
This will connect directly to the ec2 or use the NAT instance as needed
It will launch an ssh session to the instance
:param instances: the list of instance from which you want to connect to the instance
:param index: the index of the instance you want to connect to
:param command: optional command to start the instance for and stop when finished
"""
instance = instances[index]
nat_instance = get_nat_from_instances(instances)
if nat_instance is None:
start_ssh_session(instance, "ubuntu", instance.key_name, command)
elif instance == nat_instance:
start_ssh_session(instance, "ec2-user", instance.key_name, command)
else:
start_nat_ssh_session(instance, "ubuntu", nat_instance, "ec2-user", instance.key_name, command)
def start_ssh_session(instance, user, key_name, command=None):
"""
Starts an ssh session to an instance
:param instance: the instance to start the session to
:param user: the user name to use to log into the instance
:param key_name: the name of the key file for login
:param command: optional command to start the instance for and stop when finished
"""
key_pair_path = load_key_pair(key_name)
ssh_command = "ssh -i '%s' %s@%s" % (
key_pair_path, user, instance.public_ip_address)
if command is not None:
ssh_command = '%s "%s"' % (ssh_command, command)
print "Connecting to %s instance and running command" % get_instance_tag(instance, "Name")
else:
print "Connecting to %s instance" % get_instance_tag(instance, "Name")
key_error = "Permission denied (publickey)"
call = Popen(ssh_command, shell=True, stderr=PIPE)
stdout, stderr = call.communicate()
if key_error in stderr:
config_file.remove_parameter("key_pairs", key_name)
raw_input("Error loading key, please make sure the key and user are correct. Click enter to continue")
def start_nat_ssh_session(instance, instance_user, nat_instance, nat_user, key_name, command=None):
"""
Starts an ssh session to an instance through NAT instance
:param instance: the instance to connect to
:param instance_user: the instance user to connect with
:param nat_instance: the nat instance to connect to
:param nat_user: the nat user to connect with
:param key_name: the key_pair name
"""
key_pair_path = load_key_pair(key_name)
ssh_command = "ssh -A -t -i '%s' %s@%s" % (key_pair_path, nat_user, nat_instance.public_ip_address)
tunnel_command = "ssh %s@%s" % (instance_user, instance.private_ip_address)
if command is not None:
ssh_command = "%s '%s \"%s\"'" % (ssh_command, tunnel_command, command)
print "Connecting to %s instance and running command" % get_instance_tag(instance, "Name")
else:
ssh_command = "%s %s" % (ssh_command, tunnel_command)
print "Connecting to %s instance" % get_instance_tag(instance, "Name")
key_error = "Permission denied (publickey)"
call = Popen(ssh_command, shell=True, stderr=PIPE)
stdout, stderr = call.communicate()
if key_error in stderr:
config_file.remove_parameter("key_pairs", key_name)
raw_input("Error loading key, please make sure the key and user are correct. Click enter to continue")
def pull_git_branch(instances, index, branch="development"):
"""
Pull a git branch of an instance
:param insntaces: the instance to pull the branch for
:param index: the index of the instance
:param branch: the name of the branch to pull, development is default
"""
instance = instances[index]
username = config_file.get_parameter("git", "username")
password = config_file.get_parameter("git", "password")
if username is None:
username = raw_input("Please enter username to use with your git account\n")
config_file.add_parameter("git", "username", username)
if password is None:
password = raw_input("Please enter password to use with your git account\n")
config_file.add_parameter("git", "password", password)
user_pass = "%s:%s" % (username, password)
remote_repository = get_instance_tag(instance, "Remote Repository")
remote_repository = "https://%s@%s" % (user_pass, remote_repository)
local_repository = get_instance_tag(instance, "Local Repository")
git_command_1 = "sudo git --git-dir=%s/.git --work-tree=%s/ checkout -b %s" % (
local_repository, local_repository, branch)
git_command_2 = "sudo git --git-dir=%s/.git --work-tree=%s/ checkout %s" % (
local_repository, local_repository, branch)
git_command_3 = "sudo git --git-dir=%s/.git --work-tree=%s/ pull --no-edit %s %s" % (
local_repository, local_repository, remote_repository, branch)
command = "%s; %s; %s" % (git_command_1, git_command_2, git_command_3)
connect_to_instance(instances, index, command)
def has_repository(instance):
"""
Check if an instance has repository defined and get receive pull requests
:param instance: the instance to check
:return: True if there is a repository for the instance, false otherwise
:rtype: boolean
"""
if get_instance_tag(instance, "Remote Repository") is not None and get_instance_tag(instance,
"Local Repository") is not None:
return True
return False
def load_key_pair(key_name):
"""
Check if a key pair name is exists in the config file
:param string key_name: the name of the key pair
:param bool reset: flag to determine if to reset the config key before loading
:return: path to key pair
:rtype str
"""
key_pair_path = config_file.get_parameter("key_pairs", key_name)
if key_pair_path is None:
print "Please define path for Key-Pair named %s" % key_name
key_pair_path = raw_input().replace("\\", "").strip()
config_file.add_parameter("key_pairs", key_name, key_pair_path)
subprocess.call("sudo ssh-add '%s'" % key_pair_path, shell=True)
return key_pair_path
def _get_resource(name, region=None):
"""
Get the resource for a name
:param str name: the name of the resource
:param str region: optional region
:return:
"""
credentials = aws_credentials.get_current_credentials()
boto3.setup_default_session(aws_access_key_id=credentials["key"],
aws_secret_access_key=credentials["secret"],
region_name=region)
return boto3.resource(name); | AWS-Manager | /AWS-Manager-0.4.tar.gz/AWS-Manager-0.4/aws_manager/aws/aws.py | aws.py |
import math
import matplotlib.pyplot as plt
from .Generaldistribution import Distribution
class Gaussian(Distribution):
""" Gaussian distribution class for calculating and
visualizing a Gaussian distribution.
Attributes:
mean (float) representing the mean value of the distribution
stdev (float) representing the standard deviation of the distribution
data_list (list of floats) a list of floats extracted from the data file
"""
def __init__(self, mu=0, sigma=1):
Distribution.__init__(self, mu, sigma)
def calculate_mean(self):
"""Function to calculate the mean of the data set.
Args:
None
Returns:
float: mean of the data set
"""
avg = 1.0 * sum(self.data) / len(self.data)
self.mean = avg
return self.mean
def calculate_stdev(self, sample=True):
"""Function to calculate the standard deviation of the data set.
Args:
sample (bool): whether the data represents a sample or population
Returns:
float: standard deviation of the data set
"""
if sample:
n = len(self.data) - 1
else:
n = len(self.data)
mean = self.calculate_mean()
sigma = 0
for d in self.data:
sigma += (d - mean) ** 2
sigma = math.sqrt(sigma / n)
self.stdev = sigma
return self.stdev
def plot_histogram(self):
"""Function to output a histogram of the instance variable data using
matplotlib pyplot library.
Args:
None
Returns:
None
"""
plt.hist(self.data)
plt.title('Histogram of Data')
plt.xlabel('data')
plt.ylabel('count')
def pdf(self, x):
"""Probability density function calculator for the gaussian distribution.
Args:
x (float): point for calculating the probability density function
Returns:
float: probability density function output
"""
return (1.0 / (self.stdev * math.sqrt(2*math.pi))) * math.exp(-0.5*((x - self.mean) / self.stdev) ** 2)
def plot_histogram_pdf(self, n_spaces = 50):
"""Function to plot the normalized histogram of the data and a plot of the
probability density function along the same range
Args:
n_spaces (int): number of data points
Returns:
list: x values for the pdf plot
list: y values for the pdf plot
"""
mu = self.mean
sigma = self.stdev
min_range = min(self.data)
max_range = max(self.data)
# calculates the interval between x values
interval = 1.0 * (max_range - min_range) / n_spaces
x = []
y = []
# calculate the x values to visualize
for i in range(n_spaces):
tmp = min_range + interval*i
x.append(tmp)
y.append(self.pdf(tmp))
# make the plots
fig, axes = plt.subplots(2,sharex=True)
fig.subplots_adjust(hspace=.5)
axes[0].hist(self.data, density=True)
axes[0].set_title('Normed Histogram of Data')
axes[0].set_ylabel('Density')
axes[1].plot(x, y)
axes[1].set_title('Normal Distribution for \n Sample Mean and Sample Standard Deviation')
axes[0].set_ylabel('Density')
plt.show()
return x, y
def __add__(self, other):
"""Function to add together two Gaussian distributions
Args:
other (Gaussian): Gaussian instance
Returns:
Gaussian: Gaussian distribution
"""
result = Gaussian()
result.mean = self.mean + other.mean
result.stdev = math.sqrt(self.stdev ** 2 + other.stdev ** 2)
return result
def __repr__(self):
"""Function to output the characteristics of the Gaussian instance
Args:
None
Returns:
string: characteristics of the Gaussian
"""
return "mean {}, standard deviation {}".format(self.mean, self.stdev) | AWS-Matope-Sithole-distributions | /AWS_Matope_Sithole_distributions-0.1.tar.gz/AWS_Matope_Sithole_distributions-0.1/AWS_Matope_Sithole_distributions/Gaussiandistribution.py | Gaussiandistribution.py |
import math
import matplotlib.pyplot as plt
from .Generaldistribution import Distribution
class Binomial(Distribution):
""" Binomial distribution class for calculating and
visualizing a Binomial distribution.
Attributes:
mean (float) representing the mean value of the distribution
stdev (float) representing the standard deviation of the distribution
data_list (list of floats) a list of floats to be extracted from the data file
p (float) representing the probability of an event occurring
n (int) number of trials
TODO: Fill out all functions below
"""
def __init__(self, prob=.5, size=20):
self.n = size
self.p = prob
Distribution.__init__(self, self.calculate_mean(), self.calculate_stdev())
def calculate_mean(self):
"""Function to calculate the mean from p and n
Args:
None
Returns:
float: mean of the data set
"""
self.mean = self.p * self.n
return self.mean
def calculate_stdev(self):
"""Function to calculate the standard deviation from p and n.
Args:
None
Returns:
float: standard deviation of the data set
"""
self.stdev = math.sqrt(self.n * self.p * (1 - self.p))
return self.stdev
def replace_stats_with_data(self):
"""Function to calculate p and n from the data set
Args:
None
Returns:
float: the p value
float: the n value
"""
self.n = len(self.data)
self.p = 1.0 * sum(self.data) / len(self.data)
self.mean = self.calculate_mean()
self.stdev = self.calculate_stdev()
def plot_bar(self):
"""Function to output a histogram of the instance variable data using
matplotlib pyplot library.
Args:
None
Returns:
None
"""
plt.bar(x = ['0', '1'], height = [(1 - self.p) * self.n, self.p * self.n])
plt.title('Bar Chart of Data')
plt.xlabel('outcome')
plt.ylabel('count')
def pdf(self, k):
"""Probability density function calculator for the gaussian distribution.
Args:
x (float): point for calculating the probability density function
Returns:
float: probability density function output
"""
a = math.factorial(self.n) / (math.factorial(k) * (math.factorial(self.n - k)))
b = (self.p ** k) * (1 - self.p) ** (self.n - k)
return a * b
def plot_bar_pdf(self):
"""Function to plot the pdf of the binomial distribution
Args:
None
Returns:
list: x values for the pdf plot
list: y values for the pdf plot
"""
x = []
y = []
# calculate the x values to visualize
for i in range(self.n + 1):
x.append(i)
y.append(self.pdf(i))
# make the plots
plt.bar(x, y)
plt.title('Distribution of Outcomes')
plt.ylabel('Probability')
plt.xlabel('Outcome')
plt.show()
return x, y
def __add__(self, other):
"""Function to add together two Binomial distributions with equal p
Args:
other (Binomial): Binomial instance
Returns:
Binomial: Binomial distribution
"""
try:
assert self.p == other.p, 'p values are not equal'
except AssertionError as error:
raise
result = Binomial()
result.n = self.n + other.n
result.p = self.p
result.calculate_mean()
result.calculate_stdev()
return result
def __repr__(self):
"""Function to output the characteristics of the Binomial instance
Args:
None
Returns:
string: characteristics of the Gaussian
"""
return "mean {}, standard deviation {}, p {}, n {}".\
format(self.mean, self.stdev, self.p, self.n) | AWS-Matope-Sithole-distributions | /AWS_Matope_Sithole_distributions-0.1.tar.gz/AWS_Matope_Sithole_distributions-0.1/AWS_Matope_Sithole_distributions/Binomialdistribution.py | Binomialdistribution.py |
import math
import matplotlib.pyplot as plt
from .Generaldistribution import Distribution
class Gaussian(Distribution):
""" Gaussian distribution class for calculating and
visualizing a Gaussian distribution.
Attributes:
mean (float) representing the mean value of the distribution
stdev (float) representing the standard deviation of the distribution
data_list (list of floats) a list of floats extracted from the data file
"""
def __init__(self, mu=0, sigma=1):
Distribution.__init__(self, mu, sigma)
def calculate_mean(self):
"""Function to calculate the mean of the data set.
Args:
None
Returns:
float: mean of the data set
"""
avg = 1.0 * sum(self.data) / len(self.data)
self.mean = avg
return self.mean
def calculate_stdev(self, sample=True):
"""Function to calculate the standard deviation of the data set.
Args:
sample (bool): whether the data represents a sample or population
Returns:
float: standard deviation of the data set
"""
if sample:
n = len(self.data) - 1
else:
n = len(self.data)
mean = self.calculate_mean()
sigma = 0
for d in self.data:
sigma += (d - mean) ** 2
sigma = math.sqrt(sigma / n)
self.stdev = sigma
return self.stdev
def plot_histogram(self):
"""Function to output a histogram of the instance variable data using
matplotlib pyplot library.
Args:
None
Returns:
None
"""
plt.hist(self.data)
plt.title('Histogram of Data')
plt.xlabel('data')
plt.ylabel('count')
def pdf(self, x):
"""Probability density function calculator for the gaussian distribution.
Args:
x (float): point for calculating the probability density function
Returns:
float: probability density function output
"""
return (1.0 / (self.stdev * math.sqrt(2*math.pi))) * math.exp(-0.5*((x - self.mean) / self.stdev) ** 2)
def plot_histogram_pdf(self, n_spaces = 50):
"""Function to plot the normalized histogram of the data and a plot of the
probability density function along the same range
Args:
n_spaces (int): number of data points
Returns:
list: x values for the pdf plot
list: y values for the pdf plot
"""
mu = self.mean
sigma = self.stdev
min_range = min(self.data)
max_range = max(self.data)
# calculates the interval between x values
interval = 1.0 * (max_range - min_range) / n_spaces
x = []
y = []
# calculate the x values to visualize
for i in range(n_spaces):
tmp = min_range + interval*i
x.append(tmp)
y.append(self.pdf(tmp))
# make the plots
fig, axes = plt.subplots(2,sharex=True)
fig.subplots_adjust(hspace=.5)
axes[0].hist(self.data, density=True)
axes[0].set_title('Normed Histogram of Data')
axes[0].set_ylabel('Density')
axes[1].plot(x, y)
axes[1].set_title('Normal Distribution for \n Sample Mean and Sample Standard Deviation')
axes[0].set_ylabel('Density')
plt.show()
return x, y
def __add__(self, other):
"""Function to add together two Gaussian distributions
Args:
other (Gaussian): Gaussian instance
Returns:
Gaussian: Gaussian distribution
"""
result = Gaussian()
result.mean = self.mean + other.mean
result.stdev = math.sqrt(self.stdev ** 2 + other.stdev ** 2)
return result
def __repr__(self):
"""Function to output the characteristics of the Gaussian instance
Args:
None
Returns:
string: characteristics of the Gaussian
"""
return "mean {}, standard deviation {}".format(self.mean, self.stdev) | AWS-OOP-distributions | /AWS_OOP_distributions-0.1.tar.gz/AWS_OOP_distributions-0.1/distributions/Gaussiandistribution.py | Gaussiandistribution.py |
import math
import matplotlib.pyplot as plt
from .Generaldistribution import Distribution
class Binomial(Distribution):
""" Binomial distribution class for calculating and
visualizing a Binomial distribution.
Attributes:
mean (float) representing the mean value of the distribution
stdev (float) representing the standard deviation of the distribution
data_list (list of floats) a list of floats to be extracted from the data file
p (float) representing the probability of an event occurring
n (int) number of trials
TODO: Fill out all functions below
"""
def __init__(self, prob=.5, size=20):
self.n = size
self.p = prob
Distribution.__init__(self, self.calculate_mean(), self.calculate_stdev())
def calculate_mean(self):
"""Function to calculate the mean from p and n
Args:
None
Returns:
float: mean of the data set
"""
self.mean = self.p * self.n
return self.mean
def calculate_stdev(self):
"""Function to calculate the standard deviation from p and n.
Args:
None
Returns:
float: standard deviation of the data set
"""
self.stdev = math.sqrt(self.n * self.p * (1 - self.p))
return self.stdev
def replace_stats_with_data(self):
"""Function to calculate p and n from the data set
Args:
None
Returns:
float: the p value
float: the n value
"""
self.n = len(self.data)
self.p = 1.0 * sum(self.data) / len(self.data)
self.mean = self.calculate_mean()
self.stdev = self.calculate_stdev()
def plot_bar(self):
"""Function to output a histogram of the instance variable data using
matplotlib pyplot library.
Args:
None
Returns:
None
"""
plt.bar(x = ['0', '1'], height = [(1 - self.p) * self.n, self.p * self.n])
plt.title('Bar Chart of Data')
plt.xlabel('outcome')
plt.ylabel('count')
def pdf(self, k):
"""Probability density function calculator for the gaussian distribution.
Args:
x (float): point for calculating the probability density function
Returns:
float: probability density function output
"""
a = math.factorial(self.n) / (math.factorial(k) * (math.factorial(self.n - k)))
b = (self.p ** k) * (1 - self.p) ** (self.n - k)
return a * b
def plot_bar_pdf(self):
"""Function to plot the pdf of the binomial distribution
Args:
None
Returns:
list: x values for the pdf plot
list: y values for the pdf plot
"""
x = []
y = []
# calculate the x values to visualize
for i in range(self.n + 1):
x.append(i)
y.append(self.pdf(i))
# make the plots
plt.bar(x, y)
plt.title('Distribution of Outcomes')
plt.ylabel('Probability')
plt.xlabel('Outcome')
plt.show()
return x, y
def __add__(self, other):
"""Function to add together two Binomial distributions with equal p
Args:
other (Binomial): Binomial instance
Returns:
Binomial: Binomial distribution
"""
try:
assert self.p == other.p, 'p values are not equal'
except AssertionError as error:
raise
result = Binomial()
result.n = self.n + other.n
result.p = self.p
result.calculate_mean()
result.calculate_stdev()
return result
def __repr__(self):
"""Function to output the characteristics of the Binomial instance
Args:
None
Returns:
string: characteristics of the Gaussian
"""
return "mean {}, standard deviation {}, p {}, n {}".\
format(self.mean, self.stdev, self.p, self.n) | AWS-OOP-distributions | /AWS_OOP_distributions-0.1.tar.gz/AWS_OOP_distributions-0.1/distributions/Binomialdistribution.py | Binomialdistribution.py |
# OpenAPI AWS API Gateway linter
[](https://codecov.io/gh/evilmint/aws-openapi-lint) [](https://github.com/evilmint/aws-openapi-lint)
AWS-OpenAPI-Lint is a simple OpenAPI 3 yaml / json spec linter designed for checking API Gateway integration.
## Rules
It contains rules for checking whether:
- you have an authorizer on OPTIONS
- authorizer is mentioned in `requestParameters` but is not present in `security`
- http verbs are consistent in the path and integration
- all used headers in path from all verbs are mentioned in CORS rules and vice-versa
- CORS rules allow all verbs mentioned in the path
- CORS rules are present
- amazon integration is present
- path parameters present in `requestParameters` are not used in the direct path parameters and vice-versa
## Roadmap
- [ ] Update README with rule names and behavior
- [X] Support json specs
- [X] Add optional rule for checking base url equality
- [ ] Add support for configuration yaml file
- [ ] Add possibility to disable rule checking on specific paths
- [ ] Add possibility to disable rules per path
- [ ] Ignore path-params if `http_proxy` integration type used
- [X] Add option to disable rules via CLI
- [X] Add warning threshold to return with status code 0 if limit not exceeded
- [X] Fix flake8 violations
- [X] Add a license
- [X] Publish to PyPI or alike
- [X] Configure properly up GitHub actions to run tests on push
## Installation
```
pip install aws-openapi-lint
```
## Usage
`$ aws-openapi-lint path/to/spec.yml`
```
usage: main.py [-h] [--treat-errors-as-warnings]
[--warning-threshold WARNING_THRESHOLD]
[--exclude-rules EXCLUDE_RULES]
lint_file
Lint OpenAPI specifications based on AWS API Gateway.
positional arguments:
lint_file Specify path to the openapi schema file.
optional arguments:
-h, --help show this help message and exit
--treat-errors-as-warnings
Treats errors as warnings (exit code will be 0 unless
warning threshold is specified
--warning-threshold WARNING_THRESHOLD
Warning threshold which when surpassed renders exit
code to become 1)
--exclude-rules EXCLUDE_RULES
Excluded rules separated by comma.
```
| AWS-OpenAPI-Lint | /AWS-OpenAPI-Lint-0.2.2.tar.gz/AWS-OpenAPI-Lint-0.2.2/README.md | README.md |
from .bcolors import bcolors
from .rules.CORSInconsistentHeadersRule import CORSInconsistentHeadersRule
from .rules.CORSNotEnoughVerbsRule import CORSNotEnoughVerbsRule
from .rules.ConflictingHttpVerbsRule import ConflictingHttpVerbsRule
from .rules.IntegrationBaseUriRule import IntegrationBaseUriRule
from .rules.MissingAmazonIntegrationRule import MissingAmazonIntegrationRule
from .rules.NoCORSPresentRule import NoCORSPresentRule
from .rules.PathParamNotMappedRule import PathParamNotMappedRule
from .rules.AuthorizerOnOptionsRule import AuthorizerOnOptionsRule
from .rules.AuthorizerReferencedButMissingRule import AuthorizerReferencedButMissingRule
from .rules.rule_validator import RuleValidator
import sys
import argparse
def print_violations(violations):
for violation in violations:
print(violation.identifier, violation.message, violation.path)
violation_string = "violations"
if len(violations) == 1:
violation_string = "violation"
print(bcolors.FAIL + "{} {} found.".format(len(violations), violation_string))
def print_no_violations():
print(bcolors.OKGREEN + "0 violations found. Well done 💚")
def parse_arguments():
parser = argparse.ArgumentParser(description='Lint OpenAPI specifications based on AWS API Gateway.')
parser.add_argument('lint_file', help='Specify path to the openapi schema file.')
parser.add_argument('--treat-errors-as-warnings', action='store_const', const=True, default=False,
help='Treats errors as warnings (exit code will be 0 unless warning threshold is specified')
parser.add_argument('--warning-threshold', default=-1, type=int, help='Warning threshold which when surpassed '
'renders exit code to become 1)')
parser.add_argument('--exclude-rules', default="", type=str, help='Excluded rules separated by comma.')
parser.add_argument('--check-base-uri', default="", type=str, help='Checks whether every integration\'s '
'path is equal to the base uri specified.')
return parser.parse_args()
def cli(args=None, input_format="yaml", program_name="aws-openapi-lint"):
if len(sys.argv) == 1:
print('File path not passed as command line argument.')
exit(1)
args = parse_arguments()
supported_rules = [
ConflictingHttpVerbsRule(),
MissingAmazonIntegrationRule(),
PathParamNotMappedRule(),
AuthorizerOnOptionsRule(),
AuthorizerReferencedButMissingRule(),
NoCORSPresentRule(),
CORSNotEnoughVerbsRule(),
CORSInconsistentHeadersRule()
]
exclude_rules = args.exclude_rules.split(",")
effective_rules = filter(lambda r: r.rule_name not in exclude_rules, supported_rules)
rule_validator = RuleValidator(args.lint_file)
for rule in effective_rules:
rule_validator.add_rule(rule)
if args.check_base_uri != "":
rule_validator.add_rule(IntegrationBaseUriRule(base_uri=args.check_base_uri))
violations = rule_validator.validate()
if len(violations) == 0:
print_no_violations()
else:
print_violations(violations)
if args.treat_errors_as_warnings:
if args.warning_threshold != -1 and len(violations) > args.warning_threshold:
print("Warning threshold exceeded: {}/{}".format(len(violations), args.warning_threshold))
exit(1)
else:
exit(0)
else:
exit(0 if len(violations) == 0 else 1) | AWS-OpenAPI-Lint | /AWS-OpenAPI-Lint-0.2.2.tar.gz/AWS-OpenAPI-Lint-0.2.2/aws_openapi_lint/__init__.py | __init__.py |
import re
def find_path_params(path):
path_params = re.findall(r'(\{[a-zA-Z _\-0-9]+\})', path)
path_params = map(lambda x: x.replace('{', '').replace('}', ''), path_params)
return path_params
def contains_apigateway_integration(path_verb):
return 'x-amazon-apigateway-integration' in path_verb
def contains_request_parameters(path_verb):
return 'requestParameters' in path_verb['x-amazon-apigateway-integration']
def get_path_verbs(spec, path):
verbs = spec['paths'][path].keys()
return map(lambda x: x.lower(), verbs)
def get_apigateway_integration(spec, path, verb):
return spec['paths'][path][verb]['x-amazon-apigateway-integration']
def path_contains_verb(spec, path, verb):
return verb in spec['paths'][path]
def get_path_headers(spec, path):
verbs = spec['paths'][path].keys()
header_parameters = []
for verb in verbs:
if 'parameters' not in spec['paths'][path][verb]:
continue
parameters = filter(lambda p: p['in'] == 'header', spec['paths'][path][verb]['parameters'])
parameters = map(lambda p: p['name'], parameters)
header_parameters += parameters
return header_parameters
def integration_response_contains_parameters(spec, path, verb, response, parameters):
response_params = get_apigateway_integration(spec, path, verb)['responses'][response]['responseParameters']
return parameters in response_params
def get_integration_response_parameters(spec, path, verb, response):
return get_apigateway_integration(spec, path, verb)['responses'][response]['responseParameters']
def get_integration_verb(spec, path, verb):
return get_apigateway_integration(spec, path, verb)['httpMethod']
def authorizer_referenced_in_request_params(spec, path, verb) -> bool:
request_params = get_apigateway_integration(spec, path, verb)['requestParameters']
for request_param in request_params.values():
if request_param.startswith('context.authorizer'):
return True
return False
def has_security_components(spec, path, verb):
has_security = 'security' in spec['paths'][path][verb]
return has_security and len(spec['paths'][path][verb]['security']) > 0 | AWS-OpenAPI-Lint | /AWS-OpenAPI-Lint-0.2.2.tar.gz/AWS-OpenAPI-Lint-0.2.2/aws_openapi_lint/rules/rules_helper.py | rules_helper.py |
from .rule_validator import RuleViolation
from .rules_helper import get_path_verbs, get_apigateway_integration, path_contains_verb, \
get_integration_response_parameters
class CORSNotEnoughVerbsRule:
def __init__(self):
self.rule_name = 'options_cors_not_enough_verbs'
def validate(self, spec):
violations = []
for path in spec['paths']:
if not path_contains_verb(spec, path, 'options'):
violations.append(self.missing_options_verb_rule_violation(path))
continue
integration = get_apigateway_integration(spec, path, 'options')
path_verbs = get_path_verbs(spec, path)
for response in integration['responses']:
if 'responses' not in integration or response not in integration['responses'] or \
'responseParameters' not in integration['responses'][response]:
violations.append(self.missing_options_verb_rule_violation(path))
continue
response_params = get_integration_response_parameters(spec, path, 'options', response)
if 'method.response.header.Access-Control-Allow-Methods' not in response_params:
violations.append(self.missing_options_verb_rule_violation(path))
else:
allow_methods_value = response_params['method.response.header.Access-Control-Allow-Methods']
integration_verbs = map(lambda x: x.lower().strip(), allow_methods_value[1:-1].split(','))
verbs_difference = set(path_verbs).symmetric_difference(set(integration_verbs))
for verb in verbs_difference:
message = 'Extra HTTP verb {} included in path or request mapping.'.format(verb)
violations.append(RuleViolation('options_cors_not_enough_verbs',
message=message,
path=path))
return violations
def missing_options_verb_rule_violation(self, path):
return RuleViolation('options_cors_not_enough_verbs',
message='Missing OPTIONS verb',
path=path) | AWS-OpenAPI-Lint | /AWS-OpenAPI-Lint-0.2.2.tar.gz/AWS-OpenAPI-Lint-0.2.2/aws_openapi_lint/rules/CORSNotEnoughVerbsRule.py | CORSNotEnoughVerbsRule.py |
from .rule_validator import RuleViolation
from .rules_helper import get_apigateway_integration, get_path_headers, get_integration_response_parameters, \
get_path_verbs
class CORSInconsistentHeadersRule:
def __init__(self):
self.rule_name = 'options_cors_incosistent_headers'
def validate(self, spec):
violations = []
for path in spec['paths']:
if 'options' not in get_path_verbs(spec, path):
continue
integration = get_apigateway_integration(spec, path, 'options')
path_headers = get_path_headers(spec, path)
for response in integration['responses']:
if 'responses' not in integration or response not in integration['responses'] or \
'responseParameters' not in integration['responses'][response]:
continue
integration_response_params = get_integration_response_parameters(spec, path, 'options', response)
if 'method.response.header.Access-Control-Allow-Headers' in integration_response_params:
integration_headers = self.get_access_control_allow_headers(integration_response_params)
headers_difference = set(path_headers).symmetric_difference(set(integration_headers))
for header in headers_difference:
message = 'Extra Allow-Header "{}" included in parameters or responseParameters.'.format(header)
violations.append(RuleViolation('options_cors_incosistent_headers',
message=message,
path=path))
return violations
def get_access_control_allow_headers(self, integration_response_params):
allow_headers_value = integration_response_params['method.response.header.Access-Control-Allow-Headers']
split_headers = map(lambda x: x.strip(), allow_headers_value[1:-1].split(','))
split_headers = filter(lambda h: len(h.strip()) > 0, split_headers)
return split_headers | AWS-OpenAPI-Lint | /AWS-OpenAPI-Lint-0.2.2.tar.gz/AWS-OpenAPI-Lint-0.2.2/aws_openapi_lint/rules/CORSInconsistentHeadersRule.py | CORSInconsistentHeadersRule.py |
import math
import matplotlib.pyplot as plt
from .Generaldistribution import Distribution
class Gaussian(Distribution):
""" Gaussian distribution class for calculating and
visualizing a Gaussian distribution.
Attributes:
mean (float) representing the mean value of the distribution
stdev (float) representing the standard deviation of the distribution
data_list (list of floats) a list of floats extracted from the data file
"""
def __init__(self, mu=0, sigma=1):
Distribution.__init__(self, mu, sigma)
def calculate_mean(self):
"""Function to calculate the mean of the data set.
Args:
None
Returns:
float: mean of the data set
"""
avg = 1.0 * sum(self.data) / len(self.data)
self.mean = avg
return self.mean
def calculate_stdev(self, sample=True):
"""Function to calculate the standard deviation of the data set.
Args:
sample (bool): whether the data represents a sample or population
Returns:
float: standard deviation of the data set
"""
if sample:
n = len(self.data) - 1
else:
n = len(self.data)
mean = self.calculate_mean()
sigma = 0
for d in self.data:
sigma += (d - mean) ** 2
sigma = math.sqrt(sigma / n)
self.stdev = sigma
return self.stdev
def plot_histogram(self):
"""Function to output a histogram of the instance variable data using
matplotlib pyplot library.
Args:
None
Returns:
None
"""
plt.hist(self.data)
plt.title('Histogram of Data')
plt.xlabel('data')
plt.ylabel('count')
def pdf(self, x):
"""Probability density function calculator for the gaussian distribution.
Args:
x (float): point for calculating the probability density function
Returns:
float: probability density function output
"""
return (1.0 / (self.stdev * math.sqrt(2*math.pi))) * math.exp(-0.5*((x - self.mean) / self.stdev) ** 2)
def plot_histogram_pdf(self, n_spaces = 50):
"""Function to plot the normalized histogram of the data and a plot of the
probability density function along the same range
Args:
n_spaces (int): number of data points
Returns:
list: x values for the pdf plot
list: y values for the pdf plot
"""
mu = self.mean
sigma = self.stdev
min_range = min(self.data)
max_range = max(self.data)
# calculates the interval between x values
interval = 1.0 * (max_range - min_range) / n_spaces
x = []
y = []
# calculate the x values to visualize
for i in range(n_spaces):
tmp = min_range + interval*i
x.append(tmp)
y.append(self.pdf(tmp))
# make the plots
fig, axes = plt.subplots(2,sharex=True)
fig.subplots_adjust(hspace=.5)
axes[0].hist(self.data, density=True)
axes[0].set_title('Normed Histogram of Data')
axes[0].set_ylabel('Density')
axes[1].plot(x, y)
axes[1].set_title('Normal Distribution for \n Sample Mean and Sample Standard Deviation')
axes[0].set_ylabel('Density')
plt.show()
return x, y
def __add__(self, other):
"""Function to add together two Gaussian distributions
Args:
other (Gaussian): Gaussian instance
Returns:
Gaussian: Gaussian distribution
"""
result = Gaussian()
result.mean = self.mean + other.mean
result.stdev = math.sqrt(self.stdev ** 2 + other.stdev ** 2)
return result
def __repr__(self):
"""Function to output the characteristics of the Gaussian instance
Args:
None
Returns:
string: characteristics of the Gaussian
"""
return "mean {}, standard deviation {}".format(self.mean, self.stdev) | AWS-PYPI-Practice | /AWS_PYPI_Practice-0.2.tar.gz/AWS_PYPI_Practice-0.2/distributions/Gaussiandistribution.py | Gaussiandistribution.py |
import math
import matplotlib.pyplot as plt
from .Generaldistribution import Distribution
class Binomial(Distribution):
""" Binomial distribution class for calculating and
visualizing a Binomial distribution.
Attributes:
mean (float) representing the mean value of the distribution
stdev (float) representing the standard deviation of the distribution
data_list (list of floats) a list of floats to be extracted from the data file
p (float) representing the probability of an event occurring
n (int) number of trials
TODO: Fill out all functions below
"""
def __init__(self, prob=.5, size=20):
self.n = size
self.p = prob
Distribution.__init__(self, self.calculate_mean(), self.calculate_stdev())
def calculate_mean(self):
"""Function to calculate the mean from p and n
Args:
None
Returns:
float: mean of the data set
"""
self.mean = self.p * self.n
return self.mean
def calculate_stdev(self):
"""Function to calculate the standard deviation from p and n.
Args:
None
Returns:
float: standard deviation of the data set
"""
self.stdev = math.sqrt(self.n * self.p * (1 - self.p))
return self.stdev
def replace_stats_with_data(self):
"""Function to calculate p and n from the data set
Args:
None
Returns:
float: the p value
float: the n value
"""
self.n = len(self.data)
self.p = 1.0 * sum(self.data) / len(self.data)
self.mean = self.calculate_mean()
self.stdev = self.calculate_stdev()
def plot_bar(self):
"""Function to output a histogram of the instance variable data using
matplotlib pyplot library.
Args:
None
Returns:
None
"""
plt.bar(x = ['0', '1'], height = [(1 - self.p) * self.n, self.p * self.n])
plt.title('Bar Chart of Data')
plt.xlabel('outcome')
plt.ylabel('count')
def pdf(self, k):
"""Probability density function calculator for the gaussian distribution.
Args:
x (float): point for calculating the probability density function
Returns:
float: probability density function output
"""
a = math.factorial(self.n) / (math.factorial(k) * (math.factorial(self.n - k)))
b = (self.p ** k) * (1 - self.p) ** (self.n - k)
return a * b
def plot_bar_pdf(self):
"""Function to plot the pdf of the binomial distribution
Args:
None
Returns:
list: x values for the pdf plot
list: y values for the pdf plot
"""
x = []
y = []
# calculate the x values to visualize
for i in range(self.n + 1):
x.append(i)
y.append(self.pdf(i))
# make the plots
plt.bar(x, y)
plt.title('Distribution of Outcomes')
plt.ylabel('Probability')
plt.xlabel('Outcome')
plt.show()
return x, y
def __add__(self, other):
"""Function to add together two Binomial distributions with equal p
Args:
other (Binomial): Binomial instance
Returns:
Binomial: Binomial distribution
"""
try:
assert self.p == other.p, 'p values are not equal'
except AssertionError as error:
raise
result = Binomial()
result.n = self.n + other.n
result.p = self.p
result.calculate_mean()
result.calculate_stdev()
return result
def __repr__(self):
"""Function to output the characteristics of the Binomial instance
Args:
None
Returns:
string: characteristics of the Gaussian
"""
return "mean {}, standard deviation {}, p {}, n {}".\
format(self.mean, self.stdev, self.p, self.n) | AWS-PYPI-Practice | /AWS_PYPI_Practice-0.2.tar.gz/AWS_PYPI_Practice-0.2/distributions/Binomialdistribution.py | Binomialdistribution.py |
# AWS Tags As A DataBase (AWS TaaDB) 🚀🚀
[](https://badge.fury.io/py/AWSTagsAsADatabase)



**NOTE: Please Don't Acutally Use this as a Database!**
**Please Reference [An AWS Database Safari By Corey Quinn](https://www.lastweekinaws.com/blog/an-aws-database-safari/) for acutally databases**
## About 🏎️🏎️
Corey Quinn describes how to use AWS Managed DNS Offering (Route 53) as a DataBase in [Route 53, Amazon Premier Database By Corey Quinn](https://www.lastweekinaws.com/blog/route-53-amazons-premier-database/) & [Twitter Thread](https://twitter.com/quinnypig/status/1120653859561459712?lang=en).
To continue to trend to misuse random AWS resources as, AWS Tags As A Database (**AWS TaaDb**) Python🐍🐍 library was created to use AWS Tags feature as a Key-Value database.
It uses AWS EC2 instance Tags as the database in its current configuration but nothing is stopping it from using any AWS resource that allows the use of Tags
## Installation 🚀🚀
```bash
pip install TagsAsADatabase
```
## Examples 🚀🚀
```python
# imports AWS Tags As A Database Library
from TagsAsADatabase import DatabaseClient
# create a database client (using AWS EC2 instance Tags as backend)
# pass in the resource id of an ec2 instance
# region_name defaults to us-east-1
dbClient = DatabaseClient(INSTANCE_ID region_name=REGION_NAME)
# gets all the current Keys of the key-value database
# returns type List[str]
print(dbClient.getAllKeys())
# gets all the key-value pairs
# returns as type Dict[str, str]
print(dbClient.getKeyValuePairs())
# adds or updates the VALUE at KEY
dbClient.updateKeyValue(KEY, VALUE)
# deletes the key-value pair at KEY
dbClient.deleteKeyValue(KEY)
```
## Resources 🚀🚀
- https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ec2.html#instance
- https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ec2.html#EC2.Instance.tags
- https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ec2.html#EC2.Instance.create_tags
- https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ec2.html#EC2.Instance.delete_tags
- https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/ec2.html#EC2.Tag.reload | AWS-Tags-As-A-DataBase | /AWS%20Tags%20As%20A%20DataBase-0.0.1.tar.gz/AWS Tags As A DataBase-0.0.1/README.md | README.md |
import math
import matplotlib.pyplot as plt
from .Generaldistribution import Distribution
class Gaussian(Distribution):
""" Gaussian distribution class for calculating and
visualizing a Gaussian distribution.
Attributes:
mean (float) representing the mean value of the distribution
stdev (float) representing the standard deviation of the distribution
data_list (list of floats) a list of floats extracted from the data file
"""
def __init__(self, mu=0, sigma=1):
Distribution.__init__(self, mu, sigma)
def calculate_mean(self):
"""Function to calculate the mean of the data set.
Args:
None
Returns:
float: mean of the data set
"""
avg = 1.0 * sum(self.data) / len(self.data)
self.mean = avg
return self.mean
def calculate_stdev(self, sample=True):
"""Function to calculate the standard deviation of the data set.
Args:
sample (bool): whether the data represents a sample or population
Returns:
float: standard deviation of the data set
"""
if sample:
n = len(self.data) - 1
else:
n = len(self.data)
mean = self.calculate_mean()
sigma = 0
for d in self.data:
sigma += (d - mean) ** 2
sigma = math.sqrt(sigma / n)
self.stdev = sigma
return self.stdev
def plot_histogram(self):
"""Function to output a histogram of the instance variable data using
matplotlib pyplot library.
Args:
None
Returns:
None
"""
plt.hist(self.data)
plt.title('Histogram of Data')
plt.xlabel('data')
plt.ylabel('count')
def pdf(self, x):
"""Probability density function calculator for the gaussian distribution.
Args:
x (float): point for calculating the probability density function
Returns:
float: probability density function output
"""
return (1.0 / (self.stdev * math.sqrt(2*math.pi))) * math.exp(-0.5*((x - self.mean) / self.stdev) ** 2)
def plot_histogram_pdf(self, n_spaces = 50):
"""Function to plot the normalized histogram of the data and a plot of the
probability density function along the same range
Args:
n_spaces (int): number of data points
Returns:
list: x values for the pdf plot
list: y values for the pdf plot
"""
mu = self.mean
sigma = self.stdev
min_range = min(self.data)
max_range = max(self.data)
# calculates the interval between x values
interval = 1.0 * (max_range - min_range) / n_spaces
x = []
y = []
# calculate the x values to visualize
for i in range(n_spaces):
tmp = min_range + interval*i
x.append(tmp)
y.append(self.pdf(tmp))
# make the plots
fig, axes = plt.subplots(2,sharex=True)
fig.subplots_adjust(hspace=.5)
axes[0].hist(self.data, density=True)
axes[0].set_title('Normed Histogram of Data')
axes[0].set_ylabel('Density')
axes[1].plot(x, y)
axes[1].set_title('Normal Distribution for \n Sample Mean and Sample Standard Deviation')
axes[0].set_ylabel('Density')
plt.show()
return x, y
def __add__(self, other):
"""Function to add together two Gaussian distributions
Args:
other (Gaussian): Gaussian instance
Returns:
Gaussian: Gaussian distribution
"""
result = Gaussian()
result.mean = self.mean + other.mean
result.stdev = math.sqrt(self.stdev ** 2 + other.stdev ** 2)
return result
def __repr__(self):
"""Function to output the characteristics of the Gaussian instance
Args:
None
Returns:
string: characteristics of the Gaussian
"""
return "mean {}, standard deviation {}".format(self.mean, self.stdev) | AWS-distributions | /AWS_distributions-0.1.tar.gz/AWS_distributions-0.1/AWS_distributions/Gaussiandistribution.py | Gaussiandistribution.py |
import math
import matplotlib.pyplot as plt
from .Generaldistribution import Distribution
class Binomial(Distribution):
""" Binomial distribution class for calculating and
visualizing a Binomial distribution.
Attributes:
mean (float) representing the mean value of the distribution
stdev (float) representing the standard deviation of the distribution
data_list (list of floats) a list of floats to be extracted from the data file
p (float) representing the probability of an event occurring
n (int) number of trials
TODO: Fill out all functions below
"""
def __init__(self, prob=.5, size=20):
self.n = size
self.p = prob
Distribution.__init__(self, self.calculate_mean(), self.calculate_stdev())
def calculate_mean(self):
"""Function to calculate the mean from p and n
Args:
None
Returns:
float: mean of the data set
"""
self.mean = self.p * self.n
return self.mean
def calculate_stdev(self):
"""Function to calculate the standard deviation from p and n.
Args:
None
Returns:
float: standard deviation of the data set
"""
self.stdev = math.sqrt(self.n * self.p * (1 - self.p))
return self.stdev
def replace_stats_with_data(self):
"""Function to calculate p and n from the data set
Args:
None
Returns:
float: the p value
float: the n value
"""
self.n = len(self.data)
self.p = 1.0 * sum(self.data) / len(self.data)
self.mean = self.calculate_mean()
self.stdev = self.calculate_stdev()
def plot_bar(self):
"""Function to output a histogram of the instance variable data using
matplotlib pyplot library.
Args:
None
Returns:
None
"""
plt.bar(x = ['0', '1'], height = [(1 - self.p) * self.n, self.p * self.n])
plt.title('Bar Chart of Data')
plt.xlabel('outcome')
plt.ylabel('count')
def pdf(self, k):
"""Probability density function calculator for the gaussian distribution.
Args:
x (float): point for calculating the probability density function
Returns:
float: probability density function output
"""
a = math.factorial(self.n) / (math.factorial(k) * (math.factorial(self.n - k)))
b = (self.p ** k) * (1 - self.p) ** (self.n - k)
return a * b
def plot_bar_pdf(self):
"""Function to plot the pdf of the binomial distribution
Args:
None
Returns:
list: x values for the pdf plot
list: y values for the pdf plot
"""
x = []
y = []
# calculate the x values to visualize
for i in range(self.n + 1):
x.append(i)
y.append(self.pdf(i))
# make the plots
plt.bar(x, y)
plt.title('Distribution of Outcomes')
plt.ylabel('Probability')
plt.xlabel('Outcome')
plt.show()
return x, y
def __add__(self, other):
"""Function to add together two Binomial distributions with equal p
Args:
other (Binomial): Binomial instance
Returns:
Binomial: Binomial distribution
"""
try:
assert self.p == other.p, 'p values are not equal'
except AssertionError as error:
raise
result = Binomial()
result.n = self.n + other.n
result.p = self.p
result.calculate_mean()
result.calculate_stdev()
return result
def __repr__(self):
"""Function to output the characteristics of the Binomial instance
Args:
None
Returns:
string: characteristics of the Gaussian
"""
return "mean {}, standard deviation {}, p {}, n {}".\
format(self.mean, self.stdev, self.p, self.n) | AWS-distributions | /AWS_distributions-0.1.tar.gz/AWS_distributions-0.1/AWS_distributions/Binomialdistribution.py | Binomialdistribution.py |
import math
import matplotlib.pyplot as plt
from .Generaldistribution import Distribution
class Gaussian(Distribution):
""" Gaussian distribution class for calculating and
visualizing a Gaussian distribution.
Attributes:
mean (float) representing the mean value of the distribution
stdev (float) representing the standard deviation of the distribution
data_list (list of floats) a list of floats extracted from the data file
"""
def __init__(self, mu=0, sigma=1):
Distribution.__init__(self, mu, sigma)
def calculate_mean(self):
"""Function to calculate the mean of the data set.
Args:
None
Returns:
float: mean of the data set
"""
avg = 1.0 * sum(self.data) / len(self.data)
self.mean = avg
return self.mean
def calculate_stdev(self, sample=True):
"""Function to calculate the standard deviation of the data set.
Args:
sample (bool): whether the data represents a sample or population
Returns:
float: standard deviation of the data set
"""
if sample:
n = len(self.data) - 1
else:
n = len(self.data)
mean = self.calculate_mean()
sigma = 0
for d in self.data:
sigma += (d - mean) ** 2
sigma = math.sqrt(sigma / n)
self.stdev = sigma
return self.stdev
def plot_histogram(self):
"""Function to output a histogram of the instance variable data using
matplotlib pyplot library.
Args:
None
Returns:
None
"""
plt.hist(self.data)
plt.title('Histogram of Data')
plt.xlabel('data')
plt.ylabel('count')
def pdf(self, x):
"""Probability density function calculator for the gaussian distribution.
Args:
x (float): point for calculating the probability density function
Returns:
float: probability density function output
"""
return (1.0 / (self.stdev * math.sqrt(2*math.pi))) * math.exp(-0.5*((x - self.mean) / self.stdev) ** 2)
def plot_histogram_pdf(self, n_spaces = 50):
"""Function to plot the normalized histogram of the data and a plot of the
probability density function along the same range
Args:
n_spaces (int): number of data points
Returns:
list: x values for the pdf plot
list: y values for the pdf plot
"""
mu = self.mean
sigma = self.stdev
min_range = min(self.data)
max_range = max(self.data)
# calculates the interval between x values
interval = 1.0 * (max_range - min_range) / n_spaces
x = []
y = []
# calculate the x values to visualize
for i in range(n_spaces):
tmp = min_range + interval*i
x.append(tmp)
y.append(self.pdf(tmp))
# make the plots
fig, axes = plt.subplots(2,sharex=True)
fig.subplots_adjust(hspace=.5)
axes[0].hist(self.data, density=True)
axes[0].set_title('Normed Histogram of Data')
axes[0].set_ylabel('Density')
axes[1].plot(x, y)
axes[1].set_title('Normal Distribution for \n Sample Mean and Sample Standard Deviation')
axes[0].set_ylabel('Density')
plt.show()
return x, y
def __add__(self, other):
"""Function to add together two Gaussian distributions
Args:
other (Gaussian): Gaussian instance
Returns:
Gaussian: Gaussian distribution
"""
result = Gaussian()
result.mean = self.mean + other.mean
result.stdev = math.sqrt(self.stdev ** 2 + other.stdev ** 2)
return result
def __repr__(self):
"""Function to output the characteristics of the Gaussian instance
Args:
None
Returns:
string: characteristics of the Gaussian
"""
return "mean {}, standard deviation {}".format(self.mean, self.stdev) | AWS-pypi-exercise | /AWS_pypi_exercise-0.1.tar.gz/AWS_pypi_exercise-0.1/AWS_pypi_exercise/Gaussiandistribution.py | Gaussiandistribution.py |
import math
import matplotlib.pyplot as plt
from .Generaldistribution import Distribution
class Binomial(Distribution):
""" Binomial distribution class for calculating and
visualizing a Binomial distribution.
Attributes:
mean (float) representing the mean value of the distribution
stdev (float) representing the standard deviation of the distribution
data_list (list of floats) a list of floats to be extracted from the data file
p (float) representing the probability of an event occurring
n (int) number of trials
TODO: Fill out all functions below
"""
def __init__(self, prob=.5, size=20):
self.n = size
self.p = prob
Distribution.__init__(self, self.calculate_mean(), self.calculate_stdev())
def calculate_mean(self):
"""Function to calculate the mean from p and n
Args:
None
Returns:
float: mean of the data set
"""
self.mean = self.p * self.n
return self.mean
def calculate_stdev(self):
"""Function to calculate the standard deviation from p and n.
Args:
None
Returns:
float: standard deviation of the data set
"""
self.stdev = math.sqrt(self.n * self.p * (1 - self.p))
return self.stdev
def replace_stats_with_data(self):
"""Function to calculate p and n from the data set
Args:
None
Returns:
float: the p value
float: the n value
"""
self.n = len(self.data)
self.p = 1.0 * sum(self.data) / len(self.data)
self.mean = self.calculate_mean()
self.stdev = self.calculate_stdev()
def plot_bar(self):
"""Function to output a histogram of the instance variable data using
matplotlib pyplot library.
Args:
None
Returns:
None
"""
plt.bar(x = ['0', '1'], height = [(1 - self.p) * self.n, self.p * self.n])
plt.title('Bar Chart of Data')
plt.xlabel('outcome')
plt.ylabel('count')
def pdf(self, k):
"""Probability density function calculator for the gaussian distribution.
Args:
x (float): point for calculating the probability density function
Returns:
float: probability density function output
"""
a = math.factorial(self.n) / (math.factorial(k) * (math.factorial(self.n - k)))
b = (self.p ** k) * (1 - self.p) ** (self.n - k)
return a * b
def plot_bar_pdf(self):
"""Function to plot the pdf of the binomial distribution
Args:
None
Returns:
list: x values for the pdf plot
list: y values for the pdf plot
"""
x = []
y = []
# calculate the x values to visualize
for i in range(self.n + 1):
x.append(i)
y.append(self.pdf(i))
# make the plots
plt.bar(x, y)
plt.title('Distribution of Outcomes')
plt.ylabel('Probability')
plt.xlabel('Outcome')
plt.show()
return x, y
def __add__(self, other):
"""Function to add together two Binomial distributions with equal p
Args:
other (Binomial): Binomial instance
Returns:
Binomial: Binomial distribution
"""
try:
assert self.p == other.p, 'p values are not equal'
except AssertionError as error:
raise
result = Binomial()
result.n = self.n + other.n
result.p = self.p
result.calculate_mean()
result.calculate_stdev()
return result
def __repr__(self):
"""Function to output the characteristics of the Binomial instance
Args:
None
Returns:
string: characteristics of the Gaussian
"""
return "mean {}, standard deviation {}, p {}, n {}".\
format(self.mean, self.stdev, self.p, self.n) | AWS-pypi-exercise | /AWS_pypi_exercise-0.1.tar.gz/AWS_pypi_exercise-0.1/AWS_pypi_exercise/Binomialdistribution.py | Binomialdistribution.py |
import boto3
import logging
import os
import json
import sys
import fcntl
from datetime import datetime
class SQSLogHandler(logging.Handler):
def __init__(self, queue, level):
logging.Handler.__init__(self, level)
self.queue = queue
def emit(self, record):
group = 'default'
if hasattr(record, 'name'):
group = record.name.replace(" ", "_")
self.queue.send_message(MessageBody=json.dumps(record.__dict__), MessageGroupId=group)
class SQSLogger:
def __init__(self, queue_name="Logs.fifo"):
self.queue = boto3.resource("sqs").get_queue_by_name(QueueName=queue_name)
handler = SQSLogHandler(self.queue, logging.INFO)
self.logger = logging.getLogger("SQS")
self.logger.setLevel(10)
self.logger.addHandler(handler)
self.formatter = logging.Formatter('[%(name)s] %(asctime)s - %(levelname)s: %(message)s', '%Y-%m-%d %H:%M:%S')
def consume(self, name=None):
print("Press Enter to stop consuming.")
fl = fcntl.fcntl(sys.stdin.fileno(), fcntl.F_GETFL) # Some magic that lets us continue reading until
fcntl.fcntl(sys.stdin.fileno(), fcntl.F_SETFL, fl | os.O_NONBLOCK) # the user presses enter
while True:
resp = self.queue.receive_messages(MaxNumberOfMessages=1, AttributeNames=['MessageGroupId'])
if len(resp) > 0:
message = resp[0]
record = json.loads(resp[0].body)
groupId = message.attributes['MessageGroupId']
# TODO: use logging formatter (self.formatter) rather than % formatting)
if name is not None:
if groupId == name.replace(" ", "_"):
print("[%s] %s - %s: %s" % (groupId,
datetime.utcfromtimestamp(record['created']).strftime(
'%Y-%m-%d %H:%M:%S'),
record['levelname'], record['msg']))
message.delete()
else:
print("[%s] %s - %s: %s" % (groupId,
datetime.utcfromtimestamp(record['created']).strftime(
'%Y-%m-%d %H:%M:%S'),
record['levelname'], record['msg']))
message.delete()
try:
if sys.stdin.read():
sys.stdout.write("\r")
break
except IOError:
pass
except TypeError:
pass
def purge(self):
self.queue.purge() | AWSCloudLogger | /AWSCloudLogger-0.0.2-py3-none-any.whl/SQSLogger/SQSLogger.py | SQSLogger.py |
AWS Deployment Pre-Processor Project
====================================
The AWS Deployment Pre-Processor Project exists to facilitate the use of the
Amazon Web Service (AWS) CodeDeploy service. AWS CodeDeploy has a variety of
requirements that can become stumbling blocks for inexperience users.
AWSDeploy removes most of these issues by automating the CodeDeploy setup process.
AWSDeploy ss a pre-processor for AWS CodeDeploy, it is by no means a replacement.
It's features include:
1) Transform the contents of a source directory into an AWS Deployment Package
2) Convert line delimiters to a value that is appropriate for the destination
3) (re-)Write an application specification in AWS CodeDeploy format based upon
the directory mappings found in profile.ini and the contents of the source
directory.
4) Zip the prepared deployment package into an archive
5) Upload the archive to S3 storage on AWS
6) Trigger the initial AWS CodeDeployment
7) Report the results
File Requirements:
1) Code Deploy
1.1 appspec.yml
2) AWSDeploy
2.1 profile.ini
| AWSDeploy | /AWSDeploy-0.0.97.tar.gz/AWSDeploy-0.0.97/README.rst | README.rst |
__all__ = ['Profile']
'''
Copyright 2016 Daniel Ross Creager
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
#------------------------------------------------------------------------------#
# Class: Profile.py
# Written: November 2016
# Author: Daniel R Creager
# Desc: Supports user, application, and AWS Account customization
#
# Input: profile.ini
#------------------------------------------------------------------------------#
'''
import ConfigParser, os, sys, time, re
class Profile(object):
'''
Profile for an AWS Code Deployment
'''
def __init__(self, configFilePath):
'''
Constructor
'''
self.profilePath = lambda arg: self.path + "\\" + arg
self.configFilePath = configFilePath
self.config = ConfigParser.ConfigParser()
self.config.read(self.configFilePath)
self.config.sections()
section = self.configSectionMap("Source")
if ((type(section) != type(None)) and (len(section) > 0)):
self.path = section['path']
self.file_name = time.strftime("DeploymentPackage.%Y%m%d.zip")
self.tomcat_config = section['tomcat_config'].replace('\\','/').replace('/','') #: conf
self.tomcat_content = section['tomcat_content'].replace('\\','/').replace('/','') #: webapps
self.apache_config = section['apache_config'].replace('\\','/').replace('/','') #: conf.d
self.apache_content = section['apache_content'].replace('\\','/').replace('/','') #: html
self.deploy_hooks = section['deploy_hooks'].replace('\\','/').replace('/','') #: scripts
section = self.configSectionMap("CodeDeploy")
if ((type(section) != type(None)) and (len(section) > 0)):
self.dst_path = section['dst_path']
self.region = section['region']
self.bucket_regex = section['bucket_regex']
self.profile = section['profile']
self.log_group_name = section['log_group_name']
self.log_stream_name = section['log_stream_name']
section = self.configSectionMap("Runtime")
if ((type(section) != type(None)) and (len(section) > 0)):
self.working_dir = section['working_dir']
self.log_max_lines = section['log_max_lines']
self.sleepInterval = self.config.getint("Runtime","sleepInterval") # Seconds
self.blocking = self.config.getboolean("Runtime","blocking")
self.logging = self.config.getboolean("Runtime","logging")
self.rewriteAppSpec = self.config.getboolean("Runtime","rewriteAppSpec")
self.verbose = self.config.getboolean("Runtime","verbose")
self.renameSec = self.configSectionMap("Rename")
'''
Validate the names of directories
'''
msg="Directory mapping(%s) is inconsistent."
if not os.path.exists(self.profilePath(self.tomcat_config)):
raise Warning(msg % ('tomcat_config'))
if not os.path.exists(self.profilePath(self.tomcat_content)):
raise Warning(msg % ('tomcat_content'))
if not os.path.exists(self.profilePath(self.apache_config)):
raise Warning(msg % ('apache_config'))
if not os.path.exists(self.profilePath(self.apache_content)):
raise Warning(msg % ('apache_content'))
if not os.path.exists(self.profilePath(self.deploy_hooks)):
raise Warning(msg % ('deploy_hooks'))
def configSectionMap(self,section):
results = {}
try:
for option in self.config.options(section):
try:
results[option] = self.config.get(section, option)
if results[option] == -1:
print("skip: %s" % option)
except:
print("exception on %s!" % option)
results[option] = None
except ConfigParser.NoSectionError as ex1:
if section != 'Rename':
print "Abnormal Termination because %s" % (ex1)
sys.exit(-1)
return results
def rename(self, arg):
if ((type(self.renameSec) != type(None)) and (len(self.renameSec) > 0)):
#
# Check each rename within the profile.ini
#
for itm in self.renameSec.items():
arg = re.sub(itm[0],itm[1],arg)
return arg
if __name__ == "__main__":
pfl = Profile(sys.argv[1]) | AWSDeploy | /AWSDeploy-0.0.97.tar.gz/AWSDeploy-0.0.97/com/danielcreager/Profile.py | Profile.py |
__all__ = ['AppSpecFactory', 'AWSToolbox']
'''
Copyright 2016 Daniel Ross Creager
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
Class: AppSpecFactory.py
Written: October 2016
Author: Daniel R Creager
Desc: Generate a CodeDeploy Application Specification (appspec.yml)
'''
from os import walk
from com.danielcreager.Profile import Profile
import boto3, botocore, datetime, re, os, sys, time, zipfile
class AppSpecFactory(object):
'''
Generate a CodeDeploy Application Specification (appspec.yml).
The application specifications produced by this class are based upon specific
implied meanings of directory names within the deployment package.
Directory (Default Setup)
/ = files to copy into the 'home' directory
script = scripts to execute during the 'afterInstall' phase
conf.d = Apache configuration files
html = Apache Static Content
conf = Tomcat configuration files
webapps = Tomcat applications
Methods
instanceOf(src, verbose)
src = The directory which holds the deployment package
default = 'C:/Users/Z8364A/Downloads/JavaStack'
verbose = True = Generate an individual copy operation for each file
False = (default) Generate fewer directory level copy operations
persist(dst, specStr)
dst = The location in which to place the appspec.yml file
specStr = application specification string
'''
def __init__(self, profile):
self.current_time = lambda: long(round(time.time() * 1000))
self.currTimeStamp = lambda : self.getTimeStamp(self.current_time())
self.getTemplate = lambda: {'version': 0.0, 'os': 'linux', 'files': [], 'permissions': [], 'hooks': []}
self.getDay = lambda millis: datetime.datetime.fromtimestamp(millis / 1000.0).strftime('%d')
self.getDateStamp = lambda millis: datetime.datetime.fromtimestamp(millis / 1000.0).strftime('%Y-%m-%d')
self.getTimeStamp = lambda millis: datetime.datetime.fromtimestamp(millis / 1000.0).strftime('%H:%M:%S.%f')[0:12]+' '
self.profile = profile
self.indent = ' ';
self.stdPath = lambda arg: arg.replace("\\\\","\\").replace("\\","/")
self.absPath = lambda arg: self.stdPath(profile.path + "\\" +arg)
self.relPath = lambda arg: '.\\' + arg + '\\'
self.relDstPath = lambda arg: arg + "/"
def dump(self, specStr, f):
if specStr.has_key('version'):
f.write("version: %s\n" % (specStr['version']))
f.write("os: %s\n" % (specStr['os']))
if specStr.has_key('files'):
f.write("files:\n")
for itm in specStr['files']:
self.dumpNode(itm.items(), 1, f)
if specStr.has_key('permissions'):
f.write("permissions:\n")
for itm in specStr['permissions']:
self.dumpPermission(itm.items(), 1, f)
if specStr.has_key('hooks'):
# Write the AfterInstall Hook
f.write("hooks:\n AfterInstall:\n")
try:
if len(specStr['hooks']) > 0:
for itm in specStr['hooks']:
self.dumpNode(itm.items(), 2, f)
except TypeError as ex1:
print("AmazonWebSrvc::dump TypeError. "), ex1
def dumpNode(self, items, indCnt, f):
dlmtr = '-'
for itm in items:
if type(itm[1]) == type(dict()):
f.write("%s%s %s:\n%s%s: %s\n" % (indCnt*self.indent, dlmtr, itm[0], indCnt*3*self.indent,
itm[1].items()[0][0], itm[1].items()[0][1]))
else:
f.write("%s%s %s: %s\n" % (indCnt*self.indent, dlmtr, itm[0], itm[1]))
dlmtr = ' '
def dumpPermission(self, items, indCnt, f):
for i in [2,0,3,4,1]:
if i == 1: # Type
f.write("%s%s:\n%s- %s\n" %
(indCnt*2*self.indent, items[i][0], indCnt*3*self.indent, items[i][1]))
else:
f.write("%s%s %s: %s\n" %
(indCnt*self.indent, ('-' if i==2 else ' '), items[i][0], items[i][1]))
def getFile(self, src, dst):
return {'source': src.replace("\\","/"),'destination': dst.replace("\\","/")}
def getPermission(self, obj, owner=None, group=None, mode=None, ltype=None):
result = {'object': obj}
if owner != None:
result.update({'owner': owner})
if group != None:
result.update({'group': group})
if mode != None:
result.update({'mode': mode})
if ltype != None:
result.update({'type': ltype})
return result
def getHook(self, loc, timeout=None, runas=None):
result = {'location': loc}
if timeout != None:
result.update({'timeout': timeout})
if runas != None:
result.update({'runas': runas})
return result
def instanceOf(self):
tmplt = self.getTemplate();
for (dirpath, dirnames, filenames) in walk(self.profile.path):
#------------------------------------------------------------------------------#
# Prepare the Hooks Section with Scripts to be Executed
#------------------------------------------------------------------------------#
if self.stdPath(dirpath) == self.absPath(self.profile.deploy_hooks):
after=[]
for itm in filenames:
after = self.getHook(self.relDstPath(self.profile.deploy_hooks) + itm, 180)
if len(after) > 0:
tmplt['hooks'].append(after)
else:
del tmplt['hooks']
#------------------------------------------------------------------------------#
# Prepare the Files Section for Apache configuration files
#------------------------------------------------------------------------------#
elif self.stdPath(dirpath).startswith(self.absPath(self.profile.apache_config)):
subDirPath = self.stdPath(dirpath[len(self.absPath(self.profile.apache_config))+1:])
subDirPath += '\\' if len(subDirPath) > 0 else ''
if self.profile.verbose:
for itm in filenames:
#tmplt['files'].append(self.getFile("conf.d\\" + subDirPath + itm,
tmplt['files'].append(self.getFile(self.profile.apache_config + '\\' + subDirPath + itm,
'/usr/local/etc/httpd/conf.d/'+subDirPath))
else:
#tmplt['files'].append(self.getFile(".\conf.d\\",'/usr/local/etc/httpd/conf.d/'))
tmplt['files'].append(self.getFile(self.relPath(self.profile.apache_config),
'/usr/local/etc/httpd/conf.d/'))
#------------------------------------------------------------------------------#
# Prepare the Files Section for Tomcat configuration files
#------------------------------------------------------------------------------#
elif self.stdPath(dirpath).startswith(self.absPath(self.profile.tomcat_config)):
subDirPath = self.stdPath(dirpath[len(self.absPath(self.profile.tomcat_config))+1:])
subDirPath += '\\' if len(subDirPath) > 0 else ''
if self.profile.verbose:
for itm in filenames:
#tmplt['files'].append(self.getFile("conf\\" + subDirPath + itm,'/usr/share/tomcat7/conf/'+subDirPath))
tmplt['files'].append(self.getFile(self.profile.tomcat_config + '\\' + subDirPath + itm,
'/usr/share/tomcat7/conf/'+subDirPath))
else:
#tmplt['files'].append(self.getFile(".\conf\\",'/usr/share/tomcat7/conf/'))
tmplt['files'].append(self.getFile(self.relPath(self.profile.tomcat_config),
'/usr/share/tomcat7/conf/'))
#------------------------------------------------------------------------------#
# Prepare the Files Section for Apache static content files
#------------------------------------------------------------------------------#
elif self.stdPath(dirpath).startswith(self.absPath(self.profile.apache_content)):
subDirPath = self.stdPath(dirpath[len(self.absPath(self.profile.apache_content))+1:])
subDirPath += '\\' if len(subDirPath) > 0 else ''
if self.profile.verbose:
for itm in filenames:
#tmplt['files'].append(self.getFile("conf\\" + subDirPath + itm,'/usr/share/tomcat7/conf/'+subDirPath))
tmplt['files'].append(self.getFile(self.profile.apache_content + '\\' + subDirPath + itm,
'/var/www/html/' + subDirPath))
else:
#tmplt['files'].append(self.getFile(".\conf\\",'/usr/share/tomcat7/conf/'))
tmplt['files'].append(self.getFile(self.relPath(self.profile.apache_content),
'/var/www/html/'))
#------------------------------------------------------------------------------#
# Prepare the Files Section for Tomcat Applications
#------------------------------------------------------------------------------#
elif self.stdPath(dirpath).startswith(self.absPath(self.profile.tomcat_content)):
subDirPath = self.stdPath(dirpath[len(self.absPath(self.profile.tomcat_content))+1:])
subDirPath += '\\' if len(subDirPath) > 0 else ''
if self.profile.verbose:
for itm in filenames:
#tmplt['files'].append(self.getFile("conf\\" + subDirPath + itm,'/usr/share/tomcat7/conf/'+subDirPath))
tmplt['files'].append(self.getFile(self.profile.tomcat_content + '\\' + subDirPath + itm,
'/usr/share/tomcat7/webapps/' + subDirPath))
else:
#tmplt['files'].append(self.getFile(".\conf\\",'/usr/share/tomcat7/conf/'))
tmplt['files'].append(self.getFile(self.relPath(self.profile.tomcat_content),
'/usr/share/tomcat7/webapps/'))
#------------------------------------------------------------------------------#
# Prepare the Files Section for root level files
#------------------------------------------------------------------------------#
else:
subDirPath = self.stdPath(dirpath[len(self.profile.path)+1:])
subDirPath += '\\' if len(subDirPath) > 0 else ''
if self.profile.verbose:
for itm in filenames:
if itm == 'profile.ini': # Don't propgate the deploy configuration file
pass
else:
#tmplt['files'].append(self.getFile(subDirPath + itm, self.profile.dst_path + subDirPath))
tmplt['files'].append(self.getFile(subDirPath + itm,
self.profile.dst_path + '/' + subDirPath + itm))
else:
tmplt['files'].append(self.getFile('\\', self.profile.dst_path))
#------------------------------------------------------------------------------#
# Prepare the Permissions Section
#------------------------------------------------------------------------------#
tmplt['permissions'].append(self.getPermission('/var/log/httpd','root','apache',mode=775,ltype='directory'))
tmplt['permissions'].append(self.getPermission('/var/run','root','apache',mode=775,ltype='directory'))
tmplt['permissions'].append(self.getPermission('/var/www','root','apache',mode=775,ltype='directory'))
tmplt['permissions'].append(self.getPermission('/usr/local/etc/httpd','root','apache',mode=775,ltype='directory'))
tmplt['permissions'].append(self.getPermission('/var/log/tomcat7','root','tomcat',mode=775,ltype='directory'))
tmplt['permissions'].append(self.getPermission('/var/cache/tomcat7','root','tomcat',mode=775,ltype='directory'))
tmplt['permissions'].append(self.getPermission('/var/lib/tomcat7','root','tomcat',mode=775,ltype='directory'))
tmplt['permissions'].append(self.getPermission('/home/ec2-user','ec2-user','ec2-user',mode=775,ltype='directory'))
return tmplt
def persist(self, dst, specStr):
'''
Serialize the specStr out to the file system
'''
try:
with open(dst, 'w') as yamlFile:
#yaml.dump(specStr,yamlFile,default_flow_style=False)
self.dump(specStr, yamlFile)
yamlFile.close()
except IOError as ex1:
print '%s %s' % (self.currTimeStamp(),ex1)
class AWSToolBox(object):
'''
A collection of tools for working with the AWS Python SDK (boto3).
'''
def __init__(self,profile):
'''
Constructor
'''
self.currTimeStamp = lambda: self.getTimeStamp(self.current_time())
self.currDateStamp = lambda: self.getDateStamp(self.current_time())
self.current_time = lambda: long(round(time.time() * 1000))
self.getDay = lambda millis: datetime.datetime.fromtimestamp(millis / 1000.0).strftime('%d')
self.getDateStamp = lambda millis: datetime.datetime.fromtimestamp(millis / 1000.0).strftime('%Y-%m-%d')
self.getTimeStamp = lambda millis: datetime.datetime.fromtimestamp(millis / 1000.0).strftime('%H:%M:%S.%f')[0:12]+' '
self.profile = profile
self.deployment_time = 0L
self.lines = []
self.session = boto3.Session(profile_name=self.profile.profile)
self.cdClnt = self.session.client('codedeploy', self.profile.region)
self.s3Clnt = self.session.resource('s3', self.profile.region)
self.logClnt = self.session.client('logs', self.profile.region)
def condenseMsg(self, msg, indent):
'''
Condense long messages into a more compact linear format.
'''
wrk = msg.split()
newMsg = ''
# rearrange the whole lines
for i in range(0, len(wrk) / 10):
for j in range(i*10+0,i*10+10):
newMsg += (wrk[j] + " ")
newMsg += '\n' + indent
# add the last partial line
for k in range(j+1,j + len(wrk) % 10 + 1):
newMsg +=(wrk[k] + " ")
return newMsg
def convert_file(self, fileName, targetOS="lin"):
'''
Convert File from local line delimiters into Linux line delimiters.
'''
# read the file converting local new character to Unix newlines
with open(fileName, "U") as f:
self.lines = f.readlines()
#
# rewrite the file with the appropriate newline char at the end
# Note: This code assumes it is being run on a Windows OS
#
with open(fileName, "wb" if targetOS == "lin" else "w") as f:
for line in self.lines:
f.write(line)
return fileName
def create(self, srcPath, dstPath, fileName, targetOS="lin"):
'''
Creates a deployment package.
Features
- Recurse into subdirectories
- Convert line delimiters for the target Operating System
- Converts the deployment package into an archive (.zip)
'''
zfile = zipfile.ZipFile(dstPath + "/" + fileName, "w")
fileList = [os.path.join(dirpath, f) for dirpath,
dirnames,
files in os.walk(srcPath) for f in files]
for itm in fileList:
# insert a converted file with relative Path names
zfile.write(self.convert_file(itm,targetOS),
self.profile.rename(itm[len(srcPath):]),
zipfile.ZIP_DEFLATED)
zfile.close()
print("%s Created Deployment Package in AWS using %s"
% (self.currTimeStamp(), self.profile.working_dir + "/" + self.profile.file_name))
return
def deploy(self, s3Obj, profile, region, desc, targetOS="lin"):
'''
Create and run an S3 based deployment package.
'''
#
# Retrieve the Deployment Application and Group
#
resp = self.cdClnt.list_applications()
resp = self.cdClnt.list_deployment_groups(applicationName=resp['applications'][0])
try:
deployment_time = self.current_time()
print ("%s Requested deployment of %s from AWS S3(%s)." %
(self.getTimeStamp(deployment_time),s3Obj.key,s3Obj.bucket_name))
resp = self.cdClnt.create_deployment(
applicationName=resp['applicationName'],
deploymentGroupName=resp['deploymentGroups'][0],
revision={
'revisionType': 'S3',
's3Location': {
'bucket': s3Obj.bucket_name,
'key': s3Obj.key,
'bundleType': 'zip',
'version': s3Obj.version_id,
'eTag': s3Obj.e_tag
}
},
deploymentConfigName='CodeDeployDefault.OneAtATime',
description=desc,
ignoreApplicationStopFailures=False,
autoRollbackConfiguration={
'enabled': True,
'events': ['DEPLOYMENT_FAILURE']
},
updateOutdatedInstancesOnly=False
)
except botocore.exceptions.ClientError as ex1:
resp = ex1.response
if ex1.response['Error']['Code'] == 'DeploymentLimitExceededException':
print ('%s Specified deployment Group is currently busy! - Please try again later.\n'
% (self.currTimeStamp()))
else:
print ("%s %s" % (self.currTimeStamp(),ex1.response['Error']['Message']))
return resp
def download(self, profile, region, bucketRegEx, path, fileName):
'''
Download a deployment package from AWS.
'''
p = re.compile(bucketRegEx, re.IGNORECASE)
# locate the specified Bucket and upload into it
try:
for bucket in self.s3Clnt.buckets.all():
match = re.search(p,bucket.name,0)
if match:
self.s3Clnt.Bucket(bucket.name).download_file(fileName, path + "/" + fileName)
print "%s Downloaded AWS S3(%s) to %s" % (self.currTimeStamp(), bucket.name, path + "/" + fileName)
except botocore.exceptions.ClientError as ex1:
if ex1.response['Error']['Code'] == 'ExpiredToken':
print("%s Abnormal Termination! %s\n\t\tPlease run CCHelper and try again." %
(self.currTimeStamp(), ex1.response['Error']['Message']))
sys.exit()
else:
print ex1.response
def getLogEvents(self, grpName,strmName, maxLines, profile, region):
'''
Retrieve the log entries from CloudWatch.
'''
log = ''
#
# Retrieve the Custom Log entries
#
rsp = self.logClnt.get_log_events(
logGroupName=grpName,
logStreamName=strmName,
limit=maxLines,
startFromHead=False
)
#
# Format the Custom Log entries
#
if len(rsp['events']) > 0:
prevDay = rsp['events'][0]['timestamp']
log = "Date: %s\n" % (self.getDateStamp(prevDay))
for i in range(0,len(rsp['events'])):
today = rsp['events'][i]['timestamp']
if (self.getDay(today) != self.getDay(prevDay)):
log += "\nDate: %s\n" % (self.getDateStamp(today))
prevDay = today
log += ("%s %s\n" % (self.getTimeStamp(today),
rsp['events'][i]['message']))
return log
def upload(self, profile, region, bucketRegEx, path, fileName):
'''
Upload the new deployment package to AWS.
'''
s3Obj = None
data = open(path + "/" + fileName, 'rb')
p = re.compile(bucketRegEx, re.IGNORECASE)
# locate the specified Bucket and upload into it
try:
print "%s Commencing Upload to AWS ..." % (self.currTimeStamp())
for bucket in self.s3Clnt.buckets.all():
match = re.search(p,bucket.name,0)
if match:
s3Obj = self.s3Clnt.Bucket(bucket.name).put_object(Key=fileName, Body=data)
print "%s Uploaded %s to AWS S3(%s)" % (self.currTimeStamp(), fileName, bucket.name)
data.close()
if s3Obj == None:
print "%s Unable to locate an S3 bucket using %s. Uploading %s failed." % (self.currTimeStamp(), bucketRegEx, fileName)
except botocore.exceptions.ClientError as ex1:
if ex1.response['Error']['Code'] == 'ExpiredToken':
print("%s Abnormal Termination! %s\n\t\tPlease run CCHelper and try again." % (self.currTimeStamp(), ex1.response['Error']['Message']))
sys.exit()
else:
print ex1.response
return s3Obj
def waitForCompletion(self, rsp, profile, region):
result = False
try:
# Block execution until the deployment completes
print ("%s Waiting for completion ..." % (self.currTimeStamp()))
self.cdClnt.get_waiter('deployment_successful').wait(deploymentId=rsp['deploymentId'])
result = True
except botocore.exceptions.WaiterError as ex1:
print "%s The requested deployment failed!\n" % (self.currTimeStamp())
if (('deploymentInfo' in ex1.last_response) == True):
print "\t\tApplication:\t%s" % (ex1.last_response['deploymentInfo']['applicationName'])
print "\t\tVersion:\t%s" % (ex1.last_response['deploymentInfo']['revision']['s3Location']['version'])
print "\t\tBucket:\t\t%s" % (ex1.last_response['deploymentInfo']['revision']['s3Location']['bucket'])
print "\t\tObject:\t\t%s" % (ex1.last_response['deploymentInfo']['revision']['s3Location']['key'])
print "\t\teTag:\t\t%s\n" % (ex1.last_response['deploymentInfo']['revision']['s3Location']['eTag'])
if (('errorInformation' in ex1.last_response['deploymentInfo']) == True ):
print "\t\tError"
print ("\t\tCode:\t\t%s" %
(ex1.last_response['deploymentInfo']['errorInformation']['code']))
print ("\t\tMessage:\t%s\n" %
(self.condenseMsg(ex1.last_response['deploymentInfo']['errorInformation']['message'],'\t\t\t\t')))
if (('rollbackInfo' in ex1.last_response['deploymentInfo']) == True):
print "\t\tRollBack"
print ("\t\tMessage:\t%s\n" %
(self.condenseMsg(ex1.last_response['deploymentInfo']['rollbackInfo']['rollbackMessage'],'\t\t\t\t')))
return result
if __name__ == "__main__":
fact = AppSpecFactory(Profile(sys.argv[1]))
prod = 'C:\Users\Z8364A\Downloads\JavaStack\\appspec.yml'
devl = 'appspec.yml'
fact.persist(devl,fact.instanceOf()) | AWSDeploy | /AWSDeploy-0.0.97.tar.gz/AWSDeploy-0.0.97/com/danielcreager/AmazonWebSrvc.py | AmazonWebSrvc.py |
__all__ = ['AWSToolBox']
'''
Copyright 2016 Daniel Ross Creager
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.
@author: Daniel R Creager
Created on Nov 1, 2016
'''
import os, sys
from time import sleep
from com.danielcreager.Profile import Profile
from com.danielcreager.AmazonWebSrvc import AWSToolBox
from com.danielcreager.AmazonWebSrvc import AppSpecFactory
# type(s3Obj) == <class 'boto3.resources.factory.s3.Object'>
# type(resp) == <type 'dict'>
class ToolBox(object):
'''
A collection of tools for using AWS CodeDeploy in Python.
'''
def __init__(self,profile):
'''
Constructor
'''
self.profile = profile
self.tb = AWSToolBox(profile)
self.s3Obj = None
self.isSuccess = lambda rsp: (('Error' in rsp.keys()) == False)
self.isFailure = lambda rsp: (('Error' in rsp.keys()) == True )
# Download the deployment package
def downloadPkg(self):
'''
'''
print "Downloading %s" % (self.profile.working_dir + self.profile.file_name)
self.tb.download(self.profile.profile, self.profile.region, self.profile.bucket_reqex,
self.profile.working_dir, self.profile.file_name)
def makePkg(self):
'''
Construct a deployment package.
'''
#------------------------------------------------------------------------------#
# OnAppSpecMissing: Dynamically generate one
#------------------------------------------------------------------------------#
filePath = self.profile.path + '/appspec.yml'
# if indicated delete the appspec.yml and regenerate every time.
if self.profile.rewriteAppSpec == True:
if os.path.exists(filePath):
print "%s Removing %s" % (self.tb.currTimeStamp(),filePath)
os.remove(filePath)
if os.path.exists(filePath) == False:
f = AppSpecFactory(self.profile)
f.persist(filePath,f.instanceOf())
print "%s Generated Application Spec: %s" % (self.tb.currTimeStamp(),filePath)
#------------------------------------------------------------------------------#
# Prepare the Deployment Package
#------------------------------------------------------------------------------#
self.tb.create(self.profile.path,self.profile.working_dir,self.profile.file_name)
#------------------------------------------------------------------------------#
# Transfer the Deployment Package to AWS
#------------------------------------------------------------------------------#
self.s3Obj = self.tb.upload(self.profile.profile, self.profile.region, self.profile.bucket_regex,
self.profile.working_dir, self.profile.file_name)
#------------------------------------------------------------------------------#
# Remove the Deployment Package locally
#------------------------------------------------------------------------------#
print "%s Removed %s" % (self.tb.currTimeStamp(),
self.profile.working_dir + "/" + self.profile.file_name)
os.remove(self.profile.working_dir + "/" + self.profile.file_name)
return self.s3Obj
def printHeader(self):
'''
Print a Runtime Header.
'''
print ("Command:\tdeploy\nParameters:\t%s\nPath:\t\t%s\nPackage:\t%s\nRegion:\t\t%s\n"
"Bucket:\t\t%s\nProfile:\t%s\nRun Date:\t%s\nRun Mode:\t%s\n\nRun Log"
% (self.profile.configFilePath, self.profile.path, self.profile.file_name, self.profile.region,
self.profile.bucket_regex, self.profile.profile, self.tb.currDateStamp(),
("Rewrite_AppSpec" if self.profile.rewriteAppSpec else "Use_Existing_AppSpec")
+ (" Wait_for_Completion" if self.profile.blocking else "")
+ (" Retrieve_App_Logs" if self.profile.logging else "")))
print("%s Initiated deployment request using %s."
% (self.tb.currTimeStamp(),self.profile.path))
def printLog(self):
'''
Print out the custom log entries.
'''
print "Log Entries"
sleep(self.profile.sleepInterval) # Time is seconds
print self.tb.getLogEvents(self.profile.log_group_name,self.profile.log_stream_name,
self.profile.log_max_lines,self.profile.profile, self.profile.region)
def runPkg(self):
'''
Run the deployment package.
'''
#------------------------------------------------------------------------------#
# Trigger Deployment of the new Package
#------------------------------------------------------------------------------#
resp = self.tb.deploy(self.s3Obj, self.profile.profile,
self.profile.region, 'Deploy Automation')
if self.isSuccess(resp) and self.profile.blocking:
if self.tb.waitForCompletion(resp, self.profile.profile, self.profile.region):
print("%s Deployment using %s completed successfully."
% (self.tb.currTimeStamp(), self.profile.file_name))
else:
if self.profile.logging:
self.printLog()
print("%s Deployment using %s terminated abnormally."
% (self.tb.currTimeStamp(),self.profile.file_name))
elif self.isSuccess(resp):
print("%s Deployment request for %s submitted successfully."
% (self.tb.currTimeStamp(), self.profile.file_name))
else:
if self.profile.logging:
self.printLog(self.profile)
print("%s Deployment request for %s terminated abnormally."
% (self.tb.currTimeStamp(),self.profile.file_name))
return resp
if __name__ == "__main__":
#------------------------------------------------------------------------------#
# Create and execute an application deployment in AWS CodeDeploy on Linux
#------------------------------------------------------------------------------#
tool = ToolBox(Profile(sys.argv[1]))
tool.printHeader();
s3Obj = tool.makePkg()
if s3Obj != None: # OnS3ObjMissing: Skip further processing
tool.runPkg() | AWSDeploy | /AWSDeploy-0.0.97.tar.gz/AWSDeploy-0.0.97/com/danielcreager/CodeDeploy.py | CodeDeploy.py |
# AWS Gateway Client
===================
# Overview
# Installation
With pip
```
pip install apigateway_client
```
GitHub
```
git clone [email protected]:iamjohnnym/apigateway_client.git
```
# Usage
```
from apigateway_client import Client
api_client = Client(
api_gateway_id="gateway_id",
region="us-east-1",
stage='develop',
endpoint_key='pets',
account_id='acccount-id',
role='role_name',
role_session_name='unittest'
)
response = api_cliet.get(endpoint_meta='pet')
``` | AWSGateway-Client | /AWSGateway-Client-0.1.tar.gz/AWSGateway-Client-0.1/README.rst | README.rst |
from awsrequests import AwsRequester
import requests
import boto3
class Client():
def __init__(self, api_gateway_id, region, stage, endpoint_key,
account_id=None, role=None, role_session_name=None, api_key=None):
'''
'''
self.api_gateway_id = api_gateway_id
self.region = region
self.stage = stage
self.endpoint_key = endpoint_key
self.account_id = account_id
self.role = role
self.role_session_name = role_session_name
self.assume = None
if self.role:
self.role_arn = self.__set_role_arn()
self.api_key = api_key
def __build_url(self, endpoint_meta=None):
'''
Build apigateway URL based on arguments provided
'''
if not endpoint_meta:
return "https://{api_gateway_id}.execute-api.{region}.amazonaws.com/{stage}/{endpoint_key}".format(
api_gateway_id=self.api_gateway_id,
region=self.region,
stage=self.stage,
endpoint_key=self.endpoint_key
)
return "https://{api_gateway_id}.execute-api.{region}.amazonaws.com/{stage}/{endpoint_key}/{endpoint_meta}".format(
api_gateway_id=self.api_gateway_id,
region=self.region,
stage=self.stage,
endpoint_key=self.endpoint_key,
endpoint_meta=endpoint_meta
)
def __set_temp_creds(self, assume=None):
if not assume:
assume = self.assume
_temp_creds = dict(
aws_access_key_id=assume['Credentials']['AccessKeyId'],
aws_secret_access_key=assume['Credentials']['SecretAccessKey'],
aws_session_token=assume['Credentials']['SessionToken']
)
return _temp_creds
def __set_role_arn(self):
return "arn:aws:iam::{account_id}:role/{role}".format(
account_id=self.account_id,
role=self.role
)
def __create_client(self):
return boto3.client('sts')
def __assume_role(self):
sts = self.__create_client()
assume = sts.assume_role(
RoleArn=self.__set_role_arn(),
RoleSessionName=self.role_session_name
)
return self.__set_temp_creds(assume)
def get(self, region='us-east-1', endpoint_meta=None):
if not self.api_key:
temp_creds = self.__assume_role()
api_gateway = AwsRequester(
region,
*(temp_creds[key] for key in ['aws_access_key_id',
'aws_secret_access_key', 'aws_session_token'])
)
if endpoint_meta:
return api_gateway.get(self.__build_url(endpoint_meta))
return api_gateway.get(self.__build_url())
else:
headers = {"content-type": "application/json","x-api-key": self.api_key}
if endpoint_meta:
return requests.get(self.__build_url(endpoint_meta), headers=headers)
return requests.get(self.__build_url(), headers=headers) | AWSGateway-Client | /AWSGateway-Client-0.1.tar.gz/AWSGateway-Client-0.1/apigateway_client/__init__.py | __init__.py |
WAIT_TIME = 1
READY_SESSION_STATUS = "READY"
PROVISIONING_SESSION_STATUS = "PROVISIONING"
NOT_FOUND_SESSION_STATUS = "NOT_FOUND"
FAILED_SESSION_STATUS = "FAILED"
UNHEALTHY_SESSION_STATUS = [NOT_FOUND_SESSION_STATUS, FAILED_SESSION_STATUS]
ERROR_STATEMENT_STATUS = "ERROR"
CANCELLED_STATEMENT_STATUS = "CANCELLED"
AVAILABLE_STATEMENT_STATUS = "AVAILABLE"
FINAL_STATEMENT_STATUS = [ERROR_STATEMENT_STATUS, CANCELLED_STATEMENT_STATUS, AVAILABLE_STATEMENT_STATUS]
CELL_MAGICS = {"%%configure", "%%sql"}
HELP_TEXT = f'''
Available Magic Commands:
%%configure | Dictionary | A json-formatted dictionary consisting of all configuration parameters for a session. Each parameter can be specified here or through individual magics.
%profile | String | Specify a profile in your aws configuration to use as the credentials provider.
%iam_role | String | Specify an IAM role to execute your session with. | Default from ~/.aws/configure
%region | String | Specify the AWS region in which to initialize a session | Default from ~/.aws/configure
%max_capacity | Float | The number of AWS Glue data processing units (DPUs) that can be allocated.
%number_of_workers | int | The number of workers of a defined worker_type that are allocated when a job runs. worker_type must be set too.
%worker_type | String | Standard, G.1X, or G.2X. number_of_workers must be set too.
%endpoint | String | Define a custom glue endpoint url.
%new_session | Delete the current session and start a new session.
%list_sessions | Lists all currently running sessions by name and ID.
%session_id | String | Returns the session ID for the running session.
%status | Returns the status of the current Glue session including its duration, configuration and executing user / role.
%terminate_session | Terminates the current session, kills the cluster. User stops being charged.
%connections | List | Specify a comma separated list of connections to use in the session.
%additional_python_modules | List | Comma separated list of additional Python modules to include in your cluster (can be from Pypi or S3).
%extra_py_files | List | Comma separated list of additional Python files From S3.
%extra_jars | List | Comma separated list of additional Jars to include in the cluster.
''' | AWSGlueInteractiveSessionsKernel | /AWSGlueInteractiveSessionsKernel-0.4-py3-none-any.whl/aws_glue_interactive_sessions_kernel/glue_scala_kernel/constants.py | constants.py |
try:
from asyncio import Future
except ImportError:
class Future(object):
"""A class nothing will use."""
import json
import time
import traceback
from collections import defaultdict
from datetime import datetime
import boto3
import botocore
from botocore.credentials import RefreshableCredentials
from botocore.session import get_session
from dateutil.tz import tzlocal
from .constants import *
from hdijupyterutils.ipythondisplay import IpythonDisplay
from ipykernel.ipkernel import IPythonKernel
from IPython import get_ipython
from .KernelMagics import KernelMagics
class GlueKernel(IPythonKernel):
time_out = 90
session_id = None
glue_client = None
implementation = 'Scala Glue Session'
implementation_version = '1.0'
language = 'no-op'
language_version = '0.1'
language_info = {
'name': 'scala',
'mimetype': 'text/x-scala',
'codemirror_mode': 'text/x-scala',
'pygments_lexer': 'scala'
}
session_language = "scala"
ipython_display = None
def __init__(self, **kwargs):
super(GlueKernel, self).__init__(**kwargs)
self.glue_role_arn = None
self.profile = None
self.endpoint_url = None
self.region = None
self.default_arguments = {
"--session-language": "scala",
"--enable-glue-datacatalog": "true"
}
self.enable_glue_datacatalog = None
self.extra_py_files = None
self.extra_jars = None
self.additional_python_modules = None
self.connections = defaultdict()
self.security_config = None
# TODO: What is a good name for this?
self.session_name = "AssumeRoleSession"
self.max_capacity = None
self.number_of_workers = 5
self.worker_type = 'G.1X'
self.temp_dir = None
if not self.ipython_display:
self.ipython_display = IpythonDisplay()
self._register_magics()
def do_execute(self, code: str, silent: bool, store_history=True, user_expressions=None, allow_stdin=False):
code = self._execute_magics(code, silent, store_history, user_expressions, allow_stdin)
statement_id = None
if not code:
return self._complete_cell()
# Create glue client and session
try:
if not self.glue_client:
# Attempt to retrieve default profile if a profile is not already set
if not self.get_profile() and botocore.session.Session().full_config['profiles'].get('default'):
self.set_profile('default')
self.glue_client = self.authenticate(glue_role_arn=self.get_glue_role_arn(), profile=self.get_profile())
if not self.session_id or self.get_current_session_status() in UNHEALTHY_SESSION_STATUS:
self.create_session()
except Exception as e:
self.ipython_display.send_error(f'Exception encountered while creating session: {e} \n')
self._print_traceback(e)
return self._complete_cell()
try:
# Run statement
statement_id = self.glue_client.run_statement(SessionId=self.session_id, Code=code)["Id"]
start_time = time.time()
try:
while time.time() - start_time <= self.time_out:
statement = self.glue_client.get_statement(SessionId=self.session_id, Id=statement_id)["Statement"]
if statement["State"] in FINAL_STATEMENT_STATUS:
statement_output = statement["Output"]
status = statement["State"]
reply_content = {
"execution_count": statement["Id"],
'user_expressions': {},
"payload": []
}
if status == AVAILABLE_STATEMENT_STATUS:
if statement_output["Status"] == "ok":
reply_content["status"] = u'ok'
self._send_output(statement_output["Data"]["TextPlain"])
else:
reply_content["status"] = u'error'
reply_content.update({
u'traceback': statement_output["Traceback"],
u'ename': statement_output["ErrorName"],
u'evalue': statement_output["ErrorValue"],
})
self._send_output(f"{statement_output['ErrorName']}: {statement_output['ErrorValue']}")
elif status == ERROR_STATEMENT_STATUS:
self.ipython_display.send_error(statement_output)
elif status == CANCELLED_STATEMENT_STATUS:
self._send_output("This statement is cancelled")
return reply_content
time.sleep(WAIT_TIME)
self.ipython_display.send_error(f"Timeout occurred with statement (statement_id={statement_id})")
except KeyboardInterrupt:
self._send_output(
f"Execution Interrupted. Attempting to cancel the statement (statement_id={statement_id})"
)
self._cancel_statement(statement_id)
except Exception as e:
self.ipython_display.send_error(f'Exception encountered while running statement: {e} \n')
self._print_traceback(e)
self._cancel_statement(statement_id)
return self._complete_cell()
def authenticate(self, glue_role_arn=None, profile=None):
# either an IAM role for Glue must be provided or a profile must be set
if not glue_role_arn and not profile:
raise ValueError(f'Neither glue_role_arn nor profile were provided')
# region must be set
if not self.get_region():
raise ValueError(f'Region must be set.')
# If we are using a custom endpoint
if not self.get_endpoint_url():
self.set_endpoint_url(f"https://glue.{self.get_region()}.amazonaws.com")
if glue_role_arn:
self.set_glue_role_arn(glue_role_arn)
if profile:
return self._authenticate_with_profile()
else:
self._send_output(
f'Authenticating with environment variables and user-defined glue_role_arn: {glue_role_arn}')
self.sts_client = boto3.Session().client('sts')
session_credentials = RefreshableCredentials.create_from_metadata(
metadata=self._refresh(),
refresh_using=self._refresh,
method="sts-assume-role",
)
session = get_session()
session._credentials = session_credentials
session.set_config_variable("region", self.get_region())
autorefresh_session = boto3.Session(botocore_session=session)
return autorefresh_session.client("glue",
endpoint_url=self.get_endpoint_url())
def _authenticate_with_profile(self):
self._send_output(f'Authenticating with profile={self.get_profile()}')
if self.get_profile() not in boto3.session.Session().available_profiles:
raise ValueError(f'Profile {self.get_profile()} not defined in config')
custom_role_arn = self._retrieve_from_aws_config('glue_role_arn')
# Check if a glue_role_arn is defined in the profile and a custom glue_role_arn hasn't been defined
if not self.get_glue_role_arn() and custom_role_arn is not None:
self._send_output(f'glue_role_arn retrieved from profile: {custom_role_arn}')
self.set_glue_role_arn(custom_role_arn)
else:
if self.get_glue_role_arn() is not None:
self._send_output(f'glue_role_arn defined by user: {self.get_glue_role_arn()}')
else:
raise ValueError(f'glue_role_arn not present in profile and was not defined by user')
self.sts_client = boto3.Session(profile_name=self.get_profile()).client('sts', region_name=self.get_region())
session_credentials = RefreshableCredentials.create_from_metadata(
metadata=self._refresh(),
refresh_using=self._refresh,
method="sts-assume-role",
)
session = get_session()
session._credentials = session_credentials
session.set_config_variable("region", self.get_region())
autorefresh_session = boto3.Session(botocore_session=session)
return autorefresh_session.client("glue",
endpoint_url=self.get_endpoint_url())
def _retrieve_from_aws_config(self, key):
custom_profile_session = botocore.session.Session(profile=self.get_profile())
return custom_profile_session.full_config['profiles'][self.get_profile()].get(key)
def _get_configs_from_profile(self):
if not self.get_region():
config_region = self._retrieve_from_aws_config('region')
if config_region:
self.set_region(config_region)
if not self.get_glue_role_arn():
config_glue_role_arn = self._retrieve_from_aws_config('glue_role_arn')
if config_glue_role_arn:
self.set_glue_role_arn(config_glue_role_arn)
def configure(self, configs_json):
updated_configs = dict()
try:
configs = json.loads("[" + configs_json + "]")[0]
if 'profile' in configs:
self.set_profile(configs.get('profile'))
updated_configs['profile'] = configs.get('profile')
if 'endpoint' in configs:
self.set_endpoint_url(configs.get('endpoint'))
updated_configs['endpoint'] = configs.get('endpoint')
if 'region' in configs:
self.set_region(configs.get('region'))
updated_configs['region'] = configs.get('region')
if 'iam_role' in configs:
self.set_glue_role_arn(configs.get('iam_role'))
updated_configs['iam_role'] = configs.get('iam_role')
if 'max_capacity' in configs:
self.set_max_capacity(configs.get('max_capacity'))
updated_configs['max_capacity'] = configs.get('max_capacity')
if 'number_of_workers' in configs:
self.set_number_of_workers(configs.get('number_of_workers'))
updated_configs['number_of_workers'] = configs.get('number_of_workers')
if 'worker_type' in configs:
self.set_worker_type(configs.get('worker_type'))
updated_configs['worker_type'] = configs.get('worker_type')
if 'extra_py_files' in configs:
self.set_extra_py_files(configs.get('extra_py_files'))
updated_configs['extra_py_files'] = configs.get('extra_py_files')
if 'additional_python_modules' in configs:
self.set_additional_python_modules(configs.get('additional_python_modules'))
updated_configs['additional_python_modules'] = configs.get('additional_python_modules')
if 'extra_jars' in configs:
self.set_extra_jars(configs.get('extra_jars'))
updated_configs['extra_jars'] = configs.get('extra_jars')
if 'connections' in configs:
self.set_connections(configs.get('connections'))
updated_configs['connections'] = configs.get('connections')
if 'enable_glue_datacatalog' in configs:
self.set_enable_glue_datacatalog()
updated_configs['enable_glue_datacatalog'] = configs.get('enable_glue_datacatalog')
if 'security_config' in configs:
self.set_security_config(configs.get('security_config'))
updated_configs['security_config'] = configs.get('security_config')
if 'extra_py_files' in configs:
self.set_temp_dir(configs.get('temp_dir'))
updated_configs['temp_dir'] = configs.get('temp_dir')
except Exception as e:
self.ipython_display.send_error(f'The following exception was encountered while parsing the configurations provided: {e} \n')
self._print_traceback(e)
if not updated_configs:
self.ipython_display.send_error("No valid configuration values were provided.")
else:
self._send_output(f'The following configurations have been updated: {updated_configs}')
def do_shutdown(self, restart):
self.delete_session()
return self._do_shutdown(restart)
def _do_shutdown(self, restart):
return super(GlueKernel, self).do_shutdown(restart)
def set_profile(self, profile):
self.profile = profile
# Pull in new configs from profile
self._get_configs_from_profile()
def set_glue_role_arn(self, glue_role_arn):
self.glue_role_arn = glue_role_arn
def get_profile(self):
return self.profile
def get_glue_role_arn(self):
return self.glue_role_arn
def get_sessions(self):
return self.glue_client.list_sessions()
def get_session_id(self):
return self.session_id
def set_session_id(self, session_id):
self.session_id = session_id
def set_endpoint_url(self, endpoint_url):
self.endpoint_url = endpoint_url
def get_endpoint_url(self):
return self.endpoint_url
def set_region(self, region):
self.region = region
def get_region(self):
return self.region
def get_default_arguments(self):
if self.get_enable_glue_datacatalog() is not None:
self.default_arguments['--enable-glue-datacatalog'] = self.get_enable_glue_datacatalog()
if self.get_extra_py_files() is not None:
self.default_arguments['--extra-py-files'] = self.get_extra_py_files()
if self.get_extra_jars() is not None:
self.default_arguments['--extra-jars'] = self.get_extra_jars()
if self.get_additional_python_modules() is not None:
self.default_arguments['--additional-python-modules'] = self.get_additional_python_modules()
if self.get_temp_dir() is not None:
self.default_arguments['--TempDir'] = self.get_temp_dir()
if self.default_arguments:
self._send_output(f'Applying the following default arguments:')
for arg, val in self.default_arguments.items():
self._send_output(f'{arg} {val}')
return self.default_arguments
def get_enable_glue_datacatalog(self):
return self.enable_glue_datacatalog
def set_enable_glue_datacatalog(self):
self.enable_glue_datacatalog = 'true'
def get_extra_py_files(self):
return self.extra_py_files
def set_extra_py_files(self, extra_py_files):
self.extra_py_files = extra_py_files
def get_extra_jars(self):
return self.extra_jars
def set_extra_jars(self, extra_jars):
self.extra_jars = extra_jars
def get_additional_python_modules(self):
return self.additional_python_modules
def set_additional_python_modules(self, modules):
self.additional_python_modules = modules
def get_connections(self):
return self.connections
def set_connections(self, connections):
self.connections["Connections"] = list(connections.split(','))
def get_session_name(self):
return self.session_name
def get_max_capacity(self):
return self.max_capacity
def set_max_capacity(self, max_capacity):
self.max_capacity = float(max_capacity)
self.number_of_workers = None
self.worker_type = None
def get_number_of_workers(self):
return self.number_of_workers
def set_number_of_workers(self, number_of_workers):
self.number_of_workers = int(number_of_workers)
self.max_capacity = None
def get_worker_type(self):
return self.worker_type
def set_worker_type(self, worker_type):
self.worker_type = worker_type
self.max_capacity = None
def get_security_config(self):
return self.security_config
def set_security_config(self, security_config):
self.security_config = security_config
def get_temp_dir(self):
return self.temp_dir
def set_temp_dir(self, temp_dir):
self.temp_dir = temp_dir
def disconnect(self):
if self.get_session_id():
session_id = self.get_session_id()
self.set_session_id(None)
self._send_output(f'Disconnected from session {session_id}')
else:
self.ipython_display.send_error(f'Not currently connected to a session. \n')
def reconnect(self, session_id):
if self.get_session_id():
self.disconnect()
self._send_output(f'Trying to connect to {session_id}')
self.set_session_id(session_id)
# Verify that this session exists.
try:
# TODO: create glue client if it doesn't exist
self.glue_client.get_session(Id=self.session_id)
self._send_output(f'Connected to {session_id}')
except Exception as e:
self.ipython_display.send_error(f'Exception encountered while connecting to session: {e} \n')
self._print_traceback(e)
def _refresh(self):
# Refresh tokens by calling assume_role again
params = {
# "RoleArn": self.get_glue_role_arn(),
# "RoleSessionName": self.get_session_name(),
"DurationSeconds": 3600,
}
response = self.sts_client.get_session_token(**params).get("Credentials")
credentials = {
"access_key": response.get("AccessKeyId"),
"secret_key": response.get("SecretAccessKey"),
"token": response.get("SessionToken"),
"expiry_time": response.get("Expiration").isoformat(),
}
return credentials
def create_session(self):
self._send_output("Trying to create a Glue session for the kernel")
if self.get_max_capacity() and (self.get_number_of_workers() and self.get_worker_type()):
raise ValueError(f'Either max_capacity or worker_type and number_of_workers must be set, but not both.')
additional_args = self._get_additional_arguments()
self.session_id = self.glue_client.create_session(
Role=self.get_glue_role_arn(),
DefaultArguments=self.get_default_arguments(),
Connections=self.get_connections(),
Command={
"Name": "glueetl",
"PythonVersion": "3"
},
**additional_args)["Session"]["Id"]
self._send_output(f'Waiting for session {self.session_id} to get into ready status...')
is_ready = False
start_time = time.time()
while time.time() - start_time <= self.time_out and not is_ready:
if self.get_current_session_status() == READY_SESSION_STATUS:
is_ready = True
time.sleep(WAIT_TIME)
if not is_ready:
self.ipython_display.send_error(f"Session failed to reach ready status in {self.time_out}s")
else:
self._send_output(f"Session {self.session_id} has been created")
def _get_additional_arguments(self):
additional_args = {}
if self.get_max_capacity():
additional_args['MaxCapacity'] = self.get_max_capacity()
if self.get_number_of_workers():
additional_args['NumberOfWorkers'] = self.get_number_of_workers()
if self.get_worker_type():
additional_args['WorkerType'] = self.get_worker_type()
if self.get_security_config():
additional_args['SecurityConfiguration'] = self.get_security_config()
return additional_args
def delete_session(self):
if self.session_id:
try:
self._send_output(f'Terminating session: {self.session_id}')
# TODO: how do we delete session if our security token expires?
self.glue_client.delete_session(Id=self.session_id)
self.glue_client = None
self.session_id = None
except Exception as e:
self.ipython_display.send_error(f'Exception encountered while terminating session {self.session_id}: {e} \n')
self._print_traceback(e)
def _cancel_statement(self, statement_id: str):
if not statement_id:
return
try:
self.glue_client.cancel_statement(SessionId=self.session_id, Id=statement_id)
start_time = time.time()
is_ready = False
while time.time() - start_time <= self.time_out and not is_ready:
status = self.glue_client.get_statement(SessionId=self.session_id, Id=statement_id)["Statement"]["State"]
if status == CANCELLED_STATEMENT_STATUS:
self._send_output(f"Statement {statement_id} has been cancelled")
is_ready = True
time.sleep(WAIT_TIME)
if not is_ready:
self.ipython_display.send_error(f"Failed to cancel the statement {statement_id}")
except Exception as e:
self.ipython_display.send_error(f'Exception encountered while canceling statement {statement_id}: {e} \n')
self._print_traceback(e)
def get_current_session_status(self):
try:
return self.get_current_session()["Status"]
except Exception as e:
self.ipython_display.send_error(f'Failed to retrieve session status \n')
def get_current_session_duration_in_seconds(self):
try:
time_in_seconds = datetime.now(tzlocal()) - self.get_current_session()["CreatedOn"]
return time_in_seconds.total_seconds()
except Exception as e:
self.ipython_display.send_error(f'Failed to retrieve session duration \n')
def get_current_session_role(self):
try:
return self.get_current_session()["Role"]
except Exception as e:
self.ipython_display.send_error(f'Failed to retrieve session role \n')
def get_current_session(self):
if self.session_id is None:
self.ipython_display.send_error(f'No current session.')
else:
try:
current_session = self.glue_client.get_session(Id=self.session_id)["Session"]
return NOT_FOUND_SESSION_STATUS if not current_session else current_session
except Exception as e:
self.ipython_display.send_error(f'Exception encountered while retrieving session: {e} \n')
self._print_traceback(e)
def _send_output(self, output):
stream_content = {'name': 'stdout', 'text': f"{output}\n"}
self.send_response(self.iopub_socket, 'stream', stream_content)
def _do_execute(self, code, silent, store_history, user_expressions, allow_stdin):
res = self._execute_cell(code, silent, store_history, user_expressions, allow_stdin)
return res
def _execute_cell(self, code, silent, store_history=True, user_expressions=None, allow_stdin=False):
reply_content = self._execute_cell_for_user(code, silent, store_history, user_expressions, allow_stdin)
return reply_content
def _execute_cell_for_user(self, code, silent, store_history=True, user_expressions=None, allow_stdin=False):
result = super(GlueKernel, self).do_execute(code, silent, store_history, user_expressions, allow_stdin)
if isinstance(result, Future):
result = result.result()
return result
def _execute_magics(self, code, silent, store_history, user_expressions, allow_stdin):
try:
magic_lines = 0
lines = code.splitlines()
for line in lines:
# If there is a cell magic, we simply treat all the remaining code as part of the cell magic
if any(line.startswith(cell_magic) for cell_magic in CELL_MAGICS):
code = '\n'.join(lines[magic_lines:])
self._do_execute(code, silent, store_history, user_expressions, allow_stdin)
return None
# If we encounter a line magic, we execute this line magic and continue
if line.startswith("%") or line.startswith("!"):
self._do_execute(line, silent, store_history, user_expressions, allow_stdin)
magic_lines += 1
# We ignore comments and empty lines
elif line.startswith("#") or not line:
magic_lines += 1
else:
break
code = '\n'.join(lines[magic_lines:])
return code
except Exception as e:
self.ipython_display.send_error(f'Exception encountered: {e} \n')
self._print_traceback(e)
return self._complete_cell()
def _complete_cell(self):
"""A method that runs a cell with no effect. Call this and return the value it
returns when there's some sort of error preventing the user's cell from executing; this
will register the cell from the Jupyter UI as being completed."""
return self._execute_cell("None", False, True, None, False)
def _register_magics(self):
ip = get_ipython()
magics = KernelMagics(ip, '', self)
ip.register_magics(magics)
def _print_traceback(self, e):
traceback.print_exception(type(e), e, e.__traceback__)
if __name__ == '__main__':
from ipykernel.kernelapp import IPKernelApp
IPKernelApp.launch_instance(kernel_class=GlueKernel) | AWSGlueInteractiveSessionsKernel | /AWSGlueInteractiveSessionsKernel-0.4-py3-none-any.whl/aws_glue_interactive_sessions_kernel/glue_scala_kernel/GlueKernel.py | GlueKernel.py |
from __future__ import print_function
from .constants import *
from IPython.core.magic import Magics, cell_magic, line_magic, magics_class
@magics_class
class KernelMagics(Magics):
def __init__(self, shell, data, kernel):
super(KernelMagics, self).__init__(shell)
self.data = data
self.kernel = kernel
@line_magic('iam_role')
def set_iam_role(self, glue_role_arn):
print(f'Current glue_role_arn is {self.kernel.get_glue_role_arn()}')
self.kernel.set_glue_role_arn(glue_role_arn)
print(f'IAM role has been set to {glue_role_arn}. Trying to re-authenticate.')
new_client = self.kernel.authenticate(glue_role_arn=glue_role_arn, profile=self.kernel.get_profile())
self.kernel.glue_client = new_client
self.kernel.create_session()
@line_magic('new_session')
def new_session(self, line=None):
self.kernel.delete_session()
print(f'Creating new session.')
new_client = self.kernel.authenticate(glue_role_arn=self.kernel.get_glue_role_arn(), profile=self.kernel.get_profile())
self.kernel.glue_client = new_client
self.kernel.create_session()
@line_magic('profile')
def set_profile(self, profile):
print(f'Previous profile: {self.kernel.get_profile()}')
print(f'Setting new profile to: {profile}')
self.kernel.set_profile(profile)
@line_magic('status')
def get_status(self, line=None):
status = self.kernel.get_current_session_status()
duration = self.kernel.get_current_session_duration_in_seconds()
role = self.kernel.get_current_session_role()
print(f'Status: {status}')
print(f'Duration: {duration} seconds')
print(f'Role: {role}')
@line_magic('list_sessions')
def list_sessions(self, line=None):
ids = self.kernel.get_sessions().get('Ids')
print(f'There are currently {len(ids)} active sessions:')
for id in ids:
print(id)
@line_magic('terminate_session')
def terminate_session(self, line=None):
self.kernel.delete_session()
print(f'Terminated session.')
@line_magic('session_id')
def get_session_id(self, line=None):
print(f'Current Session ID: {self.kernel.get_session_id()}')
@line_magic('enable_glue_datacatalog')
def set_enable_glue_datacatalog(self, line=None):
print("Enabling Glue DataCatalog")
self.kernel.set_enable_glue_datacatalog()
@line_magic('extra_py_files')
def set_extra_py_files(self, line=None):
print("Adding the following:")
for s3_path in line.split(','):
print(s3_path)
self.kernel.set_extra_py_files(line)
@line_magic('additional_python_modules')
def set_additional_python_modules(self, line=None):
print("Adding the following:")
for s3_path in line.split(','):
print(s3_path)
self.kernel.set_additional_python_modules(line)
@line_magic('extra_jars')
def set_extra_jars(self, line=None):
print("Adding the following:")
for s3_path in line.split(','):
print(s3_path)
self.kernel.set_extra_jars(line)
@line_magic('temp_dir')
def set_temp_dir(self, line=None):
print(f"Setting temporary directory to: {line}")
self.kernel.set_temp_dir(line)
@line_magic('connections')
def set_connections(self, line=None):
print("Adding the following:")
for connection in line.split(','):
print(connection)
self.kernel.set_connections(line)
@line_magic('endpoint')
def set_endpoint(self, line=None):
print(f'Previous endpoint: {self.kernel.get_endpoint_url()}')
print(f'Setting new endpoint to: {line}')
self.kernel.set_endpoint_url(line)
@line_magic('region')
def set_region(self, line=None):
print(f'Previous region: {self.kernel.get_region()}')
print(f'Setting new region to: {line}')
self.kernel.set_region(line)
@line_magic('max_capacity')
def set_max_capacity(self, line=None):
print(f'Previous max capacity: {self.kernel.get_max_capacity()}')
print(f'Setting new max capacity to: {float(line)}')
self.kernel.set_max_capacity(line)
@line_magic('number_of_workers')
def set_number_of_workers(self, line=None):
print(f'Previous number of workers: {self.kernel.get_number_of_workers()}')
print(f'Setting new number of workers to: {int(line)}')
self.kernel.set_number_of_workers(line)
@line_magic('worker_type')
def set_worker_type(self, line=None):
print(f'Previous worker type: {self.kernel.get_worker_type()}')
print(f'Setting new worker type to: {line}')
self.kernel.set_worker_type(line)
@line_magic('security_config')
def set_security_config(self, line=None):
print(f'Previous security_config: {self.kernel.get_security_config()}')
print(f'Setting new security_config to: {line}')
self.kernel.set_security_config(line)
@line_magic('disconnect')
def disconnect(self, line=None):
self.kernel.disconnect()
@line_magic('reconnect')
def reconnect(self, line=None):
self.kernel.reconnect(line)
@cell_magic('sql')
def run_sql(self, line=None, cell=None):
if line == 'show':
code = f'spark.sql(\'{cell.rstrip()}\').show()'
self.kernel.do_execute(code, False, True, None, False)
else:
code = f'spark.sql(\'{cell.rstrip()}\')'
self.kernel.do_execute(code, False, True, None, False)
@cell_magic('configure')
def configure(self, line=None, cell=None):
self.kernel.configure(cell)
@line_magic('help')
def help(self, line=None):
print(HELP_TEXT) | AWSGlueInteractiveSessionsKernel | /AWSGlueInteractiveSessionsKernel-0.4-py3-none-any.whl/aws_glue_interactive_sessions_kernel/glue_scala_kernel/KernelMagics.py | KernelMagics.py |
WAIT_TIME = 1
READY_SESSION_STATUS = "READY"
PROVISIONING_SESSION_STATUS = "PROVISIONING"
NOT_FOUND_SESSION_STATUS = "NOT_FOUND"
FAILED_SESSION_STATUS = "FAILED"
UNHEALTHY_SESSION_STATUS = [NOT_FOUND_SESSION_STATUS, FAILED_SESSION_STATUS]
ERROR_STATEMENT_STATUS = "ERROR"
CANCELLED_STATEMENT_STATUS = "CANCELLED"
AVAILABLE_STATEMENT_STATUS = "AVAILABLE"
FINAL_STATEMENT_STATUS = [ERROR_STATEMENT_STATUS, CANCELLED_STATEMENT_STATUS, AVAILABLE_STATEMENT_STATUS]
CELL_MAGICS = {"%%configure", "%%sql"}
HELP_TEXT = f'''
Available Magic Commands:
%%configure | Dictionary | A json-formatted dictionary consisting of all configuration parameters for a session. Each parameter can be specified here or through individual magics.
%profile | String | Specify a profile in your aws configuration to use as the credentials provider.
%iam_role | String | Specify an IAM role to execute your session with. | Default from ~/.aws/configure
%region | String | Specify the AWS region in which to initialize a session | Default from ~/.aws/configure
%max_capacity | Float | The number of AWS Glue data processing units (DPUs) that can be allocated.
%number_of_workers | int | The number of workers of a defined worker_type that are allocated when a job runs. worker_type must be set too.
%worker_type | String | Standard, G.1X, or G.2X. number_of_workers must be set too.
%endpoint | String | Define a custom glue endpoint url.
%new_session | Delete the current session and start a new session.
%list_sessions | Lists all currently running sessions by name and ID.
%session_id | String | Returns the session ID for the running session.
%status | Returns the status of the current Glue session including its duration, configuration and executing user / role.
%terminate_session | Terminates the current session, kills the cluster. User stops being charged.
%connections | List | Specify a comma separated list of connections to use in the session.
%additional_python_modules | List | Comma separated list of additional Python modules to include in your cluster (can be from Pypi or S3).
%extra_py_files | List | Comma separated list of additional Python files From S3.
%extra_jars | List | Comma separated list of additional Jars to include in the cluster.
''' | AWSGlueInteractiveSessionsKernel | /AWSGlueInteractiveSessionsKernel-0.4-py3-none-any.whl/aws_glue_interactive_sessions_kernel/glue_python_kernel/constants.py | constants.py |
try:
from asyncio import Future
except ImportError:
class Future(object):
"""A class nothing will use."""
import json
import time
import traceback
from collections import defaultdict
from datetime import datetime
import boto3
import botocore
from botocore.credentials import RefreshableCredentials
from botocore.session import get_session
from dateutil.tz import tzlocal
from .constants import *
from hdijupyterutils.ipythondisplay import IpythonDisplay
from ipykernel.ipkernel import IPythonKernel
from IPython import get_ipython
from .KernelMagics import KernelMagics
class GlueKernel(IPythonKernel):
time_out = 90
session_id = None
glue_client = None
implementation = 'Python Glue Session'
implementation_version = '1.0'
language = 'no-op'
language_version = '0.1'
language_info = {
'name': 'Python_Glue_Session',
'mimetype': 'text/x-python',
'codemirror_mode': {'name': 'python', 'version': 3},
'pygments_lexer': 'python3',
'file_extension': '.py',
}
session_language = "python"
ipython_display = None
def __init__(self, **kwargs):
super(GlueKernel, self).__init__(**kwargs)
self.glue_role_arn = None
self.profile = None
self.endpoint_url = None
self.region = None
self.default_arguments = {
"--enable-glue-datacatalog": "true"
}
self.enable_glue_datacatalog = None
self.extra_py_files = None
self.extra_jars = None
self.additional_python_modules = None
self.connections = defaultdict()
self.security_config = None
# TODO: What is a good name for this?
self.session_name = "AssumeRoleSession"
self.max_capacity = None
self.number_of_workers = 5
self.worker_type ='G.1X'
self.temp_dir = None
if not self.ipython_display:
self.ipython_display = IpythonDisplay()
self._register_magics()
def do_execute(self, code: str, silent: bool, store_history=True, user_expressions=None, allow_stdin=False):
code = self._execute_magics(code, silent, store_history, user_expressions, allow_stdin)
statement_id = None
if not code:
return self._complete_cell()
# Create glue client and session
try:
if not self.glue_client:
# Attempt to retrieve default profile if a profile is not already set
if not self.get_profile() and botocore.session.Session().full_config['profiles'].get('default'):
self.set_profile('default')
self.glue_client = self.authenticate(glue_role_arn=self.get_glue_role_arn(), profile=self.get_profile())
if not self.session_id or self.get_current_session_status() in UNHEALTHY_SESSION_STATUS:
self.create_session()
except Exception as e:
self.ipython_display.send_error(f'Exception encountered while creating session: {e} \n')
self._print_traceback(e)
return self._complete_cell()
try:
# Run statement
statement_id = self.glue_client.run_statement(SessionId=self.session_id, Code=code)["Id"]
start_time = time.time()
try:
while time.time() - start_time <= self.time_out:
statement = self.glue_client.get_statement(SessionId=self.session_id, Id=statement_id)["Statement"]
if statement["State"] in FINAL_STATEMENT_STATUS:
statement_output = statement["Output"]
status = statement["State"]
reply_content = {
"execution_count": statement["Id"],
'user_expressions': {},
"payload": []
}
if status == AVAILABLE_STATEMENT_STATUS:
if statement_output["Status"] == "ok":
reply_content["status"] = u'ok'
self._send_output(statement_output["Data"]["TextPlain"])
else:
reply_content["status"] = u'error'
reply_content.update({
u'traceback': statement_output["Traceback"],
u'ename': statement_output["ErrorName"],
u'evalue': statement_output["ErrorValue"],
})
self._send_output(f"{statement_output['ErrorName']}: {statement_output['ErrorValue']}")
elif status == ERROR_STATEMENT_STATUS:
self.ipython_display.send_error(statement_output)
elif status == CANCELLED_STATEMENT_STATUS:
self._send_output("This statement is cancelled")
return reply_content
time.sleep(WAIT_TIME)
self.ipython_display.send_error(f"Timeout occurred with statement (statement_id={statement_id})")
except KeyboardInterrupt:
self._send_output(
f"Execution Interrupted. Attempting to cancel the statement (statement_id={statement_id})"
)
self._cancel_statement(statement_id)
except Exception as e:
self.ipython_display.send_error(f'Exception encountered while running statement: {e} \n')
self._print_traceback(e)
self._cancel_statement(statement_id)
return self._complete_cell()
def authenticate(self, glue_role_arn=None, profile=None):
# either an IAM role for Glue must be provided or a profile must be set
if not glue_role_arn and not profile:
raise ValueError(f'Neither glue_role_arn nor profile were provided')
# region must be set
if not self.get_region():
raise ValueError(f'Region must be set.')
# If we are using a custom endpoint
if not self.get_endpoint_url():
self.set_endpoint_url(f"https://glue.{self.get_region()}.amazonaws.com")
if glue_role_arn:
self.set_glue_role_arn(glue_role_arn)
if profile:
return self._authenticate_with_profile()
else:
self._send_output(f'Authenticating with environment variables and user-defined glue_role_arn: {glue_role_arn}')
self.sts_client = boto3.Session().client('sts')
session_credentials = RefreshableCredentials.create_from_metadata(
metadata=self._refresh(),
refresh_using=self._refresh,
method="sts-assume-role",
)
session = get_session()
session._credentials = session_credentials
session.set_config_variable("region", self.get_region())
autorefresh_session = boto3.Session(botocore_session=session)
return autorefresh_session.client("glue",
endpoint_url=self.get_endpoint_url())
def _authenticate_with_profile(self):
self._send_output(f'Authenticating with profile={self.get_profile()}')
if self.get_profile() not in boto3.session.Session().available_profiles:
raise ValueError(f'Profile {self.get_profile()} not defined in config')
custom_role_arn = self._retrieve_from_aws_config('glue_role_arn')
# Check if a glue_role_arn is defined in the profile and a custom glue_role_arn hasn't been defined
if not self.get_glue_role_arn() and custom_role_arn is not None:
self._send_output(f'glue_role_arn retrieved from profile: {custom_role_arn}')
self.set_glue_role_arn(custom_role_arn)
else:
if self.get_glue_role_arn() is not None:
self._send_output(f'glue_role_arn defined by user: {self.get_glue_role_arn()}')
else:
raise ValueError(f'glue_role_arn not present in profile and was not defined by user')
self.sts_client = boto3.Session(profile_name=self.get_profile()).client('sts', region_name=self.get_region())
session_credentials = RefreshableCredentials.create_from_metadata(
metadata=self._refresh(),
refresh_using=self._refresh,
method="sts-assume-role",
)
session = get_session()
session._credentials = session_credentials
session.set_config_variable("region", self.get_region())
autorefresh_session = boto3.Session(botocore_session=session)
return autorefresh_session.client("glue",
endpoint_url=self.get_endpoint_url())
def _retrieve_from_aws_config(self, key):
custom_profile_session = botocore.session.Session(profile=self.get_profile())
return custom_profile_session.full_config['profiles'][self.get_profile()].get(key)
def _get_configs_from_profile(self):
self.set_region(self._retrieve_from_aws_config('region'))
self.set_glue_role_arn(self._retrieve_from_aws_config('glue_role_arn'))
def configure(self, configs_json):
updated_configs = dict()
try:
configs = json.loads("[" + configs_json + "]")[0]
if 'profile' in configs:
self.set_profile(configs.get('profile'))
updated_configs['profile'] = configs.get('profile')
if 'endpoint' in configs:
self.set_endpoint_url(configs.get('endpoint'))
updated_configs['endpoint'] = configs.get('endpoint')
if 'region' in configs:
self.set_region(configs.get('region'))
updated_configs['region'] = configs.get('region')
if 'iam_role' in configs:
self.set_glue_role_arn(configs.get('iam_role'))
updated_configs['iam_role'] = configs.get('iam_role')
if 'max_capacity' in configs:
self.set_max_capacity(configs.get('max_capacity'))
updated_configs['max_capacity'] = configs.get('max_capacity')
if 'number_of_workers' in configs:
self.set_number_of_workers(configs.get('number_of_workers'))
updated_configs['number_of_workers'] = configs.get('number_of_workers')
if 'worker_type' in configs:
self.set_worker_type(configs.get('worker_type'))
updated_configs['worker_type'] = configs.get('worker_type')
if 'extra_py_files' in configs:
self.set_extra_py_files(configs.get('extra_py_files'))
updated_configs['extra_py_files'] = configs.get('extra_py_files')
if 'additional_python_modules' in configs:
self.set_additional_python_modules(configs.get('additional_python_modules'))
updated_configs['additional_python_modules'] = configs.get('additional_python_modules')
if 'extra_jars' in configs:
self.set_extra_jars(configs.get('extra_jars'))
updated_configs['extra_jars'] = configs.get('extra_jars')
if 'connections' in configs:
self.set_connections(configs.get('connections'))
updated_configs['connections'] = configs.get('connections')
if 'enable_glue_datacatalog' in configs:
self.set_enable_glue_datacatalog()
updated_configs['enable_glue_datacatalog'] = configs.get('enable_glue_datacatalog')
if 'security_config' in configs:
self.set_security_config(configs.get('security_config'))
updated_configs['security_config'] = configs.get('security_config')
if 'extra_py_files' in configs:
self.set_temp_dir(configs.get('temp_dir'))
updated_configs['temp_dir'] = configs.get('temp_dir')
except Exception as e:
self.ipython_display.send_error(f'The following exception was encountered while parsing the configurations provided: {e} \n')
self._print_traceback(e)
if not updated_configs:
self.ipython_display.send_error("No valid configuration values were provided.")
else:
self._send_output(f'The following configurations have been updated: {updated_configs}')
def do_shutdown(self, restart):
self.delete_session()
return self._do_shutdown(restart)
def _do_shutdown(self, restart):
return super(GlueKernel, self).do_shutdown(restart)
def set_profile(self, profile):
self.profile = profile
# Pull in new configs from profile
self._get_configs_from_profile()
def set_glue_role_arn(self, glue_role_arn):
self.glue_role_arn = glue_role_arn
def get_profile(self):
return self.profile
def get_glue_role_arn(self):
return self.glue_role_arn
def get_sessions(self):
return self.glue_client.list_sessions()
def get_session_id(self):
return self.session_id
def set_session_id(self, session_id):
self.session_id = session_id
def set_endpoint_url(self, endpoint_url):
self.endpoint_url = endpoint_url
def get_endpoint_url(self):
return self.endpoint_url
def set_region(self, region):
self.region = region
def get_region(self):
return self.region
def get_default_arguments(self):
if self.get_enable_glue_datacatalog() is not None:
self.default_arguments['--enable-glue-datacatalog'] = self.get_enable_glue_datacatalog()
if self.get_extra_py_files() is not None:
self.default_arguments['--extra-py-files'] = self.get_extra_py_files()
if self.get_extra_jars() is not None:
self.default_arguments['--extra-jars'] = self.get_extra_jars()
if self.get_additional_python_modules() is not None:
self.default_arguments['--additional-python-modules'] = self.get_additional_python_modules()
if self.get_temp_dir() is not None:
self.default_arguments['--TempDir'] = self.get_temp_dir()
if self.default_arguments:
self._send_output(f'Applying the following default arguments:')
for arg, val in self.default_arguments.items():
self._send_output(f'{arg} {val}')
return self.default_arguments
def get_enable_glue_datacatalog(self):
return self.enable_glue_datacatalog
def set_enable_glue_datacatalog(self):
self.enable_glue_datacatalog = 'true'
def get_extra_py_files(self):
return self.extra_py_files
def set_extra_py_files(self, extra_py_files):
self.extra_py_files = extra_py_files
def get_extra_jars(self):
return self.extra_jars
def set_extra_jars(self, extra_jars):
self.extra_jars = extra_jars
def get_additional_python_modules(self):
return self.additional_python_modules
def set_additional_python_modules(self, modules):
self.additional_python_modules = modules
def get_connections(self):
return self.connections
def set_connections(self, connections):
self.connections["Connections"] = list(connections.split(','))
def get_session_name(self):
return self.session_name
def get_max_capacity(self):
return self.max_capacity
def set_max_capacity(self, max_capacity):
self.max_capacity = float(max_capacity)
self.number_of_workers = None
self.worker_type = None
def get_number_of_workers(self):
return self.number_of_workers
def set_number_of_workers(self, number_of_workers):
self.number_of_workers = int(number_of_workers)
self.max_capacity = None
def get_worker_type(self):
return self.worker_type
def set_worker_type(self, worker_type):
self.worker_type = worker_type
self.max_capacity = None
def get_security_config(self):
return self.security_config
def set_security_config(self, security_config):
self.security_config = security_config
def get_temp_dir(self):
return self.temp_dir
def set_temp_dir(self, temp_dir):
self.temp_dir = temp_dir
def disconnect(self):
if self.get_session_id():
session_id = self.get_session_id()
self.set_session_id(None)
self._send_output(f'Disconnected from session {session_id}')
else:
self.ipython_display.send_error(f'Not currently connected to a session. \n')
def reconnect(self, session_id):
if self.get_session_id():
self.disconnect()
self._send_output(f'Trying to connect to {session_id}')
self.set_session_id(session_id)
# Verify that this session exists.
try:
# TODO: create glue client if it doesn't exist
self.glue_client.get_session(Id=self.session_id)
self._send_output(f'Connected to {session_id}')
except Exception as e:
self.ipython_display.send_error(f'Exception encountered while connecting to session: {e} \n')
self._print_traceback(e)
def _refresh(self):
# Refresh tokens by calling assume_role again
params = {
# "RoleArn": self.get_glue_role_arn(),
# "RoleSessionName": self.get_session_name(),
"DurationSeconds": 3600,
}
response = self.sts_client.get_session_token(**params).get("Credentials")
credentials = {
"access_key": response.get("AccessKeyId"),
"secret_key": response.get("SecretAccessKey"),
"token": response.get("SessionToken"),
"expiry_time": response.get("Expiration").isoformat(),
}
return credentials
def create_session(self):
self._send_output("Trying to create a Glue session for the kernel")
if self.get_max_capacity() and (self.get_number_of_workers() and self.get_worker_type()):
raise ValueError(f'Either max_capacity or worker_type and number_of_workers must be set, but not both.')
additional_args = self._get_additional_arguments()
self.session_id = self.glue_client.create_session(
Role=self.get_glue_role_arn(),
DefaultArguments=self.get_default_arguments(),
Connections=self.get_connections(),
Command={
"Name": "glueetl",
"PythonVersion": "3"
},
**additional_args)["Session"]["Id"]
self._send_output(f'Waiting for session {self.session_id} to get into ready status...')
is_ready = False
start_time = time.time()
while time.time() - start_time <= self.time_out and not is_ready:
if self.get_current_session_status() == READY_SESSION_STATUS:
is_ready = True
time.sleep(WAIT_TIME)
if not is_ready:
self.ipython_display.send_error(f"Session failed to reach ready status in {self.time_out}s")
else:
self._send_output(f"Session {self.session_id} has been created")
def _get_additional_arguments(self):
additional_args = {}
if self.get_max_capacity():
additional_args['MaxCapacity'] = self.get_max_capacity()
if self.get_number_of_workers():
additional_args['NumberOfWorkers'] = self.get_number_of_workers()
if self.get_worker_type():
additional_args['WorkerType'] = self.get_worker_type()
if self.get_security_config():
additional_args['SecurityConfiguration'] = self.get_security_config()
return additional_args
def delete_session(self):
if self.session_id:
try:
self._send_output(f'Terminating session: {self.session_id}')
# TODO: how do we delete session if our security token expires?
self.glue_client.delete_session(Id=self.session_id)
self.glue_client = None
self.session_id = None
except Exception as e:
self.ipython_display.send_error(f'Exception encountered while terminating session {self.session_id}: {e} \n')
self._print_traceback(e)
def _cancel_statement(self, statement_id: str):
if not statement_id:
return
try:
self.glue_client.cancel_statement(SessionId=self.session_id, Id=statement_id)
start_time = time.time()
is_ready = False
while time.time() - start_time <= self.time_out and not is_ready:
status = self.glue_client.get_statement(SessionId=self.session_id, Id=statement_id)["Statement"]["State"]
if status == CANCELLED_STATEMENT_STATUS:
self._send_output(f"Statement {statement_id} has been cancelled")
is_ready = True
time.sleep(WAIT_TIME)
if not is_ready:
self.ipython_display.send_error(f"Failed to cancel the statement {statement_id}")
except Exception as e:
self.ipython_display.send_error(f'Exception encountered while canceling statement {statement_id}: {e} \n')
self._print_traceback(e)
def get_current_session_status(self):
try:
return self.get_current_session()["Status"]
except Exception as e:
self.ipython_display.send_error(f'Failed to retrieve session status \n')
def get_current_session_duration_in_seconds(self):
try:
time_in_seconds = datetime.now(tzlocal()) - self.get_current_session()["CreatedOn"]
return time_in_seconds.total_seconds()
except Exception as e:
self.ipython_display.send_error(f'Failed to retrieve session duration \n')
def get_current_session_role(self):
try:
return self.get_current_session()["Role"]
except Exception as e:
self.ipython_display.send_error(f'Failed to retrieve session role \n')
def get_current_session(self):
if self.session_id is None:
self.ipython_display.send_error(f'No current session.')
else:
try:
current_session = self.glue_client.get_session(Id=self.session_id)["Session"]
return NOT_FOUND_SESSION_STATUS if not current_session else current_session
except Exception as e:
self.ipython_display.send_error(f'Exception encountered while retrieving session: {e} \n')
self._print_traceback(e)
def _send_output(self, output):
stream_content = {'name': 'stdout', 'text': f"{output}\n"}
self.send_response(self.iopub_socket, 'stream', stream_content)
def _do_execute(self, code, silent, store_history, user_expressions, allow_stdin):
res = self._execute_cell(code, silent, store_history, user_expressions, allow_stdin)
return res
def _execute_cell(self, code, silent, store_history=True, user_expressions=None, allow_stdin=False):
reply_content = self._execute_cell_for_user(code, silent, store_history, user_expressions, allow_stdin)
return reply_content
def _execute_cell_for_user(self, code, silent, store_history=True, user_expressions=None, allow_stdin=False):
result = super(GlueKernel, self).do_execute(code, silent, store_history, user_expressions, allow_stdin)
if isinstance(result, Future):
result = result.result()
return result
def _execute_magics(self, code, silent, store_history, user_expressions, allow_stdin):
try:
magic_lines = 0
lines = code.splitlines()
for line in lines:
# If there is a cell magic, we simply treat all the remaining code as part of the cell magic
if any(line.startswith(cell_magic) for cell_magic in CELL_MAGICS):
code = '\n'.join(lines[magic_lines:])
self._do_execute(code, silent, store_history, user_expressions, allow_stdin)
return None
# If we encounter a line magic, we execute this line magic and continue
if line.startswith("%") or line.startswith("!"):
self._do_execute(line, silent, store_history, user_expressions, allow_stdin)
magic_lines += 1
# We ignore comments and empty lines
elif line.startswith("#") or not line:
magic_lines += 1
else:
break
code = '\n'.join(lines[magic_lines:])
return code
except Exception as e:
self.ipython_display.send_error(f'Exception encountered: {e} \n')
self._print_traceback(e)
return self._complete_cell()
def _complete_cell(self):
"""A method that runs a cell with no effect. Call this and return the value it
returns when there's some sort of error preventing the user's cell from executing; this
will register the cell from the Jupyter UI as being completed."""
return self._execute_cell("None", False, True, None, False)
def _register_magics(self):
ip = get_ipython()
magics = KernelMagics(ip, '', self)
ip.register_magics(magics)
def _print_traceback(self, e):
traceback.print_exception(type(e), e, e.__traceback__)
if __name__ == '__main__':
from ipykernel.kernelapp import IPKernelApp
IPKernelApp.launch_instance(kernel_class=GlueKernel) | AWSGlueInteractiveSessionsKernel | /AWSGlueInteractiveSessionsKernel-0.4-py3-none-any.whl/aws_glue_interactive_sessions_kernel/glue_python_kernel/GlueKernel.py | GlueKernel.py |
from __future__ import print_function
from .constants import *
from IPython.core.magic import Magics, cell_magic, line_magic, magics_class
@magics_class
class KernelMagics(Magics):
def __init__(self, shell, data, kernel):
super(KernelMagics, self).__init__(shell)
self.data = data
self.kernel = kernel
@line_magic('iam_role')
def set_iam_role(self, glue_role_arn):
print(f'Current glue_role_arn is {self.kernel.get_glue_role_arn()}')
self.kernel.set_glue_role_arn(glue_role_arn)
print(f'IAM role has been set to {glue_role_arn}. Trying to re-authenticate.')
new_client = self.kernel.authenticate(glue_role_arn=glue_role_arn, profile=self.kernel.get_profile())
self.kernel.glue_client = new_client
self.kernel.create_session()
@line_magic('new_session')
def new_session(self, line=None):
self.kernel.delete_session()
print(f'Creating new session.')
new_client = self.kernel.authenticate(glue_role_arn=self.kernel.get_glue_role_arn(), profile=self.kernel.get_profile())
self.kernel.glue_client = new_client
self.kernel.create_session()
@line_magic('profile')
def set_profile(self, profile):
print(f'Previous profile: {self.kernel.get_profile()}')
print(f'Setting new profile to: {profile}')
self.kernel.set_profile(profile)
@line_magic('status')
def get_status(self, line=None):
status = self.kernel.get_current_session_status()
duration = self.kernel.get_current_session_duration_in_seconds()
role = self.kernel.get_current_session_role()
print(f'Status: {status}')
print(f'Duration: {duration} seconds')
print(f'Role: {role}')
@line_magic('list_sessions')
def list_sessions(self, line=None):
ids = self.kernel.get_sessions().get('Ids')
print(f'There are currently {len(ids)} active sessions:')
for id in ids:
print(id)
@line_magic('terminate_session')
def terminate_session(self, line=None):
self.kernel.delete_session()
print(f'Terminated session.')
@line_magic('session_id')
def get_session_id(self, line=None):
print(f'Current Session ID: {self.kernel.get_session_id()}')
@line_magic('enable_glue_datacatalog')
def set_enable_glue_datacatalog(self, line=None):
print("Enabling Glue DataCatalog")
self.kernel.set_enable_glue_datacatalog()
@line_magic('extra_py_files')
def set_extra_py_files(self, line=None):
print("Adding the following:")
for s3_path in line.split(','):
print(s3_path)
self.kernel.set_extra_py_files(line)
@line_magic('additional_python_modules')
def set_additional_python_modules(self, line=None):
print("Adding the following:")
for s3_path in line.split(','):
print(s3_path)
self.kernel.set_additional_python_modules(line)
@line_magic('extra_jars')
def set_extra_jars(self, line=None):
print("Adding the following:")
for s3_path in line.split(','):
print(s3_path)
self.kernel.set_extra_jars(line)
@line_magic('temp_dir')
def set_temp_dir(self, line=None):
print(f"Setting temporary directory to: {line}")
self.kernel.set_temp_dir(line)
@line_magic('connections')
def set_connections(self, line=None):
print("Adding the following:")
for connection in line.split(','):
print(connection)
self.kernel.set_connections(line)
@line_magic('endpoint')
def set_endpoint(self, line=None):
print(f'Previous endpoint: {self.kernel.get_endpoint_url()}')
print(f'Setting new endpoint to: {line}')
self.kernel.set_endpoint_url(line)
@line_magic('region')
def set_region(self, line=None):
print(f'Previous region: {self.kernel.get_region()}')
print(f'Setting new region to: {line}')
self.kernel.set_region(line)
@line_magic('max_capacity')
def set_max_capacity(self, line=None):
print(f'Previous max capacity: {self.kernel.get_max_capacity()}')
print(f'Setting new max capacity to: {float(line)}')
self.kernel.set_max_capacity(line)
@line_magic('number_of_workers')
def set_number_of_workers(self, line=None):
print(f'Previous number of workers: {self.kernel.get_number_of_workers()}')
print(f'Setting new number of workers to: {int(line)}')
self.kernel.set_number_of_workers(line)
@line_magic('worker_type')
def set_worker_type(self, line=None):
print(f'Previous worker type: {self.kernel.get_worker_type()}')
print(f'Setting new worker type to: {line}')
self.kernel.set_worker_type(line)
@line_magic('security_config')
def set_security_config(self, line=None):
print(f'Previous security_config: {self.kernel.get_security_config()}')
print(f'Setting new security_config to: {line}')
self.kernel.set_security_config(line)
@line_magic('disconnect')
def disconnect(self, line=None):
self.kernel.disconnect()
@line_magic('reconnect')
def reconnect(self, line=None):
self.kernel.reconnect(line)
@cell_magic('sql')
def run_sql(self, line=None, cell=None):
if line == 'show':
code = f'spark.sql(\'{cell.rstrip()}\').show()'
self.kernel.do_execute(code, False, True, None, False)
else:
code = f'spark.sql(\'{cell.rstrip()}\')'
self.kernel.do_execute(code, False, True, None, False)
@cell_magic('configure')
def configure(self, line=None, cell=None):
self.kernel.configure(cell)
@line_magic('help')
def help(self, line=None):
print(HELP_TEXT) | AWSGlueInteractiveSessionsKernel | /AWSGlueInteractiveSessionsKernel-0.4-py3-none-any.whl/aws_glue_interactive_sessions_kernel/glue_python_kernel/KernelMagics.py | KernelMagics.py |
##########################################
AWS IoT Device Defender Agent SDK (Python)
##########################################
Example implementation of an AWS IoT Device Defender metrics collection agent,
and other Device Defender Python samples.
The provided sample agent can be used as a basis to implement a custom metrics collection agent.
*************
Prerequisites
*************
Minimum System Requirements
===========================
The Following requirements are shared with the `AWS IoT Device SDK for Python <https://github.com/aws/aws-iot-device-sdk-python>`_
- Python 3.5+ for X.509 certificate-based mutual authentication via port 8883 and MQTT over WebSocket protocol with AWS Signature Version 4 authentication
- Python 3.5+ for X.509 certificate-based mutual authentication via port 443
- OpenSSL version 1.0.1+ (TLS version 1.2) compiled with the Python executable for X.509 certificate-based mutual authentication
Connect your Device to AWS IoT
==============================
If you have never connected your device to AWS IoT before, please follow the
`Getting Started with AWS IoT <https://docs.aws.amazon.com/iot/latest/developerguide/iot-gs.html>`_
Guide. Make sure you note the location of your certificates, you will
need to provide the location of these to the Device Defender Sample
Agent.
****************************************
Notes on the sample agent implementation
****************************************
**client id**: The sample agent requires a client id that will also be used as the "Thing Name". This only for the sake of making the sample easy to get started with. To customize this behavior, you can modify the way the agent generates the MQTT topic for publishing metrics reports, to use a value other than client id as the thing name portion of the topic.
**metric selection**: The sample agent attempts to gather all supported Device Defender metrics. Depending on your platform requirements and use case, you may wish to customize your agent to a subset of the metrics.
**********
Quickstart
**********
Installation
============
#. Clone the repository
.. code:: bash
git clone https://github.com/aws-samples/aws-iot-device-defender-agent-sdk-python.git
#. Install Using pip
Pip is the easiest way to install the sample agent, it will take care of
installing dependencies
.. code:: bash
pip install /path/to/sample/package
Running the Sample Agent
========================
.. code:: bash
python agent.py --endpoint <your.custom.endpoint.amazonaws.com> --rootCA </path/to/rootca> --cert </path/to/cert> --key <path/to/key> --format json -i 300 -id <ThingName>
Command line options
--------------------
To see a summary of all commandline options:
.. code:: bash
python agent.py --help
Test Metrics Collection Locally
-------------------------------
.. code:: bash
python collector.py -n 1 -s 1
Custom Metric Integration
=========================
The sample agent has a flag allowing it to publish custom metrics
.. code:: bash
python agent.py --include-custom-metrics --endpoint <your.custom.endpoint.amazonaws.com> --rootCA </path/to/rootca> --cert </path/to/cert> --key <path/to/key> --format json -i 300 -id <ThingName>
This flag will tell the agent to publish the custom metric `cpu_usage`, a `number` float representing the current cpu usage as a percent. How this looks in the generated report can be seen in the sample report below.
We can run the command seen below to create the `custom_metric` for `cpu_usage`.
.. code:: bash
aws iot create-custom-metric --metric-name "cpu_usage" --metric-type "number" --client-request-token "access-test" --region us-east-1
After creating this `custom_metric` you will be able to create security profiles that use it.
.. code:: bash
aws iot create-security-profile \
--security-profile-name CpuUsageIssue \
--security-profile-description "High-Cpu-Usage" \
--behaviors "[{\"name\":\"great-than-75\",\"metric\":\"cpu_usage\",\"criteria\":{\"comparisonOperator\":\"greater-than\",\"value\":{\"count\":75},\"consecutiveDatapointsToAlarm\":5,\"consecutiveDatapointsToClear\":1}}]" \
--region us-east-1
******************************
AWS IoT Greengrass Integration
******************************
Overview
========
AWS IoT Device Defender can be used in conjunction with AWS Greengrass.
Integration follows the standard Greengrass lambda deployment model,
making it easy to add AWS IoT Device Defender security to your
Greengrass Core devices.
Prerequisites
=============
#. `Greengrass environment setup <https://docs.aws.amazon.com/greengrass/latest/developerguide/module1.html>`__
#. `Greengrass core configured and running <https://docs.aws.amazon.com/greengrass/latest/developerguide/module2.html>`__
#. Ensure you can successfully deploy and run a lambda on your core
Using Device Defender with Greengrass Core devices
==================================================
You can deploy a Device Defender to your Greengrass core in two ways:
#. Using the pre-built Greengrass Device Defender Connector (*recommended*)
#. Create a lambda package manually
Using Greengrass Connector
--------------------------
The Device Defender Greengrass Connector provides the most streamlined and automated means of deploy the Device Defender agent to your
Greengrass core, and is the recommended method of using Device Defender with Greengrass.
For detailed information about using Greengrass Connectors see `Getting Started with Greengrass Connectors <https://docs.aws.amazon.com/greengrass/latest/developerguide/connectors-console.html>`__
For information about configuring the Device Defender Connector see `Device Defender Connector Details <https://docs.aws.amazon.com/greengrass/latest/developerguide/device-defender-connector.html>`__
#. Create a local resource to allow your lambda to collect metrics from the Greengrass Core host
* Follow the instructions `here <https://docs.aws.amazon.com/greengrass/latest/developerguide/access-local-resources.html>`__
* Use the following parameters:
* **Resource Name:** ``Core_Proc``
* **Type:** ``Volume``
* **Source Path:** ``/proc``
* **Destination Path:** ``/host_proc`` (make sure the same value is configured for the PROCFS_PATH environment variable above)
* **Group owner file access permission:** "Automatically add OS group permissions of the Linux group that owns the resource"
* Associate the resource with your metrics lambda
#. From the detail page of your Greengrass Group, click "Connectors" in the left-hand menu
#. Click the "Add a Connector" button
#. In the "Select a connector" screen, select the "Device Defender" connector from the list, click "Next"
#. On the "Configure parameters" screen, select the resource you created in Step 1, in the "Resource for /proc" box
#. In the "Metrics reporting interval" box, enter 300, or larger if you wish to use a longer reporting interval
#. Click the "add" button
#. `Deploy your connector to your Greengrass Group <https://docs.aws.amazon.com/greengrass/latest/developerguide/configs-core.html>`__
Create Your Lambda Package Manually
-----------------------------------
For this portion will be following the general process outlined
`here <https://docs.aws.amazon.com/greengrass/latest/developerguide/create-lambda.html/>`__
**Note:** Due to platform-specific binary extensions in the psutil package, this process should be performed on the platform where you
plan to deploy your lambda.
#. Clone the AWS IoT Device Defender Python Samples Repository
.. code:: bash
git clone https://github.com/aws-samples/aws-iot-device-defender-agent-sdk-python.git
#. Create, and activate a virtual environment (optional, recommended)
.. code:: bash
pip install virtualenv
virtualenv metrics_lambda_environment
source metrics_lambda_environment/bin/activate
#. Install the AWS IoT Device Defender sample agent in the virtual
environment Install from PyPi
.. code:: bash
pip install AWSIoTDeviceDefenderAgentSDK
Install from downloaded source
.. code:: bash
cd aws-iot-device-defender-agent-sdk-python
#This must be run from the same directory as setup.py
pip install .
#. Create an empty directory to assemble your lambda, we will refer to
this as your "lambda directory"
.. code:: bash
mkdir metrics_lambda
cd metrics_lambda
#. Complete steps 1-4 from this
`guide <https://docs.aws.amazon.com/greengrass/latest/developerguide/create-lambda.html>`__
#. Unzip the Greengrass python sdk into your lambda directory
.. code:: bash
unzip ../aws_greengrass_core_sdk/sdk/python_sdk_1_1_0.zip
cp -R ../aws_greengrass_core_sdk/examples/HelloWorld/greengrass_common .
cp -R ../aws_greengrass_core_sdk/examples/HelloWorld/greengrasssdk .
cp -R ../aws_greengrass_core_sdk/examples/HelloWorld/greengrass_ipc_python_sdk .
#. Copy the AWSIoTDeviceDefenderAgentSDK module to the root level of
your lambda
.. code:: bash
cp -R ../aws-iot-device-defender-agent-sdk-python/AWSIoTDeviceDefenderAgentSDK .
#. Copy the Greengrass agent to the root level of your lambda directory
.. code:: bash
cp ../aws-iot-device-defender-agent-sdk-python/samples/greengrass/greengrass_core_metrics_agent/greengrass_defender_agent.py .
#. Copy the dependencies from your virtual environment or your system, into the the root level of your lambda
.. code:: bash
cp -R ../metrics_lambda_environment/lib/python2.7/site-packages/psutil .
cp -R ../metrics_lambda_environment/lib/python2.7/site-packages/cbor .
#. Create your lambda zipfile *Note: you should perform this command in
the root level of your lambda directory*
.. code:: bash
rm *.zip
zip -r greengrass_defender_metrics_lambda.zip *
Configure and deploy your Greengrass Lambda
-------------------------------------------
#. `Upload your lambda zip file <https://docs.aws.amazon.com/greengrass/latest/developerguide/package.html>`__
#. Select the Python 2.7 runtime, and enter ``greengrass_defender_agent.function_handler`` in the Handler field
#. `Configure your lambda as a long-lived lambda <https://docs.aws.amazon.com/greengrass/latest/developerguide/long-lived.html>`__
#. Configure the following environment variables:
* **SAMPLE_INTERVAL_SECONDS:** The metrics generation interval. The default is 300 seconds.
*Note: 5 minutes (300 seconds) is the shortest reporting interval supported by AWS IoT Device Defender*
* **PROCFS_PATH:** The destination path that you will configure for your **/proc** resource as shown below.
#. `Configure a subscription from your lambda to the AWS IoT Cloud <https://docs.aws.amazon.com/greengrass/latest/developerguide/config_subs.html>`__
*Note: For AWS IoT Device Defender, a subscription from AWS IoT Cloud to your lambda is not required*
#. Create a local resource to allow your lambda to collect metrics from the Greengrass Core host
* Follow the instructions `here <https://docs.aws.amazon.com/greengrass/latest/developerguide/access-local-resources.html>`__
* Use the following parameters:
* **Resource Name:** ``Core_Proc``
* **Type:** ``Volume``
* **Source Path:** ``/proc``
* **Destination Path:** ``/host_proc`` (make sure the same value is configured for the PROCFS_PATH environment variable above)
* **Group owner file access permission:** "Automatically add OS group permissions of the Linux group that owns the resource"
* Associate the resource with your metrics lambda
#. `Deploy your connector to your Greengrass Group <https://docs.aws.amazon.com/greengrass/latest/developerguide/configs-core.html>`__
Troubleshooting
---------------
Reviewing AWS IoT Device Defender device metrics using AWS IoT Console
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
#. Temporarily modify your publish topic in your Greengrass lambda to
something such as metrics/test
#. Deploy the lambda
#. Add a subscription to the temporary topic in the "Test" section of
the iot console, shortly you should the metrics your Greengrass Core
is emitting
**********************
Metrics Report Details
**********************
Overall Structure
=================
+----------------+--------------+------------+----------+---------------+--------------------------------------------------+
| Long Name | Short Name | Required | Type | Constraints | Notes |
+================+==============+============+==========+===============+==================================================+
| header | hed | Y | Object | | Complete block required for well-formed report |
+----------------+--------------+------------+----------+---------------+--------------------------------------------------+
| metrics | met | Y | Object | | Complete block required for well-formed report |
+----------------+--------------+------------+----------+---------------+--------------------------------------------------+
| custom_metrics | cmet | N | Object | | Complete block required for well-formed report |
+----------------+--------------+------------+----------+---------------+--------------------------------------------------+
Header Block
------------
+--------+--------+-------+------+--------+---------------------------------------------+
| Long | Short | Requi | Type | Constr | Notes |
| Name | Name | red | | aints | |
+========+========+=======+======+========+=============================================+
| report | rid | Y | Inte | | Monotonically increasing value, epoch |
| \_id | | | ger | | timestamp recommended |
+--------+--------+-------+------+--------+---------------------------------------------+
| versio | v | Y | Stri | Major. | Minor increments with addition of field, |
| n | | | ng | Minor | major increments if metrics removed |
+--------+--------+-------+------+--------+---------------------------------------------+
Metrics Block
-------------
TCP Connections
^^^^^^^^^^^^^^^
+----------------------------+--------------+----------------------------+------------+----------+---------------+----------------------------------+
| Long Name | Short Name | Parent Element | Required | Type | Constraints | Notes |
+============================+==============+============================+============+==========+===============+==================================+
| tcp\_connections | tc | metrics | N | Object | | |
+----------------------------+--------------+----------------------------+------------+----------+---------------+----------------------------------+
| established\_connections | ec | tcp\_connections | N | List | | ESTABLISHED TCP State |
+----------------------------+--------------+----------------------------+------------+----------+---------------+----------------------------------+
| connections | cs | established\_connections | N | List | | |
+----------------------------+--------------+----------------------------+------------+----------+---------------+----------------------------------+
| remote\_addr | rad | connections | Y | Number | ip:port | ip can be ipv6 or ipv4 |
+----------------------------+--------------+----------------------------+------------+----------+---------------+----------------------------------+
| local\_port | lp | connections | N | Number | >0 | |
+----------------------------+--------------+----------------------------+------------+----------+---------------+----------------------------------+
| local\_interface | li | connections | N | String | | interface name |
+----------------------------+--------------+----------------------------+------------+----------+---------------+----------------------------------+
| total | t | established\_connections | N | Number | >= 0 | Number established connections |
+----------------------------+--------------+----------------------------+------------+----------+---------------+----------------------------------+
| | | | | | | |
+----------------------------+--------------+----------------------------+------------+----------+---------------+----------------------------------+
Listening TCP Ports
^^^^^^^^^^^^^^^^^^^
+-------------------------+--------------+-------------------------+------------+----------+---------------+-------------------------------+
| Long Name | Short Name | Parent Element | Required | Type | Constraints | Notes |
+=========================+==============+=========================+============+==========+===============+===============================+
| listening\_tcp\_ports | tp | metrics | N | Object | | |
+-------------------------+--------------+-------------------------+------------+----------+---------------+-------------------------------+
| ports | pts | listening\_tcp\_ports | N | List | > 0 | |
+-------------------------+--------------+-------------------------+------------+----------+---------------+-------------------------------+
| port | pt | ports | N | Number | >= 0 | ports should be numbers > 0 |
+-------------------------+--------------+-------------------------+------------+----------+---------------+-------------------------------+
| interface | if | ports | N | String | | Interface Name |
+-------------------------+--------------+-------------------------+------------+----------+---------------+-------------------------------+
| total | t | listening\_tcp\_ports | N | Number | >= 0 | |
+-------------------------+--------------+-------------------------+------------+----------+---------------+-------------------------------+
Listening UDP Ports
^^^^^^^^^^^^^^^^^^^
+-------------------------+--------------+-------------------------+------------+----------+---------------+-------------------------------+
| Long Name | Short Name | Parent Element | Required | Type | Constraints | Notes |
+=========================+==============+=========================+============+==========+===============+===============================+
| listening\_udp\_ports | up | metrics | N | Object | | |
+-------------------------+--------------+-------------------------+------------+----------+---------------+-------------------------------+
| ports | pts | listening\_udp\_ports | N | List | > 0 | |
+-------------------------+--------------+-------------------------+------------+----------+---------------+-------------------------------+
| port | pt | ports | N | Number | > 0 | ports should be numbers > 0 |
+-------------------------+--------------+-------------------------+------------+----------+---------------+-------------------------------+
| interface | if | ports | N | String | | Interface Name |
+-------------------------+--------------+-------------------------+------------+----------+---------------+-------------------------------+
| total | t | listening\_udp\_ports | N | Number | >= 0 | |
+-------------------------+--------------+-------------------------+------------+----------+---------------+-------------------------------+
Network Stats
^^^^^^^^^^^^^
+------------------+--------------+------------------+------------+----------+----------------------+---------+
| Long Name | Short Name | Parent Element | Required | Type | Constraints | Notes |
+==================+==============+==================+============+==========+======================+=========+
| network\_stats | ns | metrics | N | Object | | |
+------------------+--------------+------------------+------------+----------+----------------------+---------+
| bytes\_in | bi | network\_stats | N | Number | Delta Metric, >= 0 | |
+------------------+--------------+------------------+------------+----------+----------------------+---------+
| bytes\_out | bo | network\_stats | N | Number | Delta Metric, >= 0 | |
+------------------+--------------+------------------+------------+----------+----------------------+---------+
| packets\_in | pi | network\_stats | N | Number | Delta Metric, >= 0 | |
+------------------+--------------+------------------+------------+----------+----------------------+---------+
| packets\_out | po | network\_stats | N | Number | Delta Metric, >= 0 | |
+------------------+--------------+------------------+------------+----------+----------------------+---------+
Custom Metrics
^^^^^^^^^^^^^^
+------------------+--------------+------------------+------------+----------+----------------------+---------+
| Long Name | Short Name | Parent Element | Required | Type | Constraints | Notes |
+==================+==============+==================+============+==========+======================+=========+
| cpu_usage | cpu | custom_metrics | N | Number | | |
+------------------+--------------+------------------+------------+----------+----------------------+---------+
Sample Metrics Reports
======================
Long Field Names
----------------
.. code:: javascript
{
"header": {
"report_id": 1529963534,
"version": "1.0"
},
"metrics": {
"listening_tcp_ports": {
"ports": [
{
"interface": "eth0",
"port": 24800
},
{
"interface": "eth0",
"port": 22
},
{
"interface": "eth0",
"port": 53
}
],
"total": 3
},
"listening_udp_ports": {
"ports": [
{
"interface": "eth0",
"port": 5353
},
{
"interface": "eth0",
"port": 67
}
],
"total": 2
},
"network_stats": {
"bytes_in": 1157864729406,
"bytes_out": 1170821865,
"packets_in": 693092175031,
"packets_out": 738917180
},
"tcp_connections": {
"established_connections":{
"connections": [
{
"local_interface": "eth0",
"local_port": 80,
"remote_addr": "192.168.0.1:8000"
},
{
"local_interface": "eth0",
"local_port": 80,
"remote_addr": "192.168.0.1:8000"
}
],
"total": 2
}
}
},
"custom_metrics": {
"cpu_usage": [
{
"number": 26.1
}
]
}
}
Short Field Names
-----------------
.. code:: javascript
{
"hed": {
"rid": 1529963534,
"v": "1.0"
},
"met": {
"tp": {
"pts": [
{
"if": "eth0",
"pt": 24800
},
{
"if": "eth0",
"pt": 22
},
{
"if": "eth0",
"pt": 53
}
],
"t": 3
},
"up": {
"pts": [
{
"if": "eth0",
"pt": 5353
},
{
"if": "eth0",
"pt": 67
}
],
"t": 2
},
"ns": {
"bi": 1157864729406,
"bo": 1170821865,
"pi": 693092175031,
"po": 738917180
},
"tc": {
"ec":{
"cs": [
{
"li": "eth0",
"lp": 80,
"rad": "192.168.0.1:8000"
},
{
"li": "eth0",
"lp": 80,
"rad": "192.168.0.1:8000"
}
],
"t": 2
}
}
},
"cmet": {
"cpu": [
{
"number": 26.1
}
]
}
}
*****************
API Documentation
*****************
You can find the API documentation `here <https://aws-iot-device-defender-agent-sdk.readthedocs.io/en/latest/>`__
**********
References
**********
- `AWS Lambda: Creating a Deployment Package
(Python) <https://docs.aws.amazon.com/lambda/latest/dg/lambda-python-how-to-create-deployment-package.html>`__
- `Monitoring with AWS Greengrass
Logs <https://docs.aws.amazon.com/greengrass/latest/developerguide/greengrass-logs-overview.html>`__
- `Troubleshooting AWS Greengrass
Applications <https://docs.aws.amazon.com/greengrass/latest/developerguide/gg-troubleshooting.html>`__
- `Access Local Resources with Lambda
Functions <https://docs.aws.amazon.com/greengrass/latest/developerguide/access-local-resources.html>`__
*******
License
*******
This library is licensed under the Apache 2.0 License.
*******
Support
*******
If you have technical questions about the AWS IoT Device SDK, use the `AWS
IoT Forum <https://forums.aws.amazon.com/forum.jspa?forumID=210>`__.
For any other questions about AWS IoT, contact `AWS
Support <https://aws.amazon.com/contact-us>`__.
| AWSIoTDeviceDefenderAgentSDK | /AWSIoTDeviceDefenderAgentSDK-2.0.0.tar.gz/AWSIoTDeviceDefenderAgentSDK-2.0.0/README.rst | README.rst |
New Version Available
=============================
A new AWS IoT Device SDK is [now available](https://github.com/awslabs/aws-iot-device-sdk-python-v2). It is a complete rework, built to improve reliability, performance, and security. We invite your feedback!
This SDK will no longer receive feature updates, but will receive security updates.
AWS IoT Device SDK for Python
=============================
The AWS IoT Device SDK for Python allows developers to write Python
script to use their devices to access the AWS IoT platform through `MQTT or
MQTT over the WebSocket
protocol <http://docs.aws.amazon.com/iot/latest/developerguide/protocols.html>`__.
By connecting their devices to AWS IoT, users can securely work with
the message broker, rules, and the device shadow (sometimes referred to as a thing shadow) provided by AWS IoT and
with other AWS services like AWS Lambda, Amazon Kinesis, Amazon S3, and more.
- Overview_
- Installation_
- `Use the SDK`_
- `Key Features`_
- Examples_
- `API Documentation`_
- License_
- Support_
--------------
.. _Overview:
Overview
~~~~~~~~
This document provides instructions for installing and configuring
the AWS IoT Device SDK for Python. It includes examples demonstrating the
use of the SDK APIs.
MQTT Connections
________________
The SDK is built on top of a modified `Paho MQTT Python client
library <https://eclipse.org/paho/clients/python/>`__. Developers can choose from two
types of connections to connect to AWS
IoT:
- MQTT (over TLS 1.2) with X.509 certificate-based mutual
authentication.
- MQTT over the WebSocket protocol with AWS Signature Version 4 authentication.
- MQTT (over TLS 1.2) with X.509 certificate-based mutual authentication with TLS ALPN extension.
For MQTT over TLS (port 8883 and port 443), a valid certificate and a private key are
required for authentication. For MQTT over the WebSocket protocol (port 443),
a valid AWS Identity and Access Management (IAM) access key ID and secret access key pair are required for
authentication.
Device Shadow
_____________
A device shadow, or thing shadow, is a JSON document that is used to
store and retrieve current state information for a thing (device, app,
and so on). A shadow can be created and maintained for each thing or device so that its state can be get and set
regardless of whether the thing or device is connected to the Internet. The
SDK implements the protocol for applications to retrieve, update, and
delete shadow documents. The SDK allows operations on shadow documents
of single or multiple shadow instances in one MQTT connection. The SDK
also allows the use of the same connection for shadow operations and non-shadow, simple MQTT operations.
.. _Installation:
Installation
~~~~~~~~~~~~
Minimum Requirements
____________________
- Python 2.7+ or Python 3.3+ for X.509 certificate-based mutual authentication via port 8883
and MQTT over WebSocket protocol with AWS Signature Version 4 authentication
- Python 2.7.10+ or Python 3.5+ for X.509 certificate-based mutual authentication via port 443
- OpenSSL version 1.0.1+ (TLS version 1.2) compiled with the Python executable for
X.509 certificate-based mutual authentication
To check your version of OpenSSL, use the following command in a Python interpreter:
.. code-block:: python
>>> import ssl
>>> ssl.OPENSSL_VERSION
Install from pip
________________
.. code-block:: sh
pip install AWSIoTPythonSDK
Build from source
_________________
.. code-block:: sh
git clone https://github.com/aws/aws-iot-device-sdk-python.git
cd aws-iot-device-sdk-python
python setup.py install
Download the zip file
_____________________
The SDK zip file is available `here <https://s3.amazonaws.com/aws-iot-device-sdk-python/aws-iot-device-sdk-python-latest.zip>`__. Unzip the package and install the SDK like this:
.. code-block:: python
python setup.py install
.. _Use_the_SDK:
Use the SDK
~~~~~~~~~~~
Collection of Metrics
_____________________
Beginning with Release v1.3.0 of the SDK, AWS collects usage metrics indicating which language and version of the SDK
is being used. This feature is enabled by default and allows us to prioritize our resources towards addressing issues
faster in SDKs that see the most and is an important data point. However, we do understand that not all customers would
want to report this data. In that case, the sending of usage metrics can be easily disabled by the user using the
corresponding API:
.. code-block:: python
# AWS IoT MQTT Client
AWSIoTPythonSDK.MQTTLib.AWSIoTMQTTClient.enableMetricsCollection()
AWSIoTPythonSDK.MQTTLib.AWSIoTMQTTClient.disableMetricsCollection()
# AWS IoT MQTT Shadow Client
AWSIoTPythonSDK.MQTTLib.AWSIoTMQTTShadowClient.enableMetricsCollection()
AWSIoTPythonSDK.MQTTLib.AWSIoTMQTTShadowClient.disableMetricsCollection()
Credentials
___________
The SDK supports two types of credentials that correspond to the two connection
types:
- X.509 certificate
For the certificate-based mutual authentication connection
type.
Download the `AWS IoT root
CA <https://docs.aws.amazon.com/iot/latest/developerguide/managing-device-certs.html#server-authentication>`__.
Use the AWS IoT console to create and download the certificate and private key. You must specify the location of these files
when you initialize the client.
- IAM credentials
For the Websocket with Signature Version 4 authentication type. You will need IAM credentials: an access key ID, a secret access
key, and an optional session token. You must also
download the `AWS IoT root
CA <https://docs.aws.amazon.com/iot/latest/developerguide/managing-device-certs.html#server-authentication>`__.
You can specify the IAM credentials by:
- Passing method parameters
The SDK will first call the following method to check if there is any input for a custom IAM
credentials configuration:
.. code-block:: python
# AWS IoT MQTT Client
AWSIoTPythonSDK.MQTTLib.AWSIoTMQTTClient.configureIAMCredentials(obtainedAccessKeyID, obtainedSecretAccessKey, obtainedSessionToken)
# AWS IoT MQTT Shadow Client
AWSIoTPythonSDK.MQTTLib.AWSIoTMQTTShadowClient.configureIAMCredentials(obtainedAccessKeyID, obtainedSecretAccessKey, obtainedSessionToken)
Note: We do not recommend hard-coding credentials in a custom script. You can use `Amazon Cognito Identity
<https://aws.amazon.com/cognito/>`__ or another credential
provider.
- Exporting environment variables
If there is no custom configuration through method calls, the SDK
will then check these environment variables for credentials:
``AWS_ACCESS_KEY_ID``
The access key for your AWS account.
``AWS_SECRET_ACCESS_KEY``
The secret key for your AWS account.
``AWS_SESSION_TOKEN``
The session key for your AWS account. This is required only when
you are using temporary credentials. For more information, see
`here <http://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_temp.html>`__.
You can set your IAM credentials as environment variables by
using the preconfigured names. For Unix systems, you can do the
following:
.. code-block:: sh
export AWS_ACCESS_KEY_ID=<your aws access key id string>
export AWS_SECRET_ACCESS_KEY=<your aws secret access key string>
export AWS_SESSION_TOKEN=<your aws session token string>
For Windows, open ``Control Panel`` and choose ``System``. In
``Advanced system settings`` choose ``Environment Variables`` and
then configure the required environment variables.
- Configuring shared credentials file
If there are no such environment variables specified, the SDK
will check the **default** section for a shared
credentials file (in Unix, ``~/.aws/credentials`` and in Windows, ``%UserProfile%\.aws\credentials``) as follows:
.. code-block:: sh
[default]
aws_access_key_id=foo
aws_secret_access_key=bar
aws_session_token=baz
You can use the AWS CLI to configure the shared credentials file <http://aws.amazon.com/cli/>`__:
.. code-block:: sh
aws configure
AWSIoTMQTTClient
________________
This is the client class used for plain MQTT communication with AWS IoT.
You can initialize and configure the client like this:
.. code-block:: python
# Import SDK packages
from AWSIoTPythonSDK.MQTTLib import AWSIoTMQTTClient
# For certificate based connection
myMQTTClient = AWSIoTMQTTClient("myClientID")
# For Websocket connection
# myMQTTClient = AWSIoTMQTTClient("myClientID", useWebsocket=True)
# Configurations
# For TLS mutual authentication
myMQTTClient.configureEndpoint("YOUR.ENDPOINT", 8883)
# For Websocket
# myMQTTClient.configureEndpoint("YOUR.ENDPOINT", 443)
# For TLS mutual authentication with TLS ALPN extension
# myMQTTClient.configureEndpoint("YOUR.ENDPOINT", 443)
myMQTTClient.configureCredentials("YOUR/ROOT/CA/PATH", "PRIVATE/KEY/PATH", "CERTIFICATE/PATH")
# For Websocket, we only need to configure the root CA
# myMQTTClient.configureCredentials("YOUR/ROOT/CA/PATH")
myMQTTClient.configureOfflinePublishQueueing(-1) # Infinite offline Publish queueing
myMQTTClient.configureDrainingFrequency(2) # Draining: 2 Hz
myMQTTClient.configureConnectDisconnectTimeout(10) # 10 sec
myMQTTClient.configureMQTTOperationTimeout(5) # 5 sec
...
For basic MQTT operations, your script will look like this:
.. code-block:: python
...
myMQTTClient.connect()
myMQTTClient.publish("myTopic", "myPayload", 0)
myMQTTClient.subscribe("myTopic", 1, customCallback)
myMQTTClient.unsubscribe("myTopic")
myMQTTClient.disconnect()
...
AWSIoTShadowClient
__________________
This is the client class used for device shadow operations with AWS IoT.
You can initialize and configure the client like this:
.. code-block:: python
from AWSIoTPythonSDK.MQTTLib import AWSIoTMQTTShadowClient
# For certificate based connection
myShadowClient = AWSIoTMQTTShadowClient("myClientID")
# For Websocket connection
# myMQTTClient = AWSIoTMQTTClient("myClientID", useWebsocket=True)
# Configurations
# For TLS mutual authentication
myShadowClient.configureEndpoint("YOUR.ENDPOINT", 8883)
# For Websocket
# myShadowClient.configureEndpoint("YOUR.ENDPOINT", 443)
# For TLS mutual authentication with TLS ALPN extension
# myShadowClient.configureEndpoint("YOUR.ENDPOINT", 443)
myShadowClient.configureCredentials("YOUR/ROOT/CA/PATH", "PRIVATE/KEY/PATH", "CERTIFICATE/PATH")
# For Websocket, we only need to configure the root CA
# myShadowClient.configureCredentials("YOUR/ROOT/CA/PATH")
myShadowClient.configureConnectDisconnectTimeout(10) # 10 sec
myShadowClient.configureMQTTOperationTimeout(5) # 5 sec
...
For shadow operations, your script will look like this:
.. code-block:: python
...
myShadowClient.connect()
# Create a device shadow instance using persistent subscription
myDeviceShadow = myShadowClient.createShadowHandlerWithName("Bot", True)
# Shadow operations
myDeviceShadow.shadowGet(customCallback, 5)
myDeviceShadow.shadowUpdate(myJSONPayload, customCallback, 5)
myDeviceShadow.shadowDelete(customCallback, 5)
myDeviceShadow.shadowRegisterDeltaCallback(customCallback)
myDeviceShadow.shadowUnregisterDeltaCallback()
...
You can also retrieve the MQTTClient(MQTT connection) to perform plain
MQTT operations along with shadow operations:
.. code-block:: python
myMQTTClient = myShadowClient.getMQTTConnection()
myMQTTClient.publish("plainMQTTTopic", "Payload", 1)
AWSIoTMQTTThingJobsClient
__________________
This is the client class used for jobs operations with AWS IoT. See docs here:
https://docs.aws.amazon.com/iot/latest/developerguide/iot-jobs.html
You can initialize and configure the client like this:
.. code-block:: python
from AWSIoTPythonSDK.MQTTLib import AWSIoTMQTTThingJobsClient
# For certificate based connection
myJobsClient = AWSIoTMQTTThingJobsClient("myClientID", "myThingName")
# For Websocket connection
# myJobsClient = AWSIoTMQTTThingJobsClient("myClientID", "myThingName", useWebsocket=True)
# Configurations
# For TLS mutual authentication
myJobsClient.configureEndpoint("YOUR.ENDPOINT", 8883)
# For Websocket
# myJobsClient.configureEndpoint("YOUR.ENDPOINT", 443)
myJobsClient.configureCredentials("YOUR/ROOT/CA/PATH", "PRIVATE/KEY/PATH", "CERTIFICATE/PATH")
# For Websocket, we only need to configure the root CA
# myJobsClient.configureCredentials("YOUR/ROOT/CA/PATH")
myJobsClient.configureConnectDisconnectTimeout(10) # 10 sec
myJobsClient.configureMQTTOperationTimeout(5) # 5 sec
...
For job operations, your script will look like this:
.. code-block:: python
...
myJobsClient.connect()
# Create a subsciption for $notify-next topic
myJobsClient.createJobSubscription(notifyNextCallback, jobExecutionTopicType.JOB_NOTIFY_NEXT_TOPIC)
# Create a subscription for update-job-execution accepted response topic
myJobsClient.createJobSubscription(updateSuccessfulCallback, jobExecutionTopicType.JOB_UPDATE_TOPIC, jobExecutionTopicReplyType.JOB_ACCEPTED_REPLY_TYPE, '+')
# Send a message to start the next pending job (if any)
myJobsClient.sendJobsStartNext(statusDetailsDict)
# Send a message to update a successfully completed job
myJobsClient.sendJobsUpdate(jobId, jobExecutionStatus.JOB_EXECUTION_SUCCEEDED, statusDetailsDict)
...
You can also retrieve the MQTTClient(MQTT connection) to perform plain
MQTT operations along with shadow operations:
.. code-block:: python
myMQTTClient = myJobsClient.getMQTTConnection()
myMQTTClient.publish("plainMQTTTopic", "Payload", 1)
DiscoveryInfoProvider
_____________________
This is the client class for device discovery process with AWS IoT Greengrass.
You can initialize and configure the client like this:
.. code-block:: python
from AWSIoTPythonSDK.core.greengrass.discovery.providers import DiscoveryInfoProvider
discoveryInfoProvider = DiscoveryInfoProvider()
discoveryInfoProvider.configureEndpoint("YOUR.IOT.ENDPOINT")
discoveryInfoProvider.configureCredentials("YOUR/ROOT/CA/PATH", "CERTIFICATE/PATH", "PRIVATE/KEY/PATH")
discoveryInfoProvider.configureTimeout(10) # 10 sec
To perform the discovery process for a Greengrass Aware Device (GGAD) that belongs to a deployed group, your script
should look like this:
.. code-block:: python
discoveryInfo = discoveryInfoProvider.discover("myGGADThingName")
# I know nothing about the group/core I want to connect to. I want to iterate through all cores and find out.
coreList = discoveryInfo.getAllCores()
groupIdCAList = discoveryInfo.getAllCas() # list([(groupId, ca), ...])
# I know nothing about the group/core I want to connect to. I want to iterate through all groups and find out.
groupList = discoveryInfo.getAllGroups()
# I know exactly which group, which core and which connectivity info I need to connect.
connectivityInfo = discoveryInfo.toObjectAtGroupLevel()["YOUR_GROUP_ID"]
.getCoreConnectivityInfo("YOUR_CORE_THING_ARN")
.getConnectivityInfo("YOUR_CONNECTIVITY_ID")
# Connecting logic follows...
...
For more information about discovery information access at group/core/connectivity info set level, please refer to the
API documentation for ``AWSIoTPythonSDK.core.greengrass.discovery.models``,
`Greengrass Discovery documentation <http://docs.aws.amazon.com/greengrass/latest/developerguide/gg-discover-api.html>`__
or `Greengrass overall documentation <http://docs.aws.amazon.com/greengrass/latest/developerguide/what-is-gg.html>`__.
Synchronous APIs and Asynchronous APIs
______________________________________
Beginning with Release v1.2.0, SDK provides asynchronous APIs and enforces synchronous API behaviors for MQTT operations,
which includes:
- connect/connectAsync
- disconnect/disconnectAsync
- publish/publishAsync
- subscribe/subscribeAsync
- unsubscribe/unsubscribeAsync
- Asynchronous APIs
Asynchronous APIs translate the invocation into MQTT packet and forward it to the underneath connection to be sent out.
They return immediately once packets are out for delivery, regardless of whether the corresponding ACKs, if any, have
been received. Users can specify their own callbacks for ACK/message (server side PUBLISH) processing for each
individual request. These callbacks will be sequentially dispatched and invoked upon the arrival of ACK/message (server
side PUBLISH) packets.
- Synchronous APIs
Synchronous API behaviors are enforced by registering blocking ACK callbacks on top of the asynchronous APIs.
Synchronous APIs wait on their corresponding ACK packets, if there is any, before the invocation returns. For example,
a synchronous QoS1 publish call will wait until it gets its PUBACK back. A synchronous subscribe call will wait until
it gets its SUBACK back. Users can configure operation time out for synchronous APIs to stop the waiting.
Since callbacks are sequentially dispatched and invoked, calling synchronous APIs within callbacks will deadlock the
user application. If users are inclined to utilize the asynchronous mode and perform MQTT operations
within callbacks, asynchronous APIs should be used. For more details, please check out the provided samples at
``samples/basicPubSub/basicPubSub_APICallInCallback.py``
.. _Key_Features:
Key Features
~~~~~~~~~~~~
Progressive Reconnect Back Off
______________________________
When a non-client-side disconnect occurs, the SDK will reconnect automatically. The following APIs are provided for configuration:
.. code-block:: python
# AWS IoT MQTT Client
AWSIoTPythonSDK.MQTTLib.AWSIoTMQTTClient.configureAutoReconnectBackoffTime(baseReconnectQuietTimeSecond, maxReconnectQuietTimeSecond, stableConnectionTimeSecond)
# AWS IoT MQTT Shadow Client
AWSIoTPythonSDK.MQTTLib.AWSIoTMQTTShadowClient.configureAutoReconnectBackoffTime(baseReconnectQuietTimeSecond, maxReconnectQuietTimeSecond, stableConnectionTimeSecond)
The auto-reconnect occurs with a progressive backoff, which follows this
mechanism for reconnect backoff time calculation:
t\ :sup:`current` = min(2\ :sup:`n` t\ :sup:`base`, t\ :sup:`max`)
where t\ :sup:`current` is the current reconnect backoff time, t\ :sup:`base` is the base
reconnect backoff time, t\ :sup:`max` is the maximum reconnect backoff time.
The reconnect backoff time will be doubled on disconnect and reconnect
attempt until it reaches the preconfigured maximum reconnect backoff
time. After the connection is stable for over the
``stableConnectionTime``, the reconnect backoff time will be reset to
the ``baseReconnectQuietTime``.
If no ``configureAutoReconnectBackoffTime`` is called, the following
default configuration for backoff timing will be performed on initialization:
.. code-block:: python
baseReconnectQuietTimeSecond = 1
maxReconnectQuietTimeSecond = 32
stableConnectionTimeSecond = 20
Offline Requests Queueing with Draining
_______________________________________
If the client is temporarily offline and disconnected due to
network failure, publish/subscribe/unsubscribe requests will be added to an internal
queue until the number of queued-up requests reaches the size limit
of the queue. This functionality is for plain MQTT operations. Shadow
client contains time-sensitive data and is therefore not supported.
The following API is provided for configuration:
.. code-block:: python
AWSIoTPythonSDK.MQTTLib.AWSIoTMQTTClient.configureOfflinePublishQueueing(queueSize, dropBehavior)
After the queue is full, offline publish/subscribe/unsubscribe requests will be discarded or
replaced according to the configuration of the drop behavior:
.. code-block:: python
# Drop the oldest request in the queue
AWSIoTPythonSDK.MQTTLib.DROP_OLDEST = 0
# Drop the newest request in the queue
AWSIoTPythonSDK.MQTTLib.DROP_NEWEST = 1
Let's say we configure the size of offlinePublishQueue to 5 and we
have 7 incoming offline publish requests.
In a ``DROP_OLDEST`` configuration:
.. code-block:: python
myClient.configureOfflinePublishQueueing(5, AWSIoTPythonSDK.MQTTLib.DROP_OLDEST);
The internal queue should be like this when the queue is just full:
.. code-block:: sh
HEAD ['pub_req1', 'pub_req2', 'pub_req3', 'pub_req4', 'pub_req5']
When the 6th and the 7th publish requests are made offline, the internal
queue will be like this:
.. code-block:: sh
HEAD ['pub_req3', 'pub_req4', 'pub_req5', 'pub_req6', 'pub_req7']
Because the queue is already full, the oldest requests ``pub_req1`` and
``pub_req2`` are discarded.
In a ``DROP_NEWEST`` configuration:
.. code-block:: python
myClient.configureOfflinePublishQueueing(5, AWSIoTPythonSDK.MQTTLib.DROP_NEWEST);
The internal queue should be like this when the queue is just full:
.. code-block:: sh
HEAD ['pub_req1', 'pub_req2', 'pub_req3', 'pub_req4', 'pub_req5']
When the 6th and the 7th publish requests are made offline, the internal
queue will be like this:
.. code-block:: sh
HEAD ['pub_req1', 'pub_req2', 'pub_req3', 'pub_req4', 'pub_req5']
Because the queue is already full, the newest requests ``pub_req6`` and
``pub_req7`` are discarded.
When the client is back online, connected, and resubscribed to all topics
it has previously subscribed to, the draining starts. All requests
in the offline request queue will be resent at the configured draining
rate:
.. code-block:: python
AWSIoTPythonSDK.MQTTLib.AWSIoTMQTTClient.configureDrainingFrequency(frequencyInHz)
If no ``configOfflinePublishQueue`` or ``configureDrainingFrequency`` is
called, the following default configuration for offline request queueing
and draining will be performed on the initialization:
.. code-block:: python
offlinePublishQueueSize = 20
dropBehavior = DROP_NEWEST
drainingFrequency = 2Hz
Before the draining process is complete, any new publish/subscribe/unsubscribe request
within this time period will be added to the queue. Therefore, the draining rate
should be higher than the normal request rate to avoid an endless
draining process after reconnect.
The disconnect event is detected based on PINGRESP MQTT
packet loss. Offline request queueing will not be triggered until the
disconnect event is detected. Configuring a shorter keep-alive
interval allows the client to detect disconnects more quickly. Any QoS0
publish, subscribe and unsubscribe requests issued after the network failure and before the
detection of the PINGRESP loss will be lost.
Persistent/Non-Persistent Subscription
______________________________________
Device shadow operations are built on top of the publish/subscribe model
for the MQTT protocol, which provides an asynchronous request/response workflow. Shadow operations (Get, Update, Delete) are
sent as requests to AWS IoT. The registered callback will
be executed after a response is returned. In order to receive
responses, the client must subscribe to the corresponding shadow
response topics. After the responses are received, the client might want
to unsubscribe from these response topics to avoid getting unrelated
responses for charges for other requests not issued by this client.
The SDK provides a persistent/non-persistent subscription selection on
the initialization of a device shadow. Developers can choose the type of subscription workflow they want to follow.
For a non-persistent subscription, you will need to create a device
shadow like this:
.. code-block:: python
nonPersistentSubShadow = myShadowClient.createShadowHandlerWithName("NonPersistentSubShadow", False)
In this case, the request to subscribe to accepted/rejected topics will be
sent on each shadow operation. After a response is returned,
accepted/rejected topics will be unsubscribed to avoid getting unrelated
responses.
For a persistent subscription, you will need to create a device shadow
like this:
.. code-block:: python
persistentSubShadow = myShadowClient.createShadowHandlerWithName("PersistentSubShadow", True)
In this case, the request to subscribe to the corresponding
accepted/rejected topics will be sent on the first shadow operation. For
example, on the first call of shadowGet API, the following topics will
be subscribed to on the first Get request:
.. code-block:: sh
$aws/things/PersistentSubShadow/shadow/get/accepted
$aws/things/PersistentSubShadow/shadow/get/rejected
Because it is a persistent subscription, no unsubscribe requests will be
sent when a response is returned. The SDK client is always listening on
accepted/rejected topics.
In all SDK examples, PersistentSubscription is used in consideration of its better performance.
SSL Ciphers Setup
______________________________________
If custom SSL Ciphers are required for the client, they can be set when configuring the client before
starting the connection.
To setup specific SSL Ciphers:
.. code-block:: python
myAWSIoTMQTTClient.configureCredentials(rootCAPath, privateKeyPath, certificatePath, Ciphers="AES128-SHA256")
.. _Examples:
Examples
~~~~~~~~
BasicPubSub
___________
This example demonstrates a simple MQTT publish/subscribe using AWS
IoT. It first subscribes to a topic and registers a callback to print
new messages and then publishes to the same topic in a loop.
New messages are printed upon receipt, indicating
the callback function has been called.
Instructions
************
Run the example like this:
.. code-block:: python
# Certificate based mutual authentication
python basicPubSub.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath>
# MQTT over WebSocket
python basicPubSub.py -e <endpoint> -r <rootCAFilePath> -w
# Customize client id and topic
python basicPubSub.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -id <clientId> -t <topic>
# Customize the message
python basicPubSub.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -id <clientId> -t <topic> -M <message>
# Customize the port number
python basicPubSub.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -p <portNumber>
# change the run mode to subscribe or publish only (see python basicPubSub.py -h for the available options)
python basicPubSub.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -m <mode>
Source
******
The example is available in ``samples/basicPubSub/``.
BasicPubSub with Amazon Cognito Session Token
_____________________________________________
This example demonstrates a simple MQTT publish/subscribe using an Amazon Cognito
Identity session token. It uses the AWS IoT Device SDK for
Python and the AWS SDK for Python (boto3). It first makes a request to
Amazon Cognito to retrieve the access ID, the access key, and the session token for temporary
authentication. It then uses these credentials to connect to AWS
IoT and communicate data/messages using MQTT over Websocket, just like
the BasicPubSub example.
Instructions
************
To run the example, you will need your **Amazon Cognito identity pool ID** and allow **unauthenticated
identities** to connect. Make sure that the policy attached to the
unauthenticated role has permissions to access the required AWS IoT
APIs. For more information about Amazon Cognito, see
`here <https://console.aws.amazon.com/cognito/>`__.
Run the example like this:
.. code-block:: python
python basicPubSub_CognitoSTS.py -e <endpoint> -r <rootCAFilePath> -C <CognitoIdentityPoolID>
# Customize client id and topic
python basicPubsub_CognitoSTS.py -e <endpoint> -r <rootCAFilePath> -C <CognitoIdentityPoolID> -id <clientId> -t <topic>
Source
******
The example is available in ``samples/basicPubSub/``.
BasicPubSub Asynchronous version
________________________________
This example demonstrates a simple MQTT publish/subscribe with asynchronous APIs using AWS IoT.
It first registers general notification callbacks for CONNACK reception, disconnect reception and message arrival.
It then registers ACK callbacks for subscribe and publish requests to print out received ack packet ids.
It subscribes to a topic with no specific callback and then publishes to the same topic in a loop.
New messages are printed upon reception by the general message arrival callback, indicating
the callback function has been called.
New ack packet ids are printed upon reception of PUBACK and SUBACK through ACK callbacks registered with asynchronous
API calls, indicating that the the client received ACKs for the corresponding asynchronous API calls.
Instructions
************
Run the example like this:
.. code-block:: python
# Certificate based mutual authentication
python basicPubSubAsync.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath>
# MQTT over WebSocket
python basicPubSubAsync.py -e <endpoint> -r <rootCAFilePath> -w
# Customize client id and topic
python basicPubSubAsync.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -id <clientId> -t <topic>
# Customize the port number
python basicPubSubAsync.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -p <portNumber>
Source
******
The example is available in ``samples/basicPubSub/``.
BasicPubSub with API invocation in callback
___________
This example demonstrates the usage of asynchronous APIs within callbacks. It first connects to AWS IoT and subscribes
to 2 topics with the corresponding message callbacks registered. One message callback contains client asynchronous API
invocation that republishes the received message from <topic> to <topic>/republish. The other message callback simply
prints out the received message. It then publishes messages to <topic> in an infinite loop. For every message received
from <topic>, it will be republished to <topic>/republish and be printed out as configured in the simple print-out
message callback.
New ack packet ids are printed upon reception of PUBACK and SUBACK through ACK callbacks registered with asynchronous
API calls, indicating that the the client received ACKs for the corresponding asynchronous API calls.
Instructions
************
Run the example like this:
.. code-block:: python
# Certificate based mutual authentication
python basicPubSub_APICallInCallback.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath>
# MQTT over WebSocket
python basicPubSub_APICallInCallback.py -e <endpoint> -r <rootCAFilePath> -w
# Customize client id and topic
python basicPubSub_APICallInCallback.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -id <clientId> -t <topic>
# Customize the port number
python basicPubSub_APICallInCallback.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -p <portNumber>
Source
******
The example is available in ``samples/basicPubSub/``.
BasicShadow
___________
This example demonstrates the use of basic shadow operations
(update/delta). It has two scripts, ``basicShadowUpdater.py`` and
``basicShadowDeltaListener.py``. The example shows how an shadow update
request triggers delta events.
``basicShadowUpdater.py`` performs a shadow update in a loop to
continuously modify the desired state of the shadow by changing the
value of the integer attribute.
``basicShadowDeltaListener.py`` subscribes to the delta topic
of the same shadow and receives delta messages when there is a
difference between the desired and reported states.
Because only the desired state is being updated by basicShadowUpdater, a
series of delta messages that correspond to the shadow update requests should be received in basicShadowDeltaListener.
Instructions
************
Run the example like this:
First, start the basicShadowDeltaListener:
.. code-block:: python
# Certificate-based mutual authentication
python basicShadowDeltaListener.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath>
# MQTT over WebSocket
python basicShadowDeltaListener.py -e <endpoint> -r <rootCAFilePath> -w
# Customize the port number
python basicShadowDeltaListener.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -p <portNumber>
Then, start the basicShadowUpdater:
.. code-block:: python
# Certificate-based mutual authentication
python basicShadowUpdater.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath>
# MQTT over WebSocket
python basicShadowUpdater.py -e <endpoint> -r <rootCAFilePath> -w
# Customize the port number
python basicShadowUpdater.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -p <portNumber>
After the basicShadowUpdater starts sending shadow update requests, you
should be able to see corresponding delta messages in the
basicShadowDeltaListener output.
Source
******
The example is available in ``samples/basicShadow/``.
ThingShadowEcho
_______________
This example demonstrates how a device communicates with AWS IoT,
syncing data into the device shadow in the cloud and receiving commands
from another app. Whenever there is a new command from the app side to
change the desired state of the device, the device receives this
request and applies the change by publishing it as the reported state. By
registering a delta callback function, users will be able to see this
incoming message and notice the syncing of the state.
Instructions
************
Run the example like this:
.. code-block:: python
# Certificate based mutual authentication
python ThingShadowEcho.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath>
# MQTT over WebSocket
python ThingShadowEcho.py -e <endpoint> -r <rootCAFilePath> -w
# Customize client Id and thing name
python ThingShadowEcho.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -id <clientId> -n <thingName>
# Customize the port number
python ThingShadowEcho.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -p <portNumber>
Now use the `AWS IoT console <https://console.aws.amazon.com/iot/>`__ or other MQTT
client to update the shadow desired state only. You should be able to see the reported state is updated to match
the changes you just made in desired state.
Source
******
The example is available in ``samples/ThingShadowEcho/``.
JobsSample
__________
This example demonstrates how a device communicates with AWS IoT while
also taking advantage of AWS IoT Jobs functionality. It shows how to
subscribe to Jobs topics in order to recieve Job documents on your
device. It also shows how to process those Jobs so that you can see in
the `AWS IoT console <https://console.aws.amazon.com/iot/>`__ which of your devices have received and processed
which Jobs. See the AWS IoT Device Management documentation `here <https://aws.amazon.com/documentation/iot-device-management/>`__
for more information on creating and deploying Jobs to your fleet of
devices to facilitate management tasks such deploying software updates
and running diagnostics.
Instructions
************
First use the `AWS IoT console <https://console.aws.amazon.com/iot/>`__ to create and deploy Jobs to your fleet of devices.
Then run the example like this:
.. code-block:: python
# Certificate based mutual authentication
python jobsSample.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -n <thingName>
# MQTT over WebSocket
python jobsSample.py -e <endpoint> -r <rootCAFilePath> -w -n <thingName>
# Customize client Id and thing name
python jobsSample.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -id <clientId> -n <thingName>
# Customize the port number
python jobsSample.py -e <endpoint> -r <rootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -n <thingName> -p <portNumber>
Source
******
The example is available in ``samples/jobs/``.
BasicDiscovery
______________
This example demonstrates how to perform a discovery process from a Greengrass Aware Device (GGAD) to obtain the required
connectivity/identity information to connect to the Greengrass Core (GGC) deployed within the same group. It uses the
discovery information provider to invoke discover call for a certain GGAD with its thing name. After it gets back a
success response, it picks up the first GGC and the first set of identity information (CA) for the first group, persists \
it locally and iterates through all connectivity info sets for this GGC to establish a MQTT connection to the designated
GGC. It then publishes messages to the topic, which, on the GGC side, is configured to route the messages back to the
same GGAD. Therefore, it receives the published messages and invokes the corresponding message callbacks.
Note that in order to get the sample up and running correctly, you need:
1. Have a successfully deployed Greengrass group.
2. Use the certificate and private key that have been deployed with the group for the GGAD to perform discovery process.
3. The subscription records for that deployed group should contain a route that routes messages from the targeted GGAD to itself via a dedicated MQTT topic.
4. The deployed GGAD thing name, the deployed GGAD certificate/private key and the dedicated MQTT topic should be used as the inputs for this sample.
Run the sample like this:
.. code-block:: python
python basicDiscovery.py -e <endpoint> -r <IoTRootCAFilePath> -c <certFilePath> -k <privateKeyFilePath> -n <GGADThingName> -t <RoutingTopic>
If the group, GGC, GGAD and group subscription/routes are set up correctly, you should be able to see the sample running
on your GGAD, receiving messages that get published to GGC by itself.
.. _API_Documentation:
API Documentation
~~~~~~~~~~~~~~~~~
You can find the API documentation for the SDK `here <https://s3.amazonaws.com/aws-iot-device-sdk-python-docs/index.html>`__.
.. _License:
License
~~~~~~~
This SDK is distributed under the `Apache License, Version
2.0 <http://www.apache.org/licenses/LICENSE-2.0>`__, see LICENSE.txt
and NOTICE.txt for more information.
.. _Support:
Support
~~~~~~~
If you have technical questions about the AWS IoT Device SDK, use the `AWS
IoT Forum <https://forums.aws.amazon.com/forum.jspa?forumID=210>`__.
For any other questions about AWS IoT, contact `AWS
Support <https://aws.amazon.com/contact-us>`__.
| AWSIoTPythonSDK | /AWSIoTPythonSDK-1.5.2.tar.gz/AWSIoTPythonSDK-1.5.2/README.rst | README.rst |
=========
CHANGELOG
=========
1.4.9
=====
* bugfix: Fixing possible race condition with timer in deviceShadow.
1.4.8
=====
* improvement: Added support for subscription acknowledgement callbacks while offline or resubscribing
1.4.7
=====
* improvement: Added connection establishment control through client socket factory option
1.4.6
=====
* bugfix: Use non-deprecated ssl API to specify ALPN when doing Greengrass discovery
1.4.5
=====
* improvement: Added validation to mTLS arguments in basicDiscovery
1.4.3
=====
* bugfix: [Issue #150](https://github.com/aws/aws-iot-device-sdk-python/issues/150)Fix for ALPN in Python 3.7
1.4.2
=====
* bugfix: Websocket handshake supports Amazon Trust Store (ats) endpoints
* bugfix: Remove default port number in samples, which prevented WebSocket mode from using 443
* bugfix: jobsSample print statements compatible with Python 3.x
* improvement: Small fixes to IoT Jobs documentation
1.4.0
=====
* bugfix:Issue `#136 <https://github.com/aws/aws-iot-device-sdk-python/issues/136>`
* bugfix:Issue:`#124 <https://github.com/aws/aws-iot-device-sdk-python/issues/124>`
* improvement:Expose the missing getpeercert() from SecuredWebsocket class
* improvement:Enforce sending host header in the outbound discovery request
* improvement:Ensure credentials non error are properly handled and communicated to application level when creating wss endpoint
* feature:Add support for ALPN, along with API docs, sample and updated README
* feature:Add support for IoT Jobs, along with API docs, sample and updated README
* feature:Add command line option to allow port number override
1.3.1
=====
* bugfix:Issue:`#67 <https://github.com/aws/aws-iot-device-sdk-python/issues/67>`__
* bugfix:Fixed a dead lock issue when client async API is called within the event callback
* bugfix:Updated README and API documentation to provide clear usage information on sync/async API and callbacks
* improvement:Added a new sample to show API usage within callbacks
1.3.0
=====
* bugfix:WebSocket handshake response timeout and error escalation
* bugfix:Prevent GG discovery from crashing if Metadata field is None
* bugfix:Fix the client object reusability issue
* bugfix:Prevent NPE due to shadow operation token not found in the pool
* improvement:Split the publish and subscribe operations in basicPubSub.py sample
* improvement:Updated default connection keep-alive interval to 600 seconds
* feature:AWSIoTMQTTClient:New API for username and password configuration
* feature:AWSIoTMQTTShadowClient:New API for username and password configuration
* feature:AWSIoTMQTTClient:New API for enabling/disabling metrics collection
* feature:AWSIoTMQTTShadowClient:New API for enabling/disabling metrics collection
1.2.0
=====
* improvement:AWSIoTMQTTClient:Improved synchronous API backend for ACK tracking
* feature:AWSIoTMQTTClient:New API for asynchronous API
* feature:AWSIoTMQTTClient:Expose general notification callbacks for online, offline and message arrival
* feature:AWSIoTMQTTShadowClient:Expose general notification callbacks for online, offline and message arrival
* feature:AWSIoTMQTTClient:Extend offline queueing to include offline subscribe/unsubscribe requests
* feature:DiscoveryInfoProvider:Support for Greengrass discovery
* bugfix:Pull request:`#50 <https://github.com/aws/aws-iot-device-sdk-python/pull/50>`__
* bugfix:Pull request:`#51 <https://github.com/aws/aws-iot-device-sdk-python/pull/51>`__
* bugfix:Issue:`#52 <https://github.com/aws/aws-iot-device-sdk-python/issues/52>`__
1.1.2
=====
* bugfix:Issue:`#28 <https://github.com/aws/aws-iot-device-sdk-python/issues/28>`__
* bugfix:Issue:`#29 <https://github.com/aws/aws-iot-device-sdk-python/issues/29>`__
* bugfix:Pull request:`#32 <https://github.com/aws/aws-iot-device-sdk-python/pull/32>`__
* improvement:Pull request:`#38 <https://github.com/aws/aws-iot-device-sdk-python/pull/38>`__
* bugfix:Pull request:`#45 <https://github.com/aws/aws-iot-device-sdk-python/pull/45>`__
* improvement:Pull request:`#46 <https://github.com/aws/aws-iot-device-sdk-python/pull/46>`__
1.1.1
=====
* bugfix:Issue:`#23 <https://github.com/aws/aws-iot-device-sdk-python/issues/23>`__
* bugfix:README documentation
1.1.0
=====
* feature:AWSIoTMQTTClient:last will configuration APIs
* bugfix:Pull request:`#12 <https://github.com/aws/aws-iot-device-sdk-python/pull/12>`__
* bugfix:Pull request:`#14 <https://github.com/aws/aws-iot-device-sdk-python/pull/14>`__
* Addressed issue:`#15 <https://github.com/aws/aws-iot-device-sdk-python/issues/15>`__
1.0.1
=====
* bugfix:Pull request:`#9 <https://github.com/aws/aws-iot-device-sdk-python/pull/9>`__
1.0.0
=====
* feature:AWSIoTMQTTClient:basic MQTT APIs
* feature:AWSIoTMQTTClient:auto-reconnection/resubscribe
* feature:AWSIoTMQTTClient:offline publish requests queueing and draining
* feature:AWSIoTMQTTShadowClient:basic Shadow APIs
| AWSIoTPythonSDK | /AWSIoTPythonSDK-1.5.2.tar.gz/AWSIoTPythonSDK-1.5.2/CHANGELOG.rst | CHANGELOG.rst |
import os
import sys
import time
import uuid
import json
import logging
import argparse
from AWSIoTPythonSDK.core.greengrass.discovery.providers import DiscoveryInfoProvider
from AWSIoTPythonSDK.core.protocol.connection.cores import ProgressiveBackOffCore
from AWSIoTPythonSDK.MQTTLib import AWSIoTMQTTClient
from AWSIoTPythonSDK.exception.AWSIoTExceptions import DiscoveryInvalidRequestException
AllowedActions = ['both', 'publish', 'subscribe']
# General message notification callback
def customOnMessage(message):
print('Received message on topic %s: %s\n' % (message.topic, message.payload))
MAX_DISCOVERY_RETRIES = 10
GROUP_CA_PATH = "./groupCA/"
# Read in command-line parameters
parser = argparse.ArgumentParser()
parser.add_argument("-e", "--endpoint", action="store", required=True, dest="host", help="Your AWS IoT custom endpoint")
parser.add_argument("-r", "--rootCA", action="store", required=True, dest="rootCAPath", help="Root CA file path")
parser.add_argument("-c", "--cert", action="store", dest="certificatePath", help="Certificate file path")
parser.add_argument("-k", "--key", action="store", dest="privateKeyPath", help="Private key file path")
parser.add_argument("-n", "--thingName", action="store", dest="thingName", default="Bot", help="Targeted thing name")
parser.add_argument("-t", "--topic", action="store", dest="topic", default="sdk/test/Python", help="Targeted topic")
parser.add_argument("-m", "--mode", action="store", dest="mode", default="both",
help="Operation modes: %s"%str(AllowedActions))
parser.add_argument("-M", "--message", action="store", dest="message", default="Hello World!",
help="Message to publish")
#--print_discover_resp_only used for delopyment testing. The test run will return 0 as long as the SDK installed correctly.
parser.add_argument("-p", "--print_discover_resp_only", action="store_true", dest="print_only", default=False)
args = parser.parse_args()
host = args.host
rootCAPath = args.rootCAPath
certificatePath = args.certificatePath
privateKeyPath = args.privateKeyPath
clientId = args.thingName
thingName = args.thingName
topic = args.topic
print_only = args.print_only
if args.mode not in AllowedActions:
parser.error("Unknown --mode option %s. Must be one of %s" % (args.mode, str(AllowedActions)))
exit(2)
if not args.certificatePath or not args.privateKeyPath:
parser.error("Missing credentials for authentication, you must specify --cert and --key args.")
exit(2)
if not os.path.isfile(rootCAPath):
parser.error("Root CA path does not exist {}".format(rootCAPath))
exit(3)
if not os.path.isfile(certificatePath):
parser.error("No certificate found at {}".format(certificatePath))
exit(3)
if not os.path.isfile(privateKeyPath):
parser.error("No private key found at {}".format(privateKeyPath))
exit(3)
# Configure logging
logger = logging.getLogger("AWSIoTPythonSDK.core")
logger.setLevel(logging.DEBUG)
streamHandler = logging.StreamHandler()
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
streamHandler.setFormatter(formatter)
logger.addHandler(streamHandler)
# Progressive back off core
backOffCore = ProgressiveBackOffCore()
# Discover GGCs
discoveryInfoProvider = DiscoveryInfoProvider()
discoveryInfoProvider.configureEndpoint(host)
discoveryInfoProvider.configureCredentials(rootCAPath, certificatePath, privateKeyPath)
discoveryInfoProvider.configureTimeout(10) # 10 sec
retryCount = MAX_DISCOVERY_RETRIES if not print_only else 1
discovered = False
groupCA = None
coreInfo = None
while retryCount != 0:
try:
discoveryInfo = discoveryInfoProvider.discover(thingName)
caList = discoveryInfo.getAllCas()
coreList = discoveryInfo.getAllCores()
# We only pick the first ca and core info
groupId, ca = caList[0]
coreInfo = coreList[0]
print("Discovered GGC: %s from Group: %s" % (coreInfo.coreThingArn, groupId))
print("Now we persist the connectivity/identity information...")
groupCA = GROUP_CA_PATH + groupId + "_CA_" + str(uuid.uuid4()) + ".crt"
if not os.path.exists(GROUP_CA_PATH):
os.makedirs(GROUP_CA_PATH)
groupCAFile = open(groupCA, "w")
groupCAFile.write(ca)
groupCAFile.close()
discovered = True
print("Now proceed to the connecting flow...")
break
except DiscoveryInvalidRequestException as e:
print("Invalid discovery request detected!")
print("Type: %s" % str(type(e)))
print("Error message: %s" % str(e))
print("Stopping...")
break
except BaseException as e:
print("Error in discovery!")
print("Type: %s" % str(type(e)))
print("Error message: %s" % str(e))
retryCount -= 1
print("\n%d/%d retries left\n" % (retryCount, MAX_DISCOVERY_RETRIES))
print("Backing off...\n")
backOffCore.backOff()
if not discovered:
# With print_discover_resp_only flag, we only woud like to check if the API get called correctly.
if print_only:
sys.exit(0)
print("Discovery failed after %d retries. Exiting...\n" % (MAX_DISCOVERY_RETRIES))
sys.exit(-1)
# Iterate through all connection options for the core and use the first successful one
myAWSIoTMQTTClient = AWSIoTMQTTClient(clientId)
myAWSIoTMQTTClient.configureCredentials(groupCA, privateKeyPath, certificatePath)
myAWSIoTMQTTClient.onMessage = customOnMessage
connected = False
for connectivityInfo in coreInfo.connectivityInfoList:
currentHost = connectivityInfo.host
currentPort = connectivityInfo.port
print("Trying to connect to core at %s:%d" % (currentHost, currentPort))
myAWSIoTMQTTClient.configureEndpoint(currentHost, currentPort)
try:
myAWSIoTMQTTClient.connect()
connected = True
break
except BaseException as e:
print("Error in connect!")
print("Type: %s" % str(type(e)))
print("Error message: %s" % str(e))
if not connected:
print("Cannot connect to core %s. Exiting..." % coreInfo.coreThingArn)
sys.exit(-2)
# Successfully connected to the core
if args.mode == 'both' or args.mode == 'subscribe':
myAWSIoTMQTTClient.subscribe(topic, 0, None)
time.sleep(2)
loopCount = 0
while True:
if args.mode == 'both' or args.mode == 'publish':
message = {}
message['message'] = args.message
message['sequence'] = loopCount
messageJson = json.dumps(message)
myAWSIoTMQTTClient.publish(topic, messageJson, 0)
if args.mode == 'publish':
print('Published topic %s: %s\n' % (topic, messageJson))
loopCount += 1
time.sleep(1) | AWSIoTPythonSDK | /AWSIoTPythonSDK-1.5.2.tar.gz/AWSIoTPythonSDK-1.5.2/samples/greengrass/basicDiscovery.py | basicDiscovery.py |
from AWSIoTPythonSDK.MQTTLib import AWSIoTMQTTClient
from AWSIoTPythonSDK.MQTTLib import AWSIoTMQTTThingJobsClient
from AWSIoTPythonSDK.core.jobs.thingJobManager import jobExecutionTopicType
from AWSIoTPythonSDK.core.jobs.thingJobManager import jobExecutionTopicReplyType
from AWSIoTPythonSDK.core.jobs.thingJobManager import jobExecutionStatus
import threading
import logging
import time
import datetime
import argparse
import json
class JobsMessageProcessor(object):
def __init__(self, awsIoTMQTTThingJobsClient, clientToken):
#keep track of this to correlate request/responses
self.clientToken = clientToken
self.awsIoTMQTTThingJobsClient = awsIoTMQTTThingJobsClient
self.done = False
self.jobsStarted = 0
self.jobsSucceeded = 0
self.jobsRejected = 0
self._setupCallbacks(self.awsIoTMQTTThingJobsClient)
def _setupCallbacks(self, awsIoTMQTTThingJobsClient):
self.awsIoTMQTTThingJobsClient.createJobSubscription(self.newJobReceived, jobExecutionTopicType.JOB_NOTIFY_NEXT_TOPIC)
self.awsIoTMQTTThingJobsClient.createJobSubscription(self.startNextJobSuccessfullyInProgress, jobExecutionTopicType.JOB_START_NEXT_TOPIC, jobExecutionTopicReplyType.JOB_ACCEPTED_REPLY_TYPE)
self.awsIoTMQTTThingJobsClient.createJobSubscription(self.startNextRejected, jobExecutionTopicType.JOB_START_NEXT_TOPIC, jobExecutionTopicReplyType.JOB_REJECTED_REPLY_TYPE)
# '+' indicates a wildcard for jobId in the following subscriptions
self.awsIoTMQTTThingJobsClient.createJobSubscription(self.updateJobSuccessful, jobExecutionTopicType.JOB_UPDATE_TOPIC, jobExecutionTopicReplyType.JOB_ACCEPTED_REPLY_TYPE, '+')
self.awsIoTMQTTThingJobsClient.createJobSubscription(self.updateJobRejected, jobExecutionTopicType.JOB_UPDATE_TOPIC, jobExecutionTopicReplyType.JOB_REJECTED_REPLY_TYPE, '+')
#call back on successful job updates
def startNextJobSuccessfullyInProgress(self, client, userdata, message):
payload = json.loads(message.payload.decode('utf-8'))
if 'execution' in payload:
self.jobsStarted += 1
execution = payload['execution']
self.executeJob(execution)
statusDetails = {'HandledBy': 'ClientToken: {}'.format(self.clientToken)}
threading.Thread(target = self.awsIoTMQTTThingJobsClient.sendJobsUpdate, kwargs = {'jobId': execution['jobId'], 'status': jobExecutionStatus.JOB_EXECUTION_SUCCEEDED, 'statusDetails': statusDetails, 'expectedVersion': execution['versionNumber'], 'executionNumber': execution['executionNumber']}).start()
else:
print('Start next saw no execution: ' + message.payload.decode('utf-8'))
self.done = True
def executeJob(self, execution):
print('Executing job ID, version, number: {}, {}, {}'.format(execution['jobId'], execution['versionNumber'], execution['executionNumber']))
print('With jobDocument: ' + json.dumps(execution['jobDocument']))
def newJobReceived(self, client, userdata, message):
payload = json.loads(message.payload.decode('utf-8'))
if 'execution' in payload:
self._attemptStartNextJob()
else:
print('Notify next saw no execution')
self.done = True
def processJobs(self):
self.done = False
self._attemptStartNextJob()
def startNextRejected(self, client, userdata, message):
printf('Start next rejected:' + message.payload.decode('utf-8'))
self.jobsRejected += 1
def updateJobSuccessful(self, client, userdata, message):
self.jobsSucceeded += 1
def updateJobRejected(self, client, userdata, message):
self.jobsRejected += 1
def _attemptStartNextJob(self):
statusDetails = {'StartedBy': 'ClientToken: {} on {}'.format(self.clientToken, datetime.datetime.now().isoformat())}
threading.Thread(target=self.awsIoTMQTTThingJobsClient.sendJobsStartNext, kwargs = {'statusDetails': statusDetails}).start()
def isDone(self):
return self.done
def getStats(self):
stats = {}
stats['jobsStarted'] = self.jobsStarted
stats['jobsSucceeded'] = self.jobsSucceeded
stats['jobsRejected'] = self.jobsRejected
return stats
# Read in command-line parameters
parser = argparse.ArgumentParser()
parser.add_argument("-n", "--thingName", action="store", dest="thingName", help="Your AWS IoT ThingName to process jobs for")
parser.add_argument("-e", "--endpoint", action="store", required=True, dest="host", help="Your AWS IoT custom endpoint")
parser.add_argument("-r", "--rootCA", action="store", required=True, dest="rootCAPath", help="Root CA file path")
parser.add_argument("-c", "--cert", action="store", dest="certificatePath", help="Certificate file path")
parser.add_argument("-k", "--key", action="store", dest="privateKeyPath", help="Private key file path")
parser.add_argument("-p", "--port", action="store", dest="port", type=int, help="Port number override")
parser.add_argument("-w", "--websocket", action="store_true", dest="useWebsocket", default=False,
help="Use MQTT over WebSocket")
parser.add_argument("-id", "--clientId", action="store", dest="clientId", default="basicJobsSampleClient",
help="Targeted client id")
args = parser.parse_args()
host = args.host
rootCAPath = args.rootCAPath
certificatePath = args.certificatePath
privateKeyPath = args.privateKeyPath
port = args.port
useWebsocket = args.useWebsocket
clientId = args.clientId
thingName = args.thingName
if args.useWebsocket and args.certificatePath and args.privateKeyPath:
parser.error("X.509 cert authentication and WebSocket are mutual exclusive. Please pick one.")
exit(2)
if not args.useWebsocket and (not args.certificatePath or not args.privateKeyPath):
parser.error("Missing credentials for authentication.")
exit(2)
# Port defaults
if args.useWebsocket and not args.port: # When no port override for WebSocket, default to 443
port = 443
if not args.useWebsocket and not args.port: # When no port override for non-WebSocket, default to 8883
port = 8883
# Configure logging
logger = logging.getLogger("AWSIoTPythonSDK.core")
logger.setLevel(logging.DEBUG)
streamHandler = logging.StreamHandler()
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
streamHandler.setFormatter(formatter)
logger.addHandler(streamHandler)
# Init AWSIoTMQTTClient
myAWSIoTMQTTClient = None
if useWebsocket:
myAWSIoTMQTTClient = AWSIoTMQTTClient(clientId, useWebsocket=True)
myAWSIoTMQTTClient.configureEndpoint(host, port)
myAWSIoTMQTTClient.configureCredentials(rootCAPath)
else:
myAWSIoTMQTTClient = AWSIoTMQTTClient(clientId)
myAWSIoTMQTTClient.configureEndpoint(host, port)
myAWSIoTMQTTClient.configureCredentials(rootCAPath, privateKeyPath, certificatePath)
# AWSIoTMQTTClient connection configuration
myAWSIoTMQTTClient.configureAutoReconnectBackoffTime(1, 32, 20)
myAWSIoTMQTTClient.configureConnectDisconnectTimeout(10) # 10 sec
myAWSIoTMQTTClient.configureMQTTOperationTimeout(10) # 5 sec
jobsClient = AWSIoTMQTTThingJobsClient(clientId, thingName, QoS=1, awsIoTMQTTClient=myAWSIoTMQTTClient)
print('Connecting to MQTT server and setting up callbacks...')
jobsClient.connect()
jobsMsgProc = JobsMessageProcessor(jobsClient, clientId)
print('Starting to process jobs...')
jobsMsgProc.processJobs()
while not jobsMsgProc.isDone():
time.sleep(2)
print('Done processing jobs')
print('Stats: ' + json.dumps(jobsMsgProc.getStats()))
jobsClient.disconnect() | AWSIoTPythonSDK | /AWSIoTPythonSDK-1.5.2.tar.gz/AWSIoTPythonSDK-1.5.2/samples/jobs/jobsSample.py | jobsSample.py |
from AWSIoTPythonSDK.MQTTLib import AWSIoTMQTTShadowClient
import logging
import time
import json
import argparse
class shadowCallbackContainer:
def __init__(self, deviceShadowInstance):
self.deviceShadowInstance = deviceShadowInstance
# Custom Shadow callback
def customShadowCallback_Delta(self, payload, responseStatus, token):
# payload is a JSON string ready to be parsed using json.loads(...)
# in both Py2.x and Py3.x
print("Received a delta message:")
payloadDict = json.loads(payload)
deltaMessage = json.dumps(payloadDict["state"])
print(deltaMessage)
print("Request to update the reported state...")
newPayload = '{"state":{"reported":' + deltaMessage + '}}'
self.deviceShadowInstance.shadowUpdate(newPayload, None, 5)
print("Sent.")
# Read in command-line parameters
parser = argparse.ArgumentParser()
parser.add_argument("-e", "--endpoint", action="store", required=True, dest="host", help="Your AWS IoT custom endpoint")
parser.add_argument("-r", "--rootCA", action="store", required=True, dest="rootCAPath", help="Root CA file path")
parser.add_argument("-c", "--cert", action="store", dest="certificatePath", help="Certificate file path")
parser.add_argument("-k", "--key", action="store", dest="privateKeyPath", help="Private key file path")
parser.add_argument("-p", "--port", action="store", dest="port", type=int, help="Port number override")
parser.add_argument("-w", "--websocket", action="store_true", dest="useWebsocket", default=False,
help="Use MQTT over WebSocket")
parser.add_argument("-n", "--thingName", action="store", dest="thingName", default="Bot", help="Targeted thing name")
parser.add_argument("-id", "--clientId", action="store", dest="clientId", default="ThingShadowEcho",
help="Targeted client id")
args = parser.parse_args()
host = args.host
rootCAPath = args.rootCAPath
certificatePath = args.certificatePath
privateKeyPath = args.privateKeyPath
port = args.port
useWebsocket = args.useWebsocket
thingName = args.thingName
clientId = args.clientId
if args.useWebsocket and args.certificatePath and args.privateKeyPath:
parser.error("X.509 cert authentication and WebSocket are mutual exclusive. Please pick one.")
exit(2)
if not args.useWebsocket and (not args.certificatePath or not args.privateKeyPath):
parser.error("Missing credentials for authentication.")
exit(2)
# Port defaults
if args.useWebsocket and not args.port: # When no port override for WebSocket, default to 443
port = 443
if not args.useWebsocket and not args.port: # When no port override for non-WebSocket, default to 8883
port = 8883
# Configure logging
logger = logging.getLogger("AWSIoTPythonSDK.core")
logger.setLevel(logging.DEBUG)
streamHandler = logging.StreamHandler()
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
streamHandler.setFormatter(formatter)
logger.addHandler(streamHandler)
# Init AWSIoTMQTTShadowClient
myAWSIoTMQTTShadowClient = None
if useWebsocket:
myAWSIoTMQTTShadowClient = AWSIoTMQTTShadowClient(clientId, useWebsocket=True)
myAWSIoTMQTTShadowClient.configureEndpoint(host, port)
myAWSIoTMQTTShadowClient.configureCredentials(rootCAPath)
else:
myAWSIoTMQTTShadowClient = AWSIoTMQTTShadowClient(clientId)
myAWSIoTMQTTShadowClient.configureEndpoint(host, port)
myAWSIoTMQTTShadowClient.configureCredentials(rootCAPath, privateKeyPath, certificatePath)
# AWSIoTMQTTShadowClient configuration
myAWSIoTMQTTShadowClient.configureAutoReconnectBackoffTime(1, 32, 20)
myAWSIoTMQTTShadowClient.configureConnectDisconnectTimeout(10) # 10 sec
myAWSIoTMQTTShadowClient.configureMQTTOperationTimeout(5) # 5 sec
# Connect to AWS IoT
myAWSIoTMQTTShadowClient.connect()
# Create a deviceShadow with persistent subscription
deviceShadowHandler = myAWSIoTMQTTShadowClient.createShadowHandlerWithName(thingName, True)
shadowCallbackContainer_Bot = shadowCallbackContainer(deviceShadowHandler)
# Listen on deltas
deviceShadowHandler.shadowRegisterDeltaCallback(shadowCallbackContainer_Bot.customShadowCallback_Delta)
# Loop forever
while True:
time.sleep(1) | AWSIoTPythonSDK | /AWSIoTPythonSDK-1.5.2.tar.gz/AWSIoTPythonSDK-1.5.2/samples/ThingShadowEcho/ThingShadowEcho.py | ThingShadowEcho.py |
from AWSIoTPythonSDK.MQTTLib import AWSIoTMQTTClient
import logging
import time
import argparse
import json
AllowedActions = ['both', 'publish', 'subscribe']
# Custom MQTT message callback
def customCallback(client, userdata, message):
print("Received a new message: ")
print(message.payload)
print("from topic: ")
print(message.topic)
print("--------------\n\n")
# Read in command-line parameters
parser = argparse.ArgumentParser()
parser.add_argument("-e", "--endpoint", action="store", required=True, dest="host", help="Your AWS IoT custom endpoint")
parser.add_argument("-r", "--rootCA", action="store", required=True, dest="rootCAPath", help="Root CA file path")
parser.add_argument("-c", "--cert", action="store", dest="certificatePath", help="Certificate file path")
parser.add_argument("-k", "--key", action="store", dest="privateKeyPath", help="Private key file path")
parser.add_argument("-p", "--port", action="store", dest="port", type=int, help="Port number override")
parser.add_argument("-w", "--websocket", action="store_true", dest="useWebsocket", default=False,
help="Use MQTT over WebSocket")
parser.add_argument("-id", "--clientId", action="store", dest="clientId", default="basicPubSub",
help="Targeted client id")
parser.add_argument("-t", "--topic", action="store", dest="topic", default="sdk/test/Python", help="Targeted topic")
parser.add_argument("-m", "--mode", action="store", dest="mode", default="both",
help="Operation modes: %s"%str(AllowedActions))
parser.add_argument("-M", "--message", action="store", dest="message", default="Hello World!",
help="Message to publish")
args = parser.parse_args()
host = args.host
rootCAPath = args.rootCAPath
certificatePath = args.certificatePath
privateKeyPath = args.privateKeyPath
port = args.port
useWebsocket = args.useWebsocket
clientId = args.clientId
topic = args.topic
if args.mode not in AllowedActions:
parser.error("Unknown --mode option %s. Must be one of %s" % (args.mode, str(AllowedActions)))
exit(2)
if args.useWebsocket and args.certificatePath and args.privateKeyPath:
parser.error("X.509 cert authentication and WebSocket are mutual exclusive. Please pick one.")
exit(2)
if not args.useWebsocket and (not args.certificatePath or not args.privateKeyPath):
parser.error("Missing credentials for authentication.")
exit(2)
# Port defaults
if args.useWebsocket and not args.port: # When no port override for WebSocket, default to 443
port = 443
if not args.useWebsocket and not args.port: # When no port override for non-WebSocket, default to 8883
port = 8883
# Configure logging
logger = logging.getLogger("AWSIoTPythonSDK.core")
logger.setLevel(logging.DEBUG)
streamHandler = logging.StreamHandler()
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
streamHandler.setFormatter(formatter)
logger.addHandler(streamHandler)
# Init AWSIoTMQTTClient
myAWSIoTMQTTClient = None
if useWebsocket:
myAWSIoTMQTTClient = AWSIoTMQTTClient(clientId, useWebsocket=True)
myAWSIoTMQTTClient.configureEndpoint(host, port)
myAWSIoTMQTTClient.configureCredentials(rootCAPath)
else:
myAWSIoTMQTTClient = AWSIoTMQTTClient(clientId)
myAWSIoTMQTTClient.configureEndpoint(host, port)
myAWSIoTMQTTClient.configureCredentials(rootCAPath, privateKeyPath, certificatePath)
# AWSIoTMQTTClient connection configuration
myAWSIoTMQTTClient.configureAutoReconnectBackoffTime(1, 32, 20)
myAWSIoTMQTTClient.configureOfflinePublishQueueing(-1) # Infinite offline Publish queueing
myAWSIoTMQTTClient.configureDrainingFrequency(2) # Draining: 2 Hz
myAWSIoTMQTTClient.configureConnectDisconnectTimeout(10) # 10 sec
myAWSIoTMQTTClient.configureMQTTOperationTimeout(5) # 5 sec
# Connect and subscribe to AWS IoT
myAWSIoTMQTTClient.connect()
if args.mode == 'both' or args.mode == 'subscribe':
myAWSIoTMQTTClient.subscribe(topic, 1, customCallback)
time.sleep(2)
# Publish to the same topic in a loop forever
loopCount = 0
while True:
if args.mode == 'both' or args.mode == 'publish':
message = {}
message['message'] = args.message
message['sequence'] = loopCount
messageJson = json.dumps(message)
myAWSIoTMQTTClient.publish(topic, messageJson, 1)
if args.mode == 'publish':
print('Published topic %s: %s\n' % (topic, messageJson))
loopCount += 1
time.sleep(1) | AWSIoTPythonSDK | /AWSIoTPythonSDK-1.5.2.tar.gz/AWSIoTPythonSDK-1.5.2/samples/basicPubSub/basicPubSub.py | basicPubSub.py |
from AWSIoTPythonSDK.MQTTLib import AWSIoTMQTTClient
import logging
import time
import argparse
import json
AllowedActions = ['both', 'publish', 'subscribe']
# Custom MQTT message callback
def customCallback(client, userdata, message):
print("Received a new message: ")
print(message.payload)
print("from topic: ")
print(message.topic)
print("--------------\n\n")
# Read in command-line parameters
parser = argparse.ArgumentParser()
parser.add_argument("-e", "--endpoint", action="store", required=True, dest="host", help="Your AWS IoT custom endpoint")
parser.add_argument("-r", "--rootCA", action="store", required=True, dest="rootCAPath", help="Root CA file path")
parser.add_argument("-c", "--cert", action="store", dest="certificatePath", help="Certificate file path")
parser.add_argument("-k", "--key", action="store", dest="privateKeyPath", help="Private key file path")
parser.add_argument("-p", "--port", action="store", dest="port", type=int, help="Port number override")
parser.add_argument("-w", "--websocket", action="store_true", dest="useWebsocket", default=False,
help="Use MQTT over WebSocket")
parser.add_argument("-id", "--clientId", action="store", dest="clientId", default="basicPubSub",
help="Targeted client id")
parser.add_argument("-t", "--topic", action="store", dest="topic", default="sdk/test/Python", help="Targeted topic")
parser.add_argument("-m", "--mode", action="store", dest="mode", default="both",
help="Operation modes: %s"%str(AllowedActions))
parser.add_argument("-M", "--message", action="store", dest="message", default="Hello World!",
help="Message to publish")
args = parser.parse_args()
host = args.host
rootCAPath = args.rootCAPath
certificatePath = args.certificatePath
privateKeyPath = args.privateKeyPath
port = args.port
useWebsocket = args.useWebsocket
clientId = args.clientId
topic = args.topic
if args.mode not in AllowedActions:
parser.error("Unknown --mode option %s. Must be one of %s" % (args.mode, str(AllowedActions)))
exit(2)
if args.useWebsocket and args.certificatePath and args.privateKeyPath:
parser.error("X.509 cert authentication and WebSocket are mutual exclusive. Please pick one.")
exit(2)
if not args.useWebsocket and (not args.certificatePath or not args.privateKeyPath):
parser.error("Missing credentials for authentication.")
exit(2)
# Port defaults
if args.useWebsocket and not args.port: # When no port override for WebSocket, default to 443
port = 443
if not args.useWebsocket and not args.port: # When no port override for non-WebSocket, default to 8883
port = 8883
# Configure logging
logger = logging.getLogger("AWSIoTPythonSDK.core")
logger.setLevel(logging.DEBUG)
streamHandler = logging.StreamHandler()
formatter = logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s')
streamHandler.setFormatter(formatter)
logger.addHandler(streamHandler)
# Init AWSIoTMQTTClient
myAWSIoTMQTTClient = None
if useWebsocket:
myAWSIoTMQTTClient = AWSIoTMQTTClient(clientId, useWebsocket=True)
myAWSIoTMQTTClient.configureEndpoint(host, port)
myAWSIoTMQTTClient.configureCredentials(rootCAPath)
else:
myAWSIoTMQTTClient = AWSIoTMQTTClient(clientId)
myAWSIoTMQTTClient.configureEndpoint(host, port)
myAWSIoTMQTTClient.configureCredentials(rootCAPath, privateKeyPath, certificatePath)
# AWSIoTMQTTClient connection configuration
myAWSIoTMQTTClient.configureAutoReconnectBackoffTime(1, 32, 20)
myAWSIoTMQTTClient.configureOfflinePublishQueueing(-1) # Infinite offline Publish queueing
myAWSIoTMQTTClient.configureDrainingFrequency(2) # Draining: 2 Hz
myAWSIoTMQTTClient.configureConnectDisconnectTimeout(10) # 10 sec
myAWSIoTMQTTClient.configureMQTTOperationTimeout(5) # 5 sec
# AWSIoTMQTTClient socket configuration
# import pysocks to help us build a socket that supports a proxy configuration
import socks
# set proxy arguments (for SOCKS5 proxy: proxy_type=2, for HTTP proxy: proxy_type=3)
proxy_config = {"proxy_addr":<proxy_addr>, "proxy_port":<proxy_port>, "proxy_type":<proxy_type>}
# create anonymous function to handle socket creation
socket_factory = lambda: socks.create_connection((host, port), **proxy_config)
myAWSIoTMQTTClient.configureSocketFactory(socket_factory)
# Connect and subscribe to AWS IoT
myAWSIoTMQTTClient.connect()
if args.mode == 'both' or args.mode == 'subscribe':
myAWSIoTMQTTClient.subscribe(topic, 1, customCallback)
time.sleep(2)
# Publish to the same topic in a loop forever
loopCount = 0
while True:
if args.mode == 'both' or args.mode == 'publish':
message = {}
message['message'] = args.message
message['sequence'] = loopCount
messageJson = json.dumps(message)
myAWSIoTMQTTClient.publish(topic, messageJson, 1)
if args.mode == 'publish':
print('Published topic %s: %s\n' % (topic, messageJson))
loopCount += 1
time.sleep(1) | AWSIoTPythonSDK | /AWSIoTPythonSDK-1.5.2.tar.gz/AWSIoTPythonSDK-1.5.2/samples/basicPubSub/basicPubSubProxy.py | basicPubSubProxy.py |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.