hexsha
stringlengths 40
40
| size
int64 6
14.9M
| ext
stringclasses 1
value | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 6
260
| max_stars_repo_name
stringlengths 6
119
| max_stars_repo_head_hexsha
stringlengths 40
41
| max_stars_repo_licenses
sequence | max_stars_count
int64 1
191k
⌀ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
⌀ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
⌀ | max_issues_repo_path
stringlengths 6
260
| max_issues_repo_name
stringlengths 6
119
| max_issues_repo_head_hexsha
stringlengths 40
41
| max_issues_repo_licenses
sequence | max_issues_count
int64 1
67k
⌀ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
⌀ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
⌀ | max_forks_repo_path
stringlengths 6
260
| max_forks_repo_name
stringlengths 6
119
| max_forks_repo_head_hexsha
stringlengths 40
41
| max_forks_repo_licenses
sequence | max_forks_count
int64 1
105k
⌀ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
⌀ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
⌀ | avg_line_length
float64 2
1.04M
| max_line_length
int64 2
11.2M
| alphanum_fraction
float64 0
1
| cells
sequence | cell_types
sequence | cell_type_groups
sequence |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
d07afa8bcf831890715156c19a55ead158ee6504 | 179,422 | ipynb | Jupyter Notebook | 04 Seaborn/Seaborn_learning_03.ipynb | ketangangal/Ml_Projects | 4ee0415ce12a95b0eb2abaacf2286a84405c01a2 | [
"Apache-2.0"
] | null | null | null | 04 Seaborn/Seaborn_learning_03.ipynb | ketangangal/Ml_Projects | 4ee0415ce12a95b0eb2abaacf2286a84405c01a2 | [
"Apache-2.0"
] | null | null | null | 04 Seaborn/Seaborn_learning_03.ipynb | ketangangal/Ml_Projects | 4ee0415ce12a95b0eb2abaacf2286a84405c01a2 | [
"Apache-2.0"
] | null | null | null | 396.951327 | 103,944 | 0.920071 | [
[
[
"import numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport seaborn as sns",
"_____no_output_____"
],
[
"df = pd.read_csv(\"C:\\\\Users\\\\Ketan Gangal\\\\OneDrive\\\\Desktop\\\\python_basic\\\\Ml\\\\00 ml_course\\\\05-Seaborn\\\\country_table.csv\")\n",
"_____no_output_____"
],
[
"df",
"_____no_output_____"
],
[
"# index and columns for head map should be label\ndf = df.set_index('Countries')",
"_____no_output_____"
],
[
"df",
"_____no_output_____"
],
[
"plt.figure(dpi = 200)\nsns.heatmap(df.drop(columns = 'Life expectancy'),lw = 0.5,annot = True,\n cmap = 'viridis')",
"_____no_output_____"
],
[
"plt.figure(dpi = 300)\nsns.clustermap(df.drop(columns = 'Life expectancy'),lw = 0.5,annot = True,\n cmap = 'viridis')",
"_____no_output_____"
],
[
"plt.figure(dpi = 300)\nsns.clustermap(df.drop(columns = 'Life expectancy'),lw = 0.5,annot = True,\n cmap = 'viridis',col_cluster=False)",
"_____no_output_____"
],
[
"plt.figure(dpi = 300)\nsns.clustermap(df.drop(columns = 'Life expectancy'),lw = 0.5,annot = True,\n cmap = 'viridis')",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07b18428ba5900d325e65461b59d1e63cc9e2c9 | 314,408 | ipynb | Jupyter Notebook | code/visualizing_tracts.ipynb | gkiar/alzhippo | 2f1008eae61e2abe0cd2b029fa9bbc2497d48e55 | [
"MIT"
] | 1 | 2017-12-13T23:29:01.000Z | 2017-12-13T23:29:01.000Z | code/visualizing_tracts.ipynb | gkiar/alzhippo | 2f1008eae61e2abe0cd2b029fa9bbc2497d48e55 | [
"MIT"
] | null | null | null | code/visualizing_tracts.ipynb | gkiar/alzhippo | 2f1008eae61e2abe0cd2b029fa9bbc2497d48e55 | [
"MIT"
] | null | null | null | 176.138936 | 121,219 | 0.853436 | [
[
[
"# Alzhippo Pr0gress\n\n##### Possible Tasks\n- **Visualizing fibers** passing through ERC and hippo, for both ipsi and contra cxns (4-figs) (GK)\n- **Dilate hippocampal parcellations**, to cover entire hippocampus by nearest neighbour (JV)\n- **Voxelwise ERC-to-hippocampal** projections + clustering (Both)",
"_____no_output_____"
],
[
"## Visulaizating fibers\n\n1. Plot group average connectome\n2. Find representative subject X (i.e. passes visual inspection match to the group)\n3. Visualize fibers with parcellation\n4. Repeat 3. on dilated parcellation\n5. If connections appear more symmetric in 4., regenerate graphs with dilated parcellation",
"_____no_output_____"
],
[
"### 1. Plot group average connectome",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport networkx as nx\nimport nibabel as nib\nimport scipy.stats as stats\nimport matplotlib.pyplot as plt\nfrom nilearn import plotting\nimport os\nimport seaborn as sns\nimport pandas\n\n%matplotlib notebook",
"_____no_output_____"
],
[
"def matrixplotter(data, log=True, title=\"Connectivity between ERC and Hippocampus\"):\n plotdat = np.log(data + 1) if log else data\n plt.imshow(plotdat)\n labs = ['ERC-L', 'Hippo-L-noise', 'Hippo-L-tau',\n 'ERC-R', 'Hippo-R-noise', 'Hippo-R-tau']\n plt.xticks(np.arange(0, 6), labs, rotation=40)\n plt.yticks(np.arange(0, 6), labs)\n plt.title(title)\n plt.colorbar()\n plt.show() ",
"_____no_output_____"
],
[
"avg = np.load('../data/connection_matrix.npy')\nmatrixplotter(np.mean(avg, axis=2))",
"_____no_output_____"
]
],
[
[
"### 2. Find representative subject",
"_____no_output_____"
]
],
[
[
"tmp = np.reshape(avg.T, (355, 36))\ntmp[0]\ncorrs = np.corrcoef(tmp)[-1]\ncorrs[corrs == 1] = 0\nbestfit = int(np.where(corrs == np.max(corrs))[0])\nprint(\"Most similar graph: {}\".format(bestfit))",
"Most similar graph: 333\n"
],
[
"dsets = ['../data/graphs/BNU1/combined_erc_hippo_labels/',\n '../data/graphs/BNU3/',\n '../data/graphs/HNU1/']\nfiles = [os.path.join(d,f) for d in dsets for f in os.listdir(d)]\ngraph_fname = files[bestfit]\ngx = nx.read_weighted_edgelist(graph_fname)\nadjx = np.asarray(nx.adjacency_matrix(gx).todense())\nmatrixplotter(adjx)\nprint(graph_fname)",
"_____no_output_____"
]
],
[
[
"**N.B.**: The fibers from the subject/session shown above were SCP'd from the following location on Compute Canada's Cedar machine by @gkiar. They are too large for a git repository, but they were downloaded to the `data/fibers/` directory from the root of this project. Please @gkiar him if you'd like access to this file, in lieu of better public storage:\n\n> /project/6008063/gkiar/ndmg/connectomics/ndmg-d/HNU1/fibers/sub-0025444_ses-2_dwi_fibers.npz",
"_____no_output_____"
],
[
"### 3. Visualize fibers with parcellation",
"_____no_output_____"
],
[
"Because I don't have VTK/Dipy locally, this was done in Docker with the script in `./code/npz2trackviz.py` and submitted to the scheduler with `./code/npzdriver.sh`.\n\nThe command to run this in Docker, from the base directory of this project was:\n\n docker run -ti \\\n -v /Users/greg/code/gkiar/alzhippo/data/:/data \\\n -v /Users/greg/code/gkiar/alzhippo/code/:/proj \\\n --entrypoint python2.7 \\\n bids/ndmg:v0.1.0 \\\n /proj/npz2trackviz.py /data/fibers/sub-0025444_ses-2_dwi_fibers.npz /data/combined_erc_hippo_labels.nii.gz\n\nThe resulting `.trk` files were viewed locally with [TrackVis](http://www.trackvis.org/) to make the screenshot below.",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
]
] |
d07b1df7af012c4983f405e07d0e3084553b34ff | 15,269 | ipynb | Jupyter Notebook | materials/Module 1/0_intro.ipynb | alortiz27/Python-669 | 707b2fc42b8536646ae4160f5d2521dece7da11f | [
"MIT"
] | 5 | 2019-01-14T20:43:15.000Z | 2022-01-23T06:56:46.000Z | materials/Module 1/0_intro.ipynb | alortiz27/Python-669 | 707b2fc42b8536646ae4160f5d2521dece7da11f | [
"MIT"
] | null | null | null | materials/Module 1/0_intro.ipynb | alortiz27/Python-669 | 707b2fc42b8536646ae4160f5d2521dece7da11f | [
"MIT"
] | 3 | 2021-07-06T17:28:58.000Z | 2022-01-23T06:56:48.000Z | 61.817814 | 835 | 0.684655 | [
[
[
"# Course introduction",
"_____no_output_____"
],
[
"## A. Overview",
"_____no_output_____"
],
[
"### Am I ready to take this course?\n\nYes. Probably. Some programming experience will help, but is not required. If you have no programming experience, I strongly encourage you to go through the first handful of modules on the [Codecademy Python course](https://www.codecademy.com/learn/learn-python) as soon as possible. While that course utilizes Python 2, we will be using Python 3 in our course here. BUT...many of the basics are identical between the two versions. \n\nThere are <b>a lot</b> of online resources that you can use to supplement things we learn in class. Some examples are:\n\n * [python.org Tutorial](https://docs.python.org/3/tutorial/index.html)\n * [Learn Python](https://www.learnpython.org/)\n * [Google](http://google.com) (Just type in your question and follow the first [stackoverflow](http://stackoverflow.com) link. This is surprisingly effective; do this first.)",
"_____no_output_____"
],
[
"### What computational resources do I need for class?\n\nYou will need a laptop that will provide you access to the course (i.e. internet access) and a Python environment to follow along. ",
"_____no_output_____"
],
[
"### How is this for geosciences specifically?\n\nThe goal of this class is to provide information for all fields in geoscience. To that end, I will try to cover topics from geology, geography, atmospheric sciences, and oceanography. Specifically, I will focus on 1D timeseries, and 2D geospatial (i.e., on a map) analysis. If you have any topics you would like to cover, please let me know, and I will do my best to accommodate.",
"_____no_output_____"
],
[
"## Class setup\n\n### Class format\n\nWe will go through course materials during class time. You should bring a computer to class so that you can follow along and participate in exercises. Also, course materials are interactive, so you can learn by running code snippets as we go and asking questions. Much like learning a new spoken language, hands-on coding is one the <b>best</b> ways to learn a new language. \n\n\n### Course materials\n\nThe course materials are available in the [class repository](https://github.com/snifflesnrumjum/python4geosciences). They are in the form of [Jupyter notebooks](http://jupyter.org/). More information on notebooks in the next section.\n\nYou'll do your work either on your own computer, in a Google Colab notebook, or through the VOAL provided by Texas A&M University. To access the VOAL when off campus, you need to first set up a VPN connection. Set this up for your computer by visiting `https://connect.tamu.edu` and follow instructions there. You'll need to sign in with your NetID, and click on the little blue link that says \"AnyConnect VPN\" if and when you find that \"Web-based installation was unsuccessful\" to install Cisco AnyConnect (you will no longer use the web-based installer after this). When you open the Cisco application on your computer, you will need to fill in \"connect.tamu.edu\" in the little box, then use your NetID and university password to connect. Then you can run this application to use your computer as if you are on campus.\n\n\n### Course textbook\n\nThere is no textbook for the course. But if you'd like an outside resource, here are three recommendations:\n\n1. Learning Python by Mark Lutz (available electronically through TAMU Library http://library.tamu.edu/)\n2. Beginning Python by Magnus Lie Hetland (available electronically through TAMU Library http://library.tamu.edu/)\n3. Allen Downey has written a number of books on Python and related scientific subjects. And as a bonus, they are free (digital versions): http://greenteapress.com/wp/. In particular you would want to check out Think Python (2nd edition).\n ",
"_____no_output_____"
],
[
"## B. Jupyter notebooks\n\nThis file format makes it easy to seamlessly combine text and code. The text can be plain or formatted with [Markdown](https://daringfireball.net/projects/markdown/). The code can be written in over 40 languages including Python, R, and Scala. Most importantly, the code can be interacted with when the notebook is opened in a local (that is, on your computer) iPython server. Alternatively, it can simply be viewed through a github repository (like [this very notebook](https://github.com/snifflesnrumjum/python4geosciences/blob/master/materials/0_intro.ipynb)) or through [nbviewer](http://nbviewer.ipython.org/).\n\nYou'll be able to run class materials (in the form of Jupyter notebooks) on your own computer, on Google Colab or the VOAL via your web browser as well as create and work on homework assignments. If you prefer, you are welcome to run Python on your own computer, but you will need to do that mostly on your own. If you go that route, I recommend using Python 3 (which we will be using in class) and a distribution from [Anaconda](https://www.anaconda.com/products/individual). \n\n\n### Create a new notebook\n\nStart up your local notebook server in your new repo and create a new Jupyter notebook from the local server page.\n\n### Choose syntax for a cell\n\nNotebooks are built of cells. Cells can have multiple built-in formats including code and Markdown for text. You can select the desired format from a dropdown menu at the top.\n\nIf you want to type words, use \"Markdown\"; if you want to write code, choose \"code\".\n\n### Move between cells\n\nTo run a given cell, type `[shift-enter]` which active in that cell. You can run all of the cells with Cell > Run all; other variations are available in that drop down menu.\n\n\n### Homework\n\nWe'll discuss homework soon and go through details. It will be in the form of Jupyter notebooks and will be submitted through the Canvas LMS.\n",
"_____no_output_____"
],
[
"---\n\nThe material below is bonus for any students that are interested in using a terminal window on their own computer for running Python. We may go through it in class.",
"_____no_output_____"
],
[
"## Command-line interface\n\nA command-line interface is a way to interact with your computer using text instead of a Graphical User Interface (GUI), a GUI being visually based with icons etc. We will use these in this class. On a Macintosh or Linux machine, this is a terminal window. On a PC this is often called a command prompt.\n\nHere are some commonly-used commands:\n\n* `cd [path]`: change directory from current location to [path]. `cd ..` can be used to move up a single directory, and `cd ../..` moves up two directories, etc.\n* `pwd`: print working directory, as in write out the current location in the terminal window.\n* `ls`: list files in current directory. `ls -l` list files in long format to include more information, `ls -a` to list all files even those that are usually not shown because the have a `.` in front, `ls -h` to show file sizes in human readable format. Flags can always be combined to use multiple options at once, as in `ls -ah` to show all files in human readable format.\n* [tab]: Tab completion. You can always push tab in the terminal window to see available options. As you have some letters entered and push tab, the options will be limited to those that fit the pattern you have started.\n* `mkdir [dirname]`: make directory called dirname.\n* `rm [filename]`: remove a file called filename. To remove a directory called dirname, use `rm -r [dirname]`.",
"_____no_output_____"
],
[
"## Short git and GitHub tutorial (optional)\n\nClass materials are available on a [GitHub](http://github.org) repository. GitHub is a way to share and access code online which has been version-controlled using git. Version control allows changes in code to be tracked over time; this is important for reproducibility, retrieving code in case of accidents, and working on code in groups. Git is one way to version control your code — other methods include subversion (svn), cvs, and mercurial. More information on this is provided below.\n\nRemember: you can always google to learn more! Google is an infinite resource that you can ask at any time of the day. Here we summarize a brief overview of how to use git. GitHub has a [cheatsheet](https://education.github.com/git-cheat-sheet-education.pdf) available.\n\nTo get changes in a file in a local version of a repository tracked, saved, and then shared with the internet (on github), do the following:\n\n* `git add` to initially tell the system to track your file and subsequently to tell the system that you want to take into account new changes to the file (you can also add more than one file in this process). Then \n* `git commit -m [commit note]` to save the changes. Then\n* `git push` to share the changes with the version of your repository on github. Now you should be able to look at your repo on github and see your updated file there.\n\n**GitHub Desktop**\n\nAfter you have made your repository on GitHub (the website), you should clone it to GitHub Desktop (which is on your local machine). (This should be very easy if you are properly signed into your github account in GitHub Desktop.) Then to get your file tracked and pushed to GitHub (the website):\n\n* While inspecting the relevant repository, any untracked files or changes to tracked files are shown in the middle window (it is white with horizontal lines). To do the equivalent of `git add`, you should check the box of the file.\n* To commit your changes, fill out the form at the bottom of the same window. One short window leaves space for a \"Summary\" of your changes, and if you have more to say you can put it in the \"Description\" box.\n* To push your local changes out to GitHub online, use the Sync button on the upper right hand corner of the window. As a side note, this sync button is also how you should pull down changes to a repository you are following (the equivalent of `git pull`).\n\n\nNote that you do not want to have a directory covered by two git repositories. So for file structure for this class, for example, you might want to have one directory for the class (\"Python_x89\") which contains two version-controlled subdirectories: the course materials (python4geosciences) and your homework repository (\"homework\"). That will keep everything properly separated. \n\n\n\n[Git as explained by XKCD](https://xkcd.com/1597/)\n\n### `git status`\n\nType this in a git-monitored subdirectory on your computer to see the status of files that are under git version control and which files are not being monitored.\n\n### `git add`\n\nUse this to add local files to your git repository. `git add [filename]`.\n\n### `git commit`\n\nUse this to save to your repository local file changes. First you need to add the file with `git add`, then you can `git commit -m [commit message]`, where the commit message is a concise and useful note to explain what changes you have made. You may skip the `git add` step if you would like to commit at once all changes that have been made (you can see what changes would be committed by first consulting `git status`) with `git commit -am [commit message]`. The `-a` flag stands for \"all\" as in commit all changes.\n\n### `git push`\n\nTo move your local changes to github so that they are saved and also for sharing with others, you need to `git push`. After making whatever commits you want to make, you can run this command to finalize your changes on github.\n\n### `git merge`\n\nIf changes have been made in multiple places (say, by another person working on the same code base, or by yourself on two different machines) between `git push`es, git will try to merge if possible — which it will do if the changes don't overlap and therefore don't require your input. If the changes do overlap, you will have to merge the two versions of the code together. You probably won't need to do this in this class.\n\n\n### Get set up with GitHub\n\nYou'll need an account on github if you don't already have one, and to have git installed on your computer. We will interact with github through a terminal window in class and through the website. You may also download the [GitHub Desktop application](https://desktop.github.com/) to use if you prefer, though we won't be able to help you much with the details of it.\n\n### Create a new repo\n\nIn your account on the github webpage, click on the Repositories tab and then the green \"New\" button in the upper right. You'll need to keep this repo public. After creating this new repo, follow the instructions on the next page for quick setup or to create a new repository on the command line. This makes it so that you have the repo both on github and on your local machine. \n\n### Clone a repo\n\nIn a terminal window in the location you want to save the repo materials, type: `git clone [repo address, e.g. https://github.com/snifflesnrumjum/python4geosciences]`.",
"_____no_output_____"
]
]
] | [
"markdown"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
d07b29fb3534ebf561aa66d0b69940c79648171c | 58,949 | ipynb | Jupyter Notebook | courses/modsim2018/tasks/Tasks_DuringLecture18/BMC-master/notebooks/Optimization.ipynb | raissabthibes/bmc | 840800fb94ea3bf188847d0771ca7197dfec68e3 | [
"MIT"
] | null | null | null | courses/modsim2018/tasks/Tasks_DuringLecture18/BMC-master/notebooks/Optimization.ipynb | raissabthibes/bmc | 840800fb94ea3bf188847d0771ca7197dfec68e3 | [
"MIT"
] | null | null | null | courses/modsim2018/tasks/Tasks_DuringLecture18/BMC-master/notebooks/Optimization.ipynb | raissabthibes/bmc | 840800fb94ea3bf188847d0771ca7197dfec68e3 | [
"MIT"
] | null | null | null | 62.182489 | 17,068 | 0.715161 | [
[
[
"# Optimization\n\n> Marcos Duarte \n> Laboratory of Biomechanics and Motor Control ([http://demotu.org/](http://demotu.org/)) \n> Federal University of ABC, Brazil",
"_____no_output_____"
],
[
"<div style=\"text-align: right\">\n<i>If there occur some changes in nature, the amount of action necessary for this change must be as small as possible.</i> \n<br>Maupertuis (sec XVIII)\n</div>\n \n**Optimization is the process of finding the best value from possible alternatives with regards to a certain criteria** ([Wikipedia](http://en.wikipedia.org/wiki/Mathematical_optimization)). \n\nTypically, such best value is the value that maximizes or minimizes the criteria. In this context, to solve a (mathematical) optimization problem is to find the maximum or minimum (a.k.a., a stationary point) of a function (and we can use maximum or minimum interchangeably because the maximum of a function is the minimum of the negative of that function). \nTo solve an optimization problem, we first have to model the problem and define the objective, the variables, and the constraints of the problem. In optimization, these terms are usually defined as:\n\n1. Objective function (or also, cost, loss, utility, or fitness function): a function describing what we want to optimize. \n2. Design variable(s): variables that will be manipulated to optimize the cost function. \n3. Constraint functions: a set of constraints, equalities or inequalities that constrains the possible solutions to possible values of the design variables (candidate solutions or feasible solutions or feasible set).\n\nA feasible solution that minimizes (or maximizes) the objective function is called an optimal solution.\n\nThe optimization problem is the calculation of the minimum or maximum values of an objective function over a set of **unknown** possible values of the design variables. \nEven in case of a finite number of possible values of the objective function and design variables (e.g., after discretization and a manual or a grid search), in general the evaluation of the objective function is computationally expensive and should be avoided. \nOf note, even if there is no other option, a random search is in fact more efficient than a manual or a grid search! See [Bergstra, Bengio (2012)](http://jmlr.csail.mit.edu/papers/volume13/bergstra12a/bergstra12a.pdf).\n\nA typical problem of optimization: [Knapsack problem](https://en.wikipedia.org/wiki/Knapsack_problem).\n\nRead more about that in [Introduction to Optimization](http://neos-guide.org/content/optimization-introduction) from the [NEOS Guide](http://neos-guide.org/).",
"_____no_output_____"
],
[
"## Some jargon in mathematical optimization\n\n - **Linear versus nonlinear optimization**: linear optimization refers to when the objective function and the constraints are linear mathematical functions. When the objective function is linear, an optimal solution is always found at the constraint boundaries and a local optimum is also a global optimum. See [Wikipedia 1](https://en.wikipedia.org/wiki/Linear_programming) and [Wikipedia 2](https://en.wikipedia.org/wiki/Nonlinear_programming). \n - **Constrained versus unconstrained optimization**: in constrained optimization there are no constraints. \n - **Convex optimization**: the field of optimization that deals with finding the minimum of convex functions (or the maximum of concave functions) over a convex constraint set. The convexity of a function facilitates the optimization because a local minimum must be a global minimum and first-order conditions (the first derivatives) are sufficient conditions for finding the optimal solution. Note that although convex optimization is a particular case of nonlinear optimization, it is a relatively simple optimization problem, with robust and mature methods of solution. See [Wikipedia](https://en.wikipedia.org/wiki/Convex_optimization). \n - **Multivariate optimization**: optimization of a function of several variables.\n - **Multimodal optimization**: optimization of a function with several local minima to find the multiple (locally) optimal solutions, as opposed to a single best solution. \n - **Multi-objective optimization**: optimization involving more than one objective function to be optimized simultaneously. \n - **Optimal control**: finding a control law for a given system such that a certain optimality criterion is achieved. See [Wikipedia](https://en.wikipedia.org/wiki/Optimal_control). \n - **Quadratic programming**: optimization of a quadratic function subject to linear constraints. See [Wikipedia](https://en.wikipedia.org/wiki/Quadratic_programming). \n - **Simplex algorithm**: linear optimization algorithm that begins at a starting vertex and moves along the edges of the polytope (the feasible region) until it reaches the vertex of the optimum solution. See [Wikipedia](https://en.wikipedia.org/wiki/Simplex_algorithm). ",
"_____no_output_____"
],
[
"## Maxima and minima\n\nIn mathematics, the maximum and minimum of a function are the largest and smallest values that the function takes at a point either within a neighborhood (local) or on the function entire domain (global) ([Wikipedia](http://en.wikipedia.org/wiki/Maxima_and_minima)). \n\nFor a function of one variable, if the maximum or minimum of a function is not at the limits of the domain and if at least the first and second derivatives of the function exist, a maximum and minimum can be found as the point where the first derivative of the function is zero. If the second derivative on that point is positive, then it's a minimum, if it is negative, it's a maximum.\n\n<div class='center-align'><figure><img src='./../images/maxmin.png' width=350 alt='minima and maxima of a function'/> <figcaption><center><i>Figure. Maxima and minima of a function of one variable.</i></center></figcaption> </figure></div>\n\n - Note that the requirement that the second derivative on the extremum to be positive for a minimum or negative for a maximum is sufficient but not a necessary condition. For instance, the function $f(x)=x^4$ has an extremum in $x=0$ since $f'(x)=4x^3$ and $f'(0)=0$, but its second derivative at $x=0$ is also zero: $f''(x)=12x^2;\\: f''(0)=0$. In fact, the requirement is that the first non-zero derivative on that point should be positive for a minimum or negative for a maximum: $f''''(0)=24$; the extremum is a minimum.",
"_____no_output_____"
],
[
"Let's now apply optimization to solve a problem with a univariate function.",
"_____no_output_____"
]
],
[
[
"# import Python libraries\nimport numpy as np\n%matplotlib inline\nimport matplotlib\nimport matplotlib.pyplot as plt\nimport sympy as sym\nfrom sympy.plotting import plot\nimport pandas as pd\nfrom IPython.display import display\nfrom IPython.core.display import Math",
"_____no_output_____"
]
],
[
[
"### Example 1: Maximum volume of a cardboard box\n\nWe want to make a box from a square cardboard with side $a$ such that its volume should be maximum. \nWhat is the optimal distance where the square cardboard should be cut and folded to make a box with maximum volume?\n\n<div class='center-align'><figure><img src='./../images/box.png' width=450 alt='box optimization'/> <figcaption><center><i>Figure. A box to be made from a cardboard such that its volume should be maximum. Where we should cut?</i></center></figcaption> </figure></div>",
"_____no_output_____"
],
[
"If the distance where to cut and fold the cardboard is $b$, see figure above, the volume of the box will be:\n\n\\begin{equation}\n\\begin{array}{l l}\nV(b) = b(a-2b)(a-2b) \\\\\n\\\\\nV(b) = a^2b - 4ab^2 + 4b^3\n\\end{array}\n\\label{}\n\\end{equation}\n\nIn the context of optimization: \n**The expression for $V$ is the cost function, $b$ is the design variable, and the constraint is that feasible values of $b$ are in the interval $]0, \\dfrac{a}{2}[$, i.e., $b>0$ and $b<\\dfrac{a}{2}$.** \n\nThe first and second derivatives of $V$ w.r.t. $b$ are:\n\n\\begin{equation}\n\\begin{array}{l l}\n\\dfrac{\\mathrm{d}V}{\\mathrm{d}b} = a^2 - 8ab + 12b^2 \\\\\n\\\\\n\\dfrac{\\mathrm{d}^2 V}{\\mathrm{d}b^2} = - 8a + 24b\n\\end{array}\n\\label{}\n\\end{equation}\n\nWe have to find the values for $b$ where the first derivative of $V$ is zero (the extrema) and then use the expression for the second derivative of $V$ to find whether each of these extrema is a minimum (positive value) or a maximum (negative value). \nLet's use Sympy for that:",
"_____no_output_____"
]
],
[
[
"a, b = sym.symbols('a b')\nV = b*(a - 2*b)*(a - 2*b)\nVdiff = sym.expand(sym.diff(V, b))\nroots = sym.solve(Vdiff, b)\ndisplay(Math(sym.latex('Roots:') + sym.latex(roots)))",
"_____no_output_____"
],
[
"roots",
"_____no_output_____"
]
],
[
[
"Discarding the solution $b=\\dfrac{a}{2}$ (where $V=0$, which is a minimum), $b=\\dfrac{a}{6}$ results in the maximum volume. \nWe can check that by plotting the volume of the cardboard box for $a=1$ and $b: [0,\\:0.5]$:",
"_____no_output_____"
]
],
[
[
"plot(V.subs({a: 1}), (b, 0, .5), xlabel='b', ylabel='V')\ndisplay(Math(sym.latex('V_{a=1}^{max}(b=%s)=%s'\n %(roots[0].evalf(n=4, subs={a: 1}), V.evalf(n=3, subs={a: 1, b: roots[0]})))))",
"_____no_output_____"
]
],
[
[
" - Note that although the problem above is a case of nonlinear constrained optimization, because the objective function is univariate, well-conditioned and the constraints are linear inequalities, the optimization is simple. Unfortunately, this is seldom the case.",
"_____no_output_____"
],
[
"## Curve fitting as an optimization problem\n\nCurve fitting is the process of fitting a model, expressed in terms of a mathematical function, that depends on adjustable parameters to a series of data points and once adjusted, that curve has the best fit to the data points.\n\nThe general approach to the fitting procedure involves the definition of a merit function that measures the agreement between data and model. The model parameters are then adjusted to yield the best-fit parameters as a problem of minimization (an optimization problem, where the merit function is the cost function). \n\nA classical solution, termed least-squares fitting, is to find the best fit by minimizing the sum of the squared differences between data points and the model function (the sum of squared residuals as the merit function).\n\nFor more on curve fitting see the video below and the notebook [Curve fitting](http://nbviewer.jupyter.org/github/demotu/BMC/blob/master/notebooks/CurveFitting.ipynb).",
"_____no_output_____"
]
],
[
[
"from IPython.display import YouTubeVideo\nYouTubeVideo('Rxp7o7_RxII', width=480, height=360, rel=0)",
"_____no_output_____"
]
],
[
[
"## Gradient descent\n\nGradient descent is a first-order iterative optimization algorithm for finding the minimum of a function ([Wikipedia](https://en.wikipedia.org/wiki/Gradient_descent)). \nIn the gradient descent algorithm, a local minimum of a function is found starting from an initial point and taking steps proportional to the negative of the derivative of the function (gradient) at the current point and we evaluate if the current point is lower than then the previous point until a local minimum in reached (hopefully). \n\nIt follows that, if\n\n\\begin{equation}\nx_{n+1} = x_n - \\gamma \\nabla f(x)\n\\label{}\n\\end{equation}\n\nfor $\\gamma$ small enough, then $f(x_{n}) \\geq f(x_{n+1})$.\n\nThis process is repeated iteratively until the step size (which is proportional to the gradient!) is below a required precision (hopefully the sequence $x_{n}$ converges to the desired local minimum).",
"_____no_output_____"
],
[
"### Example 2: Minimum of a function by gradient descent\n\nFrom https://en.wikipedia.org/wiki/Gradient_descent: \nCalculate the minimum of $f(x)=x^4-3x^3+2$.",
"_____no_output_____"
]
],
[
[
"# From https://en.wikipedia.org/wiki/Gradient_descent\n# The local minimum of $f(x)=x^4-3x^3+2$ is at x=9/4\n\ncur_x = 6 # The algorithm starts at x=6\ngamma = 0.01 # step size multiplier\nprecision = 0.00001\nstep_size = 1 # initial step size\nmax_iters = 10000 # maximum number of iterations\niters = 0 # iteration counter\n\n\nf = lambda x: x**4 - 3*x**3 + 2 # lambda function for f(x)\ndf = lambda x: 4*x**3 - 9*x**2 # lambda function for the gradient of f(x)\n\nwhile (step_size > precision) & (iters < max_iters):\n prev_x = cur_x\n cur_x -= gamma*df(prev_x)\n step_size = abs(cur_x - prev_x)\n iters+=1\n\nprint('True local minimum at {} with function value {}.'.format(9/4, f(9/4)))\nprint('Local minimum by gradient descent at {} with function value {}.'.format(cur_x, f(cur_x)))",
"_____no_output_____"
]
],
[
[
"## Multivariate optimization\n \nWhen there is more than one design variable (the cost function depends on more than one variable), it's a multivariate optimization. The general idea of finding minimum and maximum values where the derivatives are zero still holds for a multivariate function. The second derivative of a multivariate function can be described by the Hessian matrix:\n\n\\begin{equation}\n\\mathbf{H} = \\begin{bmatrix}{\\dfrac {\\partial ^{2}f}{\\partial x_{1}^{2}}}&{\\dfrac {\\partial ^{2}f}{\\partial x_{1}\\,\\partial x_{2}}}&\\cdots &{\\dfrac {\\partial ^{2}f}{\\partial x_{1}\\,\\partial x_{n}}}\\\\[2.2ex]{\\dfrac {\\partial ^{2}f}{\\partial x_{2}\\,\\partial x_{1}}}&{\\dfrac {\\partial ^{2}f}{\\partial x_{2}^{2}}}&\\cdots &{\\dfrac {\\partial ^{2}f}{\\partial x_{2}\\,\\partial x_{n}}}\\\\[2.2ex]\\vdots &\\vdots &\\ddots &\\vdots \\\\[2.2ex]{\\dfrac {\\partial ^{2}f}{\\partial x_{n}\\,\\partial x_{1}}}&{\\dfrac {\\partial ^{2}f}{\\partial x_{n}\\,\\partial x_{2}}}&\\cdots &{\\dfrac {\\partial ^{2}f}{\\partial x_{n}^{2}}}\n\\end{bmatrix}\n\\label{}\n\\end{equation}\n\nLet's see now a classical problem in biomechanics where optimization is useful and there is more than one design variable.",
"_____no_output_____"
],
[
"## The distribution problem in biomechanics\n\nUsing the inverse dynamics approach in biomechanics, we can determine the net force and torque acting on a joint if we know the external forces on the segments and the kinematics and inertial properties of the segments. But with this approach we are unable to determine the individual muscles forces that created such torque, as expressed in the following equation:\n\n\\begin{equation}\nM_{total} = M_1 + M_2 + \\dots + M_n = r_1F_1 + r_2F_2 + \\dots + r_nF_n\n\\label{}\n\\end{equation}\n\nwhere $r_i$ is the moment arm of the force $F_i$ that generates a torque $M_i$, a parcel of the (known) total torque $M_{total}$. \n\nEven if we know the moment arm of each muscle (e.g., from cadaveric data or from image analysis), the equation above has $n$ unknowns. Because there is more than one muscle that potentially created such torque, there are more unknowns than equations, and the problem is undetermined. So, the problem is how to find how the torque is distributed among the muscles of that joint.\n\nOne solution is to consider that we (biological systems) optimize our effort in order to minimize energy expenditure, stresses on our tissues, fatigue, etc. The principle of least action, stated in the opening of this text, is an allusion that optimization might be ubiquitous in nature. With this rationale, let's solve the distribution problem in biomechanics using optimization and find the minimum force of each muscle necessary to complete a given task.\n\nThe following cost functions have been proposed to solve the distribution problem in biomechanics:\n\n\\begin{equation}\n\\begin{array}{l l}\n\\displaystyle\\sum_{i=1}^N F_i \\quad &\\text{e.g., Seireg and Arkivar (1973)}\n\\\\\n\\displaystyle\\sum_{i=1}^N F_i^2 \\quad &\n\\\\\n\\displaystyle\\sum_{i=1}^N \\left(\\dfrac{F_i}{pcsa_i}\\right)^2 \\quad &\\text{e.g., Crowninshield and Brand (1981)}\n\\\\\n\\displaystyle\\sum_{i=1}^N \\left(\\dfrac{F_i}{M_{max,i}}\\right)^3 \\quad &\\text{e.g., Herzog (1987)}\n\\end{array}\n\\label{}\n\\end{equation}\n\nWhere $pcsa_i$ is the physiological cross-sectional area of muscle $i$ and $M_{max,i}$ is the maximum torque muscle $i$ can produce. \nEach muscle force $F_i$ is a design variable and the following constraints must be satisfied:\n\n\\begin{equation}\n\\begin{array}{l l}\n0 \\leq F_i \\leq F_{max}\n\\\\\n\\displaystyle\\sum_{i=1}^N r_i \\times F_i = M\n\\end{array}\n\\label{}\n\\end{equation}\n\nLet's apply this concept to solve a distribution problem in biomechanics.",
"_____no_output_____"
],
[
"### Muscle force estimation\n\nConsider the following main flexors of the elbow joint (see figure below): biceps long head, biceps short head, and brachialis. Suppose that the elbow net joint torque determined using inverse dynamics is 20 Nm (flexor). How much each of these muscles contributed to the net torque?\n\n<div class='center-align'><figure><img src='./../images/elbowflexors.png' alt='Elbow flexors'/> <figcaption><center><i>Figure. A view in OpenSim of the arm26 model showing three elbow flexors (Biceps long and short heads and Brachialis).</i></center></figcaption> </figure></div>\n\nFor the optimization, we will need experimental data for the moment arm, maximum moment, and *pcsa* of each muscle. Let's import these data from the OpenSim arm26 model:",
"_____no_output_____"
]
],
[
[
"# time elbow_flexion BIClong BICshort BRA\nr_ef = np.loadtxt('./../data/r_elbowflexors.mot', skiprows=7)\nf_ef = np.loadtxt('./../data/f_elbowflexors.mot', skiprows=7)",
"_____no_output_____"
]
],
[
[
"The maximum isometric force of these muscles are defined in the arm26 model as: Biceps long head: 624.3 N, Biceps short head: 435.56 N, and Brachialis: 987.26 N. Let's compute the mamimum torques that each muscle could produce considering a static situation at the different elbow flexion angles:",
"_____no_output_____"
]
],
[
[
"m_ef = r_ef*1\nm_ef[:, 2:] = r_ef[:, 2:]*f_ef[:, 2:]",
"_____no_output_____"
]
],
[
[
"And let's visualize these data:",
"_____no_output_____"
]
],
[
[
"labels = ['Biceps long head', 'Biceps short head', 'Brachialis']\nfig, ax = plt.subplots(nrows=1, ncols=3, sharex=True, figsize=(10, 4))\nax[0].plot(r_ef[:, 1], r_ef[:, 2:])\n#ax[0].set_xlabel('Elbow angle $(\\,^o)$')\nax[0].set_title('Moment arm (m)')\nax[1].plot(f_ef[:, 1], f_ef[:, 2:])\nax[1].set_xlabel('Elbow angle $(\\,^o)$', fontsize=16)\nax[1].set_title('Maximum force (N)')\nax[2].plot(m_ef[:, 1], m_ef[:, 2:])\n#ax[2].set_xlabel('Elbow angle $(\\,^o)$')\nax[2].set_title('Maximum torque (Nm)')\nax[2].legend(labels, loc='best', framealpha=.5)\nax[2].set_xlim(np.min(r_ef[:, 1]), np.max(r_ef[:, 1]))\nplt.tight_layout()\nplt.show()",
"_____no_output_____"
]
],
[
[
"These data don't have the *pcsa* value of each muscle. We will estimate the *pcsa* considering that the amount of maximum muscle force generated per area is constant and equal to 50N/cm$^2$. Consequently, the *pcsa* (in cm$^2$) for each muscle is:",
"_____no_output_____"
]
],
[
[
"a_ef = np.array([624.3, 435.56, 987.26])/50 # 50 N/cm2\nprint(a_ef)",
"_____no_output_____"
]
],
[
[
"### Static versus dynamic optimization\n\nIn the context of biomechanics, we can solve the distribution problem separately for each angle (instant) of the elbow; we will refer to that as static optimization. However, there is no guarantee that when we analyze all these solutions across the range of angles, they will be the best solution overall. One reason is that static optimization ignores the time history of the muscle force. Dynamic optimization refers to the optimization over a period of time. For such, we will need to input a cost function spanning the entire period of time at once. Dynamic optimization usually has a higher computational cost than static optimization.\n\nFor now, we will solve the present problem using static optimization.",
"_____no_output_____"
],
[
"### Solution of the optimization problem\n\nFor the present case, we are dealing with a problem of minimization, multidimensional (function of several variables), nonlinear, constrained, and we can't assume that the cost function is convex. Numerical optimization is hardly a simple task. There are many different algorithms and public and commercial software for performing optimization. For instance, look at [NEOS Server](http://www.neos-server.org/neos/), a free internet-based service for solving numerical optimization problems. \nWe will solve the present problem using the [scipy.optimize](http://docs.scipy.org/doc/scipy/reference/optimize.html#module-scipy.optimize) package which provides several optimization algorithms. We will use the function `minimize`:\n\n```python\nscipy.optimize.minimize(fun, x0, args=(), method=None, jac=None, hess=None, hessp=None, bounds=None, constraints=(), tol=None, callback=None, options=None)\n\"\"\"Minimization of scalar function of one or more variables.\"\"\"\n```\n\nNow, let's write Python functions for each cost function:",
"_____no_output_____"
]
],
[
[
"from scipy.optimize import minimize",
"_____no_output_____"
],
[
"def cf_f1(x):\n \"\"\"Cost function: sum of forces.\"\"\" \n return x[0] + x[1] + x[2]\n\ndef cf_f2(x):\n \"\"\"Cost function: sum of forces squared.\"\"\"\n return x[0]**2 + x[1]**2 + x[2]**2\n\ndef cf_fpcsa2(x, a):\n \"\"\"Cost function: sum of squared muscle stresses.\"\"\"\n return (x[0]/a[0])**2 + (x[1]/a[1])**2 + (x[2]/a[2])**2\n\ndef cf_fmmax3(x, m):\n \"\"\"Cost function: sum of cubic forces normalized by moments.\"\"\"\n return (x[0]/m[0])**3 + (x[1]/m[1])**3 + (x[2]/m[2])**3",
"_____no_output_____"
]
],
[
[
"Let's also define the Jacobian for each cost function (which is an optional parameter for the optimization):",
"_____no_output_____"
]
],
[
[
"def cf_f1d(x):\n \"\"\"Derivative of cost function: sum of forces.\"\"\"\n dfdx0 = 1\n dfdx1 = 1\n dfdx2 = 1\n return np.array([dfdx0, dfdx1, dfdx2])\n\ndef cf_f2d(x):\n \"\"\"Derivative of cost function: sum of forces squared.\"\"\"\n dfdx0 = 2*x[0]\n dfdx1 = 2*x[1]\n dfdx2 = 2*x[2]\n return np.array([dfdx0, dfdx1, dfdx2])\n\ndef cf_fpcsa2d(x, a):\n \"\"\"Derivative of cost function: sum of squared muscle stresses.\"\"\"\n dfdx0 = 2*x[0]/a[0]**2\n dfdx1 = 2*x[1]/a[1]**2\n dfdx2 = 2*x[2]/a[2]**2\n return np.array([dfdx0, dfdx1, dfdx2])\n\ndef cf_fmmax3d(x, m):\n \"\"\"Derivative of cost function: sum of cubic forces normalized by moments.\"\"\"\n dfdx0 = 3*x[0]**2/m[0]**3\n dfdx1 = 3*x[1]**2/m[1]**3\n dfdx2 = 3*x[2]**2/m[2]**3\n return np.array([dfdx0, dfdx1, dfdx2])",
"_____no_output_____"
]
],
[
[
"Let's define initial values:",
"_____no_output_____"
]
],
[
[
"M = 20 # desired torque at the elbow\niang = 69 # which will give the closest value to 90 degrees\nr = r_ef[iang, 2:]\nf0 = f_ef[iang, 2:]\na = a_ef\nm = m_ef[iang, 2:]\nx0 = f_ef[iang, 2:]/10 # far from the correct answer for the sum of torques\nprint('M =', M)\nprint('x0 =', x0)\nprint('r * x0 =', np.sum(r*x0))",
"_____no_output_____"
]
],
[
[
"Inequality constraints (such as boundaries in our problem) can be entered with the parameter `bounds` to the `minimize` function:",
"_____no_output_____"
]
],
[
[
"bnds = ((0, f0[0]), (0, f0[1]), (0, f0[2]))",
"_____no_output_____"
]
],
[
[
"Equality constraints (such as the sum of torques should equals the desired torque in our problem), as well as inequality constraints, can be entered with the parameter `constraints` to the `minimize` function (and we can also opt to enter the Jacobian of these constraints):",
"_____no_output_____"
]
],
[
[
"# use this in combination with the parameter bounds:\ncons = ({'type': 'eq',\n 'fun' : lambda x, r, f0, M: np.array([r[0]*x[0] + r[1]*x[1] + r[2]*x[2] - M]), \n 'jac' : lambda x, r, f0, M: np.array([r[0], r[1], r[2]]), 'args': (r, f0, M)})",
"_____no_output_____"
],
[
"# to enter everything as constraints:\ncons = ({'type': 'eq',\n 'fun' : lambda x, r, f0, M: np.array([r[0]*x[0] + r[1]*x[1] + r[2]*x[2] - M]), \n 'jac' : lambda x, r, f0, M: np.array([r[0], r[1], r[2]]), 'args': (r, f0, M)},\n {'type': 'ineq', 'fun' : lambda x, r, f0, M: f0[0]-x[0],\n 'jac' : lambda x, r, f0, M: np.array([-1, 0, 0]), 'args': (r, f0, M)},\n {'type': 'ineq', 'fun' : lambda x, r, f0, M: f0[1]-x[1],\n 'jac' : lambda x, r, f0, M: np.array([0, -1, 0]), 'args': (r, f0, M)},\n {'type': 'ineq', 'fun' : lambda x, r, f0, M: f0[2]-x[2],\n 'jac' : lambda x, r, f0, M: np.array([0, 0, -1]), 'args': (r, f0, M)},\n {'type': 'ineq', 'fun' : lambda x, r, f0, M: x[0],\n 'jac' : lambda x, r, f0, M: np.array([1, 0, 0]), 'args': (r, f0, M)},\n {'type': 'ineq', 'fun' : lambda x, r, f0, M: x[1],\n 'jac' : lambda x, r, f0, M: np.array([0, 1, 0]), 'args': (r, f0, M)},\n {'type': 'ineq', 'fun' : lambda x, r, f0, M: x[2],\n 'jac' : lambda x, r, f0, M: np.array([0, 0, 1]), 'args': (r, f0, M)})",
"_____no_output_____"
]
],
[
[
"Although more verbose, if all the Jacobians of the constraints are also informed, this alternative seems better than informing bounds for the optimization process (less error in the final result and less iterations). \n\nGiven the characteristics of the problem, if we use the function `minimize` we are limited to the SLSQP (Sequential Least SQuares Programming) solver. \n\nFinally, let's run the optimization for the four different cost functions and find the optimal muscle forces:",
"_____no_output_____"
]
],
[
[
"f1r = minimize(fun=cf_f1, x0=x0, args=(), jac=cf_f1d,\n constraints=cons, method='SLSQP',\n options={'disp': True})",
"_____no_output_____"
],
[
"f2r = minimize(fun=cf_f2, x0=x0, args=(), jac=cf_f2d,\n constraints=cons, method='SLSQP',\n options={'disp': True})",
"_____no_output_____"
],
[
"fpcsa2r = minimize(fun=cf_fpcsa2, x0=x0, args=(a,), jac=cf_fpcsa2d,\n constraints=cons, method='SLSQP',\n options={'disp': True})",
"_____no_output_____"
],
[
"fmmax3r = minimize(fun=cf_fmmax3, x0=x0, args=(m,), jac=cf_fmmax3d,\n constraints=cons, method='SLSQP',\n options={'disp': True})",
"_____no_output_____"
]
],
[
[
"Let's compare the results for the different cost functions:",
"_____no_output_____"
]
],
[
[
"dat = np.vstack((np.around(r*100,1), np.around(a,1), np.around(f0,0), np.around(m,1)))\nopt = np.around(np.vstack((f1r.x, f2r.x, fpcsa2r.x, fmmax3r.x)), 1)\ner = ['-', '-', '-', '-',\n np.sum(r*f1r.x)-M, np.sum(r*f2r.x)-M, np.sum(r*fpcsa2r.x)-M, np.sum(r*fmmax3r.x)-M]\ndata = np.vstack((np.vstack((dat, opt)).T, er)).T\n\nrows = ['$\\text{Moment arm}\\;[cm]$', '$pcsa\\;[cm^2]$', '$F_{max}\\;[N]$', '$M_{max}\\;[Nm]$',\n '$\\sum F_i$', '$\\sum F_i^2$', '$\\sum(F_i/pcsa_i)^2$', '$\\sum(F_i/M_{max,i})^3$']\ncols = ['Biceps long head', 'Biceps short head', 'Brachialis', 'Error in M']\ndf = pd.DataFrame(data, index=rows, columns=cols)\nprint('\\nComparison of different cost functions for solving the distribution problem')\ndf",
"_____no_output_____"
]
],
[
[
"## Comments\n\nThe results show that the estimations for the muscle forces depend on the cost function used in the optimization. Which one is correct? This is a difficult question and it's dependent on the goal of the actual task being modeled. Glitsch and Baumann (1997) investigated the effect of different cost functions on the optimization of walking and running and the predicted muscles forces were compared with the electromyographic activity of the corresponding muscles of the lower limb. They found that, among the analyzed cost functions, the minimization of the sum of squared muscle stresses resulted in the best similarity with the actual electromyographic activity.\n\nIn general, one should always test different algorithms and different initial values before settling for the solution found. Downey (2011), Kitchin (2013), and Kiusalaas (2013) present more examples on numerical optimization. The [NEOS Guide](http://neos-guide.org/) is a valuable source of information on this topic and [OpenOpt](http://openopt.org/) is a good alternative software for numerical optimization in Python.",
"_____no_output_____"
],
[
"## Exercises\n\n1. Find the extrema in the function $f(x)=x^3-7.5x^2+18x-10$ analytically and determine if they are minimum or maximum. \n2. Find the minimum in the $f(x)=x^3-7.5x^2+18x-10$ using the gradient descent algorithm. \n2. Regarding the distribution problem for the elbow muscles presented in this text: \n a. Test different initial values for the optimization. \n b. Test other values for the elbow angle where the results are likely to change. \n \n3. In an experiment to estimate forces of the elbow flexors, through inverse dynamics it was found an elbow flexor moment of 10 Nm. \nConsider the following data for maximum force (F0), moment arm (r), and pcsa (A) of the brachialis, brachioradialis, and biceps brachii muscles: F0 (N): 1000, 250, 700; r (cm): 2, 5, 4; A (cm$^2$): 33, 8, 23, respectively (data from Robertson et al. (2013)). \n a. Use static optimization to estimate the muscle forces. \n b. Test the robustness of the results using different initial values for the muscle forces. \n c. Compare the results for different cost functions.",
"_____no_output_____"
],
[
"## References\n\n- Bergstra B, Bengio Y (2012) [Random Search for Hyper-Parameter Optimization](http://jmlr.csail.mit.edu/papers/volume13/bergstra12a/bergstra12a.pdf). Journal of Machine Learning Research, 13, 281-305. \n- Crowninshield RD, Brand RA (1981) [A physiologically based criterion of muscle force prediction in locomotion](http://www.ncbi.nlm.nih.gov/pubmed/7334039). Journal of Biomechanics, 14, 793–801. \n- Downey AB (2014) [Physical Modeling in MATLAB](http://greenteapress.com/wp/physical-modeling-in-matlab-2e/). 2nd edition. Green Tea Press. \n- Herzog W (1987) [Individual muscle force estimations using a non-linear optimal design](http://www.ncbi.nlm.nih.gov/pubmed/3682873). J Neurosci Methods, 21, 167-179. \n- Glitsch U, Baumann W (1997) [The three-dimensional determination of internal loads in the lower extremity](http://www.ncbi.nlm.nih.gov/pubmed/9456380). Journal of Biomechanics, 30, 1123–1131. \n- Kitchin J (2013) [pycse - Python Computations in Science and Engineering](http://kitchingroup.cheme.cmu.edu/pycse/). \n- Kiusalaas (2013) [Numerical methods in engineering with Python 3](http://books.google.com.br/books?id=aJkXoxxoCoUC). 3rd edition. Cambridge University Press. \n- Nigg BM and Herzog W (2006) [Biomechanics of the Musculo-skeletal System](https://books.google.com.br/books?id=hOIeAQAAIAAJ&dq=editions:ISBN0470017678). 3rd Edition. Wiley. \n- Robertson G, Caldwell G, Hamill J, Kamen G (2013) [Research Methods in Biomechanics](http://books.google.com.br/books?id=gRn8AAAAQBAJ). 2nd Edition. Human Kinetics. \n- Seireg A, Arvikar RJ (1973) [A mathematical model for evaluation of forces in lower extremeties of the musculo-skeletal system](http://www.ncbi.nlm.nih.gov/pubmed/4706941). Journal of Biomechanics, 6, 313–322, IN19–IN20, 323–326.",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
]
] |
d07b2a0ac5ea5d32113dea569694c16891b0d3ec | 235,719 | ipynb | Jupyter Notebook | pacote-download/02-Pandas-Data-Visualization-Exercise.ipynb | stephacastro/Coleta_dados | 1f7df4fd025f8724516b8d337c54b119cf9e552d | [
"MIT"
] | null | null | null | pacote-download/02-Pandas-Data-Visualization-Exercise.ipynb | stephacastro/Coleta_dados | 1f7df4fd025f8724516b8d337c54b119cf9e552d | [
"MIT"
] | null | null | null | pacote-download/02-Pandas-Data-Visualization-Exercise.ipynb | stephacastro/Coleta_dados | 1f7df4fd025f8724516b8d337c54b119cf9e552d | [
"MIT"
] | null | null | null | 401.565588 | 22,992 | 0.943369 | [
[
[
"___\n\n<a href='http://www.pieriandata.com'><img src='../Pierian_Data_Logo.png'/></a>\n___\n<center><em>Copyright Pierian Data</em></center>\n<center><em>For more information, visit us at <a href='http://www.pieriandata.com'>www.pieriandata.com</a></em></center>",
"_____no_output_____"
],
[
"# Pandas Data Visualization Exercises\n\nThis is just a quick exercise to review the various plots we showed earlier. Use <tt>df3.csv</tt> to replicate the following plots.\n\n<div class=\"alert alert-danger\" style=\"margin: 10px\"><strong>IMPORTANT NOTE!</strong> Make sure you don't run the cells directly above the example output shown, <br>otherwise you will end up writing over the example output!</div>",
"_____no_output_____"
]
],
[
[
"# RUN THIS CELL\nimport pandas as pd\nimport matplotlib.pyplot as plt\n%matplotlib inline\n\ndf3 = pd.read_csv('df3.csv')\nprint(len(df3))\nprint(df3.head())",
"500\n weekday produced defective\n0 1.Monday 73 7\n1 2.Tuesday 75 10\n2 3.Wednesday 86 7\n3 4.Thursday 64 7\n4 5.Friday 70 6\n"
]
],
[
[
"So <tt>df3</tt> has 500 records and 3 columns. The data represents factory production numbers and reported numbers of defects on certain days of the week.",
"_____no_output_____"
],
[
"### 1. Recreate this scatter plot of 'produced' vs 'defective'. Note the color and size of the points. Also note the figure size. See if you can figure out how to stretch it in a similar fashion.",
"_____no_output_____"
]
],
[
[
"# 1. Recrie este gráfico de dispersão de 'produzido' vs 'defeituoso'. Observe a cor e o tamanho dos pontos. \n# Observe também o tamanho da figura. Veja se você pode descobrir como esticá-lo de forma semelhante.\n# CODE HERE\ndf3.plot.scatter(x='produced', y='defective', c='red', figsize=(12,3), s=20)",
"_____no_output_____"
],
[
"# DON'T WRITE HERE",
"_____no_output_____"
]
],
[
[
"### 2. Create a histogram of the 'produced' column.",
"_____no_output_____"
]
],
[
[
"# 2. Crie um histograma da coluna 'produzida'.\ndf3['produced'].plot.hist()",
"_____no_output_____"
],
[
"# DON'T WRITE HERE",
"_____no_output_____"
]
],
[
[
"### 3. Recreate the following histogram of 'produced', tightening the x-axis and adding lines between bars.",
"_____no_output_____"
]
],
[
[
"# 3. Recrie o seguinte histograma de 'produzido', apertando o eixo x e adicionando linhas entre as barras.\ndf3['produced'].plot.hist(edgecolor='k').autoscale(axis='x', tight=True)",
"_____no_output_____"
],
[
"# DON'T WRITE HERE",
"_____no_output_____"
]
],
[
[
"### 4. Create a boxplot that shows 'produced' for each 'weekday' (hint: this is a groupby operation)",
"_____no_output_____"
]
],
[
[
"# 4. Crie um boxplot que mostre 'produzido' para cada 'dia da semana' (dica: esta é uma operação groupby)\ndf3[['weekday','produced']].boxplot(by='weekday', figsize=(12,5))",
"_____no_output_____"
],
[
"# DON'T WRITE HERE",
"_____no_output_____"
]
],
[
[
"### 5. Create a KDE plot of the 'defective' column",
"_____no_output_____"
]
],
[
[
"# 5. Crie um gráfico KDE da coluna 'defeituoso'\ndf3['defective'].plot.kde()",
"_____no_output_____"
],
[
"# DON'T WRITE HERE",
"_____no_output_____"
]
],
[
[
"### 6. For the above KDE plot, figure out how to increase the linewidth and make the linestyle dashed.<br>(Note: You would usually <em>not</em> dash a KDE plot line)",
"_____no_output_____"
]
],
[
[
"# 6. Para o gráfico do KDE acima, descubra como aumentar a largura da linha e tornar o estilo de linha tracejada.\n# (Nota: Você normalmente não traçou uma linha de plotagem do KDE)\ndf3['defective'].plot.kde(ls='--', lw=5)",
"_____no_output_____"
],
[
"# DON'T WRITE HERE",
"_____no_output_____"
]
],
[
[
"### 7. Create a <em>blended</em> area plot of all the columns for just the rows up to 30. (hint: use .loc)",
"_____no_output_____"
]
],
[
[
"# 7. Crie um gráfico de área combinada de todas as colunas apenas para as linhas até 30. (dica: use .loc)\nax = df3.loc[0:30].plot.area(stacked=False, alpha=0.4)\nax.legend(loc=0)",
"_____no_output_____"
],
[
"# DON'T WRITE HERE",
"_____no_output_____"
]
],
[
[
"## Bonus Challenge!\n\n<strong>Notice how the legend in our previous figure overlapped some of actual diagram.<br> Can you figure out how to display the legend outside of the plot as shown below?</strong>",
"_____no_output_____"
]
],
[
[
"ax = df3.loc[0:30].plot.area(stacked=False, alpha=0.4)\nax.legend(loc=0, bbox_to_anchor=(1.3,0.5))",
"_____no_output_____"
],
[
"# DON'T WRITE HERE",
"_____no_output_____"
]
],
[
[
"# Great Job!",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
]
] |
d07b337b5f1236c3ec476b2a697032607ba6c9d4 | 6,965 | ipynb | Jupyter Notebook | algoExpert/disk_stacking/solution.ipynb | maple1eaf/learning_algorithm | a9296c083ba9b79af1be10f56365ee8c95d3aac6 | [
"MIT"
] | null | null | null | algoExpert/disk_stacking/solution.ipynb | maple1eaf/learning_algorithm | a9296c083ba9b79af1be10f56365ee8c95d3aac6 | [
"MIT"
] | null | null | null | algoExpert/disk_stacking/solution.ipynb | maple1eaf/learning_algorithm | a9296c083ba9b79af1be10f56365ee8c95d3aac6 | [
"MIT"
] | null | null | null | 27.207031 | 203 | 0.483274 | [
[
[
"# Disk Stacking\n[link](https://www.algoexpert.io/questions/Disk%20Stacking)",
"_____no_output_____"
],
[
"## My Solution",
"_____no_output_____"
]
],
[
[
"def diskStacking(disks):\n # Write your code here.\n disks.sort(key=lambda x: x[1])\n globalMaxHeight = 0\n prevDiskIdx = [None for _ in range(len(disks) + 1)]\n opt = [0 for _ in range(len(disks) + 1)]\n for i in range(len(disks)):\n opt[i + 1] = disks[i][2]\n for i in range(len(opt)):\n maxHeight = opt[i]\n for j in range(i):\n if disks[i - 1][0] > disks[j][0] \\\n and disks[i - 1][1] > disks[j][1] \\\n and disks[i - 1][2] > disks[j][2]:\n height = opt[j + 1] + disks[i - 1][2]\n if height > maxHeight:\n maxHeight = height\n prevDiskIdx[i] = j + 1\n opt[i] = maxHeight\n if maxHeight > globalMaxHeight:\n globalMaxHeight = maxHeight\n globalMaxHeightIdx = i\n\n res = []\n idx = globalMaxHeightIdx\n while prevDiskIdx[idx] != None:\n res.append(idx - 1)\n idx = prevDiskIdx[idx]\n if idx != 0:\n res.append(idx - 1)\n return [disks[res[i]] for i in reversed(range(len(res)))]",
"_____no_output_____"
],
[
"def diskStacking(disks):\n # Write your code here.\n disks.sort(key=lambda x: x[1])\n globalMaxHeight = 0\n prevDiskIdx = [None for _ in disks]\n opt = [disk[2] for disk in disks]\n for i in range(len(opt)):\n for j in range(i):\n if isStackable(disks, i, j):\n height = opt[j] + disks[i][2]\n if height > opt[i]:\n opt[i] = height\n prevDiskIdx[i] = j\n if opt[i] > globalMaxHeight:\n globalMaxHeight = opt[i]\n globalMaxHeightIdx = i\n\n res = []\n idx = globalMaxHeightIdx\n while idx is not None:\n res.append(idx)\n idx = prevDiskIdx[idx]\n return [disks[res[i]] for i in reversed(range(len(res)))]\n \ndef isStackable(disks, lower, upper):\n return disks[lower][0] > disks[upper][0] \\\n and disks[lower][1] > disks[upper][1] \\\n and disks[lower][2] > disks[upper][2]",
"_____no_output_____"
]
],
[
[
"## Expert Solution",
"_____no_output_____"
]
],
[
[
"# O(n^2) time | O(n) space\ndef diskStacking(disks):\n disks.sort(key=lambda disk:disk[2])\n heights = [disk[2] for disk in disks]\n sequences = [None for disk in disks]\n maxHeightIdx = 0\n for i in range(1, len(disks)):\n currentDisk = disks[i]\n for j in range(0, i):\n otherDisk = disks[j]\n if areValidDimensions(otherDisk, currentDisk):\n if heights[i] <= currentDisk[2] + heights[j]:\n heights[i] = currentDisk[2] + heights[j]\n sequences[i] = j\n if heights[i] >= heights[maxHeightIdx]:\n maxHeightIdx = i\n return buildSequence(disks, sequences, maxHeightIdx)\n\ndef areValidDimensions(o, c):\n return o[0] < c[0] and o[1] < c[1] and o[2] < c[2]\n\ndef buildSequence(array, sequences, currentIdx):\n sequence = []\n while currentIdx is not None:\n sequence.append(array[currentIdx])\n currentIdx = sequences[currentIdx]\n return list(reversed(sequence))",
"_____no_output_____"
]
],
[
[
"## Thoughts\n- the difference between my solution 1 and 2 is the way considering the edge case that solution 1 set null input as an edge case and store the optimal value of it at `opt[0]` as 0.\n\n- why need a first sort:\n - if we don't sort it, the inital order is abitrarily. in an extreme situation, the largest disk which is in the optimal solution may at index 0. our solution will not get an correct answer.\n - if we sort it first, we at least make sure the correct result could come from the sorted array of input.",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
d07b3afc6a6e0d394dd4a6e6e8d0425c1904dc97 | 5,469 | ipynb | Jupyter Notebook | data-wrangling/.ipynb_checkpoints/make_s1_s2_rdrp_reference-checkpoint.ipynb | nextstrain/seasonal-cov | 9dee4e145d003b839729b86ff9c7b74bb8483e55 | [
"MIT"
] | 4 | 2020-03-24T21:54:12.000Z | 2020-03-26T19:12:05.000Z | data-wrangling/.ipynb_checkpoints/make_s1_s2_rdrp_reference-checkpoint.ipynb | blab/seasonal-cov-adaptive-evolution | 9dee4e145d003b839729b86ff9c7b74bb8483e55 | [
"MIT"
] | null | null | null | data-wrangling/.ipynb_checkpoints/make_s1_s2_rdrp_reference-checkpoint.ipynb | blab/seasonal-cov-adaptive-evolution | 9dee4e145d003b839729b86ff9c7b74bb8483e55 | [
"MIT"
] | null | null | null | 38.244755 | 133 | 0.538124 | [
[
[
"import re\nfrom Bio import SeqIO\nfrom Bio.Seq import Seq\nfrom Bio.SeqRecord import SeqRecord\nfrom Bio.Alphabet import IUPAC\nfrom Bio.SeqFeature import SeqFeature, FeatureLocation",
"_____no_output_____"
],
[
"#first 6 aas of each domain\n#from uniprot: NL63 (Q6Q1S2), 229e(P15423), oc43 (P36334), hku1 (Q0ZME7)\n#nl63 s1 domain definition: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC2693060/\ns1_domains = {'nl63': 'FFTCNS', '229e': 'CQTTNG', 'oc43': 'AVIGDL', 'hku1': 'AVIGDF'}\ns2_domains = {'nl63': 'SSDNGI', '229e': 'IIAVQP', 'oc43': 'AITTGY', 'hku1': 'SISASY'}\n\nrdrp_domains_start = {'oc43': 'SKDTNF'}\nrdrp_domains_end = {'oc43': 'RSAVMQ'}",
"_____no_output_____"
],
[
"def write_gene_reference(gene_seq, gene_id, gene_name, gene_description, cov_type, outfile):\n gene_record = SeqRecord(gene_seq, id= gene_id, \n name= gene_name, \n description= gene_description)\n source_feature = SeqFeature(FeatureLocation(0, len(gene_seq)), type='source', \n qualifiers={'organsism':cov_type, \"mol_type\":\"genomic RNA\"}) \n gene_record.features.append(source_feature)\n cds_feature = SeqFeature(FeatureLocation(0, len(gene_seq)), type='CDS', qualifiers={'translation':gene_seq.translate()})\n gene_record.features.append(cds_feature)\n\n SeqIO.write(gene_record, outfile, 'genbank')",
"_____no_output_____"
],
[
"def make_s1_s2_reference(cov):\n spike_reference = '../'+str(cov)+'/config/'+str(cov)+'_spike_reference.gb'\n\n with open(spike_reference, \"r\") as handle:\n for record in SeqIO.parse(handle, \"genbank\"):\n nt_seq = record.seq\n aa_seq = record.seq.translate()\n \n s1_regex = re.compile(f'{s1_domains[cov]}.*(?={s2_domains[cov]})')\n s1_aa = s1_regex.search(str(aa_seq)).group()\n s1_aa_coords = [(aa.start(0), aa.end(0)) for aa in re.finditer(s1_regex, str(aa_seq))][0]\n s1_nt_coords = [s1_aa_coords[0]*3, s1_aa_coords[1]*3]\n s1_nt_seq = nt_seq[s1_nt_coords[0]: s1_nt_coords[1]]\n \n s2_regex = re.compile(f'{s2_domains[cov]}.*')\n s2_aa = s2_regex.search(str(aa_seq)).group()\n s2_aa_coords = [(aa.start(0), aa.end(0)) for aa in re.finditer(s2_regex, str(aa_seq))][0]\n s2_nt_coords = [s2_aa_coords[0]*3, s2_aa_coords[1]*3]\n s2_nt_seq = nt_seq[s2_nt_coords[0]: s2_nt_coords[1]]\n\n write_gene_reference(s1_nt_seq, record.id, str(cov)+'_S1', 'spike s1 subdomain', \n cov, '../'+str(cov)+'/config/'+str(cov)+'_s1_reference.gb')\n write_gene_reference(s2_nt_seq, record.id, str(cov)+'_S2', 'spike s2 subdomain', \n cov, '../'+str(cov)+'/config/'+str(cov)+'_s2_reference.gb')",
"_____no_output_____"
],
[
"# covs = ['oc43', '229e', 'nl63', 'hku1']\ncovs = ['229e']\nfor cov in covs:\n make_s1_s2_reference(cov)",
"_____no_output_____"
],
[
"def make_rdrp_reference(cov):\n replicase_reference = '../'+str(cov)+'/config/'+str(cov)+'_replicase1ab_reference.gb'\n\n with open(replicase_reference, \"r\") as handle:\n for record in SeqIO.parse(handle, \"genbank\"):\n nt_seq = record.seq\n aa_seq = record.seq.translate()\n \n rdrp_regex = re.compile(f'{rdrp_domains_start[cov]}.*{rdrp_domains_end[cov]}')\n rdrp_aa = rdrp_regex.search(str(aa_seq)).group()\n rdrp_aa_coords = [(aa.start(0), aa.end(0)) for aa in re.finditer(rdrp_regex, str(aa_seq))][0]\n rdrp_nt_coords = [rdrp_aa_coords[0]*3, rdrp_aa_coords[1]*3]\n rdrp_nt_seq = nt_seq[rdrp_nt_coords[0]: rdrp_nt_coords[1]]\n\n write_gene_reference(rdrp_nt_seq, record.id, str(cov)+'_rdrp', 'rna-dependent rna polymerase', \n cov, '../'+str(cov)+'/config/'+str(cov)+'_rdrp_reference.gb')",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07b4bf6f1baf18636c4d7997aea557b5d569b3a | 4,120 | ipynb | Jupyter Notebook | digital-image-processing/notebooks/edges/plot_active_contours.ipynb | sinamedialab/courses | 720a78ebf4b4fb77f57a73870480233646f9a51d | [
"MIT"
] | null | null | null | digital-image-processing/notebooks/edges/plot_active_contours.ipynb | sinamedialab/courses | 720a78ebf4b4fb77f57a73870480233646f9a51d | [
"MIT"
] | null | null | null | digital-image-processing/notebooks/edges/plot_active_contours.ipynb | sinamedialab/courses | 720a78ebf4b4fb77f57a73870480233646f9a51d | [
"MIT"
] | null | null | null | 57.222222 | 1,209 | 0.609709 | [
[
[
"%matplotlib inline",
"_____no_output_____"
]
],
[
[
"\n# Active Contour Model\n\nThe active contour model is a method to fit open or closed splines to lines or\nedges in an image [1]_. It works by minimising an energy that is in part\ndefined by the image and part by the spline's shape: length and smoothness. The\nminimization is done implicitly in the shape energy and explicitly in the\nimage energy.\n\nIn the following two examples the active contour model is used (1) to segment\nthe face of a person from the rest of an image by fitting a closed curve\nto the edges of the face and (2) to find the darkest curve between two fixed\npoints while obeying smoothness considerations. Typically it is a good idea to\nsmooth images a bit before analyzing, as done in the following examples.\n\nWe initialize a circle around the astronaut's face and use the default boundary\ncondition ``boundary_condition='periodic'`` to fit a closed curve. The default\nparameters ``w_line=0, w_edge=1`` will make the curve search towards edges,\nsuch as the boundaries of the face.\n\n.. [1] *Snakes: Active contour models*. Kass, M.; Witkin, A.; Terzopoulos, D.\n International Journal of Computer Vision 1 (4): 321 (1988).\n DOI:`10.1007/BF00133570`\n",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport matplotlib.pyplot as plt\nfrom skimage.color import rgb2gray\nfrom skimage import data\nfrom skimage.filters import gaussian\nfrom skimage.segmentation import active_contour\n\n\nimg = data.astronaut()\nimg = rgb2gray(img)\n\ns = np.linspace(0, 2*np.pi, 400)\nr = 100 + 100*np.sin(s)\nc = 220 + 100*np.cos(s)\ninit = np.array([r, c]).T\n\nsnake = active_contour(gaussian(img, 3),\n init, alpha=0.015, beta=10, gamma=0.001,\n coordinates='rc')\n\nfig, ax = plt.subplots(figsize=(7, 7))\nax.imshow(img, cmap=plt.cm.gray)\nax.plot(init[:, 1], init[:, 0], '--r', lw=3)\nax.plot(snake[:, 1], snake[:, 0], '-b', lw=3)\nax.set_xticks([]), ax.set_yticks([])\nax.axis([0, img.shape[1], img.shape[0], 0])\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"Here we initialize a straight line between two points, `(5, 136)` and\n`(424, 50)`, and require that the spline has its end points there by giving\nthe boundary condition `boundary_condition='fixed'`. We furthermore\nmake the algorithm search for dark lines by giving a negative `w_line` value.\n\n",
"_____no_output_____"
]
],
[
[
"img = data.text()\n\nr = np.linspace(136, 50, 100)\nc = np.linspace(5, 424, 100)\ninit = np.array([r, c]).T\n\nsnake = active_contour(gaussian(img, 1), init, boundary_condition='fixed',\n alpha=0.1, beta=1.0, w_line=-5, w_edge=0, gamma=0.1,\n coordinates='rc')\n\nfig, ax = plt.subplots(figsize=(9, 5))\nax.imshow(img, cmap=plt.cm.gray)\nax.plot(init[:, 1], init[:, 0], '--r', lw=3)\nax.plot(snake[:, 1], snake[:, 0], '-b', lw=3)\nax.set_xticks([]), ax.set_yticks([])\nax.axis([0, img.shape[1], img.shape[0], 0])\n\nplt.show()",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07b4ee30c5b453955af61c2acc26beee47efa65 | 196,306 | ipynb | Jupyter Notebook | Assignment_7/heart_failure_linear_model/HeartFali.ipynb | KyleLeePiupiupiu/CS677_Assignment | c38278e81f4e58cc6ef020fade2c075e9fc09bf7 | [
"MIT"
] | null | null | null | Assignment_7/heart_failure_linear_model/HeartFali.ipynb | KyleLeePiupiupiu/CS677_Assignment | c38278e81f4e58cc6ef020fade2c075e9fc09bf7 | [
"MIT"
] | null | null | null | Assignment_7/heart_failure_linear_model/HeartFali.ipynb | KyleLeePiupiupiu/CS677_Assignment | c38278e81f4e58cc6ef020fade2c075e9fc09bf7 | [
"MIT"
] | null | null | null | 271.515906 | 25,806 | 0.912672 | [
[
[
"import pandas as pd\nimport numpy as np\nimport sklearn.neighbors\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.neighbors import KNeighborsClassifier\nfrom sklearn.metrics import accuracy_score\nimport matplotlib.pyplot as plt\nfrom sklearn.metrics import confusion_matrix\nfrom sklearn.preprocessing import StandardScaler\nscaler = StandardScaler()\nfrom sklearn.neighbors import NearestCentroid\nimport math\nfrom sklearn.linear_model import LogisticRegression\nimport seaborn as sn\nimport matplotlib.pyplot as plt",
"_____no_output_____"
],
[
"df = pd.read_csv('./heart_failure_clinical_records_dataset.csv')\ndf = df[['creatinine_phosphokinase', 'serum_creatinine', 'serum_sodium', 'platelets', 'DEATH_EVENT']]\ndf",
"_____no_output_____"
]
],
[
[
"# Question1",
"_____no_output_____"
]
],
[
[
"df0 = df[df.DEATH_EVENT == 0]\ndf0Feature = df0[['creatinine_phosphokinase', 'serum_creatinine', 'serum_sodium', 'platelets']]\ndf1 = df[df.DEATH_EVENT == 1]\ndf1Feature = df1[['creatinine_phosphokinase', 'serum_creatinine', 'serum_sodium', 'platelets']]",
"_____no_output_____"
],
[
"# corr matrix for death_event 0\nsn.heatmap(df0Feature.corr(), annot=True)\nplt.show()",
"_____no_output_____"
],
[
"# corr matrix for death_event 1\nsn.heatmap(df1Feature.corr(), annot=True)\nplt.show()",
"_____no_output_____"
]
],
[
[
"# Question2\n\nGroup3--> \nX = serum sodium, Y = serum creatinine",
"_____no_output_____"
]
],
[
[
"def residual(yTest, yPredict):\n temp = 0\n for (a, b) in zip(yTest, yPredict):\n temp += (a-b) * (a - b)\n return temp",
"_____no_output_____"
],
[
"zeroSet = df0[[ 'serum_creatinine', 'serum_sodium']]\noneSet = df1[[ 'serum_creatinine', 'serum_sodium']]",
"_____no_output_____"
],
[
"# for death_evetn = 0\n# simple linear regression\nx = zeroSet['serum_sodium']\ny = zeroSet['serum_creatinine']\ndegree = 1\n\nxTrain, xTest, yTrain, yTest = train_test_split(x, y, test_size= 0.5, random_state = 0)\n\nweights = np.polyfit(xTrain, yTrain, degree)\nmodel = np.poly1d(weights)\nyPredict = model(xTest)\nprint(weights)\nprint(residual(yTest, yPredict))\nplt.figure()\nplt.scatter(xTest, yTest, color = 'green', label = 'True Data')\nplt.scatter(xTest, yPredict, color = 'red', label = 'Predict')\nplt.grid()\nplt.legend()\nplt.title('linear')\n\n# quadratic\nx = zeroSet['serum_sodium']\ny = zeroSet['serum_creatinine']\ndegree = 2\n\nxTrain, xTest, yTrain, yTest = train_test_split(x, y, test_size= 0.5, random_state = 0)\n\nweights = np.polyfit(xTrain, yTrain, degree)\nmodel = np.poly1d(weights)\nyPredict = model(xTest)\nprint(weights)\nprint(residual(yTest, yPredict))\nplt.figure()\nplt.scatter(xTest, yTest, color = 'green', label = 'True Data')\nplt.scatter(xTest, yPredict, color = 'red', label = 'Predict')\nplt.grid()\nplt.legend()\nplt.title('quadratic')\n\n\n# cubic\nx = zeroSet['serum_sodium']\ny = zeroSet['serum_creatinine']\ndegree = 3\n\nxTrain, xTest, yTrain, yTest = train_test_split(x, y, test_size= 0.5, random_state = 0)\n\nweights = np.polyfit(xTrain, yTrain, degree)\nmodel = np.poly1d(weights)\nyPredict = model(xTest)\nprint(weights)\nprint(residual(yTest, yPredict))\nplt.figure()\nplt.scatter(xTest, yTest, color = 'green', label = 'True Data')\nplt.scatter(xTest, yPredict, color = 'red', label = 'Predict')\nplt.grid()\nplt.legend()\nplt.title('cubic')\n\n# GLM\nx = zeroSet['serum_sodium']\nx = np.log(x)\ny = zeroSet['serum_creatinine']\n\ndegree = 1\n\nxTrain, xTest, yTrain, yTest = train_test_split(x, y, test_size= 0.5, random_state = 0)\n\nweights = np.polyfit(xTrain, yTrain, degree)\nmodel = np.poly1d(weights)\nyPredict = model(xTest)\nprint(weights)\nprint(residual(yTest, yPredict))\nplt.figure()\nplt.scatter(xTest, yTest, color = 'green', label = 'True Data')\nplt.scatter(xTest, yPredict, color = 'red', label = 'Predict')\nplt.grid()\nplt.legend()\nplt.title('y = alog(x) + b')\n\n# GLM\nx = zeroSet['serum_sodium']\nx = np.log(x)\ny = zeroSet['serum_creatinine']\ny = np.log(y)\n\ndegree = 1\n\nxTrain, xTest, yTrain, yTest = train_test_split(x, y, test_size= 0.5, random_state = 0)\n\nweights = np.polyfit(xTrain, yTrain, degree)\nmodel = np.poly1d(weights)\nyPredict = model(xTest)\nprint(weights)\nprint(residual(np.exp(yTest), np.exp(yPredict)))\nplt.figure()\nplt.scatter(xTest, yTest, color = 'green', label = 'True Data')\nplt.scatter(xTest, yPredict, color = 'red', label = 'Predict')\nplt.grid()\nplt.legend()\nplt.title('log(y) = alog(x) + b')\n\n",
"[-0.02006178 3.95868491]\n37.7522760272153\n[ 4.38716045e-03 -1.22100846e+00 8.60833650e+01]\n41.029217477716145\n[ 2.74904980e-04 -1.08363379e-01 1.41794911e+01 -6.14446542e+02]\n36.324014896471446\n[-2.82231768 15.09510415]\n37.64074858610694\n[-1.47469291 7.3446293 ]\n38.938603465563176\n"
],
[
"# for death_evetn = 1\n# simple linear regression\nx = oneSet['serum_sodium']\ny = oneSet['serum_creatinine']\ndegree = 1\n\nxTrain, xTest, yTrain, yTest = train_test_split(x, y, test_size= 0.5, random_state = 0)\n\nweights = np.polyfit(xTrain, yTrain, degree)\nmodel = np.poly1d(weights)\nyPredict = model(xTest)\nprint(weights)\nprint(residual(yTest, yPredict))\nplt.figure()\nplt.scatter(xTest, yTest, color = 'green', label = 'True Data')\nplt.scatter(xTest, yPredict, color = 'red', label = 'Predict')\nplt.grid()\nplt.legend()\nplt.title('linear')\n\n\n# quadratic\nx = oneSet['serum_sodium']\ny = oneSet['serum_creatinine']\ndegree = 2\n\nxTrain, xTest, yTrain, yTest = train_test_split(x, y, test_size= 0.5, random_state = 0)\n\nweights = np.polyfit(xTrain, yTrain, degree)\nmodel = np.poly1d(weights)\nyPredict = model(xTest)\nprint(weights)\nprint(residual(yTest, yPredict))\nplt.figure()\nplt.scatter(xTest, yTest, color = 'green', label = 'True Data')\nplt.scatter(xTest, yPredict, color = 'red', label = 'Predict')\nplt.grid()\nplt.legend()\nplt.title('quadratic')\n\n\n# cubic\nx = oneSet['serum_sodium']\ny = oneSet['serum_creatinine']\ndegree = 3\n\nxTrain, xTest, yTrain, yTest = train_test_split(x, y, test_size= 0.5, random_state = 0)\n\nweights = np.polyfit(xTrain, yTrain, degree)\nmodel = np.poly1d(weights)\nyPredict = model(xTest)\nprint(weights)\nprint(residual(yTest, yPredict))\nplt.figure()\nplt.scatter(xTest, yTest, color = 'green', label = 'True Data')\nplt.scatter(xTest, yPredict, color = 'red', label = 'Predict')\nplt.grid()\nplt.legend()\nplt.title('cubic')\n\n# GLM\nx = oneSet['serum_sodium']\nx = np.log(x)\ny = oneSet['serum_creatinine']\n\ndegree = 1\n\nxTrain, xTest, yTrain, yTest = train_test_split(x, y, test_size= 0.5, random_state = 0)\n\nweights = np.polyfit(xTrain, yTrain, degree)\nmodel = np.poly1d(weights)\nyPredict = model(xTest)\nprint(weights)\nprint(residual(yTest, yPredict))\nplt.figure()\nplt.scatter(xTest, yTest, color = 'green', label = 'True Data')\nplt.scatter(xTest, yPredict, color = 'red', label = 'Predict')\nplt.grid()\nplt.legend()\nplt.title('y = alog(x) + b')\n\n# GLM\nx = oneSet['serum_sodium']\nx = np.log(x)\ny = oneSet['serum_creatinine']\ny = np.log(y)\n\ndegree = 1\n\nxTrain, xTest, yTrain, yTest = train_test_split(x, y, test_size= 0.5, random_state = 0)\n\nweights = np.polyfit(xTrain, yTrain, degree)\nmodel = np.poly1d(weights)\nyPredict = model(xTest)\nprint(weights)\nprint(residual(np.exp(yTest), np.exp(yPredict)))\nplt.figure()\nplt.scatter(xTest, yTest, color = 'green', label = 'True Data')\nplt.scatter(xTest, yPredict, color = 'red', label = 'Predict')\nplt.grid()\nplt.legend()\nplt.title('log(y) = alog(x) + b')\n\n\n\n",
"[-0.01236989 3.8018824 ]\n38.63069354525988\n[ 9.69912611e-03 -2.65111621e+00 1.83120347e+02]\n61.60974727399684\n[ 6.93416020e-03 -2.82186374e+00 3.82460077e+02 -1.72620285e+04]\n2607.9355538202913\n[-1.80477484 10.98470858]\n38.58311899419233\n[-3.18264613 16.14955754]\n22.58026509775523\n"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
]
] |
d07b6df868c91058303a9d89c18614be4d499854 | 1,871 | ipynb | Jupyter Notebook | crds/tests/dev/bestrefs-cos.ipynb | stscieisenhamer/crds | d32010f3ac9cb4b03b4a5e99ab7e9c44861b14b3 | [
"BSD-3-Clause"
] | 7 | 2016-12-13T20:35:34.000Z | 2022-03-10T06:46:13.000Z | crds/tests/dev/bestrefs-cos.ipynb | stscieisenhamer/crds | d32010f3ac9cb4b03b4a5e99ab7e9c44861b14b3 | [
"BSD-3-Clause"
] | 86 | 2016-03-15T19:04:19.000Z | 2022-02-16T15:09:40.000Z | crds/tests/dev/bestrefs-cos.ipynb | stscieisenhamer/crds | d32010f3ac9cb4b03b4a5e99ab7e9c44861b14b3 | [
"BSD-3-Clause"
] | 24 | 2016-03-18T20:50:14.000Z | 2022-02-11T16:26:57.000Z | 22.27381 | 143 | 0.477285 | [
[
[
"empty"
]
]
] | [
"empty"
] | [
[
"empty"
]
] |
d07b750c1a86d5b2987351e1d2e8200f7fe3a60f | 305,914 | ipynb | Jupyter Notebook | code/image_analysis_twilight.ipynb | yongrong-qiu/mouse-scene-cam | a9afd7e8cd7abba32e4807b9018339edbe4ca13c | [
"MIT"
] | null | null | null | code/image_analysis_twilight.ipynb | yongrong-qiu/mouse-scene-cam | a9afd7e8cd7abba32e4807b9018339edbe4ca13c | [
"MIT"
] | null | null | null | code/image_analysis_twilight.ipynb | yongrong-qiu/mouse-scene-cam | a9afd7e8cd7abba32e4807b9018339edbe4ca13c | [
"MIT"
] | 1 | 2021-05-27T09:18:15.000Z | 2021-05-27T09:18:15.000Z | 279.628885 | 33,476 | 0.918297 | [
[
[
"import numpy as np\nimport matplotlib.pyplot as plt\nfrom matplotlib.ticker import FormatStrFormatter\nimport cv2\nimport glob\nimport h5py\nfrom skimage.morphology import disk\nfrom scipy.stats import pearsonr\nfrom scipy.ndimage import gaussian_filter\n%matplotlib inline \n%load_ext autoreload\n%autoreload 2",
"_____no_output_____"
],
[
"# for plot figures\nplt.rcParams['svg.fonttype'] = 'none'\ndef adjust_spines(ax, spines):\n for loc, spine in ax.spines.items():\n if loc in spines:\n spine.set_position(('outward', 2)) \n else:\n spine.set_color('none') \n if 'left' in spines:\n ax.yaxis.set_ticks_position('left')\n else:\n ax.yaxis.set_ticks([])\n if 'bottom' in spines:\n ax.xaxis.set_ticks_position('bottom')\n else:\n ax.xaxis.set_ticks([])",
"_____no_output_____"
],
[
"#import data\nmovie_name = \"../data/image_twilight_bgr.h5\"\n\n#read movie real data, real means: after spectral calibration, before gamma correction for the screen\ndef read_sunriseset_from_h5(filename):\n h5f = h5py.File(filename,'r')\n img_sunrises=h5f['sunrises_bgr_real'][:]\n img_sunsets=h5f['sunsets_bgr_real'][:]\n h5f.close()\n return img_sunrises,img_sunsets\nimg_sunrises,img_sunsets=read_sunriseset_from_h5(movie_name)\nprint (img_sunrises.shape)\nprint (img_sunsets.shape)",
"(6, 437, 437, 3)\n(8, 437, 437, 3)\n"
],
[
"#show one example, image real value\nplt.imshow(img_sunrises[5][...,::-1])",
"_____no_output_____"
],
[
"#to better visulaize image, use gamma correction to transfer image real to image view\ndef img_real2view(img):\n gamma_correction=lambda x:np.power(x,1.0/2.2)\n img_shape=img.shape\n # gray image\n if np.size(img_shape)==2:\n #uint8\n if np.max(img)>1:\n temp_view=np.zeros_like(img,dtype=np.float32)\n temp_view=np.float32(img)/255.0#float32, 1.0\n temp_view=gamma_correction(temp_view)\n temp_view2=np.zeros_like(img,dtype=np.uint8)\n temp_view2=np.uint8(temp_view*255)\n return temp_view2\n #float\n if np.max(img)<2:\n return gamma_correction(img)\n #color image\n if np.size(img_shape)==3:\n #uint8\n if np.max(img)>1:\n temp_view=np.zeros_like(img,dtype=np.float32)\n temp_view=np.float32(img)/255.0#1.0\n temp_view=gamma_correction(temp_view)\n temp_view2=np.zeros_like(img,dtype=np.uint8)\n temp_view2=np.uint8(temp_view*255)#255\n return temp_view2\n #float\n if np.max(img)<2:\n return gamma_correction(img)\n",
"_____no_output_____"
],
[
"#show one example, image view value\nplt.imshow(img_real2view(img_sunrises[1])[...,::-1])",
"_____no_output_____"
]
],
[
[
"### Functions",
"_____no_output_____"
]
],
[
[
"#function: gaussian kernel 1d\n#input: sigma: std\n# order: A positive order corresponds to convolution with\n# that derivative of a Gaussian, use 0 here\n# radius: radius of the filter\ndef my_gaussian_kernel1d(sigma, order, radius):\n \"\"\"\n Computes a 1D Gaussian convolution kernel.\n \"\"\"\n if order < 0:\n raise ValueError('order must be non-negative')\n p = np.polynomial.Polynomial([0, 0, -0.5 / (sigma * sigma)])\n x = np.arange(-radius, radius + 1)\n phi_x = np.exp(p(x), dtype=np.double)\n phi_x /= phi_x.sum()\n if order > 0:\n q = np.polynomial.Polynomial([1])\n p_deriv = p.deriv()\n for _ in range(order):\n # f(x) = q(x) * phi(x) = q(x) * exp(p(x))\n # f'(x) = (q'(x) + q(x) * p'(x)) * phi(x)\n q = q.deriv() + q * p_deriv\n phi_x *= q(x)\n return phi_x\n\n#function: gaussian filter 2d\ndef my_gaussian_kernel2d(sigma,order,radius):\n g_ker_1d=my_gaussian_kernel1d(sigma, order, radius)\n g_ker_2d=np.outer(g_ker_1d, g_ker_1d)\n g_ker_2d /=g_ker_2d.sum()\n return g_ker_2d\n\n#function: my difference of gaussian kernel 1d\n#input: centersigma is the center sigma, surround sigma=1.5*centersigma, centersigma=RFradius\n# radius: defalt 3*centersigma \n#output: kernel size length: 1+3*centersigma*2\ndef my_DOG_kernel1d(centersigma,order,radius):\n surroundsigma=1.5*centersigma\n center_kernel1d=my_gaussian_kernel1d(centersigma,order,radius)\n surround_kernel1d=my_gaussian_kernel1d(surroundsigma,order,radius)\n out_kernel1d=center_kernel1d-surround_kernel1d\n return out_kernel1d\n\n#function: my difference of gaussian kernel 2d, mimic retina center-surround onoff\n#input: centersigma is the center sigma, surround sigma=1.5*centersigma\n# radius: kernelradius, defalt 3*centersigma \n#output: kernel size length: 1+3*centersigma*2\ndef my_DOG_kernel2d(centersigma,order,radius):\n surroundsigma=1.5*centersigma\n center_kernel2d=my_gaussian_kernel2d(centersigma,order,radius)\n surround_kernel2d=my_gaussian_kernel2d(surroundsigma,order,radius)\n out_kernel2d=center_kernel2d-surround_kernel2d\n return out_kernel2d\n\n#function, calculate ONOFF for single pixel\n#input:\n#img: gray image, float, 1.0 (when phase srambled image, may be a little larger than 1.0)\n#(xx,yy): center coordinate, xx: along height, yy: along width, RFradius: radius of center\n#output:\n#onoff value\ndef ONOFF_single(img,xx,yy,centersigma):\n surroundsigma=np.round(1.5*centersigma)\n kernelradius=3*centersigma\n temp=img[xx-kernelradius:xx+kernelradius+1,yy-kernelradius:yy+kernelradius+1]\n center_kernel2d=my_gaussian_kernel2d(centersigma,0,kernelradius)\n surround_kernel2d=my_gaussian_kernel2d(surroundsigma,0,kernelradius)\n centermean=np.sum(temp*center_kernel2d)\n surroundmean=np.sum(temp*surround_kernel2d)\n onoff=(centermean-surroundmean)/(centermean+surroundmean+1e-8)\n return onoff\n\n#input: \n#centersigma is the center sigma\n#img: image or image region, float\n#output: onoff_img, float, -1.0 to 1.0\ndef onoff_wholeimg(img,centersigma):\n kernelradius=3*centersigma\n onoff_img=np.zeros((img.shape[0],img.shape[1]))\n for ii in np.arange(kernelradius,img.shape[0]-kernelradius-1):\n for jj in np.arange(kernelradius,img.shape[1]-kernelradius-1):\n onoff_img[ii,jj]=ONOFF_single(img,ii,jj,centersigma)\n if img.shape[0]==437:\n mask_con=np.zeros((437,437),np.uint8)\n cv2.circle(mask_con,(218,218),radius=218-kernelradius,color=255,thickness=-1)\n mask_con=np.float32(mask_con/255.0)\n onoff_img=np.multiply(onoff_img,mask_con)\n return onoff_img\n\n#input: onoff_seed: random seed for contrast calculation\n#onoff_num: random pick numbers\n#centersigma is the center sigma\n#img: image or image region, float 1.0 (when phase srambled, may be a little larger than 1.0)\n#output: the onoff value distribution\ndef onoff_random(onoff_seed,onoff_num,centersigma,img):\n kernelradius=3*centersigma\n np.random.seed(onoff_seed+866)\n walk_height=np.random.choice(np.arange(kernelradius,img.shape[0]-kernelradius-1),onoff_num,replace=False)\n np.random.seed(onoff_seed+899)\n walk_width=np.random.choice(np.arange(kernelradius,img.shape[1]-kernelradius-1),onoff_num,replace=False)\n onoffs=np.zeros(onoff_num)\n for ii in range(onoff_num):\n onoffs[ii]=ONOFF_single(img,walk_height[ii],walk_width[ii],centersigma)\n return onoffs\n\n#input: onoff_seed: random seed for contrast calculation\n#onoff_num: total random pick numbers=numberofimages* each_random_pick_numbers\n#centersigma is the center sigma\n#imgs: images, all gray images, float 1.0 (when phase srambled, may be a little larger than 1.0)\n# format like: numberofimages*height*width\n#output: the onoff value distribution\ndef onoff_random_imgs(onoff_seed,onoff_num,centersigma,imgs):\n num_imgs=imgs.shape[0]\n onoffs=[]\n for ii in range(num_imgs):\n onoffs.append(onoff_random(onoff_seed+ii,int(np.round(onoff_num/num_imgs)),centersigma,imgs[ii]))\n onoffs=np.array(onoffs)\n onoffs=onoffs.flatten()\n return onoffs\n\n#input: onoff_seed: random seed for onoff and local contrast(rms2) calculation\n#onoff_num: random pick numbers\n#centersigma is the center sigma for onoff\n#RFradius for local contrast(rms2)\n#img: image or image region, float 1.0 (when phase srambled, may be a little larger than 1.0)\n#output: the onoff and local contrast (rms2) value distribution\ndef onoff_rms2_random(onoff_seed,onoff_num,centersigma,RFradius,img):\n kernelradius=3*centersigma\n np.random.seed(onoff_seed+1866)\n walk_height=np.random.choice(np.arange(kernelradius,img.shape[0]-kernelradius-1),onoff_num,replace=False)\n np.random.seed(onoff_seed+2899)\n walk_width=np.random.choice(np.arange(kernelradius,img.shape[1]-kernelradius-1),onoff_num,replace=False)\n onoffs=np.zeros(onoff_num)\n rms2s=np.zeros(onoff_num)\n tempdisk=np.float64(disk(RFradius))\n for ii in range(onoff_num):\n onoffs[ii]=ONOFF_single(img,walk_height[ii],walk_width[ii],centersigma)\n temp=img[walk_height[ii]-RFradius:walk_height[ii]+RFradius+1,\\\n walk_width[ii]-RFradius:walk_width[ii]+RFradius+1]\n temp=temp[np.nonzero(tempdisk)]\n rms2s[ii]=np.std(temp,ddof=1)/(np.mean(temp)+1e-8)\n return onoffs,rms2s\n\n#input: onoff_seed: random seed for contrast calculation\n#onoff_num: total random pick numbers=numberofimages* each_random_pick_numbers\n#centersigma is the center sigma for onoff\n#RFradius for local contrast(rms2)\n#imgs: images, all gray images, float 1.0 (when phase srambled, may be a little larger than 1.0)\n# format like: numberofimages*height*width\n#output: the onoff and local contrast (rms2) value distribution\ndef onoff_rms2_random_imgs(onoff_seed,onoff_num,centersigma,RFradius,imgs):\n num_imgs=imgs.shape[0]\n onoffs=[]\n rms2s=[]\n for ii in range(num_imgs):\n temp_onoff,temp_rms2=onoff_rms2_random(onoff_seed+ii,int(np.round(onoff_num/num_imgs)),\\\n centersigma,RFradius,imgs[ii])\n onoffs.append(temp_onoff)\n rms2s.append(temp_rms2)\n onoffs=np.array(onoffs)\n onoffs=onoffs.flatten()\n rms2s=np.array(rms2s)\n rms2s=rms2s.flatten()\n return onoffs,rms2s\n\n#function, get the rms2 image of one image, input: \n#img: image or image region, float, 1.0, could be a little larger than 1.0 for phase scrambled image\n#RFradius: the radius of the crop area to be estimated the rms2\n#output: rms2_img, float, nonnegative\ndef rms2_wholeimg(img,RFradius):\n tempdisk=np.float64(disk(RFradius))\n rms2_img=np.zeros((img.shape[0],img.shape[1]))\n for ii in np.arange(RFradius,img.shape[0]-RFradius-1):\n for jj in np.arange(RFradius,img.shape[1]-RFradius-1):\n temp=img[ii-RFradius:ii+RFradius+1,jj-RFradius:jj+RFradius+1]\n temp=temp[np.nonzero(tempdisk)]#circular kernel\n rms2_img[ii,jj]=np.std(temp,ddof=1)/(np.mean(temp)+1e-8)\n if img.shape[0]==437:#whole image frame, not crop\n mask_con=np.zeros((437,437),np.uint8)\n cv2.circle(mask_con,(218,218),radius=218-RFradius,color=255,thickness=-1)\n mask_con=np.float32(mask_con/255.0)\n rms2_img=np.multiply(rms2_img,mask_con)\n return rms2_img\n\n#input: onoff_seed: random seed for local contrast(rms2) calculation\n#onoff_num: random pick numbers\n#RFradius for local contrast(rms2)\n#img: image or image region, float 1.0 (when phase srambled, may be a little larger than 1.0)\n#output: the local contrast (rms2) value distribution\ndef rms2_random(onoff_seed,onoff_num,RFradius,img):\n np.random.seed(onoff_seed+1866)\n walk_height=np.random.choice(np.arange(RFradius,img.shape[0]-RFradius-1),onoff_num)\n np.random.seed(onoff_seed+2899)\n walk_width=np.random.choice(np.arange(RFradius,img.shape[1]-RFradius-1),onoff_num)\n rms2s=np.zeros(onoff_num)\n tempdisk=np.float64(disk(RFradius))\n for ii in range(onoff_num):\n temp=img[walk_height[ii]-RFradius:walk_height[ii]+RFradius+1,\\\n walk_width[ii]-RFradius:walk_width[ii]+RFradius+1]\n temp=temp[np.nonzero(tempdisk)]\n rms2s[ii]=np.std(temp,ddof=1)/(np.mean(temp)+1e-8)\n return rms2s",
"_____no_output_____"
],
[
"#bootstrapping\n#apply bootstrapping to estimate standard deviation (error)\n#statistics can be offratios, median, mean\n#for offratios, be careful with the threshold\n#data: for statistics offratios, median, mean: numpy array with shape (sample_size,1)\n#num_exp: number of experiments, with replacement\ndef bootstrap(statistics,data,num_exp=10000,seed=66):\n if statistics == 'offratios':\n def func(x): return len(x[np.where(x<0)])/len(x[np.where(x>0)]) \n elif statistics == 'median':\n def func(x): return np.median(x)\n elif statistics == 'mean':\n def func(x): return np.mean(x)\n sta_boot=np.zeros((num_exp))\n num_data=len(data)\n for ii in range(num_exp):\n np.random.seed(seed+ii)\n tempind=np.random.choice(num_data,num_data,replace=True)\n sta_boot[ii]=func(data[tempind])\n return np.percentile(sta_boot,2.5),np.percentile(sta_boot,97.5)",
"_____no_output_____"
]
],
[
[
"### Sunrise: Intensity profile in the dome, radiated from the sun",
"_____no_output_____"
]
],
[
[
"def createLineIterator(P1, P2, img):\n \"\"\"\n Produces and array that consists of the coordinates and intensities of each pixel in a line between two points\n\n Parameters:\n -P1: a numpy array that consists of the coordinate of the first point (x,y)\n -P2: a numpy array that consists of the coordinate of the second point (x,y)\n -img: the image being processed\n\n Returns:\n -it: a numpy array that consists of the coordinates and intensities of each pixel in the radii \n (shape: [numPixels, 3], row = [x,y,intensity]) \n \"\"\"\n #define local variables for readability\n imageH = img.shape[0]\n imageW = img.shape[1]\n P1X = P1[0]\n P1Y = P1[1]\n P2X = P2[0]\n P2Y = P2[1]\n\n #difference and absolute difference between points\n #used to calculate slope and relative location between points\n dX = P2X - P1X\n dY = P2Y - P1Y\n dXa = np.abs(dX)\n dYa = np.abs(dY)\n\n #predefine numpy array for output based on distance between points\n itbuffer = np.empty(shape=(np.maximum(dYa,dXa),3),dtype=np.float32)\n itbuffer.fill(np.nan)\n\n #Obtain coordinates along the line using a form of Bresenham's algorithm\n negY = P1Y > P2Y\n negX = P1X > P2X\n if P1X == P2X: #vertical line segment\n itbuffer[:,0] = P1X\n if negY:\n itbuffer[:,1] = np.arange(P1Y - 1,P1Y - dYa - 1,-1)\n else:\n itbuffer[:,1] = np.arange(P1Y+1,P1Y+dYa+1) \n elif P1Y == P2Y: #horizontal line segment\n itbuffer[:,1] = P1Y\n if negX:\n itbuffer[:,0] = np.arange(P1X-1,P1X-dXa-1,-1)\n else:\n itbuffer[:,0] = np.arange(P1X+1,P1X+dXa+1)\n else: #diagonal line segment\n steepSlope = dYa > dXa\n if steepSlope:\n #slope = dX.astype(np.float32)/dY.astype(np.float32)\n slope = dX/dY\n if negY:\n itbuffer[:,1] = np.arange(P1Y-1,P1Y-dYa-1,-1)\n else:\n itbuffer[:,1] = np.arange(P1Y+1,P1Y+dYa+1)\n itbuffer[:,0] = (slope*(itbuffer[:,1]-P1Y)).astype(np.int) + P1X\n else:\n #slope = dY.astype(np.float32)/dX.astype(np.float32)\n slope = dY/dX\n if negX:\n itbuffer[:,0] = np.arange(P1X-1,P1X-dXa-1,-1)\n else:\n itbuffer[:,0] = np.arange(P1X+1,P1X+dXa+1)\n itbuffer[:,1] = (slope*(itbuffer[:,0]-P1X)).astype(np.int) + P1Y\n\n #Remove points outside of image\n colX = itbuffer[:,0]\n colY = itbuffer[:,1]\n itbuffer = itbuffer[(colX >= 0) & (colY >=0) & (colX<imageW) & (colY<imageH)]\n\n #Get intensities from img ndarray\n itbuffer[:,2] = img[itbuffer[:,1].astype(np.uint),itbuffer[:,0].astype(np.uint)]\n\n return itbuffer",
"_____no_output_____"
],
[
"#show line\ntemp=img_real2view(img_sunrises[0])\nlineeg=cv2.line(temp,(198,233),(53,161),(0,0,255),5)\nplt.imshow(lineeg[...,::-1])",
"_____no_output_____"
],
[
"#one example\npoint1=(198,233)\npoint2=(53,161)\ntemp=createLineIterator(point1, point2, img_sunrises[0,...,0])\nprint (temp.shape)\n\n#intensity profile\npoint1s=[[198,233],[198,233],[201,222]]\npoint2s=[[53,161],[53,161],[56,150]]\nintenpro=np.zeros((3,2,145),np.uint8)#3 time points, 2 color channel (UV and G),135 pixels\nfor ii in range(3):\n for jj in range(2):\n intenpro[ii,jj]=createLineIterator(point1s[ii], point2s[ii], img_sunrises[ii*2,...,jj])[:,2]\nintenpro=intenpro/255.0",
"(145, 3)\n"
],
[
"#plot intensity profile in 3 time points\nfig, ax = plt.subplots(nrows=1, ncols=1,figsize=(3,3))\nax.plot(intenpro[0,0],color='purple',linestyle='-',label='UV; Time 0')\nax.plot(intenpro[1,0],color='purple',linestyle='--',label='UV; Time 2')\nax.plot(intenpro[2,0],color='purple',linestyle=':',label='UV; Time 4')\nax.plot(intenpro[0,1],color='g',linestyle='-',label='G; Time 0')\nax.plot(intenpro[1,1],color='g',linestyle='--',label='G; Time 2')\nax.plot(intenpro[2,1],color='g',linestyle=':',label='G; Time 4')\nax.legend(loc='best',fontsize=16)\nax.set_xticks([0,75,150])\nax.set_xticklabels(([0,35,70]))\nax.set_ylim([0,1.0])\nax.set_yticks([0,0.5,1.0])\nax.set_xlabel('RF (degree)', fontsize=16)\nax.set_ylabel('Intensity', fontsize=16)\nadjust_spines(ax, ['left', 'bottom'])\nhandles, labels = ax.get_legend_handles_labels()\nlgd = ax.legend(handles, labels, loc='center left',frameon=False, bbox_to_anchor=(1, 0.5))",
"_____no_output_____"
],
[
"#plot intensity profile in 3 time points\nfig, ax = plt.subplots(nrows=1, ncols=1,figsize=(3,3))\nax.plot(intenpro[0,0],color='blueviolet',linestyle='-',label='UV; Time 0')\nax.plot(intenpro[1,0],color='violet',linestyle='-',label='UV; Time 2')\nax.plot(intenpro[2,0],color='purple',linestyle='-',label='UV; Time 4')\nax.plot(intenpro[0,1],color='lime',linestyle='-',label='G; Time 0')\nax.plot(intenpro[1,1],color='g',linestyle='-',label='G; Time 2')\nax.plot(intenpro[2,1],color='yellowgreen',linestyle='-',label='G; Time 4')\nax.legend(loc='best',fontsize=16)\nax.set_xticks([0,75,150])\nax.set_xticklabels(([0,35,70]))\nax.set_ylim([0,1.0])\nax.set_yticks([0,0.5,1.0])\nax.set_xlabel('RF (degree)', fontsize=16)\nax.set_ylabel('Intensity', fontsize=16)\nadjust_spines(ax, ['left', 'bottom'])\nhandles, labels = ax.get_legend_handles_labels()\nlgd = ax.legend(handles, labels, loc='center left',frameon=False, bbox_to_anchor=(1, 0.5))",
"_____no_output_____"
]
],
[
[
"### Sunrise: Dome and tree intensity change along time points",
"_____no_output_____"
]
],
[
[
"temp=img_real2view(img_sunrises[5])\nrecteg=cv2.rectangle(temp,(168,35),(228,55),(255,255,255),1)\nplt.imshow(recteg[...,::-1])",
"_____no_output_____"
],
[
"#dome intensity\ndomeinten_median=np.zeros((6,2)) #6 time points, 2 color channel (UV and G)\ndomeinten_std=np.zeros((6,2))\ndomeinten_lowq_higq=np.zeros((6,2,2)) #6 time points, 2 color channel (UV and G), low and high quantiles(percentiles)\nfor ii in range(6):\n for jj in range(2):\n temp=img_sunrises[ii,35:55,168:228,jj]/255\n domeinten_median[ii,jj]=np.median(temp)\n domeinten_std[ii,jj]=np.std(temp)\n low_perc,high_perc=bootstrap('median',temp,num_exp=10000,seed=66)\n domeinten_lowq_higq[ii,jj,0] = domeinten_median[ii,jj]-low_perc #low\n domeinten_lowq_higq[ii,jj,1] =-domeinten_median[ii,jj]+high_perc #high\n#tree intensity\ntreeinten_median=np.zeros((6,2))#6 time points, 2 color channel (UV and G)\ntreeinten_std=np.zeros((6,2))\ntreeinten_lowq_higq=np.zeros((6,2,2)) #6 time points, 2 color channel (UV and G), low and high quantiles(percentiles)\nfor ii in range(6):\n for jj in range(2):\n temp=img_sunrises[ii,80:100,230:280,jj]/255\n treeinten_median[ii,jj]=np.median(temp)\n treeinten_std[ii,jj]=np.std(temp)\n low_perc,high_perc=bootstrap('median',temp,num_exp=10000,seed=6666)\n treeinten_lowq_higq[ii,jj,0] = treeinten_median[ii,jj]-low_perc #low\n treeinten_lowq_higq[ii,jj,1] =-treeinten_median[ii,jj]+high_perc #high",
"_____no_output_____"
],
[
"#median, errorbar: 2.5-97.5 percentils\ntimepoints=[0,1,2,3,4,5]\nfig, ax = plt.subplots(nrows=1, ncols=1,figsize=(3,3))\nax.errorbar(timepoints,domeinten_median[:,0],yerr=(domeinten_lowq_higq[:,0,0],domeinten_lowq_higq[:,0,1]),marker='o',\\\n color='purple',linestyle='-', label='Dome UV',alpha=1.0, capsize=4)\nax.errorbar(timepoints,domeinten_median[:,1],yerr=(domeinten_lowq_higq[:,1,0],domeinten_lowq_higq[:,1,1]),marker='o',\\\n color='g', linestyle='-', label='Dome G',alpha=1.0, capsize=4)\nax.errorbar(timepoints,treeinten_median[:,0],yerr=(treeinten_lowq_higq[:,0,0],treeinten_lowq_higq[:,0,1]),marker='o',\\\n color='purple',linestyle='--',label='Tree UV',alpha=1.0, capsize=4)\nax.errorbar(timepoints,treeinten_median[:,1],yerr=(treeinten_lowq_higq[:,1,0],treeinten_lowq_higq[:,1,1]),marker='o',\\\n color='g', linestyle='--',label='Tree G',alpha=1.0, capsize=4)\nax.legend(loc='best',fontsize=16)\nax.set_xticks([0,1,2,3,4,5])\nax.set_ylim([0,0.09])\nax.set_yticks([0,0.03,0.06,0.09])\nax.set_xlabel('Time point', fontsize=16)\nax.set_ylabel('Intensity median', fontsize=16)\nadjust_spines(ax, ['left', 'bottom'])\nhandles, labels = ax.get_legend_handles_labels()\nlgd = ax.legend(handles, labels, loc='center left',frameon=False, bbox_to_anchor=(1, 0.5))",
"_____no_output_____"
]
],
[
[
"### Sunrise: Crms change along time points",
"_____no_output_____"
]
],
[
[
"#pick a rectangular area for the tree, not close to the sun, near the edge\ntemp=img_real2view(img_sunrises[5])\nrecteg=cv2.rectangle(temp,(160,50),(340,100),(0,0,255),5)\nplt.imshow(recteg[...,::-1])",
"_____no_output_____"
],
[
"#RF: 2,10 degrees\nRFradius=np.array([2,12])\nonoff_num=100\n#Crms\nrms2_time=np.zeros((6,2,2,onoff_num))#6 time points, 2 color channel (UV and G),2 RFs, 100 data\nrms2_means=np.zeros((6,2,2))#6 time points, 2 color channel (UV and G),2 RFs\nrms2_stds=np.zeros((6,2,2))\nrms2_lowq_higq=np.zeros((6,2,2,2)) #the last channel: low and high quantiles(percentiles)\nfor ii in range(6):\n for jj in range(2):\n for kk in range(2):\n temp=img_sunrises[ii,50:100,160:340,jj]/255\n temprms2s=rms2_random(566+ii*10,onoff_num,RFradius[kk],temp)\n rms2_time[ii,jj,kk]=temprms2s\n rms2_means[ii,jj,kk]=np.mean(temprms2s)\n rms2_stds[ii,jj,kk]=np.std(temprms2s)\n low_perc,high_perc=bootstrap('mean',temprms2s,num_exp=10000,seed=888)\n rms2_lowq_higq[ii,jj,kk,0] = rms2_means[ii,jj,kk]-low_perc #low\n rms2_lowq_higq[ii,jj,kk,1] =-rms2_means[ii,jj,kk]+high_perc #high",
"_____no_output_____"
],
[
"#mean, errorbar: 2.5-97.5 percentiles\ntimepoints=[0,1,2,3,4,5]\nfig, ax = plt.subplots(nrows=1, ncols=1,figsize=(3,3))\nax.errorbar(timepoints,rms2_means[:,0,1],yerr=(rms2_lowq_higq[:,0,1,0],rms2_lowq_higq[:,0,1,1]),marker='o',\\\n color='purple',linestyle='-',label='UV; RF=10',alpha=1.0, capsize=4)\nax.errorbar(timepoints,rms2_means[:,1,1],yerr=(rms2_lowq_higq[:,1,1,0],rms2_lowq_higq[:,1,1,1]),marker='o',\\\n color='g', linestyle='-',label='G; RF=10',alpha=1.0, capsize=4)\nax.legend(loc='best',fontsize=16)\nax.set_xticks([0,1,2,3,4,5])\nax.set_yticks([0,0.2,0.4])\nax.set_xlabel('Time point', fontsize=16)\nax.set_ylabel('Crms mean', fontsize=16)\nadjust_spines(ax, ['left', 'bottom'])\nhandles, labels = ax.get_legend_handles_labels()\nlgd = ax.legend(handles, labels, loc='center left',frameon=False, bbox_to_anchor=(1, 0.5))",
"_____no_output_____"
]
],
[
[
"### Sunset: Conoff and Crms of tree",
"_____no_output_____"
]
],
[
[
"#pick a rectangular area for the tree\ntemp=img_real2view(img_sunsets[1])\nrecteg=cv2.rectangle(temp,(130,50),(340,200),(0,0,255),5)\nplt.imshow(recteg[...,::-1])",
"_____no_output_____"
],
[
"RFradius=np.array([2,7,12,16])\nonoff_num=200\n#upper visual field, UV channel\nupper_UV_RF_rms2s=np.zeros((4,onoff_num))\nfor ii in range(4):\n temp=img_sunsets[1,50:200,130:340,0]/255\n upper_UV_RF_rms2s[ii]=rms2_random(566+ii*10,onoff_num,RFradius[ii],temp)\n#upper visual field, G channel\nupper_G_RF_rms2s=np.zeros((4,onoff_num))\nfor ii in range(4):\n temp=img_sunsets[1,50:200,130:340,1]/255\n upper_G_RF_rms2s[ii]=rms2_random(566+ii*10,onoff_num,RFradius[ii],temp)",
"_____no_output_____"
],
[
"#calculate rms2medians\nRFradius=np.array([2,7,12,16])\n#upper visual field, UV channel\nupper_UV_RF_rms2medians=np.zeros(4)\nupper_UV_RF_rms2stds=np.zeros(4)\nupper_UV_RF_rms2lowqs=np.zeros(4) #lower_quartile\nupper_UV_RF_rms2higqs=np.zeros(4) #upper_quartile\nfor ii in range(4):\n upper_UV_RF_rms2medians[ii]=np.median(upper_UV_RF_rms2s[ii])\n upper_UV_RF_rms2stds[ii]=np.std(upper_UV_RF_rms2s[ii])\n low_perc,high_perc=bootstrap('median',upper_UV_RF_rms2s[ii],num_exp=10000,seed=66)\n upper_UV_RF_rms2lowqs[ii] = upper_UV_RF_rms2medians[ii]-low_perc\n upper_UV_RF_rms2higqs[ii] =-upper_UV_RF_rms2medians[ii]+high_perc \n#upper visual field, G channel\nupper_G_RF_rms2medians=np.zeros(4)\nupper_G_RF_rms2stds=np.zeros(4)\nupper_G_RF_rms2lowqs=np.zeros(4) #lower_quartile\nupper_G_RF_rms2higqs=np.zeros(4) #upper_quartile\nfor ii in range(4):\n upper_G_RF_rms2medians[ii]=np.median(upper_G_RF_rms2s[ii])\n upper_G_RF_rms2stds[ii]=np.std(upper_G_RF_rms2s[ii])\n low_perc,high_perc=bootstrap('median',upper_G_RF_rms2s[ii],num_exp=10000,seed=66)\n upper_G_RF_rms2lowqs[ii] = upper_G_RF_rms2medians[ii]-low_perc\n upper_G_RF_rms2higqs[ii] =-upper_G_RF_rms2medians[ii]+high_perc ",
"_____no_output_____"
],
[
"#median, errorbar: 2.5-97.5 percentiles\nRFs=np.array([2,6,10,14])\nfig, ax = plt.subplots(nrows=1, ncols=1,figsize=(3,3))\nax.errorbar(RFs,upper_UV_RF_rms2medians,yerr=(upper_UV_RF_rms2lowqs,upper_UV_RF_rms2higqs),marker='o',\\\n color='purple',linestyle='-',label='Upper UV',alpha=1.0, capsize=4)\nax.errorbar(RFs,upper_G_RF_rms2medians, yerr=(upper_G_RF_rms2lowqs,upper_G_RF_rms2higqs), marker='o',\\\n color='g', linestyle='-',label='Upper G', alpha=1.0, capsize=4)\nax.legend(loc='best',fontsize=16)\nax.set_xticks([2,6,10,14])\nax.set_yticks([0,0.2,0.4,0.6])\nax.set_xlabel('RF (degree)', fontsize=16)\nax.set_ylabel('Crms median', fontsize=16)\nadjust_spines(ax, ['left', 'bottom'])\nhandles, labels = ax.get_legend_handles_labels()\nlgd = ax.legend(handles, labels, loc='center left',frameon=False, bbox_to_anchor=(1, 0.5))",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
]
] |
d07b79a4451b05f6091079add38241dd19508d78 | 113,876 | ipynb | Jupyter Notebook | methods/transformers/notebooks/04-onnx-export.ipynb | INK-USC/RiddleSense | a3d57eaf084da9cf6b77692c608e2cd2870fbd97 | [
"MIT"
] | 3 | 2021-07-06T20:02:31.000Z | 2022-03-27T13:13:01.000Z | methods/transformers/notebooks/04-onnx-export.ipynb | INK-USC/RiddleSense | a3d57eaf084da9cf6b77692c608e2cd2870fbd97 | [
"MIT"
] | null | null | null | methods/transformers/notebooks/04-onnx-export.ipynb | INK-USC/RiddleSense | a3d57eaf084da9cf6b77692c608e2cd2870fbd97 | [
"MIT"
] | null | null | null | 123.109189 | 35,385 | 0.834926 | [
[
[
"<h1><center>How to export 🤗 Transformers Models to ONNX ?<h1><center>",
"_____no_output_____"
],
[
"[ONNX](http://onnx.ai/) is open format for machine learning models. It allows to save your neural network's computation graph in a framework agnostic way, which might be particulary helpful when deploying deep learning models.\n\nIndeed, businesses might have other requirements _(languages, hardware, ...)_ for which the training framework might not be the best suited in inference scenarios. In that context, having a representation of the actual computation graph that can be shared accross various business units and logics across an organization might be a desirable component.\n\nAlong with the serialization format, ONNX also provides a runtime library which allows efficient and hardware specific execution of the ONNX graph. This is done through the [onnxruntime](https://microsoft.github.io/onnxruntime/) project and already includes collaborations with many hardware vendors to seamlessly deploy models on various platforms.\n\nThrough this notebook we'll walk you through the process to convert a PyTorch or TensorFlow transformers model to the [ONNX](http://onnx.ai/) and leverage [onnxruntime](https://microsoft.github.io/onnxruntime/) to run inference tasks on models from 🤗 __transformers__",
"_____no_output_____"
],
[
"## Exporting 🤗 transformers model to ONNX\n\n---\n\nExporting models _(either PyTorch or TensorFlow)_ is easily achieved through the conversion tool provided as part of 🤗 __transformers__ repository. \n\nUnder the hood the process is sensibly the following: \n\n1. Allocate the model from transformers (**PyTorch or TensorFlow**)\n2. Forward dummy inputs through the model this way **ONNX** can record the set of operations executed\n3. Optionally define dynamic axes on input and output tensors\n4. Save the graph along with the network parameters",
"_____no_output_____"
]
],
[
[
"import sys\n!{sys.executable} -m pip install --upgrade git+https://github.com/huggingface/transformers\n!{sys.executable} -m pip install --upgrade torch==1.6.0+cpu torchvision==0.7.0+cpu -f https://download.pytorch.org/whl/torch_stable.html\n!{sys.executable} -m pip install --upgrade onnxruntime==1.4.0\n!{sys.executable} -m pip install -i https://test.pypi.org/simple/ ort-nightly\n!{sys.executable} -m pip install --upgrade onnxruntime-tools",
"Collecting git+https://github.com/huggingface/transformers\n Cloning https://github.com/huggingface/transformers to /tmp/pip-req-build-9rvbp9p8\n Running command git clone -q https://github.com/huggingface/transformers /tmp/pip-req-build-9rvbp9p8\nRequirement already satisfied, skipping upgrade: numpy in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (1.18.1)\nRequirement already satisfied, skipping upgrade: tokenizers==0.8.1.rc2 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (0.8.1rc2)\nRequirement already satisfied, skipping upgrade: packaging in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (20.4)\nRequirement already satisfied, skipping upgrade: filelock in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (3.0.12)\nRequirement already satisfied, skipping upgrade: requests in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (2.23.0)\nRequirement already satisfied, skipping upgrade: tqdm>=4.27 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (4.46.1)\nRequirement already satisfied, skipping upgrade: regex!=2019.12.17 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (2020.6.8)\nRequirement already satisfied, skipping upgrade: sentencepiece!=0.1.92 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (0.1.91)\nRequirement already satisfied, skipping upgrade: sacremoses in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers==3.0.2) (0.0.43)\nRequirement already satisfied, skipping upgrade: pyparsing>=2.0.2 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from packaging->transformers==3.0.2) (2.4.7)\nRequirement already satisfied, skipping upgrade: six in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from packaging->transformers==3.0.2) (1.15.0)\nRequirement already satisfied, skipping upgrade: chardet<4,>=3.0.2 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from requests->transformers==3.0.2) (3.0.4)\nRequirement already satisfied, skipping upgrade: idna<3,>=2.5 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from requests->transformers==3.0.2) (2.9)\nRequirement already satisfied, skipping upgrade: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from requests->transformers==3.0.2) (1.25.9)\nRequirement already satisfied, skipping upgrade: certifi>=2017.4.17 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from requests->transformers==3.0.2) (2020.6.20)\nRequirement already satisfied, skipping upgrade: click in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from sacremoses->transformers==3.0.2) (7.1.2)\nRequirement already satisfied, skipping upgrade: joblib in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from sacremoses->transformers==3.0.2) (0.15.1)\nBuilding wheels for collected packages: transformers\n Building wheel for transformers (setup.py) ... \u001b[?25ldone\n\u001b[?25h Created wheel for transformers: filename=transformers-3.0.2-py3-none-any.whl size=883063 sha256=5f2caef76450921ae2e5b10abbbaab436e9c87c83486114fa08d305e4396d4cd\n Stored in directory: /tmp/pip-ephem-wheel-cache-kftypcjz/wheels/42/68/45/c63edff61c292f2dfd4df4ef6522dcbecc603e7af82813c1d7\nSuccessfully built transformers\nInstalling collected packages: transformers\n Attempting uninstall: transformers\n Found existing installation: transformers 3.0.2\n Uninstalling transformers-3.0.2:\n Successfully uninstalled transformers-3.0.2\nSuccessfully installed transformers-3.0.2\nLooking in links: https://download.pytorch.org/whl/torch_stable.html\nRequirement already up-to-date: torch==1.6.0+cpu in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (1.6.0+cpu)\nRequirement already up-to-date: torchvision==0.7.0+cpu in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (0.7.0+cpu)\nRequirement already satisfied, skipping upgrade: numpy in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from torch==1.6.0+cpu) (1.18.1)\nRequirement already satisfied, skipping upgrade: future in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from torch==1.6.0+cpu) (0.18.2)\nRequirement already satisfied, skipping upgrade: pillow>=4.1.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from torchvision==0.7.0+cpu) (7.2.0)\nRequirement already up-to-date: onnxruntime==1.4.0 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (1.4.0)\nRequirement already satisfied, skipping upgrade: protobuf in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime==1.4.0) (3.12.2)\nRequirement already satisfied, skipping upgrade: numpy>=1.16.6 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime==1.4.0) (1.18.1)\nRequirement already satisfied, skipping upgrade: setuptools in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from protobuf->onnxruntime==1.4.0) (47.1.1.post20200604)\nRequirement already satisfied, skipping upgrade: six>=1.9 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from protobuf->onnxruntime==1.4.0) (1.15.0)\nLooking in indexes: https://test.pypi.org/simple/\nRequirement already satisfied: ort-nightly in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (1.4.0.dev202008262)\nRequirement already satisfied: protobuf in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from ort-nightly) (3.12.2)\nRequirement already satisfied: numpy>=1.16.6 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from ort-nightly) (1.18.1)\nRequirement already satisfied: setuptools in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from protobuf->ort-nightly) (47.1.1.post20200604)\nRequirement already satisfied: six>=1.9 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from protobuf->ort-nightly) (1.15.0)\nRequirement already up-to-date: onnxruntime-tools in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (1.4.2)\nRequirement already satisfied, skipping upgrade: numpy in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime-tools) (1.18.1)\nRequirement already satisfied, skipping upgrade: coloredlogs in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime-tools) (14.0)\nRequirement already satisfied, skipping upgrade: py3nvml in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime-tools) (0.2.6)\nRequirement already satisfied, skipping upgrade: psutil in /home/mfuntowicz/.local/lib/python3.8/site-packages/psutil-5.7.0-py3.8-linux-x86_64.egg (from onnxruntime-tools) (5.7.0)\nRequirement already satisfied, skipping upgrade: packaging in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime-tools) (20.4)\nRequirement already satisfied, skipping upgrade: py-cpuinfo in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime-tools) (5.0.0)\nRequirement already satisfied, skipping upgrade: onnx in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime-tools) (1.7.0)\nRequirement already satisfied, skipping upgrade: humanfriendly>=7.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from coloredlogs->onnxruntime-tools) (8.2)\nRequirement already satisfied, skipping upgrade: xmltodict in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from py3nvml->onnxruntime-tools) (0.12.0)\nRequirement already satisfied, skipping upgrade: six in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from packaging->onnxruntime-tools) (1.15.0)\nRequirement already satisfied, skipping upgrade: pyparsing>=2.0.2 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from packaging->onnxruntime-tools) (2.4.7)\nRequirement already satisfied, skipping upgrade: typing-extensions>=3.6.2.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnx->onnxruntime-tools) (3.7.4.2)\nRequirement already satisfied, skipping upgrade: protobuf in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnx->onnxruntime-tools) (3.12.2)\n"
],
[
"!rm -rf onnx/\nfrom pathlib import Path\nfrom transformers.convert_graph_to_onnx import convert\n\n# Handles all the above steps for you\nconvert(framework=\"pt\", model=\"bert-base-cased\", output=Path(\"onnx/bert-base-cased.onnx\"), opset=11)\n\n# Tensorflow \n# convert(framework=\"tf\", model=\"bert-base-cased\", output=\"onnx/bert-base-cased.onnx\", opset=11)",
"loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-config.json from cache at /home/mfuntowicz/.cache/torch/transformers/b945b69218e98b3e2c95acf911789741307dec43c698d35fad11c1ae28bda352.9da767be51e1327499df13488672789394e2ca38b877837e52618a67d7002391\nModel config BertConfig {\n \"architectures\": [\n \"BertForMaskedLM\"\n ],\n \"attention_probs_dropout_prob\": 0.1,\n \"gradient_checkpointing\": false,\n \"hidden_act\": \"gelu\",\n \"hidden_dropout_prob\": 0.1,\n \"hidden_size\": 768,\n \"initializer_range\": 0.02,\n \"intermediate_size\": 3072,\n \"layer_norm_eps\": 1e-12,\n \"max_position_embeddings\": 512,\n \"model_type\": \"bert\",\n \"num_attention_heads\": 12,\n \"num_hidden_layers\": 12,\n \"pad_token_id\": 0,\n \"type_vocab_size\": 2,\n \"vocab_size\": 28996\n}\n\n"
]
],
[
[
"## How to leverage runtime for inference over an ONNX graph\n\n---\n\nAs mentionned in the introduction, **ONNX** is a serialization format and many side projects can load the saved graph and run the actual computations from it. Here, we'll focus on the official [onnxruntime](https://microsoft.github.io/onnxruntime/). The runtime is implemented in C++ for performance reasons and provides API/Bindings for C++, C, C#, Java and Python.\n\nIn the case of this notebook, we will use the Python API to highlight how to load a serialized **ONNX** graph and run inference workload on various backends through **onnxruntime**.\n\n**onnxruntime** is available on pypi:\n\n- onnxruntime: ONNX + MLAS (Microsoft Linear Algebra Subprograms)\n- onnxruntime-gpu: ONNX + MLAS + CUDA\n",
"_____no_output_____"
]
],
[
[
"!pip install transformers onnxruntime-gpu onnx psutil matplotlib",
"Requirement already satisfied: transformers in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (3.0.2)\nRequirement already satisfied: onnxruntime-gpu in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (1.3.0)\nRequirement already satisfied: onnx in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (1.7.0)\nRequirement already satisfied: psutil in /home/mfuntowicz/.local/lib/python3.8/site-packages/psutil-5.7.0-py3.8-linux-x86_64.egg (5.7.0)\nRequirement already satisfied: matplotlib in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (3.3.1)\nRequirement already satisfied: tqdm>=4.27 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (4.46.1)\nRequirement already satisfied: numpy in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (1.18.1)\nRequirement already satisfied: sacremoses in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (0.0.43)\nRequirement already satisfied: regex!=2019.12.17 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (2020.6.8)\nRequirement already satisfied: filelock in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (3.0.12)\nRequirement already satisfied: sentencepiece!=0.1.92 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (0.1.91)\nRequirement already satisfied: requests in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (2.23.0)\nRequirement already satisfied: packaging in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (20.4)\nRequirement already satisfied: tokenizers==0.8.1.rc2 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from transformers) (0.8.1rc2)\nRequirement already satisfied: protobuf in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnxruntime-gpu) (3.12.2)\nRequirement already satisfied: six in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnx) (1.15.0)\nRequirement already satisfied: typing-extensions>=3.6.2.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from onnx) (3.7.4.2)\nRequirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.3 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from matplotlib) (2.4.7)\nRequirement already satisfied: kiwisolver>=1.0.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from matplotlib) (1.2.0)\nRequirement already satisfied: python-dateutil>=2.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from matplotlib) (2.8.1)\nRequirement already satisfied: cycler>=0.10 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from matplotlib) (0.10.0)\nRequirement already satisfied: pillow>=6.2.0 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from matplotlib) (7.2.0)\nRequirement already satisfied: certifi>=2020.06.20 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from matplotlib) (2020.6.20)\nRequirement already satisfied: click in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from sacremoses->transformers) (7.1.2)\nRequirement already satisfied: joblib in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from sacremoses->transformers) (0.15.1)\nRequirement already satisfied: urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from requests->transformers) (1.25.9)\nRequirement already satisfied: chardet<4,>=3.0.2 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from requests->transformers) (3.0.4)\nRequirement already satisfied: idna<3,>=2.5 in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from requests->transformers) (2.9)\nRequirement already satisfied: setuptools in /home/mfuntowicz/miniconda3/envs/pytorch/lib/python3.8/site-packages (from protobuf->onnxruntime-gpu) (47.1.1.post20200604)\n"
]
],
[
[
"## Preparing for an Inference Session\n\n---\n\nInference is done using a specific backend definition which turns on hardware specific optimizations of the graph. \n\nOptimizations are basically of three kinds: \n\n- **Constant Folding**: Convert static variables to constants in the graph \n- **Deadcode Elimination**: Remove nodes never accessed in the graph\n- **Operator Fusing**: Merge multiple instruction into one (Linear -> ReLU can be fused to be LinearReLU)\n\nONNX Runtime automatically applies most optimizations by setting specific `SessionOptions`.\n\nNote:Some of the latest optimizations that are not yet integrated into ONNX Runtime are available in [optimization script](https://github.com/microsoft/onnxruntime/tree/master/onnxruntime/python/tools/transformers) that tunes models for the best performance.",
"_____no_output_____"
]
],
[
[
"# # An optional step unless\n# # you want to get a model with mixed precision for perf accelartion on newer GPU\n# # or you are working with Tensorflow(tf.keras) models or pytorch models other than bert\n\n# !pip install onnxruntime-tools\n# from onnxruntime_tools import optimizer\n\n# # Mixed precision conversion for bert-base-cased model converted from Pytorch\n# optimized_model = optimizer.optimize_model(\"bert-base-cased.onnx\", model_type='bert', num_heads=12, hidden_size=768)\n# optimized_model.convert_model_float32_to_float16()\n# optimized_model.save_model_to_file(\"bert-base-cased.onnx\")\n\n# # optimizations for bert-base-cased model converted from Tensorflow(tf.keras)\n# optimized_model = optimizer.optimize_model(\"bert-base-cased.onnx\", model_type='bert_keras', num_heads=12, hidden_size=768)\n# optimized_model.save_model_to_file(\"bert-base-cased.onnx\")\n\n\n# optimize transformer-based models with onnxruntime-tools\nfrom onnxruntime_tools import optimizer\nfrom onnxruntime_tools.transformers.onnx_model_bert import BertOptimizationOptions\n\n# disable embedding layer norm optimization for better model size reduction\nopt_options = BertOptimizationOptions('bert')\nopt_options.enable_embed_layer_norm = False\n\nopt_model = optimizer.optimize_model(\n 'onnx/bert-base-cased.onnx',\n 'bert', \n num_heads=12,\n hidden_size=768,\n optimization_options=opt_options)\nopt_model.save_model_to_file('bert.opt.onnx')\n",
"_____no_output_____"
],
[
"from os import environ\nfrom psutil import cpu_count\n\n# Constants from the performance optimization available in onnxruntime\n# It needs to be done before importing onnxruntime\nenviron[\"OMP_NUM_THREADS\"] = str(cpu_count(logical=True))\nenviron[\"OMP_WAIT_POLICY\"] = 'ACTIVE'\n\nfrom onnxruntime import GraphOptimizationLevel, InferenceSession, SessionOptions, get_all_providers",
"_____no_output_____"
],
[
"from contextlib import contextmanager\nfrom dataclasses import dataclass\nfrom time import time\nfrom tqdm import trange\n\ndef create_model_for_provider(model_path: str, provider: str) -> InferenceSession: \n \n assert provider in get_all_providers(), f\"provider {provider} not found, {get_all_providers()}\"\n\n # Few properties that might have an impact on performances (provided by MS)\n options = SessionOptions()\n options.intra_op_num_threads = 1\n options.graph_optimization_level = GraphOptimizationLevel.ORT_ENABLE_ALL\n\n # Load the model as a graph and prepare the CPU backend \n session = InferenceSession(model_path, options, providers=[provider])\n session.disable_fallback()\n \n return session\n\n\n@contextmanager\ndef track_infer_time(buffer: [int]):\n start = time()\n yield\n end = time()\n\n buffer.append(end - start)\n\n\n@dataclass\nclass OnnxInferenceResult:\n model_inference_time: [int] \n optimized_model_path: str",
"_____no_output_____"
]
],
[
[
"## Forwarding through our optimized ONNX model running on CPU\n\n---\n\nWhen the model is loaded for inference over a specific provider, for instance **CPUExecutionProvider** as above, an optimized graph can be saved. This graph will might include various optimizations, and you might be able to see some **higher-level** operations in the graph _(through [Netron](https://github.com/lutzroeder/Netron) for instance)_ such as:\n- **EmbedLayerNormalization**\n- **Attention**\n- **FastGeLU**\n\nThese operations are an example of the kind of optimization **onnxruntime** is doing, for instance here gathering multiple operations into bigger one _(Operator Fusing)_.",
"_____no_output_____"
]
],
[
[
"from transformers import BertTokenizerFast\n\ntokenizer = BertTokenizerFast.from_pretrained(\"bert-base-cased\")\ncpu_model = create_model_for_provider(\"onnx/bert-base-cased.onnx\", \"CPUExecutionProvider\")\n\n# Inputs are provided through numpy array\nmodel_inputs = tokenizer(\"My name is Bert\", return_tensors=\"pt\")\ninputs_onnx = {k: v.cpu().detach().numpy() for k, v in model_inputs.items()}\n\n# Run the model (None = get all the outputs)\nsequence, pooled = cpu_model.run(None, inputs_onnx)\n\n# Print information about outputs\n\nprint(f\"Sequence output: {sequence.shape}, Pooled output: {pooled.shape}\")",
"loading file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt from cache at /home/mfuntowicz/.cache/torch/transformers/5e8a2b4893d13790ed4150ca1906be5f7a03d6c4ddf62296c383f6db42814db2.e13dbb970cb325137104fb2e5f36fe865f27746c6b526f6352861b1980eb80b1\n"
]
],
[
[
"# Benchmarking PyTorch model\n\n_Note: PyTorch model benchmark is run on CPU_",
"_____no_output_____"
]
],
[
[
"from transformers import BertModel\n\nPROVIDERS = {\n (\"cpu\", \"PyTorch CPU\"),\n# Uncomment this line to enable GPU benchmarking\n# (\"cuda:0\", \"PyTorch GPU\")\n}\n\nresults = {}\n\nfor device, label in PROVIDERS:\n \n # Move inputs to the correct device\n model_inputs_on_device = {\n arg_name: tensor.to(device)\n for arg_name, tensor in model_inputs.items()\n }\n\n # Add PyTorch to the providers\n model_pt = BertModel.from_pretrained(\"bert-base-cased\").to(device)\n for _ in trange(10, desc=\"Warming up\"):\n model_pt(**model_inputs_on_device)\n\n # Compute \n time_buffer = []\n for _ in trange(100, desc=f\"Tracking inference time on PyTorch\"):\n with track_infer_time(time_buffer):\n model_pt(**model_inputs_on_device)\n\n # Store the result\n results[label] = OnnxInferenceResult(\n time_buffer, \n None\n ) ",
"loading configuration file https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-config.json from cache at /home/mfuntowicz/.cache/torch/transformers/b945b69218e98b3e2c95acf911789741307dec43c698d35fad11c1ae28bda352.9da767be51e1327499df13488672789394e2ca38b877837e52618a67d7002391\nModel config BertConfig {\n \"architectures\": [\n \"BertForMaskedLM\"\n ],\n \"attention_probs_dropout_prob\": 0.1,\n \"gradient_checkpointing\": false,\n \"hidden_act\": \"gelu\",\n \"hidden_dropout_prob\": 0.1,\n \"hidden_size\": 768,\n \"initializer_range\": 0.02,\n \"intermediate_size\": 3072,\n \"layer_norm_eps\": 1e-12,\n \"max_position_embeddings\": 512,\n \"model_type\": \"bert\",\n \"num_attention_heads\": 12,\n \"num_hidden_layers\": 12,\n \"pad_token_id\": 0,\n \"type_vocab_size\": 2,\n \"vocab_size\": 28996\n}\n\nloading weights file https://cdn.huggingface.co/bert-base-cased-pytorch_model.bin from cache at /home/mfuntowicz/.cache/torch/transformers/d8f11f061e407be64c4d5d7867ee61d1465263e24085cfa26abf183fdc830569.3fadbea36527ae472139fe84cddaa65454d7429f12d543d80bfc3ad70de55ac2\nAll model checkpoint weights were used when initializing BertModel.\n\nAll the weights of BertModel were initialized from the model checkpoint at bert-base-cased.\nIf your task is similar to the task the model of the checkpoint was trained on, you can already use BertModel for predictions without further training.\nWarming up: 100%|██████████| 10/10 [00:00<00:00, 39.30it/s]\nTracking inference time on PyTorch: 100%|██████████| 100/100 [00:02<00:00, 41.09it/s]\n"
]
],
[
[
"## Benchmarking PyTorch & ONNX on CPU\n\n_**Disclamer: results may vary from the actual hardware used to run the model**_",
"_____no_output_____"
]
],
[
[
"PROVIDERS = {\n (\"CPUExecutionProvider\", \"ONNX CPU\"),\n# Uncomment this line to enable GPU benchmarking\n# (\"CUDAExecutionProvider\", \"ONNX GPU\")\n}\n\n\nfor provider, label in PROVIDERS:\n # Create the model with the specified provider\n model = create_model_for_provider(\"onnx/bert-base-cased.onnx\", provider)\n\n # Keep track of the inference time\n time_buffer = []\n\n # Warm up the model\n model.run(None, inputs_onnx)\n\n # Compute \n for _ in trange(100, desc=f\"Tracking inference time on {provider}\"):\n with track_infer_time(time_buffer):\n model.run(None, inputs_onnx)\n\n # Store the result\n results[label] = OnnxInferenceResult(\n time_buffer,\n model.get_session_options().optimized_model_filepath\n )",
"Tracking inference time on CPUExecutionProvider: 100%|██████████| 100/100 [00:01<00:00, 63.62it/s]\n"
],
[
"%matplotlib inline\n\nimport matplotlib\nimport matplotlib.pyplot as plt\nimport numpy as np\nimport os\n\n\n# Compute average inference time + std\ntime_results = {k: np.mean(v.model_inference_time) * 1e3 for k, v in results.items()}\ntime_results_std = np.std([v.model_inference_time for v in results.values()]) * 1000\n\nplt.rcdefaults()\nfig, ax = plt.subplots(figsize=(16, 12))\nax.set_ylabel(\"Avg Inference time (ms)\")\nax.set_title(\"Average inference time (ms) for each provider\")\nax.bar(time_results.keys(), time_results.values(), yerr=time_results_std)\nplt.show()",
"_____no_output_____"
]
],
[
[
"# Quantization support from transformers\n\nQuantization enables the use of integers (_instead of floatting point_) arithmetic to run neural networks models faster. From a high-level point of view, quantization works as mapping the float32 ranges of values as int8 with the less loss in the performances of the model.\n\nHugging Face provides a conversion tool as part of the transformers repository to easily export quantized models to ONNX Runtime. For more information, please refer to the following: \n\n- [Hugging Face Documentation on ONNX Runtime quantization supports](https://huggingface.co/transformers/master/serialization.html#quantization)\n- [Intel's Explanation of Quantization](https://nervanasystems.github.io/distiller/quantization.html)\n\nWith this method, the accuracy of the model remains at the same level than the full-precision model. If you want to see benchmarks on model performances, we recommand reading the [ONNX Runtime notebook](https://github.com/microsoft/onnxruntime/blob/master/onnxruntime/python/tools/quantization/notebooks/Bert-GLUE_OnnxRuntime_quantization.ipynb) on the subject.",
"_____no_output_____"
],
[
"# Benchmarking PyTorch quantized model",
"_____no_output_____"
]
],
[
[
"import torch \n\n# Quantize\nmodel_pt_quantized = torch.quantization.quantize_dynamic(\n model_pt.to(\"cpu\"), {torch.nn.Linear}, dtype=torch.qint8\n)\n\n# Warm up \nmodel_pt_quantized(**model_inputs)\n\n# Benchmark PyTorch quantized model\ntime_buffer = []\nfor _ in trange(100):\n with track_infer_time(time_buffer):\n model_pt_quantized(**model_inputs)\n \nresults[\"PyTorch CPU Quantized\"] = OnnxInferenceResult(\n time_buffer,\n None\n)",
"100%|██████████| 100/100 [00:01<00:00, 90.15it/s]\n"
]
],
[
[
"# Benchmarking ONNX quantized model",
"_____no_output_____"
]
],
[
[
"from transformers.convert_graph_to_onnx import quantize\n\n# Transformers allow you to easily convert float32 model to quantized int8 with ONNX Runtime\nquantized_model_path = quantize(Path(\"bert.opt.onnx\"))\n\n# Then you just have to load through ONNX runtime as you would normally do\nquantized_model = create_model_for_provider(quantized_model_path.as_posix(), \"CPUExecutionProvider\")\n\n# Warm up the overall model to have a fair comparaison\noutputs = quantized_model.run(None, inputs_onnx)\n\n# Evaluate performances\ntime_buffer = []\nfor _ in trange(100, desc=f\"Tracking inference time on CPUExecutionProvider with quantized model\"):\n with track_infer_time(time_buffer):\n outputs = quantized_model.run(None, inputs_onnx)\n\n# Store the result\nresults[\"ONNX CPU Quantized\"] = OnnxInferenceResult(\n time_buffer, \n quantized_model_path\n) ",
"As of onnxruntime 1.4.0, models larger than 2GB will fail to quantize due to protobuf constraint.\nThis limitation will be removed in the next release of onnxruntime.\nQuantized model has been written at bert.onnx: ✔\n"
]
],
[
[
"## Show the inference performance of each providers ",
"_____no_output_____"
]
],
[
[
"%matplotlib inline\n\nimport matplotlib\nimport matplotlib.pyplot as plt\nimport numpy as np\nimport os\n\n\n# Compute average inference time + std\ntime_results = {k: np.mean(v.model_inference_time) * 1e3 for k, v in results.items()}\ntime_results_std = np.std([v.model_inference_time for v in results.values()]) * 1000\n\nplt.rcdefaults()\nfig, ax = plt.subplots(figsize=(16, 12))\nax.set_ylabel(\"Avg Inference time (ms)\")\nax.set_title(\"Average inference time (ms) for each provider\")\nax.bar(time_results.keys(), time_results.values(), yerr=time_results_std)\nplt.show()",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07b8140f8e8196d1730572356e02db10289d7d4 | 167,659 | ipynb | Jupyter Notebook | exercises/1_CUDA_programming_model.ipynb | pjarosik/ius-2021-gpu-short-course | 5731cd9f7caa37db21463c07c38ec0b8dfd48040 | [
"Unlicense"
] | null | null | null | exercises/1_CUDA_programming_model.ipynb | pjarosik/ius-2021-gpu-short-course | 5731cd9f7caa37db21463c07c38ec0b8dfd48040 | [
"Unlicense"
] | null | null | null | exercises/1_CUDA_programming_model.ipynb | pjarosik/ius-2021-gpu-short-course | 5731cd9f7caa37db21463c07c38ec0b8dfd48040 | [
"Unlicense"
] | null | null | null | 80.799518 | 106,785 | 0.813186 | [
[
[
"# Lecture 1. Introduction to CUDA Programming Model and Toolkit.\n\n",
"_____no_output_____"
],
[
"Welcome to the GPU short course exercies!\n\nDuring this course we will introduce you with the syntax of CUDA programming in Python using `numba.cuda` package.",
"_____no_output_____"
],
[
"During the first lecture we will try to focus on optimizing the following function, which adds two vectors together using a single CPU core.",
"_____no_output_____"
]
],
[
[
"import numpy as np\n\ndef add_vectors(a, b):\n assert len(a) == len(b)\n \n n = len(a)\n result = [None]*n\n for i in range(n):\n result[i] = a[i] + b[i]\n return result\n\nadd_vectors([1, 2, 3, 4], \n [4, 5, 6, 7])",
"_____no_output_____"
]
],
[
[
"Lets measure the time needed to execute the `add_vectors` for big-scale arrays:",
"_____no_output_____"
]
],
[
[
"a = np.random.rand(2**24) # ~ 1e7 elements\nb = np.random.rand(2**24)",
"_____no_output_____"
],
[
"%%timeit -n 2\nadd_vectors(a, b)",
"2 loops, best of 5: 6.46 s per loop\n"
]
],
[
[
"In the following sections we will show you how to optimize the above implementation by reimplementing vector addition on GPU.",
"_____no_output_____"
],
[
"## Exercise 1.1. CUDA kernels and CUDA threads.\n\n",
"_____no_output_____"
],
[
"#### 1.1.1. One-dimensional grid.",
"_____no_output_____"
],
[
"Lets do the all necessary Python imports first.",
"_____no_output_____"
]
],
[
[
"import math\nfrom numba import cuda\nimport numpy as np",
"_____no_output_____"
]
],
[
[
"The `numba.cuda` package provides a possibility to write CUDA kernels directly in Python language. We will describe Numba in more detail later. For now, it will be enough to understand that:\n- the code that is executed by each GPU core separately is called a *GPU kernel*,\n- in Numba, GPU kernel is a Python function with `@cuda.jit` decorator.\n\nThe below code creates a CUDA kernel which simply does nothing (*NOP*).",
"_____no_output_____"
]
],
[
[
"@cuda.jit\ndef my_first_gpu_kernel():\n pass",
"_____no_output_____"
]
],
[
[
"The following line launches the code to be executed by 64 thread blocks, each block has 256 threads:",
"_____no_output_____"
]
],
[
[
"my_first_gpu_kernel[64, 256]()",
"_____no_output_____"
]
],
[
[
"Ok, now that we know what is the syntax for writing and lanuching CUDA kernel code, we can proceed with porting the `add_vectors` function to the GPU device. \n\nThe key is to note that vector element additions are *independent tasks* - that is, for each `i` and `j`, the result of `a[i]+b[i]` does not depend on `a[j]+b[j]` and vice versa.\n\nSo, let's move line 9 from `add_vectors` function to the new, GPU kernel implementation: ",
"_____no_output_____"
]
],
[
[
"@cuda.jit\ndef add_vectors_kernel(result, a, b):\n i = cuda.blockIdx.x*cuda.blockDim.x + cuda.threadIdx.x\n\n # Make sure not to go outside the grid area!\n if i >= len(result):\n return\n\n result[i] = a[i] + b[i]",
"_____no_output_____"
]
],
[
[
"The above code is executed by each CUDA thread separately. \n\nFields `blockIdx`, `blockDim` and `threadIdx` allows to determine the position of the current thread in the whole grid of threads executed by CUDA device:\n- `blockIdx` is the identifier of the currently executed block of threads in the grid,\n- `blockDim` is the size of a single block of threads,\n- `threadIdx` is the identifier of the currenty executed thread within a single block.\n\nA grid of threads can have one, two or three dimensions, described by `(cuda.blockIdx.z, cuda.blockDim.y, cuda.blockDim.x)` tuple. \n\nEach thread in a block has coordinates `(cuda.threadIdx.z, cuda.threadIdx.y, cuda.threadIdx.x)`. \n\nEach block in a grid has coordinates `(cuda.blockIdx.z, cuda.blockIdx.y, cuda.blockIdx.x)`. \n\nThe `x` coordinate changes the fastest: two adjacent threads in the same block differ in the value of the `x` coordinate by 1.\n\nGrird and thread dimensions can be specified via `grid_size` and `thread_size` parameters: a single scalar value means, that a 1-D grid will be used, a pair of values imposes a 2-D grid, three values sets 3-D grid. ",
"_____no_output_____"
],
[
"Now, we would like to run the above kernel for each `result[i]`. Let's assume for a moment that we want the above CUDA kernel to be executed by 256 threads in parallel - i.e. one block will consists of 256 threads. To cover the entire input array, the kernel has to be executed by $\\left\\lceil \\frac{n}{256} \\right\\rceil$ blocks of threads. ",
"_____no_output_____"
]
],
[
[
"def add_vectors_gpu(a, b):\n assert len(a) == len(b)\n # Create output array in the GPU memory.\n result = cuda.device_array(shape=a.shape, dtype=a.dtype)\n\n block_size = 256\n grid_size = math.ceil(len(a)/block_size)\n add_vectors_kernel[grid_size, block_size](result, a, b)\n return result.copy_to_host()",
"_____no_output_____"
],
[
"%%timeit\nadd_vectors_gpu(a, b)",
"1 loop, best of 5: 138 ms per loop\n"
]
],
[
[
"Congratulations! Your very first GPU kernel, i.e. `add_vectors_gpu` function, executes much faster than its CPU counterpart.\nOf course, writing CUDA kernels is not the only part of preparing GPU processing pipeline. One of the other important things to consider is the heterogenous nature of the CPU-GPU processing: is the data transfer between GPU and the host computer.",
"_____no_output_____"
],
[
"#### 1.1.2. Two-dimensional grid.",
"_____no_output_____"
],
[
"In the previous example, the grid of threads was defined in a single dimension, i.e. the variables `grid_size` and `block_size` were a single scalar value. This time we will implement a function, which adds two **matrices**, and we will use a 2-D grid of threads for this purpose.",
"_____no_output_____"
],
[
"Lets implement `add_matrices` for CPU first. ",
"_____no_output_____"
]
],
[
[
"import itertools\nimport numpy as np \n\ndef add_matrices(a, b):\n a = np.array(a)\n b = np.array(b)\n assert a.shape == b.shape\n \n height, width = a.shape\n result = np.zeros(a.shape, dtype=a.dtype)\n \n for i, j in itertools.product(range(height), range(width)):\n result[i, j] = a[i, j] + b[i, j]\n return result\n\nadd_matrices(\n # a = \n [[ 1, 2, 3, 4], \n [ 4, 5, 6, 7]],\n # b = \n [[-1, -2, -3, -4],\n [ 1, 1, 1, 1]])",
"_____no_output_____"
],
[
"A = np.random.rand(2**12, 2**12) # ~ 1e7 elements\nB = np.random.rand(2**12, 2**12)\n\nC = add_matrices(A, B)\nnp.testing.assert_equal(C, A+B)",
"_____no_output_____"
],
[
"%%timeit -n 2\nadd_matrices(A, B)",
"2 loops, best of 5: 8.95 s per loop\n"
]
],
[
[
"Similarly to the `add_vectors_kernel` implementation, the `add_matrices_kernel` will compute a single matrix element. This time, we will use `y` coordinate to address matrix elements in the second dimension:\n\n",
"_____no_output_____"
]
],
[
[
"@cuda.jit\ndef add_matrices_kernel(result, a, b):\n i = cuda.blockIdx.x*cuda.blockDim.x + cuda.threadIdx.x\n j = cuda.blockIdx.y*cuda.blockDim.y + cuda.threadIdx.y\n\n height, width = result.shape\n\n # Make sure we are not accessing data outside available space!\n if i >= width or j >= height:\n return\n\n result[j, i] = a[j, i] + b[j, i]",
"_____no_output_____"
]
],
[
[
"Now, we must also pass the second dimension of the grid to the GPU kernel invocation parameters, as the implementation assumes a 2-D grid layout. The parameters `grid_size` and `block_size` now has to pairs of integer values: ",
"_____no_output_____"
]
],
[
[
"def add_matrices_gpu(a, b):\n assert a.shape == b.shape\n # Create output array in the GPU memory.\n result = cuda.device_array(shape=a.shape, dtype=a.dtype)\n\n height, width = a.shape\n block_size = (16, 16)\n grid_size = (math.ceil(width/block_size[0]), math.ceil(height/block_size[1]))\n add_matrices_kernel[grid_size, block_size](result, a, b)\n return result.copy_to_host()\n\nadd_matrices(\n # a = \n [[ 1, 2, 3, 4], \n [ 4, 5, 6, 7]],\n # b = \n [[-1, -2, -3, -4],\n [ 1, 1, 1, 1]])",
"_____no_output_____"
]
],
[
[
"Let's test the implementation first:",
"_____no_output_____"
]
],
[
[
"C = add_matrices_gpu(A, B)\nnp.testing.assert_equal(C, A+B)",
"_____no_output_____"
]
],
[
[
"Now lets compare CPU and GPU processing time:",
"_____no_output_____"
]
],
[
[
"%%timeit\nadd_matrices_gpu(A, B)",
"10 loops, best of 5: 140 ms per loop\n"
]
],
[
[
"We leave the implementation of adding two 3D arrays as homework for the students.",
"_____no_output_____"
],
[
"#### 1.1.3. See also\n\n- CUDA kernels in Numba: introduction: https://numba.readthedocs.io/en/0.52.0/cuda/kernels.html#introduction\n\n- Matrix multiplication example https://numba.readthedocs.io/en/0.52.0/cuda/examples.html#matrix-multiplication",
"_____no_output_____"
],
[
"## Exercise 1.2. Transferring data to and from GPU memory.\n\n",
"_____no_output_____"
],
[
"In the previous examples, we passed the `numpy` arrays directly to the GPU kernel code. The `numpy` arrays were stored in the host PC's operating memory. As GPU computing can be performed only on data which is located in GPU memory, Numba package impliclitly transferred the data from PC's memory to GPU global memory first.\n\nThe data transfers can be run explicitly, if necessary:",
"_____no_output_____"
]
],
[
[
"block_size = 256\ngrid_size = math.ceil(len(a)/block_size)\n\n# Create an array for the result in the GPU global memory. \nresult_gpu = cuda.device_array(shape=a.shape, dtype=a.dtype)\n# Here are the explicit data transfers from host PC memory to GPU global memory:\na_gpu = cuda.to_device(a)\nb_gpu = cuda.to_device(b)\n\nadd_vectors_kernel[grid_size, block_size](result_gpu, a_gpu, b_gpu)\n# After the computations are done, transfer the results to the host PC memory.\nresult = result_gpu.copy_to_host()",
"_____no_output_____"
]
],
[
[
"Data transfer to and from GPU memory is only possible with GPU global memory. The following functions are available in Numba:\n\n- create an array in the GPU global memory: `numba.cuda.device_array`, or `numba.cuda.device_array_like`,\n- host PC to GPU global memory transfer: `numba.cuda.to_device`\n- GPU global memory to host PC memory transfer: `gpu_array.copy_to_host`, where `gpu_array` is a GPU array. \n\nThe complete list of Numba's functions for data transfer to and from GPU is available here:\nhttps://numba.readthedocs.io/en/0.52.0/cuda/memory.html#data-transfer",
"_____no_output_____"
],
[
"The advantage of heterogenous programming with CUDA is that computing performed on the GPU can be done in parallel with the operations performed by CPU -- both CPU and GPU are separate processing devices that can work simultaneously. In other words, that CUDA kernel invocations are **asynochronous**: when we invoke GPU kernel, the only job the CPU does is to enqueue the kernel to be executed on GPU, then it returns immediately\n(In fact, this is not always true for kernels written in Numba - the first launch of a CUDA kernel may also require Python code compilation. This topic will be discussed later in this lecture.)\n\nFor example, the following GPU kernel call takes much less time than we've seen so far:",
"_____no_output_____"
]
],
[
[
"%%timeit -n 100\nadd_vectors_kernel[grid_size, block_size](result_gpu, a_gpu, b_gpu)",
"100 loops, best of 5: 172 µs per loop\n"
]
],
[
[
"The difference is that we did not transfer data from the GPU to the CPU. Let's try now with the GPU -> CPU transfer:",
"_____no_output_____"
]
],
[
[
"%%timeit -n 100\nadd_vectors_kernel[grid_size, block_size](result_gpu, a_gpu, b_gpu)\nresult = result_gpu.copy_to_host()",
"100 loops, best of 5: 34.6 ms per loop\n"
]
],
[
[
"The difference is due to the fact that the data transfer is a blocking operation - it waits for all queued operations to be performed, then performs the transfer. To wait for the kernels to execute explicitly, without transferring the result data to host PC, run `cuda.default_stream().synchronize()`.",
"_____no_output_____"
]
],
[
[
"%%timeit -n 100\nadd_vectors_kernel[grid_size, block_size](result_gpu, a_gpu, b_gpu)\ncuda.default_stream().synchronize()",
"100 loops, best of 5: 1.8 ms per loop\n"
]
],
[
[
"CUDA streams will be covered in more detail later in this short-course.",
"_____no_output_____"
],
[
"### 1.2.1 See also",
"_____no_output_____"
],
[
"- CUDA Toolkit documentation: device memory management (CUDA C++): https://docs.nvidia.com/cuda/cuda-c-programming-guide/index.html#device-memory\n- The complete list of Numba's functions for data transfer to and from GPU: https://numba.readthedocs.io/en/0.52.0/cuda/memory.html#data-transfer\n",
"_____no_output_____"
],
[
"## Exercise 1.3. CUDA Toolkit and Numba package.",
"_____no_output_____"
],
[
"Nvidia provides several tools in its Toolkit that help in the implementation and testing of the GPU code. In this exercise we will show you how to:\n- check what parameters you hardware has, like determine how to programatically check how many GPUs are available, how much each kind of memory you it has, and so on,\n- debug and memcheck you Python CUDA kernels,\n- profile CUDA code execution time.\n\nAlso, we will introduce you with more details of Numba Python package, which we will use during the whole course.",
"_____no_output_____"
],
[
"### Exercise 1.3.1. CUDA device diagnostics.",
"_____no_output_____"
],
[
"The most basic diagnostic tool for GPU cards is `nvidia-smi` which displays the current status of available all GPU cards.\n\n(NOTE: `nvidia-smi` tool is not available Nvidia Jetson processors. For SoC chips, please use built-in `tegrastats` or install [`jtop`](https://pypi.org/project/jetson-stats/)).",
"_____no_output_____"
]
],
[
[
"! nvidia-smi",
"Sun Jun 27 10:32:17 2021 \n+-----------------------------------------------------------------------------+\n| NVIDIA-SMI 465.27 Driver Version: 460.32.03 CUDA Version: 11.2 |\n|-------------------------------+----------------------+----------------------+\n| GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC |\n| Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. |\n| | | MIG M. |\n|===============================+======================+======================|\n| 0 Tesla T4 Off | 00000000:00:04.0 Off | 0 |\n| N/A 60C P0 66W / 70W | 1256MiB / 15109MiB | 86% Default |\n| | | N/A |\n+-------------------------------+----------------------+----------------------+\n \n+-----------------------------------------------------------------------------+\n| Processes: |\n| GPU GI CI PID Type Process name GPU Memory |\n| ID ID Usage |\n|=============================================================================|\n+-----------------------------------------------------------------------------+\n"
]
],
[
[
"`nvidia-smi` outputs information about:\n- the installed NVIDIA driver and CUDA Toolkit,\n- for each available GPU:\n - temperature, memory usage and GPU utilization,\n - processes that are currently running on that GPU.\n\n ",
"_____no_output_____"
],
[
"`nvidia-smi` is a command line tool, so use it in your shell to quickly check the state of your GPU. \n\nCUDA SDK provides also a programatic way to access device description in your application run-time, to e.g. check if we are not exceeding available GPU global memory.\n\nThe CUDA device description is called SDK *device properties*. \n\nTo get the device properties, we will use `cupy` package, which exposes CUDA SDK interface in Python in a convenient way.\n\nLet's check first how many GPU cards do we have:\n\n",
"_____no_output_____"
]
],
[
[
"import cupy as cp\ncp.cuda.runtime.getDeviceCount()",
"_____no_output_____"
]
],
[
[
"Now, let's we check:\n\n- what is the name of the device and what is its compute capability,\n- what is the GPU clock frequency,\n- how much global, shared and constant memory our GPU card has.",
"_____no_output_____"
]
],
[
[
"device_props = cp.cuda.runtime.getDeviceProperties(0)\n\nprint(f\"Device: {device_props['name']} (cc {device_props['major']}.{device_props['minor']})\")\nprint(f\"GPU clock frequency: {device_props['clockRate']/1e3} MHz\")\nprint(\"Available memory: \")\nprint(f\"- global memory: {device_props['totalGlobalMem']/2**20} MiB\")\nprint(f\"- shared memory per thread block: {device_props['sharedMemPerBlock']} B\")\nprint(f\"- constant memory: {device_props['totalConstMem']} B\")",
"Device: b'Tesla T4' (cc 7.5)\nGPU clock frequency: 1590.0 MHz\nAvailable memory: \n- global memory: 15109.75 MiB\n- shared memory per thread block: 49152 B\n- constant memory: 65536 B\n"
]
],
[
[
"The complete list of device properties is available [here](https://docs.nvidia.com/cuda/cuda-runtime-api/group__CUDART__DEVICE.html#group__CUDART__DEVICE_1g1bf9d625a931d657e08db2b4391170f0).\n",
"_____no_output_____"
],
[
"### Exercise 1.3.2. Numba.",
"_____no_output_____"
],
[
"- Numba is a just-in-time (JIT) compiler for Python.\n- It generates machine code from Python bytecode using LLVM compiler library, which results in a significant speed up.\n- It works best on code that uses NumPy arrays and functions, and loops.\n- Numba can target NVIDIA CUDA and (experimentally) AMD ROC GPUs. In other words, it allows for (relatively) easy creation of Python code executed on GPU, which results (potentially) in a significant speed-up.\n\nNumba documentation is available here: https://numba.pydata.org/numba-doc/latest/index.html",
"_____no_output_____"
],
[
"The thing that needs to be stressed here is to note, that Numba is a JIT compiler - that means it compiles a given function to machine code *lazily*, **on the first function call**. Compilation is performed only once - the first time a given function is run. After that, a cached version of machine code is used.\n\nLet's see how long it will take to execute the brand new kernel the first time:",
"_____no_output_____"
]
],
[
[
"@cuda.jit\ndef increment(x):\n i = cuda.blockIdx.x*cuda.blockDim.x + cuda.threadIdx.x\n x[i] += 2",
"_____no_output_____"
],
[
"%time increment[1, 5](np.arange(5))",
"CPU times: user 125 ms, sys: 5.01 ms, total: 130 ms\nWall time: 132 ms\n"
]
],
[
[
"The second kernel execution takes:",
"_____no_output_____"
]
],
[
[
"%time increment[1, 5](np.arange(5))",
"CPU times: user 1.52 ms, sys: 10 µs, total: 1.53 ms\nWall time: 1.34 ms\n"
]
],
[
[
"### Exercise 1.3.3. CUDA-MEMCHECK and debugging Numba code.\n",
"_____no_output_____"
],
[
"#### 1.3.3.1 CUDA-MEMCHECK",
"_____no_output_____"
],
[
"CUDA-MEMCHECK is an tool available in CUDA SDK, which gives the possibility to check if CUDA application makes any of the following errors:\n- misaligned and out of bounds memory access errors,\n- shared memory data races,\n- unintialized accesses to global memory. ",
"_____no_output_____"
],
[
"Let's to debug below Python script in order to detect any memory issues it may cause. According to Numba [documentation](https://numba.pydata.org/numba-doc/latest/user/troubleshoot.html#debug-info), we can pass `debug=True` parameter to the `@cuda.jit` decorator in order to get some more information about the analyzed kernel. Let's do that, save the below cell to Python script, and run CUDA-MEMCHECK for the Python interpreter.",
"_____no_output_____"
]
],
[
[
"%%writefile 1_3_3_memcheck.py\n\nimport os \nimport numpy as np\nfrom numba import cuda\nimport math\n\[email protected](debug=True)\ndef add_vectors_invalid(result, a, b):\n pass\n i = cuda.blockIdx.x*cuda.blockDim.x + cuda.threadIdx.x\n # What are we missing here?\n result[i] = a[i] + b[i]\n\n\na = np.arange(255)\nb = np.arange(255)\nresult = cuda.device_array(a.shape, dtype=a.dtype)\n\nadd_vectors_invalid[1, 256](result, a, b)\nresult_host = result.copy_to_host()",
"Writing 1_3_3_memcheck.py\n"
]
],
[
[
"The only thing the above cell does is saving the Python code to the `1_3_3_memcheck.py` script in the current directory (you can check it using `! pwd` command). Now, we can run `cuda-memcheck` along with the Python interpreter in order to see if there any issues with the script.",
"_____no_output_____"
]
],
[
[
"! cuda-memcheck --show-backtrace no python 1_3_3_memcheck.py",
"========= CUDA-MEMCHECK\n========= Invalid __global__ read of size 8\n========= at 0x00000490 in cudapy::__main__::add_vectors_invalid$241(Array<__int64, int=1, C, mutable, aligned>, Array<__int64, int=1, C, mutable, aligned>, Array<__int64, int=1, C, mutable, aligned>)\n========= by thread (255,0,0) in block (0,0,0)\n========= Address 0x7fb806200ff8 is out of bounds\n=========\n========= Program hit CUDA_ERROR_LAUNCH_FAILED (error 719) due to \"unspecified launch failure\" on CUDA API call to cuMemcpyDtoH_v2.\n=========\nTraceback (most recent call last):\n File \"1_3_3_memcheck.py\", line 19, in <module>\n add_vectors_invalid[1, 256](result, a, b)\n File \"/usr/local/lib/python3.7/dist-packages/numba/cuda/compiler.py\", line 770, in __call__\n self.stream, self.sharedmem)\n File \"/usr/local/lib/python3.7/dist-packages/numba/cuda/compiler.py\", line 862, in call\n kernel.launch(args, griddim, blockdim, stream, sharedmem)\n File \"/usr/local/lib/python3.7/dist-packages/numba/cuda/compiler.py\", line 655, in launch\n driver.device_to_host(ctypes.addressof(excval), excmem, excsz)\n File \"/usr/local/lib/python3.7/dist-packages/numba/cuda/cudadrv/driver.py\", line 2345, in device_to_host\n fn(host_pointer(dst), device_pointer(src), size, *varargs)\n File \"/usr/local/lib/python3.7/dist-packages/numba/cuda/cudadrv/driver.py\", line 302, in safe_cuda_api_call\n self._check_error(fname, retcode)\n File \"/usr/local/lib/python3.7/dist-packages/numba/cuda/cudadrv/driver.py\", line 337, in _check_error\n raise CudaAPIError(retcode, msg)\nnumba.cuda.cudadrv.driver.CudaAPIError: [719] Call to cuMemcpyDtoH results in CUDA_ERROR_LAUNCH_FAILED\n========= ERROR SUMMARY: 2 errors\n"
]
],
[
[
"As we can see, CUDA-MEMCHECK detected, the `add_vectors_invalid` kernel was not properly executed by thread `255`. What is causing the issue?",
"_____no_output_____"
],
[
"#### 1.3.3.2 Debugging Numba kernels",
"_____no_output_____"
],
[
"CUDA SDK toolkit includes debuggers that can be run on the GPU kernel code, in case it's necessary to trace the cause of the issue. A list of CUDA debuggers, that can be run on C++ CUDA kernel code is available here: [Linux](https://docs.nvidia.com/cuda/cuda-gdb/index.html), [Windows](https://docs.nvidia.com/nsight-visual-studio-edition/cuda-debugger/).\n\nFor the application that uses Python with Numba to generate GPU machine code, user have an opportunity to run Python debugger (`pdb`) directly on the kernel code using the CUDA simulator. More details about the simulator can be found [here](https://numba.pydata.org/numba-doc/dev/cuda/simulator.html). ",
"_____no_output_____"
],
[
"Let's use CUDA simulator to print debug data directly in the kernel code. In order to be able to run CUDA simulator, set `NUMBA_ENABLE_CUDASIM` environment variable to value `1`. ",
"_____no_output_____"
]
],
[
[
"%%writefile 1_3_3_numba_debugger.py\n\n# Turn on CUDA simulator.\nimport os \nos.environ['NUMBA_ENABLE_CUDASIM'] = '1'\n\nimport numpy as np\nfrom numba import cuda\nimport math\n\[email protected]\ndef add_vectors(result, a, b):\n i = cuda.blockIdx.x*cuda.blockDim.x + cuda.threadIdx.x\n # What are we missing here?\n result[i] = a[i] + b[i]\n if i == 10:\n print(f\"{result[i]} = {a[i]} + {b[i]}\")\n # or use PDB: import pdb; pdb.set_trace()\n\na = np.arange(255)\nb = np.arange(255)\nresult = cuda.device_array(a.shape, dtype=a.dtype)\n\nadd_vectors[1, 255](result, a, b)\nresult_host = result.copy_to_host()",
"Writing 1_3_3_numba_debugger.py\n"
],
[
"! python 1_3_3_numba_debugger.py",
"20 = 10 + 10\n"
]
],
[
[
"### Exercise 1.3.4. Profiling GPU code.\n",
"_____no_output_____"
],
[
"Sometimes, to better understand and optimize performance of GPU appplication, it is necessary to perform a dynamic program analysis and to mesaure a specific metrics, for example the execution time and memory requirements of a particular CUDA kernel. The utility that performs such analysis is usually called a *code profiler*.\n\nNVIDIA provides a number of tools that enable code profiling. Some of them allow you to perform inspections from the command line, while others provide a graphical user interface that clearly presents various code metrics. In this excersie we will introduce you the tools available in CUDA ecosystem.\n\nNOTE: Currently, NVIDIA is migrating to a new profiling toolkit called *NVIDIA Nsight Systems* and NVIDIA Nsight Compute system. We will extend this exercise with examples of their use in the future.",
"_____no_output_____"
],
[
"#### 1.3.4.1. NVPROF",
"_____no_output_____"
],
[
"`nvprof` is a CUDA SDK tool that allows to acquire profiling data directly from the command line. Documentation for the tool is available here. \n\nWe will use `nvprof` for the rest of this course.",
"_____no_output_____"
],
[
"Let's try profiling our `add_vectors_gpu` code first.",
"_____no_output_____"
]
],
[
[
"%%writefile 1_3_4_nvprof_add_vectors.py\n\nimport math\nfrom numba import cuda\nimport numpy as np\n\[email protected]\ndef add_vectors_kernel(result, a, b):\n i = cuda.blockIdx.x*cuda.blockDim.x + cuda.threadIdx.x\n # Make sure not to go outside the grid area!\n if i >= len(result):\n return\n result[i] = a[i] + b[i]\n\ny = cuda.device_array(4)\nadd_vectors_kernel[1, 4](y, np.array([1, 2, 3, 4]), np.array([4, 5, 6, 7]))\nresult = y.copy_to_host()\n\nnp.testing.assert_equal(result, [5, 7, 9, 11])",
"Writing 1_3_4_nvprof_add_vectors.py\n"
]
],
[
[
"The usage is the following:\n```\nnvprof [options] [application] [application-arguments]\n```",
"_____no_output_____"
],
[
"For example, to run the above Python script:",
"_____no_output_____"
]
],
[
[
"! nvprof python 1_3_4_nvprof_add_vectors.py",
"==568== NVPROF is profiling process 568, command: python3 1_3_4_nvprof_add_vectors.py\n==568== Profiling application: python3 1_3_4_nvprof_add_vectors.py\n==568== Profiling result:\n Type Time(%) Time Calls Avg Min Max Name\n GPU activities: 46.64% 5.7600us 3 1.9200us 1.6000us 2.5280us [CUDA memcpy DtoH]\n 26.94% 3.3270us 1 3.3270us 3.3270us 3.3270us cudapy::__main__::add_vectors_kernel$241(Array<double, int=1, C, mutable, aligned>, Array<__int64, int=1, C, mutable, aligned>, Array<__int64, int=1, C, mutable, aligned>)\n 26.43% 3.2640us 2 1.6320us 1.3760us 1.8880us [CUDA memcpy HtoD]\n API calls: 77.42% 192.63ms 1 192.63ms 192.63ms 192.63ms cuDevicePrimaryCtxRetain\n 21.86% 54.384ms 1 54.384ms 54.384ms 54.384ms cuLinkAddData\n 0.22% 540.14us 1 540.14us 540.14us 540.14us cuDeviceTotalMem\n 0.15% 384.52us 1 384.52us 384.52us 384.52us cuModuleLoadDataEx\n 0.09% 223.19us 3 74.397us 7.6010us 199.53us cuMemAlloc\n 0.08% 199.26us 101 1.9720us 243ns 80.285us cuDeviceGetAttribute\n 0.04% 103.58us 1 103.58us 103.58us 103.58us cuLinkComplete\n 0.04% 94.693us 2 47.346us 35.640us 59.053us cuDeviceGetName\n 0.03% 67.610us 3 22.536us 15.027us 36.832us cuMemcpyDtoH\n 0.02% 56.992us 1 56.992us 56.992us 56.992us cuLinkCreate\n 0.01% 29.286us 2 14.643us 10.217us 19.069us cuMemcpyHtoD\n 0.01% 28.299us 1 28.299us 28.299us 28.299us cuMemGetInfo\n 0.01% 25.681us 1 25.681us 25.681us 25.681us cuLaunchKernel\n 0.00% 11.264us 13 866ns 242ns 3.7900us cuCtxGetCurrent\n 0.00% 6.1790us 1 6.1790us 6.1790us 6.1790us cuDeviceGetPCIBusId\n 0.00% 5.6830us 11 516ns 235ns 1.7140us cuCtxGetDevice\n 0.00% 3.5440us 3 1.1810us 706ns 1.7690us cuDeviceGet\n 0.00% 3.0540us 1 3.0540us 3.0540us 3.0540us cuInit\n 0.00% 2.7270us 4 681ns 274ns 1.2660us cuDeviceGetCount\n 0.00% 1.7460us 5 349ns 177ns 736ns cuFuncGetAttribute\n 0.00% 1.7400us 1 1.7400us 1.7400us 1.7400us cuLinkDestroy\n 0.00% 1.7180us 1 1.7180us 1.7180us 1.7180us cuCtxPushCurrent\n 0.00% 1.6600us 1 1.6600us 1.6600us 1.6600us cuDriverGetVersion\n 0.00% 1.4600us 1 1.4600us 1.4600us 1.4600us cuModuleGetFunction\n 0.00% 1.0940us 1 1.0940us 1.0940us 1.0940us cuDeviceComputeCapability\n 0.00% 725ns 1 725ns 725ns 725ns cudaRuntimeGetVersion\n 0.00% 368ns 1 368ns 368ns 368ns cuDeviceGetUuid\n"
]
],
[
[
"By default, `nvprof` outputs all GPU and API calls activity. Mainly we are interested in the CUDA GPU tracing -- we can turn off API calls by using `--trace gpu` option.",
"_____no_output_____"
]
],
[
[
"! nvprof --trace gpu python 1_3_4_nvprof_add_vectors.py",
"==590== NVPROF is profiling process 590, command: python3 1_3_4_nvprof_add_vectors.py\n==590== Profiling application: python3 1_3_4_nvprof_add_vectors.py\n==590== Profiling result:\n Type Time(%) Time Calls Avg Min Max Name\n GPU activities: 44.50% 5.3120us 3 1.7700us 1.6000us 2.0800us [CUDA memcpy DtoH]\n 27.88% 3.3280us 1 3.3280us 3.3280us 3.3280us cudapy::__main__::add_vectors_kernel$241(Array<double, int=1, C, mutable, aligned>, Array<__int64, int=1, C, mutable, aligned>, Array<__int64, int=1, C, mutable, aligned>)\n 27.61% 3.2960us 2 1.6480us 1.3760us 1.9200us [CUDA memcpy HtoD]\nNo API activities were profiled.\n"
]
],
[
[
"CUDA GPU activities includes:\n- CUDA kernels activities,\n- GPU to Host memory transfers (`DtoH`), host to GPU memory transfers (`HtoD`).",
"_____no_output_____"
],
[
"Let's we add one more kernel to the above code:",
"_____no_output_____"
]
],
[
[
"%%writefile 1_3_4_nvprof_increment_add_vectors.py\n\nimport math\nfrom numba import cuda\nimport numpy as np\n\[email protected]\ndef increment(a):\n i = cuda.blockIdx.x*cuda.blockDim.x + cuda.threadIdx.x\n a[i] += 1\n\[email protected]\ndef add_vectors_kernel(result, a, b):\n i = cuda.blockIdx.x*cuda.blockDim.x + cuda.threadIdx.x\n # Make sure not to go outside the grid area!\n result[i] = a[i] + b[i]\n\nresult_gpu = cuda.device_array(2**24)\n\na_gpu = cuda.to_device(np.random.rand(2**24))\nb_gpu = cuda.to_device(np.random.rand(2**24))\n\nblock_size = 256\ngrid_size = math.ceil(len(result_gpu)/block_size)\nincrement[grid_size, block_size](a_gpu)\nadd_vectors_kernel[grid_size, block_size](result_gpu, a_gpu, b_gpu)\nresult = result_gpu.copy_to_host()",
"Writing 1_3_4_nvprof_increment_add_vectors.py\n"
],
[
"! nvprof --trace gpu python 1_3_4_nvprof_increment_add_vectors.py",
"==612== NVPROF is profiling process 612, command: python3 1_3_4_nvprof_increment_add_vectors.py\n==612== Profiling application: python3 1_3_4_nvprof_increment_add_vectors.py\n==612== Profiling result:\n Type Time(%) Time Calls Avg Min Max Name\n GPU activities: 54.69% 57.733ms 2 28.867ms 28.240ms 29.493ms [CUDA memcpy HtoD]\n 42.87% 45.252ms 1 45.252ms 45.252ms 45.252ms [CUDA memcpy DtoH]\n 1.45% 1.5324ms 1 1.5324ms 1.5324ms 1.5324ms cudapy::__main__::add_vectors_kernel$242(Array<double, int=1, C, mutable, aligned>, Array<double, int=1, C, mutable, aligned>, Array<double, int=1, C, mutable, aligned>)\n 0.99% 1.0485ms 1 1.0485ms 1.0485ms 1.0485ms cudapy::__main__::increment$241(Array<double, int=1, C, mutable, aligned>)\nNo API activities were profiled.\n"
]
],
[
[
"In the above case, `nvprof` should display the execution times for both kernels.",
"_____no_output_____"
],
[
"#### 1.3.4.2. NVIDIA Visual Profiler",
"_____no_output_____"
],
[
"NVIDIA Visual Profiler (NVVP) allows to ",
"_____no_output_____"
],
[
"Let's export the profiling results to a file that can be loaded by NVVP. We can `nvprof` for this purpose, just use `--export-profile` parameter.",
"_____no_output_____"
]
],
[
[
"! nvprof --trace gpu --export-profile nvvp_example.nvvp -f python 1_3_4_nvprof_increment_add_vectors.py",
"==634== NVPROF is profiling process 634, command: python3 1_3_4_nvprof_increment_add_vectors.py\n==634== Generated result file: /content/nvvp_example.nvvp\n"
]
],
[
[
"Next, let's load the profiling results into NVVP:\n1. Open NVVP.\n2. Press: File -> Open choose the exported file.\n\nAs a result, you should see an image similar to the one below. The generated graph presents the moments of execution each individual GPU activity over time. CUDA kernel launches are placed on the `Compute` lane (in the order in which they are executed: `increment` first, then `add_vectors`) CUDA memory transfers are on the `MemCpy` lanes (variable `a`, then variable `b`, both `HtoD`, then `result`, `DtoH`). ",
"_____no_output_____"
],
[
"",
"_____no_output_____"
],
[
"#### 1.3.4.3. NVIDIA Nsight Compute and Systems\n\nIn preparation...",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
]
] |
d07b8cb7a69dc976b4b5a7a793de790bc5e3d972 | 7,497 | ipynb | Jupyter Notebook | COVID-19.ipynb | NillsF/covid-jupyter | 6037ddf6b6d27128ea973ebeb8abb09d8942a239 | [
"MIT"
] | null | null | null | COVID-19.ipynb | NillsF/covid-jupyter | 6037ddf6b6d27128ea973ebeb8abb09d8942a239 | [
"MIT"
] | null | null | null | COVID-19.ipynb | NillsF/covid-jupyter | 6037ddf6b6d27128ea973ebeb8abb09d8942a239 | [
"MIT"
] | null | null | null | 32.881579 | 359 | 0.554488 | [
[
[
"%matplotlib inline\nimport matplotlib\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport pandas as pd",
"_____no_output_____"
],
[
"cases = pd.read_csv('https://raw.githubusercontent.com/CSSEGISandData/COVID-19/master/csse_covid_19_data/csse_covid_19_time_series/time_series_covid19_confirmed_US.csv')\ndeaths = pd.read_csv('https://raw.githubusercontent.com/CSSEGISandData/COVID-19/master/csse_covid_19_data/csse_covid_19_time_series/time_series_covid19_deaths_US.csv')\n",
"_____no_output_____"
],
[
"print(cases.head())\nprint(deaths.head())",
"_____no_output_____"
],
[
"cases_CA = cases[cases[\"Province_State\"] == \"California\"]",
"_____no_output_____"
],
[
"cases_CA_indexed = cases_CA.set_index(\"Admin2\")\ncases_CA_T = cases_CA_indexed.T",
"_____no_output_____"
],
[
"cases_clean = cases_CA_T.drop(['UID','iso2','iso3','code3','FIPS','Province_State','Country_Region','Lat','Long_','Combined_Key'])",
"_____no_output_____"
],
[
"deaths_clean = deaths[deaths[\"Province_State\"] == \"California\"].set_index(\"Admin2\").T.drop(['UID','iso2','iso3','code3','FIPS','Province_State','Country_Region','Lat','Long_','Combined_Key']).drop(\"Population\",axis=0)",
"_____no_output_____"
],
[
"counties = ['Alameda',\n 'San Francisco',\n 'San Mateo',\n 'Santa Clara']",
"_____no_output_____"
],
[
"plot = cases_clean[counties].plot()\nplot.set_title(\"COVID-19 cases in Bay Area Counties\")",
"_____no_output_____"
],
[
"plot = deaths_clean[counties].plot(figsize=(20,10))\nplot.set_title(\"COVID-19 deaths in Bay Area Counties\")\n",
"_____no_output_____"
],
[
"cases_diff = cases_clean.diff().rolling(window=7).mean()\ndeaths_diff = deaths_clean.diff().rolling(window=7).mean()",
"_____no_output_____"
],
[
"pop = pd.read_csv('https://gist.githubusercontent.com/NillsF/7923a8c7f27ca98ec75b7e1529f259bb/raw/3bedefbe2e242addba3fb47cbcd239fbed16cd54/california.csv')\npop[\"CTYNAME\"] = pop[\"CTYNAME\"].str.replace(\" County\", \"\")\n",
"_____no_output_____"
],
[
"pop2 = pop.drop('GrowthRate',axis=1).set_index('CTYNAME')",
"_____no_output_____"
],
[
"cases_pm = cases_clean.copy()\nfor c in pop2.index.tolist():\n cases_pm[c] = cases_pm[c]/pop2.loc[c , : ]['Pop']\ncases_pm = cases_pm*1000000\n\ndeaths_pm = deaths_clean.copy()\nfor c in pop2.index.tolist():\n deaths_pm[c] = deaths_pm[c]/pop2.loc[c , : ]['Pop']\ndeaths_pm = deaths_pm*1000000",
"_____no_output_____"
],
[
"cases_pm_diff = cases_pm.diff().rolling(window=7).mean()\ndeaths_pm_diff = deaths_pm.diff().rolling(window=7).mean()",
"_____no_output_____"
],
[
"plot = cases_diff[counties].plot(figsize=(20,10))\nplot.set_title(\"7 day moving avg of new COVID-19 cases \")",
"_____no_output_____"
],
[
"plot = cases_pm_diff[counties].plot(figsize=(20,10))\nplot.set_title(\"7 day moving avg of new COVID-19 cases per million inhabitants \")",
"_____no_output_____"
],
[
"plot = deaths_pm_diff[counties].plot(figsize=(20,10))\nplot.set_title(\"7 day moving avg of daily COVID-19 deaths per million inhabitants \")",
"_____no_output_____"
],
[
"plot = cases_pm.sort_values(axis=1,by='7/20/20',ascending=False).iloc[:, : 10].plot(figsize=(20,10))\nplot.set_title(\"Top 10 counties by COVID-19 cases per million inhabitants\")",
"_____no_output_____"
],
[
"plot = deaths_pm.sort_values(axis=1,by='7/20/20',ascending=False).iloc[:, : 10].plot(figsize=(20,10))\nplot.set_title(\"Top 10 counties by COVID-19 deaths per million inhabitants\")",
"_____no_output_____"
],
[
"plot = cases_pm_diff.sort_values(axis=1,by='7/20/20',ascending=False).iloc[:, : 10].plot(figsize=(20,10))\nplot.set_title(\"Top 10 counties by 7 day rolling avg COVID-19 case increases per million inhabitants\")",
"_____no_output_____"
],
[
"plot = deaths_pm_diff.sort_values(axis=1,by='7/20/20',ascending=False).iloc[:, : 10].plot(figsize=(20,10))\nplot.set_title(\"Top 10 counties by 7 day rolling avg COVID-19 daily deaths per million inhabitants\")",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07b8d733e9dd44e43efe3c4c4479a100de79c81 | 62,379 | ipynb | Jupyter Notebook | Colab 2/CS224W - Colab 2_tobias.ipynb | victorcroisfelt/aau-cs224w-ml-with-graphs | adb38651be8da98cc574f127763c785ed16dfb5a | [
"Unlicense",
"MIT"
] | null | null | null | Colab 2/CS224W - Colab 2_tobias.ipynb | victorcroisfelt/aau-cs224w-ml-with-graphs | adb38651be8da98cc574f127763c785ed16dfb5a | [
"Unlicense",
"MIT"
] | null | null | null | Colab 2/CS224W - Colab 2_tobias.ipynb | victorcroisfelt/aau-cs224w-ml-with-graphs | adb38651be8da98cc574f127763c785ed16dfb5a | [
"Unlicense",
"MIT"
] | null | null | null | 39.935339 | 1,069 | 0.566601 | [
[
[
"# **CS224W - Colab 2**",
"_____no_output_____"
],
[
"In Colab 2, we will work to construct our own graph neural network using PyTorch Geometric (PyG) and then apply that model on two Open Graph Benchmark (OGB) datasets. These two datasets will be used to benchmark your model's performance on two different graph-based tasks: 1) node property prediction, predicting properties of single nodes and 2) graph property prediction, predicting properties of entire graphs or subgraphs.\n\nFirst, we will learn how PyTorch Geometric stores graphs as PyTorch tensors.\n\nThen, we will load and inspect one of the Open Graph Benchmark (OGB) datasets by using the `ogb` package. OGB is a collection of realistic, large-scale, and diverse benchmark datasets for machine learning on graphs. The `ogb` package not only provides data loaders for each dataset but also model evaluators.\n\nLastly, we will build our own graph neural network using PyTorch Geometric. We will then train and evaluate our model on the OGB node property prediction and graph property prediction tasks.\n\n**Note**: Make sure to **sequentially run all the cells in each section**, so that the intermediate variables / packages will carry over to the next cell\n\nWe recommend you save a copy of this colab in your drive so you don't lose progress!\n\nHave fun and good luck on Colab 2 :)",
"_____no_output_____"
],
[
"# Device\nYou might need to use a GPU for this Colab to run quickly.\n\nPlease click `Runtime` and then `Change runtime type`. Then set the `hardware accelerator` to **GPU**.",
"_____no_output_____"
],
[
"# Setup\nAs discussed in Colab 0, the installation of PyG on Colab can be a little bit tricky. First let us check which version of PyTorch you are running",
"_____no_output_____"
]
],
[
[
"import torch\nimport os\nprint(\"PyTorch has version {}\".format(torch.__version__))",
"PyTorch has version 1.10.0\n"
]
],
[
[
"Download the necessary packages for PyG. Make sure that your version of torch matches the output from the cell above. In case of any issues, more information can be found on the [PyG's installation page](https://pytorch-geometric.readthedocs.io/en/latest/notes/installation.html).",
"_____no_output_____"
]
],
[
[
"# Install torch geometric\nif 'IS_GRADESCOPE_ENV' not in os.environ:\n !pip install torch-scatter -f https://pytorch-geometric.com/whl/torch-1.9.0+cu111.html\n !pip install torch-sparse -f https://pytorch-geometric.com/whl/torch-1.9.0+cu111.html\n !pip install torch-geometric\n !pip install ogb",
"Looking in links: https://pytorch-geometric.com/whl/torch-1.9.0+cu111.html\nCollecting torch-scatter\n Downloading https://data.pyg.org/whl/torch-1.9.0%2Bcu111/torch_scatter-2.0.9-cp38-cp38-win_amd64.whl (4.0 MB)\nInstalling collected packages: torch-scatter\nSuccessfully installed torch-scatter-2.0.9\nLooking in links: https://pytorch-geometric.com/whl/torch-1.9.0+cu111.html\nCollecting torch-sparse\n Downloading https://data.pyg.org/whl/torch-1.9.0%2Bcu111/torch_sparse-0.6.12-cp38-cp38-win_amd64.whl (4.0 MB)\nRequirement already satisfied: scipy in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from torch-sparse) (1.7.1)\nRequirement already satisfied: numpy<1.23.0,>=1.16.5 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from scipy->torch-sparse) (1.20.3)\nInstalling collected packages: torch-sparse\nSuccessfully installed torch-sparse-0.6.12\nCollecting torch-geometric\n Downloading torch_geometric-2.0.2.tar.gz (325 kB)\nRequirement already satisfied: numpy in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from torch-geometric) (1.20.3)\nRequirement already satisfied: tqdm in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from torch-geometric) (4.62.3)\nRequirement already satisfied: scipy in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from torch-geometric) (1.7.1)\nRequirement already satisfied: networkx in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from torch-geometric) (2.6.3)\nRequirement already satisfied: scikit-learn in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from torch-geometric) (1.0.1)\nRequirement already satisfied: requests in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from torch-geometric) (2.26.0)\nRequirement already satisfied: pandas in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from torch-geometric) (1.3.4)\nCollecting rdflib\n Downloading rdflib-6.0.2-py3-none-any.whl (407 kB)\nCollecting googledrivedownloader\n Downloading googledrivedownloader-0.4-py2.py3-none-any.whl (3.9 kB)\nRequirement already satisfied: jinja2 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from torch-geometric) (2.11.3)\nRequirement already satisfied: pyparsing in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from torch-geometric) (3.0.4)\nCollecting yacs\n Downloading yacs-0.1.8-py3-none-any.whl (14 kB)\nRequirement already satisfied: PyYAML in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from torch-geometric) (6.0)\nRequirement already satisfied: MarkupSafe>=0.23 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from jinja2->torch-geometric) (1.1.1)\nRequirement already satisfied: python-dateutil>=2.7.3 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from pandas->torch-geometric) (2.8.2)\nRequirement already satisfied: pytz>=2017.3 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from pandas->torch-geometric) (2021.3)\nRequirement already satisfied: six>=1.5 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from python-dateutil>=2.7.3->pandas->torch-geometric) (1.16.0)\nRequirement already satisfied: setuptools in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from rdflib->torch-geometric) (58.0.4)\nCollecting isodate\n Downloading isodate-0.6.0-py2.py3-none-any.whl (45 kB)\nRequirement already satisfied: certifi>=2017.4.17 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from requests->torch-geometric) (2021.10.8)\nRequirement already satisfied: charset-normalizer~=2.0.0 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from requests->torch-geometric) (2.0.4)\nRequirement already satisfied: urllib3<1.27,>=1.21.1 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from requests->torch-geometric) (1.26.7)\nRequirement already satisfied: idna<4,>=2.5 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from requests->torch-geometric) (3.3)\nRequirement already satisfied: threadpoolctl>=2.0.0 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from scikit-learn->torch-geometric) (2.2.0)\nRequirement already satisfied: joblib>=0.11 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from scikit-learn->torch-geometric) (1.1.0)\nRequirement already satisfied: colorama in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from tqdm->torch-geometric) (0.4.4)\nBuilding wheels for collected packages: torch-geometric\n Building wheel for torch-geometric (setup.py): started\n Building wheel for torch-geometric (setup.py): finished with status 'done'\n Created wheel for torch-geometric: filename=torch_geometric-2.0.2-py3-none-any.whl size=535570 sha256=a9e81f1d66cb10c742eff0a1d0d81fad65898556c4f39f8a94dee2f77ab5364a\n Stored in directory: c:\\users\\yx84ax\\appdata\\local\\pip\\cache\\wheels\\41\\cd\\57\\4187ae4860bff8a3a432ca291ea2574a7682f87331bfa0551d\nSuccessfully built torch-geometric\nInstalling collected packages: isodate, yacs, rdflib, googledrivedownloader, torch-geometric\nSuccessfully installed googledrivedownloader-0.4 isodate-0.6.0 rdflib-6.0.2 torch-geometric-2.0.2 yacs-0.1.8\nCollecting ogb\n Downloading ogb-1.3.2-py3-none-any.whl (78 kB)\nRequirement already satisfied: tqdm>=4.29.0 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from ogb) (4.62.3)\nRequirement already satisfied: scikit-learn>=0.20.0 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from ogb) (1.0.1)\nRequirement already satisfied: numpy>=1.16.0 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from ogb) (1.20.3)\nCollecting outdated>=0.2.0\n Downloading outdated-0.2.1-py3-none-any.whl (7.5 kB)\nRequirement already satisfied: torch>=1.6.0 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from ogb) (1.10.0)\nRequirement already satisfied: pandas>=0.24.0 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from ogb) (1.3.4)\nRequirement already satisfied: urllib3>=1.24.0 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from ogb) (1.26.7)\nRequirement already satisfied: six>=1.12.0 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from ogb) (1.16.0)\nRequirement already satisfied: requests in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from outdated>=0.2.0->ogb) (2.26.0)\nCollecting littleutils\n Downloading littleutils-0.2.2.tar.gz (6.6 kB)\nRequirement already satisfied: pytz>=2017.3 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from pandas>=0.24.0->ogb) (2021.3)\nRequirement already satisfied: python-dateutil>=2.7.3 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from pandas>=0.24.0->ogb) (2.8.2)\nRequirement already satisfied: scipy>=1.1.0 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from scikit-learn>=0.20.0->ogb) (1.7.1)\nRequirement already satisfied: joblib>=0.11 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from scikit-learn>=0.20.0->ogb) (1.1.0)\nRequirement already satisfied: threadpoolctl>=2.0.0 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from scikit-learn>=0.20.0->ogb) (2.2.0)\nRequirement already satisfied: typing_extensions in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from torch>=1.6.0->ogb) (3.10.0.2)\nRequirement already satisfied: colorama in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from tqdm>=4.29.0->ogb) (0.4.4)\nRequirement already satisfied: idna<4,>=2.5 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from requests->outdated>=0.2.0->ogb) (3.3)\nRequirement already satisfied: certifi>=2017.4.17 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from requests->outdated>=0.2.0->ogb) (2021.10.8)\nRequirement already satisfied: charset-normalizer~=2.0.0 in c:\\users\\yx84ax\\anaconda3\\lib\\site-packages (from requests->outdated>=0.2.0->ogb) (2.0.4)\nBuilding wheels for collected packages: littleutils\n Building wheel for littleutils (setup.py): started\n Building wheel for littleutils (setup.py): finished with status 'done'\n Created wheel for littleutils: filename=littleutils-0.2.2-py3-none-any.whl size=7048 sha256=cd4b0d285ba3b55442cc053ab4775f139c4f7cac897b1330ce6d202aff5d7587\n Stored in directory: c:\\users\\yx84ax\\appdata\\local\\pip\\cache\\wheels\\6a\\33\\c4\\0ef84d7f5568c2823e3d63a6e08988852fb9e4bc822034870a\nSuccessfully built littleutils\nInstalling collected packages: littleutils, outdated, ogb\nSuccessfully installed littleutils-0.2.2 ogb-1.3.2 outdated-0.2.1\n"
]
],
[
[
"# 1) PyTorch Geometric (Datasets and Data)\n",
"_____no_output_____"
],
[
"PyTorch Geometric has two classes for storing and/or transforming graphs into tensor format. One is `torch_geometric.datasets`, which contains a variety of common graph datasets. Another is `torch_geometric.data`, which provides the data handling of graphs in PyTorch tensors.\n\nIn this section, we will learn how to use `torch_geometric.datasets` and `torch_geometric.data` together.",
"_____no_output_____"
],
[
"## PyG Datasets\n\nThe `torch_geometric.datasets` class has many common graph datasets. Here we will explore its usage through one example dataset.",
"_____no_output_____"
]
],
[
[
"from torch_geometric.datasets import TUDataset\n\nif 'IS_GRADESCOPE_ENV' not in os.environ:\n root = './enzymes'\n name = 'ENZYMES'\n\n # The ENZYMES dataset\n pyg_dataset= TUDataset(root, name)\n\n # You will find that there are 600 graphs in this dataset\n print(pyg_dataset)",
"Downloading https://www.chrsmrrs.com/graphkerneldatasets/ENZYMES.zip\nExtracting enzymes\\ENZYMES\\ENZYMES.zip\nProcessing...\n"
]
],
[
[
"## Question 1: What is the number of classes and number of features in the ENZYMES dataset? (5 points)",
"_____no_output_____"
]
],
[
[
"def get_num_classes(pyg_dataset):\n # TODO: Implement a function that takes a PyG dataset object\n # and returns the number of classes for that dataset.\n\n num_classes = 0\n\n ############# Your code here ############\n ## (~1 line of code)\n ## Note\n ## 1. Colab autocomplete functionality might be useful.\n num_classes = pyg_dataset.num_classes\n #########################################\n\n return num_classes\n\ndef get_num_features(pyg_dataset):\n # TODO: Implement a function that takes a PyG dataset object\n # and returns the number of features for that dataset.\n\n num_features = 0\n\n ############# Your code here ############\n ## (~1 line of code)\n ## Note\n ## 1. Colab autocomplete functionality might be useful.\n num_features = pyg_dataset.num_features\n #########################################\n\n return num_features\n\nif 'IS_GRADESCOPE_ENV' not in os.environ:\n num_classes = get_num_classes(pyg_dataset)\n num_features = get_num_features(pyg_dataset)\n print(\"{} dataset has {} classes\".format(name, num_classes))\n print(\"{} dataset has {} features\".format(name, num_features))",
"ENZYMES dataset has 6 classes\nENZYMES dataset has 3 features\n"
]
],
[
[
"## PyG Data\n\nEach PyG dataset stores a list of `torch_geometric.data.Data` objects, where each `torch_geometric.data.Data` object represents a graph. We can easily get the `Data` object by indexing into the dataset.\n\nFor more information such as what is stored in the `Data` object, please refer to the [documentation](https://pytorch-geometric.readthedocs.io/en/latest/modules/data.html#torch_geometric.data.Data).",
"_____no_output_____"
],
[
"## Question 2: What is the label of the graph with index 100 in the ENZYMES dataset? (5 points)",
"_____no_output_____"
]
],
[
[
"def get_graph_class(pyg_dataset, idx):\n # TODO: Implement a function that takes a PyG dataset object,\n # an index of a graph within the dataset, and returns the class/label \n # of the graph (as an integer).\n\n label = -1\n\n ############# Your code here ############\n ## (~1 line of code)\n graph = pyg_dataset[idx]\n label = int(graph.y)\n #########################################\n\n return label\n\n# Here pyg_dataset is a dataset for graph classification\nif 'IS_GRADESCOPE_ENV' not in os.environ:\n graph_0 = pyg_dataset[0]\n print(graph_0)\n idx = 100\n label = get_graph_class(pyg_dataset, idx)\n print('Graph with index {} has label {}'.format(idx, label))",
"Data(edge_index=[2, 168], x=[37, 3], y=[1])\nGraph with index 100 has label 4\n"
]
],
[
[
"## Question 3: How many edges does the graph with index 200 have? (5 points)",
"_____no_output_____"
]
],
[
[
"def get_graph_num_edges(pyg_dataset, idx):\n # TODO: Implement a function that takes a PyG dataset object,\n # the index of a graph in the dataset, and returns the number of \n # edges in the graph (as an integer). You should not count an edge \n # twice if the graph is undirected. For example, in an undirected \n # graph G, if two nodes v and u are connected by an edge, this edge\n # should only be counted once.\n\n num_edges = 0\n\n ############# Your code here ############\n ## Note:\n ## 1. You can't return the data.num_edges directly - WHY NOT?\n ## 2. We assume the graph is undirected\n ## 3. Look at the PyG dataset built in functions\n ## (~4 lines of code)\n num_edges = pyg_dataset[idx].num_edges/2\n num_edges_2 = pyg_dataset[idx]['edge_index'].size()[1]/2 \n assert num_edges == num_edges_2\n #########################################\n\n return num_edges\n\nif 'IS_GRADESCOPE_ENV' not in os.environ:\n idx = 200\n num_edges = get_graph_num_edges(pyg_dataset, idx)\n print('Graph with index {} has {} edges'.format(idx, num_edges))",
"_____no_output_____"
]
],
[
[
"# 2) Open Graph Benchmark (OGB)\n\nThe Open Graph Benchmark (OGB) is a collection of realistic, large-scale, and diverse benchmark datasets for machine learning on graphs. Its datasets are automatically downloaded, processed, and split using the OGB Data Loader. The model performance can then be evaluated by using the OGB Evaluator in a unified manner.",
"_____no_output_____"
],
[
"## Dataset and Data\n\nOGB also supports PyG dataset and data classes. Here we take a look on the `ogbn-arxiv` dataset.",
"_____no_output_____"
]
],
[
[
"import torch_geometric.transforms as T\nfrom ogb.nodeproppred import PygNodePropPredDataset\n\nif 'IS_GRADESCOPE_ENV' not in os.environ:\n dataset_name = 'ogbn-arxiv'\n # Load the dataset and transform it to sparse tensor\n dataset = PygNodePropPredDataset(name=dataset_name,\n transform=T.ToSparseTensor())\n print('The {} dataset has {} graph'.format(dataset_name, len(dataset)))\n\n # Extract the graph\n data = dataset[0]\n print(data)",
"Downloading http://snap.stanford.edu/ogb/data/nodeproppred/arxiv.zip\n"
]
],
[
[
"## Question 4: How many features are in the ogbn-arxiv graph? (5 points)",
"_____no_output_____"
]
],
[
[
"def graph_num_features(data):\n # TODO: Implement a function that takes a PyG data object,\n # and returns the number of features in the graph (as an integer).\n\n num_features = 0\n\n ############# Your code here ############\n ## (~1 line of code)\n num_features = data.num_features\n #########################################\n\n return num_features\n\nif 'IS_GRADESCOPE_ENV' not in os.environ:\n num_features = graph_num_features(data)\n print('The graph has {} features'.format(num_features))",
"The graph has 128 features\n"
]
],
[
[
"# 3) GNN: Node Property Prediction\n\nIn this section we will build our first graph neural network using PyTorch Geometric. Then we will apply it to the task of node property prediction (node classification).\n\nSpecifically, we will use GCN as the foundation for your graph neural network ([Kipf et al. (2017)](https://arxiv.org/pdf/1609.02907.pdf)). To do so, we will work with PyG's built-in `GCNConv` layer. ",
"_____no_output_____"
],
[
"## Setup",
"_____no_output_____"
]
],
[
[
"import torch\nimport pandas as pd\nimport torch.nn.functional as F\nprint(torch.__version__)\n\n# The PyG built-in GCNConv\nfrom torch_geometric.nn import GCNConv\n\nimport torch_geometric.transforms as T\nfrom ogb.nodeproppred import PygNodePropPredDataset, Evaluator",
"1.10.0\n"
]
],
[
[
"## Load and Preprocess the Dataset",
"_____no_output_____"
]
],
[
[
"if 'IS_GRADESCOPE_ENV' not in os.environ:\n dataset_name = 'ogbn-arxiv'\n dataset = PygNodePropPredDataset(name=dataset_name,\n transform=T.ToSparseTensor())\n data = dataset[0]\n\n # Make the adjacency matrix to symmetric\n data.adj_t = data.adj_t.to_symmetric()\n\n device = 'cpu'\n\n # If you use GPU, the device should be cuda\n print('Device: {}'.format(device))\n\n data = data.to(device)\n split_idx = dataset.get_idx_split()\n train_idx = split_idx['train'].to(device)\n\n print(graph_num_features(data))",
"Device: cpu\n128\n"
]
],
[
[
"## GCN Model\n\nNow we will implement our GCN model!\n\nPlease follow the figure below to implement the `forward` function.\n\n\n",
"_____no_output_____"
]
],
[
[
"class GCN(torch.nn.Module):\n def __init__(self, input_dim, hidden_dim, output_dim, num_layers,\n dropout, return_embeds=False):\n # TODO: Implement a function that initializes self.convs, \n # self.bns, and self.softmax.\n\n super(GCN, self).__init__()\n\n # A list of GCNConv layers\n self.convs = None\n\n # A list of 1D batch normalization layers\n self.bns = None\n\n # The log softmax layer\n self.softmax = None\n\n ############# Your code here ############\n ## Note:\n ## 1. You should use torch.nn.ModuleList for self.convs and self.bns\n ## 2. self.convs has num_layers GCNConv layers\n ## 3. self.bns has num_layers - 1 BatchNorm1d layers\n ## 4. You should use torch.nn.LogSoftmax for self.softmax\n ## 5. The parameters you can set for GCNConv include 'in_channels' and \n ## 'out_channels'. For more information please refer to the documentation:\n ## https://pytorch-geometric.readthedocs.io/en/latest/modules/nn.html#torch_geometric.nn.conv.GCNConv\n ## 6. The only parameter you need to set for BatchNorm1d is 'num_features'\n ## For more information please refer to the documentation: \n ## https://pytorch.org/docs/stable/generated/torch.nn.BatchNorm1d.html\n ## (~10 lines of code)\n self.num_layers = num_layers\n \n self.convs = torch.nn.ModuleList() # convolutional layers\n self.bns = torch.nn.ModuleList() # batch normalization layers \n \n in_size = [input_dim] + (num_layers - 1)*[hidden_dim] # [input_dim, hidden_dim, ... , hidden_dim, output_dim]\n out_size = (num_layers - 1)*[hidden_dim] + [output_dim] # [hidden_dim, ... , hidden_dim, output_dim]\n for i in range(num_layers): \n self.convs.append(GCNConv(in_channels = in_size[i], out_channels = out_size[i]))\n if i < num_layers - 1: \n self.bns.append(torch.nn.BatchNorm1d(out_size[i]))\n \n self.softmax = torch.nn.LogSoftmax(1) # along feature vectors \n\n #########################################\n\n # Probability of an element getting zeroed\n self.dropout = dropout\n\n # Skip classification layer and return node embeddings\n self.return_embeds = return_embeds\n\n def reset_parameters(self):\n for conv in self.convs:\n conv.reset_parameters()\n for bn in self.bns:\n bn.reset_parameters()\n\n def forward(self, x, adj_t):\n # TODO: Implement a function that takes the feature tensor x and\n # edge_index tensor adj_t and returns the output tensor as\n # shown in the figure.\n\n out = None\n\n ############# Your code here ############\n ## Note:\n ## 1. Construct the network as shown in the figure\n ## 2. torch.nn.functional.relu and torch.nn.functional.dropout are useful\n ## For more information please refer to the documentation:\n ## https://pytorch.org/docs/stable/nn.functional.html\n ## 3. Don't forget to set F.dropout training to self.training\n ## 4. If return_embeds is True, then skip the last softmax layer\n ## (~7 lines of code)\n for i in range(self.num_layers):\n x = self.convs[i](x, adj_t) # GCN convolutional\n if i < self.num_layers - 1:\n x = self.bns[i](x)\n x = F.relu(x)\n x = F.dropout(x, p = self.dropout, training = self.training) \n # last layer\n if self.return_embeds: \n out = x\n if not self.return_embeds:\n out = self.softmax(x)\n\n #########################################\n\n return out",
"_____no_output_____"
],
[
"def train(model, data, train_idx, optimizer, loss_fn):\n # TODO: Implement a function that trains the model by \n # using the given optimizer and loss_fn.\n model.train()\n loss = 0\n\n ############# Your code here ############\n ## Note:\n ## 1. Zero grad the optimizer\n ## 2. Feed the data into the model\n ## 3. Slice the model output and label by train_idx\n ## 4. Feed the sliced output and label to loss_fn\n ## (~4 lines of code)\n optimizer.zero_grad()\n output = model(data['x'], data['adj_t'])[train_idx]\n label = (data.y.to(device)[train_idx]).flatten()\n loss = loss_fn(output,label)\n #########################################\n\n loss.backward()\n optimizer.step()\n\n return loss.item()",
"_____no_output_____"
],
[
"# Test function here\[email protected]_grad()\ndef test(model, data, split_idx, evaluator, save_model_results=False):\n # TODO: Implement a function that tests the model by \n # using the given split_idx and evaluator.\n model.eval()\n\n # The output of model on all data\n out = None\n\n ############# Your code here ############\n ## (~1 line of code)\n ## Note:\n ## 1. No index slicing here\n out = model(data['x'], data['adj_t'])\n\n #########################################\n\n y_pred = out.argmax(dim=-1, keepdim=True)\n\n train_acc = evaluator.eval({\n 'y_true': data.y[split_idx['train']],\n 'y_pred': y_pred[split_idx['train']],\n })['acc']\n valid_acc = evaluator.eval({\n 'y_true': data.y[split_idx['valid']],\n 'y_pred': y_pred[split_idx['valid']],\n })['acc']\n test_acc = evaluator.eval({\n 'y_true': data.y[split_idx['test']],\n 'y_pred': y_pred[split_idx['test']],\n })['acc']\n\n if save_model_results:\n print (\"Saving Model Predictions\")\n\n data = {}\n data['y_pred'] = y_pred.view(-1).cpu().detach().numpy()\n\n df = pd.DataFrame(data=data)\n # Save locally as csv\n df.to_csv('ogbn-arxiv_node.csv', sep=',', index=False)\n\n\n return train_acc, valid_acc, test_acc",
"_____no_output_____"
],
[
"# Please do not change the args\nif 'IS_GRADESCOPE_ENV' not in os.environ:\n args = {\n 'device': device,\n 'num_layers': 3,\n 'hidden_dim': 256,\n 'dropout': 0.5,\n 'lr': 0.01,\n 'epochs': 100,\n }\n args",
"_____no_output_____"
],
[
"if 'IS_GRADESCOPE_ENV' not in os.environ:\n model = GCN(data.num_features, args['hidden_dim'],\n dataset.num_classes, args['num_layers'],\n args['dropout']).to(device)\n evaluator = Evaluator(name='ogbn-arxiv')",
"_____no_output_____"
],
[
"import copy\nif 'IS_GRADESCOPE_ENV' not in os.environ:\n # reset the parameters to initial random value\n model.reset_parameters()\n\n optimizer = torch.optim.Adam(model.parameters(), lr=args['lr'])\n loss_fn = F.nll_loss\n\n best_model = None\n best_valid_acc = 0\n\n for epoch in range(1, 1 + args[\"epochs\"]):\n loss = train(model, data, train_idx, optimizer, loss_fn)\n result = test(model, data, split_idx, evaluator)\n train_acc, valid_acc, test_acc = result\n if valid_acc > best_valid_acc:\n best_valid_acc = valid_acc\n best_model = copy.deepcopy(model)\n print(f'Epoch: {epoch:02d}, '\n f'Loss: {loss:.4f}, '\n f'Train: {100 * train_acc:.2f}%, '\n f'Valid: {100 * valid_acc:.2f}% '\n f'Test: {100 * test_acc:.2f}%')",
"Epoch: 01, Loss: 4.0669, Train: 19.65%, Valid: 26.05% Test: 23.55%\nEpoch: 02, Loss: 2.3643, Train: 21.23%, Valid: 20.62% Test: 26.14%\nEpoch: 03, Loss: 1.9636, Train: 38.83%, Valid: 39.31% Test: 43.51%\nEpoch: 04, Loss: 1.7614, Train: 45.58%, Valid: 46.42% Test: 43.25%\nEpoch: 05, Loss: 1.6479, Train: 44.97%, Valid: 43.31% Test: 41.47%\nEpoch: 06, Loss: 1.5548, Train: 41.26%, Valid: 37.87% Test: 38.38%\nEpoch: 07, Loss: 1.4922, Train: 38.69%, Valid: 34.45% Test: 36.02%\nEpoch: 08, Loss: 1.4355, Train: 38.75%, Valid: 33.85% Test: 35.05%\nEpoch: 09, Loss: 1.3906, Train: 40.15%, Valid: 35.92% Test: 37.75%\nEpoch: 10, Loss: 1.3588, Train: 41.95%, Valid: 39.16% Test: 41.82%\nEpoch: 11, Loss: 1.3255, Train: 44.21%, Valid: 41.95% Test: 44.63%\nEpoch: 12, Loss: 1.3049, Train: 47.28%, Valid: 44.72% Test: 46.91%\nEpoch: 13, Loss: 1.2808, Train: 50.88%, Valid: 48.63% Test: 49.80%\nEpoch: 14, Loss: 1.2598, Train: 53.35%, Valid: 51.80% Test: 52.59%\nEpoch: 15, Loss: 1.2450, Train: 56.01%, Valid: 55.26% Test: 56.23%\nEpoch: 16, Loss: 1.2245, Train: 58.28%, Valid: 58.04% Test: 58.90%\nEpoch: 17, Loss: 1.2115, Train: 59.59%, Valid: 59.43% Test: 60.01%\nEpoch: 18, Loss: 1.1933, Train: 60.01%, Valid: 59.97% Test: 60.65%\nEpoch: 19, Loss: 1.1832, Train: 60.30%, Valid: 60.00% Test: 61.01%\nEpoch: 20, Loss: 1.1701, Train: 60.35%, Valid: 59.16% Test: 60.97%\nEpoch: 21, Loss: 1.1615, Train: 61.14%, Valid: 59.69% Test: 61.31%\nEpoch: 22, Loss: 1.1469, Train: 62.09%, Valid: 60.61% Test: 61.84%\nEpoch: 23, Loss: 1.1390, Train: 62.83%, Valid: 61.29% Test: 62.25%\nEpoch: 24, Loss: 1.1310, Train: 63.19%, Valid: 61.81% Test: 62.68%\nEpoch: 25, Loss: 1.1229, Train: 63.38%, Valid: 62.14% Test: 62.86%\nEpoch: 26, Loss: 1.1150, Train: 63.84%, Valid: 62.45% Test: 63.11%\nEpoch: 27, Loss: 1.1105, Train: 64.55%, Valid: 63.18% Test: 63.62%\nEpoch: 28, Loss: 1.1051, Train: 65.49%, Valid: 64.38% Test: 64.56%\nEpoch: 29, Loss: 1.0975, Train: 66.58%, Valid: 65.90% Test: 66.08%\nEpoch: 30, Loss: 1.0912, Train: 67.25%, Valid: 66.93% Test: 67.32%\nEpoch: 31, Loss: 1.0838, Train: 67.47%, Valid: 67.48% Test: 67.89%\nEpoch: 32, Loss: 1.0811, Train: 67.64%, Valid: 67.63% Test: 68.16%\nEpoch: 33, Loss: 1.0747, Train: 67.93%, Valid: 67.66% Test: 68.23%\nEpoch: 34, Loss: 1.0667, Train: 68.22%, Valid: 67.94% Test: 68.32%\nEpoch: 35, Loss: 1.0592, Train: 68.48%, Valid: 68.26% Test: 68.44%\nEpoch: 36, Loss: 1.0587, Train: 68.80%, Valid: 68.55% Test: 68.72%\nEpoch: 37, Loss: 1.0526, Train: 69.13%, Valid: 68.97% Test: 69.01%\nEpoch: 38, Loss: 1.0482, Train: 69.65%, Valid: 69.53% Test: 69.22%\nEpoch: 39, Loss: 1.0459, Train: 70.07%, Valid: 69.81% Test: 69.19%\nEpoch: 40, Loss: 1.0429, Train: 70.31%, Valid: 69.79% Test: 68.95%\nEpoch: 41, Loss: 1.0346, Train: 70.46%, Valid: 69.87% Test: 69.21%\nEpoch: 42, Loss: 1.0320, Train: 70.56%, Valid: 70.19% Test: 69.51%\nEpoch: 43, Loss: 1.0282, Train: 70.75%, Valid: 70.35% Test: 69.91%\nEpoch: 44, Loss: 1.0241, Train: 71.00%, Valid: 70.51% Test: 70.03%\nEpoch: 45, Loss: 1.0210, Train: 71.16%, Valid: 70.68% Test: 70.12%\nEpoch: 46, Loss: 1.0168, Train: 71.18%, Valid: 70.66% Test: 70.21%\nEpoch: 47, Loss: 1.0149, Train: 71.14%, Valid: 70.61% Test: 70.16%\nEpoch: 48, Loss: 1.0110, Train: 71.16%, Valid: 70.60% Test: 69.92%\nEpoch: 49, Loss: 1.0082, Train: 71.31%, Valid: 70.49% Test: 69.56%\nEpoch: 50, Loss: 1.0033, Train: 71.48%, Valid: 70.29% Test: 69.13%\nEpoch: 51, Loss: 0.9980, Train: 71.61%, Valid: 70.31% Test: 69.33%\nEpoch: 52, Loss: 1.0007, Train: 71.76%, Valid: 70.52% Test: 69.63%\nEpoch: 53, Loss: 0.9961, Train: 71.80%, Valid: 70.70% Test: 69.87%\nEpoch: 54, Loss: 0.9917, Train: 71.92%, Valid: 70.72% Test: 69.76%\nEpoch: 55, Loss: 0.9929, Train: 72.02%, Valid: 70.76% Test: 69.62%\nEpoch: 56, Loss: 0.9883, Train: 71.97%, Valid: 70.97% Test: 70.09%\nEpoch: 57, Loss: 0.9884, Train: 71.92%, Valid: 71.08% Test: 70.27%\nEpoch: 58, Loss: 0.9825, Train: 71.98%, Valid: 70.94% Test: 69.95%\nEpoch: 59, Loss: 0.9790, Train: 72.09%, Valid: 70.51% Test: 69.01%\nEpoch: 60, Loss: 0.9784, Train: 72.03%, Valid: 70.01% Test: 67.76%\nEpoch: 61, Loss: 0.9755, Train: 72.11%, Valid: 70.22% Test: 68.10%\nEpoch: 62, Loss: 0.9731, Train: 72.19%, Valid: 70.74% Test: 69.52%\nEpoch: 63, Loss: 0.9744, Train: 72.29%, Valid: 70.82% Test: 69.93%\nEpoch: 64, Loss: 0.9696, Train: 72.34%, Valid: 70.77% Test: 69.60%\nEpoch: 65, Loss: 0.9667, Train: 72.44%, Valid: 70.56% Test: 69.21%\nEpoch: 66, Loss: 0.9670, Train: 72.57%, Valid: 71.00% Test: 69.74%\nEpoch: 67, Loss: 0.9659, Train: 72.65%, Valid: 71.29% Test: 70.09%\nEpoch: 68, Loss: 0.9591, Train: 72.76%, Valid: 71.47% Test: 70.33%\nEpoch: 69, Loss: 0.9604, Train: 72.74%, Valid: 71.57% Test: 70.46%\nEpoch: 70, Loss: 0.9569, Train: 72.81%, Valid: 71.43% Test: 70.59%\nEpoch: 71, Loss: 0.9561, Train: 72.76%, Valid: 71.34% Test: 70.63%\nEpoch: 72, Loss: 0.9527, Train: 72.90%, Valid: 71.48% Test: 70.74%\nEpoch: 73, Loss: 0.9495, Train: 72.98%, Valid: 71.46% Test: 70.35%\nEpoch: 74, Loss: 0.9488, Train: 72.92%, Valid: 71.26% Test: 69.92%\nEpoch: 75, Loss: 0.9486, Train: 72.98%, Valid: 71.55% Test: 70.66%\nEpoch: 76, Loss: 0.9453, Train: 72.95%, Valid: 71.54% Test: 71.07%\nEpoch: 77, Loss: 0.9422, Train: 73.00%, Valid: 71.32% Test: 70.65%\nEpoch: 78, Loss: 0.9434, Train: 72.98%, Valid: 71.05% Test: 69.73%\nEpoch: 79, Loss: 0.9423, Train: 72.86%, Valid: 70.51% Test: 68.76%\nEpoch: 80, Loss: 0.9423, Train: 73.12%, Valid: 71.00% Test: 70.03%\nEpoch: 81, Loss: 0.9396, Train: 73.23%, Valid: 71.70% Test: 71.04%\nEpoch: 82, Loss: 0.9378, Train: 73.27%, Valid: 71.73% Test: 70.89%\nEpoch: 83, Loss: 0.9375, Train: 73.27%, Valid: 71.44% Test: 70.31%\nEpoch: 84, Loss: 0.9313, Train: 73.33%, Valid: 71.40% Test: 70.19%\nEpoch: 85, Loss: 0.9337, Train: 73.43%, Valid: 71.55% Test: 70.41%\nEpoch: 86, Loss: 0.9318, Train: 73.49%, Valid: 71.44% Test: 70.28%\nEpoch: 87, Loss: 0.9307, Train: 73.43%, Valid: 71.56% Test: 70.45%\nEpoch: 88, Loss: 0.9281, Train: 73.28%, Valid: 71.70% Test: 71.24%\nEpoch: 89, Loss: 0.9269, Train: 73.23%, Valid: 71.87% Test: 71.41%\nEpoch: 90, Loss: 0.9253, Train: 73.47%, Valid: 71.74% Test: 70.77%\nEpoch: 91, Loss: 0.9198, Train: 73.59%, Valid: 71.60% Test: 70.51%\nEpoch: 92, Loss: 0.9225, Train: 73.62%, Valid: 71.79% Test: 71.17%\nEpoch: 93, Loss: 0.9195, Train: 73.60%, Valid: 71.84% Test: 71.16%\nEpoch: 94, Loss: 0.9202, Train: 73.74%, Valid: 71.45% Test: 70.24%\nEpoch: 95, Loss: 0.9155, Train: 73.81%, Valid: 71.58% Test: 70.70%\nEpoch: 96, Loss: 0.9169, Train: 73.88%, Valid: 71.90% Test: 71.29%\nEpoch: 97, Loss: 0.9154, Train: 73.99%, Valid: 71.77% Test: 70.85%\nEpoch: 98, Loss: 0.9128, Train: 73.94%, Valid: 71.38% Test: 70.30%\nEpoch: 99, Loss: 0.9101, Train: 73.67%, Valid: 71.29% Test: 70.26%\nEpoch: 100, Loss: 0.9117, Train: 73.53%, Valid: 71.39% Test: 70.86%\n"
]
],
[
[
"## Question 5: What are your `best_model` validation and test accuracies?(20 points)\n\nRun the cell below to see the results of your best of model and save your model's predictions to a file named *ogbn-arxiv_node.csv*. \n\nYou can view this file by clicking on the *Folder* icon on the left side pannel. As in Colab 1, when you sumbit your assignment, you will have to download this file and attatch it to your submission.",
"_____no_output_____"
]
],
[
[
"if 'IS_GRADESCOPE_ENV' not in os.environ:\n best_result = test(best_model, data, split_idx, evaluator, save_model_results=True)\n train_acc, valid_acc, test_acc = best_result\n print(f'Best model: '\n f'Train: {100 * train_acc:.2f}%, '\n f'Valid: {100 * valid_acc:.2f}% '\n f'Test: {100 * test_acc:.2f}%')",
"Saving Model Predictions\nBest model: Train: 73.88%, Valid: 71.90% Test: 71.29%\n"
]
],
[
[
"# 4) GNN: Graph Property Prediction\n\nIn this section we will create a graph neural network for graph property prediction (graph classification).\n",
"_____no_output_____"
],
[
"## Load and preprocess the dataset",
"_____no_output_____"
]
],
[
[
"from ogb.graphproppred import PygGraphPropPredDataset, Evaluator\nfrom torch_geometric.data import DataLoader\nfrom tqdm.notebook import tqdm\n\nif 'IS_GRADESCOPE_ENV' not in os.environ:\n # Load the dataset \n dataset = PygGraphPropPredDataset(name='ogbg-molhiv')\n\n device = 'cuda' if torch.cuda.is_available() else 'cpu'\n print('Device: {}'.format(device))\n\n split_idx = dataset.get_idx_split()\n\n # Check task type\n print('Task type: {}'.format(dataset.task_type))",
"_____no_output_____"
],
[
"# Load the dataset splits into corresponding dataloaders\n# We will train the graph classification task on a batch of 32 graphs\n# Shuffle the order of graphs for training set\nif 'IS_GRADESCOPE_ENV' not in os.environ:\n train_loader = DataLoader(dataset[split_idx[\"train\"]], batch_size=32, shuffle=True, num_workers=0)\n valid_loader = DataLoader(dataset[split_idx[\"valid\"]], batch_size=32, shuffle=False, num_workers=0)\n test_loader = DataLoader(dataset[split_idx[\"test\"]], batch_size=32, shuffle=False, num_workers=0)",
"_____no_output_____"
],
[
"if 'IS_GRADESCOPE_ENV' not in os.environ:\n # Please do not change the args\n args = {\n 'device': device,\n 'num_layers': 5,\n 'hidden_dim': 256,\n 'dropout': 0.5,\n 'lr': 0.001,\n 'epochs': 30,\n }\n args",
"_____no_output_____"
]
],
[
[
"## Graph Prediction Model",
"_____no_output_____"
],
[
"### Graph Mini-Batching\nBefore diving into the actual model, we introduce the concept of mini-batching with graphs. In order to parallelize the processing of a mini-batch of graphs, PyG combines the graphs into a single disconnected graph data object (*torch_geometric.data.Batch*). *torch_geometric.data.Batch* inherits from *torch_geometric.data.Data* (introduced earlier) and contains an additional attribute called `batch`. \n\nThe `batch` attribute is a vector mapping each node to the index of its corresponding graph within the mini-batch:\n\n batch = [0, ..., 0, 1, ..., n - 2, n - 1, ..., n - 1]\n\nThis attribute is crucial for associating which graph each node belongs to and can be used to e.g. average the node embeddings for each graph individually to compute graph level embeddings. \n\n",
"_____no_output_____"
],
[
"### Implemention\nNow, we have all of the tools to implement a GCN Graph Prediction model! \n\nWe will reuse the existing GCN model to generate `node_embeddings` and then use `Global Pooling` over the nodes to create graph level embeddings that can be used to predict properties for the each graph. Remeber that the `batch` attribute will be essential for performining Global Pooling over our mini-batch of graphs.",
"_____no_output_____"
]
],
[
[
"from ogb.graphproppred.mol_encoder import AtomEncoder\nfrom torch_geometric.nn import global_add_pool, global_mean_pool\n\n### GCN to predict graph property\nclass GCN_Graph(torch.nn.Module):\n def __init__(self, hidden_dim, output_dim, num_layers, dropout):\n super(GCN_Graph, self).__init__()\n\n # Load encoders for Atoms in molecule graphs\n self.node_encoder = AtomEncoder(hidden_dim)\n\n # Node embedding model\n # Note that the input_dim and output_dim are set to hidden_dim\n self.gnn_node = GCN(hidden_dim, hidden_dim,\n hidden_dim, num_layers, dropout, return_embeds=True)\n\n self.pool = None\n\n ############# Your code here ############\n ## Note:\n ## 1. Initialize self.pool as a global mean pooling layer\n ## For more information please refer to the documentation:\n ## https://pytorch-geometric.readthedocs.io/en/latest/modules/nn.html#global-pooling-layers\n\n #########################################\n\n # Output layer\n self.linear = torch.nn.Linear(hidden_dim, output_dim)\n\n\n def reset_parameters(self):\n self.gnn_node.reset_parameters()\n self.linear.reset_parameters()\n\n def forward(self, batched_data):\n # TODO: Implement a function that takes as input a \n # mini-batch of graphs (torch_geometric.data.Batch) and \n # returns the predicted graph property for each graph. \n #\n # NOTE: Since we are predicting graph level properties,\n # your output will be a tensor with dimension equaling\n # the number of graphs in the mini-batch\n\n \n # Extract important attributes of our mini-batch\n x, edge_index, batch = batched_data.x, batched_data.edge_index, batched_data.batch\n embed = self.node_encoder(x)\n\n out = None\n\n ############# Your code here ############\n ## Note:\n ## 1. Construct node embeddings using existing GCN model\n ## 2. Use the global pooling layer to aggregate features for each individual graph\n ## For more information please refer to the documentation:\n ## https://pytorch-geometric.readthedocs.io/en/latest/modules/nn.html#global-pooling-layers\n ## 3. Use a linear layer to predict each graph's property\n ## (~3 lines of code)\n\n #########################################\n\n return out",
"_____no_output_____"
],
[
"def train(model, device, data_loader, optimizer, loss_fn):\n # TODO: Implement a function that trains your model by \n # using the given optimizer and loss_fn.\n model.train()\n loss = 0\n\n for step, batch in enumerate(tqdm(data_loader, desc=\"Iteration\")):\n batch = batch.to(device)\n\n if batch.x.shape[0] == 1 or batch.batch[-1] == 0:\n pass\n else:\n ## ignore nan targets (unlabeled) when computing training loss.\n is_labeled = batch.y == batch.y\n\n ############# Your code here ############\n ## Note:\n ## 1. Zero grad the optimizer\n ## 2. Feed the data into the model\n ## 3. Use `is_labeled` mask to filter output and labels\n ## 4. You may need to change the type of label to torch.float32\n ## 5. Feed the output and label to the loss_fn\n ## (~3 lines of code)\n\n #########################################\n\n loss.backward()\n optimizer.step()\n\n return loss.item()",
"_____no_output_____"
],
[
"# The evaluation function\ndef eval(model, device, loader, evaluator, save_model_results=False, save_file=None):\n model.eval()\n y_true = []\n y_pred = []\n\n for step, batch in enumerate(tqdm(loader, desc=\"Iteration\")):\n batch = batch.to(device)\n\n if batch.x.shape[0] == 1:\n pass\n else:\n with torch.no_grad():\n pred = model(batch)\n\n y_true.append(batch.y.view(pred.shape).detach().cpu())\n y_pred.append(pred.detach().cpu())\n\n y_true = torch.cat(y_true, dim = 0).numpy()\n y_pred = torch.cat(y_pred, dim = 0).numpy()\n\n input_dict = {\"y_true\": y_true, \"y_pred\": y_pred}\n\n if save_model_results:\n print (\"Saving Model Predictions\")\n \n # Create a pandas dataframe with a two columns\n # y_pred | y_true\n data = {}\n data['y_pred'] = y_pred.reshape(-1)\n data['y_true'] = y_true.reshape(-1)\n\n df = pd.DataFrame(data=data)\n # Save to csv\n df.to_csv('ogbg-molhiv_graph_' + save_file + '.csv', sep=',', index=False)\n\n return evaluator.eval(input_dict)",
"_____no_output_____"
],
[
"if 'IS_GRADESCOPE_ENV' not in os.environ:\n model = GCN_Graph(args['hidden_dim'],\n dataset.num_tasks, args['num_layers'],\n args['dropout']).to(device)\n evaluator = Evaluator(name='ogbg-molhiv')",
"_____no_output_____"
],
[
"import copy\n\nif 'IS_GRADESCOPE_ENV' not in os.environ:\n model.reset_parameters()\n\n optimizer = torch.optim.Adam(model.parameters(), lr=args['lr'])\n loss_fn = torch.nn.BCEWithLogitsLoss()\n\n best_model = None\n best_valid_acc = 0\n\n for epoch in range(1, 1 + args[\"epochs\"]):\n print('Training...')\n loss = train(model, device, train_loader, optimizer, loss_fn)\n\n print('Evaluating...')\n train_result = eval(model, device, train_loader, evaluator)\n val_result = eval(model, device, valid_loader, evaluator)\n test_result = eval(model, device, test_loader, evaluator)\n\n train_acc, valid_acc, test_acc = train_result[dataset.eval_metric], val_result[dataset.eval_metric], test_result[dataset.eval_metric]\n if valid_acc > best_valid_acc:\n best_valid_acc = valid_acc\n best_model = copy.deepcopy(model)\n print(f'Epoch: {epoch:02d}, '\n f'Loss: {loss:.4f}, '\n f'Train: {100 * train_acc:.2f}%, '\n f'Valid: {100 * valid_acc:.2f}% '\n f'Test: {100 * test_acc:.2f}%')",
"_____no_output_____"
]
],
[
[
"## Question 6: What are your `best_model` validation and test ROC-AUC scores? (20 points)\n\nRun the cell below to see the results of your best of model and save your model's predictions over the validation and test datasets. The resulting files are named *ogbn-arxiv_graph_valid.csv* and *ogbn-arxiv_graph_test.csv*. \n\nAgain, you can view these files by clicking on the *Folder* icon on the left side pannel. As in Colab 1, when you sumbit your assignment, you will have to download these files and attatch them to your submission.",
"_____no_output_____"
]
],
[
[
"if 'IS_GRADESCOPE_ENV' not in os.environ:\n train_acc = eval(best_model, device, train_loader, evaluator)[dataset.eval_metric]\n valid_acc = eval(best_model, device, valid_loader, evaluator, save_model_results=True, save_file=\"valid\")[dataset.eval_metric]\n test_acc = eval(best_model, device, test_loader, evaluator, save_model_results=True, save_file=\"test\")[dataset.eval_metric]\n\n print(f'Best model: '\n f'Train: {100 * train_acc:.2f}%, '\n f'Valid: {100 * valid_acc:.2f}% '\n f'Test: {100 * test_acc:.2f}%')",
"_____no_output_____"
]
],
[
[
"## Question 7 (Optional): Experiment with the two other global pooling layers in Pytorch Geometric.",
"_____no_output_____"
],
[
"# Submission\n\nYou will need to submit four files on Gradescope to complete this notebook. \n\n1. Your completed *CS224W_Colab2.ipynb*. From the \"File\" menu select \"Download .ipynb\" to save a local copy of your completed Colab. **PLEASE DO NOT CHANGE THE NAME!** The autograder depends on the .ipynb file being called \"CS224W_Colab2.ipynb\".\n2. *ogbn-arxiv_node.csv* \n3. *ogbg-molhiv_graph_valid.csv*\n4. *ogbg-molhiv_graph_test.csv*\n\nDownload the csv files by selecting the *Folder* icon on the left panel. \n\nTo submit your work, zip the files downloaded in steps 1-4 above and submit to gradescope. **NOTE:** DO NOT rename any of the downloaded files. ",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
]
] |
d07ba9c785b52ba86bb617b194f251d62a543e2d | 6,431 | ipynb | Jupyter Notebook | notebooks/00-data/04-boston.ipynb | mixerupper/mltools-fi_cate | 9a43af3f58d91cadce584863f388111abbb80b39 | [
"MIT"
] | null | null | null | notebooks/00-data/04-boston.ipynb | mixerupper/mltools-fi_cate | 9a43af3f58d91cadce584863f388111abbb80b39 | [
"MIT"
] | 1 | 2021-03-31T19:52:04.000Z | 2021-03-31T20:03:02.000Z | notebooks/00-data/04-boston.ipynb | mixerupper/mltools-fi_cate | 9a43af3f58d91cadce584863f388111abbb80b39 | [
"MIT"
] | 1 | 2021-06-01T08:45:07.000Z | 2021-06-01T08:45:07.000Z | 28.709821 | 223 | 0.545949 | [
[
[
"import pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt\n%matplotlib inline\nimport seaborn as sns\nsns.set_theme(style=\"darkgrid\")\nimport scipy as sp\nfrom sklearn.ensemble import RandomForestRegressor\nfrom sklearn.impute import SimpleImputer\nfrom datetime import datetime\nfrom sklearn.datasets import load_boston\nimport os",
"_____no_output_____"
],
[
"exec(open(\"../../header.py\").read())",
"_____no_output_____"
],
[
"print(load_boston().DESCR)",
".. _boston_dataset:\n\nBoston house prices dataset\n---------------------------\n\n**Data Set Characteristics:** \n\n :Number of Instances: 506 \n\n :Number of Attributes: 13 numeric/categorical predictive. Median Value (attribute 14) is usually the target.\n\n :Attribute Information (in order):\n - CRIM per capita crime rate by town\n - ZN proportion of residential land zoned for lots over 25,000 sq.ft.\n - INDUS proportion of non-retail business acres per town\n - CHAS Charles River dummy variable (= 1 if tract bounds river; 0 otherwise)\n - NOX nitric oxides concentration (parts per 10 million)\n - RM average number of rooms per dwelling\n - AGE proportion of owner-occupied units built prior to 1940\n - DIS weighted distances to five Boston employment centres\n - RAD index of accessibility to radial highways\n - TAX full-value property-tax rate per $10,000\n - PTRATIO pupil-teacher ratio by town\n - B 1000(Bk - 0.63)^2 where Bk is the proportion of blacks by town\n - LSTAT % lower status of the population\n - MEDV Median value of owner-occupied homes in $1000's\n\n :Missing Attribute Values: None\n\n :Creator: Harrison, D. and Rubinfeld, D.L.\n\nThis is a copy of UCI ML housing dataset.\nhttps://archive.ics.uci.edu/ml/machine-learning-databases/housing/\n\n\nThis dataset was taken from the StatLib library which is maintained at Carnegie Mellon University.\n\nThe Boston house-price data of Harrison, D. and Rubinfeld, D.L. 'Hedonic\nprices and the demand for clean air', J. Environ. Economics & Management,\nvol.5, 81-102, 1978. Used in Belsley, Kuh & Welsch, 'Regression diagnostics\n...', Wiley, 1980. N.B. Various transformations are used in the table on\npages 244-261 of the latter.\n\nThe Boston house-price data has been used in many machine learning papers that address regression\nproblems. \n \n.. topic:: References\n\n - Belsley, Kuh & Welsch, 'Regression diagnostics: Identifying Influential Data and Sources of Collinearity', Wiley, 1980. 244-261.\n - Quinlan,R. (1993). Combining Instance-Based and Model-Based Learning. In Proceedings on the Tenth International Conference of Machine Learning, 236-243, University of Massachusetts, Amherst. Morgan Kaufmann.\n\n"
]
],
[
[
"# Import Data",
"_____no_output_____"
]
],
[
[
"def boston_df(sklearn_dataset):\n X = pd.DataFrame(sklearn_dataset.data, columns=sklearn_dataset.feature_names)\n y = pd.DataFrame(sklearn_dataset.target, columns = ['MEDV'])\n\n return X, y",
"_____no_output_____"
],
[
"X, y = boston_df(load_boston())",
"_____no_output_____"
],
[
"X.columns",
"_____no_output_____"
]
],
[
[
"# Save data",
"_____no_output_____"
]
],
[
[
"folder_name = 'boston'",
"_____no_output_____"
],
[
"try:\n os.mkdir(processed_root(folder_name))\nexcept FileExistsError:\n print(\"Folder already exists\")",
"_____no_output_____"
],
[
"X.to_csv(processed_root(f\"{folder_name}/X.csv\"), index = False)\ny.to_csv(processed_root(f\"{folder_name}/y.csv\"), index = False)",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
] |
d07baf7457a14b30f5efddf29924dab2b1788eca | 208,831 | ipynb | Jupyter Notebook | Notebooks/CreateFigures/Figures13_and_14_new.ipynb | HiRISE-MachineLearning/HiRISE-MachineLearning-Python | 66b81ca24bf8828fbcb218da3359d05349e6c381 | [
"0BSD"
] | null | null | null | Notebooks/CreateFigures/Figures13_and_14_new.ipynb | HiRISE-MachineLearning/HiRISE-MachineLearning-Python | 66b81ca24bf8828fbcb218da3359d05349e6c381 | [
"0BSD"
] | null | null | null | Notebooks/CreateFigures/Figures13_and_14_new.ipynb | HiRISE-MachineLearning/HiRISE-MachineLearning-Python | 66b81ca24bf8828fbcb218da3359d05349e6c381 | [
"0BSD"
] | null | null | null | 513.09828 | 83,192 | 0.944955 | [
[
[
"import matplotlib\nimport matplotlib.pyplot as plt\nimport numpy as np\nimport os\nimport pandas as pd\nfrom sklearn.metrics import confusion_matrix,balanced_accuracy_score,roc_auc_score,roc_curve,recall_score,precision_score\nfrom p4tools import io",
"_____no_output_____"
],
[
"ResultsPath = '../../Data/SummaryResults/'\nFiguresPath = '../../Data/Figures/'\nif not os.path.isdir(FiguresPath):\n os.mkdir(FiguresPath)\n \nNumRepeats=3\n\nResultsList=[]\nfor Rep in range(NumRepeats):\n ResultsList.append(pd.read_csv(ResultsPath+'TileClassifier_LORO_final_repeat'+str(Rep)+'.csv'))\n \nY_true=ResultsList[-1]['GroundTruth'].astype('uint8')\n\nY_pred=ResultsList[0]['ClassifierConf'].values\nfor Rep in range(1,NumRepeats):\n Y_pred=Y_pred+ResultsList[Rep]['ClassifierConf'].values\nY_pred=Y_pred/NumRepeats\n\nResults_df = ResultsList[-1]\nResults_df['ClassifierConf']=Y_pred",
"_____no_output_____"
],
[
"Recall95PC_Threshold=0.5\n\nAUC = roc_auc_score(Y_true, Y_pred)\nconf_matrix = confusion_matrix(Y_true,Y_pred>Recall95PC_Threshold,labels=[0,1])\nSensitivity = conf_matrix[1,1]/(conf_matrix[1,0]+conf_matrix[1,1])\nSpecificity = conf_matrix[0,0]/(conf_matrix[0,0]+conf_matrix[0,1])\nPrecision = conf_matrix[1,1]/(conf_matrix[1,1]+conf_matrix[0,1])\nBalanced_accuracy = balanced_accuracy_score(Y_true, Y_pred>Recall95PC_Threshold)\n\nprint('Number of tiles classified in Leave-One-Region-Out Cross-Validation= ',Results_df.shape[0])\nprint('')\nprint('Confusion matrix = ')\nprint(conf_matrix)\nprint('')\nprint('sensitivity=',round(100*Sensitivity,2),'%')\nprint('Specificity=',round(100*Specificity,2),'%')\nprint('Precision=',round(100*Precision,2),'%')\nprint('AUC=',round(AUC,3))\nprint('Balanced Accuracy =',round(100*Balanced_accuracy))\n\n",
"Number of tiles classified in Leave-One-Region-Out Cross-Validation= 42904\n\nConfusion matrix = \n[[ 9302 599]\n [ 6161 26842]]\n\nsensitivity= 81.33 %\nSpecificity= 93.95 %\nPrecision= 97.82 %\nAUC= 0.934\nBalanced Accuracy = 88.0\n"
],
[
"Recall95PC_Threshold=0.24\n\nAUC = roc_auc_score(Y_true, Y_pred)\nconf_matrix = confusion_matrix(Y_true,Y_pred>Recall95PC_Threshold,labels=[0,1])\nSensitivity = conf_matrix[1,1]/(conf_matrix[1,0]+conf_matrix[1,1])\nSpecificity = conf_matrix[0,0]/(conf_matrix[0,0]+conf_matrix[0,1])\nPrecision = conf_matrix[1,1]/(conf_matrix[1,1]+conf_matrix[0,1])\nBalanced_accuracy = balanced_accuracy_score(Y_true, Y_pred>Recall95PC_Threshold)\n\nprint('Number of tiles classified in Leave-One-Region-Out Cross-Validation= ',Results_df.shape[0])\nprint('')\nprint('Confusion matrix = ')\nprint(conf_matrix)\nprint('')\nprint('sensitivity=',round(100*Sensitivity,2),'%')\nprint('Specificity=',round(100*Specificity,2),'%')\nprint('Precision=',round(100*Precision,2),'%')\nprint('AUC=',round(AUC,3))\nprint('Balanced Accuracy =',round(100*Balanced_accuracy))\n\n",
"Number of tiles classified in Leave-One-Region-Out Cross-Validation= 42904\n\nConfusion matrix = \n[[ 5388 4513]\n [ 1661 31342]]\n\nsensitivity= 94.97 %\nSpecificity= 54.42 %\nPrecision= 87.41 %\nAUC= 0.934\nBalanced Accuracy = 75.0\n"
],
[
"fig = plt.figure(figsize=(10,10))\nfpr, tpr, thresholds = roc_curve(Y_true, Y_pred)\nplt.plot(1-fpr,tpr,linewidth=3)\nplt.xlabel('Specificity',fontsize=20)\nplt.ylabel('Recall',fontsize=20)\n\nplt.plot(Specificity,Sensitivity,'og',linewidth=30,markersize=20)\n\nplt.text(0.5, 0.5, 'AUC='+str(round(AUC,2)), fontsize=20)\nplt.text(0.2, 0.9, '95% recall point at', fontsize=20,color='green')\nplt.text(0.2, 0.85, '54% specificity', fontsize=20,color='green')\n\n\nmatplotlib.rc('xtick', labelsize=20) \nmatplotlib.rc('ytick', labelsize=20)\n\nfig.tight_layout()\nplt.savefig(FiguresPath+'Figure14.pdf')\nplt.show()",
"_____no_output_____"
],
[
"#regions\nregion_names_df = io.get_region_names()\nregion_names_df = region_names_df.set_index('obsid')\nregion_names_df.at['ESP_012620_0975','roi_name'] = 'Buffalo'\nregion_names_df.at['ESP_012277_0975','roi_name'] = 'Buffalo'\nregion_names_df.at['ESP_012348_0975','roi_name'] = 'Taichung'\n\n#other meta data\nImageResults_df = io.get_meta_data()\nImageResults_df = ImageResults_df.set_index('OBSERVATION_ID')\nImageResults_df = pd.concat([ImageResults_df, region_names_df], axis=1, sort=False)\nImageResults_df=ImageResults_df.dropna()\nUniqueP4Regions = ImageResults_df['roi_name'].unique()\nprint(\"Number of P4 regions = \",len(UniqueP4Regions))\n\nBAs=[]\nfor ToLeaveOut in UniqueP4Regions:\n This_df = Results_df[Results_df['Region']==ToLeaveOut]\n y_true = This_df['GroundTruth'].values\n y_pred = This_df['ClassifierConf'].values\n Balanced_accuracy_cl = balanced_accuracy_score(y_true, y_pred>0.5)\n BAs.append(Balanced_accuracy_cl)\nregions_sorted=[x for y, x in sorted(zip(BAs,UniqueP4Regions))]\n\nfig=plt.figure(figsize=(15,15))\nplt.bar(regions_sorted,100*np.array(sorted(BAs)))\nax=fig.gca()\nax.set_xticks(np.arange(0,len(regions_sorted)))\nax.set_xticklabels(regions_sorted,rotation=90,fontsize=20)\nax.set_ylabel('Balanced Accuracy (%)',fontsize=30)\nmatplotlib.rc('xtick', labelsize=30) \nmatplotlib.rc('ytick', labelsize=30)\nfig.tight_layout()\nplt.savefig(FiguresPath+'Figure15.pdf')\nplt.show()",
"Number of P4 regions = 28\n"
],
[
"#regions\nregion_names_df = io.get_region_names()\nregion_names_df = region_names_df.set_index('obsid')\nregion_names_df.at['ESP_012620_0975','roi_name'] = 'Buffalo'\nregion_names_df.at['ESP_012277_0975','roi_name'] = 'Buffalo'\nregion_names_df.at['ESP_012348_0975','roi_name'] = 'Taichung'\n\n#other meta data\nImageResults_df = io.get_meta_data()\nImageResults_df = ImageResults_df.set_index('OBSERVATION_ID')\nImageResults_df = pd.concat([ImageResults_df, region_names_df], axis=1, sort=False)\nImageResults_df=ImageResults_df.dropna()\nUniqueP4Regions = ImageResults_df['roi_name'].unique()\nprint(\"Number of P4 regions = \",len(UniqueP4Regions))\n\nRecalls=[]\nfor ToLeaveOut in UniqueP4Regions:\n This_df = Results_df[Results_df['Region']==ToLeaveOut]\n y_true = This_df['GroundTruth'].values\n y_pred = This_df['ClassifierConf'].values\n Recall_cl = recall_score(y_true, y_pred>Recall95PC_Threshold)\n Recalls.append(Recall_cl)\nregions_sorted=[x for y, x in sorted(zip(Recalls,UniqueP4Regions))]\n\nPrecisions=[]\nfor ToLeaveOut in regions_sorted:\n This_df = Results_df[Results_df['Region']==ToLeaveOut]\n y_true = This_df['GroundTruth'].values\n y_pred = This_df['ClassifierConf'].values\n Precision_cl = precision_score(y_true, y_pred>Recall95PC_Threshold)\n Precisions.append(Precision_cl)\n\nfig=plt.figure(figsize=(15,15))\nplt.bar(regions_sorted,100*np.array(sorted(Recalls)))\nplt.bar(regions_sorted,100*np.array(Precisions),alpha=0.5,color='g')\nax=fig.gca()\nax.set_xticks(np.arange(0,len(regions_sorted)))\nax.set_xticklabels(regions_sorted,rotation=90,fontsize=20)\nax.set_ylabel('Recall (%), Precision (%)',fontsize=30)\nmatplotlib.rc('xtick', labelsize=30) \nmatplotlib.rc('ytick', labelsize=30)\nplt.ylim([0,110])\nfig.tight_layout()\nfig.legend(['Recall','Precision'],fontsize=20,loc='upper right')\nplt.savefig(FiguresPath+'Figure15_new.pdf')\nplt.show()",
"Number of P4 regions = 28\n"
],
[
"sorted(Recalls)",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07bd5b9bfff73e71c351e507810d863bafc02c1 | 199,935 | ipynb | Jupyter Notebook | example/isotree_example.ipynb | ankane/isotree-1 | 67940e7afaf8332bf2ca95722cec90c56280b04f | [
"BSD-2-Clause"
] | 1 | 2020-09-18T17:21:49.000Z | 2020-09-18T17:21:49.000Z | example/isotree_example.ipynb | ankane/isotree-1 | 67940e7afaf8332bf2ca95722cec90c56280b04f | [
"BSD-2-Clause"
] | null | null | null | example/isotree_example.ipynb | ankane/isotree-1 | 67940e7afaf8332bf2ca95722cec90c56280b04f | [
"BSD-2-Clause"
] | null | null | null | 523.390052 | 188,140 | 0.940246 | [
[
[
"# Example 1: Detecting an obvious outlier",
"_____no_output_____"
]
],
[
[
"import numpy as np\nfrom isotree import IsolationForest\n\n### Random data from a standard normal distribution\nnp.random.seed(1)\nn = 100\nm = 2\nX = np.random.normal(size = (n, m))\n\n### Will now add obvious outlier point (3, 3) to the data\nX = np.r_[X, np.array([3, 3]).reshape((1, m))]\n\n### Fit a small isolation forest model\niso = IsolationForest(ntrees = 10, ndim = 2, nthreads = 1)\niso.fit(X)\n\n### Check which row has the highest outlier score\npred = iso.predict(X)\nprint(\"Point with highest outlier score: \",\n X[np.argsort(-pred)[0], ])",
"Point with highest outlier score: [3. 3.]\n"
]
],
[
[
"# Example 2: Plotting outlier and density regions",
"_____no_output_____"
]
],
[
[
"import numpy as np, pandas as pd\nfrom isotree import IsolationForest\nimport matplotlib.pyplot as plt\nfrom pylab import rcParams\n%matplotlib inline\nrcParams['figure.figsize'] = 10, 8\n\nnp.random.seed(1)\ngroup1 = pd.DataFrame({\n \"x\" : np.random.normal(loc=-1, scale=.4, size = 1000),\n \"y\" : np.random.normal(loc=-1, scale=.2, size = 1000),\n})\ngroup2 = pd.DataFrame({\n \"x\" : np.random.normal(loc=+1, scale=.2, size = 1000),\n \"y\" : np.random.normal(loc=+1, scale=.4, size = 1000),\n})\nX = pd.concat([group1, group2], ignore_index=True)\n\n### Now add an obvious outlier which is within the 1d ranges\n### (As an interesting test, remove it and see what happens,\n### or check how its score changes when using sub-sampling)\nX = X.append(pd.DataFrame({\"x\" : [-1], \"y\" : [1]}), ignore_index = True)\n\n### Single-variable Isolatio Forest\niso_simple = IsolationForest(ndim=1, ntrees=100,\n penalize_range=False,\n prob_pick_pooled_gain=0)\niso_simple.fit(X)\n\n### Extended Isolation Forest\niso_ext = IsolationForest(ndim=2, ntrees=100,\n penalize_range=False,\n prob_pick_pooled_gain=0)\niso_ext.fit(X)\n\n### SCiForest\niso_sci = IsolationForest(ndim=2, ntrees=100, ntry=10,\n penalize_range=True,\n prob_pick_avg_gain=1,\n prob_pick_pooled_gain=0)\niso_sci.fit(X)\n\n### Fair-Cut Forest\niso_fcf = IsolationForest(ndim=2, ntrees=100,\n penalize_range=False,\n prob_pick_avg_gain=0,\n prob_pick_pooled_gain=1)\niso_fcf.fit(X)\n\n### Plot as a heatmap\npts = np.linspace(-3, 3, 250)\nspace = np.array( np.meshgrid(pts, pts) ).reshape((2, -1)).T\nZ_sim = iso_simple.predict(space)\nZ_ext = iso_ext.predict(space)\nZ_sci = iso_sci.predict(space)\nZ_fcf = iso_fcf.predict(space)\nspace_index = pd.MultiIndex.from_arrays([space[:, 0], space[:, 1]])\n\ndef plot_space(Z, space_index, X):\n df = pd.DataFrame({\"z\" : Z}, index = space_index)\n df = df.unstack()\n df = df[df.columns.values[::-1]]\n plt.imshow(df, extent = [-3, 3, -3, 3], cmap = 'hot_r')\n plt.scatter(x = X['x'], y = X['y'], alpha = .15, c = 'navy')\n\nplt.suptitle(\"Outlier and Density Regions\", fontsize = 20)\n\nplt.subplot(2, 2, 1)\nplot_space(Z_sim, space_index, X)\nplt.title(\"Isolation Forest\", fontsize=15)\n\nplt.subplot(2, 2, 2)\nplot_space(Z_ext, space_index, X)\nplt.title(\"Extended Isolation Forest\", fontsize=15)\n\nplt.subplot(2, 2, 3)\nplot_space(Z_sci, space_index, X)\nplt.title(\"SCiForest\", fontsize=15)\n\nplt.subplot(2, 2, 4)\nplot_space(Z_fcf, space_index, X)\nplt.title(\"Fair-Cut Forest\", fontsize=15)\nplt.show()\n\nprint(\"(Note that the upper-left corner has an outlier point,\\n\\\n and that there is a slight slide in the axes of the heat colors and the points)\")",
"_____no_output_____"
]
],
[
[
"# Example 3: calculating pairwise distances",
"_____no_output_____"
]
],
[
[
"import numpy as np, pandas as pd\nfrom isotree import IsolationForest\nfrom scipy.spatial.distance import cdist\n\n### Generate random multivariate-normal data\nnp.random.seed(1)\nn = 1000\nm = 10\n\n### This is a random PSD matrix to use as covariance\nS = np.random.normal(size = (m, m))\nS = S.T.dot(S)\n\nmu = np.random.normal(size = m, scale = 2)\nX = np.random.multivariate_normal(mu, S, n)\n\n### Fitting the model\niso = IsolationForest(prob_pick_avg_gain=0, prob_pick_pooled_gain=0)\niso.fit(X)\n\n### Calculate approximate distance\nD_sep = iso.predict_distance(X, square_mat = True)\n\n### Compare against other distances\nD_euc = cdist(X, X, metric = \"euclidean\")\nD_cos = cdist(X, X, metric = \"cosine\")\nD_mah = cdist(X, X, metric = \"mahalanobis\")\n\n### Correlations\nprint(\"Correlations between different distance metrics\")\npd.DataFrame(\n np.corrcoef([D_sep.reshape(-1), D_euc.reshape(-1), D_cos.reshape(-1), D_mah.reshape(-1)]),\n columns = ['SeparaionDepth', 'Euclidean', 'Cosine', 'Mahalanobis'],\n index = ['SeparaionDepth', 'Euclidean', 'Cosine', 'Mahalanobis']\n)",
"Correlations between different distance metrics\n"
]
],
[
[
"# Example 4: imputing missing values",
"_____no_output_____"
]
],
[
[
"import numpy as np\nfrom isotree import IsolationForest\n\n### Generate random multivariate-normal data\nnp.random.seed(1)\nn = 1000\nm = 5\n\n### This is a random PSD matrix to use as covariance\nS = np.random.normal(size = (m, m))\nS = S.T.dot(S)\n\nmu = np.random.normal(size = m)\nX = np.random.multivariate_normal(mu, S, n)\n\n### Set some values randomly as missing\nvalues_NA = (np.random.random(size = n * m) <= .15).reshape((n, m))\nX_na = X.copy()\nX_na[values_NA] = np.nan\n\n### Fitting the model\niso = IsolationForest(build_imputer=True, prob_pick_pooled_gain=1, ntry=10)\niso.fit(X_na)\n\n### Impute missing values\nX_imputed = iso.transform(X_na)\nprint(\"MSE for imputed values w/model: %f\\n\" % np.mean((X[values_NA] - X_imputed[values_NA])**2))\n\n### Comparison against simple mean imputation\nX_means = np.nanmean(X_na, axis = 0)\nX_imp_mean = X_na.copy()\nfor cl in range(m):\n X_imp_mean[np.isnan(X_imp_mean[:,cl]), cl] = X_means[cl]\n \nprint(\"MSE for imputed values w/means: %f\\n\" % np.mean((X[values_NA] - X_imp_mean[values_NA])**2))",
"MSE for imputed values w/model: 3.176113\n\nMSE for imputed values w/means: 5.540559\n\n"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07be0e29f62edb9c7ce08b8566cc89876e111fd | 794,909 | ipynb | Jupyter Notebook | 06_blog/06_blog_01.ipynb | yunshuipiao/hands-on-ml-with-sklearn-tf-python3 | 2b11560a7183a1ca9e823d4586e392456cbdd4d4 | [
"Apache-2.0"
] | 5 | 2018-05-19T06:36:09.000Z | 2022-02-14T10:53:27.000Z | 06_blog/06_blog_01.ipynb | yunshuipiao/hands-on-ml-with-sklearn-tf-python3 | 2b11560a7183a1ca9e823d4586e392456cbdd4d4 | [
"Apache-2.0"
] | null | null | null | 06_blog/06_blog_01.ipynb | yunshuipiao/hands-on-ml-with-sklearn-tf-python3 | 2b11560a7183a1ca9e823d4586e392456cbdd4d4 | [
"Apache-2.0"
] | 3 | 2018-05-01T11:33:34.000Z | 2020-06-23T05:22:35.000Z | 884.214683 | 191,080 | 0.954104 | [
[
[
"# 决策树\n-----",
"_____no_output_____"
]
],
[
[
"# 准备工作\n\n# Common imports\nimport numpy as np\nimport os\n\n# to make this notebook's output stable across runs\nnp.random.seed(42)\n\n# To plot pretty figures\n%matplotlib inline\nimport matplotlib\nimport matplotlib.pyplot as plt\nplt.rcParams['axes.labelsize'] = 14\nplt.rcParams['xtick.labelsize'] = 12\nplt.rcParams['ytick.labelsize'] = 12\n\n# Where to save the figures\nPROJECT_ROOT_DIR = \"..\"\nCHAPTER_ID = \"decision_trees\"\n\ndef image_path(fig_id):\n return os.path.join(PROJECT_ROOT_DIR, \"images\", CHAPTER_ID, fig_id)\n\ndef save_fig(fig_id, tight_layout=True):\n print(\"Saving figure\", fig_id)\n if tight_layout:\n plt.tight_layout()\n plt.savefig(image_path(fig_id) + \".png\", format='png', dpi=300)",
"_____no_output_____"
]
],
[
[
"# 训练与可视化",
"_____no_output_____"
]
],
[
[
"from sklearn.datasets import load_iris\nfrom sklearn.tree import DecisionTreeClassifier\n\niris = load_iris()\nX = iris.data[:, 2:] # petal length and width\ny = iris.target\n\ntree_clf = DecisionTreeClassifier(max_depth=2, random_state=42)\ntree_clf.fit(X, y)",
"_____no_output_____"
],
[
"from sklearn.tree import export_graphviz\n\nexport_graphviz(tree_clf, out_file=image_path(\"iris_tree.dot\"), \n feature_names=iris.feature_names[2:], \n class_names = iris.target_names, \n rounded=True,\n filled=True,\n )",
"_____no_output_____"
]
],
[
[
"根据上面得到的dot文件,可以使用`$ dot -Tpng iris_tree.dot -o iris_tree.png\n`命令转换为图片,如下:\n \n\n\n上图可以看到树是的预测过程。假设想分类鸢尾花, 可以从根节点开始。 \n首先看花瓣宽度, 如果小于2.45cm, 分入左边节点(深度1,左)。这种情况下,叶子节点不同继续询问,可以直接预测为Setosa鸢尾花。 如果宽度大于2.45cm, 移到右边子节点继续判断。由于不是叶子节点,因此继续判断, 花萼宽度如果小于1.75cm,则很大可能是Versicolor花(深度2, 左)。否则,可能是Virginica花(深度2, 右)。",
"_____no_output_____"
],
[
"其中参数含义如下:sample表示训练实例的个数。比如右节点中有100个实例, 花瓣宽度大于2.45cm。(深度1)\n其中54个花萼宽度小于1.75cm。value表示实例中每个类别的分分类个数。 \ngini系数表示实例的杂乱程度。如果等于0, 表示所有训练实例都属于同一个类别。如上setosa花分类。 \n公式可以计算第i个节点的gini分数。$G_i = 1 - \\sum_{k=1}^{n} p_{i,k}^{2}$ \nP(i,k)表示k实例在i节点中的分布比例。 \n比如2层左节点的gini等于:$1-(0/50)^{2}-(49/50)^{2}-(5/50)^{2} = 0.168$。",
"_____no_output_____"
],
[
"注意:sklearn中使用CART,生成二叉树。但是像ID3可以生成多个孩子的决策树。",
"_____no_output_____"
]
],
[
[
"from matplotlib.colors import ListedColormap\n\ndef plot_decision_boundary(clf, X, y, axes=[0, 7.5, 0, 3], iris=True, legend=False, plot_training=True):\n x1s = np.linspace(axes[0], axes[1], 100)\n x2s = np.linspace(axes[2], axes[3], 100)\n x1, x2 = np.meshgrid(x1s, x2s)\n X_new = np.c_[x1.ravel(), x2.ravel()]\n y_pred = clf.predict(X_new).reshape(x1.shape)\n custom_cmap = ListedColormap(['#fafab0','#9898ff','#a0faa0'])\n plt.contourf(x1, x2, y_pred, alpha=0.3, cmap=custom_cmap, linewidth=10)\n if not iris:\n custom_cmap2 = ListedColormap(['#7d7d58','#4c4c7f','#507d50'])\n plt.contour(x1, x2, y_pred, cmap=custom_cmap2, alpha=0.8)\n if plot_training:\n plt.plot(X[:, 0][y==0], X[:, 1][y==0], \"yo\", label=\"Iris-Setosa\")\n plt.plot(X[:, 0][y==1], X[:, 1][y==1], \"bs\", label=\"Iris-Versicolor\")\n plt.plot(X[:, 0][y==2], X[:, 1][y==2], \"g^\", label=\"Iris-Virginica\")\n plt.axis(axes)\n if iris:\n plt.xlabel(\"Petal length\", fontsize=14)\n plt.ylabel(\"Petal width\", fontsize=14)\n else:\n plt.xlabel(r\"$x_1$\", fontsize=18)\n plt.ylabel(r\"$x_2$\", fontsize=18, rotation=0)\n if legend:\n plt.legend(loc=\"lower right\", fontsize=14)\n \nplt.figure(figsize=(8, 4))\nplot_decision_boundary(tree_clf, X, y)\nplt.plot([2.45, 2.45], [0, 3], \"k-\", linewidth=2)\nplt.plot([2.45, 7.5], [1.75, 1.75], \"k--\", linewidth=2)\nplt.plot([4.95, 4.95], [0, 1.75], \"k:\", linewidth=2)\nplt.plot([4.85, 4.85], [1.75, 3], \"k:\", linewidth=2)\nplt.text(1.40, 1.0, \"Depth=0\", fontsize=15)\nplt.text(3.2, 1.80, \"Depth=1\", fontsize=13)\nplt.text(4.05, 0.5, \"(Depth=2)\", fontsize=11)\n\nsave_fig(\"decision_tree_decision_boundaries_plot\")\nplt.show()",
"/anaconda3/lib/python3.6/site-packages/matplotlib/contour.py:967: UserWarning: The following kwargs were not used by contour: 'linewidth'\n s)\n"
]
],
[
[
"上图显示了该决策树的决策边界。垂直线表示决策树的根节点(深度0), 花瓣长度等于2.45cm。 \n由于左边gini为0,只有一种分类,不再进一步分类判断。但是右边不是很纯,因此深度1的右边节点根据花萼宽度1.75cm进一步判断。 \n由于最大深度为2,决策树停止后面的判断。但是可以设置max_depth为3, 然后,两个深度2节点将各自添加另一个决策边界(由虚线表示)。 ",
"_____no_output_____"
],
[
"补充:可以看到决策树的过程容易理解,称之为白盒模型。与之不同的是,随机森林和神经网络一般称为黑盒模型。 \n它们预测效果很好,可以很容易地检查其计算结果, 来做出这些预测。但却难以解释为什么这样预测。 \n决策树提供了很好的和简单的分类规则,甚至可以在需要时手动分类。",
"_____no_output_____"
],
[
"# 进行预测和计算可能性",
"_____no_output_____"
]
],
[
[
"tree_clf.predict_proba([[5, 1.5]])",
"_____no_output_____"
],
[
"tree_clf.predict([[5, 1.5]])",
"_____no_output_____"
]
],
[
[
"### CART:分类回归树\nsklearn使用CART算法对训练决策树(增长树)。思想很简单:首先将训练集分为两个子集,根据特征k和阈值$t_k$(比如花瓣长度小于2.45cm)。重要的是怎么选出这个特征。 \n通过对每一组最纯的子集(k, $t_k$),根据大小的权重进行搜索。最小化如下损失函数:\n#### CART分类的损失函数\n$J(k, t_k) = \\frac{m_{left}}{m}G_{left} + \\frac{m_{right}}{m}G_{right} $\n\n\n最小化如上函数,一旦成功分为两个子集, 就可以使用相同逻辑递归进行切分。当到达给定的最大深度时(max_depth)停止,或者不能再继续切分(数据集很纯,无法减少杂质)。 \n如下超参数控制其他的停止条件(min_samples_split, min_sample_leaf, min_weight_fraction_leaf, max_leaf_nodes). ",
"_____no_output_____"
],
[
"### 计算复杂度\n默认的,经常使用gini impurity测量标准,但是也可以使用entropy impuirty来测量。\n",
"_____no_output_____"
]
],
[
[
"not_widest_versicolor = (X[:, 1] != 1.8) | (y==2)\nX_tweaked = X[not_widest_versicolor]\ny_tweaked = y[not_widest_versicolor]\n\ntree_clf_tweaked = DecisionTreeClassifier(max_depth=2, random_state=40)\ntree_clf_tweaked.fit(X_tweaked, y_tweaked)",
"_____no_output_____"
],
[
"plt.figure(figsize=(8, 4))\nplot_decision_boundary(tree_clf_tweaked, X_tweaked, y_tweaked, legend=False)\nplt.plot([0, 7.5], [0.8, 0.8], \"k-\", linewidth=2)\nplt.plot([0, 7.5], [1.75, 1.75], \"k-\", linewidth=2)\nplt.text(1.0, 0.9, \"Depth=0\", fontsize=15)\nplt.text(1.0, 1.80, \"Depth=0\", fontsize=13)\n\nsave_fig(\"decision_tree_instability_plot\")\nplt.show()",
"/anaconda3/lib/python3.6/site-packages/matplotlib/contour.py:967: UserWarning: The following kwargs were not used by contour: 'linewidth'\n s)\n"
]
],
[
[
"### 限制超参数\n如下情况所示,防止过拟合数据,需要限制决策树的自由度,这个过程也叫正则(限制)。 \n决策树的max_depth超参数来控制拟合程度,默认不限制。可以减少max_depth来限制模型,减少过拟合的风险。 ",
"_____no_output_____"
]
],
[
[
"from sklearn.datasets import make_moons\n\nXm, ym = make_moons(n_samples=100, noise=0.25, random_state=53)\n\ndeep_tree_clf1 = DecisionTreeClassifier(random_state=42)\ndeep_tree_clf2 = DecisionTreeClassifier(min_samples_leaf=4, random_state=42)\ndeep_tree_clf1.fit(Xm, ym)\ndeep_tree_clf2.fit(Xm, ym)\n\nplt.figure(figsize=(11, 4))\nplt.subplot(121)\nplot_decision_boundary(deep_tree_clf1, Xm, ym, axes=[-1.5, 2.5, -1, 1.5], iris=False)\nplt.title(\"No restrictions\", fontsize=16)\nplt.subplot(122)\nplot_decision_boundary(deep_tree_clf2, Xm, ym, axes=[-1.5, 2.5, -1, 1.5], iris=False)\nplt.title(\"min_samples_leaf = {}\".format(deep_tree_clf2.min_samples_leaf), fontsize=14)\n\nsave_fig(\"min_samples_leaf_plot\")\nplt.show()",
"/anaconda3/lib/python3.6/site-packages/matplotlib/contour.py:967: UserWarning: The following kwargs were not used by contour: 'linewidth'\n s)\n"
]
],
[
[
"DecisionTreeClassifier有如下超参数:min_samples_split表示切分数据时包含的最小实例, min_samples_leaf表示一个叶子节点必须拥有的最小样本数目, min_weight_fraction_leaf(与min_samples_leaf相同,但表示为加权实例总数的一小部分), max_leaf_nodes(最大叶节点数)和max_features(在每个节点上分配的最大特性数), 增加min_*超参数或减少max_*超参数将使模型规范化。\n\n其他算法开始时对决策树进行无约束训练,之后删除没必要的特征,称为减枝。 \n如果一个节点的所有子节点所提供的纯度改善没有统计学意义,则认为其和其子节点是不必要的。 ",
"_____no_output_____"
]
],
[
[
"angle = np.pi / 180 * 20\nrotation_maxtrix = np.array([[np.cos(angle), -np.sin(angle)], [np.sin(angle), np.cos(angle)]])\nXr = X.dot(rotation_maxtrix)\n\ntree_clf_r = DecisionTreeClassifier(random_state=42)\ntree_clf_r.fit(Xr, y)\n\n\nplt.figure(figsize=(8, 3))\nplot_decision_boundary(tree_clf_r, Xr, y, axes=[0.5, 7.5, -1.0, 1], iris=False)\n\nplt.show()",
"/anaconda3/lib/python3.6/site-packages/matplotlib/contour.py:967: UserWarning: The following kwargs were not used by contour: 'linewidth'\n s)\n"
]
],
[
[
"# 不稳定性\n目前为止,决策树有很多好处:它们易于理解和解释,易于使用,用途广泛,而且功能强大。 \n但是也有一些限制。首先, 决策树喜欢正交决策边界(所有的分割都垂直于一个轴),这使得它们对训练集的旋转很敏感。如下右图所示,旋转45度之后,尽管分类的很好,但是不会得到更大推广。其中的一种解决办法是PCA(后面介绍)。\n\n更普遍的,决策树对训练集中的微小变化很敏感。比如上图中移除一个实例的分类结果又很大的不同。 \n随机森林可以通过对许多树进行平均预测来限制这种不稳定, 对异常值,微小变化更加适用。",
"_____no_output_____"
]
],
[
[
"np.random.seed(6)\nXs = np.random.rand(100, 2) - 0.5\nys = (Xs[:, 0] > 0).astype(np.float32) * 2\n\nangle = np.pi / 4\nrotation_matrix = np.array([[np.cos(angle), -np.sin(angle)], [np.sin(angle), np.cos(angle)]])\nXsr = Xs.dot(rotation_matrix)\n\ntree_clf_s = DecisionTreeClassifier(random_state=42)\ntree_clf_s.fit(Xs, ys)\ntree_clf_sr = DecisionTreeClassifier(random_state=42)\ntree_clf_sr.fit(Xsr, ys)\n\nplt.figure(figsize=(11, 4))\nplt.subplot(121)\nplot_decision_boundary(tree_clf_s, Xs, ys, axes=[-0.7, 0.7, -0.7, 0.7], iris=False)\nplt.subplot(122)\nplot_decision_boundary(tree_clf_sr, Xsr, ys, axes=[-0.7, 0.7, -0.7, 0.7], iris=False)\n\nsave_fig(\"sensitivity_to_rotation_plot\")\nplt.show()",
"/anaconda3/lib/python3.6/site-packages/matplotlib/contour.py:967: UserWarning: The following kwargs were not used by contour: 'linewidth'\n s)\n"
]
],
[
[
"### 回归树",
"_____no_output_____"
]
],
[
[
"import numpy as np\n# 带噪声的2阶训练集\nnp.random.seed(42)\nm = 200\nX = np.random.rand(m ,1)\ny = 4 * (X - 0.5) ** 2\ny = y + np.random.randn(m, 1) / 10",
"_____no_output_____"
],
[
"from sklearn.tree import DecisionTreeRegressor\n\ntree_reg = DecisionTreeRegressor(max_depth=2, random_state=42)\ntree_reg.fit(X, y)",
"_____no_output_____"
]
],
[
[
"该回归决策树最大深度为2, dot后如下:\n \n与分类树非常类似。 主要的不同在于,分类树根据每个节点预测每个分类。 \n比如当x1 = 0.6时进行预测。从根开始遍历树,最终到达叶节点,该节点预测值=0.1106。 \n这个预测仅仅是与此叶节点相关的110个训练实例的平均目标值。这个预测的结果是一个平均平方误差(MSE),在这110个实例中等于0.0151。 \n\n请注意,每个区域的预测值始终是该区域实例的平均目标值。该算法以一种使大多数训练实例尽可能接近预测值的方式来分割每个区域。",
"_____no_output_____"
]
],
[
[
"from sklearn.tree import DecisionTreeRegressor\n\ntree_reg1 = DecisionTreeRegressor(random_state=42, max_depth=2)\ntree_reg2 = DecisionTreeRegressor(random_state=42, max_depth=3)\ntree_reg1.fit(X, y)\ntree_reg2.fit(X, y)\n\n\ndef plot_regression_predictions(tree_reg, X, y, axes=[0, 1, -0.2, 1], ylabel=\"$y$\"):\n x1 = np.linspace(axes[0], axes[1], 500).reshape(-1, 1)\n y_pred = tree_reg.predict(x1)\n plt.axis(axes)\n \n if ylabel:\n plt.ylabel(ylabel, fontsize=18, rotation=0)\n plt.plot(X, y, \"b.\")\n plt.plot(x1, y_pred, \"r.-\", linewidth=2, label=r\"$\\hat{y}$\")\n \n\nplt.figure(figsize=(11, 4))\nplt.subplot(121)\nplot_regression_predictions(tree_reg1, X, y)\nfor split, style in ((0.1973, \"k-\"), (0.0917, \"k--\"), (0.7718, \"k--\")):\n plt.plot([split, split], [-0.2, 1], style, linewidth=2)\nplt.text(0.21, 0.65, \"Depth=0\", fontsize=15)\nplt.text(0.01, 0.2, \"Depth=1\", fontsize=13)\nplt.text(0.65, 0.8, \"Depth=1\", fontsize=13)\nplt.legend(loc=\"upper center\", fontsize=18)\nplt.title(\"max_depth=2\", fontsize=14)\n\nplt.subplot(122)\nplot_regression_predictions(tree_reg2, X, y, ylabel=None)\nfor split, style in ((0.1973, \"k-\"), (0.0917, \"k--\"), (0.7718, \"k--\")):\n plt.plot([split, split], [-0.2, 1], style, linewidth=2)\nfor split in (0.0458, 0.1298, 0.2873, 0.9040):\n plt.plot([split, split], [-0.2, 1], \"k:\", linewidth=1)\nplt.text(0.3, 0.5, \"Depth=2\", fontsize=13)\nplt.title(\"max_depth=3\", fontsize=14)\n\nsave_fig(\"tree_regression_plot\")\nplt.show()",
"Saving figure tree_regression_plot\n"
],
[
"# 画出分类图\nexport_graphviz(\n tree_reg1,\n out_file=image_path(\"regression_tree.dot\"),\n feature_names=[\"x1\"],\n rounded=True,\n filled=True\n )",
"_____no_output_____"
],
[
"tree_reg1 = DecisionTreeRegressor(random_state=42)\ntree_reg2 = DecisionTreeRegressor(random_state=42, min_samples_leaf=10)\ntree_reg1.fit(X, y)\ntree_reg2.fit(X, y)\n\nx1 = np.linspace(0, 1, 500).reshape(-1, 1)\ny_pred1 = tree_reg1.predict(x1)\ny_pred2 = tree_reg2.predict(x1)\n\nplt.figure(figsize=(11, 4))\n\nplt.subplot(121)\nplt.plot(X, y, \"b.\")\nplt.plot(x1, y_pred1, \"r.-\", linewidth=2, label=r\"$\\hat{y}$\")\nplt.axis([0, 1, -0.2, 1.1])\nplt.xlabel(\"$x_1$\", fontsize=18)\nplt.ylabel(\"$y$\", fontsize=18, rotation=0)\nplt.legend(loc=\"upper center\", fontsize=18)\nplt.title(\"No restrictions\", fontsize=14)\n\nplt.subplot(122)\nplt.plot(X, y, \"b.\")\nplt.plot(x1, y_pred2, \"r.-\", linewidth=2, label=r\"$\\hat{y}$\")\nplt.axis([0, 1, -0.2, 1.1])\nplt.xlabel(\"$x_1$\", fontsize=18)\nplt.title(\"min_samples_leaf={}\".format(tree_reg2.min_samples_leaf), fontsize=14)\n\nsave_fig(\"tree_regression_regularization_plot\")",
"Saving figure tree_regression_regularization_plot\n"
]
],
[
[
"\n如上图所示, 回归树根据最小化mse来切分数据集。 \n决策树在处理回归任务时倾向于过度拟合。 \n如上左图,超参数默认,不加约束时容易过拟合。通过设置min_samples_leaf 将模型更合理的约束。 ",
"_____no_output_____"
],
[
"# 课后习题",
"_____no_output_____"
],
[
"#### 1. 无约束情况下,一百万个实例的训练集训练得到的决策树的大约深度是多少?",
"_____no_output_____"
],
[
"#### 2. 节点的gini impurity一般是小于还是大于其父节点?一般情况下这样,还是一直都这样?",
"_____no_output_____"
],
[
"#### 3. 如果决策树过拟合, 减少max_depth是一个好方法吗?",
"_____no_output_____"
],
[
"#### 4. 如果决策树欠拟合,缩放输入的特征是一个好方法吗?",
"_____no_output_____"
],
[
"#### 5. 如果在一个包含100万个实例的训练集上训练决策树需要一个小时,那么在包含1000万个实例的训练集上训练另一个决策树需要花费多少时间呢?",
"_____no_output_____"
],
[
"#### 6. 如果您的训练集包含100,000个实例,将设置presort=True 可以加快训练吗?",
"_____no_output_____"
],
[
"#### 7. 训练并调节决策树模型,使用moons数据集。\na. 使用 make_moons(n_samples=10000, noise=0.4)生成数据集。 \nb. 使用train_test_split(). 切分数据集。\nc. 使用网格搜索并进行交叉验证,去找到最合适的超参数。尝试max_leaf_nodes参数。\nd. 使用全部数据进行训练,并在测试集上估计性能。应该在85到87之间。",
"_____no_output_____"
],
[
"#### 8. 生成森林。 \na. 继续上一题, 生成训练集的1000个子集所谓验证集。随机选择100实例。\nb. 每一个子集训练一棵树,使用上述得到的最合适超参数。在测试集上评估这1000棵树。 \n由于它们是在较小的集合上进行训练的,所以这些决策树可能比第一个决策树更糟糕,只实现了大约80%的精度。\nc. 对于每个测试集实例,生成1,000个决策树的预测,并且只保留最频繁的预测(您可以使用SciPy的mode()函数来实现这一点)。这给了对测试集的多数投票预测. \nd. 评估测试集的这些预测:您应该获得比第一个模型稍高的精度(大约高0.5到1.5%)。恭喜你,你训练了一个随机森林分类器!",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
d07befb4b0f0b6f43e73200e80fba734a9ed27de | 199,324 | ipynb | Jupyter Notebook | examples/05-example-save-load-networks-shapes.ipynb | tanvim/test | 3fc7deab27a33183e49e7868aac361a650727ff3 | [
"MIT"
] | null | null | null | examples/05-example-save-load-networks-shapes.ipynb | tanvim/test | 3fc7deab27a33183e49e7868aac361a650727ff3 | [
"MIT"
] | null | null | null | examples/05-example-save-load-networks-shapes.ipynb | tanvim/test | 3fc7deab27a33183e49e7868aac361a650727ff3 | [
"MIT"
] | null | null | null | 922.796296 | 187,232 | 0.93792 | [
[
[
"# Use OSMnx to construct place boundaries and street networks, and save as various file formats for working with later\n\n - [Overview of OSMnx](http://geoffboeing.com/2016/11/osmnx-python-street-networks/)\n - [GitHub repo](https://github.com/gboeing/osmnx)\n - [Examples, demos, tutorials](https://github.com/gboeing/osmnx/tree/master/examples)",
"_____no_output_____"
]
],
[
[
"import osmnx as ox\n%matplotlib inline\nox.config(log_file=True, log_console=True, use_cache=True)",
"_____no_output_____"
],
[
"place = 'Piedmont, California, USA'",
"_____no_output_____"
]
],
[
[
"## Get place shape geometries from OSM and save as shapefile",
"_____no_output_____"
]
],
[
[
"gdf = ox.gdf_from_place(place)\ngdf.loc[0, 'geometry']",
"_____no_output_____"
],
[
"# save place boundary geometry as ESRI shapefile\nox.save_gdf_shapefile(gdf, filename='place-shape')",
"_____no_output_____"
]
],
[
[
"## Construct street network and save as shapefile to work with in GIS",
"_____no_output_____"
]
],
[
[
"G = ox.graph_from_place(place, network_type='drive')\nG_projected = ox.project_graph(G)",
"_____no_output_____"
],
[
"# save street network as ESRI shapefile\nox.save_graph_shapefile(G_projected, filename='network-shape')",
"_____no_output_____"
]
],
[
[
"## Save street network as GraphML to work with in Gephi or NetworkX",
"_____no_output_____"
]
],
[
[
"# save street network as GraphML file\nox.save_graphml(G_projected, filename='network.graphml')",
"_____no_output_____"
]
],
[
[
"## Save street network as SVG to work with in Illustrator",
"_____no_output_____"
]
],
[
[
"# save street network as SVG\nfig, ax = ox.plot_graph(G_projected, show=False, save=True, \n filename='network', file_format='svg')",
"_____no_output_____"
]
],
[
[
"## Load street network from saved GraphML file",
"_____no_output_____"
]
],
[
[
"G2 = ox.load_graphml('network.graphml')\nfig, ax = ox.plot_graph(G2)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07bfa328056cf8d4d9677b187f8fff3cc986a3a | 17,906 | ipynb | Jupyter Notebook | _notebooks/Matrix Data Scrapper.ipynb | lalalaNomNomNom/Notebooks | 91f9d184fc75df125615ef2ed0b5f389ab482df5 | [
"Apache-2.0"
] | null | null | null | _notebooks/Matrix Data Scrapper.ipynb | lalalaNomNomNom/Notebooks | 91f9d184fc75df125615ef2ed0b5f389ab482df5 | [
"Apache-2.0"
] | null | null | null | _notebooks/Matrix Data Scrapper.ipynb | lalalaNomNomNom/Notebooks | 91f9d184fc75df125615ef2ed0b5f389ab482df5 | [
"Apache-2.0"
] | null | null | null | 36.692623 | 162 | 0.628337 | [
[
[
"# TAG MATRIX DATA SCRAPER\n- author: Richard Castro \n- sticky_rank: 1\n- toc: true\n- badges: true\n- comments: false\n- categories: [Matrix]\n- image: images/scraper.jpg",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nimport requests\nimport datetime\ndate=datetime.datetime.now().strftime(\"%Y-%m-%d\")\nimport dash\nimport dash_core_components as dcc",
"_____no_output_____"
],
[
"df=pd.read_csv(\"https://raw.githubusercontent.com/pcm-dpc/COVID-19/master/dati-province/dpc-covid19-ita-province.csv\")\ndf.to_csv('../data/clients/CHEP/'+date+'-Italy.csv')",
"_____no_output_____"
]
],
[
[
"# CANADA STUFF",
"_____no_output_____"
]
],
[
[
"#SOURCE - https://github.com/eebrown/data2019nCoV\ndf=pd.read_csv('https://raw.githubusercontent.com/eebrown/data2019nCoV/master/data-raw/covid19.csv')\ndf=df.rename(columns={'pruid':'uid', 'prname':'province'})\ncol=['uid', 'province','date', 'numconf', 'numprob',\n 'numdeaths', 'numtotal', 'numtested', 'numrecover', 'percentrecover',\n 'ratetested', 'numtoday', 'percentoday', 'ratetotal', 'ratedeaths',\n 'numdeathstoday', 'percentdeath', 'numtestedtoday', 'numrecoveredtoday',\n 'percentactive', 'numactive', 'rateactive', 'numtotal_last14',\n 'ratetotal_last14', 'numdeaths_last14', 'ratedeaths_last14']\ndf_ca=df[col]\ndf_ca.set_index('date', inplace=True)\ndf_ca.to_csv('../data/sources/canada/'+date+'-eeBrown.csv')",
"_____no_output_____"
],
[
"#SOURCE - https://www12.statcan.gc.ca/census-recensement/index-eng.cfm\ndf=pd.read_csv('https://www12.statcan.gc.ca/census-recensement/2016/dp-pd/hlt-fst/pd-pl/Tables/CompFile.cfm?Lang=Eng&T=301&OFT=FULLCSV')\ndf_cacen=df\ndf_cacen.to_csv('../data/sources/canada/'+date+'-ca_census.csv')",
"_____no_output_____"
],
[
"#SOURCE - ISHABERRY\n#PROVINCE LEVEL CASE DATA\ndf=pd.read_csv('https://raw.githubusercontent.com/ishaberry/Covid19Canada/master/timeseries_prov/cases_timeseries_prov.csv')\ndf.rename(columns={'date_report':'date'}, inplace=True)\ndf.set_index('date')\ndf_Isha=df\ndf_Isha.to_csv('../data/sources/canada/'+date+'Isha_Prov_Cases.csv')",
"_____no_output_____"
],
[
"#SOURCE - ISHABERRY\n#HEALTH REGION LEVEL CASE DATA\ndf=pd.read_csv('https://raw.githubusercontent.com/ishaberry/Covid19Canada/master/timeseries_hr/cases_timeseries_hr.csv')\ndf.rename(columns={'date_report':'date'}, inplace=True)\ndf.set_index('date')\ndf_Isha=df\ndf_Isha.to_csv('../data/sources/canada/'+date+'Isha_HR_Cases.csv')\n",
"_____no_output_____"
],
[
"#SOURCE - ISHABERRY\n#PROVINCE LEVEL TEST DATA\ndf=pd.read_csv('https://raw.githubusercontent.com/ishaberry/Covid19Canada/master/timeseries_prov/testing_timeseries_prov.csv')\ndf.rename(columns={'date_testing':'date'}, inplace=True)\ndf.set_index('date')\ndf_Isha=df\ndf_Isha.to_csv('../data/sources/canada/'+date+'Isha_Province_Testing.csv')",
"_____no_output_____"
]
],
[
[
"# World-o-Meter",
"_____no_output_____"
]
],
[
[
"#WORLD O METER DATA\n#NEW YORK COUNTY DATA\nimport datetime\ndate=datetime.datetime.now().strftime(\"%Y-%m-%d\")\n\nweb=requests.get('https://www.worldometers.info/coronavirus/usa/new-york')\nny=pd.read_html(web.text)\nny=ny[1]\nny.columns=map(str.lower, ny.columns)\nny.to_csv('../data/sources/worldometer/'+date+'-NY-County-Data.csv')\n\n#CALIFORNIA COUNTY DATA\ncad=requests.get('https://www.worldometers.info/coronavirus/usa/california')\nca=pd.read_html(cad.text)\nca=ca[1]\nca.columns=map(str.lower, ca.columns)\nca.to_csv('../data/sources/worldometer/'+date+'-CA-County-Data.csv')\n\n#NEW JERSEY COUNTY DATA\nnjd=requests.get('https://www.worldometers.info/coronavirus/usa/new-jersey')\nnj=pd.read_html(njd.text)\nnj=nj[1]\nnj.columns=map(str.lower, nj.columns)\nnj.to_csv('../data/sources/worldometer/'+date+'-NJ-County-Data.csv')\n\n#OHIO COUNTY DATA\nohd=requests.get('https://www.worldometers.info/coronavirus/usa/ohio/')\noh=pd.read_html(ohd.text)\noh=oh[1]\noh.columns=map(str.lower, oh.columns)\noh.to_csv('../data/sources/worldometer/'+date+'-OH-County-Data.csv')\n\n#SOUTH CAROLINA COUNTY DATA\nscd=requests.get('https://www.worldometers.info/coronavirus/usa/south-carolina/')\nsc=pd.read_html(scd.text)\nsc=sc[1]\nsc.columns=map(str.lower, sc.columns)\nsc.to_csv('../data/sources/worldometer/'+date+'-SC-County-Data.csv')\n\n#PA COUNTY DATA\npad=requests.get('https://www.worldometers.info/coronavirus/usa/pennsylvania/')\npa=pd.read_html(pad.text)\npa=pa[1]\npa.columns=map(str.lower, pa.columns)\npa.to_csv('../data/sources/worldometer/'+date+'-PA-County-Data.csv')\n\n#WASHINGTON COUNTY DATA\nwad=requests.get('https://www.worldometers.info/coronavirus/usa/washington/')\nwa=pd.read_html(wad.text)\nwa=wa[1]\nwa.columns=map(str.lower, wa.columns)\nwa.to_csv('../data/sources/worldometer/'+date+'-WA-County-Data.csv')\n\n#US STATE LEVEL DATA\nwe=requests.get('https://www.worldometers.info/coronavirus/country/us/')\nus=pd.read_html(we.text)\nus=us[1]\nus.to_csv('../data/sources/worldometer/'+date+'-US-State-Data.csv')",
"_____no_output_____"
]
],
[
[
"# rt live",
"_____no_output_____"
]
],
[
[
"rtlive=pd.read_csv('https://d14wlfuexuxgcm.cloudfront.net/covid/rt.csv')\nrtlive.to_csv('../_data/data_sources/rtlive/rtlive'+date+'.csv')",
"_____no_output_____"
]
],
[
[
"# Mobility Reports\n\n<ul>\n <li>Google Mobility Reports</li>\n <li>Apple Mobility Reports</li>\n</ul>",
"_____no_output_____"
]
],
[
[
"#GOOGLE AND APPLE MOBILITY DATA BY COUNTY \n#apple=pd.read_csv('https://covid19-static.cdn-apple.com/covid19-mobility-data/2014HotfixDev8/v3/en-us/applemobilitytrends-2020-08-08.csv')\n#apple.to_csv('../_data/Data_Sources/Mobility_Reports/apple.csv')\ngoogle=pd.read_csv('https://www.gstatic.com/covid19/mobility/Global_Mobility_Report.csv')\ngoogle.to_csv('../_data/Data_Sources/google/google.csv')",
"_____no_output_____"
]
],
[
[
"# WORLD-O-METER DATASETS\n\n<ul>\n <li>NEW YORK\n <li>CALIFORNIA\n <li>NEW JERSEY\n <li>PA\n <li>SOUTH CAROLINA\n <li>OHIO\n <li>WASHINGTON STATE\n</ul>\n",
"_____no_output_____"
]
],
[
[
"healthDepartment=requests.get('https://data.ct.gov/Health-and-Human-Services/COVID-19-Tests-Cases-and-Deaths-By-Town-/28fr-iqnx/data')\nhd=pd.read_html(healthDepartment.text)",
"_____no_output_____"
]
],
[
[
"# COUNTY HEALTH DEPARTMANT DATASETS\n",
"_____no_output_____"
]
],
[
[
"#HEALTH DEPARTMENTS DATA\nflData=pd.read_csv('https://opendata.arcgis.com/datasets/222c9d85e93540dba523939cfb718d76_0.csv?outSR=%7B%22latestWkid%22%3A4326%2C%22wkid%22%3A4326%7D')\nflData.to_csv('../_data/Data_Sources/Fl-Data.csv')\n\nmiData=pd.read_excel('https://www.michigan.gov/documents/coronavirus/Covid-19_Tests_by_County_2020-08-08_698830_7.xlsx')\nmiData.to_csv('../_data/Data_Sources/Health-Department-Data/MI-Tests-County.csv')\n\nmiData2=pd.read_excel('https://www.michigan.gov/documents/coronavirus/Cases_by_County_and_Date_2020-08-08_698828_7.xlsx')\nmiData2.to_csv('../_data/Data_Sources/Health-Department-Data/MI-Cases-County.csv')\n\nmiData3=pd.read_excel('https://www.michigan.gov/documents/coronavirus/Cases_and_Deaths_by_County_2020-08-08_698827_7.xlsx')\nmiData3.to_csv('../_data/Data_Sources/Health-Department-Data/MI-Deaths-Cases-County.csv')\n\n\nmiData4=pd.read_csv('https://raw.githubusercontent.com/jeffcore/covid-19-usa-by-state/master/COVID-19-Cases-USA-By-County.csv')\nmiData4.to_csv('../_data/Data_Sources/Health-Department-Data/COVID-19-Cases-USA-By-County.csv')\n\nmiData5=pd.read_csv('https://raw.githubusercontent.com/jeffcore/covid-19-usa-by-state/master/COVID-19-Deaths-USA-By-County.csv')\nmiData5.to_csv('../_data/Data_Sources/Health-Department-Data/COVID-19-Deaths-USA-By-County.csv')\n\nmiData6=pd.read_csv('https://raw.githubusercontent.com/jeffcore/covid-19-usa-by-state/master/COVID-19-Cases-USA-By-State.csv')\nmiData6.to_csv('../_data/Data_Sources/Health-Department-Data/COVID-19-Cases-USA-By-State.csv')\n\nmiData7=pd.read_csv('https://raw.githubusercontent.com/jeffcore/covid-19-usa-by-state/master/COVID-19-Deaths-USA-By-State.csv')\nmiData7.to_csv('../_data/Data_Sources/Health-Department-Data/COVID-19-Deaths-USA-By-State.csv')",
"_____no_output_____"
],
[
"#ANOTHER MODULE \nfrom bs4 import BeautifulSoup\nurl=requests.get('https://covidactnow.org/us/fl/county/taylor_county?s=846164')\nsoup = BeautifulSoup(requests.get(url).text)\nsoup.findAll(\"table\")[0].findAll(\"tr\")[0]",
"_____no_output_____"
]
],
[
[
"# COVID TRACKER DATA",
"_____no_output_____"
]
],
[
[
"#COVID TRACKER DATA\n\nda1=pd.read_html('https://covidtracking.com/data/state/alabama')\nda1[1].to_csv('../_data/Data_Sources/CovidTracker/Alabama.csv')\n\nda2=pd.read_html('https://covidtracking.com/data/state/alaska')\nda2[1].to_csv('../_data/Data_Sources/CovidTracker/Alaska.csv')\n\nda3=pd.read_html('https://covidtracking.com/data/state/arizona')\nda3[1].to_csv('../_data/Data_Sources/CovidTracker/Arizona.csv')\n\nAR_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/arkansas')\nAR_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-ARKANSAS.csv')\n\nCA_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/california')\nCA_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-CALIFORNIA.csv') \n\nGA_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/GEORGIA')\nGA_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-GEORGIA.csv')\n\nKS_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/KANSAS')\nKS_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-KANSAS.csv')\n\nFL_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/FLORIDA')\nFL_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-FLORIDA.csv')\n\nIL_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/ILLINOIS')\nIL_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-ILLINOIS.csv')\n\nOH_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/OHIO')\nOH_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-OHIO.csv')\n\nTN_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/TENNESSEE')\nTN_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-TENNESSEE.csv')\n\nNE_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/NEBRASKA')\nNE_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-NEBRASKA.csv')\n\nPA_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/PENNSYLVANIA')\nPA_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-PENNSYLVANIA.csv')\n\nNC_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/NORTH-CAROLINA')\nNC_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-NORTHCAROLINA.csv')\n\nKY_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/KENTUCKY')\nKY_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-KENTUCKY.csv')\n\nCO_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/COLORADO')\nCO_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-COLORADO.csv')\n\nKS_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/KENTUCKY')\nKS_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-KENTUCKY.csv')\n\nNJ_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/NEW-JERSEY')\nNJ_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-NEWJERSEY.csv')\n\nMN_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/MINNESOTA')\nMN_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-MINNESOTA.csv')\n\nMI_COVIDTRACKER=pd.read_html('https://covidtracking.com/data/state/MICHIGAN')\nMI_COVIDTRACKER[1].to_csv('../_data/data_sources/covidtracker/'+date+'-MICHIGAN.csv')",
"_____no_output_____"
]
],
[
[
"# NEW YORK TIMES DATA",
"_____no_output_____"
]
],
[
[
"#NEW YORK TIMES DATA\nMASK_NYT=pd.read_csv('https://raw.githubusercontent.com/nytimes/covid-19-data/master/mask-use/mask-use-by-county.csv')\nMASK_NYT.to_csv('../_data/data_sources/NYT/MASKUSAGE-'+date+'.csv')\n\nCASESDEATHSC_NYT=pd.read_csv('https://raw.githubusercontent.com/nytimes/covid-19-data/master/us-counties.csv')\nCASESDEATHSC_NYT.to_csv('../_data/data_sources/NYT/CASES-DEATHS-COUNTY-'+date+'.csv')\n\nCASESDEATHS_NYT=pd.read_csv('https://raw.githubusercontent.com/nytimes/covid-19-data/master/us-states.csv')\nCASESDEATHS_NYT.to_csv('../_data/data_sources/NYT/CASES-DEATHS-STATE-'+date+'.csv')\n\nCASESDEATHSCD_NYT=pd.read_csv('https://raw.githubusercontent.com/nytimes/covid-19-data/master/live/us-counties.csv')\nCASESDEATHSCD_NYT.to_csv('../_data/data_sources/NYT/CASES-DEATHS-COUNTY-DAILY-'+date+'.csv')\n\nCASESDEATHSD_NYT=pd.read_csv('https://raw.githubusercontent.com/nytimes/covid-19-data/master/live/us-states.csv')\nCASESDEATHSD_NYT.to_csv('../_data/data_sources/NYT/CASES-DEATHS-STATE-DAILY-'+date+'.csv')\n\nEXDEATHS_NYT=pd.read_csv('https://raw.githubusercontent.com/nytimes/covid-19-data/master/excess-deaths/deaths.csv')\nEXDEATHS_NYT.to_csv('../_data/data_sources/NYT/EXCESS-DEATHS-CITY-'+date+'.csv')",
"_____no_output_____"
]
],
[
[
"# linlab",
"_____no_output_____"
]
],
[
[
"LINLAB=pd.read_csv('https://raw.githubusercontent.com/lin-lab/COVID19-Viz/master/clean_data/rt_table_export.csv')\nLINLAB.to_csv('../data/sources/LINLAB/'+date+'-RTCOUNTYLEVEL.csv')",
"_____no_output_____"
],
[
"wew=requests.get('https://www.geonames.org/statistics/')\nus=pd.read_html(wew.text)\nus[1].to_csv('../_data/data_sources/worldCountryPopulations.csv')\n\n",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
d07c0aa76191118a3f4f8bc322300a747e5fd1ce | 1,549 | ipynb | Jupyter Notebook | Data-Scrapping/webApi/GoogleApi.ipynb | pulkit10251/cbData-Science | 93350f07a02e34cd0507cbb69229e6926fa17a62 | [
"MIT"
] | null | null | null | Data-Scrapping/webApi/GoogleApi.ipynb | pulkit10251/cbData-Science | 93350f07a02e34cd0507cbb69229e6926fa17a62 | [
"MIT"
] | null | null | null | Data-Scrapping/webApi/GoogleApi.ipynb | pulkit10251/cbData-Science | 93350f07a02e34cd0507cbb69229e6926fa17a62 | [
"MIT"
] | null | null | null | 19.123457 | 139 | 0.528728 | [
[
[
"import requests",
"_____no_output_____"
],
[
"url=\"https://maps.googleapis.com/maps/api/geocode/json?\"",
"_____no_output_____"
],
[
"parameters={\n \"address\":\"naman poojan Samagri rohini\",\n \"key\":\"AIzaSyDxpzAOiOie2lqiUfMhWegOvmbKH25TNlE\"\n}",
"_____no_output_____"
],
[
"r=requests.get(url,parameters)\nprint(r.url)",
"https://maps.googleapis.com/maps/api/geocode/json?address=naman+poojan+Samagri+rohini&key=AIzaSyDxpzAOiOie2lqiUfMhWegOvmbKH25TNlE\n"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code"
]
] |
d07c10f15749ea770ef0c6bae3d5b11659255dce | 2,777 | ipynb | Jupyter Notebook | TTR.ipynb | Mukesh7197/anlp-1 | eff3de6349b9ba4cab3702d36ecbacb6cb551611 | [
"MIT"
] | 40 | 2019-08-03T08:35:15.000Z | 2022-02-18T04:46:59.000Z | TTR.ipynb | Mukesh7197/anlp-1 | eff3de6349b9ba4cab3702d36ecbacb6cb551611 | [
"MIT"
] | 2 | 2019-08-19T09:15:44.000Z | 2020-09-30T05:41:43.000Z | TTR.ipynb | Mukesh7197/anlp-1 | eff3de6349b9ba4cab3702d36ecbacb6cb551611 | [
"MIT"
] | 66 | 2019-08-04T23:40:26.000Z | 2021-11-23T18:11:50.000Z | 31.91954 | 214 | 0.601368 | [
[
[
"<a href=\"https://colab.research.google.com/github/Ramaseshanr/ANLP/blob/master/TTR.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
]
],
[
[
"# MIT License\n# Copyright (c) 2019.\n# Project Name: ANLP\n# File name: code.py\n# Last modification date: 2/28/19 2:32 PM\n# Author: ramaseshan\n# Email:[email protected]\n# /TTR.py\n# \n\nimport nltk\nfrom nltk.corpus import stopwords\n#get the stop words for English\nstop_words = set(stopwords.words('english'))\nwords_bryant = nltk.Text(nltk.corpus.gutenberg.words('bryant-stories.txt'))\nwords_emma = nltk.Text(nltk.corpus.gutenberg.words('austen-emma.txt'))\n#convert to small letters\nwords_bryant = [word.lower() for word in words_bryant if word.isalpha()]\nwords_emma = [word.lower() for word in words_emma if word.isalpha()]\n#remove stop words\nwords_bryant = [word.lower() for word in words_bryant if word not in stop_words][:15000]\nwords_emma = [word.lower() for word in words_emma if word not in stop_words][:15000]\nTTR_bryant = len(set(words_bryant))/len(words_bryant)\nTTR_emma = len(set(words_emma))/len(words_emma)\nprint('Number of tokens, Vocabulary, Type-token ratio (Bryant stories) = ', len(words_bryant), len(set(words_bryant)), TTR_bryant)\nprint('Number of tokens, Vocabulary, Type-token ratio (Jane Austen Emma) = ', len(words_emma), len(set(words_emma)), TTR_emma)\n",
"3826 7079\nNumber of tokens, Vocabulary, Type-token ratio (Bryant stories) = 15000 2796 0.1864\nNumber of tokens, Vocabulary, Type-token ratio (Jane Austen Emma) = 15000 3274 0.21826666666666666\n"
]
]
] | [
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
]
] |
d07c12b8804d1efb5efe70765acb40b340a18d2b | 954,124 | ipynb | Jupyter Notebook | bitcoin_script_load_data_postgres.ipynb | himanshudce/Realtime-Database-with-Firebase | 8baf547097a84c96e883ee8f9b5cceda431a8ddb | [
"MIT"
] | null | null | null | bitcoin_script_load_data_postgres.ipynb | himanshudce/Realtime-Database-with-Firebase | 8baf547097a84c96e883ee8f9b5cceda431a8ddb | [
"MIT"
] | null | null | null | bitcoin_script_load_data_postgres.ipynb | himanshudce/Realtime-Database-with-Firebase | 8baf547097a84c96e883ee8f9b5cceda431a8ddb | [
"MIT"
] | 1 | 2021-12-11T16:14:14.000Z | 2021-12-11T16:14:14.000Z | 43.639041 | 138 | 0.657796 | [
[
[
"Import all libraries",
"_____no_output_____"
]
],
[
[
"import requests\nimport json\nimport time\nimport multiprocessing\nimport psycopg2",
"_____no_output_____"
]
],
[
[
"Functions for loading the data and counting rows in meanwhile",
"_____no_output_____"
]
],
[
[
"# Postgres\ndef load_data_postgres(data):\n start = time.time()\n for line in data:\n response_insert = requests.post('http://localhost:3001/postgres/write/crypto',json=line).text\n end = time.time()\n print(\"postgres\",end-start)\n\n# Firebase\ndef load_data_firebase(data):\n start = time.time()\n prev_time = start\n for line in data:\n response_insert = requests.post('http://localhost:3001/firebase/write/crypto',json=line).text\n end = time.time()\n print(\"firebase\",end-start)\n\ndef time_measurement(num,l):\n start = time.time()\n prev_time = start\n query_list_stat_all=[]\n for i in range(num):\n time.sleep(20)\n time_now = time.time()-start\n # print or write to log here\n prev_time = time.time() \n response_write = int(requests.get('http://localhost:3001/postgres/read/crypto').text)\n #running queries after 20 sec\n quer_stat = query_postgres(l)\n quer_stat.append(response_write)\n time_now = time.time()-start\n quer_stat.append(time_now)\n \n print(\"No of rows {} and time {}\".format(response_write,time_now))\n print(\"query_stats\",quer_stat)\n print(\"\\n\\n\")\n query_list_stat_all.append(quer_stat)\n \n return query_list_stat_all",
"_____no_output_____"
],
[
"def query_postgres(query_list):\n time_list = []\n# row_count = 0\n q_all_start = time.time() \n for i,que in enumerate(query_list):\n temp = {}\n temp['query'] = que\n start = time.time()\n response_query = requests.post('http://localhost:3001/postgres/query/crypto',json=temp)\n temp[\"id\"] = i+1\n# if i==0:\n# print(response_query)\n# row_count = json.loads(response_query.text)['rowCount']\n end = time.time()\n temp[\"query_time\"] = end-start\n# temp['row_count'] = row_count\n time_list.append(temp)\n \n q_all_end = time.time()\n \n final_list = [time_list,(q_all_end-q_all_start)]\n# time_list.append(q_all_end-q_all_start)\n# temp={}\n# temp['all_query_time'] = \n# time_list.append(temp)\n print(\"time taken to run all the queries\", q_all_end-q_all_start)\n return final_list\n \n\ndef query_firebase(query_list):\n \n q_all_start = time.time()\n for i,que in enumerate(query_list):\n temp = {}\n temp['query'] = que\n start = time.time()\n # post request \n response_query = requests.post('http://localhost:3001/firebase/query/crypto',json=temp)\n print(response_query)\n \n end = time.time()\n print(\"time taken for query {} is {}\".format(i+1,(end-start)))\n \n q_all_end = time.time()\n print(\"time taken to run all the queries\", q_all_end-q_all_start)",
"_____no_output_____"
],
[
"# l = [\"\"\"select count(*) from crypto_tab;\"\"\",\n# \"\"\" select * from crypto_tab order by bitcoin_info->>'Date' DESC limit 100;\"\"\",\n# \"\"\"select count(*) from crypto_tab group by bitcoin_info->>'Symbol';\"\"\",\n# \"\"\"select * from crypto_tab where (bitcoin_info->>'Volume')::float > 2;\"\"\"\n# \"\"\"select * from crypto_tab where bitcoin_info->>'Date' = '2018-12-07 21:52:00';\"\"\",\n# \"\"\"select * from crypto_tab where (bitcoin_info->>'High')::float > 3800 and bitcoin_info->>'Symbol' = 'BTCUSD';\"\"\"\n# ]\n# query_postgres(l)",
"_____no_output_____"
],
[
"l = [\"\"\"select count(*) from crypto_tab;\"\"\",\n \"\"\" select * from crypto_tab order by bitcoin_info->>'Date' DESC limit 100;\"\"\",\n \"\"\"select * from crypto_tab where bitcoin_info->>'Date' = '2020-12-31 21:52:00';\"\"\",\n \"\"\"select * from crypto_tab where (bitcoin_info->>'High')::float > 3800 and (bitcoin_info->>'High')::float < 4500; \"\"\",\n \"\"\"select * from crypto_tab where (bitcoin_info->>'Volume')::float > 2;\"\"\"\n ]\n# query_postgres(l)",
"_____no_output_____"
]
],
[
[
"Select which data to load",
"_____no_output_____"
]
],
[
[
"# # BITCOIN DATA\n# json_file_path= './data/bitcoin_data_sample.json'\n# with open(json_file_path) as json_file:\n# bitcoin_data = json.load(json_file)",
"_____no_output_____"
],
[
"# # ETHEREUM DATA\n# json_file_path= 'data/trajectory_data_60k.json'\n# with open(json_file_path) as json_file:\n# ethereum_data = json.load(json_file)",
"_____no_output_____"
],
[
"# # LITECOIN DATA\n# json_file_path= './bitcoin_merged_data.json'\n# with open(json_file_path) as json_file:\n# data = json.load(json_file)",
"_____no_output_____"
],
[
"# crpto DATA\n# json_file_path= './data/bitcoin_merged_data.json'\n# with open(json_file_path) as json_file:\n# crypto_data = json.load(json_file)",
"_____no_output_____"
],
[
"# crypto_updated = crypto_data*2",
"_____no_output_____"
],
[
"load_data_postgres(crypto_data)",
"postgres 401.75011825561523\n"
],
[
"# p1 = multiprocessing.Process(target=load_data_postgres, args=(crypto_data, ))\n# p2 = multiprocessing.Process(target=load_data_postgres, args=(bitcoin_data, ))\n# p3 = multiprocessing.Process(target=load_data_postgres, args=(bitcoin_data, ))\n# p4 = multiprocessing.Process(target=time_measurement, args=(len(bitcoin_data), ))\n\n# # starting process 1\n# p1.start()\n# # starting process 2\n# p2.start()\n# # starting process 3\n# p3.start()\n# # starting process 4\n# p4.start()\n\n# # wait until process 1 is finished\n# p1.join()\n# # wait until process 2 is finished\n# p2.join()\n# # wait until process 3 is finished\n# p3.join()\n# # wait until process 4 is finished\n# p4.join()",
"_____no_output_____"
],
[
"# load_data_postgres(ethereum_data)",
"_____no_output_____"
],
[
"# time_list = query_postgres(l)",
"_____no_output_____"
],
[
"# time_measurement(len(data),l)",
"_____no_output_____"
]
],
[
[
"Run process for Firebase",
"_____no_output_____"
]
],
[
[
"\np1 = multiprocessing.Process(target=load_data_postgres, args=(bitcoin_data, ))\np2 = multiprocessing.Process(target=load_data_postgres, args=(bitcoin_data, ))\np3 = multiprocessing.Process(target=load_data_postgres, args=(bitcoin_data, ))\np4 = multiprocessing.Process(target=time_measurement, args=(len(bitcoin_data), ))\n\n# starting process 1\np1.start()\n# starting process 2\np2.start()\n# starting process 3\np3.start()\n# starting process 4\np4.start()\n\n# wait until process 1 is finished\np1.join()\n# wait until process 2 is finished\np2.join()\n# wait until process 3 is finished\np3.join()\n# wait until process 4 is finished\np4.join()",
"_____no_output_____"
]
],
[
[
"Run process for PostgreSQL",
"_____no_output_____"
]
],
[
[
"load_data_postgres(data)",
"postgres 0.7907586097717285\n"
],
[
"# BITCOIN DATA\njson_file_path= './data/trajectory_data_60k.json'\nwith open(json_file_path) as json_file:\n data = json.load(json_file)",
"_____no_output_____"
],
[
"len(data)",
"_____no_output_____"
],
[
"sql = \"\"\"insert into trajectory_path(path_info) values(%s)\"\"\"\n\n\n \n\nsql_count = \"\"\"select count(*) from trajectory_path\"\"\"\n\n\n \n\n\nstart_time = time.time()\nprev_time = start_time\nprv_count = 0\n\n\nwith psycopg2.connect(host='localhost', port=5432, database='applicationdb', user='postgres', password='4258') as conn:\n\n with conn.cursor() as cur:\n\n\n \n\n # q_all_start = time.time()\n\n # for i,que in enumerate(l):\n # temp = {}\n # temp['query'] = que\n # start = time.time()\n\n # # post request <<\n # cur.execute(que)\n # # post request >>\n\n # end = time.time()\n # print(\"time taken for query {} is {}\".format(i+1,(end-start)))\n\n # q_all_end = time.time()\n # print(\"time taken to run all the queries\", q_all_end-q_all_start)\n\n\n #start_time = time.time()\n print(start_time)\n counter = 0\n for line in data:\n line = str(line)\n line = line.replace('\\'','\\\"')\n try:\n x = cur.execute(sql, (line,))\n except (Exception, psycopg2.DatabaseError) as error:\n print(error)\n print(time.time()-start_time)\n continue\n\n\n \n\n dt = time.time() - prev_time\n if dt > 1:\n # print or write to log here\n prev_time = time.time()\n cur.execute(sql_count)\n result = cur.fetchone()\n print(result[0])\n\n\n conn.commit()\n\n # except (Exception, psycopg2.DatabaseError) as error:\n # print(error)\n # print(time.time()-start_time)\n end_time = time.time()\n\n\n \n\n print(end_time)\n print(\"time taken to load the data is {} seconds\".format(end_time-start_time))",
"1637452712.3010046\n2058\n4605\n7130\n9652\n12216\n14725\n17256\n19806\n22350\n24918\n27493\n30004\n32538\n35064\n37589\n40129\n42657\n45291\n47736\n50190\n52728\nmissile\nCONTEXT: PL/pgSQL function process_dist_calculation() line 4 at RAISE\n\n21.097785234451294\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.098284006118774\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.09968662261963\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.100196361541748\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.101358890533447\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.102198123931885\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.102198123931885\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.103246450424194\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.104194402694702\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.10523796081543\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.106194257736206\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.106194257736206\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.10735774040222\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.10835909843445\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.10835909843445\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.109365463256836\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.110358238220215\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.111357927322388\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.112508535385132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.113396406173706\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.113396406173706\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.114362955093384\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.11551260948181\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.11636471748352\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.11742115020752\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.11836528778076\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.119396686553955\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.120396375656128\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.121359825134277\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.121359825134277\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.122706413269043\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.123352527618408\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.124398231506348\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.125396728515625\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.125396728515625\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.12751603126526\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.12751603126526\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.128400325775146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.129353284835815\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.130356550216675\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.13138175010681\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.13235902786255\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.13339900970459\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.13439655303955\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.13540530204773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.13540530204773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.136566162109375\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.137362241744995\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.138367891311646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.139354467391968\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.140644788742065\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.141396045684814\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.142399311065674\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.143352508544922\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.143352508544922\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.14544701576233\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.14635682106018\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.147367477416992\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.14836549758911\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.149399280548096\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.150360584259033\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.151358366012573\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.151358366012573\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.152795553207397\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.153353214263916\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.154361963272095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.156357765197754\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.15735673904419\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.15835976600647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.15963840484619\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.160356760025024\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.16136360168457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.162356853485107\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.163358211517334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.1643545627594\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.165356159210205\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.166363954544067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.167354822158813\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.168396472930908\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.16875171661377\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.169397115707397\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.170355558395386\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.171359539031982\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.17235779762268\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.17340874671936\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.17435383796692\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.17539668083191\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.17639684677124\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.17639684677124\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.177396535873413\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.178358793258667\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.179469347000122\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.180357456207275\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.1813542842865\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.18277931213379\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.18335199356079\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.184396266937256\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.18536353111267\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.186359643936157\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.18740749359131\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.18836498260498\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.18836498260498\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.18965792655945\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.190407752990723\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.19139862060547\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.192358255386353\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.192358255386353\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.193366527557373\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.194416046142578\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.19535493850708\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.196360111236572\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.19739842414856\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.19739842414856\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.19836401939392\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.199443578720093\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.20136308670044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.20136308670044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.203396320343018\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.204355716705322\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.204355716705322\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.205647230148315\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.206355810165405\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.207361698150635\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.20764708518982\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.208396434783936\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.20935869216919\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.210369110107422\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.211400032043457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.212358713150024\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.213359832763672\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.213359832763672\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.215357542037964\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.21639895439148\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.21735692024231\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.218355178833008\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.219353914260864\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.220356464385986\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.222352027893066\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.22335410118103\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.224354028701782\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.225354433059692\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.22635555267334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.227354288101196\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.228373765945435\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.22935914993286\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.23035740852356\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.231356382369995\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.232359170913696\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.233358144760132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.233358144760132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.23539638519287\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.23539638519287\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.236398458480835\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.237361669540405\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.238574743270874\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.239357709884644\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.240396738052368\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.24135422706604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.24277424812317\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.243353128433228\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.24536395072937\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.245434284210205\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.24636459350586\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.247354984283447\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.24835515022278\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.24935746192932\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.25036907196045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.251363277435303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.252429962158203\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.253605127334595\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.254353284835815\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.2553653717041\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.2553653717041\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.257359504699707\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.258358001708984\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.259360551834106\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.260403871536255\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.261619567871094\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.262400150299072\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.26435685157776\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.26535177230835\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.266359090805054\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.26739811897278\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.26739811897278\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.26968240737915\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.27056384086609\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.271398305892944\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.272451400756836\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.273395776748657\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.274353504180908\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.275431871414185\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.27635383605957\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.277372121810913\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.27937412261963\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.280357122421265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.281460523605347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.282355070114136\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.283857345581055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.28537106513977\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.286355018615723\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.287362337112427\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.287362337112427\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.289360523223877\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.290355920791626\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.291454315185547\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.292361974716187\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.293445348739624\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.29435896873474\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.29535937309265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.296358108520508\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.29735279083252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.29835844039917\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.299354314804077\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.300354480743408\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.30135488510132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.302353858947754\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.303269386291504\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.304267168045044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.305262327194214\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.306294441223145\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.306294441223145\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.307335376739502\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.30840301513672\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.3095920085907\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.310150146484375\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.311410427093506\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.311995029449463\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.312994956970215\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.314086198806763\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.314086198806763\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.315990209579468\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.316991329193115\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.316991329193115\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.3179931640625\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.31999683380127\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.31999683380127\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.32103419303894\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.322164297103882\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.322997093200684\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.322997093200684\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.324137449264526\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.325002193450928\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.32605266571045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.32710599899292\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.328006744384766\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.328991651535034\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.328991651535034\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.330533266067505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.331043004989624\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.33199715614319\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.33329200744629\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.334002256393433\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.335033178329468\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.335033178329468\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.336034059524536\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.337188959121704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.337998151779175\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.33903694152832\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.34024739265442\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.340996026992798\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.342000484466553\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.343169450759888\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.343429565429688\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.34500217437744\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.345996379852295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.346997499465942\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.347254514694214\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.348533630371094\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.34963345527649\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.350529670715332\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.351000547409058\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.352004289627075\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.3529953956604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.354121208190918\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.354997158050537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.354997158050537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.35618782043457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.35740852355957\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.358158588409424\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.358997344970703\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.360005855560303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.360995531082153\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.360995531082153\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.362035274505615\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.36331796646118\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.364182472229004\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.365036010742188\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.365995407104492\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.36700177192688\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.368165254592896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.368990898132324\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.370034217834473\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.370993614196777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.37204074859619\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.372995138168335\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.372995138168335\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.37404155731201\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.37503457069397\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.376006603240967\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.377044439315796\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.377997636795044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.377997636795044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.37907385826111\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.380104780197144\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.381016492843628\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.382123470306396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.382997512817383\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.382997512817383\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.384033679962158\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.38500213623047\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.386033535003662\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.386033535003662\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.38804793357849\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.388988733291626\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.390997886657715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.391873359680176\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.391993045806885\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.393128395080566\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.394999980926514\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.39599323272705\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.396991968154907\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.39803433418274\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.39899444580078\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.399994611740112\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.399994611740112\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.401058673858643\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.402035236358643\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.402992486953735\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.40500783920288\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.406086444854736\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.40699815750122\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.408003330230713\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.409991025924683\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.409991025924683\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.4119930267334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.412994861602783\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.41439700126648\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.415826082229614\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.417184591293335\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.417993545532227\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.41899347305298\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.42059063911438\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.420993089675903\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.422083854675293\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.423991441726685\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.42499828338623\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.425995349884033\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.426995277404785\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.42798924446106\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.428995847702026\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.428995847702026\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.43004083633423\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.43101143836975\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.43218183517456\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.433187246322632\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.4340603351593\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.435998916625977\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.437003135681152\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.437992811203003\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.439003229141235\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.439995765686035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.439995765686035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.441049337387085\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.44210410118103\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.44300389289856\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.44403624534607\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.44535756111145\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.446000337600708\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.447033405303955\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.448034286499023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.448034286499023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.44909358024597\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.449998140335083\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.45100688934326\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.45203924179077\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.453174829483032\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.453996896743774\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.454317808151245\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.455358505249023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.455991744995117\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.456995725631714\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.458038568496704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.45899510383606\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.45899510383606\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.460033416748047\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.461036205291748\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.461997032165527\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.461997032165527\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.462995290756226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.463996171951294\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.464993238449097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.46603298187256\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.466995239257812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.4679958820343\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.468286514282227\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.46899724006653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.470048904418945\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.471013069152832\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.472036600112915\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.47303295135498\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.473992586135864\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.475036144256592\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.47526717185974\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.475995779037476\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.476996898651123\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.478050708770752\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.478994369506836\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.48056721687317\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.48146343231201\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.4820339679718\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.483033895492554\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.48399782180786\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.485074758529663\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.48600172996521\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.48600172996521\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.48708486557007\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.488001108169556\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.489001512527466\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.489001512527466\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.490140199661255\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.490994930267334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.49213695526123\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.49308133125305\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.493996620178223\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.495038509368896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.495038509368896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.496001720428467\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.4969961643219\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.49815607070923\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.50003218650818\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.50099754333496\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.502007246017456\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.502996921539307\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.503058433532715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.504002571105957\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.506002187728882\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.50699257850647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.50699257850647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.50823974609375\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.509299755096436\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.509608507156372\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.510585069656372\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.5113046169281\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.512262105941772\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.513263940811157\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.514359951019287\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.515374183654785\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.515624046325684\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.516271591186523\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.51729917526245\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.51858377456665\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.519259214401245\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.52026891708374\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.521299600601196\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.521299600601196\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.52230143547058\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.523484468460083\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.524298667907715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.524298667907715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.52525568008423\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.5262668132782\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.52726459503174\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.528300523757935\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.52926206588745\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.52926206588745\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.530298709869385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.53130078315735\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.532261848449707\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.53327226638794\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.53445553779602\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.535258769989014\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.536307096481323\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.536561727523804\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.537299394607544\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.538389682769775\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.539388418197632\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.54026436805725\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.54145622253418\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.542261600494385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.54326295852661\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.54426622390747\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.544395923614502\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.54526424407959\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.54637598991394\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.547277688980103\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.548477172851562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.549275636672974\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.55026340484619\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.55128049850464\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.55149531364441\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.552305698394775\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.55331039428711\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.554269075393677\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.555256843566895\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.556262493133545\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.55729913711548\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.55729913711548\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.558632135391235\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.559301376342773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.56025719642639\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.561272382736206\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.56225895881653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.563299655914307\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.564299821853638\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.564299821853638\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.56529974937439\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.56649136543274\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.567461252212524\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.568260431289673\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.568260431289673\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.56931447982788\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.570889472961426\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.571263790130615\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.572267055511475\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.572267055511475\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.573298931121826\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.574263334274292\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.57630228996277\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.57738947868347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.578558444976807\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.579299449920654\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.580557346343994\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.581257343292236\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.582270622253418\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.583266973495483\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.584299564361572\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.585315942764282\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.585315942764282\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.586259126663208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.587299823760986\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.588299989700317\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.589276552200317\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.590256214141846\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.592259407043457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.593260526657104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.593260526657104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.595260858535767\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.596268892288208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.596268892288208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.59826374053955\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.599265575408936\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.599265575408936\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.60030436515808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.60129976272583\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.602269172668457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.604262590408325\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.604262590408325\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.605559587478638\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.606263399124146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.60726284980774\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.60726284980774\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.608299016952515\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.609308004379272\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.610313177108765\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.611262798309326\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.61236548423767\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.613269090652466\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.613269090652466\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.61430025100708\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.61526346206665\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.616267681121826\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.618260383605957\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.619357109069824\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.62126874923706\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.62126874923706\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.622267484664917\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.624265670776367\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.624265670776367\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.626262664794922\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.626317024230957\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.627270460128784\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.62825846672058\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.62929892539978\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.62929892539978\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.630300521850586\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.63127326965332\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.632615089416504\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.63325548171997\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.63429546356201\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.635263442993164\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.635525703430176\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.636515617370605\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.637351751327515\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.638275146484375\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.639302492141724\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.64047932624817\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.64130210876465\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.64130210876465\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.642334461212158\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.64325499534607\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.644261598587036\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.645554304122925\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.64626431465149\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.64726233482361\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.64826536178589\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.64826536178589\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.64937472343445\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.650465488433838\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.651259183883667\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.652257680892944\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.653314352035522\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.654300451278687\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.65528106689453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.65528106689453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.656269550323486\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.6573383808136\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.658260583877563\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.659531354904175\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.660300254821777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.66130018234253\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.66158413887024\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.66230058670044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.663404941558838\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.66426157951355\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.665261030197144\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.666268587112427\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.66738271713257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.668259382247925\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.669299840927124\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.669299840927124\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.670299530029297\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.671257495880127\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.67227077484131\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.673603057861328\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.674766778945923\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.675257921218872\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.676265478134155\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.677301168441772\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.677550792694092\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.678263902664185\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.679336547851562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.680264234542847\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.68125629425049\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.68125629425049\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.682302474975586\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.683299779891968\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.68429970741272\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.685261487960815\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.685261487960815\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.687371492385864\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.688256978988647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.68930220603943\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.689534664154053\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.69049620628357\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.69129991531372\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.692268133163452\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.693275451660156\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.694257497787476\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.694257497787476\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.695300340652466\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.69630455970764\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.69726324081421\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.69726324081421\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.698406219482422\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.699259996414185\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.700281381607056\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.701258659362793\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.702301025390625\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.70326066017151\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.704301118850708\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.705263137817383\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.705263137817383\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.706302404403687\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.707306146621704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.708309650421143\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.70954442024231\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.710299968719482\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.71126127243042\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.71126127243042\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.713056325912476\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.71325707435608\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.715256452560425\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.715256452560425\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.716742038726807\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.7173011302948\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.718271017074585\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.719301462173462\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.72030019760132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.721312046051025\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.722261428833008\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.722261428833008\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.72330069541931\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.724299669265747\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.72530436515808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.72530436515808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.726305961608887\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.727261304855347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.72926354408264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.72926354408264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.73030424118042\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.731266975402832\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.73226261138916\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.73226261138916\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.733317852020264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.734262704849243\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.735262393951416\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.736260414123535\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.737443685531616\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.738300561904907\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.738300561904907\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.73930025100708\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.740479946136475\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.74137306213379\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.74227285385132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.743300676345825\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.744633197784424\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.745500564575195\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.74630308151245\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.74730134010315\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.74730134010315\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.748300790786743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.74931502342224\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.750260591506958\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.751301527023315\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.752300024032593\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.753301858901978\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.753301858901978\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.754300355911255\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.75526261329651\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.756550550460815\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.757301092147827\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.758517026901245\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.759264707565308\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.759264707565308\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.76030445098877\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.761587381362915\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.76326084136963\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.763930559158325\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.765300750732422\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.766261100769043\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.766261100769043\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.7678165435791\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.76825475692749\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.770312547683716\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.77125859260559\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.77226996421814\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.772417783737183\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.773263216018677\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.774269104003906\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.7752583026886\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.777286291122437\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.777692079544067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.77826952934265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.77926802635193\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.780269384384155\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.780269384384155\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.782256603240967\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.783271551132202\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.784263372421265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.785258531570435\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.786259412765503\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.78725528717041\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.78725528717041\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.78858494758606\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.789260149002075\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.790271520614624\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.791257858276367\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.792318105697632\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.793259620666504\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.793259620666504\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.79431462287903\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.795475721359253\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.796262502670288\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.796262502670288\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.79828381538391\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.799262046813965\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.799262046813965\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.800389766693115\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.801302194595337\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.801302194595337\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.80229949951172\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.80326199531555\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.804272651672363\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.805270195007324\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.807262420654297\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.808262825012207\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.808262825012207\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.809303522109985\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.810258626937866\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.811262369155884\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.813274145126343\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.814258575439453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.815301179885864\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.815535306930542\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.816728353500366\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.81740665435791\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.81826686859131\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.81925940513611\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.81925940513611\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.821316957473755\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.821316957473755\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.822317123413086\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.823304653167725\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.824265956878662\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.825270414352417\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.826406955718994\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.826406955718994\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.827301740646362\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.828259468078613\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.829299926757812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.830303192138672\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.83126163482666\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.832263469696045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.833261013031006\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.833261013031006\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.83469796180725\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.835299968719482\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.836265802383423\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.83730673789978\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.83730673789978\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.838520050048828\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.840256929397583\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.840256929397583\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.84126877784729\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.842304468154907\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.84330701828003\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.844300508499146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.844300508499146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.84537172317505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.84630060195923\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.847310543060303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.848636150360107\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.849268436431885\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.850263118743896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.850263118743896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.85138773918152\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.85326838493347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.85426354408264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.85426354408264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.855602264404297\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.85626792907715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.857444286346436\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.858301877975464\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.859269857406616\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.860310077667236\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.860310077667236\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.861300230026245\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.862558603286743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.863269567489624\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.863269567489624\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.864303588867188\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.865654945373535\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.866369247436523\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.86778235435486\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.868260145187378\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.86943221092224\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.870287656784058\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.870287656784058\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.871254920959473\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.872288465499878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.873263359069824\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.874255180358887\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.874255180358887\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.875311374664307\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.876261234283447\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.877260446548462\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.878451347351074\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.878451347351074\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.87952470779419\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.880269050598145\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.88131594657898\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.882306814193726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.883300304412842\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.8836190700531\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.884588479995728\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.885299921035767\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.88629722595215\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.88725996017456\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.88831400871277\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.88831400871277\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.88926386833191\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.890501499176025\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.891263723373413\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.892300367355347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.89325785636902\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.89368438720703\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.895332098007202\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.896260738372803\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.896260738372803\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.897672176361084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.89826774597168\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.89926314353943\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.900256395339966\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.90044069290161\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.90126061439514\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.902621269226074\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.903542280197144\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.9044086933136\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.905263423919678\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.906302213668823\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.906302213668823\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.907262325286865\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.90929079055786\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.9102680683136\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.91126322746277\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.911516666412354\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.91230034828186\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.91341280937195\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.914258003234863\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.914258003234863\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.916256189346313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.916256189346313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.917260885238647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.918437719345093\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.91925859451294\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.920258045196533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.920258045196533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.921258687973022\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.92225933074951\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.924258708953857\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.924258708953857\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.925318479537964\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.926257133483887\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.927257537841797\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.928305864334106\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.929262161254883\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.929262161254883\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.93026375770569\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.931314945220947\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.932300806045532\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.933300256729126\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.933300256729126\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.93527054786682\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.93527054786682\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.93631863594055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.937288284301758\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.938302516937256\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.939300060272217\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.939300060272217\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.941293001174927\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.94225811958313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.94331169128418\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.944257736206055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.944257736206055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.945300579071045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.94629979133606\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.947259187698364\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.948299884796143\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.948627710342407\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.949262857437134\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.951374292373657\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.952258348464966\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.954262256622314\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.955262660980225\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.955262660980225\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.95625877380371\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.9572594165802\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.959259510040283\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.959507703781128\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.961263179779053\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.961263179779053\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.962265253067017\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.965267181396484\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.966262817382812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.966262817382812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.967259168624878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.968257188796997\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.969311237335205\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.970258235931396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.971328496932983\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.9722580909729\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.973296642303467\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.97442054748535\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.97525954246521\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.976300716400146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.97726345062256\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.97828722000122\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.979262351989746\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.980268478393555\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.981257915496826\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.982630491256714\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.98326086997986\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.984347343444824\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.985260248184204\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.986270904541016\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.987530946731567\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.98825716972351\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.989261865615845\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.989261865615845\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.990299701690674\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.99232816696167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.99325728416443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.99325728416443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.99439263343811\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.995259761810303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.99625849723816\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.997259855270386\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.998263359069824\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n21.999266862869263\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.000270128250122\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.001300573349\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.001643180847168\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.00230360031128\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.003299951553345\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.004265546798706\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.005257844924927\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.006267786026\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.007372617721558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.008259534835815\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.009300470352173\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.009300470352173\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.010420322418213\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.011529445648193\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.01234745979309\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.013655185699463\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.014255046844482\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.015474796295166\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.016255855560303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.0172598361969\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.018255472183228\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.018255472183228\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.020716905593872\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.021263360977173\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.02229928970337\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.022629499435425\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.023300170898438\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.024269104003906\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.025256395339966\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.026256561279297\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.027260065078735\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.027260065078735\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.028300046920776\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.029379844665527\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.03026843070984\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.03026843070984\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.03130078315735\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.032363891601562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.033260345458984\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.034650325775146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.035258769989014\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.036270141601562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.037302255630493\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.038300037384033\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.038300037384033\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.039257287979126\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.040257930755615\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.04125738143921\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.042317390441895\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.043304920196533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.043304920196533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.044300079345703\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.045405387878418\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.04626202583313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.04626202583313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.048262357711792\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.048667192459106\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.049298763275146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.051270246505737\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.051270246505737\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.05241632461548\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.053268671035767\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.05430507659912\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.055542945861816\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.056300401687622\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.05735731124878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.058299779891968\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.058299779891968\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.05929970741272\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.06025719642639\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.06127142906189\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.062567949295044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.063258171081543\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.063258171081543\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.064263582229614\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.065301656723022\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.06630277633667\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.067267417907715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.068259239196777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.069401025772095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.07026171684265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.071456909179688\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.07230019569397\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.07230019569397\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.073625802993774\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.07425856590271\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.075268030166626\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.076271295547485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.07729935646057\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.07828998565674\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.079299688339233\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.079299688339233\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.080302953720093\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.081259965896606\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.083261966705322\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.083261966705322\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.084299087524414\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.08554434776306\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.086339235305786\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.087300300598145\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.087300300598145\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.088307857513428\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.09026336669922\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.091301441192627\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.092267990112305\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.092267990112305\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.09363341331482\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.094268321990967\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.095317125320435\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.09630298614502\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.097290754318237\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.097290754318237\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.098491430282593\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.09925866127014\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.100301265716553\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.101261615753174\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.102259397506714\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.103307485580444\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.10426354408264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.10426354408264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.10569953918457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.106300592422485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.10726308822632\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.10726308822632\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.108545780181885\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.109492301940918\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.110487937927246\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.111484050750732\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.111484050750732\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.11251711845398\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.11352229118347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.11449146270752\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.11547875404358\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.116488456726074\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.11750817298889\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.1184823513031\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.119496822357178\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.120522022247314\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.12152338027954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.12152338027954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.12247920036316\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.123493432998657\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.12466335296631\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.125487327575684\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.126482009887695\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.126482009887695\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.127527713775635\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.128522157669067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.130492448806763\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.13153386116028\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.132482528686523\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.132482528686523\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.133673667907715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.1347918510437\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.13552212715149\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.136483907699585\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.137483835220337\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.13865089416504\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.14048194885254\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.141481161117554\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.141481161117554\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.142478942871094\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.143563508987427\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.14448118209839\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.146480560302734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.146480560302734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.147478103637695\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.148483276367188\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.14948582649231\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.15056848526001\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.151480674743652\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.15247941017151\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.154479026794434\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.155482053756714\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.1564838886261\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.1564838886261\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.15748143196106\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.158493757247925\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.15956139564514\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.160912036895752\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.161627769470215\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.162484169006348\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.162811040878296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.163912296295166\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.16448426246643\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.165483713150024\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.16673994064331\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.16748309135437\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.16848134994507\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.16848134994507\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.169729471206665\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.170802116394043\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.171489477157593\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.172497034072876\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.173809051513672\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.174522876739502\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.17549228668213\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.176482915878296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.177483081817627\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.178485870361328\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.17948865890503\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.180480003356934\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.182481050491333\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.18348526954651\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.184497594833374\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.18549084663391\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.18651008605957\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.187484741210938\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.188488245010376\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.188690662384033\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.18952465057373\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.190491199493408\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.19149088859558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.192481517791748\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.19348955154419\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.194527626037598\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.195486783981323\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.195486783981323\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.196574211120605\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.197522163391113\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.19852590560913\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.1995267868042\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.201481580734253\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.201481580734253\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.20248508453369\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.203529834747314\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.20452308654785\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.205524682998657\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.206136465072632\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.206523656845093\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.207484245300293\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.208524465560913\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.20954942703247\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.21052312850952\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.21052312850952\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.211478233337402\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.212526559829712\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.213489055633545\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.2145357131958\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.215485095977783\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.215485095977783\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.216749668121338\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.217530012130737\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.218522548675537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.219524145126343\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.219629526138306\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.220479488372803\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.222489595413208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.222489595413208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.22364115715027\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.22448205947876\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.2255220413208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.226524353027344\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.226524353027344\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.22913432121277\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.22948145866394\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.230480909347534\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.231523513793945\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.232523202896118\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.232523202896118\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.233500480651855\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.234493255615234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.23548460006714\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.236788034439087\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.237508535385132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.23852252960205\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.239485263824463\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.240477800369263\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.241493225097656\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.242493629455566\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.242493629455566\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.24352192878723\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.244577884674072\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.245522260665894\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.245522260665894\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.24648904800415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.247480869293213\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.248488903045654\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.24948811531067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.24948811531067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.250521898269653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.251723527908325\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.252521991729736\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.252521991729736\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.253488779067993\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.25448751449585\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.256498336791992\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.25748324394226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.25748324394226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.258820295333862\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.259629249572754\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.260528564453125\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.260528564453125\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.261653184890747\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.26248288154602\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.263484477996826\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.263882160186768\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.26547861099243\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.265710830688477\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.26652240753174\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.267524003982544\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.26847791671753\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.269489526748657\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.27048921585083\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.27048921585083\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.271522521972656\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.272855043411255\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.27352285385132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.274522304534912\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.27557134628296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.276491165161133\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.277626991271973\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.278486251831055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.278486251831055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.279485940933228\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.280659914016724\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.281522274017334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.28248357772827\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.28248357772827\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.284488916397095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.284488916397095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.285524368286133\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.286526203155518\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.287484645843506\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.288485288619995\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.288485288619995\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.28948163986206\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.290480375289917\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.291483402252197\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.291483402252197\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.293195247650146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.293477773666382\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.294490814208984\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.295522928237915\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.295522928237915\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.296482801437378\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.2975332736969\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.298710584640503\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.299526691436768\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.300108909606934\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.300524473190308\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.30152416229248\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.302632808685303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.303526878356934\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.304484844207764\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.30552864074707\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.30570936203003\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.306522369384766\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.307523250579834\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.30837106704712\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.309433221817017\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.30992341041565\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.31039571762085\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.311448574066162\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.313395261764526\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.313395261764526\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.31442666053772\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.315465927124023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.31639575958252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.31639575958252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.318408966064453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.319393396377563\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.319393396377563\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.32102084159851\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.321393489837646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.322402477264404\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.323433876037598\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.324435710906982\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.324435710906982\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.326443195343018\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.327396392822266\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.328433513641357\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.32960319519043\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.33054780960083\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.331388473510742\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.33244276046753\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.3333957195282\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.3333957195282\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.334518671035767\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.33543300628662\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.33643341064453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.337597608566284\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.3383948802948\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.340395212173462\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.341392517089844\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.342392444610596\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.344390153884888\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.345513582229614\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.34639263153076\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.34938955307007\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.35038924217224\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.35138773918152\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.35138773918152\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.352400541305542\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.353497743606567\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.35439372062683\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.355777263641357\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.356435775756836\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.357433080673218\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.357433080673218\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.358432054519653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.35940170288086\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.360450744628906\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.36139488220215\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.36260676383972\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.363433361053467\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.363433361053467\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.364561557769775\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.365439891815186\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.366401195526123\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.368390321731567\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.36940026283264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.36940026283264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.370527982711792\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.37139654159546\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.372391939163208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.37439799308777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.375402212142944\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.37643074989319\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.377445936203003\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.37861657142639\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.379388570785522\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.380393028259277\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.381397485733032\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.382407903671265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.383208751678467\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.38339400291443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.384597063064575\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.385432243347168\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.385432243347168\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.38638925552368\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.387467622756958\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.388445138931274\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.389400482177734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.390396118164062\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.390396118164062\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.3916494846344\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.392488956451416\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.393394470214844\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.394437789916992\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.395419359207153\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.396429538726807\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.397398948669434\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.397398948669434\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.398441076278687\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.39939045906067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.400456428527832\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.401390552520752\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.402646780014038\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.40339946746826\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.404399394989014\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.404399394989014\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.40543222427368\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.406432628631592\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.407447338104248\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.40858483314514\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.409580945968628\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.409580945968628\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.4106228351593\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.41176676750183\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.412582635879517\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.413583993911743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.4145827293396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.41586446762085\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.41657543182373\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.41657543182373\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.41762137413025\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.41858983039856\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.419583082199097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.420060873031616\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.420620441436768\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.421586990356445\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.4237380027771\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.42458176612854\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.425578355789185\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.425578355789185\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.426621913909912\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.42781400680542\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.42857837677002\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.429621696472168\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.430579900741577\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.43163800239563\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.432620525360107\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.432620525360107\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.433582067489624\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.43462085723877\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.43558382987976\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.43661665916443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.43758487701416\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.43758487701416\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.43861961364746\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.439587831497192\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.440582990646362\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.440582990646362\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.44161891937256\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.442588090896606\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.444579124450684\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.445578813552856\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.446619510650635\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.446619510650635\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.447919368743896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.44861936569214\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.44957685470581\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.450582265853882\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.45157504081726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.45157504081726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.452619314193726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.453741550445557\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.45462155342102\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.455620765686035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.45585799217224\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.456597089767456\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.45762062072754\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.45862340927124\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.45962619781494\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.4606192111969\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.461623668670654\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.461623668670654\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.46284294128418\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.463582754135132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.464771032333374\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.465619802474976\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.46665906906128\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.46665906906128\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.46763038635254\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.468619108200073\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.469605445861816\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.470587491989136\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.47163438796997\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.47262191772461\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.47262191772461\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.473968982696533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.474780797958374\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.475619554519653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.476575136184692\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.477578163146973\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.47866702079773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.479577779769897\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.48058295249939\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.480786323547363\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.481619834899902\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.48268985748291\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.48357605934143\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.484585523605347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.484585523605347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.485968112945557\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.48662281036377\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.487722635269165\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.488738298416138\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.488738298416138\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.489619255065918\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.490701913833618\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.491589784622192\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.49262523651123\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.493619918823242\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.494619131088257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.494943857192993\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.495619535446167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.49661922454834\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.49661922454834\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.498579263687134\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.4995756149292\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.4995756149292\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.500619888305664\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.501622438430786\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.501622438430786\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.50261950492859\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.50362253189087\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.50471591949463\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.50558590888977\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.506609201431274\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.507588863372803\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.50865888595581\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.50865888595581\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.50965189933777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.510953187942505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.511654138565063\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.512654304504395\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.51471781730652\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.515650749206543\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.516690731048584\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.517796516418457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.51864790916443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.519697666168213\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.520652532577515\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.520652532577515\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.52168846130371\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.522687673568726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.523655652999878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.524694442749023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.525651454925537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.526660442352295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.527743577957153\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.52864694595337\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.529650688171387\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.530656337738037\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.53164529800415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.53264856338501\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.534698486328125\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.535646677017212\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.535646677017212\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.536803483963013\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.537689208984375\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.538870573043823\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.53964877128601\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.540693998336792\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.54165029525757\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.542797565460205\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.54367423057556\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.545154809951782\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.545656442642212\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.54665732383728\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.54765796661377\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.548659086227417\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.549652338027954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.549652338027954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.550652265548706\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.5518057346344\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.552688598632812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.553691625595093\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.5546555519104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.555647373199463\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.55664610862732\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.557650804519653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.557650804519653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.558687925338745\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.559687852859497\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.560647010803223\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.56264567375183\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.563647985458374\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.564648389816284\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.565645456314087\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.56669282913208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.567649364471436\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.568690538406372\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.56966996192932\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.570688724517822\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.571646451950073\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.572655200958252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.572655200958252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.573691844940186\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.574687719345093\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.5756618976593\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.57720375061035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.577653646469116\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.57891345024109\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.579689264297485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.579689264297485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.580687761306763\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.581648111343384\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.582702159881592\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.583646059036255\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.584646463394165\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.584646463394165\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.586652517318726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.587655544281006\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.58774471282959\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.588645696640015\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.590643882751465\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.591647624969482\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.59268808364868\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.59268808364868\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.59368872642517\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.595014333724976\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.59564995765686\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.596691370010376\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.597690105438232\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.5986487865448\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.599849700927734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.60064697265625\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.601747751235962\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.6026508808136\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.60374927520752\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.60374927520752\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.604695558547974\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.605934381484985\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.60665202140808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.60665202140808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.6079261302948\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.608769178390503\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.60975480079651\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.61096978187561\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.611092567443848\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.611788988113403\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.612887382507324\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.612887382507324\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.61375856399536\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.61475157737732\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.615748643875122\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.61676025390625\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.61794114112854\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.619084358215332\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.619789838790894\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.620789766311646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.6211097240448\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.621800422668457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.622755527496338\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.623753309249878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.62475275993347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.62575054168701\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.62575054168701\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.6268048286438\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.628138065338135\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.628791093826294\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.629753351211548\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.630756855010986\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.631752967834473\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.632791757583618\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.632791757583618\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.633758068084717\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.63475251197815\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.63575792312622\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.636757373809814\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.63775086402893\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.63875436782837\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.639752626419067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.639752626419067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.640790700912476\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.64178967475891\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.642789125442505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.643789529800415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.644756317138672\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.6460063457489\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.646748781204224\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.647753477096558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.647753477096558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.648792266845703\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.649791717529297\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.650748014450073\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.651752710342407\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.651752710342407\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.65278959274292\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.653751611709595\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.655056715011597\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.655758142471313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.656747579574585\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.6577889919281\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.6577889919281\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.65879511833191\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.659816026687622\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.661064624786377\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.661789417266846\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.662802696228027\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.663795709609985\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.663795709609985\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.665762662887573\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.666894674301147\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.66787052154541\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.668790102005005\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.668790102005005\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.669837951660156\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.67086887359619\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.67174768447876\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.672749757766724\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.6739399433136\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.674752235412598\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.67579436302185\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.67579436302185\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.67674970626831\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.67775321006775\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.678757429122925\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.678757429122925\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.679790496826172\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.680857181549072\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.681793451309204\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.682799577713013\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.683794021606445\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.683794021606445\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.684789180755615\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.685749292373657\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.686805725097656\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.687747955322266\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.688745737075806\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.688745737075806\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.689789295196533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.690845251083374\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.69198703765869\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.69198703765869\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.69282364845276\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.693801403045654\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.694749355316162\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.69579029083252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.696951627731323\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.697747945785522\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.698758363723755\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.699008464813232\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.70077681541443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.701797485351562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.703518867492676\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.703812837600708\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.704790830612183\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.70574641227722\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.70674991607666\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.70775008201599\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.70879054069519\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.70879054069519\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.709796667099\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.711790561676025\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.712746381759644\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.71374821662903\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.715749502182007\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.715749502182007\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.716752529144287\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.71803045272827\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.718834161758423\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.719942331314087\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.720759868621826\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.72275400161743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.723748922348022\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.724748373031616\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.724748373031616\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.725751161575317\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.726746082305908\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.727749586105347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.7291738986969\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.72974944114685\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.73074746131897\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.73182511329651\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.73274803161621\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.733746767044067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.734748601913452\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.735785722732544\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.73674726486206\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.737756729125977\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.738747596740723\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.73974871635437\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.74074673652649\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.74074673652649\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.74276900291443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.74378514289856\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.744747638702393\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.745747804641724\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.746748685836792\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.74698233604431\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.74875044822693\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.749878406524658\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.751749515533447\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.752010583877563\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.753760814666748\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.754758596420288\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.755751848220825\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.75775384902954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.759754180908203\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.759754180908203\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.760753870010376\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.76176166534424\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.762895345687866\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.763914585113525\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.764799118041992\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.765798330307007\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.76679277420044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.76679277420044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.76775884628296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.76874542236328\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.769750118255615\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.770901441574097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.771777391433716\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.77275824546814\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.77375102043152\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.77375102043152\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.774925231933594\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.775749444961548\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.776758193969727\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.777754306793213\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.777754306793213\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.778791904449463\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.77979040145874\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.780794858932495\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.78178882598877\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.78178882598877\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.78283953666687\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.783774614334106\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.78487253189087\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.785749197006226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.78679060935974\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.787856340408325\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.788751363754272\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.788751363754272\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.78975248336792\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.790791034698486\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.791749477386475\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.792751789093018\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.792751789093018\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.793956518173218\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.79479217529297\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.79582715034485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.796748876571655\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.79775595664978\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.798746824264526\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.798746824264526\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.800788640975952\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.800788640975952\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.8017897605896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.8027560710907\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.804757595062256\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.80575728416443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.80676555633545\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.807751893997192\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.808796405792236\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.808796405792236\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.810303688049316\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.81075119972229\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.812747955322266\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.812747955322266\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.813879013061523\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.814747095108032\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.815967082977295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.816940784454346\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.817776203155518\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.817776203155518\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.8188054561615\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.8197922706604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.82076668739319\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.821884870529175\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.823023319244385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.823752880096436\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.82474970817566\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.82575249671936\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.826791763305664\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.826791763305664\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.827790021896362\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.828937768936157\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.829794883728027\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.830744743347168\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.830744743347168\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.83175492286682\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.83280062675476\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.833752393722534\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.834819793701172\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.83579158782959\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.836790084838867\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.836790084838867\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.838112592697144\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.838751792907715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.840753316879272\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.840753316879272\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.8422212600708\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.84279179573059\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.843751907348633\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.84478998184204\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.84574842453003\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.84574842453003\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.847118854522705\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.847962617874146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.848796129226685\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.849751234054565\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.849751234054565\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.85090684890747\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.851752042770386\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.852762460708618\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.85402774810791\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.85475206375122\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.855947256088257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.856791019439697\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.856791019439697\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.85775589942932\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.858882188796997\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.859763383865356\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.860787868499756\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.86179494857788\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.86179494857788\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.862788677215576\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.863789558410645\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.864789485931396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.864789485931396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.86675190925598\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.867751598358154\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.86874794960022\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.869750499725342\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.869750499725342\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.870789527893066\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.872023820877075\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.872758626937866\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.87375497817993\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.87375497817993\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.874792337417603\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.876195669174194\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.87679624557495\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.877750158309937\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.87878918647766\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.87975239753723\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.88075351715088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.881752729415894\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.882789134979248\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.882789134979248\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.88379192352295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.885096788406372\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.885096788406372\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.885788917541504\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.886789798736572\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.888983726501465\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.889784574508667\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.89079189300537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.89079189300537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.891794443130493\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.89311122894287\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.893746614456177\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.89475131034851\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.89588737487793\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.89674735069275\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.89674735069275\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.897794246673584\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.898969411849976\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.899808168411255\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.90075707435608\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.901750087738037\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.90374755859375\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.90475630760193\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.90574812889099\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.906747817993164\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.907755136489868\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.908775091171265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.910747051239014\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.91179084777832\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.91179084777832\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.913018703460693\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.913793802261353\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.9147469997406\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.916745901107788\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.91775417327881\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.91775417327881\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.918790340423584\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.91975736618042\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.920749187469482\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.921753644943237\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.922756910324097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.92386293411255\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.924760818481445\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.925753593444824\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.925753593444824\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.927060842514038\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.927759170532227\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.92878270149231\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.930362701416016\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.930920124053955\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.931964635849\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.93281126022339\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.93415880203247\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.934751987457275\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.934751987457275\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.936797857284546\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.937758922576904\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.938756227493286\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.938918828964233\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.940288066864014\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.94090437889099\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.941754579544067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.942745208740234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.942745208740234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.94375514984131\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.94476556777954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.945858240127563\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.946748733520508\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.947792053222656\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.948744773864746\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.948744773864746\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.95083999633789\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.95175266265869\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.9527907371521\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.953290224075317\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.953790187835693\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.954756021499634\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.955746173858643\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.956752061843872\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.956752061843872\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.958139896392822\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.95879101753235\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.959808826446533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.960891723632812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.96178698539734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.96178698539734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.962745904922485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.963749885559082\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.965131998062134\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.965749979019165\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.966754913330078\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.967789888381958\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.967789888381958\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.968790769577026\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.96988821029663\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.970759868621826\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.971803426742554\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.972795009613037\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.97378969192505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.97409749031067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.975144386291504\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.975932836532593\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.976747751235962\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.977749824523926\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.978782176971436\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.979755878448486\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.980751991271973\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.981431007385254\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.981789588928223\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.982876777648926\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.98376202583313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.98475217819214\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.98475217819214\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.986259937286377\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.986793279647827\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.987807750701904\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.98875093460083\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.989789247512817\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.989789247512817\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.990752935409546\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.99176001548767\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.992875814437866\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.993757963180542\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.994752645492554\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.99575662612915\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.995863914489746\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.997124195098877\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.99776005744934\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.998755931854248\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.998755931854248\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n22.99997591972351\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.000927686691284\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.001862287521362\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.002791166305542\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.002791166305542\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.00378918647766\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.004748106002808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.005754709243774\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.006748914718628\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.007752656936646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.008790731430054\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.008790731430054\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.009750604629517\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.01079750061035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.011746644973755\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.012752294540405\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.012752294540405\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.013790130615234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.01475214958191\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.01580548286438\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.01675033569336\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.01675033569336\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.017881393432617\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.018791675567627\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.01979947090149\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.020959615707397\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.02175283432007\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.022759437561035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.022759437561035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.023789167404175\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.024818420410156\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.0257511138916\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.026798009872437\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.027833223342896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.028753757476807\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.02974820137024\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.02974820137024\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.03079104423523\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.031789541244507\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.032768964767456\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.032768964767456\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.034749031066895\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.034749031066895\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.035747289657593\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.036797761917114\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.03775668144226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.03775668144226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.038750171661377\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.039747714996338\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.04079508781433\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.042747259140015\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.043802976608276\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.043802976608276\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.045044660568237\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.045755624771118\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.0467586517334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.047801971435547\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.047801971435547\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.04875111579895\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.049789905548096\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.050791025161743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.050791025161743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.05192756652832\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.05297350883484\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.053834676742554\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.05475091934204\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.05575203895569\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.05575203895569\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.05686068534851\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.05775022506714\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.058798789978027\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.059792280197144\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.059792280197144\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.061811447143555\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.062747955322266\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.06404185295105\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.064785480499268\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.064785480499268\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.06575894355774\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.066834688186646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.06776261329651\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.069000959396362\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.069769859313965\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.069769859313965\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.070986032485962\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.072190761566162\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.07275390625\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.07374382019043\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.074755668640137\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.07674741744995\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.078076124191284\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.07882046699524\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.07974886894226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.07974886894226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.080745220184326\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.082802772521973\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.08375096321106\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.08375096321106\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.085034608840942\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.085789442062378\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.086787700653076\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.087746381759644\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.087746381759644\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.089760780334473\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.091758966445923\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.091758966445923\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.09283471107483\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.09375023841858\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.095750093460083\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.09674906730652\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.09674906730652\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.097748279571533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.09898018836975\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.10012173652649\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.100902557373047\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.101748943328857\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.10275149345398\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.103755712509155\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.10475492477417\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.10576367378235\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.10576367378235\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.1067898273468\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.107820749282837\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.108749866485596\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.10976243019104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.110767364501953\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.111843585968018\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.11279010772705\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.113789796829224\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.113789796829224\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.115752935409546\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.115752935409546\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.1187903881073\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.1187903881073\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.11978554725647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.121025800704956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.121750116348267\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.122747659683228\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.12375259399414\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.12375259399414\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.124763250350952\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.125755071640015\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.12717580795288\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.12812638282776\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.1287944316864\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.129751205444336\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.130751132965088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.13175082206726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.132789611816406\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.133607864379883\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.133753299713135\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.134962797164917\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.13578963279724\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.136829137802124\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.13775110244751\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.13775110244751\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.13895034790039\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.13980269432068\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.140793800354004\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.14187979698181\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.142752647399902\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.142752647399902\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.143749952316284\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.145752429962158\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.145752429962158\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.146791458129883\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.147748470306396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.148750066757202\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.149746656417847\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.149746656417847\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.150790691375732\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.15178918838501\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.152791500091553\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.153813123703003\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.15478801727295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.155791521072388\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.155791521072388\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.15678834915161\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.157913208007812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.158809661865234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.15975332260132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.15975332260132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.161137580871582\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.161999225616455\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.162790060043335\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.16378927230835\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.16378927230835\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.16474676132202\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.165804624557495\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.166754245758057\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.1679630279541\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.168789386749268\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.168789386749268\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.16983985900879\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.170759201049805\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.171843767166138\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.172770500183105\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.17374849319458\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.17374849319458\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.17475962638855\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.175886631011963\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.176754236221313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.176754236221313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.178759336471558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.178759336471558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.179791927337646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.181748151779175\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.181809902191162\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.183115005493164\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.18378782272339\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.184789419174194\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.185750722885132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.186766862869263\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.18774962425232\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.188748836517334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.18890142440796\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.189746856689453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.190747022628784\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.191754579544067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.192811012268066\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.193764209747314\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.19475221633911\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.19475221633911\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.196221113204956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.196749448776245\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.19797134399414\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.198744297027588\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.199750900268555\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.200775146484375\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.20179319381714\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.202788829803467\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.203062534332275\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.203938722610474\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.205000162124634\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.205745220184326\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.206754207611084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.207794189453125\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.208882331848145\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.208882331848145\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.209798336029053\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.210931301116943\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.211791276931763\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.212746620178223\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.21375274658203\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.21475625038147\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.215155601501465\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.2157883644104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.21693515777588\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.21775245666504\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.21775245666504\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.21878981590271\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.21980047225952\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.220754623413086\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.221945762634277\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.222789525985718\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.22378897666931\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.22378897666931\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.224937438964844\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.225908994674683\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.22675108909607\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.22675108909607\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.228750705718994\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.22974920272827\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.22974920272827\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.230789184570312\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.23191523551941\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.232788562774658\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.232961654663086\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.2337486743927\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.23475217819214\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.23581027984619\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.236886978149414\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.237752437591553\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.237752437591553\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.238799810409546\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.239900588989258\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.24075675010681\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.241751432418823\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.242753744125366\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.242753744125366\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.243788957595825\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.24519395828247\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.245789051055908\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.245789051055908\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.246901035308838\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.248809814453125\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.249747276306152\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.249747276306152\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.250791549682617\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.251791954040527\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.251791954040527\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.252788305282593\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.25378918647766\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.254755973815918\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.255804777145386\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.256746530532837\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.257747650146484\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.257747650146484\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.25879192352295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.259753942489624\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.260788917541504\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.261753797531128\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.261753797531128\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.26420831680298\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.264791011810303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.26574969291687\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.266748428344727\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.2679123878479\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.268794059753418\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.26975417137146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.270751237869263\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.270751237869263\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.27230739593506\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.272789239883423\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.273752450942993\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.27475142478943\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.2757887840271\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.27676272392273\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.278747081756592\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.27978801727295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.27978801727295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.280752182006836\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.282751083374023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.282751083374023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.284745931625366\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.285751342773438\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.28678798675537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.28696298599243\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.28779125213623\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.2890305519104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.28979229927063\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.29177188873291\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.29275131225586\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.293922424316406\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.294747591018677\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.295747756958008\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.295747756958008\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.297805309295654\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.298073768615723\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.298744440078735\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.29995059967041\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.300806522369385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.301748037338257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.30275058746338\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.30275058746338\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.304757833480835\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.304757833480835\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.305747747421265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.30679941177368\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.307748317718506\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.30797839164734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.308748722076416\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.30986475944519\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.310749292373657\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.311763048171997\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.312747955322266\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.313992738723755\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.314752101898193\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.31575107574463\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.31575107574463\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.317057609558105\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.317747354507446\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.31974697113037\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.320756673812866\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.320756673812866\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.321794748306274\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.322750091552734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.324037551879883\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.324746131896973\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.325755834579468\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.326807022094727\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.327789306640625\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.327789306640625\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.328909158706665\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.329941034317017\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.330745458602905\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.331850051879883\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.332757711410522\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.333796739578247\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.33478832244873\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.33478832244873\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.335922241210938\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.336926221847534\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.337966203689575\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.33875346183777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.3398916721344\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.34079122543335\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.34174656867981\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.342746257781982\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.342746257781982\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.344082593917847\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.344866514205933\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.345746994018555\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.347278118133545\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.3477463722229\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.348745584487915\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.34984278678894\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.350743532180786\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.350743532180786\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.35174870491028\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.353745460510254\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.353745460510254\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.35474467277527\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.355788946151733\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.356791496276855\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.356791496276855\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.357788562774658\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.358850717544556\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.35975193977356\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.360780000686646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.360982418060303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.361791372299194\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.36284351348877\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.36379098892212\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.36379098892212\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.364811182022095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.365745782852173\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.366756200790405\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.367745876312256\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.368751287460327\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.369789361953735\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.369789361953735\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.370789527893066\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.371790647506714\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.372753381729126\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.37374711036682\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.374804258346558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.375749588012695\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.375749588012695\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.376750707626343\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.377933502197266\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.37878942489624\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.379748821258545\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.379748821258545\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.381752014160156\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.381752014160156\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.383026361465454\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.383887767791748\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.384759187698364\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.385789394378662\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.385964393615723\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.386749267578125\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.388750553131104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.389752626419067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.389752626419067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.390801906585693\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.39191722869873\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.392752170562744\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.39375066757202\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.394757747650146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.39575695991516\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.396751165390015\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.396751165390015\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.39779758453369\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.398756742477417\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.399752855300903\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.4007568359375\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.401752710342407\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.402759075164795\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.402759075164795\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.403793573379517\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.404797315597534\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.405760049819946\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.40675163269043\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.40775156021118\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.40875768661499\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.40974736213684\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.410788774490356\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.410788774490356\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.41189980506897\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.41284441947937\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.41375708580017\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.414782762527466\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.415767431259155\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.416789054870605\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.41778826713562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.41808581352234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.41879105567932\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.419748544692993\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.420835733413696\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.421749353408813\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.42276096343994\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.423750400543213\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.425054788589478\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.4257493019104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.4257493019104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.426788568496704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.427753686904907\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.4287531375885\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.4287531375885\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.429794788360596\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.43075156211853\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.431997060775757\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.432788372039795\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.433788776397705\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.433788776397705\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.434854984283447\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.435900926589966\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.436781644821167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.437790155410767\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.4387526512146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.439934253692627\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.440937042236328\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.441749095916748\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.441749095916748\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.442754983901978\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.443770170211792\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.44482970237732\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.44482970237732\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.446750164031982\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.446750164031982\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.44789171218872\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.448745727539062\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.448745727539062\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.45179796218872\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.452757358551025\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.453757286071777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.454755783081055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.454755783081055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.45574688911438\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.456748485565186\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.45775604248047\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.458857536315918\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.45975160598755\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.460750818252563\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.460750818252563\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.46184277534485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.462754726409912\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.46376132965088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.464747667312622\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.465994358062744\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.466790199279785\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.46774935722351\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.468793153762817\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.46980571746826\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.470752000808716\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.471765756607056\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.472748517990112\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.473748683929443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.474752187728882\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.47579073905945\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.47579073905945\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.477753400802612\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.47876501083374\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.479748249053955\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.480748176574707\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.480748176574707\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.481746673583984\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.482841730117798\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.483744859695435\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.48474884033203\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.48474884033203\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.485748529434204\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.487227201461792\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.4878568649292\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.48878288269043\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.48974871635437\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.490747451782227\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.490747451782227\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.49274778366089\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.49274778366089\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.494094133377075\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.49474811553955\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.49574875831604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.495830059051514\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.496747970581055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.497748613357544\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.498809814453125\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.499831676483154\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.500753164291382\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.50174903869629\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.50174903869629\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.5027494430542\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.50375747680664\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.505748748779297\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.50677251815796\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.508759021759033\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.508759021759033\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.50975012779236\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.510790586471558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.511749505996704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.51274871826172\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.51300597190857\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.51379108428955\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.514747619628906\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.51601243019104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.51675271987915\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.517903804779053\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.51875352859497\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.519757986068726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.520750761032104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.520840406417847\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.522085189819336\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.522789239883423\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.52375078201294\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.52475357055664\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.52475357055664\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.52577018737793\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.527340173721313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.52779483795166\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.5287606716156\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.529755115509033\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.529755115509033\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.53078007698059\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.531789302825928\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.53283977508545\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.534689903259277\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.534769296646118\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.536072969436646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.53679060935974\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.53777241706848\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.53777241706848\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.538790225982666\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.539748907089233\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.54074501991272\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.54199719429016\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.542752265930176\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.543760061264038\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.544790744781494\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.544790744781494\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.545799732208252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.546758890151978\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.547811031341553\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.548755645751953\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.549790143966675\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.549790143966675\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.55080771446228\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.55192255973816\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.55279016494751\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.55379343032837\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.554805755615234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.55609965324402\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.55678915977478\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.558066368103027\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.558917999267578\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.5597984790802\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.560753107070923\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.56226396560669\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.562748670578003\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.563748836517334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.564745903015137\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.565795421600342\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.565795421600342\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.566776990890503\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.567761659622192\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.56874656677246\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.569759368896484\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.57079029083252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.57175302505493\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.572790145874023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.57314109802246\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.573744535446167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.574751377105713\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.57603907585144\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.576753616333008\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.5777530670166\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.5777530670166\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.579004526138306\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.579914093017578\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.580745458602905\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.581748485565186\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.582746267318726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.583788871765137\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.583788871765137\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.58500051498413\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.585914373397827\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.58678960800171\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.58678960800171\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.58783507347107\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.58875799179077\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.590747833251953\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.590747833251953\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.591789722442627\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.59279155731201\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.593790292739868\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.59475088119507\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.59575366973877\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.59575366973877\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.597190856933594\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.59822940826416\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.598790884017944\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.59975242614746\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.60075879096985\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.60075879096985\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.60175108909607\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.60374641418457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.60374641418457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.60478973388672\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.60596990585327\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.606789588928223\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.607788801193237\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.607788801193237\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.60875153541565\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.609758138656616\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.611123085021973\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.61179208755493\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.612754821777344\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.613905668258667\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.614017724990845\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.614789247512817\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.615752696990967\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.617748975753784\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.618749141693115\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.61974835395813\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.619789361953735\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.620752334594727\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.621943473815918\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.62275743484497\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.62375831604004\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.62520146369934\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.62580442428589\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.626789331436157\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.62775754928589\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.628758430480957\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.629752159118652\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.629752159118652\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.630805730819702\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.63175106048584\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.632789373397827\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.633790969848633\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.634749174118042\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.635751008987427\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.635751008987427\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.637755632400513\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.638757705688477\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.639793157577515\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.64078950881958\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.641836643218994\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.64275574684143\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.64375376701355\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.64484977722168\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.645748376846313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.64675211906433\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.64675211906433\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.647804021835327\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.649032831192017\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.649791717529297\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.6507511138916\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.651750087738037\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.65375256538391\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.65375256538391\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.65475082397461\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.65578269958496\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.65684223175049\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.658761262893677\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.659786462783813\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.660748958587646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.661784648895264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.661784648895264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.66278886795044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.66379475593567\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.664748668670654\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.666752815246582\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.66775369644165\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.66775369644165\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.66878080368042\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.670247316360474\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.670791149139404\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.6717586517334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.672773361206055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.67375946044922\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.67375946044922\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.675203561782837\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.675856113433838\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.676745176315308\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.676877975463867\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.677794933319092\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.678754091262817\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.679745197296143\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.679745197296143\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.68174910545349\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.68174910545349\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.68278932571411\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.68375039100647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.684791803359985\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.685750007629395\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.686809301376343\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.687758207321167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.688751459121704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.68974804878235\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.690747022628784\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.691749334335327\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.692748546600342\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.693788290023804\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.694791078567505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.695789575576782\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.695789575576782\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.696789503097534\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.698047637939453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.698744535446167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.69975996017456\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.700767040252686\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.701884031295776\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.702850818634033\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.703789472579956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.704744577407837\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.704744577407837\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.706111669540405\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.70674705505371\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.707791805267334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.70888924598694\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.710079431533813\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.71093463897705\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.71093463897705\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.712027311325073\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.712887048721313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.713891744613647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.71513819694519\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.71592664718628\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.71692705154419\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.71692705154419\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.71788454055786\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.718960285186768\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.71988868713379\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.72089457511902\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.722083806991577\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.72288942337036\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.723889112472534\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.724236965179443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.724934101104736\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.72608256340027\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.726888179779053\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.727890729904175\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.729182243347168\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.729928493499756\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.730925798416138\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.730925798416138\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.732927322387695\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.732927322387695\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.733893632888794\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.734928369522095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.735934734344482\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.736887216567993\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.73792815208435\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.73792815208435\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.738927841186523\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.739927053451538\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.740925073623657\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.74191188812256\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.742929220199585\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.744348526000977\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.744932651519775\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.745925426483154\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.745925426483154\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.74714684486389\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.747894525527954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.748947858810425\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.749889373779297\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.75088930130005\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.751927375793457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.751927375793457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.75289297103882\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.753989696502686\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.754894733428955\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.755916357040405\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.757091760635376\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.7579288482666\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.758885860443115\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.759886264801025\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.761257886886597\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.761897563934326\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.762890100479126\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.763935565948486\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.764893531799316\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.766109704971313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.7669620513916\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.767927169799805\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.767927169799805\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.769243001937866\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.76988983154297\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.770887851715088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.772141456604004\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.77289080619812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.773523092269897\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.77392554283142\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.774991512298584\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.775888204574585\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.77694058418274\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.77789282798767\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.778889179229736\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.778889179229736\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.779924392700195\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.780892372131348\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.781928777694702\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.781928777694702\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.78394913673401\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.784892797470093\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.785966157913208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.7869291305542\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.7869291305542\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.78798747062683\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.7888925075531\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.789971828460693\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.790892839431763\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.79188370704651\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.792926788330078\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.793108463287354\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.793926000595093\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.79492497444153\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.795928239822388\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.796208381652832\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.796887636184692\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.798479557037354\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.798925161361694\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.798925161361694\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.800135612487793\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.801134824752808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.801931858062744\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.802884578704834\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.802884578704834\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.804046392440796\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.804885625839233\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.805925607681274\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.806931257247925\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.806931257247925\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.808196544647217\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.80872106552124\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.809787034988403\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.810754537582397\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.81176257133484\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.812744140625\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.813894748687744\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.814051151275635\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.814789056777954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.81578516960144\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.81674361228943\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.81706714630127\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.81784963607788\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.81877040863037\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.820013999938965\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.820751905441284\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.820751905441284\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.821751356124878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.822750091552734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.823742151260376\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.824745416641235\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.826786279678345\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.827749013900757\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.82878541946411\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.82878541946411\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.829897165298462\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.830939054489136\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.83174467086792\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.83274531364441\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.83274531364441\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.833784580230713\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.834757804870605\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.835784435272217\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.835784435272217\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.83678650856018\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.837743997573853\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.83875346183777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.83974266052246\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.840744495391846\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.841748476028442\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.842752933502197\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.843815803527832\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.844833612442017\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.845746278762817\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.847742795944214\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.848744869232178\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.84974503517151\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.850744009017944\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.85174822807312\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.85174822807312\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.853748321533203\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.854747772216797\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.85574769973755\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.85574769973755\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.85690712928772\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.857789278030396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.8587486743927\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.859747886657715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.860748529434204\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.86174702644348\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.86174702644348\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.86278462409973\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.864033222198486\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.86485719680786\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.865748643875122\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.865748643875122\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.867743968963623\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.868744611740112\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.869744062423706\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.870748043060303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.871748447418213\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.87207269668579\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.872748136520386\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.874741792678833\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.874741792678833\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.876023054122925\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.87675404548645\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.87784218788147\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.878743886947632\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.879750967025757\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.880746364593506\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.88274073600769\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.88274073600769\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.88391351699829\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.885117292404175\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.885849714279175\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.88674807548523\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.88674807548523\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.88778781890869\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.889191389083862\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.889789581298828\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.890753507614136\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.891955614089966\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.892791509628296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.893749952316284\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.894906044006348\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.895958423614502\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.89674401283264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.897443056106567\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.897786378860474\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.899046659469604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.899786949157715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.900749683380127\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.901759386062622\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.90274453163147\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.90375328063965\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.90375328063965\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.904785871505737\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.90587568283081\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.907055139541626\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.90782856941223\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.909751892089844\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.91078495979309\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.911083459854126\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.91178560256958\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.913022756576538\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.9137864112854\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.914745092391968\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.916749238967896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.916749238967896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.917784929275513\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.918936491012573\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.919790744781494\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.920784950256348\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.920784950256348\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.92174744606018\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.92374610900879\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.924747228622437\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.925785779953003\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.925785779953003\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.927387714385986\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.927793502807617\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.92875099182129\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.929901123046875\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.931747913360596\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.931747913360596\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.93274450302124\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.933785915374756\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.934748649597168\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.935750007629395\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.936794757843018\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.93708825111389\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.937785387039185\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.93928575515747\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.940112590789795\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.9407856464386\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.941760063171387\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.941760063171387\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.942756175994873\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.943798065185547\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.944787979125977\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.94590950012207\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.946746587753296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.947874784469604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.948885917663574\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.949756145477295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.95080876350403\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.952167510986328\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.95308232307434\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.95375943183899\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.95474624633789\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.95474624633789\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.95574450492859\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.956751346588135\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.9580078125\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.95879817008972\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.959749221801758\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.960753202438354\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.96175503730774\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.96274709701538\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.962884664535522\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.96375346183777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.96474838256836\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.96578598022461\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.966840267181396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.96776008605957\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.96776008605957\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.96925115585327\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.969825267791748\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.97074866294861\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.971840620040894\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.972792625427246\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.973468780517578\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.973785400390625\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.974889516830444\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.97575569152832\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.97684144973755\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.977757453918457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.979785203933716\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.979785203933716\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.9807448387146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.981752634048462\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.98278570175171\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.98278570175171\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.983749389648438\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.984750509262085\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.985989332199097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.987064838409424\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.987788438796997\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.988787174224854\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.989886045455933\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.99074625968933\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.99174666404724\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.99321699142456\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.99378514289856\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.994749546051025\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.994749546051025\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.99578619003296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.996755123138428\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.997879028320312\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.998755931854248\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n23.999788999557495\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.00078558921814\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.001790285110474\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.002748489379883\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.003787994384766\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.003787994384766\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.004748344421387\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.005777835845947\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.007750272750854\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.008787870407104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.009089708328247\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.010006189346313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.01096749305725\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.01096749305725\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.011969089508057\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.014020204544067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.01598024368286\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.01614737510681\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.01700448989868\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.01816201210022\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.018967151641846\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.019965648651123\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.02096700668335\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.021963119506836\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.02206325531006\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.02296781539917\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.024195194244385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.024972915649414\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.025969982147217\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.027361154556274\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.02796983718872\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.029006719589233\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.029972314834595\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.03096842765808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.03096842765808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.031972646713257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.03445029258728\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.03496503829956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.035962343215942\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.037084579467773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.03796911239624\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.038963794708252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.0399751663208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.040977478027344\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.04198122024536\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.043172359466553\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.043971300125122\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.044999837875366\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.04595923423767\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.04697322845459\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.047977209091187\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.047977209091187\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.049003839492798\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.049972534179688\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.051098585128784\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.052016496658325\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.052016496658325\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.05296039581299\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.053961038589478\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.05500054359436\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.056973457336426\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.05728530883789\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.05801033973694\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.059006214141846\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.059961318969727\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.059961318969727\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.06198000907898\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.062968730926514\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.063968181610107\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.064964771270752\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.064964771270752\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.066001653671265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.067975759506226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.0689640045166\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.07025980949402\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.071970462799072\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.07296919822693\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.07296919822693\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.073962688446045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.075321435928345\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.07615566253662\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.077077627182007\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.077967882156372\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.077967882156372\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.07941460609436\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.0800039768219\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.080970764160156\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.080970764160156\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.082965850830078\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.082965850830078\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.0841064453125\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.084964752197266\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.08596682548523\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.08652687072754\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.087015628814697\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.08796715736389\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.089967966079712\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.090083360671997\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.091005563735962\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.09216618537903\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.09318518638611\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.09400486946106\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.09400486946106\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.094972372055054\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.095970153808594\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.097007751464844\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.09817337989807\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.09896755218506\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.099969148635864\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.099969148635864\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.101011514663696\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.101961374282837\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.102975845336914\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.104196071624756\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.10497236251831\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.105963468551636\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.106971263885498\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.10713768005371\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.108004808425903\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.108824491500854\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.11024308204651\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.110803365707397\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.11180067062378\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.112839937210083\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.112839937210083\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.113800287246704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.1148362159729\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.115798711776733\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.116804599761963\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.117794513702393\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.118837594985962\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.119805097579956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.12083601951599\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.12083601951599\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.12183666229248\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.122796535491943\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.123798847198486\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.12544322013855\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.125797271728516\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.12679934501648\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.12679934501648\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.127798318862915\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.12896227836609\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.129867553710938\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.130802392959595\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.131803512573242\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.132797956466675\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.133837461471558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.133837461471558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.134835243225098\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.135801553726196\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.136804580688477\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.137802600860596\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.13882565498352\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.139798164367676\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.1403911113739\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.14083480834961\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.141798496246338\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.141798496246338\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.143190622329712\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.143803358078003\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.144814014434814\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.14579725265503\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.14679527282715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.147839069366455\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.149075269699097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.149837493896484\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.149837493896484\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.150863885879517\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.151803731918335\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.15285038948059\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.153841972351074\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.15483593940735\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.155802011489868\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.156068563461304\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.156960487365723\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.157864332199097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.158854246139526\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.15983557701111\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.160835027694702\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.16107940673828\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.161834716796875\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.162834882736206\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.16383671760559\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.164798259735107\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.16585111618042\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.16585111618042\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.16683530807495\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.167935132980347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.168835163116455\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.16983461380005\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.16983461380005\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.17083477973938\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.17179584503174\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.17330288887024\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.173805236816406\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.174863576889038\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.174863576889038\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.17579674720764\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.176865339279175\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.17784023284912\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.178797721862793\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.178797721862793\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.180334091186523\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.180798530578613\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.18198037147522\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.18283438682556\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.18383526802063\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.18383526802063\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.184805631637573\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.185898065567017\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.186792850494385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.187836170196533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.18883442878723\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.18912625312805\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.190097093582153\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.190837144851685\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.19179105758667\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.19279980659485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.194029569625854\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.194793462753296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.194793462753296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.195968627929688\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.196802854537964\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.19780421257019\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.19780421257019\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.198995351791382\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.201179265975952\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.201793432235718\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.20279598236084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.2037935256958\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.204151153564453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.204792499542236\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.205876350402832\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.20679497718811\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.20779013633728\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.209041833877563\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.21005606651306\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.21005606651306\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.21105432510376\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.212127685546875\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.213056087493896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.214065074920654\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.21505570411682\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.21505570411682\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.21610403060913\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.217100858688354\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.21805691719055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.21805691719055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.219059944152832\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.22107219696045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.22305941581726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.22406244277954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.22406244277954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.22510290145874\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.226059913635254\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.227160692214966\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.22906231880188\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.230100631713867\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.231058359146118\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.232062578201294\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.233099699020386\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.233099699020386\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.23405909538269\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.236155033111572\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.236155033111572\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.237102031707764\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.23807382583618\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.239099979400635\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.240061283111572\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.24105739593506\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.242076873779297\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.243063926696777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.243063926696777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.244569778442383\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.24509048461914\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.246246576309204\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.247268676757812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.248271226882935\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.249286651611328\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.25027346611023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.251264810562134\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.25227999687195\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.253273963928223\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.254263877868652\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.25527548789978\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.25726866722107\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.258273124694824\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.259268760681152\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.26030683517456\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.26052951812744\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.261263608932495\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.262265920639038\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.263262510299683\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.264708042144775\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.265992403030396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.2663791179657\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.267311811447144\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.268264532089233\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.268264532089233\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.270263195037842\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.271262407302856\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.272307634353638\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.272307634353638\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.27330756187439\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.274307250976562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.275267124176025\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.276272535324097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.27752161026001\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.278265476226807\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.278265476226807\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.279308795928955\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.28027582168579\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.281306743621826\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.281306743621826\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.28227400779724\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.283271551132202\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.284273386001587\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.28529715538025\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.286266088485718\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.287309408187866\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.28831148147583\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.28831148147583\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.289270639419556\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.291319608688354\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.292264461517334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.293267965316772\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.293267965316772\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.294264554977417\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.29530143737793\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.29626989364624\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.29626989364624\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.298267364501953\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.299270629882812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.299800395965576\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.300275087356567\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.301270484924316\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.301270484924316\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.302693367004395\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.303481340408325\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.30527663230896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.305386304855347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.306265830993652\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.307274341583252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.307274341583252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.308263301849365\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.309260606765747\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.309260606765747\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.310543060302734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.312263011932373\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.312263011932373\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.313417196273804\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.313560962677002\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.31427550315857\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.31427550315857\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.315293788909912\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.315293788909912\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.31626296043396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.31626296043396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.317264556884766\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.317264556884766\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.318795919418335\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.319265127182007\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.320302724838257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.320302724838257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.321471691131592\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.321471691131592\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.32230257987976\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.32230257987976\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.323347806930542\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.323347806930542\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.324268341064453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.325273275375366\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.325273275375366\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.326266288757324\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.326266288757324\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.327264547348022\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.327264547348022\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.328260898590088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.328731298446655\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.32929515838623\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.32929515838623\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.330268383026123\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.330268383026123\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.331265926361084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.332280158996582\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.332823514938354\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.33327603340149\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.33327603340149\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.334351539611816\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.334351539611816\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.33526873588562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.33626937866211\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.337263107299805\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.338263988494873\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.340266227722168\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.341264247894287\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.341264247894287\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.342269897460938\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.3432719707489\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.34426712989807\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.345264196395874\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.345264196395874\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.346269369125366\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.346269369125366\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.347286701202393\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.348263025283813\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.348263025283813\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.349555015563965\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.350277423858643\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.351266384124756\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.351266384124756\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.353519201278687\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.354264736175537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.35526728630066\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.35526728630066\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.356263875961304\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.356263875961304\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.357273817062378\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.358266592025757\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.358266592025757\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.359267711639404\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.36026954650879\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.360491514205933\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.361287355422974\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.362262725830078\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.362262725830078\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.363268852233887\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.364267587661743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.364267587661743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.365272283554077\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.365272283554077\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.36626887321472\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.36733651161194\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.3683443069458\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.36950135231018\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.370266675949097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.370266675949097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.37126612663269\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.372283220291138\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.373265266418457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.373265266418457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.374531269073486\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.375267028808594\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.375267028808594\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.37630867958069\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.377264499664307\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.377588033676147\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.378268718719482\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.37926459312439\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.37926459312439\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.38027596473694\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.381266832351685\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.381266832351685\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.38231611251831\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.38326644897461\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.38326644897461\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.384267330169678\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.384267330169678\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.385271310806274\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.386261224746704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.38827633857727\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.390289783477783\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.390424489974976\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.391268253326416\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.392277002334595\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.393264532089233\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.394267082214355\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.39526891708374\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.39626431465149\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.39626431465149\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.397277355194092\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.398301124572754\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.398301124572754\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.39927649497986\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.39927649497986\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.40026354789734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.40026354789734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.40126633644104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.40226912498474\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.40226912498474\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.40326762199402\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.404285192489624\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.40626883506775\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.407264471054077\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.408264636993408\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.40926504135132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.4102725982666\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.4102725982666\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.411263942718506\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.412269830703735\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.412269830703735\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.413270235061646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.414266347885132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.414266347885132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.415263652801514\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.415263652801514\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.416313648223877\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.417290210723877\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.418302297592163\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.418302297592163\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.41928005218506\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.41928005218506\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.420263290405273\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.420263290405273\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.42133617401123\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.42326593399048\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.424264907836914\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.425271034240723\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.426262855529785\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.427265882492065\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.428264617919922\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.428264617919922\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.429263591766357\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.430266857147217\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.430266857147217\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.431330680847168\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.432266235351562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.432266235351562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.43336820602417\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.434596300125122\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.435264587402344\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.435264587402344\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.436267137527466\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.437270164489746\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.437270164489746\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.438339233398438\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.438339233398438\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.439265251159668\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.439265251159668\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.44026470184326\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.44026470184326\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.441266298294067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.441266298294067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.442265033721924\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.443268060684204\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.444278240203857\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.444278240203857\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.445504426956177\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.44626522064209\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.44626522064209\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.447299003601074\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.447299003601074\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.447299003601074\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.44830298423767\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.44926691055298\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.44926691055298\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.450302362442017\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.451273441314697\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.452264070510864\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.45326566696167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.45432710647583\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.45432710647583\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.455410480499268\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.456264972686768\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.456264972686768\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.45730710029602\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.45826745033264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.45826745033264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.45927095413208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.45927095413208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.460320234298706\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.461265325546265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.46226978302002\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.463269233703613\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.46426486968994\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.46426486968994\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.465310096740723\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.466302394866943\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.466302394866943\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.46737766265869\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.46737766265869\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.468266487121582\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.469262838363647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.469262838363647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.471274614334106\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.471274614334106\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.472267627716064\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.47329545021057\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.47380566596985\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.474266052246094\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.475269556045532\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.475269556045532\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.47630739212036\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.47676944732666\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.477269172668457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.478262662887573\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.47984790802002\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.480265378952026\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.481266021728516\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.481266021728516\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.48230266571045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.48230266571045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.483264446258545\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.483264446258545\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.48427438735962\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.48427438735962\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.485286951065063\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.485286951065063\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.486345529556274\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.486345529556274\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.48731780052185\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.48731780052185\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.48844575881958\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.48940110206604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.48940110206604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.49029779434204\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.491427183151245\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.492267370224\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.493558406829834\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.494266748428345\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.495264053344727\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.495264053344727\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.496264934539795\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.496264934539795\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.497262239456177\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.497262239456177\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.49827790260315\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.499279975891113\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.500271797180176\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.50131320953369\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.50131320953369\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.502268075942993\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.502268075942993\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.503270387649536\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.503270387649536\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.504263162612915\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.505268812179565\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.506273984909058\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.506273984909058\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.507265090942383\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.50843858718872\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.509269952774048\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.510263919830322\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.510263919830322\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.51126194000244\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.512269735336304\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.51326608657837\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.51326608657837\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.514355897903442\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.51526403427124\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.51526403427124\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.516319036483765\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.516319036483765\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.51745629310608\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.518266916275024\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.51926326751709\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.51926326751709\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.5205295085907\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.521265506744385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.521265506744385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.522271633148193\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.522271633148193\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.523464918136597\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.52427339553833\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.52529525756836\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.52529525756836\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.526269912719727\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.527273416519165\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.528355836868286\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.529263496398926\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.5302631855011\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.5302631855011\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.531270742416382\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.531270742416382\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.533276796340942\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.534315586090088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.53526496887207\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.53526496887207\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.536622047424316\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.537299156188965\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.537299156188965\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.538269758224487\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.538269758224487\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.53926706314087\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.540271759033203\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.540271759033203\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.541272163391113\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.541272163391113\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.542263507843018\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.542263507843018\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.54326820373535\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.544273376464844\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.544273376464844\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.546458959579468\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.5472629070282\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.548274278640747\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.548274278640747\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.549271821975708\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.549271821975708\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.55030655860901\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.55030655860901\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.551405429840088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.551405429840088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.55226492881775\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.55226492881775\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.553267240524292\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.55426788330078\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.554369926452637\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.555306434631348\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.556314945220947\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.55727481842041\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.55727481842041\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.558265447616577\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.558265447616577\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.55926537513733\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.56026601791382\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.56126379966736\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.562283277511597\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.563488960266113\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.5644109249115\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.56526279449463\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.56526279449463\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.566267013549805\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.5672709941864\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.568276405334473\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.56926703453064\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.56926703453064\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.570272207260132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.57126498222351\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.57126498222351\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.572316884994507\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.572316884994507\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.573265075683594\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.57426428794861\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.57426428794861\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.57626485824585\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.577266931533813\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.57826566696167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.57826566696167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.57926630973816\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.58026647567749\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.58126449584961\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.58227229118347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.58227229118347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.583324909210205\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.58459758758545\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.585264444351196\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.585264444351196\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.586270570755005\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.587269067764282\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.588645935058594\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.589263677597046\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.59027647972107\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.591312646865845\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.591312646865845\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.59230375289917\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.59230375289917\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.593265295028687\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.593265295028687\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.594303131103516\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.59526515007019\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.59526515007019\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.596271276474\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.596271276474\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.597274780273438\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.598267555236816\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.599266052246094\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.600274085998535\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.600274085998535\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.601266384124756\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.602263927459717\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.603288173675537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.60426640510559\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.605305194854736\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.605305194854736\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.606303215026855\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.606303215026855\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.607303619384766\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.607303619384766\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.608337879180908\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.608337879180908\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.609266757965088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.609266757965088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.610271215438843\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.611303567886353\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.612266302108765\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.612266302108765\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.613263607025146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.61426568031311\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.615479469299316\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.61626696586609\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.618273973464966\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.62026572227478\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.62026572227478\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.621265649795532\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.621265649795532\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.622269868850708\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.623266220092773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.62428069114685\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.625269174575806\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.625269174575806\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.62627649307251\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.62627649307251\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.62726378440857\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.628264904022217\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.629261255264282\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.629261255264282\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.630271673202515\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.630271673202515\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.63126301765442\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.63126301765442\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.632263898849487\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.632263898849487\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.633267402648926\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.634271383285522\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.635268926620483\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.635268926620483\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.63626217842102\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.637264013290405\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.638288259506226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.638288259506226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.63926339149475\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.63926339149475\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.640387296676636\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.640387296676636\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.641265630722046\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.64226484298706\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.64226484298706\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.643342971801758\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.64427351951599\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.645288944244385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.647265672683716\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.648274183273315\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.64926838874817\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.64926838874817\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.6502685546875\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.651265621185303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.651265621185303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.65227246284485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.653265237808228\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.65426468849182\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.65426468849182\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.655269384384155\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.655269384384155\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.65626287460327\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.65626287460327\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.65726137161255\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.65726137161255\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.65826654434204\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.66027331352234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.661437273025513\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.662264585494995\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.662264585494995\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.663267850875854\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.664264917373657\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.665265321731567\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.66626787185669\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.667317628860474\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.668700218200684\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.66926598548889\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.670268535614014\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.670268535614014\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.671303272247314\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.672268867492676\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.673279285430908\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.674272060394287\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.67526364326477\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.676262617111206\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.676262617111206\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.677263259887695\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.678349018096924\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.67926836013794\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.67926836013794\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.680262804031372\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.680262804031372\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.681262493133545\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.6822726726532\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.6822726726532\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.68326997756958\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.6842679977417\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.685483932495117\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.686277389526367\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.687262773513794\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.687262773513794\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.68826937675476\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.68826937675476\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.6892671585083\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.690263748168945\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.690263748168945\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.691262006759644\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.692278623580933\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.693263292312622\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.693263292312622\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.695266723632812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.696271657943726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.696271657943726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.697264194488525\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.69926881790161\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.700266122817993\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.70140051841736\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.70226788520813\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.703264474868774\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.703264474868774\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.704263925552368\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.704263925552368\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.70526385307312\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.70526385307312\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.706265211105347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.707529306411743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.70826745033264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.709270238876343\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.710280656814575\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.71127438545227\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.71127438545227\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.71251153945923\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.713261604309082\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.714715242385864\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.7152681350708\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.71636152267456\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.71636152267456\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.717262983322144\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.717262983322144\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.71826410293579\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.71826410293579\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.719263792037964\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.719263792037964\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.720262050628662\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.721272706985474\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.72226309776306\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.723278760910034\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.724273443222046\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.72526264190674\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.72526264190674\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.726262092590332\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.727269172668457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.729264974594116\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.730335235595703\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.73126459121704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.73126459121704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.732301235198975\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.732301235198975\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.733263731002808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.733263731002808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.734304904937744\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.734304904937744\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.735309600830078\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.736266136169434\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.736266136169434\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.737271785736084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.738282680511475\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.73929452896118\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.7403507232666\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.74128007888794\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.74128007888794\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.742274045944214\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.74326515197754\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.744266033172607\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.74527096748352\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.74627184867859\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.747267723083496\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.748271465301514\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.74928116798401\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.75026297569275\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.75026297569275\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.751269340515137\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.75227427482605\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.75326156616211\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.75326156616211\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.754363775253296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.755268335342407\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.756537675857544\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.757264137268066\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.75826358795166\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.75826358795166\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.759264707565308\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.760267972946167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.760267972946167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.761327505111694\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.763293027877808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.764264345169067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.765267848968506\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.766262769699097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.766262769699097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.767264127731323\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.767264127731323\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.768264055252075\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.76948642730713\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.770284414291382\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.770284414291382\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.771262407302856\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.772270441055298\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.773261785507202\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.773261785507202\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.774260997772217\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.774260997772217\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.775262355804443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.776267528533936\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.776267528533936\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.7772696018219\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.778262615203857\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.779263734817505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.779306173324585\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.780263423919678\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.780263423919678\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.780263423919678\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.78126358985901\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.782262086868286\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.782262086868286\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.784263134002686\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.784263134002686\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.78550410270691\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.78550410270691\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.7862606048584\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.78726363182068\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.78726363182068\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.788296937942505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.789266347885132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.789266347885132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.790262460708618\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.790262460708618\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.792264461517334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.792264461517334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.79326033592224\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.79326033592224\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.794262170791626\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.794262170791626\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.795267820358276\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.795267820358276\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.796262979507446\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.796262979507446\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.797265768051147\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.798272371292114\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.799267053604126\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.800267696380615\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.800267696380615\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.80126452445984\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.80226993560791\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.80328059196472\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.804276704788208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.805267810821533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.805267810821533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.806280374526978\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.806280374526978\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.807271480560303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.808265447616577\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.80856418609619\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.80926513671875\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.80926513671875\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.810263872146606\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.810263872146606\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.811267852783203\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.812270402908325\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.813292980194092\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.814265966415405\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.815265655517578\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.815265655517578\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.816263914108276\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.817265033721924\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.81830644607544\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.81936240196228\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.82026219367981\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.821264266967773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.821264266967773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.822264909744263\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.822264909744263\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.823342323303223\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.824264526367188\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.825273275375366\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.825273275375366\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.826314449310303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.826314449310303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.82727313041687\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.82827401161194\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.82926297187805\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.830262422561646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.830262422561646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.83134365081787\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.83134365081787\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.833268880844116\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.833268880844116\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.834635019302368\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.83526372909546\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.83526372909546\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.836266040802002\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.836266040802002\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.837265253067017\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.838265657424927\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.839264392852783\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.84027361869812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.84027361869812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.841264486312866\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.841264486312866\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.84246039390564\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.84246039390564\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.843266487121582\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.843266487121582\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.844263792037964\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.845266819000244\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.845266819000244\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.846263885498047\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.846263885498047\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.847267150878906\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.847267150878906\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.848499298095703\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.84927272796631\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.84927272796631\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.85026502609253\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.85026502609253\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.85131525993347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.85226607322693\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.854336738586426\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.855266571044922\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.8554265499115\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.85681676864624\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.857268810272217\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.858263969421387\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.85926628112793\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.86028552055359\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.861262798309326\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.862266302108765\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.862266302108765\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.86326789855957\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.86426615715027\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.86426615715027\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.86526346206665\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.866368055343628\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.86727285385132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.868469953536987\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.870361804962158\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.870361804962158\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.871270179748535\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.872262954711914\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.873365879058838\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.874297857284546\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.875264644622803\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.876264810562134\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.876264810562134\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.877267122268677\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.878265857696533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.87926173210144\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.880350589752197\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.881381511688232\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.882267236709595\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.882267236709595\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.88333249092102\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.88426637649536\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.88426637649536\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.885263442993164\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.886263132095337\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.88731861114502\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.888492107391357\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.88926672935486\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.89038634300232\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.89126467704773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.89126467704773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.89232349395752\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.893593311309814\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.894341707229614\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.89648461341858\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.897263288497925\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.898263454437256\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.89926767349243\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.90026354789734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.900421619415283\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.90139603614807\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.902263402938843\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.902263402938843\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.90326237678528\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.9052631855011\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.9052631855011\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.906264066696167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.907262325286865\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.90854287147522\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.910265684127808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.910265684127808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.91126775741577\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.91226553916931\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.91226553916931\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.913264513015747\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.91450834274292\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.916263818740845\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.917264461517334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.91826367378235\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.918412923812866\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.91926598548889\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.920353889465332\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.920382976531982\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.922266721725464\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.924487829208374\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.925583839416504\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.926363229751587\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.927598476409912\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.928633213043213\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.929466009140015\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.930264711380005\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.930264711380005\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.931264877319336\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.93253254890442\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.93326449394226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.934958696365356\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.936287879943848\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.937276601791382\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.938263654708862\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.93948268890381\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.940263986587524\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.940263986587524\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.941264629364014\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.942473888397217\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.94327163696289\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.944275379180908\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.945274591445923\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.946280241012573\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.947264671325684\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.947935581207275\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.948264360427856\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.949265003204346\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.951273918151855\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.9532630443573\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.954461812973022\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.954512357711792\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.955263137817383\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.956263303756714\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.956263303756714\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.95726203918457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.95726203918457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.958261728286743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.958261728286743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.959262132644653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.959262132644653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.960269927978516\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.961276054382324\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.96326470375061\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.964285612106323\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.965264081954956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.966267108917236\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.966267108917236\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.96726417541504\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.96726417541504\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.968263626098633\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.968263626098633\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.970264196395874\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.970264196395874\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.97126317024231\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.97227144241333\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.973261833190918\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.973261833190918\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.974262714385986\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.974262714385986\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.975263595581055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.975263595581055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.97626256942749\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.97726535797119\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.97826862335205\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.980297803878784\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.981278657913208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.983336448669434\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.983336448669434\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.98526692390442\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.986289262771606\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.98640251159668\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.987296104431152\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.98826265335083\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.98826265335083\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.989264011383057\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.989264011383057\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.990262269973755\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.991315603256226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.991315603256226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.992266178131104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.992266178131104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.994288206100464\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.994288206100464\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.995355367660522\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.996267318725586\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.996267318725586\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.997262239456177\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.997262239456177\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.998265504837036\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.998265504837036\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n24.99926471710205\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.000568628311157\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.001261949539185\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.002268314361572\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.002268314361572\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.003270864486694\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.003270864486694\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.00426197052002\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.005262851715088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.005262851715088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.006484270095825\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.007564067840576\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.008422374725342\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.00926399230957\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.010314464569092\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.0112624168396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.01132845878601\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.012263774871826\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.014278650283813\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.014278650283813\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.015288829803467\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.015288829803467\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.016265392303467\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.016265392303467\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.017263412475586\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.017263412475586\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.018261909484863\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.01927399635315\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.020264863967896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.020264863967896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.02127432823181\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.02127432823181\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.022271156311035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.023308038711548\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.024264097213745\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.02539849281311\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.02626943588257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.02729344367981\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.029266119003296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.029266119003296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.030280828475952\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.03135871887207\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.032264947891235\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.033267974853516\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.034592151641846\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.03526759147644\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.03626251220703\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.0372633934021\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.0372633934021\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.038533687591553\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.0392644405365\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.0392644405365\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.040263175964355\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.041268348693848\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.04228663444519\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.04228663444519\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.043437957763672\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.044262647628784\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.044262647628784\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.045264959335327\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.045440196990967\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.046260833740234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.046260833740234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.047266721725464\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.048476457595825\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.049262285232544\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.049262285232544\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.05026340484619\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.05026340484619\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.05126118659973\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.05126118659973\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.052263259887695\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.052428483963013\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.05326199531555\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.05326199531555\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.05463433265686\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.055262804031372\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.055262804031372\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.056262016296387\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.05726194381714\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.058262586593628\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.058262586593628\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.059267044067383\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.06033706665039\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.061288356781006\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.061288356781006\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.062263011932373\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.06327176094055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.06327176094055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.064292669296265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.064292669296265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.06527352333069\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.06527352333069\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.06626319885254\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.06626319885254\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.06726360321045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.068264484405518\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.068264484405518\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.068264484405518\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.069262742996216\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.069262742996216\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.070287942886353\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.071017742156982\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.071274757385254\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.072373867034912\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.073262214660645\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.073262214660645\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.074262380599976\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.07548999786377\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.076263904571533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.076263904571533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.076263904571533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.077306509017944\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.078320026397705\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.078320026397705\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.07926321029663\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.07926321029663\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.080263137817383\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.080263137817383\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.081262350082397\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.081262350082397\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.082275390625\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.0832622051239\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.0832622051239\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.084261178970337\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.084261178970337\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.08526086807251\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.08526086807251\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.086294889450073\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.086294889450073\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.087263107299805\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.088263034820557\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.088263034820557\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.08927011489868\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.09026336669922\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.09026336669922\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.091352224349976\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.091352224349976\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.092260599136353\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.093263864517212\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.093263864517212\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.093263864517212\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.094263553619385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.094263553619385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.095265865325928\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.095265865325928\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.096262454986572\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.097272634506226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.097272634506226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.098336219787598\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.098336219787598\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.099288940429688\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.099288940429688\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.100261926651\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.100321531295776\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.100321531295776\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.10126280784607\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.10126280784607\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.102264404296875\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.10326075553894\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.10326075553894\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.104263067245483\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.104263067245483\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.104263067245483\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.105271339416504\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.105271339416504\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.106287240982056\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.107263565063477\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.107263565063477\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.10826301574707\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.10826301574707\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.10927391052246\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.10927391052246\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.110278129577637\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.110278129577637\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.111263751983643\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.111263751983643\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.112268924713135\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.112268924713135\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.113268613815308\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.11426329612732\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.11426329612732\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.115267276763916\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.115267276763916\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.11626386642456\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.117375373840332\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.117375373840332\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.118263959884644\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.119261741638184\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.119628190994263\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.119628190994263\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.12026834487915\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.121318340301514\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.12226128578186\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.12226128578186\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.12326955795288\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.124269723892212\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.125262022018433\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.12532353401184\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.126264810562134\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.126264810562134\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.127268314361572\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.12827444076538\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.12827444076538\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.129263401031494\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.130264282226562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.130264282226562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.131275177001953\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.13226294517517\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.13226294517517\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.133265018463135\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.133265018463135\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.13430690765381\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.13430690765381\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.135522603988647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.135522603988647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.136290550231934\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.138261556625366\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.13926672935486\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.14026379585266\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.14226484298706\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.14226484298706\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.1432626247406\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.1452693939209\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.1452693939209\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.146273612976074\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.147265195846558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.147265195846558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.148443937301636\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.149262189865112\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.149262189865112\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.150262117385864\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.15126347541809\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.15126347541809\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.15226149559021\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.153263807296753\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.153263807296753\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.154268264770508\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.15550923347473\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.156264066696167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.156264066696167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.15726637840271\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.15842318534851\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.159269094467163\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.159269094467163\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.16026496887207\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.16026496887207\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.16126823425293\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.16243290901184\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.16243290901184\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.163296699523926\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.164263248443604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.16531991958618\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.16627025604248\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.16727638244629\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.16826844215393\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.169262886047363\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.169262886047363\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.17026686668396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.172279357910156\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.17326855659485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.17326855659485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.17447018623352\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.175267219543457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.175267219543457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.176263332366943\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.177263021469116\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.177263021469116\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.177263021469116\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.178265810012817\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.17926287651062\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.17926287651062\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.18026566505432\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.181265592575073\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.182273864746094\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.182456254959106\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.1844265460968\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.185269832611084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.186266660690308\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.18752884864807\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.18752884864807\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.188263177871704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.188263177871704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.18936324119568\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.190263986587524\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.190263986587524\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.191260814666748\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.192262649536133\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.192262649536133\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.193466663360596\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.193466663360596\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.195273399353027\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.195273399353027\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.197266817092896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.198270320892334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.19926691055298\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.201268672943115\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.202573776245117\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.203271865844727\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.204766035079956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.205615758895874\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.206331491470337\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.207273483276367\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.208263635635376\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.208263635635376\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.20926833152771\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.210301876068115\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.21127414703369\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.21127414703369\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.212260723114014\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.213263034820557\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.215263605117798\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.215263605117798\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.216265439987183\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.216265439987183\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.21730375289917\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.21730375289917\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.218286514282227\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.218286514282227\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.219262838363647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.219262838363647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.22026562690735\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.221527099609375\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.222265243530273\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.222265243530273\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.22326159477234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.22326159477234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.22426438331604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.22426438331604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.225263595581055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.225263595581055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.226263761520386\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.226263761520386\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.227263689041138\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.228504180908203\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.229262828826904\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.229262828826904\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.229262828826904\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.230263471603394\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.230263471603394\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.232264280319214\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.232264280319214\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.233264684677124\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.234293699264526\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.235265016555786\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.236276149749756\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.236276149749756\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.237489700317383\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.237489700317383\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.238264560699463\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.239272356033325\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.240267992019653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.240267992019653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.241265296936035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.24227786064148\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.243268489837646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.243268489837646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.244268894195557\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.244268894195557\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.24526858329773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.24526858329773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.24526858329773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.24626135826111\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.24626135826111\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.247265100479126\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.248265981674194\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.2492733001709\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.2492733001709\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.250272512435913\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.251270532608032\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.251270532608032\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.252283096313477\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.252283096313477\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.253263473510742\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.253263473510742\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.254331827163696\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.254331827163696\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.255263805389404\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.25657820701599\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.25657820701599\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.25727367401123\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.258264780044556\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.2592613697052\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.2592613697052\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.2592613697052\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.26026153564453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.26126194000244\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.26126194000244\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.26226234436035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.26226234436035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.263264179229736\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.263264179229736\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.26438856124878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.265270948410034\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.265270948410034\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.26626682281494\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.26626682281494\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.26726460456848\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.268267154693604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.269266605377197\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.269266605377197\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.27026343345642\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.27026343345642\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.271350860595703\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.27226161956787\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.27226161956787\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.273260593414307\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.27340579032898\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.274263858795166\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.274263858795166\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.275270462036133\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.275270462036133\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.276268482208252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.277275562286377\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.278268098831177\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.278268098831177\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.278268098831177\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.279263019561768\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.280264139175415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.280264139175415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.280264139175415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.281312465667725\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.281312465667725\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.282265663146973\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.283266305923462\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.284298181533813\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.285268783569336\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.2862708568573\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.2862708568573\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.287262678146362\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.287262678146362\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.288263082504272\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.288263082504272\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.28932547569275\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.290262699127197\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.290262699127197\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.29130482673645\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.29130482673645\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.29226303100586\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.29226303100586\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.293370246887207\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.29426598548889\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.29426598548889\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.295260906219482\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.295260906219482\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.296308040618896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.296308040618896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.297266483306885\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.2982656955719\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.2982656955719\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.299264430999756\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.299264430999756\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.299264430999756\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.300264596939087\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.301262855529785\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.301262855529785\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.3022620677948\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.303263902664185\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.303263902664185\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.304267168045044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.30526566505432\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.30626344680786\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.30626344680786\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.307265520095825\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.307265520095825\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.308266639709473\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.308266639709473\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.309266090393066\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.309266090393066\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.31026577949524\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.311298370361328\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.312267303466797\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.313297510147095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.314262866973877\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.314262866973877\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.31526231765747\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.31526231765747\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.316267013549805\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.317267656326294\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.317267656326294\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.318267583847046\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.318267583847046\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.319266080856323\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.319266080856323\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.32026505470276\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.32026505470276\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.32127332687378\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.32127332687378\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.322267055511475\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.322267055511475\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.32366919517517\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.324275970458984\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.32526683807373\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.326265573501587\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.326265573501587\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.3274085521698\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.328269004821777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.328269004821777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.329262733459473\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.33026933670044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.331266403198242\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.33231496810913\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.33328080177307\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.33328080177307\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.334480047225952\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.335435152053833\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.335435152053833\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.33626365661621\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.33726692199707\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.33726692199707\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.338263750076294\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.338263750076294\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.340278148651123\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.341275453567505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.342264652252197\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.343266010284424\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.343266010284424\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.344334840774536\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.345266103744507\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.34643578529358\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.34726619720459\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.348264455795288\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.348264455795288\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.350278854370117\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.35126566886902\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.35126566886902\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.353275060653687\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.35426902770996\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.35526466369629\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.35526466369629\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.356263637542725\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.35726284980774\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.35726284980774\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.35826849937439\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.359400272369385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.361266136169434\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.361266136169434\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.36326551437378\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.36326551437378\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.36426615715027\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.36526393890381\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.36644196510315\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.36827063560486\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.3692626953125\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.37026858329773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.37026858329773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.37026858329773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.371264219284058\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.372263431549072\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.372263431549072\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.37326407432556\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.374263048171997\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.374263048171997\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.37526798248291\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.376272678375244\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.376272678375244\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.37726378440857\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.37726378440857\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.378293991088867\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.378293991088867\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.37926721572876\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.37926721572876\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.381475925445557\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.382266759872437\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.382266759872437\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.383274793624878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.383274793624878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.38434362411499\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.38434362411499\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.38526439666748\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.38526439666748\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.38626480102539\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.38626480102539\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.387269020080566\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.38827109336853\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.38926362991333\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.390270471572876\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.39126491546631\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.39186692237854\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.39226245880127\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.39226245880127\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.39326000213623\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.39426589012146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.394630670547485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.39577317237854\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.39626407623291\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.39626407623291\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.39726233482361\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.39726233482361\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.398263692855835\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.39833402633667\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.39833402633667\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.399261236190796\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.399261236190796\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.400264739990234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.400264739990234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.401272535324097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.401272535324097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.402263879776\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.402263879776\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.40326189994812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.40326189994812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.404274940490723\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.404274940490723\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.405264377593994\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.405264377593994\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.405264377593994\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.406262636184692\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.407262563705444\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.407262563705444\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.409263372421265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.409263372421265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.410267114639282\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.410267114639282\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.411260843276978\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.411260843276978\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.4122633934021\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.4122633934021\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.413362979888916\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.413362979888916\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.414340257644653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.414340257644653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.415273904800415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.415273904800415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.416263580322266\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.417267084121704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.417267084121704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.417267084121704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.41826367378235\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.41826367378235\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.419260501861572\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.419260501861572\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.420260429382324\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.420260429382324\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.420260429382324\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.421262741088867\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.42263102531433\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.423266410827637\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.423266410827637\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.424428462982178\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.425299644470215\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.42543649673462\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.426496505737305\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.426496505737305\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.427260875701904\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.427260875701904\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.42826819419861\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.42826819419861\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.429268836975098\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.429268836975098\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.430402040481567\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.430402040481567\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.431267976760864\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.43235754966736\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.43235754966736\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.433263301849365\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.433263301849365\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.434264659881592\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.434264659881592\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.434264659881592\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.43526315689087\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.43526315689087\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.436266660690308\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.437314748764038\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.437314748764038\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.438263416290283\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.438263416290283\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.439342975616455\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.439342975616455\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.440265417099\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.440265417099\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.440265417099\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.441298723220825\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.441298723220825\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.442265510559082\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.44329047203064\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.444304943084717\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.444304943084717\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.444304943084717\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.445266008377075\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.445266008377075\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.446424961090088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.446424961090088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.447301864624023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.447301864624023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.448302507400513\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.448302507400513\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.44926381111145\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.44926381111145\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.45028281211853\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.451263189315796\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.451263189315796\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.452460527420044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.452460527420044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.453301429748535\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.453301429748535\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.454267740249634\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.454267740249634\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.455304861068726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.455304861068726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.45626664161682\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.45626664161682\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.457453966140747\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.458266496658325\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.458266496658325\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.45931077003479\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.45931077003479\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.460302591323853\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.460302591323853\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.461297750473022\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.461297750473022\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.46230173110962\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.46230173110962\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.46326518058777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.46326518058777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.464810848236084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.46526789665222\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.46526789665222\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.466265439987183\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.466265439987183\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.467334508895874\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.467334508895874\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.46830153465271\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.46830153465271\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.469656467437744\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.470265865325928\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.470265865325928\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.471315383911133\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.471315383911133\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.472294569015503\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.472294569015503\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.473299264907837\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.473299264907837\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.474300622940063\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.474300622940063\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.475406885147095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.475406885147095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.476340532302856\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.476340532302856\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.47831439971924\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.47831439971924\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.479264497756958\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.48030686378479\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.48030686378479\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.481264352798462\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.481264352798462\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.482303380966187\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.482303380966187\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.48336124420166\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.484275341033936\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.485270023345947\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.486275911331177\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.487430095672607\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.488271713256836\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.488271713256836\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.48926854133606\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.49057126045227\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.491272449493408\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.491272449493408\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.492267608642578\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.493419647216797\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.494269371032715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.494269371032715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.495290279388428\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.496268272399902\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.49727177619934\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.498265504837036\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.498265504837036\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.49935507774353\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.500316619873047\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.5012686252594\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.5012686252594\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.50226926803589\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.503273248672485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.504273653030396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.505274772644043\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.50628089904785\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.50727391242981\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.5082745552063\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.50944471359253\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.51048231124878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.51148748397827\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.513484716415405\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.51447367668152\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.516470432281494\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.517473459243774\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.51847267150879\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.520470142364502\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.520470142364502\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.52247929573059\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.523521423339844\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.52546763420105\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.526476860046387\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.528472423553467\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.529473066329956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.529473066329956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.530519485473633\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.531612873077393\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.532482385635376\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.533472299575806\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.534470319747925\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.536474227905273\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.538475513458252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.538475513458252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.540472984313965\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.54146909713745\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.542471170425415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.543477773666382\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.54556632041931\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.54646944999695\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.548536777496338\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.549471855163574\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.550477266311646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.55154299736023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.553467988967896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.55447006225586\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.556604623794556\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.556654691696167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.55765652656555\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.55867338180542\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.559922695159912\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.559922695159912\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.562140226364136\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.562692642211914\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.5636887550354\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.56468915939331\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.56468915939331\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.566309213638306\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.56686234474182\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.56875491142273\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.56875491142273\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.570812225341797\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.57179069519043\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.57370162010193\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.57370162010193\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.57578134536743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.576727867126465\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.577767610549927\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.578731775283813\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.579729795455933\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.58072853088379\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.58173108100891\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.583011150360107\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.583731412887573\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.5857355594635\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.586732625961304\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.58772587776184\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.588732481002808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.58972477912903\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.591745138168335\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.591745138168335\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.593726873397827\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.59473443031311\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.595808267593384\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.596732139587402\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.59773302078247\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.598750829696655\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.599732398986816\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.60076665878296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.601731538772583\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.602739095687866\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.60372805595398\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.60372805595398\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.605332612991333\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.605751514434814\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.606734037399292\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.607725858688354\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.607725858688354\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.608726978302002\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.60972547531128\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.611029863357544\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.611774682998657\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.612736463546753\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.613770246505737\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.614862203598022\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.615735054016113\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.616734504699707\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.617839097976685\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.618725299835205\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.619730949401855\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.620353937149048\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.62072491645813\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.62181305885315\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.622724533081055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.623730421066284\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.623730421066284\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.625004529953003\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.625734567642212\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.628113985061646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.628723621368408\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.629734992980957\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.63075876235962\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.63174295425415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.63273549079895\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.63273549079895\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.633737802505493\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.63473343849182\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.63573169708252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.63673210144043\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.637733459472656\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.637733459472656\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.63873791694641\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.64073419570923\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.64176845550537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.642767906188965\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.642767906188965\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.64372944831848\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.6447331905365\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.6462345123291\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.646770000457764\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.64781165122986\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.648731231689453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.648731231689453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.649770736694336\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.651726722717285\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.652732372283936\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.652732372283936\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.654730319976807\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.654805183410645\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.656017541885376\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.656741857528687\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.65773582458496\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.65873122215271\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.659733295440674\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.659733295440674\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.66073513031006\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.661846160888672\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.662736177444458\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.66376829147339\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.664725065231323\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.665778875350952\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.666739463806152\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.666739463806152\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.668132781982422\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.668771505355835\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.669769048690796\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.669769048690796\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.670770168304443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.672745943069458\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.673726558685303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.674726486206055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.675730228424072\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.676100254058838\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.67676830291748\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.67773199081421\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.678796768188477\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.67973780632019\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.679800033569336\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.681751251220703\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.682730674743652\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.68374013900757\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.684723377227783\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.68573760986328\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.68573760986328\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.687840223312378\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.688735246658325\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.689727306365967\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.690726280212402\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.691723108291626\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.692782163619995\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.69380235671997\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.69472575187683\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.696526050567627\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.69772958755493\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.69772958755493\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.69873881340027\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.700733184814453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.70172643661499\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.703724145889282\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.70473027229309\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.705726623535156\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.706746578216553\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.707731246948242\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.708921670913696\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.709734439849854\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.710727214813232\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.71172785758972\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.71272563934326\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.71272563934326\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.714739322662354\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.715728521347046\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.716726779937744\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.71686577796936\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.717767477035522\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.71883773803711\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.719744205474854\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.72174048423767\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.722731351852417\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.72373056411743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.72373056411743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.724767446517944\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.725895166397095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.726775646209717\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.72772526741028\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.729735136032104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.729735136032104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.731076955795288\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.731772661209106\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.73272943496704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.73372745513916\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.734727144241333\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.73572874069214\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.73673105239868\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.738230228424072\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.738736629486084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.739845037460327\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.740741729736328\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.741731643676758\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.742740154266357\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.744738340377808\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.745734453201294\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.746726989746094\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.747830152511597\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.749746799468994\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.75172734260559\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.75172734260559\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.75378704071045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.75474238395691\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.75698685646057\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.757723569869995\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.758755683898926\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.758755683898926\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.75972580909729\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.75972580909729\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.76072382926941\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.76072382926941\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.761725187301636\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.762848377227783\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.762848377227783\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.7637300491333\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.764730215072632\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.7657253742218\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.7657253742218\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.76676034927368\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.767722606658936\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.76872754096985\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.76872754096985\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.770727396011353\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.770727396011353\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.771779775619507\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.771779775619507\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.77281904220581\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.77281904220581\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.77372407913208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.77372407913208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.774723291397095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.774723291397095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.775723695755005\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.7770516872406\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.777721643447876\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.77872610092163\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.77872610092163\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.779728174209595\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.78073740005493\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.78073740005493\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.7818021774292\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.782731533050537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.78372836112976\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.7847261428833\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.7847261428833\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.785724639892578\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.785724639892578\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.786766529083252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.786766529083252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.787724018096924\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.787724018096924\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.788722038269043\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.788722038269043\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.789726972579956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.790731191635132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.790731191635132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.791728019714355\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.791728019714355\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.79272985458374\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.79373550415039\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.79473090171814\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.795721769332886\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.795721769332886\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.79672598838806\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.798743963241577\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.798743963241577\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.799724340438843\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.800767183303833\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.800767183303833\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.80172610282898\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.80172610282898\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.802767753601074\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.803797721862793\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.804728507995605\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.805726051330566\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.80583930015564\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.806736946105957\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.807733058929443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.808732748031616\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.808732748031616\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.809730052947998\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.810725212097168\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.81198763847351\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.812732934951782\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.81377363204956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.81377363204956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.81473135948181\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.81576657295227\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.816726207733154\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.816726207733154\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.81773090362549\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.81773090362549\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.818732976913452\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.819824695587158\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.819824695587158\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.820764780044556\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.8217294216156\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.8217294216156\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.822730541229248\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.823724508285522\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.823724508285522\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.824727773666382\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.824727773666382\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.82573103904724\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.826733112335205\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.826733112335205\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.827734231948853\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.827734231948853\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.828731060028076\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.828731060028076\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.829730987548828\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.830723524093628\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.83182191848755\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.83272385597229\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.833730459213257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.834725379943848\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.834725379943848\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.835747718811035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.835747718811035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.836729049682617\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.837723970413208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.837723970413208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.838724613189697\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.83972668647766\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.840729236602783\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.841845750808716\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.842732191085815\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.842732191085815\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.843727827072144\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.843727827072144\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.844722986221313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.844722986221313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.845725297927856\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.847732305526733\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.847732305526733\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.84872579574585\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.849727153778076\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.849727153778076\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.850745677947998\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.850745677947998\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.851726293563843\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.851726293563843\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.852726221084595\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.85373306274414\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.85373306274414\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.85473108291626\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.85473108291626\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.855781078338623\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.856723070144653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.856723070144653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.857722759246826\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.85872793197632\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.859734773635864\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.86072874069214\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.86173701286316\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.86272692680359\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.86272692680359\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.863759756088257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.863759756088257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.864983081817627\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.864983081817627\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.865724563598633\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.866729259490967\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.866729259490967\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.86774182319641\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.86873483657837\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.86873483657837\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.869739770889282\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.870729684829712\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.871739387512207\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.872740030288696\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.873725414276123\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.874729871749878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.875731706619263\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.875731706619263\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.87673020362854\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.877731800079346\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.877731800079346\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.87873125076294\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.87873125076294\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.879724979400635\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.879724979400635\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.88072443008423\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.88072443008423\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.88172459602356\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.882731437683105\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.88373064994812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.88373064994812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.88473391532898\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.885738849639893\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.886726140975952\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.8877272605896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.8887722492218\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.8887722492218\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.890769958496094\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.890769958496094\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.891780614852905\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.891780614852905\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.89302897453308\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.89372968673706\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.894726037979126\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.89574694633484\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.896737098693848\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.897735118865967\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.898975610733032\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.899736404418945\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.900728464126587\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.9017391204834\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.9017391204834\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.90272831916809\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.903817176818848\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.903817176818848\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.904727458953857\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.905890464782715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.905890464782715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.906728744506836\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.90774154663086\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.90872597694397\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.90872597694397\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.9099702835083\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.910966873168945\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.910966873168945\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.911970615386963\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.912967920303345\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.913065671920776\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.913975954055786\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.914979219436646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.915966510772705\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.91696548461914\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.917105674743652\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.918015241622925\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.91897177696228\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.91897177696228\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.919970512390137\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.920974254608154\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.921972036361694\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.922972679138184\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.92397713661194\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.924975395202637\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.926971435546875\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.927975177764893\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.92897057533264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.92897057533264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.930005311965942\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.930005311965942\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.93096661567688\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.931969165802002\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.932968139648438\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.932968139648438\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.932968139648438\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.934004545211792\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.93496561050415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.93496561050415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.935967683792114\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.93697190284729\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.93897247314453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.93897247314453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.93996834754944\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.93996834754944\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.94096803665161\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.941966772079468\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.94297981262207\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.94297981262207\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.94396471977234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.945067167282104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.94596791267395\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.946012496948242\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.947449445724487\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.947967290878296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.947967290878296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.948973417282104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.94998860359192\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.950350761413574\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.950971841812134\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.951971530914307\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.951971530914307\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.952967643737793\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.952967643737793\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.953965187072754\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.953965187072754\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.954999208450317\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.955965757369995\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.955965757369995\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.95697283744812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.95697283744812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.958001375198364\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.958001375198364\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.9589684009552\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.95996880531311\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.960971117019653\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.962175130844116\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.962971448898315\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.963965892791748\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.965096712112427\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.96596622467041\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.96596622467041\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.96696448326111\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.96696448326111\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.96843409538269\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.968969106674194\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.968969106674194\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.96997308731079\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.970977544784546\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.97201156616211\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.973974466323853\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.975005865097046\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.97608518600464\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.976966619491577\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.97798776626587\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.97910451889038\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.97999143600464\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.981077909469604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.981993913650513\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.981993913650513\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.98300838470459\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.983968257904053\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.983968257904053\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.984976530075073\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.9864399433136\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.986993074417114\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.986993074417114\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.98802161216736\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.988974809646606\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.988974809646606\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.989969730377197\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.990965843200684\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.992010831832886\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.992010831832886\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.99297285079956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.99297285079956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.994009017944336\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.994999408721924\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.994999408721924\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.995968341827393\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.995968341827393\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.995968341827393\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.997064352035522\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.997971296310425\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.997971296310425\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.999234676361084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.999234676361084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n25.99996304512024\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.001009702682495\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.001983642578125\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.002968549728394\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.002968549728394\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.00397038459778\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.00397038459778\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.004972457885742\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.004972457885742\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.007065773010254\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.007065773010254\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.00797700881958\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.00896382331848\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.00896382331848\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.010002374649048\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.010002374649048\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.01096749305725\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.01096749305725\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.01196551322937\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.01296615600586\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.01296615600586\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.01405620574951\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.01405620574951\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.014966011047363\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.014966011047363\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.01602339744568\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.016975164413452\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.016975164413452\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.016975164413452\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.01796793937683\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.01896858215332\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.01896858215332\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.020020723342896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.020973443984985\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.020973443984985\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.022006511688232\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.022006511688232\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.02300000190735\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.023226022720337\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.023226022720337\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.023966550827026\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.023966550827026\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.025169849395752\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.025969982147217\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.02696418762207\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.02696418762207\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.028008460998535\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.028008460998535\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.029112815856934\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.029112815856934\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.030121088027954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.030309200286865\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.031071662902832\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.031071662902832\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.031962156295776\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.031962156295776\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.03296709060669\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.03398871421814\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.03498077392578\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.03498077392578\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.03605580329895\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.03605580329895\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.03696918487549\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.03696918487549\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.038001537322998\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.038001537322998\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.038963556289673\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.038963556289673\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.040010929107666\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.041041374206543\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.041041374206543\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.041974782943726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.041974782943726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.04304003715515\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.04304003715515\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.044007062911987\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.044007062911987\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.045108795166016\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.045108795166016\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.04596209526062\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.04596209526062\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.04696822166443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.04798173904419\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.04798173904419\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.048970222473145\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.048970222473145\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.050005197525024\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.050005197525024\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.051095962524414\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.051095962524414\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.05200433731079\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.05200433731079\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.05296754837036\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.05296754837036\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.054022789001465\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.054967164993286\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.054967164993286\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.05601215362549\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.05617904663086\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.056967735290527\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.056967735290527\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.058003902435303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.058003902435303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.05896759033203\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.05896759033203\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.05997395515442\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.05997395515442\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.06102752685547\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.06224775314331\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.062967538833618\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.062967538833618\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.062967538833618\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.064006328582764\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.064006328582764\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.065001964569092\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.065001964569092\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.0660297870636\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.0660297870636\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.0669686794281\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.067967414855957\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.067967414855957\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.069141387939453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.069141387939453\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.070003747940063\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.070003747940063\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.071004152297974\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.071004152297974\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.072007417678833\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.072007417678833\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.073004007339478\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.073004007339478\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.073970794677734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.073970794677734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.07596707344055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.07596707344055\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.077005624771118\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.078005075454712\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.078005075454712\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.079010009765625\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.079010009765625\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.08011031150818\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.08011031150818\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.080970764160156\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.081998348236084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.08296489715576\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.083967685699463\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.083967685699463\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.083967685699463\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.08504605293274\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.08504605293274\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.086007118225098\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.08696699142456\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.08696699142456\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.087966680526733\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.088972091674805\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.090334177017212\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.090965747833252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.090965747833252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.09230399131775\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.092968225479126\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.092968225479126\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.094032764434814\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.094967365264893\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.094967365264893\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.095982551574707\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.09796690940857\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.09896683692932\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.09896683692932\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.099966049194336\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.100963830947876\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.100963830947876\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.10196542739868\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.103580713272095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.10397243499756\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.104971170425415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.10597038269043\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.106969833374023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.106969833374023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.108098030090332\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.108098030090332\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.10897135734558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.11012315750122\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.11012315750122\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.111008167266846\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.11200761795044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.11200761795044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.113038301467896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.113038301467896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.114007472991943\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.1149685382843\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.1149685382843\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.115975618362427\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.11753249168396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.117968320846558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.119142293930054\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.119142293930054\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.119967222213745\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.120971202850342\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.12200927734375\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.12200927734375\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.12296986579895\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.124022960662842\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.12496519088745\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.12496519088745\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.125969648361206\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.126969814300537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.127965927124023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.127965927124023\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.129004955291748\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.129972457885742\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.129972457885742\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.130966901779175\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.132046699523926\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.132046699523926\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.132996082305908\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.134007215499878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.134007215499878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.135002374649048\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.135002374649048\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.13601064682007\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.136969566345215\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.137972354888916\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.138946056365967\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.13896608352661\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.139999628067017\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.139999628067017\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.14100217819214\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.14100217819214\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.142002820968628\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.143009424209595\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.143009424209595\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.143975734710693\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.145028829574585\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.14600968360901\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.147002935409546\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.147002935409546\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.148077487945557\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.148077487945557\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.14898443222046\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.1499981880188\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.1499981880188\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.151078701019287\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.152015209197998\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.153059005737305\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.153982400894165\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.154213190078735\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.155011892318726\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.15560555458069\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.156009674072266\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.15696406364441\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.157975912094116\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.158965826034546\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.158965826034546\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.16000771522522\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.160977840423584\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.160977840423584\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.162004470825195\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.162970542907715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.162970542907715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.163963794708252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.164971113204956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.165971755981445\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.16620182991028\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.16700792312622\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.167969226837158\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.16896629333496\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.16915726661682\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.170045614242554\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.170329809188843\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.171032190322876\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.17197036743164\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.17324662208557\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.17396855354309\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.17396855354309\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.17496967315674\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.17496967315674\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.175965785980225\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.176974058151245\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.178101301193237\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.178969144821167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.18003797531128\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.18097162246704\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.182018518447876\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.182976245880127\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.184250116348267\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.18512511253357\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.185978412628174\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.188036918640137\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.188979625701904\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.19025993347168\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.19133448600769\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.191978216171265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.194025993347168\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.19501495361328\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.1962149143219\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.196996450424194\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.19898295402527\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.199225664138794\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.200968265533447\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.202073574066162\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.203391551971436\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.203972339630127\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.20546579360962\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.20698356628418\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.2081139087677\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.20898151397705\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.210052967071533\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.211266040802002\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.212098360061646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.214054346084595\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.215262413024902\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.216068506240845\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.218056201934814\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.21904969215393\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.220057249069214\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.221055030822754\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.22204279899597\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.22204279899597\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.223185300827026\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.223185300827026\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.224043607711792\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.224043607711792\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.225077629089355\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.225077629089355\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.226046800613403\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.226046800613403\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.22704267501831\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.2280433177948\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.229053020477295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.229053020477295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.230047464370728\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.230047464370728\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.231189250946045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.231189250946045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.232043504714966\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.232043504714966\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.233134984970093\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.233134984970093\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.234044551849365\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.234044551849365\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.235047817230225\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.235047817230225\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.236087799072266\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.23704218864441\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.23704218864441\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.238340854644775\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.239047527313232\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.239047527313232\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.240043878555298\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.24104881286621\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.24206829071045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.24206829071045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.24404239654541\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.24404239654541\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.245049238204956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.24604344367981\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.24604344367981\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.247079372406006\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.24804973602295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.249048709869385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.249048709869385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.250044107437134\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.250895977020264\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.251044750213623\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.25204634666443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.253044605255127\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.253044605255127\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.2540442943573\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.25504159927368\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.25604248046875\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.257054805755615\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.25804901123047\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.259056568145752\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.26004433631897\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.26004433631897\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.261041164398193\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.26204824447632\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.26304864883423\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.26427435874939\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.26504135131836\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.266042709350586\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.266042709350586\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.267301559448242\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.26803994178772\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.269044399261475\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.270044803619385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.27204442024231\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.273057222366333\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.274051666259766\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.274051666259766\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.275315523147583\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.27605152130127\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.27704644203186\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.278052806854248\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.27903985977173\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.27903985977173\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.280093908309937\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.280093908309937\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.281041622161865\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.282040119171143\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.283044576644897\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.284050464630127\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.28605580329895\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.287046909332275\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.28804349899292\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.28804349899292\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.289039850234985\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.290053129196167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.29103970527649\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.29103970527649\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.292407989501953\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.292407989501953\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.293081998825073\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.294044733047485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.294044733047485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.29504418373108\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.29504418373108\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.296077489852905\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.29704451560974\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.298142910003662\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.300573587417603\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.301042318344116\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.30205011367798\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.304051637649536\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.30505394935608\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.30605411529541\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.30605411529541\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.307173252105713\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.307173252105713\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.308340311050415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.30904221534729\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.30904221534729\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.31007218360901\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.311072826385498\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.31406545639038\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.31504535675049\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.31604766845703\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.31604766845703\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.317056894302368\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.31806230545044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.31906533241272\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.32003951072693\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.32003951072693\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.32108473777771\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.32108473777771\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.32234287261963\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.323044300079346\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.323044300079346\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.32408332824707\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.325056552886963\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.326242923736572\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.326242923736572\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.327041625976562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.328044891357422\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.328044891357422\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.32904291152954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.32904291152954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.32904291152954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.330041646957397\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.331044912338257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.331044912338257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.33305048942566\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.33305048942566\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.334105730056763\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.335043907165527\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.33608078956604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.33608078956604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.337042808532715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.338047742843628\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.339048862457275\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.339048862457275\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.34104561805725\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.34205389022827\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.343042850494385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.34404468536377\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.34404468536377\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.34504771232605\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.346047401428223\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.347044944763184\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.347044944763184\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.34805178642273\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.34805178642273\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.349230766296387\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.350393295288086\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.351044416427612\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.35204768180847\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.35204768180847\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.3530433177948\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.3530433177948\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.354394912719727\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.354394912719727\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.355041980743408\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.356046676635742\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.356046676635742\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.357250928878784\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.358044862747192\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.359043836593628\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.359043836593628\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.360048532485962\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.361152172088623\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.362057209014893\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.362057209014893\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.36304521560669\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.364046335220337\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.364046335220337\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.365049600601196\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.36606192588806\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.367048263549805\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.367048263549805\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.368085622787476\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.369064807891846\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.36938762664795\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.370043992996216\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.370043992996216\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.37104296684265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.37104296684265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.372041702270508\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.37305188179016\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.37305188179016\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.37404727935791\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.37404727935791\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.375072717666626\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.375072717666626\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.376044511795044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.377054452896118\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.377054452896118\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.378063678741455\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.37904977798462\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.38004446029663\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.38004446029663\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.38105082511902\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.382047176361084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.383044481277466\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.383044481277466\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.384039878845215\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.384039878845215\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.38504648208618\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.38504648208618\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.38609266281128\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.38609266281128\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.387041330337524\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.38809561729431\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.38809561729431\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.389044523239136\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.390040636062622\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.390040636062622\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.391048192977905\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.391048192977905\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.392048358917236\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.392048358917236\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.39304828643799\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.39404320716858\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.395041942596436\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.395041942596436\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.39704394340515\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.39711570739746\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.3980655670166\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.3980655670166\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.399043321609497\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.399043321609497\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.400044918060303\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.4020516872406\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.403085470199585\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.403085470199585\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.404049158096313\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.40504240989685\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.406455278396606\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.407108783721924\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.408041954040527\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.408041954040527\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.409039735794067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.409039735794067\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.41007971763611\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.41007971763611\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.411100149154663\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.4120991230011\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.412189245224\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.413100481033325\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.41509771347046\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.416097164154053\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.416097164154053\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.417097806930542\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.418100357055664\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.419095516204834\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.419095516204834\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.42009997367859\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.420294761657715\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.421101570129395\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.421101570129395\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.422093391418457\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.42310118675232\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.42310118675232\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.42410111427307\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.425103187561035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.42609453201294\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.42609453201294\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.427101850509644\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.42810821533203\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.42909836769104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.43009352684021\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.431111097335815\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.431251764297485\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.43209719657898\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.433094263076782\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.433094263076782\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.434093236923218\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.434093236923218\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.43509531021118\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.43609309196472\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.43709373474121\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.43713355064392\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.438101053237915\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.438101053237915\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.439104318618774\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.44009256362915\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.44009256362915\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.4411039352417\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.44222903251648\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.44309949874878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.44409680366516\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.44409680366516\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.445109844207764\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.446096897125244\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.446096897125244\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.447094440460205\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.447094440460205\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.448164701461792\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.449098587036133\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.449098587036133\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.45109510421753\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.45210027694702\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.454095602035522\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.45524787902832\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.456180334091187\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.45712447166443\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.458102464675903\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.459096670150757\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.459096670150757\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.460233211517334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.460233211517334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.46109914779663\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.462093830108643\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.463093757629395\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.463093757629395\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.464098691940308\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.46509623527527\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.46509623527527\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.467108964920044\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.468118906021118\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.469139099121094\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.4700984954834\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.4700984954834\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.471111536026\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.47231149673462\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.47231149673462\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.473096132278442\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.473096132278442\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.474095106124878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.474095106124878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.475093841552734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.476092100143433\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.477097034454346\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.47810173034668\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.479125499725342\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.48009753227234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.48110866546631\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.482104539871216\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.482104539871216\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.483094453811646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.484093189239502\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.484093189239502\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.4851016998291\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.486100435256958\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.486100435256958\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.487285614013672\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.487285614013672\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.488097190856934\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.488097190856934\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.48941469192505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.48941469192505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.490095615386963\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.4910945892334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.49321722984314\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.495120763778687\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.496095657348633\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.497310161590576\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.498316764831543\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.4991512298584\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.50031304359436\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.501097440719604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.50209403038025\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.50309443473816\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.503268241882324\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.504093170166016\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.505097150802612\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.506093502044678\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.507105350494385\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.50820016860962\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.51009774208069\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.51009774208069\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.511236667633057\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.5121066570282\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.513179063796997\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.513179063796997\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.514092922210693\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.516101598739624\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.51709008216858\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.51709008216858\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.518096208572388\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.51909899711609\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.520095825195312\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.520095825195312\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.521094799041748\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.522096872329712\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.522096872329712\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.523096561431885\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.524108171463013\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.526097059249878\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.52709674835205\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.52709674835205\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.5281822681427\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.52909564971924\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.52909564971924\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.53009581565857\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.531126737594604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.532094478607178\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.53309726715088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.53309726715088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.534093379974365\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.535112380981445\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.536102056503296\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.537614345550537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.53809142112732\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.5390944480896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.54009437561035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.541090726852417\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.541090726852417\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.542519092559814\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.543588876724243\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.544092655181885\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.545091152191162\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.545091152191162\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.546093702316284\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.547094106674194\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.548092365264893\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.550099849700928\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.550099849700928\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.55109667778015\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.55109667778015\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.552096605300903\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.553096294403076\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.55509853363037\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.556097507476807\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.556097507476807\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.55732774734497\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.558091402053833\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.559091806411743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.559091806411743\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.56009316444397\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.56110644340515\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.56209707260132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.56209707260132\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.563318729400635\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.5640926361084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.5640926361084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.565092086791992\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.566091060638428\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.567091464996338\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.56714963912964\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.56809377670288\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.569095134735107\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.569095134735107\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.5700945854187\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.571110248565674\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.57209348678589\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.57209348678589\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.573120832443237\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.574116230010986\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.57509756088257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.576094388961792\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.576094388961792\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.57709765434265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.57709765434265\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.578091859817505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.57909393310547\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.57909393310547\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.58009386062622\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.58009386062622\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.581192016601562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.581192016601562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.582099676132202\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.58309268951416\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.58409833908081\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.58409833908081\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.585103750228882\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.585103750228882\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.58609437942505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.58609437942505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.587125062942505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.587125062942505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.58809542655945\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.589094877243042\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.590100049972534\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.59109401702881\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.59236168861389\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.593092918395996\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.593092918395996\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.59409213066101\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.59409213066101\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.595093250274658\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.59609889984131\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.59609889984131\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.597119331359863\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.597119331359863\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.598098278045654\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.598098278045654\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.5990993976593\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.5990993976593\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.600269317626953\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.600269317626953\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.601134777069092\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.601134777069092\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.603104829788208\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.604095697402954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.604095697402954\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.605092525482178\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.605092525482178\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.606258153915405\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.606258153915405\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.607157707214355\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.607157707214355\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.608333826065063\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.609112977981567\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.610235452651978\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.611252546310425\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.611252546310425\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.61225199699402\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.61225199699402\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.61325240135193\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.61325240135193\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.614320993423462\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.614320993423462\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.615251064300537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.616268634796143\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.61725926399231\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.61827325820923\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.61925458908081\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.620254278182983\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.620564937591553\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.621254920959473\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.621254920959473\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.622251272201538\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.623257637023926\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.624255418777466\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.624255418777466\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.62525200843811\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.626262187957764\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.626262187957764\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.627257347106934\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.628480911254883\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.628480911254883\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.629334211349487\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.63026189804077\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.63125228881836\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.632256746292114\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.632256746292114\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.633254766464233\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.634295225143433\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.635256052017212\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.636253356933594\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.638256311416626\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.640255212783813\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.641433715820312\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.64225459098816\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.64225459098816\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.643561601638794\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.645267009735107\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.6462721824646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.6462721824646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.647263288497925\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.648258447647095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.64851403236389\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.649255990982056\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.650264739990234\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.651267766952515\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.652262210845947\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.65447974205017\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.655252933502197\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.655252933502197\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.6562602519989\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.657254457473755\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.65828275680542\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.65926241874695\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.660252809524536\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.660252809524536\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.66125726699829\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.662251234054565\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.66230821609497\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.66230821609497\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.663251161575317\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.664255380630493\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.664255380630493\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.66525149345398\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.666255950927734\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.66725444793701\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.66725444793701\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.668301105499268\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.668301105499268\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.669261932373047\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.67026376724243\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.671286821365356\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.6722891330719\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.673274755477905\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.67425298690796\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.6752986907959\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.6752986907959\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.67625594139099\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.677352905273438\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.677352905273438\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.6782546043396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.679258346557617\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.68134379386902\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.682255506515503\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.682255506515503\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.68325424194336\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.684252738952637\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.685256242752075\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.685256242752075\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.686256170272827\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.687253713607788\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.687253713607788\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.68826389312744\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.68926501274109\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.68926501274109\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.690251111984253\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.690251111984253\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.691345930099487\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.69233727455139\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.693294763565063\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.69433093070984\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.69433093070984\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.69525408744812\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.696255445480347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.696255445480347\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.697254180908203\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.69825291633606\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.700254201889038\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.700254201889038\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.701307773590088\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.70298671722412\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.703255653381348\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.704360961914062\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.70525360107422\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.70525360107422\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.706297636032104\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.707294464111328\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.70825457572937\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.709272623062134\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.710254192352295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.710254192352295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.711254835128784\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.712255716323853\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.712255716323853\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.713261127471924\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.71425771713257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.7154324054718\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.7154324054718\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.716253757476807\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.717252254486084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.717252254486084\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.718252420425415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.718252420425415\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.719252109527588\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.72025442123413\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.721277236938477\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.722254037857056\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.722254037857056\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.72325563430786\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.7242534160614\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.7242534160614\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.72525453567505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.72525453567505\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.726261138916016\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.726261138916016\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.727262258529663\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.728269577026367\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.72936201095581\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.72936201095581\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.730259895324707\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.730259895324707\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.73125410079956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.73125410079956\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.732253789901733\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.733253240585327\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.733253240585327\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.73446798324585\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.735252857208252\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.736258029937744\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.7377769947052\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.73848605155945\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.73848605155945\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.739253044128418\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.740251302719116\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.740251302719116\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.741272926330566\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.742254495620728\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.74328851699829\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.744260549545288\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.7452654838562\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.74625277519226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.74625277519226\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.74725317955017\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.74725317955017\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.748263835906982\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.74925994873047\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.750258684158325\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.751257181167603\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.75225305557251\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.75225305557251\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.75325345993042\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.75425148010254\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.755258321762085\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.756516695022583\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.75725483894348\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.758261680603027\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.759340286254883\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.759340286254883\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.76043176651001\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.761259078979492\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.761259078979492\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.762255668640137\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.762255668640137\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.763254404067993\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.763254404067993\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.76425838470459\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.765257358551025\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.76626181602478\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.76626181602478\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.767263650894165\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.768253803253174\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.76938557624817\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.77025532722473\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.771260023117065\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.77225399017334\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.773254871368408\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.774327993392944\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.774327993392944\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.775468349456787\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.776257514953613\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.776257514953613\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.77729058265686\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.77729058265686\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.7783522605896\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.779257774353027\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.779257774353027\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.78033137321472\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.78033137321472\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.781256198883057\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.781256198883057\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.78228259086609\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.78228259086609\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.783267974853516\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.78525733947754\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.786253690719604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.786253690719604\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.787253379821777\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.78825354576111\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.78825354576111\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.78925395011902\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.79025912284851\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.79025912284851\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.791263103485107\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.792253017425537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.792253017425537\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.79325246810913\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.79325246810913\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.794251918792725\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.794251918792725\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.795252799987793\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.795252799987793\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.79625630378723\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.797250986099243\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.797250986099243\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.79826068878174\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.79848074913025\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.79925227165222\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.79925227165222\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.800285577774048\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.800285577774048\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.80125093460083\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.80125093460083\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.802250623703003\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.80230140686035\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.80325698852539\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.80325698852539\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.804264307022095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.804264307022095\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.805256128311157\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.805256128311157\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.806256771087646\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.807260990142822\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.807260990142822\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.808252573013306\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.808252573013306\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.809253454208374\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.810254335403442\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.810254335403442\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.81125521659851\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.812256574630737\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.812256574630737\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.813591957092285\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.81425428390503\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.81425428390503\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.815253257751465\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.815378427505493\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.8162522315979\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.817258596420288\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.817258596420288\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.8182532787323\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.8182532787323\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.819254398345947\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.819254398345947\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.82025456428528\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.82025456428528\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.821271419525146\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.82225465774536\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.82225465774536\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.823254585266113\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.823254585266113\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.824251651763916\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.824251651763916\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.826252698898315\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.827251434326172\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.827251434326172\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.827251434326172\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.828254461288452\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.828254461288452\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.829251050949097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.829251050949097\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.830541610717773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.830541610717773\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.831254959106445\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.832263469696045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.8332576751709\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.8332576751709\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.83425545692444\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.83425545692444\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.835254192352295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.835254192352295\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.836257219314575\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.836257219314575\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.83725619316101\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.838258028030396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.838258028030396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.83925771713257\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.840261697769165\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.841254949569702\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.841254949569702\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.84225344657898\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.84325337409973\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.84325337409973\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.84325337409973\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.844252347946167\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.845253944396973\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.84633779525757\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.84633779525757\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.84725260734558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.84725260734558\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.848265171051025\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.849255800247192\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.850252151489258\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.850252151489258\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.85125207901001\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.85125207901001\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.853269338607788\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.854254007339478\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.854254007339478\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.855253219604492\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.855253219604492\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.856253623962402\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.856253623962402\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.85732865333557\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.85732865333557\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.858454942703247\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.858454942703247\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.85926342010498\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.860337734222412\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.860337734222412\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.861251831054688\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.861251831054688\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.86225199699402\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.86225199699402\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.86325478553772\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.86325478553772\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.864253044128418\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.864253044128418\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.865258932113647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.865258932113647\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.866254329681396\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.86729645729065\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.86729645729065\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.86825394630432\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.869293451309204\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.869293451309204\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.8702552318573\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.8702552318573\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.8702552318573\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.87161612510681\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.872258186340332\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.872258186340332\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.87325620651245\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.87325620651245\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.874255657196045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.874255657196045\ncurrent transaction is aborted, commands ignored until end of transaction block\n\n26.875293731689453\n1637452739.1762984\ntime taken to load the data is 26.875293731689453 seconds\n"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
]
] |
d07c3afce1efcdc9aa1357c796c40ebe35f18159 | 459,414 | ipynb | Jupyter Notebook | exercise/chap_9/13_reconstruction_error.ipynb | vaibhavkumar11/hands-on-ml | 6a36c85dee4599ff1a87d47111e795b69b43821b | [
"MIT"
] | null | null | null | exercise/chap_9/13_reconstruction_error.ipynb | vaibhavkumar11/hands-on-ml | 6a36c85dee4599ff1a87d47111e795b69b43821b | [
"MIT"
] | null | null | null | exercise/chap_9/13_reconstruction_error.ipynb | vaibhavkumar11/hands-on-ml | 6a36c85dee4599ff1a87d47111e795b69b43821b | [
"MIT"
] | null | null | null | 1,316.372493 | 174,268 | 0.959013 | [
[
[
"import numpy as np\nfrom sklearn.datasets import fetch_olivetti_faces\nimport matplotlib.pyplot as plt\n\nnp.random.seed(42)\n%matplotlib inline",
"_____no_output_____"
],
[
"data = fetch_olivetti_faces()\nx = data.data\ny = data.target",
"_____no_output_____"
],
[
"print(x.shape)\nprint(y.shape)",
"(400, 4096)\n(400,)\n"
],
[
"plt.imshow(x[0].reshape(64, 64), cmap='gray')",
"_____no_output_____"
],
[
"# Looking on a random set of images\nfig = plt.figure(figsize=(9, 9))\ncols = 4\nrows = 5\nfor ind in range(1, cols*rows+1):\n img = x[np.random.randint(x.shape[0])].reshape(64, 64)\n fig.add_subplot(rows, cols, ind)\n plt.imshow(img, cmap='gray')\n plt.axis(\"off\")\nplt.show()",
"_____no_output_____"
],
[
"x.shape",
"_____no_output_____"
],
[
"# Splitting into train and test set and having equal proportions\nfrom sklearn.model_selection import StratifiedShuffleSplit\n\nsplit_test = StratifiedShuffleSplit(n_splits=1, test_size=0.1, random_state=42)\nfor train_valid_ind, test_ind in split_test.split(x, y):\n x_train_valid, x_test = x[train_valid_ind], x[test_ind]\n y_train_valid, y_test = y[train_valid_ind], y[test_ind]\n\nsplit_valid = StratifiedShuffleSplit(n_splits=1, test_size=0.2, random_state=42)\nfor train_ind, valid_ind in split_valid.split(x_train_valid, y_train_valid):\n x_train, x_valid = x_train_valid[train_ind], x_train_valid[valid_ind]\n y_train, y_valid = y_train_valid[train_ind], y_train_valid[valid_ind]",
"_____no_output_____"
]
],
[
[
"### PCA Reduction",
"_____no_output_____"
]
],
[
[
"from sklearn.decomposition import PCA\n\npca = PCA(n_components=0.99)\nx_train_pca = pca.fit_transform(x_train)\nx_valid_pca = pca.transform(x_valid)",
"_____no_output_____"
],
[
"def plot_faces(faces, label, n_rows = 4, n_cols = 5): \n plt.figure(figsize=(8, 5))\n for index, (face, label) in enumerate(zip(faces, label)):\n plt.subplot(n_rows, n_cols, index+1)\n plt.imshow(face.reshape(64, 64), cmap='gray')\n plt.axis(\"off\")\n plt.title(label)\n plt.show()",
"_____no_output_____"
]
],
[
[
"### Modifying Images ",
"_____no_output_____"
]
],
[
[
"from scipy import ndimage\n\n# rotated, flipped and darkened the images\n# flipping and darkening has been used from solution as turned out to be easier\nx_transformed = []\nfor face in x_train[:20]:\n transform = ndimage.rotate(face.reshape(64, 64), angle=np.random.choice([90, 180]),\n mode='constant')[:,::-1]\n transform[:, 1:-1] *= np.random.choice([1, 0.3])\n x_transformed.append(transform)\n \nx_transformed = np.array(x_transformed)",
"_____no_output_____"
],
[
"def error(pca, x):\n x_pca = pca.transform(x)\n x_reconstruct = pca.inverse_transform(x_pca)\n return np.square(x_reconstruct - x).mean(axis=-1)",
"_____no_output_____"
],
[
"error(pca, x_train[:20]).mean()",
"_____no_output_____"
],
[
"error(pca, x_transformed.reshape(-1, 4096)).mean()",
"_____no_output_____"
]
],
[
[
"### The reconstruction error is not large",
"_____no_output_____"
]
],
[
[
"plot_faces(x_transformed, y_train[:20])",
"_____no_output_____"
],
[
"x_transformed_pca = pca.transform(x_transformed.reshape(-1, 4096))\nplot_faces(pca.inverse_transform(x_transformed_pca), y_train[:20])",
"_____no_output_____"
]
],
[
[
"### All reconstructed images look similar as pca one type of alignment of the face but the transformed images have varying alignments",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
]
] |
d07c4818f7eb13b06ca540f94db03e841df2efad | 6,806 | ipynb | Jupyter Notebook | docs/_downloads/52d4aaa33601a2b3990ace6aa45546ce/numpy_extensions_tutorial.ipynb | leejh1230/PyTorch-tutorials-kr | ebbf44b863ff96c597631e28fc194eafa590c9eb | [
"BSD-3-Clause"
] | 1 | 2019-12-05T05:16:44.000Z | 2019-12-05T05:16:44.000Z | docs/_downloads/52d4aaa33601a2b3990ace6aa45546ce/numpy_extensions_tutorial.ipynb | leejh1230/PyTorch-tutorials-kr | ebbf44b863ff96c597631e28fc194eafa590c9eb | [
"BSD-3-Clause"
] | null | null | null | docs/_downloads/52d4aaa33601a2b3990ace6aa45546ce/numpy_extensions_tutorial.ipynb | leejh1230/PyTorch-tutorials-kr | ebbf44b863ff96c597631e28fc194eafa590c9eb | [
"BSD-3-Clause"
] | null | null | null | 47.263889 | 1,701 | 0.597708 | [
[
[
"%matplotlib inline",
"_____no_output_____"
]
],
[
[
"\nCreating Extensions Using numpy and scipy\n=========================================\n**Author**: `Adam Paszke <https://github.com/apaszke>`_\n\n**Updated by**: `Adam Dziedzic <https://github.com/adam-dziedzic>`_\n\nIn this tutorial, we shall go through two tasks:\n\n1. Create a neural network layer with no parameters.\n\n - This calls into **numpy** as part of its implementation\n\n2. Create a neural network layer that has learnable weights\n\n - This calls into **SciPy** as part of its implementation\n",
"_____no_output_____"
]
],
[
[
"import torch\nfrom torch.autograd import Function",
"_____no_output_____"
]
],
[
[
"Parameter-less example\n----------------------\n\nThis layer doesn’t particularly do anything useful or mathematically\ncorrect.\n\nIt is aptly named BadFFTFunction\n\n**Layer Implementation**\n\n",
"_____no_output_____"
]
],
[
[
"from numpy.fft import rfft2, irfft2\n\n\nclass BadFFTFunction(Function):\n @staticmethod\n def forward(ctx, input):\n numpy_input = input.detach().numpy()\n result = abs(rfft2(numpy_input))\n return input.new(result)\n\n @staticmethod\n def backward(ctx, grad_output):\n numpy_go = grad_output.numpy()\n result = irfft2(numpy_go)\n return grad_output.new(result)\n\n# since this layer does not have any parameters, we can\n# simply declare this as a function, rather than as an nn.Module class\n\n\ndef incorrect_fft(input):\n return BadFFTFunction.apply(input)",
"_____no_output_____"
]
],
[
[
"**Example usage of the created layer:**\n\n",
"_____no_output_____"
]
],
[
[
"input = torch.randn(8, 8, requires_grad=True)\nresult = incorrect_fft(input)\nprint(result)\nresult.backward(torch.randn(result.size()))\nprint(input)",
"_____no_output_____"
]
],
[
[
"Parametrized example\n--------------------\n\nIn deep learning literature, this layer is confusingly referred\nto as convolution while the actual operation is cross-correlation\n(the only difference is that filter is flipped for convolution,\nwhich is not the case for cross-correlation).\n\nImplementation of a layer with learnable weights, where cross-correlation\nhas a filter (kernel) that represents weights.\n\nThe backward pass computes the gradient wrt the input and the gradient wrt the filter.\n\n",
"_____no_output_____"
]
],
[
[
"from numpy import flip\nimport numpy as np\nfrom scipy.signal import convolve2d, correlate2d\nfrom torch.nn.modules.module import Module\nfrom torch.nn.parameter import Parameter\n\n\nclass ScipyConv2dFunction(Function):\n @staticmethod\n def forward(ctx, input, filter, bias):\n # detach so we can cast to NumPy\n input, filter, bias = input.detach(), filter.detach(), bias.detach()\n result = correlate2d(input.numpy(), filter.numpy(), mode='valid')\n result += bias.numpy()\n ctx.save_for_backward(input, filter, bias)\n return torch.as_tensor(result, dtype=input.dtype)\n\n @staticmethod\n def backward(ctx, grad_output):\n grad_output = grad_output.detach()\n input, filter, bias = ctx.saved_tensors\n grad_output = grad_output.numpy()\n grad_bias = np.sum(grad_output, keepdims=True)\n grad_input = convolve2d(grad_output, filter.numpy(), mode='full')\n # the previous line can be expressed equivalently as:\n # grad_input = correlate2d(grad_output, flip(flip(filter.numpy(), axis=0), axis=1), mode='full')\n grad_filter = correlate2d(input.numpy(), grad_output, mode='valid')\n return torch.from_numpy(grad_input), torch.from_numpy(grad_filter).to(torch.float), torch.from_numpy(grad_bias).to(torch.float)\n\n\nclass ScipyConv2d(Module):\n def __init__(self, filter_width, filter_height):\n super(ScipyConv2d, self).__init__()\n self.filter = Parameter(torch.randn(filter_width, filter_height))\n self.bias = Parameter(torch.randn(1, 1))\n\n def forward(self, input):\n return ScipyConv2dFunction.apply(input, self.filter, self.bias)",
"_____no_output_____"
]
],
[
[
"**Example usage:**\n\n",
"_____no_output_____"
]
],
[
[
"module = ScipyConv2d(3, 3)\nprint(\"Filter and bias: \", list(module.parameters()))\ninput = torch.randn(10, 10, requires_grad=True)\noutput = module(input)\nprint(\"Output from the convolution: \", output)\noutput.backward(torch.randn(8, 8))\nprint(\"Gradient for the input map: \", input.grad)",
"_____no_output_____"
]
],
[
[
"**Check the gradients:**\n\n",
"_____no_output_____"
]
],
[
[
"from torch.autograd.gradcheck import gradcheck\n\nmoduleConv = ScipyConv2d(3, 3)\n\ninput = [torch.randn(20, 20, dtype=torch.double, requires_grad=True)]\ntest = gradcheck(moduleConv, input, eps=1e-6, atol=1e-4)\nprint(\"Are the gradients correct: \", test)",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07c5c19cc4299188e0912ec99cd6f4ab652c83b | 20,933 | ipynb | Jupyter Notebook | site/ja/beta/guide/autograph.ipynb | abviv/docs | 4ab9d373f693ce804dc21529997de104e3d9e47c | [
"Apache-2.0"
] | 1 | 2019-08-22T13:15:19.000Z | 2019-08-22T13:15:19.000Z | site/ja/beta/guide/autograph.ipynb | abviv/docs | 4ab9d373f693ce804dc21529997de104e3d9e47c | [
"Apache-2.0"
] | null | null | null | site/ja/beta/guide/autograph.ipynb | abviv/docs | 4ab9d373f693ce804dc21529997de104e3d9e47c | [
"Apache-2.0"
] | 1 | 2020-10-31T13:13:43.000Z | 2020-10-31T13:13:43.000Z | 28.596995 | 429 | 0.488798 | [
[
[
"##### Copyright 2018 The TensorFlow Authors.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");",
"_____no_output_____"
]
],
[
[
"#@title Licensed under the Apache License, Version 2.0 (the \"License\"); { display-mode: \"form\" }\n# you may not use this file except in compliance with the License.\n# You may obtain a copy of the License at\n#\n# https://www.apache.org/licenses/LICENSE-2.0\n#\n# Unless required by applicable law or agreed to in writing, software\n# distributed under the License is distributed on an \"AS IS\" BASIS,\n# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n# See the License for the specific language governing permissions and\n# limitations under the License.",
"_____no_output_____"
]
],
[
[
"# TensorFlow 2.0 での tf.function と AutoGraph",
"_____no_output_____"
],
[
"<table class=\"tfo-notebook-buttons\" align=\"left\">\n <td>\n <a target=\"_blank\" href=\"https://www.tensorflow.org/beta/guide/autograph\"><img src=\"https://www.tensorflow.org/images/tf_logo_32px.png\" />View on TensorFlow.org</a>\n </td>\n <td>\n <a target=\"_blank\" href=\"https://colab.research.google.com/github/tensorflow/docs/blob/master/site/ja/beta/guide/autograph.ipynb\"><img src=\"https://www.tensorflow.org/images/colab_logo_32px.png\" />Run in Google Colab</a>\n </td>\n <td>\n <a target=\"_blank\" href=\"https://github.com/tensorflow/docs/blob/master/site/ja/beta/guide/autograph.ipynb\"><img src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" />View source on GitHub</a>\n </td>\n</table>",
"_____no_output_____"
],
[
"Note: これらのドキュメントは私たちTensorFlowコミュニティが翻訳したものです。コミュニティによる 翻訳は**ベストエフォート**であるため、この翻訳が正確であることや[英語の公式ドキュメント](https://www.tensorflow.org/?hl=en)の 最新の状態を反映したものであることを保証することはできません。 この翻訳の品質を向上させるためのご意見をお持ちの方は、GitHubリポジトリ[tensorflow/docs](https://github.com/tensorflow/docs)にプルリクエストをお送りください。 コミュニティによる翻訳やレビューに参加していただける方は、 [[email protected] メーリングリスト](https://groups.google.com/a/tensorflow.org/forum/#!forum/docs-ja)にご連絡ください。",
"_____no_output_____"
],
[
"TensorFlow 2.0 では Eager Execution の使いやすさとTensorFlow 1.0 のパワーとを同時に提供します。この統合の中核となるのは `tf.function` です。これは Python の構文のサブセットを移植可能でハイパフォーマンスな TensorFlow のグラフに変換します。\n\n`tf.function`の魅力的な特徴に AutoGraph があります。これはグラフを Python の構文そのものを用いて記述できるようにします。\nAutoGraph で利用可能な Python の機能の一覧は、[AutoGraph Capabilities and Limitations (Autograph の性能と制限事項)](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/autograph/LIMITATIONS.md) で確認できます。また、`tf.function`の詳細については RFC [TF 2.0: Functions, not Sessions](https://github.com/tensorflow/community/blob/master/rfcs/20180918-functions-not-sessions-20.md) を参照してください。AutoGraph の詳細については `tf.autograph` を参照してください。\n\nこのチュートリアルでは `tf.function` と AutoGraph の基本的な特徴についてひととおり確認します。",
"_____no_output_____"
],
[
"## セットアップ\n\nTensorFlow 2.0 Preview Nightly をインポートして、TF 2.0 モードを有効にしてください。",
"_____no_output_____"
]
],
[
[
"from __future__ import absolute_import, division, print_function, unicode_literals\nimport numpy as np",
"_____no_output_____"
],
[
"!pip install tensorflow==2.0.0-beta1\nimport tensorflow as tf",
"_____no_output_____"
]
],
[
[
"## `tf.function` デコレータ\n\n`tf.function`を用いてある関数にアノテーションを付けたとしても、一般の関数と変わらずに呼び出せます。一方、実行時にはその関数はグラフへとコンパイルされます。これにより、より高速な実行や、 GPU や TPU での実行、SavedModel へのエクスポートといった利点が得られます。",
"_____no_output_____"
]
],
[
[
"@tf.function\ndef simple_nn_layer(x, y):\n return tf.nn.relu(tf.matmul(x, y))\n\n\nx = tf.random.uniform((3, 3))\ny = tf.random.uniform((3, 3))\n\nsimple_nn_layer(x, y)",
"_____no_output_____"
]
],
[
[
"アノテーションの結果を調べてみると、 TensorFlow ランタイムとのやり取りのすべてを処理する特別な呼び出し可能オブジェクトを確認できます。 ",
"_____no_output_____"
]
],
[
[
"simple_nn_layer",
"_____no_output_____"
]
],
[
[
"記述したコードで複数の関数を利用していたとしても、すべての関数にアノテーションを付ける必要はありません。アノテーションをつけた関数から呼び出されるすべての関数は、グラフモードで実行されます。",
"_____no_output_____"
]
],
[
[
"def linear_layer(x):\n return 2 * x + 1\n\n\[email protected]\ndef deep_net(x):\n return tf.nn.relu(linear_layer(x))\n\n\ndeep_net(tf.constant((1, 2, 3)))",
"_____no_output_____"
]
],
[
[
"グラフが大量の軽量な演算から構成される場合、関数は Eager Execution で実行するコードよりも高速になる場合があります。しかし、 graph が少量の (畳み込み演算のような) 計算に時間のかかる演算からなる場合、高速化はそれほど見込めないでしょう。",
"_____no_output_____"
]
],
[
[
"import timeit\nconv_layer = tf.keras.layers.Conv2D(100, 3)\n\[email protected]\ndef conv_fn(image):\n return conv_layer(image)\n\nimage = tf.zeros([1, 200, 200, 100])\n# warm up\nconv_layer(image); conv_fn(image)\nprint(\"Eager conv:\", timeit.timeit(lambda: conv_layer(image), number=10))\nprint(\"Function conv:\", timeit.timeit(lambda: conv_fn(image), number=10))\nprint(\"Note how there's not much difference in performance for convolutions\")\n",
"_____no_output_____"
],
[
"lstm_cell = tf.keras.layers.LSTMCell(10)\n\[email protected]\ndef lstm_fn(input, state):\n return lstm_cell(input, state)\n\ninput = tf.zeros([10, 10])\nstate = [tf.zeros([10, 10])] * 2\n# warm up\nlstm_cell(input, state); lstm_fn(input, state)\nprint(\"eager lstm:\", timeit.timeit(lambda: lstm_cell(input, state), number=10))\nprint(\"function lstm:\", timeit.timeit(lambda: lstm_fn(input, state), number=10))\n",
"_____no_output_____"
]
],
[
[
"## Python の制御フローの利用\n\n`tf.function`の内部でデータに依存した制御フローを用いる場合、Pythonの制御フロー構文を用いることができます。AutoGraph はそれらの構文を TensorFlow の Ops に書き換えます。たとえば、 `Tensor` に依存する `if` 文は、`tf.cond()` に変換されます。\n\n次の例では `x` は `Tensor` です。ですが、 `if` 文は期待するどおりに動作しています。",
"_____no_output_____"
]
],
[
[
"@tf.function\ndef square_if_positive(x):\n if x > 0:\n x = x * x\n else:\n x = 0\n return x\n\n\nprint('square_if_positive(2) = {}'.format(square_if_positive(tf.constant(2))))\nprint('square_if_positive(-2) = {}'.format(square_if_positive(tf.constant(-2))))",
"_____no_output_____"
]
],
[
[
"Note: この例ではスカラー値を用いた単純な条件を用いています。実利用する場合、典型的には<a href=\"#batching\">バッチ処理</a>が用いられます。",
"_____no_output_____"
],
[
"AutoGraph は `while`, `for`, `if`, `break`, `continue`, `return` といった典型的なPythonの構文をサポートしています。また、これらを入れ子にして利用する場合もサポートしています。つまり、`Tensor`を返す式を`while` 文や `if` 文の条件式として用いることが可能です。また、`for` 文で `Tensor` の要素に渡って反復することも可能です。",
"_____no_output_____"
]
],
[
[
"@tf.function\ndef sum_even(items):\n s = 0\n for c in items:\n if c % 2 > 0:\n print(c)\n continue\n s += c\n return s\n\n\nsum_even(tf.constant([10, 12, 15, 20]))",
"_____no_output_____"
]
],
[
[
"より高度な使い方をするユーザーのために、AutoGraph は低レベルAPIも提供しています。次の例では AutoGraph が生成したコードを確認できます。",
"_____no_output_____"
]
],
[
[
"print(tf.autograph.to_code(sum_even.python_function))",
"_____no_output_____"
]
],
[
[
"次はより複雑な制御フローの例です。",
"_____no_output_____"
]
],
[
[
"@tf.function\ndef fizzbuzz(n):\n msg = tf.constant('')\n for i in tf.range(n):\n if tf.equal(i % 3, 0):\n tf.print('Fizz')\n elif tf.equal(i % 5, 0):\n tf.print('Buzz')\n else:\n tf.print(i)\n\nfizzbuzz(tf.constant(15))",
"_____no_output_____"
]
],
[
[
"## Keras での AutoGraph の利用\n\n`tf.function` はオブジェクトのメソッドに対しても利用できます。たとえば、カスタムしたKeras モデルにデコレーターを適用できます、典型的には `call` 関数にアノテーションを付けることで実現できるでしょう。より詳細が必要な場合、`tf.keras` を確認してください。",
"_____no_output_____"
]
],
[
[
"class CustomModel(tf.keras.models.Model):\n\n @tf.function\n def call(self, input_data):\n if tf.reduce_mean(input_data) > 0:\n return input_data\n else:\n return input_data // 2\n\n\nmodel = CustomModel()\n\nmodel(tf.constant([-2, -4]))",
"_____no_output_____"
]
],
[
[
"## 副作用\n\nEager モードのように、通常の場合 `tf.function` の中で、`tf.assign` や `tf.print` といった副作用のある命令を実行できます。また、実行時の順序を保つために、処理順について必要な依存関係を書き加えます。",
"_____no_output_____"
]
],
[
[
"v = tf.Variable(5)\n\[email protected]\ndef find_next_odd():\n v.assign(v + 1)\n if tf.equal(v % 2, 0):\n v.assign(v + 1)\n\n\nfind_next_odd()\nv",
"_____no_output_____"
]
],
[
[
"## 例: シンプルなモデルの学習\n\nAutoGraph はこれまで見てきたよりもずっと多くの演算を TensorFlow の内部で実行できます。たとえば、学習のためのループ処理は単に制御フローなので、実際にそれを TensorFlow に持ち込んで処理できます。",
"_____no_output_____"
],
[
"### データのダウンロード",
"_____no_output_____"
]
],
[
[
"def prepare_mnist_features_and_labels(x, y):\n x = tf.cast(x, tf.float32) / 255.0\n y = tf.cast(y, tf.int64)\n return x, y\n\ndef mnist_dataset():\n (x, y), _ = tf.keras.datasets.mnist.load_data()\n ds = tf.data.Dataset.from_tensor_slices((x, y))\n ds = ds.map(prepare_mnist_features_and_labels)\n ds = ds.take(20000).shuffle(20000).batch(100)\n return ds\n\ntrain_dataset = mnist_dataset()",
"_____no_output_____"
]
],
[
[
"### モデルの定義",
"_____no_output_____"
]
],
[
[
"model = tf.keras.Sequential((\n tf.keras.layers.Reshape(target_shape=(28 * 28,), input_shape=(28, 28)),\n tf.keras.layers.Dense(100, activation='relu'),\n tf.keras.layers.Dense(100, activation='relu'),\n tf.keras.layers.Dense(10)))\nmodel.build()\noptimizer = tf.keras.optimizers.Adam()",
"_____no_output_____"
]
],
[
[
"### 学習のためのループ処理の定義",
"_____no_output_____"
]
],
[
[
"compute_loss = tf.keras.losses.SparseCategoricalCrossentropy(from_logits=True)\n\ncompute_accuracy = tf.keras.metrics.SparseCategoricalAccuracy()\n\n\ndef train_one_step(model, optimizer, x, y):\n with tf.GradientTape() as tape:\n logits = model(x)\n loss = compute_loss(y, logits)\n\n grads = tape.gradient(loss, model.trainable_variables)\n optimizer.apply_gradients(zip(grads, model.trainable_variables))\n\n compute_accuracy(y, logits)\n return loss\n\n\[email protected]\ndef train(model, optimizer):\n train_ds = mnist_dataset()\n step = 0\n loss = 0.0\n accuracy = 0.0\n for x, y in train_ds:\n step += 1\n loss = train_one_step(model, optimizer, x, y)\n if tf.equal(step % 10, 0):\n tf.print('Step', step, ': loss', loss, '; accuracy', compute_accuracy.result())\n return step, loss, accuracy\n\nstep, loss, accuracy = train(model, optimizer)\nprint('Final step', step, ': loss', loss, '; accuracy', compute_accuracy.result())",
"_____no_output_____"
]
],
[
[
"## バッチ処理\n\n実際のアプリケーションにおいて、処理をバッチにまとめることはパフォーマンスの観点から重要です。AutoGraphを用いるのにもっとも適しているコードは、制御フローを _バッチ_ の単位で決定するようなコードです。もし、個々の _要素_ の単位で制御を決定する場合、パフォーマンスを保つために batch API を試してみてください。\n\n一例として、次の Python コードががあったとします。",
"_____no_output_____"
]
],
[
[
"def square_if_positive(x):\n return [i ** 2 if i > 0 else i for i in x]\n\n\nsquare_if_positive(range(-5, 5))",
"_____no_output_____"
]
],
[
[
"TensorFlowに同等の処理を行わせる場合、次のように記述したくなるかもしれません。 (これは実際には動作します!)",
"_____no_output_____"
]
],
[
[
"@tf.function\ndef square_if_positive_naive(x):\n result = tf.TensorArray(tf.int32, size=x.shape[0])\n for i in tf.range(x.shape[0]):\n if x[i] > 0:\n result = result.write(i, x[i] ** 2)\n else:\n result = result.write(i, x[i])\n return result.stack()\n\n\nsquare_if_positive_naive(tf.range(-5, 5))",
"_____no_output_____"
]
],
[
[
"しかし、この場合、次のように書くこともできます。",
"_____no_output_____"
]
],
[
[
"def square_if_positive_vectorized(x):\n return tf.where(x > 0, x ** 2, x)\n\n\nsquare_if_positive_vectorized(tf.range(-5, 5))",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07c74df801bb14dc823cfcd60db129948504440 | 5,197 | ipynb | Jupyter Notebook | 10_pipeline/00_Overview.ipynb | MarcusFra/workshop | 83f16d41f5e10f9c23242066f77a14bb61ac78d7 | [
"Apache-2.0"
] | 2,327 | 2020-03-01T09:47:34.000Z | 2021-11-25T12:38:42.000Z | 10_pipeline/00_Overview.ipynb | MarcusFra/workshop | 83f16d41f5e10f9c23242066f77a14bb61ac78d7 | [
"Apache-2.0"
] | 209 | 2020-03-01T17:14:12.000Z | 2021-11-08T20:35:42.000Z | 10_pipeline/00_Overview.ipynb | MarcusFra/workshop | 83f16d41f5e10f9c23242066f77a14bb61ac78d7 | [
"Apache-2.0"
] | 686 | 2020-03-03T17:24:51.000Z | 2021-11-25T23:39:12.000Z | 40.601563 | 289 | 0.662113 | [
[
[
"# SageMaker Pipelines to Train a BERT-Based Text Classifier\n\nIn this lab, we will do the following:\n* Define a set of Workflow Parameters that can be used to parametrize a Workflow Pipeline\n* Define a Processing step that performs cleaning and feature engineering, splitting the input data into train and test data sets\n* Define a Training step that trains a model on the pre-processed train data set\n* Define a Processing step that evaluates the trained model's performance on the test data set\n* Define a Register Model step that creates a model package from the estimator and model artifacts used in training\n* Define a Conditional step that measures a condition based on output from prior steps and conditionally executes the Register Model step\n* Define and create a Pipeline in a Workflow DAG, with the defined parameters and steps defined\n* Start a Pipeline execution and wait for execution to complete",
"_____no_output_____"
],
[
"# Terminology\n\nAmazon SageMaker Pipelines support the following steps:\n\n* Pipelines - A Directed Acyclic Graph of steps and conditions to orchestrate SageMaker jobs and resource creation.\n* Processing Job steps - A simplified, managed experience on SageMaker to run data processing workloads, such as feature engineering, data validation, model evaluation, and model interpretation.\n* Training Job steps - An iterative process that teaches a model to make predictions by presenting examples from a training dataset.\n* Conditional step execution - Provides conditional execution of branches in a pipeline.\n* Registering Models - Creates a model package resource in the Model Registry that can be used to create deployable models in Amazon SageMaker.\n* Parametrized Pipeline executions - Allows pipeline executions to vary by supplied parameters.\n* Transform Job steps - A batch transform to preprocess datasets to remove noise or bias that interferes with training or inference from your dataset, get inferences from large datasets, and run inference when you don't need a persistent endpoint.",
"_____no_output_____"
],
[
"# Our BERT Pipeline\n\nIn the Processing Step, we perform Feature Engineering to create BERT embeddings from the `review_body` text using the pre-trained BERT model, and split the dataset into train, validation and test files. To optimize for Tensorflow training, we saved the files in TFRecord format. \n\nIn the Training Step, we fine-tune the BERT model to our Customer Reviews Dataset and add a new classification layer to predict the `star_rating` for a given `review_body`.\n\nIn the Evaluation Step, we take the trained model and a test dataset as input, and produce a JSON file containing classification evaluation metrics.\n\nIn the Condition Step, we decide whether to register this model if the accuracy of the model, as determined by our evaluation step exceeded some value. \n\n\n\nThe pipeline that we create follows a typical Machine Learning Application pattern of pre-processing, training, evaluation, and model registration:\n\n",
"_____no_output_____"
],
[
"# Release Resources",
"_____no_output_____"
]
],
[
[
"%%html\n\n<p><b>Shutting down your kernel for this notebook to release resources.</b></p>\n<button class=\"sm-command-button\" data-commandlinker-command=\"kernelmenu:shutdown\" style=\"display:none;\">Shutdown Kernel</button>\n\n<script>\ntry {\n els = document.getElementsByClassName(\"sm-command-button\");\n els[0].click();\n}\ncatch(err) {\n // NoOp\n} \n</script>",
"_____no_output_____"
],
[
"%%javascript\n\ntry {\n Jupyter.notebook.save_checkpoint();\n Jupyter.notebook.session.delete();\n}\ncatch(err) {\n // NoOp\n}",
"_____no_output_____"
]
]
] | [
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
]
] |
d07c8b12f245fed1915283fa95e612d5db228288 | 71,719 | ipynb | Jupyter Notebook | notebooks/legacy/legacy/07. Check MC logic.ipynb | tomkimpson/ML4L | ffa8360cb80df25bd6af4fa5cc39b42bd6f405cd | [
"MIT"
] | 1 | 2022-02-23T12:31:56.000Z | 2022-02-23T12:31:56.000Z | notebooks/legacy/legacy/07. Check MC logic.ipynb | tomkimpson/ML4L | ffa8360cb80df25bd6af4fa5cc39b42bd6f405cd | [
"MIT"
] | null | null | null | notebooks/legacy/legacy/07. Check MC logic.ipynb | tomkimpson/ML4L | ffa8360cb80df25bd6af4fa5cc39b42bd6f405cd | [
"MIT"
] | null | null | null | 88.215252 | 40,296 | 0.770479 | [
[
[
"#These dictionaries describe the local hour of the satellite\nlocal_times = {\"aquaDay\":\"13:30\",\n \"terraDay\":\"10:30\",\n \"terraNight\":\"22:30\",\n \"aquaNight\":\"01:30\"\n }\n# and are used to load the correct file for dealing with the date-line.\nmin_hours = {\"aquaDay\":2,\n \"terraDay\":-1,\n \"aquaNight\":-1,\n \"terraNight\":11}\nmax_hours = {\"aquaDay\":24,\n \"terraDay\":22,\n \"aquaNight\":13,\n \"terraNight\":24}",
"_____no_output_____"
],
[
"import xarray as xr\nimport pandas as pd\nimport numpy as np\nimport matplotlib.pyplot as plt",
"_____no_output_____"
],
[
"#Data loader for the satellite data,\n#returns a complete global map (regular lat-lon)\n#but sparsely filled, with only one stripe of data\n#showing where the satellite passed that hour\ndef get_satellite_slice(date : str,\n utc_hour : int,\n satellite='aquaDay',\n latitude_bound = None #Recommend only using |lat| < 70 degrees\n ):\n #Due to crossing of the datetime, some times will be saved different date\n if utc_hour < min_hours[satellite]:\n file_date = str((np.datetime64(date) - np.timedelta64(1,'D')))\n elif utc_hour > max_hours[satellite]:\n file_date = str((np.datetime64(date) + np.timedelta64(1,'D')))\n else:\n file_date = date\n \n #print ('the UTC hour is', utc_hour)\n #print ('the file date is', file_date)\n #Open .tif file\n sat_xr = xr.open_rasterio(f'{satellite_folder}/{satellite}_errorGTE03K_04km_{file_date}.tif')\n #Rename spatial dimensions\n sat_xr = sat_xr.rename({'x':'longitude','y':'latitude'})\n \n #Create time delta to change local to UTC\n time_delta = pd.to_timedelta(sat_xr.longitude.data/15,unit='H') \n \n #Convert local satellite time to UTC and round to nearest hour\n time = (pd.to_datetime([file_date + \" \" + \"13:30\"]*time_delta.shape[0]) - time_delta).round('H')\n \n #display(time)\n #print(time)\n #Select desired hour\n \n dt = np.datetime64(f'{date} {utc_hour:02}:00:00')\n right_time = np.expand_dims(time == dt,axis=(0,1))\n \n \n #print ('right time', right_time.sum())\n if right_time.sum() == 0:\n print(\"Warning: Correct time not found in dataset, likely problem in file selection\")\n #Make subset\n subset = np.logical_and(np.isfinite(sat_xr),right_time)\n if subset.sum() == 0:\n print(f\"Warning: No valid data found for {date} {utc_hour:02}h\")\n if latitude_bound is not None:\n #print(f\"Subsetting < {latitude_bound}\")\n subset = np.logical_and(subset,np.expand_dims(np.abs(sat_xr.latitude) < latitude_bound,axis=(0,-1)))\n #Select valid data\n test_subset = sat_xr.where(subset).load()\n sat_xr.close()\n sat_xr = None\n #display(test_subset)\n #display(test_subset[0,::-1,:])\n #display(test_subset.squeeze('band'))\n \n \n return test_subset[0,::-1,:]",
"_____no_output_____"
],
[
"satellite_folder = '/network/group/aopp/predict/TIP016_PAXTON_RPSPEEDY/ML4L/ECMWF_files/raw/'\ntest_data = get_satellite_slice('2018-01-03',11,latitude_bound=70)",
"_____no_output_____"
],
[
"plt.figure(dpi=200)\n#Make grid for plotting purposes\nxv, yv = np.meshgrid(test_data.longitude, test_data.latitude, indexing='xy')\n#Contour plot\nplt.contourf(xv,yv,test_data,levels=np.arange(230,320,10))\nplt.xlim(0,100)\nplt.ylim(-80,-60)\n\n\nplt.colorbar()",
"_____no_output_____"
],
[
"blob = [True,False]",
"_____no_output_____"
],
[
"sum(blob)",
"_____no_output_____"
],
[
"def get_era_data(date : str,\n utc_hour : int,\n field = 't2m'):\n\n print ('date = ', '_'.join(date.split('-')))\n month = '_'.join(date.split('-')[:-1])\n \n print ('month', month)\n \n ds_era = xr.open_dataset(f'{era_folder}/sfc_unstructured_{month}.grib',engine='cfgrib')\n #Grab correct field\n da = ds_era[field]\n #\n time_str = f\"{date} {utc_hour:02}:00:00\" \n print (time_str)\n da = da.sel(time=time_str)\n #Relabel longitude coordinate to be consistent with MODIS\n da = da.assign_coords({\"longitude\": (((da.longitude + 180) % 360) - 180)})\n \n #Load data, perhaps this is too early?\n da = da.load()\n \n #Close file, attempt to not have memory leaks\n ds_era.close()\n ds_era = None\n return da",
"_____no_output_____"
],
[
"era_folder = '/network/group/aopp/predict/TIP016_PAXTON_RPSPEEDY/ML4L/ECMWF_files/raw/'\nda = get_era_data('2019-01-01',10)",
"Ignoring index file '/network/group/aopp/predict/TIP016_PAXTON_RPSPEEDY/ML4L/ECMWF_files/raw//sfc_unstructured_2019_01.grib.923a8.idx' incompatible with GRIB file\n"
]
],
[
[
"---",
"_____no_output_____"
]
],
[
[
"root = '/network/group/aopp/predict/TIP016_PAXTON_RPSPEEDY/ML4L/ECMWF_files/raw/'\nf = 'aquaDay_errorGTE03K_04km_2019-01-01.tif'\n\nsat_xr = xr.open_rasterio(root+f) #load the temperature data array",
"_____no_output_____"
],
[
"sat_xr = sat_xr.rename({'x':'longitude','y':'latitude'})",
"_____no_output_____"
],
[
"time_delta = pd.to_timedelta(sat_xr.longitude.data/15,unit='H') ",
"_____no_output_____"
],
[
"file_date = '2019-01-01'\nlocal_times = \"13:30\"\ntime = (pd.to_datetime([file_date + \" \" + local_times]*time_delta.shape[0]) - time_delta).round('H')",
"_____no_output_____"
],
[
"time",
"_____no_output_____"
],
[
"#Select desired hour\ndate = '2019-01-01'\nutc_hour = 10\ndt = np.datetime64(f'{date} {utc_hour:02}:00:00')\nright_time = np.expand_dims(time == dt,axis=(0,1))\nif right_time.sum() == 0:\n print(\"Warning: Correct time not found in dataset, likely problem in file selection\")\n#Make subset\nsubset = np.logical_and(np.isfinite(sat_xr),right_time)",
"_____no_output_____"
],
[
"subset",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07c9405f07f31c5588cadf391759c1f873aa845 | 82,902 | ipynb | Jupyter Notebook | pattern-classification/stat_pattern_class/supervised/parametric/3_stat_superv_parametric.ipynb | gopala-kr/ds-notebooks | bc35430ecdd851f2ceab8f2437eec4d77cb59423 | [
"MIT"
] | 1 | 2021-12-13T15:41:48.000Z | 2021-12-13T15:41:48.000Z | pattern-classification/stat_pattern_class/supervised/parametric/3_stat_superv_parametric.ipynb | gopala-kr/ds-notebooks | bc35430ecdd851f2ceab8f2437eec4d77cb59423 | [
"MIT"
] | 15 | 2021-09-12T15:06:13.000Z | 2022-03-31T19:02:08.000Z | pattern-classification/stat_pattern_class/supervised/parametric/3_stat_superv_parametric.ipynb | gopala-kr/ds-notebooks | bc35430ecdd851f2ceab8f2437eec4d77cb59423 | [
"MIT"
] | 1 | 2022-01-29T00:37:52.000Z | 2022-01-29T00:37:52.000Z | 212.569231 | 26,625 | 0.885648 | [
[
[
"empty"
]
]
] | [
"empty"
] | [
[
"empty"
]
] |
d07c9dc28cc4831d94b3b3478e4adf640dd96d90 | 7,489 | ipynb | Jupyter Notebook | tensor__cpu/matplot/fft.ipynb | Zhang-O/small | bfb41b2267159bd5e408dba524713d3bc0b28074 | [
"MIT"
] | 1 | 2017-09-25T03:16:00.000Z | 2017-09-25T03:16:00.000Z | tensor__cpu/matplot/fft.ipynb | Zhang-O/small | bfb41b2267159bd5e408dba524713d3bc0b28074 | [
"MIT"
] | null | null | null | tensor__cpu/matplot/fft.ipynb | Zhang-O/small | bfb41b2267159bd5e408dba524713d3bc0b28074 | [
"MIT"
] | null | null | null | 21.644509 | 115 | 0.462412 | [
[
[
"import numpy as np\nfrom scipy.fftpack import fft\nimport matplotlib.pyplot as plt\n%matplotlib",
"Using matplotlib backend: Qt5Agg\n"
],
[
"x = np.linspace(0, 1, 1400)\ny = 7 * np.sin(2 * np.pi * 180 * x) + 2.8 * np.sin(2 * np.pi * 390 * x) + 5.1 * np.sin(2 * np.pi * 600 * x)",
"_____no_output_____"
],
[
"yy = fft(y)\nyy",
"_____no_output_____"
],
[
"yy.real",
"_____no_output_____"
],
[
"yf = abs(yy)\nyf",
"_____no_output_____"
],
[
"abs(1 + 1j)",
"_____no_output_____"
],
[
"plt.subplot(221)\nplt.plot(x[0: 180], y[0:180])\nplt.title('original wave')",
"_____no_output_____"
],
[
"yf1 = abs(yy) / len(x)\nyf2 = yf1[range(int(len(x) / 2))]\n\nxf = np.arange(len(x))\nxf2=xf[range(int(len(x) / 2))]\n\n",
"_____no_output_____"
],
[
"plt.subplot(222)\nplt.plot(xf,yf,'r')\nplt.title('FFT of Mixed wave(two sides frequency range)',fontsize=7,color='#7A378B') #注意这里的颜色可以查询颜色代码表\n\n\nplt.subplot(223)\nplt.plot(xf,yf1,'g')\nplt.title('FFT of Mixed wave(normalization)',fontsize=9,color='r')\n\n\nplt.subplot(224)\nplt.plot(xf2,yf2,'b')\nplt.title('FFT of Mixed wave)',fontsize=10,color='#F08080')",
"_____no_output_____"
],
[
"# http://blog.csdn.net/ouening/article/details/70339341 \n# python实现傅立叶级数展开\nx = np.mgrid[-10: 10.02: 0.02]\ndef fourier1():\n s = np.pi / 2\n print(s)\n\n for i in range(1,100,1):\n s0 = 2 / np.pi * (1 - (-1) ** i) / i ** 2 * np.cos(i * x) \n s = s + s0\n if len(s) == 2:\n print('hehe')\n print(s)\n# plt.plot(x[0:10], s[0:10], 'orange', linewidth=0.6)\n# plt.title('fourier1')\n# plt.show()\n# print(s[0:10])\nfourier1()",
"1.5707963267948966\n"
],
[
"from pylab import *\nfrom scipy.io import wavfile\nimport math\n\nsampFreq, snd = wavfile.read('440_sine.wav')\nprint(sampFreq)\nsnd = snd / (2.**15)\n# snd = snd\n\n# snd.shape\ns1 = snd[:, 0]\n\ntimeArray = arange(0, 5292.0, 1) #[0s, 1s], 5292个点\ntimeArray = timeArray / sampFreq #[0s, 0.114s]\ntimeArray = timeArray * 1000 #[0ms, 114ms]\n\nplt.subplot(3, 1, 1)\nplt.plot(timeArray, s1, color='k')\nylabel('Amplitude')\nxlabel('Time (ms)')\n\nn = len(s1)\np = fft(s1) #执行傅立叶变换\n\nnUniquePts = int(math.ceil((n + 1) / 2.0))\np = p[0: nUniquePts]\np = abs(p)\n\np = p / float(n) #除以采样点数,去除幅度对信号长度或采样频率的依赖\np = p**2 #求平方得到能量\n\n#乘2(详见技术手册)\n#奇nfft排除奈奎斯特点\nif n % 2 > 0: #fft点数为奇\n p[1:len(p)] = p[1:len(p)]*2\nelse: #fft点数为偶\n p[1:len(p)-1] = p[1:len(p)-1] * 2\n\nfreqArray = arange(0, nUniquePts, 1.0) * (sampFreq / n)\nplt.subplot(3, 1, 2)\nplt.plot(freqArray/1000, 10*log10(p), color='k')\nxlabel('Freqency (kHz)')\nylabel('Power (dB)')\n\nplt.subplot(3, 1, 3)\nplt.plot(freqArray/1000, p, color='k')\n\n",
"44100\n"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07ca88e21920fb620a8251a69221a24171f726f | 12,805 | ipynb | Jupyter Notebook | wholesale_decisionTree_classification.ipynb | Ethan-Jeong/test_machinelearning | 2f9d0c4dd554de03ce151db5a9ad11bf36a693e2 | [
"Apache-2.0"
] | null | null | null | wholesale_decisionTree_classification.ipynb | Ethan-Jeong/test_machinelearning | 2f9d0c4dd554de03ce151db5a9ad11bf36a693e2 | [
"Apache-2.0"
] | null | null | null | wholesale_decisionTree_classification.ipynb | Ethan-Jeong/test_machinelearning | 2f9d0c4dd554de03ce151db5a9ad11bf36a693e2 | [
"Apache-2.0"
] | null | null | null | 30.343602 | 268 | 0.35861 | [
[
[
"<a href=\"https://colab.research.google.com/github/Ethan-Jeong/test_machinelearning/blob/master/wholesale_decisionTree_classification.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
]
],
[
[
"!ls",
"sample_data Wholesale_customers_data.csv\n"
],
[
"!ls -l # 자세히 보기",
"total 20\ndrwxr-xr-x 1 root root 4096 Jun 15 13:37 sample_data\n-rw-r--r-- 1 root root 15021 Jul 2 04:45 Wholesale_customers_data.csv\n"
],
[
"!pwd",
"/content\n"
],
[
"!ls -l ./sample_data # 디렉토리 내용물 자세히 보기",
"total 55504\n-rwxr-xr-x 1 root root 1697 Jan 1 2000 anscombe.json\n-rw-r--r-- 1 root root 301141 Jun 15 13:37 california_housing_test.csv\n-rw-r--r-- 1 root root 1706430 Jun 15 13:37 california_housing_train.csv\n-rw-r--r-- 1 root root 18289443 Jun 15 13:37 mnist_test.csv\n-rw-r--r-- 1 root root 36523880 Jun 15 13:37 mnist_train_small.csv\n-rwxr-xr-x 1 root root 930 Jan 1 2000 README.md\n"
],
[
"!ls -l ./Wholesale_customers_data.csv",
"-rw-r--r-- 1 root root 15021 Jul 2 04:45 ./Wholesale_customers_data.csv\n"
],
[
"import pandas as pd\n\ndf = pd.read_excel('./wholesale.xls') # y가 없는 데이터 불러와서 y 생성 후 붙이기\ndf.info()\ndf.head(5)\n",
"<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 440 entries, 0 to 439\nData columns (total 10 columns):\n # Column Non-Null Count Dtype\n--- ------ -------------- -----\n 0 Unnamed: 0 440 non-null int64\n 1 Channel 440 non-null int64\n 2 Region 440 non-null int64\n 3 Fresh 440 non-null int64\n 4 Milk 440 non-null int64\n 5 Grocery 440 non-null int64\n 6 Frozen 440 non-null int64\n 7 Detergents_Paper 440 non-null int64\n 8 Delicassen 440 non-null int64\n 9 label 440 non-null int64\ndtypes: int64(10)\nmemory usage: 34.5 KB\n"
],
[
"Y = df['label']\nX = df.iloc[:,1:9]\nY.shape , X.shape",
"_____no_output_____"
],
[
"from sklearn.tree import DecisionTreeClassifier\ndtree = DecisionTreeClassifier()\ndtree.fit(X,Y)",
"_____no_output_____"
],
[
"dtree.score(X,Y)",
"_____no_output_____"
],
[
"",
"_____no_output_____"
]
]
] | [
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07ca97c0eef5ac752649c6072b47dc78a9f76f1 | 5,848 | ipynb | Jupyter Notebook | 2021_07_15/lab three.ipynb | kashaf874/lab_one | 386ae47e219e7d811c54b98bfc7eca7684ed95c4 | [
"Apache-2.0"
] | null | null | null | 2021_07_15/lab three.ipynb | kashaf874/lab_one | 386ae47e219e7d811c54b98bfc7eca7684ed95c4 | [
"Apache-2.0"
] | null | null | null | 2021_07_15/lab three.ipynb | kashaf874/lab_one | 386ae47e219e7d811c54b98bfc7eca7684ed95c4 | [
"Apache-2.0"
] | null | null | null | 23.869388 | 104 | 0.475034 | [
[
[
"class Person:\n def __init__(self, fname, lname):\n self.firstname = fname\n self.lastname = lname\n\n def printname(self):\n print(self.firstname, self.lastname)\n\nclass Student(Person):\n def __init__(self, fname, lname, year):\n super().__init__(fname, lname)\n self.graduationyear = year\n\nx = Student(\"naz\", \"kashaf\", 2021)\nprint(x.graduationyear)",
"2021\n"
]
],
[
[
"# Question break down\n\nin this code we will create a child class in which access parent class\n\n# creat a class \"student\"\n\nfollowing parameter\n\"i. Name\nii. Batch No\niii. Address\"\n\n## creat new child class \"DevnationStudents\"\n\n1.in which we will call parent class const\n save \"course\"\"marks\" list as empty\n2.Add only those courses in the list which are not already in the\nlist\n3.add dict\n4.Marks list should not contains duplicate entries of same\ncourses\n",
"_____no_output_____"
]
],
[
[
"class Student:\n def __init__(self,name,batch_no,address):\n self.name = name\n self.batch_no = batch_no\n self.address = address\n \n def set_name(self):\n return self.name\n def set_batch_no(self):\n return self.batch_no\n def set_adress(self):\n return self.address\n\n \n def get_name(self):\n return self.name\n def get_batch_no(self):\n return self.batch_no\n def get_adress(self):\n return self.adress\n\nclass Devnationstudents(Student):\n def __init__(self,name,batch_no, address, courses=[], marks=[]):\n super().__init__(name,batch_no,address)\n self.courses = courses\n self.marks = marks\n self.marks = []\n \n def set_courses(self, courses):\n for i in courses:\n if i not in self.courses:\n self.courses.append(i)\n\n def check_point(self, point):\n dup = False\n for default_y in self.marks:\n if default_y[\"courses\"] != point:\n dup = True\n return dup\n \n def set_marks(self, marks):\n \n for new_x in marks:\n if new_x[\"courses\"] in self.courses:\n if (self.check_point(new_x[\"course\"])):\n continue\n else:\n self.marks.append(new_x)\n \n\n def get_courses(self):\n return self.courses\n def get_marks(self):\n return self.marks ",
"_____no_output_____"
],
[
"x = Devnationstudents(\"kashaf\", \"naz\", \"swl\")\nprint(x)",
"<__main__.Devnationstudents object at 0x000002416EB1A850>\n"
],
[
"out_put = Devnationstudents(\"kashaf\", 4, \"Swl\", ['data science','full stack'], [30, 40 , 20])",
"_____no_output_____"
],
[
"out_put.get_courses()",
"_____no_output_____"
],
[
"out_put.set_courses([\"Python\", \"R\" ,'full stack'])",
"_____no_output_____"
],
[
"out_put.get_courses()",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code"
] | [
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07cadc3f5a6ef9c04831999f962110378d2fcaa | 76,191 | ipynb | Jupyter Notebook | tfx_addons/feature_selection/nb/Example.ipynb | vulkomilev/tfx-addons | df5cd1e6217b5a40425a643456a93b04e483a73d | [
"Apache-2.0"
] | 70 | 2021-03-16T17:35:04.000Z | 2022-03-28T08:20:07.000Z | tfx_addons/feature_selection/nb/Example.ipynb | vulkomilev/tfx-addons | df5cd1e6217b5a40425a643456a93b04e483a73d | [
"Apache-2.0"
] | 104 | 2021-04-04T01:10:03.000Z | 2022-03-31T14:25:17.000Z | tfx_addons/feature_selection/nb/Example.ipynb | vulkomilev/tfx-addons | df5cd1e6217b5a40425a643456a93b04e483a73d | [
"Apache-2.0"
] | 38 | 2021-03-19T17:53:08.000Z | 2022-03-31T14:24:13.000Z | 46.260474 | 2,348 | 0.512698 | [
[
[
"from component import FeatureSelection\nfrom tfx.components import CsvExampleGen\nfrom tfx.orchestration.experimental.interactive.interactive_context import InteractiveContext",
"_____no_output_____"
],
[
"context = InteractiveContext()",
"WARNING:absl:InteractiveContext pipeline_root argument not provided: using temporary directory /var/folders/55/gk05zk9j6596lk4gv9p8sk680000gn/T/tfx-interactive-2021-11-15T00_50_13.753592-we9hrzdh as root for pipeline outputs.\nWARNING:absl:InteractiveContext metadata_connection_config not provided: using SQLite ML Metadata database at /var/folders/55/gk05zk9j6596lk4gv9p8sk680000gn/T/tfx-interactive-2021-11-15T00_50_13.753592-we9hrzdh/metadata.sqlite.\n"
],
[
"import urllib.request\nimport tempfile\nimport os\n\n# getting data and setup CsvExampleGen\nDATA_ROOT = tempfile.mkdtemp(prefix='tfx-data') # Create a temporary directory.\n_data_url = 'https://raw.githubusercontent.com/tensorflow/tfx/master/tfx/examples/penguin/data/labelled/penguins_processed.csv'\n_data_filepath = os.path.join(DATA_ROOT, \"data.csv\")\nurllib.request.urlretrieve(_data_url, _data_filepath)",
"_____no_output_____"
],
[
"example_gen = CsvExampleGen(input_base=DATA_ROOT)\ncontext.run(example_gen)",
"WARNING:apache_beam.runners.interactive.interactive_environment:Dependencies required for Interactive Beam PCollection visualization are not available, please use: `pip install apache-beam[interactive]` to install necessary dependencies to enable all data visualization features.\n"
],
[
"# give path to the module file\nfeature_selector = FeatureSelection(orig_examples = example_gen.outputs['examples'],\n module_file=\"module_file\")",
"_____no_output_____"
],
[
"context.run(feature_selector)",
"2021-11-15 00:50:15.530296: I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is optimized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-critical operations: AVX2 FMA\nTo enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.\n2021-11-15 00:50:15.563654: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:176] None of the MLIR Optimization Passes are enabled (registered 2)\n"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07cb7268e9ceaea5ade80a3bb3864ffdd60dc96 | 126,484 | ipynb | Jupyter Notebook | Regression/Gradient Boosting Machine/GradientBoostingRegressor_MinMaxScaler.ipynb | surya2365/ds-seed | 74ef58479333fed95522f7b691f1209f7d70fc95 | [
"Apache-2.0"
] | null | null | null | Regression/Gradient Boosting Machine/GradientBoostingRegressor_MinMaxScaler.ipynb | surya2365/ds-seed | 74ef58479333fed95522f7b691f1209f7d70fc95 | [
"Apache-2.0"
] | null | null | null | Regression/Gradient Boosting Machine/GradientBoostingRegressor_MinMaxScaler.ipynb | surya2365/ds-seed | 74ef58479333fed95522f7b691f1209f7d70fc95 | [
"Apache-2.0"
] | null | null | null | 159.299748 | 73,963 | 0.862915 | [
[
[
"# GradientBoostingRegressor with MinMaxScaler\r\n\r\n",
"_____no_output_____"
],
[
"This Code template is for regression analysis using a simple GradientBoostingRegressor based on the Gradient Boosting Ensemble Learning Technique using MinMaxScalerfor Feature Rescaling.",
"_____no_output_____"
],
[
"### Required Packages",
"_____no_output_____"
]
],
[
[
"import warnings\r\nimport numpy as np \r\nimport pandas as pd \r\nimport matplotlib.pyplot as plt \r\nimport seaborn as se \r\nfrom sklearn.preprocessing import MinMaxScaler\r\nfrom sklearn.model_selection import train_test_split\r\nfrom sklearn.ensemble import GradientBoostingRegressor \r\nfrom sklearn.metrics import r2_score, mean_absolute_error, mean_squared_error \r\nwarnings.filterwarnings('ignore')",
"_____no_output_____"
]
],
[
[
"### Initialization\n\nFilepath of CSV file",
"_____no_output_____"
]
],
[
[
"#filepath\r\nfile_path = \"\"",
"_____no_output_____"
]
],
[
[
"List of features which are required for model training .",
"_____no_output_____"
]
],
[
[
"#x_values\r\nfeatures=[]",
"_____no_output_____"
]
],
[
[
"Target feature for prediction.",
"_____no_output_____"
]
],
[
[
"#y_value\r\ntarget = ''",
"_____no_output_____"
]
],
[
[
"### Data Fetching\n\nPandas is an open-source, BSD-licensed library providing high-performance, easy-to-use data manipulation and data analysis tools.\n\nWe will use panda's library to read the CSV file using its storage path.And we use the head function to display the initial row or entry.",
"_____no_output_____"
]
],
[
[
"df=pd.read_csv(file_path)\r\ndf.head()",
"_____no_output_____"
]
],
[
[
"### Feature Selections\n\nIt is the process of reducing the number of input variables when developing a predictive model. Used to reduce the number of input variables to both reduce the computational cost of modelling and, in some cases, to improve the performance of the model.\n\nWe will assign all the required input features to X and target/outcome to Y.",
"_____no_output_____"
]
],
[
[
"X = df[features]\r\nY = df[target]",
"_____no_output_____"
]
],
[
[
"### Data Preprocessing\n\nSince the majority of the machine learning models in the Sklearn library doesn't handle string category data and Null value, we have to explicitly remove or replace null values. The below snippet have functions, which removes the null value if any exists. And convert the string classes data in the datasets by encoding them to integer classes.\n",
"_____no_output_____"
]
],
[
[
"def NullClearner(df):\r\n if(isinstance(df, pd.Series) and (df.dtype in [\"float64\",\"int64\"])):\r\n df.fillna(df.mean(),inplace=True)\r\n return df\r\n elif(isinstance(df, pd.Series)):\r\n df.fillna(df.mode()[0],inplace=True)\r\n return df\r\n else:return df\r\ndef EncodeX(df):\r\n return pd.get_dummies(df)",
"_____no_output_____"
]
],
[
[
"Calling preprocessing functions on the feature and target set.",
"_____no_output_____"
]
],
[
[
"x=X.columns.to_list()\r\nfor i in x:\r\n X[i]=NullClearner(X[i])\r\nX=EncodeX(X)\r\nY=NullClearner(Y)\r\nX.head()",
"_____no_output_____"
]
],
[
[
"#### Correlation Map\n\nIn order to check the correlation between the features, we will plot a correlation matrix. It is effective in summarizing a large amount of data where the goal is to see patterns.",
"_____no_output_____"
]
],
[
[
"f,ax = plt.subplots(figsize=(18, 18))\r\nmatrix = np.triu(X.corr())\r\nse.heatmap(X.corr(), annot=True, linewidths=.5, fmt= '.1f',ax=ax, mask=matrix)\r\nplt.show()",
"_____no_output_____"
]
],
[
[
"### Data Splitting\n\nThe train-test split is a procedure for evaluating the performance of an algorithm. The procedure involves taking a dataset and dividing it into two subsets. The first subset is utilized to fit/train the model. The second subset is used for prediction. The main motive is to estimate the performance of the model on new data.",
"_____no_output_____"
]
],
[
[
"X_train, X_test, y_train, y_test = train_test_split(X, Y, test_size = 0.2, random_state = 123)#performing datasplitting",
"_____no_output_____"
]
],
[
[
"## Data Rescaling\nMinMaxScaler subtracts the minimum value in the feature and then divides by the range, where range is the difference between the original maximum and original minimum.\n\nWe will fit an object of MinMaxScaler to train data then transform the same data via fit_transform(X_train) method, following which we will transform test data via transform(X_test) method.",
"_____no_output_____"
]
],
[
[
"minmax_scaler = MinMaxScaler()\r\nX_train = minmax_scaler.fit_transform(X_train)\r\nX_test = minmax_scaler.transform(X_test)",
"_____no_output_____"
]
],
[
[
"### Model\n\nGradient Boosting builds an additive model in a forward stage-wise fashion; it allows for the optimization of arbitrary differentiable loss functions. In each stage a regression tree is fit on the negative gradient of the given loss function.\n\n#### Model Tuning Parameters\n\n 1. loss : {‘ls’, ‘lad’, ‘huber’, ‘quantile’}, default=’ls’\n> Loss function to be optimized. ‘ls’ refers to least squares regression. ‘lad’ (least absolute deviation) is a highly robust loss function solely based on order information of the input variables. ‘huber’ is a combination of the two. ‘quantile’ allows quantile regression (use `alpha` to specify the quantile).\n\n 2. learning_ratefloat, default=0.1\n> Learning rate shrinks the contribution of each tree by learning_rate. There is a trade-off between learning_rate and n_estimators.\n\n 3. n_estimators : int, default=100\n> The number of trees in the forest.\n\n 4. criterion : {‘friedman_mse’, ‘mse’, ‘mae’}, default=’friedman_mse’\n> The function to measure the quality of a split. Supported criteria are ‘friedman_mse’ for the mean squared error with improvement score by Friedman, ‘mse’ for mean squared error, and ‘mae’ for the mean absolute error. The default value of ‘friedman_mse’ is generally the best as it can provide a better approximation in some cases.\n\n 5. max_depth : int, default=3\n> The maximum depth of the individual regression estimators. The maximum depth limits the number of nodes in the tree. Tune this parameter for best performance; the best value depends on the interaction of the input variables.\n\n 6. max_features : {‘auto’, ‘sqrt’, ‘log2’}, int or float, default=None\n> The number of features to consider when looking for the best split: \n\n 7. random_state : int, RandomState instance or None, default=None\n> Controls both the randomness of the bootstrapping of the samples used when building trees (if <code>bootstrap=True</code>) and the sampling of the features to consider when looking for the best split at each node (if `max_features < n_features`).\n\n 8. verbose : int, default=0\n> Controls the verbosity when fitting and predicting.\n \n 9. n_iter_no_change : int, default=None\n> <code>n_iter_no_change</code> is used to decide if early stopping will be used to terminate training when validation score is not improving. By default it is set to None to disable early stopping. If set to a number, it will set aside <code>validation_fraction</code> size of the training data as validation and terminate training when validation score is not improving in all of the previous <code>n_iter_no_change</code> numbers of iterations. The split is stratified.\n \n 10. tol : float, default=1e-4\n> Tolerance for the early stopping. When the loss is not improving by at least tol for <code>n_iter_no_change</code> iterations (if set to a number), the training stops.",
"_____no_output_____"
]
],
[
[
"# Build Model here\r\nmodel = GradientBoostingRegressor(random_state = 123)\r\nmodel.fit(X_train, y_train)",
"_____no_output_____"
]
],
[
[
"#### Model Accuracy\n\nWe will use the trained model to make a prediction on the test set.Then use the predicted value for measuring the accuracy of our model.\n\n> **score**: The **score** function returns the coefficient of determination <code>R<sup>2</sup></code> of the prediction.",
"_____no_output_____"
]
],
[
[
"print(\"Accuracy score {:.2f} %\\n\".format(model.score(X_test,y_test)*100))",
"Accuracy score 94.47 %\n\n"
]
],
[
[
"> **r2_score**: The **r2_score** function computes the percentage variablility explained by our model, either the fraction or the count of correct predictions. \n\n> **mae**: The **mean abosolute error** function calculates the amount of total error(absolute average distance between the real data and the predicted data) by our model. \n\n> **mse**: The **mean squared error** function squares the error(penalizes the model for large errors) by our model. ",
"_____no_output_____"
]
],
[
[
"y_pred=model.predict(X_test)\r\nprint(\"R2 Score: {:.2f} %\".format(r2_score(y_test,y_pred)*100))\r\nprint(\"Mean Absolute Error {:.2f}\".format(mean_absolute_error(y_test,y_pred)))\r\nprint(\"Mean Squared Error {:.2f}\".format(mean_squared_error(y_test,y_pred)))",
"R2 Score: 94.47 %\nMean Absolute Error 3.02\nMean Squared Error 16.01\n"
]
],
[
[
"#### Feature Importances\nThe Feature importance refers to techniques that assign a score to features based on how useful they are for making the prediction.",
"_____no_output_____"
]
],
[
[
"plt.figure(figsize=(8,6))\r\nn_features = len(X.columns)\r\nplt.barh(range(n_features), model.feature_importances_, align='center')\r\nplt.yticks(np.arange(n_features), X.columns)\r\nplt.xlabel(\"Feature importance\")\r\nplt.ylabel(\"Feature\")\r\nplt.ylim(-1, n_features)",
"_____no_output_____"
]
],
[
[
"#### Prediction Plot\n\nFirst, we make use of a plot to plot the actual observations, with x_train on the x-axis and y_train on the y-axis.\nFor the regression line, we will use x_train on the x-axis and then the predictions of the x_train observations on the y-axis.",
"_____no_output_____"
]
],
[
[
"plt.figure(figsize=(14,10))\r\nplt.plot(range(20),y_test[0:20], color = \"blue\")\r\nplt.plot(range(20),model.predict(X_test[0:20]), color = \"red\")\r\nplt.legend([\"Actual\",\"prediction\"]) \r\nplt.title(\"Predicted vs True Value\")\r\nplt.xlabel(\"Record number\")\r\nplt.ylabel(target)\r\nplt.show()",
"_____no_output_____"
]
],
[
[
"#### Creator: Ganapathi Thota , Github: [Profile](https://github.com/Shikiz)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
d07cc1429350e6847877e2e7df4031771e0e9334 | 34,644 | ipynb | Jupyter Notebook | Project_Plagiarism_Detection/.ipynb_checkpoints/1_Data_Exploration-checkpoint.ipynb | iahsanujunda/ml_sagemaker_studies | c31a09a75900df40255dfaf4af8ddc4ff51dc9e1 | [
"MIT"
] | null | null | null | Project_Plagiarism_Detection/.ipynb_checkpoints/1_Data_Exploration-checkpoint.ipynb | iahsanujunda/ml_sagemaker_studies | c31a09a75900df40255dfaf4af8ddc4ff51dc9e1 | [
"MIT"
] | null | null | null | Project_Plagiarism_Detection/.ipynb_checkpoints/1_Data_Exploration-checkpoint.ipynb | iahsanujunda/ml_sagemaker_studies | c31a09a75900df40255dfaf4af8ddc4ff51dc9e1 | [
"MIT"
] | null | null | null | 36.352571 | 456 | 0.485163 | [
[
[
"# Plagiarism Text Data\n\nIn this project, you will be tasked with building a plagiarism detector that examines a text file and performs binary classification; labeling that file as either plagiarized or not, depending on how similar the text file is when compared to a provided source text. \n\nThe first step in working with any dataset is loading the data in and noting what information is included in the dataset. This is an important step in eventually working with this data, and knowing what kinds of features you have to work with as you transform and group the data!\n\nSo, this notebook is all about exploring the data and noting patterns about the features you are given and the distribution of data. \n\n> There are not any exercises or questions in this notebook, it is only meant for exploration. This notebook will note be required in your final project submission.\n\n---",
"_____no_output_____"
],
[
"## Read in the Data\n\nThe cell below will download the necessary data and extract the files into the folder `data/`.\n\nThis data is a slightly modified version of a dataset created by Paul Clough (Information Studies) and Mark Stevenson (Computer Science), at the University of Sheffield. You can read all about the data collection and corpus, at [their university webpage](https://ir.shef.ac.uk/cloughie/resources/plagiarism_corpus.html). \n\n> **Citation for data**: Clough, P. and Stevenson, M. Developing A Corpus of Plagiarised Short Answers, Language Resources and Evaluation: Special Issue on Plagiarism and Authorship Analysis, In Press. [Download]",
"_____no_output_____"
]
],
[
[
"!wget https://s3.amazonaws.com/video.udacity-data.com/topher/2019/January/5c4147f9_data/data.zip\n!unzip data",
"--2020-06-07 09:58:24-- https://s3.amazonaws.com/video.udacity-data.com/topher/2019/January/5c4147f9_data/data.zip\nResolving s3.amazonaws.com (s3.amazonaws.com)... 52.216.144.206\nConnecting to s3.amazonaws.com (s3.amazonaws.com)|52.216.144.206|:443... connected.\nHTTP request sent, awaiting response... 200 OK\nLength: 113826 (111K) [application/zip]\nSaving to: ‘data.zip’\n\ndata.zip 100%[===================>] 111.16K --.-KB/s in 0.04s \n\n2020-06-07 09:58:24 (2.97 MB/s) - ‘data.zip’ saved [113826/113826]\n\nArchive: data.zip\n creating: data/\n inflating: data/.DS_Store \n creating: __MACOSX/\n creating: __MACOSX/data/\n inflating: __MACOSX/data/._.DS_Store \n inflating: data/file_information.csv \n inflating: __MACOSX/data/._file_information.csv \n inflating: data/g0pA_taska.txt \n inflating: __MACOSX/data/._g0pA_taska.txt \n inflating: data/g0pA_taskb.txt \n inflating: __MACOSX/data/._g0pA_taskb.txt \n inflating: data/g0pA_taskc.txt \n inflating: __MACOSX/data/._g0pA_taskc.txt \n inflating: data/g0pA_taskd.txt \n inflating: __MACOSX/data/._g0pA_taskd.txt \n inflating: data/g0pA_taske.txt \n inflating: __MACOSX/data/._g0pA_taske.txt \n inflating: data/g0pB_taska.txt \n inflating: __MACOSX/data/._g0pB_taska.txt \n inflating: data/g0pB_taskb.txt \n inflating: __MACOSX/data/._g0pB_taskb.txt \n inflating: data/g0pB_taskc.txt \n inflating: __MACOSX/data/._g0pB_taskc.txt \n inflating: data/g0pB_taskd.txt \n inflating: __MACOSX/data/._g0pB_taskd.txt \n inflating: data/g0pB_taske.txt \n inflating: __MACOSX/data/._g0pB_taske.txt \n inflating: data/g0pC_taska.txt \n inflating: __MACOSX/data/._g0pC_taska.txt \n inflating: data/g0pC_taskb.txt \n inflating: __MACOSX/data/._g0pC_taskb.txt \n inflating: data/g0pC_taskc.txt \n inflating: __MACOSX/data/._g0pC_taskc.txt \n inflating: data/g0pC_taskd.txt \n inflating: __MACOSX/data/._g0pC_taskd.txt \n inflating: data/g0pC_taske.txt \n inflating: __MACOSX/data/._g0pC_taske.txt \n inflating: data/g0pD_taska.txt \n inflating: __MACOSX/data/._g0pD_taska.txt \n inflating: data/g0pD_taskb.txt \n inflating: __MACOSX/data/._g0pD_taskb.txt \n inflating: data/g0pD_taskc.txt \n inflating: __MACOSX/data/._g0pD_taskc.txt \n inflating: data/g0pD_taskd.txt \n inflating: __MACOSX/data/._g0pD_taskd.txt \n inflating: data/g0pD_taske.txt \n inflating: __MACOSX/data/._g0pD_taske.txt \n inflating: data/g0pE_taska.txt \n inflating: __MACOSX/data/._g0pE_taska.txt \n inflating: data/g0pE_taskb.txt \n inflating: __MACOSX/data/._g0pE_taskb.txt \n inflating: data/g0pE_taskc.txt \n inflating: __MACOSX/data/._g0pE_taskc.txt \n inflating: data/g0pE_taskd.txt \n inflating: __MACOSX/data/._g0pE_taskd.txt \n inflating: data/g0pE_taske.txt \n inflating: __MACOSX/data/._g0pE_taske.txt \n inflating: data/g1pA_taska.txt \n inflating: __MACOSX/data/._g1pA_taska.txt \n inflating: data/g1pA_taskb.txt \n inflating: __MACOSX/data/._g1pA_taskb.txt \n inflating: data/g1pA_taskc.txt \n inflating: __MACOSX/data/._g1pA_taskc.txt \n inflating: data/g1pA_taskd.txt \n inflating: __MACOSX/data/._g1pA_taskd.txt \n inflating: data/g1pA_taske.txt \n inflating: __MACOSX/data/._g1pA_taske.txt \n inflating: data/g1pB_taska.txt \n inflating: __MACOSX/data/._g1pB_taska.txt \n inflating: data/g1pB_taskb.txt \n inflating: __MACOSX/data/._g1pB_taskb.txt \n inflating: data/g1pB_taskc.txt \n inflating: __MACOSX/data/._g1pB_taskc.txt \n inflating: data/g1pB_taskd.txt \n inflating: __MACOSX/data/._g1pB_taskd.txt \n inflating: data/g1pB_taske.txt \n inflating: __MACOSX/data/._g1pB_taske.txt \n inflating: data/g1pD_taska.txt \n inflating: __MACOSX/data/._g1pD_taska.txt \n inflating: data/g1pD_taskb.txt \n inflating: __MACOSX/data/._g1pD_taskb.txt \n inflating: data/g1pD_taskc.txt \n inflating: __MACOSX/data/._g1pD_taskc.txt \n inflating: data/g1pD_taskd.txt \n inflating: __MACOSX/data/._g1pD_taskd.txt \n inflating: data/g1pD_taske.txt \n inflating: __MACOSX/data/._g1pD_taske.txt \n inflating: data/g2pA_taska.txt \n inflating: __MACOSX/data/._g2pA_taska.txt \n inflating: data/g2pA_taskb.txt \n inflating: __MACOSX/data/._g2pA_taskb.txt \n inflating: data/g2pA_taskc.txt \n inflating: __MACOSX/data/._g2pA_taskc.txt \n inflating: data/g2pA_taskd.txt \n inflating: __MACOSX/data/._g2pA_taskd.txt \n inflating: data/g2pA_taske.txt \n inflating: __MACOSX/data/._g2pA_taske.txt \n inflating: data/g2pB_taska.txt \n inflating: __MACOSX/data/._g2pB_taska.txt \n inflating: data/g2pB_taskb.txt \n inflating: __MACOSX/data/._g2pB_taskb.txt \n inflating: data/g2pB_taskc.txt \n inflating: __MACOSX/data/._g2pB_taskc.txt \n inflating: data/g2pB_taskd.txt \n inflating: __MACOSX/data/._g2pB_taskd.txt \n inflating: data/g2pB_taske.txt \n inflating: __MACOSX/data/._g2pB_taske.txt \n inflating: data/g2pC_taska.txt \n inflating: __MACOSX/data/._g2pC_taska.txt \n inflating: data/g2pC_taskb.txt \n inflating: __MACOSX/data/._g2pC_taskb.txt \n inflating: data/g2pC_taskc.txt \n inflating: __MACOSX/data/._g2pC_taskc.txt \n inflating: data/g2pC_taskd.txt \n inflating: __MACOSX/data/._g2pC_taskd.txt \n inflating: data/g2pC_taske.txt \n inflating: __MACOSX/data/._g2pC_taske.txt \n inflating: data/g2pE_taska.txt \n inflating: __MACOSX/data/._g2pE_taska.txt \n inflating: data/g2pE_taskb.txt \n inflating: __MACOSX/data/._g2pE_taskb.txt \n inflating: data/g2pE_taskc.txt \n inflating: __MACOSX/data/._g2pE_taskc.txt \n inflating: data/g2pE_taskd.txt \n inflating: __MACOSX/data/._g2pE_taskd.txt \n inflating: data/g2pE_taske.txt \n inflating: __MACOSX/data/._g2pE_taske.txt \n inflating: data/g3pA_taska.txt \n inflating: __MACOSX/data/._g3pA_taska.txt \n inflating: data/g3pA_taskb.txt \n inflating: __MACOSX/data/._g3pA_taskb.txt \n inflating: data/g3pA_taskc.txt \n inflating: __MACOSX/data/._g3pA_taskc.txt \n inflating: data/g3pA_taskd.txt \n inflating: __MACOSX/data/._g3pA_taskd.txt \n inflating: data/g3pA_taske.txt \n inflating: __MACOSX/data/._g3pA_taske.txt \n inflating: data/g3pB_taska.txt \n inflating: __MACOSX/data/._g3pB_taska.txt \n inflating: data/g3pB_taskb.txt \n inflating: __MACOSX/data/._g3pB_taskb.txt \n inflating: data/g3pB_taskc.txt \n inflating: __MACOSX/data/._g3pB_taskc.txt \n inflating: data/g3pB_taskd.txt \n inflating: __MACOSX/data/._g3pB_taskd.txt \n inflating: data/g3pB_taske.txt \n inflating: __MACOSX/data/._g3pB_taske.txt \n inflating: data/g3pC_taska.txt \n inflating: __MACOSX/data/._g3pC_taska.txt \n inflating: data/g3pC_taskb.txt \n inflating: __MACOSX/data/._g3pC_taskb.txt \n inflating: data/g3pC_taskc.txt \n inflating: __MACOSX/data/._g3pC_taskc.txt \n inflating: data/g3pC_taskd.txt \n inflating: __MACOSX/data/._g3pC_taskd.txt \n inflating: data/g3pC_taske.txt \n inflating: __MACOSX/data/._g3pC_taske.txt \n inflating: data/g4pB_taska.txt \n inflating: __MACOSX/data/._g4pB_taska.txt \n inflating: data/g4pB_taskb.txt \n inflating: __MACOSX/data/._g4pB_taskb.txt \n inflating: data/g4pB_taskc.txt \n inflating: __MACOSX/data/._g4pB_taskc.txt \n inflating: data/g4pB_taskd.txt \n inflating: __MACOSX/data/._g4pB_taskd.txt \n inflating: data/g4pB_taske.txt \n inflating: __MACOSX/data/._g4pB_taske.txt \n inflating: data/g4pC_taska.txt \n inflating: __MACOSX/data/._g4pC_taska.txt \n inflating: data/g4pC_taskb.txt \n inflating: __MACOSX/data/._g4pC_taskb.txt \n inflating: data/g4pC_taskc.txt \n inflating: __MACOSX/data/._g4pC_taskc.txt \n inflating: data/g4pC_taskd.txt \n inflating: __MACOSX/data/._g4pC_taskd.txt \n inflating: data/g4pC_taske.txt \n inflating: __MACOSX/data/._g4pC_taske.txt \n inflating: data/g4pD_taska.txt \n inflating: __MACOSX/data/._g4pD_taska.txt \n inflating: data/g4pD_taskb.txt \n inflating: __MACOSX/data/._g4pD_taskb.txt \n inflating: data/g4pD_taskc.txt \n inflating: __MACOSX/data/._g4pD_taskc.txt \n inflating: data/g4pD_taskd.txt \n inflating: __MACOSX/data/._g4pD_taskd.txt \n inflating: data/g4pD_taske.txt \n inflating: __MACOSX/data/._g4pD_taske.txt \n inflating: data/g4pE_taska.txt \n inflating: __MACOSX/data/._g4pE_taska.txt \n inflating: data/g4pE_taskb.txt \n inflating: __MACOSX/data/._g4pE_taskb.txt \n inflating: data/g4pE_taskc.txt \n inflating: __MACOSX/data/._g4pE_taskc.txt \n inflating: data/g4pE_taskd.txt \n inflating: __MACOSX/data/._g4pE_taskd.txt \n inflating: data/g4pE_taske.txt \n inflating: __MACOSX/data/._g4pE_taske.txt \n inflating: data/orig_taska.txt \n inflating: __MACOSX/data/._orig_taska.txt \n inflating: data/orig_taskb.txt \n inflating: data/orig_taskc.txt \n inflating: __MACOSX/data/._orig_taskc.txt \n inflating: data/orig_taskd.txt \n inflating: __MACOSX/data/._orig_taskd.txt \n inflating: data/orig_taske.txt \n inflating: __MACOSX/data/._orig_taske.txt \n inflating: data/test_info.csv \n inflating: __MACOSX/data/._test_info.csv \n inflating: __MACOSX/._data \n"
],
[
"# import libraries\nimport pandas as pd\nimport numpy as np\nimport os",
"_____no_output_____"
]
],
[
[
"This plagiarism dataset is made of multiple text files; each of these files has characteristics that are is summarized in a `.csv` file named `file_information.csv`, which we can read in using `pandas`.",
"_____no_output_____"
]
],
[
[
"csv_file = 'data/file_information.csv'\nplagiarism_df = pd.read_csv(csv_file)\n\n# print out the first few rows of data info\nplagiarism_df.head(10)",
"_____no_output_____"
]
],
[
[
"## Types of Plagiarism\n\nEach text file is associated with one **Task** (task A-E) and one **Category** of plagiarism, which you can see in the above DataFrame.\n\n### Five task types, A-E\n\nEach text file contains an answer to one short question; these questions are labeled as tasks A-E.\n* Each task, A-E, is about a topic that might be included in the Computer Science curriculum that was created by the authors of this dataset. \n * For example, Task A asks the question: \"What is inheritance in object oriented programming?\"\n\n### Four categories of plagiarism \n\nEach text file has an associated plagiarism label/category:\n\n1. `cut`: An answer is plagiarized; it is copy-pasted directly from the relevant Wikipedia source text.\n2. `light`: An answer is plagiarized; it is based on the Wikipedia source text and includes some copying and paraphrasing.\n3. `heavy`: An answer is plagiarized; it is based on the Wikipedia source text but expressed using different words and structure. Since this doesn't copy directly from a source text, this will likely be the most challenging kind of plagiarism to detect.\n4. `non`: An answer is not plagiarized; the Wikipedia source text is not used to create this answer.\n5. `orig`: This is a specific category for the original, Wikipedia source text. We will use these files only for comparison purposes.\n\n> So, out of the submitted files, the only category that does not contain any plagiarism is `non`.\n\nIn the next cell, print out some statistics about the data.",
"_____no_output_____"
]
],
[
[
"# print out some stats about the data\nprint('Number of files: ', plagiarism_df.shape[0]) # .shape[0] gives the rows \n# .unique() gives unique items in a specified column\nprint('Number of unique tasks/question types (A-E): ', (len(plagiarism_df['Task'].unique())))\nprint('Unique plagiarism categories: ', (plagiarism_df['Category'].unique()))",
"Number of files: 100\nNumber of unique tasks/question types (A-E): 5\nUnique plagiarism categories: ['non' 'cut' 'light' 'heavy' 'orig']\n"
]
],
[
[
"You should see the number of text files in the dataset as well as some characteristics about the `Task` and `Category` columns. **Note that the file count of 100 *includes* the 5 _original_ wikipedia files for tasks A-E.** If you take a look at the files in the `data` directory, you'll notice that the original, source texts start with the filename `orig_` as opposed to `g` for \"group.\" \n\n> So, in total there are 100 files, 95 of which are answers (submitted by people) and 5 of which are the original, Wikipedia source texts.\n\nYour end goal will be to use this information to classify any given answer text into one of two categories, plagiarized or not-plagiarized.",
"_____no_output_____"
],
[
"### Distribution of Data\n\nNext, let's look at the distribution of data. In this course, we've talked about traits like class imbalance that can inform how you develop an algorithm. So, here, we'll ask: **How evenly is our data distributed among different tasks and plagiarism levels?**\n\nBelow, you should notice two things:\n* Our dataset is quite small, especially with respect to examples of varying plagiarism levels.\n* The data is distributed fairly evenly across task and plagiarism types.",
"_____no_output_____"
]
],
[
[
"# Show counts by different tasks and amounts of plagiarism\n\n# group and count by task\ncounts_per_task=plagiarism_df.groupby(['Task']).size().reset_index(name=\"Counts\")\nprint(\"\\nTask:\")\ndisplay(counts_per_task)\n\n# group by plagiarism level\ncounts_per_category=plagiarism_df.groupby(['Category']).size().reset_index(name=\"Counts\")\nprint(\"\\nPlagiarism Levels:\")\ndisplay(counts_per_category)\n\n# group by task AND plagiarism level\ncounts_task_and_plagiarism=plagiarism_df.groupby(['Task', 'Category']).size().reset_index(name=\"Counts\")\nprint(\"\\nTask & Plagiarism Level Combos :\")\ndisplay(counts_task_and_plagiarism)",
"\nTask:\n"
]
],
[
[
"It may also be helpful to look at this last DataFrame, graphically.\n\nBelow, you can see that the counts follow a pattern broken down by task. Each task has one source text (original) and the highest number on `non` plagiarized cases.",
"_____no_output_____"
]
],
[
[
"import matplotlib.pyplot as plt\n% matplotlib inline\n\n# counts\ngroup = ['Task', 'Category']\ncounts = plagiarism_df.groupby(group).size().reset_index(name=\"Counts\")\n\nplt.figure(figsize=(8,5))\nplt.bar(range(len(counts)), counts['Counts'], color = 'blue')",
"UsageError: Line magic function `%` not found.\n"
]
],
[
[
"## Up Next\n\nThis notebook is just about data loading and exploration, and you do not need to include it in your final project submission. \n\nIn the next few notebooks, you'll use this data to train a complete plagiarism classifier. You'll be tasked with extracting meaningful features from the text data, reading in answers to different tasks and comparing them to the original Wikipedia source text. You'll engineer similarity features that will help identify cases of plagiarism. Then, you'll use these features to train and deploy a classification model in a SageMaker notebook instance. ",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
d07cd651b1b059e976774a791ddfea53e829cc24 | 3,770 | ipynb | Jupyter Notebook | Binary Search/1015/1337. The K Weakest Rows in a Matrix.ipynb | YuHe0108/Leetcode | 90d904dde125dd35ee256a7f383961786f1ada5d | [
"Apache-2.0"
] | 1 | 2020-08-05T11:47:47.000Z | 2020-08-05T11:47:47.000Z | Binary Search/1015/1337. The K Weakest Rows in a Matrix.ipynb | YuHe0108/LeetCode | b9e5de69b4e4d794aff89497624f558343e362ad | [
"Apache-2.0"
] | null | null | null | Binary Search/1015/1337. The K Weakest Rows in a Matrix.ipynb | YuHe0108/LeetCode | b9e5de69b4e4d794aff89497624f558343e362ad | [
"Apache-2.0"
] | null | null | null | 24.640523 | 92 | 0.377719 | [
[
[
"说明:\n 给你一个大小为 m * n 的方阵 mat,方阵由若干军人和平民组成,分别用 1 和 0 表示。\n 请你返回方阵中战斗力最弱的 k 行的索引,按从最弱到最强排序。\n 如果第 i 行的军人数量少于第 j 行,或者两行军人数量相同但 i 小于 j,那么我们认为第 i 行的战斗力比第 j 行弱。\n 军人 总是 排在一行中的靠前位置,也就是说 1 总是出现在 0 之前。\n\n示例 1:\n 输入:mat = \n [[1,1,0,0,0],\n [1,1,1,1,0],\n [1,0,0,0,0],\n [1,1,0,0,0],\n [1,1,1,1,1]], \n k = 3\n 输出:[2,0,3]: 0:2, 2:1, 3:2\n 解释:\n 每行中的军人数目:\n 行 0 -> 2 \n 行 1 -> 4 \n 行 2 -> 1 \n 行 3 -> 2 \n 行 4 -> 5 \n 从最弱到最强对这些行排序后得到 [2,0,3,1,4]\n 2:1, 0:2, 3:2, 1:4, 4:5\n示例 2:\n 输入:mat = \n [[1,0,0,0],\n [1,1,1,1],\n [1,0,0,0],\n [1,0,0,0]], \n k = 2\n 输出:[0,2]\n 解释: \n 每行中的军人数目:\n 行 0 -> 1 \n 行 1 -> 4 \n 行 2 -> 1 \n 行 3 -> 1 \n 从最弱到最强对这些行排序后得到 [0,2,3,1]\n\n提示:\n 1、m == mat.length\n 2、n == mat[i].length\n 3、2 <= n, m <= 100\n 4、1 <= k <= m\n 5、matrix[i][j] 不是 0 就是 1",
"_____no_output_____"
]
],
[
[
"class Solution:\n def kWeakestRows(self, mat, k):\n rows, cols = len(mat), len(mat[0])\n powers = {}\n for i in range(rows):\n sum_pow = self.count_power(mat[i])\n powers[i] = sum_pow\n print(powers)\n res = []\n powers = sorted(powers.items(), key = lambda x: x[1])\n for i in range(k):\n res.append(powers[i][0])\n return res\n \n def count_power(self, pow_list):\n count = 0\n left, right = 0, len(pow_list) - 1\n while left <= right:\n mid = left + (right - left) // 2\n if pow_list[mid] == 1:\n count += (mid - left + 1)\n left = mid + 1\n else:\n right = mid - 1\n return count",
"_____no_output_____"
],
[
"solution = Solution()\nsolution.kWeakestRows([[1,1,1,1,1],[1,0,0,0,0],[1,1,0,0,0],[1,1,1,1,0],[1,1,1,1,1]],3)",
"{0: 5, 1: 1, 2: 2, 3: 4, 4: 5}\n"
]
]
] | [
"raw",
"code"
] | [
[
"raw"
],
[
"code",
"code"
]
] |
d07d018dafaf3356948660f50abbf7cea51710ca | 17,733 | ipynb | Jupyter Notebook | _posts/python-v3/basic/table/table_sub.ipynb | bmb804/documentation | 57826d25e0afea7fff6a8da9abab8be2f7a4b48c | [
"CC-BY-3.0"
] | 2 | 2019-06-24T23:55:53.000Z | 2019-07-08T12:22:56.000Z | _posts/python-v3/basic/table/table_sub.ipynb | bmb804/documentation | 57826d25e0afea7fff6a8da9abab8be2f7a4b48c | [
"CC-BY-3.0"
] | 15 | 2020-06-30T21:21:30.000Z | 2021-08-02T21:16:33.000Z | _posts/python-v3/basic/table/table_sub.ipynb | bmb804/documentation | 57826d25e0afea7fff6a8da9abab8be2f7a4b48c | [
"CC-BY-3.0"
] | 1 | 2019-11-10T04:01:48.000Z | 2019-11-10T04:01:48.000Z | 38.053648 | 1,321 | 0.506626 | [
[
[
"#### New to Plotly?\nPlotly's Python library is free and open source! [Get started](https://plot.ly/python/getting-started/) by downloading the client and [reading the primer](https://plot.ly/python/getting-started/).\n<br>You can set up Plotly to work in [online](https://plot.ly/python/getting-started/#initialization-for-online-plotting) or [offline](https://plot.ly/python/getting-started/#initialization-for-offline-plotting) mode, or in [jupyter notebooks](https://plot.ly/python/getting-started/#start-plotting-online).\n<br>We also have a quick-reference [cheatsheet](https://images.plot.ly/plotly-documentation/images/python_cheat_sheet.pdf) (new!) to help you get started!\n#### Version Check\nNote: Tables are available in version <b>2.1.0+</b><br>\nRun `pip install plotly --upgrade` to update your Plotly version",
"_____no_output_____"
]
],
[
[
"import plotly\nplotly.__version__",
"_____no_output_____"
]
],
[
[
"#### Import CSV Data",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nimport re\n\nimport plotly.plotly as py\nimport plotly.graph_objs as go\n\ndf = pd.read_csv('https://raw.githubusercontent.com/plotly/datasets/master/Mining-BTC-180.csv')\n\n# remove min:sec:millisec from dates \nfor i, row in enumerate(df['Date']):\n p = re.compile(' 00:00:00')\n datetime = p.split(df['Date'][i])[0]\n df.iloc[i, 1] = datetime\n\ntable = go.Table(\n header=dict(\n values=list(df.columns),\n line = dict(color='rgb(50, 50, 50)'),\n align = ['left'] * 5,\n fill = dict(color='#EDFAFF')\n ),\n cells=dict(\n values=[df.iloc[j] for j in range(10)],\n line = dict(color='rgb(50, 50, 50)'),\n align = ['left'] * 5,\n fill = dict(color='#f5f5fa')\n )\n)\n\npy.iplot([table])",
"_____no_output_____"
]
],
[
[
"#### Table and Right Aligned Plots\nIn Plotly there is no native way to insert a Plotly Table into a Subplot. To do this, create your own `Layout` object and defining multiple `xaxis` and `yaxis` to split up the chart area into different domains. Then for the traces you wish to insert in your final chart, set their `xaxis` and `yaxis` individually to map to the domains definied in the `Layout`. See the example below to see how to align 3 Scatter plots to the right and a Table on the top.",
"_____no_output_____"
]
],
[
[
"import plotly.plotly as py\nimport plotly.graph_objs as go\n\ntable_trace1 = go.Table(\n domain=dict(x=[0, 0.5],\n y=[0, 1.0]),\n columnwidth = [30] + [33, 35, 33],\n columnorder=[0, 1, 2, 3, 4],\n header = dict(height = 50,\n values = [['<b>Date</b>'],['<b>Number<br>transactions</b>'], \n ['<b>Output-volume(BTC)</b>'], ['<b>Market-Price</b>']], \n line = dict(color='rgb(50, 50, 50)'),\n align = ['left'] * 5,\n font = dict(color=['rgb(45, 45, 45)'] * 5, size=14),\n fill = dict(color='#d562be')),\n cells = dict(values = [df.iloc[j][1:5] for j in range(25)],\n line = dict(color='#506784'),\n align = ['left'] * 5,\n font = dict(color=['rgb(40, 40, 40)'] * 5, size=12),\n format = [None] + [\", .2f\"] * 2 + [',.4f'], \n prefix = [None] * 2 + ['$', u'\\u20BF'],\n suffix=[None] * 4,\n height = 27,\n fill = dict(color=['rgb(235, 193, 238)', 'rgba(228, 222, 249, 0.65)']))\n)\n\ntrace1=go.Scatter(\n x=df['Date'],\n y=df['Hash-rate'],\n xaxis='x1',\n yaxis='y1',\n mode='lines',\n line=dict(width=2, color='#9748a1'),\n name='hash-rate-TH/s'\n)\n\ntrace2=go.Scatter(\n x=df['Date'],\n y=df['Mining-revenue-USD'],\n xaxis='x2',\n yaxis='y2',\n mode='lines',\n line=dict(width=2, color='#b04553'),\n name='mining revenue'\n)\n\ntrace3=go.Scatter(\n x=df['Date'],\n y=df['Transaction-fees-BTC'],\n xaxis='x3',\n yaxis='y3',\n mode='lines',\n line=dict(width=2, color='#af7bbd'),\n name='transact-fee'\n)\n\naxis=dict(\n showline=True,\n zeroline=False,\n showgrid=True,\n mirror=True, \n ticklen=4, \n gridcolor='#ffffff',\n tickfont=dict(size=10)\n)\n\nlayout1 = dict(\n width=950,\n height=800,\n autosize=False,\n title='Bitcoin mining stats for 180 days',\n margin = dict(t=100),\n showlegend=False, \n xaxis1=dict(axis, **dict(domain=[0.55, 1], anchor='y1', showticklabels=False)),\n xaxis2=dict(axis, **dict(domain=[0.55, 1], anchor='y2', showticklabels=False)), \n xaxis3=dict(axis, **dict(domain=[0.55, 1], anchor='y3')), \n yaxis1=dict(axis, **dict(domain=[0.66, 1.0], anchor='x1', hoverformat='.2f')), \n yaxis2=dict(axis, **dict(domain=[0.3 + 0.03, 0.63], anchor='x2', tickprefix='$', hoverformat='.2f')),\n yaxis3=dict(axis, **dict(domain=[0.0, 0.3], anchor='x3', tickprefix=u'\\u20BF', hoverformat='.2f')),\n plot_bgcolor='rgba(228, 222, 249, 0.65)',\n annotations=[\n dict(\n showarrow=False,\n text='The last 20 records',\n xref='paper',\n yref='paper',\n x=0,\n y=1.01,\n xanchor='left',\n yanchor='bottom', \n font=dict(size=15)\n )\n ]\n)\n\nfig1 = dict(data=[table_trace1, trace1, trace2, trace3], layout=layout1)\npy.iplot(fig1)",
"_____no_output_____"
]
],
[
[
"#### Vertical Table and Graph Subplot",
"_____no_output_____"
]
],
[
[
"import plotly.plotly as py\nimport plotly.graph_objs as go\n\ntable_trace2 = go.Table(\n domain=dict(x=[0, 1],\n y=[0, 1.0]), \n columnwidth = [30] + [33, 35, 33],\n columnorder=[0, 1, 2, 3, 4],\n header = dict(height = 50,\n values = [['<b>Date</b>'],['<b>Hash Rate, TH/sec</b>'], \n ['<b>Mining revenue</b>'], ['<b>Transaction fees</b>']], \n line = dict(color='rgb(50, 50, 50)'),\n align = ['left'] * 5,\n font = dict(color=['rgb(45, 45, 45)'] * 5, size=14),\n fill = dict(color='#d562be')),\n cells = dict(values = [df['Date'][-20:], df['Hash-rate'][-20:], df['Mining-revenue-USD'][-20:],\n df['Transaction-fees-BTC'][-20:]],\n line = dict(color='#506784'),\n align = ['left'] * 5,\n font = dict(color=['rgb(40, 40, 40)'] * 5, size=12),\n format = [None] + [\", .2f\"] * 2 + [',.4f'], \n prefix = [None] * 2 + ['$', u'\\u20BF'],\n suffix=[None] * 4,\n height = 27,\n fill = dict(color=['rgb(235, 193, 238)', 'rgba(228, 222, 249, 0.65)']))\n)\n\ntrace4=go.Scatter(\n x=df['Date'],\n y=df['Hash-rate'],\n xaxis='x1',\n yaxis='y1',\n mode='lines',\n line=dict(width=2, color='#9748a1'),\n name='hash-rate-TH/s'\n)\n\ntrace5=go.Scatter(\n x=df['Date'],\n y=df['Mining-revenue-USD'],\n xaxis='x2',\n yaxis='y2',\n mode='lines',\n line=dict(width=2, color='#b04553'),\n name='mining revenue'\n)\n\ntrace6=go.Scatter(\n x=df['Date'],\n y=df['Transaction-fees-BTC'],\n xaxis='x3',\n yaxis='y3',\n mode='lines',\n line=dict(width=2, color='#af7bbd'),\n name='transact-fee'\n)\n\naxis=dict(\n showline=True,\n zeroline=False,\n showgrid=True,\n mirror=True, \n ticklen=4, \n gridcolor='#ffffff',\n tickfont=dict(size=10)\n)\n\nlayout2 = dict(\n width=950,\n height=800,\n autosize=False,\n title='Bitcoin mining stats for 180 days',\n margin = dict(t=100),\n showlegend=False, \n xaxis1=dict(axis, **dict(domain=[0, 1], anchor='y1', showticklabels=False)),\n xaxis2=dict(axis, **dict(domain=[0, 1], anchor='y2', showticklabels=False)), \n xaxis3=dict(axis, **dict(domain=[0, 1], anchor='y3')), \n yaxis1=dict(axis, **dict(domain=[2 * 0.21 + 0.02 + 0.02, 0.68], anchor='x1', hoverformat='.2f')), \n yaxis2=dict(axis, **dict(domain=[0.21 + 0.02, 2 * 0.21 + 0.02], anchor='x2', tickprefix='$', hoverformat='.2f')),\n yaxis3=dict(axis, **dict(domain=[0.0, 0.21], anchor='x3', tickprefix=u'\\u20BF', hoverformat='.2f')),\n plot_bgcolor='rgba(228, 222, 249, 0.65)',\n annotations=[\n dict(\n showarrow=False,\n text='The last 20 records',\n xref='paper',\n yref='paper',\n x=0.415,\n y=1.01,\n xanchor='left',\n yanchor='bottom', \n font=dict(size=15)\n )\n ]\n)\n\nfig2 = dict(data=[table_trace2, trace4, trace5, trace6], layout=layout2)\npy.iplot(fig2)",
"_____no_output_____"
]
],
[
[
"#### Reference\nSee https://plot.ly/python/reference/#table for more information regarding chart attributes! <br>\nFor examples of Plotly Tables, see: https://plot.ly/python/table/",
"_____no_output_____"
]
],
[
[
"from IPython.display import display, HTML\n\ndisplay(HTML('<link href=\"//fonts.googleapis.com/css?family=Open+Sans:600,400,300,200|Inconsolata|Ubuntu+Mono:400,700\" rel=\"stylesheet\" type=\"text/css\" />'))\ndisplay(HTML('<link rel=\"stylesheet\" type=\"text/css\" href=\"http://help.plot.ly/documentation/all_static/css/ipython-notebook-custom.css\">'))\n\n! pip install git+https://github.com/plotly/publisher.git --upgrade\nimport publisher\npublisher.publish(\n 'table-subplots.ipynb', 'python/table-subplots/', 'Table and Chart Subplots',\n 'How to create a subplot with tables and charts in Python with Plotly.',\n title = 'Table and Chart Subplots | plotly',\n has_thumbnail='true', thumbnail='table_subplots.jpg',\n language='python',\n display_as='multiple_axes', order=11)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07d10a9dbd787597c1364277cbecd35a9680187 | 125,931 | ipynb | Jupyter Notebook | IT_Survey_Application/building_the_dash_application.ipynb | anjumrohra/Data_visualization_with_Plotly_and_Dash | a2f4672b2a5f1c6ca846590de38ca73d6f725cc6 | [
"Apache-2.0"
] | 1 | 2021-04-08T05:53:25.000Z | 2021-04-08T05:53:25.000Z | IT_Survey_Application/building_the_dash_application.ipynb | anjumrohra/Data_visualization_with_Plotly_and_Dash | a2f4672b2a5f1c6ca846590de38ca73d6f725cc6 | [
"Apache-2.0"
] | null | null | null | IT_Survey_Application/building_the_dash_application.ipynb | anjumrohra/Data_visualization_with_Plotly_and_Dash | a2f4672b2a5f1c6ca846590de38ca73d6f725cc6 | [
"Apache-2.0"
] | null | null | null | 104.507054 | 73,201 | 0.759638 | [
[
[
"### Name: Anjum Rohra",
"_____no_output_____"
],
[
"# Overview\n\n\n\nBeing a popular finance journalist of Europe, everyone is waiting for the IT Salary Survey report you release every 3 years. The IT Sector is booming and the younger aspirants keep themselves updated with the trends by the beautiful visualizations your report contains. \n\nGiven the survey data from 2018 - 2020, it’s time to put your creative hat on and lay out insightful visualizations for the masses.",
"_____no_output_____"
],
[
"### Importing the required libraries",
"_____no_output_____"
]
],
[
[
"import plotly.express as px\nimport pandas as pd\n\nimport dash\nimport dash_core_components as dcc\nimport dash_html_components as html\nfrom dash.dependencies import Output, Input",
"_____no_output_____"
]
],
[
[
"### Importing data",
"_____no_output_____"
]
],
[
[
"data_18 = pd.read_csv('Survey_2018.csv')\ndata_19 = pd.read_csv('Survey_2019.csv')\ndata_20 = pd.read_csv('Survey_2020.csv')",
"_____no_output_____"
]
],
[
[
"### Imputing the missing values",
"_____no_output_____"
],
[
"#### 2020 imputation",
"_____no_output_____"
]
],
[
[
"data_20 = data_20.rename(columns = {'Position ': 'Position'})",
"_____no_output_____"
],
[
"data_20.isnull().sum()",
"_____no_output_____"
],
[
"data_20['Age'].fillna(data_20['Age'].mean(), inplace=True)",
"_____no_output_____"
],
[
"data_20['Gender'].fillna('Male', inplace=True)",
"_____no_output_____"
],
[
"data_20['Position'].fillna('Software Engineer', inplace=True)",
"_____no_output_____"
],
[
"data_20['Position'].replace({'Account Managet': 'Account Manager', 'agile master ':'Agile Coach', 'Data analyst ':'Data Analyst', 'Data Analyst ':'Data Analyst', 'Dana Analyst':'Data Analyst', 'Fullstack engineer, ну или Software engineer': 'Fullstack engineer, Software engineer', 'Software Architekt':'Software Architect', 'DatabEngineer':'Data Engineer', 'data engineer':'Data Engineer', 'support engineer':'Support Engineer', 'Systemadministrator':'System Administrator', 'Team lead':'Team Lead', 'Tech Leader':'Tech Lead', 'Technical Lead':'Tech Lead', 'Security engineer':'Security Engineer' },\n inplace=True)",
"_____no_output_____"
],
[
"data_20['Your main technology / programming language'].replace({'Javascript / Typescript':'JavaScript/TypeScript','JavaScript / TypeScript':'JavaScript/TypeScript','TypeScript, JavaScript':'JavaScript/TypeScript','JavaScript/Typescript':'JavaScript/TypeScript','JavaScript / typescript':'JavaScript/TypeScript','Javascript/Typescript':'JavaScript/TypeScript','JavaScript, TypeScript':'JavaScript/TypeScript','kotlin':'Kotlin','Javascript':'JavaScript','JavaScript ':'JavaScript','Javascript ':'JavaScript','javascript':'JavaScript','Typescript':'TypeScript','typescript':'TypeScript','Typescript ':'TypeScript','python':'Python','Python ':'Python','pythin':'Python','python ':'Python','Pyrhon':'Python','scala':'Scala','Java / Scala':'Java/Scala','python, scala':'Scala / Python','C, C++':'C/C++','C++/c':'C/C++','C++, C#':'C/C++','c/c++':'C/C++','--':'Java','-':'Java'},\n inplace=True)",
"_____no_output_____"
],
[
"data_20['Your main technology / programming language'].fillna('Java', inplace=True)",
"_____no_output_____"
],
[
"data_20['Total years of experience'].replace({'1,5':'1.5','2,5':'2.5','1 (as QA Engineer) / 11 in total':'11','15, thereof 8 as CTO':'15','6 (not as a data scientist, but as a lab scientist)':'6','383':'38.3','less than year':'0'}, inplace=True)",
"_____no_output_____"
],
[
"data_20['Total years of experience'].mode()",
"_____no_output_____"
],
[
"data_20['Total years of experience'].unique()",
"_____no_output_____"
],
[
"data_20['Total years of experience']=data_20['Total years of experience'].astype('float64')",
"_____no_output_____"
],
[
"data_20['Total years of experience'].median()",
"_____no_output_____"
],
[
"data_20['Total years of experience'].fillna(data_20['Total years of experience'].median(), inplace=True)",
"_____no_output_____"
],
[
"data_20['Years of experience in Germany'].unique()",
"_____no_output_____"
],
[
"data_20['Years of experience in Germany'].replace({'⁰':'0','-':'10','< 1':'<1','1,7':'1.7','2,5':'2.5','0,5':'0.5','1,5':'1.5','4,5':'4.5','3,5':'3.5','4 month':'4 months','less than year':'<1'}, inplace=True)",
"_____no_output_____"
],
[
"data_20['Years of experience in Germany'].mode()",
"_____no_output_____"
],
[
"data_20['Years of experience in Germany'].fillna('2', inplace=True)",
"_____no_output_____"
],
[
"data_20['Years of experience in Germany'].unique()",
"_____no_output_____"
],
[
"data_18.drop('Timestamp',axis=1,inplace=True)\ndata_19.drop('Zeitstempel',axis=1,inplace=True)\ndata_20.drop('Timestamp',axis=1,inplace=True)",
"_____no_output_____"
],
[
"data_20['Seniority level'].fillna('Senior', inplace=True)\ndata_20['Other technologies/programming languages you use often'].fillna('Javascript / Typescript', inplace=True)\ndata_20['Yearly bonus + stocks in EUR'].fillna('0', inplace=True)\ndata_20['Annual bonus+stocks one year ago. Only answer if staying in same country'].fillna('0', inplace=True)\ndata_20['Number of vacation days'].fillna('30', inplace=True)\ndata_20['Employment status'].fillna('Full-time employee', inplace=True)\ndata_20['Contract duration'].fillna('Unlimited contract', inplace=True)\ndata_20['Main language at work'].fillna('English', inplace=True)\ndata_20['Company size'].fillna('1000+', inplace=True)\ndata_20['Company type'].fillna('Product', inplace=True)\ndata_20['Have you lost your job due to the coronavirus outbreak?'].fillna('No', inplace=True)\ndata_20['Have you received additional monetary support from your employer due to Work From Home? If yes, how much in 2020 in EUR'].fillna('0', inplace=True)\ndata_20['Annual brutto salary (without bonus and stocks) one year ago. Only answer if staying in the same country'].fillna(data_20['Annual brutto salary (without bonus and stocks) one year ago. Only answer if staying in the same country'].median(), inplace=True)\ndata_20['Have you been forced to have a shorter working week (Kurzarbeit)? If yes, how many hours per week'].fillna(data_20['Have you been forced to have a shorter working week (Kurzarbeit)? If yes, how many hours per week'].mean(), inplace=True)",
"_____no_output_____"
],
[
"data_20.isnull().sum()",
"_____no_output_____"
]
],
[
[
"#### 2018 imputation",
"_____no_output_____"
]
],
[
[
"data_18.isnull().sum()",
"_____no_output_____"
],
[
"data_18['Gender'].fillna('M', inplace=True)\ndata_18['City'].fillna('Berlin', inplace=True)\ndata_18['Position'].fillna('Java Developer', inplace=True)\ndata_18['Your level'].fillna('Senior', inplace=True)\ndata_18['Are you getting any Stock Options?'].fillna('No', inplace=True)\ndata_18['Main language at work'].fillna('English', inplace=True)\ndata_18['Company size'].fillna('100-1000', inplace=True)\ndata_18['Company type'].fillna('Product', inplace=True)\ndata_18['Age'].fillna(data_18['Age'].mean(), inplace=True)\ndata_18['Years of experience'].fillna(data_18['Years of experience'].mean(), inplace=True)\ndata_18['Current Salary'].fillna(data_18['Current Salary'].mean(), inplace=True)\ndata_18['Salary one year ago'].fillna(data_18['Salary one year ago'].mean(), inplace=True)\ndata_18['Salary two years ago'].fillna(data_18['Salary two years ago'].mean(), inplace=True)",
"_____no_output_____"
],
[
"data_18['Gender'].value_counts()",
"_____no_output_____"
],
[
"data_18['Gender'].replace({'M':'Male','F':'Female'}, inplace=True)",
"_____no_output_____"
],
[
"data_18['Company type'].unique()",
"_____no_output_____"
],
[
"data_18['Company type'].replace({'Consulting Company':'Consulting','Consultancy':'Consulting','Consulting Company':'Consulting','Consult':'Consulting','IT Consulting ':'IT Consulting','IT Consultancy ':'IT Consulting','IT Consultants':'IT Consulting','Outsorce':'Outsourcing','Outsource':'Outsourcing','E-Commerce firm':'E-Commerce','e-commerce':'E-Commerce','Ecommerce':'E-Commerce'}, inplace=True)",
"_____no_output_____"
],
[
"data_18.isnull().sum()",
"_____no_output_____"
]
],
[
[
"#### 2019 imputation",
"_____no_output_____"
]
],
[
[
"data_19.drop('0',axis=1,inplace=True)",
"_____no_output_____"
],
[
"data_19.isnull().sum()",
"_____no_output_____"
],
[
"data_19['Seniority level'].fillna('Senior', inplace=True)\ndata_19['Position (without seniority)'].fillna('Backend Developer', inplace=True)\ndata_19['Your main technology / programming language'].fillna('Python', inplace=True)\ndata_19['Main language at work'].fillna('English', inplace=True)\ndata_19['Company name '].fillna('Zalando', inplace=True)\ndata_19['Company size'].fillna('100-1000', inplace=True)\ndata_19['Company type'].fillna('Product', inplace=True)\ndata_19['Contract duration'].fillna('unlimited', inplace=True)\ndata_19['Company business sector'].fillna('Commerce', inplace=True)\ndata_19['Age'].fillna(data_19['Age'].mean(), inplace=True)\ndata_19['Years of experience'].fillna(data_19['Years of experience'].mean(), inplace=True)\ndata_19['Yearly brutto salary (without bonus and stocks)'].fillna(data_19['Yearly brutto salary (without bonus and stocks)'].mean(), inplace=True)\ndata_19['Yearly bonus'].fillna(data_19['Yearly bonus'].mean(), inplace=True)\ndata_19['Yearly stocks'].fillna(data_19['Yearly stocks'].mean(), inplace=True)\ndata_19['Yearly brutto salary (without bonus and stocks) one year ago. Only answer if staying in same country'].fillna(data_19['Yearly brutto salary (without bonus and stocks) one year ago. Only answer if staying in same country'].mean(), inplace=True)\ndata_19['Yearly bonus one year ago. Only answer if staying in same country'].fillna(data_19['Yearly bonus one year ago. Only answer if staying in same country'].mean(), inplace=True)\ndata_19['Yearly stocks one year ago. Only answer if staying in same country'].fillna(data_19['Yearly stocks one year ago. Only answer if staying in same country'].mean(), inplace=True)\ndata_19['Number of vacation days'].fillna(data_19['Number of vacation days'].mean(), inplace=True)\ndata_19['Number of home office days per month'].fillna(data_19['Number of home office days per month'].mean(), inplace=True)",
"_____no_output_____"
],
[
"data_19['Seniority level'].value_counts()",
"_____no_output_____"
],
[
"data_19['Position (without seniority)'].unique()",
"_____no_output_____"
],
[
"data_19['Your main technology / programming language'].unique()",
"_____no_output_____"
],
[
"data_19['Main language at work'].unique()",
"_____no_output_____"
],
[
"data_19['Company name '].unique()",
"_____no_output_____"
],
[
"data_19['Company name '].replace({'google':'Google','check24':'Check24','CHECK24':'Check24','Here':'HERE'}, inplace=True)",
"_____no_output_____"
],
[
"data_19['Company business sector'].unique()",
"_____no_output_____"
],
[
"data_19.isnull().sum()",
"_____no_output_____"
]
],
[
[
"## Data Visualization with Dash Application",
"_____no_output_____"
]
],
[
[
"app = dash.Dash(__name__)",
"_____no_output_____"
],
[
"app.layout = html.Div([\n html.Div([\n dcc.Dropdown(id='years', multi=False, clearable=False,\n options=[{'label':x, 'value':x} for x in sorted(['2018','2019','2020'])],\n value=\"2018\")\n ],style={'width':'50%'}),\n \n html.Div([\n dcc.Dropdown(id='parameters', multi=False, clearable=False,\n options=[{'label':x, 'value':x} for x in sorted(['City','Gender','Seniority level','Main language at work','Current Salary'])],\n value=\"Gender\")\n ],style={'width':'50%'}),\n \n# html.Div([\n# dcc.Graph(id='my-pieplot', figure={})\n# ]),\n \n html.Div([\n dcc.Graph(id='my-plot', figure={})\n ])\n])",
"_____no_output_____"
],
[
"# Callback - app interactivity section------------------------------------\[email protected](\n #Output(component_id='my-pieplot', component_property='figure'),\n Output(component_id='my-plot', component_property='figure'),\n Input(component_id='years', component_property='value'),\n Input(component_id='parameters', component_property='value')\n)\ndef update_graph(year_chosen, parameter):\n print(year_chosen)\n print(parameter)\n if (year_chosen == '2018'):\n if (parameter == 'City'):\n city_18 = data_18['City'].value_counts().head(10)\n city_18 = pd.DataFrame(city_18)\n fig = px.bar(data_frame=city_18,y='City',color='City')\n fig.update_xaxes(title_text=\"<b>Cities</b>\")\n fig.update_layout(title_text=\"<b>Respondent's City Analysis (2018)</b>\")\n fig.update_yaxes(title_text=\"<b>Employee count</b> \", secondary_y=False)\n elif (parameter == 'Gender'):\n df = data_18\n fig=px.pie(data_frame=df,names='Gender')\n fig.update_traces(textinfo = 'label+percent')\n fig.update_layout(title_text=\"<b>Gender ratio of the respondents in 2018</b>\")\n elif (parameter == 'Seniority level'):\n fig=px.pie(data_frame=data_18,names='Your level')\n fig.update_traces(textinfo = 'label+percent')\n fig.update_layout(title_text=\"<b>Seniority levels in 2018</b>\")\n elif (parameter == 'Main language at work'):\n language = data_18['Main language at work'].value_counts()\n language = pd.DataFrame(language)\n fig = px.bar(data_frame=language,y='Main language at work',color='Main language at work')\n fig.update_xaxes(title_text=\"<b>Languages</b>\")\n fig.update_layout(title_text=\"<b>Language spoken at work (2018)</b>\")\n fig.update_yaxes(title_text=\"<b>Employee count</b> \", secondary_y=False)\n elif (parameter == 'Current Salary'): \n experience = data_18.groupby('Years of experience')\n mean_salary_18 = experience['Current Salary'].mean()\n mean_salary_18=pd.DataFrame(mean_salary_18)\n mean_salary_18 = mean_salary_18.reset_index(drop=False)\n fig = px.line(data_frame=mean_salary_18,x='Years of experience',y='Current Salary')\n fig.update_layout(title_text=\"<b>Salary info for the year 2018</b>\")\n \n elif (year_chosen == '2019'):\n if (parameter == 'City'):\n city_19 = data_19['City'].value_counts().head(10)\n city_19 = pd.DataFrame(city_19)\n fig = px.bar(data_frame=city_19,y='City',color='City')\n fig.update_xaxes(title_text=\"<b>Cities</b>\")\n fig.update_layout(title_text=\"<b>Respondent's City Analysis (2019)</b>\")\n fig.update_yaxes(title_text=\"<b>Employee count</b> \", secondary_y=False)\n elif (parameter == 'Gender'):\n df = data_19\n fig=px.pie(data_frame=df,names='Gender')\n fig.update_traces(textinfo = 'label+percent')\n fig.update_layout(title_text=\"<b>Gender ratio of the respondents in 2019</b>\")\n elif (parameter == 'Seniority level'):\n fig=px.pie(data_frame=data_19,names='Seniority level')\n fig.update_traces(textinfo = 'label+percent')\n fig.update_layout(title_text=\"<b>Seniority levels in 2019</b>\")\n elif (parameter == 'Main language at work'):\n language = data_19['Main language at work'].value_counts()\n language = pd.DataFrame(language)\n fig = px.bar(data_frame=language,y='Main language at work',color='Main language at work')\n fig.update_xaxes(title_text=\"<b>Languages</b>\")\n fig.update_layout(title_text=\"<b>Language spoken at work (2019)</b>\")\n fig.update_yaxes(title_text=\"<b>Employee count</b> \", secondary_y=False)\n elif (parameter == 'Current Salary'):\n experience = data_19.groupby('Years of experience')\n mean_salary_19 = experience['Yearly brutto salary (without bonus and stocks)'].mean()\n mean_salary_19=pd.DataFrame(mean_salary_19)\n mean_salary_19 = mean_salary_19.reset_index(drop=False)\n fig = px.line(data_frame=mean_salary_19,x='Years of experience',y='Yearly brutto salary (without bonus and stocks)')\n fig.update_layout(title_text=\"<b>Salary info for the year 2019</b>\")\n \n elif (year_chosen == '2020'):\n if (parameter == 'City'):\n city_20 = data_20['City'].value_counts().head(10)\n city_20 = pd.DataFrame(city_20)\n fig = px.bar(data_frame=city_20,y='City',color='City')\n fig.update_xaxes(title_text=\"<b>Cities</b>\")\n fig.update_layout(title_text=\"<b>Respondent's City Analysis (2020)</b>\")\n fig.update_yaxes(title_text=\"<b>Employee count</b> \", secondary_y=False)\n elif (parameter == 'Gender'):\n df = data_20\n fig=px.pie(data_frame=df,names='Gender')\n fig.update_traces(textinfo = 'label+percent')\n fig.update_layout(title_text=\"<b>Gender ratio of the respondents in 2020</b>\")\n elif (parameter == 'Seniority level'):\n fig=px.pie(data_frame=data_20,names='Seniority level')\n fig.update_traces(textinfo = 'label+percent')\n fig.update_layout(title_text=\"<b>Seniority levels in 2020</b>\")\n elif (parameter == 'Main language at work'):\n language = data_20['Main language at work'].value_counts()\n language = pd.DataFrame(language)\n fig = px.bar(data_frame=language,y='Main language at work',color='Main language at work')\n fig.update_xaxes(title_text=\"<b>Languages</b>\")\n fig.update_layout(title_text=\"<b>Language spoken at work (2020)</b>\")\n fig.update_yaxes(title_text=\"<b>Employee count</b> \", secondary_y=False)\n elif (parameter == 'Current Salary'):\n experience = data_20.groupby('Total years of experience')\n mean_salary_20 = experience['Yearly brutto salary (without bonus and stocks) in EUR'].mean()\n mean_salary_20=pd.DataFrame(mean_salary_20)\n mean_salary_20 = mean_salary_20.reset_index(drop=False)\n fig = px.line(data_frame=mean_salary_20,x='Total years of experience',y='Yearly brutto salary (without bonus and stocks) in EUR')\n fig.update_layout(title_text=\"<b>Salary info for the year 2020</b>\")\n return fig",
"_____no_output_____"
],
[
"if __name__=='__main__':\n app.run_server(debug=False, port=8001)",
"Dash is running on http://127.0.0.1:8001/\n\nDash is running on http://127.0.0.1:8001/\n\n * Serving Flask app \"__main__\" (lazy loading)\n * Environment: production\n WARNING: This is a development server. Do not use it in a production deployment.\n Use a production WSGI server instead.\n * Debug mode: off\n"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
]
] |
d07d11e4cbb55afe2da56e43f26ea36547e936bc | 6,216 | ipynb | Jupyter Notebook | 2020/ferran/03/03.ipynb | bbglab/adventofcode | 65b6d8331d10f229b59232882d60024b08d69294 | [
"MIT"
] | null | null | null | 2020/ferran/03/03.ipynb | bbglab/adventofcode | 65b6d8331d10f229b59232882d60024b08d69294 | [
"MIT"
] | null | null | null | 2020/ferran/03/03.ipynb | bbglab/adventofcode | 65b6d8331d10f229b59232882d60024b08d69294 | [
"MIT"
] | 3 | 2016-12-02T09:20:42.000Z | 2021-12-01T13:31:07.000Z | 39.341772 | 281 | 0.410714 | [
[
[
"# Day 3",
"_____no_output_____"
]
],
[
[
"--- Day 3: Toboggan Trajectory ---\nWith the toboggan login problems resolved, you set off toward the airport. While travel by toboggan might be easy, it's certainly not safe: there's very minimal steering and the area is covered in trees. You'll need to see which angles will take you near the fewest trees.\n\nDue to the local geology, trees in this area only grow on exact integer coordinates in a grid. You make a map (your puzzle input) of the open squares (.) and trees (#) you can see. For example:\n\n..##.......\n#...#...#..\n.#....#..#.\n..#.#...#.#\n.#...##..#.\n..#.##.....\n.#.#.#....#\n.#........#\n#.##...#...\n#...##....#\n.#..#...#.#\nThese aren't the only trees, though; due to something you read about once involving arboreal genetics and biome stability, the same pattern repeats to the right many times:\n\n..##.........##.........##.........##.........##.........##....... --->\n#...#...#..#...#...#..#...#...#..#...#...#..#...#...#..#...#...#..\n.#....#..#..#....#..#..#....#..#..#....#..#..#....#..#..#....#..#.\n..#.#...#.#..#.#...#.#..#.#...#.#..#.#...#.#..#.#...#.#..#.#...#.#\n.#...##..#..#...##..#..#...##..#..#...##..#..#...##..#..#...##..#.\n..#.##.......#.##.......#.##.......#.##.......#.##.......#.##..... --->\n.#.#.#....#.#.#.#....#.#.#.#....#.#.#.#....#.#.#.#....#.#.#.#....#\n.#........#.#........#.#........#.#........#.#........#.#........#\n#.##...#...#.##...#...#.##...#...#.##...#...#.##...#...#.##...#...\n#...##....##...##....##...##....##...##....##...##....##...##....#\n.#..#...#.#.#..#...#.#.#..#...#.#.#..#...#.#.#..#...#.#.#..#...#.# --->\nYou start on the open square (.) in the top-left corner and need to reach the bottom (below the bottom-most row on your map).\n\nThe toboggan can only follow a few specific slopes (you opted for a cheaper model that prefers rational numbers); start by counting all the trees you would encounter for the slope right 3, down 1:\n\nFrom your starting position at the top-left, check the position that is right 3 and down 1. Then, check the position that is right 3 and down 1 from there, and so on until you go past the bottom of the map.\n\nThe locations you'd check in the above example are marked here with O where there was an open square and X where there was a tree:\n\n..##.........##.........##.........##.........##.........##....... --->\n#..O#...#..#...#...#..#...#...#..#...#...#..#...#...#..#...#...#..\n.#....X..#..#....#..#..#....#..#..#....#..#..#....#..#..#....#..#.\n..#.#...#O#..#.#...#.#..#.#...#.#..#.#...#.#..#.#...#.#..#.#...#.#\n.#...##..#..X...##..#..#...##..#..#...##..#..#...##..#..#...##..#.\n..#.##.......#.X#.......#.##.......#.##.......#.##.......#.##..... --->\n.#.#.#....#.#.#.#.O..#.#.#.#....#.#.#.#....#.#.#.#....#.#.#.#....#\n.#........#.#........X.#........#.#........#.#........#.#........#\n#.##...#...#.##...#...#.X#...#...#.##...#...#.##...#...#.##...#...\n#...##....##...##....##...#X....##...##....##...##....##...##....#\n.#..#...#.#.#..#...#.#.#..#...X.#.#..#...#.#.#..#...#.#.#..#...#.# --->\nIn this example, traversing the map using this slope would cause you to encounter 7 trees.\n\nStarting at the top-left corner of your map and following a slope of right 3 and down 1, how many trees would you encounter?",
"_____no_output_____"
]
],
[
[
"trees = []\nwith open('input.txt', 'rt') as f:\n trees = f.read().splitlines()\nrows = len(trees)\ncols = len(trees[0])",
"_____no_output_____"
],
[
"i, j = 0, 0\npath = ''\nwhile(i < rows):\n path += trees[i][j % cols]\n i += 1\n j += 3\npath.count('#')",
"_____no_output_____"
]
],
[
[
"--- Part Two ---\nTime to check the rest of the slopes - you need to minimize the probability of a sudden arboreal stop, after all.\n\nDetermine the number of trees you would encounter if, for each of the following slopes, you start at the top-left corner and traverse the map all the way to the bottom:\n\nRight 1, down 1.\nRight 3, down 1. (This is the slope you already checked.)\nRight 5, down 1.\nRight 7, down 1.\nRight 1, down 2.\nIn the above example, these slopes would find 2, 7, 3, 4, and 2 tree(s) respectively; multiplied together, these produce the answer 336.\n\nWhat do you get if you multiply together the number of trees encountered on each of the listed slopes?",
"_____no_output_____"
]
],
[
[
"def tree_path(down, right):\n path = ''\n i, j = 0, 0\n while(i < rows):\n path += trees[i][j % cols]\n i += down\n j += right\n return path.count('#')\n\nprod = 1\nfor s in [(1, 1), (1, 3), (1, 5), (1, 7), (2, 1)]:\n prod *= tree_path(*s)\nprod",
"_____no_output_____"
]
]
] | [
"markdown",
"raw",
"code",
"raw",
"code"
] | [
[
"markdown"
],
[
"raw"
],
[
"code",
"code"
],
[
"raw"
],
[
"code"
]
] |
d07d138aa11370a00a415491c90b40467d62594e | 16,712 | ipynb | Jupyter Notebook | database/08_pymongo.ipynb | dayoungMM/TIL | b844ef5621657908d4c256cdfe233462dd075e8b | [
"MIT"
] | null | null | null | database/08_pymongo.ipynb | dayoungMM/TIL | b844ef5621657908d4c256cdfe233462dd075e8b | [
"MIT"
] | null | null | null | database/08_pymongo.ipynb | dayoungMM/TIL | b844ef5621657908d4c256cdfe233462dd075e8b | [
"MIT"
] | null | null | null | 36.489083 | 369 | 0.521422 | [
[
[
"from pymongo import MongoClient\nclient = MongoClient(host='localhost', port=27017)\ndb = client['test']\ncursor = db.score.find()\nfor data in list(cursor):\n print(data['a'])",
"99.0\n0.0\n1.0\n2.0\n3.0\n4.0\n5.0\n5.0\n"
],
[
"db.score.save({'a':6, 'exam': 6})",
"C:\\ProgramData\\Anaconda3\\lib\\site-packages\\ipykernel_launcher.py:1: DeprecationWarning: save is deprecated. Use insert_one or replace_one instead\n \"\"\"Entry point for launching an IPython kernel.\n"
],
[
"#내려간 커서를 다시 위로 끌어올림\ncursor.rewind()\nfor data in list(cursor):\n print(data['a'])",
"99.0\n0.0\n1.0\n2.0\n3.0\n4.0\n5.0\n5.0\n6\n"
],
[
"# 데이터 수정\ndb.score.update( { 'a' : 5 }, { '$set' : { 'exam' : 10 } } )\n\ncursor.rewind()\nfor data in list(cursor):\n print(data)",
"{'_id': ObjectId('5e2fdcd609d7368d39e3b799'), 'a': 99.0}\n{'_id': ObjectId('5e2fdd0109d7368d39e3b79a'), 'a': 0.0}\n{'_id': ObjectId('5e2fdd0109d7368d39e3b79b'), 'a': 1.0}\n{'_id': ObjectId('5e2fdd0109d7368d39e3b79c'), 'a': 2.0}\n{'_id': ObjectId('5e2fdd0109d7368d39e3b79d'), 'a': 3.0}\n{'_id': ObjectId('5e2fdd0109d7368d39e3b79e'), 'a': 4.0}\n{'_id': ObjectId('5e2fde8f09d7368d39e3b79f'), 'a': 5.0, 'exam': 10}\n{'_id': ObjectId('5e2fdeb609d7368d39e3b7a0'), 'a': 5.0, 'exam': 5.0}\n{'_id': ObjectId('5e2ff20eae8b72b003d0b13b'), 'a': 6, 'exam': 6}\n"
],
[
"# 데이터 1개 삭제\ndb.score.delete_one( { 'a' : 5 } )\n\ncursor.rewind()\nfor data in list(cursor):\n print(data)\n",
"{'_id': ObjectId('5e2fdcd609d7368d39e3b799'), 'a': 99.0}\n{'_id': ObjectId('5e2fdd0109d7368d39e3b79a'), 'a': 0.0}\n{'_id': ObjectId('5e2fdd0109d7368d39e3b79b'), 'a': 1.0}\n{'_id': ObjectId('5e2fdd0109d7368d39e3b79c'), 'a': 2.0}\n{'_id': ObjectId('5e2fdd0109d7368d39e3b79d'), 'a': 3.0}\n{'_id': ObjectId('5e2fdd0109d7368d39e3b79e'), 'a': 4.0}\n{'_id': ObjectId('5e2fdeb609d7368d39e3b7a0'), 'a': 5.0, 'exam': 5.0}\n{'_id': ObjectId('5e2ff20eae8b72b003d0b13b'), 'a': 6, 'exam': 6}\n"
]
],
[
[
"### 연습문제\n- 인터파크 투어에서 조회한 결과를 MongoDB로 저장하기\n 컬렉션 : tour\n- 필드 : 이미지경로-img, 상세페이지경로-link, 상품명-title,\n 상품설명-desc, 가격-price",
"_____no_output_____"
],
[
"### headless로 크롬 실행\n```options = webdriver.ChromeOptions()\noptions.add_argument(\"headless\")\noptions.add_argument(\"window-size=1920x1080\")\ndriver = webdriver.Chrome(executable_path='C:/Code/chromedriver.exe', options=options)```",
"_____no_output_____"
]
],
[
[
"from selenium import webdriver as wd\ndriver = wd.Chrome(executable_path='C:/Code/chromedriver.exe')\nfrom selenium.webdriver.common.by import By\nfrom selenium.webdriver.support.ui import WebDriverWait\nfrom selenium.webdriver.support import expected_conditions as EC\nimport time\n\nfrom pymongo import MongoClient\nclient = MongoClient(host='localhost', port=27017)\ndb = client['test']\n\n\ndriver.get('http://tour.interpark.com')\ndriver.implicitly_wait(10)\ndriver.find_element_by_id('SearchGNBText').send_keys('달랏')\ndriver.find_element_by_css_selector('button.search-btn').click()\nWebDriverWait(driver, 10).until(\n EC.presence_of_element_located((By.CLASS_NAME, 'oTravelBox'))\n)\ndriver.find_element_by_css_selector('.oTravelBox .moreBtn').click()",
"_____no_output_____"
],
[
"\nfor page in range(1, 2):\n driver.execute_script(\"searchModule.SetCategoryList(%s, '')\" % page)\n time.sleep(2)\n boxItems = driver.find_elements_by_css_selector('.panelZone > .oTravelBox > .boxList > li')\n for li in boxItems:\n tour_dict = {}\n tour_dict['이미지']= li.find_element_by_css_selector('img.img').get_attribute('src')\n tour_dict['링크'] = li.find_element_by_css_selector('a').get_attribute('onclick')\n tour_dict['상품명'] = li.find_element_by_css_selector('h5.proTit').text\n tour_dict['추가설명'] = li.find_element_by_css_selector('.proSub').text\n tour_dict['가격'] = li.find_element_by_css_selector('strong.proPrice').text\n db.tour.save(tour_dict)\n# for info in li.find_elements_by_css_selector('.info-row .proInfo'):\n# print(info.text)\n# print('=' * 20)\n",
"C:\\ProgramData\\Anaconda3\\lib\\site-packages\\ipykernel_launcher.py:12: DeprecationWarning: save is deprecated. Use insert_one or replace_one instead\n if sys.path[0] == '':\n"
],
[
"driver.close()",
"_____no_output_____"
],
[
"cursor = db.tour.find()\nfor data in list(cursor):\n print(data)",
"{'_id': ObjectId('5e2ff5a7ae8b72b003d0b13d'), '이미지': 'http://img.modetour.com/eagle/photoimg/56584/bfile/636453195339574051.JPG', '링크': \"searchModule.OnClickDetail('http://tour.interpark.com/goods/detail/?BaseGoodsCd=Q1011029','')\", '상품명': '●연중 봄의 도시, 달랏● [5성호텔] 달랏 3색 골프(54홀) + 관광 3박 5일', '추가설명': '', '가격': '1,480,000 원~'}\n{'_id': ObjectId('5e2ff5f9ae8b72b003d0b13f'), '이미지': 'http://tourimage.interpark.com/product/tour/00161/A10/280/A1031420_1_027.jpg', '링크': \"searchModule.OnClickDetail('http://tour.interpark.com/goods/detail/?BaseGoodsCd=A1031420','')\", '상품명': '[달랏4일] 낭만도시 달랏 직항+랑비앙 포함 3박4일', '추가설명': '★직항 취항 기념 초특가★', '가격': '365,000 원~'}\n{'_id': ObjectId('5e2ff5faae8b72b003d0b140'), '이미지': 'http://tourimage.interpark.com/product/tour/00161/A10/280/A1024924_1_620.jpg', '링크': \"searchModule.OnClickDetail('http://tour.interpark.com/goods/detail/?BaseGoodsCd=A1024924','')\", '상품명': '[나트랑/달랏5일] 시원한 낭만도시 랑비앙관광 5일', '추가설명': '★떠오르는 베트남 인생여행지★', '가격': '434,700 원~'}\n{'_id': ObjectId('5e2ff5faae8b72b003d0b141'), '이미지': 'http://tourimage.interpark.com/product/tour/00161/A10/280/A1025044_3_337.jpg', '링크': \"searchModule.OnClickDetail('http://tour.interpark.com/goods/detail/?BaseGoodsCd=A1025044','')\", '상품명': '[대구출발_나트랑5일] 지금 핫한 그곳! 아름다운 나트랑/달랏 관광', '추가설명': '머드온천&나트랑 시내관광 포함-동양의 나폴리 나트랑&달랏여행♥', '가격': '499,000 원~'}\n{'_id': ObjectId('5e2ff5faae8b72b003d0b142'), '이미지': 'http://tourimage.interpark.com/product/tour/00161/A10/280/A1031349_1_470.jpg', '링크': \"searchModule.OnClickDetail('http://tour.interpark.com/goods/detail/?BaseGoodsCd=A1031349','')\", '상품명': '[나트랑 5일★연합] 나트랑 1박/ 달랏 2박 꽉찬 관광일정', '추가설명': '신규취항★뱀부항공/기내식제공/수하물20KG', '가격': '349,000 원~'}\n{'_id': ObjectId('5e2ff5faae8b72b003d0b143'), '이미지': 'http://img.modetour.com/eagle/photoimg/42164/bfile/636724338253061415.JPG', '링크': \"searchModule.OnClickDetail('http://tour.interpark.com/goods/detail/?BaseGoodsCd=A1023759','')\", '상품명': '*출발확정*[관광형] 나트랑x달랏 3박5일 [4성(1박)+달랏(2박)]', '추가설명': '최대 7만원 쿠폰할인 (출발일 2020.01.01~2020.03.31)', '가격': '405,000 원~'}\n{'_id': ObjectId('5e2ff5faae8b72b003d0b144'), '이미지': 'http://tourimage.interpark.com/product/tour/00161/A10/280/A1025396_7_460.jpg', '링크': \"searchModule.OnClickDetail('http://tour.interpark.com/goods/detail/?BaseGoodsCd=A1025396','')\", '상품명': '[부산출발_나트랑5일] 떠오르는 여행지 나트랑 + 봄의 도시 달랏을 동시에!! 실속 관광형', '추가설명': '전 일전 준특급호텔 + 아름다운 나트랑 달랏 관광♡', '가격': '499,000 원~'}\n{'_id': ObjectId('5e2ff5fbae8b72b003d0b145'), '이미지': 'http://img.modetour.com/eagle/photoimg/59122/bfile/636724338211701392.JPG', '링크': \"searchModule.OnClickDetail('http://tour.interpark.com/goods/detail/?BaseGoodsCd=A1031529','')\", '상품명': '[영원한봄의도시] 달랏 패키지 3박4일 [5성호텔]', '추가설명': '', '가격': '471,000 원~'}\n{'_id': ObjectId('5e2ff5fbae8b72b003d0b146'), '이미지': 'http://img.modetour.com/eagle/photoimg/42164/bfile/636724338249268660.JPG', '링크': \"searchModule.OnClickDetail('http://tour.interpark.com/goods/detail/?BaseGoodsCd=A1025809','')\", '상품명': '*출발확정*[증편기][관광형] 나트랑x달랏 3박5일 [4성(1박)+달랏(2박)]', '추가설명': '', '가격': '352,000 원~'}\n{'_id': ObjectId('5e2ff5fbae8b72b003d0b147'), '이미지': 'http://img.modetour.com/eagle/photoimg/42171/bfile/636724338177366577.JPG', '링크': \"searchModule.OnClickDetail('http://tour.interpark.com/goods/detail/?BaseGoodsCd=A1025810','')\", '상품명': '[증편기][관광형] 나트랑x달랏 3박5일 [5성(1박)+달랏(2박)]', '추가설명': '', '가격': '405,000 원~'}\n{'_id': ObjectId('5e2ff5fbae8b72b003d0b148'), '이미지': 'http://img.modetour.com/eagle/photoimg/56584/bfile/636453195339574051.JPG', '링크': \"searchModule.OnClickDetail('http://tour.interpark.com/goods/detail/?BaseGoodsCd=Q1011029','')\", '상품명': '●연중 봄의 도시, 달랏● [5성호텔] 달랏 3색 골프(54홀) + 관광 3박 5일', '추가설명': '', '가격': '1,480,000 원~'}\n"
],
[
"from pymongo import MongoClient\nclient = MongoClient(host='localhost', port=27017)\ndb = client['test']\n\ndb.tour.count()",
"C:\\ProgramData\\Anaconda3\\lib\\site-packages\\ipykernel_launcher.py:5: DeprecationWarning: count is deprecated. Use estimated_document_count or count_documents instead. Please note that $where must be replaced by $expr, $near must be replaced by $geoWithin with $center, and $nearSphere must be replaced by $geoWithin with $centerSphere\n \"\"\"\n"
],
[
"import pandas as pd\ncursor=db.tour.find()\nresult = []\nfor data in cursor:\n result.append([data['상품명'],data['가격']] )\n\npd.DataFrame(result, columns = ['상품명','가격'])",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code"
] | [
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07d39bb773f256e08cf8016527557d68db98f52 | 99,631 | ipynb | Jupyter Notebook | 6/old/CIFAR.ipynb | isaiah-li/Building-Machine-Learning-Projects-with-TensorFlow | d01acb378a74d6589f01adf6bfbc4fe20ad4f0b4 | [
"MIT"
] | 270 | 2016-11-21T13:54:44.000Z | 2022-02-18T01:50:49.000Z | 6/old/CIFAR.ipynb | isaiah-li/Building-Machine-Learning-Projects-with-TensorFlow | d01acb378a74d6589f01adf6bfbc4fe20ad4f0b4 | [
"MIT"
] | 12 | 2016-12-28T00:06:09.000Z | 2021-09-29T06:19:24.000Z | 6/old/CIFAR.ipynb | isaiah-li/Building-Machine-Learning-Projects-with-TensorFlow | d01acb378a74d6589f01adf6bfbc4fe20ad4f0b4 | [
"MIT"
] | 192 | 2016-11-21T13:54:47.000Z | 2021-11-25T08:33:18.000Z | 268.54717 | 76,656 | 0.863918 | [
[
[
"%matplotlib inline\nimport glob\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport tensorflow as tf\nimport tensorflow.contrib.learn as skflow\nfrom sklearn import metrics\n\ndatadir='/home/bonnin/dev/cifar-10-batches-bin/'\n\nplt.ion()\nG = glob.glob (datadir + '*.bin')\nA = np.fromfile(G[0],dtype=np.uint8).reshape([10000,3073])\nlabels = A [:,0]\nimages = A [:,1:].reshape([10000,3,32,32]).transpose (0,2,3,1)\nplt.imshow(images[15])\nprint labels[11]\nimages_unroll = A [:,1:]\n",
"9\n"
],
[
"''' Convert classes labels from escalars to one-hot vectrors.'''\ndef dense_to_one_hot (labels_dense, num_classes=10):\n num_labels = labels_dense.shape [0]\n index_offset = np.arange (num_labels) * num_classes\n labels_one_hot = np.zeros ((num_labels,num_classes))\n labels_one_hot.flat [index_offset +labels_dense.ravel()]=1\n return labels_one_hot\n\nlabels_hot = dense_to_one_hot(labels, num_classes= 10)\nsess= tf.InteractiveSession()\n#classifier=skflow.TensorFlowLinearClassifier(n_classes= 10, batch_size= 100, steps= 1000, learning_rate=0.1)\n#classifier.fit(images_unroll,labels)\n#score = metrics.accuracy_score (labels, classifier.predict (images_unroll))\n#print ('Accuracy: {0:f}',format(score))\n#W = classifier.weights_\n\n#sx.sy = (16,32)\n#f.con = plt. subplots(sx,sy, sharex='col', sharey='row')\n#for xx in range(sx):\n# for yy in range(sy):\n# con [xx.yy].pcolormesh\n\n#need to read in data in [n_samples. Height. Widht.\n#so element A [0.0.0] 3 RGB bytes\n\ndef max_pool_2x2(tensor_in):\n\treturn tf.nn.max_pool(tensor_in, ksize= [1,2,2,1], strides= [1,2,2,1], padding='SAME')\n\ndef conv_model (X, y):\n X= tf. reshape(X, [-1, 32, 32, 3])\n with tf.variable_scope('conv_layer1'):\n h_conv1=skflow.ops.conv2d(X, n_filters=16, filter_shape=[5,5], bias=True, activation=tf.nn.relu)#print (h_conv1)\n h_pool1=max_pool_2x2(h_conv1)#print (h_pool1)\n with tf.variable_scope('conv_layer2'):\n h_conv2=skflow.ops.conv2d(h_pool1, n_filters=16, filter_shape=[5,5], bias=True, activation=tf.nn.relu)\n #print (h_conv2)\n h_pool2=max_pool_2x2(h_conv2)\n #print (h_pool2)\n #needs work\n h_pool2_flat = tf.reshape(h_pool2, [-1,8*8*16 ])\n h_fc1 = skflow.ops.dnn(h_pool2_flat, [96,48], activation=tf.nn.relu , dropout=0.5)\n return skflow.models.logistic_regression(h_fc1,y)\n\nimages = np.array(images,dtype=np.float32)\nclassifier = skflow.TensorFlowEstimator(model_fn=conv_model, n_classes=10, batch_size=100, steps=2000, learning_rate=0.01)\n\n%time classifier.fit(images, labels, logdir='/tmp/cnn_train/')\n%time score =metrics.accuracy_score(labels, classifier.predict(images))\nprint ('Accuracy: {0:f}'.format(score))\n\n#Examining fitted weights\n#First 'onvolutional Layer\nprint ('1st Convolutional Layer weights and Bias')\nprint (classifier.get_tensor_value('conv_layer1/convolution/filters:0'))\nprint (classifier.get_tensor_value('conv_layer1/convolution/filters:1'))\n ",
"Exception AssertionError: AssertionError() in <bound method InteractiveSession.__del__ of <tensorflow.python.client.session.InteractiveSession object at 0x7fdea7e7ecd0>> ignored\n"
]
]
] | [
"code"
] | [
[
"code",
"code"
]
] |
d07d604af3ba7538684891d8af2c7ce8d6496db2 | 614,535 | ipynb | Jupyter Notebook | .ipynb_checkpoints/crypto_arbitrage-checkpoint.ipynb | justcb/bitcoin_arb | 6c1c9189672951e05d99a0cfd8377e4fc50cc5c1 | [
"MIT"
] | null | null | null | .ipynb_checkpoints/crypto_arbitrage-checkpoint.ipynb | justcb/bitcoin_arb | 6c1c9189672951e05d99a0cfd8377e4fc50cc5c1 | [
"MIT"
] | null | null | null | .ipynb_checkpoints/crypto_arbitrage-checkpoint.ipynb | justcb/bitcoin_arb | 6c1c9189672951e05d99a0cfd8377e4fc50cc5c1 | [
"MIT"
] | 1 | 2022-01-15T05:22:55.000Z | 2022-01-15T05:22:55.000Z | 210.745885 | 84,284 | 0.888734 | [
[
[
"## Crypto Arbitrage\n\nIn this Challenge, you'll take on the role of an analyst at a high-tech investment firm. The vice president (VP) of your department is considering arbitrage opportunities in Bitcoin and other cryptocurrencies. As Bitcoin trades on markets across the globe, can you capitalize on simultaneous price dislocations in those markets by using the powers of Pandas?\n\nFor this assignment, you’ll sort through historical trade data for Bitcoin on two exchanges: Bitstamp and Coinbase. Your task is to apply the three phases of financial analysis to determine if any arbitrage opportunities exist for Bitcoin.\n\nThis aspect of the Challenge will consist of 3 phases.\n\n1. Collect the data.\n\n2. Prepare the data.\n\n3. Analyze the data. \n\n",
"_____no_output_____"
],
[
"### Import the required libraries and dependencies.",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nfrom pathlib import Path\n%matplotlib inline",
"_____no_output_____"
]
],
[
[
"## Collect the Data\n\nTo collect the data that you’ll need, complete the following steps:\n\nInstructions. \n\n1. Using the Pandas `read_csv` function and the `Path` module, import the data from `bitstamp.csv` file, and create a DataFrame called `bitstamp`. Set the DatetimeIndex as the Timestamp column, and be sure to parse and format the dates.\n\n2. Use the `head` (and/or the `tail`) function to confirm that Pandas properly imported the data.\n\n3. Repeat Steps 1 and 2 for `coinbase.csv` file.",
"_____no_output_____"
],
[
"### Step 1: Using the Pandas `read_csv` function and the `Path` module, import the data from `bitstamp.csv` file, and create a DataFrame called `bitstamp`. Set the DatetimeIndex as the Timestamp column, and be sure to parse and format the dates.",
"_____no_output_____"
]
],
[
[
"# Read in the CSV file called \"bitstamp.csv\" using the Path module. \n# The CSV file is located in the Resources folder.\n# Set the index to the column \"Date\"\n# Set the parse_dates and infer_datetime_format parameters\nbitstamp = pd.read_csv(\n Path(\"Resources/bitstamp.csv\"),\n index_col = \"Timestamp\",\n parse_dates = True,\n infer_datetime_format = True\n)",
"_____no_output_____"
]
],
[
[
"### Step 2: Use the `head` (and/or the `tail`) function to confirm that Pandas properly imported the data.",
"_____no_output_____"
]
],
[
[
"# Use the head (and/or tail) function to confirm that the data was imported properly.\n# YOUR CODE HERE\ndisplay(bitstamp.head())\ndisplay(bitstamp.tail())",
"_____no_output_____"
]
],
[
[
"### Step 3: Repeat Steps 1 and 2 for `coinbase.csv` file.",
"_____no_output_____"
]
],
[
[
"# Read in the CSV file called \"coinbase.csv\" using the Path module. \n# The CSV file is located in the Resources folder.\n# Set the index to the column \"Timestamp\"\n# Set the parse_dates and infer_datetime_format parameters\ncoinbase = pd.read_csv(\n Path(\"Resources/coinbase.csv\"),\n index_col = \"Timestamp\",\n parse_dates = True,\n infer_datetime_format = True\n)",
"_____no_output_____"
],
[
"# Use the head (and/or tail) function to confirm that the data was imported properly.\ndisplay(coinbase.head())\ndisplay(coinbase.tail())",
"_____no_output_____"
]
],
[
[
"## Prepare the Data\n\nTo prepare and clean your data for analysis, complete the following steps:\n\n1. For the bitstamp DataFrame, replace or drop all `NaN`, or missing, values in the DataFrame.\n\n2. Use the `str.replace` function to remove the dollar signs ($) from the values in the Close column.\n\n3. Convert the data type of the Close column to a `float`.\n\n4. Review the data for duplicated values, and drop them if necessary.\n\n5. Repeat Steps 1–4 for the coinbase DataFrame.",
"_____no_output_____"
],
[
"### Step 1: For the bitstamp DataFrame, replace or drop all `NaN`, or missing, values in the DataFrame.",
"_____no_output_____"
]
],
[
[
"# For the bitstamp DataFrame, replace or drop all NaNs or missing values in the DataFrame\n\n# Displays the null count\ndisplay(bitstamp.isnull().sum())\n\n# Uses fillna() to replace all null values with 0\nbitstamp = bitstamp.fillna(0)\n\n# Verifies nulls have been replaced\ndisplay(bitstamp.isnull().sum())",
"_____no_output_____"
]
],
[
[
"### Step 2: Use the `str.replace` function to remove the dollar signs ($) from the values in the Close column.",
"_____no_output_____"
]
],
[
[
"# Display head to show before $ removed\ndisplay(bitstamp.head())\n\n# Use the str.replace function to remove the dollar sign, $\nbitstamp.loc[:,\"Close\"] = bitstamp.loc[:,\"Close\"].str.replace(\"$\",\"\")\n\n# Display head to show after has been removed.\ndisplay(bitstamp.head())",
"_____no_output_____"
]
],
[
[
"### Step 3: Convert the data type of the Close column to a `float`.",
"_____no_output_____"
]
],
[
[
"# Convert the Close data type to a float\n# display type before change\ndisplay(bitstamp[\"Close\"].dtypes)\n\n# change the column type to float\nbitstamp.loc[:, \"Close\"] = bitstamp.loc[:, \"Close\"].astype(\"float\")\n\n# display the new type\ndisplay(bitstamp[\"Close\"].dtypes)\n\n",
"_____no_output_____"
]
],
[
[
"### Step 4: Review the data for duplicated values, and drop them if necessary.",
"_____no_output_____"
]
],
[
[
"# Review the data for duplicate values, and drop them if necessary\n\n# Count the number of duplicates\ndisplay(bitstamp.duplicated().sum())\n\n# Drop the duplicates\nbitstamp = bitstamp.drop_duplicates()\n\n# Count the number of duplicates\ndisplay(bitstamp.duplicated().sum())",
"_____no_output_____"
]
],
[
[
"### Step 5: Repeat Steps 1–4 for the coinbase DataFrame.",
"_____no_output_____"
]
],
[
[
"# Step 1 - Remove the Nulls\n\n# Displays the null count\ndisplay(coinbase.isnull().sum())\n\n# Uses fillna() to replace all null values with 0\ncoinbase = coinbase.fillna(0)\n\n# Verifies nulls have been replaced\ndisplay(coinbase.isnull().sum())\n\n\n# Step 2 - Remove $ from Close column\n\n# Display head to show before $ removed\ndisplay(coinbase.head())\n\n# Use the str.replace function to remove the dollar sign, $\ncoinbase.loc[:,\"Close\"] = coinbase.loc[:,\"Close\"].str.replace(\"$\",\"\")\n\n# Display head to show after has been removed.\ndisplay(coinbase.head())\n\n\n# Step 3 - Convert the Close data type to a float\n\n# Display type before change\ndisplay(coinbase[\"Close\"].dtypes)\n\n# Change the column type to float\ncoinbase.loc[:, \"Close\"] = coinbase.loc[:, \"Close\"].astype(\"float\")\n\n# Display the new type\ndisplay(coinbase[\"Close\"].dtypes)\n\n\n# Step 4 - Review the data for duplicate values, and drop them if necessary\n\n# Count the number of duplicates\ndisplay(coinbase.duplicated().sum())\n\n# Drop the duplicates\ncoinbase = coinbase.drop_duplicates()\n\n# Count the number of duplicates\ndisplay(coinbase.duplicated().sum())\n\n",
"_____no_output_____"
]
],
[
[
"## Analyze the Data\n\nYour analysis consists of the following tasks: \n\n1. Choose the columns of data on which to focus your analysis.\n\n2. Get the summary statistics and plot the data.\n\n3. Focus your analysis on specific dates.\n\n4. Calculate the arbitrage profits.",
"_____no_output_____"
],
[
"### Step 1: Choose columns of data on which to focus your analysis.\n\nSelect the data you want to analyze. Use `loc` or `iloc` to select the following columns of data for both the bitstamp and coinbase DataFrames:\n\n* Timestamp (index)\n\n* Close\n",
"_____no_output_____"
]
],
[
[
"# Use loc or iloc to select `Timestamp (the index)` and `Close` from bitstamp DataFrame\nbitstamp_sliced = bitstamp.iloc[:,[3]]\n\n# Review the first five rows of the DataFrame\nbitstamp_sliced.head()",
"_____no_output_____"
],
[
"# Use loc or iloc to select `Timestamp (the index)` and `Close` from coinbase DataFrame\ncoinbase_sliced = coinbase.iloc[:,[3]]\n\n# Review the first five rows of the DataFrame\ncoinbase_sliced.head()",
"_____no_output_____"
]
],
[
[
"### Step 2: Get summary statistics and plot the data.\n\nSort through the time series data associated with the bitstamp and coinbase DataFrames to identify potential arbitrage opportunities. To do so, complete the following steps:\n\n1. Generate the summary statistics for each DataFrame by using the `describe` function.\n\n2. For each DataFrame, create a line plot for the full period of time in the dataset. Be sure to tailor the figure size, title, and color to each visualization.\n\n3. In one plot, overlay the visualizations that you created in Step 2 for bitstamp and coinbase. Be sure to adjust the legend and title for this new visualization.\n\n4. Using the `loc` and `plot` functions, plot the price action of the assets on each exchange for different dates and times. Your goal is to evaluate how the spread between the two exchanges changed across the time period that the datasets define. Did the degree of spread change as time progressed?",
"_____no_output_____"
]
],
[
[
"# Generate the summary statistics for the bitstamp DataFrame\nbitstamp_sliced.describe()\n",
"_____no_output_____"
],
[
"# Generate the summary statistics for the coinbase DataFrame\ncoinbase_sliced.describe()",
"_____no_output_____"
],
[
"# Create a line plot for the bitstamp DataFrame for the full length of time in the dataset \n# Be sure that the figure size, title, and color are tailored to each visualization\nbitstamp_sliced.plot(figsize=(10, 5), title=\"Bitstamp Closing Prices\", color=\"blue\")",
"_____no_output_____"
],
[
"# Create a line plot for the coinbase DataFrame for the full length of time in the dataset \n# Be sure that the figure size, title, and color are tailored to each visualization\ncoinbase_sliced.plot(figsize=(10, 5), title=\"Coinbase Closing Prices\", color=\"orange\")",
"_____no_output_____"
],
[
"# Overlay the visualizations for the bitstamp and coinbase DataFrames in one plot\n# The plot should visualize the prices over the full lenth of the dataset\n# Be sure to include the parameters: legend, figure size, title, and color and label\ncoinbase_sliced[\"Close\"].plot(legend = True, figsize=(15, 7), title=\"Bitstamp v. Coinbase\", color=\"orange\", label = \"Coinbase\") \nbitstamp_sliced[\"Close\"].plot(legend = True, figsize=(15, 7), color=\"blue\", label = \"Bitstamp\")",
"_____no_output_____"
],
[
"# Using the loc and plot functions, create an overlay plot that visualizes \n# the price action of both DataFrames for a one month period early in the dataset\n# Be sure to include the parameters: legend, figure size, title, and color and label\ncoinbase_sliced[\"Close\"].loc[\"2018-01-01\" : \"2018-01-31\"].plot(legend = True, figsize=(15, 7), title=\"Jan. 2018 - Bitstamp v. Coinbase\", color=\"orange\", label = \"Coinbase\") \nbitstamp_sliced[\"Close\"].loc[\"2018-01-01\" : \"2018-01-31\"].plot(legend = True, figsize=(15, 7), color=\"blue\", label = \"Bitstamp\")",
"_____no_output_____"
],
[
"# Using the loc and plot functions, create an overlay plot that visualizes \n# the price action of both DataFrames for a one month period later in the dataset\n# Be sure to include the parameters: legend, figure size, title, and color and label \ncoinbase_sliced[\"Close\"].loc[\"2018-03-01\" : \"2018-03-31\"].plot(legend = True, figsize=(15, 7), title=\"March 2018 - Bitstamp v. Coinbase\", color=\"orange\", label = \"Coinbase\") \nbitstamp_sliced[\"Close\"].loc[\"2018-03-01\" : \"2018-03-31\"].plot(legend = True, figsize=(15, 7), color=\"blue\", label = \"Bitstamp\")",
"_____no_output_____"
]
],
[
[
"**Question** Based on the visualizations of the different time periods, has the degree of spread change as time progressed?\n\n**Answer** Yes. Early in the timeframe, there were more frequent and more severe divergences between the price of Bitcoin on Bitstamp and Coinbase.",
"_____no_output_____"
],
[
"### Step 3: Focus Your Analysis on Specific Dates\n\nFocus your analysis on specific dates by completing the following steps:\n\n1. Select three dates to evaluate for arbitrage profitability. Choose one date that’s early in the dataset, one from the middle of the dataset, and one from the later part of the time period.\n\n2. For each of the three dates, generate the summary statistics and then create a box plot. This big-picture view is meant to help you gain a better understanding of the data before you perform your arbitrage calculations. As you compare the data, what conclusions can you draw?",
"_____no_output_____"
]
],
[
[
"# Create an overlay plot that visualizes the two dataframes over a period of one day early in the dataset. \n# Be sure that the plots include the parameters `legend`, `figsize`, `title`, `color` and `label` \ncoinbase_sliced[\"Close\"].loc[\"2018-01-24\" : \"2018-01-24\"].plot(legend = True, figsize=(15, 7), title=\"Jan. 25, 2018 - Bitstamp v. Coinbase\", color=\"orange\", label = \"Coinbase\") \nbitstamp_sliced[\"Close\"].loc[\"2018-01-24\" : \"2018-01-24\"].plot(legend = True, figsize=(15, 7), color=\"blue\", label = \"Bitstamp\")",
"_____no_output_____"
],
[
"# Using the early date that you have selected, calculate the arbitrage spread \n# by subtracting the bitstamp lower closing prices from the coinbase higher closing prices\narbitrage_spread_early = bitstamp_sliced['Close'].loc['2018-01-24'] - coinbase_sliced['Close'].loc['2018-01-24']\n\n# Generate summary statistics for the early DataFrame\narbitrage_spread_early.describe()",
"_____no_output_____"
],
[
"# Visualize the arbitrage spread from early in the dataset in a box plot\narbitrage_spread_early.plot(kind = \"box\", legend = True, figsize=(15, 7), title=\"Arbitrage Spread - Early\", color=\"blue\")",
"_____no_output_____"
],
[
"# Create an overlay plot that visualizes the two dataframes over a period of one day from the middle of the dataset. \n# Be sure that the plots include the parameters `legend`, `figsize`, `title`, `color` and `label` \ncoinbase_sliced[\"Close\"].loc[\"2018-02-23\" : \"2018-02-23\"].plot(legend = True, figsize=(15, 7), title=\"Feb. 25, 2018 - Bitstamp v. Coinbase\", color=\"orange\", label = \"Coinbase\") \nbitstamp_sliced[\"Close\"].loc[\"2018-02-23\" : \"2018-02-23\"].plot(legend = True, figsize=(15, 7), color=\"blue\", label = \"Bitstamp\")",
"_____no_output_____"
],
[
"# Using the date in the middle that you have selected, calculate the arbitrage spread \n# by subtracting the bitstamp lower closing prices from the coinbase higher closing prices\narbitrage_spread_middle = bitstamp_sliced['Close'].loc['2018-02-23'] - coinbase_sliced['Close'].loc['2018-02-23']\n\n# Generate summary statistics \narbitrage_spread_middle.describe()",
"_____no_output_____"
],
[
"# Visualize the arbitrage spread from the middle of the dataset in a box plot\narbitrage_spread_middle.plot(kind = \"box\", legend = True, figsize=(15, 7), title=\"Arbitrage Spread - Middle\", color=\"blue\")",
"_____no_output_____"
],
[
"# Create an overlay plot that visualizes the two dataframes over a period of one day from late in the dataset. \n# Be sure that the plots include the parameters `legend`, `figsize`, `title`, `color` and `label` \ncoinbase_sliced[\"Close\"].loc[\"2018-03-23\" : \"2018-03-23\"].plot(legend = True, figsize=(15, 7), title=\"March 25, 2018 - Bitstamp v. Coinbase\", color=\"orange\", label = \"Coinbase\") \nbitstamp_sliced[\"Close\"].loc[\"2018-03-23\" : \"2018-03-23\"].plot(legend = True, figsize=(15, 7), color=\"blue\", label = \"Bitstamp\")",
"_____no_output_____"
],
[
"# Using the date from the late that you have selected, calculate the arbitrage spread \n# by subtracting the bitstamp lower closing prices from the coinbase higher closing prices\narbitrage_spread_late = bitstamp_sliced['Close'].loc['2018-03-23'] - coinbase_sliced['Close'].loc['2018-03-23']\n\n\n# Generate summary statistics for the late DataFrame\narbitrage_spread_late.describe()",
"_____no_output_____"
],
[
"# Visualize the arbitrage spread from late in the dataset in a box plot\narbitrage_spread_late.plot(kind = \"box\", legend = True, figsize=(15, 7), title=\"Arbitrage Spread - Late\", color=\"blue\")",
"_____no_output_____"
]
],
[
[
"### Step 4: Calculate the Arbitrage Profits\n\nCalculate the potential profits for each date that you selected in the previous section. Your goal is to determine whether arbitrage opportunities still exist in the Bitcoin market. Complete the following steps:\n\n1. For each of the three dates, measure the arbitrage spread between the two exchanges by subtracting the lower-priced exchange from the higher-priced one. Then use a conditional statement to generate the summary statistics for each arbitrage_spread DataFrame, where the spread is greater than zero.\n\n2. For each of the three dates, calculate the spread returns. To do so, divide the instances that have a positive arbitrage spread (that is, a spread greater than zero) by the price of Bitcoin from the exchange you’re buying on (that is, the lower-priced exchange). Review the resulting DataFrame.\n\n3. For each of the three dates, narrow down your trading opportunities even further. To do so, determine the number of times your trades with positive returns exceed the 1% minimum threshold that you need to cover your costs.\n\n4. Generate the summary statistics of your spread returns that are greater than 1%. How do the average returns compare among the three dates?\n\n5. For each of the three dates, calculate the potential profit, in dollars, per trade. To do so, multiply the spread returns that were greater than 1% by the cost of what was purchased. Make sure to drop any missing values from the resulting DataFrame.\n\n6. Generate the summary statistics, and plot the results for each of the three DataFrames.\n\n7. Calculate the potential arbitrage profits that you can make on each day. To do so, sum the elements in the profit_per_trade DataFrame.\n\n8. Using the `cumsum` function, plot the cumulative sum of each of the three DataFrames. Can you identify any patterns or trends in the profits across the three time periods?\n\n(NOTE: The starter code displays only one date. You'll want to do this analysis for two additional dates).",
"_____no_output_____"
],
[
"#### 1. For each of the three dates, measure the arbitrage spread between the two exchanges by subtracting the lower-priced exchange from the higher-priced one. Then use a conditional statement to generate the summary statistics for each arbitrage_spread DataFrame, where the spread is greater than zero.\n\n*NOTE*: For illustration, only one of the three dates is shown in the starter code below.",
"_____no_output_____"
]
],
[
[
"# For the date early in the dataset, measure the arbitrage spread between the two exchanges\n# by subtracting the lower-priced exchange from the higher-priced one\n\n# Determines the arbitrage spread at three different times - early, middle, and late\narbitrage_spread_early = bitstamp_sliced['Close'].loc['2018-01-24'] - coinbase_sliced['Close'].loc['2018-01-24']\narbitrage_spread_middle = bitstamp_sliced['Close'].loc['2018-02-23'] - coinbase_sliced['Close'].loc['2018-02-23']\narbitrage_spread_late = bitstamp_sliced['Close'].loc['2018-03-23'] - coinbase_sliced['Close'].loc['2018-03-23']\n\n# Filters for positive returns at three different times - early, middle, and late\nspread_return_early_positive = arbitrage_spread_early[arbitrage_spread_early>0]\ndisplay(spread_return_early_positive.describe())\n\nspread_return_middle_positive = arbitrage_spread_middle[arbitrage_spread_middle>0]\ndisplay(spread_return_middle_positive.describe())\n\nspread_return_late_positive = arbitrage_spread_late[arbitrage_spread_late>0]\ndisplay(spread_return_late_positive.describe())\n",
"_____no_output_____"
]
],
[
[
"#### 2. For each of the three dates, calculate the spread returns. To do so, divide the instances that have a positive arbitrage spread (that is, a spread greater than zero) by the price of Bitcoin from the exchange you’re buying on (that is, the lower-priced exchange). Review the resulting DataFrame.",
"_____no_output_____"
]
],
[
[
"# For the date early in the dataset, calculate the spread returns by dividing the instances when the arbitrage spread is positive (> 0) \n# by the price of Bitcoin from the exchange you are buying on (the lower-priced exchange).\nspread_return_early= spread_return_early_positive / coinbase_sliced['Close'].loc['2018-01-24']\nspread_return_middle= spread_return_middle_positive / coinbase_sliced['Close'].loc['2018-02-23']\nspread_return_late= spread_return_late_positive / coinbase_sliced['Close'].loc['2018-03-23']\n\n\n\n# Review the spread return DataFrame\ndisplay(spread_return_early.describe())\ndisplay(spread_return_middle.describe())\ndisplay(spread_return_late.describe())",
"_____no_output_____"
]
],
[
[
"#### 3. For each of the three dates, narrow down your trading opportunities even further. To do so, determine the number of times your trades with positive returns exceed the 1% minimum threshold that you need to cover your costs.",
"_____no_output_____"
]
],
[
[
"# For the date early in the dataset, determine the number of times your trades with positive returns \n# exceed the 1% minimum threshold (.01) that you need to cover your costs\nprofitable_trades_early = spread_return_early[spread_return_early > .01]\nprofitable_trades_middle = spread_return_middle[spread_return_middle > .01]\nprofitable_trades_late = spread_return_late[spread_return_late > .01]\n\n# Review the first five profitable trades\ndisplay(profitable_trades_early.head())\ndisplay(profitable_trades_middle.head())\ndisplay(profitable_trades_late.head())",
"_____no_output_____"
]
],
[
[
"#### 4. Generate the summary statistics of your spread returns that are greater than 1%. How do the average returns compare among the three dates?",
"_____no_output_____"
]
],
[
[
"# For the date early in the dataset, generate the summary statistics for the profitable trades\n# or you trades where the spread returns are are greater than 1%\n\ndisplay(profitable_trades_early.describe())\ndisplay(profitable_trades_middle.describe())\ndisplay(profitable_trades_late.describe())",
"_____no_output_____"
]
],
[
[
"#### 5. For each of the three dates, calculate the potential profit, in dollars, per trade. To do so, multiply the spread returns that were greater than 1% by the cost of what was purchased. Make sure to drop any missing values from the resulting DataFrame.",
"_____no_output_____"
]
],
[
[
"# For the date early in the dataset, calculate the potential profit per trade in dollars \n# Multiply the profitable trades by the cost of the Bitcoin that was purchased\nprofit_early = spread_return_early[spread_return_early > .01] * coinbase_sliced['Close'].loc['2018-01-24']\nprofit_middle = spread_return_middle[spread_return_middle > .01] * coinbase_sliced['Close'].loc['2018-02-23']\nprofit_late = spread_return_late[spread_return_late > .01] * coinbase_sliced['Close'].loc['2018-03-23']\n\n\n\n# Drop any missing values from the profit DataFrame\nprofit_per_trade_early = profit_early.dropna()\nprofit_per_trade_middle = profit_middle.dropna()\nprofit_per_trade_late = profit_late.dropna()\n\n# View the early profit DataFrame\ndisplay(profit_per_trade_early.head())\ndisplay(profit_per_trade_middle.head())\ndisplay(profit_per_trade_late.head())",
"_____no_output_____"
]
],
[
[
"#### 6. Generate the summary statistics, and plot the results for each of the three DataFrames.",
"_____no_output_____"
]
],
[
[
"# Generate the summary statistics for the early profit per trade DataFrame\ndisplay(profit_per_trade_early.describe())\ndisplay(profit_per_trade_middle.describe())\ndisplay(profit_per_trade_late.describe())",
"_____no_output_____"
],
[
"# Plot the results for the early profit per trade DataFrame\nprofit_per_trade_early.plot(legend = True, figsize=(15, 7), title=\"Profit from Trades Made on Jan. 24, 2018\", color=\"green\", label = \"Profit\")",
"_____no_output_____"
]
],
[
[
"#### 7. Calculate the potential arbitrage profits that you can make on each day. To do so, sum the elements in the profit_per_trade DataFrame.",
"_____no_output_____"
]
],
[
[
"# Calculate the sum of the potential profits for the early profit per trade DataFrame\ndisplay(profit_per_trade_early.sum())\ndisplay(profit_per_trade_middle.sum())\ndisplay(profit_per_trade_late.sum())",
"_____no_output_____"
]
],
[
[
"#### 8. Using the `cumsum` function, plot the cumulative sum of each of the three DataFrames. Can you identify any patterns or trends in the profits across the three time periods?",
"_____no_output_____"
]
],
[
[
"# Use the cumsum function to calculate the cumulative profits over time for the early profit per trade DataFrame\ncumulative_profit_early = profit_per_trade_early.cumsum()# YOUR CODE HERE",
"_____no_output_____"
],
[
"# Plot the cumulative sum of profits for the early profit per trade DataFrame\ncumulative_profit_early.plot()",
"_____no_output_____"
]
],
[
[
"**Question:** After reviewing the profit information across each date from the different time periods, can you identify any patterns or trends?\n \n**Answer:** The clearest pattern is that arbitrage opportunities do not last very long. So, it is important to act on them early because over time, the spread narrows and eventually disappears. This is consistent with the idea of efficient marketplaces. Over time, the market finds and exploints inefficiencies, therefore elimating them. This does not mean that they do not exist, it just means that they are rarer in established markets than emerging ones and, even in emerging markets, do not last long when there is a sufficiently profitable spread or volume such that the profit is high.",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
]
] |
d07d62989cd1ea5fe835b38cfc47520bb4234980 | 181 | ipynb | Jupyter Notebook | content_raj/Untitled1.ipynb | xbsd/CS109 | a61c6861cfe68791451c4c59d2deeb3507c5f7f9 | [
"MIT"
] | null | null | null | content_raj/Untitled1.ipynb | xbsd/CS109 | a61c6861cfe68791451c4c59d2deeb3507c5f7f9 | [
"MIT"
] | null | null | null | content_raj/Untitled1.ipynb | xbsd/CS109 | a61c6861cfe68791451c4c59d2deeb3507c5f7f9 | [
"MIT"
] | null | null | null | 20.111111 | 88 | 0.690608 | [
[
[
"empty"
]
]
] | [
"empty"
] | [
[
"empty"
]
] |
d07d62eee6df38f8460d1531f231744fc6a19a53 | 618,156 | ipynb | Jupyter Notebook | Library_design/CTP-10_Aire/Merfish_design_from_XJP.ipynb | shiwei23/Chromatin_Analysis_Scripts | 909b9b81de8fcf04dd4c39ac21a84864ce2003ff | [
"MIT"
] | null | null | null | Library_design/CTP-10_Aire/Merfish_design_from_XJP.ipynb | shiwei23/Chromatin_Analysis_Scripts | 909b9b81de8fcf04dd4c39ac21a84864ce2003ff | [
"MIT"
] | null | null | null | Library_design/CTP-10_Aire/Merfish_design_from_XJP.ipynb | shiwei23/Chromatin_Analysis_Scripts | 909b9b81de8fcf04dd4c39ac21a84864ce2003ff | [
"MIT"
] | null | null | null | 68.12387 | 17,564 | 0.710123 | [
[
[
"Design MERFISH probes using the example inputs from Jeff Moffitt. The original MATLAB design pipeline can be found at https://github.com/ZhuangLab/MERFISH_analysis.",
"_____no_output_____"
],
[
"# Prepare inputs",
"_____no_output_____"
]
],
[
[
"# Download the input data\n# This is for the UNIX-like operating systems. If you are using Windows, just download the files accordingly.\n!mkdir temporary_data\n!wget http://zhuang.harvard.edu/merfish/MERFISHData/MERFISH_Examples2.zip -O temporary_data/MERFISH_Examples2.zip\n!unzip -o temporary_data/MERFISH_Examples2.zip -d temporary_data \n# Make a path for output\n!mkdir temporary_data/MERFISH_Examples2/outputs",
"_____no_output_____"
],
[
"# Define all the input files you need in this script\nimport os\nref_folder = r'\\\\10.245.74.212\\Chromatin_NAS_2\\Libraries\\CTP-10_Aire\\MERFISH_designer'\n#os.makedirs(os.path.join(ref_folder, 'MERFISH_Examples2', 'outputs'))\n\n\ncodebook_file = os.path.join(ref_folder, r'MERFISH_Examples2\\codebook.csv')\ntranscripts_fasta_file = os.path.join(ref_folder, r'MERFISH_Examples2\\transcripts.fasta')\nfpkm_tracking_file = os.path.join(ref_folder, r'MERFISH_Examples2\\isoforms.fpkm_tracking')\nreadout_fasta_file = os.path.join(ref_folder, r'MERFISH_Examples2\\readouts.fasta')\nforward_primer_file = r'\\\\10.245.74.212\\Chromatin_NAS_2\\Libraries\\Primers\\forward_primers.fasta'\nreverse_primer_file = r'\\\\10.245.74.212\\Chromatin_NAS_2\\Libraries\\Primers\\reverse_primers.fasta'\nncRNA_file = os.path.join(ref_folder, r'MERFISH_Examples2\\Homo_sapiens.GRCh38.ncrna.fa')\n\n# Define the output files\nottable_transcriptome_file = os.path.join(ref_folder, r'MERFISH_Examples2\\outputs\\ottable_transcriptome.pkl')\nselected_primers_file = os.path.join(ref_folder, r'MERFISH_Examples2\\outputs\\selected_primers.csv')\nprobe_output_file = os.path.join(ref_folder, r'MERFISH_Examples2\\outputs\\designed_probes.csv')\ntranscript_level_report_file = os.path.join(ref_folder, r'MERFISH_Examples2\\outputs\\transcript_level_report.csv')",
"_____no_output_____"
],
[
"print(transcripts_fasta_file)",
"\\\\10.245.74.212\\Chromatin_NAS_2\\Libraries\\CTP-10_Aire\\MERFISH_designer\\MERFISH_Examples2\\transcripts.fasta\n"
]
],
[
[
"# Initialize data structures",
"_____no_output_____"
]
],
[
[
"# Import the modules\nimport os\nimport sys\n\nfrom IPython.display import display\nimport MERFISH_probe_design\n\nimport MERFISH_probe_design.IO.file_io as fio\nimport MERFISH_probe_design.probe_design.probe_dict as p_d\nimport MERFISH_probe_design.probe_design.OTTable_dict as ot\nimport MERFISH_probe_design.probe_design.readout_sequences as rs\nimport MERFISH_probe_design.probe_design.probe_selection as ps\nimport MERFISH_probe_design.probe_design.quality_check as qc\nfrom MERFISH_probe_design.probe_design import filters\nfrom MERFISH_probe_design.probe_design import plot\nfrom MERFISH_probe_design.probe_design import primer_design",
"_____no_output_____"
]
],
[
[
"## load transcriptome",
"_____no_output_____"
]
],
[
[
"# Load the transcriptome as a pandas data frame\ntranscriptome = fio.load_transcriptome(transcripts_fasta_file, fpkm_tracking_file)\n\n# Make sure that the transcriptome data frame has the standard column names.\n# The standard columns are: transcript_id, sequence, gene_id, gene_short_name and FPKM.\n# Also remove the non-standard columns for clarity.\ntranscriptome = qc.check_and_standardize_transcriptome(transcriptome, remove_non_standard_columns=True)\n\ntranscriptome # Let's have a look at what's inside the transcriptome",
"Loaded 198798 transcripts.\nLoaded FPKMs for 215942 transcripts of 66008 genes.\nKept 198664 transcripts of 60655 genes after merging.\n"
],
[
"# Load the transcriptome as a pandas data frame\ntranscriptome = fio.load_transcriptome(transcripts_fasta_file, )#fpkm_tracking_file)\n\n# Make sure that the transcriptome data frame has the standard column names.\n# The standard columns are: transcript_id, sequence, gene_id, gene_short_name and FPKM.\n# Also remove the non-standard columns for clarity.\ntranscriptome = qc.check_and_standardize_transcriptome(transcriptome, remove_non_standard_columns=True)\n\ntranscriptome # Let's have a look at what's inside the transcriptome",
"Loaded 198798 transcripts.\n"
],
[
"# Load the codebook\ncb_version, cb_name, bit_names, barcode_table = fio.load_merlin_codebook(codebook_file)\ngene_ids = list(barcode_table['name'][barcode_table['id'] != '']) # Get the non-blank gene names\ntranscript_ids = set(barcode_table['id'][barcode_table['id'] != '']) # Get the non-blank transcript ids\nbarcode_table # Let's have a look at the barcode table",
"_____no_output_____"
],
[
"# Initialize the probe dictionary which is the carrier of the probes throught the design process.\nprobe_dict = p_d.init_probe_dict(gene_ids, transcriptome, 'gene_short_name', K=30)\np_d.print_probe_dict(probe_dict)",
"Found 943 transcripts for 130 target genes.\nGene\tTranscript\tN_probes\nVPS13D\n\tENST00000613099.4\t16216\n\tENST00000620676.4\t16291\n\tENST00000476169.1\t446\n\tENST00000489961.1\t414\n\tENST00000011700.10\t10940\n\tENST00000460333.5\t4881\n\tENST00000487188.1\t3961\n\tENST00000469054.1\t538\n\tENST00000543710.5\t5647\n\tENST00000466732.2\t348\n\tENST00000476045.5\t351\n\tENST00000481484.1\t331\n\tENST00000543766.2\t3809\n\tENST00000473099.1\t432\nPRDM2\n\tENST00000484063.6\t599\n\tENST00000376048.9\t2659\n\tENST00000311066.9\t7408\n\tENST00000235372.11\t7928\n\tENST00000491134.5\t1071\n\tENST00000502727.5\t553\n\tENST00000502724.5\t512\n\tENST00000413440.5\t6146\n\tENST00000343137.8\t5768\n\tENST00000491815.1\t629\n\tENST00000503842.5\t676\n\tENST00000407521.7\t484\n\tENST00000505823.5\t539\n\tENST00000487453.1\t617\nLUZP1\n\tENST00000418342.5\t8392\n\tENST00000302291.8\t8869\n\tENST00000314174.5\t3686\n\tENST00000471849.5\t1161\n\tENST00000475164.2\t584\nCNR2\n\tENST00000374472.4\t5225\nAHDC1\n\tENST00000247087.9\t6305\n\tENST00000374011.6\t6409\n\tENST00000487743.2\t388\n\tENST00000490295.1\t482\n\tENST00000480033.2\t517\nAGO3\n\tENST00000324350.9\t1697\n\tENST00000373191.8\t19658\n\tENST00000397828.3\t1006\n\tENST00000246314.10\t3182\n\tENST00000491443.2\t519\n\tENST00000471099.1\t573\nRAB3B\n\tENST00000371655.3\t12815\nUSP24\n\tENST00000294383.6\t10520\n\tENST00000484447.5\t585\n\tENST00000480962.1\t377\n\tENST00000472566.1\t702\n\tENST00000512504.1\t539\n\tENST00000482197.1\t571\nMAN1A2\n\tENST00000356554.7\t8547\n\tENST00000482811.1\t695\n\tENST00000449370.6\t2053\n\tENST00000421535.5\t649\n\tENST00000422329.1\t458\nNOTCH2\n\tENST00000256646.6\t11360\n\tENST00000493703.1\t524\n\tENST00000478864.1\t491\n\tENST00000479412.2\t3192\n\tENST00000579475.7\t2925\n\tENST00000602566.5\t5017\n\tENST00000612822.1\t5012\n\tENST00000489731.1\t385\nTPR\n\tENST00000367478.8\t9679\n\tENST00000467810.1\t552\n\tENST00000492973.1\t424\n\tENST00000481347.1\t2698\n\tENST00000491783.5\t853\n\tENST00000469463.1\t391\n\tENST00000474852.1\t2728\n\tENST00000613151.1\t2449\n\tENST00000451586.1\t486\nPLXNA2\n\tENST00000367033.3\t11415\n\tENST00000483048.1\t2831\n\tENST00000480053.1\t332\n\tENST00000463510.1\t469\n\tENST00000460870.1\t727\n\tENST00000486969.1\t910\nDIEXF\n\tENST00000491415.6\t8417\n\tENST00000457820.1\t2866\nPTPN14\n\tENST00000366956.9\t12956\n\tENST00000543945.5\t3813\n\tENST00000473261.1\t413\n\tENST00000486173.1\t458\n\tENST00000491277.1\t5523\nCENPF\n\tENST00000464322.5\t660\n\tENST00000366955.7\t10278\n\tENST00000495259.1\t316\n\tENST00000614578.1\t499\n\tENST00000467765.1\t1398\n\tENST00000469862.1\t705\nFAM208B\n\tENST00000328090.9\t8597\n\tENST00000477043.5\t485\n\tENST00000473789.5\t477\n\tENST00000496681.5\t493\n\tENST00000480839.6\t511\n\tENST00000532424.5\t404\n\tENST00000482419.1\t395\n\tENST00000463468.5\t531\n\tENST00000527124.1\t545\n\tENST00000532080.1\t456\n\tENST00000380270.3\t709\n\tENST00000487196.1\t547\n\tENST00000459693.1\t1812\nKIAA1462\n\tENST00000375377.1\t9236\n\tENST00000464386.1\t655\n\tENST00000465712.1\t561\nCHST3\n\tENST00000373115.4\t6941\nRBM20\n\tENST00000369519.3\t7204\n\tENST00000471172.1\t451\n\tENST00000480343.2\t544\n\tENST00000465774.2\t876\nBUB3\n\tENST00000368865.8\t7799\n\tENST00000368859.6\t638\n\tENST00000368858.9\t1332\n\tENST00000407911.2\t866\n\tENST00000481952.1\t761\nCKAP5\n\tENST00000529230.5\t7092\n\tENST00000312055.9\t6503\n\tENST00000354558.7\t6399\n\tENST00000533413.5\t3055\n\tENST00000525896.1\t803\n\tENST00000527333.5\t469\n\tENST00000526876.2\t541\n\tENST00000526943.1\t534\n\tENST00000528593.1\t337\n\tENST00000532321.1\t557\n\tENST00000526496.1\t553\n\tENST00000525248.1\t669\nMALAT1\n\tENST00000534336.1\t8679\n\tENST00000619449.1\t765\n\tENST00000620902.1\t323\n\tENST00000617791.1\t365\n\tENST00000544868.2\t451\n\tENST00000610481.1\t324\n\tENST00000508832.2\t1490\n\tENST00000616527.4\t543\n\tENST00000625158.1\t1560\n\tENST00000618925.1\t395\n\tENST00000618249.1\t66\n\tENST00000620465.4\t304\n\tENST00000612781.1\t205\n\tENST00000618132.1\t696\n\tENST00000613376.1\t103\n\tENST00000610851.1\t555\n\tENST00000618227.1\t564\n\tENST00000617489.1\t289\n\tENST00000616691.1\t558\nRNF169\n\tENST00000299563.4\t7794\n\tENST00000527301.1\t311\nFZD4\n\tENST00000531380.1\t7354\nAMOTL1\n\tENST00000299004.13\t624\n\tENST00000317829.12\t8815\n\tENST00000433060.2\t8941\n\tENST00000539727.1\t449\n\tENST00000537191.1\t828\nCBL\n\tENST00000264033.4\t11436\nITPR2\n\tENST00000381340.7\t11376\n\tENST00000451599.6\t2548\n\tENST00000538984.1\t557\n\tENST00000540429.1\t371\n\tENST00000543958.1\t515\n\tENST00000545902.1\t552\n\tENST00000540791.1\t546\n\tENST00000545235.1\t511\n\tENST00000242737.5\t552\n\tENST00000536627.1\t540\nSLC38A1\n\tENST00000398637.9\t8037\n\tENST00000549049.5\t3167\n\tENST00000439706.5\t3407\n\tENST00000546893.5\t2303\n\tENST00000552197.5\t2570\n\tENST00000612161.4\t1578\n\tENST00000548979.1\t514\n\tENST00000549633.5\t1047\n\tENST00000551506.1\t557\n\tENST00000550173.1\t862\n\tENST00000546519.1\t280\nDIP2B\n\tENST00000549620.5\t2647\n\tENST00000301180.9\t8564\n\tENST00000546719.1\t2268\n\tENST00000552328.1\t400\n\tENST00000546732.1\t3485\nCBX5\n\tENST00000209875.8\t11499\n\tENST00000439541.6\t1803\n\tENST00000547872.1\t716\n\tENST00000550411.5\t865\n\tENST00000552562.1\t646\n\tENST00000618078.1\t2598\nANKRD52\n\tENST00000548241.1\t622\n\tENST00000267116.7\t8659\n\tENST00000548081.1\t527\n\tENST00000551023.1\t364\nLRP1\n\tENST00000553277.5\t1664\n\tENST00000243077.7\t14868\n\tENST00000338962.8\t2140\n\tENST00000554174.1\t2146\n\tENST00000556830.1\t1818\n\tENST00000553446.1\t263\n\tENST00000554118.1\t389\n\tENST00000555941.1\t470\n\tENST00000555124.1\t521\n\tENST00000556247.1\t582\n\tENST00000451724.6\t1255\n\tENST00000556356.1\t5085\nSSH1\n\tENST00000326495.9\t13011\n\tENST00000546433.5\t5592\n\tENST00000551165.5\t2325\n\tENST00000326470.9\t2403\n\tENST00000547862.6\t596\n\tENST00000548522.5\t573\n\tENST00000547381.1\t447\n\tENST00000546697.1\t554\n\tENST00000546812.1\t559\nBRCA2\n\tENST00000380152.7\t11957\n\tENST00000544455.5\t10955\n\tENST00000530893.6\t1982\n\tENST00000614259.1\t7921\n\tENST00000528762.1\t466\n\tENST00000470094.1\t813\n\tENST00000533776.1\t494\nPROSER1\n\tENST00000352251.7\t5139\n\tENST00000625998.2\t5073\n\tENST00000484434.3\t2121\n\tENST00000468017.1\t408\n\tENST00000492646.1\t476\n\tENST00000602512.1\t540\n\tENST00000602899.1\t545\n\tENST00000496138.2\t389\n\tENST00000436678.1\t472\n\tENST00000418503.1\t869\n\tENST00000602534.1\t547\nAKAP11\n\tENST00000025301.3\t9891\nMCF2L\n\tENST00000375608.7\t3619\n\tENST00000397036.4\t1515\n\tENST00000442625.2\t2681\n\tENST00000397030.5\t6551\n\tENST00000535094.6\t3723\n\tENST00000421756.5\t3531\n\tENST00000433807.5\t528\n\tENST00000409954.6\t545\n\tENST00000375604.6\t6416\n\tENST00000375597.8\t3384\n\tENST00000486210.1\t519\n\tENST00000397024.1\t461\n\tENST00000486806.1\t332\n\tENST00000473345.5\t370\n\tENST00000480321.1\t5698\n\tENST00000469558.1\t587\n\tENST00000397021.5\t547\n\tENST00000494043.1\t867\n\tENST00000423251.1\t705\n\tENST00000464800.5\t691\n\tENST00000475524.1\t575\n\tENST00000397017.6\t3874\n\tENST00000487354.1\t524\n\tENST00000413354.5\t923\n\tENST00000261963.11\t2390\n\tENST00000453297.5\t1790\n\tENST00000439475.1\t424\n\tENST00000420013.5\t537\n\tENST00000441756.1\t341\n\tENST00000491028.1\t544\n\tENST00000469415.1\t2998\n\tENST00000488765.1\t3176\nC14orf132\n\tENST00000555004.1\t7745\n\tENST00000553764.1\t533\n\tENST00000556728.1\t745\n\tENST00000553782.1\t507\nDYNC1H1\n\tENST00000360184.8\t14304\n\tENST00000555204.1\t324\n\tENST00000554854.1\t692\n\tENST00000556791.1\t909\n\tENST00000553423.1\t510\n\tENST00000555800.1\t445\n\tENST00000556139.1\t734\n\tENST00000556499.1\t525\n\tENST00000555102.1\t532\n\tENST00000556229.1\t553\n\tENST00000555062.1\t1322\nHERC2\n\tENST00000261609.11\t15308\n\tENST00000566635.5\t2085\n\tENST00000562136.1\t603\n\tENST00000568206.1\t1989\n\tENST00000564519.1\t294\n\tENST00000569772.1\t569\n\tENST00000567869.1\t3494\n\tENST00000569335.1\t553\n\tENST00000564734.5\t3092\n\tENST00000563670.1\t589\n\tENST00000564383.1\t442\n\tENST00000563945.1\t462\nTHBS1\n\tENST00000397591.2\t553\n\tENST00000260356.5\t7746\n\tENST00000497720.1\t498\n\tENST00000466755.1\t471\n\tENST00000490247.1\t402\n\tENST00000560894.1\t508\n\tENST00000484734.1\t1262\n\tENST00000559746.1\t534\nSTARD9\n\tENST00000564158.5\t6020\n\tENST00000290607.11\t15538\n\tENST00000563872.5\t524\n\tENST00000568493.1\t2440\n\tENST00000569419.1\t637\n\tENST00000562139.1\t627\n\tENST00000562619.1\t7208\nFBN1\n\tENST00000316623.9\t11727\n\tENST00000559133.5\t4872\n\tENST00000561429.1\t631\n\tENST00000560720.1\t657\n\tENST00000537463.6\t3737\n\tENST00000560820.1\t550\n\tENST00000560355.1\t4080\n\tENST00000558230.1\t453\nUSP8\n\tENST00000560297.5\t623\n\tENST00000560954.5\t553\n\tENST00000307179.8\t4214\n\tENST00000396444.7\t18997\n\tENST00000425032.7\t3530\n\tENST00000558892.1\t587\n\tENST00000560885.5\t496\n\tENST00000559329.5\t1888\n\tENST00000560730.5\t1420\n\tENST00000625664.2\t1539\n\tENST00000560982.5\t381\n\tENST00000561211.5\t580\n\tENST00000558091.5\t469\n\tENST00000559242.5\t403\n\tENST00000561330.1\t786\n\tENST00000560527.1\t506\n\tENST00000561206.1\t527\n\tENST00000419830.2\t2373\n\tENST00000560379.1\t708\nTMOD2\n\tENST00000249700.8\t9162\n\tENST00000539962.6\t2971\n\tENST00000560576.1\t2319\n\tENST00000435126.6\t1461\n\tENST00000561300.1\t513\n\tENST00000561407.1\t506\nTHSD4\n\tENST00000355327.7\t9171\n\tENST00000620694.1\t387\n\tENST00000567745.5\t1189\n\tENST00000357769.4\t3374\n\tENST00000261862.7\t3104\n\tENST00000567838.1\t2616\n\tENST00000567776.2\t5560\nTSPAN3\n\tENST00000267970.8\t6438\n\tENST00000560715.5\t2863\n\tENST00000346495.6\t1618\n\tENST00000424443.7\t1501\n\tENST00000561277.5\t1563\n\tENST00000559494.1\t1297\n\tENST00000558745.5\t1311\n\tENST00000558394.5\t521\n\tENST00000560514.1\t609\n\tENST00000559373.1\t364\nCEMIP\n\tENST00000220244.7\t7197\n\tENST00000356249.9\t7268\n\tENST00000394685.7\t7328\n\tENST00000611615.1\t3507\n\tENST00000560027.1\t619\n\tENST00000495041.1\t356\n\tENST00000559966.1\t459\nZNF592\n\tENST00000560079.6\t8102\n\tENST00000559607.1\t3881\n\tENST00000299927.4\t7834\n\tENST00000618477.1\t179\nSRRM2\n\tENST00000301740.12\t9324\n\tENST00000576924.5\t3285\n\tENST00000575009.5\t1117\n\tENST00000575701.4\t831\n\tENST00000576894.5\t2181\n\tENST00000630499.2\t1178\n\tENST00000576415.5\t432\n\tENST00000571378.5\t2841\n\tENST00000574340.1\t550\n\tENST00000570971.2\t1103\n\tENST00000575870.5\t667\n\tENST00000573451.5\t412\n\tENST00000571372.1\t433\n\tENST00000570655.1\t546\n\tENST00000573498.5\t965\n\tENST00000572278.1\t682\n\tENST00000576076.1\t885\n\tENST00000572952.1\t1089\n\tENST00000576674.1\t564\n\tENST00000574593.1\t1411\n\tENST00000572883.1\t858\n\tENST00000572721.1\t812\n\tENST00000573311.1\t703\n\tENST00000571041.1\t469\n\tENST00000570705.1\t646\n\tENST00000576878.1\t1112\n\tENST00000574331.5\t1730\n\tENST00000574866.1\t609\n\tENST00000573692.1\t430\n\tENST00000570539.1\t402\n\tENST00000573583.2\t514\nCREBBP\n\tENST00000262367.9\t10774\n\tENST00000382070.7\t7569\n\tENST00000571763.5\t519\n\tENST00000576720.1\t3549\n\tENST00000574740.1\t494\n\tENST00000573517.5\t687\n\tENST00000570939.1\t835\n\tENST00000572569.1\t524\n\tENST00000573672.1\t893\n\tENST00000571826.5\t598\n\tENST00000572134.1\t817\n\tENST00000575237.1\t340\nTNRC6A\n\tENST00000315183.11\t8274\n\tENST00000562829.1\t248\n\tENST00000395799.7\t8409\n\tENST00000491718.5\t7927\n\tENST00000450465.6\t3671\n\tENST00000568903.5\t914\n\tENST00000567232.1\t494\n\tENST00000561726.1\t514\n\tENST00000568806.1\t269\n\tENST00000477487.5\t2711\n\tENST00000462400.1\t2521\n\tENST00000568750.1\t580\n\tENST00000563201.1\t247\n\tENST00000569376.5\t581\n\tENST00000569634.1\t523\n\tENST00000569098.1\t558\n\tENST00000464539.1\t1824\nCCDC113\n\tENST00000569374.1\t527\n\tENST00000219299.8\t5243\n\tENST00000443128.6\t5081\n\tENST00000561517.5\t528\n\tENST00000616795.1\t862\n\tENST00000566498.1\t465\nCDYL2\n\tENST00000570137.6\t8130\n\tENST00000562812.5\t2057\n\tENST00000563890.5\t2057\n\tENST00000566173.3\t2057\n\tENST00000561616.2\t626\n\tENST00000567924.1\t424\n\tENST00000562753.1\t528\nPRPF8\n\tENST00000571958.1\t351\n\tENST00000304992.10\t7266\n\tENST00000572621.5\t7416\n\tENST00000572723.1\t930\n\tENST00000575116.1\t431\n\tENST00000576585.1\t2128\n\tENST00000573681.1\t1015\n\tENST00000573725.1\t720\n\tENST00000572445.1\t619\n\tENST00000574217.1\t281\n\tENST00000577001.1\t3245\n\tENST00000576958.1\t387\n\tENST00000576407.1\t537\n\tENST00000573716.1\t513\n\tENST00000574728.1\t437\n\tENST00000571346.1\t836\nMYH10\n\tENST00000360416.7\t7733\n\tENST00000379980.8\t7665\n\tENST00000269243.8\t7633\n\tENST00000476737.1\t563\n\tENST00000488329.1\t509\n\tENST00000465458.1\t770\n\tENST00000469865.1\t475\n\tENST00000584124.1\t288\n\tENST00000459986.1\t682\n\tENST00000472728.1\t354\n\tENST00000411957.1\t703\nC17orf51\n\tENST00000540508.1\t373\n\tENST00000538604.1\t508\n\tENST00000391411.9\t7885\n\tENST00000535846.1\t679\n\tENST00000412778.3\t848\nRAD51D\n\tENST00000345365.10\t9959\n\tENST00000394589.8\t2258\n\tENST00000588372.5\t2185\n\tENST00000588594.5\t1380\n\tENST00000335858.11\t1324\n\tENST00000586044.5\t1542\n\tENST00000587977.5\t1703\n\tENST00000590016.5\t1499\n\tENST00000586210.5\t1165\n\tENST00000460118.6\t1035\n\tENST00000587405.5\t647\n\tENST00000592577.5\t733\n\tENST00000590631.1\t532\n\tENST00000415064.6\t957\n\tENST00000587982.5\t653\n\tENST00000592430.5\t669\n\tENST00000585947.5\t643\n\tENST00000585982.5\t567\n\tENST00000585343.5\t574\n\tENST00000586186.2\t260\n\tENST00000592850.5\t357\n\tENST00000592928.2\t178\n\tENST00000589506.1\t535\nPRKCA\n\tENST00000578063.5\t1704\n\tENST00000583361.1\t490\n\tENST00000413366.7\t8722\n\tENST00000284384.6\t2689\n\tENST00000583775.1\t292\nSMIM5\n\tENST00000375215.3\t1517\n\tENST00000581115.1\t633\n\tENST00000537494.1\t4315\nFASN\n\tENST00000306749.2\t8531\n\tENST00000584610.1\t1223\n\tENST00000580382.1\t652\n\tENST00000579758.1\t824\n\tENST00000578424.1\t807\n\tENST00000579410.1\t395\nALPK2\n\tENST00000361673.3\t7274\n\tENST00000589204.1\t2744\n\tENST00000587842.1\t542\n\tENST00000587399.1\t818\n\tENST00000590662.1\t2197\n\tENST00000590642.1\t526\nRNF152\n\tENST00000312828.3\t9472\n\tENST00000591306.5\t653\n\tENST00000588064.5\t520\n\tENST00000619552.1\t763\n\tENST00000588396.1\t428\nDSEL\n\tENST00000310045.7\t9502\nSIPA1L3\n\tENST00000222345.10\t7958\n\tENST00000599644.1\t516\n\tENST00000476317.2\t579\n\tENST00000595982.1\t134\n\tENST00000595384.1\t756\n\tENST00000601222.1\t517\n\tENST00000594553.1\t345\n\tENST00000601881.1\t539\n\tENST00000600919.2\t378\n\tENST00000595495.1\t555\n\tENST00000601054.1\t543\n\tENST00000596403.1\t782\nXDH\n\tENST00000379416.3\t5659\n\tENST00000491727.5\t552\n\tENST00000476043.1\t446\nYIPF4\n\tENST00000238831.8\t11920\n\tENST00000495355.1\t515\n\tENST00000437765.1\t699\n\tENST00000441084.1\t283\nHEATR5B\n\tENST00000425467.5\t404\n\tENST00000233099.5\t6876\n\tENST00000471051.1\t807\n\tENST00000467978.1\t582\n\tENST00000478810.1\t721\nSPTBN1\n\tENST00000356805.8\t8453\n\tENST00000615901.4\t10197\n\tENST00000389980.7\t2135\n\tENST00000602898.1\t2913\n\tENST00000333896.5\t9046\n\tENST00000496323.1\t2196\n\tENST00000467371.1\t3303\nUSP34\n\tENST00000411912.5\t4280\n\tENST00000398571.6\t11328\n\tENST00000436269.1\t1798\n\tENST00000492604.1\t498\n\tENST00000463046.5\t6278\n\tENST00000490552.1\t1764\n\tENST00000498268.1\t468\n\tENST00000476716.5\t577\n\tENST00000467128.5\t502\n\tENST00000472689.5\t747\n\tENST00000472706.5\t527\n\tENST00000490527.1\t531\n\tENST00000483672.1\t336\n\tENST00000482250.1\t486\n\tENST00000487547.1\t583\n\tENST00000468703.3\t199\n\tENST00000453734.1\t2679\n\tENST00000492882.1\t319\n\tENST00000484179.5\t898\n\tENST00000494867.1\t766\n\tENST00000460004.1\t543\n\tENST00000453133.1\t1380\nSLC9A2\n\tENST00000233969.2\t5381\n\tENST00000467657.1\t369\n\tENST00000469286.1\t555\nSULT1C2\n\tENST00000251481.10\t2766\n\tENST00000326853.9\t2799\n\tENST00000438339.5\t547\n\tENST00000409880.5\t1177\n\tENST00000437390.6\t1330\n\tENST00000442801.5\t843\n\tENST00000409067.2\t899\n\tENST00000492554.1\t144\n\tENST00000495441.1\t6337\nAGPS\n\tENST00000264167.8\t7735\n\tENST00000409888.1\t786\n\tENST00000460342.1\t547\nFZD5\n\tENST00000295417.3\t6679\nAGAP1\n\tENST00000409457.5\t2136\n\tENST00000336665.9\t5398\n\tENST00000304032.12\t10803\n\tENST00000402604.6\t534\n\tENST00000409538.5\t6109\n\tENST00000619456.4\t1028\n\tENST00000614409.4\t9903\n\tENST00000428334.6\t10061\n\tENST00000448025.5\t728\n\tENST00000418654.1\t797\n\tENST00000466575.5\t512\n\tENST00000482882.1\t835\n\tENST00000453371.2\t2109\nCEP250\n\tENST00000446710.5\t839\n\tENST00000420564.5\t673\n\tENST00000476146.5\t521\n\tENST00000397527.5\t15674\n\tENST00000461386.5\t2266\n\tENST00000397524.5\t1390\n\tENST00000425934.5\t2127\n\tENST00000465987.1\t351\n\tENST00000474829.1\t327\n\tENST00000425096.1\t600\n\tENST00000487467.1\t312\n\tENST00000425525.1\t889\n\tENST00000422671.1\t2619\n\tENST00000621352.1\t578\nTTPAL\n\tENST00000262605.8\t6183\n\tENST00000372904.7\t6205\n\tENST00000372906.2\t1110\n\tENST00000456317.1\t897\n\tENST00000461134.1\t3426\nNRIP1\n\tENST00000318948.6\t7527\n\tENST00000400199.5\t7533\n\tENST00000400202.5\t7642\n\tENST00000411932.1\t691\nSLC5A3\n\tENST00000381151.3\t11545\nIL17RA\n\tENST00000477874.1\t622\n\tENST00000319363.10\t8578\n\tENST00000612619.1\t8477\n\tENST00000459971.1\t592\nRP4-671O14.6\n\tENST00000624919.1\t14233\nCRTAP\n\tENST00000320954.10\t6601\n\tENST00000449224.1\t1782\n\tENST00000485310.1\t533\nNKTR\n\tENST00000232978.12\t7314\n\tENST00000468735.6\t7935\n\tENST00000460910.5\t253\n\tENST00000487466.5\t2306\n\tENST00000429888.5\t7296\n\tENST00000617821.4\t7290\n\tENST00000459950.1\t2105\n\tENST00000442970.5\t617\n\tENST00000445842.1\t531\n\tENST00000478488.1\t669\n\tENST00000465584.5\t3810\n\tENST00000466553.1\t687\n\tENST00000498730.5\t4883\n\tENST00000472127.1\t767\n\tENST00000508351.5\t847\n\tENST00000460807.2\t865\n\tENST00000464315.1\t515\n\tENST00000472258.1\t547\n\tENST00000490189.1\t552\nFYCO1\n\tENST00000535325.5\t8549\n\tENST00000433878.5\t4288\n\tENST00000296137.6\t8475\n\tENST00000438446.1\t1043\n\tENST00000471739.1\t516\nBSN\n\tENST00000296452.4\t15926\n\tENST00000467456.1\t833\nPOLQ\n\tENST00000621776.4\t9026\n\tENST00000264233.5\t8746\n\tENST00000474243.1\t584\n\tENST00000488282.1\t810\nUMPS\n\tENST00000232607.6\t6709\n\tENST00000462091.5\t1972\n\tENST00000628619.1\t199\n\tENST00000467167.5\t2213\n\tENST00000460034.5\t2052\n\tENST00000474588.5\t1760\n\tENST00000479719.5\t2311\n\tENST00000497791.5\t1662\n\tENST00000498715.1\t803\n\tENST00000487622.5\t770\n\tENST00000495751.1\t529\nPLXNA1\n\tENST00000393409.2\t9037\n\tENST00000503234.5\t4417\n\tENST00000503363.1\t381\n\tENST00000505278.1\t507\nDNAJC13\n\tENST00000260818.10\t7701\n\tENST00000486798.5\t2277\n\tENST00000471925.1\t1410\n\tENST00000464766.1\t746\n\tENST00000506813.1\t772\n\tENST00000513822.1\t309\n\tENST00000463038.1\t517\n\tENST00000509279.1\t644\nKPNA4\n\tENST00000334256.8\t8952\n\tENST00000483437.1\t481\n\tENST00000469804.1\t761\nPIK3CA\n\tENST00000477735.1\t216\n\tENST00000263967.3\t9064\n\tENST00000468036.1\t591\n\tENST00000462255.1\t717\nAFAP1\n\tENST00000360265.8\t7450\n\tENST00000358461.6\t7487\n\tENST00000420658.5\t7739\n\tENST00000505447.5\t1488\n\tENST00000513842.1\t1609\n\tENST00000382543.3\t2561\n\tENST00000508415.1\t517\n\tENST00000614385.1\t327\n\tENST00000513856.1\t735\n\tENST00000612691.1\t729\nFAM184B\n\tENST00000265018.3\t6593\nSLC7A11\n\tENST00000280612.9\t9616\n\tENST00000509248.1\t768\nSMARCA5\n\tENST00000283131.3\t7894\n\tENST00000515531.1\t500\n\tENST00000508573.1\t334\nANKH\n\tENST00000284268.6\t8174\n\tENST00000502585.1\t801\n\tENST00000503939.5\t581\n\tENST00000515517.1\t586\n\tENST00000513115.1\t668\n\tENST00000503389.1\t258\n\tENST00000505140.1\t1837\nFBN2\n\tENST00000262464.8\t10695\n\tENST00000619499.4\t10125\n\tENST00000508053.5\t11103\n\tENST00000507835.5\t1855\n\tENST00000508989.5\t4958\n\tENST00000511489.1\t713\n\tENST00000502468.5\t1918\n\tENST00000620257.1\t1912\n\tENST00000514742.1\t1948\nAFF4\n\tENST00000265343.9\t9523\n\tENST00000378595.7\t3319\n\tENST00000478588.5\t868\n\tENST00000378593.6\t401\n\tENST00000425658.1\t484\n\tENST00000477369.5\t699\n\tENST00000491831.5\t1638\n\tENST00000465484.1\t1700\n\tENST00000421773.1\t512\nSKP1\n\tENST00000353411.10\t9445\n\tENST00000522552.5\t1654\n\tENST00000521216.5\t870\n\tENST00000517691.1\t2004\n\tENST00000517625.5\t847\n\tENST00000523966.5\t870\n\tENST00000522855.5\t773\n\tENST00000328392.10\t543\n\tENST00000519054.5\t625\n\tENST00000519321.5\t663\n\tENST00000520417.1\t581\n\tENST00000524288.1\t703\n\tENST00000523359.1\t719\nARL10\n\tENST00000310389.5\t10816\n\tENST00000507151.1\t336\n\tENST00000514533.1\t229\n\tENST00000503175.1\t351\nFAF2\n\tENST00000510730.5\t655\n\tENST00000261942.6\t4486\n\tENST00000506061.1\t308\n\tENST00000510446.1\t727\n\tENST00000504983.1\t584\n\tENST00000513627.1\t627\nSCUBE3\n\tENST00000274938.7\t7327\nUBR2\n\tENST00000372903.6\t2394\n\tENST00000372899.5\t7828\n\tENST00000372901.1\t7828\nPHIP\n\tENST00000275034.4\t10431\n\tENST00000479165.1\t5665\nDOPEY1\n\tENST00000237163.9\t7966\n\tENST00000349129.6\t8181\n\tENST00000369739.7\t7714\n\tENST00000604380.1\t482\n\tENST00000493541.1\t602\n\tENST00000484282.1\t5444\n\tENST00000481979.1\t717\nASCC3\n\tENST00000369162.6\t8117\n\tENST00000518006.1\t664\n\tENST00000324696.8\t3589\n\tENST00000522650.5\t2799\n\tENST00000468245.2\t1198\n\tENST00000324723.10\t823\n\tENST00000369143.2\t3404\nSOD2\n\tENST00000538183.6\t14237\n\tENST00000367055.8\t961\n\tENST00000546260.5\t4273\n\tENST00000367054.6\t786\n\tENST00000546087.5\t2529\n\tENST00000444946.6\t888\n\tENST00000337404.8\t962\n\tENST00000535459.5\t552\n\tENST00000541573.5\t544\n\tENST00000540491.1\t425\n\tENST00000545162.5\t542\n\tENST00000535561.5\t522\n\tENST00000537657.5\t478\n\tENST00000401980.3\t494\n\tENST00000452684.2\t2016\n\tENST00000535372.1\t1689\nIGF2R\n\tENST00000356956.5\t14015\n\tENST00000464636.1\t541\n\tENST00000487607.1\t330\n\tENST00000475834.1\t729\n\tENST00000475584.1\t568\nEGFR\n\tENST00000455089.5\t3815\n\tENST00000342916.7\t2210\n\tENST00000454757.6\t5435\n\tENST00000344576.6\t2835\n\tENST00000420316.6\t1541\n\tENST00000275493.6\t9792\n\tENST00000442591.5\t2500\n\tENST00000459688.1\t423\n\tENST00000463948.1\t532\n\tENST00000450046.1\t662\n\tENST00000485503.1\t636\nLMTK2\n\tENST00000297293.5\t8917\n\tENST00000493372.1\t511\nSLC35B4\n\tENST00000378509.8\t6768\n\tENST00000416907.5\t1350\n\tENST00000466599.1\t425\n\tENST00000470969.2\t1004\nKIAA1147\n\tENST00000536163.5\t7267\n\tENST00000482493.1\t1818\nXKR5\n\tENST00000618990.4\t4856\n\tENST00000618742.2\t4864\nKIF13B\n\tENST00000524189.5\t8716\n\tENST00000523130.1\t1238\n\tENST00000522355.5\t2601\n\tENST00000521515.1\t2736\n\tENST00000523968.1\t1721\n\tENST00000522846.1\t1936\nPRKDC\n\tENST00000314191.6\t13480\n\tENST00000338368.7\t12755\n\tENST00000536429.1\t630\n\tENST00000536483.1\t805\n\tENST00000521331.5\t1058\n\tENST00000536710.1\t525\n\tENST00000432581.2\t554\n\tENST00000546304.1\t560\n\tENST00000541488.1\t690\n\tENST00000535375.1\t435\n\tENST00000540819.1\t508\nUBR5\n\tENST00000520539.5\t10268\n\tENST00000220959.8\t9389\n\tENST00000517465.1\t1066\n\tENST00000518205.5\t1902\n\tENST00000521922.5\t8979\n\tENST00000521767.1\t1734\n\tENST00000521312.1\t531\n\tENST00000521566.1\t534\n\tENST00000519528.1\t296\n\tENST00000520898.5\t553\n\tENST00000519365.1\t573\n\tENST00000518145.1\t569\n\tENST00000522327.1\t672\nSAMD12\n\tENST00000409003.4\t8837\n\tENST00000453675.6\t1953\n\tENST00000524796.5\t596\n\tENST00000445741.5\t491\n\tENST00000527515.1\t173\n\tENST00000314727.8\t2143\n\tENST00000526328.5\t563\n\tENST00000526765.5\t478\nKLHL9\n\tENST00000359039.4\t5681\nTLN1\n\tENST00000314888.9\t8794\n\tENST00000489255.2\t769\n\tENST00000464379.5\t785\n\tENST00000466916.1\t417\n\tENST00000465002.1\t514\n\tENST00000486788.2\t522\n\tENST00000495712.1\t341\n\tENST00000378192.2\t1664\nZCCHC6\n\tENST00000277141.10\t5695\n\tENST00000375960.6\t4910\n\tENST00000375957.5\t2233\n\tENST00000375963.7\t5350\n\tENST00000469004.1\t596\n\tENST00000375948.2\t929\n\tENST00000375947.1\t582\nTSTD2\n\tENST00000375173.1\t3047\n\tENST00000341170.4\t4291\n\tENST00000375172.6\t1474\n\tENST00000375165.5\t1583\n\tENST00000375163.5\t555\n\tENST00000484708.1\t471\nPAPPA\n\tENST00000328252.3\t10930\n\tENST00000460463.1\t325\n\tENST00000483254.1\t761\nZBTB43\n\tENST00000449886.5\t5822\n\tENST00000373464.4\t5936\n\tENST00000450858.1\t681\n\tENST00000373457.1\t5666\n\tENST00000497064.1\t164\nGPR107\n\tENST00000347136.10\t7180\n\tENST00000372406.5\t7324\n\tENST00000372410.7\t6962\n\tENST00000493417.5\t2804\n\tENST00000610997.1\t2784\n\tENST00000462907.1\t360\n\tENST00000483935.1\t148\n\tENST00000415344.1\t619\nPRRC2B\n\tENST00000405995.5\t9128\n\tENST00000357304.8\t11013\n\tENST00000489593.1\t768\n\tENST00000422467.1\t506\n\tENST00000456307.1\t496\n\tENST00000451855.1\t1733\n\tENST00000458550.5\t870\n\tENST00000320547.7\t4921\n\tENST00000465931.1\t5337\nGTF3C4\n\tENST00000483873.6\t1388\n\tENST00000372146.4\t9014\nCOL5A1\n\tENST00000371817.7\t8442\n\tENST00000618395.4\t8408\n\tENST00000464187.1\t1801\n\tENST00000469093.1\t641\n\tENST00000463925.1\t504\n\tENST00000460264.5\t884\n\tENST00000371820.3\t599\n\tENST00000465877.1\t554\nMED14\n\tENST00000324817.5\t7955\n\tENST00000416199.5\t528\n\tENST00000433003.1\t1451\n\tENST00000472736.1\t471\n\tENST00000496531.2\t586\n\tENST00000482034.5\t491\n\tENST00000492219.1\t516\n\tENST00000495865.1\t496\n\tENST00000463072.1\t517\nUSP9X\n\tENST00000378308.6\t8342\n\tENST00000324545.8\t12372\n\tENST00000467173.1\t640\n\tENST00000465386.1\t450\n\tENST00000463829.1\t637\n\tENST00000462850.1\t519\n\tENST00000487625.1\t552\n\tENST00000485180.1\t728\nNHSL2\n\tENST00000633930.1\t13631\n\tENST00000631833.1\t423\n\tENST00000631375.1\t12564\n\tENST00000632230.1\t2926\n\tENST00000373677.1\t3949\n\tENST00000510661.2\t3053\n\tENST00000623354.1\t2240\n"
],
[
"# The probe_dict is just a dictionary of dictionary of pandas data frames.\n# Let's have a look at the data frame of an example transcript.\nprobe_dict['VPS13D']['ENST00000613099.4']",
"_____no_output_____"
],
[
"# Select the transcripts that we want to target\n# The target transcripts are already defined in the codebook\nprobe_dict = p_d.select_transcripts_by_ids(probe_dict, transcript_ids)\np_d.print_probe_dict(probe_dict) # We excluded all the transcripts that are not our direct targets",
"Gene\tTranscript\tN_probes\nVPS13D\n\tENST00000613099.4\t16216\nPRDM2\n\tENST00000343137.8\t5768\nLUZP1\n\tENST00000418342.5\t8392\nCNR2\n\tENST00000374472.4\t5225\nAHDC1\n\tENST00000374011.6\t6409\nAGO3\n\tENST00000373191.8\t19658\nRAB3B\n\tENST00000371655.3\t12815\nUSP24\n\tENST00000294383.6\t10520\nMAN1A2\n\tENST00000356554.7\t8547\nNOTCH2\n\tENST00000256646.6\t11360\nTPR\n\tENST00000367478.8\t9679\nPLXNA2\n\tENST00000367033.3\t11415\nDIEXF\n\tENST00000491415.6\t8417\nPTPN14\n\tENST00000366956.9\t12956\nCENPF\n\tENST00000366955.7\t10278\nFAM208B\n\tENST00000328090.9\t8597\nKIAA1462\n\tENST00000375377.1\t9236\nCHST3\n\tENST00000373115.4\t6941\nRBM20\n\tENST00000369519.3\t7204\nBUB3\n\tENST00000368865.8\t7799\nCKAP5\n\tENST00000312055.9\t6503\nMALAT1\n\tENST00000534336.1\t8679\nRNF169\n\tENST00000299563.4\t7794\nFZD4\n\tENST00000531380.1\t7354\nAMOTL1\n\tENST00000433060.2\t8941\nCBL\n\tENST00000264033.4\t11436\nITPR2\n\tENST00000381340.7\t11376\nSLC38A1\n\tENST00000398637.9\t8037\nDIP2B\n\tENST00000301180.9\t8564\nCBX5\n\tENST00000209875.8\t11499\nANKRD52\n\tENST00000267116.7\t8659\nLRP1\n\tENST00000243077.7\t14868\nSSH1\n\tENST00000326495.9\t13011\nBRCA2\n\tENST00000380152.7\t11957\nPROSER1\n\tENST00000352251.7\t5139\nAKAP11\n\tENST00000025301.3\t9891\nMCF2L\n\tENST00000480321.1\t5698\nC14orf132\n\tENST00000555004.1\t7745\nDYNC1H1\n\tENST00000360184.8\t14304\nHERC2\n\tENST00000261609.11\t15308\nTHBS1\n\tENST00000260356.5\t7746\nSTARD9\n\tENST00000290607.11\t15538\nFBN1\n\tENST00000316623.9\t11727\nUSP8\n\tENST00000396444.7\t18997\nTMOD2\n\tENST00000249700.8\t9162\nTHSD4\n\tENST00000355327.7\t9171\nTSPAN3\n\tENST00000267970.8\t6438\nCEMIP\n\tENST00000220244.7\t7197\nZNF592\n\tENST00000559607.1\t3881\nSRRM2\n\tENST00000301740.12\t9324\nCREBBP\n\tENST00000382070.7\t7569\nTNRC6A\n\tENST00000315183.11\t8274\nCCDC113\n\tENST00000219299.8\t5243\nCDYL2\n\tENST00000570137.6\t8130\nPRPF8\n\tENST00000304992.10\t7266\nMYH10\n\tENST00000269243.8\t7633\nC17orf51\n\tENST00000391411.9\t7885\nRAD51D\n\tENST00000345365.10\t9959\nPRKCA\n\tENST00000413366.7\t8722\nSMIM5\n\tENST00000537494.1\t4315\nFASN\n\tENST00000306749.2\t8531\nALPK2\n\tENST00000361673.3\t7274\nRNF152\n\tENST00000312828.3\t9472\nDSEL\n\tENST00000310045.7\t9502\nSIPA1L3\n\tENST00000222345.10\t7958\nXDH\n\tENST00000379416.3\t5659\nYIPF4\n\tENST00000238831.8\t11920\nHEATR5B\n\tENST00000233099.5\t6876\nSPTBN1\n\tENST00000356805.8\t8453\nUSP34\n\tENST00000398571.6\t11328\nSLC9A2\n\tENST00000233969.2\t5381\nSULT1C2\n\tENST00000495441.1\t6337\nAGPS\n\tENST00000264167.8\t7735\nFZD5\n\tENST00000295417.3\t6679\nAGAP1\n\tENST00000614409.4\t9903\nCEP250\n\tENST00000397527.5\t15674\nTTPAL\n\tENST00000262605.8\t6183\nNRIP1\n\tENST00000400199.5\t7533\nSLC5A3\n\tENST00000381151.3\t11545\nIL17RA\n\tENST00000319363.10\t8578\nRP4-671O14.6\n\tENST00000624919.1\t14233\nCRTAP\n\tENST00000320954.10\t6601\nNKTR\n\tENST00000429888.5\t7296\nFYCO1\n\tENST00000296137.6\t8475\nBSN\n\tENST00000296452.4\t15926\nPOLQ\n\tENST00000264233.5\t8746\nUMPS\n\tENST00000232607.6\t6709\nPLXNA1\n\tENST00000393409.2\t9037\nDNAJC13\n\tENST00000260818.10\t7701\nKPNA4\n\tENST00000334256.8\t8952\nPIK3CA\n\tENST00000263967.3\t9064\nAFAP1\n\tENST00000358461.6\t7487\nFAM184B\n\tENST00000265018.3\t6593\nSLC7A11\n\tENST00000280612.9\t9616\nSMARCA5\n\tENST00000283131.3\t7894\nANKH\n\tENST00000284268.6\t8174\nFBN2\n\tENST00000619499.4\t10125\nAFF4\n\tENST00000265343.9\t9523\nSKP1\n\tENST00000353411.10\t9445\nARL10\n\tENST00000310389.5\t10816\nFAF2\n\tENST00000261942.6\t4486\nSCUBE3\n\tENST00000274938.7\t7327\nUBR2\n\tENST00000372901.1\t7828\nPHIP\n\tENST00000275034.4\t10431\nDOPEY1\n\tENST00000369739.7\t7714\nASCC3\n\tENST00000369162.6\t8117\nSOD2\n\tENST00000538183.6\t14237\nIGF2R\n\tENST00000356956.5\t14015\nEGFR\n\tENST00000275493.6\t9792\nLMTK2\n\tENST00000297293.5\t8917\nSLC35B4\n\tENST00000378509.8\t6768\nKIAA1147\n\tENST00000536163.5\t7267\nXKR5\n\tENST00000618990.4\t4856\nKIF13B\n\tENST00000524189.5\t8716\nPRKDC\n\tENST00000314191.6\t13480\nUBR5\n\tENST00000521922.5\t8979\nSAMD12\n\tENST00000409003.4\t8837\nKLHL9\n\tENST00000359039.4\t5681\nTLN1\n\tENST00000314888.9\t8794\nZCCHC6\n\tENST00000375963.7\t5350\nTSTD2\n\tENST00000341170.4\t4291\nPAPPA\n\tENST00000328252.3\t10930\nZBTB43\n\tENST00000373464.4\t5936\nGPR107\n\tENST00000372410.7\t6962\nPRRC2B\n\tENST00000357304.8\t11013\nGTF3C4\n\tENST00000372146.4\t9014\nCOL5A1\n\tENST00000371817.7\t8442\nMED14\n\tENST00000324817.5\t7955\nUSP9X\n\tENST00000378308.6\t8342\nNHSL2\n\tENST00000633930.1\t13631\n"
],
[
"# Initialize the off-target counting tables\n# OTTable for rRNA/tRNAs\nncRNAs = fio.load_fasta_into_df(ncRNA_file)\nottable_rtRNAs = ot.get_OTTable_for_rtRNAs(ncRNAs, 15)",
"Found 587 rRNAs/tRNAs from 37612 non-coding RNAs.\n"
],
[
"# OTTables for the genes we target\ngene_ottable_dict = ot.get_gene_OTTables(transcriptome, gene_ids, 'gene_short_name', 17)",
"Generate OTTable for gene AKAP11.\nConstruct a OTTable using 1/1 transcripts with FPKM > 0.\nGenerate OTTable for gene CBX5.\nConstruct a OTTable using 6/6 transcripts with FPKM > 0.\nGenerate OTTable for gene CCDC113.\nConstruct a OTTable using 6/6 transcripts with FPKM > 0.\nGenerate OTTable for gene CEMIP.\nConstruct a OTTable using 5/7 transcripts with FPKM > 0.\nGenerate OTTable for gene SIPA1L3.\nConstruct a OTTable using 12/12 transcripts with FPKM > 0.\nGenerate OTTable for gene UMPS.\nConstruct a OTTable using 11/11 transcripts with FPKM > 0.\nGenerate OTTable for gene HEATR5B.\nConstruct a OTTable using 5/5 transcripts with FPKM > 0.\nGenerate OTTable for gene SLC9A2.\nConstruct a OTTable using 2/3 transcripts with FPKM > 0.\nGenerate OTTable for gene YIPF4.\nConstruct a OTTable using 4/4 transcripts with FPKM > 0.\nGenerate OTTable for gene LRP1.\nConstruct a OTTable using 12/12 transcripts with FPKM > 0.\nGenerate OTTable for gene TMOD2.\nConstruct a OTTable using 5/6 transcripts with FPKM > 0.\nGenerate OTTable for gene NOTCH2.\nConstruct a OTTable using 8/8 transcripts with FPKM > 0.\nGenerate OTTable for gene THBS1.\nConstruct a OTTable using 8/8 transcripts with FPKM > 0.\nGenerate OTTable for gene DNAJC13.\nConstruct a OTTable using 8/8 transcripts with FPKM > 0.\nGenerate OTTable for gene HERC2.\nConstruct a OTTable using 12/12 transcripts with FPKM > 0.\nGenerate OTTable for gene FAF2.\nConstruct a OTTable using 6/6 transcripts with FPKM > 0.\nGenerate OTTable for gene TTPAL.\nConstruct a OTTable using 5/5 transcripts with FPKM > 0.\nGenerate OTTable for gene PIK3CA.\nConstruct a OTTable using 4/4 transcripts with FPKM > 0.\nGenerate OTTable for gene CBL.\nConstruct a OTTable using 1/1 transcripts with FPKM > 0.\nGenerate OTTable for gene AGPS.\nConstruct a OTTable using 3/3 transcripts with FPKM > 0.\nGenerate OTTable for gene POLQ.\nConstruct a OTTable using 4/4 transcripts with FPKM > 0.\nGenerate OTTable for gene FAM184B.\nConstruct a OTTable using 1/1 transcripts with FPKM > 0.\nGenerate OTTable for gene AFF4.\nConstruct a OTTable using 9/9 transcripts with FPKM > 0.\nGenerate OTTable for gene ANKRD52.\nConstruct a OTTable using 4/4 transcripts with FPKM > 0.\nGenerate OTTable for gene TSPAN3.\nConstruct a OTTable using 10/10 transcripts with FPKM > 0.\nGenerate OTTable for gene MYH10.\nConstruct a OTTable using 10/11 transcripts with FPKM > 0.\nGenerate OTTable for gene SCUBE3.\nConstruct a OTTable using 1/1 transcripts with FPKM > 0.\nGenerate OTTable for gene PHIP.\nConstruct a OTTable using 2/2 transcripts with FPKM > 0.\nGenerate OTTable for gene EGFR.\nConstruct a OTTable using 11/11 transcripts with FPKM > 0.\nGenerate OTTable for gene SLC7A11.\nConstruct a OTTable using 2/2 transcripts with FPKM > 0.\nGenerate OTTable for gene SMARCA5.\nConstruct a OTTable using 3/3 transcripts with FPKM > 0.\nGenerate OTTable for gene ANKH.\nConstruct a OTTable using 7/7 transcripts with FPKM > 0.\nGenerate OTTable for gene STARD9.\nConstruct a OTTable using 7/7 transcripts with FPKM > 0.\nGenerate OTTable for gene USP24.\nConstruct a OTTable using 6/6 transcripts with FPKM > 0.\nGenerate OTTable for gene FZD5.\nConstruct a OTTable using 1/1 transcripts with FPKM > 0.\nGenerate OTTable for gene FYCO1.\nConstruct a OTTable using 5/5 transcripts with FPKM > 0.\nGenerate OTTable for gene BSN.\nConstruct a OTTable using 1/2 transcripts with FPKM > 0.\nGenerate OTTable for gene LMTK2.\nConstruct a OTTable using 2/2 transcripts with FPKM > 0.\nGenerate OTTable for gene RNF169.\nConstruct a OTTable using 2/2 transcripts with FPKM > 0.\nGenerate OTTable for gene DIP2B.\nConstruct a OTTable using 5/5 transcripts with FPKM > 0.\nGenerate OTTable for gene SRRM2.\nConstruct a OTTable using 30/31 transcripts with FPKM > 0.\nGenerate OTTable for gene PRPF8.\nConstruct a OTTable using 16/16 transcripts with FPKM > 0.\nGenerate OTTable for gene FASN.\nConstruct a OTTable using 6/6 transcripts with FPKM > 0.\nGenerate OTTable for gene DSEL.\nConstruct a OTTable using 1/1 transcripts with FPKM > 0.\nGenerate OTTable for gene ARL10.\nConstruct a OTTable using 4/4 transcripts with FPKM > 0.\nGenerate OTTable for gene CKAP5.\nConstruct a OTTable using 12/12 transcripts with FPKM > 0.\nGenerate OTTable for gene RNF152.\nConstruct a OTTable using 2/5 transcripts with FPKM > 0.\nGenerate OTTable for gene PRKDC.\nConstruct a OTTable using 11/11 transcripts with FPKM > 0.\nGenerate OTTable for gene TLN1.\nConstruct a OTTable using 8/8 transcripts with FPKM > 0.\nGenerate OTTable for gene TNRC6A.\nConstruct a OTTable using 17/17 transcripts with FPKM > 0.\nGenerate OTTable for gene FBN1.\nConstruct a OTTable using 8/8 transcripts with FPKM > 0.\nGenerate OTTable for gene IL17RA.\nConstruct a OTTable using 4/4 transcripts with FPKM > 0.\nGenerate OTTable for gene CRTAP.\nConstruct a OTTable using 3/3 transcripts with FPKM > 0.\nGenerate OTTable for gene MED14.\nConstruct a OTTable using 9/9 transcripts with FPKM > 0.\nGenerate OTTable for gene SSH1.\nConstruct a OTTable using 9/9 transcripts with FPKM > 0.\nGenerate OTTable for gene FAM208B.\nConstruct a OTTable using 13/13 transcripts with FPKM > 0.\nGenerate OTTable for gene PAPPA.\nConstruct a OTTable using 3/3 transcripts with FPKM > 0.\nGenerate OTTable for gene KPNA4.\nConstruct a OTTable using 3/3 transcripts with FPKM > 0.\nGenerate OTTable for gene TSTD2.\nConstruct a OTTable using 6/6 transcripts with FPKM > 0.\nGenerate OTTable for gene PRDM2.\nConstruct a OTTable using 14/14 transcripts with FPKM > 0.\nGenerate OTTable for gene RAD51D.\nConstruct a OTTable using 22/23 transcripts with FPKM > 0.\nGenerate OTTable for gene PROSER1.\nConstruct a OTTable using 9/11 transcripts with FPKM > 0.\nGenerate OTTable for gene SKP1.\nConstruct a OTTable using 13/13 transcripts with FPKM > 0.\nGenerate OTTable for gene THSD4.\nConstruct a OTTable using 6/7 transcripts with FPKM > 0.\nGenerate OTTable for gene MAN1A2.\nConstruct a OTTable using 5/5 transcripts with FPKM > 0.\nGenerate OTTable for gene SPTBN1.\nConstruct a OTTable using 7/7 transcripts with FPKM > 0.\nGenerate OTTable for gene IGF2R.\nConstruct a OTTable using 5/5 transcripts with FPKM > 0.\nGenerate OTTable for gene PRRC2B.\nConstruct a OTTable using 9/9 transcripts with FPKM > 0.\nGenerate OTTable for gene AFAP1.\nConstruct a OTTable using 7/10 transcripts with FPKM > 0.\nGenerate OTTable for gene KLHL9.\nConstruct a OTTable using 1/1 transcripts with FPKM > 0.\nGenerate OTTable for gene DYNC1H1.\nConstruct a OTTable using 11/11 transcripts with FPKM > 0.\nGenerate OTTable for gene ALPK2.\nConstruct a OTTable using 6/6 transcripts with FPKM > 0.\nGenerate OTTable for gene CENPF.\nConstruct a OTTable using 6/6 transcripts with FPKM > 0.\nGenerate OTTable for gene PTPN14.\nConstruct a OTTable using 5/5 transcripts with FPKM > 0.\nGenerate OTTable for gene PLXNA2.\nConstruct a OTTable using 5/6 transcripts with FPKM > 0.\nGenerate OTTable for gene TPR.\nConstruct a OTTable using 9/9 transcripts with FPKM > 0.\nGenerate OTTable for gene BUB3.\nConstruct a OTTable using 5/5 transcripts with FPKM > 0.\nGenerate OTTable for gene ASCC3.\nConstruct a OTTable using 7/7 transcripts with FPKM > 0.\nGenerate OTTable for gene RBM20.\nConstruct a OTTable using 4/4 transcripts with FPKM > 0.\nGenerate OTTable for gene DOPEY1.\nConstruct a OTTable using 7/7 transcripts with FPKM > 0.\nGenerate OTTable for gene RAB3B.\nConstruct a OTTable using 1/1 transcripts with FPKM > 0.\nGenerate OTTable for gene COL5A1.\nConstruct a OTTable using 6/8 transcripts with FPKM > 0.\nGenerate OTTable for gene GTF3C4.\nConstruct a OTTable using 2/2 transcripts with FPKM > 0.\nGenerate OTTable for gene GPR107.\nConstruct a OTTable using 8/8 transcripts with FPKM > 0.\nGenerate OTTable for gene UBR2.\nConstruct a OTTable using 3/3 transcripts with FPKM > 0.\nGenerate OTTable for gene CHST3.\nConstruct a OTTable using 1/1 transcripts with FPKM > 0.\nGenerate OTTable for gene AGO3.\nConstruct a OTTable using 6/6 transcripts with FPKM > 0.\nGenerate OTTable for gene ZBTB43.\nConstruct a OTTable using 5/5 transcripts with FPKM > 0.\nGenerate OTTable for gene AHDC1.\nConstruct a OTTable using 5/5 transcripts with FPKM > 0.\nGenerate OTTable for gene CNR2.\nConstruct a OTTable using 1/1 transcripts with FPKM > 0.\nGenerate OTTable for gene KIAA1462.\nConstruct a OTTable using 3/3 transcripts with FPKM > 0.\nGenerate OTTable for gene ZCCHC6.\nConstruct a OTTable using 7/7 transcripts with FPKM > 0.\nGenerate OTTable for gene USP9X.\nConstruct a OTTable using 8/8 transcripts with FPKM > 0.\nGenerate OTTable for gene SLC35B4.\n"
],
[
"%%time\n# OTTable for the transcriptome.\n# Let's save this big table to save time when we need it again\nif os.path.exists(ottable_transcriptome_file):\n ottable_transcriptome = ot.OTTable.load_pkl(ottable_transcriptome_file)\nelse:\n ottable_transcriptome = ot.get_OTTable_for_transcriptome(transcriptome, 17)\n ottable_transcriptome.save_pkl(ottable_transcriptome_file)",
"Load the OTTable from \\\\10.245.74.212\\Chromatin_NAS_2\\Libraries\\CTP-10_Aire\\MERFISH_designer\\MERFISH_Examples2\\outputs\\ottable_transcriptome.pkl.\nWall time: 1min 8s\n"
]
],
[
[
"# Select target regions",
"_____no_output_____"
]
],
[
[
"# Calculate and plot the GC contents of the target regions\nfilters.calc_gc_for_probe_dict(probe_dict, column_key_seq='target_sequence', column_key_write='target_GC')\nplot.plot_hist(probe_dict, column_key='target_GC')",
"_____no_output_____"
],
[
"# Filter GC cotent and plot the GC content after filtering\nfilters.filter_probe_dict_by_metric(probe_dict, 'target_GC', lower_bound=43, upper_bound=63)\nplot.plot_hist(probe_dict, column_key='target_GC')",
"VPS13D\n\tENST00000613099.4: 9831 / 16216 probes passed the filter 43 < target_GC < 63.\nPRDM2\n\tENST00000343137.8: 2957 / 5768 probes passed the filter 43 < target_GC < 63.\nLUZP1\n\tENST00000418342.5: 5628 / 8392 probes passed the filter 43 < target_GC < 63.\nCNR2\n\tENST00000374472.4: 3270 / 5225 probes passed the filter 43 < target_GC < 63.\nAHDC1\n\tENST00000374011.6: 1922 / 6409 probes passed the filter 43 < target_GC < 63.\nAGO3\n\tENST00000373191.8: 6449 / 19658 probes passed the filter 43 < target_GC < 63.\nRAB3B\n\tENST00000371655.3: 7044 / 12815 probes passed the filter 43 < target_GC < 63.\nUSP24\n\tENST00000294383.6: 5285 / 10520 probes passed the filter 43 < target_GC < 63.\nMAN1A2\n\tENST00000356554.7: 2645 / 8547 probes passed the filter 43 < target_GC < 63.\nNOTCH2\n\tENST00000256646.6: 7147 / 11360 probes passed the filter 43 < target_GC < 63.\nTPR\n\tENST00000367478.8: 3849 / 9679 probes passed the filter 43 < target_GC < 63.\nPLXNA2\n\tENST00000367033.3: 7110 / 11415 probes passed the filter 43 < target_GC < 63.\nDIEXF\n\tENST00000491415.6: 4434 / 8417 probes passed the filter 43 < target_GC < 63.\nPTPN14\n\tENST00000366956.9: 6552 / 12956 probes passed the filter 43 < target_GC < 63.\nCENPF\n\tENST00000366955.7: 4402 / 10278 probes passed the filter 43 < target_GC < 63.\nFAM208B\n\tENST00000328090.9: 3935 / 8597 probes passed the filter 43 < target_GC < 63.\nKIAA1462\n\tENST00000375377.1: 4477 / 9236 probes passed the filter 43 < target_GC < 63.\nCHST3\n\tENST00000373115.4: 3720 / 6941 probes passed the filter 43 < target_GC < 63.\nRBM20\n\tENST00000369519.3: 4248 / 7204 probes passed the filter 43 < target_GC < 63.\nBUB3\n\tENST00000368865.8: 3480 / 7799 probes passed the filter 43 < target_GC < 63.\nCKAP5\n\tENST00000312055.9: 3915 / 6503 probes passed the filter 43 < target_GC < 63.\nMALAT1\n\tENST00000534336.1: 3176 / 8679 probes passed the filter 43 < target_GC < 63.\nRNF169\n\tENST00000299563.4: 3686 / 7794 probes passed the filter 43 < target_GC < 63.\nFZD4\n\tENST00000531380.1: 3752 / 7354 probes passed the filter 43 < target_GC < 63.\nAMOTL1\n\tENST00000433060.2: 5059 / 8941 probes passed the filter 43 < target_GC < 63.\nCBL\n\tENST00000264033.4: 6104 / 11436 probes passed the filter 43 < target_GC < 63.\nITPR2\n\tENST00000381340.7: 4731 / 11376 probes passed the filter 43 < target_GC < 63.\nSLC38A1\n\tENST00000398637.9: 3340 / 8037 probes passed the filter 43 < target_GC < 63.\nDIP2B\n\tENST00000301180.9: 4857 / 8564 probes passed the filter 43 < target_GC < 63.\nCBX5\n\tENST00000209875.8: 5136 / 11499 probes passed the filter 43 < target_GC < 63.\nANKRD52\n\tENST00000267116.7: 5074 / 8659 probes passed the filter 43 < target_GC < 63.\nLRP1\n\tENST00000243077.7: 8057 / 14868 probes passed the filter 43 < target_GC < 63.\nSSH1\n\tENST00000326495.9: 7263 / 13011 probes passed the filter 43 < target_GC < 63.\nBRCA2\n\tENST00000380152.7: 2796 / 11957 probes passed the filter 43 < target_GC < 63.\nPROSER1\n\tENST00000352251.7: 2436 / 5139 probes passed the filter 43 < target_GC < 63.\nAKAP11\n\tENST00000025301.3: 3007 / 9891 probes passed the filter 43 < target_GC < 63.\nMCF2L\n\tENST00000480321.1: 3175 / 5698 probes passed the filter 43 < target_GC < 63.\nC14orf132\n\tENST00000555004.1: 4780 / 7745 probes passed the filter 43 < target_GC < 63.\nDYNC1H1\n\tENST00000360184.8: 9257 / 14304 probes passed the filter 43 < target_GC < 63.\nHERC2\n\tENST00000261609.11: 8930 / 15308 probes passed the filter 43 < target_GC < 63.\nTHBS1\n\tENST00000260356.5: 3762 / 7746 probes passed the filter 43 < target_GC < 63.\nSTARD9\n\tENST00000290607.11: 11459 / 15538 probes passed the filter 43 < target_GC < 63.\nFBN1\n\tENST00000316623.9: 6648 / 11727 probes passed the filter 43 < target_GC < 63.\nUSP8\n\tENST00000396444.7: 6703 / 18997 probes passed the filter 43 < target_GC < 63.\nTMOD2\n\tENST00000249700.8: 3592 / 9162 probes passed the filter 43 < target_GC < 63.\nTHSD4\n\tENST00000355327.7: 5271 / 9171 probes passed the filter 43 < target_GC < 63.\nTSPAN3\n\tENST00000267970.8: 3585 / 6438 probes passed the filter 43 < target_GC < 63.\nCEMIP\n\tENST00000220244.7: 4644 / 7197 probes passed the filter 43 < target_GC < 63.\nZNF592\n\tENST00000559607.1: 2507 / 3881 probes passed the filter 43 < target_GC < 63.\nSRRM2\n\tENST00000301740.12: 5716 / 9324 probes passed the filter 43 < target_GC < 63.\nCREBBP\n\tENST00000382070.7: 3782 / 7569 probes passed the filter 43 < target_GC < 63.\nTNRC6A\n\tENST00000315183.11: 4395 / 8274 probes passed the filter 43 < target_GC < 63.\nCCDC113\n\tENST00000219299.8: 2969 / 5243 probes passed the filter 43 < target_GC < 63.\nCDYL2\n\tENST00000570137.6: 3982 / 8130 probes passed the filter 43 < target_GC < 63.\nPRPF8\n\tENST00000304992.10: 5646 / 7266 probes passed the filter 43 < target_GC < 63.\nMYH10\n\tENST00000269243.8: 4417 / 7633 probes passed the filter 43 < target_GC < 63.\nC17orf51\n\tENST00000391411.9: 4616 / 7885 probes passed the filter 43 < target_GC < 63.\nRAD51D\n\tENST00000345365.10: 5525 / 9959 probes passed the filter 43 < target_GC < 63.\nPRKCA\n\tENST00000413366.7: 5256 / 8722 probes passed the filter 43 < target_GC < 63.\nSMIM5\n\tENST00000537494.1: 2381 / 4315 probes passed the filter 43 < target_GC < 63.\nFASN\n\tENST00000306749.2: 2516 / 8531 probes passed the filter 43 < target_GC < 63.\nALPK2\n\tENST00000361673.3: 4789 / 7274 probes passed the filter 43 < target_GC < 63.\nRNF152\n\tENST00000312828.3: 3527 / 9472 probes passed the filter 43 < target_GC < 63.\nDSEL\n\tENST00000310045.7: 2990 / 9502 probes passed the filter 43 < target_GC < 63.\nSIPA1L3\n\tENST00000222345.10: 2974 / 7958 probes passed the filter 43 < target_GC < 63.\nXDH\n\tENST00000379416.3: 3621 / 5659 probes passed the filter 43 < target_GC < 63.\nYIPF4\n\tENST00000238831.8: 3664 / 11920 probes passed the filter 43 < target_GC < 63.\nHEATR5B\n\tENST00000233099.5: 3915 / 6876 probes passed the filter 43 < target_GC < 63.\nSPTBN1\n\tENST00000356805.8: 5304 / 8453 probes passed the filter 43 < target_GC < 63.\nUSP34\n\tENST00000398571.6: 4610 / 11328 probes passed the filter 43 < target_GC < 63.\nSLC9A2\n\tENST00000233969.2: 2301 / 5381 probes passed the filter 43 < target_GC < 63.\nSULT1C2\n\tENST00000495441.1: 3642 / 6337 probes passed the filter 43 < target_GC < 63.\nAGPS\n\tENST00000264167.8: 1891 / 7735 probes passed the filter 43 < target_GC < 63.\nFZD5\n\tENST00000295417.3: 2310 / 6679 probes passed the filter 43 < target_GC < 63.\nAGAP1\n\tENST00000614409.4: 5179 / 9903 probes passed the filter 43 < target_GC < 63.\nCEP250\n\tENST00000397527.5: 10399 / 15674 probes passed the filter 43 < target_GC < 63.\nTTPAL\n\tENST00000262605.8: 3194 / 6183 probes passed the filter 43 < target_GC < 63.\nNRIP1\n\tENST00000400199.5: 2431 / 7533 probes passed the filter 43 < target_GC < 63.\nSLC5A3\n\tENST00000381151.3: 4072 / 11545 probes passed the filter 43 < target_GC < 63.\nIL17RA\n\tENST00000319363.10: 4708 / 8578 probes passed the filter 43 < target_GC < 63.\nRP4-671O14.6\n\tENST00000624919.1: 8447 / 14233 probes passed the filter 43 < target_GC < 63.\nCRTAP\n\tENST00000320954.10: 3909 / 6601 probes passed the filter 43 < target_GC < 63.\nNKTR\n\tENST00000429888.5: 3287 / 7296 probes passed the filter 43 < target_GC < 63.\nFYCO1\n\tENST00000296137.6: 4999 / 8475 probes passed the filter 43 < target_GC < 63.\nBSN\n\tENST00000296452.4: 7358 / 15926 probes passed the filter 43 < target_GC < 63.\nPOLQ\n\tENST00000264233.5: 3925 / 8746 probes passed the filter 43 < target_GC < 63.\nUMPS\n\tENST00000232607.6: 4130 / 6709 probes passed the filter 43 < target_GC < 63.\nPLXNA1\n\tENST00000393409.2: 3706 / 9037 probes passed the filter 43 < target_GC < 63.\nDNAJC13\n\tENST00000260818.10: 3381 / 7701 probes passed the filter 43 < target_GC < 63.\nKPNA4\n\tENST00000334256.8: 2963 / 8952 probes passed the filter 43 < target_GC < 63.\nPIK3CA\n\tENST00000263967.3: 2036 / 9064 probes passed the filter 43 < target_GC < 63.\nAFAP1\n\tENST00000358461.6: 3615 / 7487 probes passed the filter 43 < target_GC < 63.\nFAM184B\n\tENST00000265018.3: 2966 / 6593 probes passed the filter 43 < target_GC < 63.\nSLC7A11\n\tENST00000280612.9: 2533 / 9616 probes passed the filter 43 < target_GC < 63.\nSMARCA5\n\tENST00000283131.3: 2366 / 7894 probes passed the filter 43 < target_GC < 63.\nANKH\n\tENST00000284268.6: 4128 / 8174 probes passed the filter 43 < target_GC < 63.\nFBN2\n\tENST00000619499.4: 6374 / 10125 probes passed the filter 43 < target_GC < 63.\nAFF4\n\tENST00000265343.9: 3693 / 9523 probes passed the filter 43 < target_GC < 63.\nSKP1\n\tENST00000353411.10: 4213 / 9445 probes passed the filter 43 < target_GC < 63.\nARL10\n\tENST00000310389.5: 5504 / 10816 probes passed the filter 43 < target_GC < 63.\nFAF2\n\tENST00000261942.6: 2762 / 4486 probes passed the filter 43 < target_GC < 63.\nSCUBE3\n\tENST00000274938.7: 4645 / 7327 probes passed the filter 43 < target_GC < 63.\nUBR2\n\tENST00000372901.1: 3478 / 7828 probes passed the filter 43 < target_GC < 63.\nPHIP\n\tENST00000275034.4: 3043 / 10431 probes passed the filter 43 < target_GC < 63.\nDOPEY1\n\tENST00000369739.7: 3749 / 7714 probes passed the filter 43 < target_GC < 63.\nASCC3\n\tENST00000369162.6: 2959 / 8117 probes passed the filter 43 < target_GC < 63.\nSOD2\n\tENST00000538183.6: 6107 / 14237 probes passed the filter 43 < target_GC < 63.\nIGF2R\n\tENST00000356956.5: 8288 / 14015 probes passed the filter 43 < target_GC < 63.\nEGFR\n\tENST00000275493.6: 5401 / 9792 probes passed the filter 43 < target_GC < 63.\nLMTK2\n\tENST00000297293.5: 4107 / 8917 probes passed the filter 43 < target_GC < 63.\nSLC35B4\n\tENST00000378509.8: 3324 / 6768 probes passed the filter 43 < target_GC < 63.\nKIAA1147\n\tENST00000536163.5: 3721 / 7267 probes passed the filter 43 < target_GC < 63.\nXKR5\n\tENST00000618990.4: 2802 / 4856 probes passed the filter 43 < target_GC < 63.\nKIF13B\n\tENST00000524189.5: 4380 / 8716 probes passed the filter 43 < target_GC < 63.\nPRKDC\n\tENST00000314191.6: 6622 / 13480 probes passed the filter 43 < target_GC < 63.\nUBR5\n\tENST00000521922.5: 4950 / 8979 probes passed the filter 43 < target_GC < 63.\nSAMD12\n\tENST00000409003.4: 3900 / 8837 probes passed the filter 43 < target_GC < 63.\nKLHL9\n\tENST00000359039.4: 1902 / 5681 probes passed the filter 43 < target_GC < 63.\nTLN1\n\tENST00000314888.9: 5418 / 8794 probes passed the filter 43 < target_GC < 63.\nZCCHC6\n\tENST00000375963.7: 2297 / 5350 probes passed the filter 43 < target_GC < 63.\nTSTD2\n\tENST00000341170.4: 2288 / 4291 probes passed the filter 43 < target_GC < 63.\nPAPPA\n\tENST00000328252.3: 5869 / 10930 probes passed the filter 43 < target_GC < 63.\nZBTB43\n\tENST00000373464.4: 2831 / 5936 probes passed the filter 43 < target_GC < 63.\nGPR107\n\tENST00000372410.7: 4043 / 6962 probes passed the filter 43 < target_GC < 63.\nPRRC2B\n\tENST00000357304.8: 6042 / 11013 probes passed the filter 43 < target_GC < 63.\nGTF3C4\n\tENST00000372146.4: 4033 / 9014 probes passed the filter 43 < target_GC < 63.\nCOL5A1\n\tENST00000371817.7: 3039 / 8442 probes passed the filter 43 < target_GC < 63.\nMED14\n\tENST00000324817.5: 3125 / 7955 probes passed the filter 43 < target_GC < 63.\nUSP9X\n\tENST00000378308.6: 3718 / 8342 probes passed the filter 43 < target_GC < 63.\nNHSL2\n\tENST00000633930.1: 7177 / 13631 probes passed the filter 43 < target_GC < 63.\n"
],
[
"# Calculate and filter the melting temperature using the method in JM's MATLAB code.\n# Alternatively, you can use the newer Tm calculation method that are commented out \n# in the two subsequent cells.\nfilters.calc_tm_JM_for_probe_dict(probe_dict, monovalentSalt=0.3, probe_conc=5e-9,\n column_key_seq='target_sequence', column_key_write='target_Tm')\nplot.plot_hist(probe_dict, column_key='target_Tm')\n\nfilters.filter_probe_dict_by_metric(probe_dict, 'target_Tm', lower_bound=66, upper_bound=76)\nplot.plot_hist(probe_dict, column_key='target_Tm')",
"_____no_output_____"
],
[
"## Calculate and plot the melting-temperatures (Tm)\n#filters.calc_tm_for_probe_dict(probe_dict, Na_conc=300, fmd_percentile=30, probe_conc=0.05,\n# column_key_seq='target_sequence', column_key_write='target_Tm')\n#plot.plot_hist(probe_dict, column_key='target_Tm')",
"_____no_output_____"
],
[
"## Filter Tm and plot Tm distribution after filtering\n#filters.filter_probe_dict_by_metric(probe_dict, 'target_Tm', lower_bound=46, upper_bound=56)\n#plot.plot_hist(probe_dict, column_key='target_Tm')",
"_____no_output_____"
],
[
"# Calculate and plot the off-targets to rRNA/tRNAs\not.calc_OTs(probe_dict, ottable_rtRNAs, 'target_sequence', 'target_OT_rtRNA', 15)\nplot.plot_hist(probe_dict, 'target_OT_rtRNA', y_max=400)\n# Filter out probes that have any rRNA/tRNA off-targets\nfilters.filter_probe_dict_by_metric(probe_dict, 'target_OT_rtRNA', upper_bound=0.5)\nplot.plot_hist(probe_dict, 'target_OT_rtRNA')",
"_____no_output_____"
],
[
"# Get the FPKMs of the transcripts\ntranscript_fpkms = dict(zip(list(transcriptome['transcript_id']), list(transcriptome['FPKM'])))\n\n# Calculate the specificities and isoform specificities of the target regions\not.calc_specificity(probe_dict, ottable_transcriptome, gene_ottable_dict, transcript_fpkms,\n 'target_sequence', 'target_specificity', 'target_isospecificity', 17)\n\nplot.plot_hist(probe_dict, 'target_specificity')\nplot.plot_hist(probe_dict, 'target_isospecificity')",
"_____no_output_____"
],
[
"# Filter the specificities and isoform specificities of the target regions\nfilters.filter_probe_dict_by_metric(probe_dict, 'target_specificity', lower_bound=0.75)\nfilters.filter_probe_dict_by_metric(probe_dict, 'target_isospecificity', lower_bound=0.75)\nplot.plot_hist(probe_dict, 'target_specificity')\nplot.plot_hist(probe_dict, 'target_isospecificity')",
"VPS13D\n\tENST00000613099.4: 9512 / 9545 probes passed the filter 0.75 < target_specificity < inf.\nPRDM2\n\tENST00000343137.8: 2806 / 2863 probes passed the filter 0.75 < target_specificity < inf.\nLUZP1\n\tENST00000418342.5: 5478 / 5478 probes passed the filter 0.75 < target_specificity < inf.\nCNR2\n\tENST00000374472.4: 2494 / 3168 probes passed the filter 0.75 < target_specificity < inf.\nAHDC1\n\tENST00000374011.6: 1804 / 1815 probes passed the filter 0.75 < target_specificity < inf.\nAGO3\n\tENST00000373191.8: 4273 / 6143 probes passed the filter 0.75 < target_specificity < inf.\nRAB3B\n\tENST00000371655.3: 6083 / 6774 probes passed the filter 0.75 < target_specificity < inf.\nUSP24\n\tENST00000294383.6: 5109 / 5117 probes passed the filter 0.75 < target_specificity < inf.\nMAN1A2\n\tENST00000356554.7: 2528 / 2547 probes passed the filter 0.75 < target_specificity < inf.\nNOTCH2\n\tENST00000256646.6: 6462 / 6915 probes passed the filter 0.75 < target_specificity < inf.\nTPR\n\tENST00000367478.8: 3694 / 3724 probes passed the filter 0.75 < target_specificity < inf.\nPLXNA2\n\tENST00000367033.3: 6633 / 6898 probes passed the filter 0.75 < target_specificity < inf.\nDIEXF\n\tENST00000491415.6: 4101 / 4282 probes passed the filter 0.75 < target_specificity < inf.\nPTPN14\n\tENST00000366956.9: 5967 / 6360 probes passed the filter 0.75 < target_specificity < inf.\nCENPF\n\tENST00000366955.7: 4296 / 4307 probes passed the filter 0.75 < target_specificity < inf.\nFAM208B\n\tENST00000328090.9: 3820 / 3820 probes passed the filter 0.75 < target_specificity < inf.\nKIAA1462\n\tENST00000375377.1: 4251 / 4354 probes passed the filter 0.75 < target_specificity < inf.\nCHST3\n\tENST00000373115.4: 3527 / 3588 probes passed the filter 0.75 < target_specificity < inf.\nRBM20\n\tENST00000369519.3: 4052 / 4107 probes passed the filter 0.75 < target_specificity < inf.\nBUB3\n\tENST00000368865.8: 3076 / 3351 probes passed the filter 0.75 < target_specificity < inf.\nCKAP5\n\tENST00000312055.9: 3829 / 3829 probes passed the filter 0.75 < target_specificity < inf.\nMALAT1\n\tENST00000534336.1: 3060 / 3060 probes passed the filter 0.75 < target_specificity < inf.\nRNF169\n\tENST00000299563.4: 3555 / 3558 probes passed the filter 0.75 < target_specificity < inf.\nFZD4\n\tENST00000531380.1: 3433 / 3617 probes passed the filter 0.75 < target_specificity < inf.\nAMOTL1\n\tENST00000433060.2: 4794 / 4845 probes passed the filter 0.75 < target_specificity < inf.\nCBL\n\tENST00000264033.4: 5726 / 5858 probes passed the filter 0.75 < target_specificity < inf.\nITPR2\n\tENST00000381340.7: 4547 / 4600 probes passed the filter 0.75 < target_specificity < inf.\nSLC38A1\n\tENST00000398637.9: 3195 / 3204 probes passed the filter 0.75 < target_specificity < inf.\nDIP2B\n\tENST00000301180.9: 4711 / 4726 probes passed the filter 0.75 < target_specificity < inf.\nCBX5\n\tENST00000209875.8: 4410 / 4910 probes passed the filter 0.75 < target_specificity < inf.\nANKRD52\n\tENST00000267116.7: 4848 / 4848 probes passed the filter 0.75 < target_specificity < inf.\nLRP1\n\tENST00000243077.7: 7496 / 7513 probes passed the filter 0.75 < target_specificity < inf.\nSSH1\n\tENST00000326495.9: 6130 / 7020 probes passed the filter 0.75 < target_specificity < inf.\nBRCA2\n\tENST00000380152.7: 2442 / 2690 probes passed the filter 0.75 < target_specificity < inf.\nPROSER1\n\tENST00000352251.7: 2355 / 2355 probes passed the filter 0.75 < target_specificity < inf.\nAKAP11\n\tENST00000025301.3: 2840 / 2894 probes passed the filter 0.75 < target_specificity < inf.\nMCF2L\n\tENST00000480321.1: 2818 / 3027 probes passed the filter 0.75 < target_specificity < inf.\nC14orf132\n\tENST00000555004.1: 4489 / 4648 probes passed the filter 0.75 < target_specificity < inf.\nDYNC1H1\n\tENST00000360184.8: 8938 / 8939 probes passed the filter 0.75 < target_specificity < inf.\nHERC2\n\tENST00000261609.11: 5584 / 8584 probes passed the filter 0.75 < target_specificity < inf.\nTHBS1\n\tENST00000260356.5: 3527 / 3609 probes passed the filter 0.75 < target_specificity < inf.\nSTARD9\n\tENST00000290607.11: 11137 / 11184 probes passed the filter 0.75 < target_specificity < inf.\nFBN1\n\tENST00000316623.9: 6413 / 6487 probes passed the filter 0.75 < target_specificity < inf.\nUSP8\n\tENST00000396444.7: 3653 / 6366 probes passed the filter 0.75 < target_specificity < inf.\nTMOD2\n\tENST00000249700.8: 3265 / 3484 probes passed the filter 0.75 < target_specificity < inf.\nTHSD4\n\tENST00000355327.7: 5079 / 5112 probes passed the filter 0.75 < target_specificity < inf.\nTSPAN3\n\tENST00000267970.8: 3271 / 3469 probes passed the filter 0.75 < target_specificity < inf.\nCEMIP\n\tENST00000220244.7: 4452 / 4487 probes passed the filter 0.75 < target_specificity < inf.\nZNF592\n\tENST00000559607.1: 2140 / 2408 probes passed the filter 0.75 < target_specificity < inf.\nSRRM2\n\tENST00000301740.12: 5497 / 5542 probes passed the filter 0.75 < target_specificity < inf.\nCREBBP\n\tENST00000382070.7: 3545 / 3580 probes passed the filter 0.75 < target_specificity < inf.\nTNRC6A\n\tENST00000315183.11: 4248 / 4265 probes passed the filter 0.75 < target_specificity < inf.\nCCDC113\n\tENST00000219299.8: 2550 / 2847 probes passed the filter 0.75 < target_specificity < inf.\nCDYL2\n\tENST00000570137.6: 3838 / 3843 probes passed the filter 0.75 < target_specificity < inf.\nPRPF8\n\tENST00000304992.10: 5475 / 5475 probes passed the filter 0.75 < target_specificity < inf.\nMYH10\n\tENST00000269243.8: 4214 / 4285 probes passed the filter 0.75 < target_specificity < inf.\nC17orf51\n\tENST00000391411.9: 3013 / 4471 probes passed the filter 0.75 < target_specificity < inf.\nRAD51D\n\tENST00000345365.10: 4052 / 5306 probes passed the filter 0.75 < target_specificity < inf.\nPRKCA\n\tENST00000413366.7: 4927 / 5120 probes passed the filter 0.75 < target_specificity < inf.\nSMIM5\n\tENST00000537494.1: 2110 / 2295 probes passed the filter 0.75 < target_specificity < inf.\nFASN\n\tENST00000306749.2: 2343 / 2343 probes passed the filter 0.75 < target_specificity < inf.\nALPK2\n\tENST00000361673.3: 4643 / 4646 probes passed the filter 0.75 < target_specificity < inf.\nRNF152\n\tENST00000312828.3: 3096 / 3365 probes passed the filter 0.75 < target_specificity < inf.\nDSEL\n\tENST00000310045.7: 2594 / 2902 probes passed the filter 0.75 < target_specificity < inf.\nSIPA1L3\n\tENST00000222345.10: 2823 / 2841 probes passed the filter 0.75 < target_specificity < inf.\nXDH\n\tENST00000379416.3: 3309 / 3515 probes passed the filter 0.75 < target_specificity < inf.\nYIPF4\n\tENST00000238831.8: 1701 / 3453 probes passed the filter 0.75 < target_specificity < inf.\nHEATR5B\n\tENST00000233099.5: 3769 / 3769 probes passed the filter 0.75 < target_specificity < inf.\nSPTBN1\n\tENST00000356805.8: 5152 / 5187 probes passed the filter 0.75 < target_specificity < inf.\nUSP34\n\tENST00000398571.6: 4348 / 4442 probes passed the filter 0.75 < target_specificity < inf.\nSLC9A2\n\tENST00000233969.2: 2199 / 2236 probes passed the filter 0.75 < target_specificity < inf.\nSULT1C2\n\tENST00000495441.1: 3297 / 3541 probes passed the filter 0.75 < target_specificity < inf.\nAGPS\n\tENST00000264167.8: 1833 / 1833 probes passed the filter 0.75 < target_specificity < inf.\nFZD5\n\tENST00000295417.3: 2170 / 2241 probes passed the filter 0.75 < target_specificity < inf.\nAGAP1\n\tENST00000614409.4: 4871 / 4987 probes passed the filter 0.75 < target_specificity < inf.\nCEP250\n\tENST00000397527.5: 9444 / 9988 probes passed the filter 0.75 < target_specificity < inf.\nTTPAL\n\tENST00000262605.8: 2618 / 3018 probes passed the filter 0.75 < target_specificity < inf.\nNRIP1\n\tENST00000400199.5: 2358 / 2367 probes passed the filter 0.75 < target_specificity < inf.\nSLC5A3\n\tENST00000381151.3: 3903 / 3947 probes passed the filter 0.75 < target_specificity < inf.\nIL17RA\n\tENST00000319363.10: 4091 / 4559 probes passed the filter 0.75 < target_specificity < inf.\nRP4-671O14.6\n\tENST00000624919.1: 6549 / 8042 probes passed the filter 0.75 < target_specificity < inf.\nCRTAP\n\tENST00000320954.10: 3652 / 3775 probes passed the filter 0.75 < target_specificity < inf.\nNKTR\n\tENST00000429888.5: 3121 / 3158 probes passed the filter 0.75 < target_specificity < inf.\nFYCO1\n\tENST00000296137.6: 4818 / 4824 probes passed the filter 0.75 < target_specificity < inf.\nBSN\n\tENST00000296452.4: 6879 / 7001 probes passed the filter 0.75 < target_specificity < inf.\nPOLQ\n\tENST00000264233.5: 3828 / 3828 probes passed the filter 0.75 < target_specificity < inf.\nUMPS\n\tENST00000232607.6: 3657 / 4038 probes passed the filter 0.75 < target_specificity < inf.\nPLXNA1\n\tENST00000393409.2: 3488 / 3488 probes passed the filter 0.75 < target_specificity < inf.\nDNAJC13\n\tENST00000260818.10: 3292 / 3292 probes passed the filter 0.75 < target_specificity < inf.\nKPNA4\n\tENST00000334256.8: 2389 / 2873 probes passed the filter 0.75 < target_specificity < inf.\nPIK3CA\n\tENST00000263967.3: 1827 / 1967 probes passed the filter 0.75 < target_specificity < inf.\nAFAP1\n\tENST00000358461.6: 3402 / 3443 probes passed the filter 0.75 < target_specificity < inf.\nFAM184B\n\tENST00000265018.3: 2541 / 2857 probes passed the filter 0.75 < target_specificity < inf.\nSLC7A11\n\tENST00000280612.9: 2148 / 2448 probes passed the filter 0.75 < target_specificity < inf.\nSMARCA5\n\tENST00000283131.3: 2136 / 2287 probes passed the filter 0.75 < target_specificity < inf.\nANKH\n\tENST00000284268.6: 3782 / 3996 probes passed the filter 0.75 < target_specificity < inf.\nFBN2\n\tENST00000619499.4: 6233 / 6233 probes passed the filter 0.75 < target_specificity < inf.\nAFF4\n\tENST00000265343.9: 3560 / 3560 probes passed the filter 0.75 < target_specificity < inf.\nSKP1\n\tENST00000353411.10: 3470 / 4066 probes passed the filter 0.75 < target_specificity < inf.\nARL10\n\tENST00000310389.5: 4381 / 5285 probes passed the filter 0.75 < target_specificity < inf.\nFAF2\n\tENST00000261942.6: 2704 / 2704 probes passed the filter 0.75 < target_specificity < inf.\nSCUBE3\n"
]
],
[
[
"# Design readout sequences",
"_____no_output_____"
]
],
[
[
"# Load the readout sequences into a data frame\nreadout_seqs = fio.load_fasta_into_df(readout_fasta_file)\nrs.append_on_bit_ids_to_readout_sequences(readout_seqs, bit_names)\nreadout_seqs ",
"_____no_output_____"
],
[
"# Add the readout sequences. Here we randomly add 3 readout sequences to each probe.\n# Add an \"A\" between the concatenated sequences.\nrs.add_readout_seqs_to_probes_random(probe_dict, readout_seqs, barcode_table, 3, \n spacer='', gene_id_key='name', n_threads=8)",
"_____no_output_____"
],
[
"# Filter out probes that have off-targets to rRNA/tRNAs\not.calc_OTs(probe_dict, ottable_rtRNAs, 'target_readout_sequence', 'target_readout_OT_rtRNA', 15)\nplot.plot_hist(probe_dict, 'target_readout_OT_rtRNA', y_max=400)\nfilters.filter_probe_dict_by_metric(probe_dict, 'target_readout_OT_rtRNA', upper_bound=0.5)\nplot.plot_hist(probe_dict, 'target_readout_OT_rtRNA')",
"_____no_output_____"
],
[
"# NOTE: This step is optional since JM didn't have this step.\n# Calculate how many more off-targets to the transcriptome are introduced due to the readout sequences.\n# The off-target counts are weighted down by the FPKMs of the on-target transcripts\not.calc_OT_diffs(probe_dict, ottable_transcriptome, gene_ottable_dict, transcript_fpkms, \n 'target_sequence', 'target_readout_sequence', 'readout_OT_increase', 17)\nplot.plot_hist(probe_dict, 'readout_OT_increase', y_max=400)\n\n# Filter out the probes with extra off-targets due to the readouts\n# Require the new weighted off-targets to be minor compared to the on-target weight.\nfilters.filter_probe_dict_by_metric(probe_dict, 'readout_OT_increase', upper_bound=0.25 * (30 - 17 + 1))\nplot.plot_hist(probe_dict, 'readout_OT_increase')",
"_____no_output_____"
]
],
[
[
"# Select probes",
"_____no_output_____"
]
],
[
[
"%%time\n# Select probes by a stochastic greedy algorithms that optimizes the on-bit coverage\n# and minimizes the overlapping between probes.\nps.select_probes_greedy_stochastic(probe_dict, N_probes_per_transcript=92, N_on_bits=4, N_threads=16)",
"Wall time: 20min 27s\n"
],
[
"# Let's plot the probe coverage of an example transcript\nseq_len = len(transcriptome[transcriptome['transcript_id'] == 'ENST00000380152.7'].iloc[0]['sequence'])\nplot.plot_sequence_coverage(probe_dict['BRCA2']['ENST00000380152.7'], seq_len)",
"_____no_output_____"
],
[
"ps",
"_____no_output_____"
]
],
[
[
"# Primer design",
"_____no_output_____"
]
],
[
[
"# Load the primer candidates into data frames\nforward_primers, reverse_primers = fio.load_primers(forward_primer_file, reverse_primer_file)\ndisplay(forward_primers)\ndisplay(reverse_primers)",
"_____no_output_____"
],
[
"# Selet primers\n# Make an off-target from the current probe sequences.\nottable_target_readout = ot.get_OTTable_for_probe_dictionary(probe_dict, 'target_readout_sequence', 15)\n\n# Calculate the off-targets for the primer sequences and their reverse-complements\n# Usually, there shouln't be any off-targets\not.calc_OTs_df(forward_primers, ottable_target_readout, 'sequence', 'sequence_OT', 15)\not.calc_OTs_df(forward_primers, ottable_target_readout, 'sequence_rc', 'sequence_rc_OT', 15)\not.calc_OTs_df(reverse_primers, ottable_target_readout, 'sequence', 'sequence_OT', 15)\not.calc_OTs_df(reverse_primers, ottable_target_readout, 'sequence_rc', 'sequence_rc_OT', 15)\n\n# Select primers with lowest OTs\nforward_primers = primer_design.randomly_select_primers_with_lowest_OT(forward_primers)\nreverse_primers = primer_design.randomly_select_primers_with_lowest_OT(reverse_primers)\n\n# Now each primer table should only a single row of the selected primer\ndisplay(forward_primers)\ndisplay(reverse_primers)\n\n# Save the selected primers\nforward_primers.append(reverse_primers, ignore_index=True).to_csv(selected_primers_file)",
"_____no_output_____"
],
[
"# Add the primer sequences\n# NOTE: the sequence after primer addition should be (reverse_primer)-(target_readouts)-(forward_primer_rc)\nprimer_design.add_primer_sequences(probe_dict, \n reverse_primers.iloc[0]['sequence'], forward_primers.iloc[0]['sequence_rc'],\n input_column='target_readout_sequence', output_column='target_readout_primer_sequence')\n\n# Notice that the T7 promoter (the first 17 bases of the reverse primer) will be lost after in vitro transcription\n# create a column of the T7 transcribed sequences for the subsequent quality check\nprimer_design.add_primer_sequences(probe_dict, \n reverse_primers.iloc[0]['sequence'][17:], forward_primers.iloc[0]['sequence_rc'],\n input_column='target_readout_sequence', output_column='target_readout_primer_sequence_t7_transcribed')",
"_____no_output_____"
]
],
[
[
"# Quality check",
"_____no_output_____"
]
],
[
[
"# Filter out probes that have off-targets to rRNA/tRNAs\not.calc_OTs(probe_dict, ottable_rtRNAs, 'target_readout_primer_sequence_t7_transcribed', 'target_readout_primer_t7_transcribed_OT_rtRNA', 15)\nplot.plot_hist(probe_dict, 'target_readout_primer_t7_transcribed_OT_rtRNA', y_max=400)\nfilters.filter_probe_dict_by_metric(probe_dict, 'target_readout_primer_t7_transcribed_OT_rtRNA', upper_bound=0.5)\nplot.plot_hist(probe_dict, 'target_readout_primer_t7_transcribed_OT_rtRNA')",
"_____no_output_____"
],
[
"# Calculate how many more off-targets to the transcriptome are introduced due to the primer sequences.\n# The off-target counts are weighted down by the FPKMs of the on-target transcripts\not.calc_OT_diffs(probe_dict, ottable_transcriptome, gene_ottable_dict, transcript_fpkms, \n 'target_readout_sequence', 'target_readout_primer_sequence_t7_transcribed', 'primer_OT_increase', 17)\nplot.plot_hist(probe_dict, 'primer_OT_increase', y_max=400)\n\n# Filter out the probes with extra off-targets due to the primers\n# Require the new weighted off-targets to be minor compared to the on-target weight.\nfilters.filter_probe_dict_by_metric(probe_dict, 'primer_OT_increase', upper_bound=0.25 * (30 - 17 + 1))\nplot.plot_hist(probe_dict, 'primer_OT_increase')",
"_____no_output_____"
],
[
"%%time\n# Filter out the probes that self complement or complement with other probes.\n\n# Iterately remove the probes with high numbers of cis/trans-complementarity\n# This filtering strategy is a compromise between speed and the number of probes to keep\nwhile True:\n # Make a OTTable from the reverse-complement sequences of the probes.\n ottable_probes_rc = ot.get_OTTable_for_probe_dictionary(probe_dict, 'target_readout_primer_sequence', 15, rc=True)\n \n # The off-targets in this table indicates cis/trans-complementarity\n ot.calc_OTs(probe_dict, ottable_probes_rc, 'target_readout_primer_sequence', 'probe_cis_trans_OT', 15)\n max_ot = max(plot.get_values_from_probe_dict(probe_dict, 'probe_cis_trans_OT'))\n if max_ot == 0:\n break\n \n # Remove probes that have any cis/trans-complementarity\n filters.filter_probe_dict_by_metric(probe_dict, 'probe_cis_trans_OT', upper_bound=max_ot - 0.5)\n \nplot.plot_hist(probe_dict, 'probe_cis_trans_OT')",
"VPS13D\n\tENST00000613099.4: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nPRDM2\n\tENST00000343137.8: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nLUZP1\n\tENST00000418342.5: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nCNR2\n\tENST00000374472.4: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nAHDC1\n\tENST00000374011.6: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nAGO3\n\tENST00000373191.8: 35 / 35 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nRAB3B\n\tENST00000371655.3: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nUSP24\n\tENST00000294383.6: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nMAN1A2\n\tENST00000356554.7: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nNOTCH2\n\tENST00000256646.6: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nTPR\n\tENST00000367478.8: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nPLXNA2\n\tENST00000367033.3: 62 / 62 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nDIEXF\n\tENST00000491415.6: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nPTPN14\n\tENST00000366956.9: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nCENPF\n\tENST00000366955.7: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nFAM208B\n\tENST00000328090.9: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nKIAA1462\n\tENST00000375377.1: 58 / 58 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nCHST3\n\tENST00000373115.4: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nRBM20\n\tENST00000369519.3: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nBUB3\n\tENST00000368865.8: 44 / 44 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nCKAP5\n\tENST00000312055.9: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nMALAT1\n\tENST00000534336.1: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nRNF169\n\tENST00000299563.4: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nFZD4\n\tENST00000531380.1: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nAMOTL1\n\tENST00000433060.2: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nCBL\n\tENST00000264033.4: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nITPR2\n\tENST00000381340.7: 69 / 69 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSLC38A1\n\tENST00000398637.9: 91 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nDIP2B\n\tENST00000301180.9: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nCBX5\n\tENST00000209875.8: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nANKRD52\n\tENST00000267116.7: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nLRP1\n\tENST00000243077.7: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSSH1\n\tENST00000326495.9: 68 / 68 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nBRCA2\n\tENST00000380152.7: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nPROSER1\n\tENST00000352251.7: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nAKAP11\n\tENST00000025301.3: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nMCF2L\n\tENST00000480321.1: 73 / 73 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nC14orf132\n\tENST00000555004.1: 66 / 66 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nDYNC1H1\n\tENST00000360184.8: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nHERC2\n\tENST00000261609.11: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nTHBS1\n\tENST00000260356.5: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSTARD9\n\tENST00000290607.11: 22 / 22 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nFBN1\n\tENST00000316623.9: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nUSP8\n\tENST00000396444.7: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nTMOD2\n\tENST00000249700.8: 62 / 62 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nTHSD4\n\tENST00000355327.7: 54 / 54 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nTSPAN3\n\tENST00000267970.8: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nCEMIP\n\tENST00000220244.7: 68 / 68 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nZNF592\n\tENST00000559607.1: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSRRM2\n\tENST00000301740.12: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nCREBBP\n\tENST00000382070.7: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nTNRC6A\n\tENST00000315183.11: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nCCDC113\n\tENST00000219299.8: 72 / 72 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nCDYL2\n\tENST00000570137.6: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nPRPF8\n\tENST00000304992.10: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nMYH10\n\tENST00000269243.8: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nC17orf51\n\tENST00000391411.9: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nRAD51D\n\tENST00000345365.10: 66 / 66 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nPRKCA\n\tENST00000413366.7: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSMIM5\n\tENST00000537494.1: 72 / 72 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nFASN\n\tENST00000306749.2: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nALPK2\n\tENST00000361673.3: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nRNF152\n\tENST00000312828.3: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nDSEL\n\tENST00000310045.7: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSIPA1L3\n\tENST00000222345.10: 69 / 69 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nXDH\n\tENST00000379416.3: 49 / 49 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nYIPF4\n\tENST00000238831.8: 38 / 38 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nHEATR5B\n\tENST00000233099.5: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSPTBN1\n\tENST00000356805.8: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nUSP34\n\tENST00000398571.6: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSLC9A2\n\tENST00000233969.2: 73 / 73 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSULT1C2\n\tENST00000495441.1: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nAGPS\n\tENST00000264167.8: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nFZD5\n\tENST00000295417.3: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nAGAP1\n\tENST00000614409.4: 71 / 71 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nCEP250\n\tENST00000397527.5: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nTTPAL\n\tENST00000262605.8: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nNRIP1\n\tENST00000400199.5: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSLC5A3\n\tENST00000381151.3: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nIL17RA\n\tENST00000319363.10: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nRP4-671O14.6\n\tENST00000624919.1: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nCRTAP\n\tENST00000320954.10: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nNKTR\n\tENST00000429888.5: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nFYCO1\n\tENST00000296137.6: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nBSN\n\tENST00000296452.4: 63 / 63 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nPOLQ\n\tENST00000264233.5: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nUMPS\n\tENST00000232607.6: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nPLXNA1\n\tENST00000393409.2: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nDNAJC13\n\tENST00000260818.10: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nKPNA4\n\tENST00000334256.8: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nPIK3CA\n\tENST00000263967.3: 66 / 66 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nAFAP1\n\tENST00000358461.6: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nFAM184B\n\tENST00000265018.3: 63 / 63 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSLC7A11\n\tENST00000280612.9: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSMARCA5\n\tENST00000283131.3: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nANKH\n\tENST00000284268.6: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nFBN2\n\tENST00000619499.4: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nAFF4\n\tENST00000265343.9: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSKP1\n\tENST00000353411.10: 71 / 71 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nARL10\n\tENST00000310389.5: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nFAF2\n\tENST00000261942.6: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSCUBE3\n\tENST00000274938.7: 45 / 45 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nUBR2\n\tENST00000372901.1: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nPHIP\n\tENST00000275034.4: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nDOPEY1\n\tENST00000369739.7: 63 / 63 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nASCC3\n\tENST00000369162.6: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSOD2\n\tENST00000538183.6: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nIGF2R\n\tENST00000356956.5: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nEGFR\n\tENST00000275493.6: 63 / 63 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nLMTK2\n\tENST00000297293.5: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSLC35B4\n\tENST00000378509.8: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nKIAA1147\n\tENST00000536163.5: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nXKR5\n\tENST00000618990.4: 40 / 40 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nKIF13B\n\tENST00000524189.5: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nPRKDC\n\tENST00000314191.6: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nUBR5\n\tENST00000521922.5: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nSAMD12\n\tENST00000409003.4: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nKLHL9\n\tENST00000359039.4: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nTLN1\n\tENST00000314888.9: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nZCCHC6\n\tENST00000375963.7: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nTSTD2\n\tENST00000341170.4: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nPAPPA\n\tENST00000328252.3: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nZBTB43\n\tENST00000373464.4: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nGPR107\n\tENST00000372410.7: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nPRRC2B\n\tENST00000357304.8: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nGTF3C4\n\tENST00000372146.4: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nCOL5A1\n\tENST00000371817.7: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nMED14\n\tENST00000324817.5: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nUSP9X\n\tENST00000378308.6: 92 / 92 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\nNHSL2\n\tENST00000633930.1: 64 / 64 probes passed the filter -inf < probe_cis_trans_OT < 74.5.\n"
],
[
"# Also get the reverse-complementary sequences of the designed probes\np_d.get_rc_sequences(probe_dict, 'target_readout_primer_sequence', 'target_readout_primer_sequence_rc')\n# Write the designed probes\np_d.probe_dict_to_df(probe_dict).to_csv(probe_output_file, index=False)",
"_____no_output_____"
],
[
"# Write the transcript level report\ntranscript_level_report = qc.generate_transcript_level_report(probe_dict, transcriptome)\ndisplay(transcript_level_report)\ntranscript_level_report.to_csv(transcript_level_report_file, index=False)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
]
] |
d07d7cd401ffddde08f5101f06c2ede72ea79c25 | 1,151 | ipynb | Jupyter Notebook | coursera/Python-Data-Science-and-Machine-Learning-Bootcamp/Aula 1.ipynb | kenjiyamamoto/data-scientist | 8d93dfa66ee27c47b208ce36c5b0378912e4ca55 | [
"MIT"
] | null | null | null | coursera/Python-Data-Science-and-Machine-Learning-Bootcamp/Aula 1.ipynb | kenjiyamamoto/data-scientist | 8d93dfa66ee27c47b208ce36c5b0378912e4ca55 | [
"MIT"
] | null | null | null | coursera/Python-Data-Science-and-Machine-Learning-Bootcamp/Aula 1.ipynb | kenjiyamamoto/data-scientist | 8d93dfa66ee27c47b208ce36c5b0378912e4ca55 | [
"MIT"
] | null | null | null | 15.767123 | 36 | 0.464813 | [
[
[
"type(8)",
"_____no_output_____"
],
[
"type(0.1)",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code"
]
] |
d07d7e2ee23c0b289af6c27be0af8bc1654d560d | 130,174 | ipynb | Jupyter Notebook | testing/mumax_projection-Copy1.ipynb | Skoricius/xmcd-projection | 3c48c9ae1522956b257548a1b2775aa877bca649 | [
"MIT"
] | null | null | null | testing/mumax_projection-Copy1.ipynb | Skoricius/xmcd-projection | 3c48c9ae1522956b257548a1b2775aa877bca649 | [
"MIT"
] | null | null | null | testing/mumax_projection-Copy1.ipynb | Skoricius/xmcd-projection | 3c48c9ae1522956b257548a1b2775aa877bca649 | [
"MIT"
] | null | null | null | 296.523918 | 38,448 | 0.919454 | [
[
[
"%gui qt\n%matplotlib qt\nfrom glob import glob\nimport os, sys\n# only necessary if using without installing\nsys.path.append(\"..\")\nfrom xmcd_projection import *\nfrom skimage.io import imread, imsave\nfrom PIL import Image\nimport meshio\nimport trimesh",
"c:\\users\\lukas\\miniconda3\\lib\\site-packages\\numpy\\_distributor_init.py:30: UserWarning: loaded more than 1 DLL from .libs:\nc:\\users\\lukas\\miniconda3\\lib\\site-packages\\numpy\\.libs\\libopenblas.GK7GX5KEQ4F6UYO3P26ULGBQYHGQO7J4.gfortran-win_amd64.dll\nc:\\users\\lukas\\miniconda3\\lib\\site-packages\\numpy\\.libs\\libopenblas.NOIJJG62EMASZI6NYURL6JBKM4EVBGM7.gfortran-win_amd64.dll\n warnings.warn(\"loaded more than 1 DLL from .libs:\"\n"
]
],
[
[
"### Get file paths",
"_____no_output_____"
]
],
[
[
"msh_file = \"DW_MeshConv4nm_25nmSep.vtu\"\nmag_file = \"DW_MeshConv4nm_25nmSep.csv\"\n# scale for the points\nscale = 1e9",
"_____no_output_____"
]
],
[
[
"## Generate raytracing - skip if generated",
"_____no_output_____"
]
],
[
[
"# get the mesh, scale the points to nm\nmsh = Mesh.from_file(msh_file, scale=scale)",
"_____no_output_____"
]
],
[
[
"#### Make sure that the projection vector is correct and that the structure is oriented well",
"_____no_output_____"
]
],
[
[
"# get the projection vector\np = get_projection_vector(90, 0) # direction of xrays\nn = [0, 1, 1] # normal to the projection plane\nx0 = [-100, 0, 0] # point on the projection plane\n\n# prepare raytracing object\nraytr = RayTracing(msh, p, n=n, x0=x0)\nstruct = raytr.struct\nstruct_projected = raytr.struct_projected",
"_____no_output_____"
],
[
"vis = MeshVisualizer(struct, struct_projected)\nvis.set_camera(dist=2e5)\nvis.show()",
"_____no_output_____"
]
],
[
[
"## If raytracing file generated - skip if not",
"_____no_output_____"
]
],
[
[
"# load raytracing if exists\nraytr = np.load(\"raytracing.npy\", allow_pickle=True).item()\nstruct = raytr.struct\nstruct_projected = raytr.struct_projected",
"_____no_output_____"
]
],
[
[
"## Generate and save raytracing",
"_____no_output_____"
]
],
[
[
"raytr.get_piercings()\n# np.save(\"raytracing.npy\", raytr, allow_pickle=True)",
"100%|███████████████████████████████████████████████████████████████████████████████| 56592/56592 [07:29<00:00, 125.99it/s]\n"
]
],
[
[
"## Get the xmcd",
"_____no_output_____"
],
[
"#### Get magnetisation, fix vertex shuffling\nNote: sometimes if the mesh file has multiple parts, the paraview export and the original mesh coordinates are not in the same order. I add a function to fix that when necessary",
"_____no_output_____"
]
],
[
[
"magnetisation, mag_points = load_mesh_magnetisation(mag_file, scale=scale)\nshuffle_file = \"shuffle_indx.npy\"\ntry:\n shuffle_indx = np.load(shuffle_file)\nexcept FileNotFoundError:\n print('File not found. Generating shuffle indx')\n shuffle_indx = msh.get_shuffle_indx(mag_points)\n np.save(shuffle_file, shuffle_indx)\nmagnetisation = magnetisation[shuffle_indx, :]",
"File not found. Generating shuffle indx\n"
],
[
"magnetisation, mag_points = load_mesh_magnetisation(mag_file, scale=scale)\nshuffle_indx = msh.get_shuffle_indx(mag_points)\nmagnetisation = magnetisation[shuffle_indx, :]",
"_____no_output_____"
]
],
[
[
"### Get the colours and XMCD values",
"_____no_output_____"
]
],
[
[
"xmcd_value = raytr.get_xmcd(magnetisation)\nmag_colors = get_struct_face_mag_color(struct, magnetisation)",
"_____no_output_____"
],
[
"azi=90\ncenter_struct = [0, 0, 0]\ndist_struct = 1e4\ncenter_peem = [100, -200, 0]\ndist_peem = 8e4\n\nvis = MeshVisualizer(struct, struct_projected, projected_xmcd=xmcd_value, struct_colors=mag_colors)\nvis.show(azi=azi, center=center_peem, dist=dist_peem)\nImage.fromarray(vis.get_image_np())",
"_____no_output_____"
]
],
[
[
"#### View different parts of the image separately",
"_____no_output_____"
],
[
"#### Both",
"_____no_output_____"
]
],
[
[
"\nvis.update_colors(xmcd_value, mag_colors)\nvis.view_both(azi=azi, center=center_peem, dist=dist_peem)\nImage.fromarray(vis.get_image_np())",
"_____no_output_____"
]
],
[
[
"#### Projection",
"_____no_output_____"
]
],
[
[
"vis.view_projection(azi=azi, center=center_peem, dist=dist_peem)\nImage.fromarray(vis.get_image_np())",
"_____no_output_____"
]
],
[
[
"#### Structure",
"_____no_output_____"
]
],
[
[
"\ncenter_struct = [75, 50, 0]\ndist_struct = 1e4\nvis.view_struct(azi=azi, center=center_struct, dist=dist_struct)\nImage.fromarray(vis.get_image_np())",
"_____no_output_____"
]
],
[
[
"#### Blurred image",
"_____no_output_____"
]
],
[
[
"vis.view_projection(azi=azi, center=center_peem, dist=dist_peem)\nImage.fromarray((vis.get_blurred_image(desired_background=0.7)*255).astype(np.uint8))",
"_____no_output_____"
]
],
[
[
"#### Saving one render",
"_____no_output_____"
]
],
[
[
"vis.view_both(azi=azi, center=center_peem, dist=dist_peem)\nvis.save_render('mumax_shadow.png')\nvis.view_projection()\nblurred = vis.get_blurred_image(desired_background=0.7)\nimsave('mumax_shadow_blurred.png', (blurred*255).astype(np.uint8), check_contrast=False)\n\nvis.view_struct(azi=azi, center=center_struct, dist=dist_struct)\nvis.save_render('mumax_structure_view.png')",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07d8966822f9ea96a0b881b786adbfb7f1dd763 | 29,287 | ipynb | Jupyter Notebook | EmployeeSQL/sql-challenge-bonus.ipynb | CraftyJack/sql-challenge | fdfc196793c1c1477b071afc5e1e85f5d07a6967 | [
"ADSL"
] | null | null | null | EmployeeSQL/sql-challenge-bonus.ipynb | CraftyJack/sql-challenge | fdfc196793c1c1477b071afc5e1e85f5d07a6967 | [
"ADSL"
] | null | null | null | EmployeeSQL/sql-challenge-bonus.ipynb | CraftyJack/sql-challenge | fdfc196793c1c1477b071afc5e1e85f5d07a6967 | [
"ADSL"
] | null | null | null | 115.30315 | 13,096 | 0.862328 | [
[
[
"# do our imports and set up the engine and connection to the database.\n\nimport pandas as pd\nimport matplotlib.pyplot as plt\nfrom config import postgresql_username, postgresql_password\nfrom sqlalchemy import create_engine\nengine = create_engine(f'postgresql://{postgresql_username}:{postgresql_password}@localhost:5432/EmployeeSQL')\nconnection = engine.connect()",
"_____no_output_____"
],
[
"# read in the data we're going to need as a dataframe\ndata_df = pd.read_sql(\"SELECT employees.emp_no, salaries.salary, titles.title FROM employees INNER JOIN salaries ON employees.emp_no = salaries.emp_no INNER JOIN titles ON employees.emp_title = titles.title_id\", connection)",
"_____no_output_____"
],
[
"# take a peek at the dataframe\ndata_df.head()",
"_____no_output_____"
],
[
"# first a histogram of the salaries.\nplt.hist(data_df.salary)\nplt.xlabel('Salary')\nplt.ylabel('Fequency')\nplt.show",
"_____no_output_____"
]
],
[
[
"The histogram looks a little fishy. The general shape of the histogram is not surprising, but the limits are. We don't even make it to $150K here, so c-suite salaries are likely excluded.",
"_____no_output_____"
]
],
[
[
"# now, a bar chart of average salary by title.\nhelper_df = data_df.groupby(['title'])['salary'].mean()\nhelper_df.plot.bar()",
"_____no_output_____"
]
],
[
[
"This definitely looks fishy. The average salaries are all really tightly clustered. There also doesn't seem to be much point to advancement. For example, Assistant Engineer, Engineer, and Senior Engineer all make about the same on average. \n\nWith that, let's take a look behind the curtain.",
"_____no_output_____"
]
],
[
[
"punchline = pd.read_sql(\"SELECT * FROM employees INNER JOIN salaries ON employees.emp_no = salaries.emp_no INNER JOIN titles ON employees.emp_title = titles.title_id WHERE employees.emp_no = 499942\", connection)\nprint(punchline)",
" emp_no emp_title birth_date first_name last_name sex hire_date emp_no \\\n0 499942 e0004 1963-01-10 April Foolsday F 1997-02-10 499942 \n\n salary title_id title \n0 40000 e0004 Technique Leader \n"
]
],
[
[
"The name is \"April Foolsday\".",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
d07d8ecaf1329e559a1f1ff371cdf5c5182f9529 | 19,398 | ipynb | Jupyter Notebook | Notebooks/03-Intro_to_Dask.ipynb | quasiben/rapids-dask-tutorial-2021 | 493ce39d057292f49ba61106d94d41e2dbc49ed9 | [
"BSD-3-Clause"
] | 7 | 2021-05-20T18:55:27.000Z | 2022-02-08T16:39:24.000Z | Notebooks/03-Intro_to_Dask.ipynb | quasiben/rapids-dask-tutorial-2021 | 493ce39d057292f49ba61106d94d41e2dbc49ed9 | [
"BSD-3-Clause"
] | 2 | 2021-05-05T19:53:04.000Z | 2021-05-20T13:34:19.000Z | Notebooks/03-Intro_to_Dask.ipynb | quasiben/rapids-dask-tutorial-2021 | 493ce39d057292f49ba61106d94d41e2dbc49ed9 | [
"BSD-3-Clause"
] | 2 | 2021-05-04T15:59:39.000Z | 2021-05-13T04:17:47.000Z | 30.64455 | 513 | 0.610785 | [
[
[
"# Dask Overview",
"_____no_output_____"
],
[
"Dask is a flexible library for parallel computing in Python that makes scaling out your workflow smooth and simple. On the CPU, Dask uses Pandas (NumPy) to execute operations in parallel on DataFrame (array) partitions.\n\nDask-cuDF extends Dask where necessary to allow its DataFrame partitions to be processed by cuDF GPU DataFrames as opposed to Pandas DataFrames. For instance, when you call dask_cudf.read_csv(…), your cluster’s GPUs do the work of parsing the CSV file(s) with underlying cudf.read_csv(). Dask also supports array based workflows using CuPy.\n\n## When to use Dask\nIf your workflow is fast enough on a single GPU or your data comfortably fits in memory on a single GPU, you would want to use cuDF or CuPy. If you want to distribute your workflow across multiple GPUs, have more data than you can fit in memory on a single GPU, or want to analyze data spread across many files at once, you would want to use Dask.\n\nOne additional benefit Dask provides is that it lets us easily spill data between device and host memory. This can be very useful when we need to do work that would otherwise cause out of memory errors.\n\nIn this brief notebook, you'll walk through an example of using Dask on a single GPU. Because we're using Dask, the same code in this notebook would work on two, eight, 16, or 100s of GPUs.",
"_____no_output_____"
],
[
"# Creating a Local Cluster\n\nThe easiest way to scale workflows on a single node is to use the `LocalCUDACluster` API. This lets us create a GPU cluster, using one worker per GPU by default.\n\nIn this case, we'll pass the following arguments. \n\n- `CUDA_VISIBLE_DEVICES`, to limit our cluster to a single GPU (for demonstration purposes).\n- `device_memory_limit`, to illustrate how we can spill data between GPU and CPU memory. Artificial memory limits like this reduce our performance if we don't actually need them, but can let us accomplish much larger tasks when we do.\n- `rmm_pool_size`, to use the RAPIDS Memory Manager to allocate one big chunk of memory upfront rather than having our operations call `cudaMalloc` all the time under the hood. This improves performance, and is generally a best practice.",
"_____no_output_____"
]
],
[
[
"from dask.distributed import Client, fire_and_forget, wait\nfrom dask_cuda import LocalCUDACluster\nfrom dask.utils import parse_bytes\nimport dask\n\n\ncluster = LocalCUDACluster(\n CUDA_VISIBLE_DEVICES=\"0,1\",\n device_memory_limit=parse_bytes(\"3GB\"),\n rmm_pool_size=parse_bytes(\"16GB\"),\n) \n\nclient = Client(cluster)\nclient",
"_____no_output_____"
]
],
[
[
"Click the **Dashboard** link above to view your Dask dashboard. ",
"_____no_output_____"
],
[
"## cuDF DataFrames to Dask DataFrames",
"_____no_output_____"
],
[
"Dask lets scale our cuDF workflows. We'll walk through a couple of examples below, and then also highlight how Dask lets us spill data from GPU to CPU memory.\n\nFirst, we'll create a dataframe with CPU Dask and then send it to the GPU",
"_____no_output_____"
]
],
[
[
"import cudf\nimport dask_cudf",
"_____no_output_____"
],
[
"ddf = dask_cudf.from_dask_dataframe(dask.datasets.timeseries())\nddf.head()",
"_____no_output_____"
]
],
[
[
"### Example One: Groupby-Aggregations",
"_____no_output_____"
]
],
[
[
"ddf.groupby([\"id\", \"name\"]).agg({\"x\":['sum', 'mean']}).head()",
"_____no_output_____"
]
],
[
[
"Run the code above again.\n\nIf you look at the task stream in the dashboard, you'll notice that we're creating the data every time. That's because Dask is lazy. We need to `persist` the data if we want to cache it in memory.",
"_____no_output_____"
]
],
[
[
"ddf = ddf.persist()\nwait(ddf);",
"_____no_output_____"
],
[
"ddf.groupby([\"id\", \"name\"]).agg({\"x\":['sum', 'mean']}).head()",
"_____no_output_____"
]
],
[
[
"This is the same API as cuDF, except it works across many GPUs.",
"_____no_output_____"
],
[
"### Example Two: Rolling Windows\n\nWe can also do things like rolling window calculations with Dask and GPUs.",
"_____no_output_____"
]
],
[
[
"ddf.head()",
"_____no_output_____"
],
[
"rolling = ddf[['x','y']].rolling(window=3)\ntype(rolling)",
"_____no_output_____"
],
[
"rolling.mean().head()",
"_____no_output_____"
]
],
[
[
"## Larger than GPU Memory Workflows\n\nWhat if we needed to scale up even more, but didn't have enough GPU memory? Dask handles spilling for us, so we don't need to worry about it. The `device_memory_limit` parameter we used while creating the LocalCluster determines when we should start spilling. In this case, we'll start spilling when we've used about 4GB of GPU memory.\n\nLet's create a larger dataframe to use as an example.",
"_____no_output_____"
]
],
[
[
"ddf = dask_cudf.from_dask_dataframe(dask.datasets.timeseries(start=\"2000-01-01\", end=\"2003-12-31\", partition_freq='60d'))\n\nddf = ddf.persist()\nlen(ddf)",
"_____no_output_____"
],
[
"print(f\"{ddf.memory_usage(deep=True).sum().compute() / 1e9} GB of data\")",
"_____no_output_____"
],
[
"ddf.head()",
"_____no_output_____"
]
],
[
[
"Let's imagine we have some downstream operations that require all the data from a given unique identifier in the same partition. We can repartition our data based on the `name` column using the `shuffle` API.\n\nRepartitioning our large dataframe will spike GPU memory higher than 4GB, so we'll need to spill to CPU memory.",
"_____no_output_____"
]
],
[
[
"ddf = ddf.shuffle(on=\"id\")\nddf = ddf.persist()\n\nlen(ddf)",
"_____no_output_____"
]
],
[
[
"Watch the Dask Dashboard while this runs. You should see a lot of tasks in the stream like `disk-read` and `disk-write`. Setting a `device_memory_limit` tells dask to spill to CPU memory and potentially disk (if we overwhelm CPU memory). This lets us do these large computations even when we're almost out of memory (though in this case, we faked it).",
"_____no_output_____"
],
[
"# Dask Custom Functions\n\nDask DataFrames also provide a `map_partitions` API, which is very useful for parallelizing custom logic that doesn't quite fit perfectly or doesn't need to be used with the Dask dataframe API. Dask will `map` the function to every partition of the distributed dataframe.\n\nNow that we have all the rows of each `id` collected in the same partitions, what if we just wanted to sort **within each partition**. Avoiding global sorts is usually a good idea if possible, since they're very expensive operations.",
"_____no_output_____"
]
],
[
[
"sorted_ddf = ddf.map_partitions(lambda x: x.sort_values(\"id\"))\nlen(sorted_ddf)",
"_____no_output_____"
]
],
[
[
"We could also do something more complicated and wrap it into a function. Let's do a rolling window on the two value columns after sorting by the id column.",
"_____no_output_____"
]
],
[
[
"def sort_and_rolling_mean(df):\n df = df.sort_values(\"id\")\n df = df.rolling(3)[[\"x\", \"y\"]].mean()\n return df",
"_____no_output_____"
],
[
"result = ddf.map_partitions(sort_and_rolling_mean)\nresult = result.persist()\nwait(result);",
"_____no_output_____"
],
[
"# let's look at a random partition\nresult.partitions[12].head()",
"_____no_output_____"
]
],
[
[
"Pretty cool. When we're using `map_partitions`, the function is executing on the individual cuDF DataFrames that make up our Dask DataFrame. This means we can do any cuDF operation, run CuPy array manipulations, or anything else we want.",
"_____no_output_____"
],
[
"# Dask Delayed\n\nDask also provides a `delayed` API, which is useful for parallelizing custom logic that doesn't quite fit into the DataFrame API.\n\nLet's imagine we wanted to run thousands of regressions models on different combinations of two features. We can do this experiment super easily with dask.delayed.",
"_____no_output_____"
]
],
[
[
"from cuml.linear_model import LinearRegression\nfrom dask import delayed\nimport dask\nimport numpy as np\nfrom itertools import combinations",
"_____no_output_____"
],
[
"# Setup data\nnp.random.seed(12)\n\nnrows = 1000000\nncols = 50\ndf = cudf.DataFrame({f\"x{i}\": np.random.randn(nrows) for i in range(ncols)})\ndf['y'] = np.random.randn(nrows)",
"_____no_output_____"
],
[
"feature_combinations = list(combinations(df.columns.drop(\"y\"), 2))\nfeature_combinations[:10]",
"_____no_output_____"
],
[
"len(feature_combinations)",
"_____no_output_____"
],
[
"# Many calls to linear regression, parallelized with Dask\n@delayed\ndef fit_ols(df, feature_cols, target_col=\"y\"):\n clf = LinearRegression()\n clf.fit(df[list(feature_cols)], df[target_col])\n return feature_cols, clf.coef_, clf.intercept_",
"_____no_output_____"
],
[
"# scatter the data to the workers beforehand\ndata_future = client.scatter(df, broadcast=True)",
"_____no_output_____"
],
[
"results = []\n\nfor features in feature_combinations:\n # note how i'm passing the scattered data future\n res = fit_ols(data_future, features)\n results.append(res)\n\nres = dask.compute(results)\nres = res[0]\n\nprint(\"Features\\t\\tCoefficients\\t\\t\\tIntercept\")\nfor i in range(5):\n print(res[i][0], res[i][1].values, res[i][2], sep=\"\\t\")",
"_____no_output_____"
]
],
[
[
"# Handling Parquet Files\n\nDask and cuDF provide accelerated Parquet readers and writers, and it's useful to take advantage of these tools.\n\nTo start, let's write out our DataFrame `ddf` to Parquet files using the `to_parquet` API and delete it from memory.",
"_____no_output_____"
]
],
[
[
"print(ddf.npartitions)",
"_____no_output_____"
],
[
"ddf.to_parquet(\"ddf.parquet\")",
"_____no_output_____"
],
[
"del ddf",
"_____no_output_____"
]
],
[
[
"Let's take a look at what happened.",
"_____no_output_____"
]
],
[
[
"!ls ddf.parquet | head",
"_____no_output_____"
]
],
[
[
"We end up with many parquet files, and one metadata file. Dask will write one file per partition.\n\nLet's read the data back in with `dask_cudf.read_parquet`.",
"_____no_output_____"
]
],
[
[
"ddf = dask_cudf.read_parquet(\"ddf.parquet/\")\nddf",
"_____no_output_____"
]
],
[
[
"Why do we have more partitions than files? It turns out, Dask's readers do things like chunk our data by default. Additionally, the `_metadata` file helps provide guidelines for reading the data. But, we can still read them on a per-file basis if want by using a `*` wildcard in the filepath and ignoring the metadata.",
"_____no_output_____"
]
],
[
[
"ddf = dask_cudf.read_parquet(\"ddf.parquet/*.parquet\")\nddf",
"_____no_output_____"
]
],
[
[
"Let's now write one big parquet file and then read it back in. We can `repartition` our dataset down to a single partition.",
"_____no_output_____"
]
],
[
[
"ddf.repartition(npartitions=1).to_parquet(\"big_ddf.parquet\")",
"_____no_output_____"
],
[
"dask_cudf.read_parquet(\"big_ddf.parquet/\")",
"_____no_output_____"
]
],
[
[
"We still get lots of partitions? We can control the splitting behavior using the `split_row_groups` parameter.",
"_____no_output_____"
]
],
[
[
"dask_cudf.read_parquet(\"big_ddf.parquet/\", split_row_groups=False)",
"_____no_output_____"
]
],
[
[
"In general, we want to avoid massive partitions. The sweet spot is probably around 2-3 GB of data per partition for a 32GB V100.",
"_____no_output_____"
],
[
"# Understanding Persist and Compute\n\nBefore we close, it's worth coming back to the concepts of `persist` and `compute`. We've seen them several times, but haven't gone into depth.\n\nMost Dask operations are lazy. This is a common pattern in distributed computing, but is likely unfamiliar to those who primarily use single-machine libraries like pandas and cuDF. As a result, you'll usually need to call an **eager** operation like `len` or `persist` to actually trigger work.\n\nIn general, you should avoid calling `compute` except when collecting small datasets or scalars. When we spin up a cluster, we're interacting with our cluster in what we call the `Client` Python process. When we created a `Client` object above, this is what we did. Calling `compute` brings all of the results back to a single GPU cuDF DataFrame in the client process, not in any of the worker processes. This means we're not using the same memory pool, so we could go out of memory if we're not careful.\n\nFor those of you with Spark experience, you can think of `persist` as triggering work and caching the dataframe in distributed memory and `compute` as collecting the data or results into a single GPU dataframe (cuDF) on the driver.\n\n\n### Should I Persist My Data?\n\nPersisting is generally a good idea if the data needs to be accessed multiple times, to avoid repeated computation. However, if the size of your data would lead to memory pressure, this could cause spilling, which hurts performance. As a best practice, we recommend persisting only when necessary or when you're using an eager operation in the middle of your workflow (to avoid repeating computation).\n\nNote that calling `df.head` is an eager operation, which will trigger some computation. If you're going to be doing exploratory data analysis or visually inspecting the data, you would want to persist beforehand.",
"_____no_output_____"
],
[
"# Summary",
"_____no_output_____"
],
[
"RAPIDS lets us scale up and take advantage of GPU acceleration. Dask lets us scale out to multiple machines. Dask supports both cuDF DataFrames and CuPy arrays, with generally the same APIs as the single-machine libraries.\n\nWe encourage you to read the Dask [documentation](https://docs.dask.org/en/latest/) to learn more, and also look at our [10 Minute Guide to cuDF and Dask cuDF](https://docs.rapids.ai/api/cudf/nightly/10min.html)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
d07db2686799ceb24de1dd22ce784fc624233876 | 5,237 | ipynb | Jupyter Notebook | FeatureCollection/distance.ipynb | OIEIEIO/earthengine-py-notebooks | 5d6c5cdec0c73bf02020ee17d42c9e30d633349f | [
"MIT"
] | 1,008 | 2020-01-27T02:03:18.000Z | 2022-03-24T10:42:14.000Z | FeatureCollection/distance.ipynb | rafatieppo/earthengine-py-notebooks | 99fbc4abd1fb6ba41e3d8a55f8911217353a3237 | [
"MIT"
] | 8 | 2020-02-01T20:18:18.000Z | 2021-11-23T01:48:02.000Z | FeatureCollection/distance.ipynb | rafatieppo/earthengine-py-notebooks | 99fbc4abd1fb6ba41e3d8a55f8911217353a3237 | [
"MIT"
] | 325 | 2020-01-27T02:03:36.000Z | 2022-03-25T20:33:33.000Z | 36.880282 | 470 | 0.558526 | [
[
[
"<table class=\"ee-notebook-buttons\" align=\"left\">\n <td><a target=\"_blank\" href=\"https://github.com/giswqs/earthengine-py-notebooks/tree/master/FeatureCollection/distance.ipynb\"><img width=32px src=\"https://www.tensorflow.org/images/GitHub-Mark-32px.png\" /> View source on GitHub</a></td>\n <td><a target=\"_blank\" href=\"https://nbviewer.jupyter.org/github/giswqs/earthengine-py-notebooks/blob/master/FeatureCollection/distance.ipynb\"><img width=26px src=\"https://upload.wikimedia.org/wikipedia/commons/thumb/3/38/Jupyter_logo.svg/883px-Jupyter_logo.svg.png\" />Notebook Viewer</a></td>\n <td><a target=\"_blank\" href=\"https://colab.research.google.com/github/giswqs/earthengine-py-notebooks/blob/master/FeatureCollection/distance.ipynb\"><img src=\"https://www.tensorflow.org/images/colab_logo_32px.png\" /> Run in Google Colab</a></td>\n</table>",
"_____no_output_____"
],
[
"## Install Earth Engine API and geemap\nInstall the [Earth Engine Python API](https://developers.google.com/earth-engine/python_install) and [geemap](https://geemap.org). The **geemap** Python package is built upon the [ipyleaflet](https://github.com/jupyter-widgets/ipyleaflet) and [folium](https://github.com/python-visualization/folium) packages and implements several methods for interacting with Earth Engine data layers, such as `Map.addLayer()`, `Map.setCenter()`, and `Map.centerObject()`.\nThe following script checks if the geemap package has been installed. If not, it will install geemap, which automatically installs its [dependencies](https://github.com/giswqs/geemap#dependencies), including earthengine-api, folium, and ipyleaflet.",
"_____no_output_____"
]
],
[
[
"# Installs geemap package\nimport subprocess\n\ntry:\n import geemap\nexcept ImportError:\n print('Installing geemap ...')\n subprocess.check_call([\"python\", '-m', 'pip', 'install', 'geemap'])",
"_____no_output_____"
],
[
"import ee\nimport geemap",
"_____no_output_____"
]
],
[
[
"## Create an interactive map \nThe default basemap is `Google Maps`. [Additional basemaps](https://github.com/giswqs/geemap/blob/master/geemap/basemaps.py) can be added using the `Map.add_basemap()` function. ",
"_____no_output_____"
]
],
[
[
"Map = geemap.Map(center=[40,-100], zoom=4)\nMap",
"_____no_output_____"
]
],
[
[
"## Add Earth Engine Python script ",
"_____no_output_____"
]
],
[
[
"# Add Earth Engine dataset\n# Collection.distance example.\n# Computes the distance to the nearest feature in a collection.\n\n# Construct a FeatureCollection from a list of geometries.\nfc = ee.FeatureCollection([\n ee.Geometry.Point(-72.94411, 41.32902),\n ee.Geometry.Point(-72.94411, 41.33402),\n ee.Geometry.Point(-72.94411, 41.33902),\n # The geometries do not need to be the same type.\n ee.Geometry.LineString(\n -72.93411, 41.30902, -72.93411, 41.31902, -72.94411, 41.31902)\n])\n\n# Compute distance from the dfeatures, to a max of 1000 meters.\ndistance = fc.distance(1000, 100)\n\nMap.setCenter(-72.94, 41.32, 13)\nMap.addLayer(distance, {'min': 0, 'max': 1000, 'palette': ['yellow', 'red']}, 'distance')\nMap.addLayer(fc, {}, 'Features')\n",
"_____no_output_____"
]
],
[
[
"## Display Earth Engine data layers ",
"_____no_output_____"
]
],
[
[
"Map.addLayerControl() # This line is not needed for ipyleaflet-based Map.\nMap",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07dc7aa88086391c1fd897f8efd003fd5392d7f | 427,489 | ipynb | Jupyter Notebook | notebooks/66-PRMT-2332--lloyd-george-digitisation-ccgs-failure-rates.ipynb | nhsconnect/prm-gp2gp-data-sandbox | ba2f7bcf7fb3371b2358c2386a265fbeb262a994 | [
"Apache-2.0"
] | 1 | 2022-02-02T15:43:10.000Z | 2022-02-02T15:43:10.000Z | notebooks/66-PRMT-2332--lloyd-george-digitisation-ccgs-failure-rates.ipynb | nhsconnect/prm-gp2gp-data-sandbox | ba2f7bcf7fb3371b2358c2386a265fbeb262a994 | [
"Apache-2.0"
] | null | null | null | notebooks/66-PRMT-2332--lloyd-george-digitisation-ccgs-failure-rates.ipynb | nhsconnect/prm-gp2gp-data-sandbox | ba2f7bcf7fb3371b2358c2386a265fbeb262a994 | [
"Apache-2.0"
] | 1 | 2021-04-11T07:23:49.000Z | 2021-04-11T07:23:49.000Z | 189.238158 | 39,956 | 0.775676 | [
[
[
"# Hypothesis: Are digitised practices causing more failures?\n\n## Hypothesis\n\nWe believe that practices undergoing Lloyd Gerge digitisation have an increased failure rate. \nWe will know this to be true when we look at their data for the last three months, and see that either their failures have increased, or that in general their failures are higher than average.\n\n## Context\n\nFrom the months of May-Aug 2021, we see a steady increase of TPP-->EMIS Large message general failures. A general hypothesis is that this is due to record sizes increasing, which could be due to Lloyd George digitisation. This has prompted a more general hypothesis to identify whether digitisation is impacting failure rates. \n\n## Scope\n\n- Generate a transfer outcomes table for each of the below CCGs split down for May, June, July:\n - Sunderland\n - Fylde and Wyre\n - Chorley and South Ribble \n - Blackpool\n - Birmingham and Solihull \n- Show technical failure rate for each month, for each practice in the CCG\n- Separate out outcomes for transfers in, and transfers out\n- Do this for practices as a sender and as a requester",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nimport numpy as np\nimport paths\nfrom data.practice_metadata import read_asid_metadata",
"_____no_output_____"
],
[
"asid_lookup=read_asid_metadata(\"prm-gp2gp-ods-metadata-preprod\", \"v2/2021/8/organisationMetadata.json\")\n\ntransfer_file_location = \"s3://prm-gp2gp-transfer-data-preprod/v4/\"\n\ntransfer_files = [\n \"2021/5/transfers.parquet\",\n \"2021/6/transfers.parquet\",\n \"2021/7/transfers.parquet\"\n]\ntransfer_input_files = [transfer_file_location + f for f in transfer_files]\n\ntransfers_raw = pd.concat((\n pd.read_parquet(f)\n for f in transfer_input_files\n))\n\ntransfers = transfers_raw\\\n .join(asid_lookup.add_prefix(\"requesting_\"), on=\"requesting_practice_asid\", how=\"left\")\\\n .join(asid_lookup.add_prefix(\"sending_\"), on=\"sending_practice_asid\", how=\"left\")\\\n\ntransfers['month']=transfers['date_requested'].dt.to_period('M')",
"_____no_output_____"
],
[
"def generate_monthly_outcome_breakdown(transfers, columns):\n total_transfers = (\n transfers\n .groupby(columns)\n .size()\n .to_frame(\"Total Transfers\")\n )\n \n transfer_outcomes=pd.pivot_table(\n transfers,\n index=columns,\n columns=[\"status\"],\n aggfunc='size'\n )\n \n\n transfer_outcomes_pc = (\n transfer_outcomes\n .div(total_transfers[\"Total Transfers\"],axis=0)\n .multiply(100)\n .round(2)\n .add_suffix(\" %\")\n )\n \n failed_transfers = (\n transfers\n .assign(failed_transfer=transfers[\"status\"] != \"INTEGRATED_ON_TIME\")\n .groupby(columns)\n .agg({'failed_transfer': 'sum'})\n .rename(columns={'failed_transfer': 'ALL_FAILURE'})\n )\n \n failed_transfers_pc = (\n failed_transfers\n .div(total_transfers[\"Total Transfers\"],axis=0)\n .multiply(100)\n .round(2)\n .add_suffix(\" %\")\n )\n \n \n\n return pd.concat([\n total_transfers,\n transfer_outcomes,\n failed_transfers,\n transfer_outcomes_pc,\n failed_transfers_pc,\n ],axis=1).fillna(0)\n",
"_____no_output_____"
]
],
[
[
"## Generate national transfer outcomes",
"_____no_output_____"
]
],
[
[
"national_metrics_monthly=generate_monthly_outcome_breakdown(transfers, [\"month\"])\nnational_metrics_monthly",
"_____no_output_____"
]
],
[
[
"## Generate digitised CCG transfer outcomes",
"_____no_output_____"
]
],
[
[
"ccgs_to_investigate = [\n \"NHS SUNDERLAND CCG\",\n 'NHS FYLDE AND WYRE CCG',\n 'NHS CHORLEY AND SOUTH RIBBLE CCG',\n 'NHS BLACKPOOL CCG',\n 'NHS BIRMINGHAM AND SOLIHULL CCG'\n]\nis_requesting_ccg_of_interest = transfers.requesting_ccg_name.isin(ccgs_to_investigate)\nis_sending_ccg_of_interest = transfers.sending_ccg_name.isin(ccgs_to_investigate)\n\nrequesting_transfers_of_interest = transfers[is_requesting_ccg_of_interest]\nsending_transfers_of_interest = transfers[is_sending_ccg_of_interest]",
"_____no_output_____"
]
],
[
[
"### Requesting CCGs (Digitised)",
"_____no_output_____"
]
],
[
[
"requesting_ccgs_monthly=generate_monthly_outcome_breakdown(\n transfers=requesting_transfers_of_interest,\n columns=[\"requesting_ccg_name\", \"month\"]\n)\nrequesting_ccgs_monthly",
"_____no_output_____"
]
],
[
[
"### Sending CCGs (Digitised)",
"_____no_output_____"
]
],
[
[
"sending_ccgs_monthly=generate_monthly_outcome_breakdown(\n transfers=sending_transfers_of_interest,\n columns=[\"sending_ccg_name\", \"month\"]\n)\nsending_ccgs_monthly",
"_____no_output_____"
]
],
[
[
"### Requesting practices (digitised)",
"_____no_output_____"
]
],
[
[
"requesting_practices_monthly=generate_monthly_outcome_breakdown(\n transfers=requesting_transfers_of_interest,\n columns=[\"requesting_ccg_name\", \"requesting_practice_name\", \"requesting_practice_ods_code\", \"requesting_supplier\", \"month\"]\n)\nrequesting_practices_monthly",
"_____no_output_____"
]
],
[
[
"### Sending practices (digitised)",
"_____no_output_____"
]
],
[
[
"sending_practices_monthly=generate_monthly_outcome_breakdown(\n transfers=sending_transfers_of_interest,\n columns=[\"sending_ccg_name\", \"sending_practice_name\", \"sending_practice_ods_code\", \"sending_supplier\", \"month\"]\n)\nsending_practices_monthly",
"_____no_output_____"
]
],
[
[
"## Looking at failure rate trends by CCG when requesting a record",
"_____no_output_____"
]
],
[
[
"barplot_config = {\n 'color': ['lightsteelblue', 'cornflowerblue', 'royalblue'],\n 'edgecolor':'black',\n 'kind':'bar',\n 'figsize': (15,6),\n 'rot': 30\n}",
"_____no_output_____"
],
[
"def requesting_ccg_barplot(column_name, title):\n (\n pd\n .concat({'All CCGs': national_metrics_monthly}, names=['requesting_ccg_name'])\n .append(requesting_ccgs_monthly)\n .unstack()\n .plot(\n y=column_name,\n title=title,\n **barplot_config\n )\n )\n\nrequesting_ccg_barplot('ALL_FAILURE %', 'Total Failure Percentage (Digitised CCGs - Requesting)')\nrequesting_ccg_barplot('TECHNICAL_FAILURE %', 'Technical Failure Percentage (Digitised CCGs - Requesting)')\nrequesting_ccg_barplot('PROCESS_FAILURE %', 'Process Failure Percentage (Digitised CCGs - Requesting)')\nrequesting_ccg_barplot('UNCLASSIFIED_FAILURE %', 'Unlassified Failure Percentage (Digitised CCGs - Requesting)')",
"_____no_output_____"
]
],
[
[
"## Looking at failure rate trends by CCG when sending a record",
"_____no_output_____"
]
],
[
[
"def sending_ccg_barplot(column_name, title):\n (\n pd\n .concat({'All CCGs': national_metrics_monthly}, names=['sending_ccg_name'])\n .append(sending_ccgs_monthly)\n .unstack()\n .plot(\n y=column_name,\n title=title,\n **barplot_config\n )\n )\n\nsending_ccg_barplot('ALL_FAILURE %', 'Total Failure Percentage (Digitised CCGs - Sending)')\nsending_ccg_barplot('TECHNICAL_FAILURE %', 'Technical Failure Percentage (Digitised CCGs - Sending)')\nsending_ccg_barplot('PROCESS_FAILURE %', 'Process Failure Percentage (Digitised CCGs - Sending)')\nsending_ccg_barplot('UNCLASSIFIED_FAILURE %', 'Unlassified Failure Percentage (Digitised CCGs - Sending)')",
"_____no_output_____"
]
],
[
[
"## Write CCG transfer outcomes by sending and requesting practice to Excel",
"_____no_output_____"
]
],
[
[
"with pd.ExcelWriter('PRMT-2332-Digitisation-Failure-Rates-May-July-2021.xlsx') as writer:\n national_metrics_monthly.to_excel(writer, sheet_name=\"National Baseline\")\n requesting_ccgs_monthly.to_excel(writer, sheet_name=\"Digitised CCGs (Req)\")\n sending_ccgs_monthly.to_excel(writer, sheet_name=\"Digitised CCGs (Send)\")\n requesting_practices_monthly.to_excel(writer, sheet_name=\"Digitised Practices (Req)\")\n sending_practices_monthly.to_excel(writer, sheet_name=\"Digitised Practices (Send)\")",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07df20a544cbaa649df02be8162459ba8ed5fd1 | 28,031 | ipynb | Jupyter Notebook | Developement/Data_Collection_Web_scraping.ipynb | Vivek1258/IBM-data-science-capstone | 8da917b62f032ad033825d1d7f7f11f4c9d90a47 | [
"FTL"
] | 1 | 2021-01-15T09:30:14.000Z | 2021-01-15T09:30:14.000Z | Developement/Data_Collection_Web_scraping.ipynb | Vivek1258/IBM-data-science-capstone | 8da917b62f032ad033825d1d7f7f11f4c9d90a47 | [
"FTL"
] | null | null | null | Developement/Data_Collection_Web_scraping.ipynb | Vivek1258/IBM-data-science-capstone | 8da917b62f032ad033825d1d7f7f11f4c9d90a47 | [
"FTL"
] | null | null | null | 35.799489 | 275 | 0.414577 | [
[
[
"# Data Collection Using Web Scraping \n\n## To solve this problem we will need the following data :\n\n● List of neighborhoods in Pune.\n\n● Latitude and Longitudinal coordinates of those neighborhoods.\n\n● Venue data for each neighborhood.\n\n## Sources\n● For the list of neighborhoods, I used\n(https://en.wikipedia.org/wiki/Category:Neighbourhoods_in_Pune)\n\n● For Latitude and Longitudinal coordinates: Python Geocoder Package\n(https://geocoder.readthedocs.io/)\n\n● For Venue data: Foursquare API (https://foursquare.com/)\n",
"_____no_output_____"
],
[
"## Methods to extract data from Sources\n\nTo extract the data we will use python packages like requests, beautifulsoup and geocoder.\n\nWe will use Requests and beautifulsoup packages for web\nscraping(https://en.wikipedia.org/wiki/Category:Neighbourhoods_in_Pune ) to get the list of\nneighborhoods in Pune and geocoder package to get the latitude and longitude coordinates of\neach neighborhood.\n\nThen we will use Folium to plot these neighborhoods on the map. \n\nAfter that, we will use the foursquare API to get the venue data of those neighborhoods. Foursquare API will provide many categories of the venue data but we are particularly interested in the supermarket category in order to help us to solve the business problem.",
"_____no_output_____"
],
[
"## Imports ",
"_____no_output_____"
]
],
[
[
"\nimport numpy as np # library to handle data in a vectorized manner\n\nimport pandas as pd # library for data analsysis\npd.set_option(\"display.max_columns\", None)\npd.set_option(\"display.max_rows\", None)\n\nimport json # library to handle JSON files\n\nfrom geopy.geocoders import Nominatim # convert an address into latitude and longitude values\n!pip install geocoder\nimport geocoder # to get coordinates\n!pip install requests \nimport requests # library to handle requests\nfrom bs4 import BeautifulSoup # library to parse HTML and XML documents\n\nfrom pandas.io.json import json_normalize # tranform JSON file into a pandas dataframe\n\nprint(\"Libraries imported.\")",
"_____no_output_____"
]
],
[
[
"## Collecting the nebourhood data using Requests, BeautifulSoup, and Geocoder labries",
"_____no_output_____"
]
],
[
[
"\ndata = requests.get(\"https://en.wikipedia.org/wiki/Category:Neighbourhoods_in_Pune\").text\n# parse data from the html into a beautifulsoup object\nsoup = BeautifulSoup(data, 'html.parser')\n# create a list to store neighborhood data\nneighborhood_List = []\n# append the data into the list\nfor row in soup.find_all(\"div\", class_=\"mw-category\")[0].findAll(\"li\"):\n neighborhood_List.append(row.text)\n\n# create a new DataFrame from the list\nPune_df = pd.DataFrame({\"Neighborhood\": neighborhood_List})\n\nPune_df.tail()",
"_____no_output_____"
],
[
"\n# define a function to get coordinates\ndef get_cord(neighborhood):\n \n coords = None\n # loop until you get the coordinates\n while(coords is None):\n g = geocoder.arcgis('{}, Pune, Maharashtra'.format(neighborhood))\n coords = g.latlng\n return coords",
"_____no_output_____"
],
[
"\n# create a list and store the coordinates \ncoords = [ get_cord(neighborhood) for neighborhood in Pune_df[\"Neighborhood\"].tolist() ]",
"_____no_output_____"
],
[
"coords[:10]",
"_____no_output_____"
],
[
"df_coords = pd.DataFrame(coords, columns=['Latitude', 'Longitude'])",
"_____no_output_____"
],
[
"\n# merge the coordinates into the original dataframe\nPune_df['Latitude'] = df_coords['Latitude']\nPune_df['Longitude'] = df_coords['Longitude']\n\n# check the neighborhoods and the coordinates\nprint(Pune_df.shape)\nPune_df.head(10)",
"(55, 3)\n"
],
[
"# save the DataFrame as CSV file\nPune_df.to_csv(\"Pune_df.csv\", index=False)",
"_____no_output_____"
]
],
[
[
"## Collecting the nebourhood venue data using Foursquare API ",
"_____no_output_____"
]
],
[
[
"\n# define Foursquare Credentials and Version\nCLIENT_ID = '5HUDVH14DMECWUAFI2MICONBTTDPW1CCL1C4TFGE3FEHEUHJ' # your Foursquare ID\nCLIENT_SECRET = 'R0WIH5UIW2SADKBUW4B4WMY2QWBBT0Q02IURAXQXVJZMTDIV' # your Foursquare Secret\nVERSION = '20180605' # Foursquare API version\n\nprint('Your credentails:')\nprint('CLIENT_ID: ' + CLIENT_ID)\nprint('CLIENT_SECRET:' + CLIENT_SECRET)",
"Your credentails:\nCLIENT_ID: 5HUDVH14DMECWUAFI2MICONBTTDPW1CCL1C4TFGE3FEHEUHJ\nCLIENT_SECRET:R0WIH5UIW2SADKBUW4B4WMY2QWBBT0Q02IURAXQXVJZMTDIV\n"
],
[
"\nradius = 3000\nLIMIT = 150\n\nvenues = []\n\nfor lat, long, neighborhood in zip(Pune_df['Latitude'], Pune_df['Longitude'], Pune_df['Neighborhood']):\n \n # create the API request URL\n url = \"https://api.foursquare.com/v2/venues/explore?client_id={}&client_secret={}&v={}&ll={},{}&radius={}&limit={}\".format(\n CLIENT_ID,\n CLIENT_SECRET,\n VERSION,\n lat,\n long,\n radius, \n LIMIT)\n \n # make the GET request\n results = requests.get(url).json()[\"response\"]['groups'][0]['items']\n \n # return only relevant information for each nearby venue\n for venue in results:\n venues.append((\n neighborhood,\n lat, \n long, \n venue['venue']['name'], \n venue['venue']['location']['lat'], \n venue['venue']['location']['lng'], \n venue['venue']['categories'][0]['name']))\n",
"_____no_output_____"
],
[
"# convert the venues list into a new DataFrame\nvenues_df = pd.DataFrame(venues)\n\n# define the column names\nvenues_df.columns = ['Neighborhood', 'Latitude', 'Longitude', 'VenueName', 'VenueLatitude', 'VenueLongitude', 'VenueCategory']\n\nprint(venues_df.shape)",
"(4313, 7)\n"
],
[
"venues_df.head()",
"_____no_output_____"
],
[
"print('There are {} uniques categories.'.format(len(venues_df['VenueCategory'].unique())))\n",
"There are 142 uniques categories.\n"
],
[
"# print out the list of categories\nvenues_df['VenueCategory'].unique()",
"_____no_output_____"
],
[
"venues_df.to_csv(\"venues_df.csv\")",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07df49eb874d22d29bea17fe99cf158072bab03 | 993,126 | ipynb | Jupyter Notebook | Monte_Carlo.ipynb | HrantDavtyan/BA_BA | d2499812e5148e16558a70a6889fdf8b3e5a417e | [
"Apache-2.0"
] | null | null | null | Monte_Carlo.ipynb | HrantDavtyan/BA_BA | d2499812e5148e16558a70a6889fdf8b3e5a417e | [
"Apache-2.0"
] | null | null | null | Monte_Carlo.ipynb | HrantDavtyan/BA_BA | d2499812e5148e16558a70a6889fdf8b3e5a417e | [
"Apache-2.0"
] | 2 | 2018-02-27T11:08:49.000Z | 2020-02-14T06:29:50.000Z | 3,377.979592 | 896,794 | 0.950321 | [
[
[
"# Monte Carlo Simulation\n\nThis notebook provides introduction to Monte Carlo simulation using a toy example. The example is built upon a famous casino game known as **Rouulette** (Ruletka). During the game, players (bettros) make a bet on an integer, colour or a range and win if they bet was correct. Roulettes usually have 37 slots including one green (0), eighteen red (1-18) and eighteen black (19-36) slots.\n\nIn our case, we assume better is betting on a range 1-18 (which is equivalent to betting on red) and wins if randomly tossed ball ends up on a slot in that range. Otherwise, bettor loses. If former (win), his budget is increased by the amount of bet, if latter (loose) the budget is reduced by that amount.\n\nIn this example we will compare to bettors:\n- **Simple bettor** - has fixed budget, has decided how many periods to play beforehand, but always bets the same amount independent of any other factor.\n- **Smart bettor** - has fixed budget, has decided how many periods to play beforehand, bets the initial amount after winning, but doubles the bet if lost.\n\nThe simulation is expected to show that **Simple bettor** is a clear loser, while **Smart bettor**, if budget is enough, cab be a clear winner (in average terms).\n\nAs the ball is tossed randomly, we will need a random integer generator. We will as well plot results, so plotting libraries will be handy as well. We will start from developing the **spinner** function and then separate functions for simple and smart bettors. We will simulate both multiple (e.g. 100) times to see what happens.\n\nNote: In both functions there is a components to make sure budget does not become negative. While it is active in simple bettor, for the smart one it is commented out for you to see what can happen if no condition is set on budget (i.e. budget can even become negative). To make it more realistic you are encouraged to take **#** out in **smart bettor** and uncomment the following component and see what happens:\n```\n# if budget<=0:\n# break\n```",
"_____no_output_____"
]
],
[
[
"import random # to generate random inetegers\nimport matplotlib.pyplot as plt # to plot simulation\nimport seaborn as sns # to plot distribution\nplt.rc('figure', figsize=(20.0, 10.0)) # make the default plots bigger",
"_____no_output_____"
],
[
"# the random spinner\ndef spinner():\n slot = random.randint(1,37)\n if slot == 0:\n return \"lost\"\n elif 1<=slot <= 18:\n return \"won\"\n elif 19<=slot<=36:\n return \"lost\"",
"_____no_output_____"
],
[
"# the simple bettor\ndef simple_bettor(budget,bet,periods):\n X_axis = []\n Y_axis = []\n\n currentPeriod = 1\n while currentPeriod <= periods:\n result = spinner()\n if result == \"won\":\n budget = budget + bet\n elif result == \"lost\":\n budget = budget - bet\n if budget<=0:\n break\n X_axis.append(currentPeriod)\n Y_axis.append(budget)\n currentPeriod = currentPeriod + 1\n \n plt.plot(X_axis,Y_axis)\n return Y_axis[-1]",
"_____no_output_____"
],
[
"# the smart/doubler bettor\ndef smart_bettor(budget,bet,periods):\n X_axis = []\n Y_axis = []\n\n currentPeriod = 1\n initial_bet = bet\n while currentPeriod <= periods:\n result = spinner()\n if result == \"won\":\n budget = budget + bet\n bet = initial_bet\n elif result == \"lost\":\n budget = budget - bet\n bet = bet*2\n \n #if budget<=0:\n # break\n X_axis.append(currentPeriod)\n Y_axis.append(budget)\n currentPeriod = currentPeriod + 1\n \n plt.subplot(121) \n plt.plot(X_axis,Y_axis)\n return Y_axis[-1]",
"_____no_output_____"
],
[
"# the simulation of multiple possible futures (for simple)\nfutures = 1\nwhile futures < 101:\n simple_bettor(10000,100,1000)\n futures = futures + 1\n\nplt.title('Simple bettor')\nplt.ylabel('Budget')\nplt.xlabel('Periods')\nplt.show()",
"_____no_output_____"
],
[
"# the simulation of multiple possible futures (for smart)\nfutures = 1\noutcomes = []\nwhile futures < 101:\n outcomes.append(smart_bettor(10000,100,1000))\n futures = futures + 1\n\nplt.title('Smart bettor')\nplt.ylabel('Budget')\nplt.xlabel('Periods')\nplt.subplot(122)\nsns.distplot(outcomes,bins=25,vertical=True)\n#plt.subplots_adjust(wspace=0.5)\nplt.show()",
"/usr/local/lib/python3.6/dist-packages/matplotlib/cbook/deprecation.py:106: MatplotlibDeprecationWarning: Adding an axes using the same arguments as a previous axes currently reuses the earlier instance. In a future version, a new instance will always be created and returned. Meanwhile, this warning can be suppressed, and the future behavior ensured, by passing a unique label to each axes instance.\n warnings.warn(message, mplDeprecation, stacklevel=1)\n"
]
]
] | [
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07e161507f29bc4c16ec06e72682520f85c8f99 | 13,351 | ipynb | Jupyter Notebook | Chapter07/Exercise7.07/Exercise7.07.ipynb | khieunguyen/The-Data-Science-Workshop | 52cab305e6e2e8bb6820cf488ddb6e16b5567ac9 | [
"MIT"
] | 1 | 2020-05-08T08:59:30.000Z | 2020-05-08T08:59:30.000Z | Chapter07/Exercise7.07/Exercise7.07.ipynb | khieunguyen/The-Data-Science-Workshop | 52cab305e6e2e8bb6820cf488ddb6e16b5567ac9 | [
"MIT"
] | 1 | 2022-03-12T01:03:16.000Z | 2022-03-12T01:03:16.000Z | Chapter07/Exercise7.07/Exercise7.07.ipynb | khieunguyen/The-Data-Science-Workshop | 52cab305e6e2e8bb6820cf488ddb6e16b5567ac9 | [
"MIT"
] | null | null | null | 30.137698 | 169 | 0.356677 | [
[
[
"# Grid Search with Cross Validation",
"_____no_output_____"
]
],
[
[
"# import libraries\nimport pandas as pd",
"_____no_output_____"
],
[
"# data doesn't have headers, so let's create headers\n_headers = ['buying', 'maint', 'doors', 'persons', 'lug_boot', 'safety', 'car']\n# read in cars dataset\ndf = pd.read_csv('https://raw.githubusercontent.com/PacktWorkshops/The-Data-Science-Workshop/master/Chapter07/Dataset/car.data', names=_headers, index_col=None)\ndf.info()",
"<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 1728 entries, 0 to 1727\nData columns (total 7 columns):\nbuying 1728 non-null object\nmaint 1728 non-null object\ndoors 1728 non-null object\npersons 1728 non-null object\nlug_boot 1728 non-null object\nsafety 1728 non-null object\ncar 1728 non-null object\ndtypes: object(7)\nmemory usage: 94.6+ KB\n"
],
[
"# encode categorical variables\n_df = pd.get_dummies(df, columns=['buying', 'maint', 'doors', 'persons', 'lug_boot', 'safety'])\n_df.head()",
"_____no_output_____"
],
[
"# separate features and labels DataFrames\nfeatures = _df.drop(['car'], axis=1).values\nlabels = _df[['car']].values",
"_____no_output_____"
],
[
"import numpy as np\nfrom sklearn.tree import DecisionTreeClassifier\nfrom sklearn.model_selection import GridSearchCV",
"_____no_output_____"
],
[
"clf = DecisionTreeClassifier()",
"_____no_output_____"
],
[
"params = {'max_depth': np.arange(1, 8)}",
"_____no_output_____"
],
[
"clf_cv = GridSearchCV(clf, param_grid=params, cv=5)",
"_____no_output_____"
],
[
"clf_cv.fit(features, labels)",
"_____no_output_____"
],
[
"print(\"Tuned Decision Tree Parameters: {}\".format(clf_cv.best_params_))",
"Tuned Decision Tree Parameters: {'max_depth': 2}\n"
],
[
"print(\"Best score is {}\".format(clf_cv.best_score_))",
"Best score is 0.7777777777777778\n"
],
[
"model = clf_cv.best_estimator_\nmodel",
"_____no_output_____"
]
]
] | [
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07e2020c243eb6b36cb44943f740eab0f1a0cd6 | 85,559 | ipynb | Jupyter Notebook | notebooks/tabulate_results_v1.ipynb | subhadarship/nlp4if-2021 | f1a62d96bff1cefcbf3ec61186aa1ac9b2739482 | [
"MIT"
] | 1 | 2021-09-12T10:11:39.000Z | 2021-09-12T10:11:39.000Z | notebooks/tabulate_results_v1.ipynb | subhadarship/nlp4if-2021 | f1a62d96bff1cefcbf3ec61186aa1ac9b2739482 | [
"MIT"
] | null | null | null | notebooks/tabulate_results_v1.ipynb | subhadarship/nlp4if-2021 | f1a62d96bff1cefcbf3ec61186aa1ac9b2739482 | [
"MIT"
] | 1 | 2022-01-17T15:36:44.000Z | 2022-01-17T15:36:44.000Z | 110.256443 | 6,424 | 0.578022 | [
[
[
"# Tabulate results\n",
"_____no_output_____"
]
],
[
[
"import os\nimport sys\nfrom typing import Tuple\nimport pandas as pd\nfrom tabulate import tabulate\nfrom tqdm import tqdm\nsys.path.append('../src')\nfrom read_log_file import read_log_file",
"_____no_output_____"
],
[
"LOG_HOME_DIR = os.path.join('../logs_v1/')\nassert os.path.isdir(LOG_HOME_DIR)",
"_____no_output_____"
],
[
"MODEL_NAMES = ['logistic_regression', 'transformer_encoder', 'bert-base-uncased', 'bert-base-multilingual-cased']",
"_____no_output_____"
],
[
"SETUPS = ['zero', 'few50', 'few100', 'few150', 'few200', 'full', 'trg']",
"_____no_output_____"
],
[
"def get_best_score_from_dict(di: dict) -> dict:\n \"\"\"Get max value from a dict\"\"\"\n keys_with_max_val = []\n # find max value\n max_val = -float('inf')\n for k, v in di.items():\n if v > max_val:\n max_val = v\n # find all keys with max value\n for k, v in di.items():\n if v == max_val:\n keys_with_max_val.append(k)\n return {\n 'k': keys_with_max_val,\n 'v': max_val,\n }",
"_____no_output_____"
],
[
"def create_best_results_df(langs: str) -> Tuple[pd.DataFrame, pd.DataFrame]:\n results_dict = {}\n for model_name in MODEL_NAMES:\n results_dict[model_name] = {}\n log_dir = os.path.join(LOG_HOME_DIR, langs, model_name)\n log_filenames = os.listdir(log_dir)\n for fname in log_filenames:\n results_dict[model_name][fname] = read_log_file(\n log_file_path=os.path.join(log_dir, fname),\n plot=False,\n verbose=False,\n )['best_val_metrics']['f1']\n\n best_results_dict = {'Setup': SETUPS}\n best_hparams_dict = {'Setup': SETUPS}\n best_results_dict.update({model_name: [] for model_name in MODEL_NAMES})\n best_hparams_dict.update({model_name: [] for model_name in MODEL_NAMES})\n for model_name in MODEL_NAMES:\n for setup in SETUPS:\n best_score = get_best_score_from_dict(\n {k: v for k, v in results_dict[model_name].items() if k.startswith(f'{setup}_')}\n )\n best_results_dict[model_name].append(\n best_score['v']\n )\n best_hparams_dict[model_name].append(\n best_score['k']\n )\n\n\n best_results_df = pd.DataFrame(best_results_dict)\n best_hparams_df = pd.DataFrame(best_hparams_dict)\n return best_results_df, best_hparams_df",
"_____no_output_____"
],
[
"def highlight_best_score(df: pd.DataFrame) -> pd.DataFrame:\n \"\"\"Highlight best score in each row\"\"\"\n return df.style.apply(lambda x: ['background: red' if isinstance(v, float) and v == max(x.iloc[1:]) else '' for v in x], axis=1)",
"_____no_output_____"
],
[
"def tabulate_markdown(df: pd.DataFrame) -> str:\n \"\"\"Tabulate in markdown format and bold best scores in each row\"\"\"\n df = df.round(4)\n for model_name in MODEL_NAMES:\n df[model_name] = df[model_name].astype(str)\n for idx in range(len(df)):\n max_val = max(float(df.iloc[idx][model_name]) for model_name in MODEL_NAMES)\n for model_name in MODEL_NAMES:\n cell_val = float(df.iloc[idx][model_name])\n if cell_val == max_val:\n df.at[idx, model_name] = f'**{cell_val}**'\n else:\n df.at[idx, model_name] = f'{cell_val}'\n\n return tabulate(df, headers='keys', showindex=False, tablefmt='github')\n",
"_____no_output_____"
],
[
"best_results_dfs_dict = {}\nbest_hparams_dfs_dict = {}\nfor langs in tqdm(['enbg', 'enar', 'bgen', 'bgar', 'aren', 'arbg']):\n best_results_dfs_dict[langs], best_hparams_dfs_dict[langs] = create_best_results_df(langs)",
"100%|██████████| 6/6 [00:19<00:00, 3.30s/it]\n"
]
],
[
[
"## en-bg",
"_____no_output_____"
]
],
[
[
"highlight_best_score(best_results_dfs_dict['enbg'])",
"_____no_output_____"
],
[
"print(tabulate_markdown(best_results_dfs_dict['enbg']))",
"| Setup | logistic_regression | transformer_encoder | bert-base-uncased | bert-base-multilingual-cased |\n|---------|-----------------------|-----------------------|---------------------|--------------------------------|\n| zero | 0.5083 | 0.7916 | 0.8032 | **0.8042** |\n| few50 | 0.669 | 0.8059 | **0.8113** | 0.8106 |\n| few100 | 0.7891 | 0.8085 | 0.8123 | **0.8157** |\n| few150 | 0.8014 | 0.8129 | **0.8208** | 0.815 |\n| few200 | 0.8067 | 0.8119 | 0.813 | **0.816** |\n| full | 0.8171 | 0.8122 | 0.8199 | **0.8268** |\n| trg | 0.8138 | 0.8096 | 0.8206 | **0.8252** |\n"
],
[
"best_hparams_dfs_dict['enbg']",
"_____no_output_____"
]
],
[
[
"## en-ar",
"_____no_output_____"
]
],
[
[
"highlight_best_score(best_results_dfs_dict['enar'])",
"_____no_output_____"
],
[
"print(tabulate_markdown(best_results_dfs_dict['enar']))",
"| Setup | logistic_regression | transformer_encoder | bert-base-uncased | bert-base-multilingual-cased |\n|---------|-----------------------|-----------------------|---------------------|--------------------------------|\n| zero | 0.4596 | 0.5436 | 0.5417 | **0.627** |\n| few50 | 0.6026 | 0.6284 | 0.5884 | **0.64** |\n| few100 | 0.5997 | 0.656 | 0.6316 | **0.6999** |\n| few150 | 0.6201 | 0.7086 | 0.6557 | **0.7272** |\n| few200 | 0.6222 | **0.7141** | 0.5805 | 0.7061 |\n| full | 0.6222 | **0.7141** | 0.5805 | 0.7061 |\n| trg | 0.6445 | 0.6953 | 0.5817 | **0.7126** |\n"
],
[
"best_hparams_dfs_dict['enar']",
"_____no_output_____"
]
],
[
[
"## bg-en",
"_____no_output_____"
]
],
[
[
"highlight_best_score(best_results_dfs_dict['bgen'])",
"_____no_output_____"
],
[
"print(tabulate_markdown(best_results_dfs_dict['bgen']))",
"| Setup | logistic_regression | transformer_encoder | bert-base-uncased | bert-base-multilingual-cased |\n|---------|-----------------------|-----------------------|---------------------|--------------------------------|\n| zero | 0.4391 | **0.5281** | 0.4761 | 0.5082 |\n| few50 | 0.5871 | **0.605** | 0.6045 | 0.5959 |\n| few100 | 0.5901 | 0.6001 | 0.6339 | **0.6484** |\n| few150 | 0.5929 | 0.5944 | 0.6394 | **0.6488** |\n| few200 | 0.5925 | 0.5997 | 0.6472 | **0.6597** |\n| full | 0.5891 | 0.616 | **0.6828** | 0.6706 |\n| trg | 0.5689 | 0.6245 | **0.6776** | 0.6497 |\n"
],
[
"best_hparams_dfs_dict['bgen']",
"_____no_output_____"
]
],
[
[
"## bg-ar",
"_____no_output_____"
]
],
[
[
"highlight_best_score(best_results_dfs_dict['bgar'])",
"_____no_output_____"
],
[
"print(tabulate_markdown(best_results_dfs_dict['bgar']))",
"| Setup | logistic_regression | transformer_encoder | bert-base-uncased | bert-base-multilingual-cased |\n|---------|-----------------------|-----------------------|---------------------|--------------------------------|\n| zero | 0.4962 | **0.583** | 0.5184 | 0.5771 |\n| few50 | 0.5657 | **0.6747** | 0.5571 | 0.6064 |\n| few100 | 0.6453 | 0.6461 | 0.6543 | **0.6666** |\n| few150 | 0.6422 | 0.6882 | 0.6025 | **0.6981** |\n| few200 | 0.6543 | **0.7094** | 0.6089 | 0.6761 |\n| full | 0.6543 | **0.7094** | 0.6089 | 0.6761 |\n| trg | 0.4806 | 0.6645 | 0.5817 | **0.7126** |\n"
],
[
"best_hparams_dfs_dict['bgar']",
"_____no_output_____"
]
],
[
[
"## ar-en",
"_____no_output_____"
]
],
[
[
"highlight_best_score(best_results_dfs_dict['aren'])",
"_____no_output_____"
],
[
"print(tabulate_markdown(best_results_dfs_dict['aren']))",
"| Setup | logistic_regression | transformer_encoder | bert-base-uncased | bert-base-multilingual-cased |\n|---------|-----------------------|-----------------------|---------------------|--------------------------------|\n| zero | 0.2009 | 0.4899 | 0.4996 | **0.573** |\n| few50 | 0.5232 | 0.5878 | **0.6092** | 0.5978 |\n| few100 | 0.511 | 0.5826 | **0.6529** | 0.601 |\n| few150 | 0.5377 | 0.6074 | **0.6454** | 0.621 |\n| few200 | 0.5715 | 0.6032 | 0.6355 | **0.6418** |\n| full | 0.5651 | 0.5894 | 0.6644 | **0.6813** |\n| trg | 0.5689 | 0.6245 | **0.6776** | 0.6497 |\n"
],
[
"best_hparams_dfs_dict['aren']",
"_____no_output_____"
]
],
[
[
"## ar-bg",
"_____no_output_____"
]
],
[
[
"highlight_best_score(best_results_dfs_dict['arbg'])",
"_____no_output_____"
],
[
"print(tabulate_markdown(best_results_dfs_dict['arbg']))",
"| Setup | logistic_regression | transformer_encoder | bert-base-uncased | bert-base-multilingual-cased |\n|---------|-----------------------|-----------------------|---------------------|--------------------------------|\n| zero | 0.4588 | 0.7928 | **0.8079** | 0.8027 |\n| few50 | 0.7946 | 0.7911 | 0.8099 | **0.8145** |\n| few100 | 0.7972 | 0.8149 | 0.8137 | **0.8194** |\n| few150 | 0.8046 | 0.8144 | 0.818 | **0.8226** |\n| few200 | 0.8023 | 0.8095 | 0.8119 | **0.8208** |\n| full | 0.8136 | 0.8153 | 0.8218 | **0.8264** |\n| trg | 0.8138 | 0.8096 | 0.8206 | **0.8252** |\n"
],
[
"best_hparams_dfs_dict['arbg']",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
] |
d07e29c2981daa75bb7f5c7aedb8d4355ac310ee | 41,182 | ipynb | Jupyter Notebook | Project4/Project4_sul_part4 (1).ipynb | wiggs555/cse7324project | 6bc6e51ccbdbf0b80abbb0e7f0a64ae150831abd | [
"Unlicense"
] | null | null | null | Project4/Project4_sul_part4 (1).ipynb | wiggs555/cse7324project | 6bc6e51ccbdbf0b80abbb0e7f0a64ae150831abd | [
"Unlicense"
] | null | null | null | Project4/Project4_sul_part4 (1).ipynb | wiggs555/cse7324project | 6bc6e51ccbdbf0b80abbb0e7f0a64ae150831abd | [
"Unlicense"
] | 1 | 2019-02-05T07:45:51.000Z | 2019-02-05T07:45:51.000Z | 60.740413 | 14,704 | 0.630397 | [
[
[
"# Preparation ",
"_____no_output_____"
]
],
[
[
"# dependencies\nimport pandas as pd\nimport numpy as np\nimport missingno as msno \nimport matplotlib.pyplot as plt\nimport re\nfrom sklearn.model_selection import train_test_split\n\nfrom textwrap import wrap\nfrom sklearn.preprocessing import StandardScaler\nimport warnings\nwarnings.filterwarnings(\"ignore\")\nimport math\n%matplotlib inline",
"_____no_output_____"
],
[
"# import data\nshelter_outcomes = pd.read_csv(\"C:/Users/sulem/OneDrive/Desktop/machin learnign/Project3/aac_shelter_outcomes.csv\")\n# filter animal type for just cats\ncats = shelter_outcomes[shelter_outcomes['animal_type'] == 'Cat']\n#print(cats.head())\n\n# remove age_upon_outcome and recalculate to standard units (days)\nage = cats.loc[:,['datetime', 'date_of_birth']]\n# convert to datetime\nage.loc[:,'datetime'] = pd.to_datetime(age['datetime'])\nage.loc[:,'date_of_birth'] = pd.to_datetime(age['date_of_birth'])\n# calculate cat age in days\ncats.loc[:,'age'] = (age.loc[:,'datetime'] - age.loc[:,'date_of_birth']).dt.days\n# get dob info\ncats['dob_month'] = age.loc[:, 'date_of_birth'].dt.month\ncats['dob_day'] = age.loc[:, 'date_of_birth'].dt.day\ncats['dob_dayofweek'] = age.loc[:, 'date_of_birth'].dt.dayofweek\n# get month from datetime\ncats['month'] = age.loc[:,'datetime'].dt.month\n# get day of month\ncats['day'] = age.loc[:,'datetime'].dt.day\n# get day of week\ncats['dayofweek'] = age.loc[:, 'datetime'].dt.dayofweek\n# get hour of day\ncats['hour'] = age.loc[:, 'datetime'].dt.hour\n# get quarter\ncats['quarter'] = age.loc[:, 'datetime'].dt.quarter\n\n# clean up breed attribute\n# get breed attribute for processing\n# convert to lowercase, remove mix and strip whitespace\n# remove space in 'medium hair' to match 'longhair' and 'shorthair'\n# split on either space or '/'\nbreed = cats.loc[:, 'breed'].str.lower().str.replace('mix', '').str.replace('medium hair', 'mediumhair').str.strip().str.split('/', expand=True)\ncats['breed'] = breed[0]\ncats['breed1'] = breed[1]\n\n# clean up color attribute\n# convert to lowercase\n# strip spaces\n# split on '/'\ncolor = cats.loc[:, 'color'].str.lower().str.strip().str.split('/', expand=True)\ncats['color'] = color[0]\ncats['color1'] = color[1]\n\n# clean up sex_upon_outcome\nsex = cats['sex_upon_outcome'].str.lower().str.strip().str.split(' ', expand=True)\nsex[0].replace('spayed', True, inplace=True)\nsex[0].replace('neutered', True, inplace=True)\nsex[0].replace('intact', False, inplace=True)\nsex[1].replace(np.nan, 'unknown', inplace=True)\ncats['spayed_neutered'] = sex[0]\ncats['sex'] = sex[1]\n\n# add in domesticated attribute\ncats['domestic'] = np.where(cats['breed'].str.contains('domestic'), 1, 0)\n\n# combine outcome and outcome subtype into a single attribute\ncats['outcome_subtype'] = cats['outcome_subtype'].str.lower().str.replace(' ', '-').fillna('unknown')\ncats['outcome_type'] = cats['outcome_type'].str.lower().str.replace(' ', '-').fillna('unknown')\ncats['outcome'] = cats['outcome_type'] + '_' + cats['outcome_subtype']\n\n# drop unnecessary columns\ncats.drop(columns=['animal_id', 'name', 'animal_type', 'age_upon_outcome', 'date_of_birth', 'datetime', 'monthyear', 'sex_upon_outcome', 'outcome_subtype', 'outcome_type'], inplace=True)\n#print(cats['outcome'].value_counts())\n\ncats.head()\n",
"_____no_output_____"
],
[
"cats.drop(columns=['breed1'], inplace=True)\n# Breed, Color, Color1, Spayed_Netured and Sex attributes need to be one hot encoded\ncats_ohe = pd.get_dummies(cats, columns=['breed', 'color', 'color1', 'spayed_neutered', 'sex'])\ncats_ohe.head()\nout_t={'euthanasia_suffering' : 0, 'died_in-kennel' : 0, 'return-to-owner_unknown' : 0, 'transfer_partner' : 1, 'euthanasia_at-vet' : 2, 'adoption_foster' : 3, 'died_in-foster' : 0, 'transfer_scrp' : 4, 'euthanasia_medical' : 0, 'transfer_snr' : 0, 'died_enroute' : 0, 'rto-adopt_unknown' : 0, 'missing_in-foster' : 0, 'adoption_offsite' : 0, 'adoption_unknown' :5,'euthanasia_rabies-risk' : 0, 'unknown_unknown' : 0, 'adoption_barn' : 0, 'died_unknown' : 0, 'died_in-surgery' : 0, 'euthanasia_aggressive' : 0, 'euthanasia_unknown' : 0, 'missing_unknown' : 0, 'missing_in-kennel' : 0, 'missing_possible-theft' : 0, 'died_at-vet' : 0, 'disposal_unknown' : 0, 'euthanasia_underage' : 0, 'transfer_barn' : 0}\n#output is converted from string to catogries 0 to 5 represent each output\n# separate outcome from data\noutcome = cats_ohe['outcome']\ncats_ohe.drop(columns=['outcome'])\n\nprint(cats_ohe.head())\n\n# split the data\nX_train, X_test, y_train, y_test = train_test_split(cats_ohe, outcome, test_size=0.2, random_state=0)\nX_train.drop(columns=['outcome'], inplace=True)\ny_train = [out_t[item] for item in y_train]\n#print(X_train.shape, X_test.shape, y_train.shape, y_test.shape)",
" age dob_month dob_day dob_dayofweek month day dayofweek hour \\\n0 15 7 7 0 7 22 1 16 \n8 59 6 16 0 8 14 3 18 \n9 95 3 26 2 6 29 6 17 \n10 366 3 27 2 3 28 4 14 \n17 24 12 16 0 1 9 3 19 \n\n quarter domestic ... color1_tortie point color1_tricolor \\\n0 3 1 ... 0 0 \n8 3 1 ... 0 0 \n9 2 1 ... 0 0 \n10 1 1 ... 0 0 \n17 1 1 ... 0 0 \n\n color1_white color1_yellow spayed_neutered_False spayed_neutered_True \\\n0 0 0 1 0 \n8 1 0 1 0 \n9 0 0 0 1 \n10 1 0 0 1 \n17 1 0 1 0 \n\n spayed_neutered_unknown sex_female sex_male sex_unknown \n0 0 0 1 0 \n8 0 1 0 0 \n9 0 1 0 0 \n10 0 1 0 0 \n17 0 0 1 0 \n\n[5 rows x 141 columns]\n"
],
[
"x_train_ar=X_train.values\ny_target_ar=np.asarray(y_train)\nx_train_ar = StandardScaler().fit(x_train_ar).transform(x_train_ar)\nprint(x_train_ar.shape)\nprint(y_target_ar.shape)\nunique, counts = np.unique(y_target_ar, return_counts=True)\nnp.asarray((unique, counts))\nplt.pie(np.asarray(( counts)), labels=np.unique(y_target_ar), startangle=90, autopct='%.1f%%')\nplt.show()",
"(23537, 140)\n(23537,)\n"
]
],
[
[
"# Evaluation ",
"_____no_output_____"
],
[
"# Modeling ",
"_____no_output_____"
],
[
"# Exceptional Work ",
"_____no_output_____"
]
],
[
[
"# Example adapted from https://github.com/rasbt/python-machine-learning-book/blob/master/code/ch12/ch12.ipynb\n# Original Author: Sebastian Raschka\n\n# This is the optional book we use in the course, excellent intuitions and straightforward programming examples\n# please note, however, that this code has been manipulated to reflect our assumptions and notation.\nimport numpy as np\nfrom scipy.special import expit\nimport pandas as pd\nimport sys\n\n# start with a simple base classifier, which can't be fit or predicted\n# it only has internal classes to be used by classes that will subclass it\nclass TwoLayerPerceptronBase(object):\n def __init__(self, n_hidden=30,\n C=0.0, epochs=500, eta=0.001, random_state=None,phi='sig'):\n np.random.seed(random_state)\n self.n_hidden = n_hidden\n self.l2_C = C\n self.epochs = epochs\n self.eta = eta\n self.phi=phi\n \n @staticmethod\n def _encode_labels(y):\n \"\"\"Encode labels into one-hot representation\"\"\"\n onehot = pd.get_dummies(y).values.T\n \n return onehot\n\n def _initialize_weights(self):\n \"\"\"Initialize weights with small random numbers.\"\"\"\n W1_num_elems = (self.n_features_ + 1)*self.n_hidden\n W1 = np.random.uniform(-1.0, 1.0,size=W1_num_elems)\n W1 = W1.reshape(self.n_hidden, self.n_features_ + 1) # reshape to be W\n \n W2_num_elems = (self.n_hidden + 1)*self.n_output_\n W2 = np.random.uniform(-1.0, 1.0, size=W2_num_elems)\n W2 = W2.reshape(self.n_output_, self.n_hidden + 1)\n return W1, W2\n \n @staticmethod\n def _sigmoid(z,phi):\n \"\"\"Use scipy.special.expit to avoid overflow\"\"\"\n # 1.0 / (1.0 + np.exp(-z))\n if phi=='sig': \n return expit(z)\n if phi=='lin': \n return z\n if phi=='silu': \n return expit(z)*z\n if phi=='relu': \n bol= z>=0 \n #z=bol*z\n return np.maximum(0,z.copy())\n \n @staticmethod\n def _add_bias_unit(X, how='column'):\n \"\"\"Add bias unit (column or row of 1s) to array at index 0\"\"\"\n if how == 'column':\n ones = np.ones((X.shape[0], 1))\n X_new = np.hstack((ones, X))\n elif how == 'row':\n ones = np.ones((1, X.shape[1]))\n X_new = np.vstack((ones, X))\n return X_new\n \n @staticmethod\n def _L2_reg(lambda_, W1, W2):\n \"\"\"Compute L2-regularization cost\"\"\"\n # only compute for non-bias terms\n return (lambda_/2.0) * np.sqrt(np.mean(W1[:, 1:] ** 2) + np.mean(W2[:, 1:] ** 2))\n \n def _cost(self,A3,Y_enc,W1,W2):\n '''Get the objective function value'''\n cost = np.mean((Y_enc-A3)**2)\n L2_term = self._L2_reg(self.l2_C, W1, W2)\n return cost + L2_term\n \n def _feedforward(self, X, W1, W2):\n \"\"\"Compute feedforward step\n \"\"\"\n A1 = self._add_bias_unit(X, how='column')\n A1 = A1.T\n Z1 = W1 @ A1\n A2 = self._sigmoid(Z1,self.phi)\n A2 = self._add_bias_unit(A2, how='row')\n Z2 = W2 @ A2\n A3 = self._sigmoid(Z2,'sig')\n return A1, Z1, A2, Z2, A3\n def _div(b,A_,phi):\n \n if phi=='sig': \n return A_*(1-A_)\n if phi=='lin': \n return 1\n if phi=='silu':\n return (expit(A_)*A_)+(expit(A_)*(1-expit(A_)*A_))\n if phi=='relu': \n bol= A_>=0 \n return 1 \n \n def _get_gradient(self, A1, A2, A3, Z1, Z2, Y_enc, W1, W2):\n \"\"\" Compute gradient step using backpropagation.\n \"\"\"\n # vectorized backpropagation\n Z1_with_bias = self._add_bias_unit(Z1,how='row')\n Z2_with_bias = self._add_bias_unit(Z2,how='row')\n V2 = -2*(Y_enc-A3)*self._div(A3,self.phi) # last layer sensitivity\n V1 = self._div(A2,self.phi)*(W2.T @ V2) # back prop the sensitivity \n if self.phi=='relu':\n #print(Z2_with_bias.shape)\n #print(V2.shape)\n V1[Z1_with_bias<=0] = 0\n V2[Z2<=0] = 0\n grad2 = V2 @ A2.T # no bias on final layer\n grad1 = V1[1:,:] @ A1.T # dont back prop sensitivity of bias\n \n \n # regularize weights that are not bias terms\n grad1[:, 1:] += W1[:, 1:] * self.l2_C\n grad2[:, 1:] += W2[:, 1:] * self.l2_C\n\n return grad1, grad2\n \n def predict(self, X):\n \"\"\"Predict class labels\"\"\"\n _, _, _, _, A3 = self._feedforward(X, self.W1, self.W2)\n y_pred = np.argmax(A3, axis=0)\n return y_pred",
"_____no_output_____"
],
[
"from sklearn.metrics import accuracy_score\n# just start with the vectorized version and minibatch\nclass TLPMiniBatch(TwoLayerPerceptronBase):\n def __init__(self, alpha=0.0, decrease_const=0.0, shuffle=True, \n minibatches=1, **kwds): \n # need to add to the original initializer \n self.alpha = alpha\n self.decrease_const = decrease_const\n self.shuffle = shuffle\n self.minibatches = minibatches\n # but keep other keywords\n super().__init__(**kwds)\n \n \n def fit(self, X, y, print_progress=False):\n \"\"\" Learn weights from training data. With mini-batch\"\"\"\n X_data, y_data = X.copy(), y.copy()\n Y_enc = self._encode_labels(y)\n \n # init weights and setup matrices\n self.n_features_ = X_data.shape[1]\n self.n_output_ = Y_enc.shape[0]\n self.W1, self.W2 = self._initialize_weights()\n\n delta_W1_prev = np.zeros(self.W1.shape)\n delta_W2_prev = np.zeros(self.W2.shape)\n\n self.cost_ = []\n self.score_ = []\n # get starting acc\n self.score_.append(accuracy_score(y_data,self.predict(X_data)))\n for i in range(self.epochs):\n\n # adaptive learning rate\n self.eta /= (1 + self.decrease_const*i)\n\n if print_progress>0 and (i+1)%print_progress==0:\n sys.stderr.write('\\rEpoch: %d/%d' % (i+1, self.epochs))\n sys.stderr.flush()\n\n if self.shuffle:\n idx_shuffle = np.random.permutation(y_data.shape[0])\n X_data, Y_enc, y_data = X_data[idx_shuffle], Y_enc[:, idx_shuffle], y_data[idx_shuffle]\n\n mini = np.array_split(range(y_data.shape[0]), self.minibatches)\n mini_cost = []\n for idx in mini:\n\n # feedforward\n A1, Z1, A2, Z2, A3 = self._feedforward(X_data[idx],\n self.W1,\n self.W2)\n \n cost = self._cost(A3,Y_enc[:, idx],self.W1,self.W2)\n mini_cost.append(cost) # this appends cost of mini-batch only\n\n # compute gradient via backpropagation\n grad1, grad2 = self._get_gradient(A1=A1, A2=A2, A3=A3, Z1=Z1, Z2=Z2, \n Y_enc=Y_enc[:, idx],\n W1=self.W1,W2=self.W2)\n\n # momentum calculations\n delta_W1, delta_W2 = self.eta * grad1, self.eta * grad2\n self.W1 -= (delta_W1 + (self.alpha * delta_W1_prev))\n self.W2 -= (delta_W2 + (self.alpha * delta_W2_prev))\n delta_W1_prev, delta_W2_prev = delta_W1, delta_W2\n\n self.cost_.append(mini_cost)\n self.score_.append(accuracy_score(y_data,self.predict(X_data)))\n \n return self",
"_____no_output_____"
],
[
"%%time\nparams = dict(n_hidden=100, \n C=.0001, # tradeoff L2 regularizer\n epochs=200, # iterations\n eta=0.001, # learning rate\n random_state=1,\n phi='lin')\nnn_mini = TLPMiniBatch(**params,\n alpha=0.001,# momentum calculation\n decrease_const=0.0001, # decreasing eta\n minibatches=50, # minibatch size\n shuffle=True)\n\n \nnn_mini.fit(x_train_ar, y_target_ar, print_progress=50)\nyhat = nn_mini.predict(x_train_ar)\nprint('Accuracy:',accuracy_score(y_target_ar,yhat))",
"Epoch: 200/200"
]
]
] | [
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code"
]
] |
d07e2e474a71093c74abb5a975d394b6de790ee4 | 82,756 | ipynb | Jupyter Notebook | Chapter15/Handwriting_transcription.ipynb | arifmudi/Modern-Computer-Vision-with-PyTorch | de9d4896922b4eacae8307b0e178ab1422354414 | [
"MIT"
] | 269 | 2020-11-18T07:26:16.000Z | 2022-03-31T16:51:13.000Z | Chapter15/Handwriting_transcription.ipynb | Jrodmath/Modern-Computer-Vision-with-PyTorch | 08a1f57b96e1de0a1c71c4e4167ede4901928577 | [
"MIT"
] | 22 | 2021-01-20T17:08:17.000Z | 2022-03-18T13:17:31.000Z | Chapter15/Handwriting_transcription.ipynb | Jrodmath/Modern-Computer-Vision-with-PyTorch | 08a1f57b96e1de0a1c71c4e4167ede4901928577 | [
"MIT"
] | 140 | 2020-11-22T16:27:46.000Z | 2022-03-26T11:26:39.000Z | 81.212954 | 29,930 | 0.663819 | [
[
[
"<a href=\"https://colab.research.google.com/github/PacktPublishing/Hands-On-Computer-Vision-with-PyTorch/blob/master/Chapter15/Handwriting_transcription.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
]
],
[
[
"!wget https://www.dropbox.com/s/l2ul3upj7dkv4ou/synthetic-data.zip\n!unzip -qq synthetic-data.zip",
"--2020-10-02 13:47:42-- https://www.dropbox.com/s/l2ul3upj7dkv4ou/synthetic-data.zip\nResolving www.dropbox.com (www.dropbox.com)... 162.125.1.1, 2620:100:6016:1::a27d:101\nConnecting to www.dropbox.com (www.dropbox.com)|162.125.1.1|:443... connected.\nHTTP request sent, awaiting response... 301 Moved Permanently\nLocation: /s/raw/l2ul3upj7dkv4ou/synthetic-data.zip [following]\n--2020-10-02 13:47:42-- https://www.dropbox.com/s/raw/l2ul3upj7dkv4ou/synthetic-data.zip\nReusing existing connection to www.dropbox.com:443.\nHTTP request sent, awaiting response... 302 Found\nLocation: https://ucb0ff9836865991a703b67acc3e.dl.dropboxusercontent.com/cd/0/inline/BAjxFW_dLaTkMmOdGoh3My4N7da_3j-VT2DkBl2EnrrW4KdMpt8jv6OPY_xFZ0gqcx-O4CWaSAnemKsH7OI6Y_DHDkXdn7fT49-ykIqG5asdSzLJdqv5NhQdEgJR8E8Kx1w/file# [following]\n--2020-10-02 13:47:42-- https://ucb0ff9836865991a703b67acc3e.dl.dropboxusercontent.com/cd/0/inline/BAjxFW_dLaTkMmOdGoh3My4N7da_3j-VT2DkBl2EnrrW4KdMpt8jv6OPY_xFZ0gqcx-O4CWaSAnemKsH7OI6Y_DHDkXdn7fT49-ykIqG5asdSzLJdqv5NhQdEgJR8E8Kx1w/file\nResolving ucb0ff9836865991a703b67acc3e.dl.dropboxusercontent.com (ucb0ff9836865991a703b67acc3e.dl.dropboxusercontent.com)... 162.125.1.15, 2620:100:6016:15::a27d:10f\nConnecting to ucb0ff9836865991a703b67acc3e.dl.dropboxusercontent.com (ucb0ff9836865991a703b67acc3e.dl.dropboxusercontent.com)|162.125.1.15|:443... connected.\nHTTP request sent, awaiting response... 302 Found\nLocation: /cd/0/inline2/BAhvUkv9AycqRiPHpNkzR07RaD_6ZXBCqMgxjLIz4uwHAlpfR_9FTZlU_2jevSRfK5jLowo9i24CHVRMJOkflJxeTY0WgysV2_he60y8h9YmgSbSnqzFMXQ0ARbL1Var1-6SSBevFSyDc-fpvgqxQ_WoCgkCN5VIasAKwTAlyX5PZW2Fn3MA82_U_TDexbxc9uthzuGYDW9Yg3lMT5c46vzujY5ZJCJW-S4GAj-eUFhz0EmdTOZa40T_Sf1d-GgE35MKf-ShlwgT_r7HzIfz23omAKBeRxB7FGZ7akdb6lviIfJac1jtoCJO5IjaRx3T21H3nruWNMQlgBQa9_g-P7PmxV7YNmaOPxLUWggLIrCysg/file [following]\n--2020-10-02 13:47:43-- https://ucb0ff9836865991a703b67acc3e.dl.dropboxusercontent.com/cd/0/inline2/BAhvUkv9AycqRiPHpNkzR07RaD_6ZXBCqMgxjLIz4uwHAlpfR_9FTZlU_2jevSRfK5jLowo9i24CHVRMJOkflJxeTY0WgysV2_he60y8h9YmgSbSnqzFMXQ0ARbL1Var1-6SSBevFSyDc-fpvgqxQ_WoCgkCN5VIasAKwTAlyX5PZW2Fn3MA82_U_TDexbxc9uthzuGYDW9Yg3lMT5c46vzujY5ZJCJW-S4GAj-eUFhz0EmdTOZa40T_Sf1d-GgE35MKf-ShlwgT_r7HzIfz23omAKBeRxB7FGZ7akdb6lviIfJac1jtoCJO5IjaRx3T21H3nruWNMQlgBQa9_g-P7PmxV7YNmaOPxLUWggLIrCysg/file\nReusing existing connection to ucb0ff9836865991a703b67acc3e.dl.dropboxusercontent.com:443.\nHTTP request sent, awaiting response... 200 OK\nLength: 39876999 (38M) [application/zip]\nSaving to: ‘synthetic-data.zip’\n\nsynthetic-data.zip 100%[===================>] 38.03M 49.7MB/s in 0.8s \n\n2020-10-02 13:47:44 (49.7 MB/s) - ‘synthetic-data.zip’ saved [39876999/39876999]\n\n"
],
[
"!pip install torch_snippets torch_summary editdistance",
"Collecting torch_snippets\n Downloading https://files.pythonhosted.org/packages/99/59/4792a235f188a7777110ef99b6d3667883e14b1c96aeb25ade738812b73d/torch_snippets-0.224-py3-none-any.whl\nCollecting torch_summary\n Downloading https://files.pythonhosted.org/packages/83/49/f9db57bcad7246591b93519fd8e5166c719548c45945ef7d2fc9fcba46fb/torch_summary-1.4.3-py3-none-any.whl\nRequirement already satisfied: editdistance in /usr/local/lib/python3.6/dist-packages (0.5.3)\nRequirement already satisfied: Pillow in /usr/local/lib/python3.6/dist-packages (from torch_snippets) (7.0.0)\nRequirement already satisfied: dill in /usr/local/lib/python3.6/dist-packages (from torch_snippets) (0.3.2)\nCollecting opencv-python-headless\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/e2/e2/6670da2b12544858657058a5db2f088a18c56d0144bef8d178ad4734b7a3/opencv_python_headless-4.4.0.44-cp36-cp36m-manylinux2014_x86_64.whl (36.7MB)\n\u001b[K |████████████████████████████████| 36.7MB 83kB/s \n\u001b[?25hRequirement already satisfied: tqdm in /usr/local/lib/python3.6/dist-packages (from torch_snippets) (4.41.1)\nRequirement already satisfied: pandas in /usr/local/lib/python3.6/dist-packages (from torch_snippets) (1.1.2)\nCollecting loguru\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/6d/48/0a7d5847e3de329f1d0134baf707b689700b53bd3066a5a8cfd94b3c9fc8/loguru-0.5.3-py3-none-any.whl (57kB)\n\u001b[K |████████████████████████████████| 61kB 10.0MB/s \n\u001b[?25hRequirement already satisfied: matplotlib in /usr/local/lib/python3.6/dist-packages (from torch_snippets) (3.2.2)\nRequirement already satisfied: numpy in /usr/local/lib/python3.6/dist-packages (from torch_snippets) (1.18.5)\nRequirement already satisfied: pytz>=2017.2 in /usr/local/lib/python3.6/dist-packages (from pandas->torch_snippets) (2018.9)\nRequirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.6/dist-packages (from pandas->torch_snippets) (2.8.1)\nCollecting aiocontextvars>=0.2.0; python_version < \"3.7\"\n Downloading https://files.pythonhosted.org/packages/db/c1/7a723e8d988de0a2e623927396e54b6831b68cb80dce468c945b849a9385/aiocontextvars-0.2.2-py2.py3-none-any.whl\nRequirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->torch_snippets) (1.2.0)\nRequirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.6/dist-packages (from matplotlib->torch_snippets) (0.10.0)\nRequirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib->torch_snippets) (2.4.7)\nRequirement already satisfied: six>=1.5 in /usr/local/lib/python3.6/dist-packages (from python-dateutil>=2.7.3->pandas->torch_snippets) (1.15.0)\nCollecting contextvars==2.4; python_version < \"3.7\"\n Downloading https://files.pythonhosted.org/packages/83/96/55b82d9f13763be9d672622e1b8106c85acb83edd7cc2fa5bc67cd9877e9/contextvars-2.4.tar.gz\nCollecting immutables>=0.9\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/99/e0/ea6fd4697120327d26773b5a84853f897a68e33d3f9376b00a8ff96e4f63/immutables-0.14-cp36-cp36m-manylinux1_x86_64.whl (98kB)\n\u001b[K |████████████████████████████████| 102kB 15.8MB/s \n\u001b[?25hBuilding wheels for collected packages: contextvars\n Building wheel for contextvars (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for contextvars: filename=contextvars-2.4-cp36-none-any.whl size=7666 sha256=6993b683b673658d18613ee481dad0b50e74bf19a111f022adc3d07807a68013\n Stored in directory: /root/.cache/pip/wheels/a5/7d/68/1ebae2668bda2228686e3c1cf16f2c2384cea6e9334ad5f6de\nSuccessfully built contextvars\nInstalling collected packages: opencv-python-headless, immutables, contextvars, aiocontextvars, loguru, torch-snippets, torch-summary\nSuccessfully installed aiocontextvars-0.2.2 contextvars-2.4 immutables-0.14 loguru-0.5.3 opencv-python-headless-4.4.0.44 torch-snippets-0.224 torch-summary-1.4.3\n"
],
[
"from torch_snippets import *\nfrom torchsummary import summary\nimport editdistance",
"_____no_output_____"
],
[
"device = 'cuda' if torch.cuda.is_available() else 'cpu'\nfname2label = lambda fname: stem(fname).split('@')[0]\nimages = Glob('synthetic-data')",
"2020-10-02 13:47:59.724 | INFO | torch_snippets.loader:Glob:160 - 25132 files found at synthetic-data\n"
],
[
"vocab = 'QWERTYUIOPASDFGHJKLZXCVBNMqwertyuiopasdfghjklzxcvbnm'\nB,T,V = 64, 32, len(vocab) \nH,W = 32, 128 ",
"_____no_output_____"
],
[
"class OCRDataset(Dataset):\n def __init__(self, items, vocab=vocab, preprocess_shape=(H,W), timesteps=T):\n super().__init__()\n self.items = items\n self.charList = {ix+1:ch for ix,ch in enumerate(vocab)}\n self.charList.update({0: '`'})\n self.invCharList = {v:k for k,v in self.charList.items()}\n self.ts = timesteps\n def __len__(self):\n return len(self.items)\n def sample(self):\n return self[randint(len(self))]\n def __getitem__(self, ix):\n item = self.items[ix]\n image = cv2.imread(item, 0)\n label = fname2label(item)\n return image, label\n def collate_fn(self, batch):\n images, labels, label_lengths, label_vectors, input_lengths = [], [], [], [], []\n for image, label in batch:\n images.append(torch.Tensor(self.preprocess(image))[None,None])\n label_lengths.append(len(label))\n labels.append(label)\n label_vectors.append(self.str2vec(label))\n input_lengths.append(self.ts)\n images = torch.cat(images).float().to(device)\n label_lengths = torch.Tensor(label_lengths).long().to(device)\n label_vectors = torch.Tensor(label_vectors).long().to(device)\n input_lengths = torch.Tensor(input_lengths).long().to(device)\n return images, label_vectors, label_lengths, input_lengths, labels\n def str2vec(self, string, pad=True):\n string = ''.join([s for s in string if s in self.invCharList])\n val = list(map(lambda x: self.invCharList[x], string)) \n if pad:\n while len(val) < self.ts:\n val.append(0)\n return val\n def preprocess(self, img, shape=(32,128)):\n target = np.ones(shape)*255\n try:\n H, W = shape\n h, w = img.shape\n fx = H/h\n fy = W/w\n f = min(fx, fy)\n _h = int(h*f)\n _w = int(w*f)\n _img = cv2.resize(img, (_w,_h))\n target[:_h,:_w] = _img\n except:\n ...\n return (255-target)/255\n def decoder_chars(self, pred):\n decoded = \"\"\n last = \"\"\n pred = pred.cpu().detach().numpy()\n for i in range(len(pred)):\n k = np.argmax(pred[i])\n if k > 0 and self.charList[k] != last:\n last = self.charList[k]\n decoded = decoded + last\n elif k > 0 and self.charList[k] == last:\n continue\n else:\n last = \"\"\n return decoded.replace(\" \",\" \")\n def wer(self, preds, labels):\n c = 0\n for p, l in zip(preds, labels):\n c += p.lower().strip() != l.lower().strip()\n return round(c/len(preds), 4)\n def cer(self, preds, labels):\n c, d = [], []\n for p, l in zip(preds, labels):\n c.append(editdistance.eval(p, l) / len(l))\n return round(np.mean(c), 4)\n def evaluate(self, model, ims, labels, lower=False):\n model.eval()\n preds = model(ims).permute(1,0,2) # B, T, V+1\n preds = [self.decoder_chars(pred) for pred in preds]\n return {'char-error-rate': self.cer(preds, labels),\n 'word-error-rate': self.wer(preds, labels),\n 'char-accuracy' : 1 - self.cer(preds, labels),\n 'word-accuracy' : 1 - self.wer(preds, labels)}\n",
"_____no_output_____"
],
[
"from sklearn.model_selection import train_test_split\ntrn_items, val_items = train_test_split(Glob('synthetic-data'), test_size=0.2, random_state=22)\ntrn_ds = OCRDataset(trn_items)\nval_ds = OCRDataset(val_items)\n\ntrn_dl = DataLoader(trn_ds, batch_size=B, collate_fn=trn_ds.collate_fn, drop_last=True, shuffle=True)\nval_dl = DataLoader(val_ds, batch_size=B, collate_fn=val_ds.collate_fn, drop_last=True)",
"2020-10-02 13:49:48.111 | INFO | torch_snippets.loader:Glob:160 - 25132 files found at synthetic-data\n"
],
[
"from torch_snippets import Reshape, Permute\nclass BasicBlock(nn.Module):\n def __init__(self, ni, no, ks=3, st=1, padding=1, pool=2, drop=0.2):\n super().__init__()\n self.ks = ks\n self.block = nn.Sequential(\n nn.Conv2d(ni, no, kernel_size=ks, stride=st, padding=padding),\n nn.BatchNorm2d(no, momentum=0.3),\n nn.ReLU(inplace=True),\n nn.MaxPool2d(pool),\n nn.Dropout2d(drop)\n )\n def forward(self, x):\n return self.block(x)",
"_____no_output_____"
],
[
"class Ocr(nn.Module):\n def __init__(self, vocab):\n super().__init__()\n self.model = nn.Sequential(\n BasicBlock( 1, 128),\n BasicBlock(128, 128),\n BasicBlock(128, 256, pool=(4,2)),\n Reshape(-1, 256, 32),\n Permute(2, 0, 1) # T, B, D\n )\n self.rnn = nn.Sequential(\n nn.LSTM(256, 256, num_layers=2, dropout=0.2, bidirectional=True),\n )\n self.classification = nn.Sequential(\n nn.Linear(512, vocab+1),\n nn.LogSoftmax(-1),\n )\n def forward(self, x):\n x = self.model(x)\n x, lstm_states = self.rnn(x)\n y = self.classification(x)\n return y",
"_____no_output_____"
],
[
"def ctc(log_probs, target, input_lengths, target_lengths, blank=0):\n loss = nn.CTCLoss(blank=blank, zero_infinity=True)\n ctc_loss = loss(log_probs, target, input_lengths, target_lengths)\n return ctc_loss",
"_____no_output_____"
],
[
"model = Ocr(len(vocab)).to(device)\n!pip install torch_summary\nfrom torchsummary import summary\nsummary(model, torch.zeros((1,1,32,128)).to(device))",
"Requirement already satisfied: torch_summary in /usr/local/lib/python3.6/dist-packages (1.4.3)\n==========================================================================================\nLayer (type:depth-idx) Output Shape Param #\n==========================================================================================\n├─Sequential: 1-1 [-1, 1, 256] --\n| └─BasicBlock: 2-1 [-1, 128, 16, 64] --\n| | └─Sequential: 3-1 [-1, 128, 16, 64] 1,536\n| └─BasicBlock: 2-2 [-1, 128, 8, 32] --\n| | └─Sequential: 3-2 [-1, 128, 8, 32] 147,840\n| └─BasicBlock: 2-3 [-1, 256, 2, 16] --\n| | └─Sequential: 3-3 [-1, 256, 2, 16] 295,680\n| └─Reshape: 2-4 [-1, 256, 32] --\n| └─Permute: 2-5 [-1, 1, 256] --\n├─Sequential: 1-2 [-1, 1, 512] --\n| └─LSTM: 2-6 [-1, 1, 512] 2,629,632\n├─Sequential: 1-3 [-1, 1, 53] --\n| └─Linear: 2-7 [-1, 1, 53] 27,189\n| └─LogSoftmax: 2-8 [-1, 1, 53] --\n==========================================================================================\nTotal params: 3,101,877\nTrainable params: 3,101,877\nNon-trainable params: 0\nTotal mult-adds (M): 237.84\n==========================================================================================\nInput size (MB): 0.02\nForward/backward pass size (MB): 11.00\nParams size (MB): 11.83\nEstimated Total Size (MB): 22.85\n==========================================================================================\n"
],
[
"def train_batch(data, model, optimizer, criterion):\n model.train()\n imgs, targets, label_lens, input_lens, labels = data\n optimizer.zero_grad()\n preds = model(imgs)\n loss = criterion(preds, targets, input_lens, label_lens)\n loss.backward()\n optimizer.step()\n results = trn_ds.evaluate(model, imgs.to(device), labels)\n return loss, results",
"_____no_output_____"
],
[
"@torch.no_grad()\ndef validate_batch(data, model):\n model.eval()\n imgs, targets, label_lens, input_lens, labels = data\n preds = model(imgs)\n loss = criterion(preds, targets, input_lens, label_lens)\n return loss, val_ds.evaluate(model, imgs.to(device), labels)",
"_____no_output_____"
],
[
"model = Ocr(len(vocab)).to(device)\ncriterion = ctc\n\noptimizer = optim.AdamW(model.parameters(), lr=3e-3)\n\nn_epochs = 50\nlog = Report(n_epochs)",
"_____no_output_____"
],
[
"for ep in range( n_epochs):\n # if ep in lr_schedule: optimizer = AdamW(ocr.parameters(), lr=lr_schedule[ep])\n N = len(trn_dl)\n for ix, data in enumerate(trn_dl):\n pos = ep + (ix+1)/N\n loss, results = train_batch(data, model, optimizer, criterion)\n # scheduler.step()\n ca, wa = results['char-accuracy'], results['word-accuracy']\n log.record(pos=pos, trn_loss=loss, trn_char_acc=ca, trn_word_acc=wa, end='\\r')\n val_results = []\n N = len(val_dl)\n for ix, data in enumerate(val_dl):\n pos = ep + (ix+1)/N\n loss, results = validate_batch(data, model)\n ca, wa = results['char-accuracy'], results['word-accuracy']\n log.record(pos=pos, val_loss=loss, val_char_acc=ca, val_word_acc=wa, end='\\r')\n\n log.report_avgs(ep+1)\n print()\n for jx in range(5):\n img, label = val_ds.sample()\n _img = torch.Tensor(val_ds.preprocess(img)[None,None]).to(device)\n pred = model(_img)[:,0,:]\n pred = trn_ds.decoder_chars(pred)\n print(f'Pred: `{pred}` :: Truth: `{label}`')\n print()",
"EPOCH: 1.000\ttrn_loss: 3.327\ttrn_char_acc: 0.012\ttrn_word_acc: 0.000\tval_loss: 3.033\tval_char_acc: 0.053\tval_word_acc: 0.000\t(56.00s - 2744.13s remaining)\n\nPred: `b` :: Truth: `finish`\nPred: `s` :: Truth: `attack`\nPred: `b` :: Truth: `beautiful`\nPred: `m` :: Truth: `represent`\nPred: `m` :: Truth: `what`\n\nEPOCH: 2.000\ttrn_loss: 2.633\ttrn_char_acc: 0.214\ttrn_word_acc: 0.005\tval_loss: 1.957\tval_char_acc: 0.423\tval_word_acc: 0.031\t(87.98s - 2111.44s remaining)\n\nPred: `huaa` :: Truth: `know`\nPred: `mrd` :: Truth: `really`\nPred: `sentrer` :: Truth: `control`\nPred: `wet` :: Truth: `west`\nPred: `pareene` :: Truth: `someone`\n\nEPOCH: 3.000\ttrn_loss: 1.615\ttrn_char_acc: 0.590\ttrn_word_acc: 0.124\tval_loss: 1.164\tval_char_acc: 0.667\tval_word_acc: 0.204\t(119.94s - 1879.07s remaining)\n\nPred: `theony` :: Truth: `theory`\nPred: `t` :: Truth: `I`\nPred: `whin` :: Truth: `worker`\nPred: `liod` :: Truth: `word`\nPred: `dinectod` :: Truth: `director`\n\nEPOCH: 4.000\ttrn_loss: 1.095\ttrn_char_acc: 0.747\ttrn_word_acc: 0.314\tval_loss: 0.801\tval_char_acc: 0.777\tval_word_acc: 0.371\t(151.87s - 1746.51s remaining)\n\nPred: `evening` :: Truth: `evening`\nPred: `plauer` :: Truth: `player`\nPred: `wide` :: Truth: `wide`\nPred: `shale` :: Truth: `shake`\nPred: `forocess` :: Truth: `process`\n\nEPOCH: 5.000\ttrn_loss: 0.832\ttrn_char_acc: 0.826\ttrn_word_acc: 0.469\tval_loss: 0.616\tval_char_acc: 0.830\tval_word_acc: 0.492\t(183.95s - 1655.54s remaining)\n\nPred: `raise` :: Truth: `raise`\nPred: `commercial` :: Truth: `commercial`\nPred: `note` :: Truth: `note`\nPred: `bring` :: Truth: `bring`\nPred: `assume` :: Truth: `assume`\n\nEPOCH: 6.000\ttrn_loss: 0.656\ttrn_char_acc: 0.874\ttrn_word_acc: 0.580\tval_loss: 0.499\tval_char_acc: 0.864\tval_word_acc: 0.571\t(216.05s - 1584.37s remaining)\n\nPred: `nlion` :: Truth: `million`\nPred: `actiuity` :: Truth: `activity`\nPred: `imgine` :: Truth: `imagine`\nPred: `tax` :: Truth: `tax`\nPred: `dshernt` :: Truth: `different`\n\nEPOCH: 7.000\ttrn_loss: 0.531\ttrn_char_acc: 0.906\ttrn_word_acc: 0.671\tval_loss: 0.428\tval_char_acc: 0.880\tval_word_acc: 0.605\t(248.11s - 1524.09s remaining)\n\nPred: `region` :: Truth: `region`\nPred: `to` :: Truth: `to`\nPred: `senvice` :: Truth: `service`\nPred: `individual` :: Truth: `individual`\nPred: `activity` :: Truth: `activity`\n\nEPOCH: 8.000\ttrn_loss: 0.439\ttrn_char_acc: 0.926\ttrn_word_acc: 0.729\tval_loss: 0.359\tval_char_acc: 0.900\tval_word_acc: 0.665\t(280.34s - 1471.77s remaining)\n\nPred: `berubucan` :: Truth: `Republican`\nPred: `inpotaut` :: Truth: `important`\nPred: `suggest` :: Truth: `suggest`\nPred: `TV` :: Truth: `TV`\nPred: `not` :: Truth: `not`\n\nEPOCH: 9.000\ttrn_loss: 0.375\ttrn_char_acc: 0.942\ttrn_word_acc: 0.778\tval_loss: 0.318\tval_char_acc: 0.910\tval_word_acc: 0.692\t(312.57s - 1423.91s remaining)\n\nPred: `political` :: Truth: `political`\nPred: `thue` :: Truth: `thus`\nPred: `allo` :: Truth: `allow`\nPred: `experaly` :: Truth: `especially`\nPred: `admit` :: Truth: `admit`\n\nEPOCH: 10.000\ttrn_loss: 0.325\ttrn_char_acc: 0.952\ttrn_word_acc: 0.812\tval_loss: 0.291\tval_char_acc: 0.920\tval_word_acc: 0.720\t(344.94s - 1379.75s remaining)\n\nPred: `choam` :: Truth: `clear`\nPred: `just` :: Truth: `just`\nPred: `southern` :: Truth: `southern`\nPred: `add` :: Truth: `add`\nPred: `family` :: Truth: `family`\n\nEPOCH: 11.000\ttrn_loss: 0.282\ttrn_char_acc: 0.961\ttrn_word_acc: 0.844\tval_loss: 0.278\tval_char_acc: 0.923\tval_word_acc: 0.732\t(377.24s - 1337.47s remaining)\n\nPred: `green` :: Truth: `green`\nPred: `check` :: Truth: `check`\nPred: `proride` :: Truth: `provide`\nPred: `expert` :: Truth: `expert`\nPred: `leter` :: Truth: `letter`\n\nEPOCH: 12.000\ttrn_loss: 0.249\ttrn_char_acc: 0.967\ttrn_word_acc: 0.863\tval_loss: 0.244\tval_char_acc: 0.931\tval_word_acc: 0.762\t(409.68s - 1297.32s remaining)\n\nPred: `particwlarly` :: Truth: `particularly`\nPred: `watch` :: Truth: `watch`\nPred: `try` :: Truth: `try`\nPred: `now` :: Truth: `now`\nPred: `tatal` :: Truth: `total`\n\nEPOCH: 13.000\ttrn_loss: 0.223\ttrn_char_acc: 0.972\ttrn_word_acc: 0.885\tval_loss: 0.234\tval_char_acc: 0.937\tval_word_acc: 0.779\t(441.96s - 1257.89s remaining)\n\nPred: `improve` :: Truth: `improve`\nPred: `require` :: Truth: `require`\nPred: `from` :: Truth: `from`\nPred: `various` :: Truth: `various`\nPred: `high` :: Truth: `high`\n\nEPOCH: 14.000\ttrn_loss: 0.207\ttrn_char_acc: 0.976\ttrn_word_acc: 0.901\tval_loss: 0.216\tval_char_acc: 0.938\tval_word_acc: 0.791\t(474.01s - 1218.89s remaining)\n\nPred: `wide` :: Truth: `wide`\nPred: `clearly` :: Truth: `clearly`\nPred: `reduce` :: Truth: `reduce`\nPred: `hot` :: Truth: `hot`\nPred: `into` :: Truth: `into`\n\nEPOCH: 15.000\ttrn_loss: 0.184\ttrn_char_acc: 0.980\ttrn_word_acc: 0.911\tval_loss: 0.215\tval_char_acc: 0.941\tval_word_acc: 0.792\t(506.08s - 1180.84s remaining)\n\nPred: `item` :: Truth: `item`\nPred: `bank` :: Truth: `bank`\nPred: `decide` :: Truth: `decide`\nPred: `apainst` :: Truth: `against`\nPred: `none` :: Truth: `none`\n\nEPOCH: 16.000\ttrn_loss: 0.165\ttrn_char_acc: 0.983\ttrn_word_acc: 0.924\tval_loss: 0.208\tval_char_acc: 0.944\tval_word_acc: 0.799\t(538.20s - 1143.68s remaining)\n\nPred: `especially` :: Truth: `especially`\nPred: `oder` :: Truth: `order`\nPred: `main` :: Truth: `main`\nPred: `husband` :: Truth: `husband`\nPred: `power` :: Truth: `power`\n\nEPOCH: 17.000\ttrn_loss: 0.156\ttrn_char_acc: 0.985\ttrn_word_acc: 0.931\tval_loss: 0.198\tval_char_acc: 0.946\tval_word_acc: 0.812\t(570.16s - 1106.78s remaining)\n\nPred: `us` :: Truth: `us`\nPred: `program` :: Truth: `program`\nPred: `case` :: Truth: `case`\nPred: `change` :: Truth: `change`\nPred: `customer` :: Truth: `customer`\n\nEPOCH: 18.000\ttrn_loss: 0.142\ttrn_char_acc: 0.987\ttrn_word_acc: 0.941\tval_loss: 0.192\tval_char_acc: 0.947\tval_word_acc: 0.815\t(602.21s - 1070.60s remaining)\n\nPred: `father` :: Truth: `father`\nPred: `bank` :: Truth: `bank`\nPred: `property` :: Truth: `property`\nPred: `respond` :: Truth: `respond`\nPred: `reason` :: Truth: `reason`\n\nEPOCH: 19.000\ttrn_loss: 0.133\ttrn_char_acc: 0.988\ttrn_word_acc: 0.946\tval_loss: 0.195\tval_char_acc: 0.946\tval_word_acc: 0.816\t(634.27s - 1034.85s remaining)\n\nPred: `middle` :: Truth: `middle`\nPred: `heatt` :: Truth: `health`\nPred: `finally` :: Truth: `finally`\nPred: `artist` :: Truth: `artist`\nPred: `bad` :: Truth: `bad`\n\nEPOCH: 20.000\ttrn_loss: 0.126\ttrn_char_acc: 0.990\ttrn_word_acc: 0.951\tval_loss: 0.182\tval_char_acc: 0.950\tval_word_acc: 0.827\t(666.38s - 999.57s remaining)\n\nPred: `station` :: Truth: `station`\nPred: `research` :: Truth: `research`\nPred: `thing` :: Truth: `thing`\nPred: `or` :: Truth: `or`\nPred: `name` :: Truth: `name`\n\nEPOCH: 21.000\ttrn_loss: 0.120\ttrn_char_acc: 0.991\ttrn_word_acc: 0.957\tval_loss: 0.180\tval_char_acc: 0.950\tval_word_acc: 0.829\t(698.30s - 964.32s remaining)\n\nPred: `woman` :: Truth: `woman`\nPred: `run` :: Truth: `run`\nPred: `many` :: Truth: `many`\nPred: `another` :: Truth: `another`\nPred: `reduce` :: Truth: `reduce`\n\nEPOCH: 22.000\ttrn_loss: 0.112\ttrn_char_acc: 0.992\ttrn_word_acc: 0.960\tval_loss: 0.195\tval_char_acc: 0.948\tval_word_acc: 0.816\t(730.23s - 929.39s remaining)\n\nPred: `fear` :: Truth: `fear`\nPred: `less` :: Truth: `less`\nPred: `raise` :: Truth: `raise`\nPred: `commercial` :: Truth: `commercial`\nPred: `history` :: Truth: `history`\n\nEPOCH: 23.000\ttrn_loss: 0.106\ttrn_char_acc: 0.993\ttrn_word_acc: 0.965\tval_loss: 0.174\tval_char_acc: 0.953\tval_word_acc: 0.835\t(762.23s - 894.79s remaining)\n\nPred: `decade` :: Truth: `decade`\nPred: `occur` :: Truth: `occur`\nPred: `worber` :: Truth: `worker`\nPred: `south` :: Truth: `south`\nPred: `bill` :: Truth: `bill`\n\nEPOCH: 24.000\ttrn_loss: 0.098\ttrn_char_acc: 0.994\ttrn_word_acc: 0.969\tval_loss: 0.193\tval_char_acc: 0.951\tval_word_acc: 0.828\t(794.16s - 860.34s remaining)\n\nPred: `force` :: Truth: `force`\nPred: `alone` :: Truth: `alone`\nPred: `significant` :: Truth: `significant`\nPred: `posible` :: Truth: `possible`\nPred: `woman` :: Truth: `woman`\n\nEPOCH: 25.000\ttrn_loss: 0.097\ttrn_char_acc: 0.994\ttrn_word_acc: 0.971\tval_loss: 0.194\tval_char_acc: 0.951\tval_word_acc: 0.823\t(826.14s - 826.14s remaining)\n\nPred: `deep` :: Truth: `deep`\nPred: `table` :: Truth: `table`\nPred: `across` :: Truth: `across`\nPred: `production` :: Truth: `production`\nPred: `ife` :: Truth: `life`\n\nEPOCH: 26.000\ttrn_loss: 0.087\ttrn_char_acc: 0.995\ttrn_word_acc: 0.975\tval_loss: 0.187\tval_char_acc: 0.953\tval_word_acc: 0.839\t(858.06s - 792.05s remaining)\n\nPred: `sign` :: Truth: `sign`\nPred: `local` :: Truth: `local`\nPred: `nove` :: Truth: `move`\nPred: `sense` :: Truth: `sense`\nPred: `pesple` :: Truth: `people`\n\nEPOCH: 27.000\ttrn_loss: 0.087\ttrn_char_acc: 0.996\ttrn_word_acc: 0.976\tval_loss: 0.177\tval_char_acc: 0.955\tval_word_acc: 0.844\t(889.98s - 758.13s remaining)\n\nPred: `than` :: Truth: `than`\nPred: `reveal` :: Truth: `reveal`\nPred: `culture` :: Truth: `culture`\nPred: `neturk` :: Truth: `network`\nPred: `could` :: Truth: `could`\n\nEPOCH: 28.000\ttrn_loss: 0.079\ttrn_char_acc: 0.996\ttrn_word_acc: 0.980\tval_loss: 0.188\tval_char_acc: 0.953\tval_word_acc: 0.836\t(921.99s - 724.42s remaining)\n\nPred: `heavy` :: Truth: `heavy`\nPred: `along` :: Truth: `along`\nPred: `loss` :: Truth: `loss`\nPred: `law` :: Truth: `law`\nPred: `wst` :: Truth: `just`\n\nEPOCH: 29.000\ttrn_loss: 0.083\ttrn_char_acc: 0.996\ttrn_word_acc: 0.980\tval_loss: 0.175\tval_char_acc: 0.955\tval_word_acc: 0.845\t(954.07s - 690.88s remaining)\n\nPred: `wall` :: Truth: `wall`\nPred: `capital` :: Truth: `capital`\nPred: `method` :: Truth: `method`\nPred: `stratiagy` :: Truth: `strategy`\nPred: `moden` :: Truth: `modern`\n\nEPOCH: 30.000\ttrn_loss: 0.075\ttrn_char_acc: 0.997\ttrn_word_acc: 0.982\tval_loss: 0.182\tval_char_acc: 0.955\tval_word_acc: 0.835\t(986.04s - 657.36s remaining)\n\nPred: `national` :: Truth: `national`\nPred: `campaign` :: Truth: `campaign`\nPred: `school` :: Truth: `school`\nPred: `deacly` :: Truth: `clearly`\nPred: `imagine` :: Truth: `imagine`\n\nEPOCH: 31.000\ttrn_loss: 0.079\ttrn_char_acc: 0.997\ttrn_word_acc: 0.983\tval_loss: 0.183\tval_char_acc: 0.956\tval_word_acc: 0.840\t(1018.02s - 623.95s remaining)\n\nPred: `pretty` :: Truth: `pretty`\nPred: `until` :: Truth: `until`\nPred: `plan` :: Truth: `plan`\nPred: `attack` :: Truth: `attack`\nPred: `plan` :: Truth: `plan`\n\nEPOCH: 32.000\ttrn_loss: 0.069\ttrn_char_acc: 0.997\ttrn_word_acc: 0.986\tval_loss: 0.173\tval_char_acc: 0.957\tval_word_acc: 0.845\t(1050.00s - 590.62s remaining)\n\nPred: `imporlitant` :: Truth: `important`\nPred: `establish` :: Truth: `establish`\nPred: `certain` :: Truth: `certain`\nPred: `lot` :: Truth: `lot`\nPred: `it` :: Truth: `it`\n\nEPOCH: 33.000\ttrn_loss: 0.067\ttrn_char_acc: 0.997\ttrn_word_acc: 0.985\tval_loss: 0.171\tval_char_acc: 0.958\tval_word_acc: 0.853\t(1082.03s - 557.41s remaining)\n\nPred: `strong` :: Truth: `strong`\nPred: `apply` :: Truth: `apply`\nPred: `hotel` :: Truth: `hotel`\nPred: `hluge` :: Truth: `huge`\nPred: `afect` :: Truth: `affect`\n\nEPOCH: 34.000\ttrn_loss: 0.066\ttrn_char_acc: 0.998\ttrn_word_acc: 0.988\tval_loss: 0.172\tval_char_acc: 0.956\tval_word_acc: 0.846\t(1114.13s - 524.30s remaining)\n\nPred: `local` :: Truth: `local`\nPred: `page` :: Truth: `page`\nPred: `agreement` :: Truth: `agreement`\nPred: `line` :: Truth: `line`\nPred: `ask` :: Truth: `ask`\n\nEPOCH: 35.000\ttrn_loss: 0.064\ttrn_char_acc: 0.998\ttrn_word_acc: 0.988\tval_loss: 0.188\tval_char_acc: 0.956\tval_word_acc: 0.843\t(1146.25s - 491.25s remaining)\n\nPred: `debate` :: Truth: `debate`\nPred: `plone` :: Truth: `phone`\nPred: `huge` :: Truth: `huge`\nPred: `role` :: Truth: `role`\nPred: `check` :: Truth: `check`\n\nEPOCH: 36.000\ttrn_loss: 0.063\ttrn_char_acc: 0.998\ttrn_word_acc: 0.988\tval_loss: 0.166\tval_char_acc: 0.958\tval_word_acc: 0.850\t(1178.41s - 458.27s remaining)\n\nPred: `science` :: Truth: `science`\nPred: `naolue` :: Truth: `realize`\nPred: `mother` :: Truth: `mother`\nPred: `onto` :: Truth: `onto`\nPred: `standerd` :: Truth: `standard`\n\nEPOCH: 37.000\ttrn_loss: 0.060\ttrn_char_acc: 0.998\ttrn_word_acc: 0.989\tval_loss: 0.164\tval_char_acc: 0.958\tval_word_acc: 0.853\t(1210.56s - 425.33s remaining)\n\nPred: `form` :: Truth: `form`\nPred: `management` :: Truth: `management`\nPred: `focus` :: Truth: `focus`\nPred: `security` :: Truth: `security`\nPred: `girl` :: Truth: `goal`\n\nEPOCH: 38.000\ttrn_loss: 0.056\ttrn_char_acc: 0.999\ttrn_word_acc: 0.991\tval_loss: 0.164\tval_char_acc: 0.959\tval_word_acc: 0.856\t(1242.82s - 392.47s remaining)\n\nPred: `cuch` :: Truth: `such`\nPred: `increase` :: Truth: `increase`\nPred: `big` :: Truth: `big`\nPred: `indeer` :: Truth: `indeed`\nPred: `their` :: Truth: `their`\n\nEPOCH: 39.000\ttrn_loss: 0.057\ttrn_char_acc: 0.999\ttrn_word_acc: 0.991\tval_loss: 0.167\tval_char_acc: 0.959\tval_word_acc: 0.855\t(1275.05s - 359.63s remaining)\n\nPred: `attention` :: Truth: `attention`\nPred: `law` :: Truth: `law`\nPred: `blood` :: Truth: `blood`\nPred: `simply` :: Truth: `simply`\nPred: `find` :: Truth: `find`\n\nEPOCH: 40.000\ttrn_loss: 0.052\ttrn_char_acc: 0.999\ttrn_word_acc: 0.992\tval_loss: 0.166\tval_char_acc: 0.959\tval_word_acc: 0.860\t(1307.23s - 326.81s remaining)\n\nPred: `result` :: Truth: `result`\nPred: `mission` :: Truth: `mission`\nPred: `authority` :: Truth: `authority`\nPred: `lawyer` :: Truth: `lawyer`\nPred: `interest` :: Truth: `interest`\n\nEPOCH: 41.000\ttrn_loss: 0.057\ttrn_char_acc: 0.998\ttrn_word_acc: 0.990\tval_loss: 0.170\tval_char_acc: 0.959\tval_word_acc: 0.862\t(1339.29s - 293.99s remaining)\n\nPred: `can` :: Truth: `can`\nPred: `forward` :: Truth: `forward`\nPred: `camera` :: Truth: `camera`\nPred: `pretty` :: Truth: `pretty`\nPred: `collection` :: Truth: `collection`\n\nEPOCH: 42.000\ttrn_loss: 0.052\ttrn_char_acc: 0.999\ttrn_word_acc: 0.992\tval_loss: 0.174\tval_char_acc: 0.959\tval_word_acc: 0.856\t(1371.56s - 261.25s remaining)\n\nPred: `mission` :: Truth: `mission`\nPred: `relationstip` :: Truth: `relationship`\nPred: `at` :: Truth: `at`\nPred: `adult` :: Truth: `adult`\nPred: `property` :: Truth: `property`\n\nEPOCH: 43.000\ttrn_loss: 0.055\ttrn_char_acc: 0.999\ttrn_word_acc: 0.992\tval_loss: 0.164\tval_char_acc: 0.960\tval_word_acc: 0.856\t(1403.68s - 228.51s remaining)\n\nPred: `article` :: Truth: `article`\nPred: `appear` :: Truth: `appear`\nPred: `also` :: Truth: `also`\nPred: `beat` :: Truth: `beat`\nPred: `expect` :: Truth: `expect`\n\nEPOCH: 44.000\ttrn_loss: 0.048\ttrn_char_acc: 0.999\ttrn_word_acc: 0.993\tval_loss: 0.164\tval_char_acc: 0.961\tval_word_acc: 0.864\t(1435.79s - 195.79s remaining)\n\nPred: `successful` :: Truth: `successful`\nPred: `oushide` :: Truth: `outside`\nPred: `identify` :: Truth: `identify`\nPred: `indicate` :: Truth: `indicate`\nPred: `usually` :: Truth: `usually`\n\nEPOCH: 45.000\ttrn_loss: 0.046\ttrn_char_acc: 0.999\ttrn_word_acc: 0.994\tval_loss: 0.192\tval_char_acc: 0.957\tval_word_acc: 0.849\t(1467.94s - 163.10s remaining)\n\nPred: `though` :: Truth: `though`\nPred: `blue` :: Truth: `blue`\nPred: `country` :: Truth: `country`\nPred: `produce` :: Truth: `produce`\nPred: `reseut` :: Truth: `result`\n\nEPOCH: 46.000\ttrn_loss: 0.050\ttrn_char_acc: 0.999\ttrn_word_acc: 0.993\tval_loss: 0.172\tval_char_acc: 0.959\tval_word_acc: 0.858\t(1500.16s - 130.45s remaining)\n\nPred: `such` :: Truth: `such`\nPred: `director` :: Truth: `director`\nPred: `land` :: Truth: `land`\nPred: `some` :: Truth: `some`\nPred: `civil` :: Truth: `civil`\n\nEPOCH: 47.000\ttrn_loss: 0.045\ttrn_char_acc: 0.999\ttrn_word_acc: 0.994\tval_loss: 0.160\tval_char_acc: 0.961\tval_word_acc: 0.865\t(1532.37s - 97.81s remaining)\n\nPred: `wife` :: Truth: `wife`\nPred: `environmental` :: Truth: `environmental`\nPred: `opportunity` :: Truth: `opportunity`\nPred: `movie` :: Truth: `movie`\nPred: `identify` :: Truth: `identify`\n\nEPOCH: 48.000\ttrn_loss: 0.045\ttrn_char_acc: 0.999\ttrn_word_acc: 0.993\tval_loss: 0.182\tval_char_acc: 0.958\tval_word_acc: 0.851\t(1564.78s - 65.20s remaining)\n\nPred: `help` :: Truth: `help`\nPred: `brother` :: Truth: `brother`\nPred: `run` :: Truth: `run`\nPred: `accept` :: Truth: `accept`\nPred: `mouth` :: Truth: `mouth`\n\nEPOCH: 49.000\ttrn_loss: 0.049\ttrn_char_acc: 0.999\ttrn_word_acc: 0.994\tval_loss: 0.165\tval_char_acc: 0.961\tval_word_acc: 0.861\t(1597.20s - 32.60s remaining)\n\nPred: `catch` :: Truth: `catch`\nPred: `deep` :: Truth: `deep`\nPred: `I` :: Truth: `I`\nPred: `lawyer` :: Truth: `lawyer`\nPred: `both` :: Truth: `both`\n\nEPOCH: 50.000\ttrn_loss: 0.048\ttrn_char_acc: 0.999\ttrn_word_acc: 0.994\tval_loss: 0.168\tval_char_acc: 0.960\tval_word_acc: 0.863\t(1629.53s - 0.00s remaining)\n\nPred: `nor` :: Truth: `nor`\nPred: `esrcialy` :: Truth: `especially`\nPred: `born` :: Truth: `born`\nPred: `few` :: Truth: `few`\nPred: `western` :: Truth: `western`\n\n"
],
[
"log.plot_epochs(['trn_word_acc','val_word_acc'], title='Training and validation word accuracy')",
" 0%| | 0/50 [00:00<?, ?it/s]/usr/local/lib/python3.6/dist-packages/numpy/core/fromnumeric.py:3335: RuntimeWarning: Mean of empty slice.\n out=out, **kwargs)\n/usr/local/lib/python3.6/dist-packages/numpy/core/_methods.py:161: RuntimeWarning: invalid value encountered in double_scalars\n ret = ret.dtype.type(ret / rcount)\n100%|██████████| 50/50 [00:00<00:00, 499.58it/s]\n"
]
]
] | [
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07e2f2013671efa4acfb6ebee94cc1303e13819 | 32,285 | ipynb | Jupyter Notebook | Model backlog/Inference/276-tweet-inference-5fold-roberta-avg-last.ipynb | dimitreOliveira/Tweet-Sentiment-Extraction | 0a775abe9a92c4bc2db957519c523be7655df8d8 | [
"MIT"
] | 11 | 2020-06-17T07:30:20.000Z | 2022-03-25T16:56:01.000Z | Model backlog/Inference/276-tweet-inference-5fold-roberta-avg-last.ipynb | dimitreOliveira/Tweet-Sentiment-Extraction | 0a775abe9a92c4bc2db957519c523be7655df8d8 | [
"MIT"
] | null | null | null | Model backlog/Inference/276-tweet-inference-5fold-roberta-avg-last.ipynb | dimitreOliveira/Tweet-Sentiment-Extraction | 0a775abe9a92c4bc2db957519c523be7655df8d8 | [
"MIT"
] | null | null | null | 34.055907 | 149 | 0.415549 | [
[
[
"## Dependencies",
"_____no_output_____"
]
],
[
[
"import json, glob\nfrom tweet_utility_scripts import *\nfrom tweet_utility_preprocess_roberta_scripts_aux import *\nfrom transformers import TFRobertaModel, RobertaConfig\nfrom tokenizers import ByteLevelBPETokenizer\nfrom tensorflow.keras import layers\nfrom tensorflow.keras.models import Model",
"_____no_output_____"
]
],
[
[
"# Load data",
"_____no_output_____"
]
],
[
[
"test = pd.read_csv('/kaggle/input/tweet-sentiment-extraction/test.csv')\n\nprint('Test samples: %s' % len(test))\ndisplay(test.head())",
"Test samples: 3534\n"
]
],
[
[
"# Model parameters",
"_____no_output_____"
]
],
[
[
"input_base_path = '/kaggle/input/276-tweet-train-5fold-roberta-avg-last4-onecy-exp3/'\nwith open(input_base_path + 'config.json') as json_file:\n config = json.load(json_file)\n\nconfig",
"_____no_output_____"
],
[
"vocab_path = input_base_path + 'vocab.json'\nmerges_path = input_base_path + 'merges.txt'\nbase_path = '/kaggle/input/qa-transformers/roberta/'\n\n# vocab_path = base_path + 'roberta-base-vocab.json'\n# merges_path = base_path + 'roberta-base-merges.txt'\nconfig['base_model_path'] = base_path + 'roberta-base-tf_model.h5'\nconfig['config_path'] = base_path + 'roberta-base-config.json'\n\nmodel_path_list = glob.glob(input_base_path + '*.h5')\nmodel_path_list.sort()\n\nprint('Models to predict:')\nprint(*model_path_list, sep='\\n')",
"Models to predict:\n/kaggle/input/276-tweet-train-5fold-roberta-avg-last4-onecy-exp3/model_fold_1.h5\n/kaggle/input/276-tweet-train-5fold-roberta-avg-last4-onecy-exp3/model_fold_2.h5\n/kaggle/input/276-tweet-train-5fold-roberta-avg-last4-onecy-exp3/model_fold_3.h5\n/kaggle/input/276-tweet-train-5fold-roberta-avg-last4-onecy-exp3/model_fold_4.h5\n/kaggle/input/276-tweet-train-5fold-roberta-avg-last4-onecy-exp3/model_fold_5.h5\n"
]
],
[
[
"# Tokenizer",
"_____no_output_____"
]
],
[
[
"tokenizer = ByteLevelBPETokenizer(vocab_file=vocab_path, merges_file=merges_path, \n lowercase=True, add_prefix_space=True)",
"_____no_output_____"
]
],
[
[
"# Pre process",
"_____no_output_____"
]
],
[
[
"test['text'].fillna('', inplace=True)\ntest['text'] = test['text'].apply(lambda x: x.lower())\ntest['text'] = test['text'].apply(lambda x: x.strip())\n\nx_test, x_test_aux, x_test_aux_2 = get_data_test(test, tokenizer, config['MAX_LEN'], preprocess_fn=preprocess_roberta_test)",
"_____no_output_____"
]
],
[
[
"# Model",
"_____no_output_____"
]
],
[
[
"module_config = RobertaConfig.from_pretrained(config['config_path'], output_hidden_states=True)\n\ndef model_fn(MAX_LEN):\n input_ids = layers.Input(shape=(MAX_LEN,), dtype=tf.int32, name='input_ids')\n attention_mask = layers.Input(shape=(MAX_LEN,), dtype=tf.int32, name='attention_mask')\n \n base_model = TFRobertaModel.from_pretrained(config['base_model_path'], config=module_config, name='base_model')\n _, _, hidden_states = base_model({'input_ids': input_ids, 'attention_mask': attention_mask})\n \n h12 = hidden_states[-1]\n h11 = hidden_states[-2]\n h10 = hidden_states[-3]\n h09 = hidden_states[-4]\n \n avg_hidden = layers.Average()([h12, h11, h10, h09])\n\n logits = layers.Dense(2, use_bias=False, name='qa_outputs')(avg_hidden)\n \n start_logits, end_logits = tf.split(logits, 2, axis=-1)\n start_logits = tf.squeeze(start_logits, axis=-1, name='y_start')\n end_logits = tf.squeeze(end_logits, axis=-1, name='y_end')\n \n model = Model(inputs=[input_ids, attention_mask], outputs=[start_logits, end_logits])\n \n return model",
"_____no_output_____"
]
],
[
[
"# Make predictions",
"_____no_output_____"
]
],
[
[
"NUM_TEST_IMAGES = len(test)\ntest_start_preds = np.zeros((NUM_TEST_IMAGES, config['MAX_LEN']))\ntest_end_preds = np.zeros((NUM_TEST_IMAGES, config['MAX_LEN']))\n\nfor model_path in model_path_list:\n print(model_path)\n model = model_fn(config['MAX_LEN'])\n model.load_weights(model_path)\n \n test_preds = model.predict(get_test_dataset(x_test, config['BATCH_SIZE'])) \n test_start_preds += test_preds[0]\n test_end_preds += test_preds[1]",
"/kaggle/input/276-tweet-train-5fold-roberta-avg-last4-onecy-exp3/model_fold_1.h5\n/kaggle/input/276-tweet-train-5fold-roberta-avg-last4-onecy-exp3/model_fold_2.h5\n/kaggle/input/276-tweet-train-5fold-roberta-avg-last4-onecy-exp3/model_fold_3.h5\n/kaggle/input/276-tweet-train-5fold-roberta-avg-last4-onecy-exp3/model_fold_4.h5\n/kaggle/input/276-tweet-train-5fold-roberta-avg-last4-onecy-exp3/model_fold_5.h5\n"
]
],
[
[
"# Post process",
"_____no_output_____"
]
],
[
[
"test['start'] = test_start_preds.argmax(axis=-1)\ntest['end'] = test_end_preds.argmax(axis=-1)\n\ntest['selected_text'] = test.apply(lambda x: decode(x['start'], x['end'], x['text'], config['question_size'], tokenizer), axis=1)\n\n# Post-process\ntest[\"selected_text\"] = test.apply(lambda x: ' '.join([word for word in x['selected_text'].split() if word in x['text'].split()]), axis=1)\ntest['selected_text'] = test.apply(lambda x: x['text'] if (x['selected_text'] == '') else x['selected_text'], axis=1)\ntest['selected_text'].fillna(test['text'], inplace=True)",
"_____no_output_____"
]
],
[
[
"# Visualize predictions",
"_____no_output_____"
]
],
[
[
"test['text_len'] = test['text'].apply(lambda x : len(x))\ntest['label_len'] = test['selected_text'].apply(lambda x : len(x))\ntest['text_wordCnt'] = test['text'].apply(lambda x : len(x.split(' ')))\ntest['label_wordCnt'] = test['selected_text'].apply(lambda x : len(x.split(' ')))\ntest['text_tokenCnt'] = test['text'].apply(lambda x : len(tokenizer.encode(x).ids))\ntest['label_tokenCnt'] = test['selected_text'].apply(lambda x : len(tokenizer.encode(x).ids))\ntest['jaccard'] = test.apply(lambda x: jaccard(x['text'], x['selected_text']), axis=1)\n\ndisplay(test.head(10))\ndisplay(test.describe())",
"_____no_output_____"
]
],
[
[
"# Test set predictions",
"_____no_output_____"
]
],
[
[
"submission = pd.read_csv('/kaggle/input/tweet-sentiment-extraction/sample_submission.csv')\nsubmission['selected_text'] = test['selected_text']\nsubmission.to_csv('submission.csv', index=False)\nsubmission.head(10)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07e3430681069600215759516efd851bb5ca7a6 | 663,753 | ipynb | Jupyter Notebook | Assignment 5 ( Q no-1) Multi Linear Regression (1).ipynb | shraddhaghadage/Multi-Linear-Regression | 68cb14c424b35e73fcdbbf9dd7166b76b76205c9 | [
"Apache-2.0"
] | 2 | 2021-12-10T03:37:22.000Z | 2022-01-23T18:56:46.000Z | Assignment 5 ( Q no-1) Multi Linear Regression (1).ipynb | shraddhaghadage/Multi-Linear-Regression | 68cb14c424b35e73fcdbbf9dd7166b76b76205c9 | [
"Apache-2.0"
] | null | null | null | Assignment 5 ( Q no-1) Multi Linear Regression (1).ipynb | shraddhaghadage/Multi-Linear-Regression | 68cb14c424b35e73fcdbbf9dd7166b76b76205c9 | [
"Apache-2.0"
] | null | null | null | 172.492983 | 98,068 | 0.856755 | [
[
[
"import pandas as pd\nimport matplotlib.pyplot as plt\nimport seaborn as sns\nimport numpy as np\nimport statsmodels.formula.api as smf\nimport statsmodels.api as sm\nfrom statsmodels.graphics.regressionplots import influence_plot\nimport sklearn",
"_____no_output_____"
],
[
"startup=pd.read_csv(\"50_Startups.csv\")\nstartup",
"_____no_output_____"
],
[
"startup.describe()",
"_____no_output_____"
],
[
"startup.head()",
"_____no_output_____"
],
[
"startup.info()",
"<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 50 entries, 0 to 49\nData columns (total 5 columns):\n # Column Non-Null Count Dtype \n--- ------ -------------- ----- \n 0 R&D Spend 50 non-null float64\n 1 Administration 50 non-null float64\n 2 Marketing Spend 50 non-null float64\n 3 State 50 non-null object \n 4 Profit 50 non-null float64\ndtypes: float64(4), object(1)\nmemory usage: 2.1+ KB\n"
],
[
"startup1=startup.rename({'R&D Spend':'RDS','Administration':'ADMS','Marketing Spend':'MKTS'},axis=1)\nstartup1",
"_____no_output_____"
],
[
"startup1[startup1.duplicated()]",
"_____no_output_____"
],
[
"startup.corr()",
"_____no_output_____"
],
[
"sns.set_style(style='darkgrid')\nsns.pairplot(startup1)",
"_____no_output_____"
],
[
"model=smf.ols(\"Profit~RDS+ADMS+MKTS\",data=startup1).fit()",
"_____no_output_____"
],
[
"model.params",
"_____no_output_____"
],
[
"model.tvalues , np.round(model.pvalues,5)",
"_____no_output_____"
],
[
"(model.rsquared , model.rsquared_adj)",
"_____no_output_____"
],
[
"slr_a=smf.ols(\"Profit~ADMS\",data=startup1).fit()\nslr_a.tvalues , slr_a.pvalues",
"_____no_output_____"
],
[
"slr_m=smf.ols(\"Profit~MKTS\",data=startup1).fit()\nslr_m.tvalues , slr_m.pvalues",
"_____no_output_____"
],
[
"mlr_am=smf.ols(\"Profit~ADMS+MKTS\",data=startup1).fit()\nmlr_am.tvalues , mlr_am.pvalues",
"_____no_output_____"
],
[
"rsq_r=smf.ols(\"RDS~ADMS+MKTS\",data=startup1).fit().rsquared\nvif_r=1/(1-rsq_r)\n\nrsq_a=smf.ols(\"ADMS~RDS+MKTS\",data=startup1).fit().rsquared\nvif_a=1/(1-rsq_a)\n\nrsq_m=smf.ols(\"MKTS~RDS+ADMS\",data=startup1).fit().rsquared\nvif_m=1/(1-rsq_m)\n\n# Putting the values in Dataframe format\nd1={'Variables':['RDS','ADMS','MKTS'],'VIF':[vif_r,vif_a,vif_m]}\nVif_df=pd.DataFrame(d1)\nVif_df",
"_____no_output_____"
],
[
"sm.qqplot(model.resid,line='q')\nplt.title(\"Normal Q-Q plot of residuals\")\nplt.show()",
"_____no_output_____"
],
[
"sm.qqplot(model.resid,line='q')\nplt.title(\"Normal Q-Q plot of residuals\")\nplt.show()",
"_____no_output_____"
],
[
"list(np.where(model.resid<-30000))",
"_____no_output_____"
],
[
"def standard_values(vals) : return (vals-vals.mean())/vals.std()",
"_____no_output_____"
],
[
"plt.scatter(standard_values(model.fittedvalues),standard_values(model.resid))\nplt.title('Residual Plot')\nplt.xlabel('standardized fitted values')\nplt.ylabel('standardized residual values')\nplt.show()",
"_____no_output_____"
],
[
"fig=plt.figure(figsize=(15,8))\nsm.graphics.plot_regress_exog(model,'RDS',fig=fig)\nplt.show()",
"_____no_output_____"
],
[
"fig=plt.figure(figsize=(15,8))\nsm.graphics.plot_regress_exog(model,'ADMS',fig=fig)\nplt.show()",
"_____no_output_____"
],
[
"fig=plt.figure(figsize=(15,8))\nsm.graphics.plot_regress_exog(model,'MKTS',fig=fig)\nplt.show()",
"_____no_output_____"
],
[
"(c,_)=model.get_influence().cooks_distance\nc",
"_____no_output_____"
],
[
"fig=plt.figure(figsize=(20,7))\nplt.stem(np.arange(len(startup1)),np.round(c,5))\nplt.xlabel('Row Index')\nplt.ylabel('Cooks Distance')\nplt.show()",
"_____no_output_____"
],
[
"np.argmax(c) , np.max(c)",
"_____no_output_____"
],
[
"influence_plot(model)\nplt.show()",
"_____no_output_____"
],
[
"k=startup1.shape[1]\nn=startup1.shape[0]\nleverage_cutoff = (3*(k+1))/n\nleverage_cutoff",
"_____no_output_____"
],
[
"startup1[startup1.index.isin([49])]",
"_____no_output_____"
],
[
"startup2=startup1.drop(startup1.index[[49]],axis=0).reset_index(drop=True)\nstartup2",
"_____no_output_____"
],
[
"model2 = smf.ols(\"Profit~RDS+ADMS+MKTS\",data=startup2).fit()",
"_____no_output_____"
],
[
"sm.graphics.plot_partregress_grid(model)",
"_____no_output_____"
],
[
"model2=smf.ols(\"Profit~RDS+ADMS+MKTS\",data=startup2).fit()\n(c,_)=model2.get_influence().cooks_distance\nc\nnp.argmax(c) , np.max(c)\nstartup2=startup2.drop(startup2.index[[np.argmax(c)]],axis=0).reset_index(drop=True)\nstartup2",
"_____no_output_____"
],
[
"final_model=smf.ols(\"Profit~RDS+ADMS+MKTS\",data=startup2).fit()\nfinal_model.rsquared , final_model.aic\nprint(\"model accuracy is improved to\",final_model.rsquared)",
"model accuracy is improved to 0.9626766170294073\n"
],
[
"final_model.rsquared",
"_____no_output_____"
],
[
"startup2",
"_____no_output_____"
],
[
"new_data=pd.DataFrame({'RDS':70000,\"ADMS\":90000,\"MKTS\":140000},index=[0])\nnew_data",
"_____no_output_____"
],
[
"final_model.predict(new_data)",
"_____no_output_____"
],
[
"pred_y=final_model.predict(startup2)\npred_y",
"_____no_output_____"
],
[
"df={'Prep_Models':['Model','Final_Model'],'Rsquared':[model.rsquared,final_model.rsquared]}\ntable=pd.DataFrame(df)\nprint('FINAL MODEL :-')\ntable",
"FINAL MODEL :-\n"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07e370898d9a160769d630ced41aa2be48d8889 | 225,540 | ipynb | Jupyter Notebook | Jupyter Notebook files/Analysis of Twitter.ipynb | ugis22/analysing_twitter | 126ccd27383dc5aee813f6872a1d8ce1ca425ec5 | [
"MIT"
] | 104 | 2019-01-10T16:03:47.000Z | 2022-03-19T19:40:33.000Z | Jupyter Notebook files/Analysis of Twitter.ipynb | cindyss/analysing_twitter | 0c9f527d8fc43dd0b34ca6ea19437c7da5969ecc | [
"MIT"
] | 7 | 2019-12-18T14:33:34.000Z | 2020-11-16T20:36:16.000Z | Jupyter Notebook files/Analysis of Twitter.ipynb | cindyss/analysing_twitter | 0c9f527d8fc43dd0b34ca6ea19437c7da5969ecc | [
"MIT"
] | 50 | 2019-06-18T18:12:14.000Z | 2022-01-22T12:58:32.000Z | 171.122914 | 41,120 | 0.878886 | [
[
[
"## Importing Modules",
"_____no_output_____"
]
],
[
[
"#%matplotlib notebook\nfrom tqdm import tqdm\n%matplotlib inline\n#Module to handle regular expressions\nimport re\n#manage files\nimport os\n#Library for emoji\nimport emoji\n#Import pandas and numpy to handle data\nimport pandas as pd\nimport numpy as np\n\n#import libraries for accessing the database\nimport psycopg2\nfrom sqlalchemy import create_engine\nfrom postgres_credentials import *\n\n#import libraries for visualization\nimport matplotlib.pyplot as plt\nimport seaborn as sns\nfrom wordcloud import WordCloud\nfrom PIL import Image\n\n#Import nltk to check english lexicon\nimport nltk\nfrom nltk.tokenize import word_tokenize\nfrom nltk.corpus import (\n wordnet,\n stopwords\n)\n\n#import libraries for tokenization and ML\nimport json;\nimport keras;\nimport keras.preprocessing.text as kpt;\n#from keras.preprocessing.text import Tokenizer;\n\nimport sklearn\nfrom sklearn.preprocessing import Normalizer\nfrom sklearn.feature_extraction.text import (\n CountVectorizer,\n TfidfVectorizer\n)\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.metrics import accuracy_score\n\n#Import all libraries for creating a deep neural network\n#Sequential is the standard type of neural network with stackable layers\nfrom keras.models import (\n Sequential,\n model_from_json\n)\n#Dense: Standard layers with every node connected, dropout: avoids overfitting\nfrom keras.layers import Dense, Dropout, Activation;\n\n#To anotate database\nfrom pycorenlp import StanfordCoreNLP",
"Using TensorFlow backend.\n"
],
[
"#Querying the database\ndef query_database(tabletweets):\n engine = create_engine(\"postgresql+psycopg2://%s:%s@%s:%d/%s\" %(usertwitter, passwordtwitter, hosttwitter, porttwitter, dbnametwitter))\n table = pd.read_sql_query(\"select * from %s\" %tabletweets,con=engine, index_col=\"id\")\n return table",
"_____no_output_____"
]
],
[
[
"## Preprocessing the text",
"_____no_output_____"
],
[
"Before we dig into analyzing the public opinion on 'Avengers', there is an important step that we need to take: preprocessing the tweet text. But what does this mean? Text preprocessing includes a basic text cleaning following a set of simple rules commonly used but also, advanced techniques that takes into account syntactic and lexical information.",
"_____no_output_____"
]
],
[
[
"#preprocess text in tweets by removing links, @UserNames, blank spaces, etc.\ndef preprocessing_text(table):\n #put everythin in lowercase\n table['tweet'] = table['tweet'].str.lower()\n #Replace rt indicating that was a retweet\n table['tweet'] = table['tweet'].str.replace('rt', '')\n #Replace occurences of mentioning @UserNames\n table['tweet'] = table['tweet'].replace(r'@\\w+', '', regex=True)\n #Replace links contained in the tweet\n table['tweet'] = table['tweet'].replace(r'http\\S+', '', regex=True)\n table['tweet'] = table['tweet'].replace(r'www.[^ ]+', '', regex=True)\n #remove numbers\n table['tweet'] = table['tweet'].replace(r'[0-9]+', '', regex=True)\n #replace special characters and puntuation marks\n table['tweet'] = table['tweet'].replace(r'[!\"#$%&()*+,-./:;<=>?@[\\]^_`{|}~]', '', regex=True)\n return table ",
"_____no_output_____"
],
[
"#Replace elongated words by identifying those repeated characters and then remove them and compare the new word with the english lexicon\ndef in_dict(word):\n if wordnet.synsets(word):\n #if the word is in the dictionary, we'll return True\n return True\n\ndef replace_elongated_word(word):\n regex = r'(\\w*)(\\w+)\\2(\\w*)'\n repl = r'\\1\\2\\3' \n if in_dict(word):\n return word\n new_word = re.sub(regex, repl, word)\n if new_word != word:\n return replace_elongated_word(new_word)\n else:\n return new_word\n\ndef detect_elongated_words(row):\n regexrep = r'(\\w*)(\\w+)(\\2)(\\w*)'\n words = [''.join(i) for i in re.findall(regexrep, row)]\n for word in words:\n if not in_dict(word):\n row = re.sub(word, replace_elongated_word(word), row)\n return row ",
"_____no_output_____"
],
[
"def stop_words(table):\n #We need to remove the stop words\n stop_words_list = stopwords.words('english')\n table['tweet'] = table['tweet'].str.lower()\n table['tweet'] = table['tweet'].apply(lambda x: ' '.join([word for word in x.split() if word not in (stop_words_list)]))\n return table",
"_____no_output_____"
],
[
"def replace_antonyms(word):\n #We get all the lemma for the word\n for syn in wordnet.synsets(word): \n for lemma in syn.lemmas(): \n #if the lemma is an antonyms of the word\n if lemma.antonyms(): \n #we return the antonym\n return lemma.antonyms()[0].name()\n return word\n \ndef handling_negation(row):\n #Tokenize the row\n words = word_tokenize(row)\n speach_tags = ['JJ', 'JJR', 'JJS', 'NN', 'VB', 'VBD', 'VBG', 'VBN', 'VBP']\n #We obtain the type of words that we have in the text, we use the pos_tag function\n tags = nltk.pos_tag(words)\n #Now we ask if we found a negation in the words\n tags_2 = ''\n if \"n't\" in words and \"not\" in words:\n tags_2 = tags[min(words.index(\"n't\"), words.index(\"not\")):]\n words_2 = words[min(words.index(\"n't\"), words.index(\"not\")):]\n words = words[:(min(words.index(\"n't\"), words.index(\"not\")))+1]\n elif \"n't\" in words:\n tags_2 = tags[words.index(\"n't\"):]\n words_2 = words[words.index(\"n't\"):] \n words = words[:words.index(\"n't\")+1]\n elif \"not\" in words:\n tags_2 = tags[words.index(\"not\"):]\n words_2 = words[words.index(\"not\"):]\n words = words[:words.index(\"not\")+1] \n \n for index, word_tag in enumerate(tags_2):\n if word_tag[1] in speach_tags:\n words = words+[replace_antonyms(word_tag[0])]+words_2[index+2:]\n break\n \n return ' '.join(words) ",
"_____no_output_____"
],
[
"def cleaning_table(table):\n #This function will process all the required cleaning for the text in our tweets\n table = preprocessing_text(table)\n table['tweet'] = table['tweet'].apply(lambda x: detect_elongated_words(x))\n table['tweet'] = table['tweet'].apply(lambda x: handling_negation(x))\n table = stop_words(table)\n return table",
"_____no_output_____"
]
],
[
[
"## Data Visualization",
"_____no_output_____"
],
[
"After we have cleaned our data but before we start building our model for sentiment analysis, we can perform an exploratory data analysis to see what are the most frequent words that appear in our 'Avengers' tweets. For this part, we will show graphs regarding tweets labelled as positive separated from those labelled as negative.",
"_____no_output_____"
]
],
[
[
"#Vectorization for Data Visualization\ndef vectorization(table):\n #CountVectorizer will convert a collection of text documents to a matrix of token counts\n #Produces a sparse representation of the counts \n #Initialize\n vector = CountVectorizer()\n #We fit and transform the vector created\n frequency_matrix = vector.fit_transform(table.tweet)\n #Sum all the frequencies for each word\n sum_frequencies = np.sum(frequency_matrix, axis=0)\n #Now we use squeeze to remove single-dimensional entries from the shape of an array that we got from applying np.asarray to\n #the sum of frequencies.\n frequency = np.squeeze(np.asarray(sum_frequencies))\n #Now we get into a dataframe all the frequencies and the words that they correspond to\n frequency_df = pd.DataFrame([frequency], columns=vector.get_feature_names()).transpose()\n return frequency_df",
"_____no_output_____"
],
[
"def word_cloud(tweets):\n \n #We get the directory that we are working on\n file = os.getcwd()\n #We read the mask image into a numpy array\n avengers_mask = np.array(Image.open(os.path.join(file, \"avengers.png\")))\n #Now we store the tweets into a series to be able to process \n #tweets_list = pd.Series([t for t in tweet_table.tweet]).str.cat(sep=' ') \n #We generate the wordcloud using the series created and the mask \n word_cloud = WordCloud(width=2000, height=1000, max_font_size=200, background_color=\"black\", max_words=2000, mask=avengers_mask, contour_width=1, \n contour_color=\"steelblue\", colormap=\"nipy_spectral\", stopwords=[\"avengers\"])\n word_cloud.generate(tweets)\n \n #wordcloud = WordCloud(width=1600, height=800,max_font_size=200).generate(tweets_list)\n \n #Now we plot both figures, the wordcloud and the mask\n #plt.figure(figsize=(15,15))\n plt.figure(figsize=(10,10))\n plt.imshow(word_cloud, interpolation=\"hermite\")\n plt.axis(\"off\")\n #plt.imshow(avengers_mask, cmap=plt.cm.gray, interpolation=\"bilinear\")\n #plt.axis(\"off\") \n plt.show() ",
"_____no_output_____"
],
[
"def graph(word_frequency, sent):\n labels = word_frequency[0][1:51].index\n title = \"Word Frequency for %s\" %sent\n #Plot the figures\n plt.figure(figsize=(10,5))\n plt.bar(np.arange(50), word_frequency[0][1:51], width = 0.8, color = sns.color_palette(\"bwr\"), alpha=0.5, \n edgecolor = \"black\", capsize=8, linewidth=1);\n plt.xticks(np.arange(50), labels, rotation=90, size=14);\n plt.xlabel(\"50 more frequent words\", size=14);\n plt.ylabel(\"Frequency\", size=14);\n #plt.title('Word Frequency for %s', size=18) %sent;\n plt.title(title, size=18)\n plt.grid(False);\n plt.gca().spines[\"top\"].set_visible(False);\n plt.gca().spines[\"right\"].set_visible(False);\n plt.show() ",
"_____no_output_____"
],
[
"def regression_graph(table):\n table = table[1:]\n #We set the style of seaborn\n sns.set_style(\"whitegrid\") \n #Initialize the figure\n plt.figure(figsize=(6,6))\n \n #we obtain the points from matplotlib scatter\n points = plt.scatter(table[\"Positive\"], table[\"Negative\"], c=table[\"Positive\"], s=75, cmap=\"bwr\")\n #graph the colorbar\n plt.colorbar(points)\n #we graph the regplot from seaborn\n sns.regplot(x=\"Positive\", y=\"Negative\",fit_reg=False, scatter=False, color=\".1\", data=table)\n plt.xlabel(\"Frequency for Positive Tweets\", size=14)\n plt.ylabel(\"Frequency for Negative Tweets\", size=14)\n plt.title(\"Word frequency in Positive vs. Negative Tweets\", size=14)\n plt.grid(False)\n sns.despine()",
"_____no_output_____"
]
],
[
[
"## Preparing data for model",
"_____no_output_____"
],
[
"After visualizing our data, the next step is to split our dataset into training and test sets. For doing so, we'll take advantage of the train_test_split functionality of sklearn package. We will take 20% of the dataset for testing following the 20–80% rule. From the remaining 80% used for the training set, we'll save a part for validation of our model.",
"_____no_output_____"
]
],
[
[
"#Split Data into training and test dataset\ndef splitting(table):\n X_train, X_test, y_train, y_test = train_test_split(table.tweet, table.sentiment, test_size=0.2, shuffle=True)\n return X_train, X_test, y_train, y_test",
"_____no_output_____"
]
],
[
[
"m",
"_____no_output_____"
]
],
[
[
"#Tokenization for analysis\ndef tokenization_tweets(dataset, features):\n tokenization = TfidfVectorizer(max_features=features)\n tokenization.fit(dataset)\n dataset_transformed = tokenization.transform(dataset).toarray()\n return dataset_transformed",
"_____no_output_____"
]
],
[
[
"## Train model",
"_____no_output_____"
]
],
[
[
"#Create a Neural Network\n#Create the model\ndef train(X_train_mod, y_train, features, shuffle, drop, layer1, layer2, epoch, lr, epsilon, validation):\n model_nn = Sequential()\n model_nn.add(Dense(layer1, input_shape=(features,), activation='relu'))\n model_nn.add(Dropout(drop))\n model_nn.add(Dense(layer2, activation='sigmoid'))\n model_nn.add(Dropout(drop))\n model_nn.add(Dense(3, activation='softmax'))\n \n optimizer = keras.optimizers.Adam(lr=lr, beta_1=0.9, beta_2=0.999, epsilon=epsilon, decay=0.0, amsgrad=False)\n model_nn.compile(loss='sparse_categorical_crossentropy',\n optimizer=optimizer,\n metrics=['accuracy'])\n model_nn.fit(np.array(X_train_mod), y_train,\n batch_size=32,\n epochs=epoch,\n verbose=1,\n validation_split=validation,\n shuffle=shuffle)\n return model_nn",
"_____no_output_____"
]
],
[
[
"## Test model",
"_____no_output_____"
]
],
[
[
"def test(X_test, model_nn):\n prediction = model_nn.predict(X_test)\n return prediction",
"_____no_output_____"
]
],
[
[
"## Main code",
"_____no_output_____"
]
],
[
[
"if __name__ == \"__main__\":\n tabletweets = \"tweets_avengers\"\n tweet_table = query_database(tabletweets)\n tweet_table = cleaning_table(tweet_table)",
"_____no_output_____"
],
[
"if __name__ == \"__main__\": \n #First we draw a word cloud\n #For All tweets\n word_cloud(pd.Series([t for t in tweet_table.tweet]).str.cat(sep=' ')) \n #For positive tweets \n word_cloud(pd.Series([t for t in tweet_table[tweet_table.sentiment == \"Positive\"].tweet]).str.cat(sep=' ')) \n #For negative tweets\n word_cloud(pd.Series([t for t in tweet_table[tweet_table.sentiment == \"Negative\"].tweet]).str.cat(sep=' '))",
"_____no_output_____"
],
[
"if __name__ == \"__main__\":\n #Get the frequency\n word_frequency = vectorization(tweet_table).sort_values(0, ascending = False)\n word_frequency_pos = vectorization(tweet_table[tweet_table['sentiment'] == 'Positive']).sort_values(0, ascending = False)\n word_frequency_neg = vectorization(tweet_table[tweet_table['sentiment'] == 'Negative']).sort_values(0, ascending = False)\n\n #Graph with frequency words all, positive and negative tweets and get the frequency\n graph(word_frequency, 'all')\n graph(word_frequency_pos, 'positive')\n graph(word_frequency_neg, 'negative')",
"_____no_output_____"
],
[
"if __name__ == \"__main__\":\n #Concatenate word frequency for positive and negative\n table_regression = pd.concat([word_frequency_pos, word_frequency_neg], axis=1, sort=False)\n table_regression.columns = [\"Positive\", \"Negative\"]\n regression_graph(table_regression)",
"_____no_output_____"
],
[
"if __name__ == \"__main__\":\n tabletweets = \"tweets_avengers_labeled\"\n tweet_table = query_database(tabletweets)",
"_____no_output_____"
],
[
"if __name__ == \"__main__\":\n tweet_table['sentiment'] = tweet_table['sentiment'].apply(lambda x: 2 if x == 'Positive' else (0 if x == 'Negative' else 1))",
"_____no_output_____"
],
[
"if __name__ == \"__main__\":\n X_train, X_test, y_train, y_test = splitting(tweet_table)",
"_____no_output_____"
],
[
"def model1(X_train, y_train): \n features = 3500\n shuffle = True\n drop = 0.5\n layer1 = 512\n layer2 = 256\n epoch = 5\n lr = 0.001\n epsilon = None\n validation = 0.1\n X_train_mod = tokenization_tweets(X_train, features)\n model = train(X_train_mod, y_train, features, shuffle, drop, layer1, layer2, epoch, lr, epsilon, validation)\n return model;\nmodel1(X_train, y_train)",
"Train on 50712 samples, validate on 5635 samples\nEpoch 1/5\n50712/50712 [==============================] - 31s 618us/step - loss: 0.7807 - acc: 0.6350 - val_loss: 0.6618 - val_acc: 0.7043\nEpoch 2/5\n50712/50712 [==============================] - 31s 621us/step - loss: 0.5936 - acc: 0.7435 - val_loss: 0.6160 - val_acc: 0.7366\nEpoch 3/5\n50712/50712 [==============================] - 31s 602us/step - loss: 0.5211 - acc: 0.7807 - val_loss: 0.6024 - val_acc: 0.7462\nEpoch 4/5\n50712/50712 [==============================] - 32s 624us/step - loss: 0.4506 - acc: 0.8161 - val_loss: 0.6097 - val_acc: 0.7531\nEpoch 5/5\n50712/50712 [==============================] - 30s 597us/step - loss: 0.3870 - acc: 0.8419 - val_loss: 0.6486 - val_acc: 0.7542\n"
],
[
"def model2(X_train, y_train): \n features = 3000\n shufle = True\n drop = 0.5\n layer1 = 512\n layer2 = 256\n epoch = 5\n lr = 0.001\n epsilon = None\n validation = 0.1\n X_train_mod = tokenization_tweets(X_train, features)\n model = train(X_train_mod, y_train, features, shufle, drop, layer1, layer2, epoch, lr, epsilon, validation)\n return model;\nmodel2(X_train, y_train)",
"Train on 50712 samples, validate on 5635 samples\nEpoch 1/5\n50712/50712 [==============================] - 26s 513us/step - loss: 0.7911 - acc: 0.6289 - val_loss: 0.6542 - val_acc: 0.7150\nEpoch 2/5\n50712/50712 [==============================] - 26s 512us/step - loss: 0.6067 - acc: 0.7405 - val_loss: 0.6259 - val_acc: 0.7315\nEpoch 3/5\n50712/50712 [==============================] - 27s 528us/step - loss: 0.5324 - acc: 0.7780 - val_loss: 0.6173 - val_acc: 0.7429\nEpoch 4/5\n50712/50712 [==============================] - 27s 528us/step - loss: 0.4668 - acc: 0.8118 - val_loss: 0.6237 - val_acc: 0.7496\nEpoch 5/5\n50712/50712 [==============================] - 26s 507us/step - loss: 0.4003 - acc: 0.8408 - val_loss: 0.6544 - val_acc: 0.7478\n"
],
[
"def model3(X_train, y_train): \n features = 3500\n shufle = True\n drop = 0.5\n layer1 = 512\n layer2 = 256\n epoch = 5\n lr = 0.002\n epsilon = None\n validation = 0.1\n X_train_mod = tokenization_tweets(X_train, features)\n model = train(X_train_mod, y_train, features, shufle, drop, layer1, layer2, epoch, lr, epsilon, validation)\n return model;\nmodel_final = model3(X_train, y_train)",
"Train on 50712 samples, validate on 5635 samples\nEpoch 1/5\n50712/50712 [==============================] - 32s 631us/step - loss: 0.7493 - acc: 0.6528 - val_loss: 0.6321 - val_acc: 0.7244\nEpoch 2/5\n50712/50712 [==============================] - 32s 626us/step - loss: 0.5775 - acc: 0.7510 - val_loss: 0.5998 - val_acc: 0.7400\nEpoch 3/5\n50712/50712 [==============================] - 30s 587us/step - loss: 0.4820 - acc: 0.7993 - val_loss: 0.5909 - val_acc: 0.7571\nEpoch 4/5\n50712/50712 [==============================] - 29s 568us/step - loss: 0.3842 - acc: 0.8428 - val_loss: 0.6090 - val_acc: 0.7588\nEpoch 5/5\n50712/50712 [==============================] - 28s 561us/step - loss: 0.3088 - acc: 0.8778 - val_loss: 0.6868 - val_acc: 0.7567\n"
],
[
"def model4(X_train, y_train): \n features = 5000\n shufle = True\n drop = 0.5\n layer1 = 512\n layer2 = 256\n epoch = 2\n lr = 0.005\n epsilon = None\n validation = 0.1\n X_train_mod = tokenization_tweets(X_train, features)\n model = train(X_train_mod, y_train, features, shufle, drop, layer1, layer2, epoch, lr, epsilon, validation)\n return model;\nmodel4(X_train, y_train)",
"Train on 50712 samples, validate on 5635 samples\nEpoch 1/2\n50712/50712 [==============================] - 48s 940us/step - loss: 0.7415 - acc: 0.6609 - val_loss: 0.6593 - val_acc: 0.6978\nEpoch 2/2\n50712/50712 [==============================] - 46s 916us/step - loss: 0.5386 - acc: 0.7729 - val_loss: 0.6040 - val_acc: 0.7464\n"
],
[
"def model5(X_train, y_train): \n features = 3500\n shufle = True\n drop = 0.5\n layer1 = 512\n layer2 = 256\n epoch = 5\n lr = 0.002\n epsilon = 1e-5\n validation = 0.1\n X_train_mod = tokenization_tweets(X_train, features)\n model = train(X_train_mod, y_train, features, shufle, drop, layer1, layer2, epoch, lr, epsilon, validation)\n return model;\nmodel5(X_train, y_train)",
"Train on 50712 samples, validate on 5635 samples\nEpoch 1/5\n50712/50712 [==============================] - 35s 682us/step - loss: 0.7565 - acc: 0.6494 - val_loss: 0.6422 - val_acc: 0.7177\nEpoch 2/5\n50712/50712 [==============================] - 33s 641us/step - loss: 0.5799 - acc: 0.7501 - val_loss: 0.6225 - val_acc: 0.7390\nEpoch 3/5\n50712/50712 [==============================] - 31s 617us/step - loss: 0.4789 - acc: 0.7993 - val_loss: 0.6002 - val_acc: 0.7528\nEpoch 4/5\n50712/50712 [==============================] - 30s 585us/step - loss: 0.3826 - acc: 0.8435 - val_loss: 0.6432 - val_acc: 0.7555\nEpoch 5/5\n50712/50712 [==============================] - 29s 579us/step - loss: 0.3044 - acc: 0.8781 - val_loss: 0.7099 - val_acc: 0.7567\n"
],
[
"def model6(X_train, y_train): \n features = 3500\n shufle = True\n drop = 0.5\n layer1 = 512\n layer2 = 256\n epoch = 5\n lr = 0.002\n epsilon = 1e-8\n validation = 0.1\n X_train_mod = tokenization_tweets(X_train, features)\n model = train(X_train_mod, y_train, features, shufle, drop, layer1, layer2, epoch, lr, epsilon, validation)\n return model;\nmodel6(X_train, y_train)",
"Train on 50712 samples, validate on 5635 samples\nEpoch 1/5\n50712/50712 [==============================] - 31s 611us/step - loss: 0.7512 - acc: 0.6506 - val_loss: 0.6474 - val_acc: 0.7154\nEpoch 2/5\n50712/50712 [==============================] - 30s 588us/step - loss: 0.5740 - acc: 0.7541 - val_loss: 0.6135 - val_acc: 0.7429\nEpoch 3/5\n50712/50712 [==============================] - 30s 584us/step - loss: 0.4728 - acc: 0.8011 - val_loss: 0.6122 - val_acc: 0.7555\nEpoch 4/5\n50712/50712 [==============================] - 30s 583us/step - loss: 0.3772 - acc: 0.8446 - val_loss: 0.6569 - val_acc: 0.7601\nEpoch 5/5\n50712/50712 [==============================] - 29s 575us/step - loss: 0.2982 - acc: 0.8799 - val_loss: 0.7419 - val_acc: 0.7622\n"
],
[
"def model7(X_train, y_train): \n features = 3500\n shufle = True\n drop = 0.5\n layer1 = 512\n layer2 = 256\n epoch = 6\n lr = 0.002\n epsilon = 1e-8\n validation = 0.1\n X_train_mod = tokenization_tweets(X_train, features)\n model = train(X_train_mod, y_train, features, shufle, drop, layer1, layer2, epoch, lr, epsilon, validation)\n return model;\n#model7(X_train, y_train)",
"_____no_output_____"
],
[
"def model8(X_train, y_train): \n features = 3500\n shufle = True\n drop = 0.5\n layer1 = 512\n layer2 = 256\n epoch = 5\n lr = 0.002\n epsilon = 1e-9\n validation = 0.1\n X_train_mod = tokenization_tweets(X_train, features)\n model = train(X_train_mod, y_train, features, shufle, drop, layer1, layer2, epoch, lr, epsilon, validation)\n return model;\nmodel8(X_train, y_train)",
"Train on 50712 samples, validate on 5635 samples\nEpoch 1/5\n50712/50712 [==============================] - 31s 604us/step - loss: 0.7501 - acc: 0.6550 - val_loss: 0.6384 - val_acc: 0.7169\nEpoch 2/5\n50712/50712 [==============================] - 30s 585us/step - loss: 0.5578 - acc: 0.7635 - val_loss: 0.6035 - val_acc: 0.7446\nEpoch 3/5\n50712/50712 [==============================] - 31s 612us/step - loss: 0.4481 - acc: 0.8158 - val_loss: 0.6171 - val_acc: 0.7535\nEpoch 4/5\n50712/50712 [==============================] - 30s 596us/step - loss: 0.3473 - acc: 0.8606 - val_loss: 0.6674 - val_acc: 0.7544\nEpoch 5/5\n50712/50712 [==============================] - 29s 579us/step - loss: 0.2718 - acc: 0.8933 - val_loss: 0.7278 - val_acc: 0.7583\n"
],
[
"def model9(X_train, y_train): \n features = 3500\n shufle = False\n drop = 0.5\n layer1 = 512\n layer2 = 256\n epoch = 5\n lr = 0.002\n epsilon = 1e-9\n validation = 0.1\n X_train_mod = tokenization_tweets(X_train, features)\n model = train(X_train_mod, y_train, features, shufle, drop, layer1, layer2, epoch, lr, epsilon, validation)\n return model;\nmodel9(X_train, y_train)",
"Train on 50712 samples, validate on 5635 samples\nEpoch 1/5\n50712/50712 [==============================] - 32s 627us/step - loss: 0.7528 - acc: 0.6510 - val_loss: 0.6414 - val_acc: 0.7235\nEpoch 2/5\n50712/50712 [==============================] - 31s 604us/step - loss: 0.5797 - acc: 0.7487 - val_loss: 0.6163 - val_acc: 0.7363\nEpoch 3/5\n50712/50712 [==============================] - 30s 594us/step - loss: 0.4815 - acc: 0.7966 - val_loss: 0.6467 - val_acc: 0.7448\nEpoch 4/5\n50712/50712 [==============================] - 30s 584us/step - loss: 0.3849 - acc: 0.8397 - val_loss: 0.7066 - val_acc: 0.7491\nEpoch 5/5\n50712/50712 [==============================] - 29s 572us/step - loss: 0.3099 - acc: 0.8756 - val_loss: 0.7812 - val_acc: 0.7516\n"
],
[
"def model10(X_train, y_train): \n features = 3500\n shufle = True\n drop = 0.5\n layer1 = 512\n layer2 = 256\n epoch = 5\n lr = 0.002\n epsilon = 1e-9\n validation = 0.2\n X_train_mod = tokenization_tweets(X_train, features)\n model = train(X_train_mod, y_train, features, shufle, drop, layer1, layer2, epoch, lr, epsilon, validation)\n return model;\nmodel10(X_train, y_train)",
"Train on 45077 samples, validate on 11270 samples\nEpoch 1/5\n45077/45077 [==============================] - 29s 638us/step - loss: 0.7670 - acc: 0.6466 - val_loss: 0.6403 - val_acc: 0.7220\nEpoch 2/5\n45077/45077 [==============================] - 28s 613us/step - loss: 0.5729 - acc: 0.7542 - val_loss: 0.6180 - val_acc: 0.7405\nEpoch 3/5\n45077/45077 [==============================] - 27s 603us/step - loss: 0.4638 - acc: 0.8089 - val_loss: 0.6058 - val_acc: 0.7507\nEpoch 4/5\n45077/45077 [==============================] - 27s 594us/step - loss: 0.3630 - acc: 0.8544 - val_loss: 0.6533 - val_acc: 0.7577\nEpoch 5/5\n45077/45077 [==============================] - 26s 587us/step - loss: 0.2856 - acc: 0.8895 - val_loss: 0.7454 - val_acc: 0.7541\n"
],
[
"def model11(X_train, y_train): \n features = 3000\n shufle = True\n drop = 0.5\n layer1 = 512\n layer2 = 256\n epoch = 5\n lr = 0.002\n epsilon = 1e-9\n validation = 0.2\n X_train_mod = tokenization_tweets(X_train, features)\n model = train(X_train_mod, y_train, features, shufle, drop, layer1, layer2, epoch, lr, epsilon, validation)\n return model;\nmodel11(X_train, y_train)",
"Train on 45077 samples, validate on 11270 samples\nEpoch 1/5\n45077/45077 [==============================] - 24s 536us/step - loss: 0.7652 - acc: 0.6451 - val_loss: 0.6463 - val_acc: 0.7149\nEpoch 2/5\n45077/45077 [==============================] - 23s 520us/step - loss: 0.5918 - acc: 0.7443 - val_loss: 0.6155 - val_acc: 0.7323\nEpoch 3/5\n45077/45077 [==============================] - 23s 509us/step - loss: 0.4941 - acc: 0.7930 - val_loss: 0.6149 - val_acc: 0.7403\nEpoch 4/5\n45077/45077 [==============================] - 23s 505us/step - loss: 0.4027 - acc: 0.8348 - val_loss: 0.6474 - val_acc: 0.7509\nEpoch 5/5\n45077/45077 [==============================] - 23s 499us/step - loss: 0.3196 - acc: 0.8720 - val_loss: 0.7078 - val_acc: 0.7446\n"
],
[
"def save_model(model):\n model_json = model.to_json()\n with open('model.json', 'w') as json_file:\n json_file.write(model_json)\n\n model.save_weights('model.h5')\n\nmodel_final = model7(X_train, y_train)\nsave_model(model_final)",
"Train on 50712 samples, validate on 5635 samples\nEpoch 1/6\n50712/50712 [==============================] - 31s 605us/step - loss: 0.7494 - acc: 0.6563 - val_loss: 0.6383 - val_acc: 0.7184\nEpoch 2/6\n50712/50712 [==============================] - 30s 592us/step - loss: 0.5682 - acc: 0.7579 - val_loss: 0.6104 - val_acc: 0.7372\nEpoch 3/6\n50712/50712 [==============================] - 30s 585us/step - loss: 0.4699 - acc: 0.8040 - val_loss: 0.6132 - val_acc: 0.7473\nEpoch 4/6\n50712/50712 [==============================] - 29s 577us/step - loss: 0.3716 - acc: 0.8485 - val_loss: 0.6809 - val_acc: 0.7500\nEpoch 5/6\n50712/50712 [==============================] - 29s 571us/step - loss: 0.2952 - acc: 0.8828 - val_loss: 0.7199 - val_acc: 0.7595\nEpoch 6/6\n50712/50712 [==============================] - 29s 571us/step - loss: 0.2402 - acc: 0.9057 - val_loss: 0.8126 - val_acc: 0.7618\n"
],
[
"if __name__ == \"__main__\":\n tabletweetsnew = \"tweets_predict_avengers\"\n tweet_table_new = query_database(tabletweetsnew)\n tweet_table_new = cleaning_table(tweet_table_new)",
"_____no_output_____"
],
[
"if __name__ == \"__main__\":\n X_new = tokenization_tweets(tweet_table_new.tweet, 3500)\n new_prediction = model_final.predict(X_new)",
"_____no_output_____"
],
[
"if __name__ == \"__main__\":\n labels = ['Negative', 'Neutral', 'Positive']\n sentiments = [labels[np.argmax(pred)] for pred in new_prediction]\n tweet_table_new[\"sentiment\"] = sentiments",
"_____no_output_____"
],
[
"sizes = [sentiments.count('Negative'), sentiments.count('Neutral'), sentiments.count('Positive')]\nexplode = (0, 0, 0.1)\nlabels = 'Negative', 'Neutral', 'Positive'\nplt.figure(figsize=(5,5))\nplt.pie(sizes, explode=explode, colors=\"bwr\", labels=labels, autopct='%1.1f%%',\n shadow=True, startangle=90, wedgeprops={'alpha':0.8})\nplt.axis('equal') \nplt.show()",
"_____no_output_____"
],
[
"if __name__ == \"__main__\":\n engine = create_engine(\"postgresql+psycopg2://%s:%s@%s:%d/%s\" %(usertwitter, passwordtwitter, hosttwitter, porttwitter, dbnametwitter))\n tweet_table_new.to_sql(\"tweets_avengers_new_labeled\", con=engine, if_exists=\"append\") ",
"_____no_output_____"
]
],
[
[
" ",
"_____no_output_____"
],
[
" ",
"_____no_output_____"
],
[
"### Extra analysis for interaction network",
"_____no_output_____"
]
],
[
[
"if __name__ == \"__main__\":\n tweet_table_interaction = pd.read_csv(\"tweets_final.csv\")\n tweet_table_interaction.rename(columns = {\"text\": \"tweet\"}, inplace=True)\n tweet_table_interaction = cleaning_table(tweet_table_interaction) \n X_interaction = tokenization_tweets(tweet_table_interaction.tweet, 3500)",
"_____no_output_____"
],
[
"if __name__ == \"__main__\":\n # Open json file of saved model\n json_file = open('model.json', 'r')\n loaded_model_json = json_file.read()\n json_file.close()\n # Create a model \n model = model_from_json(loaded_model_json)\n # Weight nodes with saved values\n model.load_weights('model.h5')",
"_____no_output_____"
],
[
"if __name__ == \"__main__\":\n int_prediction = model.predict(X_interaction)\n labels = ['Negative', 'Neutral', 'Positive']\n sentiments = [labels[np.argmax(pred)] for pred in int_prediction]\n tweet_table_interaction[\"sentiment\"] = sentiments",
"_____no_output_____"
],
[
"tweet_table_interaction.to_csv(\"tweets_final_sentiment.csv\")",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
]
] |
d07e38cb7c6d2c5a86cc84301763e50226b7c038 | 2,096 | ipynb | Jupyter Notebook | pull.ipynb | EHubert/PythonBacktesting | d2178ee5b66efa35dacf3c0eff131db1b3028991 | [
"MIT"
] | 1 | 2022-02-12T05:19:49.000Z | 2022-02-12T05:19:49.000Z | pull.ipynb | EHubert/PythonBacktesting | d2178ee5b66efa35dacf3c0eff131db1b3028991 | [
"MIT"
] | 5 | 2020-03-31T11:21:31.000Z | 2021-12-13T20:35:46.000Z | pull.ipynb | EHubert/PythonBacktesting | d2178ee5b66efa35dacf3c0eff131db1b3028991 | [
"MIT"
] | 2 | 2021-10-16T16:57:39.000Z | 2022-02-17T22:50:53.000Z | 26.2 | 244 | 0.538168 | [
[
[
"import pandas as pd\nfrom alpha_vantage.timeseries import TimeSeries\nimport sys\nimport random\n\nticker = str(\"AAPL\")\n\n#lines = open(\"keys.txt\").read().splitlines()\n#keys = random.choice(lines)\n\ntime = TimeSeries(key=\"SUMO25VBT2SJ5HQA\", output_format=\"pandas\")\ndata = time.get_intraday(symbol=ticker, interval=\"1min\", outputsize=\"full\")\n\n# print(ticker)\n# print(data)\ndf = pd.DataFrame(data[0])\nprint(data[1])\n\n\n\n",
"{'1. Information': 'Intraday (1min) open, high, low, close prices and volume', '2. Symbol': 'AAPL', '3. Last Refreshed': '2020-02-25 10:49:00', '4. Interval': '1min', '5. Output Size': 'Full size', '6. Time Zone': 'US/Eastern'}\n"
],
[
"df[\"4. close\"].value_counts()",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code"
]
] |
d07e414e22f32e2ed3706ca2cffffe0523377ba7 | 52,024 | ipynb | Jupyter Notebook | NaiveBayes/NB_9_2_2_QA_Sentiment_Analysis.ipynb | jeffreyablack1/QAEvaluationNLP | 0c5297fd29cf51316d04fdc50510741a754c0401 | [
"MIT"
] | null | null | null | NaiveBayes/NB_9_2_2_QA_Sentiment_Analysis.ipynb | jeffreyablack1/QAEvaluationNLP | 0c5297fd29cf51316d04fdc50510741a754c0401 | [
"MIT"
] | null | null | null | NaiveBayes/NB_9_2_2_QA_Sentiment_Analysis.ipynb | jeffreyablack1/QAEvaluationNLP | 0c5297fd29cf51316d04fdc50510741a754c0401 | [
"MIT"
] | null | null | null | 55.051852 | 8,410 | 0.592265 | [
[
[
"# QA Sentiment Analysis: Critical Thinking 9.2.2\n# Naive Bayes\n### By JEFFREY BLACK",
"_____no_output_____"
],
[
"# Introduction",
"_____no_output_____"
],
[
"Question: \"How does increasing the sample size affect a t test? Why does it affect a t test in this manner?\"",
"_____no_output_____"
],
[
"Answer: \"In the long run, it means that the obtained t is more likely to be significant. In terms of the formula used to calculate t, increasing the sample size will decrease the standard error of the difference between means. This , in turn, will increase the size of the obtained t. A larger obtained t means that the obtained value is more likely to exceed the critical value and be significant.\"",
"_____no_output_____"
],
[
"***",
"_____no_output_____"
],
[
"# Importing Packages",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport pandas as pd\nimport os\nfrom sklearn.feature_extraction.text import TfidfVectorizer\nfrom sklearn.naive_bayes import MultinomialNB\nfrom sklearn.metrics import (f1_score,precision_score,recall_score, confusion_matrix)\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.model_selection import RandomizedSearchCV, train_test_split # for hyperparameter tuning",
"_____no_output_____"
]
],
[
[
"***",
"_____no_output_____"
],
[
"# Loading and Preprocessing the Data",
"_____no_output_____"
]
],
[
[
"CTC_9_2_2 = pd.read_excel(\"/Users/jeffreyblack/Desktop/NLPProject/QA_CTC.xlsx\", sheet_name = 'CTC_9_2_2')\nCTC_9_2_2",
"_____no_output_____"
],
[
"X_train, X_test, y_train, y_test = train_test_split(CTC_9_2_2['Answers'] , CTC_9_2_2['Grade'], test_size=0.20, random_state=42)\n",
"_____no_output_____"
]
],
[
[
"***",
"_____no_output_____"
],
[
"# Feature Extraction\n### Convert reviews into vectors using the bag-of-words model\nNote: I did not remove stop-words",
"_____no_output_____"
]
],
[
[
"def extract_features(x_train, x_test):\n # This function extracts document features for input documents, x\n # Source: \n # https://scikit-learn.org/stable/modules/generated/sklearn.feature_extraction.text.CountVectorizer.html#sklearn.feature_extraction.text.CountVectorizer\n\n vectorizer = TfidfVectorizer(max_features=10000, ngram_range = (1,3))\n train = vectorizer.fit_transform(x_train)\n test = vectorizer.transform(x_test)\n test.toarray()\n print((vectorizer.get_feature_names()))\n return train, test",
"_____no_output_____"
]
],
[
[
"Calling the TF-IDF Vectorizer to extract the features for the training and test predictors.",
"_____no_output_____"
]
],
[
[
"feats_train, feats_test = extract_features(X_train, X_test) # training and test set features",
"['affect', 'affect the', 'affect the denominator', 'affects', 'affects the', 'affects the or', 'affects the standard', 'affects the test', 'and', 'and be', 'and be significant', 'and increase', 'and increase the', 'and the', 'and the more', 'and when', 'and when you', 'are', 'are and', 'are and when', 'as', 'as the', 'as the sample', 'bar1', 'bar1 bar2', 'bar1 bar2 this', 'bar2', 'bar2 this', 'bar2 this would', 'be', 'be significant', 'be significant in', 'be significant when', 'because', 'because it', 'because it will', 'become', 'become more', 'become more significant', 'being', 'being significantly', 'being significantly larger', 'between', 'between means', 'between means in', 'between means this', 'by', 'by either', 'by either increasing', 'calculate', 'calculate increasing', 'calculate increasing the', 'can', 'can increase', 'can increase power', 'chances', 'chances of', 'chances of the', 'critical', 'critical score', 'critical score and', 'critical value', 'critical value and', 'decrease', 'decrease sx', 'decrease sx bar1', 'decrease the', 'decrease the denominator', 'decrease the standard', 'decreases', 'decreases it', 'decreases it which', 'decreasing', 'decreasing the', 'decreasing the number', 'degrees', 'degrees of', 'degrees of freedom', 'denominator', 'denominator in', 'denominator in the', 'denominator which', 'denominator which would', 'determines', 'determines the', 'determines the degrees', 'difference', 'difference between', 'difference between means', 'distribution', 'either', 'either increasing', 'either increasing or', 'equation', 'equation for', 'equation for the', 'error', 'error and', 'error and increase', 'error in', 'error in that', 'error of', 'error of the', 'exceed', 'exceed the', 'exceed the critical', 'fact', 'fact it', 'fact it would', 'fail', 'fail to', 'fail to reject', 'for', 'for test', 'for test determines', 'for that', 'for that test', 'for the', 'for the score', 'formula', 'formula used', 'formula used to', 'freedom', 'freedom for', 'freedom for that', 'gets', 'gets larger', 'gets larger the', 'going', 'going to', 'going to be', 'happens', 'happens it', 'happens it is', 'hypothesis', 'hypothesis less', 'hypothesis less likely', 'hypothesis thus', 'hypothesis thus the', 'if', 'if you', 'if you increase', 'in', 'in fact', 'in fact it', 'in terms', 'in terms of', 'in that', 'in that it', 'in the', 'in the equation', 'in the long', 'in turn', 'in turn affects', 'in turn will', 'increase', 'increase power', 'increase power because', 'increase the', 'increase the chances', 'increase the same', 'increase the sample', 'increase the size', 'increases', 'increases the', 'increases the obtained', 'increases therefore', 'increases therefore we', 'increasing', 'increasing or', 'increasing or decreasing', 'increasing the', 'increasing the sample', 'is', 'is larger', 'is larger it', 'is more', 'is more likely', 'is most', 'is most likely', 'it', 'it affects', 'it affects the', 'it decreases', 'it decreases it', 'it increases', 'it increases the', 'it is', 'it is most', 'it means', 'it means that', 'it which', 'it which in', 'it will', 'it will become', 'it will decrease', 'it will make', 'it would', 'it would decrease', 'larger', 'larger it', 'larger it will', 'larger means', 'larger means that', 'larger obtained', 'larger obtained means', 'larger the', 'larger the critical', 'larger the value', 'less', 'less likely', 'less likely to', 'likely', 'likely going', 'likely going to', 'likely to', 'likely to be', 'likely to exceed', 'likely to fail', 'likely to reject', 'long', 'long run', 'long run it', 'make', 'make the', 'make the sample', 'make the sd', 'means', 'means in', 'means in turn', 'means that', 'means that the', 'means this', 'means this in', 'more', 'more likely', 'more likely to', 'more significant', 'more significant the', 'most', 'most likely', 'most likely going', 'narrows', 'narrows the', 'narrows the distribution', 'null', 'null hypothesis', 'null hypothesis less', 'null hypothesis thus', 'number', 'obtained', 'obtained is', 'obtained is more', 'obtained larger', 'obtained larger means', 'obtained larger obtained', 'obtained means', 'obtained means that', 'obtained value', 'obtained value is', 'obtained value when', 'of', 'of freedom', 'of freedom for', 'of the', 'of the difference', 'of the formula', 'of the obtained', 'of the score', 'of the test', 'or', 'or decreasing', 'or decreasing the', 'or value', 'power', 'power because', 'power because it', 'power of', 'power of the', 'reject', 'reject the', 'reject the null', 'results', 'results are', 'results are and', 'run', 'run it', 'run it means', 'same', 'same size', 'same size it', 'sample', 'sample size', 'sample size can', 'sample size for', 'sample size gets', 'sample size it', 'sample size narrows', 'sample size will', 'sample statistic', 'sample statistic smaller', 'score', 'score and', 'score and the', 'score being', 'score being significantly', 'score in', 'score in fact', 'sd', 'sd increase', 'sd increase the', 'significant', 'significant in', 'significant in terms', 'significant the', 'significant the results', 'significant when', 'significant when the', 'significantly', 'significantly larger', 'size', 'size can', 'size can increase', 'size for', 'size for test', 'size gets', 'size gets larger', 'size it', 'size it affects', 'size it increases', 'size narrows', 'size narrows the', 'size of', 'size of the', 'size will', 'size will decrease', 'smaller', 'smaller it', 'smaller it will', 'specifies', 'specifies the', 'specifies the distribution', 'standard', 'standard error', 'standard error and', 'standard error in', 'standard error of', 'statistic', 'statistic smaller', 'statistic smaller it', 'sx', 'sx bar1', 'sx bar1 bar2', 'terms', 'terms of', 'terms of the', 'test', 'test by', 'test by either', 'test determines', 'test determines the', 'test increases', 'test which', 'test which specifies', 'test will', 'test will increase', 'that', 'that it', 'that it decreases', 'that test', 'that test which', 'that the', 'that the obtained', 'the', 'the chances', 'the chances of', 'the critical', 'the critical score', 'the critical value', 'the degrees', 'the degrees of', 'the denominator', 'the denominator in', 'the denominator which', 'the difference', 'the difference between', 'the distribution', 'the equation', 'the equation for', 'the formula', 'the formula used', 'the larger', 'the larger the', 'the long', 'the long run', 'the more', 'the more significant', 'the null', 'the null hypothesis', 'the number', 'the obtained', 'the obtained is', 'the obtained larger', 'the obtained value', 'the or', 'the or value', 'the power', 'the power of', 'the results', 'the results are', 'the same', 'the same size', 'the sample', 'the sample size', 'the sample statistic', 'the score', 'the score being', 'the score in', 'the sd', 'the sd increase', 'the size', 'the size of', 'the standard', 'the standard error', 'the test', 'the test by', 'the test increases', 'the test will', 'the value', 'the value increases', 'the value the', 'the will', 'the will be', 'therefore', 'therefore we', 'therefore we will', 'this', 'this happens', 'this happens it', 'this in', 'this in turn', 'this would', 'this would affect', 'thus', 'thus the', 'thus the power', 'to', 'to be', 'to be significant', 'to calculate', 'to calculate increasing', 'to exceed', 'to exceed the', 'to fail', 'to fail to', 'to reject', 'to reject the', 'turn', 'turn affects', 'turn affects the', 'turn will', 'turn will increase', 'used', 'used to', 'used to calculate', 'value', 'value and', 'value and be', 'value increases', 'value increases therefore', 'value is', 'value is larger', 'value is more', 'value the', 'value the larger', 'value when', 'value when this', 'we', 'we will', 'we will more', 'when', 'when the', 'when the size', 'when this', 'when this happens', 'when you', 'when you increase', 'which', 'which in', 'which in turn', 'which specifies', 'which specifies the', 'which would', 'which would increase', 'will', 'will be', 'will be significant', 'will become', 'will become more', 'will decrease', 'will decrease sx', 'will decrease the', 'will increase', 'will increase the', 'will make', 'will make the', 'will more', 'will more likely', 'would', 'would affect', 'would affect the', 'would decrease', 'would decrease the', 'would increase', 'would increase the', 'you', 'you increase', 'you increase the']\n"
]
],
[
[
"***",
"_____no_output_____"
],
[
"# Model Training: Naive Bayes\n### Fit the training data using Multinomial Naive Bayes classifier ",
"_____no_output_____"
]
],
[
[
"def build_NB_classifier(x, y):\n # This function builds a Multinomial Naive Bayes classifier with input (x,y):\n # Source: \n # https://scikit-learn.org/stable/modules/generated/sklearn.naive_bayes.MultinomialNB.html\n\n clf = MultinomialNB()\n clf.fit(x, y)\n return clf",
"_____no_output_____"
],
[
"nb_clf = build_NB_classifier(feats_train, y_train)",
"_____no_output_____"
]
],
[
[
"## Hyperparameter Tuning\nI decided to use Random Search Cross Validation in Scikit-Learn to determine the best hyperparameters needed for tuning the Naive Bayes classifier model. The RandomizedSearchCV allowed me to define a grid of hyperparameter randes and randomly sample from the grid, while performing K-fold cross validation with each combination of values. ",
"_____no_output_____"
]
],
[
[
"# Additive (Laplace/Lidstone) smoothing parameter (0 for no smoothing).\nalpha = [0, 1.0]\n# Whether to learn class prior probabilities or not. If false, a uniform prior will be used.\nfit_prior = [True, False]\n# Prior probabilities of the classes. If specified the priors are not adjusted according to the data.\nclass_prior = [None, [0.05, 0.95],[0.1, 0.9],[0.2, 0.8],[0.25, 0.85], [0.3, 0.7],[0.35, 0.75], [0.4, 0.6],[0.45, 0.65]]\n# Create the random grid\nrandom_grid = {'alpha': alpha,\n 'fit_prior': fit_prior,\n 'class_prior': class_prior}\nprint(random_grid)",
"{'alpha': [0, 1.0], 'fit_prior': [True, False], 'class_prior': [None, [0.05, 0.95], [0.1, 0.9], [0.2, 0.8], [0.25, 0.85], [0.3, 0.7], [0.35, 0.75], [0.4, 0.6], [0.45, 0.65]]}\n"
],
[
"# Use the random grid to search for best hyperparameters\n# First create the base model to tune\nnb = MultinomialNB()\n# Random search of parameters, using 3 fold cross validation, \n# search across 100 different combinations, and use all available cores\nnb_random = RandomizedSearchCV(estimator = nb, param_distributions = random_grid, cv=3, \n scoring='f1_weighted', n_iter=1000, return_train_score = True)\n# Fit the random search model\nnb_random.fit(feats_train, y_train)\n",
"/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/model_selection/_search.py:278: UserWarning: The total space of parameters 36 is smaller than n_iter=1000. Running 36 iterations. For exhaustive searches, use GridSearchCV.\n warnings.warn(\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n/Users/jeffreyblack/opt/anaconda3/lib/python3.8/site-packages/sklearn/naive_bayes.py:511: UserWarning: alpha too small will result in numeric errors, setting alpha = 1.0e-10\n warnings.warn('alpha too small will result in numeric errors, '\n"
],
[
"# finding the best parameters\nnb_random.best_params_",
"_____no_output_____"
]
],
[
[
"Using the output above, I tuned the Multinomial Naive Bayes classifier below. ",
"_____no_output_____"
]
],
[
[
"def build_NB_classifier_tuned(x, y):\n # This function builds a Multinomial Naive Bayes classifier with input (x,y):\n # Source: \n # https://scikit-learn.org/stable/modules/generated/sklearn.naive_bayes.MultinomialNB.html\n\n clf = MultinomialNB(fit_prior = False, class_prior = None, alpha = 1.0)\n clf.fit(x, y)\n return clf",
"_____no_output_____"
],
[
"nb_clf_tuned = build_NB_classifier_tuned(feats_train, y_train)",
"_____no_output_____"
]
],
[
[
"***",
"_____no_output_____"
],
[
"# Model Evaluation Functions \nI used 3 evaluation metrics: recall, precision, and F1-score. I also used a confusion matrix to visualize false-positive, false-negative, true-positive, and true-negative.",
"_____no_output_____"
]
],
[
[
"def recall_evaluator(x, y_truth, clf):\n # Function to evalute model performance, using recall:\n # Source: \n # https://scikit-learn.org/stable/modules/generated/sklearn.metrics.recall_score.html#sklearn.metrics.recall_score\n result = 0.0\n result = recall_score(y_true = y_truth, y_pred = clf.predict(x), average='weighted')\n \n return result",
"_____no_output_____"
],
[
"def precision_evaluator(x, y_truth, clf):\n # Function to evalute model performance, using precision:\n # Source: \n # https://scikit-learn.org/stable/modules/generated/sklearn.metrics.precision_score.html#sklearn.metrics.precision_score\n result = 0.0\n result = precision_score(y_true = y_truth, y_pred = clf.predict(x), average='weighted')\n \n return result",
"_____no_output_____"
],
[
"def f1_evaluator(x, y_truth, clf):\n # Function to evalute model performance, using F1-score: \n # Source: \n # https://scikit-learn.org/stable/modules/generated/sklearn.metrics.f1_score.html#sklearn.metrics.f1_score\n result = 0.0\n result = f1_score(y_true = y_truth, y_pred = clf.predict(x), average='weighted')\n \n return result",
"_____no_output_____"
]
],
[
[
"***",
"_____no_output_____"
],
[
"## Summary Results of Naive Bayes",
"_____no_output_____"
],
[
"### Original model evaluation: ",
"_____no_output_____"
]
],
[
[
"recall_nb_score = recall_evaluator(feats_test, y_test, nb_clf)\nprecision_nb_score = precision_evaluator(feats_test, y_test, nb_clf)\nf1_nb_score = f1_evaluator(feats_test, y_test, nb_clf)\npred_nb = nb_clf.predict(feats_test)",
"_____no_output_____"
],
[
"print('Naive Bayes Recall: ', recall_nb_score)\nprint('Naive Bayes Precision: ', precision_nb_score)\nprint('Naive Bayes F1: ', f1_nb_score)\nprint(\"Confusion Matrix for Naive Bayes Classifier:\")\nprint(confusion_matrix(y_test, pred_nb))",
"Naive Bayes Recall: 1.0\nNaive Bayes Precision: 1.0\nNaive Bayes F1: 1.0\nConfusion Matrix for Naive Bayes Classifier:\n[[4 0]\n [0 2]]\n"
]
],
[
[
"### After hyperparameter tuning: ",
"_____no_output_____"
]
],
[
[
"recall_nb_tuned_score = recall_evaluator(feats_test, y_test, nb_clf_tuned)\nprecision_nb_tuned_score = precision_evaluator(feats_test, y_test, nb_clf_tuned)\nf1_nb_tuned_score = f1_evaluator(feats_test, y_test, nb_clf_tuned)\npred_nb_tuned = nb_clf_tuned.predict(feats_test)",
"_____no_output_____"
],
[
"print('Naive Bayes Recall: ', recall_nb_tuned_score)\nprint('Naive Bayes Precision: ', precision_nb_tuned_score)\nprint('Naive Bayes F1: ', f1_nb_tuned_score)\nprint(\"Confusion Matrix for Naive Bayes Classifier:\")\nprint(confusion_matrix(y_test, pred_nb_tuned))",
"Naive Bayes Recall: 1.0\nNaive Bayes Precision: 1.0\nNaive Bayes F1: 1.0\nConfusion Matrix for Naive Bayes Classifier:\n[[4 0]\n [0 2]]\n"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
d07e431822c0c8b1ea76171d8375f8ffc235b331 | 792,931 | ipynb | Jupyter Notebook | examples/notebooks/atomai_atomstat.ipynb | aghosh92/atomai | 9a9fd7a4cff7dc9b5af9fcc5b1a4b3894c9df685 | [
"MIT"
] | null | null | null | examples/notebooks/atomai_atomstat.ipynb | aghosh92/atomai | 9a9fd7a4cff7dc9b5af9fcc5b1a4b3894c9df685 | [
"MIT"
] | null | null | null | examples/notebooks/atomai_atomstat.ipynb | aghosh92/atomai | 9a9fd7a4cff7dc9b5af9fcc5b1a4b3894c9df685 | [
"MIT"
] | null | null | null | 1,648.505198 | 145,510 | 0.957764 | [
[
[
"<a href=\"https://colab.research.google.com/github/ziatdinovmax/atomai/blob/master/atomai_atomstat.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
],
[
"# Multivariate analysis of ferroic distortions with *atomstat* module\n\nPrepared by Maxim Ziatdinov\n\nE-mail: [email protected]",
"_____no_output_____"
],
[
"In this notebook we show how the atomic coordinates derived via a pre-trained neural network from the atom-resolved image can be used to explore the extant atomic displacement patterns in the material and build the collection of the building blocks for the distorted lattice. For more details see our paper in Appl. Phys. Lett. 115, 052902 (2019).",
"_____no_output_____"
],
[
"## Install AtomAI",
"_____no_output_____"
],
[
"Installation:",
"_____no_output_____"
]
],
[
[
"!pip install atomai",
"Collecting atomai\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/ec/96/052c840e2bf8a28f3efca99853ce719ab40daad5163ea08df45330bacbfc/atomai-0.6.0-py3-none-any.whl (104kB)\n\u001b[K |████████████████████████████████| 112kB 6.0MB/s \n\u001b[?25hRequirement already satisfied: torch>=1.0.0 in /usr/local/lib/python3.6/dist-packages (from atomai) (1.7.0+cu101)\nRequirement already satisfied: numpy>=1.18.5 in /usr/local/lib/python3.6/dist-packages (from atomai) (1.18.5)\nRequirement already satisfied: scipy>=1.3.0 in /usr/local/lib/python3.6/dist-packages (from atomai) (1.4.1)\nCollecting mendeleev>=0.6.0\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/f0/75/5863bb298aa1390cb9ecb0548a62b8213ef085273f9d3c73e513b9c36214/mendeleev-0.6.1.tar.gz (193kB)\n\u001b[K |████████████████████████████████| 194kB 18.3MB/s \n\u001b[?25hRequirement already satisfied: scikit-learn>=0.22.1 in /usr/local/lib/python3.6/dist-packages (from atomai) (0.22.2.post1)\nRequirement already satisfied: networkx>=2.5 in /usr/local/lib/python3.6/dist-packages (from atomai) (2.5)\nRequirement already satisfied: opencv-python>=4.1.0 in /usr/local/lib/python3.6/dist-packages (from atomai) (4.1.2.30)\nRequirement already satisfied: scikit-image==0.16.2 in /usr/local/lib/python3.6/dist-packages (from atomai) (0.16.2)\nRequirement already satisfied: typing-extensions in /usr/local/lib/python3.6/dist-packages (from torch>=1.0.0->atomai) (3.7.4.3)\nRequirement already satisfied: dataclasses in /usr/local/lib/python3.6/dist-packages (from torch>=1.0.0->atomai) (0.8)\nRequirement already satisfied: future in /usr/local/lib/python3.6/dist-packages (from torch>=1.0.0->atomai) (0.16.0)\nRequirement already satisfied: pandas in /usr/local/lib/python3.6/dist-packages (from mendeleev>=0.6.0->atomai) (1.1.4)\nRequirement already satisfied: sqlalchemy>=1.3.0 in /usr/local/lib/python3.6/dist-packages (from mendeleev>=0.6.0->atomai) (1.3.20)\nCollecting colorama\n Downloading https://files.pythonhosted.org/packages/44/98/5b86278fbbf250d239ae0ecb724f8572af1c91f4a11edf4d36a206189440/colorama-0.4.4-py2.py3-none-any.whl\nCollecting pyfiglet\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/33/07/fcfdd7a2872f5b348953de35acce1544dab0c1e8368dca54279b1cde5c15/pyfiglet-0.8.post1-py2.py3-none-any.whl (865kB)\n\u001b[K |████████████████████████████████| 870kB 26.2MB/s \n\u001b[?25hRequirement already satisfied: joblib>=0.11 in /usr/local/lib/python3.6/dist-packages (from scikit-learn>=0.22.1->atomai) (0.17.0)\nRequirement already satisfied: decorator>=4.3.0 in /usr/local/lib/python3.6/dist-packages (from networkx>=2.5->atomai) (4.4.2)\nRequirement already satisfied: imageio>=2.3.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image==0.16.2->atomai) (2.4.1)\nRequirement already satisfied: matplotlib!=3.0.0,>=2.0.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image==0.16.2->atomai) (3.2.2)\nRequirement already satisfied: pillow>=4.3.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image==0.16.2->atomai) (7.0.0)\nRequirement already satisfied: PyWavelets>=0.4.0 in /usr/local/lib/python3.6/dist-packages (from scikit-image==0.16.2->atomai) (1.1.1)\nRequirement already satisfied: pytz>=2017.2 in /usr/local/lib/python3.6/dist-packages (from pandas->mendeleev>=0.6.0->atomai) (2018.9)\nRequirement already satisfied: python-dateutil>=2.7.3 in /usr/local/lib/python3.6/dist-packages (from pandas->mendeleev>=0.6.0->atomai) (2.8.1)\nRequirement already satisfied: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image==0.16.2->atomai) (2.4.7)\nRequirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.6/dist-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image==0.16.2->atomai) (1.3.1)\nRequirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.6/dist-packages (from matplotlib!=3.0.0,>=2.0.0->scikit-image==0.16.2->atomai) (0.10.0)\nRequirement already satisfied: six>=1.5 in /usr/local/lib/python3.6/dist-packages (from python-dateutil>=2.7.3->pandas->mendeleev>=0.6.0->atomai) (1.15.0)\nBuilding wheels for collected packages: mendeleev\n Building wheel for mendeleev (setup.py) ... \u001b[?25l\u001b[?25hdone\n Created wheel for mendeleev: filename=mendeleev-0.6.1-py2.py3-none-any.whl size=174964 sha256=eeeb152ae6757fc3b1633b727533ed6881e07af5722f8c5e460cb86e38b74c58\n Stored in directory: /root/.cache/pip/wheels/fb/28/5d/95e69a718b35dd00169889b0139a692f6c265d399cab3aa097\nSuccessfully built mendeleev\nInstalling collected packages: colorama, pyfiglet, mendeleev, atomai\nSuccessfully installed atomai-0.6.0 colorama-0.4.4 mendeleev-0.6.1 pyfiglet-0.8.post1\n"
]
],
[
[
"Import modules:",
"_____no_output_____"
]
],
[
[
"import atomai as aoi\nimport numpy as np",
"_____no_output_____"
]
],
[
[
"Download the trained weights and test image:",
"_____no_output_____"
]
],
[
[
"download_link_model = 'https://drive.google.com/uc?id=18hXcw0tZ_fALtI2Fir1fHirAt27tRqj4'\ndownload_link_img = 'https://drive.google.com/uc?id=1peHF1lvpOKlOSMjREB2aSscyolrQQhoh'\n!gdown -q $download_link_model -O 'simple_model.tar'\n!gdown -q $download_link_img -O 'test_img.npy'",
"_____no_output_____"
]
],
[
[
"## Ferroic blocks analysis with atomstat",
"_____no_output_____"
],
[
"First we need to load the trained model. To do this, we specify a path to file with the trained weights and model specifics. We are going to use the weights trained in the [atomai-atomnet notebook](https://colab.research.google.com/github/ziatdinovmax/atomai/blob/master/examples/notebooks/atomai_atomnet.ipynb#scrollTo=XGxhL7ha1Y3R).",
"_____no_output_____"
]
],
[
[
"# Path to file with trained weights\nmodel_dict_path = '/content/simple_model.tar'\n# load the weights into the model skeleton\nmodel = aoi.load_model(model_dict_path)",
"_____no_output_____"
]
],
[
[
"Make a prediction with the loaded model:",
"_____no_output_____"
]
],
[
[
"# Load experimental data\nexpdata = np.load('test_img.npy')\n# Get NN output with coordinates and classes\nnn_output, coordinates = model.predict(expdata)",
"\rBatch 1/1\n1 image was decoded in approximately 3.1576 seconds\n"
]
],
[
[
"Here we are going to use *atomstat* module to get local image descriptors first (i.e. stack of subimages around one of the atom types) and then perform different types of statistical analysis on them. This is similar to what we did in *Applied Physics Letters 115, 052902 (2019)* (although here we are going to use a different model and the image was downsized by a factor of 2 to allow faster inference, without using a GPU).\n\nGet local descriptors, which are subimages centered on one of the sublattices:",
"_____no_output_____"
]
],
[
[
"imstack = aoi.stat.imlocal(nn_output, coordinates, window_size=32, coord_class=1)",
"_____no_output_____"
]
],
[
[
"Compute PCA scree plot to estimate the number of components/sources for the multivariate analysis below:",
"_____no_output_____"
]
],
[
[
"imstack.pca_scree_plot(plot_results=True);",
"_____no_output_____"
]
],
[
[
"Do PCA analysis and plot results:",
"_____no_output_____"
]
],
[
[
"pca_results = imstack.imblock_pca(4, plot_results=True)",
"_____no_output_____"
]
],
[
[
"Do ICA analysis and plot results:",
"_____no_output_____"
]
],
[
[
"ica_results = imstack.imblock_ica(4, plot_results=True)",
"/usr/local/lib/python3.6/dist-packages/sklearn/decomposition/_fastica.py:119: ConvergenceWarning: FastICA did not converge. Consider increasing tolerance or the maximum number of iterations.\n ConvergenceWarning)\n"
]
],
[
[
"Do NMF analysis and plot results:",
"_____no_output_____"
]
],
[
[
"nmf_results = imstack.imblock_nmf(4, plot_results=True)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07e59faea15f15832c5e76001f2f99eaeba0c77 | 258,109 | ipynb | Jupyter Notebook | interpol_precip.ipynb | allixender/py_interpol_demo | 7ae52f94a4ebe2b883a23d2d9fa8d4c437d346b1 | [
"MIT"
] | 5 | 2022-03-13T16:04:38.000Z | 2022-03-16T19:25:08.000Z | interpol_precip.ipynb | allixender/py_interpol_demo | 7ae52f94a4ebe2b883a23d2d9fa8d4c437d346b1 | [
"MIT"
] | null | null | null | interpol_precip.ipynb | allixender/py_interpol_demo | 7ae52f94a4ebe2b883a23d2d9fa8d4c437d346b1 | [
"MIT"
] | null | null | null | 175.943422 | 37,736 | 0.888597 | [
[
[
"## Load Estonian weather service\n\n- https://www.ilmateenistus.ee/teenused/ilmainfo/ilmatikker/",
"_____no_output_____"
]
],
[
[
"import requests\nimport datetime\nimport xml.etree.ElementTree as ET\n\nimport pandas as pd\n\nfrom pandas.api.types import is_string_dtype\nfrom pandas.api.types import is_numeric_dtype\n\nimport geopandas as gpd\nimport fiona\nfrom fiona.crs import from_epsg\n\nimport numpy as np\nfrom shapely.geometry import Point\n\nimport matplotlib.pyplot as plt\n\n%matplotlib inline\n\nreq = requests.get(\"http://www.ilmateenistus.ee/ilma_andmed/xml/observations.php\")\n\nprint(req.encoding)\nprint(req.headers['content-type'])\n\ntree = ET.fromstring(req.content.decode(req.encoding) )\n\nprint(tree.tag)\nprint(tree.attrib)\n\nts = tree.attrib['timestamp']\nprint(datetime.datetime.fromtimestamp(int(ts)))",
"ISO-8859-1\ntext/xml\nobservations\n{'timestamp': '1590421855'}\n2020-05-25 18:50:55\n"
],
[
"data = {'stations' : [],\n 'wmocode': [],\n 'precipitations': [],\n 'airtemperature': [],\n 'windspeed': [],\n 'waterlevel': [],\n 'watertemperature': [],\n 'geometry': []\n }\n\n\ncounter = 0\nfor station in tree.findall('station'):\n counter += 1\n \n # print(station.tag, child.attrib)\n \n # < name > Virtsu < /name > – jaama nimi.\n name = station.find('name').text\n data['stations'].append(name)\n # < wmocode > 26128 < /wmocode > – jaama WMO kood.\n wmocode = station.find('wmocode').text\n data['wmocode'].append(wmocode)\n \n try:\n # < longitude > 23.51355555534363 < /longitude > – jaama asukoha koordinaat.\n lon = station.find('longitude').text\n # < latitude > 58.572674999100215 < /latitude > – jaama asukoha koordinaat.\n lat = station.find('latitude').text\n coords = Point(float(lon), float(lat))\n data['geometry'].append(coords)\n except ValueError as ve:\n pass\n \n # < phenomenon > Light snowfall < /phenomenon > – jaamas esinev ilmastikunähtus, selle puudumisel pilvisuse aste (kui jaamas tehakse manuaalseid pilvisuse mõõtmisi). Täielik nimekiri nähtustest on allpool olevas tabelis.\n # < visibility > 34.0 < /visibility > – nähtavus (km).\n # < precipitations > 0 < /precipitations > – sademed (mm) viimase tunni jooksul. Lume, lörtsi, rahe ja teiste taoliste sademete hulk on samuti esitatud vee millimeetritena. 1 cm lund ~ 1 mm vett.\n precip = station.find('precipitations').text\n data['precipitations'].append(precip)\n # < airpressure > 1005.4 < /airpressure > – õhurõhk (hPa). Normaalrõhk on 1013.25 hPa.\n # < relativehumidity > 57 < /relativehumidity > – suhteline õhuniiskus (%).\n # < airtemperature > -3.6 < /airtemperature > – õhutemperatuur (°C).\n temp = station.find('airtemperature').text\n data['airtemperature'].append(temp)\n # < winddirection > 101 < /winddirection > – tuule suund (°).\n # < windspeed > 3.2 < /windspeed > – keskmine tuule kiirus (m/s).\n wind = station.find('windspeed').text\n data['windspeed'].append(wind)\n # < windspeedmax > 5.1 < /windspeedmax > – maksimaalne tuule kiirus ehk puhangud (m/s).\n # < waterlevel > -49 < /waterlevel > – veetase (cm Kroonlinna nulli suhtes)\n waterlevel = station.find('waterlevel').text\n data['waterlevel'].append(waterlevel)\n # < waterlevel_eh2000 > -28 < waterlevel_eh2000/ > – veetase (cm Amsterdami nulli suhtes)\n # waterlevel_eh2000 = station.find('waterlevel_eh2000').text\n # < watertemperature > -0.2 < /watertemperature > – veetemperatuur (°C)\n watertemp = station.find('watertemperature').text\n data['watertemperature'].append(watertemp)\n\nprint(counter)\n\ndf = pd.DataFrame(data)\n\nfor field in ['precipitations','airtemperature','windspeed','waterlevel','watertemperature']:\n if field in df.columns:\n if is_string_dtype(df[field]):\n df[field] = df[field].astype(float)\n \ndisplay(df.head(5))\n\ngeo_df = gpd.GeoDataFrame(df, crs=from_epsg(4326), geometry='geometry')\n\ngeo_df.plot()\n\nwater_df = geo_df.dropna(subset=['precipitations'])\n\nwater_df.plot(column='precipitations', legend=True)",
"104\n"
],
[
"geo_df_3301 = geo_df.dropna(subset=['precipitations']).to_crs(epsg=3301)\ngeo_df_3301['x'] = geo_df_3301['geometry'].apply(lambda p: p.x)\ngeo_df_3301['y'] = geo_df_3301['geometry'].apply(lambda p: p.y)\ndisplay(geo_df_3301.head(5))\n\ngeo_df_3301.to_file('ilmateenistus_precip_stations.shp', encoding='utf-8')",
"_____no_output_____"
]
],
[
[
"## IDW in Python from scratch blogpost\n\nhttps://www.geodose.com/2019/09/creating-idw-interpolation-from-scratch-python.html\n\n- IDW Algorithm Implementation in Python\n\n - IDW Interpolation Algorithm Based on Block Radius Sampling Point\n - IDW Interpolation based on Minimum Number of Sampling Point",
"_____no_output_____"
]
],
[
[
"geo_df_3301.dtypes",
"_____no_output_____"
],
[
"from idw_basic import idw_rblock, idw_npoint",
"_____no_output_____"
],
[
"x_idw_list1, y_idw_list1, z_head1 = idw_rblock(x=geo_df_3301['x'].astype(float).values.tolist(),\n y=geo_df_3301['y'].astype(float).values.tolist(),\n z=geo_df_3301['precipitations'].values.tolist(),\n grid_side_length=200,\n search_radius=50000,\n p=1.5)\ndisplay(len(x_idw_list1))\ndisplay(len(y_idw_list1))\ndisplay(len(z_head1))\ndisplay(np.array(z_head1).shape)",
"c:\\dev\\build\\py_interpol_demo\\idw_basic.py:46: RuntimeWarning: invalid value encountered in double_scalars\n z_idw=np.dot(z_block,wt)/sum(w_list) # idw calculation using dot product\n"
],
[
"plt.matshow(z_head1, origin='lower')\nplt.colorbar()\nplt.show()",
"_____no_output_____"
]
],
[
[
"_idw_npoint_ might take very long, due to ierative search radius increase to find at least n nearest neighbours",
"_____no_output_____"
]
],
[
[
"x_idw_list2, y_idw_list2, z_head2 = idw_npoint(x=geo_df_3301['x'].astype(float).values.tolist(),\n y=geo_df_3301['y'].astype(float).values.tolist(),\n z=geo_df_3301['airtemperature'].values.tolist(),\n grid_side_length=100,\n n_points=3,\n p=1.5,\n rblock_iter_distance=50000)\ndisplay(len(x_idw_list2))\ndisplay(len(y_idw_list2))\ndisplay(len(z_head2))\ndisplay(np.array(z_head2).shape)",
"_____no_output_____"
],
[
"plt.matshow(z_head2, origin='lower')\nplt.colorbar()\nplt.show()",
"_____no_output_____"
]
],
[
[
"## Inverse distance weighting (IDW) in Python with a KDTree\n\nBy Copyright (C) 2016 Paul Brodersen <[email protected]> under GPL-3.0\n\ncode: https://github.com/paulbrodersen/inverse_distance_weighting\n\nInverse distance weighting is an interpolation method that computes the score of query points based on the scores of their k-nearest neighbours, weighted by the inverse of their distances.\n\nAs each query point is evaluated using the same number of data points, this method allows for strong gradient changes in regions of high sample density while imposing smoothness in data sparse regions.\n\nuses:\n\n- numpy\n- scipy.spatial (for cKDTree)",
"_____no_output_____"
]
],
[
[
"import numpy as np\n\nimport matplotlib.pyplot as plt\n%matplotlib inline\n\nimport idw_knn",
"_____no_output_____"
],
[
"XY_obs_coords = np.vstack([geo_df_3301['x'].values, geo_df_3301['y'].values]).T\nz_arr = geo_df_3301['precipitations'].values\n\ndisplay(XY_obs_coords.shape)\ndisplay(z_arr.shape)\n\n# returns a function that is trained (the tree setup) for the interpolation on the grid\nidw_tree = idw_knn.tree(XY_obs_coords, z_arr)",
"_____no_output_____"
],
[
"all_dist_m = geo_df_3301['x'].max() - geo_df_3301['x'].min()\ndist_km_x = all_dist_m / 1000\ndisplay(dist_km_x)\n\nall_dist_m_y = geo_df_3301['y'].max() - geo_df_3301['y'].min()\ndist_km_y = all_dist_m_y / 1000\ndisplay(dist_km_y)",
"_____no_output_____"
],
[
"# prepare grids\n# number of target interpolation grid shape along x and y axis, e.g. 150*100 raster pixels\nnx=int(dist_km_x)\nny=int(dist_km_y)\n\n# preparing the \"output\" grid\nx_spacing = np.linspace(geo_df_3301['x'].min(), geo_df_3301['x'].max(), nx)\ny_spacing = np.linspace(geo_df_3301['y'].min(), geo_df_3301['y'].max(), ny)",
"_____no_output_____"
],
[
"# preparing the target grid\nx_y_grid_pairs = np.meshgrid(x_spacing, y_spacing)\n\nx_y_grid_pairs_list = np.reshape(x_y_grid_pairs, (2, -1)).T\n\ndisplay(f\"x_y_grid_pairs {len(x_y_grid_pairs)}\")\ndisplay(f\"x_y_grid_pairs_list reshaped {x_y_grid_pairs_list.shape}\")",
"_____no_output_____"
],
[
"# now interpolating onto the target grid\nz_arr_interp = idw_tree(x_y_grid_pairs_list)\n\ndisplay(f\"z_arr_interp {z_arr_interp.shape}\")",
"_____no_output_____"
],
[
"# plot\nfig, (ax1, ax2) = plt.subplots(1,2, sharex=True, sharey=True, figsize=(10,3))\n\nax1.scatter(XY_obs_coords[:,0], XY_obs_coords[:,1], c=geo_df_3301['precipitations'], linewidths=0)\nax1.set_title('Observation samples')\nax2.contourf(x_spacing, y_spacing, z_arr_interp.reshape((ny,nx)))\nax2.set_title('Interpolation')\nplt.show()",
"_____no_output_____"
],
[
"z_arr_interp.shape",
"_____no_output_____"
],
[
"plt.matshow(z_arr_interp.reshape((ny,nx)), origin='lower')\nplt.colorbar()\nplt.show()",
"_____no_output_____"
],
[
"display(f\"x_spacing {x_spacing.shape}\")\ndisplay(f\"y_spacing {y_spacing.shape}\")\n\n# is a x_y_grid_pair a list of two ndarrays, each is fully spatial 100x150 fields, one holds the x coords the other the y coords\nx_mg = np.meshgrid(x_spacing, y_spacing)\n\ndisplay(f\"x_mg {type(x_mg)} {len(x_mg)} len0 {type(x_mg[0])} {len(x_mg[0])} {x_mg[0].shape} len1 {type(x_mg[1])} {len(x_mg[1])} {x_mg[0].shape}\")\n\n# the yget reshaped into two long flattened arrays the joint full list of target x y pairs representing all grid locations\nx_mg_interp_prep = np.reshape(x_mg, (2, -1)).T\n\ndisplay(f\"x_mg_interp_prep {type(x_mg_interp_prep)} {len(x_mg_interp_prep)} {x_mg_interp_prep.shape}\")\n",
"_____no_output_____"
]
],
[
[
"## Interpolation in Python with Radial Basis Function \n\n- https://stackoverflow.com/a/3114117",
"_____no_output_____"
]
],
[
[
"from scipy.interpolate import Rbf\n\ndef scipy_idw(x, y, z, xi, yi):\n interp = Rbf(x, y, z, function='linear')\n return interp(xi, yi)\n\n\ndef plot(x,y,z,grid):\n plt.figure()\n grid_flipped = np.flipud(grid)\n plt.imshow(grid, extent=(x.min(), x.max(), y.min(), y.max()), origin='lower')\n # plt.hold(True)\n plt.scatter(x,y,c=z)\n plt.colorbar()",
"_____no_output_____"
],
[
"# nx, ny = 50, 50\n\nx=geo_df_3301['x'].astype(float).values\ny=geo_df_3301['y'].astype(float).values\nz=geo_df_3301['precipitations'].values\n\nxi = np.linspace(x.min(), x.max(), nx)\nyi = np.linspace(y.min(), y.max(), ny)\nxi, yi = np.meshgrid(xi, yi)\nxi, yi = xi.flatten(), yi.flatten()\n\ngrid2 = scipy_idw(x,y,z,xi,yi)\ngrid2 = grid2.reshape((ny, nx))",
"_____no_output_____"
],
[
"plot(x,y,z,grid2)\nplt.title(\"Scipy's Rbf with function=linear\")",
"_____no_output_____"
],
[
"# plot\nfig, (ax1, ax2, ax3) = plt.subplots(1,3, sharex=True, sharey=True, figsize=(10,3))\n\nax1.scatter(x,y, c=z, linewidths=0)\nax1.set_title('Observation samples')\nax2.contourf(np.linspace(x.min(), x.max(), nx), np.linspace(y.min(), y.max(), ny), grid2)\nax2.set_title('Interpolation contours')\n\nax3.imshow(np.flipud(grid2), extent=(x.min(), x.max(), y.min(), y.max()))\nax3.set_title('RBF pixels')\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"## surface/contour/mesh plotting of interpolated grids\n\nhttps://matplotlib.org/3.1.0/gallery/images_contours_and_fields/pcolormesh_levels.html#sphx-glr-gallery-images-contours-and-fields-pcolormesh-levels-py",
"_____no_output_____"
]
],
[
[
"from matplotlib.colors import BoundaryNorm\nfrom matplotlib.ticker import MaxNLocator\nfrom matplotlib import cm",
"_____no_output_____"
],
[
"nbins=15\n\nlevels = MaxNLocator(nbins=nbins).tick_values(z_arr_interp.min(), z_arr_interp.max())\n\n# pick the desired colormap, sensible levels, and define a normalization\n# instance which takes data values and translates those into levels.\ncmap = plt.get_cmap('viridis')\nnorm = BoundaryNorm(levels, ncolors=cmap.N, clip=True)",
"_____no_output_____"
],
[
"# plot\nfig, (ax1, ax2) = plt.subplots(1,2, sharex=True, sharey=True, figsize=(10,3))\n\nim = ax1.pcolormesh(x_idw_list1, y_idw_list1, np.array(z_head1), cmap=cmap, norm=norm)\nfig.colorbar(im, ax=ax1)\nax1.set_title('pcolormesh with normalisation (nbins={})'.format(nbins))\n\nim2 = ax2.pcolormesh(x_idw_list1, y_idw_list1, np.array(z_head1), cmap=cm.viridis)\nfig.colorbar(im2, ax=ax2)\nax2.set_title('pcolormesh without explicit normalisation')\nplt.show()",
"_____no_output_____"
],
[
"# plot\nfig, (ax1, ax2) = plt.subplots(1,2, sharex=True, sharey=True, figsize=(10,3))\n\ncf = ax1.contourf(x_spacing, y_spacing, z_arr_interp.reshape((ny,nx)), levels=levels, cmap=cmap)\nfig.colorbar(cf, ax=ax1)\nax1.set_title('contourf with {} levels'.format(nbins))\n\ncf2 = ax2.contourf(x_spacing, y_spacing, z_arr_interp.reshape((ny,nx)), cmap=cm.viridis)\nfig.colorbar(cf2, ax=ax2)\nax2.set_title('contourf with defaut levels')\nplt.show()",
"_____no_output_____"
],
[
"z_arr_interp.reshape((ny,nx)).shape",
"_____no_output_____"
]
],
[
[
"## Writing interpolated array to a raster file\n\n- GeoTiff raster with GDAL Python\n",
"_____no_output_____"
]
],
[
[
"from fiona.crs import from_epsg\nimport pyproj\nimport osgeo.osr\n\nimport gdal\ngdal.UseExceptions()\n\n# wkt_projection = CRS(\"EPSG:3301\") -> techniclly should tae crs from the geodataframe\ncrs = pyproj.Proj(from_epsg(3301))\n\nsrs = osgeo.osr.SpatialReference()\nsrs.ImportFromProj4(crs.srs)\nwkt_projection = srs.ExportToWkt()",
"_____no_output_____"
],
[
"#\n# KDTree z_arr_interp\n#\nncols = nx\nnrows = ny\n\ncell_unit_sizeX = (geo_df_3301['x'].max() - geo_df_3301['x'].min()) / ncols\ncell_unit_sizeY = (geo_df_3301['y'].max() - geo_df_3301['y'].min()) / nrows\n\ntestnp = z_arr_interp.reshape((ny,nx))\n\nxllcorner = geo_df_3301['x'].min()\nxulcorner = geo_df_3301['x'].min()\n\nyllcorner = geo_df_3301['y'].min()\nyulcorner = geo_df_3301['y'].max()\n\nnodata_value = -9999\n\ndriver = gdal.GetDriverByName(\"GTiff\")\n\ndataset = driver.Create(\"kdtree_precip_rasterout1.tif\", ncols, nrows, 1, gdal.GDT_Float32 )\n\ndataset.SetProjection(wkt_projection)\ndataset.SetGeoTransform((xulcorner,cell_unit_sizeX,0,yulcorner,0,-cell_unit_sizeY))\n\ndataset.GetRasterBand(1).WriteArray(np.flipud(testnp))\n\nband = dataset.GetRasterBand(1)\nband.SetNoDataValue(nodata_value)\n\ndataset.FlushCache()\n\n# dereference band to avoid gotcha described previously\nband = None\ndataset = None",
"_____no_output_____"
],
[
"#\n# RBF grid2\n#\ntestnp = grid2.reshape((ny,nx))\n\nncols = nx\nnrows = ny\n\ncell_unit_sizeX = (geo_df_3301['x'].max() - geo_df_3301['x'].min()) / ncols\ncell_unit_sizeY = (geo_df_3301['y'].max() - geo_df_3301['y'].min()) / nrows\n\nxllcorner = geo_df_3301['x'].min()\nxulcorner = geo_df_3301['x'].min()\n\nyllcorner = geo_df_3301['y'].min()\nyulcorner = geo_df_3301['y'].max()\n\nnodata_value = -9999\n\ndriver = gdal.GetDriverByName(\"GTiff\")\n\ndataset = driver.Create(\"rbf_precip_rasterout1.tif\", ncols, nrows, 1, gdal.GDT_Float32 )\n\ndataset.SetProjection(wkt_projection)\ndataset.SetGeoTransform((xulcorner,cell_unit_sizeX,0,yulcorner,0,-cell_unit_sizeY))\n\ndataset.GetRasterBand(1).WriteArray(np.flipud(testnp))\n\nband = dataset.GetRasterBand(1)\nband.SetNoDataValue(nodata_value)\n\ndataset.FlushCache()\n\n# dereference band to avoid gotcha described previously\nband = None\ndataset = None",
"_____no_output_____"
],
[
"ncols = 200\nnrows = 200\n\ncell_unit_sizeX = (geo_df_3301['x'].max() - geo_df_3301['x'].min()) / ncols\ncell_unit_sizeY = (geo_df_3301['y'].max() - geo_df_3301['y'].min()) / nrows\n\nxllcorner = geo_df_3301['x'].min()\nxulcorner = geo_df_3301['x'].min()\n\nyllcorner = geo_df_3301['y'].min()\nyulcorner = geo_df_3301['y'].max()\n\nnodata_value = -9999\n\ndriver = gdal.GetDriverByName(\"GTiff\")\n\n# dataset = driver.Create(\"%s\"%(OutputFile), NROWS, NCOLS, 1, gdal.GDT_Float32 )\ndataset = driver.Create(\"idw_basic_precip_rasterout1.tif\", ncols, nrows, 1, gdal.GDT_Float32 )\n\ndataset.SetProjection(wkt_projection)\ndataset.SetGeoTransform((xulcorner,cell_unit_sizeX,0,yulcorner,0,-cell_unit_sizeY))\n\ndataset.GetRasterBand(1).WriteArray(np.flipud(np.array(z_head1)))\n\nband = dataset.GetRasterBand(1)\nband.SetNoDataValue(nodata_value)\n\ndataset.FlushCache()\n\n# dereference band to avoid gotcha described previously\nband = None\ndataset = None",
"_____no_output_____"
]
],
[
[
"## Point Query RasterStats\n\n- https://pythonhosted.org/rasterstats/manual.html#basic-example",
"_____no_output_____"
]
],
[
[
"from rasterstats import point_query\n\nxm = gpd.read_file('ilmateenistus_precip_stations.shp', encoding=\"utf-8\")\n\npts_kd = point_query('ilmateenistus_precip_stations.shp', \"kdtree_precip_rasterout1.tif\")\npts_rbf = point_query('ilmateenistus_precip_stations.shp', \"rbf_precip_rasterout1.tif\")\npts_idw = point_query('ilmateenistus_precip_stations.shp', \"idw_basic_precip_rasterout1.tif\")",
"_____no_output_____"
],
[
"xm['pcp_kdtree'] = pts_kd\nxm['pcp_rbf'] = pts_rbf\nxm['pcp_idw'] = pts_idw\n\nxm = xm[['precipitat','pcp_kdtree','pcp_rbf','pcp_idw']].dropna()",
"_____no_output_____"
],
[
"from sklearn.metrics import mean_squared_error, r2_score\n\nx_l = []\nfor rst in ['pcp_kdtree', 'pcp_rbf', 'pcp_idw']:\n rmse = np.sqrt(mean_squared_error(xm['precipitat'], xm[rst]))\n r2 = r2_score(xm['precipitat'], xm[rst])\n x_l.append({ 'name': rst, 'rmse': rmse, 'r2': r2})\n\npd.DataFrame(x_l)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
] |
d07e636a179f70426e8cfe016940e955ccdb25b4 | 30,835 | ipynb | Jupyter Notebook | advanced_functionality/distributed_tensorflow_mask_rcnn/mask-rcnn-efs.ipynb | fhirschmann/amazon-sagemaker-examples | bb4a4ed78cd4f3673bd6894f0b92ab08aa7f8f29 | [
"Apache-2.0"
] | null | null | null | advanced_functionality/distributed_tensorflow_mask_rcnn/mask-rcnn-efs.ipynb | fhirschmann/amazon-sagemaker-examples | bb4a4ed78cd4f3673bd6894f0b92ab08aa7f8f29 | [
"Apache-2.0"
] | null | null | null | advanced_functionality/distributed_tensorflow_mask_rcnn/mask-rcnn-efs.ipynb | fhirschmann/amazon-sagemaker-examples | bb4a4ed78cd4f3673bd6894f0b92ab08aa7f8f29 | [
"Apache-2.0"
] | null | null | null | 40.097529 | 529 | 0.536825 | [
[
[
"# Distirbuted Training of Mask-RCNN in Amazon SageMaker using EFS\n\nThis notebook is a step-by-step tutorial on distributed tranining of [Mask R-CNN](https://arxiv.org/abs/1703.06870) implemented in [TensorFlow](https://www.tensorflow.org/) framework. Mask R-CNN is also referred to as heavy weight object detection model and it is part of [MLPerf](https://www.mlperf.org/training-results-0-6/).\n\nConcretely, we will describe the steps for training [TensorPack Faster-RCNN/Mask-RCNN](https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN) and [AWS Samples Mask R-CNN](https://github.com/aws-samples/mask-rcnn-tensorflow) in [Amazon SageMaker](https://aws.amazon.com/sagemaker/) using [Amazon EFS](https://aws.amazon.com/efs/) file-system as data source.\n\nThe outline of steps is as follows:\n\n1. Stage COCO 2017 dataset in [Amazon S3](https://aws.amazon.com/s3/)\n2. Copy COCO 2017 dataset from S3 to Amazon EFS file-system mounted on this notebook instance\n3. Build Docker training image and push it to [Amazon ECR](https://aws.amazon.com/ecr/)\n4. Configure data input channels\n5. Configure hyper-prarameters\n6. Define training metrics\n7. Define training job and start training\n\nBefore we get started, let us initialize two python variables ```aws_region``` and ```s3_bucket``` that we will use throughout the notebook:",
"_____no_output_____"
]
],
[
[
"aws_region = # aws-region-code e.g. us-east-1\ns3_bucket = # your-s3-bucket-name",
"_____no_output_____"
]
],
[
[
"## Stage COCO 2017 dataset in Amazon S3\n\nWe use [COCO 2017 dataset](http://cocodataset.org/#home) for training. We download COCO 2017 training and validation dataset to this notebook instance, extract the files from the dataset archives, and upload the extracted files to your Amazon [S3 bucket](https://docs.aws.amazon.com/AmazonS3/latest/gsg/CreatingABucket.html). The ```prepare-s3-bucket.sh``` script executes this step. ",
"_____no_output_____"
]
],
[
[
"!cat ./prepare-s3-bucket.sh",
"_____no_output_____"
]
],
[
[
"Using your *Amazon S3 bucket* as argument, run the cell below. If you have already uploaded COCO 2017 dataset to your Amazon S3 bucket, you may skip this step. ",
"_____no_output_____"
]
],
[
[
"%%time\n!./prepare-s3-bucket.sh {s3_bucket}",
"_____no_output_____"
]
],
[
[
"## Copy COCO 2017 dataset from S3 to Amazon EFS\n\nNext, we copy COCO 2017 dataset from S3 to EFS file-system. The ```prepare-efs.sh``` script executes this step.",
"_____no_output_____"
]
],
[
[
"!cat ./prepare-efs.sh",
"_____no_output_____"
]
],
[
[
"If you have already copied COCO 2017 dataset from S3 to your EFS file-system, skip this step.",
"_____no_output_____"
]
],
[
[
"%%time\n!./prepare-efs.sh {s3_bucket}",
"_____no_output_____"
]
],
[
[
"## Build and push SageMaker training images\n\nFor this step, the [IAM Role](https://docs.aws.amazon.com/IAM/latest/UserGuide/id_roles.html) attached to this notebook instance needs full access to Amazon ECR service. If you created this notebook instance using the ```./stack-sm.sh``` script in this repository, the IAM Role attached to this notebook instance is already setup with full access to ECR service. \n\nBelow, we have a choice of two different implementations:\n\n1. [TensorPack Faster-RCNN/Mask-RCNN](https://github.com/tensorpack/tensorpack/tree/master/examples/FasterRCNN) implementation supports a maximum per-GPU batch size of 1, and does not support mixed precision. It can be used with mainstream TensorFlow releases.\n\n2. [AWS Samples Mask R-CNN](https://github.com/aws-samples/mask-rcnn-tensorflow) is an optimized implementation that supports a maximum batch size of 4 and supports mixed precision. This implementation uses custom TensorFlow ops. The required custom TensorFlow ops are available in [AWS Deep Learning Container](https://github.com/aws/deep-learning-containers/blob/master/available_images.md) images in ```tensorflow-training``` repository with image tag ```1.15.2-gpu-py36-cu100-ubuntu18.04```, or later.\n\nIt is recommended that you build and push both SageMaker training images and use either image for training later.",
"_____no_output_____"
],
[
"### TensorPack Faster-RCNN/Mask-RCNN\n\nUse ```./container/build_tools/build_and_push.sh``` script to build and push the TensorPack Faster-RCNN/Mask-RCNN training image to Amazon ECR.",
"_____no_output_____"
]
],
[
[
"!cat ./container/build_tools/build_and_push.sh",
"_____no_output_____"
]
],
[
[
"Using your *AWS region* as argument, run the cell below. ",
"_____no_output_____"
]
],
[
[
"%%time\n! ./container/build_tools/build_and_push.sh {aws_region}",
"_____no_output_____"
]
],
[
[
"Set ```tensorpack_image``` below to Amazon ECR URI of the image you pushed above.",
"_____no_output_____"
]
],
[
[
"tensorpack_image = # mask-rcnn-tensorpack-sagemaker ECR URI",
"_____no_output_____"
]
],
[
[
"### AWS Samples Mask R-CNN\nUse ```./container-optimized/build_tools/build_and_push.sh``` script to build and push the AWS Samples Mask R-CNN training image to Amazon ECR.",
"_____no_output_____"
]
],
[
[
"!cat ./container-optimized/build_tools/build_and_push.sh",
"_____no_output_____"
]
],
[
[
"Using your *AWS region* as argument, run the cell below. ",
"_____no_output_____"
]
],
[
[
"%%time\n! ./container-optimized/build_tools/build_and_push.sh {aws_region}",
"_____no_output_____"
]
],
[
[
"Set ```aws_samples_image``` below to Amazon ECR URI of the image you pushed above.",
"_____no_output_____"
]
],
[
[
"aws_samples_image = # mask-rcnn-tensorflow-sagemaker ECR URI",
"_____no_output_____"
]
],
[
[
"## SageMaker Initialization \n\nFirst we upgrade SageMaker to 2.3.0 API. If your notebook is already using latest Sagemaker 2.x API, you may skip the next cell.",
"_____no_output_____"
]
],
[
[
"! pip install --upgrade pip\n! pip install sagemaker==2.3.0",
"_____no_output_____"
]
],
[
[
"We have staged the data and we have built and pushed the training docker image to Amazon ECR. Now we are ready to start using Amazon SageMaker. \n",
"_____no_output_____"
]
],
[
[
"%%time\nimport os\nimport time\nimport boto3\nimport sagemaker\nfrom sagemaker import get_execution_role\nfrom sagemaker.estimator import Estimator\n\nrole = get_execution_role() # provide a pre-existing role ARN as an alternative to creating a new role\nprint(f'SageMaker Execution Role:{role}')\n\nclient = boto3.client('sts')\naccount = client.get_caller_identity()['Account']\nprint(f'AWS account:{account}')\n\nsession = boto3.session.Session()\nregion = session.region_name\nprint(f'AWS region:{region}')",
"_____no_output_____"
]
],
[
[
"Next, we set the Amazon ECR image URI used for training. You saved this URI in a previous step.",
"_____no_output_____"
]
],
[
[
"training_image = # set to tensorpack_image or aws_samples_image \nprint(f'Training image: {training_image}')",
"_____no_output_____"
]
],
[
[
"## Define SageMaker Data Channels\n\nNext, we define the *train* and *log* data channels using EFS file-system. To do so, we need to specify the EFS file-system id, which is shown in the output of the command below.",
"_____no_output_____"
]
],
[
[
"!df -kh | grep 'fs-' | sed 's/\\(fs-[0-9a-z]*\\).*/\\1/'",
"_____no_output_____"
]
],
[
[
"Set the EFS ```file_system_id``` below to the ouput of the command shown above. In the cell below, we define the `train` data input channel.",
"_____no_output_____"
]
],
[
[
"from sagemaker.inputs import FileSystemInput\n\n# Specify EFS ile system id.\nfile_system_id = # 'fs-xxxxxxxx'\nprint(f\"EFS file-system-id: {file_system_id}\")\n\n# Specify directory path for input data on the file system. \n# You need to provide normalized and absolute path below.\nfile_system_directory_path = '/mask-rcnn/sagemaker/input/train'\nprint(f'EFS file-system data input path: {file_system_directory_path}')\n\n# Specify the access mode of the mount of the directory associated with the file system. \n# Directory must be mounted 'ro'(read-only).\nfile_system_access_mode = 'ro'\n\n# Specify your file system type\nfile_system_type = 'EFS'\n\ntrain = FileSystemInput(file_system_id=file_system_id,\n file_system_type=file_system_type,\n directory_path=file_system_directory_path,\n file_system_access_mode=file_system_access_mode)",
"_____no_output_____"
]
],
[
[
"Below we create the log output directory and define the `log` data output channel.",
"_____no_output_____"
]
],
[
[
"# Specify directory path for log output on the EFS file system.\n# You need to provide normalized and absolute path below.\n# For example, '/mask-rcnn/sagemaker/output/log'\n# Log output directory must not exist\nfile_system_directory_path = f'/mask-rcnn/sagemaker/output/log-{int(time.time())}'\n\n# Create the log output directory. \n# EFS file-system is mounted on '$HOME/efs' mount point for this notebook.\nhome_dir=os.environ['HOME']\nlocal_efs_path = os.path.join(home_dir,'efs', file_system_directory_path[1:])\nprint(f\"Creating log directory on EFS: {local_efs_path}\")\n\nassert not os.path.isdir(local_efs_path)\n! sudo mkdir -p -m a=rw {local_efs_path}\nassert os.path.isdir(local_efs_path)\n\n# Specify the access mode of the mount of the directory associated with the file system. \n# Directory must be mounted 'rw'(read-write).\nfile_system_access_mode = 'rw'\n\n\nlog = FileSystemInput(file_system_id=file_system_id,\n file_system_type=file_system_type,\n directory_path=file_system_directory_path,\n file_system_access_mode=file_system_access_mode)\n\ndata_channels = {'train': train, 'log': log}",
"_____no_output_____"
]
],
[
[
"Next, we define the model output location in S3. Set ```s3_bucket``` to your S3 bucket name prior to running the cell below. \n\nThe model checkpoints, logs and Tensorboard events will be written to the log output directory on the EFS file system you created above. At the end of the model training, they will be copied from the log output directory to the `s3_output_location` defined below.",
"_____no_output_____"
]
],
[
[
"prefix = \"mask-rcnn/sagemaker\" #prefix in your bucket\ns3_output_location = f's3://{s3_bucket}/{prefix}/output'\nprint(f'S3 model output location: {s3_output_location}')",
"_____no_output_____"
]
],
[
[
"## Configure Hyper-parameters\nNext we define the hyper-parameters. \n\nNote, some hyper-parameters are different between the two implementations. The batch size per GPU in TensorPack Faster-RCNN/Mask-RCNN is fixed at 1, but is configurable in AWS Samples Mask-RCNN. The learning rate schedule is specified in units of steps in TensorPack Faster-RCNN/Mask-RCNN, but in epochs in AWS Samples Mask-RCNN.\n\nThe detault learning rate schedule values shown below correspond to training for a total of 24 epochs, at 120,000 images per epoch.\n\n<table align='left'>\n <caption>TensorPack Faster-RCNN/Mask-RCNN Hyper-parameters</caption>\n <tr>\n <th style=\"text-align:center\">Hyper-parameter</th>\n <th style=\"text-align:center\">Description</th>\n <th style=\"text-align:center\">Default</th>\n </tr>\n <tr>\n <td style=\"text-align:center\">mode_fpn</td>\n <td style=\"text-align:left\">Flag to indicate use of Feature Pyramid Network (FPN) in the Mask R-CNN model backbone</td>\n <td style=\"text-align:center\">\"True\"</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">mode_mask</td>\n <td style=\"text-align:left\">A value of \"False\" means Faster-RCNN model, \"True\" means Mask R-CNN moodel</td>\n <td style=\"text-align:center\">\"True\"</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">eval_period</td>\n <td style=\"text-align:left\">Number of epochs period for evaluation during training</td>\n <td style=\"text-align:center\">1</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">lr_schedule</td>\n <td style=\"text-align:left\">Learning rate schedule in training steps</td>\n <td style=\"text-align:center\">'[240000, 320000, 360000]'</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">batch_norm</td>\n <td style=\"text-align:left\">Batch normalization option ('FreezeBN', 'SyncBN', 'GN', 'None') </td>\n <td style=\"text-align:center\">'FreezeBN'</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">images_per_epoch</td>\n <td style=\"text-align:left\">Images per epoch </td>\n <td style=\"text-align:center\">120000</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">data_train</td>\n <td style=\"text-align:left\">Training data under data directory</td>\n <td style=\"text-align:center\">'coco_train2017'</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">data_val</td>\n <td style=\"text-align:left\">Validation data under data directory</td>\n <td style=\"text-align:center\">'coco_val2017'</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">resnet_arch</td>\n <td style=\"text-align:left\">Must be 'resnet50' or 'resnet101'</td>\n <td style=\"text-align:center\">'resnet50'</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">backbone_weights</td>\n <td style=\"text-align:left\">ResNet backbone weights</td>\n <td style=\"text-align:center\">'ImageNet-R50-AlignPadding.npz'</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">load_model</td>\n <td style=\"text-align:left\">Pre-trained model to load</td>\n <td style=\"text-align:center\"></td>\n </tr>\n <tr>\n <td style=\"text-align:center\">config:</td>\n <td style=\"text-align:left\">Any hyperparamter prefixed with <b>config:</b> is set as a model config parameter</td>\n <td style=\"text-align:center\"></td>\n </tr>\n</table>\n\n \n<table align='left'>\n <caption>AWS Samples Mask-RCNN Hyper-parameters</caption>\n <tr>\n <th style=\"text-align:center\">Hyper-parameter</th>\n <th style=\"text-align:center\">Description</th>\n <th style=\"text-align:center\">Default</th>\n </tr>\n <tr>\n <td style=\"text-align:center\">mode_fpn</td>\n <td style=\"text-align:left\">Flag to indicate use of Feature Pyramid Network (FPN) in the Mask R-CNN model backbone</td>\n <td style=\"text-align:center\">\"True\"</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">mode_mask</td>\n <td style=\"text-align:left\">A value of \"False\" means Faster-RCNN model, \"True\" means Mask R-CNN moodel</td>\n <td style=\"text-align:center\">\"True\"</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">eval_period</td>\n <td style=\"text-align:left\">Number of epochs period for evaluation during training</td>\n <td style=\"text-align:center\">1</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">lr_epoch_schedule</td>\n <td style=\"text-align:left\">Learning rate schedule in epochs</td>\n <td style=\"text-align:center\">'[(16, 0.1), (20, 0.01), (24, None)]'</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">batch_size_per_gpu</td>\n <td style=\"text-align:left\">Batch size per gpu ( Minimum 1, Maximum 4)</td>\n <td style=\"text-align:center\">4</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">batch_norm</td>\n <td style=\"text-align:left\">Batch normalization option ('FreezeBN', 'SyncBN', 'GN', 'None') </td>\n <td style=\"text-align:center\">'FreezeBN'</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">images_per_epoch</td>\n <td style=\"text-align:left\">Images per epoch </td>\n <td style=\"text-align:center\">120000</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">data_train</td>\n <td style=\"text-align:left\">Training data under data directory</td>\n <td style=\"text-align:center\">'train2017'</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">backbone_weights</td>\n <td style=\"text-align:left\">ResNet backbone weights</td>\n <td style=\"text-align:center\">'ImageNet-R50-AlignPadding.npz'</td>\n </tr>\n <tr>\n <td style=\"text-align:center\">load_model</td>\n <td style=\"text-align:left\">Pre-trained model to load</td>\n <td style=\"text-align:center\"></td>\n </tr>\n <tr>\n <td style=\"text-align:center\">config:</td>\n <td style=\"text-align:left\">Any hyperparamter prefixed with <b>config:</b> is set as a model config parameter</td>\n <td style=\"text-align:center\"></td>\n </tr>\n</table>",
"_____no_output_____"
]
],
[
[
"hyperparameters = {\n \"mode_fpn\": \"True\",\n \"mode_mask\": \"True\",\n \"eval_period\": 1,\n \"batch_norm\": \"FreezeBN\"\n }",
"_____no_output_____"
]
],
[
[
"## Define Training Metrics\nNext, we define the regular expressions that SageMaker uses to extract algorithm metrics from training logs and send them to [AWS CloudWatch metrics](https://docs.aws.amazon.com/en_pv/AmazonCloudWatch/latest/monitoring/working_with_metrics.html). These algorithm metrics are visualized in SageMaker console.",
"_____no_output_____"
]
],
[
[
"metric_definitions=[\n {\n \"Name\": \"fastrcnn_losses/box_loss\",\n \"Regex\": \".*fastrcnn_losses/box_loss:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"fastrcnn_losses/label_loss\",\n \"Regex\": \".*fastrcnn_losses/label_loss:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"fastrcnn_losses/label_metrics/accuracy\",\n \"Regex\": \".*fastrcnn_losses/label_metrics/accuracy:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"fastrcnn_losses/label_metrics/false_negative\",\n \"Regex\": \".*fastrcnn_losses/label_metrics/false_negative:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"fastrcnn_losses/label_metrics/fg_accuracy\",\n \"Regex\": \".*fastrcnn_losses/label_metrics/fg_accuracy:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"fastrcnn_losses/num_fg_label\",\n \"Regex\": \".*fastrcnn_losses/num_fg_label:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"maskrcnn_loss/accuracy\",\n \"Regex\": \".*maskrcnn_loss/accuracy:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"maskrcnn_loss/fg_pixel_ratio\",\n \"Regex\": \".*maskrcnn_loss/fg_pixel_ratio:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"maskrcnn_loss/maskrcnn_loss\",\n \"Regex\": \".*maskrcnn_loss/maskrcnn_loss:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"maskrcnn_loss/pos_accuracy\",\n \"Regex\": \".*maskrcnn_loss/pos_accuracy:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"mAP(bbox)/IoU=0.5\",\n \"Regex\": \".*mAP\\\\(bbox\\\\)/IoU=0\\\\.5:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"mAP(bbox)/IoU=0.5:0.95\",\n \"Regex\": \".*mAP\\\\(bbox\\\\)/IoU=0\\\\.5:0\\\\.95:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"mAP(bbox)/IoU=0.75\",\n \"Regex\": \".*mAP\\\\(bbox\\\\)/IoU=0\\\\.75:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"mAP(bbox)/large\",\n \"Regex\": \".*mAP\\\\(bbox\\\\)/large:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"mAP(bbox)/medium\",\n \"Regex\": \".*mAP\\\\(bbox\\\\)/medium:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"mAP(bbox)/small\",\n \"Regex\": \".*mAP\\\\(bbox\\\\)/small:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"mAP(segm)/IoU=0.5\",\n \"Regex\": \".*mAP\\\\(segm\\\\)/IoU=0\\\\.5:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"mAP(segm)/IoU=0.5:0.95\",\n \"Regex\": \".*mAP\\\\(segm\\\\)/IoU=0\\\\.5:0\\\\.95:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"mAP(segm)/IoU=0.75\",\n \"Regex\": \".*mAP\\\\(segm\\\\)/IoU=0\\\\.75:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"mAP(segm)/large\",\n \"Regex\": \".*mAP\\\\(segm\\\\)/large:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"mAP(segm)/medium\",\n \"Regex\": \".*mAP\\\\(segm\\\\)/medium:\\\\s*(\\\\S+).*\"\n },\n {\n \"Name\": \"mAP(segm)/small\",\n \"Regex\": \".*mAP\\\\(segm\\\\)/small:\\\\s*(\\\\S+).*\"\n } \n \n ]",
"_____no_output_____"
]
],
[
[
"## Define SageMaker Training Job\n\nNext, we use SageMaker [Estimator](https://sagemaker.readthedocs.io/en/stable/estimators.html) API to define a SageMaker Training Job. \n\nWe recommned using 32 GPUs, so we set ```instance_count=4``` and ```instance_type='ml.p3.16xlarge'```, because there are 8 Tesla V100 GPUs per ```ml.p3.16xlarge``` instance. We recommend using 100 GB [Amazon EBS](https://aws.amazon.com/ebs/) storage volume with each training instance, so we set ```volume_size = 100```. \n\nWe run the training job in your private VPC, so we need to set the ```subnets``` and ```security_group_ids``` prior to running the cell below. You may specify multiple subnet ids in the ```subnets``` list. The subnets included in the ```sunbets``` list must be part of the output of ```./stack-sm.sh``` CloudFormation stack script used to create this notebook instance. Specify only one security group id in ```security_group_ids``` list. The security group id must be part of the output of ```./stack-sm.sh``` script.\n\nFor ```instance_type``` below, you have the option to use ```ml.p3.16xlarge``` with 16 GB per-GPU memory and 25 Gbs network interconnectivity, or ```ml.p3dn.24xlarge``` with 32 GB per-GPU memory and 100 Gbs network interconnectivity. The ```ml.p3dn.24xlarge``` instance type offers significantly better performance than ```ml.p3.16xlarge``` for Mask R-CNN distributed TensorFlow training.",
"_____no_output_____"
]
],
[
[
"# Give Amazon SageMaker Training Jobs Access to FileSystem Resources in Your Amazon VPC.\nsecurity_group_ids = # ['sg-xxxxxxxx'] \nsubnets = # [ 'subnet-xxxxxxx', 'subnet-xxxxxxx', 'subnet-xxxxxxx' ]\nsagemaker_session = sagemaker.session.Session(boto_session=session)\n\nmask_rcnn_estimator = Estimator(image_uri=training_image,\n role=role, \n instance_count=4, \n instance_type='ml.p3.16xlarge',\n volume_size = 100,\n max_run = 400000,\n output_path=s3_output_location,\n sagemaker_session=sagemaker_session, \n hyperparameters = hyperparameters,\n metric_definitions = metric_definitions,\n subnets=subnets,\n security_group_ids=security_group_ids)\n\n",
"_____no_output_____"
]
],
[
[
"Finally, we launch the SageMaker training job. See ```Training Jobs``` in SageMaker console to monitor the training job. ",
"_____no_output_____"
]
],
[
[
"import time\n\njob_name=f'mask-rcnn-efs-{int(time.time())}'\nprint(f\"Launching Training Job: {job_name}\")\n\n# set wait=True below if you want to print logs in cell output\nmask_rcnn_estimator.fit(inputs=data_channels, job_name=job_name, logs=\"All\", wait=False)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07e6853e85923a3eae86dd4487ccb3439e626f3 | 276,969 | ipynb | Jupyter Notebook | notebooks/Plotting_Tutorial/Tutorial_pt2.ipynb | chazbethelbrescia/pei2020 | a031da26fbbca9fcf06a17ac15a54e1e83569661 | [
"MIT"
] | null | null | null | notebooks/Plotting_Tutorial/Tutorial_pt2.ipynb | chazbethelbrescia/pei2020 | a031da26fbbca9fcf06a17ac15a54e1e83569661 | [
"MIT"
] | null | null | null | notebooks/Plotting_Tutorial/Tutorial_pt2.ipynb | chazbethelbrescia/pei2020 | a031da26fbbca9fcf06a17ac15a54e1e83569661 | [
"MIT"
] | null | null | null | 499.944043 | 32,032 | 0.944929 | [
[
[
"# Plotting and Programming in Python (Continued)",
"_____no_output_____"
],
[
"## Plotting",
"_____no_output_____"
]
],
[
[
"%matplotlib inline\nimport matplotlib.pyplot as plt",
"_____no_output_____"
],
[
"time = [0, 1, 2, 3]\nposition = [0, 100, 200, 300]\n\nplt.plot(time, position)\nplt.xlabel('Time (hr)')\nplt.ylabel('Position (km)')",
"_____no_output_____"
]
],
[
[
"## Plot direclty from Pandas DataFrame",
"_____no_output_____"
]
],
[
[
"import pandas as pd\n\ndata = pd.read_csv('./gapminder_gdp_oceania.csv', \n index_col='country')\n\n\n# so we want to keep the (year) part only for clarity when plotting GDP vs. years\n# To do this we use strip(), which removes from the string the characters stated in the argument\n# This method works on strings, so we call str before strip()\nyears = data.columns.str.strip('gdpPercap_')\n\n# Convert year values to integers, saving results back to dataframe\ndata.columns = years.astype(int) # note astype() --> casting function\n\ndata.loc['Australia'].plot()",
"_____no_output_____"
],
[
"# More examples:\n\n# GDP Per Capita \ndata.T.plot() # line by default\nplt.ylabel('GDP per capita')\nplt.xlabel('Year')\nplt.title('Gdp per Bapita in Oceana')",
"_____no_output_____"
],
[
"# MANY styles of plots are available\nplt.style.use('ggplot')\ndata.T.plot(kind='bar') # line, bar, barh, hist, box, area, pie, scatter, hexbin\nplt.ylabel('GDP per capita')",
"_____no_output_____"
],
[
"# Plotting data using the matplotlib.plot() function direclty\nyears = data.columns\ngdp_australia = data.loc['Australia']\n\nplt.plot(years, gdp_australia, 'g--') # last flag determines color of line\nplt.title('Annual GDP in Australia', fontsize=15)\nplt.ylabel('GDP')\nplt.xlabel('Year')",
"_____no_output_____"
]
],
[
[
"# Can plot many sets of data together",
"_____no_output_____"
]
],
[
[
"# Select two countries' worth of data.\ngdp_australia = data.loc['Australia']\ngdp_nz = data.loc['New Zealand']\n\n# Plot with differently-colored markers.\nplt.plot(years, gdp_australia, 'b-', label='Australia')\nplt.plot(years, gdp_nz, 'y-', label='New Zealand')\n\n# Create legend.\nplt.legend(loc='upper left') # location parameter\nplt.xlabel('Year')\nplt.ylabel('GDP per capita ($)')\nplt.title('GDP per capita ($) in Oceana')",
"_____no_output_____"
],
[
"# Scatterplot examples:\nplt.scatter(gdp_australia, gdp_nz)",
"_____no_output_____"
],
[
"data.T.plot.scatter(x = 'Australia', y = 'New Zealand') \n# Transpose --> so country indices are now values",
"_____no_output_____"
],
[
"# Minima and Maxima\ndata_europe = pd.read_csv('./gapminder_gdp_europe.csv', index_col='country')\n\n# Note: use of strip technique to clean up labels\nyears = data_europe.columns.str.strip('gdpPercap_')\ndata_europe.columns = years;\ndata_europe.min().plot(label='min')\ndata_europe.max().plot(label='max')\nplt.legend(loc='best')\nplt.xticks(rotation=50) # rotate tick labels",
"_____no_output_____"
],
[
"# Correlations\ndata_asia = pd.read_csv('./gapminder_gdp_asia.csv', index_col='country')\ndata_asia.describe().T.plot(kind='scatter', x='min', y='max')",
"_____no_output_____"
],
[
"# Variability of Max is much higher than Min --> take a look at Max variable\ndata_asia = pd.read_csv('./gapminder_gdp_asia.csv', \n index_col='country')\nyears = data_asia.columns.str.strip('gdbPercapita_')\ndata_asia.columns = years\n\ndata_asia.max().plot()\nplt.xticks(rotation=80)\nprint(data_asia.idxmax()) # Remember idxmax function (max value for each index)",
"1952 Kuwait\n1957 Kuwait\n1962 Kuwait\n1967 Kuwait\n1972 Kuwait\n1977 Kuwait\n1982 Saudi Arabia\n1987 Kuwait\n1992 Kuwait\n1997 Kuwait\n2002 Singapore\n2007 Kuwait\ndtype: object\n"
],
[
"# More Correlations\n# Create a plot showing correlation between GDP and life expectancy for 2007\ndata_all = pd.read_csv('./gapminder_all.csv', index_col='country')\ndata_all.plot(kind='scatter', x='gdpPercap_2007', y='lifeExp_2007',\n s=data_all['pop_2007']/1e6) # change size of plotted points\nplt.title('Life Expectancy vs. GDP in 2007', fontsize=16)",
"_____no_output_____"
]
],
[
[
"## Save your plot to a file",
"_____no_output_____"
]
],
[
[
"# fig = plt.gcf() --> get current figure\n\ndata.T.plot(kind='line')\n# must get the current figure AFTER it has been plotted\nfig = plt.gcf() \nplt.legend(loc='upper left')\nplt.xlabel('GDP per capita')\nplt.ylabel('Year')\n\nfig.savefig('my_figure.png')",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07e6965a96e3144402ef5122043ae007c856193 | 16,876 | ipynb | Jupyter Notebook | python_pd/04_functions/07_decorators.ipynb | AsakoKabe/python-bp | 4eb0d9e3836c083ab93031acb4e105807b323f61 | [
"MIT"
] | null | null | null | python_pd/04_functions/07_decorators.ipynb | AsakoKabe/python-bp | 4eb0d9e3836c083ab93031acb4e105807b323f61 | [
"MIT"
] | null | null | null | python_pd/04_functions/07_decorators.ipynb | AsakoKabe/python-bp | 4eb0d9e3836c083ab93031acb4e105807b323f61 | [
"MIT"
] | null | null | null | 27.986733 | 392 | 0.535435 | [
[
[
"# Декораторы\n\nДекоратор это функция, которая в качестве одного из аргументов принимает объект и что-то возвращает. Декораторы в Python можно применять ко всему: функциям, классам и методам. Основная цель декораторов – изменить поведение объекта, не меняя сам объект. Это очень гибкая функциональная возможность языка.\n\nДекорирование функций происходит с помощью следующего синтаксиса\n\n```Python\n@decorator\ndef function():\n ...\n```\n\nТакая запись будет аналогично следующему определению:\n\n```Python\ndef function():\n ...\n\nfunction = decorator(function)\n```\n\nВ этом случае результат выполнения функции ```decorator``` записывается обратно по имени ```function```.\n\nС помощью декораторов можно, например, измерять время выполнения функций, контролировать количество вызовов, кеширование, вывод предупреждений об использовании устаревших функций, трассировка, использование в контрактном программировании.\n\nРассмотрим пример измерения времени выполнения кода функции.",
"_____no_output_____"
]
],
[
[
"import time\n\ndef timeit(f):\n def inner(*args, **kwargs):\n start = time.time()\n res = f(*args, **kwargs)\n end = time.time()\n print(f'{end - start} seconds')\n return res\n return inner\n\n\n@timeit\ndef my_sum(*args, **kwargs):\n \"\"\"Функция суммы\"\"\"\n return sum(*args, **kwargs)\n\n\nres = my_sum([i for i in range(int(1e5))])",
"0.0019989013671875 seconds\n"
]
],
[
[
"В такой реализации есть несколько проблем:\n- нет возможности отключить трассировку;\n- вывод в стандартный поток вывода (```sys.stdout```);\n- пропала строка документации и атрибуты декорируемой функции.",
"_____no_output_____"
]
],
[
[
"print(f'{my_sum.__name__ = }')\nprint(f'{my_sum.__doc__ = }')\nhelp(my_sum)",
"my_sum.__name__ = 'inner'\nmy_sum.__doc__ = None\nHelp on function inner in module __main__:\n\ninner(*args, **kwargs)\n\n"
]
],
[
[
"Так как в Python функции являются объектами, то их можно изменять во время выполнения. В этом кроется решение этой проблемы. Можно скопировать нужные атрибуты декорируемой функции.\n\nЧтобы не копировать каждый атрибут вручную существует готовая реализация этого функционала в модуле ```functools``` стандартной библиотеки.",
"_____no_output_____"
]
],
[
[
"from functools import wraps\n\ndef timeit(f):\n @wraps(f)\n def inner(*args, **kwargs):\n start = time.time()\n res = f(*args, **kwargs)\n end = time.time()\n print(f'{end - start} seconds')\n return res\n return inner\n\n\n@timeit\ndef my_sum(*args, **kwargs):\n \"\"\"Функция суммы\"\"\"\n return sum(*args, **kwargs)\n\n\nprint(f'{my_sum.__name__ = }')\nprint(f'{my_sum.__doc__ = }')\nhelp(my_sum)",
"my_sum.__name__ = 'my_sum'\nmy_sum.__doc__ = 'Функция суммы'\nHelp on function my_sum in module __main__:\n\nmy_sum(*args, **kwargs)\n Функция суммы\n\n"
]
],
[
[
"# Параметризованные декораторы\n\nУ реализованного нами декоратора сильно ограниченное применения, попробуем его расширить.\n\nОтключение декоратора можно реализовать, используя глобальную переменную, например, ```dec_enabled```, принимающую значение ```True```, если декоратор активен и ```False``` в противном случае.\n\nВозможность вывода не только в стандартный поток (```sys.stdout```), но и в поток ошибок (```sys.stderr```) или файл можно с помощью передачи аргументов. Добавление аргументов к декораторам немного усложняет задачу.\n\n```python\n@decorator(arg)\ndef foo():\n ...\n```\n\nВ этом случае добавляется дополнительный этап, а именно вычисление декоратора.\n\n```python\ndef foo():\n ...\n\ndec = decorator(x) # новый этап\nfoo = dec(foo)\n```\n\nРешить проблему передачи аргументов можно несколькими способами. Первый из них, и не самый лучший заключается в добавлении еще одной вложенной функции.",
"_____no_output_____"
]
],
[
[
"import sys\n\ndec_enabled = True\n\ndef timeit(file):\n def dec(func):\n @wraps(func)\n def inner(*args, **kwargs):\n start = time.time()\n res = func(*args, **kwargs)\n end = time.time()\n print(f'{end - start} seconds', file=file)\n return res\n return inner if dec_enabled else func\n return dec\n\n@timeit(sys.stderr)\ndef my_sum(*args, **kwargs):\n \"\"\"Функция суммы\"\"\"\n return sum(*args, **kwargs)\n\nres = my_sum([i for i in range(int(1e5))])\nprint(res)",
"4999950000\n0.0009996891021728516 seconds\n"
]
],
[
[
"Такой вариант будет работать при декорировании следующим образом ```@timeit(sys.stderr)```. Однако постоянно писать декораторы с тройной вложенностью это не путь питониста. Можно один раз сделать декоратор для декоратора, позволяющий передавать аргументы (да, декоратор для декоратора).",
"_____no_output_____"
]
],
[
[
"from functools import update_wrapper\n\ndef with_args(dec):\n @wraps(dec)\n def wrapper(*args, **kwargs):\n def decorator(func):\n res = dec(func, *args, **kwargs)\n update_wrapper(res, func)\n return res\n return decorator\n return wrapper",
"_____no_output_____"
]
],
[
[
"Функция ```with_args``` принимает декоратор, оборачивает его в обертку ```wrapper```, внутри которой происходит создание нового декоратора. Исходный декоратор при этом не изменяется.",
"_____no_output_____"
]
],
[
[
"dec_enabled = True\n\n@with_args\ndef timeit(func, file):\n def inner(*args, **kwargs):\n start = time.time()\n res = func(*args, **kwargs)\n end = time.time()\n print(f'{end - start} seconds', file=file)\n return res\n return inner if dec_enabled else func\n\n@timeit(sys.stderr)\ndef my_sum(*args, **kwargs):\n \"\"\"Функция суммы\"\"\"\n return sum(*args, **kwargs)\n\nres = my_sum([i for i in range(int(1e5))])\nprint(res)",
"4999950000\n0.001997709274291992 seconds\n"
]
],
[
[
"Однако это все еще слишком сложно. Гораздо удобнее добавить возможность вызывать декоратор без аргументов. Попробуем воспользоваться только ключевыми аргументами.",
"_____no_output_____"
]
],
[
[
"dec_enabled = True\n\ndef timeit(func=None, *, file=sys.stderr):\n if func is None:\n def dec(func):\n return timeit(func, file=file)\n return dec if dec_enabled else func\n @wraps(func)\n def inner(*args, **kwargs):\n start = time.time()\n res = func(*args, **kwargs)\n end = time.time()\n print(f'{end - start} seconds', file=file)\n return res\n return inner if dec_enabled else func",
"_____no_output_____"
]
],
[
[
"Теперь декоратор ```timeit``` можно вызывать двумя способами. Во-первых, не передавая никаких аргументов. Тогда вывод будет осуществляться в стандартный поток вывода. При этом помня, что декоратор раскрывается как ```f = timeit(f)```, можно видеть, что аргумент ```func``` принимает значение функции ```f```. Тогда первое условие не будет выполнено, а будет создана обертка ```inner```.",
"_____no_output_____"
]
],
[
[
"dec_enabled = True\n\n@timeit\ndef my_sum(*args, **kwargs):\n \"\"\"Функция суммы\"\"\"\n return sum(*args, **kwargs)\n\nres = my_sum([i for i in range(int(1e5))])\nprint(res)",
"4999950000\n0.0009999275207519531 seconds\n"
]
],
[
[
"Во-вторых, передавая в качестве именованного аргумента ```file``` ```sys.stderr``` или имя файла. В этом случае происходит явный вызов декоратора ```timeit(file=sys.stderr)``` без передачи аргумента ```func```, в связи с этим он принимает значение ```None```, а значит, выполняется первое условие и создается обертка ```dec```.",
"_____no_output_____"
]
],
[
[
"dec_enabled = True\n\n@timeit(file=sys.stderr)\ndef my_sum(*args, **kwargs):\n \"\"\"Функция суммы\"\"\"\n return sum(*args, **kwargs)\n\nres = my_sum([i for i in range(int(1e5))])\nprint(res)",
"4999950000\n0.000997304916381836 seconds\n"
]
],
[
[
"Благодаря переменной ```dec_enabled``` измерение времени можно отключить. В этом случае никаких накладных расходов, связанных с вызовом дополнительных функций не будет.\n\nК одной функции можно применить сразу несколько декораторов, порядок их работы будет зависеть от порядка их применения к функции. Рассмотрим на примере гамбургера.",
"_____no_output_____"
]
],
[
[
"def with_bun(f):\n @wraps(f)\n def inner():\n print('-' * 8)\n f()\n print('-' * 8)\n return inner\n\ndef with_vegetables(f):\n @wraps(f)\n def inner():\n print(' onion')\n f()\n print(' tomato')\n return inner\n\ndef with_sauce(f):\n @wraps(f)\n def inner():\n print(' sauce')\n f()\n return inner",
"_____no_output_____"
]
],
[
[
"Определим основную функцию и задекорируем ее.",
"_____no_output_____"
]
],
[
[
"@with_bun\n@with_vegetables\n@with_sauce\ndef burger():\n print(' cutlet')\n\nburger()",
"--------\n onion\n sauce\n cutlet\n tomato\n--------\n"
]
],
[
[
"Если явно такое декорирование, то получиться следующая последовательность вызовов:",
"_____no_output_____"
]
],
[
[
"def burger():\n print(' cutlet')\n\nburger = with_sauce(burger)\nburger = with_vegetables(burger)\nburger = with_bun(burger)\n\nburger()",
"--------\n onion\n sauce\n cutlet\n tomato\n--------\n"
]
],
[
[
"Первым будет применяться самый нижний (внутренний) декоратор. Если изменить последовательность декорирования, то результат ожидаемо измениться.\n\nВот еще пару примеров декораторов. Декоратор трассировки вызовов функций:",
"_____no_output_____"
]
],
[
[
"def trace(function=None, *, file=sys.stderr):\n if function is None:\n def dec(function):\n return trace(function, file=file)\n return dec if dec_enabled else function\n \n @wraps(function)\n def inner(*args, **kwargs):\n print(f'{function.__name__}, {args}, {kwargs}')\n return function(*args, **kwargs)\n return inner if dec_enabled else function\n\n@trace\ndef foo():\n print('Nothing')\n\nfoo()",
"foo, (), {}\nNothing\n"
]
],
[
[
"Декоратор проверки входа пользователя в систему (в упрощенном виде). ",
"_____no_output_____"
]
],
[
[
"def is_authenticated(user):\n return user in ('monty', 'guido')\n\ndef login_required(function=None, login_url=''):\n def user_passes_test(view_func):\n @wraps(view_func)\n def wrapped(user, *args, **kwargs):\n if is_authenticated(user):\n return view_func(user, *args, **kwargs)\n print(f'Пользователь {user} перенаправлен на страницу логина: {login_url}')\n return wrapped\n\n if function:\n return user_passes_test(function)\n return user_passes_test\n\n@login_required(login_url='localhost/login')\ndef foo(user):\n print(f'{user = }')\n\n\nfoo('monty')\nfoo('guido')\nfoo('pyuty')",
"user = 'monty'\nuser = 'guido'\nПользователь pyuty перенаправлен на страницу логина: localhost/login\n"
]
],
[
[
"# Полезные ссылки\n\n- [Decorators with parameters?](https://stackoverflow.com/questions/5929107/decorators-with-parameters)\n- [Reuven M. Lerner - Practical decorators - PyCon 2019](https://www.youtube.com/watch?v=MjHpMCIvwsY&feature=youtu.be)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
d07e6d395c304073dcca623724dce3e753ec9782 | 3,595 | ipynb | Jupyter Notebook | Decomposition.ipynb | VladimirsHisamutdinovs/data-mining | 42d8e20f14f740c783d8ad4d1edbbd51e292fa61 | [
"Apache-2.0"
] | null | null | null | Decomposition.ipynb | VladimirsHisamutdinovs/data-mining | 42d8e20f14f740c783d8ad4d1edbbd51e292fa61 | [
"Apache-2.0"
] | null | null | null | Decomposition.ipynb | VladimirsHisamutdinovs/data-mining | 42d8e20f14f740c783d8ad4d1edbbd51e292fa61 | [
"Apache-2.0"
] | null | null | null | 23.496732 | 107 | 0.567177 | [
[
[
"# TIME-SERIES DECOMPOSITION\n\n**File:** Decomposition.ipynb\n\n**Course:** Data Science Foundations: Data Mining in Python",
"_____no_output_____"
],
[
"# IMPORT LIBRARIES",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nimport numpy as np\nfrom matplotlib import pyplot as plt\nfrom matplotlib.dates import DateFormatter\nfrom statsmodels.tsa.seasonal import seasonal_decompose",
"_____no_output_____"
]
],
[
[
"# LOAD AND PREPARE DATA",
"_____no_output_____"
]
],
[
[
"df = pd.read_csv('data/AirPassengers.csv', parse_dates=['Month'], index_col=['Month'])",
"_____no_output_____"
]
],
[
[
"# PLOT DATA",
"_____no_output_____"
]
],
[
[
"fig, ax = plt.subplots()\nplt.xlabel('Year: 1949-1960')\nplt.ylabel('Monthly Passengers (1000s)')\nplt.title('Monthly Intl Air Passengers')\nplt.plot(df, color='black')\nax.xaxis.set_major_formatter(DateFormatter('%Y'))",
"_____no_output_____"
]
],
[
[
"# DECOMPOSE TIME SERIES",
"_____no_output_____"
],
[
"- Decompose the time series into three components: trend, seasonal, and residuals or noise.\n- This commands also plots the components. \n- The argument `period` specifies that there are 12 observations (i.e., months) in the cycle.\n- By default, `seasonal_decompose` performs an additive (as opposed to multiplicative) decomposition.",
"_____no_output_____"
]
],
[
[
"# Set the figure size\nplt.rcParams['figure.figsize'] = [7, 8]\n\n# Plot the decomposition components\nsd = seasonal_decompose(df, period=12).plot()",
"_____no_output_____"
]
],
[
[
"- For growth over time, it may be more appropriate to use a multiplicative trend.\n- The approach can show consistent changes by percentage.\n- In this approach, the residuals should be centered on 1 instead of 0.",
"_____no_output_____"
]
],
[
[
"sd = seasonal_decompose(df, model='multiplicative').plot()",
"_____no_output_____"
]
],
[
[
"# CLEAN UP\n\n- If desired, clear the results with Cell > All Output > Clear. \n- Save your work by selecting File > Save and Checkpoint.\n- Shut down the Python kernel and close the file by selecting File > Close and Halt.",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
d07e73a1e2c771bcdbf7d3ad4790b65549c2744f | 177,374 | ipynb | Jupyter Notebook | Chapter 08 - Heat Equations/801_Heat Equation- FTCS.ipynb | jjcrofts77/Numerical-Analysis-Python | 97e4b9274397f969810581ff95f4026f361a56a2 | [
"MIT"
] | 69 | 2019-09-05T21:39:12.000Z | 2022-03-26T14:00:25.000Z | Chapter 08 - Heat Equations/801_Heat Equation- FTCS.ipynb | Zak2020/Numerical-Analysis-Python | edd89141efc6f46de303b7ccc6e78df68b528a91 | [
"MIT"
] | null | null | null | Chapter 08 - Heat Equations/801_Heat Equation- FTCS.ipynb | Zak2020/Numerical-Analysis-Python | edd89141efc6f46de303b7ccc6e78df68b528a91 | [
"MIT"
] | 13 | 2021-06-17T15:34:04.000Z | 2022-01-14T14:53:43.000Z | 324.860806 | 95,690 | 0.906649 | [
[
[
"<a href=\"https://colab.research.google.com/github/john-s-butler-dit/Numerical-Analysis-Python/blob/master/Chapter%2008%20-%20Heat%20Equations/801_Heat%20Equation-%20FTCS.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
],
[
"# The Explicit Forward Time Centered Space (FTCS) Difference Equation for the Heat Equation\n\n#### John S Butler [email protected] [Course Notes](https://johnsbutler.netlify.com/files/Teaching/Numerical_Analysis_for_Differential_Equations.pdf) [Github](https://github.com/john-s-butler-dit/Numerical-Analysis-Python)\n## Overview\nThis notebook will implement the explicit Forward Time Centered Space (FTCS) Difference method for the Heat Equation.\n\n## The Heat Equation\nThe Heat Equation is the first order in time ($t$) and second order in space ($x$) Partial Differential Equation [1-3]: \n\\begin{equation} \\frac{\\partial u}{\\partial t} = \\frac{\\partial^2 u}{\\partial x^2},\\end{equation}\nThe equation describes heat transfer on a domain\n\\begin{equation} \\Omega = \\{ t \\geq 0\\leq x \\leq 1\\}. \\end{equation}\nwith an initial condition at time $t=0$ for all $x$ and boundary condition on the left ($x=0$) and right side ($x=1$).\n\n## Forward Time Centered Space (FTCS) Difference method\nThis notebook will illustrate the Forward Time Centered Space (FTCS) Difference method for the Heat Equation with the __initial conditions__ \n\\begin{equation} u(x,0)=2x, \\ \\ 0 \\leq x \\leq \\frac{1}{2}, \\end{equation}\n\\begin{equation} u(x,0)=2(1-x), \\ \\ \\frac{1}{2} \\leq x \\leq 1, \\end{equation}\nand __boundary condition__\n\\begin{equation}u(0,t)=0, u(1,t)=0. \\end{equation}\n",
"_____no_output_____"
]
],
[
[
"# LIBRARY\n# vector manipulation\nimport numpy as np\n# math functions\nimport math \n\n# THIS IS FOR PLOTTING\n%matplotlib inline\nimport matplotlib.pyplot as plt # side-stepping mpl backend\nimport warnings\nwarnings.filterwarnings(\"ignore\")",
"_____no_output_____"
]
],
[
[
"## Discete Grid\nThe region $\\Omega$ is discretised into a uniform mesh $\\Omega_h$. In the space $x$ direction into $N$ steps giving a stepsize of\n\\begin{equation}h=\\frac{1-0}{N},\\end{equation}\nresulting in \n\\begin{equation}x[i]=0+ih, \\ \\ \\ i=0,1,...,N,\\end{equation}\nand into $N_t$ steps in the time $t$ direction giving a stepsize of \n\\begin{equation} k=\\frac{1-0}{N_t}\\end{equation}\nresulting in \n\\begin{equation}t[j]=0+jk, \\ \\ \\ j=0,...,15.\\end{equation}\nThe Figure below shows the discrete grid points for $N=10$ and $Nt=100$, the known boundary conditions (green), initial conditions (blue) and the unknown values (red) of the Heat Equation.",
"_____no_output_____"
]
],
[
[
"N=10\nNt=1000\nh=1/N\nk=1/Nt\nr=k/(h*h)\ntime_steps=15\ntime=np.arange(0,(time_steps+.5)*k,k)\nx=np.arange(0,1.0001,h)\nX, Y = np.meshgrid(x, time)\nfig = plt.figure()\nplt.plot(X,Y,'ro');\nplt.plot(x,0*x,'bo',label='Initial Condition');\nplt.plot(np.ones(time_steps+1),time,'go',label='Boundary Condition');\nplt.plot(x,0*x,'bo');\nplt.plot(0*time,time,'go');\nplt.xlim((-0.02,1.02))\nplt.xlabel('x')\nplt.ylabel('time (ms)')\nplt.legend(loc='center left', bbox_to_anchor=(1, 0.5))\nplt.title(r'Discrete Grid $\\Omega_h,$ h= %s, k=%s'%(h,k),fontsize=24,y=1.08)\nplt.show();",
"_____no_output_____"
]
],
[
[
"## Discrete Initial and Boundary Conditions\n\nThe discrete initial conditions are \n\\begin{equation} w[i,0]=2x[i], \\ \\ 0 \\leq x[i] \\leq \\frac{1}{2} \\end{equation}\n\\begin{equation}w[i,0]=2(1-x[i]), \\ \\ \\frac{1}{2} \\leq x[i] \\leq 1 \\end{equation}\nand the discrete boundary conditions are \n\\begin{equation} w[0,j]=0, w[10,j]=0, \\end{equation}\nwhere $w[i,j]$ is the numerical approximation of $U(x[i],t[j])$.\n\nThe Figure below plots values of $w[i,0]$ for the inital (blue) and boundary (red) conditions for $t[0]=0.$",
"_____no_output_____"
]
],
[
[
"w=np.zeros((N+1,time_steps+1))\nb=np.zeros(N-1)\n# Initial Condition\nfor i in range (1,N):\n w[i,0]=2*x[i]\n if x[i]>0.5:\n w[i,0]=2*(1-x[i])\n \n\n# Boundary Condition\nfor k in range (0,time_steps):\n w[0,k]=0\n w[N,k]=0\n\nfig = plt.figure(figsize=(8,4))\nplt.plot(x,w[:,0],'o:',label='Initial Condition')\nplt.plot(x[[0,N]],w[[0,N],0],'go',label='Boundary Condition t[0]=0')\n#plt.plot(x[N],w[N,0],'go')\nplt.xlim([-0.1,1.1])\nplt.ylim([-0.1,1.1])\nplt.title('Intitial and Boundary Condition',fontsize=24)\nplt.xlabel('x')\nplt.ylabel('w')\nplt.legend(loc='best')\nplt.show()",
"_____no_output_____"
]
],
[
[
"## The Explicit Forward Time Centered Space (FTCS) Difference Equation\nThe explicit Forwards Time Centered Space (FTCS) difference equation of the Heat Equation\nis derived by discretising \n\\begin{equation} \\frac{\\partial u_{ij}}{\\partial t} = \\frac{\\partial^2 u_{ij}}{\\partial x^2},\\end{equation}\naround $(x_i,t_{j})$ giving the difference equation\n\\begin{equation}\n\\frac{w_{ij+1}-w_{ij}}{k}=\\frac{w_{i+1j}-2w_{ij}+w_{i-1j}}{h^2},\n\\end{equation}\nrearranging the equation we get\n\\begin{equation}\nw_{ij+1}=rw_{i-1j}+(1-2r)w_{ij}+rw_{i+1j},\n\\end{equation}\nfor $i=1,...9$ where $r=\\frac{k}{h^2}$.\n\nThis gives the formula for the unknown term $w_{ij+1}$ at the $(ij+1)$ mesh points\nin terms of $x[i]$ along the jth time row.\n\nHence we can calculate the unknown pivotal values of $w$ along the first row of $j=1$ in terms of the known boundary conditions.\n\nThis can be written in matrix form \n\\begin{equation}\\mathbf{w}_{j+1}=A\\mathbf{w}_{j} +\\mathbf{b}_{j} \\end{equation}\nfor which $A$ is a $9\\times9$ matrix:\n\\begin{equation}\n\\left(\\begin{array}{c}\nw_{1j+1}\\\\\nw_{2j+1}\\\\\nw_{3j+1}\\\\\nw_{4j+1}\\\\\nw_{5j+1}\\\\\nw_{6j+1}\\\\\nw_{7j+1}\\\\\nw_{8j+1}\\\\\nw_{9j+1}\\\\\n\\end{array}\\right).\n=\\left(\\begin{array}{cccc cccc}\n1-2r&r& 0&0&0 &0&0&0\\\\\nr&1-2r&r&0&0&0 &0&0&0\\\\\n0&r&1-2r &r&0&0& 0&0&0\\\\\n0&0&r&1-2r &r&0&0& 0&0\\\\\n0&0&0&r&1-2r &r&0&0& 0\\\\\n0&0&0&0&r&1-2r &r&0&0\\\\\n0&0&0&0&0&r&1-2r &r&0\\\\\n0&0&0&0&0&0&r&1-2r&r\\\\\n0&0&0&0&0&0&0&r&1-2r\\\\\n\\end{array}\\right)\n\\left(\\begin{array}{c}\nw_{1j}\\\\\nw_{2j}\\\\\nw_{3j}\\\\\nw_{4j}\\\\\nw_{5j}\\\\\nw_{6j}\\\\\nw_{7j}\\\\\nw_{8j}\\\\\nw_{9j}\\\\\n\\end{array}\\right)+\n\\left(\\begin{array}{c}\nrw_{0j}\\\\\n0\\\\\n0\\\\\n0\\\\\n0\\\\\n0\\\\\n0\\\\\n0\\\\\nrw_{10j}\\\\\n\\end{array}\\right).\n\\end{equation}\nIt is assumed that the boundary values $w_{0j}$ and $w_{10j}$ are known for $j=1,2,...$, and $w_{i0}$ for $i=0,...,10$ is the initial condition.\n\nThe Figure below shows the values of the $9\\times 9$ matrix in colour plot form for $r=\\frac{k}{h^2}$.",
"_____no_output_____"
]
],
[
[
"A=np.zeros((N-1,N-1))\nfor i in range (0,N-1):\n A[i,i]=1-2*r # DIAGONAL\n\nfor i in range (0,N-2): \n A[i+1,i]=r # UPPER DIAGONAL\n A[i,i+1]=r # LOWER DIAGONAL\n \nfig = plt.figure(figsize=(6,4));\n#plt.matshow(A);\nplt.imshow(A,interpolation='none');\nplt.xticks(np.arange(N-1), np.arange(1,N-0.9,1));\nplt.yticks(np.arange(N-1), np.arange(1,N-0.9,1));\nclb=plt.colorbar();\nclb.set_label('Matrix elements values');\n#clb.set_clim((-1,1));\nplt.title('Matrix r=%s'%(np.round(r,3)),fontsize=24)\nfig.tight_layout()\nplt.show();",
"_____no_output_____"
]
],
[
[
"## Results\nTo numerically approximate the solution at $t[1]$ the matrix equation becomes \n\\begin{equation} \\mathbf{w}_{1}=A\\mathbf{w}_{0} +\\mathbf{b}_{0} \\end{equation}\nwhere all the right hand side is known. \nTo approximate solution at time $t[2]$ we use the matrix equation\n\\begin{equation} \\mathbf{w}_{2}=A\\mathbf{w}_{1} +\\mathbf{b}_{1}. \\end{equation}\nEach set of numerical solutions $w[i,j]$ for all $i$ at the previous time step is used to approximate the solution $w[i,j+1]$. \n\nThe Figure below shows the numerical approximation $w[i,j]$ of the Heat Equation using the FTCS method at $x[i]$ for $i=0,...,10$ and time steps $t[j]$ for $j=1,...,15$. The left plot shows the numerical approximation $w[i,j]$ as a function of $x[i]$ with each color representing the different time steps $t[j]$. The right plot shows the numerical approximation $w[i,j]$ as colour plot as a function of $x[i]$, on the $x[i]$ axis and time $t[j]$ on the $y$ axis. \n\nFor $r>\\frac{1}{2}$ the method is unstable resulting a solution that oscillates unnaturally between positive and negative values for each time step.",
"_____no_output_____"
]
],
[
[
"fig = plt.figure(figsize=(12,6))\n\nplt.subplot(121)\nfor j in range (1,time_steps+1):\n b[0]=r*w[0,j-1]\n b[N-2]=r*w[N,j-1]\n w[1:(N),j]=np.dot(A,w[1:(N),j-1])\n plt.plot(x,w[:,j],'o:',label='t[%s]=%s'%(j,np.round(time[j],4)))\nplt.xlabel('x')\nplt.ylabel('w')\n#plt.legend(loc='bottom', bbox_to_anchor=(0.5, -0.1))\nplt.legend(bbox_to_anchor=(-.4, 1), loc=2, borderaxespad=0.)\n\nplt.subplot(122)\nplt.imshow(w.transpose())\nplt.xticks(np.arange(len(x)), x)\nplt.yticks(np.arange(len(time)), np.round(time,4))\nplt.xlabel('x')\nplt.ylabel('time')\nclb=plt.colorbar()\nclb.set_label('Temperature (w)')\nplt.suptitle('Numerical Solution of the Heat Equation r=%s'%(np.round(r,3)),fontsize=24,y=1.08)\nfig.tight_layout()\nplt.show()",
"_____no_output_____"
]
],
[
[
"## Local Trunction Error\nThe local truncation error of the classical explicit difference approach to \n\\begin{equation}\n\\frac{\\partial U}{\\partial t} - \\frac{\\partial^2 U}{\\partial x^2}=0,\n\\end{equation} \nwith\n\n\\begin{equation}\nF_{ij}(w)=\\frac{w_{ij+1}-w_{ij}}{k}-\\frac{w_{i+1j}-2w_{ij}+w_{i-1j}}{h^2}=0,\n\\end{equation} \nis \n\\begin{equation}\nT_{ij}=F_{ij}(U)=\\frac{U_{ij+1}-U_{ij}}{k}-\\frac{U_{i+1j}-2U_{ij}+U_{i-1j}}{h^2},\n\\end{equation} \nBy Taylors expansions we have\n\\begin{eqnarray*}\nU_{i+1j}&=&U((i+1)h,jk)=U(x_i+h,t_j)\\\\\n&=&U_{ij}+h\\left(\\frac{\\partial U}{\\partial x} \\right)_{ij}+\\frac{h^2}{2}\\left(\\frac{\\partial^2 U}{\\partial x^2} \\right)_{ij}+\\frac{h^3}{6}\\left(\\frac{\\partial^3 U}{\\partial x^3} \\right)_{ij} +...\\\\\nU_{i-1j}&=&U((i-1)h,jk)=U(x_i-h,t_j)\\\\\n&=&U_{ij}-h\\left(\\frac{\\partial U}{\\partial x} \\right)_{ij}+\\frac{h^2}{2}\\left(\\frac{\\partial^2 U}{\\partial x^2} \\right)_{ij}-\\frac{h^3}{6}\\left(\\frac{\\partial^3 U}{\\partial x^3} \\right)_{ij} +...\\\\\nU_{ij+1}&=&U(ih,(j+1)k)=U(x_i,t_j+k)\\\\\n&=&U_{ij}+k\\left(\\frac{\\partial U}{\\partial t} \\right)_{ij}+\\frac{k^2}{2}\\left(\\frac{\\partial^2 U}{\\partial t^2} \\right)_{ij}+\\frac{k^3}{6}\\left(\\frac{\\partial^3 U}{\\partial t^3} \\right)_{ij} +...\n\\end{eqnarray*}\n\nsubstitution into the expression for $T_{ij}$ then gives\n\n\\begin{eqnarray*}\nT_{ij}&=&\\left(\\frac{\\partial U}{\\partial t} - \\frac{\\partial^2 U}{\\partial x^2} \\right)_{ij}+\\frac{k}{2}\\left(\\frac{\\partial^2 U}{\\partial t^2} \\right)_{ij}\n-\\frac{h^2}{12}\\left(\\frac{\\partial^4 U}{\\partial x^4} \\right)_{ij}\\\\\n& &\t+\\frac{k^2}{6}\\left(\\frac{\\partial^3 U}{\\partial t^3} \\right)_{ij}\n-\\frac{h^4}{360}\\left(\\frac{\\partial^6 U}{\\partial x^6} \\right)_{ij}+ ...\n\\end{eqnarray*}\nBut $U$ is the solution to the differential equation so\n\\begin{equation} \\left(\\frac{\\partial U}{\\partial t} - \\frac{\\partial^2 U}{\\partial x^2} \\right)_{ij}=0,\\end{equation} \n\nthe principal part of the local truncation error is \n\n\\begin{equation}\n\\frac{k}{2}\\left(\\frac{\\partial^2 U}{\\partial t^2} \\right)_{ij}-\\frac{h^2}{12}\\left(\\frac{\\partial^4 U}{\\partial x^4} \\right)_{ij}.\n\\end{equation} \n\n\n\nHence the truncation error is\n\\begin{equation} \nT_{ij}=O(k)+O(h^2).\n\\end{equation} \n\n",
"_____no_output_____"
],
[
"\n## Stability Analysis \n\nTo investigating the stability of the fully explicit FTCS difference method of the Heat Equation, we will use the von Neumann method.\nThe FTCS difference equation is:\n\\begin{equation}\\frac{1}{k}(w_{pq+1}-w_{pq})=\\frac{1}{h_x^2}(w_{p-1q}-2w_{pq}+w_{p+1q}),\\end{equation}\napproximating \n\\begin{equation}\\frac{\\partial U}{\\partial t}=\\frac{\\partial^2 U}{\\partial x^2}\\end{equation}\n\nat $(ph,qk)$. Substituting $w_{pq}=e^{i\\beta x}\\xi^{q}$ into the difference equation gives: \n\\begin{equation}e^{i\\beta ph}\\xi^{q+1}-e^{i\\beta ph}\\xi^{q}=r\\{e^{i\\beta (p-1)h}\\xi^{q}-2e^{i\\beta ph}\\xi^{q}+e^{i\\beta (p+1)h}\\xi^{q} \\}\n\\end{equation}\n\nwhere $r=\\frac{k}{h_x^2}$. Divide across by $e^{i\\beta (p)h}\\xi^{q}$ leads to\n\n\\begin{equation} \\xi-1=r(e^{i\\beta (-1)h} -2+e^{i\\beta h}),\\end{equation}\n\n\\begin{equation}\\xi= 1+r (2\\cos(\\beta h)-2),\\end{equation}\n\n\\begin{equation}\\xi=1-4r(\\sin^2(\\beta\\frac{h}{2})).\\end{equation}\nHence \n\\begin{equation}\\left| 1-4r(\\sin^2(\\beta\\frac{h}{2}) )\\right|\\leq 1\\end{equation}\nfor this to hold \n\\begin{equation} 4r(\\sin^2(\\beta\\frac{h}{2}) )\\leq 2 \\end{equation}\nwhich means \n\\begin{equation} r\\leq \\frac{1}{2}.\\end{equation}\ntherefore the equation is conditionally stable as $0 < \\xi \\leq 1$ for $r<\\frac{1}{2}$ and all $\\beta$ .\n\n",
"_____no_output_____"
],
[
"## References\n[1] G D Smith Numerical Solution of Partial Differential Equations: Finite Difference Method Oxford 1992\n\n[2] Butler, J. (2019). John S Butler Numerical Methods for Differential Equations. [online] Maths.dit.ie. Available at: http://www.maths.dit.ie/~johnbutler/Teaching_NumericalMethods.html [Accessed 14 Mar. 2019].\n\n[3] Wikipedia contributors. (2019, February 22). Heat equation. In Wikipedia, The Free Encyclopedia. Available at: https://en.wikipedia.org/w/index.php?title=Heat_equation&oldid=884580138 [Accessed 14 Mar. 2019].\n\n\n",
"_____no_output_____"
]
],
[
[
"",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
]
] |
d07e83ae8f366afc172df8b17c68abfe5af94515 | 78,448 | ipynb | Jupyter Notebook | 10. Thermometer Encoding and Cluster Evaluation/kMeans_OHE_TE_Eval.ipynb | yedkk/spark-data-mining | c2fc9075f766b5dafb9f52f6a0b92bb22718ff3c | [
"MIT"
] | 1 | 2021-06-01T02:39:47.000Z | 2021-06-01T02:39:47.000Z | 10. Thermometer Encoding and Cluster Evaluation/kMeans_OHE_TE_Eval.ipynb | yedkk/spark-data-mining | c2fc9075f766b5dafb9f52f6a0b92bb22718ff3c | [
"MIT"
] | null | null | null | 10. Thermometer Encoding and Cluster Evaluation/kMeans_OHE_TE_Eval.ipynb | yedkk/spark-data-mining | c2fc9075f766b5dafb9f52f6a0b92bb22718ff3c | [
"MIT"
] | null | null | null | 43.222039 | 1,464 | 0.593272 | [
[
[
"## DS/CMPSC 410 MiniProject #3\n\n### Spring 2021\n### Instructor: John Yen\n### TA: Rupesh Prajapati and Dongkuan Xu\n### Learning Objectives\n- Be able to apply thermometer encoding to encode numerical variables into binary variable format.\n- Be able to apply k-means clustering to the Darknet dataset based on both thermometer encoding and one-hot encoding.\n- Be able to use external labels (e.g., mirai, zmap, and masscan) to evaluate the result of k-means clustering.\n- Be able to investigate characteristics of a cluster using one-hot encoded feature.\n\n### Total points: 100 \n- Exercise 1: 5 points\n- Exercise 2: 5 points \n- Exercise 3: 5 points \n- Exercise 4: 15 points\n- Exercise 5: 5 points\n- Exercise 6: 10 points\n- Exercise 7: 5 points\n- Exercise 8: 5 points\n- Exercise 9: 10 points\n- Exercise 10: 5 points\n- Exercise 11: 10 points\n- Exercise 12: 20 points\n \n### Due: 5 pm, April 23, 2021",
"_____no_output_____"
]
],
[
[
"import pyspark\nimport csv",
"_____no_output_____"
],
[
"from pyspark import SparkContext\nfrom pyspark.sql import SparkSession\nfrom pyspark.sql.types import StructField, StructType, StringType, LongType\nfrom pyspark.sql.functions import col, column\nfrom pyspark.sql.functions import expr\nfrom pyspark.sql.functions import split\nfrom pyspark.sql.functions import array_contains\nfrom pyspark.sql import Row\nfrom pyspark.ml import Pipeline\nfrom pyspark.ml.feature import OneHotEncoder, StringIndexer, VectorAssembler, IndexToString, PCA\nfrom pyspark.ml.clustering import KMeans\nfrom pyspark.ml.evaluation import ClusteringEvaluator",
"_____no_output_____"
],
[
"import pandas as pd\nimport numpy as np\nimport math",
"_____no_output_____"
],
[
"ss = SparkSession.builder.master(\"local\").appName(\"ClusteringTE\").getOrCreate()",
"_____no_output_____"
]
],
[
[
"## Exercise 1 (5 points)\nComplete the path for input file in the code below and enter your name in this Markdown cell:\n- Name: Kangdong Yuan",
"_____no_output_____"
]
],
[
[
"Scanners_df = ss.read.csv(\"/storage/home/kky5082/ds410/Lab10/sampled_profile.csv\", header= True, inferSchema=True )",
"_____no_output_____"
]
],
[
[
"## We can use printSchema() to display the schema of the DataFrame Scanners_df to see whether it was inferred correctly.",
"_____no_output_____"
]
],
[
[
"Scanners_df.printSchema()",
"root\n |-- _c0: integer (nullable = true)\n |-- id: integer (nullable = true)\n |-- numports: integer (nullable = true)\n |-- lifetime: double (nullable = true)\n |-- Bytes: integer (nullable = true)\n |-- Packets: integer (nullable = true)\n |-- average_packetsize: integer (nullable = true)\n |-- MinUniqueDests: integer (nullable = true)\n |-- MaxUniqueDests: integer (nullable = true)\n |-- MinUniqueDest24s: integer (nullable = true)\n |-- MaxUniqueDest24s: integer (nullable = true)\n |-- average_lifetime: double (nullable = true)\n |-- mirai: boolean (nullable = true)\n |-- zmap: boolean (nullable = true)\n |-- masscan: boolean (nullable = true)\n |-- country: string (nullable = true)\n |-- traffic_types_scanned_str: string (nullable = true)\n |-- ports_scanned_str: string (nullable = true)\n |-- host_tags_per_censys: string (nullable = true)\n |-- host_services_per_censys: string (nullable = true)\n\n"
],
[
"Scanners_df.where(col('mirai')).count()",
"_____no_output_____"
]
],
[
[
"# Part A: One Hot Encoding \n## This part is identical to that of Miniproject Deliverable #2\nWe want to apply one hot encoding to the set of ports scanned by scanners. \n- A.1 Like Mini Project deliverable 1 and 2, we first convert the feature \"ports_scanned_str\" to a feature that is an Array of ports\n- A.2 We then calculate the total number of scanners for each port\n- A.3 We identify the top n port to use for one-hot encoding (You choose the number n).\n- A.4 Generate one-hot encoded feature for these top n ports.",
"_____no_output_____"
]
],
[
[
"# Scanners_df.select(\"ports_scanned_str\").show(30)",
"_____no_output_____"
],
[
"Scanners_df2=Scanners_df.withColumn(\"Ports_Array\", split(col(\"ports_scanned_str\"), \"-\") )\n# Scanners_df2.persist().show(10)",
"_____no_output_____"
]
],
[
[
"## A.1 We only need the column ```Ports_Array``` to calculate the top ports being scanned",
"_____no_output_____"
]
],
[
[
"Ports_Scanned_RDD = Scanners_df2.select(\"Ports_Array\").rdd",
"_____no_output_____"
],
[
"# Ports_Scanned_RDD.persist().take(5)",
"_____no_output_____"
]
],
[
[
"## Because each port number in the Ports_Array column for each row occurs only once, we can count the total occurance of each port number through flatMap.",
"_____no_output_____"
]
],
[
[
"Ports_list_RDD = Ports_Scanned_RDD.map(lambda row: row[0] )",
"_____no_output_____"
],
[
"# Ports_list_RDD.persist()",
"_____no_output_____"
],
[
"Ports_list2_RDD = Ports_Scanned_RDD.flatMap(lambda row: row[0] )",
"_____no_output_____"
],
[
"Port_count_RDD = Ports_list2_RDD.map(lambda x: (x, 1))\n# Port_count_RDD.take(2)",
"_____no_output_____"
],
[
"Port_count_total_RDD = Port_count_RDD.reduceByKey(lambda x,y: x+y, 1)\n# Port_count_total_RDD.persist().take(5)",
"_____no_output_____"
],
[
"Sorted_Count_Port_RDD = Port_count_total_RDD.map(lambda x: (x[1], x[0])).sortByKey( ascending = False)",
"_____no_output_____"
],
[
"# Sorted_Count_Port_RDD.persist().take(50)",
"_____no_output_____"
]
],
[
[
"## Exercise 2 (5%)\nSelect top_ports to be the number of top ports you want to use for one-hot encoding. I recommend a number between 20 and 40.",
"_____no_output_____"
]
],
[
[
"top_ports=30\nSorted_Ports_RDD= Sorted_Count_Port_RDD.map(lambda x: x[1])\nTop_Ports_list = Sorted_Ports_RDD.take(top_ports)",
"_____no_output_____"
],
[
"# Top_Ports_list",
"_____no_output_____"
],
[
"# Scanners_df3=Scanners_df2.withColumn(FeatureName, array_contains(\"Ports_Array\", Top_Ports_list[0]))",
"_____no_output_____"
],
[
"# Scanners_df3.show(10)",
"_____no_output_____"
]
],
[
[
"## A.4 Generate Hot-One Encoded Feature for each of the top ports in the Top_Ports_list\n\n- Iterate through the Top_Ports_list so that each top port is one-hot encoded.",
"_____no_output_____"
],
[
"## Exercise 3 (5 %)\nComplete the following PySpark code for encoding the n ports using One Hot Encoding, where n is specified by the variable ```top_ports```",
"_____no_output_____"
]
],
[
[
"for i in range(0, top_ports - 1):\n # \"Port\" + Top_Ports_list[i] is the name of each new feature created through One Hot Encoding\n Scanners_df3 = Scanners_df2.withColumn(\"Port\" + Top_Ports_list[i], array_contains(\"Ports_Array\", Top_Ports_list[i]))\n Scanners_df2 = Scanners_df3",
"_____no_output_____"
],
[
"Scanners_df2.printSchema()",
"root\n |-- _c0: integer (nullable = true)\n |-- id: integer (nullable = true)\n |-- numports: integer (nullable = true)\n |-- lifetime: double (nullable = true)\n |-- Bytes: integer (nullable = true)\n |-- Packets: integer (nullable = true)\n |-- average_packetsize: integer (nullable = true)\n |-- MinUniqueDests: integer (nullable = true)\n |-- MaxUniqueDests: integer (nullable = true)\n |-- MinUniqueDest24s: integer (nullable = true)\n |-- MaxUniqueDest24s: integer (nullable = true)\n |-- average_lifetime: double (nullable = true)\n |-- mirai: boolean (nullable = true)\n |-- zmap: boolean (nullable = true)\n |-- masscan: boolean (nullable = true)\n |-- country: string (nullable = true)\n |-- traffic_types_scanned_str: string (nullable = true)\n |-- ports_scanned_str: string (nullable = true)\n |-- host_tags_per_censys: string (nullable = true)\n |-- host_services_per_censys: string (nullable = true)\n |-- Ports_Array: array (nullable = true)\n | |-- element: string (containsNull = true)\n |-- Port17132: boolean (nullable = true)\n |-- Port17140: boolean (nullable = true)\n |-- Port17128: boolean (nullable = true)\n |-- Port17138: boolean (nullable = true)\n |-- Port17130: boolean (nullable = true)\n |-- Port17136: boolean (nullable = true)\n |-- Port23: boolean (nullable = true)\n |-- Port445: boolean (nullable = true)\n |-- Port54594: boolean (nullable = true)\n |-- Port17142: boolean (nullable = true)\n |-- Port17134: boolean (nullable = true)\n |-- Port80: boolean (nullable = true)\n |-- Port8080: boolean (nullable = true)\n |-- Port0: boolean (nullable = true)\n |-- Port2323: boolean (nullable = true)\n |-- Port5555: boolean (nullable = true)\n |-- Port81: boolean (nullable = true)\n |-- Port1023: boolean (nullable = true)\n |-- Port52869: boolean (nullable = true)\n |-- Port8443: boolean (nullable = true)\n |-- Port49152: boolean (nullable = true)\n |-- Port7574: boolean (nullable = true)\n |-- Port37215: boolean (nullable = true)\n |-- Port34218: boolean (nullable = true)\n |-- Port34220: boolean (nullable = true)\n |-- Port33968: boolean (nullable = true)\n |-- Port34224: boolean (nullable = true)\n |-- Port34228: boolean (nullable = true)\n |-- Port33962: boolean (nullable = true)\n\n"
]
],
[
[
"# Part B Thermometer Encoding of Numerical Variables",
"_____no_output_____"
],
[
"## We encode the numerical variable numports (number of ports being scanned) using thermometer encoding",
"_____no_output_____"
]
],
[
[
"pow(2,15)",
"_____no_output_____"
],
[
"Scanners_df3=Scanners_df2.withColumn(\"TE_numports_0\", col(\"numports\") > 0) \nScanners_df2 = Scanners_df3",
"_____no_output_____"
],
[
"Scanners_df3.count()",
"_____no_output_____"
],
[
"Scanners_df3.where(col('TE_numports_0')).count()",
"_____no_output_____"
]
],
[
[
"# Exercise 4 (15%)\nComplete the following pyspark code to use the column \"numports\" to create 16 additional columns as follows:\n- TE_numports_0 : True, if the scanner scans more than 0 ports, otherwise False.\n- TE_numports_1 : True, if the scanner scans more than 2**0 (1) port, otherwise False.\n- TE_numports_2 : True, if the scanner scans more than 2**1 (2) ports, otherwise False.\n- TE_numports_3 : True, if the scanner scans more than 2**2 (4) ports, otherwise False\n ...\n- TE_numports_15 : True, if the scanner scans more than 2**14 ports, otherwise False\n- TE_numports_16 : True, if the scanner scans more than 2**15 (32768) ports, otherwise False",
"_____no_output_____"
]
],
[
[
"for i in range(0, 16):\n # \"TE_numports_\" + str(i+1) is the name of each new feature created for each Bin in Thermometer Encoding\n Scanners_df3 = Scanners_df2.withColumn(\"TE_numports_\" + str(i+1), col(\"numports\") > pow(2,i))\n Scanners_df2 = Scanners_df3",
"_____no_output_____"
],
[
"Scanners_df2.printSchema()",
"root\n |-- _c0: integer (nullable = true)\n |-- id: integer (nullable = true)\n |-- numports: integer (nullable = true)\n |-- lifetime: double (nullable = true)\n |-- Bytes: integer (nullable = true)\n |-- Packets: integer (nullable = true)\n |-- average_packetsize: integer (nullable = true)\n |-- MinUniqueDests: integer (nullable = true)\n |-- MaxUniqueDests: integer (nullable = true)\n |-- MinUniqueDest24s: integer (nullable = true)\n |-- MaxUniqueDest24s: integer (nullable = true)\n |-- average_lifetime: double (nullable = true)\n |-- mirai: boolean (nullable = true)\n |-- zmap: boolean (nullable = true)\n |-- masscan: boolean (nullable = true)\n |-- country: string (nullable = true)\n |-- traffic_types_scanned_str: string (nullable = true)\n |-- ports_scanned_str: string (nullable = true)\n |-- host_tags_per_censys: string (nullable = true)\n |-- host_services_per_censys: string (nullable = true)\n |-- Ports_Array: array (nullable = true)\n | |-- element: string (containsNull = true)\n |-- Port17132: boolean (nullable = true)\n |-- Port17140: boolean (nullable = true)\n |-- Port17128: boolean (nullable = true)\n |-- Port17138: boolean (nullable = true)\n |-- Port17130: boolean (nullable = true)\n |-- Port17136: boolean (nullable = true)\n |-- Port23: boolean (nullable = true)\n |-- Port445: boolean (nullable = true)\n |-- Port54594: boolean (nullable = true)\n |-- Port17142: boolean (nullable = true)\n |-- Port17134: boolean (nullable = true)\n |-- Port80: boolean (nullable = true)\n |-- Port8080: boolean (nullable = true)\n |-- Port0: boolean (nullable = true)\n |-- Port2323: boolean (nullable = true)\n |-- Port5555: boolean (nullable = true)\n |-- Port81: boolean (nullable = true)\n |-- Port1023: boolean (nullable = true)\n |-- Port52869: boolean (nullable = true)\n |-- Port8443: boolean (nullable = true)\n |-- Port49152: boolean (nullable = true)\n |-- Port7574: boolean (nullable = true)\n |-- Port37215: boolean (nullable = true)\n |-- Port34218: boolean (nullable = true)\n |-- Port34220: boolean (nullable = true)\n |-- Port33968: boolean (nullable = true)\n |-- Port34224: boolean (nullable = true)\n |-- Port34228: boolean (nullable = true)\n |-- Port33962: boolean (nullable = true)\n |-- TE_numports_0: boolean (nullable = true)\n |-- TE_numports_1: boolean (nullable = true)\n |-- TE_numports_2: boolean (nullable = true)\n |-- TE_numports_3: boolean (nullable = true)\n |-- TE_numports_4: boolean (nullable = true)\n |-- TE_numports_5: boolean (nullable = true)\n |-- TE_numports_6: boolean (nullable = true)\n |-- TE_numports_7: boolean (nullable = true)\n |-- TE_numports_8: boolean (nullable = true)\n |-- TE_numports_9: boolean (nullable = true)\n |-- TE_numports_10: boolean (nullable = true)\n |-- TE_numports_11: boolean (nullable = true)\n |-- TE_numports_12: boolean (nullable = true)\n |-- TE_numports_13: boolean (nullable = true)\n |-- TE_numports_14: boolean (nullable = true)\n |-- TE_numports_15: boolean (nullable = true)\n |-- TE_numports_16: boolean (nullable = true)\n\n"
]
],
[
[
"# Exercise 5 (5 points)\nWhat is the total number of scanners that scan more than 2^15 (i.e., 32768) ports? Complete the code below using Scanners_df2 to find out the answer.",
"_____no_output_____"
]
],
[
[
"HFScanners_df2 = Scanners_df2.where(col('TE_numports_15'))",
"_____no_output_____"
],
[
"HFScanners_df2.count()",
"_____no_output_____"
]
],
[
[
"# Exercise 6 (10 points)\nComplete the following code to use k-means to cluster the scanners using the following \n- thermometer encoding of 'numports' numerical feature\n- one-hot encoding of top k ports (k chosen by you in Exercise 2).",
"_____no_output_____"
],
[
"## Specify Parameters for k Means Clustering",
"_____no_output_____"
]
],
[
[
"km = KMeans(featuresCol=\"features\", predictionCol=\"prediction\").setK(50).setSeed(123)\nkm.explainParams()",
"_____no_output_____"
],
[
"input_features = []\nfor i in range(0, top_ports - 1):\n input_features.append( \"Port\"+Top_Ports_list[i] )\nfor i in range(0, 15):\n input_features.append( \"TE_numports_\" + str(i))",
"_____no_output_____"
],
[
"print(input_features)",
"['Port17132', 'Port17140', 'Port17128', 'Port17138', 'Port17130', 'Port17136', 'Port23', 'Port445', 'Port54594', 'Port17142', 'Port17134', 'Port80', 'Port8080', 'Port0', 'Port2323', 'Port5555', 'Port81', 'Port1023', 'Port52869', 'Port8443', 'Port49152', 'Port7574', 'Port37215', 'Port34218', 'Port34220', 'Port33968', 'Port34224', 'Port34228', 'Port33962', 'TE_numports_0', 'TE_numports_1', 'TE_numports_2', 'TE_numports_3', 'TE_numports_4', 'TE_numports_5', 'TE_numports_6', 'TE_numports_7', 'TE_numports_8', 'TE_numports_9', 'TE_numports_10', 'TE_numports_11', 'TE_numports_12', 'TE_numports_13', 'TE_numports_14']\n"
],
[
"va = VectorAssembler().setInputCols(input_features).setOutputCol(\"features\")",
"_____no_output_____"
],
[
"data= va.transform(Scanners_df2)",
"_____no_output_____"
],
[
"data.persist()",
"_____no_output_____"
],
[
"kmModel=km.fit(data)",
"_____no_output_____"
],
[
"kmModel",
"_____no_output_____"
],
[
"predictions = kmModel.transform(data)",
"_____no_output_____"
],
[
"predictions.persist()",
"_____no_output_____"
],
[
"Cluster1_df=predictions.where(col(\"prediction\")==0)",
"_____no_output_____"
],
[
"Cluster1_df.persist().count()",
"_____no_output_____"
]
],
[
[
"## Exercise 7 (5 points)\nComplete the following code to find the size of all of the clusters generated.",
"_____no_output_____"
]
],
[
[
"summary = kmModel.summary",
"_____no_output_____"
],
[
"summary.clusterSizes",
"_____no_output_____"
]
],
[
[
"# Exercise 8 (5 points)\nComplete the following code to find the Silhouette Score of the clustering result.",
"_____no_output_____"
]
],
[
[
"evaluator = ClusteringEvaluator()\nsilhouette = evaluator.evaluate(predictions)",
"_____no_output_____"
],
[
"print('Silhouette Score of the Clustering Result is ', silhouette)",
"Silhouette Score of the Clustering Result is 0.7291143984546943\n"
],
[
"centers = kmModel.clusterCenters()",
"_____no_output_____"
],
[
"centers[0]",
"_____no_output_____"
],
[
"print(\"Cluster Centers:\")\ni=0\nfor center in centers:\n print(\"Cluster \", str(i+1), center)\n i = i+1",
"Cluster Centers:\nCluster 1 [9.87079646e-01 9.83893805e-01 9.85663717e-01 9.87256637e-01\n 9.84424779e-01 9.82477876e-01 7.07964602e-04 8.84955752e-04\n 1.94690265e-03 1.00000000e+00 9.40707965e-01 5.30973451e-04\n 3.53982301e-04 6.37168142e-03 1.76991150e-04 1.59292035e-03\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 2.05486726e-01\n 1.99115044e-01 1.97345133e-01 2.03362832e-01 2.00000000e-01\n 1.99469027e-01 1.00000000e+00 1.00000000e+00 1.00000000e+00\n 1.00000000e+00 1.00000000e+00 1.52212389e-02 1.59292035e-03\n 5.30973451e-04 1.76991150e-04 1.76991150e-04 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 2 [0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.22139891e-04\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 8.14265939e-05\n 0.00000000e+00 4.07132970e-05 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 1.00000000e+00 7.93909291e-03 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 3 [1.41242938e-02 1.66352793e-02 1.06716886e-02 9.41619586e-03\n 1.12994350e-02 1.16133082e-02 1.06716886e-02 1.13622097e-01\n 1.00000000e+00 5.96359071e-03 6.59133710e-03 6.27746390e-03\n 6.59133710e-03 7.21908349e-03 0.00000000e+00 3.76647834e-03\n 6.27746390e-04 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 3.13873195e-04 6.27746390e-04 3.13873195e-04\n 1.25549278e-03 3.13873195e-04 3.13873195e-04 9.41619586e-04\n 6.27746390e-04 1.00000000e+00 1.00000000e+00 4.33145009e-02\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 4 [0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 4.04040404e-03\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 1.00000000e+00 0.00000000e+00 6.31313131e-04\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 1.26262626e-04 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 1.00000000e+00 2.97979798e-02 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 5 [7.07714952e-02 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 6.58653256e-02 1.00000000e+00 0.00000000e+00 2.08512204e-03\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 2.45308475e-04\n 0.00000000e+00 8.58579664e-04 0.00000000e+00 2.45308475e-04\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 2.45308475e-03\n 2.45308475e-03 2.69839323e-03 2.94370170e-03 2.69839323e-03\n 2.33043052e-03 1.00000000e+00 1.81896235e-01 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 6 [0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]\nCluster 7 [0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]\nCluster 8 [1.72777778e-01 1.72777778e-01 0.00000000e+00 1.48888889e-01\n 1.56111111e-01 1.75000000e-01 0.00000000e+00 2.77777778e-03\n 0.00000000e+00 1.00000000e+00 7.55555556e-02 0.00000000e+00\n 0.00000000e+00 1.66666667e-03 0.00000000e+00 5.55555556e-04\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 6.11111111e-03\n 5.55555556e-03 5.55555556e-03 5.00000000e-03 3.88888889e-03\n 6.66666667e-03 1.00000000e+00 1.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 9 [0.00000000e+00 1.35969142e-01 2.14079074e-01 1.90935391e-01\n 1.00000000e+00 2.32401157e-01 0.00000000e+00 9.64320154e-04\n 9.64320154e-04 1.13789778e-01 1.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 6.75024108e-03\n 9.64320154e-04 6.75024108e-03 8.67888139e-03 7.71456123e-03\n 1.25361620e-02 1.00000000e+00 1.00000000e+00 6.95274831e-01\n 1.44648023e-02 2.89296046e-03 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 10 [1.03385888e-03 2.06771776e-03 1.03385888e-03 7.75394159e-04\n 1.29232360e-03 1.03385888e-03 6.70199018e-01 1.55078832e-02\n 2.58464720e-03 2.58464720e-04 7.75394159e-04 9.98190747e-01\n 9.95864564e-01 4.47143965e-02 9.53476350e-01 9.11863531e-01\n 9.24786767e-01 9.22202119e-01 9.20909796e-01 9.25820625e-01\n 9.20392866e-01 9.05918842e-01 9.07728095e-01 2.58464720e-04\n 2.58464720e-04 2.58464720e-04 2.58464720e-04 2.58464720e-04\n 2.58464720e-04 1.00000000e+00 1.00000000e+00 1.00000000e+00\n 1.00000000e+00 1.00000000e+00 4.62651848e-02 1.39570949e-02\n 1.39570949e-02 1.39570949e-02 1.29232360e-02 1.21478418e-02\n 2.58464720e-04 2.58464720e-04 2.58464720e-04 2.58464720e-04]\nCluster 11 [1.29948625e-02 1.32970686e-02 1.84345724e-02 1.87367785e-02\n 1.78301602e-02 1.45058930e-02 7.25294651e-03 1.32970686e-02\n 5.13750378e-03 1.08794198e-02 9.97280145e-03 3.47537020e-02\n 4.29132668e-02 5.65125416e-02 8.46177093e-03 1.60169235e-02\n 4.53309157e-03 3.02206105e-04 0.00000000e+00 9.06618314e-04\n 0.00000000e+00 0.00000000e+00 9.06618314e-04 1.20882442e-03\n 1.51103052e-03 9.06618314e-04 1.20882442e-03 9.06618314e-04\n 2.11544273e-03 1.00000000e+00 1.00000000e+00 1.00000000e+00\n 1.00000000e+00 5.40646721e-01 1.83439105e-01 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 12 [0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]\nCluster 13 [6.86719637e-02 1.00000000e+00 0.00000000e+00 6.51532350e-02\n 6.65153235e-02 0.00000000e+00 0.00000000e+00 1.24858116e-03\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.13507378e-04\n 0.00000000e+00 1.02156640e-03 0.00000000e+00 1.13507378e-04\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.58910329e-03\n 2.27014756e-03 1.92962543e-03 1.70261067e-03 1.81611805e-03\n 1.36208854e-03 1.00000000e+00 2.35641317e-01 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 14 [0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 1.00000000e+00 0.00000000e+00 0.00000000e+00 2.35155791e-03\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 1.91064080e-03 0.00000000e+00 2.93944738e-04\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 3.52733686e-03\n 2.79247501e-03 3.08641975e-03 1.91064080e-03 2.79247501e-03\n 2.64550265e-03 1.00000000e+00 4.71781305e-02 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 15 [7.61729531e-01 7.69089236e-01 7.61269549e-01 1.00000000e+00\n 7.52529899e-01 7.53449862e-01 0.00000000e+00 9.19963201e-04\n 4.59981601e-04 1.00000000e+00 1.00000000e+00 0.00000000e+00\n 0.00000000e+00 1.37994480e-03 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 2.52989880e-02 2.16191352e-02 2.71389144e-02 2.94388224e-02\n 2.62189512e-02 1.00000000e+00 1.00000000e+00 1.00000000e+00\n 1.00000000e+00 4.59981601e-04 4.59981601e-04 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 16 [8.48601736e-01 8.37994214e-01 8.30279653e-01 0.00000000e+00\n 8.26422372e-01 8.18707811e-01 9.64320154e-04 4.82160077e-03\n 4.82160077e-03 0.00000000e+00 6.22950820e-01 0.00000000e+00\n 0.00000000e+00 4.82160077e-03 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 5.78592093e-03\n 5.88235294e-02 5.40019286e-02 6.17164899e-02 6.17164899e-02\n 5.49662488e-02 1.00000000e+00 1.00000000e+00 1.00000000e+00\n 1.00000000e+00 1.15718419e-02 2.89296046e-03 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 17 [0.99838057 0.99595142 0.99595142 0.99919028 0.99595142 0.99757085\n 0.0048583 0.00566802 0.00566802 1. 0.9951417 0.00647773\n 0.0048583 0.00890688 0.0048583 0.00404858 0.0048583 0.00404858\n 0.00647773 0.00323887 0.00566802 0.00323887 0.00404858 0.78866397\n 0.78461538 0.80809717 0.79838057 0.77489879 0.78785425 1.\n 1. 1. 1. 1. 0.78704453 0.01376518\n 0.00809717 0.00809717 0.00728745 0.00728745 0.00728745 0.00728745\n 0.00728745 0.00728745]\nCluster 18 [8.41737781e-02 0.00000000e+00 0.00000000e+00 1.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 2.06878717e-03\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 1.03439359e-03 0.00000000e+00 2.58598397e-04\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 2.45668477e-03\n 2.45668477e-03 3.36177916e-03 2.71528317e-03 2.19808637e-03\n 3.49107836e-03 1.00000000e+00 1.27359710e-01 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 19 [0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 4.40311774e-02\n 1.17872282e-02 0.00000000e+00 1.16778340e-02 4.03391221e-02\n 1.80500479e-03 8.20456721e-05 3.82879803e-04 1.91439902e-04\n 1.64091344e-04 1.91439902e-04 1.36742787e-04 5.87993983e-03\n 5.90728839e-03 5.77054560e-03 5.74319705e-03 5.93463695e-03\n 5.57910570e-03 1.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 20 [1.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.84606646e-03\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 9.94035785e-04 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.84606646e-03\n 2.84010224e-03 1.98807157e-03 2.13007668e-03 3.12411247e-03\n 2.27208179e-03 1.00000000e+00 4.44476001e-02 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 21 [0.0010917 0. 0. 0. 0. 0.\n 0.7128821 0.00436681 0.00436681 0. 0. 0.86790393\n 0.8558952 0.04585153 0.819869 0.36899563 0.40283843 0.51310044\n 0.40283843 0.38427948 0.36462882 0.34279476 0.33187773 0.\n 0. 0. 0. 0. 0. 1.\n 1. 1. 0.99344978 0.09606987 0.05131004 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. ]\nCluster 22 [7.75981524e-01 9.76135489e-01 0.00000000e+00 4.71131640e-01\n 3.77213241e-02 2.31716705e-01 7.69822941e-04 6.15858353e-03\n 9.23787529e-03 3.84911470e-01 1.64742109e-01 1.53964588e-03\n 0.00000000e+00 4.61893764e-03 0.00000000e+00 2.30946882e-03\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.38568129e-02\n 2.30946882e-02 1.61662818e-02 2.61739800e-02 2.07852194e-02\n 2.07852194e-02 1.00000000e+00 1.00000000e+00 1.00000000e+00\n 3.84911470e-02 2.30946882e-03 7.69822941e-04 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 23 [0.01491366 0.01295133 0.01255887 0.01844584 0.01805338 0.01726845\n 0.01805338 0.03375196 0.00313972 0.00824176 0.00824176 0.01491366\n 0.00706436 0.06161695 0.01138148 0.01138148 0.0188383 0.00510204\n 0.00470958 0.00431711 0.00470958 0.00549451 0.00392465 0.0066719\n 0.00431711 0.00627943 0.00431711 0.00510204 0.00588697 1.\n 1. 1. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. ]\nCluster 24 [7.58937198e-01 7.71980676e-01 7.68115942e-01 1.00000000e+00\n 7.66183575e-01 7.40579710e-01 4.83091787e-04 9.66183575e-04\n 9.66183575e-04 1.00000000e+00 0.00000000e+00 4.83091787e-04\n 4.83091787e-04 3.38164251e-03 0.00000000e+00 9.66183575e-04\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 4.20289855e-02 3.86473430e-02 4.39613527e-02 4.58937198e-02\n 4.87922705e-02 1.00000000e+00 1.00000000e+00 1.00000000e+00\n 1.00000000e+00 1.78743961e-02 2.41545894e-03 9.66183575e-04\n 4.83091787e-04 4.83091787e-04 4.83091787e-04 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 25 [0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]\nCluster 26 [0.00000000e+00 1.95599862e-01 1.00000000e+00 1.93193537e-01\n 1.87005844e-01 2.08662771e-01 6.87521485e-04 4.12512891e-03\n 0.00000000e+00 0.00000000e+00 9.93468546e-02 0.00000000e+00\n 0.00000000e+00 3.43760743e-03 0.00000000e+00 6.87521485e-04\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 9.62530079e-03\n 4.12512891e-03 4.81265040e-03 3.43760743e-03 8.59401856e-03\n 6.53145411e-03 1.00000000e+00 1.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 27 [0.00000000e+00 5.98334401e-01 1.67200512e-01 7.24535554e-01\n 1.00000000e+00 3.51057015e-01 1.92184497e-03 6.40614990e-04\n 7.04676489e-03 2.01793722e-01 2.88276746e-02 0.00000000e+00\n 1.92184497e-03 5.12491992e-03 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.53747598e-02\n 2.04996797e-02 1.85778347e-02 1.28122998e-02 2.04996797e-02\n 1.85778347e-02 1.00000000e+00 1.00000000e+00 1.00000000e+00\n 5.25304292e-02 2.56245996e-03 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 28 [7.91181364e-01 7.66777593e-01 7.78702163e-01 1.00000000e+00\n 7.83693844e-01 7.84248475e-01 0.00000000e+00 1.38657793e-03\n 1.38657793e-03 0.00000000e+00 5.70160843e-01 0.00000000e+00\n 0.00000000e+00 3.88241819e-03 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 4.24292845e-02 5.04714365e-02 3.77149196e-02 4.35385469e-02\n 4.18746534e-02 1.00000000e+00 1.00000000e+00 1.00000000e+00\n 1.00000000e+00 1.94120910e-03 2.77315585e-04 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 29 [1. 0. 0. 0. 1. 0.\n 0. 0.00144509 0. 0. 0. 0.\n 0. 0.00144509 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.00722543\n 0.01156069 0.00867052 0.00578035 0.00433526 0.00867052 1.\n 1. 0.1300578 0.00289017 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. ]\nCluster 30 [0. 0. 1. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]\nCluster 31 [0.22232472 0.18265683 0.19741697 0.21678967 0. 1.\n 0. 0.00184502 0. 0.1097786 1. 0.\n 0. 0.00276753 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.01199262\n 0.00922509 0.00461255 0.01107011 0.00645756 0.00553506 1.\n 1. 0.72416974 0.01107011 0.00184502 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. ]\nCluster 32 [3.98006135e-01 6.97852761e-02 1.00000000e+00 2.87576687e-01\n 3.90337423e-01 9.27914110e-01 0.00000000e+00 3.83435583e-03\n 5.36809816e-03 1.29601227e-01 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 7.66871166e-04 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.76380368e-02\n 1.91717791e-02 1.45705521e-02 2.45398773e-02 1.45705521e-02\n 1.99386503e-02 1.00000000e+00 1.00000000e+00 1.00000000e+00\n 2.30061350e-02 3.06748466e-03 1.53374233e-03 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 33 [0.69957983 0.71218487 0.74369748 0.73529412 0.76260504 0.67016807\n 0. 0.00210084 0.00210084 0.45798319 0.45168067 0.\n 0.00210084 0.00420168 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1.\n 0.03151261 0.03781513 0.04621849 0.03991597 0.05252101 1.\n 1. 1. 1. 0.02310924 0.00210084 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. ]\nCluster 34 [1.51327555e-03 9.62993534e-04 1.23813454e-03 6.87852524e-04\n 1.23813454e-03 8.25423029e-04 9.52813317e-01 1.08680699e-02\n 1.96725822e-02 8.25423029e-04 4.12711515e-04 9.93534186e-01\n 9.92571193e-01 8.25423029e-04 4.26468565e-03 2.88898060e-03\n 1.78841656e-03 9.62993534e-04 2.47626909e-03 2.20112808e-03\n 8.25423029e-04 1.92598707e-03 1.78841656e-03 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 1.00000000e+00 1.00000000e+00 1.00000000e+00\n 1.37570505e-02 2.20112808e-03 1.23813454e-03 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 35 [7.58097864e-01 7.74638181e-01 7.60854583e-01 0.00000000e+00\n 7.58097864e-01 7.56030324e-01 0.00000000e+00 2.06753963e-03\n 4.13507926e-03 1.00000000e+00 5.36871123e-01 0.00000000e+00\n 6.89179876e-04 5.51343901e-03 0.00000000e+00 6.89179876e-04\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 8.27015851e-03\n 5.09993108e-02 5.72019297e-02 4.34183322e-02 5.23776706e-02\n 5.44452102e-02 1.00000000e+00 1.00000000e+00 1.00000000e+00\n 1.00000000e+00 1.58511371e-02 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 36 [0.15212766 0.03829787 0.85851064 0.62446809 0.16170213 0.04361702\n 0. 0.00531915 0.00957447 0.75638298 0.35531915 0.\n 0. 0.00638298 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.0212766\n 0.01808511 0.02234043 0.0212766 0.0212766 0.01808511 1.\n 1. 1. 0.03085106 0.00319149 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. ]\nCluster 37 [1.83835182e-01 2.04437401e-01 0.00000000e+00 1.00000000e+00\n 0.00000000e+00 1.00000000e+00 0.00000000e+00 3.16957211e-03\n 1.58478605e-03 1.51347068e-01 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 6.33914422e-03 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 9.50871632e-03\n 1.66402536e-02 7.92393027e-03 1.74326466e-02 1.34706815e-02\n 6.33914422e-03 1.00000000e+00 1.00000000e+00 5.40412044e-01\n 1.50554675e-02 7.92393027e-04 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 38 [0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 9.90033223e-01 8.30564784e-04\n 8.30564784e-04 0.00000000e+00 0.00000000e+00 2.49169435e-02\n 2.82392027e-02 8.30564784e-04 1.00000000e+00 1.41196013e-02\n 1.16279070e-02 1.82724252e-02 1.41196013e-02 9.96677741e-03\n 1.41196013e-02 9.96677741e-03 7.47508306e-03 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 1.00000000e+00 1.00000000e+00 1.10465116e-01\n 8.30564784e-04 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 39 [1. 0.48055556 0.86319444 0.25763889 0.48125 0.\n 0. 0.00208333 0.00486111 0.05555556 0.11736111 0.\n 0. 0.00347222 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.01180556\n 0.01666667 0.01041667 0.00902778 0.00972222 0.01180556 1.\n 1. 1. 0.01527778 0.00138889 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. ]\nCluster 40 [0.00000000e+00 0.00000000e+00 0.00000000e+00 2.09380235e-04\n 2.09380235e-04 2.09380235e-04 3.24329983e-01 7.53768844e-03\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 2.50837521e-01\n 2.25711893e-01 1.46566164e-03 1.67504188e-03 1.88442211e-03\n 1.67294807e-01 1.25628141e-03 7.74706868e-03 6.28140704e-04\n 8.37520938e-04 1.46566164e-03 1.67504188e-03 2.09380235e-03\n 1.46566164e-03 1.67504188e-03 1.88442211e-03 1.46566164e-03\n 1.25628141e-03 1.00000000e+00 1.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 41 [1. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]\nCluster 42 [7.39088264e-01 1.29000970e-01 0.00000000e+00 8.82638215e-02\n 7.65276431e-01 9.17555771e-01 0.00000000e+00 4.84966052e-03\n 7.75945684e-03 4.06401552e-01 3.78273521e-02 0.00000000e+00\n 0.00000000e+00 7.75945684e-03 0.00000000e+00 9.69932105e-04\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.55189137e-02\n 3.10378274e-02 1.45489816e-02 2.32783705e-02 2.13385063e-02\n 2.03685742e-02 1.00000000e+00 1.00000000e+00 1.00000000e+00\n 4.07371484e-02 3.87972842e-03 9.69932105e-04 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 43 [0. 1. 0. 0. 0. 1.\n 0. 0.00135501 0.00271003 0.11382114 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 0.00271003\n 0.00542005 0.00948509 0.00406504 0.00948509 0.01084011 1.\n 1. 0.21815718 0.00406504 0. 0. 0.\n 0. 0. 0. 0. 0. 0.\n 0. 0. ]\nCluster 44 [0.96055227 0.94871795 0.97238659 0.95857988 0.95463511 0.95463511\n 0. 0.01183432 0.00394477 0. 0.87376726 0.\n 0. 0.02169625 0. 0.00394477 0. 0.\n 0. 0. 0. 0. 0. 0.20118343\n 0.21893491 0.18145957 0.17751479 0.20118343 0.18343195 1.\n 1. 1. 1. 1. 0.01775148 0.00197239\n 0. 0. 0. 0. 0. 0.\n 0. 0. ]\nCluster 45 [0.00000000e+00 1.00000000e+00 1.00000000e+00 2.40797546e-01\n 3.00613497e-01 3.67331288e-01 7.66871166e-04 1.53374233e-03\n 5.36809816e-03 1.58742331e-01 1.74846626e-01 0.00000000e+00\n 1.53374233e-03 1.53374233e-03 0.00000000e+00 1.53374233e-03\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.07361963e-02\n 1.61042945e-02 1.30368098e-02 1.68711656e-02 1.45705521e-02\n 1.61042945e-02 1.00000000e+00 1.00000000e+00 1.00000000e+00\n 2.60736196e-02 3.83435583e-03 1.53374233e-03 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 46 [1.00000000e+00 1.59156280e-01 5.08149569e-02 7.66059444e-01\n 5.71428571e-01 0.00000000e+00 9.58772771e-04 1.91754554e-03\n 5.75263663e-03 2.34899329e-01 4.59252157e-01 0.00000000e+00\n 9.58772771e-04 2.87631831e-03 0.00000000e+00 9.58772771e-04\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.43815916e-02\n 1.15052733e-02 2.30105465e-02 1.15052733e-02 8.62895494e-03\n 1.24640460e-02 1.00000000e+00 1.00000000e+00 1.00000000e+00\n 3.54745925e-02 3.83509108e-03 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\nCluster 47 [0. 0. 0. 1. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]\nCluster 48 [0.03733766 0.04383117 0.03733766 0.04545455 0.04220779 0.03409091\n 0.18993506 0.03733766 0.01298701 0.02272727 0.0211039 0.16720779\n 0.17045455 0.07954545 0.14772727 0.04058442 0.03246753 0.01298701\n 0.03246753 0.03571429 0.01136364 0.01136364 0.00487013 0.01136364\n 0.01136364 0.00811688 0.00974026 0.00811688 0.00487013 1.\n 1. 1. 1. 1. 1. 0.98214286\n 0.52922078 0.25487013 0.15097403 0.10064935 0.06655844 0.05032468\n 0.03571429 0.0211039 ]\nCluster 49 [0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0. 0. 0. 0. 0. 1. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]\nCluster 50 [2.34848485e-01 3.42592593e-01 0.00000000e+00 3.67003367e-01\n 0.00000000e+00 0.00000000e+00 8.41750842e-04 9.25925926e-03\n 4.20875421e-03 0.00000000e+00 1.00000000e+00 0.00000000e+00\n 0.00000000e+00 6.73400673e-03 0.00000000e+00 8.41750842e-04\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 1.26262626e-02\n 1.76767677e-02 1.68350168e-02 1.76767677e-02 6.73400673e-03\n 1.34680135e-02 1.00000000e+00 1.00000000e+00 1.54882155e-01\n 2.52525253e-03 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00\n 0.00000000e+00 0.00000000e+00 0.00000000e+00 0.00000000e+00]\n"
]
],
[
[
"# Part C Percentage of Mirai Malwares in Each Cluster\n",
"_____no_output_____"
],
[
"# Exercise 9 (10 points)\nComplete the following code to compute the percentage of Mirai Malwares, Zmap, and Masscan in each cluster.",
"_____no_output_____"
]
],
[
[
"cluster_eval_df = pd.DataFrame( columns = ['cluster ID', 'size', 'cluster center', 'mirai_ratio', 'zmap_ratio', 'masscan_ratio'] )\n\nfor i in range(0, 50):\n cluster_i = predictions.where(col('prediction')==i)\n cluster_i_size = cluster_i.count()\n cluster_i_mirai_count = cluster_i.where(col('mirai')).count()\n cluster_i_mirai_ratio = cluster_i_mirai_count/cluster_i_size\n if cluster_i_mirai_count > 0:\n print(\"Cluster \", i, \"; Mirai Ratio:\", cluster_i_mirai_ratio, \"; Cluster Size: \", cluster_i_size)\n cluster_i_zmap_ratio = (cluster_i.where(col('zmap')).count())/cluster_i_size\n cluster_i_masscan_ratio = (cluster_i.where(col('masscan')).count())/cluster_i_size\n cluster_eval_df.loc[i]=[i, cluster_i_size, centers[i], cluster_i_mirai_ratio, cluster_i_zmap_ratio, cluster_i_masscan_ratio ]\n ",
"Cluster 5 ; Mirai Ratio: 0.8424333084018948 ; Cluster Size: 16044\nCluster 10 ; Mirai Ratio: 0.009066183136899365 ; Cluster Size: 3309\nCluster 18 ; Mirai Ratio: 0.06232736223164228 ; Cluster Size: 36565\nCluster 20 ; Mirai Ratio: 0.07641921397379912 ; Cluster Size: 916\nCluster 22 ; Mirai Ratio: 0.00706436420722135 ; Cluster Size: 2548\nCluster 33 ; Mirai Ratio: 0.001513275553721282 ; Cluster Size: 7269\nCluster 37 ; Mirai Ratio: 0.8878737541528239 ; Cluster Size: 1204\nCluster 39 ; Mirai Ratio: 0.027219430485762145 ; Cluster Size: 4776\nCluster 47 ; Mirai Ratio: 0.01461038961038961 ; Cluster Size: 616\n"
]
],
[
[
"# Exercise 10 (5 points) \nIdentify all of the clusters that have a large percentage of Mirai malware. For example, you can choose clusters with at least 80% of Mirai ratio. If you use a different threshold (other than 80%), describe the threshold you used and the rational of your choice.",
"_____no_output_____"
],
[
"## Answer to Exercise 10:\n## if I choose 80% as threshold\n- Cluster 5 ; Mirai Ratio: 0.8424333084018948 ; Cluster Size: 16044\n- Cluster 37 ; Mirai Ratio: 0.8878737541528239 ; Cluster Size: 1204\n...",
"_____no_output_____"
]
],
[
[
"# You can filter predictions DataFrame (Spark) to get all scanners in a cluster. \n# For example, the code below selects scanners in cluster 5. However, you should\n# replace 5 with the ID of the cluster you want to investigate.\ncluster_selected = predictions.where((col('prediction')==5) | (col('prediction')==37))",
"_____no_output_____"
],
[
"# If you prefer to use Pandas dataframe, you can use the following to convert a cluster to a Pandas dataframe\ncluster_selected_df = cluster_selected.select(\"*\").toPandas()",
"_____no_output_____"
],
[
"cluster_selected.printSchema()",
"root\n |-- _c0: integer (nullable = true)\n |-- id: integer (nullable = true)\n |-- numports: integer (nullable = true)\n |-- lifetime: double (nullable = true)\n |-- Bytes: integer (nullable = true)\n |-- Packets: integer (nullable = true)\n |-- average_packetsize: integer (nullable = true)\n |-- MinUniqueDests: integer (nullable = true)\n |-- MaxUniqueDests: integer (nullable = true)\n |-- MinUniqueDest24s: integer (nullable = true)\n |-- MaxUniqueDest24s: integer (nullable = true)\n |-- average_lifetime: double (nullable = true)\n |-- mirai: boolean (nullable = true)\n |-- zmap: boolean (nullable = true)\n |-- masscan: boolean (nullable = true)\n |-- country: string (nullable = true)\n |-- traffic_types_scanned_str: string (nullable = true)\n |-- ports_scanned_str: string (nullable = true)\n |-- host_tags_per_censys: string (nullable = true)\n |-- host_services_per_censys: string (nullable = true)\n |-- Ports_Array: array (nullable = true)\n | |-- element: string (containsNull = true)\n |-- Port17132: boolean (nullable = true)\n |-- Port17140: boolean (nullable = true)\n |-- Port17128: boolean (nullable = true)\n |-- Port17138: boolean (nullable = true)\n |-- Port17130: boolean (nullable = true)\n |-- Port17136: boolean (nullable = true)\n |-- Port23: boolean (nullable = true)\n |-- Port445: boolean (nullable = true)\n |-- Port54594: boolean (nullable = true)\n |-- Port17142: boolean (nullable = true)\n |-- Port17134: boolean (nullable = true)\n |-- Port80: boolean (nullable = true)\n |-- Port8080: boolean (nullable = true)\n |-- Port0: boolean (nullable = true)\n |-- Port2323: boolean (nullable = true)\n |-- Port5555: boolean (nullable = true)\n |-- Port81: boolean (nullable = true)\n |-- Port1023: boolean (nullable = true)\n |-- Port52869: boolean (nullable = true)\n |-- Port8443: boolean (nullable = true)\n |-- Port49152: boolean (nullable = true)\n |-- Port7574: boolean (nullable = true)\n |-- Port37215: boolean (nullable = true)\n |-- Port34218: boolean (nullable = true)\n |-- Port34220: boolean (nullable = true)\n |-- Port33968: boolean (nullable = true)\n |-- Port34224: boolean (nullable = true)\n |-- Port34228: boolean (nullable = true)\n |-- Port33962: boolean (nullable = true)\n |-- TE_numports_0: boolean (nullable = true)\n |-- TE_numports_1: boolean (nullable = true)\n |-- TE_numports_2: boolean (nullable = true)\n |-- TE_numports_3: boolean (nullable = true)\n |-- TE_numports_4: boolean (nullable = true)\n |-- TE_numports_5: boolean (nullable = true)\n |-- TE_numports_6: boolean (nullable = true)\n |-- TE_numports_7: boolean (nullable = true)\n |-- TE_numports_8: boolean (nullable = true)\n |-- TE_numports_9: boolean (nullable = true)\n |-- TE_numports_10: boolean (nullable = true)\n |-- TE_numports_11: boolean (nullable = true)\n |-- TE_numports_12: boolean (nullable = true)\n |-- TE_numports_13: boolean (nullable = true)\n |-- TE_numports_14: boolean (nullable = true)\n |-- TE_numports_15: boolean (nullable = true)\n |-- TE_numports_16: boolean (nullable = true)\n |-- features: vector (nullable = true)\n |-- prediction: integer (nullable = false)\n\n"
]
],
[
[
"# Exercise 11 (10 points)\nComplete the following code to find out, for each of the clusters you identified in Exercise 10, \n- (1) (5 points) determine whether they scan a common port, and \n- (2) (5 points) what is the port number if most of them in a cluster scan a common port. \nYou canuse the code below to find out what top port is scanned by the scanner in a cluster.",
"_____no_output_____"
]
],
[
[
"# You fill in the ??? based on the cluster you want to investigate.\ncluster_5= predictions.where(col('prediction')==5)\ncluster_37= predictions.where(col('prediction')==37)",
"_____no_output_____"
],
[
"for i in range(0, top_ports -1):\n port_num = \"Port\" + Top_Ports_list[i]\n port_i_count = cluster_5.where(col(port_num)).count()\n if port_i_count > 0:\n print(\"Scanners of Port \", Top_Ports_list[i], \" = \", port_i_count)",
"Scanners of Port 23 = 16044\n"
],
[
"for i in range(0, top_ports -1):\n port_num = \"Port\" + Top_Ports_list[i]\n port_i_count = cluster_37.where(col(port_num)).count()\n if port_i_count > 0:\n print(\"Scanners of Port \", Top_Ports_list[i], \" = \", port_i_count)",
"Scanners of Port 23 = 1192\nScanners of Port 445 = 1\nScanners of Port 54594 = 1\nScanners of Port 80 = 30\nScanners of Port 8080 = 34\nScanners of Port 0 = 1\nScanners of Port 2323 = 1204\nScanners of Port 5555 = 17\nScanners of Port 81 = 14\nScanners of Port 1023 = 22\nScanners of Port 52869 = 17\nScanners of Port 8443 = 12\nScanners of Port 49152 = 17\nScanners of Port 7574 = 12\nScanners of Port 37215 = 9\n"
]
],
[
[
"# Answer to Exercise 11\n- (1) (5 points) They all scan the common ports, cluster scan the port 23, and cluster 37 also scan the port23, and port 23 is the common port. \n- (2) (5 points) The top port in cluster 5 is port2 which has 16044 times. And the top port in cluster 37 is port 2323, which is also common port and has 1204 times. ",
"_____no_output_____"
],
[
"# Exercise 12 (20 points)\nBased on the results above and those of mini project deliverable #2, answer the following questions:\n- (a) Why the clustering result of mini project #3 is better than that of #2? (5 points)\n- (b) Based on your answer of (a), what is the general lesson you learned for solving clustering problems? (5 points)\n- (c) Did you find anything interesting and/or surprising using Mirai labels to evaluate the clustering result? (5 points)\n- (d) Based on your answer of (c), what is the general lesson you learned regarding evaluating clustering? (5 points)",
"_____no_output_____"
],
[
"# Answer to Exercise 12: \n- (a) Because the mini project#3 use the thermometer Encoding, because mini-project-#2 contain mixture of numerical variables and (One Hot Encoded) categorical variables, but thermometer Encoding can improve it. But, in real execution the project 3 get worser score than project 2. I think we can use hypermeter tunning to improve the proformance of kmean.\n- (b) To improve the cluster proformacne, I need avoid mixture of numerical variables and (One Hot Encoded), numerical variables not normalized, and high dimensional feature space. Moreover, I can use thermometer Encoding to improve my kmean cluster.\n- (c) I find that Mirai labels give me a better way to do Clustering Validation. And, it return accurate ratio for me to know the validation score.\n- (d) External label are used when we propose a new clustering technique and we want to validate it or we want to compare it to existing techniques. In these cases, we get a bunch of datasets for which we know the ground truth and see if our clustering technique is able to produce clustering solutions that are similar to it. So we can use external validity to improve our Clustering Validation.",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
]
] |
d07e86c074b087e6464eb7d701d9da0eb3153491 | 23,748 | ipynb | Jupyter Notebook | Heaps-PriorityQueues.ipynb | rambasnet/CS3Notebooks | 7df413644743fe84b81915c96fc1ae093172721a | [
"MIT"
] | 4 | 2019-08-19T20:01:56.000Z | 2019-10-15T16:47:20.000Z | Heaps-PriorityQueues.ipynb | rambasnet/IntroToAlgorithms | 7df413644743fe84b81915c96fc1ae093172721a | [
"MIT"
] | null | null | null | Heaps-PriorityQueues.ipynb | rambasnet/IntroToAlgorithms | 7df413644743fe84b81915c96fc1ae093172721a | [
"MIT"
] | 2 | 2020-04-28T18:48:28.000Z | 2020-05-07T17:21:01.000Z | 30.022756 | 207 | 0.499495 | [
[
[
"empty"
]
]
] | [
"empty"
] | [
[
"empty"
]
] |
d07e88a50ea0d3223f057d5f9d0e68c8a9a57306 | 22,271 | ipynb | Jupyter Notebook | Python for DS, AI & Development/PY0101EN-2-3-Dictionaries.ipynb | mesbahiba/IBM_Professional_Data_Analyst | 24be273defb507f9748b30fc95b746c35d26b2c3 | [
"RSA-MD"
] | 2 | 2021-07-29T17:39:30.000Z | 2021-12-29T18:14:54.000Z | Python for DS, AI & Development/PY0101EN-2-3-Dictionaries.ipynb | mesbahiba/IBM_Professional_Data_Analyst | 24be273defb507f9748b30fc95b746c35d26b2c3 | [
"RSA-MD"
] | null | null | null | Python for DS, AI & Development/PY0101EN-2-3-Dictionaries.ipynb | mesbahiba/IBM_Professional_Data_Analyst | 24be273defb507f9748b30fc95b746c35d26b2c3 | [
"RSA-MD"
] | 2 | 2021-07-31T23:09:31.000Z | 2022-02-07T13:20:14.000Z | 23.793803 | 716 | 0.526649 | [
[
[
"<a href=\"https://cognitiveclass.ai/\">\n <img src=\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Ad/CCLog.png\" width=\"200\" align=\"center\">\n</a>",
"_____no_output_____"
],
[
"<h1>Dictionaries in Python</h1>",
"_____no_output_____"
],
[
"<p><strong>Welcome!</strong> This notebook will teach you about the dictionaries in the Python Programming Language. By the end of this lab, you'll know the basics dictionary operations in Python, including what it is, and the operations on it.</p>",
"_____no_output_____"
],
[
"<div class=\"alert alert-block alert-info\" style=\"margin-top: 20px\">\n <a href=\"http://cocl.us/topNotebooksPython101Coursera\">\n <img src=\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Ad/TopAd.png\" width=\"750\" align=\"center\">\n </a>\n</div>",
"_____no_output_____"
],
[
"<h2>Table of Contents</h2>\n<div class=\"alert alert-block alert-info\" style=\"margin-top: 20px\">\n <ul>\n <li>\n <a href=\"#dic\">Dictionaries</a>\n <ul>\n <li><a href=\"content\">What are Dictionaries?</a></li>\n <li><a href=\"key\">Keys</a></li>\n </ul>\n </li>\n <li>\n <a href=\"#quiz\">Quiz on Dictionaries</a>\n </li>\n </ul>\n <p>\n Estimated time needed: <strong>20 min</strong>\n </p>\n</div>\n\n<hr>",
"_____no_output_____"
],
[
"<h2 id=\"Dic\">Dictionaries</h2>",
"_____no_output_____"
],
[
"<h3 id=\"content\">What are Dictionaries?</h3>",
"_____no_output_____"
],
[
"A dictionary consists of keys and values. It is helpful to compare a dictionary to a list. Instead of the numerical indexes such as a list, dictionaries have keys. These keys are the keys that are used to access values within a dictionary.",
"_____no_output_____"
],
[
"<img src=\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Chapter%202/Images/DictsList.png\" width=\"650\" />",
"_____no_output_____"
],
[
"An example of a Dictionary <code>Dict</code>:",
"_____no_output_____"
]
],
[
[
"# Create the dictionary\n\nDict = {\"key1\": 1, \"key2\": \"2\", \"key3\": [3, 3, 3], \"key4\": (4, 4, 4), ('key5'): 5, (0, 1): 6}\nDict",
"_____no_output_____"
]
],
[
[
"The keys can be strings:",
"_____no_output_____"
]
],
[
[
"# Access to the value by the key\n\nDict[\"key1\"]",
"_____no_output_____"
]
],
[
[
"Keys can also be any immutable object such as a tuple: ",
"_____no_output_____"
]
],
[
[
"# Access to the value by the key\n\nDict[(0, 1)]",
"_____no_output_____"
]
],
[
[
" Each key is separated from its value by a colon \"<code>:</code>\". Commas separate the items, and the whole dictionary is enclosed in curly braces. An empty dictionary without any items is written with just two curly braces, like this \"<code>{}</code>\".",
"_____no_output_____"
]
],
[
[
"# Create a sample dictionary\n\nrelease_year_dict = {\"Thriller\": \"1982\", \"Back in Black\": \"1980\", \\\n \"The Dark Side of the Moon\": \"1973\", \"The Bodyguard\": \"1992\", \\\n \"Bat Out of Hell\": \"1977\", \"Their Greatest Hits (1971-1975)\": \"1976\", \\\n \"Saturday Night Fever\": \"1977\", \"Rumours\": \"1977\"}\nrelease_year_dict",
"_____no_output_____"
]
],
[
[
"In summary, like a list, a dictionary holds a sequence of elements. Each element is represented by a key and its corresponding value. Dictionaries are created with two curly braces containing keys and values separated by a colon. For every key, there can only be one single value, however, multiple keys can hold the same value. Keys can only be strings, numbers, or tuples, but values can be any data type.",
"_____no_output_____"
],
[
"It is helpful to visualize the dictionary as a table, as in the following image. The first column represents the keys, the second column represents the values.",
"_____no_output_____"
],
[
"<img src=\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Chapter%202/Images/DictsStructure.png\" width=\"650\" />",
"_____no_output_____"
],
[
"<h3 id=\"key\">Keys</h3>",
"_____no_output_____"
],
[
"You can retrieve the values based on the names:",
"_____no_output_____"
]
],
[
[
"# Get value by keys\n\nrelease_year_dict['Thriller'] ",
"_____no_output_____"
]
],
[
[
"This corresponds to: \n",
"_____no_output_____"
],
[
"<img src=\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Chapter%202/Images/DictsKeyOne.png\" width=\"500\" />",
"_____no_output_____"
],
[
"Similarly for <b>The Bodyguard</b>",
"_____no_output_____"
]
],
[
[
"# Get value by key\n\nrelease_year_dict['The Bodyguard'] ",
"_____no_output_____"
]
],
[
[
"<img src=\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Chapter%202/Images/DictsKeyTwo.png\" width=\"500\" />",
"_____no_output_____"
],
[
"Now let you retrieve the keys of the dictionary using the method <code>release_year_dict()</code>:",
"_____no_output_____"
]
],
[
[
"# Get all the keys in dictionary\n\nrelease_year_dict.keys() ",
"_____no_output_____"
]
],
[
[
"You can retrieve the values using the method <code>values()</code>:",
"_____no_output_____"
]
],
[
[
"# Get all the values in dictionary\n\nrelease_year_dict.values() ",
"_____no_output_____"
]
],
[
[
"We can add an entry:",
"_____no_output_____"
]
],
[
[
"# Append value with key into dictionary\n\nrelease_year_dict['Graduation'] = '2007'\nrelease_year_dict",
"_____no_output_____"
]
],
[
[
"We can delete an entry: ",
"_____no_output_____"
]
],
[
[
"# Delete entries by key\n\ndel(release_year_dict['Thriller'])\ndel(release_year_dict['Graduation'])\nrelease_year_dict",
"_____no_output_____"
]
],
[
[
" We can verify if an element is in the dictionary: ",
"_____no_output_____"
]
],
[
[
"# Verify the key is in the dictionary\n\n'The Bodyguard' in release_year_dict",
"_____no_output_____"
]
],
[
[
"<hr>",
"_____no_output_____"
],
[
"<h2 id=\"quiz\">Quiz on Dictionaries</h2>",
"_____no_output_____"
],
[
"<b>You will need this dictionary for the next two questions:</b>",
"_____no_output_____"
]
],
[
[
"# Question sample dictionary\n\nsoundtrack_dic = {\"The Bodyguard\":\"1992\", \"Saturday Night Fever\":\"1977\"}\nsoundtrack_dic ",
"_____no_output_____"
]
],
[
[
"a) In the dictionary <code>soundtrack_dict</code> what are the keys ?",
"_____no_output_____"
]
],
[
[
"# Write your code below and press Shift+Enter to execute\nsoundtrack_dic.keys()",
"_____no_output_____"
]
],
[
[
"Double-click __here__ for the solution.\n\n<!-- Your answer is below:\nsoundtrack_dic.keys() # The Keys \"The Bodyguard\" and \"Saturday Night Fever\" \n-->",
"_____no_output_____"
],
[
"b) In the dictionary <code>soundtrack_dict</code> what are the values ?",
"_____no_output_____"
]
],
[
[
"# Write your code below and press Shift+Enter to execute\nsoundtrack_dic.values()",
"_____no_output_____"
]
],
[
[
"Double-click __here__ for the solution.\n\n<!-- Your answer is below:\nsoundtrack_dic.values() # The values are \"1992\" and \"1977\"\n-->",
"_____no_output_____"
],
[
"<hr>",
"_____no_output_____"
],
[
"<b>You will need this dictionary for the following questions:</b>",
"_____no_output_____"
],
[
"The Albums <b>Back in Black</b>, <b>The Bodyguard</b> and <b>Thriller</b> have the following music recording sales in millions 50, 50 and 65 respectively:",
"_____no_output_____"
],
[
"a) Create a dictionary <code>album_sales_dict</code> where the keys are the album name and the sales in millions are the values. ",
"_____no_output_____"
]
],
[
[
"# Write your code below and press Shift+Enter to execute\n\nalbum_sales_dict = {\"Back in Black\":50, \"The Bodyguard\":50, \"Thriller\":65}\nalbum_sales_dict",
"_____no_output_____"
]
],
[
[
"Double-click __here__ for the solution.\n\n<!-- Your answer is below:\nalbum_sales_dict = {\"The Bodyguard\":50, \"Back in Black\":50, \"Thriller\":65}\n-->",
"_____no_output_____"
],
[
"b) Use the dictionary to find the total sales of <b>Thriller</b>:",
"_____no_output_____"
]
],
[
[
"# Write your code below and press Shift+Enter to execute\nalbum_sales_dict[\"Thriller\"]",
"_____no_output_____"
]
],
[
[
"Double-click __here__ for the solution.\n\n<!-- Your answer is below:\nalbum_sales_dict[\"Thriller\"]\n-->",
"_____no_output_____"
],
[
"c) Find the names of the albums from the dictionary using the method <code>keys</code>:",
"_____no_output_____"
]
],
[
[
"# Write your code below and press Shift+Enter to execute\nalbum_sales_dict.keys()",
"_____no_output_____"
]
],
[
[
"Double-click __here__ for the solution.\n\n<!-- Your answer is below:\nalbum_sales_dict.keys()\n-->",
"_____no_output_____"
],
[
"d) Find the names of the recording sales from the dictionary using the method <code>values</code>:",
"_____no_output_____"
]
],
[
[
"# Write your code below and press Shift+Enter to execute\nalbum_sales_dict.values()",
"_____no_output_____"
]
],
[
[
"Double-click __here__ for the solution.\n\n<!-- Your answer is below:\nalbum_sales_dict.values()\n-->",
"_____no_output_____"
],
[
"<hr>\n<h2>The last exercise!</h2>\n<p>Congratulations, you have completed your first lesson and hands-on lab in Python. However, there is one more thing you need to do. The Data Science community encourages sharing work. The best way to share and showcase your work is to share it on GitHub. By sharing your notebook on GitHub you are not only building your reputation with fellow data scientists, but you can also show it off when applying for a job. Even though this was your first piece of work, it is never too early to start building good habits. So, please read and follow <a href=\"https://cognitiveclass.ai/blog/data-scientists-stand-out-by-sharing-your-notebooks/\" target=\"_blank\">this article</a> to learn how to share your work.\n<hr>",
"_____no_output_____"
],
[
"<div class=\"alert alert-block alert-info\" style=\"margin-top: 20px\">\n<h2>Get IBM Watson Studio free of charge!</h2>\n <p><a href=\"https://cocl.us/bottemNotebooksPython101Coursera\"><img src=\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Ad/BottomAd.png\" width=\"750\" align=\"center\"></a></p>\n</div>",
"_____no_output_____"
],
[
"<h3>About the Authors:</h3> \n<p><a href=\"https://www.linkedin.com/in/joseph-s-50398b136/\" target=\"_blank\">Joseph Santarcangelo</a> is a Data Scientist at IBM, and holds a PhD in Electrical Engineering. His research focused on using Machine Learning, Signal Processing, and Computer Vision to determine how videos impact human cognition. Joseph has been working for IBM since he completed his PhD.</p>",
"_____no_output_____"
],
[
"Other contributors: <a href=\"www.linkedin.com/in/jiahui-mavis-zhou-a4537814a\">Mavis Zhou</a>",
"_____no_output_____"
],
[
"<hr>",
"_____no_output_____"
],
[
"<p>Copyright © 2018 IBM Developer Skills Network. This notebook and its source code are released under the terms of the <a href=\"https://cognitiveclass.ai/mit-license/\">MIT License</a>.</p>",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
d07e9333be38267ab5e19e2bb258f73565525685 | 75,773 | ipynb | Jupyter Notebook | Data_cleaning_Outlier_EDA_Practice.ipynb | sgf-afk/Class_Assignments | 8852a825975e4d5040d28ebe705ff7198d59a1c8 | [
"MIT"
] | null | null | null | Data_cleaning_Outlier_EDA_Practice.ipynb | sgf-afk/Class_Assignments | 8852a825975e4d5040d28ebe705ff7198d59a1c8 | [
"MIT"
] | null | null | null | Data_cleaning_Outlier_EDA_Practice.ipynb | sgf-afk/Class_Assignments | 8852a825975e4d5040d28ebe705ff7198d59a1c8 | [
"MIT"
] | null | null | null | 59.616837 | 8,464 | 0.665013 | [
[
[
"In this assignment, you'll continue working with the U.S. Education Dataset from Kaggle. The data gives detailed state level information on several facets of education on an annual basis. To learn more about the data and the column descriptions, you can view the Kaggle link above.\n\nAccess this data using the Thinkful database using these credentials:\n\n* postgres_user = 'dsbc_student'\n* postgres_pw = '7*.8G9QH21'\n* postgres_host = '142.93.121.174'\n* postgres_port = '5432'\n* postgres_db = 'useducation'\n\nDon't forget to apply the most suitable missing value filling techniques from the previous checkpoint to the data. Provide the answers to the following only after you've addressed missing values!\n\nTo complete this assignment, submit a link to a Jupyter notebook containing your solutions to the following tasks:\n\n1. Consider the two variables: TOTAL_REVENUE and TOTAL_EXPENDITURE. Do these variables have outlier values?\n2. If you detect outliers in the TOTAL_REVENUE and TOTAL_EXPENDITURE variables, apply the techniques you learned in this checkpoint to eliminate them and validate that there's no outlier values after you handled them.\n3. Create another variable by subtracting the original TOTAL_EXPENDITURE from TOTAL_REVENUE (before you eliminated the outliers). You can think of it as a kind of budget deficit in education. Do you find any outlier values in this new variable? 4. If so, eliminate them using the technique you think most suitable.\n5. Now create another variable by subtracting the TOTAL_EXPENDITURE from TOTAL_REVENUE. This time, use the outlier eliminated versions of TOTAL_EXPENDITURE from TOTAL_REVENUE. In this newly created variable, can you find any outliers? If so, eliminate them.\n6. Compare some basic descriptive statistics of the budget variables you end up with in the 3rd and the 4th questions. Do you see any differences?\n7. If our variable of interest is the budget deficit variable, which method do you think is the appropriate in dealing with the outliers in this variable: the method in the 3rd question or the one in the 4th question?",
"_____no_output_____"
]
],
[
[
"import matplotlib.pyplot as plt\nimport numpy as np\nimport pandas as pd\nfrom sqlalchemy import create_engine\nimport warnings\n\nwarnings.filterwarnings('ignore')\n\npostgres_user = 'dsbc_student'\npostgres_pw = '7*.8G9QH21'\npostgres_host = '142.93.121.174'\npostgres_port = '5432'\npostgres_db = 'useducation'\n\nengine = create_engine('postgresql://{}:{}@{}:{}/{}'.format(\n postgres_user, postgres_pw, postgres_host, postgres_port, postgres_db))\n\neducation_df = pd.read_sql_query('select * from useducation',con=engine)\n\n# no need for an open connection, \n# as we're only doing a single query\nengine.dispose()",
"_____no_output_____"
],
[
"fill_list = [\"STATE_REVENUE\", \"LOCAL_REVENUE\", \"TOTAL_EXPENDITURE\",\n \"INSTRUCTION_EXPENDITURE\", \"SUPPORT_SERVICES_EXPENDITURE\",\n \"OTHER_EXPENDITURE\", \"CAPITAL_OUTLAY_EXPENDITURE\", \"GRADES_PK_G\",\n \"GRADES_KG_G\", \"GRADES_4_G\", \"GRADES_8_G\", \"GRADES_12_G\", \"GRADES_1_8_G\",\n \"GRADES_9_12_G\", \"GRADES_ALL_G\"]\n\nstates = education_df[\"STATE\"].unique()\n\nfor state in states:\n education_df.loc[education_df[\"STATE\"] == state, fill_list] = education_df.loc[education_df[\"STATE\"] == state, fill_list].interpolate()\n\n# we drop the null values after interpolation\neducation_df.dropna(inplace=True)",
"_____no_output_____"
]
],
[
[
"## 1. Consider the two variables: TOTAL_REVENUE and TOTAL_EXPENDITURE. Do these variables have outlier values?",
"_____no_output_____"
]
],
[
[
"education_df.info()",
"<class 'pandas.core.frame.DataFrame'>\nInt64Index: 415 entries, 209 to 1249\nData columns (total 25 columns):\nPRIMARY_KEY 415 non-null object\nSTATE 415 non-null object\nYEAR 415 non-null int64\nENROLL 415 non-null float64\nTOTAL_REVENUE 415 non-null float64\nFEDERAL_REVENUE 415 non-null float64\nSTATE_REVENUE 415 non-null float64\nLOCAL_REVENUE 415 non-null float64\nTOTAL_EXPENDITURE 415 non-null float64\nINSTRUCTION_EXPENDITURE 415 non-null float64\nSUPPORT_SERVICES_EXPENDITURE 415 non-null float64\nOTHER_EXPENDITURE 415 non-null float64\nCAPITAL_OUTLAY_EXPENDITURE 415 non-null float64\nGRADES_PK_G 415 non-null float64\nGRADES_KG_G 415 non-null float64\nGRADES_4_G 415 non-null float64\nGRADES_8_G 415 non-null float64\nGRADES_12_G 415 non-null float64\nGRADES_1_8_G 415 non-null float64\nGRADES_9_12_G 415 non-null float64\nGRADES_ALL_G 415 non-null float64\nAVG_MATH_4_SCORE 415 non-null float64\nAVG_MATH_8_SCORE 415 non-null float64\nAVG_READING_4_SCORE 415 non-null float64\nAVG_READING_8_SCORE 415 non-null float64\ndtypes: float64(22), int64(1), object(2)\nmemory usage: 84.3+ KB\n"
],
[
"education_df.head()",
"_____no_output_____"
]
],
[
[
"__Time series data, I can interpolate the missing values__",
"_____no_output_____"
],
[
"Z-Score Test",
"_____no_output_____"
]
],
[
[
"from scipy.stats import zscore\n\nz_scores = zscore(education_df['TOTAL_REVENUE'])\nfor threshold in range(1,10):\n print(\"The score threshold is: {}\".format(threshold))\n print(\"The indices of the outliers:\")\n print(np.where(z_scores > threshold))\n print(\"Number of outliers is: {}\".format(len((np.where(z_scores > threshold)[0]))))",
"The score threshold is: 1\nThe indices of the outliers:\n(array([ 3, 26, 52, 61, 89, 100, 112, 140, 151, 163, 168, 172, 189,\n 191, 197, 202, 214, 220, 224, 241, 243, 249, 254, 266, 271, 275,\n 292, 294, 300, 305, 317, 322, 326, 343, 345, 351, 356, 368, 373,\n 377, 394, 396, 399, 402, 407], dtype=int64),)\nNumber of outliers is: 45\nThe score threshold is: 2\nThe indices of the outliers:\n(array([ 26, 61, 89, 112, 140, 151, 163, 191, 202, 214, 243, 254, 266,\n 294, 305, 317, 345, 356, 368, 396, 407], dtype=int64),)\nNumber of outliers is: 21\nThe score threshold is: 3\nThe indices of the outliers:\n(array([ 61, 112, 163, 191, 214, 243, 266, 294, 305, 317, 345, 356, 368,\n 396, 407], dtype=int64),)\nNumber of outliers is: 15\nThe score threshold is: 4\nThe indices of the outliers:\n(array([163, 214, 266, 317, 368, 396], dtype=int64),)\nNumber of outliers is: 6\nThe score threshold is: 5\nThe indices of the outliers:\n(array([368], dtype=int64),)\nNumber of outliers is: 1\nThe score threshold is: 6\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 7\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 8\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 9\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\n"
],
[
"z_scores = zscore(education_df['TOTAL_EXPENDITURE'])\nfor threshold in range(1,10):\n print(\"The score threshold is: {}\".format(threshold))\n print(\"The indices of the outliers:\")\n print(np.where(z_scores > threshold))\n print(\"Number of outliers is: {}\".format(len((np.where(z_scores > threshold)[0]))))",
"The score threshold is: 1\nThe indices of the outliers:\n(array([ 3, 26, 52, 61, 89, 100, 112, 140, 151, 163, 168, 189, 191,\n 197, 202, 214, 220, 224, 241, 243, 249, 254, 266, 271, 275, 292,\n 294, 300, 305, 317, 322, 326, 343, 345, 351, 356, 368, 373, 377,\n 394, 396, 402, 407], dtype=int64),)\nNumber of outliers is: 43\nThe score threshold is: 2\nThe indices of the outliers:\n(array([ 26, 61, 89, 100, 112, 140, 151, 163, 191, 202, 214, 243, 254,\n 266, 294, 305, 317, 345, 356, 368, 396, 407], dtype=int64),)\nNumber of outliers is: 22\nThe score threshold is: 3\nThe indices of the outliers:\n(array([ 61, 112, 163, 191, 214, 243, 254, 266, 294, 305, 317, 345, 356,\n 368, 396, 407], dtype=int64),)\nNumber of outliers is: 16\nThe score threshold is: 4\nThe indices of the outliers:\n(array([112, 163, 214, 266, 317, 368, 396], dtype=int64),)\nNumber of outliers is: 7\nThe score threshold is: 5\nThe indices of the outliers:\n(array([368], dtype=int64),)\nNumber of outliers is: 1\nThe score threshold is: 6\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 7\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 8\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 9\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\n"
]
],
[
[
"According to Zscores both have outliers",
"_____no_output_____"
],
[
"## 2. If you detect outliers in the TOTAL_REVENUE and TOTAL_EXPENDITURE variables, apply the techniques you learned in this checkpoint to eliminate them and validate that there's no outlier values after you handled them.",
"_____no_output_____"
]
],
[
[
"from scipy.stats.mstats import winsorize\n\nwinsorized_revenue = winsorize(education_df[\"TOTAL_REVENUE\"], (0, 0.05))\n\nwinsorized_expenditure = winsorize(education_df[\"TOTAL_EXPENDITURE\"], (0, 0.05))",
"_____no_output_____"
],
[
"z_scores = zscore(winsorized_revenue)\nfor threshold in range(1,10):\n print(\"The score threshold is: {}\".format(threshold))\n print(\"The indices of the outliers:\")\n print(np.where(z_scores > threshold))\n print(\"Number of outliers is: {}\".format(len((np.where(z_scores > threshold)[0]))))",
"The score threshold is: 1\nThe indices of the outliers:\n(array([ 3, 19, 26, 52, 61, 66, 70, 87, 89, 95, 100, 112, 117,\n 121, 138, 140, 143, 146, 151, 163, 168, 172, 181, 189, 191, 194,\n 197, 202, 214, 220, 224, 233, 241, 243, 246, 249, 254, 266, 271,\n 275, 284, 292, 294, 297, 300, 305, 317, 322, 326, 343, 345, 348,\n 351, 356, 368, 373, 377, 394, 396, 399, 402, 407], dtype=int64),)\nNumber of outliers is: 62\nThe score threshold is: 2\nThe indices of the outliers:\n(array([ 3, 26, 52, 61, 89, 100, 112, 140, 151, 163, 168, 191, 202,\n 214, 243, 254, 266, 275, 294, 305, 317, 326, 345, 356, 368, 377,\n 394, 396, 402, 407], dtype=int64),)\nNumber of outliers is: 30\nThe score threshold is: 3\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 4\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 5\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 6\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 7\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 8\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 9\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\n"
],
[
"z_scores = zscore(winsorized_expenditure)\nfor threshold in range(1,10):\n print(\"The score threshold is: {}\".format(threshold))\n print(\"The indices of the outliers:\")\n print(np.where(z_scores > threshold))\n print(\"Number of outliers is: {}\".format(len((np.where(z_scores > threshold)[0]))))",
"The score threshold is: 1\nThe indices of the outliers:\n(array([ 3, 19, 26, 52, 61, 66, 70, 87, 89, 95, 100, 112, 117,\n 121, 130, 138, 140, 143, 146, 151, 163, 168, 172, 181, 189, 191,\n 194, 197, 202, 214, 220, 224, 233, 241, 243, 246, 249, 254, 266,\n 271, 275, 292, 294, 297, 300, 305, 317, 322, 326, 343, 345, 348,\n 351, 356, 368, 373, 377, 394, 396, 399, 402, 407], dtype=int64),)\nNumber of outliers is: 62\nThe score threshold is: 2\nThe indices of the outliers:\n(array([ 3, 26, 52, 61, 89, 100, 112, 140, 151, 163, 168, 191, 202,\n 214, 243, 254, 266, 294, 305, 317, 345, 356, 368, 377, 396, 407],\n dtype=int64),)\nNumber of outliers is: 26\nThe score threshold is: 3\nThe indices of the outliers:\n(array([ 26, 61, 89, 112, 140, 151, 163, 191, 202, 214, 243, 254, 266,\n 294, 305, 317, 345, 356, 368, 396, 407], dtype=int64),)\nNumber of outliers is: 21\nThe score threshold is: 4\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 5\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 6\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 7\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 8\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\nThe score threshold is: 9\nThe indices of the outliers:\n(array([], dtype=int64),)\nNumber of outliers is: 0\n"
]
],
[
[
"After the outlier threshold of 3 (75%) we lose our outliers, Winsorization worked.",
"_____no_output_____"
],
[
"## 3. Create another variable by subtracting the original TOTAL_EXPENDITURE from TOTAL_REVENUE (before you eliminated the outliers). You can think of it as a kind of budget deficit in education. Do you find any outlier values in this new variable? ",
"_____no_output_____"
]
],
[
[
"education_df['Deficit'] = education_df['TOTAL_REVENUE'] - education_df['TOTAL_EXPENDITURE']",
"_____no_output_____"
],
[
"plt.boxplot(education_df['Deficit'], whis = 5)",
"_____no_output_____"
]
],
[
[
"appears so!",
"_____no_output_____"
],
[
"## 4. If so, eliminate them using the technique you think most suitable.",
"_____no_output_____"
]
],
[
[
"winsorized_budget = winsorize(education_df['Deficit'], (0.05, 0.05))",
"_____no_output_____"
],
[
"plt.boxplot(winsorized_budget, whis = 5)",
"_____no_output_____"
]
],
[
[
"Looks like outliers were taken care of",
"_____no_output_____"
],
[
"## 5. Now create another variable by subtracting the TOTAL_EXPENDITURE from TOTAL_REVENUE. This time, use the outlier eliminated versions of TOTAL_EXPENDITURE from TOTAL_REVENUE. In this newly created variable, can you find any outliers? If so, eliminate them.",
"_____no_output_____"
]
],
[
[
"education_df['winsordeficit'] = winsorized_revenue - winsorized_expenditure",
"_____no_output_____"
],
[
"plt.boxplot(education_df['winsordeficit'], whis=5)",
"_____no_output_____"
],
[
"winsorizedbudget2 = winsorize(education_df['winsordeficit'], (0.05, 0.05))",
"_____no_output_____"
],
[
"plt.boxplot(winsorizedbudget2, whis=5)",
"_____no_output_____"
]
],
[
[
"## 6. Compare some basic descriptive statistics of the budget variables you end up with in the 3rd and the 4th questions. Do you see any differences?",
"_____no_output_____"
]
],
[
[
"education_df.describe()",
"_____no_output_____"
]
],
[
[
"There are some pretty substantial differences between the two",
"_____no_output_____"
],
[
"## 7. If our variable of interest is the budget deficit variable, which method do you think is the appropriate in dealing with the outliers in this variable: the method in the 3rd question or the one in the 4th question?",
"_____no_output_____"
],
[
"for a more accurate representation, using the data from the original variables is best. The method in the 3rd question.",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
]
] |
d07e96036ba563bd487f450dd6979de18dd25778 | 750,930 | ipynb | Jupyter Notebook | Image_Classifier_Project.ipynb | Ayanlola2002/deep-learning-flower-identifier | 11f9a2516bfae01d2c70cd9bb08851420ae5c0e8 | [
"MIT"
] | null | null | null | Image_Classifier_Project.ipynb | Ayanlola2002/deep-learning-flower-identifier | 11f9a2516bfae01d2c70cd9bb08851420ae5c0e8 | [
"MIT"
] | null | null | null | Image_Classifier_Project.ipynb | Ayanlola2002/deep-learning-flower-identifier | 11f9a2516bfae01d2c70cd9bb08851420ae5c0e8 | [
"MIT"
] | null | null | null | 85.791157 | 181,428 | 0.685678 | [
[
[
"<a href=\"https://colab.research.google.com/github/Ayanlola2002/deep-learning-flower-identifier/blob/master/Image_Classifier_Project.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
],
[
"# Developing an AI application\n\nGoing forward, AI algorithms will be incorporated into more and more everyday applications. For example, you might want to include an image classifier in a smart phone app. To do this, you'd use a deep learning model trained on hundreds of thousands of images as part of the overall application architecture. A large part of software development in the future will be using these types of models as common parts of applications. \n\nIn this project, you'll train an image classifier to recognize different species of flowers. You can imagine using something like this in a phone app that tells you the name of the flower your camera is looking at. In practice you'd train this classifier, then export it for use in your application. We'll be using [this dataset](http://www.robots.ox.ac.uk/~vgg/data/flowers/102/index.html) of 102 flower categories, you can see a few examples below. \n\n<img src='https://i.imgur.com/6n64KAw.png' width=500px>\n\nThe project is broken down into multiple steps:\n\n* Load and preprocess the image dataset\n* Train the image classifier on your dataset\n* Use the trained classifier to predict image content\n\nWe'll lead you through each part which you'll implement in Python.\n\nWhen you've completed this project, you'll have an application that can be trained on any set of labeled images. Here your network will be learning about flowers and end up as a command line application. But, what you do with your new skills depends on your imagination and effort in building a dataset. For example, imagine an app where you take a picture of a car, it tells you what the make and model is, then looks up information about it. Go build your own dataset and make something new.\n\nFirst up is importing the packages you'll need. It's good practice to keep all the imports at the beginning of your code. As you work through this notebook and find you need to import a package, make sure to add the import up here.",
"_____no_output_____"
],
[
"## Install PyTorch",
"_____no_output_____"
]
],
[
[
"# http://pytorch.org/\nfrom os.path import exists\nfrom wheel.pep425tags import get_abbr_impl, get_impl_ver, get_abi_tag\nplatform = '{}{}-{}'.format(get_abbr_impl(), get_impl_ver(), get_abi_tag())\ncuda_output = !ldconfig -p|grep cudart.so|sed -e 's/.*\\.\\([0-9]*\\)\\.\\([0-9]*\\)$/cu\\1\\2/'\naccelerator = cuda_output[0] if exists('/dev/nvidia0') else 'cpu'\n\n!pip install -q http://download.pytorch.org/whl/{accelerator}/torch-0.4.1-{platform}-linux_x86_64.whl torchvision\nimport torch",
"_____no_output_____"
]
],
[
[
"## Download the dataset",
"_____no_output_____"
]
],
[
[
"!wget -O cat_to_name.json \"https://raw.githubusercontent.com/GabrielePicco/deep-learning-flower-identifier/master/cat_to_name.json\"\n!wget \"https://s3.amazonaws.com/content.udacity-data.com/courses/nd188/flower_data.zip\" \n!unzip flower_data.zip",
"--2018-12-10 15:16:17-- https://raw.githubusercontent.com/GabrielePicco/deep-learning-flower-identifier/master/cat_to_name.json\nResolving raw.githubusercontent.com (raw.githubusercontent.com)... 151.101.0.133, 151.101.64.133, 151.101.128.133, ...\nConnecting to raw.githubusercontent.com (raw.githubusercontent.com)|151.101.0.133|:443... connected.\nHTTP request sent, awaiting response... 200 OK\nLength: 2218 (2.2K) [text/plain]\nSaving to: ‘cat_to_name.json’\n\n\rcat_to_name.json 0%[ ] 0 --.-KB/s \rcat_to_name.json 100%[===================>] 2.17K --.-KB/s in 0s \n\n2018-12-10 15:16:18 (31.5 MB/s) - ‘cat_to_name.json’ saved [2218/2218]\n\n--2018-12-10 15:16:20-- https://s3.amazonaws.com/content.udacity-data.com/courses/nd188/flower_data.zip\nResolving s3.amazonaws.com (s3.amazonaws.com)... 52.216.170.37\nConnecting to s3.amazonaws.com (s3.amazonaws.com)|52.216.170.37|:443... connected.\nHTTP request sent, awaiting response... 200 OK\nLength: 311442766 (297M) [application/zip]\nSaving to: ‘flower_data.zip’\n\nflower_data.zip 100%[===================>] 297.01M 16.3MB/s in 47s \n\n2018-12-10 15:17:08 (6.29 MB/s) - ‘flower_data.zip’ saved [311442766/311442766]\n\nArchive: flower_data.zip\n creating: flower_data/\n creating: flower_data/valid/\n creating: flower_data/valid/61/\n inflating: flower_data/valid/61/image_06296.jpg \n inflating: flower_data/valid/61/image_06293.jpg \n inflating: flower_data/valid/61/image_06292.jpg \n inflating: flower_data/valid/61/image_06261.jpg \n inflating: flower_data/valid/61/image_06259.jpg \n inflating: flower_data/valid/61/image_06273.jpg \n creating: flower_data/valid/95/\n inflating: flower_data/valid/95/image_07513.jpg \n inflating: flower_data/valid/95/image_07512.jpg \n inflating: flower_data/valid/95/image_07510.jpg \n inflating: flower_data/valid/95/image_07471.jpg \n inflating: flower_data/valid/95/image_07475.jpg \n inflating: flower_data/valid/95/image_07502.jpg \n inflating: flower_data/valid/95/image_07588.jpg \n inflating: flower_data/valid/95/image_07575.jpg \n inflating: flower_data/valid/95/image_07585.jpg \n inflating: flower_data/valid/95/image_07580.jpg \n inflating: flower_data/valid/95/image_07581.jpg \n inflating: flower_data/valid/95/image_07485.jpg \n inflating: flower_data/valid/95/image_07484.jpg \n creating: flower_data/valid/59/\n inflating: flower_data/valid/59/image_05077.jpg \n inflating: flower_data/valid/59/image_05034.jpg \n inflating: flower_data/valid/59/image_05086.jpg \n inflating: flower_data/valid/59/image_05056.jpg \n creating: flower_data/valid/92/\n inflating: flower_data/valid/92/image_03065.jpg \n inflating: flower_data/valid/92/image_03063.jpg \n creating: flower_data/valid/66/\n inflating: flower_data/valid/66/image_05560.jpg \n inflating: flower_data/valid/66/image_05577.jpg \n inflating: flower_data/valid/66/image_05563.jpg \n inflating: flower_data/valid/66/image_05559.jpg \n inflating: flower_data/valid/66/image_05565.jpg \n inflating: flower_data/valid/66/image_05579.jpg \n creating: flower_data/valid/50/\n inflating: flower_data/valid/50/image_06552.jpg \n inflating: flower_data/valid/50/image_06550.jpg \n inflating: flower_data/valid/50/image_06544.jpg \n inflating: flower_data/valid/50/image_06324.jpg \n inflating: flower_data/valid/50/image_06331.jpg \n inflating: flower_data/valid/50/image_06534.jpg \n inflating: flower_data/valid/50/image_06305.jpg \n inflating: flower_data/valid/50/image_06311.jpg \n inflating: flower_data/valid/50/image_06528.jpg \n inflating: flower_data/valid/50/image_06558.jpg \n inflating: flower_data/valid/50/image_06562.jpg \n creating: flower_data/valid/68/\n inflating: flower_data/valid/68/image_05926.jpg \n inflating: flower_data/valid/68/image_05932.jpg \n inflating: flower_data/valid/68/image_05924.jpg \n inflating: flower_data/valid/68/image_05919.jpg \n inflating: flower_data/valid/68/image_05908.jpg \n inflating: flower_data/valid/68/image_05904.jpg \n inflating: flower_data/valid/68/image_05915.jpg \n inflating: flower_data/valid/68/image_05916.jpg \n creating: flower_data/valid/57/\n inflating: flower_data/valid/57/image_07262.jpg \n inflating: flower_data/valid/57/image_08145.jpg \n inflating: flower_data/valid/57/image_07239.jpg \n inflating: flower_data/valid/57/image_07254.jpg \n inflating: flower_data/valid/57/image_07255.jpg \n inflating: flower_data/valid/57/image_07247.jpg \n creating: flower_data/valid/32/\n inflating: flower_data/valid/32/image_05615.jpg \n inflating: flower_data/valid/32/image_05598.jpg \n inflating: flower_data/valid/32/image_05584.jpg \n creating: flower_data/valid/35/\n inflating: flower_data/valid/35/image_06977.jpg \n inflating: flower_data/valid/35/image_07005.jpg \n inflating: flower_data/valid/35/image_07004.jpg \n inflating: flower_data/valid/35/image_06978.jpg \n creating: flower_data/valid/102/\n inflating: flower_data/valid/102/image_08040.jpg \n inflating: flower_data/valid/102/image_08041.jpg \n inflating: flower_data/valid/102/image_08006.jpg \n inflating: flower_data/valid/102/image_08038.jpg \n inflating: flower_data/valid/102/image_08014.jpg \n inflating: flower_data/valid/102/image_08002.jpg \n creating: flower_data/valid/69/\n inflating: flower_data/valid/69/image_06008.jpg \n inflating: flower_data/valid/69/image_05993.jpg \n inflating: flower_data/valid/69/image_05992.jpg \n inflating: flower_data/valid/69/image_05984.jpg \n inflating: flower_data/valid/69/image_05958.jpg \n creating: flower_data/valid/56/\n inflating: flower_data/valid/56/image_02858.jpg \n inflating: flower_data/valid/56/image_02849.jpg \n inflating: flower_data/valid/56/image_02805.jpg \n inflating: flower_data/valid/56/image_02769.jpg \n inflating: flower_data/valid/56/image_02792.jpg \n inflating: flower_data/valid/56/image_02788.jpg \n inflating: flower_data/valid/56/image_02771.jpg \n inflating: flower_data/valid/56/image_02773.jpg \n inflating: flower_data/valid/56/image_02857.jpg \n creating: flower_data/valid/51/\n inflating: flower_data/valid/51/image_03946.jpg \n inflating: flower_data/valid/51/image_01439.jpg \n inflating: flower_data/valid/51/image_01363.jpg \n inflating: flower_data/valid/51/image_03951.jpg \n inflating: flower_data/valid/51/image_01389.jpg \n inflating: flower_data/valid/51/image_01414.jpg \n inflating: flower_data/valid/51/image_03943.jpg \n inflating: flower_data/valid/51/image_01459.jpg \n inflating: flower_data/valid/51/image_03933.jpg \n inflating: flower_data/valid/51/image_03927.jpg \n inflating: flower_data/valid/51/image_01315.jpg \n inflating: flower_data/valid/51/image_01473.jpg \n inflating: flower_data/valid/51/image_03937.jpg \n inflating: flower_data/valid/51/image_01322.jpg \n inflating: flower_data/valid/51/image_03938.jpg \n inflating: flower_data/valid/51/image_03916.jpg \n inflating: flower_data/valid/51/image_03917.jpg \n inflating: flower_data/valid/51/image_01469.jpg \n inflating: flower_data/valid/51/image_01326.jpg \n inflating: flower_data/valid/51/image_01354.jpg \n inflating: flower_data/valid/51/image_01425.jpg \n inflating: flower_data/valid/51/image_01419.jpg \n inflating: flower_data/valid/51/image_03961.jpg \n inflating: flower_data/valid/51/image_03975.jpg \n inflating: flower_data/valid/51/image_03974.jpg \n inflating: flower_data/valid/51/image_01346.jpg \n inflating: flower_data/valid/51/image_01352.jpg \n inflating: flower_data/valid/51/image_01391.jpg \n creating: flower_data/valid/58/\n inflating: flower_data/valid/58/image_02734.jpg \n inflating: flower_data/valid/58/image_02642.jpg \n inflating: flower_data/valid/58/image_02656.jpg \n inflating: flower_data/valid/58/image_02690.jpg \n inflating: flower_data/valid/58/image_02653.jpg \n inflating: flower_data/valid/58/image_02726.jpg \n inflating: flower_data/valid/58/image_02691.jpg \n inflating: flower_data/valid/58/image_02742.jpg \n inflating: flower_data/valid/58/image_02741.jpg \n inflating: flower_data/valid/58/image_02700.jpg \n inflating: flower_data/valid/58/image_02649.jpg \n inflating: flower_data/valid/58/image_02675.jpg \n inflating: flower_data/valid/58/image_02676.jpg \n inflating: flower_data/valid/58/image_02705.jpg \n creating: flower_data/valid/67/\n inflating: flower_data/valid/67/image_07075.jpg \n inflating: flower_data/valid/67/image_07083.jpg \n creating: flower_data/valid/93/\n inflating: flower_data/valid/93/image_07302.jpg \n inflating: flower_data/valid/93/image_07303.jpg \n inflating: flower_data/valid/93/image_06024.jpg \n inflating: flower_data/valid/93/image_06044.jpg \n inflating: flower_data/valid/93/image_06048.jpg \n inflating: flower_data/valid/93/image_08113.jpg \n creating: flower_data/valid/94/\n inflating: flower_data/valid/94/image_07317.jpg \n inflating: flower_data/valid/94/image_07449.jpg \n inflating: flower_data/valid/94/image_07360.jpg \n inflating: flower_data/valid/94/image_07428.jpg \n inflating: flower_data/valid/94/image_07372.jpg \n inflating: flower_data/valid/94/image_07418.jpg \n inflating: flower_data/valid/94/image_07381.jpg \n inflating: flower_data/valid/94/image_07382.jpg \n inflating: flower_data/valid/94/image_07392.jpg \n inflating: flower_data/valid/94/image_07393.jpg \n inflating: flower_data/valid/94/image_07385.jpg \n inflating: flower_data/valid/94/image_07408.jpg \n inflating: flower_data/valid/94/image_07435.jpg \n inflating: flower_data/valid/94/image_07325.jpg \n creating: flower_data/valid/60/\n inflating: flower_data/valid/60/image_02928.jpg \n inflating: flower_data/valid/60/image_03016.jpg \n inflating: flower_data/valid/60/image_03003.jpg \n inflating: flower_data/valid/60/image_02959.jpg \n inflating: flower_data/valid/60/image_02974.jpg \n inflating: flower_data/valid/60/image_02948.jpg \n inflating: flower_data/valid/60/image_02961.jpg \n inflating: flower_data/valid/60/image_02992.jpg \n inflating: flower_data/valid/60/image_02981.jpg \n inflating: flower_data/valid/60/image_02996.jpg \n inflating: flower_data/valid/60/image_02997.jpg \n inflating: flower_data/valid/60/image_02936.jpg \n inflating: flower_data/valid/60/image_02937.jpg \n inflating: flower_data/valid/60/image_02923.jpg \n creating: flower_data/valid/34/\n inflating: flower_data/valid/34/image_06963.jpg \n inflating: flower_data/valid/34/image_06949.jpg \n inflating: flower_data/valid/34/image_06964.jpg \n inflating: flower_data/valid/34/image_06959.jpg \n inflating: flower_data/valid/34/image_06939.jpg \n inflating: flower_data/valid/34/image_06955.jpg \n inflating: flower_data/valid/34/image_06950.jpg \n creating: flower_data/valid/33/\n inflating: flower_data/valid/33/image_06444.jpg \n inflating: flower_data/valid/33/image_06485.jpg \n inflating: flower_data/valid/33/image_06481.jpg \n inflating: flower_data/valid/33/image_06443.jpg \n inflating: flower_data/valid/33/image_06455.jpg \n inflating: flower_data/valid/33/image_06441.jpg \n inflating: flower_data/valid/33/image_06468.jpg \n creating: flower_data/valid/20/\n inflating: flower_data/valid/20/image_04913.jpg \n inflating: flower_data/valid/20/image_04903.jpg \n inflating: flower_data/valid/20/image_04918.jpg \n inflating: flower_data/valid/20/image_04927.jpg \n inflating: flower_data/valid/20/image_04920.jpg \n inflating: flower_data/valid/20/image_04940.jpg \n inflating: flower_data/valid/20/image_04942.jpg \n creating: flower_data/valid/18/\n inflating: flower_data/valid/18/image_04290.jpg \n inflating: flower_data/valid/18/image_04252.jpg \n inflating: flower_data/valid/18/image_04278.jpg \n inflating: flower_data/valid/18/image_04299.jpg \n inflating: flower_data/valid/18/image_04312.jpg \n inflating: flower_data/valid/18/image_04311.jpg \n inflating: flower_data/valid/18/image_04248.jpg \n inflating: flower_data/valid/18/image_04315.jpg \n inflating: flower_data/valid/18/image_04261.jpg \n inflating: flower_data/valid/18/image_04263.jpg \n inflating: flower_data/valid/18/image_04317.jpg \n creating: flower_data/valid/27/\n inflating: flower_data/valid/27/image_06868.jpg \n creating: flower_data/valid/9/\n inflating: flower_data/valid/9/image_06420.jpg \n inflating: flower_data/valid/9/image_06414.jpg \n inflating: flower_data/valid/9/image_06398.jpg \n creating: flower_data/valid/11/\n inflating: flower_data/valid/11/image_03173.jpg \n inflating: flower_data/valid/11/image_03111.jpg \n inflating: flower_data/valid/11/image_03139.jpg \n inflating: flower_data/valid/11/image_03116.jpg \n inflating: flower_data/valid/11/image_03128.jpg \n inflating: flower_data/valid/11/image_03100.jpg \n inflating: flower_data/valid/11/image_03125.jpg \n inflating: flower_data/valid/11/image_03108.jpg \n inflating: flower_data/valid/11/image_03145.jpg \n inflating: flower_data/valid/11/image_03157.jpg \n creating: flower_data/valid/7/\n inflating: flower_data/valid/7/image_07216.jpg \n creating: flower_data/valid/29/\n inflating: flower_data/valid/29/image_04143.jpg \n inflating: flower_data/valid/29/image_04097.jpg \n inflating: flower_data/valid/29/image_04108.jpg \n inflating: flower_data/valid/29/image_04104.jpg \n inflating: flower_data/valid/29/image_04099.jpg \n inflating: flower_data/valid/29/image_04116.jpg \n inflating: flower_data/valid/29/image_04103.jpg \n creating: flower_data/valid/16/\n inflating: flower_data/valid/16/image_06691.jpg \n inflating: flower_data/valid/16/image_06671.jpg \n creating: flower_data/valid/42/\n inflating: flower_data/valid/42/image_05701.jpg \n inflating: flower_data/valid/42/image_05712.jpg \n inflating: flower_data/valid/42/image_05713.jpg \n inflating: flower_data/valid/42/image_05735.jpg \n inflating: flower_data/valid/42/image_05723.jpg \n inflating: flower_data/valid/42/image_05690.jpg \n creating: flower_data/valid/89/\n inflating: flower_data/valid/89/image_00730.jpg \n inflating: flower_data/valid/89/image_00692.jpg \n inflating: flower_data/valid/89/image_00652.jpg \n inflating: flower_data/valid/89/image_00691.jpg \n inflating: flower_data/valid/89/image_00696.jpg \n inflating: flower_data/valid/89/image_00640.jpg \n inflating: flower_data/valid/89/image_00627.jpg \n inflating: flower_data/valid/89/image_00752.jpg \n inflating: flower_data/valid/89/image_00746.jpg \n inflating: flower_data/valid/89/image_00624.jpg \n inflating: flower_data/valid/89/image_00741.jpg \n inflating: flower_data/valid/89/image_00623.jpg \n inflating: flower_data/valid/89/image_00759.jpg \n inflating: flower_data/valid/89/image_00764.jpg \n inflating: flower_data/valid/89/image_00749.jpg \n inflating: flower_data/valid/89/image_00763.jpg \n creating: flower_data/valid/45/\n inflating: flower_data/valid/45/image_07160.jpg \n inflating: flower_data/valid/45/image_07155.jpg \n inflating: flower_data/valid/45/image_07137.jpg \n inflating: flower_data/valid/45/image_07127.jpg \n creating: flower_data/valid/73/\n inflating: flower_data/valid/73/image_00280.jpg \n inflating: flower_data/valid/73/image_00295.jpg \n inflating: flower_data/valid/73/image_00440.jpg \n inflating: flower_data/valid/73/image_00442.jpg \n inflating: flower_data/valid/73/image_00424.jpg \n inflating: flower_data/valid/73/image_00356.jpg \n inflating: flower_data/valid/73/image_00381.jpg \n inflating: flower_data/valid/73/image_00354.jpg \n inflating: flower_data/valid/73/image_00432.jpg \n inflating: flower_data/valid/73/image_00436.jpg \n inflating: flower_data/valid/73/image_00378.jpg \n inflating: flower_data/valid/73/image_00352.jpg \n inflating: flower_data/valid/73/image_00421.jpg \n inflating: flower_data/valid/73/image_00402.jpg \n inflating: flower_data/valid/73/image_00415.jpg \n inflating: flower_data/valid/73/image_00366.jpg \n inflating: flower_data/valid/73/image_00275.jpg \n inflating: flower_data/valid/73/image_00329.jpg \n inflating: flower_data/valid/73/image_00302.jpg \n creating: flower_data/valid/87/\n inflating: flower_data/valid/87/image_05502.jpg \n inflating: flower_data/valid/87/image_05507.jpg \n inflating: flower_data/valid/87/image_05508.jpg \n inflating: flower_data/valid/87/image_05520.jpg \n inflating: flower_data/valid/87/image_05487.jpg \n inflating: flower_data/valid/87/image_05484.jpg \n creating: flower_data/valid/80/\n inflating: flower_data/valid/80/image_02050.jpg \n inflating: flower_data/valid/80/image_01979.jpg \n inflating: flower_data/valid/80/image_01992.jpg \n inflating: flower_data/valid/80/image_01991.jpg \n inflating: flower_data/valid/80/image_02042.jpg \n inflating: flower_data/valid/80/image_02021.jpg \n inflating: flower_data/valid/80/image_02005.jpg \n inflating: flower_data/valid/80/image_02011.jpg \n inflating: flower_data/valid/80/image_02017.jpg \n inflating: flower_data/valid/80/image_01989.jpg \n inflating: flower_data/valid/80/image_01975.jpg \n inflating: flower_data/valid/80/image_01972.jpg \n creating: flower_data/valid/74/\n inflating: flower_data/valid/74/image_01149.jpg \n inflating: flower_data/valid/74/image_01175.jpg \n inflating: flower_data/valid/74/image_01288.jpg \n inflating: flower_data/valid/74/image_01260.jpg \n inflating: flower_data/valid/74/image_01271.jpg \n inflating: flower_data/valid/74/image_01264.jpg \n inflating: flower_data/valid/74/image_01310.jpg \n inflating: flower_data/valid/74/image_01266.jpg \n inflating: flower_data/valid/74/image_01294.jpg \n inflating: flower_data/valid/74/image_01244.jpg \n inflating: flower_data/valid/74/image_01251.jpg \n inflating: flower_data/valid/74/image_01157.jpg \n inflating: flower_data/valid/74/image_01222.jpg \n inflating: flower_data/valid/74/image_01192.jpg \n inflating: flower_data/valid/74/image_01190.jpg \n creating: flower_data/valid/6/\n inflating: flower_data/valid/6/image_08105.jpg \n creating: flower_data/valid/28/\n inflating: flower_data/valid/28/image_05258.jpg \n inflating: flower_data/valid/28/image_05265.jpg \n inflating: flower_data/valid/28/image_05267.jpg \n inflating: flower_data/valid/28/image_05272.jpg \n inflating: flower_data/valid/28/image_05257.jpg \n creating: flower_data/valid/17/\n inflating: flower_data/valid/17/image_03831.jpg \n inflating: flower_data/valid/17/image_03834.jpg \n inflating: flower_data/valid/17/image_03837.jpg \n inflating: flower_data/valid/17/image_03851.jpg \n inflating: flower_data/valid/17/image_03890.jpg \n inflating: flower_data/valid/17/image_03843.jpg \n inflating: flower_data/valid/17/image_03842.jpg \n inflating: flower_data/valid/17/image_03856.jpg \n inflating: flower_data/valid/17/image_03908.jpg \n inflating: flower_data/valid/17/image_03868.jpg \n inflating: flower_data/valid/17/image_03854.jpg \n inflating: flower_data/valid/17/image_03883.jpg \n inflating: flower_data/valid/17/image_03907.jpg \n inflating: flower_data/valid/17/image_03876.jpg \n inflating: flower_data/valid/17/image_03839.jpg \n inflating: flower_data/valid/17/image_03829.jpg \n creating: flower_data/valid/1/\n inflating: flower_data/valid/1/image_06756.jpg \n inflating: flower_data/valid/1/image_06755.jpg \n inflating: flower_data/valid/1/image_06769.jpg \n inflating: flower_data/valid/1/image_06739.jpg \n inflating: flower_data/valid/1/image_06758.jpg \n inflating: flower_data/valid/1/image_06765.jpg \n inflating: flower_data/valid/1/image_06763.jpg \n inflating: flower_data/valid/1/image_06749.jpg \n creating: flower_data/valid/10/\n inflating: flower_data/valid/10/image_07102.jpg \n inflating: flower_data/valid/10/image_07101.jpg \n inflating: flower_data/valid/10/image_07107.jpg \n inflating: flower_data/valid/10/image_07094.jpg \n creating: flower_data/valid/19/\n inflating: flower_data/valid/19/image_06195.jpg \n inflating: flower_data/valid/19/image_06187.jpg \n inflating: flower_data/valid/19/image_06158.jpg \n inflating: flower_data/valid/19/image_06165.jpg \n creating: flower_data/valid/26/\n inflating: flower_data/valid/26/image_06506.jpg \n inflating: flower_data/valid/26/image_06516.jpg \n inflating: flower_data/valid/26/image_06501.jpg \n creating: flower_data/valid/8/\n inflating: flower_data/valid/8/image_03366.jpg \n inflating: flower_data/valid/8/image_03349.jpg \n inflating: flower_data/valid/8/image_03313.jpg \n inflating: flower_data/valid/8/image_03330.jpg \n inflating: flower_data/valid/8/image_03342.jpg \n creating: flower_data/valid/21/\n inflating: flower_data/valid/21/image_06778.jpg \n inflating: flower_data/valid/21/image_06784.jpg \n inflating: flower_data/valid/21/image_06804.jpg \n inflating: flower_data/valid/21/image_06788.jpg \n creating: flower_data/valid/75/\n inflating: flower_data/valid/75/image_02133.jpg \n inflating: flower_data/valid/75/image_02130.jpg \n inflating: flower_data/valid/75/image_02081.jpg \n inflating: flower_data/valid/75/image_02178.jpg \n inflating: flower_data/valid/75/image_02147.jpg \n inflating: flower_data/valid/75/image_02152.jpg \n inflating: flower_data/valid/75/image_02182.jpg \n inflating: flower_data/valid/75/image_02155.jpg \n inflating: flower_data/valid/75/image_02141.jpg \n inflating: flower_data/valid/75/image_02104.jpg \n inflating: flower_data/valid/75/image_02100.jpg \n inflating: flower_data/valid/75/image_02074.jpg \n creating: flower_data/valid/81/\n inflating: flower_data/valid/81/image_00903.jpg \n inflating: flower_data/valid/81/image_00863.jpg \n inflating: flower_data/valid/81/image_00914.jpg \n inflating: flower_data/valid/81/image_00848.jpg \n inflating: flower_data/valid/81/image_00860.jpg \n inflating: flower_data/valid/81/image_00859.jpg \n inflating: flower_data/valid/81/image_00904.jpg \n inflating: flower_data/valid/81/image_00784.jpg \n inflating: flower_data/valid/81/image_00809.jpg \n inflating: flower_data/valid/81/image_00820.jpg \n inflating: flower_data/valid/81/image_00895.jpg \n inflating: flower_data/valid/81/image_00909.jpg \n inflating: flower_data/valid/81/image_00896.jpg \n inflating: flower_data/valid/81/image_00920.jpg \n inflating: flower_data/valid/81/image_00893.jpg \n inflating: flower_data/valid/81/image_00892.jpg \n inflating: flower_data/valid/81/image_00919.jpg \n inflating: flower_data/valid/81/image_00853.jpg \n creating: flower_data/valid/86/\n inflating: flower_data/valid/86/image_02906.jpg \n inflating: flower_data/valid/86/image_02901.jpg \n inflating: flower_data/valid/86/image_02891.jpg \n inflating: flower_data/valid/86/image_02887.jpg \n inflating: flower_data/valid/86/image_02909.jpg \n creating: flower_data/valid/72/\n inflating: flower_data/valid/72/image_03576.jpg \n inflating: flower_data/valid/72/image_03561.jpg \n inflating: flower_data/valid/72/image_03565.jpg \n inflating: flower_data/valid/72/image_03559.jpg \n inflating: flower_data/valid/72/image_03568.jpg \n inflating: flower_data/valid/72/image_03587.jpg \n inflating: flower_data/valid/72/image_03632.jpg \n inflating: flower_data/valid/72/image_03584.jpg \n creating: flower_data/valid/44/\n inflating: flower_data/valid/44/image_01573.jpg \n inflating: flower_data/valid/44/image_01570.jpg \n inflating: flower_data/valid/44/image_01510.jpg \n inflating: flower_data/valid/44/image_01500.jpg \n inflating: flower_data/valid/44/image_01491.jpg \n inflating: flower_data/valid/44/image_01494.jpg \n inflating: flower_data/valid/44/image_01579.jpg \n inflating: flower_data/valid/44/image_01569.jpg \n inflating: flower_data/valid/44/image_01540.jpg \n creating: flower_data/valid/43/\n inflating: flower_data/valid/43/image_02443.jpg \n inflating: flower_data/valid/43/image_02330.jpg \n inflating: flower_data/valid/43/image_02336.jpg \n inflating: flower_data/valid/43/image_02421.jpg \n inflating: flower_data/valid/43/image_02345.jpg \n inflating: flower_data/valid/43/image_02397.jpg \n inflating: flower_data/valid/43/image_02368.jpg \n inflating: flower_data/valid/43/image_02355.jpg \n inflating: flower_data/valid/43/image_02419.jpg \n inflating: flower_data/valid/43/image_02425.jpg \n inflating: flower_data/valid/43/image_02429.jpg \n inflating: flower_data/valid/43/image_02414.jpg \n inflating: flower_data/valid/43/image_02407.jpg \n inflating: flower_data/valid/43/image_02389.jpg \n creating: flower_data/valid/88/\n inflating: flower_data/valid/88/image_00533.jpg \n inflating: flower_data/valid/88/image_00531.jpg \n inflating: flower_data/valid/88/image_00518.jpg \n inflating: flower_data/valid/88/image_00508.jpg \n inflating: flower_data/valid/88/image_00509.jpg \n inflating: flower_data/valid/88/image_00545.jpg \n inflating: flower_data/valid/88/image_00544.jpg \n inflating: flower_data/valid/88/image_00593.jpg \n inflating: flower_data/valid/88/image_00585.jpg \n inflating: flower_data/valid/88/image_00546.jpg \n inflating: flower_data/valid/88/image_00547.jpg \n inflating: flower_data/valid/88/image_00542.jpg \n inflating: flower_data/valid/88/image_00556.jpg \n inflating: flower_data/valid/88/image_00595.jpg \n inflating: flower_data/valid/88/image_00597.jpg \n inflating: flower_data/valid/88/image_00554.jpg \n inflating: flower_data/valid/88/image_00577.jpg \n inflating: flower_data/valid/88/image_00574.jpg \n inflating: flower_data/valid/88/image_00512.jpg \n inflating: flower_data/valid/88/image_00459.jpg \n inflating: flower_data/valid/88/image_00539.jpg \n inflating: flower_data/valid/88/image_00461.jpg \n inflating: flower_data/valid/88/image_00502.jpg \n inflating: flower_data/valid/88/image_00516.jpg \n inflating: flower_data/valid/88/image_00489.jpg \n creating: flower_data/valid/38/\n inflating: flower_data/valid/38/image_05845.jpg \n inflating: flower_data/valid/38/image_05830.jpg \n inflating: flower_data/valid/38/image_05819.jpg \n inflating: flower_data/valid/38/image_05829.jpg \n creating: flower_data/valid/36/\n inflating: flower_data/valid/36/image_04393.jpg \n inflating: flower_data/valid/36/image_04350.jpg \n inflating: flower_data/valid/36/image_04351.jpg \n inflating: flower_data/valid/36/image_04352.jpg \n inflating: flower_data/valid/36/image_04354.jpg \n inflating: flower_data/valid/36/image_04333.jpg \n creating: flower_data/valid/31/\n inflating: flower_data/valid/31/image_06905.jpg \n inflating: flower_data/valid/31/image_08067.jpg \n creating: flower_data/valid/91/\n inflating: flower_data/valid/91/image_04829.jpg \n inflating: flower_data/valid/91/image_04872.jpg \n inflating: flower_data/valid/91/image_04848.jpg \n inflating: flower_data/valid/91/image_08059.jpg \n inflating: flower_data/valid/91/image_04844.jpg \n inflating: flower_data/valid/91/image_04878.jpg \n inflating: flower_data/valid/91/image_04843.jpg \n inflating: flower_data/valid/91/image_04854.jpg \n inflating: flower_data/valid/91/image_04835.jpg \n creating: flower_data/valid/65/\n inflating: flower_data/valid/65/image_03276.jpg \n inflating: flower_data/valid/65/image_03250.jpg \n inflating: flower_data/valid/65/image_03279.jpg \n inflating: flower_data/valid/65/image_03241.jpg \n inflating: flower_data/valid/65/image_03224.jpg \n inflating: flower_data/valid/65/image_03193.jpg \n inflating: flower_data/valid/65/image_03222.jpg \n creating: flower_data/valid/62/\n inflating: flower_data/valid/62/image_08155.jpg \n inflating: flower_data/valid/62/image_07271.jpg \n inflating: flower_data/valid/62/image_08171.jpg \n creating: flower_data/valid/96/\n inflating: flower_data/valid/96/image_07670.jpg \n inflating: flower_data/valid/96/image_07658.jpg \n inflating: flower_data/valid/96/image_07677.jpg \n inflating: flower_data/valid/96/image_07662.jpg \n inflating: flower_data/valid/96/image_07613.jpg \n inflating: flower_data/valid/96/image_07605.jpg \n inflating: flower_data/valid/96/image_07609.jpg \n inflating: flower_data/valid/96/image_07653.jpg \n inflating: flower_data/valid/96/image_07656.jpg \n inflating: flower_data/valid/96/image_07669.jpg \n creating: flower_data/valid/100/\n inflating: flower_data/valid/100/image_07895.jpg \n inflating: flower_data/valid/100/image_07931.jpg \n inflating: flower_data/valid/100/image_07917.jpg \n inflating: flower_data/valid/100/image_07929.jpg \n inflating: flower_data/valid/100/image_07905.jpg \n inflating: flower_data/valid/100/image_07904.jpg \n creating: flower_data/valid/54/\n inflating: flower_data/valid/54/image_05449.jpg \n inflating: flower_data/valid/54/image_05415.jpg \n inflating: flower_data/valid/54/image_05417.jpg \n inflating: flower_data/valid/54/image_05406.jpg \n inflating: flower_data/valid/54/image_05411.jpg \n inflating: flower_data/valid/54/image_05426.jpg \n inflating: flower_data/valid/54/image_05425.jpg \n inflating: flower_data/valid/54/image_05451.jpg \n inflating: flower_data/valid/54/image_05452.jpg \n inflating: flower_data/valid/54/image_05453.jpg \n creating: flower_data/valid/98/\n inflating: flower_data/valid/98/image_07766.jpg \n inflating: flower_data/valid/98/image_07820.jpg \n inflating: flower_data/valid/98/image_07764.jpg \n inflating: flower_data/valid/98/image_07827.jpg \n inflating: flower_data/valid/98/image_07778.jpg \n inflating: flower_data/valid/98/image_07792.jpg \n inflating: flower_data/valid/98/image_07796.jpg \n inflating: flower_data/valid/98/image_07769.jpg \n inflating: flower_data/valid/98/image_07810.jpg \n inflating: flower_data/valid/98/image_07757.jpg \n creating: flower_data/valid/53/\n inflating: flower_data/valid/53/image_03702.jpg \n inflating: flower_data/valid/53/image_03670.jpg \n inflating: flower_data/valid/53/image_03667.jpg \n inflating: flower_data/valid/53/image_03656.jpg \n inflating: flower_data/valid/53/image_03668.jpg \n inflating: flower_data/valid/53/image_03654.jpg \n inflating: flower_data/valid/53/image_03655.jpg \n inflating: flower_data/valid/53/image_03693.jpg \n inflating: flower_data/valid/53/image_03733.jpg \n creating: flower_data/valid/30/\n inflating: flower_data/valid/30/image_03475.jpg \n inflating: flower_data/valid/30/image_03467.jpg \n inflating: flower_data/valid/30/image_03471.jpg \n inflating: flower_data/valid/30/image_03464.jpg \n inflating: flower_data/valid/30/image_03469.jpg \n inflating: flower_data/valid/30/image_03485.jpg \n inflating: flower_data/valid/30/image_03530.jpg \n inflating: flower_data/valid/30/image_03487.jpg \n inflating: flower_data/valid/30/image_03531.jpg \n inflating: flower_data/valid/30/image_03544.jpg \n creating: flower_data/valid/37/\n inflating: flower_data/valid/37/image_03820.jpg \n inflating: flower_data/valid/37/image_03822.jpg \n inflating: flower_data/valid/37/image_03765.jpg \n inflating: flower_data/valid/37/image_03768.jpg \n inflating: flower_data/valid/37/image_03813.jpg \n inflating: flower_data/valid/37/image_03810.jpg \n inflating: flower_data/valid/37/image_03756.jpg \n inflating: flower_data/valid/37/image_03785.jpg \n creating: flower_data/valid/39/\n inflating: flower_data/valid/39/image_07013.jpg \n inflating: flower_data/valid/39/image_07039.jpg \n inflating: flower_data/valid/39/image_07036.jpg \n creating: flower_data/valid/99/\n inflating: flower_data/valid/99/image_07869.jpg \n inflating: flower_data/valid/99/image_08063.jpg \n inflating: flower_data/valid/99/image_07862.jpg \n inflating: flower_data/valid/99/image_07860.jpg \n inflating: flower_data/valid/99/image_07875.jpg \n inflating: flower_data/valid/99/image_07873.jpg \n creating: flower_data/valid/52/\n inflating: flower_data/valid/52/image_04191.jpg \n inflating: flower_data/valid/52/image_04225.jpg \n inflating: flower_data/valid/52/image_04182.jpg \n inflating: flower_data/valid/52/image_04235.jpg \n inflating: flower_data/valid/52/image_04173.jpg \n inflating: flower_data/valid/52/image_04172.jpg \n inflating: flower_data/valid/52/image_04238.jpg \n inflating: flower_data/valid/52/image_04207.jpg \n inflating: flower_data/valid/52/image_04163.jpg \n inflating: flower_data/valid/52/image_04215.jpg \n creating: flower_data/valid/101/\n inflating: flower_data/valid/101/image_07985.jpg \n inflating: flower_data/valid/101/image_07951.jpg \n inflating: flower_data/valid/101/image_07963.jpg \n inflating: flower_data/valid/101/image_07962.jpg \n inflating: flower_data/valid/101/image_07970.jpg \n creating: flower_data/valid/55/\n inflating: flower_data/valid/55/image_04753.jpg \n inflating: flower_data/valid/55/image_04722.jpg \n inflating: flower_data/valid/55/image_04734.jpg \n inflating: flower_data/valid/55/image_04720.jpg \n inflating: flower_data/valid/55/image_04696.jpg \n inflating: flower_data/valid/55/image_04702.jpg \n inflating: flower_data/valid/55/image_04739.jpg \n inflating: flower_data/valid/55/image_04760.jpg \n creating: flower_data/valid/97/\n inflating: flower_data/valid/97/image_07739.jpg \n inflating: flower_data/valid/97/image_07699.jpg \n inflating: flower_data/valid/97/image_07745.jpg \n inflating: flower_data/valid/97/image_07718.jpg \n inflating: flower_data/valid/97/image_07727.jpg \n inflating: flower_data/valid/97/image_07723.jpg \n inflating: flower_data/valid/97/image_07697.jpg \n creating: flower_data/valid/63/\n inflating: flower_data/valid/63/image_05853.jpg \n inflating: flower_data/valid/63/image_05894.jpg \n inflating: flower_data/valid/63/image_05857.jpg \n inflating: flower_data/valid/63/image_05898.jpg \n inflating: flower_data/valid/63/image_05867.jpg \n inflating: flower_data/valid/63/image_05871.jpg \n inflating: flower_data/valid/63/image_05861.jpg \n inflating: flower_data/valid/63/image_05876.jpg \n creating: flower_data/valid/64/\n inflating: flower_data/valid/64/image_06141.jpg \n inflating: flower_data/valid/64/image_06123.jpg \n inflating: flower_data/valid/64/image_06126.jpg \n inflating: flower_data/valid/64/image_06129.jpg \n inflating: flower_data/valid/64/image_06098.jpg \n creating: flower_data/valid/90/\n inflating: flower_data/valid/90/image_04403.jpg \n inflating: flower_data/valid/90/image_04402.jpg \n creating: flower_data/valid/46/\n inflating: flower_data/valid/46/image_01029.jpg \n inflating: flower_data/valid/46/image_01129.jpg \n inflating: flower_data/valid/46/image_01100.jpg \n inflating: flower_data/valid/46/image_00949.jpg \n inflating: flower_data/valid/46/image_00977.jpg \n inflating: flower_data/valid/46/image_01076.jpg \n inflating: flower_data/valid/46/image_01138.jpg \n inflating: flower_data/valid/46/image_01065.jpg \n inflating: flower_data/valid/46/image_01040.jpg \n inflating: flower_data/valid/46/image_00983.jpg \n inflating: flower_data/valid/46/image_01095.jpg \n inflating: flower_data/valid/46/image_01132.jpg \n inflating: flower_data/valid/46/image_01047.jpg \n inflating: flower_data/valid/46/image_01086.jpg \n inflating: flower_data/valid/46/image_01050.jpg \n inflating: flower_data/valid/46/image_01142.jpg \n inflating: flower_data/valid/46/image_01034.jpg \n inflating: flower_data/valid/46/image_01030.jpg \n creating: flower_data/valid/79/\n inflating: flower_data/valid/79/image_06718.jpg \n inflating: flower_data/valid/79/image_06695.jpg \n inflating: flower_data/valid/79/image_06707.jpg \n inflating: flower_data/valid/79/image_06728.jpg \n creating: flower_data/valid/41/\n inflating: flower_data/valid/41/image_02278.jpg \n inflating: flower_data/valid/41/image_02247.jpg \n inflating: flower_data/valid/41/image_02297.jpg \n inflating: flower_data/valid/41/image_02268.jpg \n inflating: flower_data/valid/41/image_02232.jpg \n inflating: flower_data/valid/41/image_02231.jpg \n inflating: flower_data/valid/41/image_02219.jpg \n inflating: flower_data/valid/41/image_02199.jpg \n inflating: flower_data/valid/41/image_02205.jpg \n inflating: flower_data/valid/41/image_02198.jpg \n inflating: flower_data/valid/41/image_02201.jpg \n inflating: flower_data/valid/41/image_02258.jpg \n inflating: flower_data/valid/41/image_02259.jpg \n inflating: flower_data/valid/41/image_02276.jpg \n inflating: flower_data/valid/41/image_02260.jpg \n inflating: flower_data/valid/41/image_02248.jpg \n creating: flower_data/valid/83/\n inflating: flower_data/valid/83/image_01765.jpg \n inflating: flower_data/valid/83/image_01766.jpg \n inflating: flower_data/valid/83/image_01712.jpg \n inflating: flower_data/valid/83/image_01739.jpg \n inflating: flower_data/valid/83/image_01701.jpg \n inflating: flower_data/valid/83/image_01703.jpg \n inflating: flower_data/valid/83/image_01697.jpg \n inflating: flower_data/valid/83/image_01787.jpg \n inflating: flower_data/valid/83/image_01817.jpg \n inflating: flower_data/valid/83/image_01747.jpg \n inflating: flower_data/valid/83/image_01780.jpg \n inflating: flower_data/valid/83/image_01811.jpg \n inflating: flower_data/valid/83/image_01769.jpg \n creating: flower_data/valid/77/\n inflating: flower_data/valid/77/image_00081.jpg \n inflating: flower_data/valid/77/image_00137.jpg \n inflating: flower_data/valid/77/image_00078.jpg \n inflating: flower_data/valid/77/image_00246.jpg \n inflating: flower_data/valid/77/image_00245.jpg \n inflating: flower_data/valid/77/image_00143.jpg \n inflating: flower_data/valid/77/image_00231.jpg \n inflating: flower_data/valid/77/image_00218.jpg \n inflating: flower_data/valid/77/image_00178.jpg \n inflating: flower_data/valid/77/image_00028.jpg \n inflating: flower_data/valid/77/image_00174.jpg \n inflating: flower_data/valid/77/image_00160.jpg \n inflating: flower_data/valid/77/image_00188.jpg \n inflating: flower_data/valid/77/image_00200.jpg \n inflating: flower_data/valid/77/image_00063.jpg \n inflating: flower_data/valid/77/image_00061.jpg \n inflating: flower_data/valid/77/image_00059.jpg \n inflating: flower_data/valid/77/image_00071.jpg \n inflating: flower_data/valid/77/image_00104.jpg \n inflating: flower_data/valid/77/image_00072.jpg \n inflating: flower_data/valid/77/image_00112.jpg \n creating: flower_data/valid/48/\n inflating: flower_data/valid/48/image_04641.jpg \n inflating: flower_data/valid/48/image_04678.jpg \n inflating: flower_data/valid/48/image_04693.jpg \n inflating: flower_data/valid/48/image_04687.jpg \n inflating: flower_data/valid/48/image_04691.jpg \n inflating: flower_data/valid/48/image_04652.jpg \n inflating: flower_data/valid/48/image_04684.jpg \n inflating: flower_data/valid/48/image_04690.jpg \n inflating: flower_data/valid/48/image_04688.jpg \n creating: flower_data/valid/70/\n inflating: flower_data/valid/70/image_05289.jpg \n inflating: flower_data/valid/70/image_05303.jpg \n inflating: flower_data/valid/70/image_05326.jpg \n inflating: flower_data/valid/70/image_05284.jpg \n inflating: flower_data/valid/70/image_05280.jpg \n inflating: flower_data/valid/70/image_05296.jpg \n inflating: flower_data/valid/70/image_05321.jpg \n creating: flower_data/valid/84/\n inflating: flower_data/valid/84/image_02582.jpg \n inflating: flower_data/valid/84/image_02637.jpg \n inflating: flower_data/valid/84/image_02621.jpg \n inflating: flower_data/valid/84/image_02635.jpg \n inflating: flower_data/valid/84/image_02591.jpg \n inflating: flower_data/valid/84/image_02574.jpg \n inflating: flower_data/valid/84/image_02598.jpg \n inflating: flower_data/valid/84/image_02567.jpg \n inflating: flower_data/valid/84/image_02610.jpg \n inflating: flower_data/valid/84/image_02604.jpg \n creating: flower_data/valid/24/\n inflating: flower_data/valid/24/image_06814.jpg \n inflating: flower_data/valid/24/image_06847.jpg \n inflating: flower_data/valid/24/image_06836.jpg \n inflating: flower_data/valid/24/image_06819.jpg \n inflating: flower_data/valid/24/image_06818.jpg \n creating: flower_data/valid/23/\n inflating: flower_data/valid/23/image_03371.jpg \n inflating: flower_data/valid/23/image_03416.jpg \n inflating: flower_data/valid/23/image_03400.jpg \n inflating: flower_data/valid/23/image_03398.jpg \n inflating: flower_data/valid/23/image_03412.jpg \n inflating: flower_data/valid/23/image_03444.jpg \n inflating: flower_data/valid/23/image_03436.jpg \n inflating: flower_data/valid/23/image_03437.jpg \n inflating: flower_data/valid/23/image_03386.jpg \n inflating: flower_data/valid/23/image_03420.jpg \n inflating: flower_data/valid/23/image_03425.jpg \n inflating: flower_data/valid/23/image_03383.jpg \n creating: flower_data/valid/4/\n inflating: flower_data/valid/4/image_05660.jpg \n inflating: flower_data/valid/4/image_05677.jpg \n inflating: flower_data/valid/4/image_05638.jpg \n inflating: flower_data/valid/4/image_05681.jpg \n inflating: flower_data/valid/4/image_05657.jpg \n inflating: flower_data/valid/4/image_05680.jpg \n creating: flower_data/valid/15/\n inflating: flower_data/valid/15/image_06354.jpg \n inflating: flower_data/valid/15/image_06353.jpg \n inflating: flower_data/valid/15/image_06346.jpg \n inflating: flower_data/valid/15/image_06385.jpg \n inflating: flower_data/valid/15/image_06386.jpg \n inflating: flower_data/valid/15/image_06389.jpg \n inflating: flower_data/valid/15/image_06358.jpg \n creating: flower_data/valid/3/\n inflating: flower_data/valid/3/image_06631.jpg \n inflating: flower_data/valid/3/image_06621.jpg \n creating: flower_data/valid/12/\n inflating: flower_data/valid/12/image_03997.jpg \n inflating: flower_data/valid/12/image_04080.jpg \n inflating: flower_data/valid/12/image_04072.jpg \n inflating: flower_data/valid/12/image_04013.jpg \n inflating: flower_data/valid/12/image_03998.jpg \n creating: flower_data/valid/85/\n inflating: flower_data/valid/85/image_04804.jpg \n inflating: flower_data/valid/85/image_04817.jpg \n inflating: flower_data/valid/85/image_04818.jpg \n inflating: flower_data/valid/85/image_04809.jpg \n inflating: flower_data/valid/85/image_04799.jpg \n creating: flower_data/valid/71/\n inflating: flower_data/valid/71/image_04554.jpg \n inflating: flower_data/valid/71/image_04517.jpg \n inflating: flower_data/valid/71/image_04502.jpg \n inflating: flower_data/valid/71/image_04513.jpg \n inflating: flower_data/valid/71/image_04539.jpg \n creating: flower_data/valid/76/\n inflating: flower_data/valid/76/image_02494.jpg \n inflating: flower_data/valid/76/image_02457.jpg \n inflating: flower_data/valid/76/image_02523.jpg \n inflating: flower_data/valid/76/image_02537.jpg \n inflating: flower_data/valid/76/image_02456.jpg \n inflating: flower_data/valid/76/image_02481.jpg \n inflating: flower_data/valid/76/image_02496.jpg \n inflating: flower_data/valid/76/image_02525.jpg \n inflating: flower_data/valid/76/image_02530.jpg \n inflating: flower_data/valid/76/image_02485.jpg \n inflating: flower_data/valid/76/image_02446.jpg \n inflating: flower_data/valid/76/image_02475.jpg \n inflating: flower_data/valid/76/image_02501.jpg \n inflating: flower_data/valid/76/image_02529.jpg \n inflating: flower_data/valid/76/image_02514.jpg \n inflating: flower_data/valid/76/image_02458.jpg \n inflating: flower_data/valid/76/image_02504.jpg \n inflating: flower_data/valid/76/image_02467.jpg \n inflating: flower_data/valid/76/image_02513.jpg \n inflating: flower_data/valid/76/image_02512.jpg \n creating: flower_data/valid/82/\n inflating: flower_data/valid/82/image_01658.jpg \n inflating: flower_data/valid/82/image_01671.jpg \n inflating: flower_data/valid/82/image_01659.jpg \n inflating: flower_data/valid/82/image_01649.jpg \n inflating: flower_data/valid/82/image_01691.jpg \n inflating: flower_data/valid/82/image_01690.jpg \n inflating: flower_data/valid/82/image_01693.jpg \n inflating: flower_data/valid/82/image_01683.jpg \n inflating: flower_data/valid/82/image_01640.jpg \n inflating: flower_data/valid/82/image_01643.jpg \n inflating: flower_data/valid/82/image_01625.jpg \n inflating: flower_data/valid/82/image_01592.jpg \n inflating: flower_data/valid/82/image_01596.jpg \n creating: flower_data/valid/49/\n inflating: flower_data/valid/49/image_06235.jpg \n inflating: flower_data/valid/49/image_06209.jpg \n inflating: flower_data/valid/49/image_06222.jpg \n inflating: flower_data/valid/49/image_06230.jpg \n inflating: flower_data/valid/49/image_06240.jpg \n inflating: flower_data/valid/49/image_06228.jpg \n inflating: flower_data/valid/49/image_06216.jpg \n inflating: flower_data/valid/49/image_06210.jpg \n creating: flower_data/valid/40/\n inflating: flower_data/valid/40/image_04608.jpg \n inflating: flower_data/valid/40/image_04596.jpg \n inflating: flower_data/valid/40/image_04578.jpg \n inflating: flower_data/valid/40/image_04579.jpg \n inflating: flower_data/valid/40/image_04584.jpg \n creating: flower_data/valid/47/\n inflating: flower_data/valid/47/image_04989.jpg \n inflating: flower_data/valid/47/image_05007.jpg \n inflating: flower_data/valid/47/image_04957.jpg \n creating: flower_data/valid/78/\n inflating: flower_data/valid/78/image_01944.jpg \n inflating: flower_data/valid/78/image_01894.jpg \n inflating: flower_data/valid/78/image_01937.jpg \n inflating: flower_data/valid/78/image_01922.jpg \n inflating: flower_data/valid/78/image_01844.jpg \n inflating: flower_data/valid/78/image_01929.jpg \n inflating: flower_data/valid/78/image_01914.jpg \n inflating: flower_data/valid/78/image_01862.jpg \n inflating: flower_data/valid/78/image_01938.jpg \n inflating: flower_data/valid/78/image_01910.jpg \n inflating: flower_data/valid/78/image_01962.jpg \n creating: flower_data/valid/2/\n inflating: flower_data/valid/2/image_05101.jpg \n inflating: flower_data/valid/2/image_05142.jpg \n inflating: flower_data/valid/2/image_05124.jpg \n inflating: flower_data/valid/2/image_05136.jpg \n inflating: flower_data/valid/2/image_05094.jpg \n inflating: flower_data/valid/2/image_05137.jpg \n creating: flower_data/valid/13/\n inflating: flower_data/valid/13/image_05763.jpg \n inflating: flower_data/valid/13/image_05749.jpg \n inflating: flower_data/valid/13/image_05759.jpg \n inflating: flower_data/valid/13/image_05772.jpg \n inflating: flower_data/valid/13/image_05783.jpg \n creating: flower_data/valid/5/\n inflating: flower_data/valid/5/image_05164.jpg \n inflating: flower_data/valid/5/image_05199.jpg \n inflating: flower_data/valid/5/image_05188.jpg \n inflating: flower_data/valid/5/image_05192.jpg \n inflating: flower_data/valid/5/image_05209.jpg \n inflating: flower_data/valid/5/image_05196.jpg \n inflating: flower_data/valid/5/image_05168.jpg \n creating: flower_data/valid/14/\n inflating: flower_data/valid/14/image_06082.jpg \n creating: flower_data/valid/22/\n inflating: flower_data/valid/22/image_05398.jpg \n inflating: flower_data/valid/22/image_05364.jpg \n inflating: flower_data/valid/22/image_05361.jpg \n inflating: flower_data/valid/22/image_05376.jpg \n inflating: flower_data/valid/22/image_05388.jpg \n inflating: flower_data/valid/22/image_05352.jpg \n inflating: flower_data/valid/22/image_05368.jpg \n inflating: flower_data/valid/22/image_05381.jpg \n creating: flower_data/valid/25/\n inflating: flower_data/valid/25/image_06584.jpg \n inflating: flower_data/valid/25/image_06572.jpg \n creating: flower_data/train/\n creating: flower_data/train/61/\n inflating: flower_data/train/61/image_06281.jpg \n inflating: flower_data/train/61/image_06295.jpg \n inflating: flower_data/train/61/image_06256.jpg \n inflating: flower_data/train/61/image_06257.jpg \n inflating: flower_data/train/61/image_06294.jpg \n inflating: flower_data/train/61/image_06280.jpg \n inflating: flower_data/train/61/image_06282.jpg \n inflating: flower_data/train/61/image_06255.jpg \n inflating: flower_data/train/61/image_06269.jpg \n inflating: flower_data/train/61/image_06268.jpg \n inflating: flower_data/train/61/image_06254.jpg \n inflating: flower_data/train/61/image_06283.jpg \n inflating: flower_data/train/61/image_06287.jpg \n inflating: flower_data/train/61/image_06278.jpg \n inflating: flower_data/train/61/image_06250.jpg \n inflating: flower_data/train/61/image_06286.jpg \n inflating: flower_data/train/61/image_06247.jpg \n inflating: flower_data/train/61/image_06253.jpg \n inflating: flower_data/train/61/image_06252.jpg \n inflating: flower_data/train/61/image_06291.jpg \n inflating: flower_data/train/61/image_06285.jpg \n inflating: flower_data/train/61/image_06288.jpg \n inflating: flower_data/train/61/image_06263.jpg \n inflating: flower_data/train/61/image_06277.jpg \n inflating: flower_data/train/61/image_06276.jpg \n inflating: flower_data/train/61/image_06262.jpg \n inflating: flower_data/train/61/image_06289.jpg \n inflating: flower_data/train/61/image_06274.jpg \n inflating: flower_data/train/61/image_06260.jpg \n inflating: flower_data/train/61/image_06275.jpg \n inflating: flower_data/train/61/image_06265.jpg \n inflating: flower_data/train/61/image_06264.jpg \n inflating: flower_data/train/61/image_06270.jpg \n inflating: flower_data/train/61/image_06258.jpg \n inflating: flower_data/train/61/image_06272.jpg \n inflating: flower_data/train/61/image_06267.jpg \n creating: flower_data/train/95/\n inflating: flower_data/train/95/image_07507.jpg \n inflating: flower_data/train/95/image_07498.jpg \n inflating: flower_data/train/95/image_07467.jpg \n inflating: flower_data/train/95/image_07473.jpg \n inflating: flower_data/train/95/image_07466.jpg \n inflating: flower_data/train/95/image_07499.jpg \n inflating: flower_data/train/95/image_07506.jpg \n inflating: flower_data/train/95/image_07538.jpg \n inflating: flower_data/train/95/image_07504.jpg \n inflating: flower_data/train/95/image_07511.jpg \n inflating: flower_data/train/95/image_07539.jpg \n inflating: flower_data/train/95/image_07501.jpg \n inflating: flower_data/train/95/image_07515.jpg \n inflating: flower_data/train/95/image_07529.jpg \n inflating: flower_data/train/95/image_07474.jpg \n inflating: flower_data/train/95/image_07528.jpg \n inflating: flower_data/train/95/image_07500.jpg \n inflating: flower_data/train/95/image_07516.jpg \n inflating: flower_data/train/95/image_07489.jpg \n inflating: flower_data/train/95/image_07476.jpg \n inflating: flower_data/train/95/image_07477.jpg \n inflating: flower_data/train/95/image_07488.jpg \n inflating: flower_data/train/95/image_07503.jpg \n inflating: flower_data/train/95/image_07517.jpg \n inflating: flower_data/train/95/image_07558.jpg \n inflating: flower_data/train/95/image_07570.jpg \n inflating: flower_data/train/95/image_07564.jpg \n inflating: flower_data/train/95/image_07565.jpg \n inflating: flower_data/train/95/image_07571.jpg \n inflating: flower_data/train/95/image_07559.jpg \n inflating: flower_data/train/95/image_07567.jpg \n inflating: flower_data/train/95/image_07572.jpg \n inflating: flower_data/train/95/image_07566.jpg \n inflating: flower_data/train/95/image_07589.jpg \n inflating: flower_data/train/95/image_07562.jpg \n inflating: flower_data/train/95/image_07576.jpg \n inflating: flower_data/train/95/image_07577.jpg \n inflating: flower_data/train/95/image_07563.jpg \n inflating: flower_data/train/95/image_07549.jpg \n inflating: flower_data/train/95/image_07548.jpg \n inflating: flower_data/train/95/image_07560.jpg \n inflating: flower_data/train/95/image_07574.jpg \n inflating: flower_data/train/95/image_07592.jpg \n inflating: flower_data/train/95/image_07586.jpg \n inflating: flower_data/train/95/image_07579.jpg \n inflating: flower_data/train/95/image_07545.jpg \n inflating: flower_data/train/95/image_07544.jpg \n inflating: flower_data/train/95/image_07578.jpg \n inflating: flower_data/train/95/image_07587.jpg \n inflating: flower_data/train/95/image_07593.jpg \n inflating: flower_data/train/95/image_07591.jpg \n inflating: flower_data/train/95/image_07546.jpg \n inflating: flower_data/train/95/image_07552.jpg \n inflating: flower_data/train/95/image_07553.jpg \n inflating: flower_data/train/95/image_07547.jpg \n inflating: flower_data/train/95/image_07590.jpg \n inflating: flower_data/train/95/image_07543.jpg \n inflating: flower_data/train/95/image_07557.jpg \n inflating: flower_data/train/95/image_07556.jpg \n inflating: flower_data/train/95/image_07542.jpg \n inflating: flower_data/train/95/image_07583.jpg \n inflating: flower_data/train/95/image_07554.jpg \n inflating: flower_data/train/95/image_07540.jpg \n inflating: flower_data/train/95/image_07568.jpg \n inflating: flower_data/train/95/image_07569.jpg \n inflating: flower_data/train/95/image_07541.jpg \n inflating: flower_data/train/95/image_07555.jpg \n inflating: flower_data/train/95/image_07582.jpg \n inflating: flower_data/train/95/image_07532.jpg \n inflating: flower_data/train/95/image_07526.jpg \n inflating: flower_data/train/95/image_07491.jpg \n inflating: flower_data/train/95/image_07490.jpg \n inflating: flower_data/train/95/image_07527.jpg \n inflating: flower_data/train/95/image_07533.jpg \n inflating: flower_data/train/95/image_07525.jpg \n inflating: flower_data/train/95/image_07531.jpg \n inflating: flower_data/train/95/image_07492.jpg \n inflating: flower_data/train/95/image_07486.jpg \n inflating: flower_data/train/95/image_07479.jpg \n inflating: flower_data/train/95/image_07478.jpg \n inflating: flower_data/train/95/image_07493.jpg \n inflating: flower_data/train/95/image_07530.jpg \n inflating: flower_data/train/95/image_07524.jpg \n inflating: flower_data/train/95/image_07518.jpg \n inflating: flower_data/train/95/image_07520.jpg \n inflating: flower_data/train/95/image_07534.jpg \n inflating: flower_data/train/95/image_07497.jpg \n inflating: flower_data/train/95/image_07483.jpg \n inflating: flower_data/train/95/image_07469.jpg \n inflating: flower_data/train/95/image_07482.jpg \n inflating: flower_data/train/95/image_07496.jpg \n inflating: flower_data/train/95/image_07509.jpg \n inflating: flower_data/train/95/image_07535.jpg \n inflating: flower_data/train/95/image_07521.jpg \n inflating: flower_data/train/95/image_07537.jpg \n inflating: flower_data/train/95/image_07523.jpg \n inflating: flower_data/train/95/image_07480.jpg \n inflating: flower_data/train/95/image_07494.jpg \n inflating: flower_data/train/95/image_07495.jpg \n inflating: flower_data/train/95/image_07481.jpg \n inflating: flower_data/train/95/image_07522.jpg \n creating: flower_data/train/59/\n inflating: flower_data/train/59/image_05072.jpg \n inflating: flower_data/train/59/image_05066.jpg \n inflating: flower_data/train/59/image_05073.jpg \n inflating: flower_data/train/59/image_05059.jpg \n inflating: flower_data/train/59/image_05065.jpg \n inflating: flower_data/train/59/image_05071.jpg \n inflating: flower_data/train/59/image_05070.jpg \n inflating: flower_data/train/59/image_05058.jpg \n inflating: flower_data/train/59/image_05060.jpg \n inflating: flower_data/train/59/image_05074.jpg \n inflating: flower_data/train/59/image_05048.jpg \n inflating: flower_data/train/59/image_05049.jpg \n inflating: flower_data/train/59/image_05075.jpg \n inflating: flower_data/train/59/image_05061.jpg \n inflating: flower_data/train/59/image_05063.jpg \n inflating: flower_data/train/59/image_05062.jpg \n inflating: flower_data/train/59/image_05076.jpg \n inflating: flower_data/train/59/image_05028.jpg \n inflating: flower_data/train/59/image_05029.jpg \n inflating: flower_data/train/59/image_05030.jpg \n inflating: flower_data/train/59/image_05024.jpg \n inflating: flower_data/train/59/image_05025.jpg \n inflating: flower_data/train/59/image_05031.jpg \n inflating: flower_data/train/59/image_05027.jpg \n inflating: flower_data/train/59/image_05032.jpg \n inflating: flower_data/train/59/image_05026.jpg \n inflating: flower_data/train/59/image_05022.jpg \n inflating: flower_data/train/59/image_05036.jpg \n inflating: flower_data/train/59/image_05037.jpg \n inflating: flower_data/train/59/image_05023.jpg \n inflating: flower_data/train/59/image_05035.jpg \n inflating: flower_data/train/59/image_05021.jpg \n inflating: flower_data/train/59/image_05084.jpg \n inflating: flower_data/train/59/image_05053.jpg \n inflating: flower_data/train/59/image_05047.jpg \n inflating: flower_data/train/59/image_05046.jpg \n inflating: flower_data/train/59/image_05085.jpg \n inflating: flower_data/train/59/image_05078.jpg \n inflating: flower_data/train/59/image_05044.jpg \n inflating: flower_data/train/59/image_05050.jpg \n inflating: flower_data/train/59/image_05051.jpg \n inflating: flower_data/train/59/image_05045.jpg \n inflating: flower_data/train/59/image_05079.jpg \n inflating: flower_data/train/59/image_05082.jpg \n inflating: flower_data/train/59/image_05041.jpg \n inflating: flower_data/train/59/image_05055.jpg \n inflating: flower_data/train/59/image_05069.jpg \n inflating: flower_data/train/59/image_05068.jpg \n inflating: flower_data/train/59/image_05054.jpg \n inflating: flower_data/train/59/image_05040.jpg \n inflating: flower_data/train/59/image_05083.jpg \n inflating: flower_data/train/59/image_05081.jpg \n inflating: flower_data/train/59/image_05042.jpg \n inflating: flower_data/train/59/image_05043.jpg \n inflating: flower_data/train/59/image_05057.jpg \n inflating: flower_data/train/59/image_05080.jpg \n creating: flower_data/train/92/\n inflating: flower_data/train/92/image_03038.jpg \n inflating: flower_data/train/92/image_03029.jpg \n inflating: flower_data/train/92/image_03070.jpg \n inflating: flower_data/train/92/image_03064.jpg \n inflating: flower_data/train/92/image_03071.jpg \n inflating: flower_data/train/92/image_03059.jpg \n inflating: flower_data/train/92/image_03067.jpg \n inflating: flower_data/train/92/image_03073.jpg \n inflating: flower_data/train/92/image_03072.jpg \n inflating: flower_data/train/92/image_03066.jpg \n inflating: flower_data/train/92/image_03089.jpg \n inflating: flower_data/train/92/image_03062.jpg \n inflating: flower_data/train/92/image_03076.jpg \n inflating: flower_data/train/92/image_03077.jpg \n inflating: flower_data/train/92/image_03088.jpg \n inflating: flower_data/train/92/image_03061.jpg \n inflating: flower_data/train/92/image_03060.jpg \n inflating: flower_data/train/92/image_03074.jpg \n inflating: flower_data/train/92/image_03092.jpg \n inflating: flower_data/train/92/image_03086.jpg \n inflating: flower_data/train/92/image_03079.jpg \n inflating: flower_data/train/92/image_03051.jpg \n inflating: flower_data/train/92/image_03045.jpg \n inflating: flower_data/train/92/image_03050.jpg \n inflating: flower_data/train/92/image_03078.jpg \n inflating: flower_data/train/92/image_03087.jpg \n inflating: flower_data/train/92/image_03093.jpg \n inflating: flower_data/train/92/image_03085.jpg \n inflating: flower_data/train/92/image_03091.jpg \n inflating: flower_data/train/92/image_03046.jpg \n inflating: flower_data/train/92/image_03053.jpg \n inflating: flower_data/train/92/image_03047.jpg \n inflating: flower_data/train/92/image_03090.jpg \n inflating: flower_data/train/92/image_03084.jpg \n inflating: flower_data/train/92/image_03080.jpg \n inflating: flower_data/train/92/image_03094.jpg \n inflating: flower_data/train/92/image_03043.jpg \n inflating: flower_data/train/92/image_03057.jpg \n inflating: flower_data/train/92/image_03042.jpg \n inflating: flower_data/train/92/image_03081.jpg \n inflating: flower_data/train/92/image_03083.jpg \n inflating: flower_data/train/92/image_03040.jpg \n inflating: flower_data/train/92/image_03068.jpg \n inflating: flower_data/train/92/image_03069.jpg \n inflating: flower_data/train/92/image_03055.jpg \n inflating: flower_data/train/92/image_03082.jpg \n inflating: flower_data/train/92/image_03033.jpg \n inflating: flower_data/train/92/image_03031.jpg \n inflating: flower_data/train/92/image_03030.jpg \n inflating: flower_data/train/92/image_03034.jpg \n inflating: flower_data/train/92/image_03035.jpg \n inflating: flower_data/train/92/image_03037.jpg \n inflating: flower_data/train/92/image_03036.jpg \n creating: flower_data/train/66/\n inflating: flower_data/train/66/image_05529.jpg \n inflating: flower_data/train/66/image_05528.jpg \n inflating: flower_data/train/66/image_05538.jpg \n inflating: flower_data/train/66/image_05539.jpg \n inflating: flower_data/train/66/image_05575.jpg \n inflating: flower_data/train/66/image_05561.jpg \n inflating: flower_data/train/66/image_05574.jpg \n inflating: flower_data/train/66/image_05548.jpg \n inflating: flower_data/train/66/image_05576.jpg \n inflating: flower_data/train/66/image_05567.jpg \n inflating: flower_data/train/66/image_05573.jpg \n inflating: flower_data/train/66/image_05572.jpg \n inflating: flower_data/train/66/image_05566.jpg \n inflating: flower_data/train/66/image_05570.jpg \n inflating: flower_data/train/66/image_05564.jpg \n inflating: flower_data/train/66/image_05558.jpg \n inflating: flower_data/train/66/image_05571.jpg \n inflating: flower_data/train/66/image_05583.jpg \n inflating: flower_data/train/66/image_05568.jpg \n inflating: flower_data/train/66/image_05554.jpg \n inflating: flower_data/train/66/image_05540.jpg \n inflating: flower_data/train/66/image_05541.jpg \n inflating: flower_data/train/66/image_05555.jpg \n inflating: flower_data/train/66/image_05569.jpg \n inflating: flower_data/train/66/image_05580.jpg \n inflating: flower_data/train/66/image_05543.jpg \n inflating: flower_data/train/66/image_05557.jpg \n inflating: flower_data/train/66/image_05556.jpg \n inflating: flower_data/train/66/image_05542.jpg \n inflating: flower_data/train/66/image_05581.jpg \n inflating: flower_data/train/66/image_05546.jpg \n inflating: flower_data/train/66/image_05552.jpg \n inflating: flower_data/train/66/image_05553.jpg \n inflating: flower_data/train/66/image_05547.jpg \n inflating: flower_data/train/66/image_05551.jpg \n inflating: flower_data/train/66/image_05545.jpg \n inflating: flower_data/train/66/image_05578.jpg \n inflating: flower_data/train/66/image_05544.jpg \n inflating: flower_data/train/66/image_05550.jpg \n inflating: flower_data/train/66/image_05523.jpg \n inflating: flower_data/train/66/image_05536.jpg \n inflating: flower_data/train/66/image_05534.jpg \n inflating: flower_data/train/66/image_05535.jpg \n inflating: flower_data/train/66/image_05525.jpg \n inflating: flower_data/train/66/image_05531.jpg \n inflating: flower_data/train/66/image_05530.jpg \n inflating: flower_data/train/66/image_05524.jpg \n inflating: flower_data/train/66/image_05532.jpg \n inflating: flower_data/train/66/image_05526.jpg \n inflating: flower_data/train/66/image_05527.jpg \n inflating: flower_data/train/66/image_05533.jpg \n creating: flower_data/train/50/\n inflating: flower_data/train/50/image_06341.jpg \n inflating: flower_data/train/50/image_06553.jpg \n inflating: flower_data/train/50/image_06340.jpg \n inflating: flower_data/train/50/image_06342.jpg \n inflating: flower_data/train/50/image_06545.jpg \n inflating: flower_data/train/50/image_06551.jpg \n inflating: flower_data/train/50/image_06343.jpg \n inflating: flower_data/train/50/image_06569.jpg \n inflating: flower_data/train/50/image_06555.jpg \n inflating: flower_data/train/50/image_06540.jpg \n inflating: flower_data/train/50/image_06554.jpg \n inflating: flower_data/train/50/image_06568.jpg \n inflating: flower_data/train/50/image_06344.jpg \n inflating: flower_data/train/50/image_06542.jpg \n inflating: flower_data/train/50/image_06556.jpg \n inflating: flower_data/train/50/image_06557.jpg \n inflating: flower_data/train/50/image_06543.jpg \n inflating: flower_data/train/50/image_06345.jpg \n inflating: flower_data/train/50/image_06336.jpg \n inflating: flower_data/train/50/image_06322.jpg \n inflating: flower_data/train/50/image_06530.jpg \n inflating: flower_data/train/50/image_06531.jpg \n inflating: flower_data/train/50/image_06323.jpg \n inflating: flower_data/train/50/image_06337.jpg \n inflating: flower_data/train/50/image_06335.jpg \n inflating: flower_data/train/50/image_06309.jpg \n inflating: flower_data/train/50/image_06533.jpg \n inflating: flower_data/train/50/image_06532.jpg \n inflating: flower_data/train/50/image_06308.jpg \n inflating: flower_data/train/50/image_06334.jpg \n inflating: flower_data/train/50/image_06330.jpg \n inflating: flower_data/train/50/image_06536.jpg \n inflating: flower_data/train/50/image_06537.jpg \n inflating: flower_data/train/50/image_06325.jpg \n inflating: flower_data/train/50/image_06333.jpg \n inflating: flower_data/train/50/image_06327.jpg \n inflating: flower_data/train/50/image_06535.jpg \n inflating: flower_data/train/50/image_06326.jpg \n inflating: flower_data/train/50/image_06332.jpg \n inflating: flower_data/train/50/image_06317.jpg \n inflating: flower_data/train/50/image_06303.jpg \n inflating: flower_data/train/50/image_06539.jpg \n inflating: flower_data/train/50/image_06538.jpg \n inflating: flower_data/train/50/image_06302.jpg \n inflating: flower_data/train/50/image_06316.jpg \n inflating: flower_data/train/50/image_06300.jpg \n inflating: flower_data/train/50/image_06314.jpg \n inflating: flower_data/train/50/image_06328.jpg \n inflating: flower_data/train/50/image_06329.jpg \n inflating: flower_data/train/50/image_06315.jpg \n inflating: flower_data/train/50/image_06301.jpg \n inflating: flower_data/train/50/image_06339.jpg \n inflating: flower_data/train/50/image_06310.jpg \n inflating: flower_data/train/50/image_06304.jpg \n inflating: flower_data/train/50/image_06338.jpg \n inflating: flower_data/train/50/image_06312.jpg \n inflating: flower_data/train/50/image_06306.jpg \n inflating: flower_data/train/50/image_06299.jpg \n inflating: flower_data/train/50/image_06529.jpg \n inflating: flower_data/train/50/image_06298.jpg \n inflating: flower_data/train/50/image_06307.jpg \n inflating: flower_data/train/50/image_06313.jpg \n inflating: flower_data/train/50/image_06566.jpg \n inflating: flower_data/train/50/image_06567.jpg \n inflating: flower_data/train/50/image_06565.jpg \n inflating: flower_data/train/50/image_06559.jpg \n inflating: flower_data/train/50/image_06564.jpg \n inflating: flower_data/train/50/image_06570.jpg \n inflating: flower_data/train/50/image_06548.jpg \n inflating: flower_data/train/50/image_06560.jpg \n inflating: flower_data/train/50/image_06561.jpg \n inflating: flower_data/train/50/image_06549.jpg \n inflating: flower_data/train/50/image_06563.jpg \n creating: flower_data/train/68/\n inflating: flower_data/train/68/image_05933.jpg \n inflating: flower_data/train/68/image_05930.jpg \n inflating: flower_data/train/68/image_05918.jpg \n inflating: flower_data/train/68/image_05931.jpg \n inflating: flower_data/train/68/image_05925.jpg \n inflating: flower_data/train/68/image_05909.jpg \n inflating: flower_data/train/68/image_05921.jpg \n inflating: flower_data/train/68/image_05935.jpg \n inflating: flower_data/train/68/image_05934.jpg \n inflating: flower_data/train/68/image_05920.jpg \n inflating: flower_data/train/68/image_05936.jpg \n inflating: flower_data/train/68/image_05922.jpg \n inflating: flower_data/train/68/image_05923.jpg \n inflating: flower_data/train/68/image_05937.jpg \n inflating: flower_data/train/68/image_05950.jpg \n inflating: flower_data/train/68/image_05944.jpg \n inflating: flower_data/train/68/image_05945.jpg \n inflating: flower_data/train/68/image_05951.jpg \n inflating: flower_data/train/68/image_05947.jpg \n inflating: flower_data/train/68/image_05953.jpg \n inflating: flower_data/train/68/image_05952.jpg \n inflating: flower_data/train/68/image_05946.jpg \n inflating: flower_data/train/68/image_05942.jpg \n inflating: flower_data/train/68/image_05943.jpg \n inflating: flower_data/train/68/image_05955.jpg \n inflating: flower_data/train/68/image_05941.jpg \n inflating: flower_data/train/68/image_05940.jpg \n inflating: flower_data/train/68/image_05954.jpg \n inflating: flower_data/train/68/image_05948.jpg \n inflating: flower_data/train/68/image_05949.jpg \n inflating: flower_data/train/68/image_05912.jpg \n inflating: flower_data/train/68/image_05906.jpg \n inflating: flower_data/train/68/image_05907.jpg \n inflating: flower_data/train/68/image_05913.jpg \n inflating: flower_data/train/68/image_05905.jpg \n inflating: flower_data/train/68/image_05911.jpg \n inflating: flower_data/train/68/image_05939.jpg \n inflating: flower_data/train/68/image_05938.jpg \n inflating: flower_data/train/68/image_05910.jpg \n inflating: flower_data/train/68/image_05928.jpg \n inflating: flower_data/train/68/image_05914.jpg \n inflating: flower_data/train/68/image_05929.jpg \n inflating: flower_data/train/68/image_05917.jpg \n creating: flower_data/train/57/\n inflating: flower_data/train/57/image_07249.jpg \n inflating: flower_data/train/57/image_08146.jpg \n inflating: flower_data/train/57/image_07261.jpg \n inflating: flower_data/train/57/image_07260.jpg \n inflating: flower_data/train/57/image_08147.jpg \n inflating: flower_data/train/57/image_07248.jpg \n inflating: flower_data/train/57/image_08150.jpg \n inflating: flower_data/train/57/image_07263.jpg \n inflating: flower_data/train/57/image_08140.jpg \n inflating: flower_data/train/57/image_08141.jpg \n inflating: flower_data/train/57/image_08143.jpg \n inflating: flower_data/train/57/image_07264.jpg \n inflating: flower_data/train/57/image_07258.jpg \n inflating: flower_data/train/57/image_07259.jpg \n inflating: flower_data/train/57/image_07265.jpg \n inflating: flower_data/train/57/image_08142.jpg \n inflating: flower_data/train/57/image_08119.jpg \n inflating: flower_data/train/57/image_08125.jpg \n inflating: flower_data/train/57/image_08131.jpg \n inflating: flower_data/train/57/image_08130.jpg \n inflating: flower_data/train/57/image_08124.jpg \n inflating: flower_data/train/57/image_08118.jpg \n inflating: flower_data/train/57/image_08132.jpg \n inflating: flower_data/train/57/image_08126.jpg \n inflating: flower_data/train/57/image_08137.jpg \n inflating: flower_data/train/57/image_08123.jpg \n inflating: flower_data/train/57/image_07238.jpg \n inflating: flower_data/train/57/image_08122.jpg \n inflating: flower_data/train/57/image_08136.jpg \n inflating: flower_data/train/57/image_08120.jpg \n inflating: flower_data/train/57/image_08135.jpg \n inflating: flower_data/train/57/image_08121.jpg \n inflating: flower_data/train/57/image_08138.jpg \n inflating: flower_data/train/57/image_07237.jpg \n inflating: flower_data/train/57/image_07236.jpg \n inflating: flower_data/train/57/image_08139.jpg \n inflating: flower_data/train/57/image_07235.jpg \n inflating: flower_data/train/57/image_08128.jpg \n inflating: flower_data/train/57/image_07240.jpg \n inflating: flower_data/train/57/image_07243.jpg \n inflating: flower_data/train/57/image_07257.jpg \n inflating: flower_data/train/57/image_07256.jpg \n inflating: flower_data/train/57/image_07246.jpg \n inflating: flower_data/train/57/image_07252.jpg \n inflating: flower_data/train/57/image_08149.jpg \n inflating: flower_data/train/57/image_08148.jpg \n inflating: flower_data/train/57/image_07253.jpg \n inflating: flower_data/train/57/image_07251.jpg \n inflating: flower_data/train/57/image_07245.jpg \n inflating: flower_data/train/57/image_07244.jpg \n creating: flower_data/train/32/\n inflating: flower_data/train/32/image_05603.jpg \n inflating: flower_data/train/32/image_05617.jpg \n inflating: flower_data/train/32/image_05616.jpg \n inflating: flower_data/train/32/image_05628.jpg \n inflating: flower_data/train/32/image_05600.jpg \n inflating: flower_data/train/32/image_05589.jpg \n inflating: flower_data/train/32/image_05588.jpg \n inflating: flower_data/train/32/image_05601.jpg \n inflating: flower_data/train/32/image_05611.jpg \n inflating: flower_data/train/32/image_05605.jpg \n inflating: flower_data/train/32/image_05599.jpg \n inflating: flower_data/train/32/image_05604.jpg \n inflating: flower_data/train/32/image_05606.jpg \n inflating: flower_data/train/32/image_05612.jpg \n inflating: flower_data/train/32/image_05613.jpg \n inflating: flower_data/train/32/image_05607.jpg \n inflating: flower_data/train/32/image_05622.jpg \n inflating: flower_data/train/32/image_05597.jpg \n inflating: flower_data/train/32/image_05596.jpg \n inflating: flower_data/train/32/image_05623.jpg \n inflating: flower_data/train/32/image_05609.jpg \n inflating: flower_data/train/32/image_05621.jpg \n inflating: flower_data/train/32/image_05594.jpg \n inflating: flower_data/train/32/image_05620.jpg \n inflating: flower_data/train/32/image_05608.jpg \n inflating: flower_data/train/32/image_05624.jpg \n inflating: flower_data/train/32/image_05618.jpg \n inflating: flower_data/train/32/image_05585.jpg \n inflating: flower_data/train/32/image_05590.jpg \n inflating: flower_data/train/32/image_05619.jpg \n inflating: flower_data/train/32/image_05625.jpg \n inflating: flower_data/train/32/image_05627.jpg \n inflating: flower_data/train/32/image_05586.jpg \n inflating: flower_data/train/32/image_05587.jpg \n inflating: flower_data/train/32/image_05593.jpg \n inflating: flower_data/train/32/image_05626.jpg \n creating: flower_data/train/35/\n inflating: flower_data/train/35/image_06976.jpg \n inflating: flower_data/train/35/image_06989.jpg \n inflating: flower_data/train/35/image_06988.jpg \n inflating: flower_data/train/35/image_06975.jpg \n inflating: flower_data/train/35/image_06974.jpg \n inflating: flower_data/train/35/image_06970.jpg \n inflating: flower_data/train/35/image_06971.jpg \n inflating: flower_data/train/35/image_06973.jpg \n inflating: flower_data/train/35/image_06998.jpg \n inflating: flower_data/train/35/image_06999.jpg \n inflating: flower_data/train/35/image_06972.jpg \n inflating: flower_data/train/35/image_07000.jpg \n inflating: flower_data/train/35/image_08086.jpg \n inflating: flower_data/train/35/image_08087.jpg \n inflating: flower_data/train/35/image_07001.jpg \n inflating: flower_data/train/35/image_07003.jpg \n inflating: flower_data/train/35/image_08085.jpg \n inflating: flower_data/train/35/image_07002.jpg \n inflating: flower_data/train/35/image_06980.jpg \n inflating: flower_data/train/35/image_06994.jpg \n inflating: flower_data/train/35/image_06995.jpg \n inflating: flower_data/train/35/image_06981.jpg \n inflating: flower_data/train/35/image_06997.jpg \n inflating: flower_data/train/35/image_06983.jpg \n inflating: flower_data/train/35/image_06982.jpg \n inflating: flower_data/train/35/image_06996.jpg \n inflating: flower_data/train/35/image_06969.jpg \n inflating: flower_data/train/35/image_06979.jpg \n inflating: flower_data/train/35/image_06987.jpg \n inflating: flower_data/train/35/image_06993.jpg \n inflating: flower_data/train/35/image_06985.jpg \n inflating: flower_data/train/35/image_06991.jpg \n inflating: flower_data/train/35/image_06990.jpg \n creating: flower_data/train/102/\n inflating: flower_data/train/102/image_08032.jpg \n inflating: flower_data/train/102/image_08026.jpg \n inflating: flower_data/train/102/image_08027.jpg \n inflating: flower_data/train/102/image_08033.jpg \n inflating: flower_data/train/102/image_08019.jpg \n inflating: flower_data/train/102/image_08025.jpg \n inflating: flower_data/train/102/image_08031.jpg \n inflating: flower_data/train/102/image_08024.jpg \n inflating: flower_data/train/102/image_08018.jpg \n inflating: flower_data/train/102/image_08020.jpg \n inflating: flower_data/train/102/image_08034.jpg \n inflating: flower_data/train/102/image_08008.jpg \n inflating: flower_data/train/102/image_08009.jpg \n inflating: flower_data/train/102/image_08035.jpg \n inflating: flower_data/train/102/image_08021.jpg \n inflating: flower_data/train/102/image_08037.jpg \n inflating: flower_data/train/102/image_08022.jpg \n inflating: flower_data/train/102/image_08036.jpg \n inflating: flower_data/train/102/image_08045.jpg \n inflating: flower_data/train/102/image_08044.jpg \n inflating: flower_data/train/102/image_08046.jpg \n inflating: flower_data/train/102/image_08047.jpg \n inflating: flower_data/train/102/image_08043.jpg \n inflating: flower_data/train/102/image_08013.jpg \n inflating: flower_data/train/102/image_08007.jpg \n inflating: flower_data/train/102/image_08010.jpg \n inflating: flower_data/train/102/image_08011.jpg \n inflating: flower_data/train/102/image_08005.jpg \n inflating: flower_data/train/102/image_08039.jpg \n inflating: flower_data/train/102/image_08001.jpg \n inflating: flower_data/train/102/image_08029.jpg \n inflating: flower_data/train/102/image_08028.jpg \n inflating: flower_data/train/102/image_08000.jpg \n inflating: flower_data/train/102/image_08016.jpg \n inflating: flower_data/train/102/image_08003.jpg \n inflating: flower_data/train/102/image_08017.jpg \n creating: flower_data/train/69/\n inflating: flower_data/train/69/image_06009.jpg \n inflating: flower_data/train/69/image_05978.jpg \n inflating: flower_data/train/69/image_05987.jpg \n inflating: flower_data/train/69/image_05986.jpg \n inflating: flower_data/train/69/image_05979.jpg \n inflating: flower_data/train/69/image_05990.jpg \n inflating: flower_data/train/69/image_05991.jpg \n inflating: flower_data/train/69/image_05985.jpg \n inflating: flower_data/train/69/image_05981.jpg \n inflating: flower_data/train/69/image_05995.jpg \n inflating: flower_data/train/69/image_05980.jpg \n inflating: flower_data/train/69/image_05957.jpg \n inflating: flower_data/train/69/image_05969.jpg \n inflating: flower_data/train/69/image_05996.jpg \n inflating: flower_data/train/69/image_05982.jpg \n inflating: flower_data/train/69/image_05983.jpg \n inflating: flower_data/train/69/image_05997.jpg \n inflating: flower_data/train/69/image_05968.jpg \n inflating: flower_data/train/69/image_05965.jpg \n inflating: flower_data/train/69/image_05964.jpg \n inflating: flower_data/train/69/image_05970.jpg \n inflating: flower_data/train/69/image_05966.jpg \n inflating: flower_data/train/69/image_05972.jpg \n inflating: flower_data/train/69/image_05999.jpg \n inflating: flower_data/train/69/image_05998.jpg \n inflating: flower_data/train/69/image_05973.jpg \n inflating: flower_data/train/69/image_05967.jpg \n inflating: flower_data/train/69/image_05963.jpg \n inflating: flower_data/train/69/image_05977.jpg \n inflating: flower_data/train/69/image_05988.jpg \n inflating: flower_data/train/69/image_05989.jpg \n inflating: flower_data/train/69/image_05976.jpg \n inflating: flower_data/train/69/image_05962.jpg \n inflating: flower_data/train/69/image_05974.jpg \n inflating: flower_data/train/69/image_05960.jpg \n inflating: flower_data/train/69/image_05961.jpg \n inflating: flower_data/train/69/image_05975.jpg \n inflating: flower_data/train/69/image_06002.jpg \n inflating: flower_data/train/69/image_06003.jpg \n inflating: flower_data/train/69/image_06001.jpg \n inflating: flower_data/train/69/image_06000.jpg \n inflating: flower_data/train/69/image_06010.jpg \n inflating: flower_data/train/69/image_06004.jpg \n inflating: flower_data/train/69/image_06005.jpg \n inflating: flower_data/train/69/image_06007.jpg \n inflating: flower_data/train/69/image_06006.jpg \n creating: flower_data/train/56/\n inflating: flower_data/train/56/image_02859.jpg \n inflating: flower_data/train/56/image_02860.jpg \n inflating: flower_data/train/56/image_02848.jpg \n inflating: flower_data/train/56/image_02861.jpg \n inflating: flower_data/train/56/image_02839.jpg \n inflating: flower_data/train/56/image_02811.jpg \n inflating: flower_data/train/56/image_02781.jpg \n inflating: flower_data/train/56/image_02795.jpg \n inflating: flower_data/train/56/image_02756.jpg \n inflating: flower_data/train/56/image_02757.jpg \n inflating: flower_data/train/56/image_02794.jpg \n inflating: flower_data/train/56/image_02780.jpg \n inflating: flower_data/train/56/image_02804.jpg \n inflating: flower_data/train/56/image_02810.jpg \n inflating: flower_data/train/56/image_02838.jpg \n inflating: flower_data/train/56/image_02806.jpg \n inflating: flower_data/train/56/image_02812.jpg \n inflating: flower_data/train/56/image_02796.jpg \n inflating: flower_data/train/56/image_02782.jpg \n inflating: flower_data/train/56/image_02755.jpg \n inflating: flower_data/train/56/image_02768.jpg \n inflating: flower_data/train/56/image_02754.jpg \n inflating: flower_data/train/56/image_02783.jpg \n inflating: flower_data/train/56/image_02797.jpg \n inflating: flower_data/train/56/image_02813.jpg \n inflating: flower_data/train/56/image_02807.jpg \n inflating: flower_data/train/56/image_02803.jpg \n inflating: flower_data/train/56/image_02817.jpg \n inflating: flower_data/train/56/image_02793.jpg \n inflating: flower_data/train/56/image_02787.jpg \n inflating: flower_data/train/56/image_02778.jpg \n inflating: flower_data/train/56/image_02786.jpg \n inflating: flower_data/train/56/image_02816.jpg \n inflating: flower_data/train/56/image_02802.jpg \n inflating: flower_data/train/56/image_02814.jpg \n inflating: flower_data/train/56/image_02800.jpg \n inflating: flower_data/train/56/image_02828.jpg \n inflating: flower_data/train/56/image_02784.jpg \n inflating: flower_data/train/56/image_02790.jpg \n inflating: flower_data/train/56/image_02753.jpg \n inflating: flower_data/train/56/image_02791.jpg \n inflating: flower_data/train/56/image_02829.jpg \n inflating: flower_data/train/56/image_02801.jpg \n inflating: flower_data/train/56/image_02815.jpg \n inflating: flower_data/train/56/image_02818.jpg \n inflating: flower_data/train/56/image_02830.jpg \n inflating: flower_data/train/56/image_02824.jpg \n inflating: flower_data/train/56/image_02763.jpg \n inflating: flower_data/train/56/image_02777.jpg \n inflating: flower_data/train/56/image_02776.jpg \n inflating: flower_data/train/56/image_02762.jpg \n inflating: flower_data/train/56/image_02789.jpg \n inflating: flower_data/train/56/image_02831.jpg \n inflating: flower_data/train/56/image_02819.jpg \n inflating: flower_data/train/56/image_02827.jpg \n inflating: flower_data/train/56/image_02833.jpg \n inflating: flower_data/train/56/image_02774.jpg \n inflating: flower_data/train/56/image_02760.jpg \n inflating: flower_data/train/56/image_02761.jpg \n inflating: flower_data/train/56/image_02775.jpg \n inflating: flower_data/train/56/image_02832.jpg \n inflating: flower_data/train/56/image_02826.jpg \n inflating: flower_data/train/56/image_02822.jpg \n inflating: flower_data/train/56/image_02836.jpg \n inflating: flower_data/train/56/image_02759.jpg \n inflating: flower_data/train/56/image_02765.jpg \n inflating: flower_data/train/56/image_02764.jpg \n inflating: flower_data/train/56/image_02770.jpg \n inflating: flower_data/train/56/image_02758.jpg \n inflating: flower_data/train/56/image_02837.jpg \n inflating: flower_data/train/56/image_02823.jpg \n inflating: flower_data/train/56/image_02835.jpg \n inflating: flower_data/train/56/image_02809.jpg \n inflating: flower_data/train/56/image_02799.jpg \n inflating: flower_data/train/56/image_02766.jpg \n inflating: flower_data/train/56/image_02767.jpg \n inflating: flower_data/train/56/image_02798.jpg \n inflating: flower_data/train/56/image_02820.jpg \n inflating: flower_data/train/56/image_02834.jpg \n inflating: flower_data/train/56/image_02853.jpg \n inflating: flower_data/train/56/image_02847.jpg \n inflating: flower_data/train/56/image_02846.jpg \n inflating: flower_data/train/56/image_02852.jpg \n inflating: flower_data/train/56/image_02850.jpg \n inflating: flower_data/train/56/image_02851.jpg \n inflating: flower_data/train/56/image_02845.jpg \n inflating: flower_data/train/56/image_02855.jpg \n inflating: flower_data/train/56/image_02854.jpg \n inflating: flower_data/train/56/image_02840.jpg \n inflating: flower_data/train/56/image_02856.jpg \n inflating: flower_data/train/56/image_02842.jpg \n inflating: flower_data/train/56/image_02843.jpg \n creating: flower_data/train/51/\n inflating: flower_data/train/51/image_01348.jpg \n inflating: flower_data/train/51/image_01360.jpg \n inflating: flower_data/train/51/image_01374.jpg \n inflating: flower_data/train/51/image_01412.jpg \n inflating: flower_data/train/51/image_03952.jpg \n inflating: flower_data/train/51/image_03991.jpg \n inflating: flower_data/train/51/image_03990.jpg \n inflating: flower_data/train/51/image_03947.jpg \n inflating: flower_data/train/51/image_03953.jpg \n inflating: flower_data/train/51/image_01375.jpg \n inflating: flower_data/train/51/image_01413.jpg \n inflating: flower_data/train/51/image_01361.jpg \n inflating: flower_data/train/51/image_01349.jpg \n inflating: flower_data/train/51/image_01411.jpg \n inflating: flower_data/train/51/image_01377.jpg \n inflating: flower_data/train/51/image_01405.jpg \n inflating: flower_data/train/51/image_03945.jpg \n inflating: flower_data/train/51/image_03979.jpg \n inflating: flower_data/train/51/image_03992.jpg \n inflating: flower_data/train/51/image_03993.jpg \n inflating: flower_data/train/51/image_03987.jpg \n inflating: flower_data/train/51/image_03978.jpg \n inflating: flower_data/train/51/image_03950.jpg \n inflating: flower_data/train/51/image_03944.jpg \n inflating: flower_data/train/51/image_01404.jpg \n inflating: flower_data/train/51/image_01410.jpg \n inflating: flower_data/train/51/image_01376.jpg \n inflating: flower_data/train/51/image_01438.jpg \n inflating: flower_data/train/51/image_01399.jpg \n inflating: flower_data/train/51/image_01372.jpg \n inflating: flower_data/train/51/image_01400.jpg \n inflating: flower_data/train/51/image_01366.jpg \n inflating: flower_data/train/51/image_01428.jpg \n inflating: flower_data/train/51/image_03940.jpg \n inflating: flower_data/train/51/image_03983.jpg \n inflating: flower_data/train/51/image_03982.jpg \n inflating: flower_data/train/51/image_03955.jpg \n inflating: flower_data/train/51/image_03941.jpg \n inflating: flower_data/train/51/image_03969.jpg \n inflating: flower_data/train/51/image_01429.jpg \n inflating: flower_data/train/51/image_01401.jpg \n inflating: flower_data/train/51/image_01367.jpg \n inflating: flower_data/train/51/image_01373.jpg \n inflating: flower_data/train/51/image_01415.jpg \n inflating: flower_data/train/51/image_01398.jpg \n inflating: flower_data/train/51/image_01365.jpg \n inflating: flower_data/train/51/image_01403.jpg \n inflating: flower_data/train/51/image_01417.jpg \n inflating: flower_data/train/51/image_01371.jpg \n inflating: flower_data/train/51/image_01359.jpg \n inflating: flower_data/train/51/image_03957.jpg \n inflating: flower_data/train/51/image_03981.jpg \n inflating: flower_data/train/51/image_03942.jpg \n inflating: flower_data/train/51/image_03956.jpg \n inflating: flower_data/train/51/image_01358.jpg \n inflating: flower_data/train/51/image_01416.jpg \n inflating: flower_data/train/51/image_01364.jpg \n inflating: flower_data/train/51/image_01402.jpg \n inflating: flower_data/train/51/image_01465.jpg \n inflating: flower_data/train/51/image_01471.jpg \n inflating: flower_data/train/51/image_01317.jpg \n inflating: flower_data/train/51/image_03931.jpg \n inflating: flower_data/train/51/image_03925.jpg \n inflating: flower_data/train/51/image_03919.jpg \n inflating: flower_data/train/51/image_03918.jpg \n inflating: flower_data/train/51/image_03924.jpg \n inflating: flower_data/train/51/image_03930.jpg \n inflating: flower_data/train/51/image_01470.jpg \n inflating: flower_data/train/51/image_01464.jpg \n inflating: flower_data/train/51/image_01458.jpg \n inflating: flower_data/train/51/image_01472.jpg \n inflating: flower_data/train/51/image_01466.jpg \n inflating: flower_data/train/51/image_03932.jpg \n inflating: flower_data/train/51/image_01467.jpg \n inflating: flower_data/train/51/image_01329.jpg \n inflating: flower_data/train/51/image_01488.jpg \n inflating: flower_data/train/51/image_01477.jpg \n inflating: flower_data/train/51/image_01463.jpg \n inflating: flower_data/train/51/image_01339.jpg \n inflating: flower_data/train/51/image_03923.jpg \n inflating: flower_data/train/51/image_03936.jpg \n inflating: flower_data/train/51/image_03922.jpg \n inflating: flower_data/train/51/image_01338.jpg \n inflating: flower_data/train/51/image_01462.jpg \n inflating: flower_data/train/51/image_01476.jpg \n inflating: flower_data/train/51/image_01489.jpg \n inflating: flower_data/train/51/image_01460.jpg \n inflating: flower_data/train/51/image_01448.jpg \n inflating: flower_data/train/51/image_03934.jpg \n inflating: flower_data/train/51/image_03920.jpg \n inflating: flower_data/train/51/image_03921.jpg \n inflating: flower_data/train/51/image_03935.jpg \n inflating: flower_data/train/51/image_01449.jpg \n inflating: flower_data/train/51/image_01475.jpg \n inflating: flower_data/train/51/image_01461.jpg \n inflating: flower_data/train/51/image_01487.jpg \n inflating: flower_data/train/51/image_01478.jpg \n inflating: flower_data/train/51/image_01444.jpg \n inflating: flower_data/train/51/image_01450.jpg \n inflating: flower_data/train/51/image_01336.jpg \n inflating: flower_data/train/51/image_01451.jpg \n inflating: flower_data/train/51/image_01337.jpg \n inflating: flower_data/train/51/image_01323.jpg \n inflating: flower_data/train/51/image_01445.jpg \n inflating: flower_data/train/51/image_01479.jpg \n inflating: flower_data/train/51/image_01486.jpg \n inflating: flower_data/train/51/image_01490.jpg \n inflating: flower_data/train/51/image_01484.jpg \n inflating: flower_data/train/51/image_01453.jpg \n inflating: flower_data/train/51/image_01447.jpg \n inflating: flower_data/train/51/image_01321.jpg \n inflating: flower_data/train/51/image_03913.jpg \n inflating: flower_data/train/51/image_01446.jpg \n inflating: flower_data/train/51/image_01320.jpg \n inflating: flower_data/train/51/image_01334.jpg \n inflating: flower_data/train/51/image_01452.jpg \n inflating: flower_data/train/51/image_01485.jpg \n inflating: flower_data/train/51/image_01481.jpg \n inflating: flower_data/train/51/image_01456.jpg \n inflating: flower_data/train/51/image_01330.jpg \n inflating: flower_data/train/51/image_01324.jpg \n inflating: flower_data/train/51/image_01442.jpg \n inflating: flower_data/train/51/image_01318.jpg \n inflating: flower_data/train/51/image_01319.jpg \n inflating: flower_data/train/51/image_01325.jpg \n inflating: flower_data/train/51/image_01443.jpg \n inflating: flower_data/train/51/image_01331.jpg \n inflating: flower_data/train/51/image_01480.jpg \n inflating: flower_data/train/51/image_01482.jpg \n inflating: flower_data/train/51/image_01441.jpg \n inflating: flower_data/train/51/image_01327.jpg \n inflating: flower_data/train/51/image_01333.jpg \n inflating: flower_data/train/51/image_01455.jpg \n inflating: flower_data/train/51/image_03929.jpg \n inflating: flower_data/train/51/image_03915.jpg \n inflating: flower_data/train/51/image_03914.jpg \n inflating: flower_data/train/51/image_03928.jpg \n inflating: flower_data/train/51/image_01332.jpg \n inflating: flower_data/train/51/image_01454.jpg \n inflating: flower_data/train/51/image_01440.jpg \n inflating: flower_data/train/51/image_01483.jpg \n inflating: flower_data/train/51/image_01382.jpg \n inflating: flower_data/train/51/image_01396.jpg \n inflating: flower_data/train/51/image_01369.jpg \n inflating: flower_data/train/51/image_01427.jpg \n inflating: flower_data/train/51/image_01341.jpg \n inflating: flower_data/train/51/image_01355.jpg \n inflating: flower_data/train/51/image_01433.jpg \n inflating: flower_data/train/51/image_03967.jpg \n inflating: flower_data/train/51/image_03966.jpg \n inflating: flower_data/train/51/image_03972.jpg \n inflating: flower_data/train/51/image_01432.jpg \n inflating: flower_data/train/51/image_01426.jpg \n inflating: flower_data/train/51/image_01368.jpg \n inflating: flower_data/train/51/image_01397.jpg \n inflating: flower_data/train/51/image_01383.jpg \n inflating: flower_data/train/51/image_01395.jpg \n inflating: flower_data/train/51/image_01381.jpg \n inflating: flower_data/train/51/image_01418.jpg \n inflating: flower_data/train/51/image_01430.jpg \n inflating: flower_data/train/51/image_01356.jpg \n inflating: flower_data/train/51/image_01342.jpg \n inflating: flower_data/train/51/image_01424.jpg \n inflating: flower_data/train/51/image_03964.jpg \n inflating: flower_data/train/51/image_03970.jpg \n inflating: flower_data/train/51/image_03958.jpg \n inflating: flower_data/train/51/image_03959.jpg \n inflating: flower_data/train/51/image_03971.jpg \n inflating: flower_data/train/51/image_03965.jpg \n inflating: flower_data/train/51/image_01343.jpg \n inflating: flower_data/train/51/image_01431.jpg \n inflating: flower_data/train/51/image_01357.jpg \n inflating: flower_data/train/51/image_01380.jpg \n inflating: flower_data/train/51/image_01394.jpg \n inflating: flower_data/train/51/image_01390.jpg \n inflating: flower_data/train/51/image_01384.jpg \n inflating: flower_data/train/51/image_01353.jpg \n inflating: flower_data/train/51/image_01435.jpg \n inflating: flower_data/train/51/image_01421.jpg \n inflating: flower_data/train/51/image_01347.jpg \n inflating: flower_data/train/51/image_01409.jpg \n inflating: flower_data/train/51/image_03949.jpg \n inflating: flower_data/train/51/image_03948.jpg \n inflating: flower_data/train/51/image_01408.jpg \n inflating: flower_data/train/51/image_01420.jpg \n inflating: flower_data/train/51/image_01434.jpg \n inflating: flower_data/train/51/image_01385.jpg \n inflating: flower_data/train/51/image_01387.jpg \n inflating: flower_data/train/51/image_01393.jpg \n inflating: flower_data/train/51/image_01344.jpg \n inflating: flower_data/train/51/image_01422.jpg \n inflating: flower_data/train/51/image_01436.jpg \n inflating: flower_data/train/51/image_01350.jpg \n inflating: flower_data/train/51/image_01378.jpg \n inflating: flower_data/train/51/image_03976.jpg \n inflating: flower_data/train/51/image_03962.jpg \n inflating: flower_data/train/51/image_03988.jpg \n inflating: flower_data/train/51/image_03963.jpg \n inflating: flower_data/train/51/image_03977.jpg \n inflating: flower_data/train/51/image_01379.jpg \n inflating: flower_data/train/51/image_01437.jpg \n inflating: flower_data/train/51/image_01351.jpg \n inflating: flower_data/train/51/image_01345.jpg \n inflating: flower_data/train/51/image_01423.jpg \n inflating: flower_data/train/51/image_01392.jpg \n inflating: flower_data/train/51/image_01386.jpg \n creating: flower_data/train/58/\n inflating: flower_data/train/58/image_02696.jpg \n inflating: flower_data/train/58/image_02682.jpg \n inflating: flower_data/train/58/image_02655.jpg \n inflating: flower_data/train/58/image_02641.jpg \n inflating: flower_data/train/58/image_02669.jpg \n inflating: flower_data/train/58/image_02735.jpg \n inflating: flower_data/train/58/image_02709.jpg \n inflating: flower_data/train/58/image_02708.jpg \n inflating: flower_data/train/58/image_02720.jpg \n inflating: flower_data/train/58/image_02668.jpg \n inflating: flower_data/train/58/image_02640.jpg \n inflating: flower_data/train/58/image_02654.jpg \n inflating: flower_data/train/58/image_02683.jpg \n inflating: flower_data/train/58/image_02697.jpg \n inflating: flower_data/train/58/image_02736.jpg \n inflating: flower_data/train/58/image_02723.jpg \n inflating: flower_data/train/58/image_02657.jpg \n inflating: flower_data/train/58/image_02643.jpg \n inflating: flower_data/train/58/image_02680.jpg \n inflating: flower_data/train/58/image_02684.jpg \n inflating: flower_data/train/58/image_02647.jpg \n inflating: flower_data/train/58/image_02733.jpg \n inflating: flower_data/train/58/image_02727.jpg \n inflating: flower_data/train/58/image_02732.jpg \n inflating: flower_data/train/58/image_02652.jpg \n inflating: flower_data/train/58/image_02646.jpg \n inflating: flower_data/train/58/image_02685.jpg \n inflating: flower_data/train/58/image_02693.jpg \n inflating: flower_data/train/58/image_02678.jpg \n inflating: flower_data/train/58/image_02650.jpg \n inflating: flower_data/train/58/image_02644.jpg \n inflating: flower_data/train/58/image_02718.jpg \n inflating: flower_data/train/58/image_02724.jpg \n inflating: flower_data/train/58/image_02730.jpg \n inflating: flower_data/train/58/image_02731.jpg \n inflating: flower_data/train/58/image_02725.jpg \n inflating: flower_data/train/58/image_02645.jpg \n inflating: flower_data/train/58/image_02651.jpg \n inflating: flower_data/train/58/image_02679.jpg \n inflating: flower_data/train/58/image_02686.jpg \n inflating: flower_data/train/58/image_02692.jpg \n inflating: flower_data/train/58/image_02740.jpg \n inflating: flower_data/train/58/image_02750.jpg \n inflating: flower_data/train/58/image_02744.jpg \n inflating: flower_data/train/58/image_02745.jpg \n inflating: flower_data/train/58/image_02751.jpg \n inflating: flower_data/train/58/image_02747.jpg \n inflating: flower_data/train/58/image_02746.jpg \n inflating: flower_data/train/58/image_02748.jpg \n inflating: flower_data/train/58/image_02749.jpg \n inflating: flower_data/train/58/image_02639.jpg \n inflating: flower_data/train/58/image_02674.jpg \n inflating: flower_data/train/58/image_02660.jpg \n inflating: flower_data/train/58/image_02648.jpg \n inflating: flower_data/train/58/image_02714.jpg \n inflating: flower_data/train/58/image_02728.jpg \n inflating: flower_data/train/58/image_02729.jpg \n inflating: flower_data/train/58/image_02715.jpg \n inflating: flower_data/train/58/image_02701.jpg \n inflating: flower_data/train/58/image_02661.jpg \n inflating: flower_data/train/58/image_02688.jpg \n inflating: flower_data/train/58/image_02717.jpg \n inflating: flower_data/train/58/image_02703.jpg \n inflating: flower_data/train/58/image_02702.jpg \n inflating: flower_data/train/58/image_02716.jpg \n inflating: flower_data/train/58/image_02662.jpg \n inflating: flower_data/train/58/image_02689.jpg \n inflating: flower_data/train/58/image_02699.jpg \n inflating: flower_data/train/58/image_02666.jpg \n inflating: flower_data/train/58/image_02712.jpg \n inflating: flower_data/train/58/image_02706.jpg \n inflating: flower_data/train/58/image_02707.jpg \n inflating: flower_data/train/58/image_02713.jpg \n inflating: flower_data/train/58/image_02673.jpg \n inflating: flower_data/train/58/image_02667.jpg \n inflating: flower_data/train/58/image_02698.jpg \n inflating: flower_data/train/58/image_02659.jpg \n inflating: flower_data/train/58/image_02671.jpg \n inflating: flower_data/train/58/image_02665.jpg \n inflating: flower_data/train/58/image_02739.jpg \n inflating: flower_data/train/58/image_02711.jpg \n inflating: flower_data/train/58/image_02710.jpg \n inflating: flower_data/train/58/image_02704.jpg \n inflating: flower_data/train/58/image_02664.jpg \n inflating: flower_data/train/58/image_02670.jpg \n inflating: flower_data/train/58/image_02658.jpg \n creating: flower_data/train/67/\n inflating: flower_data/train/67/image_07077.jpg \n inflating: flower_data/train/67/image_07063.jpg \n inflating: flower_data/train/67/image_07062.jpg \n inflating: flower_data/train/67/image_07076.jpg \n inflating: flower_data/train/67/image_07048.jpg \n inflating: flower_data/train/67/image_07060.jpg \n inflating: flower_data/train/67/image_07074.jpg \n inflating: flower_data/train/67/image_07061.jpg \n inflating: flower_data/train/67/image_07049.jpg \n inflating: flower_data/train/67/image_07065.jpg \n inflating: flower_data/train/67/image_07071.jpg \n inflating: flower_data/train/67/image_07059.jpg \n inflating: flower_data/train/67/image_07058.jpg \n inflating: flower_data/train/67/image_07070.jpg \n inflating: flower_data/train/67/image_07064.jpg \n inflating: flower_data/train/67/image_07072.jpg \n inflating: flower_data/train/67/image_07066.jpg \n inflating: flower_data/train/67/image_07067.jpg \n inflating: flower_data/train/67/image_07073.jpg \n inflating: flower_data/train/67/image_08079.jpg \n inflating: flower_data/train/67/image_08078.jpg \n inflating: flower_data/train/67/image_07081.jpg \n inflating: flower_data/train/67/image_07057.jpg \n inflating: flower_data/train/67/image_07069.jpg \n inflating: flower_data/train/67/image_07055.jpg \n inflating: flower_data/train/67/image_07054.jpg \n inflating: flower_data/train/67/image_07068.jpg \n inflating: flower_data/train/67/image_07050.jpg \n inflating: flower_data/train/67/image_07078.jpg \n inflating: flower_data/train/67/image_07051.jpg \n inflating: flower_data/train/67/image_07084.jpg \n inflating: flower_data/train/67/image_07053.jpg \n inflating: flower_data/train/67/image_07047.jpg \n inflating: flower_data/train/67/image_07046.jpg \n inflating: flower_data/train/67/image_07052.jpg \n inflating: flower_data/train/67/image_07085.jpg \n creating: flower_data/train/93/\n inflating: flower_data/train/93/image_06023.jpg \n inflating: flower_data/train/93/image_06036.jpg \n inflating: flower_data/train/93/image_06022.jpg \n inflating: flower_data/train/93/image_06034.jpg \n inflating: flower_data/train/93/image_06020.jpg \n inflating: flower_data/train/93/image_06035.jpg \n inflating: flower_data/train/93/image_06019.jpg \n inflating: flower_data/train/93/image_06025.jpg \n inflating: flower_data/train/93/image_06030.jpg \n inflating: flower_data/train/93/image_06018.jpg \n inflating: flower_data/train/93/image_06026.jpg \n inflating: flower_data/train/93/image_06032.jpg \n inflating: flower_data/train/93/image_06027.jpg \n inflating: flower_data/train/93/image_06040.jpg \n inflating: flower_data/train/93/image_06041.jpg \n inflating: flower_data/train/93/image_06043.jpg \n inflating: flower_data/train/93/image_06042.jpg \n inflating: flower_data/train/93/image_06046.jpg \n inflating: flower_data/train/93/image_06047.jpg \n inflating: flower_data/train/93/image_06045.jpg \n inflating: flower_data/train/93/image_08112.jpg \n inflating: flower_data/train/93/image_08116.jpg \n inflating: flower_data/train/93/image_08117.jpg \n inflating: flower_data/train/93/image_08115.jpg \n inflating: flower_data/train/93/image_08114.jpg \n inflating: flower_data/train/93/image_06016.jpg \n inflating: flower_data/train/93/image_06015.jpg \n inflating: flower_data/train/93/image_06029.jpg \n inflating: flower_data/train/93/image_06028.jpg \n inflating: flower_data/train/93/image_06038.jpg \n inflating: flower_data/train/93/image_06011.jpg \n inflating: flower_data/train/93/image_06039.jpg \n inflating: flower_data/train/93/image_06013.jpg \n inflating: flower_data/train/93/image_06012.jpg \n creating: flower_data/train/94/\n inflating: flower_data/train/94/image_07329.jpg \n inflating: flower_data/train/94/image_07315.jpg \n inflating: flower_data/train/94/image_07314.jpg \n inflating: flower_data/train/94/image_07328.jpg \n inflating: flower_data/train/94/image_07458.jpg \n inflating: flower_data/train/94/image_07316.jpg \n inflating: flower_data/train/94/image_07464.jpg \n inflating: flower_data/train/94/image_07465.jpg \n inflating: flower_data/train/94/image_07459.jpg \n inflating: flower_data/train/94/image_07313.jpg \n inflating: flower_data/train/94/image_07461.jpg \n inflating: flower_data/train/94/image_07307.jpg \n inflating: flower_data/train/94/image_07448.jpg \n inflating: flower_data/train/94/image_07460.jpg \n inflating: flower_data/train/94/image_07306.jpg \n inflating: flower_data/train/94/image_07312.jpg \n inflating: flower_data/train/94/image_07304.jpg \n inflating: flower_data/train/94/image_07462.jpg \n inflating: flower_data/train/94/image_07310.jpg \n inflating: flower_data/train/94/image_07338.jpg \n inflating: flower_data/train/94/image_07339.jpg \n inflating: flower_data/train/94/image_07311.jpg \n inflating: flower_data/train/94/image_07305.jpg \n inflating: flower_data/train/94/image_07463.jpg \n inflating: flower_data/train/94/image_07389.jpg \n inflating: flower_data/train/94/image_07438.jpg \n inflating: flower_data/train/94/image_07362.jpg \n inflating: flower_data/train/94/image_07404.jpg \n inflating: flower_data/train/94/image_07410.jpg \n inflating: flower_data/train/94/image_07376.jpg \n inflating: flower_data/train/94/image_07411.jpg \n inflating: flower_data/train/94/image_07377.jpg \n inflating: flower_data/train/94/image_07363.jpg \n inflating: flower_data/train/94/image_07405.jpg \n inflating: flower_data/train/94/image_07439.jpg \n inflating: flower_data/train/94/image_07349.jpg \n inflating: flower_data/train/94/image_07375.jpg \n inflating: flower_data/train/94/image_07413.jpg \n inflating: flower_data/train/94/image_07407.jpg \n inflating: flower_data/train/94/image_07406.jpg \n inflating: flower_data/train/94/image_07374.jpg \n inflating: flower_data/train/94/image_07412.jpg \n inflating: flower_data/train/94/image_07348.jpg \n inflating: flower_data/train/94/image_07416.jpg \n inflating: flower_data/train/94/image_07364.jpg \n inflating: flower_data/train/94/image_07402.jpg \n inflating: flower_data/train/94/image_07359.jpg \n inflating: flower_data/train/94/image_07365.jpg \n inflating: flower_data/train/94/image_07403.jpg \n inflating: flower_data/train/94/image_07417.jpg \n inflating: flower_data/train/94/image_07371.jpg \n inflating: flower_data/train/94/image_07398.jpg \n inflating: flower_data/train/94/image_07401.jpg \n inflating: flower_data/train/94/image_07367.jpg \n inflating: flower_data/train/94/image_07373.jpg \n inflating: flower_data/train/94/image_07429.jpg \n inflating: flower_data/train/94/image_07414.jpg \n inflating: flower_data/train/94/image_07366.jpg \n inflating: flower_data/train/94/image_07380.jpg \n inflating: flower_data/train/94/image_07394.jpg \n inflating: flower_data/train/94/image_07419.jpg \n inflating: flower_data/train/94/image_07343.jpg \n inflating: flower_data/train/94/image_07425.jpg \n inflating: flower_data/train/94/image_07431.jpg \n inflating: flower_data/train/94/image_07430.jpg \n inflating: flower_data/train/94/image_07356.jpg \n inflating: flower_data/train/94/image_07342.jpg \n inflating: flower_data/train/94/image_07424.jpg \n inflating: flower_data/train/94/image_07397.jpg \n inflating: flower_data/train/94/image_07383.jpg \n inflating: flower_data/train/94/image_07368.jpg \n inflating: flower_data/train/94/image_07354.jpg \n inflating: flower_data/train/94/image_07432.jpg \n inflating: flower_data/train/94/image_07426.jpg \n inflating: flower_data/train/94/image_07340.jpg \n inflating: flower_data/train/94/image_07427.jpg \n inflating: flower_data/train/94/image_07341.jpg \n inflating: flower_data/train/94/image_07355.jpg \n inflating: flower_data/train/94/image_07433.jpg \n inflating: flower_data/train/94/image_07369.jpg \n inflating: flower_data/train/94/image_07396.jpg \n inflating: flower_data/train/94/image_07386.jpg \n inflating: flower_data/train/94/image_07437.jpg \n inflating: flower_data/train/94/image_07351.jpg \n inflating: flower_data/train/94/image_07345.jpg \n inflating: flower_data/train/94/image_07423.jpg \n inflating: flower_data/train/94/image_07379.jpg \n inflating: flower_data/train/94/image_07378.jpg \n inflating: flower_data/train/94/image_07344.jpg \n inflating: flower_data/train/94/image_07422.jpg \n inflating: flower_data/train/94/image_07350.jpg \n inflating: flower_data/train/94/image_07387.jpg \n inflating: flower_data/train/94/image_07391.jpg \n inflating: flower_data/train/94/image_07420.jpg \n inflating: flower_data/train/94/image_07409.jpg \n inflating: flower_data/train/94/image_07353.jpg \n inflating: flower_data/train/94/image_07421.jpg \n inflating: flower_data/train/94/image_07347.jpg \n inflating: flower_data/train/94/image_07390.jpg \n inflating: flower_data/train/94/image_07384.jpg \n inflating: flower_data/train/94/image_07446.jpg \n inflating: flower_data/train/94/image_07320.jpg \n inflating: flower_data/train/94/image_07334.jpg \n inflating: flower_data/train/94/image_07452.jpg \n inflating: flower_data/train/94/image_07335.jpg \n inflating: flower_data/train/94/image_07453.jpg \n inflating: flower_data/train/94/image_07447.jpg \n inflating: flower_data/train/94/image_07321.jpg \n inflating: flower_data/train/94/image_07309.jpg \n inflating: flower_data/train/94/image_07451.jpg \n inflating: flower_data/train/94/image_07337.jpg \n inflating: flower_data/train/94/image_07323.jpg \n inflating: flower_data/train/94/image_07445.jpg \n inflating: flower_data/train/94/image_07322.jpg \n inflating: flower_data/train/94/image_07444.jpg \n inflating: flower_data/train/94/image_07450.jpg \n inflating: flower_data/train/94/image_07336.jpg \n inflating: flower_data/train/94/image_07332.jpg \n inflating: flower_data/train/94/image_07454.jpg \n inflating: flower_data/train/94/image_07440.jpg \n inflating: flower_data/train/94/image_07326.jpg \n inflating: flower_data/train/94/image_07441.jpg \n inflating: flower_data/train/94/image_07455.jpg \n inflating: flower_data/train/94/image_07443.jpg \n inflating: flower_data/train/94/image_07457.jpg \n inflating: flower_data/train/94/image_07331.jpg \n inflating: flower_data/train/94/image_07319.jpg \n inflating: flower_data/train/94/image_07318.jpg \n inflating: flower_data/train/94/image_07456.jpg \n inflating: flower_data/train/94/image_07330.jpg \n inflating: flower_data/train/94/image_07324.jpg \n inflating: flower_data/train/94/image_07442.jpg \n creating: flower_data/train/60/\n inflating: flower_data/train/60/image_03013.jpg \n inflating: flower_data/train/60/image_03007.jpg \n inflating: flower_data/train/60/image_03006.jpg \n inflating: flower_data/train/60/image_03012.jpg \n inflating: flower_data/train/60/image_03004.jpg \n inflating: flower_data/train/60/image_03010.jpg \n inflating: flower_data/train/60/image_02939.jpg \n inflating: flower_data/train/60/image_02938.jpg \n inflating: flower_data/train/60/image_03011.jpg \n inflating: flower_data/train/60/image_03005.jpg \n inflating: flower_data/train/60/image_03001.jpg \n inflating: flower_data/train/60/image_03015.jpg \n inflating: flower_data/train/60/image_02929.jpg \n inflating: flower_data/train/60/image_03028.jpg \n inflating: flower_data/train/60/image_03014.jpg \n inflating: flower_data/train/60/image_03002.jpg \n inflating: flower_data/train/60/image_03017.jpg \n inflating: flower_data/train/60/image_02965.jpg \n inflating: flower_data/train/60/image_02970.jpg \n inflating: flower_data/train/60/image_02964.jpg \n inflating: flower_data/train/60/image_02972.jpg \n inflating: flower_data/train/60/image_02966.jpg \n inflating: flower_data/train/60/image_02999.jpg \n inflating: flower_data/train/60/image_02998.jpg \n inflating: flower_data/train/60/image_02967.jpg \n inflating: flower_data/train/60/image_02973.jpg \n inflating: flower_data/train/60/image_02963.jpg \n inflating: flower_data/train/60/image_02988.jpg \n inflating: flower_data/train/60/image_02989.jpg \n inflating: flower_data/train/60/image_02962.jpg \n inflating: flower_data/train/60/image_02976.jpg \n inflating: flower_data/train/60/image_02960.jpg \n inflating: flower_data/train/60/image_02949.jpg \n inflating: flower_data/train/60/image_02975.jpg \n inflating: flower_data/train/60/image_02944.jpg \n inflating: flower_data/train/60/image_02950.jpg \n inflating: flower_data/train/60/image_02993.jpg \n inflating: flower_data/train/60/image_02986.jpg \n inflating: flower_data/train/60/image_02951.jpg \n inflating: flower_data/train/60/image_02945.jpg \n inflating: flower_data/train/60/image_02979.jpg \n inflating: flower_data/train/60/image_02953.jpg \n inflating: flower_data/train/60/image_02947.jpg \n inflating: flower_data/train/60/image_02990.jpg \n inflating: flower_data/train/60/image_02984.jpg \n inflating: flower_data/train/60/image_02985.jpg \n inflating: flower_data/train/60/image_02991.jpg \n inflating: flower_data/train/60/image_02946.jpg \n inflating: flower_data/train/60/image_02952.jpg \n inflating: flower_data/train/60/image_02956.jpg \n inflating: flower_data/train/60/image_02942.jpg \n inflating: flower_data/train/60/image_02995.jpg \n inflating: flower_data/train/60/image_02980.jpg \n inflating: flower_data/train/60/image_02994.jpg \n inflating: flower_data/train/60/image_02943.jpg \n inflating: flower_data/train/60/image_02957.jpg \n inflating: flower_data/train/60/image_02941.jpg \n inflating: flower_data/train/60/image_02955.jpg \n inflating: flower_data/train/60/image_02969.jpg \n inflating: flower_data/train/60/image_02982.jpg \n inflating: flower_data/train/60/image_02983.jpg \n inflating: flower_data/train/60/image_02968.jpg \n inflating: flower_data/train/60/image_02954.jpg \n inflating: flower_data/train/60/image_02940.jpg \n inflating: flower_data/train/60/image_02927.jpg \n inflating: flower_data/train/60/image_02926.jpg \n inflating: flower_data/train/60/image_03027.jpg \n inflating: flower_data/train/60/image_03019.jpg \n inflating: flower_data/train/60/image_03025.jpg \n inflating: flower_data/train/60/image_02930.jpg \n inflating: flower_data/train/60/image_02924.jpg \n inflating: flower_data/train/60/image_02925.jpg \n inflating: flower_data/train/60/image_02931.jpg \n inflating: flower_data/train/60/image_03024.jpg \n inflating: flower_data/train/60/image_03018.jpg \n inflating: flower_data/train/60/image_03020.jpg \n inflating: flower_data/train/60/image_02935.jpg \n inflating: flower_data/train/60/image_02921.jpg \n inflating: flower_data/train/60/image_02920.jpg \n inflating: flower_data/train/60/image_02934.jpg \n inflating: flower_data/train/60/image_03009.jpg \n inflating: flower_data/train/60/image_03021.jpg \n inflating: flower_data/train/60/image_03023.jpg \n inflating: flower_data/train/60/image_02922.jpg \n inflating: flower_data/train/60/image_03022.jpg \n creating: flower_data/train/34/\n inflating: flower_data/train/34/image_06962.jpg \n inflating: flower_data/train/34/image_06960.jpg \n inflating: flower_data/train/34/image_06948.jpg \n inflating: flower_data/train/34/image_06958.jpg \n inflating: flower_data/train/34/image_06965.jpg \n inflating: flower_data/train/34/image_06967.jpg \n inflating: flower_data/train/34/image_06966.jpg \n inflating: flower_data/train/34/image_06938.jpg \n inflating: flower_data/train/34/image_06934.jpg \n inflating: flower_data/train/34/image_06935.jpg \n inflating: flower_data/train/34/image_06937.jpg \n inflating: flower_data/train/34/image_06936.jpg \n inflating: flower_data/train/34/image_06932.jpg \n inflating: flower_data/train/34/image_06931.jpg \n inflating: flower_data/train/34/image_06930.jpg \n inflating: flower_data/train/34/image_06943.jpg \n inflating: flower_data/train/34/image_06957.jpg \n inflating: flower_data/train/34/image_06956.jpg \n inflating: flower_data/train/34/image_06942.jpg \n inflating: flower_data/train/34/image_06968.jpg \n inflating: flower_data/train/34/image_06940.jpg \n inflating: flower_data/train/34/image_06951.jpg \n inflating: flower_data/train/34/image_06945.jpg \n inflating: flower_data/train/34/image_06944.jpg \n inflating: flower_data/train/34/image_06946.jpg \n inflating: flower_data/train/34/image_06952.jpg \n inflating: flower_data/train/34/image_06953.jpg \n inflating: flower_data/train/34/image_06947.jpg \n creating: flower_data/train/33/\n inflating: flower_data/train/33/image_06450.jpg \n inflating: flower_data/train/33/image_06445.jpg \n inflating: flower_data/train/33/image_06451.jpg \n inflating: flower_data/train/33/image_06484.jpg \n inflating: flower_data/train/33/image_06447.jpg \n inflating: flower_data/train/33/image_06453.jpg \n inflating: flower_data/train/33/image_06452.jpg \n inflating: flower_data/train/33/image_06446.jpg \n inflating: flower_data/train/33/image_06442.jpg \n inflating: flower_data/train/33/image_06480.jpg \n inflating: flower_data/train/33/image_06482.jpg \n inflating: flower_data/train/33/image_06469.jpg \n inflating: flower_data/train/33/image_06483.jpg \n inflating: flower_data/train/33/image_06471.jpg \n inflating: flower_data/train/33/image_06465.jpg \n inflating: flower_data/train/33/image_06459.jpg \n inflating: flower_data/train/33/image_06458.jpg \n inflating: flower_data/train/33/image_06464.jpg \n inflating: flower_data/train/33/image_06466.jpg \n inflating: flower_data/train/33/image_06472.jpg \n inflating: flower_data/train/33/image_06473.jpg \n inflating: flower_data/train/33/image_06467.jpg \n inflating: flower_data/train/33/image_06463.jpg \n inflating: flower_data/train/33/image_06477.jpg \n inflating: flower_data/train/33/image_06476.jpg \n inflating: flower_data/train/33/image_06462.jpg \n inflating: flower_data/train/33/image_06448.jpg \n inflating: flower_data/train/33/image_06474.jpg \n inflating: flower_data/train/33/image_06461.jpg \n inflating: flower_data/train/33/image_06475.jpg \n inflating: flower_data/train/33/image_06449.jpg \n creating: flower_data/train/20/\n inflating: flower_data/train/20/image_04949.jpg \n inflating: flower_data/train/20/image_04948.jpg \n inflating: flower_data/train/20/image_04904.jpg \n inflating: flower_data/train/20/image_04911.jpg \n inflating: flower_data/train/20/image_04905.jpg \n inflating: flower_data/train/20/image_04939.jpg \n inflating: flower_data/train/20/image_04907.jpg \n inflating: flower_data/train/20/image_04898.jpg \n inflating: flower_data/train/20/image_04899.jpg \n inflating: flower_data/train/20/image_04906.jpg \n inflating: flower_data/train/20/image_04916.jpg \n inflating: flower_data/train/20/image_04902.jpg \n inflating: flower_data/train/20/image_04917.jpg \n inflating: flower_data/train/20/image_04901.jpg \n inflating: flower_data/train/20/image_04915.jpg \n inflating: flower_data/train/20/image_04929.jpg \n inflating: flower_data/train/20/image_04928.jpg \n inflating: flower_data/train/20/image_04914.jpg \n inflating: flower_data/train/20/image_04900.jpg \n inflating: flower_data/train/20/image_04919.jpg \n inflating: flower_data/train/20/image_04925.jpg \n inflating: flower_data/train/20/image_04931.jpg \n inflating: flower_data/train/20/image_04930.jpg \n inflating: flower_data/train/20/image_04924.jpg \n inflating: flower_data/train/20/image_04932.jpg \n inflating: flower_data/train/20/image_04926.jpg \n inflating: flower_data/train/20/image_04933.jpg \n inflating: flower_data/train/20/image_04937.jpg \n inflating: flower_data/train/20/image_04923.jpg \n inflating: flower_data/train/20/image_04922.jpg \n inflating: flower_data/train/20/image_04936.jpg \n inflating: flower_data/train/20/image_04934.jpg \n inflating: flower_data/train/20/image_04908.jpg \n inflating: flower_data/train/20/image_04897.jpg \n inflating: flower_data/train/20/image_04909.jpg \n inflating: flower_data/train/20/image_04935.jpg \n inflating: flower_data/train/20/image_04921.jpg \n inflating: flower_data/train/20/image_04946.jpg \n inflating: flower_data/train/20/image_04952.jpg \n inflating: flower_data/train/20/image_04947.jpg \n inflating: flower_data/train/20/image_04951.jpg \n inflating: flower_data/train/20/image_04945.jpg \n inflating: flower_data/train/20/image_04944.jpg \n inflating: flower_data/train/20/image_04950.jpg \n inflating: flower_data/train/20/image_04941.jpg \n inflating: flower_data/train/20/image_04943.jpg \n creating: flower_data/train/18/\n inflating: flower_data/train/18/image_04284.jpg \n inflating: flower_data/train/18/image_04247.jpg \n inflating: flower_data/train/18/image_04253.jpg \n inflating: flower_data/train/18/image_04246.jpg \n inflating: flower_data/train/18/image_04291.jpg \n inflating: flower_data/train/18/image_04285.jpg \n inflating: flower_data/train/18/image_04293.jpg \n inflating: flower_data/train/18/image_04287.jpg \n inflating: flower_data/train/18/image_04250.jpg \n inflating: flower_data/train/18/image_04244.jpg \n inflating: flower_data/train/18/image_04324.jpg \n inflating: flower_data/train/18/image_04318.jpg \n inflating: flower_data/train/18/image_04319.jpg \n inflating: flower_data/train/18/image_04325.jpg \n inflating: flower_data/train/18/image_04279.jpg \n inflating: flower_data/train/18/image_04245.jpg \n inflating: flower_data/train/18/image_04251.jpg \n inflating: flower_data/train/18/image_04286.jpg \n inflating: flower_data/train/18/image_04296.jpg \n inflating: flower_data/train/18/image_04282.jpg \n inflating: flower_data/train/18/image_04269.jpg \n inflating: flower_data/train/18/image_04255.jpg \n inflating: flower_data/train/18/image_04309.jpg \n inflating: flower_data/train/18/image_04321.jpg \n inflating: flower_data/train/18/image_04320.jpg \n inflating: flower_data/train/18/image_04308.jpg \n inflating: flower_data/train/18/image_04268.jpg \n inflating: flower_data/train/18/image_04283.jpg \n inflating: flower_data/train/18/image_04297.jpg \n inflating: flower_data/train/18/image_04281.jpg \n inflating: flower_data/train/18/image_04295.jpg \n inflating: flower_data/train/18/image_04323.jpg \n inflating: flower_data/train/18/image_04257.jpg \n inflating: flower_data/train/18/image_04294.jpg \n inflating: flower_data/train/18/image_04280.jpg \n inflating: flower_data/train/18/image_04266.jpg \n inflating: flower_data/train/18/image_04306.jpg \n inflating: flower_data/train/18/image_04307.jpg \n inflating: flower_data/train/18/image_04313.jpg \n inflating: flower_data/train/18/image_04273.jpg \n inflating: flower_data/train/18/image_04267.jpg \n inflating: flower_data/train/18/image_04298.jpg \n inflating: flower_data/train/18/image_04271.jpg \n inflating: flower_data/train/18/image_04265.jpg \n inflating: flower_data/train/18/image_04259.jpg \n inflating: flower_data/train/18/image_04305.jpg \n inflating: flower_data/train/18/image_04310.jpg \n inflating: flower_data/train/18/image_04304.jpg \n inflating: flower_data/train/18/image_04258.jpg \n inflating: flower_data/train/18/image_04264.jpg \n inflating: flower_data/train/18/image_04270.jpg \n inflating: flower_data/train/18/image_04274.jpg \n inflating: flower_data/train/18/image_04260.jpg \n inflating: flower_data/train/18/image_04300.jpg \n inflating: flower_data/train/18/image_04314.jpg \n inflating: flower_data/train/18/image_04301.jpg \n inflating: flower_data/train/18/image_04275.jpg \n inflating: flower_data/train/18/image_04249.jpg \n inflating: flower_data/train/18/image_04288.jpg \n inflating: flower_data/train/18/image_04303.jpg \n inflating: flower_data/train/18/image_04302.jpg \n inflating: flower_data/train/18/image_04316.jpg \n inflating: flower_data/train/18/image_04276.jpg \n inflating: flower_data/train/18/image_04262.jpg \n inflating: flower_data/train/18/image_04289.jpg \n creating: flower_data/train/27/\n inflating: flower_data/train/27/image_06875.jpg \n inflating: flower_data/train/27/image_06861.jpg \n inflating: flower_data/train/27/image_06860.jpg \n inflating: flower_data/train/27/image_06874.jpg \n inflating: flower_data/train/27/image_06862.jpg \n inflating: flower_data/train/27/image_06876.jpg \n inflating: flower_data/train/27/image_06889.jpg \n inflating: flower_data/train/27/image_06888.jpg \n inflating: flower_data/train/27/image_06877.jpg \n inflating: flower_data/train/27/image_06863.jpg \n inflating: flower_data/train/27/image_06867.jpg \n inflating: flower_data/train/27/image_06873.jpg \n inflating: flower_data/train/27/image_06872.jpg \n inflating: flower_data/train/27/image_06866.jpg \n inflating: flower_data/train/27/image_06870.jpg \n inflating: flower_data/train/27/image_06858.jpg \n inflating: flower_data/train/27/image_06859.jpg \n inflating: flower_data/train/27/image_06865.jpg \n inflating: flower_data/train/27/image_06854.jpg \n inflating: flower_data/train/27/image_06883.jpg \n inflating: flower_data/train/27/image_06882.jpg \n inflating: flower_data/train/27/image_06855.jpg \n inflating: flower_data/train/27/image_06869.jpg \n inflating: flower_data/train/27/image_06857.jpg \n inflating: flower_data/train/27/image_06880.jpg \n inflating: flower_data/train/27/image_06881.jpg \n inflating: flower_data/train/27/image_06856.jpg \n inflating: flower_data/train/27/image_06852.jpg \n inflating: flower_data/train/27/image_06885.jpg \n inflating: flower_data/train/27/image_06884.jpg \n inflating: flower_data/train/27/image_06853.jpg \n inflating: flower_data/train/27/image_06851.jpg \n inflating: flower_data/train/27/image_06879.jpg \n inflating: flower_data/train/27/image_06886.jpg \n inflating: flower_data/train/27/image_06878.jpg \n inflating: flower_data/train/27/image_06850.jpg \n creating: flower_data/train/9/\n inflating: flower_data/train/9/image_06396.jpg \n inflating: flower_data/train/9/image_06433.jpg \n inflating: flower_data/train/9/image_06427.jpg \n inflating: flower_data/train/9/image_06426.jpg \n inflating: flower_data/train/9/image_06432.jpg \n inflating: flower_data/train/9/image_06397.jpg \n inflating: flower_data/train/9/image_06395.jpg \n inflating: flower_data/train/9/image_06424.jpg \n inflating: flower_data/train/9/image_06430.jpg \n inflating: flower_data/train/9/image_06418.jpg \n inflating: flower_data/train/9/image_06419.jpg \n inflating: flower_data/train/9/image_06431.jpg \n inflating: flower_data/train/9/image_06425.jpg \n inflating: flower_data/train/9/image_06409.jpg \n inflating: flower_data/train/9/image_06421.jpg \n inflating: flower_data/train/9/image_06435.jpg \n inflating: flower_data/train/9/image_06434.jpg \n inflating: flower_data/train/9/image_06408.jpg \n inflating: flower_data/train/9/image_06436.jpg \n inflating: flower_data/train/9/image_06422.jpg \n inflating: flower_data/train/9/image_06423.jpg \n inflating: flower_data/train/9/image_06437.jpg \n inflating: flower_data/train/9/image_06440.jpg \n inflating: flower_data/train/9/image_06412.jpg \n inflating: flower_data/train/9/image_06406.jpg \n inflating: flower_data/train/9/image_06407.jpg \n inflating: flower_data/train/9/image_06405.jpg \n inflating: flower_data/train/9/image_06411.jpg \n inflating: flower_data/train/9/image_06439.jpg \n inflating: flower_data/train/9/image_06438.jpg \n inflating: flower_data/train/9/image_06404.jpg \n inflating: flower_data/train/9/image_06399.jpg \n inflating: flower_data/train/9/image_06428.jpg \n inflating: flower_data/train/9/image_06400.jpg \n inflating: flower_data/train/9/image_06415.jpg \n inflating: flower_data/train/9/image_06401.jpg \n inflating: flower_data/train/9/image_06429.jpg \n inflating: flower_data/train/9/image_06417.jpg \n inflating: flower_data/train/9/image_06403.jpg \n inflating: flower_data/train/9/image_06402.jpg \n inflating: flower_data/train/9/image_06416.jpg \n creating: flower_data/train/11/\n inflating: flower_data/train/11/image_03167.jpg \n inflating: flower_data/train/11/image_03172.jpg \n inflating: flower_data/train/11/image_03166.jpg \n inflating: flower_data/train/11/image_03158.jpg \n inflating: flower_data/train/11/image_03170.jpg \n inflating: flower_data/train/11/image_03164.jpg \n inflating: flower_data/train/11/image_03171.jpg \n inflating: flower_data/train/11/image_03159.jpg \n inflating: flower_data/train/11/image_03175.jpg \n inflating: flower_data/train/11/image_03161.jpg \n inflating: flower_data/train/11/image_03149.jpg \n inflating: flower_data/train/11/image_03148.jpg \n inflating: flower_data/train/11/image_03160.jpg \n inflating: flower_data/train/11/image_03174.jpg \n inflating: flower_data/train/11/image_03162.jpg \n inflating: flower_data/train/11/image_03163.jpg \n inflating: flower_data/train/11/image_03138.jpg \n inflating: flower_data/train/11/image_03104.jpg \n inflating: flower_data/train/11/image_03110.jpg \n inflating: flower_data/train/11/image_03105.jpg \n inflating: flower_data/train/11/image_03113.jpg \n inflating: flower_data/train/11/image_03107.jpg \n inflating: flower_data/train/11/image_03106.jpg \n inflating: flower_data/train/11/image_03112.jpg \n inflating: flower_data/train/11/image_03099.jpg \n inflating: flower_data/train/11/image_03102.jpg \n inflating: flower_data/train/11/image_03103.jpg \n inflating: flower_data/train/11/image_03117.jpg \n inflating: flower_data/train/11/image_03101.jpg \n inflating: flower_data/train/11/image_03129.jpg \n inflating: flower_data/train/11/image_03114.jpg \n inflating: flower_data/train/11/image_03119.jpg \n inflating: flower_data/train/11/image_03131.jpg \n inflating: flower_data/train/11/image_03124.jpg \n inflating: flower_data/train/11/image_03118.jpg \n inflating: flower_data/train/11/image_03132.jpg \n inflating: flower_data/train/11/image_03126.jpg \n inflating: flower_data/train/11/image_03127.jpg \n inflating: flower_data/train/11/image_03133.jpg \n inflating: flower_data/train/11/image_03137.jpg \n inflating: flower_data/train/11/image_03123.jpg \n inflating: flower_data/train/11/image_03122.jpg \n inflating: flower_data/train/11/image_03136.jpg \n inflating: flower_data/train/11/image_03095.jpg \n inflating: flower_data/train/11/image_03097.jpg \n inflating: flower_data/train/11/image_03120.jpg \n inflating: flower_data/train/11/image_03134.jpg \n inflating: flower_data/train/11/image_03109.jpg \n inflating: flower_data/train/11/image_03135.jpg \n inflating: flower_data/train/11/image_03121.jpg \n inflating: flower_data/train/11/image_03096.jpg \n inflating: flower_data/train/11/image_03146.jpg \n inflating: flower_data/train/11/image_03152.jpg \n inflating: flower_data/train/11/image_03153.jpg \n inflating: flower_data/train/11/image_03179.jpg \n inflating: flower_data/train/11/image_03144.jpg \n inflating: flower_data/train/11/image_03150.jpg \n inflating: flower_data/train/11/image_03178.jpg \n inflating: flower_data/train/11/image_03154.jpg \n inflating: flower_data/train/11/image_03140.jpg \n inflating: flower_data/train/11/image_03168.jpg \n inflating: flower_data/train/11/image_03169.jpg \n inflating: flower_data/train/11/image_03155.jpg \n inflating: flower_data/train/11/image_03180.jpg \n inflating: flower_data/train/11/image_03143.jpg \n inflating: flower_data/train/11/image_03156.jpg \n inflating: flower_data/train/11/image_03142.jpg \n inflating: flower_data/train/11/image_03181.jpg \n creating: flower_data/train/7/\n inflating: flower_data/train/7/image_07202.jpg \n inflating: flower_data/train/7/image_07203.jpg \n inflating: flower_data/train/7/image_07217.jpg \n inflating: flower_data/train/7/image_07229.jpg \n inflating: flower_data/train/7/image_07201.jpg \n inflating: flower_data/train/7/image_07214.jpg \n inflating: flower_data/train/7/image_07200.jpg \n inflating: flower_data/train/7/image_07228.jpg \n inflating: flower_data/train/7/image_07204.jpg \n inflating: flower_data/train/7/image_07210.jpg \n inflating: flower_data/train/7/image_07205.jpg \n inflating: flower_data/train/7/image_07213.jpg \n inflating: flower_data/train/7/image_07207.jpg \n inflating: flower_data/train/7/image_07206.jpg \n inflating: flower_data/train/7/image_07212.jpg \n inflating: flower_data/train/7/image_07223.jpg \n inflating: flower_data/train/7/image_07222.jpg \n inflating: flower_data/train/7/image_07208.jpg \n inflating: flower_data/train/7/image_07220.jpg \n inflating: flower_data/train/7/image_07221.jpg \n inflating: flower_data/train/7/image_07209.jpg \n inflating: flower_data/train/7/image_07225.jpg \n inflating: flower_data/train/7/image_08102.jpg \n inflating: flower_data/train/7/image_07231.jpg \n inflating: flower_data/train/7/image_07230.jpg \n inflating: flower_data/train/7/image_08103.jpg \n inflating: flower_data/train/7/image_07224.jpg \n inflating: flower_data/train/7/image_07232.jpg \n inflating: flower_data/train/7/image_08101.jpg \n inflating: flower_data/train/7/image_07226.jpg \n inflating: flower_data/train/7/image_07227.jpg \n inflating: flower_data/train/7/image_08100.jpg \n inflating: flower_data/train/7/image_07233.jpg \n creating: flower_data/train/29/\n inflating: flower_data/train/29/image_04152.jpg \n inflating: flower_data/train/29/image_04147.jpg \n inflating: flower_data/train/29/image_04153.jpg \n inflating: flower_data/train/29/image_04151.jpg \n inflating: flower_data/train/29/image_04150.jpg \n inflating: flower_data/train/29/image_04144.jpg \n inflating: flower_data/train/29/image_04140.jpg \n inflating: flower_data/train/29/image_04154.jpg \n inflating: flower_data/train/29/image_04155.jpg \n inflating: flower_data/train/29/image_04141.jpg \n inflating: flower_data/train/29/image_04157.jpg \n inflating: flower_data/train/29/image_04156.jpg \n inflating: flower_data/train/29/image_04086.jpg \n inflating: flower_data/train/29/image_04092.jpg \n inflating: flower_data/train/29/image_04131.jpg \n inflating: flower_data/train/29/image_04125.jpg \n inflating: flower_data/train/29/image_04119.jpg \n inflating: flower_data/train/29/image_04118.jpg \n inflating: flower_data/train/29/image_04130.jpg \n inflating: flower_data/train/29/image_04093.jpg \n inflating: flower_data/train/29/image_04087.jpg \n inflating: flower_data/train/29/image_04091.jpg \n inflating: flower_data/train/29/image_04085.jpg \n inflating: flower_data/train/29/image_04126.jpg \n inflating: flower_data/train/29/image_04132.jpg \n inflating: flower_data/train/29/image_04133.jpg \n inflating: flower_data/train/29/image_04127.jpg \n inflating: flower_data/train/29/image_04084.jpg \n inflating: flower_data/train/29/image_04094.jpg \n inflating: flower_data/train/29/image_04123.jpg \n inflating: flower_data/train/29/image_04136.jpg \n inflating: flower_data/train/29/image_04122.jpg \n inflating: flower_data/train/29/image_04081.jpg \n inflating: flower_data/train/29/image_04134.jpg \n inflating: flower_data/train/29/image_04120.jpg \n inflating: flower_data/train/29/image_04121.jpg \n inflating: flower_data/train/29/image_04135.jpg \n inflating: flower_data/train/29/image_04109.jpg \n inflating: flower_data/train/29/image_04096.jpg \n inflating: flower_data/train/29/image_04082.jpg \n inflating: flower_data/train/29/image_04110.jpg \n inflating: flower_data/train/29/image_04138.jpg \n inflating: flower_data/train/29/image_04139.jpg \n inflating: flower_data/train/29/image_04105.jpg \n inflating: flower_data/train/29/image_04111.jpg \n inflating: flower_data/train/29/image_04098.jpg \n inflating: flower_data/train/29/image_04107.jpg \n inflating: flower_data/train/29/image_04113.jpg \n inflating: flower_data/train/29/image_04106.jpg \n inflating: flower_data/train/29/image_04089.jpg \n inflating: flower_data/train/29/image_04102.jpg \n inflating: flower_data/train/29/image_04117.jpg \n inflating: flower_data/train/29/image_04088.jpg \n inflating: flower_data/train/29/image_04129.jpg \n inflating: flower_data/train/29/image_04115.jpg \n inflating: flower_data/train/29/image_04101.jpg \n inflating: flower_data/train/29/image_04100.jpg \n inflating: flower_data/train/29/image_04114.jpg \n inflating: flower_data/train/29/image_04128.jpg \n inflating: flower_data/train/29/image_04158.jpg \n inflating: flower_data/train/29/image_04149.jpg \n inflating: flower_data/train/29/image_04148.jpg \n creating: flower_data/train/16/\n inflating: flower_data/train/16/image_06685.jpg \n inflating: flower_data/train/16/image_06652.jpg \n inflating: flower_data/train/16/image_06653.jpg \n inflating: flower_data/train/16/image_06684.jpg \n inflating: flower_data/train/16/image_06690.jpg \n inflating: flower_data/train/16/image_06686.jpg \n inflating: flower_data/train/16/image_06692.jpg \n inflating: flower_data/train/16/image_06679.jpg \n inflating: flower_data/train/16/image_06678.jpg \n inflating: flower_data/train/16/image_06687.jpg \n inflating: flower_data/train/16/image_06683.jpg \n inflating: flower_data/train/16/image_06668.jpg \n inflating: flower_data/train/16/image_06654.jpg \n inflating: flower_data/train/16/image_06655.jpg \n inflating: flower_data/train/16/image_06669.jpg \n inflating: flower_data/train/16/image_06682.jpg \n inflating: flower_data/train/16/image_06680.jpg \n inflating: flower_data/train/16/image_06656.jpg \n inflating: flower_data/train/16/image_06681.jpg \n inflating: flower_data/train/16/image_06667.jpg \n inflating: flower_data/train/16/image_06666.jpg \n inflating: flower_data/train/16/image_06672.jpg \n inflating: flower_data/train/16/image_06664.jpg \n inflating: flower_data/train/16/image_06658.jpg \n inflating: flower_data/train/16/image_06659.jpg \n inflating: flower_data/train/16/image_06665.jpg \n inflating: flower_data/train/16/image_06661.jpg \n inflating: flower_data/train/16/image_06675.jpg \n inflating: flower_data/train/16/image_06674.jpg \n inflating: flower_data/train/16/image_06660.jpg \n inflating: flower_data/train/16/image_06689.jpg \n inflating: flower_data/train/16/image_06676.jpg \n inflating: flower_data/train/16/image_06662.jpg \n inflating: flower_data/train/16/image_06663.jpg \n inflating: flower_data/train/16/image_06677.jpg \n inflating: flower_data/train/16/image_06688.jpg \n creating: flower_data/train/42/\n inflating: flower_data/train/42/image_05728.jpg \n inflating: flower_data/train/42/image_05714.jpg \n inflating: flower_data/train/42/image_05700.jpg \n inflating: flower_data/train/42/image_05715.jpg \n inflating: flower_data/train/42/image_05729.jpg \n inflating: flower_data/train/42/image_05688.jpg \n inflating: flower_data/train/42/image_05703.jpg \n inflating: flower_data/train/42/image_05717.jpg \n inflating: flower_data/train/42/image_05716.jpg \n inflating: flower_data/train/42/image_05702.jpg \n inflating: flower_data/train/42/image_05689.jpg \n inflating: flower_data/train/42/image_05699.jpg \n inflating: flower_data/train/42/image_05706.jpg \n inflating: flower_data/train/42/image_05707.jpg \n inflating: flower_data/train/42/image_05698.jpg \n inflating: flower_data/train/42/image_05705.jpg \n inflating: flower_data/train/42/image_05739.jpg \n inflating: flower_data/train/42/image_05738.jpg \n inflating: flower_data/train/42/image_05704.jpg \n inflating: flower_data/train/42/image_05710.jpg \n inflating: flower_data/train/42/image_05742.jpg \n inflating: flower_data/train/42/image_05743.jpg \n inflating: flower_data/train/42/image_05741.jpg \n inflating: flower_data/train/42/image_05740.jpg \n inflating: flower_data/train/42/image_05709.jpg \n inflating: flower_data/train/42/image_05721.jpg \n inflating: flower_data/train/42/image_05720.jpg \n inflating: flower_data/train/42/image_05734.jpg \n inflating: flower_data/train/42/image_05708.jpg \n inflating: flower_data/train/42/image_05697.jpg \n inflating: flower_data/train/42/image_05695.jpg \n inflating: flower_data/train/42/image_05722.jpg \n inflating: flower_data/train/42/image_05736.jpg \n inflating: flower_data/train/42/image_05737.jpg \n inflating: flower_data/train/42/image_05694.jpg \n inflating: flower_data/train/42/image_05727.jpg \n inflating: flower_data/train/42/image_05733.jpg \n inflating: flower_data/train/42/image_05732.jpg \n inflating: flower_data/train/42/image_05726.jpg \n inflating: flower_data/train/42/image_05685.jpg \n inflating: flower_data/train/42/image_05691.jpg \n inflating: flower_data/train/42/image_05687.jpg \n inflating: flower_data/train/42/image_05693.jpg \n inflating: flower_data/train/42/image_05724.jpg \n inflating: flower_data/train/42/image_05718.jpg \n inflating: flower_data/train/42/image_05719.jpg \n inflating: flower_data/train/42/image_05725.jpg \n inflating: flower_data/train/42/image_05692.jpg \n inflating: flower_data/train/42/image_05686.jpg \n creating: flower_data/train/89/\n inflating: flower_data/train/89/image_00718.jpg \n inflating: flower_data/train/89/image_00693.jpg \n inflating: flower_data/train/89/image_00687.jpg \n inflating: flower_data/train/89/image_00650.jpg \n inflating: flower_data/train/89/image_00644.jpg \n inflating: flower_data/train/89/image_00678.jpg \n inflating: flower_data/train/89/image_00679.jpg \n inflating: flower_data/train/89/image_00651.jpg \n inflating: flower_data/train/89/image_00686.jpg \n inflating: flower_data/train/89/image_00719.jpg \n inflating: flower_data/train/89/image_00731.jpg \n inflating: flower_data/train/89/image_00725.jpg \n inflating: flower_data/train/89/image_00733.jpg \n inflating: flower_data/train/89/image_00727.jpg \n inflating: flower_data/train/89/image_00684.jpg \n inflating: flower_data/train/89/image_00690.jpg \n inflating: flower_data/train/89/image_00647.jpg \n inflating: flower_data/train/89/image_00653.jpg \n inflating: flower_data/train/89/image_00646.jpg \n inflating: flower_data/train/89/image_00685.jpg \n inflating: flower_data/train/89/image_00732.jpg \n inflating: flower_data/train/89/image_00736.jpg \n inflating: flower_data/train/89/image_00722.jpg \n inflating: flower_data/train/89/image_00681.jpg \n inflating: flower_data/train/89/image_00695.jpg \n inflating: flower_data/train/89/image_00642.jpg \n inflating: flower_data/train/89/image_00656.jpg \n inflating: flower_data/train/89/image_00657.jpg \n inflating: flower_data/train/89/image_00643.jpg \n inflating: flower_data/train/89/image_00694.jpg \n inflating: flower_data/train/89/image_00680.jpg \n inflating: flower_data/train/89/image_00723.jpg \n inflating: flower_data/train/89/image_00737.jpg \n inflating: flower_data/train/89/image_00709.jpg \n inflating: flower_data/train/89/image_00721.jpg \n inflating: flower_data/train/89/image_00735.jpg \n inflating: flower_data/train/89/image_00682.jpg \n inflating: flower_data/train/89/image_00669.jpg \n inflating: flower_data/train/89/image_00655.jpg \n inflating: flower_data/train/89/image_00641.jpg \n inflating: flower_data/train/89/image_00668.jpg \n inflating: flower_data/train/89/image_00683.jpg \n inflating: flower_data/train/89/image_00697.jpg \n inflating: flower_data/train/89/image_00734.jpg \n inflating: flower_data/train/89/image_00720.jpg \n inflating: flower_data/train/89/image_00747.jpg \n inflating: flower_data/train/89/image_00753.jpg \n inflating: flower_data/train/89/image_00633.jpg \n inflating: flower_data/train/89/image_00626.jpg \n inflating: flower_data/train/89/image_00632.jpg \n inflating: flower_data/train/89/image_00750.jpg \n inflating: flower_data/train/89/image_00744.jpg \n inflating: flower_data/train/89/image_00778.jpg \n inflating: flower_data/train/89/image_00630.jpg \n inflating: flower_data/train/89/image_00618.jpg \n inflating: flower_data/train/89/image_00619.jpg \n inflating: flower_data/train/89/image_00631.jpg \n inflating: flower_data/train/89/image_00625.jpg \n inflating: flower_data/train/89/image_00751.jpg \n inflating: flower_data/train/89/image_00769.jpg \n inflating: flower_data/train/89/image_00609.jpg \n inflating: flower_data/train/89/image_00621.jpg \n inflating: flower_data/train/89/image_00635.jpg \n inflating: flower_data/train/89/image_00620.jpg \n inflating: flower_data/train/89/image_00608.jpg \n inflating: flower_data/train/89/image_00754.jpg \n inflating: flower_data/train/89/image_00768.jpg \n inflating: flower_data/train/89/image_00742.jpg \n inflating: flower_data/train/89/image_00636.jpg \n inflating: flower_data/train/89/image_00622.jpg \n inflating: flower_data/train/89/image_00637.jpg \n inflating: flower_data/train/89/image_00757.jpg \n inflating: flower_data/train/89/image_00743.jpg \n inflating: flower_data/train/89/image_00780.jpg \n inflating: flower_data/train/89/image_00766.jpg \n inflating: flower_data/train/89/image_00772.jpg \n inflating: flower_data/train/89/image_00612.jpg \n inflating: flower_data/train/89/image_00606.jpg \n inflating: flower_data/train/89/image_00607.jpg \n inflating: flower_data/train/89/image_00613.jpg \n inflating: flower_data/train/89/image_00773.jpg \n inflating: flower_data/train/89/image_00767.jpg \n inflating: flower_data/train/89/image_00771.jpg \n inflating: flower_data/train/89/image_00765.jpg \n inflating: flower_data/train/89/image_00598.jpg \n inflating: flower_data/train/89/image_00605.jpg \n inflating: flower_data/train/89/image_00611.jpg \n inflating: flower_data/train/89/image_00639.jpg \n inflating: flower_data/train/89/image_00638.jpg \n inflating: flower_data/train/89/image_00604.jpg \n inflating: flower_data/train/89/image_00599.jpg \n inflating: flower_data/train/89/image_00758.jpg \n inflating: flower_data/train/89/image_00770.jpg \n inflating: flower_data/train/89/image_00748.jpg \n inflating: flower_data/train/89/image_00774.jpg \n inflating: flower_data/train/89/image_00760.jpg \n inflating: flower_data/train/89/image_00628.jpg \n inflating: flower_data/train/89/image_00600.jpg \n inflating: flower_data/train/89/image_00614.jpg \n inflating: flower_data/train/89/image_00615.jpg \n inflating: flower_data/train/89/image_00601.jpg \n inflating: flower_data/train/89/image_00629.jpg \n inflating: flower_data/train/89/image_00761.jpg \n inflating: flower_data/train/89/image_00775.jpg \n inflating: flower_data/train/89/image_00777.jpg \n inflating: flower_data/train/89/image_00603.jpg \n inflating: flower_data/train/89/image_00602.jpg \n inflating: flower_data/train/89/image_00616.jpg \n inflating: flower_data/train/89/image_00776.jpg \n inflating: flower_data/train/89/image_00762.jpg \n inflating: flower_data/train/89/image_00705.jpg \n inflating: flower_data/train/89/image_00711.jpg \n inflating: flower_data/train/89/image_00739.jpg \n inflating: flower_data/train/89/image_00671.jpg \n inflating: flower_data/train/89/image_00665.jpg \n inflating: flower_data/train/89/image_00659.jpg \n inflating: flower_data/train/89/image_00658.jpg \n inflating: flower_data/train/89/image_00664.jpg \n inflating: flower_data/train/89/image_00670.jpg \n inflating: flower_data/train/89/image_00738.jpg \n inflating: flower_data/train/89/image_00710.jpg \n inflating: flower_data/train/89/image_00704.jpg \n inflating: flower_data/train/89/image_00712.jpg \n inflating: flower_data/train/89/image_00706.jpg \n inflating: flower_data/train/89/image_00666.jpg \n inflating: flower_data/train/89/image_00672.jpg \n inflating: flower_data/train/89/image_00673.jpg \n inflating: flower_data/train/89/image_00667.jpg \n inflating: flower_data/train/89/image_00698.jpg \n inflating: flower_data/train/89/image_00707.jpg \n inflating: flower_data/train/89/image_00713.jpg \n inflating: flower_data/train/89/image_00717.jpg \n inflating: flower_data/train/89/image_00703.jpg \n inflating: flower_data/train/89/image_00688.jpg \n inflating: flower_data/train/89/image_00663.jpg \n inflating: flower_data/train/89/image_00677.jpg \n inflating: flower_data/train/89/image_00662.jpg \n inflating: flower_data/train/89/image_00689.jpg \n inflating: flower_data/train/89/image_07284.jpg \n inflating: flower_data/train/89/image_00702.jpg \n inflating: flower_data/train/89/image_00716.jpg \n inflating: flower_data/train/89/image_00728.jpg \n inflating: flower_data/train/89/image_00700.jpg \n inflating: flower_data/train/89/image_00714.jpg \n inflating: flower_data/train/89/image_00648.jpg \n inflating: flower_data/train/89/image_00674.jpg \n inflating: flower_data/train/89/image_00660.jpg \n inflating: flower_data/train/89/image_00661.jpg \n inflating: flower_data/train/89/image_00675.jpg \n inflating: flower_data/train/89/image_00649.jpg \n inflating: flower_data/train/89/image_00715.jpg \n inflating: flower_data/train/89/image_00701.jpg \n inflating: flower_data/train/89/image_00729.jpg \n creating: flower_data/train/45/\n inflating: flower_data/train/45/image_07128.jpg \n inflating: flower_data/train/45/image_07129.jpg \n inflating: flower_data/train/45/image_07138.jpg \n inflating: flower_data/train/45/image_07148.jpg \n inflating: flower_data/train/45/image_07161.jpg \n inflating: flower_data/train/45/image_07149.jpg \n inflating: flower_data/train/45/image_07159.jpg \n inflating: flower_data/train/45/image_07158.jpg \n inflating: flower_data/train/45/image_07141.jpg \n inflating: flower_data/train/45/image_07154.jpg \n inflating: flower_data/train/45/image_08098.jpg \n inflating: flower_data/train/45/image_07156.jpg \n inflating: flower_data/train/45/image_07143.jpg \n inflating: flower_data/train/45/image_07157.jpg \n inflating: flower_data/train/45/image_07153.jpg \n inflating: flower_data/train/45/image_07147.jpg \n inflating: flower_data/train/45/image_07146.jpg \n inflating: flower_data/train/45/image_07152.jpg \n inflating: flower_data/train/45/image_07144.jpg \n inflating: flower_data/train/45/image_07150.jpg \n inflating: flower_data/train/45/image_07151.jpg \n inflating: flower_data/train/45/image_07145.jpg \n inflating: flower_data/train/45/image_07136.jpg \n inflating: flower_data/train/45/image_07123.jpg \n inflating: flower_data/train/45/image_07135.jpg \n inflating: flower_data/train/45/image_07134.jpg \n inflating: flower_data/train/45/image_07130.jpg \n inflating: flower_data/train/45/image_07124.jpg \n inflating: flower_data/train/45/image_07125.jpg \n inflating: flower_data/train/45/image_07131.jpg \n inflating: flower_data/train/45/image_07133.jpg \n inflating: flower_data/train/45/image_07132.jpg \n inflating: flower_data/train/45/image_07126.jpg \n creating: flower_data/train/73/\n inflating: flower_data/train/73/image_00283.jpg \n inflating: flower_data/train/73/image_00297.jpg \n inflating: flower_data/train/73/image_00254.jpg \n inflating: flower_data/train/73/image_00268.jpg \n inflating: flower_data/train/73/image_00334.jpg \n inflating: flower_data/train/73/image_00308.jpg \n inflating: flower_data/train/73/image_00309.jpg \n inflating: flower_data/train/73/image_00321.jpg \n inflating: flower_data/train/73/image_00269.jpg \n inflating: flower_data/train/73/image_00255.jpg \n inflating: flower_data/train/73/image_00296.jpg \n inflating: flower_data/train/73/image_00257.jpg \n inflating: flower_data/train/73/image_00323.jpg \n inflating: flower_data/train/73/image_00336.jpg \n inflating: flower_data/train/73/image_00322.jpg \n inflating: flower_data/train/73/image_00444.jpg \n inflating: flower_data/train/73/image_00256.jpg \n inflating: flower_data/train/73/image_00281.jpg \n inflating: flower_data/train/73/image_00285.jpg \n inflating: flower_data/train/73/image_00252.jpg \n inflating: flower_data/train/73/image_00326.jpg \n inflating: flower_data/train/73/image_00332.jpg \n inflating: flower_data/train/73/image_00333.jpg \n inflating: flower_data/train/73/image_00441.jpg \n inflating: flower_data/train/73/image_00327.jpg \n inflating: flower_data/train/73/image_00253.jpg \n inflating: flower_data/train/73/image_00290.jpg \n inflating: flower_data/train/73/image_00286.jpg \n inflating: flower_data/train/73/image_00292.jpg \n inflating: flower_data/train/73/image_00279.jpg \n inflating: flower_data/train/73/image_00331.jpg \n inflating: flower_data/train/73/image_00325.jpg \n inflating: flower_data/train/73/image_00443.jpg \n inflating: flower_data/train/73/image_00318.jpg \n inflating: flower_data/train/73/image_00278.jpg \n inflating: flower_data/train/73/image_00293.jpg \n inflating: flower_data/train/73/image_00287.jpg \n inflating: flower_data/train/73/image_00380.jpg \n inflating: flower_data/train/73/image_00431.jpg \n inflating: flower_data/train/73/image_00357.jpg \n inflating: flower_data/train/73/image_00343.jpg \n inflating: flower_data/train/73/image_00425.jpg \n inflating: flower_data/train/73/image_00419.jpg \n inflating: flower_data/train/73/image_00418.jpg \n inflating: flower_data/train/73/image_00342.jpg \n inflating: flower_data/train/73/image_00430.jpg \n inflating: flower_data/train/73/image_00395.jpg \n inflating: flower_data/train/73/image_00383.jpg \n inflating: flower_data/train/73/image_00397.jpg \n inflating: flower_data/train/73/image_00340.jpg \n inflating: flower_data/train/73/image_00368.jpg \n inflating: flower_data/train/73/image_00355.jpg \n inflating: flower_data/train/73/image_00433.jpg \n inflating: flower_data/train/73/image_00427.jpg \n inflating: flower_data/train/73/image_00341.jpg \n inflating: flower_data/train/73/image_00396.jpg \n inflating: flower_data/train/73/image_00382.jpg \n inflating: flower_data/train/73/image_00386.jpg \n inflating: flower_data/train/73/image_00392.jpg \n inflating: flower_data/train/73/image_00379.jpg \n inflating: flower_data/train/73/image_00423.jpg \n inflating: flower_data/train/73/image_00437.jpg \n inflating: flower_data/train/73/image_00351.jpg \n inflating: flower_data/train/73/image_00350.jpg \n inflating: flower_data/train/73/image_00344.jpg \n inflating: flower_data/train/73/image_00422.jpg \n inflating: flower_data/train/73/image_00393.jpg \n inflating: flower_data/train/73/image_00391.jpg \n inflating: flower_data/train/73/image_00385.jpg \n inflating: flower_data/train/73/image_00408.jpg \n inflating: flower_data/train/73/image_00434.jpg \n inflating: flower_data/train/73/image_00420.jpg \n inflating: flower_data/train/73/image_00346.jpg \n inflating: flower_data/train/73/image_00347.jpg \n inflating: flower_data/train/73/image_00435.jpg \n inflating: flower_data/train/73/image_00384.jpg \n inflating: flower_data/train/73/image_00390.jpg \n inflating: flower_data/train/73/image_00410.jpg \n inflating: flower_data/train/73/image_00376.jpg \n inflating: flower_data/train/73/image_00362.jpg \n inflating: flower_data/train/73/image_00404.jpg \n inflating: flower_data/train/73/image_00438.jpg \n inflating: flower_data/train/73/image_00439.jpg \n inflating: flower_data/train/73/image_00363.jpg \n inflating: flower_data/train/73/image_00405.jpg \n inflating: flower_data/train/73/image_00411.jpg \n inflating: flower_data/train/73/image_00377.jpg \n inflating: flower_data/train/73/image_00388.jpg \n inflating: flower_data/train/73/image_00407.jpg \n inflating: flower_data/train/73/image_00361.jpg \n inflating: flower_data/train/73/image_00375.jpg \n inflating: flower_data/train/73/image_00413.jpg \n inflating: flower_data/train/73/image_00348.jpg \n inflating: flower_data/train/73/image_00374.jpg \n inflating: flower_data/train/73/image_00412.jpg \n inflating: flower_data/train/73/image_00406.jpg \n inflating: flower_data/train/73/image_00360.jpg \n inflating: flower_data/train/73/image_00358.jpg \n inflating: flower_data/train/73/image_00364.jpg \n inflating: flower_data/train/73/image_00416.jpg \n inflating: flower_data/train/73/image_00370.jpg \n inflating: flower_data/train/73/image_00417.jpg \n inflating: flower_data/train/73/image_00371.jpg \n inflating: flower_data/train/73/image_00403.jpg \n inflating: flower_data/train/73/image_00359.jpg \n inflating: flower_data/train/73/image_00398.jpg \n inflating: flower_data/train/73/image_00429.jpg \n inflating: flower_data/train/73/image_00373.jpg \n inflating: flower_data/train/73/image_00367.jpg \n inflating: flower_data/train/73/image_00372.jpg \n inflating: flower_data/train/73/image_00414.jpg \n inflating: flower_data/train/73/image_00428.jpg \n inflating: flower_data/train/73/image_00399.jpg \n inflating: flower_data/train/73/image_00261.jpg \n inflating: flower_data/train/73/image_00315.jpg \n inflating: flower_data/train/73/image_00301.jpg \n inflating: flower_data/train/73/image_00328.jpg \n inflating: flower_data/train/73/image_00300.jpg \n inflating: flower_data/train/73/image_00314.jpg \n inflating: flower_data/train/73/image_00274.jpg \n inflating: flower_data/train/73/image_00289.jpg \n inflating: flower_data/train/73/image_00276.jpg \n inflating: flower_data/train/73/image_00262.jpg \n inflating: flower_data/train/73/image_00316.jpg \n inflating: flower_data/train/73/image_00317.jpg \n inflating: flower_data/train/73/image_00263.jpg \n inflating: flower_data/train/73/image_00298.jpg \n inflating: flower_data/train/73/image_00273.jpg \n inflating: flower_data/train/73/image_00267.jpg \n inflating: flower_data/train/73/image_00307.jpg \n inflating: flower_data/train/73/image_00313.jpg \n inflating: flower_data/train/73/image_00312.jpg \n inflating: flower_data/train/73/image_00306.jpg \n inflating: flower_data/train/73/image_00266.jpg \n inflating: flower_data/train/73/image_00272.jpg \n inflating: flower_data/train/73/image_00299.jpg \n inflating: flower_data/train/73/image_00264.jpg \n inflating: flower_data/train/73/image_00270.jpg \n inflating: flower_data/train/73/image_00338.jpg \n inflating: flower_data/train/73/image_00310.jpg \n inflating: flower_data/train/73/image_00304.jpg \n inflating: flower_data/train/73/image_00305.jpg \n inflating: flower_data/train/73/image_00311.jpg \n inflating: flower_data/train/73/image_00339.jpg \n inflating: flower_data/train/73/image_00271.jpg \n inflating: flower_data/train/73/image_00265.jpg \n inflating: flower_data/train/73/image_00259.jpg \n creating: flower_data/train/87/\n inflating: flower_data/train/87/image_05489.jpg \n inflating: flower_data/train/87/image_05476.jpg \n inflating: flower_data/train/87/image_05516.jpg \n inflating: flower_data/train/87/image_05503.jpg \n inflating: flower_data/train/87/image_05517.jpg \n inflating: flower_data/train/87/image_05477.jpg \n inflating: flower_data/train/87/image_05463.jpg \n inflating: flower_data/train/87/image_05475.jpg \n inflating: flower_data/train/87/image_05461.jpg \n inflating: flower_data/train/87/image_05501.jpg \n inflating: flower_data/train/87/image_05515.jpg \n inflating: flower_data/train/87/image_05514.jpg \n inflating: flower_data/train/87/image_05500.jpg \n inflating: flower_data/train/87/image_05460.jpg \n inflating: flower_data/train/87/image_05474.jpg \n inflating: flower_data/train/87/image_05470.jpg \n inflating: flower_data/train/87/image_05464.jpg \n inflating: flower_data/train/87/image_05504.jpg \n inflating: flower_data/train/87/image_05510.jpg \n inflating: flower_data/train/87/image_05511.jpg \n inflating: flower_data/train/87/image_05505.jpg \n inflating: flower_data/train/87/image_05465.jpg \n inflating: flower_data/train/87/image_05471.jpg \n inflating: flower_data/train/87/image_05498.jpg \n inflating: flower_data/train/87/image_05467.jpg \n inflating: flower_data/train/87/image_05473.jpg \n inflating: flower_data/train/87/image_05513.jpg \n inflating: flower_data/train/87/image_05506.jpg \n inflating: flower_data/train/87/image_05512.jpg \n inflating: flower_data/train/87/image_05472.jpg \n inflating: flower_data/train/87/image_05499.jpg \n inflating: flower_data/train/87/image_05480.jpg \n inflating: flower_data/train/87/image_05494.jpg \n inflating: flower_data/train/87/image_05522.jpg \n inflating: flower_data/train/87/image_05495.jpg \n inflating: flower_data/train/87/image_05481.jpg \n inflating: flower_data/train/87/image_05497.jpg \n inflating: flower_data/train/87/image_05483.jpg \n inflating: flower_data/train/87/image_05468.jpg \n inflating: flower_data/train/87/image_05521.jpg \n inflating: flower_data/train/87/image_05509.jpg \n inflating: flower_data/train/87/image_05482.jpg \n inflating: flower_data/train/87/image_05496.jpg \n inflating: flower_data/train/87/image_05492.jpg \n inflating: flower_data/train/87/image_05486.jpg \n inflating: flower_data/train/87/image_05479.jpg \n inflating: flower_data/train/87/image_05519.jpg \n inflating: flower_data/train/87/image_05518.jpg \n inflating: flower_data/train/87/image_05478.jpg \n inflating: flower_data/train/87/image_05491.jpg \n inflating: flower_data/train/87/image_05490.jpg \n creating: flower_data/train/80/\n inflating: flower_data/train/80/image_01994.jpg \n inflating: flower_data/train/80/image_01980.jpg \n inflating: flower_data/train/80/image_02047.jpg \n inflating: flower_data/train/80/image_02053.jpg \n inflating: flower_data/train/80/image_02052.jpg \n inflating: flower_data/train/80/image_02046.jpg \n inflating: flower_data/train/80/image_01981.jpg \n inflating: flower_data/train/80/image_01995.jpg \n inflating: flower_data/train/80/image_01968.jpg \n inflating: flower_data/train/80/image_01997.jpg \n inflating: flower_data/train/80/image_02044.jpg \n inflating: flower_data/train/80/image_02045.jpg \n inflating: flower_data/train/80/image_01982.jpg \n inflating: flower_data/train/80/image_01969.jpg \n inflating: flower_data/train/80/image_01986.jpg \n inflating: flower_data/train/80/image_02040.jpg \n inflating: flower_data/train/80/image_02054.jpg \n inflating: flower_data/train/80/image_02068.jpg \n inflating: flower_data/train/80/image_01987.jpg \n inflating: flower_data/train/80/image_01978.jpg \n inflating: flower_data/train/80/image_01985.jpg \n inflating: flower_data/train/80/image_02056.jpg \n inflating: flower_data/train/80/image_02057.jpg \n inflating: flower_data/train/80/image_02043.jpg \n inflating: flower_data/train/80/image_01984.jpg \n inflating: flower_data/train/80/image_02024.jpg \n inflating: flower_data/train/80/image_02030.jpg \n inflating: flower_data/train/80/image_02018.jpg \n inflating: flower_data/train/80/image_02019.jpg \n inflating: flower_data/train/80/image_02031.jpg \n inflating: flower_data/train/80/image_02025.jpg \n inflating: flower_data/train/80/image_02033.jpg \n inflating: flower_data/train/80/image_02027.jpg \n inflating: flower_data/train/80/image_02026.jpg \n inflating: flower_data/train/80/image_02032.jpg \n inflating: flower_data/train/80/image_02036.jpg \n inflating: flower_data/train/80/image_02022.jpg \n inflating: flower_data/train/80/image_02037.jpg \n inflating: flower_data/train/80/image_02009.jpg \n inflating: flower_data/train/80/image_02035.jpg \n inflating: flower_data/train/80/image_02034.jpg \n inflating: flower_data/train/80/image_02008.jpg \n inflating: flower_data/train/80/image_02039.jpg \n inflating: flower_data/train/80/image_02038.jpg \n inflating: flower_data/train/80/image_02010.jpg \n inflating: flower_data/train/80/image_02004.jpg \n inflating: flower_data/train/80/image_02012.jpg \n inflating: flower_data/train/80/image_02006.jpg \n inflating: flower_data/train/80/image_02007.jpg \n inflating: flower_data/train/80/image_02013.jpg \n inflating: flower_data/train/80/image_02003.jpg \n inflating: flower_data/train/80/image_02002.jpg \n inflating: flower_data/train/80/image_02016.jpg \n inflating: flower_data/train/80/image_02028.jpg \n inflating: flower_data/train/80/image_02000.jpg \n inflating: flower_data/train/80/image_02014.jpg \n inflating: flower_data/train/80/image_02015.jpg \n inflating: flower_data/train/80/image_02001.jpg \n inflating: flower_data/train/80/image_02029.jpg \n inflating: flower_data/train/80/image_01976.jpg \n inflating: flower_data/train/80/image_02066.jpg \n inflating: flower_data/train/80/image_02067.jpg \n inflating: flower_data/train/80/image_01988.jpg \n inflating: flower_data/train/80/image_01977.jpg \n inflating: flower_data/train/80/image_02065.jpg \n inflating: flower_data/train/80/image_02059.jpg \n inflating: flower_data/train/80/image_02058.jpg \n inflating: flower_data/train/80/image_02064.jpg \n inflating: flower_data/train/80/image_01974.jpg \n inflating: flower_data/train/80/image_01964.jpg \n inflating: flower_data/train/80/image_01970.jpg \n inflating: flower_data/train/80/image_02048.jpg \n inflating: flower_data/train/80/image_02060.jpg \n inflating: flower_data/train/80/image_02061.jpg \n inflating: flower_data/train/80/image_01971.jpg \n inflating: flower_data/train/80/image_01965.jpg \n inflating: flower_data/train/80/image_01973.jpg \n inflating: flower_data/train/80/image_01967.jpg \n inflating: flower_data/train/80/image_01998.jpg \n inflating: flower_data/train/80/image_02063.jpg \n inflating: flower_data/train/80/image_01999.jpg \n inflating: flower_data/train/80/image_01966.jpg \n creating: flower_data/train/74/\n inflating: flower_data/train/74/image_01189.jpg \n inflating: flower_data/train/74/image_01162.jpg \n inflating: flower_data/train/74/image_01176.jpg \n inflating: flower_data/train/74/image_01228.jpg \n inflating: flower_data/train/74/image_01214.jpg \n inflating: flower_data/train/74/image_01201.jpg \n inflating: flower_data/train/74/image_01215.jpg \n inflating: flower_data/train/74/image_01229.jpg \n inflating: flower_data/train/74/image_01177.jpg \n inflating: flower_data/train/74/image_01163.jpg \n inflating: flower_data/train/74/image_01188.jpg \n inflating: flower_data/train/74/image_01161.jpg \n inflating: flower_data/train/74/image_01203.jpg \n inflating: flower_data/train/74/image_01217.jpg \n inflating: flower_data/train/74/image_01216.jpg \n inflating: flower_data/train/74/image_01202.jpg \n inflating: flower_data/train/74/image_01160.jpg \n inflating: flower_data/train/74/image_01174.jpg \n inflating: flower_data/train/74/image_01148.jpg \n inflating: flower_data/train/74/image_01170.jpg \n inflating: flower_data/train/74/image_01164.jpg \n inflating: flower_data/train/74/image_01158.jpg \n inflating: flower_data/train/74/image_01206.jpg \n inflating: flower_data/train/74/image_01212.jpg \n inflating: flower_data/train/74/image_01207.jpg \n inflating: flower_data/train/74/image_01159.jpg \n inflating: flower_data/train/74/image_01171.jpg \n inflating: flower_data/train/74/image_01198.jpg \n inflating: flower_data/train/74/image_01167.jpg \n inflating: flower_data/train/74/image_01211.jpg \n inflating: flower_data/train/74/image_01205.jpg \n inflating: flower_data/train/74/image_01239.jpg \n inflating: flower_data/train/74/image_01238.jpg \n inflating: flower_data/train/74/image_01204.jpg \n inflating: flower_data/train/74/image_01210.jpg \n inflating: flower_data/train/74/image_01172.jpg \n inflating: flower_data/train/74/image_01166.jpg \n inflating: flower_data/train/74/image_01199.jpg \n inflating: flower_data/train/74/image_01303.jpg \n inflating: flower_data/train/74/image_01277.jpg \n inflating: flower_data/train/74/image_01263.jpg \n inflating: flower_data/train/74/image_01262.jpg \n inflating: flower_data/train/74/image_01289.jpg \n inflating: flower_data/train/74/image_01302.jpg \n inflating: flower_data/train/74/image_01300.jpg \n inflating: flower_data/train/74/image_01248.jpg \n inflating: flower_data/train/74/image_01274.jpg \n inflating: flower_data/train/74/image_01275.jpg \n inflating: flower_data/train/74/image_01261.jpg \n inflating: flower_data/train/74/image_01301.jpg \n inflating: flower_data/train/74/image_01311.jpg \n inflating: flower_data/train/74/image_01265.jpg \n inflating: flower_data/train/74/image_01259.jpg \n inflating: flower_data/train/74/image_01258.jpg \n inflating: flower_data/train/74/image_01270.jpg \n inflating: flower_data/train/74/image_01304.jpg \n inflating: flower_data/train/74/image_01306.jpg \n inflating: flower_data/train/74/image_01299.jpg \n inflating: flower_data/train/74/image_01272.jpg \n inflating: flower_data/train/74/image_01267.jpg \n inflating: flower_data/train/74/image_01273.jpg \n inflating: flower_data/train/74/image_01298.jpg \n inflating: flower_data/train/74/image_01313.jpg \n inflating: flower_data/train/74/image_01295.jpg \n inflating: flower_data/train/74/image_01281.jpg \n inflating: flower_data/train/74/image_01242.jpg \n inflating: flower_data/train/74/image_01243.jpg \n inflating: flower_data/train/74/image_01257.jpg \n inflating: flower_data/train/74/image_01280.jpg \n inflating: flower_data/train/74/image_01309.jpg \n inflating: flower_data/train/74/image_01282.jpg \n inflating: flower_data/train/74/image_01296.jpg \n inflating: flower_data/train/74/image_01269.jpg \n inflating: flower_data/train/74/image_01241.jpg \n inflating: flower_data/train/74/image_01255.jpg \n inflating: flower_data/train/74/image_01240.jpg \n inflating: flower_data/train/74/image_01268.jpg \n inflating: flower_data/train/74/image_01297.jpg \n inflating: flower_data/train/74/image_01283.jpg \n inflating: flower_data/train/74/image_01308.jpg \n inflating: flower_data/train/74/image_01287.jpg \n inflating: flower_data/train/74/image_01293.jpg \n inflating: flower_data/train/74/image_01250.jpg \n inflating: flower_data/train/74/image_01278.jpg \n inflating: flower_data/train/74/image_01279.jpg \n inflating: flower_data/train/74/image_01245.jpg \n inflating: flower_data/train/74/image_01292.jpg \n inflating: flower_data/train/74/image_01286.jpg \n inflating: flower_data/train/74/image_01290.jpg \n inflating: flower_data/train/74/image_01284.jpg \n inflating: flower_data/train/74/image_01253.jpg \n inflating: flower_data/train/74/image_01247.jpg \n inflating: flower_data/train/74/image_01246.jpg \n inflating: flower_data/train/74/image_01252.jpg \n inflating: flower_data/train/74/image_01285.jpg \n inflating: flower_data/train/74/image_01291.jpg \n inflating: flower_data/train/74/image_01180.jpg \n inflating: flower_data/train/74/image_01194.jpg \n inflating: flower_data/train/74/image_01143.jpg \n inflating: flower_data/train/74/image_01235.jpg \n inflating: flower_data/train/74/image_01221.jpg \n inflating: flower_data/train/74/image_01220.jpg \n inflating: flower_data/train/74/image_01234.jpg \n inflating: flower_data/train/74/image_01208.jpg \n inflating: flower_data/train/74/image_01156.jpg \n inflating: flower_data/train/74/image_01195.jpg \n inflating: flower_data/train/74/image_01181.jpg \n inflating: flower_data/train/74/image_01197.jpg \n inflating: flower_data/train/74/image_01183.jpg \n inflating: flower_data/train/74/image_01168.jpg \n inflating: flower_data/train/74/image_01154.jpg \n inflating: flower_data/train/74/image_01236.jpg \n inflating: flower_data/train/74/image_01237.jpg \n inflating: flower_data/train/74/image_01223.jpg \n inflating: flower_data/train/74/image_01155.jpg \n inflating: flower_data/train/74/image_01169.jpg \n inflating: flower_data/train/74/image_01182.jpg \n inflating: flower_data/train/74/image_01196.jpg \n inflating: flower_data/train/74/image_01186.jpg \n inflating: flower_data/train/74/image_01145.jpg \n inflating: flower_data/train/74/image_01179.jpg \n inflating: flower_data/train/74/image_01227.jpg \n inflating: flower_data/train/74/image_01233.jpg \n inflating: flower_data/train/74/image_01232.jpg \n inflating: flower_data/train/74/image_01226.jpg \n inflating: flower_data/train/74/image_01178.jpg \n inflating: flower_data/train/74/image_01144.jpg \n inflating: flower_data/train/74/image_01150.jpg \n inflating: flower_data/train/74/image_01187.jpg \n inflating: flower_data/train/74/image_01193.jpg \n inflating: flower_data/train/74/image_01185.jpg \n inflating: flower_data/train/74/image_01146.jpg \n inflating: flower_data/train/74/image_01152.jpg \n inflating: flower_data/train/74/image_01230.jpg \n inflating: flower_data/train/74/image_01224.jpg \n inflating: flower_data/train/74/image_01218.jpg \n inflating: flower_data/train/74/image_01219.jpg \n inflating: flower_data/train/74/image_01225.jpg \n inflating: flower_data/train/74/image_01231.jpg \n inflating: flower_data/train/74/image_01153.jpg \n inflating: flower_data/train/74/image_01147.jpg \n inflating: flower_data/train/74/image_01184.jpg \n creating: flower_data/train/6/\n inflating: flower_data/train/6/image_07174.jpg \n inflating: flower_data/train/6/image_07175.jpg \n inflating: flower_data/train/6/image_07188.jpg \n inflating: flower_data/train/6/image_07177.jpg \n inflating: flower_data/train/6/image_07163.jpg \n inflating: flower_data/train/6/image_07162.jpg \n inflating: flower_data/train/6/image_07176.jpg \n inflating: flower_data/train/6/image_07189.jpg \n inflating: flower_data/train/6/image_07172.jpg \n inflating: flower_data/train/6/image_07166.jpg \n inflating: flower_data/train/6/image_07167.jpg \n inflating: flower_data/train/6/image_07165.jpg \n inflating: flower_data/train/6/image_07171.jpg \n inflating: flower_data/train/6/image_07170.jpg \n inflating: flower_data/train/6/image_07164.jpg \n inflating: flower_data/train/6/image_08109.jpg \n inflating: flower_data/train/6/image_08110.jpg \n inflating: flower_data/train/6/image_07196.jpg \n inflating: flower_data/train/6/image_07169.jpg \n inflating: flower_data/train/6/image_07168.jpg \n inflating: flower_data/train/6/image_07197.jpg \n inflating: flower_data/train/6/image_07183.jpg \n inflating: flower_data/train/6/image_08111.jpg \n inflating: flower_data/train/6/image_08107.jpg \n inflating: flower_data/train/6/image_07195.jpg \n inflating: flower_data/train/6/image_07180.jpg \n inflating: flower_data/train/6/image_07194.jpg \n inflating: flower_data/train/6/image_07190.jpg \n inflating: flower_data/train/6/image_07184.jpg \n inflating: flower_data/train/6/image_07187.jpg \n inflating: flower_data/train/6/image_07193.jpg \n inflating: flower_data/train/6/image_07178.jpg \n inflating: flower_data/train/6/image_07179.jpg \n inflating: flower_data/train/6/image_07192.jpg \n inflating: flower_data/train/6/image_07186.jpg \n creating: flower_data/train/28/\n inflating: flower_data/train/28/image_05264.jpg \n inflating: flower_data/train/28/image_05271.jpg \n inflating: flower_data/train/28/image_05259.jpg \n inflating: flower_data/train/28/image_05273.jpg \n inflating: flower_data/train/28/image_05266.jpg \n inflating: flower_data/train/28/image_05262.jpg \n inflating: flower_data/train/28/image_05276.jpg \n inflating: flower_data/train/28/image_05263.jpg \n inflating: flower_data/train/28/image_05275.jpg \n inflating: flower_data/train/28/image_05261.jpg \n inflating: flower_data/train/28/image_05249.jpg \n inflating: flower_data/train/28/image_05248.jpg \n inflating: flower_data/train/28/image_05260.jpg \n inflating: flower_data/train/28/image_05274.jpg \n inflating: flower_data/train/28/image_05213.jpg \n inflating: flower_data/train/28/image_05212.jpg \n inflating: flower_data/train/28/image_05238.jpg \n inflating: flower_data/train/28/image_05239.jpg \n inflating: flower_data/train/28/image_05215.jpg \n inflating: flower_data/train/28/image_05229.jpg \n inflating: flower_data/train/28/image_05228.jpg \n inflating: flower_data/train/28/image_05216.jpg \n inflating: flower_data/train/28/image_05217.jpg \n inflating: flower_data/train/28/image_05232.jpg \n inflating: flower_data/train/28/image_05226.jpg \n inflating: flower_data/train/28/image_05227.jpg \n inflating: flower_data/train/28/image_05233.jpg \n inflating: flower_data/train/28/image_05219.jpg \n inflating: flower_data/train/28/image_05225.jpg \n inflating: flower_data/train/28/image_05231.jpg \n inflating: flower_data/train/28/image_05224.jpg \n inflating: flower_data/train/28/image_05218.jpg \n inflating: flower_data/train/28/image_05220.jpg \n inflating: flower_data/train/28/image_05234.jpg \n inflating: flower_data/train/28/image_05235.jpg \n inflating: flower_data/train/28/image_05221.jpg \n inflating: flower_data/train/28/image_05237.jpg \n inflating: flower_data/train/28/image_05223.jpg \n inflating: flower_data/train/28/image_05222.jpg \n inflating: flower_data/train/28/image_05236.jpg \n inflating: flower_data/train/28/image_05251.jpg \n inflating: flower_data/train/28/image_05245.jpg \n inflating: flower_data/train/28/image_05244.jpg \n inflating: flower_data/train/28/image_05250.jpg \n inflating: flower_data/train/28/image_05246.jpg \n inflating: flower_data/train/28/image_05252.jpg \n inflating: flower_data/train/28/image_05247.jpg \n inflating: flower_data/train/28/image_05243.jpg \n inflating: flower_data/train/28/image_05256.jpg \n inflating: flower_data/train/28/image_05254.jpg \n inflating: flower_data/train/28/image_05240.jpg \n inflating: flower_data/train/28/image_05268.jpg \n inflating: flower_data/train/28/image_05269.jpg \n inflating: flower_data/train/28/image_05241.jpg \n inflating: flower_data/train/28/image_05255.jpg \n creating: flower_data/train/17/\n inflating: flower_data/train/17/image_03832.jpg \n inflating: flower_data/train/17/image_03833.jpg \n inflating: flower_data/train/17/image_03835.jpg \n inflating: flower_data/train/17/image_03836.jpg \n inflating: flower_data/train/17/image_03845.jpg \n inflating: flower_data/train/17/image_03879.jpg \n inflating: flower_data/train/17/image_03886.jpg \n inflating: flower_data/train/17/image_03892.jpg \n inflating: flower_data/train/17/image_03887.jpg \n inflating: flower_data/train/17/image_03878.jpg \n inflating: flower_data/train/17/image_03850.jpg \n inflating: flower_data/train/17/image_03844.jpg \n inflating: flower_data/train/17/image_03852.jpg \n inflating: flower_data/train/17/image_03891.jpg \n inflating: flower_data/train/17/image_03885.jpg \n inflating: flower_data/train/17/image_03884.jpg \n inflating: flower_data/train/17/image_03847.jpg \n inflating: flower_data/train/17/image_03853.jpg \n inflating: flower_data/train/17/image_03857.jpg \n inflating: flower_data/train/17/image_03894.jpg \n inflating: flower_data/train/17/image_03880.jpg \n inflating: flower_data/train/17/image_03881.jpg \n inflating: flower_data/train/17/image_03895.jpg \n inflating: flower_data/train/17/image_03840.jpg \n inflating: flower_data/train/17/image_03897.jpg \n inflating: flower_data/train/17/image_03896.jpg \n inflating: flower_data/train/17/image_03882.jpg \n inflating: flower_data/train/17/image_03841.jpg \n inflating: flower_data/train/17/image_03869.jpg \n inflating: flower_data/train/17/image_03909.jpg \n inflating: flower_data/train/17/image_03910.jpg \n inflating: flower_data/train/17/image_03904.jpg \n inflating: flower_data/train/17/image_03870.jpg \n inflating: flower_data/train/17/image_03858.jpg \n inflating: flower_data/train/17/image_03859.jpg \n inflating: flower_data/train/17/image_03871.jpg \n inflating: flower_data/train/17/image_03865.jpg \n inflating: flower_data/train/17/image_03905.jpg \n inflating: flower_data/train/17/image_03873.jpg \n inflating: flower_data/train/17/image_03867.jpg \n inflating: flower_data/train/17/image_03898.jpg \n inflating: flower_data/train/17/image_03899.jpg \n inflating: flower_data/train/17/image_03866.jpg \n inflating: flower_data/train/17/image_03912.jpg \n inflating: flower_data/train/17/image_03902.jpg \n inflating: flower_data/train/17/image_03862.jpg \n inflating: flower_data/train/17/image_03889.jpg \n inflating: flower_data/train/17/image_03888.jpg \n inflating: flower_data/train/17/image_03863.jpg \n inflating: flower_data/train/17/image_03877.jpg \n inflating: flower_data/train/17/image_03903.jpg \n inflating: flower_data/train/17/image_03901.jpg \n inflating: flower_data/train/17/image_03849.jpg \n inflating: flower_data/train/17/image_03875.jpg \n inflating: flower_data/train/17/image_03874.jpg \n inflating: flower_data/train/17/image_03860.jpg \n inflating: flower_data/train/17/image_03848.jpg \n inflating: flower_data/train/17/image_03900.jpg \n inflating: flower_data/train/17/image_03838.jpg \n inflating: flower_data/train/17/image_03828.jpg \n creating: flower_data/train/1/\n inflating: flower_data/train/1/image_06745.jpg \n inflating: flower_data/train/1/image_06751.jpg \n inflating: flower_data/train/1/image_06750.jpg \n inflating: flower_data/train/1/image_06744.jpg \n inflating: flower_data/train/1/image_06746.jpg \n inflating: flower_data/train/1/image_06747.jpg \n inflating: flower_data/train/1/image_06753.jpg \n inflating: flower_data/train/1/image_06757.jpg \n inflating: flower_data/train/1/image_06742.jpg \n inflating: flower_data/train/1/image_06768.jpg \n inflating: flower_data/train/1/image_06740.jpg \n inflating: flower_data/train/1/image_06741.jpg \n inflating: flower_data/train/1/image_06734.jpg \n inflating: flower_data/train/1/image_06735.jpg \n inflating: flower_data/train/1/image_06737.jpg \n inflating: flower_data/train/1/image_06736.jpg \n inflating: flower_data/train/1/image_06738.jpg \n inflating: flower_data/train/1/image_06770.jpg \n inflating: flower_data/train/1/image_06759.jpg \n inflating: flower_data/train/1/image_06771.jpg \n inflating: flower_data/train/1/image_06773.jpg \n inflating: flower_data/train/1/image_06767.jpg \n inflating: flower_data/train/1/image_06766.jpg \n inflating: flower_data/train/1/image_06772.jpg \n inflating: flower_data/train/1/image_06762.jpg \n inflating: flower_data/train/1/image_06761.jpg \n inflating: flower_data/train/1/image_06748.jpg \n creating: flower_data/train/10/\n inflating: flower_data/train/10/image_07088.jpg \n inflating: flower_data/train/10/image_07103.jpg \n inflating: flower_data/train/10/image_07116.jpg \n inflating: flower_data/train/10/image_07089.jpg \n inflating: flower_data/train/10/image_07114.jpg \n inflating: flower_data/train/10/image_07100.jpg \n inflating: flower_data/train/10/image_07115.jpg \n inflating: flower_data/train/10/image_07111.jpg \n inflating: flower_data/train/10/image_07105.jpg \n inflating: flower_data/train/10/image_07110.jpg \n inflating: flower_data/train/10/image_07099.jpg \n inflating: flower_data/train/10/image_07106.jpg \n inflating: flower_data/train/10/image_07112.jpg \n inflating: flower_data/train/10/image_07113.jpg \n inflating: flower_data/train/10/image_07098.jpg \n inflating: flower_data/train/10/image_08092.jpg \n inflating: flower_data/train/10/image_08093.jpg \n inflating: flower_data/train/10/image_08091.jpg \n inflating: flower_data/train/10/image_08090.jpg \n inflating: flower_data/train/10/image_08094.jpg \n inflating: flower_data/train/10/image_08095.jpg \n inflating: flower_data/train/10/image_08097.jpg \n inflating: flower_data/train/10/image_08096.jpg \n inflating: flower_data/train/10/image_07095.jpg \n inflating: flower_data/train/10/image_07122.jpg \n inflating: flower_data/train/10/image_07096.jpg \n inflating: flower_data/train/10/image_07109.jpg \n inflating: flower_data/train/10/image_07121.jpg \n inflating: flower_data/train/10/image_07120.jpg \n inflating: flower_data/train/10/image_07108.jpg \n inflating: flower_data/train/10/image_07097.jpg \n inflating: flower_data/train/10/image_07087.jpg \n inflating: flower_data/train/10/image_07093.jpg \n inflating: flower_data/train/10/image_07118.jpg \n inflating: flower_data/train/10/image_07119.jpg \n inflating: flower_data/train/10/image_07092.jpg \n inflating: flower_data/train/10/image_07086.jpg \n inflating: flower_data/train/10/image_07091.jpg \n creating: flower_data/train/19/\n inflating: flower_data/train/19/image_06194.jpg \n inflating: flower_data/train/19/image_06180.jpg \n inflating: flower_data/train/19/image_06157.jpg \n inflating: flower_data/train/19/image_06156.jpg \n inflating: flower_data/train/19/image_06181.jpg \n inflating: flower_data/train/19/image_06183.jpg \n inflating: flower_data/train/19/image_06154.jpg \n inflating: flower_data/train/19/image_06168.jpg \n inflating: flower_data/train/19/image_06169.jpg \n inflating: flower_data/train/19/image_06182.jpg \n inflating: flower_data/train/19/image_06192.jpg \n inflating: flower_data/train/19/image_06179.jpg \n inflating: flower_data/train/19/image_06151.jpg \n inflating: flower_data/train/19/image_06150.jpg \n inflating: flower_data/train/19/image_06178.jpg \n inflating: flower_data/train/19/image_06193.jpg \n inflating: flower_data/train/19/image_06191.jpg \n inflating: flower_data/train/19/image_06185.jpg \n inflating: flower_data/train/19/image_06152.jpg \n inflating: flower_data/train/19/image_06153.jpg \n inflating: flower_data/train/19/image_06184.jpg \n inflating: flower_data/train/19/image_06190.jpg \n inflating: flower_data/train/19/image_06189.jpg \n inflating: flower_data/train/19/image_06176.jpg \n inflating: flower_data/train/19/image_06162.jpg \n inflating: flower_data/train/19/image_06163.jpg \n inflating: flower_data/train/19/image_06177.jpg \n inflating: flower_data/train/19/image_06188.jpg \n inflating: flower_data/train/19/image_06161.jpg \n inflating: flower_data/train/19/image_06149.jpg \n inflating: flower_data/train/19/image_06174.jpg \n inflating: flower_data/train/19/image_06160.jpg \n inflating: flower_data/train/19/image_06164.jpg \n inflating: flower_data/train/19/image_06171.jpg \n inflating: flower_data/train/19/image_06173.jpg \n inflating: flower_data/train/19/image_06167.jpg \n inflating: flower_data/train/19/image_06166.jpg \n inflating: flower_data/train/19/image_06172.jpg \n creating: flower_data/train/26/\n inflating: flower_data/train/26/image_06493.jpg \n inflating: flower_data/train/26/image_06487.jpg \n inflating: flower_data/train/26/image_06524.jpg \n inflating: flower_data/train/26/image_06518.jpg \n inflating: flower_data/train/26/image_06519.jpg \n inflating: flower_data/train/26/image_06525.jpg \n inflating: flower_data/train/26/image_06492.jpg \n inflating: flower_data/train/26/image_06490.jpg \n inflating: flower_data/train/26/image_06527.jpg \n inflating: flower_data/train/26/image_06491.jpg \n inflating: flower_data/train/26/image_06495.jpg \n inflating: flower_data/train/26/image_06522.jpg \n inflating: flower_data/train/26/image_06523.jpg \n inflating: flower_data/train/26/image_06494.jpg \n inflating: flower_data/train/26/image_06496.jpg \n inflating: flower_data/train/26/image_06509.jpg \n inflating: flower_data/train/26/image_06521.jpg \n inflating: flower_data/train/26/image_06520.jpg \n inflating: flower_data/train/26/image_06508.jpg \n inflating: flower_data/train/26/image_06505.jpg \n inflating: flower_data/train/26/image_06511.jpg \n inflating: flower_data/train/26/image_06510.jpg \n inflating: flower_data/train/26/image_06504.jpg \n inflating: flower_data/train/26/image_06499.jpg \n inflating: flower_data/train/26/image_06512.jpg \n inflating: flower_data/train/26/image_06507.jpg \n inflating: flower_data/train/26/image_06513.jpg \n inflating: flower_data/train/26/image_06488.jpg \n inflating: flower_data/train/26/image_06517.jpg \n inflating: flower_data/train/26/image_06503.jpg \n inflating: flower_data/train/26/image_06502.jpg \n inflating: flower_data/train/26/image_06489.jpg \n inflating: flower_data/train/26/image_06515.jpg \n creating: flower_data/train/8/\n inflating: flower_data/train/8/image_03365.jpg \n inflating: flower_data/train/8/image_03358.jpg \n inflating: flower_data/train/8/image_03367.jpg \n inflating: flower_data/train/8/image_03363.jpg \n inflating: flower_data/train/8/image_03362.jpg \n inflating: flower_data/train/8/image_03360.jpg \n inflating: flower_data/train/8/image_03348.jpg \n inflating: flower_data/train/8/image_03361.jpg \n inflating: flower_data/train/8/image_03306.jpg \n inflating: flower_data/train/8/image_03312.jpg \n inflating: flower_data/train/8/image_03307.jpg \n inflating: flower_data/train/8/image_03298.jpg \n inflating: flower_data/train/8/image_03339.jpg \n inflating: flower_data/train/8/image_03311.jpg \n inflating: flower_data/train/8/image_03305.jpg \n inflating: flower_data/train/8/image_03304.jpg \n inflating: flower_data/train/8/image_03310.jpg \n inflating: flower_data/train/8/image_03338.jpg \n inflating: flower_data/train/8/image_03300.jpg \n inflating: flower_data/train/8/image_03328.jpg \n inflating: flower_data/train/8/image_03329.jpg \n inflating: flower_data/train/8/image_03301.jpg \n inflating: flower_data/train/8/image_03315.jpg \n inflating: flower_data/train/8/image_03288.jpg \n inflating: flower_data/train/8/image_03303.jpg \n inflating: flower_data/train/8/image_03317.jpg \n inflating: flower_data/train/8/image_03316.jpg \n inflating: flower_data/train/8/image_03302.jpg \n inflating: flower_data/train/8/image_03289.jpg \n inflating: flower_data/train/8/image_03290.jpg \n inflating: flower_data/train/8/image_03284.jpg \n inflating: flower_data/train/8/image_03327.jpg \n inflating: flower_data/train/8/image_03333.jpg \n inflating: flower_data/train/8/image_03332.jpg \n inflating: flower_data/train/8/image_03326.jpg \n inflating: flower_data/train/8/image_03285.jpg \n inflating: flower_data/train/8/image_03287.jpg \n inflating: flower_data/train/8/image_03293.jpg \n inflating: flower_data/train/8/image_03318.jpg \n inflating: flower_data/train/8/image_03324.jpg \n inflating: flower_data/train/8/image_03325.jpg \n inflating: flower_data/train/8/image_03331.jpg \n inflating: flower_data/train/8/image_03292.jpg \n inflating: flower_data/train/8/image_03286.jpg \n inflating: flower_data/train/8/image_03296.jpg \n inflating: flower_data/train/8/image_03335.jpg \n inflating: flower_data/train/8/image_03321.jpg \n inflating: flower_data/train/8/image_03309.jpg \n inflating: flower_data/train/8/image_03308.jpg \n inflating: flower_data/train/8/image_03334.jpg \n inflating: flower_data/train/8/image_03297.jpg \n inflating: flower_data/train/8/image_03322.jpg \n inflating: flower_data/train/8/image_03336.jpg \n inflating: flower_data/train/8/image_03337.jpg \n inflating: flower_data/train/8/image_03323.jpg \n inflating: flower_data/train/8/image_03294.jpg \n inflating: flower_data/train/8/image_03344.jpg \n inflating: flower_data/train/8/image_03350.jpg \n inflating: flower_data/train/8/image_03351.jpg \n inflating: flower_data/train/8/image_03345.jpg \n inflating: flower_data/train/8/image_03353.jpg \n inflating: flower_data/train/8/image_03346.jpg \n inflating: flower_data/train/8/image_03352.jpg \n inflating: flower_data/train/8/image_03356.jpg \n inflating: flower_data/train/8/image_03343.jpg \n inflating: flower_data/train/8/image_03341.jpg \n inflating: flower_data/train/8/image_03355.jpg \n inflating: flower_data/train/8/image_03368.jpg \n inflating: flower_data/train/8/image_03354.jpg \n inflating: flower_data/train/8/image_03340.jpg \n creating: flower_data/train/21/\n inflating: flower_data/train/21/image_06802.jpg \n inflating: flower_data/train/21/image_06786.jpg \n inflating: flower_data/train/21/image_06792.jpg \n inflating: flower_data/train/21/image_06779.jpg \n inflating: flower_data/train/21/image_06793.jpg \n inflating: flower_data/train/21/image_06787.jpg \n inflating: flower_data/train/21/image_06803.jpg \n inflating: flower_data/train/21/image_06801.jpg \n inflating: flower_data/train/21/image_06791.jpg \n inflating: flower_data/train/21/image_06785.jpg \n inflating: flower_data/train/21/image_06790.jpg \n inflating: flower_data/train/21/image_06800.jpg \n inflating: flower_data/train/21/image_06810.jpg \n inflating: flower_data/train/21/image_06794.jpg \n inflating: flower_data/train/21/image_06780.jpg \n inflating: flower_data/train/21/image_06781.jpg \n inflating: flower_data/train/21/image_06795.jpg \n inflating: flower_data/train/21/image_06811.jpg \n inflating: flower_data/train/21/image_06813.jpg \n inflating: flower_data/train/21/image_06783.jpg \n inflating: flower_data/train/21/image_06797.jpg \n inflating: flower_data/train/21/image_06796.jpg \n inflating: flower_data/train/21/image_06782.jpg \n inflating: flower_data/train/21/image_06806.jpg \n inflating: flower_data/train/21/image_06812.jpg \n inflating: flower_data/train/21/image_06808.jpg \n inflating: flower_data/train/21/image_06798.jpg \n inflating: flower_data/train/21/image_06799.jpg \n inflating: flower_data/train/21/image_06809.jpg \n inflating: flower_data/train/21/image_06789.jpg \n inflating: flower_data/train/21/image_06776.jpg \n inflating: flower_data/train/21/image_06777.jpg \n inflating: flower_data/train/21/image_06775.jpg \n inflating: flower_data/train/21/image_06774.jpg \n creating: flower_data/train/75/\n inflating: flower_data/train/75/image_02127.jpg \n inflating: flower_data/train/75/image_02084.jpg \n inflating: flower_data/train/75/image_02090.jpg \n inflating: flower_data/train/75/image_02091.jpg \n inflating: flower_data/train/75/image_02126.jpg \n inflating: flower_data/train/75/image_02132.jpg \n inflating: flower_data/train/75/image_02124.jpg \n inflating: flower_data/train/75/image_02118.jpg \n inflating: flower_data/train/75/image_02093.jpg \n inflating: flower_data/train/75/image_02087.jpg \n inflating: flower_data/train/75/image_02079.jpg \n inflating: flower_data/train/75/image_02086.jpg \n inflating: flower_data/train/75/image_02092.jpg \n inflating: flower_data/train/75/image_02131.jpg \n inflating: flower_data/train/75/image_02125.jpg \n inflating: flower_data/train/75/image_02109.jpg \n inflating: flower_data/train/75/image_02121.jpg \n inflating: flower_data/train/75/image_02096.jpg \n inflating: flower_data/train/75/image_02082.jpg \n inflating: flower_data/train/75/image_02069.jpg \n inflating: flower_data/train/75/image_02083.jpg \n inflating: flower_data/train/75/image_02097.jpg \n inflating: flower_data/train/75/image_02134.jpg \n inflating: flower_data/train/75/image_02120.jpg \n inflating: flower_data/train/75/image_02108.jpg \n inflating: flower_data/train/75/image_02136.jpg \n inflating: flower_data/train/75/image_02122.jpg \n inflating: flower_data/train/75/image_02095.jpg \n inflating: flower_data/train/75/image_02094.jpg \n inflating: flower_data/train/75/image_02080.jpg \n inflating: flower_data/train/75/image_02123.jpg \n inflating: flower_data/train/75/image_02137.jpg \n inflating: flower_data/train/75/image_02187.jpg \n inflating: flower_data/train/75/image_02150.jpg \n inflating: flower_data/train/75/image_02179.jpg \n inflating: flower_data/train/75/image_02145.jpg \n inflating: flower_data/train/75/image_02151.jpg \n inflating: flower_data/train/75/image_02186.jpg \n inflating: flower_data/train/75/image_02184.jpg \n inflating: flower_data/train/75/image_02153.jpg \n inflating: flower_data/train/75/image_02146.jpg \n inflating: flower_data/train/75/image_02185.jpg \n inflating: flower_data/train/75/image_02181.jpg \n inflating: flower_data/train/75/image_02142.jpg \n inflating: flower_data/train/75/image_02156.jpg \n inflating: flower_data/train/75/image_02143.jpg \n inflating: flower_data/train/75/image_02180.jpg \n inflating: flower_data/train/75/image_02169.jpg \n inflating: flower_data/train/75/image_02140.jpg \n inflating: flower_data/train/75/image_02168.jpg \n inflating: flower_data/train/75/image_02183.jpg \n inflating: flower_data/train/75/image_02171.jpg \n inflating: flower_data/train/75/image_02165.jpg \n inflating: flower_data/train/75/image_02159.jpg \n inflating: flower_data/train/75/image_02158.jpg \n inflating: flower_data/train/75/image_02164.jpg \n inflating: flower_data/train/75/image_02170.jpg \n inflating: flower_data/train/75/image_02166.jpg \n inflating: flower_data/train/75/image_02172.jpg \n inflating: flower_data/train/75/image_02173.jpg \n inflating: flower_data/train/75/image_02163.jpg \n inflating: flower_data/train/75/image_02177.jpg \n inflating: flower_data/train/75/image_02176.jpg \n inflating: flower_data/train/75/image_02162.jpg \n inflating: flower_data/train/75/image_02148.jpg \n inflating: flower_data/train/75/image_02174.jpg \n inflating: flower_data/train/75/image_02160.jpg \n inflating: flower_data/train/75/image_02161.jpg \n inflating: flower_data/train/75/image_02175.jpg \n inflating: flower_data/train/75/image_02149.jpg \n inflating: flower_data/train/75/image_02112.jpg \n inflating: flower_data/train/75/image_02106.jpg \n inflating: flower_data/train/75/image_02099.jpg \n inflating: flower_data/train/75/image_02072.jpg \n inflating: flower_data/train/75/image_02073.jpg \n inflating: flower_data/train/75/image_02098.jpg \n inflating: flower_data/train/75/image_02107.jpg \n inflating: flower_data/train/75/image_02113.jpg \n inflating: flower_data/train/75/image_02105.jpg \n inflating: flower_data/train/75/image_02139.jpg \n inflating: flower_data/train/75/image_02071.jpg \n inflating: flower_data/train/75/image_02070.jpg \n inflating: flower_data/train/75/image_02138.jpg \n inflating: flower_data/train/75/image_02110.jpg \n inflating: flower_data/train/75/image_02128.jpg \n inflating: flower_data/train/75/image_02114.jpg \n inflating: flower_data/train/75/image_02101.jpg \n inflating: flower_data/train/75/image_02129.jpg \n inflating: flower_data/train/75/image_02117.jpg \n inflating: flower_data/train/75/image_02103.jpg \n inflating: flower_data/train/75/image_02077.jpg \n inflating: flower_data/train/75/image_02076.jpg \n inflating: flower_data/train/75/image_02089.jpg \n inflating: flower_data/train/75/image_02102.jpg \n inflating: flower_data/train/75/image_02116.jpg \n creating: flower_data/train/81/\n inflating: flower_data/train/81/image_00917.jpg \n inflating: flower_data/train/81/image_00877.jpg \n inflating: flower_data/train/81/image_00889.jpg \n inflating: flower_data/train/81/image_00862.jpg \n inflating: flower_data/train/81/image_00876.jpg \n inflating: flower_data/train/81/image_00916.jpg \n inflating: flower_data/train/81/image_00902.jpg \n inflating: flower_data/train/81/image_00928.jpg \n inflating: flower_data/train/81/image_00900.jpg \n inflating: flower_data/train/81/image_00874.jpg \n inflating: flower_data/train/81/image_00875.jpg \n inflating: flower_data/train/81/image_00861.jpg \n inflating: flower_data/train/81/image_00849.jpg \n inflating: flower_data/train/81/image_00901.jpg \n inflating: flower_data/train/81/image_00915.jpg \n inflating: flower_data/train/81/image_00929.jpg \n inflating: flower_data/train/81/image_00911.jpg \n inflating: flower_data/train/81/image_00905.jpg \n inflating: flower_data/train/81/image_00939.jpg \n inflating: flower_data/train/81/image_00865.jpg \n inflating: flower_data/train/81/image_00871.jpg \n inflating: flower_data/train/81/image_00858.jpg \n inflating: flower_data/train/81/image_00870.jpg \n inflating: flower_data/train/81/image_00864.jpg \n inflating: flower_data/train/81/image_00938.jpg \n inflating: flower_data/train/81/image_00910.jpg \n inflating: flower_data/train/81/image_00906.jpg \n inflating: flower_data/train/81/image_00912.jpg \n inflating: flower_data/train/81/image_00872.jpg \n inflating: flower_data/train/81/image_00866.jpg \n inflating: flower_data/train/81/image_00899.jpg \n inflating: flower_data/train/81/image_00867.jpg \n inflating: flower_data/train/81/image_00873.jpg \n inflating: flower_data/train/81/image_00913.jpg \n inflating: flower_data/train/81/image_00907.jpg \n inflating: flower_data/train/81/image_00790.jpg \n inflating: flower_data/train/81/image_00800.jpg \n inflating: flower_data/train/81/image_00801.jpg \n inflating: flower_data/train/81/image_00829.jpg \n inflating: flower_data/train/81/image_00791.jpg \n inflating: flower_data/train/81/image_00785.jpg \n inflating: flower_data/train/81/image_00793.jpg \n inflating: flower_data/train/81/image_00787.jpg \n inflating: flower_data/train/81/image_00803.jpg \n inflating: flower_data/train/81/image_00817.jpg \n inflating: flower_data/train/81/image_00816.jpg \n inflating: flower_data/train/81/image_00802.jpg \n inflating: flower_data/train/81/image_00786.jpg \n inflating: flower_data/train/81/image_00792.jpg \n inflating: flower_data/train/81/image_00796.jpg \n inflating: flower_data/train/81/image_00782.jpg \n inflating: flower_data/train/81/image_00806.jpg \n inflating: flower_data/train/81/image_00812.jpg \n inflating: flower_data/train/81/image_00813.jpg \n inflating: flower_data/train/81/image_00807.jpg \n inflating: flower_data/train/81/image_00783.jpg \n inflating: flower_data/train/81/image_00797.jpg \n inflating: flower_data/train/81/image_00781.jpg \n inflating: flower_data/train/81/image_00795.jpg \n inflating: flower_data/train/81/image_00811.jpg \n inflating: flower_data/train/81/image_00805.jpg \n inflating: flower_data/train/81/image_00839.jpg \n inflating: flower_data/train/81/image_00838.jpg \n inflating: flower_data/train/81/image_00804.jpg \n inflating: flower_data/train/81/image_00810.jpg \n inflating: flower_data/train/81/image_00794.jpg \n inflating: flower_data/train/81/image_00799.jpg \n inflating: flower_data/train/81/image_00941.jpg \n inflating: flower_data/train/81/image_00835.jpg \n inflating: flower_data/train/81/image_00821.jpg \n inflating: flower_data/train/81/image_00834.jpg \n inflating: flower_data/train/81/image_00808.jpg \n inflating: flower_data/train/81/image_00940.jpg \n inflating: flower_data/train/81/image_00942.jpg \n inflating: flower_data/train/81/image_00822.jpg \n inflating: flower_data/train/81/image_00836.jpg \n inflating: flower_data/train/81/image_00837.jpg \n inflating: flower_data/train/81/image_00823.jpg \n inflating: flower_data/train/81/image_00943.jpg \n inflating: flower_data/train/81/image_00827.jpg \n inflating: flower_data/train/81/image_00832.jpg \n inflating: flower_data/train/81/image_00944.jpg \n inflating: flower_data/train/81/image_00788.jpg \n inflating: flower_data/train/81/image_00830.jpg \n inflating: flower_data/train/81/image_00824.jpg \n inflating: flower_data/train/81/image_00819.jpg \n inflating: flower_data/train/81/image_00831.jpg \n inflating: flower_data/train/81/image_00789.jpg \n inflating: flower_data/train/81/image_00945.jpg \n inflating: flower_data/train/81/image_00922.jpg \n inflating: flower_data/train/81/image_00936.jpg \n inflating: flower_data/train/81/image_00856.jpg \n inflating: flower_data/train/81/image_00842.jpg \n inflating: flower_data/train/81/image_00881.jpg \n inflating: flower_data/train/81/image_00880.jpg \n inflating: flower_data/train/81/image_00894.jpg \n inflating: flower_data/train/81/image_00843.jpg \n inflating: flower_data/train/81/image_00857.jpg \n inflating: flower_data/train/81/image_00937.jpg \n inflating: flower_data/train/81/image_00923.jpg \n inflating: flower_data/train/81/image_00935.jpg \n inflating: flower_data/train/81/image_00921.jpg \n inflating: flower_data/train/81/image_00841.jpg \n inflating: flower_data/train/81/image_00855.jpg \n inflating: flower_data/train/81/image_00882.jpg \n inflating: flower_data/train/81/image_00897.jpg \n inflating: flower_data/train/81/image_00883.jpg \n inflating: flower_data/train/81/image_00854.jpg \n inflating: flower_data/train/81/image_00840.jpg \n inflating: flower_data/train/81/image_00868.jpg \n inflating: flower_data/train/81/image_00934.jpg \n inflating: flower_data/train/81/image_00908.jpg \n inflating: flower_data/train/81/image_00930.jpg \n inflating: flower_data/train/81/image_00924.jpg \n inflating: flower_data/train/81/image_00918.jpg \n inflating: flower_data/train/81/image_00844.jpg \n inflating: flower_data/train/81/image_00850.jpg \n inflating: flower_data/train/81/image_00878.jpg \n inflating: flower_data/train/81/image_00887.jpg \n inflating: flower_data/train/81/image_00886.jpg \n inflating: flower_data/train/81/image_00879.jpg \n inflating: flower_data/train/81/image_00851.jpg \n inflating: flower_data/train/81/image_00845.jpg \n inflating: flower_data/train/81/image_00925.jpg \n inflating: flower_data/train/81/image_00931.jpg \n inflating: flower_data/train/81/image_00927.jpg \n inflating: flower_data/train/81/image_00933.jpg \n inflating: flower_data/train/81/image_00847.jpg \n inflating: flower_data/train/81/image_00890.jpg \n inflating: flower_data/train/81/image_00884.jpg \n inflating: flower_data/train/81/image_00885.jpg \n inflating: flower_data/train/81/image_00891.jpg \n inflating: flower_data/train/81/image_00846.jpg \n inflating: flower_data/train/81/image_00852.jpg \n inflating: flower_data/train/81/image_00932.jpg \n creating: flower_data/train/86/\n inflating: flower_data/train/86/image_02872.jpg \n inflating: flower_data/train/86/image_02866.jpg \n inflating: flower_data/train/86/image_02899.jpg \n inflating: flower_data/train/86/image_02913.jpg \n inflating: flower_data/train/86/image_02907.jpg \n inflating: flower_data/train/86/image_02898.jpg \n inflating: flower_data/train/86/image_02867.jpg \n inflating: flower_data/train/86/image_02871.jpg \n inflating: flower_data/train/86/image_02911.jpg \n inflating: flower_data/train/86/image_02905.jpg \n inflating: flower_data/train/86/image_02904.jpg \n inflating: flower_data/train/86/image_02870.jpg \n inflating: flower_data/train/86/image_02864.jpg \n inflating: flower_data/train/86/image_02874.jpg \n inflating: flower_data/train/86/image_02914.jpg \n inflating: flower_data/train/86/image_02900.jpg \n inflating: flower_data/train/86/image_02915.jpg \n inflating: flower_data/train/86/image_02875.jpg \n inflating: flower_data/train/86/image_02877.jpg \n inflating: flower_data/train/86/image_02863.jpg \n inflating: flower_data/train/86/image_02888.jpg \n inflating: flower_data/train/86/image_02903.jpg \n inflating: flower_data/train/86/image_02917.jpg \n inflating: flower_data/train/86/image_02916.jpg \n inflating: flower_data/train/86/image_02902.jpg \n inflating: flower_data/train/86/image_02889.jpg \n inflating: flower_data/train/86/image_02862.jpg \n inflating: flower_data/train/86/image_02876.jpg \n inflating: flower_data/train/86/image_02890.jpg \n inflating: flower_data/train/86/image_02884.jpg \n inflating: flower_data/train/86/image_02885.jpg \n inflating: flower_data/train/86/image_02878.jpg \n inflating: flower_data/train/86/image_02918.jpg \n inflating: flower_data/train/86/image_02919.jpg \n inflating: flower_data/train/86/image_02892.jpg \n inflating: flower_data/train/86/image_02886.jpg \n inflating: flower_data/train/86/image_02879.jpg \n inflating: flower_data/train/86/image_02869.jpg \n inflating: flower_data/train/86/image_02882.jpg \n inflating: flower_data/train/86/image_02896.jpg \n inflating: flower_data/train/86/image_02908.jpg \n inflating: flower_data/train/86/image_02897.jpg \n inflating: flower_data/train/86/image_02883.jpg \n inflating: flower_data/train/86/image_02868.jpg \n inflating: flower_data/train/86/image_02895.jpg \n inflating: flower_data/train/86/image_02881.jpg \n inflating: flower_data/train/86/image_02880.jpg \n inflating: flower_data/train/86/image_02894.jpg \n creating: flower_data/train/72/\n inflating: flower_data/train/72/image_03588.jpg \n inflating: flower_data/train/72/image_03577.jpg \n inflating: flower_data/train/72/image_03563.jpg \n inflating: flower_data/train/72/image_03629.jpg \n inflating: flower_data/train/72/image_03601.jpg \n inflating: flower_data/train/72/image_03615.jpg \n inflating: flower_data/train/72/image_03614.jpg \n inflating: flower_data/train/72/image_03600.jpg \n inflating: flower_data/train/72/image_03562.jpg \n inflating: flower_data/train/72/image_03589.jpg \n inflating: flower_data/train/72/image_03548.jpg \n inflating: flower_data/train/72/image_03560.jpg \n inflating: flower_data/train/72/image_03574.jpg \n inflating: flower_data/train/72/image_03616.jpg \n inflating: flower_data/train/72/image_03602.jpg \n inflating: flower_data/train/72/image_03617.jpg \n inflating: flower_data/train/72/image_03549.jpg \n inflating: flower_data/train/72/image_03571.jpg \n inflating: flower_data/train/72/image_03607.jpg \n inflating: flower_data/train/72/image_03606.jpg \n inflating: flower_data/train/72/image_03612.jpg \n inflating: flower_data/train/72/image_03558.jpg \n inflating: flower_data/train/72/image_03570.jpg \n inflating: flower_data/train/72/image_03564.jpg \n inflating: flower_data/train/72/image_03572.jpg \n inflating: flower_data/train/72/image_03566.jpg \n inflating: flower_data/train/72/image_03604.jpg \n inflating: flower_data/train/72/image_03610.jpg \n inflating: flower_data/train/72/image_03638.jpg \n inflating: flower_data/train/72/image_03639.jpg \n inflating: flower_data/train/72/image_03611.jpg \n inflating: flower_data/train/72/image_03605.jpg \n inflating: flower_data/train/72/image_03567.jpg \n inflating: flower_data/train/72/image_03573.jpg \n inflating: flower_data/train/72/image_03598.jpg \n inflating: flower_data/train/72/image_03640.jpg \n inflating: flower_data/train/72/image_03595.jpg \n inflating: flower_data/train/72/image_03581.jpg \n inflating: flower_data/train/72/image_03556.jpg \n inflating: flower_data/train/72/image_03608.jpg \n inflating: flower_data/train/72/image_03620.jpg \n inflating: flower_data/train/72/image_03634.jpg \n inflating: flower_data/train/72/image_03621.jpg \n inflating: flower_data/train/72/image_03609.jpg \n inflating: flower_data/train/72/image_03557.jpg \n inflating: flower_data/train/72/image_03580.jpg \n inflating: flower_data/train/72/image_03594.jpg \n inflating: flower_data/train/72/image_03582.jpg \n inflating: flower_data/train/72/image_03596.jpg \n inflating: flower_data/train/72/image_03569.jpg \n inflating: flower_data/train/72/image_03555.jpg \n inflating: flower_data/train/72/image_03637.jpg \n inflating: flower_data/train/72/image_03623.jpg \n inflating: flower_data/train/72/image_03622.jpg \n inflating: flower_data/train/72/image_03636.jpg \n inflating: flower_data/train/72/image_03554.jpg \n inflating: flower_data/train/72/image_03597.jpg \n inflating: flower_data/train/72/image_03583.jpg \n inflating: flower_data/train/72/image_03550.jpg \n inflating: flower_data/train/72/image_03578.jpg \n inflating: flower_data/train/72/image_03626.jpg \n inflating: flower_data/train/72/image_03627.jpg \n inflating: flower_data/train/72/image_03579.jpg \n inflating: flower_data/train/72/image_03551.jpg \n inflating: flower_data/train/72/image_03545.jpg \n inflating: flower_data/train/72/image_03592.jpg \n inflating: flower_data/train/72/image_03586.jpg \n inflating: flower_data/train/72/image_03590.jpg \n inflating: flower_data/train/72/image_03553.jpg \n inflating: flower_data/train/72/image_03631.jpg \n inflating: flower_data/train/72/image_03619.jpg \n inflating: flower_data/train/72/image_03618.jpg \n inflating: flower_data/train/72/image_03630.jpg \n inflating: flower_data/train/72/image_03546.jpg \n inflating: flower_data/train/72/image_03552.jpg \n inflating: flower_data/train/72/image_03585.jpg \n inflating: flower_data/train/72/image_03591.jpg \n creating: flower_data/train/44/\n inflating: flower_data/train/44/image_01572.jpg \n inflating: flower_data/train/44/image_01566.jpg \n inflating: flower_data/train/44/image_01567.jpg \n inflating: flower_data/train/44/image_01559.jpg \n inflating: flower_data/train/44/image_01565.jpg \n inflating: flower_data/train/44/image_01571.jpg \n inflating: flower_data/train/44/image_01558.jpg \n inflating: flower_data/train/44/image_01574.jpg \n inflating: flower_data/train/44/image_01549.jpg \n inflating: flower_data/train/44/image_01575.jpg \n inflating: flower_data/train/44/image_01561.jpg \n inflating: flower_data/train/44/image_01577.jpg \n inflating: flower_data/train/44/image_01563.jpg \n inflating: flower_data/train/44/image_01562.jpg \n inflating: flower_data/train/44/image_01576.jpg \n inflating: flower_data/train/44/image_01539.jpg \n inflating: flower_data/train/44/image_01511.jpg \n inflating: flower_data/train/44/image_01505.jpg \n inflating: flower_data/train/44/image_01504.jpg \n inflating: flower_data/train/44/image_01538.jpg \n inflating: flower_data/train/44/image_01499.jpg \n inflating: flower_data/train/44/image_01506.jpg \n inflating: flower_data/train/44/image_01513.jpg \n inflating: flower_data/train/44/image_01507.jpg \n inflating: flower_data/train/44/image_01498.jpg \n inflating: flower_data/train/44/image_01503.jpg \n inflating: flower_data/train/44/image_01517.jpg \n inflating: flower_data/train/44/image_01514.jpg \n inflating: flower_data/train/44/image_01528.jpg \n inflating: flower_data/train/44/image_01529.jpg \n inflating: flower_data/train/44/image_01501.jpg \n inflating: flower_data/train/44/image_01493.jpg \n inflating: flower_data/train/44/image_01524.jpg \n inflating: flower_data/train/44/image_01525.jpg \n inflating: flower_data/train/44/image_01531.jpg \n inflating: flower_data/train/44/image_01519.jpg \n inflating: flower_data/train/44/image_01492.jpg \n inflating: flower_data/train/44/image_01533.jpg \n inflating: flower_data/train/44/image_01532.jpg \n inflating: flower_data/train/44/image_01526.jpg \n inflating: flower_data/train/44/image_01495.jpg \n inflating: flower_data/train/44/image_01522.jpg \n inflating: flower_data/train/44/image_01536.jpg \n inflating: flower_data/train/44/image_01537.jpg \n inflating: flower_data/train/44/image_01523.jpg \n inflating: flower_data/train/44/image_01496.jpg \n inflating: flower_data/train/44/image_01535.jpg \n inflating: flower_data/train/44/image_01521.jpg \n inflating: flower_data/train/44/image_01509.jpg \n inflating: flower_data/train/44/image_01508.jpg \n inflating: flower_data/train/44/image_01520.jpg \n inflating: flower_data/train/44/image_01534.jpg \n inflating: flower_data/train/44/image_01497.jpg \n inflating: flower_data/train/44/image_01553.jpg \n inflating: flower_data/train/44/image_01547.jpg \n inflating: flower_data/train/44/image_01546.jpg \n inflating: flower_data/train/44/image_01552.jpg \n inflating: flower_data/train/44/image_01544.jpg \n inflating: flower_data/train/44/image_01550.jpg \n inflating: flower_data/train/44/image_01551.jpg \n inflating: flower_data/train/44/image_01545.jpg \n inflating: flower_data/train/44/image_01582.jpg \n inflating: flower_data/train/44/image_01541.jpg \n inflating: flower_data/train/44/image_01555.jpg \n inflating: flower_data/train/44/image_01568.jpg \n inflating: flower_data/train/44/image_01554.jpg \n inflating: flower_data/train/44/image_01583.jpg \n inflating: flower_data/train/44/image_01581.jpg \n inflating: flower_data/train/44/image_01556.jpg \n inflating: flower_data/train/44/image_01542.jpg \n inflating: flower_data/train/44/image_01543.jpg \n inflating: flower_data/train/44/image_01557.jpg \n inflating: flower_data/train/44/image_01580.jpg \n creating: flower_data/train/43/\n inflating: flower_data/train/43/image_02331.jpg \n inflating: flower_data/train/43/image_02325.jpg \n inflating: flower_data/train/43/image_02319.jpg \n inflating: flower_data/train/43/image_02318.jpg \n inflating: flower_data/train/43/image_02324.jpg \n inflating: flower_data/train/43/image_02442.jpg \n inflating: flower_data/train/43/image_02440.jpg \n inflating: flower_data/train/43/image_02326.jpg \n inflating: flower_data/train/43/image_02332.jpg \n inflating: flower_data/train/43/image_02333.jpg \n inflating: flower_data/train/43/image_02441.jpg \n inflating: flower_data/train/43/image_02327.jpg \n inflating: flower_data/train/43/image_02323.jpg \n inflating: flower_data/train/43/image_02445.jpg \n inflating: flower_data/train/43/image_02337.jpg \n inflating: flower_data/train/43/image_02322.jpg \n inflating: flower_data/train/43/image_02444.jpg \n inflating: flower_data/train/43/image_02334.jpg \n inflating: flower_data/train/43/image_02320.jpg \n inflating: flower_data/train/43/image_02321.jpg \n inflating: flower_data/train/43/image_02391.jpg \n inflating: flower_data/train/43/image_02385.jpg \n inflating: flower_data/train/43/image_02352.jpg \n inflating: flower_data/train/43/image_02434.jpg \n inflating: flower_data/train/43/image_02420.jpg \n inflating: flower_data/train/43/image_02408.jpg \n inflating: flower_data/train/43/image_02409.jpg \n inflating: flower_data/train/43/image_02347.jpg \n inflating: flower_data/train/43/image_02353.jpg \n inflating: flower_data/train/43/image_02384.jpg \n inflating: flower_data/train/43/image_02390.jpg \n inflating: flower_data/train/43/image_02392.jpg \n inflating: flower_data/train/43/image_02423.jpg \n inflating: flower_data/train/43/image_02437.jpg \n inflating: flower_data/train/43/image_02351.jpg \n inflating: flower_data/train/43/image_02379.jpg \n inflating: flower_data/train/43/image_02378.jpg \n inflating: flower_data/train/43/image_02436.jpg \n inflating: flower_data/train/43/image_02350.jpg \n inflating: flower_data/train/43/image_02344.jpg \n inflating: flower_data/train/43/image_02422.jpg \n inflating: flower_data/train/43/image_02393.jpg \n inflating: flower_data/train/43/image_02387.jpg \n inflating: flower_data/train/43/image_02383.jpg \n inflating: flower_data/train/43/image_02340.jpg \n inflating: flower_data/train/43/image_02354.jpg \n inflating: flower_data/train/43/image_02432.jpg \n inflating: flower_data/train/43/image_02427.jpg \n inflating: flower_data/train/43/image_02341.jpg \n inflating: flower_data/train/43/image_02396.jpg \n inflating: flower_data/train/43/image_02382.jpg \n inflating: flower_data/train/43/image_02380.jpg \n inflating: flower_data/train/43/image_02357.jpg \n inflating: flower_data/train/43/image_02343.jpg \n inflating: flower_data/train/43/image_02342.jpg \n inflating: flower_data/train/43/image_02424.jpg \n inflating: flower_data/train/43/image_02430.jpg \n inflating: flower_data/train/43/image_02356.jpg \n inflating: flower_data/train/43/image_02418.jpg \n inflating: flower_data/train/43/image_02381.jpg \n inflating: flower_data/train/43/image_02395.jpg \n inflating: flower_data/train/43/image_02398.jpg \n inflating: flower_data/train/43/image_02373.jpg \n inflating: flower_data/train/43/image_02415.jpg \n inflating: flower_data/train/43/image_02401.jpg \n inflating: flower_data/train/43/image_02367.jpg \n inflating: flower_data/train/43/image_02428.jpg \n inflating: flower_data/train/43/image_02366.jpg \n inflating: flower_data/train/43/image_02372.jpg \n inflating: flower_data/train/43/image_02399.jpg \n inflating: flower_data/train/43/image_02364.jpg \n inflating: flower_data/train/43/image_02402.jpg \n inflating: flower_data/train/43/image_02416.jpg \n inflating: flower_data/train/43/image_02358.jpg \n inflating: flower_data/train/43/image_02359.jpg \n inflating: flower_data/train/43/image_02417.jpg \n inflating: flower_data/train/43/image_02403.jpg \n inflating: flower_data/train/43/image_02361.jpg \n inflating: flower_data/train/43/image_02375.jpg \n inflating: flower_data/train/43/image_02413.jpg \n inflating: flower_data/train/43/image_02374.jpg \n inflating: flower_data/train/43/image_02406.jpg \n inflating: flower_data/train/43/image_02360.jpg \n inflating: flower_data/train/43/image_02348.jpg \n inflating: flower_data/train/43/image_02438.jpg \n inflating: flower_data/train/43/image_02410.jpg \n inflating: flower_data/train/43/image_02376.jpg \n inflating: flower_data/train/43/image_02362.jpg \n inflating: flower_data/train/43/image_02404.jpg \n inflating: flower_data/train/43/image_02363.jpg \n inflating: flower_data/train/43/image_02405.jpg \n inflating: flower_data/train/43/image_02411.jpg \n inflating: flower_data/train/43/image_02377.jpg \n inflating: flower_data/train/43/image_02439.jpg \n inflating: flower_data/train/43/image_02388.jpg \n inflating: flower_data/train/43/image_02338.jpg \n inflating: flower_data/train/43/image_02339.jpg \n inflating: flower_data/train/43/image_02316.jpg \n inflating: flower_data/train/43/image_02317.jpg \n inflating: flower_data/train/43/image_02328.jpg \n creating: flower_data/train/88/\n inflating: flower_data/train/88/image_00532.jpg \n inflating: flower_data/train/88/image_00491.jpg \n inflating: flower_data/train/88/image_00485.jpg \n inflating: flower_data/train/88/image_00452.jpg \n inflating: flower_data/train/88/image_00446.jpg \n inflating: flower_data/train/88/image_00447.jpg \n inflating: flower_data/train/88/image_00453.jpg \n inflating: flower_data/train/88/image_00484.jpg \n inflating: flower_data/train/88/image_00490.jpg \n inflating: flower_data/train/88/image_00527.jpg \n inflating: flower_data/train/88/image_00525.jpg \n inflating: flower_data/train/88/image_00519.jpg \n inflating: flower_data/train/88/image_00486.jpg \n inflating: flower_data/train/88/image_00492.jpg \n inflating: flower_data/train/88/image_00451.jpg \n inflating: flower_data/train/88/image_00479.jpg \n inflating: flower_data/train/88/image_00478.jpg \n inflating: flower_data/train/88/image_00450.jpg \n inflating: flower_data/train/88/image_00493.jpg \n inflating: flower_data/train/88/image_00487.jpg \n inflating: flower_data/train/88/image_00524.jpg \n inflating: flower_data/train/88/image_00530.jpg \n inflating: flower_data/train/88/image_00534.jpg \n inflating: flower_data/train/88/image_00483.jpg \n inflating: flower_data/train/88/image_00497.jpg \n inflating: flower_data/train/88/image_00468.jpg \n inflating: flower_data/train/88/image_00454.jpg \n inflating: flower_data/train/88/image_00455.jpg \n inflating: flower_data/train/88/image_00469.jpg \n inflating: flower_data/train/88/image_00496.jpg \n inflating: flower_data/train/88/image_00521.jpg \n inflating: flower_data/train/88/image_00535.jpg \n inflating: flower_data/train/88/image_00523.jpg \n inflating: flower_data/train/88/image_00537.jpg \n inflating: flower_data/train/88/image_00494.jpg \n inflating: flower_data/train/88/image_00480.jpg \n inflating: flower_data/train/88/image_00456.jpg \n inflating: flower_data/train/88/image_00481.jpg \n inflating: flower_data/train/88/image_00495.jpg \n inflating: flower_data/train/88/image_00536.jpg \n inflating: flower_data/train/88/image_00522.jpg \n inflating: flower_data/train/88/image_00586.jpg \n inflating: flower_data/train/88/image_00592.jpg \n inflating: flower_data/train/88/image_00579.jpg \n inflating: flower_data/train/88/image_00578.jpg \n inflating: flower_data/train/88/image_00550.jpg \n inflating: flower_data/train/88/image_00587.jpg \n inflating: flower_data/train/88/image_00591.jpg \n inflating: flower_data/train/88/image_00552.jpg \n inflating: flower_data/train/88/image_00553.jpg \n inflating: flower_data/train/88/image_00590.jpg \n inflating: flower_data/train/88/image_00580.jpg \n inflating: flower_data/train/88/image_00557.jpg \n inflating: flower_data/train/88/image_00543.jpg \n inflating: flower_data/train/88/image_00581.jpg \n inflating: flower_data/train/88/image_00583.jpg \n inflating: flower_data/train/88/image_00568.jpg \n inflating: flower_data/train/88/image_08083.jpg \n inflating: flower_data/train/88/image_08082.jpg \n inflating: flower_data/train/88/image_00555.jpg \n inflating: flower_data/train/88/image_00541.jpg \n inflating: flower_data/train/88/image_00569.jpg \n inflating: flower_data/train/88/image_00596.jpg \n inflating: flower_data/train/88/image_00564.jpg \n inflating: flower_data/train/88/image_00570.jpg \n inflating: flower_data/train/88/image_00558.jpg \n inflating: flower_data/train/88/image_00559.jpg \n inflating: flower_data/train/88/image_00571.jpg \n inflating: flower_data/train/88/image_00565.jpg \n inflating: flower_data/train/88/image_00573.jpg \n inflating: flower_data/train/88/image_00567.jpg \n inflating: flower_data/train/88/image_00566.jpg \n inflating: flower_data/train/88/image_00572.jpg \n inflating: flower_data/train/88/image_00589.jpg \n inflating: flower_data/train/88/image_00576.jpg \n inflating: flower_data/train/88/image_00562.jpg \n inflating: flower_data/train/88/image_00588.jpg \n inflating: flower_data/train/88/image_00549.jpg \n inflating: flower_data/train/88/image_00561.jpg \n inflating: flower_data/train/88/image_00575.jpg \n inflating: flower_data/train/88/image_00560.jpg \n inflating: flower_data/train/88/image_00548.jpg \n inflating: flower_data/train/88/image_00507.jpg \n inflating: flower_data/train/88/image_00513.jpg \n inflating: flower_data/train/88/image_00498.jpg \n inflating: flower_data/train/88/image_00473.jpg \n inflating: flower_data/train/88/image_00466.jpg \n inflating: flower_data/train/88/image_00472.jpg \n inflating: flower_data/train/88/image_00499.jpg \n inflating: flower_data/train/88/image_00506.jpg \n inflating: flower_data/train/88/image_00510.jpg \n inflating: flower_data/train/88/image_00504.jpg \n inflating: flower_data/train/88/image_00538.jpg \n inflating: flower_data/train/88/image_00464.jpg \n inflating: flower_data/train/88/image_00470.jpg \n inflating: flower_data/train/88/image_00458.jpg \n inflating: flower_data/train/88/image_00471.jpg \n inflating: flower_data/train/88/image_00465.jpg \n inflating: flower_data/train/88/image_00505.jpg \n inflating: flower_data/train/88/image_00511.jpg \n inflating: flower_data/train/88/image_00529.jpg \n inflating: flower_data/train/88/image_00515.jpg \n inflating: flower_data/train/88/image_00501.jpg \n inflating: flower_data/train/88/image_00449.jpg \n inflating: flower_data/train/88/image_00475.jpg \n inflating: flower_data/train/88/image_00460.jpg \n inflating: flower_data/train/88/image_00448.jpg \n inflating: flower_data/train/88/image_00500.jpg \n inflating: flower_data/train/88/image_00514.jpg \n inflating: flower_data/train/88/image_00528.jpg \n inflating: flower_data/train/88/image_00476.jpg \n inflating: flower_data/train/88/image_00462.jpg \n inflating: flower_data/train/88/image_00463.jpg \n inflating: flower_data/train/88/image_00488.jpg \n inflating: flower_data/train/88/image_00517.jpg \n inflating: flower_data/train/88/image_00503.jpg \n creating: flower_data/train/38/\n inflating: flower_data/train/38/image_05847.jpg \n inflating: flower_data/train/38/image_05846.jpg \n inflating: flower_data/train/38/image_05844.jpg \n inflating: flower_data/train/38/image_05841.jpg \n inflating: flower_data/train/38/image_05840.jpg \n inflating: flower_data/train/38/image_05842.jpg \n inflating: flower_data/train/38/image_05843.jpg \n inflating: flower_data/train/38/image_05818.jpg \n inflating: flower_data/train/38/image_05831.jpg \n inflating: flower_data/train/38/image_05827.jpg \n inflating: flower_data/train/38/image_05826.jpg \n inflating: flower_data/train/38/image_05832.jpg \n inflating: flower_data/train/38/image_05836.jpg \n inflating: flower_data/train/38/image_05822.jpg \n inflating: flower_data/train/38/image_05823.jpg \n inflating: flower_data/train/38/image_05837.jpg \n inflating: flower_data/train/38/image_05809.jpg \n inflating: flower_data/train/38/image_05821.jpg \n inflating: flower_data/train/38/image_05835.jpg \n inflating: flower_data/train/38/image_05798.jpg \n inflating: flower_data/train/38/image_05834.jpg \n inflating: flower_data/train/38/image_05820.jpg \n inflating: flower_data/train/38/image_05808.jpg \n inflating: flower_data/train/38/image_05805.jpg \n inflating: flower_data/train/38/image_05811.jpg \n inflating: flower_data/train/38/image_05839.jpg \n inflating: flower_data/train/38/image_05795.jpg \n inflating: flower_data/train/38/image_05794.jpg \n inflating: flower_data/train/38/image_05838.jpg \n inflating: flower_data/train/38/image_05810.jpg \n inflating: flower_data/train/38/image_05804.jpg \n inflating: flower_data/train/38/image_05807.jpg \n inflating: flower_data/train/38/image_05813.jpg \n inflating: flower_data/train/38/image_05817.jpg \n inflating: flower_data/train/38/image_05803.jpg \n inflating: flower_data/train/38/image_05793.jpg \n inflating: flower_data/train/38/image_05802.jpg \n inflating: flower_data/train/38/image_05816.jpg \n inflating: flower_data/train/38/image_05828.jpg \n inflating: flower_data/train/38/image_05800.jpg \n inflating: flower_data/train/38/image_05814.jpg \n inflating: flower_data/train/38/image_05815.jpg \n inflating: flower_data/train/38/image_05801.jpg \n inflating: flower_data/train/38/image_05848.jpg \n creating: flower_data/train/36/\n inflating: flower_data/train/36/image_04387.jpg \n inflating: flower_data/train/36/image_04378.jpg \n inflating: flower_data/train/36/image_04379.jpg \n inflating: flower_data/train/36/image_04345.jpg \n inflating: flower_data/train/36/image_04386.jpg \n inflating: flower_data/train/36/image_04384.jpg \n inflating: flower_data/train/36/image_04390.jpg \n inflating: flower_data/train/36/image_04347.jpg \n inflating: flower_data/train/36/image_04353.jpg \n inflating: flower_data/train/36/image_04346.jpg \n inflating: flower_data/train/36/image_04391.jpg \n inflating: flower_data/train/36/image_04385.jpg \n inflating: flower_data/train/36/image_04381.jpg \n inflating: flower_data/train/36/image_04395.jpg \n inflating: flower_data/train/36/image_04342.jpg \n inflating: flower_data/train/36/image_04356.jpg \n inflating: flower_data/train/36/image_04357.jpg \n inflating: flower_data/train/36/image_04343.jpg \n inflating: flower_data/train/36/image_04394.jpg \n inflating: flower_data/train/36/image_04380.jpg \n inflating: flower_data/train/36/image_04396.jpg \n inflating: flower_data/train/36/image_04369.jpg \n inflating: flower_data/train/36/image_04355.jpg \n inflating: flower_data/train/36/image_04341.jpg \n inflating: flower_data/train/36/image_04340.jpg \n inflating: flower_data/train/36/image_04368.jpg \n inflating: flower_data/train/36/image_04383.jpg \n inflating: flower_data/train/36/image_04397.jpg \n inflating: flower_data/train/36/image_04327.jpg \n inflating: flower_data/train/36/image_04326.jpg \n inflating: flower_data/train/36/image_04332.jpg \n inflating: flower_data/train/36/image_04330.jpg \n inflating: flower_data/train/36/image_04331.jpg \n inflating: flower_data/train/36/image_04335.jpg \n inflating: flower_data/train/36/image_04336.jpg \n inflating: flower_data/train/36/image_04337.jpg \n inflating: flower_data/train/36/image_04339.jpg \n inflating: flower_data/train/36/image_04338.jpg \n inflating: flower_data/train/36/image_04328.jpg \n inflating: flower_data/train/36/image_04329.jpg \n inflating: flower_data/train/36/image_04371.jpg \n inflating: flower_data/train/36/image_04365.jpg \n inflating: flower_data/train/36/image_04359.jpg \n inflating: flower_data/train/36/image_04358.jpg \n inflating: flower_data/train/36/image_04370.jpg \n inflating: flower_data/train/36/image_04399.jpg \n inflating: flower_data/train/36/image_04366.jpg \n inflating: flower_data/train/36/image_04372.jpg \n inflating: flower_data/train/36/image_04367.jpg \n inflating: flower_data/train/36/image_04398.jpg \n inflating: flower_data/train/36/image_04388.jpg \n inflating: flower_data/train/36/image_04363.jpg \n inflating: flower_data/train/36/image_04377.jpg \n inflating: flower_data/train/36/image_04376.jpg \n inflating: flower_data/train/36/image_04362.jpg \n inflating: flower_data/train/36/image_04389.jpg \n inflating: flower_data/train/36/image_04348.jpg \n inflating: flower_data/train/36/image_04374.jpg \n inflating: flower_data/train/36/image_04360.jpg \n inflating: flower_data/train/36/image_04361.jpg \n inflating: flower_data/train/36/image_04375.jpg \n inflating: flower_data/train/36/image_04349.jpg \n creating: flower_data/train/31/\n inflating: flower_data/train/31/image_06901.jpg \n inflating: flower_data/train/31/image_06915.jpg \n inflating: flower_data/train/31/image_06914.jpg \n inflating: flower_data/train/31/image_06900.jpg \n inflating: flower_data/train/31/image_06928.jpg \n inflating: flower_data/train/31/image_06916.jpg \n inflating: flower_data/train/31/image_06902.jpg \n inflating: flower_data/train/31/image_06917.jpg \n inflating: flower_data/train/31/image_06898.jpg \n inflating: flower_data/train/31/image_06913.jpg \n inflating: flower_data/train/31/image_06907.jpg \n inflating: flower_data/train/31/image_06906.jpg \n inflating: flower_data/train/31/image_06912.jpg \n inflating: flower_data/train/31/image_06899.jpg \n inflating: flower_data/train/31/image_06904.jpg \n inflating: flower_data/train/31/image_06910.jpg \n inflating: flower_data/train/31/image_08068.jpg \n inflating: flower_data/train/31/image_08069.jpg \n inflating: flower_data/train/31/image_06911.jpg \n inflating: flower_data/train/31/image_06897.jpg \n inflating: flower_data/train/31/image_06908.jpg \n inflating: flower_data/train/31/image_06920.jpg \n inflating: flower_data/train/31/image_08065.jpg \n inflating: flower_data/train/31/image_08071.jpg \n inflating: flower_data/train/31/image_06921.jpg \n inflating: flower_data/train/31/image_06909.jpg \n inflating: flower_data/train/31/image_06896.jpg \n inflating: flower_data/train/31/image_06894.jpg \n inflating: flower_data/train/31/image_06923.jpg \n inflating: flower_data/train/31/image_08073.jpg \n inflating: flower_data/train/31/image_08072.jpg \n inflating: flower_data/train/31/image_08066.jpg \n inflating: flower_data/train/31/image_06922.jpg \n inflating: flower_data/train/31/image_06895.jpg \n inflating: flower_data/train/31/image_06891.jpg \n inflating: flower_data/train/31/image_06926.jpg \n inflating: flower_data/train/31/image_08076.jpg \n inflating: flower_data/train/31/image_08077.jpg \n inflating: flower_data/train/31/image_06927.jpg \n inflating: flower_data/train/31/image_06890.jpg \n inflating: flower_data/train/31/image_06892.jpg \n inflating: flower_data/train/31/image_06925.jpg \n inflating: flower_data/train/31/image_06919.jpg \n inflating: flower_data/train/31/image_08075.jpg \n inflating: flower_data/train/31/image_08074.jpg \n inflating: flower_data/train/31/image_06918.jpg \n inflating: flower_data/train/31/image_06924.jpg \n inflating: flower_data/train/31/image_06893.jpg \n creating: flower_data/train/91/\n inflating: flower_data/train/91/image_04838.jpg \n inflating: flower_data/train/91/image_04839.jpg \n inflating: flower_data/train/91/image_04858.jpg \n inflating: flower_data/train/91/image_04864.jpg \n inflating: flower_data/train/91/image_04865.jpg \n inflating: flower_data/train/91/image_04871.jpg \n inflating: flower_data/train/91/image_04859.jpg \n inflating: flower_data/train/91/image_04867.jpg \n inflating: flower_data/train/91/image_04873.jpg \n inflating: flower_data/train/91/image_04866.jpg \n inflating: flower_data/train/91/image_04862.jpg \n inflating: flower_data/train/91/image_04876.jpg \n inflating: flower_data/train/91/image_08057.jpg \n inflating: flower_data/train/91/image_04889.jpg \n inflating: flower_data/train/91/image_04888.jpg \n inflating: flower_data/train/91/image_08056.jpg \n inflating: flower_data/train/91/image_04877.jpg \n inflating: flower_data/train/91/image_04863.jpg \n inflating: flower_data/train/91/image_04861.jpg \n inflating: flower_data/train/91/image_04849.jpg \n inflating: flower_data/train/91/image_08054.jpg \n inflating: flower_data/train/91/image_08055.jpg \n inflating: flower_data/train/91/image_04874.jpg \n inflating: flower_data/train/91/image_04879.jpg \n inflating: flower_data/train/91/image_04845.jpg \n inflating: flower_data/train/91/image_08058.jpg \n inflating: flower_data/train/91/image_04892.jpg \n inflating: flower_data/train/91/image_04886.jpg \n inflating: flower_data/train/91/image_04887.jpg \n inflating: flower_data/train/91/image_04893.jpg \n inflating: flower_data/train/91/image_04850.jpg \n inflating: flower_data/train/91/image_04846.jpg \n inflating: flower_data/train/91/image_04852.jpg \n inflating: flower_data/train/91/image_04885.jpg \n inflating: flower_data/train/91/image_04891.jpg \n inflating: flower_data/train/91/image_04884.jpg \n inflating: flower_data/train/91/image_04853.jpg \n inflating: flower_data/train/91/image_04847.jpg \n inflating: flower_data/train/91/image_04857.jpg \n inflating: flower_data/train/91/image_04880.jpg \n inflating: flower_data/train/91/image_04894.jpg \n inflating: flower_data/train/91/image_04895.jpg \n inflating: flower_data/train/91/image_04881.jpg \n inflating: flower_data/train/91/image_04842.jpg \n inflating: flower_data/train/91/image_04840.jpg \n inflating: flower_data/train/91/image_04868.jpg \n inflating: flower_data/train/91/image_04882.jpg \n inflating: flower_data/train/91/image_08060.jpg \n inflating: flower_data/train/91/image_04896.jpg \n inflating: flower_data/train/91/image_04869.jpg \n inflating: flower_data/train/91/image_04841.jpg \n inflating: flower_data/train/91/image_04855.jpg \n inflating: flower_data/train/91/image_04832.jpg \n inflating: flower_data/train/91/image_04833.jpg \n inflating: flower_data/train/91/image_04831.jpg \n inflating: flower_data/train/91/image_04830.jpg \n inflating: flower_data/train/91/image_04834.jpg \n inflating: flower_data/train/91/image_04837.jpg \n inflating: flower_data/train/91/image_04836.jpg \n creating: flower_data/train/65/\n inflating: flower_data/train/65/image_03239.jpg \n inflating: flower_data/train/65/image_03205.jpg \n inflating: flower_data/train/65/image_03198.jpg \n inflating: flower_data/train/65/image_03199.jpg \n inflating: flower_data/train/65/image_03204.jpg \n inflating: flower_data/train/65/image_03210.jpg \n inflating: flower_data/train/65/image_03238.jpg \n inflating: flower_data/train/65/image_03206.jpg \n inflating: flower_data/train/65/image_03212.jpg \n inflating: flower_data/train/65/image_03213.jpg \n inflating: flower_data/train/65/image_03207.jpg \n inflating: flower_data/train/65/image_03203.jpg \n inflating: flower_data/train/65/image_03217.jpg \n inflating: flower_data/train/65/image_03216.jpg \n inflating: flower_data/train/65/image_03202.jpg \n inflating: flower_data/train/65/image_03214.jpg \n inflating: flower_data/train/65/image_03200.jpg \n inflating: flower_data/train/65/image_03228.jpg \n inflating: flower_data/train/65/image_03189.jpg \n inflating: flower_data/train/65/image_03229.jpg \n inflating: flower_data/train/65/image_03201.jpg \n inflating: flower_data/train/65/image_03215.jpg \n inflating: flower_data/train/65/image_03272.jpg \n inflating: flower_data/train/65/image_03266.jpg \n inflating: flower_data/train/65/image_03267.jpg \n inflating: flower_data/train/65/image_03273.jpg \n inflating: flower_data/train/65/image_03259.jpg \n inflating: flower_data/train/65/image_03265.jpg \n inflating: flower_data/train/65/image_03271.jpg \n inflating: flower_data/train/65/image_03270.jpg \n inflating: flower_data/train/65/image_03264.jpg \n inflating: flower_data/train/65/image_03260.jpg \n inflating: flower_data/train/65/image_03274.jpg \n inflating: flower_data/train/65/image_03248.jpg \n inflating: flower_data/train/65/image_03249.jpg \n inflating: flower_data/train/65/image_03275.jpg \n inflating: flower_data/train/65/image_03261.jpg \n inflating: flower_data/train/65/image_03277.jpg \n inflating: flower_data/train/65/image_03263.jpg \n inflating: flower_data/train/65/image_03262.jpg \n inflating: flower_data/train/65/image_03253.jpg \n inflating: flower_data/train/65/image_03247.jpg \n inflating: flower_data/train/65/image_03246.jpg \n inflating: flower_data/train/65/image_03278.jpg \n inflating: flower_data/train/65/image_03244.jpg \n inflating: flower_data/train/65/image_03251.jpg \n inflating: flower_data/train/65/image_03245.jpg \n inflating: flower_data/train/65/image_03282.jpg \n inflating: flower_data/train/65/image_03255.jpg \n inflating: flower_data/train/65/image_03268.jpg \n inflating: flower_data/train/65/image_03254.jpg \n inflating: flower_data/train/65/image_03240.jpg \n inflating: flower_data/train/65/image_03283.jpg \n inflating: flower_data/train/65/image_03281.jpg \n inflating: flower_data/train/65/image_03256.jpg \n inflating: flower_data/train/65/image_03242.jpg \n inflating: flower_data/train/65/image_03257.jpg \n inflating: flower_data/train/65/image_03280.jpg \n inflating: flower_data/train/65/image_03218.jpg \n inflating: flower_data/train/65/image_03230.jpg \n inflating: flower_data/train/65/image_03185.jpg \n inflating: flower_data/train/65/image_03191.jpg \n inflating: flower_data/train/65/image_03190.jpg \n inflating: flower_data/train/65/image_03184.jpg \n inflating: flower_data/train/65/image_03225.jpg \n inflating: flower_data/train/65/image_03231.jpg \n inflating: flower_data/train/65/image_03219.jpg \n inflating: flower_data/train/65/image_03227.jpg \n inflating: flower_data/train/65/image_03233.jpg \n inflating: flower_data/train/65/image_03192.jpg \n inflating: flower_data/train/65/image_03186.jpg \n inflating: flower_data/train/65/image_03187.jpg \n inflating: flower_data/train/65/image_03232.jpg \n inflating: flower_data/train/65/image_03226.jpg \n inflating: flower_data/train/65/image_03236.jpg \n inflating: flower_data/train/65/image_03183.jpg \n inflating: flower_data/train/65/image_03182.jpg \n inflating: flower_data/train/65/image_03196.jpg \n inflating: flower_data/train/65/image_03237.jpg \n inflating: flower_data/train/65/image_03223.jpg \n inflating: flower_data/train/65/image_03235.jpg \n inflating: flower_data/train/65/image_03221.jpg \n inflating: flower_data/train/65/image_03209.jpg \n inflating: flower_data/train/65/image_03194.jpg \n inflating: flower_data/train/65/image_03195.jpg \n inflating: flower_data/train/65/image_03208.jpg \n inflating: flower_data/train/65/image_03220.jpg \n inflating: flower_data/train/65/image_03234.jpg \n creating: flower_data/train/62/\n inflating: flower_data/train/62/image_08185.jpg \n inflating: flower_data/train/62/image_07275.jpg \n inflating: flower_data/train/62/image_08152.jpg \n inflating: flower_data/train/62/image_08153.jpg \n inflating: flower_data/train/62/image_07274.jpg \n inflating: flower_data/train/62/image_08184.jpg \n inflating: flower_data/train/62/image_08186.jpg \n inflating: flower_data/train/62/image_08179.jpg \n inflating: flower_data/train/62/image_08151.jpg \n inflating: flower_data/train/62/image_07277.jpg \n inflating: flower_data/train/62/image_08178.jpg \n inflating: flower_data/train/62/image_08187.jpg \n inflating: flower_data/train/62/image_08183.jpg \n inflating: flower_data/train/62/image_08154.jpg \n inflating: flower_data/train/62/image_07273.jpg \n inflating: flower_data/train/62/image_08168.jpg \n inflating: flower_data/train/62/image_08169.jpg \n inflating: flower_data/train/62/image_07272.jpg \n inflating: flower_data/train/62/image_08182.jpg \n inflating: flower_data/train/62/image_08180.jpg \n inflating: flower_data/train/62/image_07270.jpg \n inflating: flower_data/train/62/image_08157.jpg \n inflating: flower_data/train/62/image_08156.jpg \n inflating: flower_data/train/62/image_08181.jpg \n inflating: flower_data/train/62/image_07283.jpg \n inflating: flower_data/train/62/image_07268.jpg \n inflating: flower_data/train/62/image_08167.jpg \n inflating: flower_data/train/62/image_08173.jpg \n inflating: flower_data/train/62/image_08166.jpg \n inflating: flower_data/train/62/image_07282.jpg \n inflating: flower_data/train/62/image_07280.jpg \n inflating: flower_data/train/62/image_08158.jpg \n inflating: flower_data/train/62/image_08170.jpg \n inflating: flower_data/train/62/image_08164.jpg \n inflating: flower_data/train/62/image_08165.jpg \n inflating: flower_data/train/62/image_08159.jpg \n inflating: flower_data/train/62/image_07281.jpg \n inflating: flower_data/train/62/image_08175.jpg \n inflating: flower_data/train/62/image_08161.jpg \n inflating: flower_data/train/62/image_08160.jpg \n inflating: flower_data/train/62/image_08174.jpg \n inflating: flower_data/train/62/image_08189.jpg \n inflating: flower_data/train/62/image_08162.jpg \n inflating: flower_data/train/62/image_08176.jpg \n inflating: flower_data/train/62/image_07279.jpg \n inflating: flower_data/train/62/image_07278.jpg \n inflating: flower_data/train/62/image_08163.jpg \n inflating: flower_data/train/62/image_08188.jpg \n creating: flower_data/train/96/\n inflating: flower_data/train/96/image_07659.jpg \n inflating: flower_data/train/96/image_07665.jpg \n inflating: flower_data/train/96/image_07671.jpg \n inflating: flower_data/train/96/image_07664.jpg \n inflating: flower_data/train/96/image_07666.jpg \n inflating: flower_data/train/96/image_07667.jpg \n inflating: flower_data/train/96/image_07673.jpg \n inflating: flower_data/train/96/image_07663.jpg \n inflating: flower_data/train/96/image_07660.jpg \n inflating: flower_data/train/96/image_07674.jpg \n inflating: flower_data/train/96/image_07648.jpg \n inflating: flower_data/train/96/image_07649.jpg \n inflating: flower_data/train/96/image_07675.jpg \n inflating: flower_data/train/96/image_07661.jpg \n inflating: flower_data/train/96/image_07612.jpg \n inflating: flower_data/train/96/image_07607.jpg \n inflating: flower_data/train/96/image_07598.jpg \n inflating: flower_data/train/96/image_07639.jpg \n inflating: flower_data/train/96/image_07611.jpg \n inflating: flower_data/train/96/image_07604.jpg \n inflating: flower_data/train/96/image_07610.jpg \n inflating: flower_data/train/96/image_07638.jpg \n inflating: flower_data/train/96/image_07599.jpg \n inflating: flower_data/train/96/image_07614.jpg \n inflating: flower_data/train/96/image_07600.jpg \n inflating: flower_data/train/96/image_07628.jpg \n inflating: flower_data/train/96/image_07629.jpg \n inflating: flower_data/train/96/image_07601.jpg \n inflating: flower_data/train/96/image_07615.jpg \n inflating: flower_data/train/96/image_07603.jpg \n inflating: flower_data/train/96/image_07617.jpg \n inflating: flower_data/train/96/image_07616.jpg \n inflating: flower_data/train/96/image_07602.jpg \n inflating: flower_data/train/96/image_07627.jpg \n inflating: flower_data/train/96/image_07633.jpg \n inflating: flower_data/train/96/image_07632.jpg \n inflating: flower_data/train/96/image_07626.jpg \n inflating: flower_data/train/96/image_07618.jpg \n inflating: flower_data/train/96/image_07630.jpg \n inflating: flower_data/train/96/image_07624.jpg \n inflating: flower_data/train/96/image_07625.jpg \n inflating: flower_data/train/96/image_07631.jpg \n inflating: flower_data/train/96/image_07619.jpg \n inflating: flower_data/train/96/image_07594.jpg \n inflating: flower_data/train/96/image_07635.jpg \n inflating: flower_data/train/96/image_07621.jpg \n inflating: flower_data/train/96/image_07608.jpg \n inflating: flower_data/train/96/image_07620.jpg \n inflating: flower_data/train/96/image_07634.jpg \n inflating: flower_data/train/96/image_07595.jpg \n inflating: flower_data/train/96/image_07636.jpg \n inflating: flower_data/train/96/image_07637.jpg \n inflating: flower_data/train/96/image_07623.jpg \n inflating: flower_data/train/96/image_07678.jpg \n inflating: flower_data/train/96/image_07644.jpg \n inflating: flower_data/train/96/image_07650.jpg \n inflating: flower_data/train/96/image_07651.jpg \n inflating: flower_data/train/96/image_07645.jpg \n inflating: flower_data/train/96/image_07679.jpg \n inflating: flower_data/train/96/image_07684.jpg \n inflating: flower_data/train/96/image_07647.jpg \n inflating: flower_data/train/96/image_07652.jpg \n inflating: flower_data/train/96/image_07681.jpg \n inflating: flower_data/train/96/image_07642.jpg \n inflating: flower_data/train/96/image_07643.jpg \n inflating: flower_data/train/96/image_07657.jpg \n inflating: flower_data/train/96/image_07680.jpg \n inflating: flower_data/train/96/image_07682.jpg \n inflating: flower_data/train/96/image_07641.jpg \n inflating: flower_data/train/96/image_07655.jpg \n inflating: flower_data/train/96/image_07654.jpg \n inflating: flower_data/train/96/image_07640.jpg \n creating: flower_data/train/100/\n inflating: flower_data/train/100/image_07922.jpg \n inflating: flower_data/train/100/image_07894.jpg \n inflating: flower_data/train/100/image_07923.jpg \n inflating: flower_data/train/100/image_07937.jpg \n inflating: flower_data/train/100/image_07921.jpg \n inflating: flower_data/train/100/image_07935.jpg \n inflating: flower_data/train/100/image_07909.jpg \n inflating: flower_data/train/100/image_07908.jpg \n inflating: flower_data/train/100/image_07934.jpg \n inflating: flower_data/train/100/image_07920.jpg \n inflating: flower_data/train/100/image_07918.jpg \n inflating: flower_data/train/100/image_07924.jpg \n inflating: flower_data/train/100/image_07930.jpg \n inflating: flower_data/train/100/image_07893.jpg \n inflating: flower_data/train/100/image_07925.jpg \n inflating: flower_data/train/100/image_07919.jpg \n inflating: flower_data/train/100/image_07933.jpg \n inflating: flower_data/train/100/image_07927.jpg \n inflating: flower_data/train/100/image_07932.jpg \n inflating: flower_data/train/100/image_07941.jpg \n inflating: flower_data/train/100/image_07940.jpg \n inflating: flower_data/train/100/image_07903.jpg \n inflating: flower_data/train/100/image_07916.jpg \n inflating: flower_data/train/100/image_07900.jpg \n inflating: flower_data/train/100/image_07914.jpg \n inflating: flower_data/train/100/image_07928.jpg \n inflating: flower_data/train/100/image_07915.jpg \n inflating: flower_data/train/100/image_07901.jpg \n inflating: flower_data/train/100/image_07911.jpg \n inflating: flower_data/train/100/image_07910.jpg \n inflating: flower_data/train/100/image_07912.jpg \n inflating: flower_data/train/100/image_07906.jpg \n inflating: flower_data/train/100/image_07898.jpg \n inflating: flower_data/train/100/image_07907.jpg \n inflating: flower_data/train/100/image_07913.jpg \n creating: flower_data/train/54/\n inflating: flower_data/train/54/image_05448.jpg \n inflating: flower_data/train/54/image_05458.jpg \n inflating: flower_data/train/54/image_05459.jpg \n inflating: flower_data/train/54/image_05429.jpg \n inflating: flower_data/train/54/image_05401.jpg \n inflating: flower_data/train/54/image_05414.jpg \n inflating: flower_data/train/54/image_05400.jpg \n inflating: flower_data/train/54/image_05428.jpg \n inflating: flower_data/train/54/image_05399.jpg \n inflating: flower_data/train/54/image_05416.jpg \n inflating: flower_data/train/54/image_05403.jpg \n inflating: flower_data/train/54/image_05407.jpg \n inflating: flower_data/train/54/image_05412.jpg \n inflating: flower_data/train/54/image_05404.jpg \n inflating: flower_data/train/54/image_05410.jpg \n inflating: flower_data/train/54/image_05438.jpg \n inflating: flower_data/train/54/image_05439.jpg \n inflating: flower_data/train/54/image_05405.jpg \n inflating: flower_data/train/54/image_05408.jpg \n inflating: flower_data/train/54/image_05420.jpg \n inflating: flower_data/train/54/image_05434.jpg \n inflating: flower_data/train/54/image_05435.jpg \n inflating: flower_data/train/54/image_05421.jpg \n inflating: flower_data/train/54/image_05409.jpg \n inflating: flower_data/train/54/image_05437.jpg \n inflating: flower_data/train/54/image_05423.jpg \n inflating: flower_data/train/54/image_05422.jpg \n inflating: flower_data/train/54/image_05436.jpg \n inflating: flower_data/train/54/image_05432.jpg \n inflating: flower_data/train/54/image_05427.jpg \n inflating: flower_data/train/54/image_05433.jpg \n inflating: flower_data/train/54/image_05431.jpg \n inflating: flower_data/train/54/image_05419.jpg \n inflating: flower_data/train/54/image_05418.jpg \n inflating: flower_data/train/54/image_05424.jpg \n inflating: flower_data/train/54/image_05443.jpg \n inflating: flower_data/train/54/image_05457.jpg \n inflating: flower_data/train/54/image_05456.jpg \n inflating: flower_data/train/54/image_05442.jpg \n inflating: flower_data/train/54/image_05454.jpg \n inflating: flower_data/train/54/image_05441.jpg \n inflating: flower_data/train/54/image_05455.jpg \n inflating: flower_data/train/54/image_05445.jpg \n inflating: flower_data/train/54/image_05444.jpg \n inflating: flower_data/train/54/image_05450.jpg \n inflating: flower_data/train/54/image_05446.jpg \n inflating: flower_data/train/54/image_05447.jpg \n creating: flower_data/train/98/\n inflating: flower_data/train/98/image_07799.jpg \n inflating: flower_data/train/98/image_07772.jpg \n inflating: flower_data/train/98/image_07821.jpg \n inflating: flower_data/train/98/image_07809.jpg \n inflating: flower_data/train/98/image_07808.jpg \n inflating: flower_data/train/98/image_07767.jpg \n inflating: flower_data/train/98/image_07773.jpg \n inflating: flower_data/train/98/image_07798.jpg \n inflating: flower_data/train/98/image_07759.jpg \n inflating: flower_data/train/98/image_07765.jpg \n inflating: flower_data/train/98/image_07771.jpg \n inflating: flower_data/train/98/image_07822.jpg \n inflating: flower_data/train/98/image_07823.jpg \n inflating: flower_data/train/98/image_07770.jpg \n inflating: flower_data/train/98/image_07760.jpg \n inflating: flower_data/train/98/image_07774.jpg \n inflating: flower_data/train/98/image_07826.jpg \n inflating: flower_data/train/98/image_07832.jpg \n inflating: flower_data/train/98/image_07775.jpg \n inflating: flower_data/train/98/image_07761.jpg \n inflating: flower_data/train/98/image_07788.jpg \n inflating: flower_data/train/98/image_07763.jpg \n inflating: flower_data/train/98/image_07818.jpg \n inflating: flower_data/train/98/image_07824.jpg \n inflating: flower_data/train/98/image_07830.jpg \n inflating: flower_data/train/98/image_07831.jpg \n inflating: flower_data/train/98/image_07825.jpg \n inflating: flower_data/train/98/image_07819.jpg \n inflating: flower_data/train/98/image_07762.jpg \n inflating: flower_data/train/98/image_07776.jpg \n inflating: flower_data/train/98/image_07789.jpg \n inflating: flower_data/train/98/image_07790.jpg \n inflating: flower_data/train/98/image_07784.jpg \n inflating: flower_data/train/98/image_07800.jpg \n inflating: flower_data/train/98/image_07814.jpg \n inflating: flower_data/train/98/image_07828.jpg \n inflating: flower_data/train/98/image_07829.jpg \n inflating: flower_data/train/98/image_07815.jpg \n inflating: flower_data/train/98/image_07801.jpg \n inflating: flower_data/train/98/image_07752.jpg \n inflating: flower_data/train/98/image_07785.jpg \n inflating: flower_data/train/98/image_07791.jpg \n inflating: flower_data/train/98/image_07793.jpg \n inflating: flower_data/train/98/image_07817.jpg \n inflating: flower_data/train/98/image_07803.jpg \n inflating: flower_data/train/98/image_07802.jpg \n inflating: flower_data/train/98/image_07816.jpg \n inflating: flower_data/train/98/image_07751.jpg \n inflating: flower_data/train/98/image_07779.jpg \n inflating: flower_data/train/98/image_07786.jpg \n inflating: flower_data/train/98/image_07782.jpg \n inflating: flower_data/train/98/image_07755.jpg \n inflating: flower_data/train/98/image_07812.jpg \n inflating: flower_data/train/98/image_07806.jpg \n inflating: flower_data/train/98/image_07807.jpg \n inflating: flower_data/train/98/image_07813.jpg \n inflating: flower_data/train/98/image_07768.jpg \n inflating: flower_data/train/98/image_07754.jpg \n inflating: flower_data/train/98/image_07797.jpg \n inflating: flower_data/train/98/image_07783.jpg \n inflating: flower_data/train/98/image_07795.jpg \n inflating: flower_data/train/98/image_07781.jpg \n inflating: flower_data/train/98/image_07756.jpg \n inflating: flower_data/train/98/image_07805.jpg \n inflating: flower_data/train/98/image_07811.jpg \n inflating: flower_data/train/98/image_07804.jpg \n inflating: flower_data/train/98/image_07780.jpg \n inflating: flower_data/train/98/image_07794.jpg \n creating: flower_data/train/53/\n inflating: flower_data/train/53/image_03716.jpg \n inflating: flower_data/train/53/image_03676.jpg \n inflating: flower_data/train/53/image_03663.jpg \n inflating: flower_data/train/53/image_03688.jpg \n inflating: flower_data/train/53/image_03703.jpg \n inflating: flower_data/train/53/image_03729.jpg \n inflating: flower_data/train/53/image_03701.jpg \n inflating: flower_data/train/53/image_03715.jpg \n inflating: flower_data/train/53/image_03675.jpg \n inflating: flower_data/train/53/image_03661.jpg \n inflating: flower_data/train/53/image_03660.jpg \n inflating: flower_data/train/53/image_03674.jpg \n inflating: flower_data/train/53/image_03648.jpg \n inflating: flower_data/train/53/image_03700.jpg \n inflating: flower_data/train/53/image_03728.jpg \n inflating: flower_data/train/53/image_03704.jpg \n inflating: flower_data/train/53/image_03710.jpg \n inflating: flower_data/train/53/image_03664.jpg \n inflating: flower_data/train/53/image_03658.jpg \n inflating: flower_data/train/53/image_03659.jpg \n inflating: flower_data/train/53/image_03665.jpg \n inflating: flower_data/train/53/image_03671.jpg \n inflating: flower_data/train/53/image_03711.jpg \n inflating: flower_data/train/53/image_03705.jpg \n inflating: flower_data/train/53/image_03713.jpg \n inflating: flower_data/train/53/image_03707.jpg \n inflating: flower_data/train/53/image_03698.jpg \n inflating: flower_data/train/53/image_03673.jpg \n inflating: flower_data/train/53/image_03666.jpg \n inflating: flower_data/train/53/image_03699.jpg \n inflating: flower_data/train/53/image_03706.jpg \n inflating: flower_data/train/53/image_03723.jpg \n inflating: flower_data/train/53/image_03680.jpg \n inflating: flower_data/train/53/image_03694.jpg \n inflating: flower_data/train/53/image_03643.jpg \n inflating: flower_data/train/53/image_03657.jpg \n inflating: flower_data/train/53/image_03642.jpg \n inflating: flower_data/train/53/image_03695.jpg \n inflating: flower_data/train/53/image_03681.jpg \n inflating: flower_data/train/53/image_03722.jpg \n inflating: flower_data/train/53/image_03708.jpg \n inflating: flower_data/train/53/image_03720.jpg \n inflating: flower_data/train/53/image_03697.jpg \n inflating: flower_data/train/53/image_03683.jpg \n inflating: flower_data/train/53/image_03641.jpg \n inflating: flower_data/train/53/image_03682.jpg \n inflating: flower_data/train/53/image_03696.jpg \n inflating: flower_data/train/53/image_03721.jpg \n inflating: flower_data/train/53/image_03709.jpg \n inflating: flower_data/train/53/image_03725.jpg \n inflating: flower_data/train/53/image_03731.jpg \n inflating: flower_data/train/53/image_03719.jpg \n inflating: flower_data/train/53/image_03692.jpg \n inflating: flower_data/train/53/image_03686.jpg \n inflating: flower_data/train/53/image_03651.jpg \n inflating: flower_data/train/53/image_03679.jpg \n inflating: flower_data/train/53/image_03678.jpg \n inflating: flower_data/train/53/image_03644.jpg \n inflating: flower_data/train/53/image_03650.jpg \n inflating: flower_data/train/53/image_03687.jpg \n inflating: flower_data/train/53/image_03730.jpg \n inflating: flower_data/train/53/image_03732.jpg \n inflating: flower_data/train/53/image_03726.jpg \n inflating: flower_data/train/53/image_03685.jpg \n inflating: flower_data/train/53/image_03691.jpg \n inflating: flower_data/train/53/image_03646.jpg \n inflating: flower_data/train/53/image_03647.jpg \n inflating: flower_data/train/53/image_03690.jpg \n inflating: flower_data/train/53/image_03684.jpg \n inflating: flower_data/train/53/image_03727.jpg \n creating: flower_data/train/30/\n inflating: flower_data/train/30/image_03514.jpg \n inflating: flower_data/train/30/image_03500.jpg \n inflating: flower_data/train/30/image_03460.jpg \n inflating: flower_data/train/30/image_03474.jpg \n inflating: flower_data/train/30/image_03461.jpg \n inflating: flower_data/train/30/image_03501.jpg \n inflating: flower_data/train/30/image_03515.jpg \n inflating: flower_data/train/30/image_03529.jpg \n inflating: flower_data/train/30/image_03503.jpg \n inflating: flower_data/train/30/image_03517.jpg \n inflating: flower_data/train/30/image_03477.jpg \n inflating: flower_data/train/30/image_03463.jpg \n inflating: flower_data/train/30/image_03462.jpg \n inflating: flower_data/train/30/image_03476.jpg \n inflating: flower_data/train/30/image_03516.jpg \n inflating: flower_data/train/30/image_03502.jpg \n inflating: flower_data/train/30/image_03499.jpg \n inflating: flower_data/train/30/image_03472.jpg \n inflating: flower_data/train/30/image_03473.jpg \n inflating: flower_data/train/30/image_03498.jpg \n inflating: flower_data/train/30/image_03513.jpg \n inflating: flower_data/train/30/image_03507.jpg \n inflating: flower_data/train/30/image_03511.jpg \n inflating: flower_data/train/30/image_03505.jpg \n inflating: flower_data/train/30/image_03539.jpg \n inflating: flower_data/train/30/image_03465.jpg \n inflating: flower_data/train/30/image_03470.jpg \n inflating: flower_data/train/30/image_03504.jpg \n inflating: flower_data/train/30/image_03535.jpg \n inflating: flower_data/train/30/image_03521.jpg \n inflating: flower_data/train/30/image_03496.jpg \n inflating: flower_data/train/30/image_03468.jpg \n inflating: flower_data/train/30/image_03497.jpg \n inflating: flower_data/train/30/image_03483.jpg \n inflating: flower_data/train/30/image_03520.jpg \n inflating: flower_data/train/30/image_03534.jpg \n inflating: flower_data/train/30/image_03508.jpg \n inflating: flower_data/train/30/image_03522.jpg \n inflating: flower_data/train/30/image_03536.jpg \n inflating: flower_data/train/30/image_03495.jpg \n inflating: flower_data/train/30/image_03480.jpg \n inflating: flower_data/train/30/image_03494.jpg \n inflating: flower_data/train/30/image_03537.jpg \n inflating: flower_data/train/30/image_03523.jpg \n inflating: flower_data/train/30/image_03527.jpg \n inflating: flower_data/train/30/image_03533.jpg \n inflating: flower_data/train/30/image_03490.jpg \n inflating: flower_data/train/30/image_03491.jpg \n inflating: flower_data/train/30/image_03532.jpg \n inflating: flower_data/train/30/image_03526.jpg \n inflating: flower_data/train/30/image_03524.jpg \n inflating: flower_data/train/30/image_03518.jpg \n inflating: flower_data/train/30/image_03493.jpg \n inflating: flower_data/train/30/image_03478.jpg \n inflating: flower_data/train/30/image_03479.jpg \n inflating: flower_data/train/30/image_03492.jpg \n inflating: flower_data/train/30/image_03486.jpg \n inflating: flower_data/train/30/image_03519.jpg \n inflating: flower_data/train/30/image_03525.jpg \n inflating: flower_data/train/30/image_03543.jpg \n inflating: flower_data/train/30/image_03540.jpg \n creating: flower_data/train/37/\n inflating: flower_data/train/37/image_03749.jpg \n inflating: flower_data/train/37/image_03775.jpg \n inflating: flower_data/train/37/image_03761.jpg \n inflating: flower_data/train/37/image_03826.jpg \n inflating: flower_data/train/37/image_03827.jpg \n inflating: flower_data/train/37/image_03760.jpg \n inflating: flower_data/train/37/image_03774.jpg \n inflating: flower_data/train/37/image_03748.jpg \n inflating: flower_data/train/37/image_07289.jpg \n inflating: flower_data/train/37/image_03762.jpg \n inflating: flower_data/train/37/image_03776.jpg \n inflating: flower_data/train/37/image_03825.jpg \n inflating: flower_data/train/37/image_03819.jpg \n inflating: flower_data/train/37/image_03818.jpg \n inflating: flower_data/train/37/image_03824.jpg \n inflating: flower_data/train/37/image_03777.jpg \n inflating: flower_data/train/37/image_03763.jpg \n inflating: flower_data/train/37/image_07288.jpg \n inflating: flower_data/train/37/image_03788.jpg \n inflating: flower_data/train/37/image_07298.jpg \n inflating: flower_data/train/37/image_03798.jpg \n inflating: flower_data/train/37/image_03767.jpg \n inflating: flower_data/train/37/image_03773.jpg \n inflating: flower_data/train/37/image_03808.jpg \n inflating: flower_data/train/37/image_03809.jpg \n inflating: flower_data/train/37/image_03772.jpg \n inflating: flower_data/train/37/image_03766.jpg \n inflating: flower_data/train/37/image_03799.jpg \n inflating: flower_data/train/37/image_03770.jpg \n inflating: flower_data/train/37/image_03764.jpg \n inflating: flower_data/train/37/image_03758.jpg \n inflating: flower_data/train/37/image_03823.jpg \n inflating: flower_data/train/37/image_03759.jpg \n inflating: flower_data/train/37/image_03771.jpg \n inflating: flower_data/train/37/image_03738.jpg \n inflating: flower_data/train/37/image_03739.jpg \n inflating: flower_data/train/37/image_03737.jpg \n inflating: flower_data/train/37/image_03736.jpg \n inflating: flower_data/train/37/image_03735.jpg \n inflating: flower_data/train/37/image_03797.jpg \n inflating: flower_data/train/37/image_07297.jpg \n inflating: flower_data/train/37/image_03754.jpg \n inflating: flower_data/train/37/image_03740.jpg \n inflating: flower_data/train/37/image_03812.jpg \n inflating: flower_data/train/37/image_03806.jpg \n inflating: flower_data/train/37/image_03755.jpg \n inflating: flower_data/train/37/image_03769.jpg \n inflating: flower_data/train/37/image_03782.jpg \n inflating: flower_data/train/37/image_03796.jpg \n inflating: flower_data/train/37/image_07296.jpg \n inflating: flower_data/train/37/image_03780.jpg \n inflating: flower_data/train/37/image_07294.jpg \n inflating: flower_data/train/37/image_03794.jpg \n inflating: flower_data/train/37/image_03743.jpg \n inflating: flower_data/train/37/image_03757.jpg \n inflating: flower_data/train/37/image_03804.jpg \n inflating: flower_data/train/37/image_03805.jpg \n inflating: flower_data/train/37/image_03742.jpg \n inflating: flower_data/train/37/image_07295.jpg \n inflating: flower_data/train/37/image_03795.jpg \n inflating: flower_data/train/37/image_03781.jpg \n inflating: flower_data/train/37/image_07285.jpg \n inflating: flower_data/train/37/image_03791.jpg \n inflating: flower_data/train/37/image_07291.jpg \n inflating: flower_data/train/37/image_03746.jpg \n inflating: flower_data/train/37/image_03752.jpg \n inflating: flower_data/train/37/image_03801.jpg \n inflating: flower_data/train/37/image_03800.jpg \n inflating: flower_data/train/37/image_03814.jpg \n inflating: flower_data/train/37/image_03753.jpg \n inflating: flower_data/train/37/image_03747.jpg \n inflating: flower_data/train/37/image_03790.jpg \n inflating: flower_data/train/37/image_07290.jpg \n inflating: flower_data/train/37/image_03784.jpg \n inflating: flower_data/train/37/image_07292.jpg \n inflating: flower_data/train/37/image_03792.jpg \n inflating: flower_data/train/37/image_03786.jpg \n inflating: flower_data/train/37/image_07286.jpg \n inflating: flower_data/train/37/image_03751.jpg \n inflating: flower_data/train/37/image_03745.jpg \n inflating: flower_data/train/37/image_03779.jpg \n inflating: flower_data/train/37/image_03802.jpg \n inflating: flower_data/train/37/image_03816.jpg \n inflating: flower_data/train/37/image_03817.jpg \n inflating: flower_data/train/37/image_03803.jpg \n inflating: flower_data/train/37/image_03778.jpg \n inflating: flower_data/train/37/image_03744.jpg \n inflating: flower_data/train/37/image_03750.jpg \n inflating: flower_data/train/37/image_03787.jpg \n inflating: flower_data/train/37/image_07287.jpg \n inflating: flower_data/train/37/image_07293.jpg \n inflating: flower_data/train/37/image_03793.jpg \n creating: flower_data/train/39/\n inflating: flower_data/train/39/image_07028.jpg \n inflating: flower_data/train/39/image_07014.jpg \n inflating: flower_data/train/39/image_07015.jpg \n inflating: flower_data/train/39/image_07029.jpg \n inflating: flower_data/train/39/image_07017.jpg \n inflating: flower_data/train/39/image_07012.jpg \n inflating: flower_data/train/39/image_08080.jpg \n inflating: flower_data/train/39/image_08081.jpg \n inflating: flower_data/train/39/image_07007.jpg \n inflating: flower_data/train/39/image_07011.jpg \n inflating: flower_data/train/39/image_07038.jpg \n inflating: flower_data/train/39/image_07009.jpg \n inflating: flower_data/train/39/image_07021.jpg \n inflating: flower_data/train/39/image_07020.jpg \n inflating: flower_data/train/39/image_07034.jpg \n inflating: flower_data/train/39/image_07008.jpg \n inflating: flower_data/train/39/image_07022.jpg \n inflating: flower_data/train/39/image_07037.jpg \n inflating: flower_data/train/39/image_07023.jpg \n inflating: flower_data/train/39/image_07027.jpg \n inflating: flower_data/train/39/image_07033.jpg \n inflating: flower_data/train/39/image_07032.jpg \n inflating: flower_data/train/39/image_07026.jpg \n inflating: flower_data/train/39/image_07030.jpg \n inflating: flower_data/train/39/image_07024.jpg \n inflating: flower_data/train/39/image_07019.jpg \n inflating: flower_data/train/39/image_07025.jpg \n inflating: flower_data/train/39/image_07031.jpg \n inflating: flower_data/train/39/image_07042.jpg \n inflating: flower_data/train/39/image_07041.jpg \n inflating: flower_data/train/39/image_07040.jpg \n inflating: flower_data/train/39/image_07044.jpg \n inflating: flower_data/train/39/image_07045.jpg \n creating: flower_data/train/99/\n inflating: flower_data/train/99/image_07842.jpg \n inflating: flower_data/train/99/image_07856.jpg \n inflating: flower_data/train/99/image_07881.jpg \n inflating: flower_data/train/99/image_07880.jpg \n inflating: flower_data/train/99/image_07843.jpg \n inflating: flower_data/train/99/image_07855.jpg \n inflating: flower_data/train/99/image_07841.jpg \n inflating: flower_data/train/99/image_07882.jpg \n inflating: flower_data/train/99/image_07883.jpg \n inflating: flower_data/train/99/image_07868.jpg \n inflating: flower_data/train/99/image_07878.jpg \n inflating: flower_data/train/99/image_07850.jpg \n inflating: flower_data/train/99/image_07844.jpg \n inflating: flower_data/train/99/image_07887.jpg \n inflating: flower_data/train/99/image_07886.jpg \n inflating: flower_data/train/99/image_07892.jpg \n inflating: flower_data/train/99/image_07845.jpg \n inflating: flower_data/train/99/image_07851.jpg \n inflating: flower_data/train/99/image_07879.jpg \n inflating: flower_data/train/99/image_07847.jpg \n inflating: flower_data/train/99/image_07853.jpg \n inflating: flower_data/train/99/image_07884.jpg \n inflating: flower_data/train/99/image_07890.jpg \n inflating: flower_data/train/99/image_07891.jpg \n inflating: flower_data/train/99/image_07885.jpg \n inflating: flower_data/train/99/image_07852.jpg \n inflating: flower_data/train/99/image_07846.jpg \n inflating: flower_data/train/99/image_07835.jpg \n inflating: flower_data/train/99/image_07834.jpg \n inflating: flower_data/train/99/image_07836.jpg \n inflating: flower_data/train/99/image_07837.jpg \n inflating: flower_data/train/99/image_08064.jpg \n inflating: flower_data/train/99/image_08062.jpg \n inflating: flower_data/train/99/image_07839.jpg \n inflating: flower_data/train/99/image_07863.jpg \n inflating: flower_data/train/99/image_07877.jpg \n inflating: flower_data/train/99/image_07888.jpg \n inflating: flower_data/train/99/image_07889.jpg \n inflating: flower_data/train/99/image_07876.jpg \n inflating: flower_data/train/99/image_07848.jpg \n inflating: flower_data/train/99/image_07849.jpg \n inflating: flower_data/train/99/image_07861.jpg \n inflating: flower_data/train/99/image_07859.jpg \n inflating: flower_data/train/99/image_07865.jpg \n inflating: flower_data/train/99/image_07864.jpg \n inflating: flower_data/train/99/image_07870.jpg \n inflating: flower_data/train/99/image_07858.jpg \n inflating: flower_data/train/99/image_07866.jpg \n inflating: flower_data/train/99/image_07872.jpg \n inflating: flower_data/train/99/image_07867.jpg \n creating: flower_data/train/52/\n inflating: flower_data/train/52/image_04224.jpg \n inflating: flower_data/train/52/image_04230.jpg \n inflating: flower_data/train/52/image_04218.jpg \n inflating: flower_data/train/52/image_04185.jpg \n inflating: flower_data/train/52/image_04184.jpg \n inflating: flower_data/train/52/image_04190.jpg \n inflating: flower_data/train/52/image_04219.jpg \n inflating: flower_data/train/52/image_04231.jpg \n inflating: flower_data/train/52/image_04233.jpg \n inflating: flower_data/train/52/image_04227.jpg \n inflating: flower_data/train/52/image_04186.jpg \n inflating: flower_data/train/52/image_04192.jpg \n inflating: flower_data/train/52/image_04179.jpg \n inflating: flower_data/train/52/image_04178.jpg \n inflating: flower_data/train/52/image_04193.jpg \n inflating: flower_data/train/52/image_04187.jpg \n inflating: flower_data/train/52/image_04226.jpg \n inflating: flower_data/train/52/image_04232.jpg \n inflating: flower_data/train/52/image_04236.jpg \n inflating: flower_data/train/52/image_04222.jpg \n inflating: flower_data/train/52/image_04183.jpg \n inflating: flower_data/train/52/image_04197.jpg \n inflating: flower_data/train/52/image_04196.jpg \n inflating: flower_data/train/52/image_04223.jpg \n inflating: flower_data/train/52/image_04237.jpg \n inflating: flower_data/train/52/image_04209.jpg \n inflating: flower_data/train/52/image_04194.jpg \n inflating: flower_data/train/52/image_04180.jpg \n inflating: flower_data/train/52/image_04195.jpg \n inflating: flower_data/train/52/image_04234.jpg \n inflating: flower_data/train/52/image_04220.jpg \n inflating: flower_data/train/52/image_04208.jpg \n inflating: flower_data/train/52/image_04241.jpg \n inflating: flower_data/train/52/image_04242.jpg \n inflating: flower_data/train/52/image_04243.jpg \n inflating: flower_data/train/52/image_04205.jpg \n inflating: flower_data/train/52/image_04211.jpg \n inflating: flower_data/train/52/image_04239.jpg \n inflating: flower_data/train/52/image_04198.jpg \n inflating: flower_data/train/52/image_04166.jpg \n inflating: flower_data/train/52/image_04199.jpg \n inflating: flower_data/train/52/image_04210.jpg \n inflating: flower_data/train/52/image_04204.jpg \n inflating: flower_data/train/52/image_04212.jpg \n inflating: flower_data/train/52/image_04206.jpg \n inflating: flower_data/train/52/image_04164.jpg \n inflating: flower_data/train/52/image_04170.jpg \n inflating: flower_data/train/52/image_04159.jpg \n inflating: flower_data/train/52/image_04171.jpg \n inflating: flower_data/train/52/image_04165.jpg \n inflating: flower_data/train/52/image_04213.jpg \n inflating: flower_data/train/52/image_04217.jpg \n inflating: flower_data/train/52/image_04203.jpg \n inflating: flower_data/train/52/image_04161.jpg \n inflating: flower_data/train/52/image_04175.jpg \n inflating: flower_data/train/52/image_04174.jpg \n inflating: flower_data/train/52/image_04202.jpg \n inflating: flower_data/train/52/image_04216.jpg \n inflating: flower_data/train/52/image_04228.jpg \n inflating: flower_data/train/52/image_04214.jpg \n inflating: flower_data/train/52/image_04189.jpg \n inflating: flower_data/train/52/image_04176.jpg \n inflating: flower_data/train/52/image_04162.jpg \n inflating: flower_data/train/52/image_04177.jpg \n inflating: flower_data/train/52/image_04188.jpg \n inflating: flower_data/train/52/image_04201.jpg \n inflating: flower_data/train/52/image_04229.jpg \n creating: flower_data/train/101/\n inflating: flower_data/train/101/image_07955.jpg \n inflating: flower_data/train/101/image_07969.jpg \n inflating: flower_data/train/101/image_07996.jpg \n inflating: flower_data/train/101/image_07982.jpg \n inflating: flower_data/train/101/image_07997.jpg \n inflating: flower_data/train/101/image_07968.jpg \n inflating: flower_data/train/101/image_07954.jpg \n inflating: flower_data/train/101/image_07942.jpg \n inflating: flower_data/train/101/image_07956.jpg \n inflating: flower_data/train/101/image_07981.jpg \n inflating: flower_data/train/101/image_07995.jpg \n inflating: flower_data/train/101/image_07994.jpg \n inflating: flower_data/train/101/image_07980.jpg \n inflating: flower_data/train/101/image_07957.jpg \n inflating: flower_data/train/101/image_07943.jpg \n inflating: flower_data/train/101/image_07947.jpg \n inflating: flower_data/train/101/image_07953.jpg \n inflating: flower_data/train/101/image_07984.jpg \n inflating: flower_data/train/101/image_07990.jpg \n inflating: flower_data/train/101/image_07991.jpg \n inflating: flower_data/train/101/image_07946.jpg \n inflating: flower_data/train/101/image_07978.jpg \n inflating: flower_data/train/101/image_07950.jpg \n inflating: flower_data/train/101/image_07944.jpg \n inflating: flower_data/train/101/image_07993.jpg \n inflating: flower_data/train/101/image_07987.jpg \n inflating: flower_data/train/101/image_07986.jpg \n inflating: flower_data/train/101/image_07992.jpg \n inflating: flower_data/train/101/image_07945.jpg \n inflating: flower_data/train/101/image_07979.jpg \n inflating: flower_data/train/101/image_07974.jpg \n inflating: flower_data/train/101/image_07960.jpg \n inflating: flower_data/train/101/image_07948.jpg \n inflating: flower_data/train/101/image_07961.jpg \n inflating: flower_data/train/101/image_07975.jpg \n inflating: flower_data/train/101/image_07977.jpg \n inflating: flower_data/train/101/image_07989.jpg \n inflating: flower_data/train/101/image_07976.jpg \n inflating: flower_data/train/101/image_07966.jpg \n inflating: flower_data/train/101/image_07972.jpg \n inflating: flower_data/train/101/image_07999.jpg \n inflating: flower_data/train/101/image_07998.jpg \n inflating: flower_data/train/101/image_07973.jpg \n inflating: flower_data/train/101/image_07967.jpg \n inflating: flower_data/train/101/image_07959.jpg \n inflating: flower_data/train/101/image_07971.jpg \n inflating: flower_data/train/101/image_07965.jpg \n inflating: flower_data/train/101/image_07964.jpg \n inflating: flower_data/train/101/image_07958.jpg \n creating: flower_data/train/55/\n inflating: flower_data/train/55/image_04754.jpg \n inflating: flower_data/train/55/image_04755.jpg \n inflating: flower_data/train/55/image_04741.jpg \n inflating: flower_data/train/55/image_04757.jpg \n inflating: flower_data/train/55/image_04743.jpg \n inflating: flower_data/train/55/image_04756.jpg \n inflating: flower_data/train/55/image_04752.jpg \n inflating: flower_data/train/55/image_04746.jpg \n inflating: flower_data/train/55/image_04745.jpg \n inflating: flower_data/train/55/image_04751.jpg \n inflating: flower_data/train/55/image_04750.jpg \n inflating: flower_data/train/55/image_04744.jpg \n inflating: flower_data/train/55/image_04723.jpg \n inflating: flower_data/train/55/image_04737.jpg \n inflating: flower_data/train/55/image_04736.jpg \n inflating: flower_data/train/55/image_04708.jpg \n inflating: flower_data/train/55/image_04697.jpg \n inflating: flower_data/train/55/image_04709.jpg \n inflating: flower_data/train/55/image_04721.jpg \n inflating: flower_data/train/55/image_04735.jpg \n inflating: flower_data/train/55/image_04719.jpg \n inflating: flower_data/train/55/image_04725.jpg \n inflating: flower_data/train/55/image_04724.jpg \n inflating: flower_data/train/55/image_04730.jpg \n inflating: flower_data/train/55/image_04718.jpg \n inflating: flower_data/train/55/image_04726.jpg \n inflating: flower_data/train/55/image_04732.jpg \n inflating: flower_data/train/55/image_04727.jpg \n inflating: flower_data/train/55/image_04716.jpg \n inflating: flower_data/train/55/image_04717.jpg \n inflating: flower_data/train/55/image_04703.jpg \n inflating: flower_data/train/55/image_04715.jpg \n inflating: flower_data/train/55/image_04729.jpg \n inflating: flower_data/train/55/image_04728.jpg \n inflating: flower_data/train/55/image_04700.jpg \n inflating: flower_data/train/55/image_04714.jpg \n inflating: flower_data/train/55/image_04738.jpg \n inflating: flower_data/train/55/image_04710.jpg \n inflating: flower_data/train/55/image_04704.jpg \n inflating: flower_data/train/55/image_04705.jpg \n inflating: flower_data/train/55/image_04711.jpg \n inflating: flower_data/train/55/image_04713.jpg \n inflating: flower_data/train/55/image_04698.jpg \n inflating: flower_data/train/55/image_04699.jpg \n inflating: flower_data/train/55/image_04712.jpg \n inflating: flower_data/train/55/image_04706.jpg \n inflating: flower_data/train/55/image_04761.jpg \n inflating: flower_data/train/55/image_04749.jpg \n inflating: flower_data/train/55/image_04748.jpg \n inflating: flower_data/train/55/image_04762.jpg \n inflating: flower_data/train/55/image_04763.jpg \n inflating: flower_data/train/55/image_04766.jpg \n inflating: flower_data/train/55/image_04758.jpg \n inflating: flower_data/train/55/image_04764.jpg \n inflating: flower_data/train/55/image_04765.jpg \n inflating: flower_data/train/55/image_04759.jpg \n creating: flower_data/train/97/\n inflating: flower_data/train/97/image_07711.jpg \n inflating: flower_data/train/97/image_07705.jpg \n inflating: flower_data/train/97/image_07704.jpg \n inflating: flower_data/train/97/image_07710.jpg \n inflating: flower_data/train/97/image_07738.jpg \n inflating: flower_data/train/97/image_07706.jpg \n inflating: flower_data/train/97/image_07712.jpg \n inflating: flower_data/train/97/image_07698.jpg \n inflating: flower_data/train/97/image_07713.jpg \n inflating: flower_data/train/97/image_07707.jpg \n inflating: flower_data/train/97/image_07703.jpg \n inflating: flower_data/train/97/image_07717.jpg \n inflating: flower_data/train/97/image_07688.jpg \n inflating: flower_data/train/97/image_07689.jpg \n inflating: flower_data/train/97/image_07716.jpg \n inflating: flower_data/train/97/image_07702.jpg \n inflating: flower_data/train/97/image_07714.jpg \n inflating: flower_data/train/97/image_07700.jpg \n inflating: flower_data/train/97/image_07728.jpg \n inflating: flower_data/train/97/image_07729.jpg \n inflating: flower_data/train/97/image_07701.jpg \n inflating: flower_data/train/97/image_07715.jpg \n inflating: flower_data/train/97/image_07748.jpg \n inflating: flower_data/train/97/image_07749.jpg \n inflating: flower_data/train/97/image_07747.jpg \n inflating: flower_data/train/97/image_07746.jpg \n inflating: flower_data/train/97/image_07744.jpg \n inflating: flower_data/train/97/image_07750.jpg \n inflating: flower_data/train/97/image_07741.jpg \n inflating: flower_data/train/97/image_07740.jpg \n inflating: flower_data/train/97/image_07742.jpg \n inflating: flower_data/train/97/image_07743.jpg \n inflating: flower_data/train/97/image_07730.jpg \n inflating: flower_data/train/97/image_07724.jpg \n inflating: flower_data/train/97/image_07687.jpg \n inflating: flower_data/train/97/image_07693.jpg \n inflating: flower_data/train/97/image_07692.jpg \n inflating: flower_data/train/97/image_07686.jpg \n inflating: flower_data/train/97/image_07725.jpg \n inflating: flower_data/train/97/image_07731.jpg \n inflating: flower_data/train/97/image_07733.jpg \n inflating: flower_data/train/97/image_07690.jpg \n inflating: flower_data/train/97/image_07685.jpg \n inflating: flower_data/train/97/image_07691.jpg \n inflating: flower_data/train/97/image_07732.jpg \n inflating: flower_data/train/97/image_07726.jpg \n inflating: flower_data/train/97/image_07722.jpg \n inflating: flower_data/train/97/image_07736.jpg \n inflating: flower_data/train/97/image_07695.jpg \n inflating: flower_data/train/97/image_07694.jpg \n inflating: flower_data/train/97/image_07735.jpg \n inflating: flower_data/train/97/image_07721.jpg \n inflating: flower_data/train/97/image_07720.jpg \n inflating: flower_data/train/97/image_07734.jpg \n creating: flower_data/train/63/\n inflating: flower_data/train/63/image_05884.jpg \n inflating: flower_data/train/63/image_05891.jpg \n inflating: flower_data/train/63/image_05885.jpg \n inflating: flower_data/train/63/image_05852.jpg \n inflating: flower_data/train/63/image_05850.jpg \n inflating: flower_data/train/63/image_05893.jpg \n inflating: flower_data/train/63/image_05887.jpg \n inflating: flower_data/train/63/image_05886.jpg \n inflating: flower_data/train/63/image_05892.jpg \n inflating: flower_data/train/63/image_05879.jpg \n inflating: flower_data/train/63/image_05851.jpg \n inflating: flower_data/train/63/image_05869.jpg \n inflating: flower_data/train/63/image_05855.jpg \n inflating: flower_data/train/63/image_05896.jpg \n inflating: flower_data/train/63/image_05883.jpg \n inflating: flower_data/train/63/image_05897.jpg \n inflating: flower_data/train/63/image_05854.jpg \n inflating: flower_data/train/63/image_05868.jpg \n inflating: flower_data/train/63/image_05856.jpg \n inflating: flower_data/train/63/image_05881.jpg \n inflating: flower_data/train/63/image_05895.jpg \n inflating: flower_data/train/63/image_05880.jpg \n inflating: flower_data/train/63/image_05866.jpg \n inflating: flower_data/train/63/image_05872.jpg \n inflating: flower_data/train/63/image_05899.jpg \n inflating: flower_data/train/63/image_05873.jpg \n inflating: flower_data/train/63/image_05865.jpg \n inflating: flower_data/train/63/image_05859.jpg \n inflating: flower_data/train/63/image_05858.jpg \n inflating: flower_data/train/63/image_05864.jpg \n inflating: flower_data/train/63/image_05870.jpg \n inflating: flower_data/train/63/image_05874.jpg \n inflating: flower_data/train/63/image_05860.jpg \n inflating: flower_data/train/63/image_05900.jpg \n inflating: flower_data/train/63/image_05901.jpg \n inflating: flower_data/train/63/image_05849.jpg \n inflating: flower_data/train/63/image_05863.jpg \n inflating: flower_data/train/63/image_05877.jpg \n inflating: flower_data/train/63/image_05888.jpg \n inflating: flower_data/train/63/image_05902.jpg \n inflating: flower_data/train/63/image_05889.jpg \n inflating: flower_data/train/63/image_05862.jpg \n creating: flower_data/train/64/\n inflating: flower_data/train/64/image_06143.jpg \n inflating: flower_data/train/64/image_06142.jpg \n inflating: flower_data/train/64/image_06140.jpg \n inflating: flower_data/train/64/image_06145.jpg \n inflating: flower_data/train/64/image_06144.jpg \n inflating: flower_data/train/64/image_06146.jpg \n inflating: flower_data/train/64/image_06147.jpg \n inflating: flower_data/train/64/image_06120.jpg \n inflating: flower_data/train/64/image_06108.jpg \n inflating: flower_data/train/64/image_06097.jpg \n inflating: flower_data/train/64/image_06109.jpg \n inflating: flower_data/train/64/image_06121.jpg \n inflating: flower_data/train/64/image_06135.jpg \n inflating: flower_data/train/64/image_06137.jpg \n inflating: flower_data/train/64/image_06136.jpg \n inflating: flower_data/train/64/image_06122.jpg \n inflating: flower_data/train/64/image_06132.jpg \n inflating: flower_data/train/64/image_06133.jpg \n inflating: flower_data/train/64/image_06127.jpg \n inflating: flower_data/train/64/image_06119.jpg \n inflating: flower_data/train/64/image_06131.jpg \n inflating: flower_data/train/64/image_06125.jpg \n inflating: flower_data/train/64/image_06124.jpg \n inflating: flower_data/train/64/image_06118.jpg \n inflating: flower_data/train/64/image_06115.jpg \n inflating: flower_data/train/64/image_06101.jpg \n inflating: flower_data/train/64/image_06128.jpg \n inflating: flower_data/train/64/image_06100.jpg \n inflating: flower_data/train/64/image_06114.jpg \n inflating: flower_data/train/64/image_06102.jpg \n inflating: flower_data/train/64/image_06116.jpg \n inflating: flower_data/train/64/image_06117.jpg \n inflating: flower_data/train/64/image_06103.jpg \n inflating: flower_data/train/64/image_06107.jpg \n inflating: flower_data/train/64/image_06113.jpg \n inflating: flower_data/train/64/image_06112.jpg \n inflating: flower_data/train/64/image_06106.jpg \n inflating: flower_data/train/64/image_06110.jpg \n inflating: flower_data/train/64/image_06105.jpg \n inflating: flower_data/train/64/image_06111.jpg \n inflating: flower_data/train/64/image_06139.jpg \n inflating: flower_data/train/64/image_06148.jpg \n creating: flower_data/train/90/\n inflating: flower_data/train/90/image_04436.jpg \n inflating: flower_data/train/90/image_04422.jpg \n inflating: flower_data/train/90/image_04423.jpg \n inflating: flower_data/train/90/image_04437.jpg \n inflating: flower_data/train/90/image_07300.jpg \n inflating: flower_data/train/90/image_04421.jpg \n inflating: flower_data/train/90/image_04435.jpg \n inflating: flower_data/train/90/image_04409.jpg \n inflating: flower_data/train/90/image_04408.jpg \n inflating: flower_data/train/90/image_04434.jpg \n inflating: flower_data/train/90/image_04418.jpg \n inflating: flower_data/train/90/image_04424.jpg \n inflating: flower_data/train/90/image_04425.jpg \n inflating: flower_data/train/90/image_04419.jpg \n inflating: flower_data/train/90/image_07299.jpg \n inflating: flower_data/train/90/image_04433.jpg \n inflating: flower_data/train/90/image_04427.jpg \n inflating: flower_data/train/90/image_04455.jpg \n inflating: flower_data/train/90/image_04441.jpg \n inflating: flower_data/train/90/image_04440.jpg \n inflating: flower_data/train/90/image_04454.jpg \n inflating: flower_data/train/90/image_04442.jpg \n inflating: flower_data/train/90/image_04456.jpg \n inflating: flower_data/train/90/image_04457.jpg \n inflating: flower_data/train/90/image_04443.jpg \n inflating: flower_data/train/90/image_04480.jpg \n inflating: flower_data/train/90/image_04447.jpg \n inflating: flower_data/train/90/image_04453.jpg \n inflating: flower_data/train/90/image_04452.jpg \n inflating: flower_data/train/90/image_04446.jpg \n inflating: flower_data/train/90/image_04478.jpg \n inflating: flower_data/train/90/image_04450.jpg \n inflating: flower_data/train/90/image_04444.jpg \n inflating: flower_data/train/90/image_04445.jpg \n inflating: flower_data/train/90/image_04451.jpg \n inflating: flower_data/train/90/image_04479.jpg \n inflating: flower_data/train/90/image_04474.jpg \n inflating: flower_data/train/90/image_04460.jpg \n inflating: flower_data/train/90/image_04448.jpg \n inflating: flower_data/train/90/image_04449.jpg \n inflating: flower_data/train/90/image_04461.jpg \n inflating: flower_data/train/90/image_04475.jpg \n inflating: flower_data/train/90/image_04463.jpg \n inflating: flower_data/train/90/image_04477.jpg \n inflating: flower_data/train/90/image_04476.jpg \n inflating: flower_data/train/90/image_04462.jpg \n inflating: flower_data/train/90/image_04466.jpg \n inflating: flower_data/train/90/image_04472.jpg \n inflating: flower_data/train/90/image_04467.jpg \n inflating: flower_data/train/90/image_04471.jpg \n inflating: flower_data/train/90/image_04465.jpg \n inflating: flower_data/train/90/image_04464.jpg \n inflating: flower_data/train/90/image_04470.jpg \n inflating: flower_data/train/90/image_04416.jpg \n inflating: flower_data/train/90/image_04414.jpg \n inflating: flower_data/train/90/image_04429.jpg \n inflating: flower_data/train/90/image_04415.jpg \n inflating: flower_data/train/90/image_04401.jpg \n inflating: flower_data/train/90/image_04439.jpg \n inflating: flower_data/train/90/image_04411.jpg \n inflating: flower_data/train/90/image_04410.jpg \n inflating: flower_data/train/90/image_04404.jpg \n inflating: flower_data/train/90/image_04438.jpg \n inflating: flower_data/train/90/image_04412.jpg \n inflating: flower_data/train/90/image_04406.jpg \n inflating: flower_data/train/90/image_04413.jpg \n creating: flower_data/train/46/\n inflating: flower_data/train/46/image_01016.jpg \n inflating: flower_data/train/46/image_01002.jpg \n inflating: flower_data/train/46/image_01017.jpg \n inflating: flower_data/train/46/image_01001.jpg \n inflating: flower_data/train/46/image_01015.jpg \n inflating: flower_data/train/46/image_01014.jpg \n inflating: flower_data/train/46/image_01000.jpg \n inflating: flower_data/train/46/image_01028.jpg \n inflating: flower_data/train/46/image_01004.jpg \n inflating: flower_data/train/46/image_01010.jpg \n inflating: flower_data/train/46/image_01039.jpg \n inflating: flower_data/train/46/image_01011.jpg \n inflating: flower_data/train/46/image_01005.jpg \n inflating: flower_data/train/46/image_01013.jpg \n inflating: flower_data/train/46/image_01007.jpg \n inflating: flower_data/train/46/image_01006.jpg \n inflating: flower_data/train/46/image_01012.jpg \n inflating: flower_data/train/46/image_00948.jpg \n inflating: flower_data/train/46/image_00960.jpg \n inflating: flower_data/train/46/image_00974.jpg \n inflating: flower_data/train/46/image_01101.jpg \n inflating: flower_data/train/46/image_01049.jpg \n inflating: flower_data/train/46/image_01075.jpg \n inflating: flower_data/train/46/image_01060.jpg \n inflating: flower_data/train/46/image_01074.jpg \n inflating: flower_data/train/46/image_01048.jpg \n inflating: flower_data/train/46/image_01114.jpg \n inflating: flower_data/train/46/image_01128.jpg \n inflating: flower_data/train/46/image_00975.jpg \n inflating: flower_data/train/46/image_00963.jpg \n inflating: flower_data/train/46/image_00988.jpg \n inflating: flower_data/train/46/image_01116.jpg \n inflating: flower_data/train/46/image_01102.jpg \n inflating: flower_data/train/46/image_01089.jpg \n inflating: flower_data/train/46/image_01062.jpg \n inflating: flower_data/train/46/image_01063.jpg \n inflating: flower_data/train/46/image_01088.jpg \n inflating: flower_data/train/46/image_01103.jpg \n inflating: flower_data/train/46/image_01117.jpg \n inflating: flower_data/train/46/image_00989.jpg \n inflating: flower_data/train/46/image_00962.jpg \n inflating: flower_data/train/46/image_00972.jpg \n inflating: flower_data/train/46/image_00966.jpg \n inflating: flower_data/train/46/image_01113.jpg \n inflating: flower_data/train/46/image_01107.jpg \n inflating: flower_data/train/46/image_01098.jpg \n inflating: flower_data/train/46/image_01067.jpg \n inflating: flower_data/train/46/image_01073.jpg \n inflating: flower_data/train/46/image_01072.jpg \n inflating: flower_data/train/46/image_01066.jpg \n inflating: flower_data/train/46/image_01099.jpg \n inflating: flower_data/train/46/image_01106.jpg \n inflating: flower_data/train/46/image_01112.jpg \n inflating: flower_data/train/46/image_00998.jpg \n inflating: flower_data/train/46/image_00967.jpg \n inflating: flower_data/train/46/image_00973.jpg \n inflating: flower_data/train/46/image_00965.jpg \n inflating: flower_data/train/46/image_00971.jpg \n inflating: flower_data/train/46/image_00959.jpg \n inflating: flower_data/train/46/image_01104.jpg \n inflating: flower_data/train/46/image_01110.jpg \n inflating: flower_data/train/46/image_01070.jpg \n inflating: flower_data/train/46/image_01064.jpg \n inflating: flower_data/train/46/image_01058.jpg \n inflating: flower_data/train/46/image_01059.jpg \n inflating: flower_data/train/46/image_01139.jpg \n inflating: flower_data/train/46/image_01111.jpg \n inflating: flower_data/train/46/image_01105.jpg \n inflating: flower_data/train/46/image_00964.jpg \n inflating: flower_data/train/46/image_00955.jpg \n inflating: flower_data/train/46/image_00982.jpg \n inflating: flower_data/train/46/image_01108.jpg \n inflating: flower_data/train/46/image_01120.jpg \n inflating: flower_data/train/46/image_01134.jpg \n inflating: flower_data/train/46/image_01097.jpg \n inflating: flower_data/train/46/image_01083.jpg \n inflating: flower_data/train/46/image_01068.jpg \n inflating: flower_data/train/46/image_01054.jpg \n inflating: flower_data/train/46/image_01041.jpg \n inflating: flower_data/train/46/image_01055.jpg \n inflating: flower_data/train/46/image_01069.jpg \n inflating: flower_data/train/46/image_01082.jpg \n inflating: flower_data/train/46/image_01096.jpg \n inflating: flower_data/train/46/image_01135.jpg \n inflating: flower_data/train/46/image_01121.jpg \n inflating: flower_data/train/46/image_00997.jpg \n inflating: flower_data/train/46/image_00954.jpg \n inflating: flower_data/train/46/image_00968.jpg \n inflating: flower_data/train/46/image_00956.jpg \n inflating: flower_data/train/46/image_00995.jpg \n inflating: flower_data/train/46/image_00981.jpg \n inflating: flower_data/train/46/image_01137.jpg \n inflating: flower_data/train/46/image_01123.jpg \n inflating: flower_data/train/46/image_01080.jpg \n inflating: flower_data/train/46/image_01094.jpg \n inflating: flower_data/train/46/image_01043.jpg \n inflating: flower_data/train/46/image_01057.jpg \n inflating: flower_data/train/46/image_01056.jpg \n inflating: flower_data/train/46/image_01042.jpg \n inflating: flower_data/train/46/image_01081.jpg \n inflating: flower_data/train/46/image_01122.jpg \n inflating: flower_data/train/46/image_01136.jpg \n inflating: flower_data/train/46/image_00980.jpg \n inflating: flower_data/train/46/image_00994.jpg \n inflating: flower_data/train/46/image_00957.jpg \n inflating: flower_data/train/46/image_00953.jpg \n inflating: flower_data/train/46/image_00947.jpg \n inflating: flower_data/train/46/image_00984.jpg \n inflating: flower_data/train/46/image_01126.jpg \n inflating: flower_data/train/46/image_01085.jpg \n inflating: flower_data/train/46/image_01091.jpg \n inflating: flower_data/train/46/image_01046.jpg \n inflating: flower_data/train/46/image_01052.jpg \n inflating: flower_data/train/46/image_01053.jpg \n inflating: flower_data/train/46/image_01084.jpg \n inflating: flower_data/train/46/image_01127.jpg \n inflating: flower_data/train/46/image_01133.jpg \n inflating: flower_data/train/46/image_00985.jpg \n inflating: flower_data/train/46/image_00991.jpg \n inflating: flower_data/train/46/image_00952.jpg \n inflating: flower_data/train/46/image_00950.jpg \n inflating: flower_data/train/46/image_00978.jpg \n inflating: flower_data/train/46/image_00987.jpg \n inflating: flower_data/train/46/image_00993.jpg \n inflating: flower_data/train/46/image_01131.jpg \n inflating: flower_data/train/46/image_01051.jpg \n inflating: flower_data/train/46/image_01045.jpg \n inflating: flower_data/train/46/image_01079.jpg \n inflating: flower_data/train/46/image_01044.jpg \n inflating: flower_data/train/46/image_01087.jpg \n inflating: flower_data/train/46/image_01093.jpg \n inflating: flower_data/train/46/image_01118.jpg \n inflating: flower_data/train/46/image_01130.jpg \n inflating: flower_data/train/46/image_01124.jpg \n inflating: flower_data/train/46/image_00992.jpg \n inflating: flower_data/train/46/image_00986.jpg \n inflating: flower_data/train/46/image_00979.jpg \n inflating: flower_data/train/46/image_00951.jpg \n inflating: flower_data/train/46/image_01037.jpg \n inflating: flower_data/train/46/image_01023.jpg \n inflating: flower_data/train/46/image_01022.jpg \n inflating: flower_data/train/46/image_01036.jpg \n inflating: flower_data/train/46/image_01140.jpg \n inflating: flower_data/train/46/image_01008.jpg \n inflating: flower_data/train/46/image_01020.jpg \n inflating: flower_data/train/46/image_01035.jpg \n inflating: flower_data/train/46/image_01021.jpg \n inflating: flower_data/train/46/image_01009.jpg \n inflating: flower_data/train/46/image_01141.jpg \n inflating: flower_data/train/46/image_01025.jpg \n inflating: flower_data/train/46/image_01031.jpg \n inflating: flower_data/train/46/image_01019.jpg \n inflating: flower_data/train/46/image_01018.jpg \n inflating: flower_data/train/46/image_01024.jpg \n inflating: flower_data/train/46/image_01032.jpg \n inflating: flower_data/train/46/image_01027.jpg \n inflating: flower_data/train/46/image_01033.jpg \n creating: flower_data/train/79/\n inflating: flower_data/train/79/image_06732.jpg \n inflating: flower_data/train/79/image_06733.jpg \n inflating: flower_data/train/79/image_06727.jpg \n inflating: flower_data/train/79/image_06731.jpg \n inflating: flower_data/train/79/image_06725.jpg \n inflating: flower_data/train/79/image_06719.jpg \n inflating: flower_data/train/79/image_06724.jpg \n inflating: flower_data/train/79/image_06730.jpg \n inflating: flower_data/train/79/image_06693.jpg \n inflating: flower_data/train/79/image_06697.jpg \n inflating: flower_data/train/79/image_06721.jpg \n inflating: flower_data/train/79/image_06709.jpg \n inflating: flower_data/train/79/image_06696.jpg \n inflating: flower_data/train/79/image_06694.jpg \n inflating: flower_data/train/79/image_06723.jpg \n inflating: flower_data/train/79/image_06722.jpg \n inflating: flower_data/train/79/image_06698.jpg \n inflating: flower_data/train/79/image_06713.jpg \n inflating: flower_data/train/79/image_06712.jpg \n inflating: flower_data/train/79/image_06706.jpg \n inflating: flower_data/train/79/image_06699.jpg \n inflating: flower_data/train/79/image_06710.jpg \n inflating: flower_data/train/79/image_06704.jpg \n inflating: flower_data/train/79/image_06705.jpg \n inflating: flower_data/train/79/image_06711.jpg \n inflating: flower_data/train/79/image_06729.jpg \n inflating: flower_data/train/79/image_06715.jpg \n inflating: flower_data/train/79/image_06701.jpg \n inflating: flower_data/train/79/image_06700.jpg \n inflating: flower_data/train/79/image_06714.jpg \n inflating: flower_data/train/79/image_06702.jpg \n inflating: flower_data/train/79/image_06716.jpg \n inflating: flower_data/train/79/image_06717.jpg \n inflating: flower_data/train/79/image_06703.jpg \n creating: flower_data/train/41/\n inflating: flower_data/train/41/image_02286.jpg \n inflating: flower_data/train/41/image_02292.jpg \n inflating: flower_data/train/41/image_02279.jpg \n inflating: flower_data/train/41/image_02250.jpg \n inflating: flower_data/train/41/image_02244.jpg \n inflating: flower_data/train/41/image_02293.jpg \n inflating: flower_data/train/41/image_02287.jpg \n inflating: flower_data/train/41/image_02291.jpg \n inflating: flower_data/train/41/image_02285.jpg \n inflating: flower_data/train/41/image_02252.jpg \n inflating: flower_data/train/41/image_02246.jpg \n inflating: flower_data/train/41/image_02253.jpg \n inflating: flower_data/train/41/image_02284.jpg \n inflating: flower_data/train/41/image_02290.jpg \n inflating: flower_data/train/41/image_02294.jpg \n inflating: flower_data/train/41/image_02280.jpg \n inflating: flower_data/train/41/image_02243.jpg \n inflating: flower_data/train/41/image_02242.jpg \n inflating: flower_data/train/41/image_02256.jpg \n inflating: flower_data/train/41/image_02281.jpg \n inflating: flower_data/train/41/image_02295.jpg \n inflating: flower_data/train/41/image_02308.jpg \n inflating: flower_data/train/41/image_02283.jpg \n inflating: flower_data/train/41/image_02240.jpg \n inflating: flower_data/train/41/image_02254.jpg \n inflating: flower_data/train/41/image_02255.jpg \n inflating: flower_data/train/41/image_02241.jpg \n inflating: flower_data/train/41/image_02269.jpg \n inflating: flower_data/train/41/image_02296.jpg \n inflating: flower_data/train/41/image_02282.jpg \n inflating: flower_data/train/41/image_02309.jpg \n inflating: flower_data/train/41/image_02226.jpg \n inflating: flower_data/train/41/image_02233.jpg \n inflating: flower_data/train/41/image_02227.jpg \n inflating: flower_data/train/41/image_02192.jpg \n inflating: flower_data/train/41/image_02190.jpg \n inflating: flower_data/train/41/image_02225.jpg \n inflating: flower_data/train/41/image_02218.jpg \n inflating: flower_data/train/41/image_02224.jpg \n inflating: flower_data/train/41/image_02230.jpg \n inflating: flower_data/train/41/image_02191.jpg \n inflating: flower_data/train/41/image_02195.jpg \n inflating: flower_data/train/41/image_02208.jpg \n inflating: flower_data/train/41/image_02234.jpg \n inflating: flower_data/train/41/image_02220.jpg \n inflating: flower_data/train/41/image_02221.jpg \n inflating: flower_data/train/41/image_02235.jpg \n inflating: flower_data/train/41/image_02209.jpg \n inflating: flower_data/train/41/image_02194.jpg \n inflating: flower_data/train/41/image_02196.jpg \n inflating: flower_data/train/41/image_02223.jpg \n inflating: flower_data/train/41/image_02236.jpg \n inflating: flower_data/train/41/image_02197.jpg \n inflating: flower_data/train/41/image_02207.jpg \n inflating: flower_data/train/41/image_02213.jpg \n inflating: flower_data/train/41/image_02212.jpg \n inflating: flower_data/train/41/image_02206.jpg \n inflating: flower_data/train/41/image_02210.jpg \n inflating: flower_data/train/41/image_02204.jpg \n inflating: flower_data/train/41/image_02238.jpg \n inflating: flower_data/train/41/image_02239.jpg \n inflating: flower_data/train/41/image_02211.jpg \n inflating: flower_data/train/41/image_02229.jpg \n inflating: flower_data/train/41/image_02215.jpg \n inflating: flower_data/train/41/image_02214.jpg \n inflating: flower_data/train/41/image_02228.jpg \n inflating: flower_data/train/41/image_02189.jpg \n inflating: flower_data/train/41/image_02202.jpg \n inflating: flower_data/train/41/image_02216.jpg \n inflating: flower_data/train/41/image_02217.jpg \n inflating: flower_data/train/41/image_02203.jpg \n inflating: flower_data/train/41/image_02310.jpg \n inflating: flower_data/train/41/image_02304.jpg \n inflating: flower_data/train/41/image_02264.jpg \n inflating: flower_data/train/41/image_02271.jpg \n inflating: flower_data/train/41/image_02265.jpg \n inflating: flower_data/train/41/image_02311.jpg \n inflating: flower_data/train/41/image_02307.jpg \n inflating: flower_data/train/41/image_02313.jpg \n inflating: flower_data/train/41/image_02298.jpg \n inflating: flower_data/train/41/image_02273.jpg \n inflating: flower_data/train/41/image_02267.jpg \n inflating: flower_data/train/41/image_02272.jpg \n inflating: flower_data/train/41/image_02299.jpg \n inflating: flower_data/train/41/image_02312.jpg \n inflating: flower_data/train/41/image_02289.jpg \n inflating: flower_data/train/41/image_02262.jpg \n inflating: flower_data/train/41/image_02277.jpg \n inflating: flower_data/train/41/image_02288.jpg \n inflating: flower_data/train/41/image_02303.jpg \n inflating: flower_data/train/41/image_02315.jpg \n inflating: flower_data/train/41/image_02301.jpg \n inflating: flower_data/train/41/image_02249.jpg \n inflating: flower_data/train/41/image_02275.jpg \n inflating: flower_data/train/41/image_02274.jpg \n inflating: flower_data/train/41/image_02300.jpg \n inflating: flower_data/train/41/image_02314.jpg \n creating: flower_data/train/83/\n inflating: flower_data/train/83/image_01823.jpg \n inflating: flower_data/train/83/image_01758.jpg \n inflating: flower_data/train/83/image_01764.jpg \n inflating: flower_data/train/83/image_01771.jpg \n inflating: flower_data/train/83/image_01822.jpg \n inflating: flower_data/train/83/image_01820.jpg \n inflating: flower_data/train/83/image_01808.jpg \n inflating: flower_data/train/83/image_01798.jpg \n inflating: flower_data/train/83/image_01767.jpg \n inflating: flower_data/train/83/image_01773.jpg \n inflating: flower_data/train/83/image_01772.jpg \n inflating: flower_data/train/83/image_01799.jpg \n inflating: flower_data/train/83/image_01821.jpg \n inflating: flower_data/train/83/image_01819.jpg \n inflating: flower_data/train/83/image_01825.jpg \n inflating: flower_data/train/83/image_01762.jpg \n inflating: flower_data/train/83/image_01776.jpg \n inflating: flower_data/train/83/image_01763.jpg \n inflating: flower_data/train/83/image_01788.jpg \n inflating: flower_data/train/83/image_01824.jpg \n inflating: flower_data/train/83/image_01826.jpg \n inflating: flower_data/train/83/image_01775.jpg \n inflating: flower_data/train/83/image_01761.jpg \n inflating: flower_data/train/83/image_01749.jpg \n inflating: flower_data/train/83/image_01760.jpg \n inflating: flower_data/train/83/image_01698.jpg \n inflating: flower_data/train/83/image_01713.jpg \n inflating: flower_data/train/83/image_01707.jpg \n inflating: flower_data/train/83/image_01706.jpg \n inflating: flower_data/train/83/image_01699.jpg \n inflating: flower_data/train/83/image_01738.jpg \n inflating: flower_data/train/83/image_01704.jpg \n inflating: flower_data/train/83/image_01710.jpg \n inflating: flower_data/train/83/image_01711.jpg \n inflating: flower_data/train/83/image_01705.jpg \n inflating: flower_data/train/83/image_01715.jpg \n inflating: flower_data/train/83/image_01729.jpg \n inflating: flower_data/train/83/image_01728.jpg \n inflating: flower_data/train/83/image_01714.jpg \n inflating: flower_data/train/83/image_01700.jpg \n inflating: flower_data/train/83/image_01716.jpg \n inflating: flower_data/train/83/image_01702.jpg \n inflating: flower_data/train/83/image_01717.jpg \n inflating: flower_data/train/83/image_01732.jpg \n inflating: flower_data/train/83/image_01726.jpg \n inflating: flower_data/train/83/image_01727.jpg \n inflating: flower_data/train/83/image_01733.jpg \n inflating: flower_data/train/83/image_01719.jpg \n inflating: flower_data/train/83/image_01725.jpg \n inflating: flower_data/train/83/image_01731.jpg \n inflating: flower_data/train/83/image_01730.jpg \n inflating: flower_data/train/83/image_01724.jpg \n inflating: flower_data/train/83/image_01718.jpg \n inflating: flower_data/train/83/image_01720.jpg \n inflating: flower_data/train/83/image_01734.jpg \n inflating: flower_data/train/83/image_01708.jpg \n inflating: flower_data/train/83/image_01709.jpg \n inflating: flower_data/train/83/image_01721.jpg \n inflating: flower_data/train/83/image_01696.jpg \n inflating: flower_data/train/83/image_01723.jpg \n inflating: flower_data/train/83/image_01722.jpg \n inflating: flower_data/train/83/image_01736.jpg \n inflating: flower_data/train/83/image_01802.jpg \n inflating: flower_data/train/83/image_01816.jpg \n inflating: flower_data/train/83/image_01792.jpg \n inflating: flower_data/train/83/image_01786.jpg \n inflating: flower_data/train/83/image_01779.jpg \n inflating: flower_data/train/83/image_01751.jpg \n inflating: flower_data/train/83/image_01745.jpg \n inflating: flower_data/train/83/image_01744.jpg \n inflating: flower_data/train/83/image_01750.jpg \n inflating: flower_data/train/83/image_01778.jpg \n inflating: flower_data/train/83/image_01793.jpg \n inflating: flower_data/train/83/image_01803.jpg \n inflating: flower_data/train/83/image_01815.jpg \n inflating: flower_data/train/83/image_01801.jpg \n inflating: flower_data/train/83/image_01785.jpg \n inflating: flower_data/train/83/image_01791.jpg \n inflating: flower_data/train/83/image_01746.jpg \n inflating: flower_data/train/83/image_01752.jpg \n inflating: flower_data/train/83/image_01753.jpg \n inflating: flower_data/train/83/image_01790.jpg \n inflating: flower_data/train/83/image_01784.jpg \n inflating: flower_data/train/83/image_01800.jpg \n inflating: flower_data/train/83/image_01814.jpg \n inflating: flower_data/train/83/image_01804.jpg \n inflating: flower_data/train/83/image_01794.jpg \n inflating: flower_data/train/83/image_01743.jpg \n inflating: flower_data/train/83/image_01757.jpg \n inflating: flower_data/train/83/image_01756.jpg \n inflating: flower_data/train/83/image_01795.jpg \n inflating: flower_data/train/83/image_01781.jpg \n inflating: flower_data/train/83/image_01807.jpg \n inflating: flower_data/train/83/image_01813.jpg \n inflating: flower_data/train/83/image_01797.jpg \n inflating: flower_data/train/83/image_01783.jpg \n inflating: flower_data/train/83/image_01754.jpg \n inflating: flower_data/train/83/image_01740.jpg \n inflating: flower_data/train/83/image_01768.jpg \n inflating: flower_data/train/83/image_01741.jpg \n inflating: flower_data/train/83/image_01782.jpg \n inflating: flower_data/train/83/image_01796.jpg \n inflating: flower_data/train/83/image_01812.jpg \n inflating: flower_data/train/83/image_01806.jpg \n creating: flower_data/train/77/\n inflating: flower_data/train/77/image_00095.jpg \n inflating: flower_data/train/77/image_00042.jpg \n inflating: flower_data/train/77/image_00056.jpg \n inflating: flower_data/train/77/image_00240.jpg \n inflating: flower_data/train/77/image_00136.jpg \n inflating: flower_data/train/77/image_00122.jpg \n inflating: flower_data/train/77/image_00123.jpg \n inflating: flower_data/train/77/image_00241.jpg \n inflating: flower_data/train/77/image_00057.jpg \n inflating: flower_data/train/77/image_00043.jpg \n inflating: flower_data/train/77/image_00094.jpg \n inflating: flower_data/train/77/image_00080.jpg \n inflating: flower_data/train/77/image_00096.jpg \n inflating: flower_data/train/77/image_00082.jpg \n inflating: flower_data/train/77/image_00055.jpg \n inflating: flower_data/train/77/image_00041.jpg \n inflating: flower_data/train/77/image_00069.jpg \n inflating: flower_data/train/77/image_00243.jpg \n inflating: flower_data/train/77/image_00121.jpg \n inflating: flower_data/train/77/image_00135.jpg \n inflating: flower_data/train/77/image_00109.jpg \n inflating: flower_data/train/77/image_00108.jpg \n inflating: flower_data/train/77/image_00134.jpg \n inflating: flower_data/train/77/image_00120.jpg \n inflating: flower_data/train/77/image_00242.jpg \n inflating: flower_data/train/77/image_00068.jpg \n inflating: flower_data/train/77/image_00040.jpg \n inflating: flower_data/train/77/image_00054.jpg \n inflating: flower_data/train/77/image_00083.jpg \n inflating: flower_data/train/77/image_00097.jpg \n inflating: flower_data/train/77/image_00093.jpg \n inflating: flower_data/train/77/image_00087.jpg \n inflating: flower_data/train/77/image_00044.jpg \n inflating: flower_data/train/77/image_00118.jpg \n inflating: flower_data/train/77/image_00124.jpg \n inflating: flower_data/train/77/image_00130.jpg \n inflating: flower_data/train/77/image_00131.jpg \n inflating: flower_data/train/77/image_00125.jpg \n inflating: flower_data/train/77/image_00119.jpg \n inflating: flower_data/train/77/image_00247.jpg \n inflating: flower_data/train/77/image_00045.jpg \n inflating: flower_data/train/77/image_00051.jpg \n inflating: flower_data/train/77/image_00079.jpg \n inflating: flower_data/train/77/image_00086.jpg \n inflating: flower_data/train/77/image_00084.jpg \n inflating: flower_data/train/77/image_00090.jpg \n inflating: flower_data/train/77/image_00047.jpg \n inflating: flower_data/train/77/image_00053.jpg \n inflating: flower_data/train/77/image_00133.jpg \n inflating: flower_data/train/77/image_00127.jpg \n inflating: flower_data/train/77/image_00126.jpg \n inflating: flower_data/train/77/image_00250.jpg \n inflating: flower_data/train/77/image_00244.jpg \n inflating: flower_data/train/77/image_00052.jpg \n inflating: flower_data/train/77/image_00046.jpg \n inflating: flower_data/train/77/image_00091.jpg \n inflating: flower_data/train/77/image_00085.jpg \n inflating: flower_data/train/77/image_00021.jpg \n inflating: flower_data/train/77/image_00035.jpg \n inflating: flower_data/train/77/image_00009.jpg \n inflating: flower_data/train/77/image_00223.jpg \n inflating: flower_data/train/77/image_00237.jpg \n inflating: flower_data/train/77/image_00196.jpg \n inflating: flower_data/train/77/image_00182.jpg \n inflating: flower_data/train/77/image_00155.jpg \n inflating: flower_data/train/77/image_00141.jpg \n inflating: flower_data/train/77/image_00169.jpg \n inflating: flower_data/train/77/image_00168.jpg \n inflating: flower_data/train/77/image_00140.jpg \n inflating: flower_data/train/77/image_00154.jpg \n inflating: flower_data/train/77/image_00183.jpg \n inflating: flower_data/train/77/image_00197.jpg \n inflating: flower_data/train/77/image_00236.jpg \n inflating: flower_data/train/77/image_00008.jpg \n inflating: flower_data/train/77/image_00034.jpg \n inflating: flower_data/train/77/image_00020.jpg \n inflating: flower_data/train/77/image_00036.jpg \n inflating: flower_data/train/77/image_00022.jpg \n inflating: flower_data/train/77/image_00234.jpg \n inflating: flower_data/train/77/image_00220.jpg \n inflating: flower_data/train/77/image_00208.jpg \n inflating: flower_data/train/77/image_00181.jpg \n inflating: flower_data/train/77/image_00195.jpg \n inflating: flower_data/train/77/image_00142.jpg \n inflating: flower_data/train/77/image_00156.jpg \n inflating: flower_data/train/77/image_00157.jpg \n inflating: flower_data/train/77/image_00194.jpg \n inflating: flower_data/train/77/image_00180.jpg \n inflating: flower_data/train/77/image_00209.jpg \n inflating: flower_data/train/77/image_00221.jpg \n inflating: flower_data/train/77/image_00235.jpg \n inflating: flower_data/train/77/image_00023.jpg \n inflating: flower_data/train/77/image_00037.jpg \n inflating: flower_data/train/77/image_00033.jpg \n inflating: flower_data/train/77/image_00027.jpg \n inflating: flower_data/train/77/image_00219.jpg \n inflating: flower_data/train/77/image_00225.jpg \n inflating: flower_data/train/77/image_00184.jpg \n inflating: flower_data/train/77/image_00190.jpg \n inflating: flower_data/train/77/image_00147.jpg \n inflating: flower_data/train/77/image_00153.jpg \n inflating: flower_data/train/77/image_00152.jpg \n inflating: flower_data/train/77/image_00146.jpg \n inflating: flower_data/train/77/image_00185.jpg \n inflating: flower_data/train/77/image_00224.jpg \n inflating: flower_data/train/77/image_00230.jpg \n inflating: flower_data/train/77/image_00026.jpg \n inflating: flower_data/train/77/image_00032.jpg \n inflating: flower_data/train/77/image_00018.jpg \n inflating: flower_data/train/77/image_00030.jpg \n inflating: flower_data/train/77/image_00226.jpg \n inflating: flower_data/train/77/image_00232.jpg \n inflating: flower_data/train/77/image_00193.jpg \n inflating: flower_data/train/77/image_00150.jpg \n inflating: flower_data/train/77/image_00144.jpg \n inflating: flower_data/train/77/image_00145.jpg \n inflating: flower_data/train/77/image_00151.jpg \n inflating: flower_data/train/77/image_00179.jpg \n inflating: flower_data/train/77/image_00186.jpg \n inflating: flower_data/train/77/image_00192.jpg \n inflating: flower_data/train/77/image_00233.jpg \n inflating: flower_data/train/77/image_00227.jpg \n inflating: flower_data/train/77/image_00031.jpg \n inflating: flower_data/train/77/image_00019.jpg \n inflating: flower_data/train/77/image_00014.jpg \n inflating: flower_data/train/77/image_00216.jpg \n inflating: flower_data/train/77/image_00148.jpg \n inflating: flower_data/train/77/image_00149.jpg \n inflating: flower_data/train/77/image_00161.jpg \n inflating: flower_data/train/77/image_00175.jpg \n inflating: flower_data/train/77/image_00217.jpg \n inflating: flower_data/train/77/image_00015.jpg \n inflating: flower_data/train/77/image_00001.jpg \n inflating: flower_data/train/77/image_00017.jpg \n inflating: flower_data/train/77/image_00003.jpg \n inflating: flower_data/train/77/image_00215.jpg \n inflating: flower_data/train/77/image_00201.jpg \n inflating: flower_data/train/77/image_00229.jpg \n inflating: flower_data/train/77/image_00163.jpg \n inflating: flower_data/train/77/image_00162.jpg \n inflating: flower_data/train/77/image_00189.jpg \n inflating: flower_data/train/77/image_00228.jpg \n inflating: flower_data/train/77/image_00214.jpg \n inflating: flower_data/train/77/image_00002.jpg \n inflating: flower_data/train/77/image_00016.jpg \n inflating: flower_data/train/77/image_00012.jpg \n inflating: flower_data/train/77/image_00238.jpg \n inflating: flower_data/train/77/image_00210.jpg \n inflating: flower_data/train/77/image_00199.jpg \n inflating: flower_data/train/77/image_00166.jpg \n inflating: flower_data/train/77/image_00172.jpg \n inflating: flower_data/train/77/image_00173.jpg \n inflating: flower_data/train/77/image_00167.jpg \n inflating: flower_data/train/77/image_00198.jpg \n inflating: flower_data/train/77/image_00205.jpg \n inflating: flower_data/train/77/image_00211.jpg \n inflating: flower_data/train/77/image_00007.jpg \n inflating: flower_data/train/77/image_00013.jpg \n inflating: flower_data/train/77/image_00039.jpg \n inflating: flower_data/train/77/image_00011.jpg \n inflating: flower_data/train/77/image_00207.jpg \n inflating: flower_data/train/77/image_00213.jpg \n inflating: flower_data/train/77/image_00159.jpg \n inflating: flower_data/train/77/image_00171.jpg \n inflating: flower_data/train/77/image_00165.jpg \n inflating: flower_data/train/77/image_00164.jpg \n inflating: flower_data/train/77/image_00170.jpg \n inflating: flower_data/train/77/image_00158.jpg \n inflating: flower_data/train/77/image_00206.jpg \n inflating: flower_data/train/77/image_00010.jpg \n inflating: flower_data/train/77/image_00004.jpg \n inflating: flower_data/train/77/image_00038.jpg \n inflating: flower_data/train/77/image_00088.jpg \n inflating: flower_data/train/77/image_00077.jpg \n inflating: flower_data/train/77/image_00249.jpg \n inflating: flower_data/train/77/image_00117.jpg \n inflating: flower_data/train/77/image_00103.jpg \n inflating: flower_data/train/77/image_00102.jpg \n inflating: flower_data/train/77/image_00076.jpg \n inflating: flower_data/train/77/image_00062.jpg \n inflating: flower_data/train/77/image_00089.jpg \n inflating: flower_data/train/77/image_00074.jpg \n inflating: flower_data/train/77/image_00048.jpg \n inflating: flower_data/train/77/image_00100.jpg \n inflating: flower_data/train/77/image_00128.jpg \n inflating: flower_data/train/77/image_00129.jpg \n inflating: flower_data/train/77/image_00115.jpg \n inflating: flower_data/train/77/image_00101.jpg \n inflating: flower_data/train/77/image_00049.jpg \n inflating: flower_data/train/77/image_00075.jpg \n inflating: flower_data/train/77/image_00139.jpg \n inflating: flower_data/train/77/image_00105.jpg \n inflating: flower_data/train/77/image_00111.jpg \n inflating: flower_data/train/77/image_00110.jpg \n inflating: flower_data/train/77/image_00138.jpg \n inflating: flower_data/train/77/image_00064.jpg \n inflating: flower_data/train/77/image_00070.jpg \n inflating: flower_data/train/77/image_00058.jpg \n inflating: flower_data/train/77/image_00066.jpg \n inflating: flower_data/train/77/image_00106.jpg \n inflating: flower_data/train/77/image_00107.jpg \n inflating: flower_data/train/77/image_00113.jpg \n inflating: flower_data/train/77/image_00073.jpg \n inflating: flower_data/train/77/image_00067.jpg \n inflating: flower_data/train/77/image_00098.jpg \n creating: flower_data/train/48/\n inflating: flower_data/train/48/image_04634.jpg \n inflating: flower_data/train/48/image_04635.jpg \n inflating: flower_data/train/48/image_04637.jpg \n inflating: flower_data/train/48/image_04636.jpg \n inflating: flower_data/train/48/image_04626.jpg \n inflating: flower_data/train/48/image_04632.jpg \n inflating: flower_data/train/48/image_04633.jpg \n inflating: flower_data/train/48/image_04631.jpg \n inflating: flower_data/train/48/image_04625.jpg \n inflating: flower_data/train/48/image_04630.jpg \n inflating: flower_data/train/48/image_04694.jpg \n inflating: flower_data/train/48/image_04680.jpg \n inflating: flower_data/train/48/image_04657.jpg \n inflating: flower_data/train/48/image_04643.jpg \n inflating: flower_data/train/48/image_04642.jpg \n inflating: flower_data/train/48/image_04656.jpg \n inflating: flower_data/train/48/image_04681.jpg \n inflating: flower_data/train/48/image_04695.jpg \n inflating: flower_data/train/48/image_04683.jpg \n inflating: flower_data/train/48/image_04640.jpg \n inflating: flower_data/train/48/image_04654.jpg \n inflating: flower_data/train/48/image_04668.jpg \n inflating: flower_data/train/48/image_04669.jpg \n inflating: flower_data/train/48/image_04655.jpg \n inflating: flower_data/train/48/image_04682.jpg \n inflating: flower_data/train/48/image_04686.jpg \n inflating: flower_data/train/48/image_04692.jpg \n inflating: flower_data/train/48/image_04679.jpg \n inflating: flower_data/train/48/image_04645.jpg \n inflating: flower_data/train/48/image_04651.jpg \n inflating: flower_data/train/48/image_04650.jpg \n inflating: flower_data/train/48/image_04644.jpg \n inflating: flower_data/train/48/image_04685.jpg \n inflating: flower_data/train/48/image_04646.jpg \n inflating: flower_data/train/48/image_04647.jpg \n inflating: flower_data/train/48/image_04653.jpg \n inflating: flower_data/train/48/image_04689.jpg \n inflating: flower_data/train/48/image_04676.jpg \n inflating: flower_data/train/48/image_04662.jpg \n inflating: flower_data/train/48/image_04677.jpg \n inflating: flower_data/train/48/image_04661.jpg \n inflating: flower_data/train/48/image_04675.jpg \n inflating: flower_data/train/48/image_04649.jpg \n inflating: flower_data/train/48/image_04648.jpg \n inflating: flower_data/train/48/image_04674.jpg \n inflating: flower_data/train/48/image_04660.jpg \n inflating: flower_data/train/48/image_04658.jpg \n inflating: flower_data/train/48/image_04664.jpg \n inflating: flower_data/train/48/image_04670.jpg \n inflating: flower_data/train/48/image_04659.jpg \n inflating: flower_data/train/48/image_04673.jpg \n inflating: flower_data/train/48/image_04666.jpg \n inflating: flower_data/train/48/image_04672.jpg \n inflating: flower_data/train/48/image_04629.jpg \n inflating: flower_data/train/48/image_04628.jpg \n inflating: flower_data/train/48/image_04638.jpg \n inflating: flower_data/train/48/image_04639.jpg \n creating: flower_data/train/70/\n inflating: flower_data/train/70/image_05338.jpg \n inflating: flower_data/train/70/image_05304.jpg \n inflating: flower_data/train/70/image_05310.jpg \n inflating: flower_data/train/70/image_05311.jpg \n inflating: flower_data/train/70/image_05305.jpg \n inflating: flower_data/train/70/image_05339.jpg \n inflating: flower_data/train/70/image_05313.jpg \n inflating: flower_data/train/70/image_05307.jpg \n inflating: flower_data/train/70/image_05298.jpg \n inflating: flower_data/train/70/image_05299.jpg \n inflating: flower_data/train/70/image_05306.jpg \n inflating: flower_data/train/70/image_05312.jpg \n inflating: flower_data/train/70/image_05316.jpg \n inflating: flower_data/train/70/image_05302.jpg \n inflating: flower_data/train/70/image_05288.jpg \n inflating: flower_data/train/70/image_05317.jpg \n inflating: flower_data/train/70/image_05301.jpg \n inflating: flower_data/train/70/image_05315.jpg \n inflating: flower_data/train/70/image_05329.jpg \n inflating: flower_data/train/70/image_05328.jpg \n inflating: flower_data/train/70/image_05314.jpg \n inflating: flower_data/train/70/image_05300.jpg \n inflating: flower_data/train/70/image_05319.jpg \n inflating: flower_data/train/70/image_05325.jpg \n inflating: flower_data/train/70/image_05292.jpg \n inflating: flower_data/train/70/image_05286.jpg \n inflating: flower_data/train/70/image_05279.jpg \n inflating: flower_data/train/70/image_05278.jpg \n inflating: flower_data/train/70/image_05287.jpg \n inflating: flower_data/train/70/image_05293.jpg \n inflating: flower_data/train/70/image_05318.jpg \n inflating: flower_data/train/70/image_05332.jpg \n inflating: flower_data/train/70/image_05285.jpg \n inflating: flower_data/train/70/image_05291.jpg \n inflating: flower_data/train/70/image_05290.jpg \n inflating: flower_data/train/70/image_05327.jpg \n inflating: flower_data/train/70/image_05333.jpg \n inflating: flower_data/train/70/image_05337.jpg \n inflating: flower_data/train/70/image_05323.jpg \n inflating: flower_data/train/70/image_05294.jpg \n inflating: flower_data/train/70/image_05295.jpg \n inflating: flower_data/train/70/image_05281.jpg \n inflating: flower_data/train/70/image_05322.jpg \n inflating: flower_data/train/70/image_05336.jpg \n inflating: flower_data/train/70/image_05320.jpg \n inflating: flower_data/train/70/image_05334.jpg \n inflating: flower_data/train/70/image_05297.jpg \n inflating: flower_data/train/70/image_05283.jpg \n inflating: flower_data/train/70/image_05282.jpg \n inflating: flower_data/train/70/image_05309.jpg \n inflating: flower_data/train/70/image_05335.jpg \n creating: flower_data/train/84/\n inflating: flower_data/train/84/image_02636.jpg \n inflating: flower_data/train/84/image_02622.jpg \n inflating: flower_data/train/84/image_02597.jpg \n inflating: flower_data/train/84/image_02554.jpg \n inflating: flower_data/train/84/image_02569.jpg \n inflating: flower_data/train/84/image_02555.jpg \n inflating: flower_data/train/84/image_02596.jpg \n inflating: flower_data/train/84/image_02623.jpg \n inflating: flower_data/train/84/image_02609.jpg \n inflating: flower_data/train/84/image_02594.jpg \n inflating: flower_data/train/84/image_02580.jpg \n inflating: flower_data/train/84/image_02556.jpg \n inflating: flower_data/train/84/image_02595.jpg \n inflating: flower_data/train/84/image_02608.jpg \n inflating: flower_data/train/84/image_02634.jpg \n inflating: flower_data/train/84/image_02620.jpg \n inflating: flower_data/train/84/image_02618.jpg \n inflating: flower_data/train/84/image_02624.jpg \n inflating: flower_data/train/84/image_02630.jpg \n inflating: flower_data/train/84/image_02585.jpg \n inflating: flower_data/train/84/image_02553.jpg \n inflating: flower_data/train/84/image_02584.jpg \n inflating: flower_data/train/84/image_02590.jpg \n inflating: flower_data/train/84/image_02631.jpg \n inflating: flower_data/train/84/image_02625.jpg \n inflating: flower_data/train/84/image_02619.jpg \n inflating: flower_data/train/84/image_02633.jpg \n inflating: flower_data/train/84/image_02627.jpg \n inflating: flower_data/train/84/image_02592.jpg \n inflating: flower_data/train/84/image_02579.jpg \n inflating: flower_data/train/84/image_02578.jpg \n inflating: flower_data/train/84/image_02593.jpg \n inflating: flower_data/train/84/image_02587.jpg \n inflating: flower_data/train/84/image_02626.jpg \n inflating: flower_data/train/84/image_02632.jpg \n inflating: flower_data/train/84/image_02617.jpg \n inflating: flower_data/train/84/image_02603.jpg \n inflating: flower_data/train/84/image_02561.jpg \n inflating: flower_data/train/84/image_02575.jpg \n inflating: flower_data/train/84/image_02560.jpg \n inflating: flower_data/train/84/image_02602.jpg \n inflating: flower_data/train/84/image_02616.jpg \n inflating: flower_data/train/84/image_02600.jpg \n inflating: flower_data/train/84/image_02614.jpg \n inflating: flower_data/train/84/image_02628.jpg \n inflating: flower_data/train/84/image_02589.jpg \n inflating: flower_data/train/84/image_02562.jpg \n inflating: flower_data/train/84/image_02577.jpg \n inflating: flower_data/train/84/image_02588.jpg \n inflating: flower_data/train/84/image_02629.jpg \n inflating: flower_data/train/84/image_02601.jpg \n inflating: flower_data/train/84/image_02605.jpg \n inflating: flower_data/train/84/image_02611.jpg \n inflating: flower_data/train/84/image_02573.jpg \n inflating: flower_data/train/84/image_02566.jpg \n inflating: flower_data/train/84/image_02599.jpg \n inflating: flower_data/train/84/image_02638.jpg \n inflating: flower_data/train/84/image_02612.jpg \n inflating: flower_data/train/84/image_02606.jpg \n inflating: flower_data/train/84/image_02558.jpg \n inflating: flower_data/train/84/image_02564.jpg \n inflating: flower_data/train/84/image_02570.jpg \n inflating: flower_data/train/84/image_02571.jpg \n inflating: flower_data/train/84/image_02565.jpg \n inflating: flower_data/train/84/image_02559.jpg \n inflating: flower_data/train/84/image_02607.jpg \n creating: flower_data/train/24/\n inflating: flower_data/train/24/image_06816.jpg \n inflating: flower_data/train/24/image_06817.jpg \n inflating: flower_data/train/24/image_06829.jpg \n inflating: flower_data/train/24/image_06828.jpg \n inflating: flower_data/train/24/image_06838.jpg \n inflating: flower_data/train/24/image_06839.jpg \n inflating: flower_data/train/24/image_08051.jpg \n inflating: flower_data/train/24/image_08050.jpg \n inflating: flower_data/train/24/image_06848.jpg \n inflating: flower_data/train/24/image_08052.jpg \n inflating: flower_data/train/24/image_08053.jpg \n inflating: flower_data/train/24/image_06840.jpg \n inflating: flower_data/train/24/image_06841.jpg \n inflating: flower_data/train/24/image_06843.jpg \n inflating: flower_data/train/24/image_06842.jpg \n inflating: flower_data/train/24/image_06846.jpg \n inflating: flower_data/train/24/image_06845.jpg \n inflating: flower_data/train/24/image_08049.jpg \n inflating: flower_data/train/24/image_08048.jpg \n inflating: flower_data/train/24/image_06844.jpg \n inflating: flower_data/train/24/image_06837.jpg \n inflating: flower_data/train/24/image_06823.jpg \n inflating: flower_data/train/24/image_06822.jpg \n inflating: flower_data/train/24/image_06820.jpg \n inflating: flower_data/train/24/image_06834.jpg \n inflating: flower_data/train/24/image_06835.jpg \n inflating: flower_data/train/24/image_06821.jpg \n inflating: flower_data/train/24/image_06825.jpg \n inflating: flower_data/train/24/image_06831.jpg \n inflating: flower_data/train/24/image_06830.jpg \n inflating: flower_data/train/24/image_06824.jpg \n inflating: flower_data/train/24/image_06832.jpg \n inflating: flower_data/train/24/image_06826.jpg \n inflating: flower_data/train/24/image_06827.jpg \n inflating: flower_data/train/24/image_06833.jpg \n creating: flower_data/train/23/\n inflating: flower_data/train/23/image_03403.jpg \n inflating: flower_data/train/23/image_03417.jpg \n inflating: flower_data/train/23/image_03370.jpg \n inflating: flower_data/train/23/image_03402.jpg \n inflating: flower_data/train/23/image_03399.jpg \n inflating: flower_data/train/23/image_03428.jpg \n inflating: flower_data/train/23/image_03372.jpg \n inflating: flower_data/train/23/image_03401.jpg \n inflating: flower_data/train/23/image_03373.jpg \n inflating: flower_data/train/23/image_03415.jpg \n inflating: flower_data/train/23/image_03429.jpg \n inflating: flower_data/train/23/image_03388.jpg \n inflating: flower_data/train/23/image_03411.jpg \n inflating: flower_data/train/23/image_03377.jpg \n inflating: flower_data/train/23/image_03405.jpg \n inflating: flower_data/train/23/image_03439.jpg \n inflating: flower_data/train/23/image_03438.jpg \n inflating: flower_data/train/23/image_03404.jpg \n inflating: flower_data/train/23/image_03410.jpg \n inflating: flower_data/train/23/image_03376.jpg \n inflating: flower_data/train/23/image_03389.jpg \n inflating: flower_data/train/23/image_03406.jpg \n inflating: flower_data/train/23/image_03374.jpg \n inflating: flower_data/train/23/image_03375.jpg \n inflating: flower_data/train/23/image_03413.jpg \n inflating: flower_data/train/23/image_03407.jpg \n inflating: flower_data/train/23/image_03448.jpg \n inflating: flower_data/train/23/image_03449.jpg \n inflating: flower_data/train/23/image_03459.jpg \n inflating: flower_data/train/23/image_03458.jpg \n inflating: flower_data/train/23/image_03441.jpg \n inflating: flower_data/train/23/image_03455.jpg \n inflating: flower_data/train/23/image_03456.jpg \n inflating: flower_data/train/23/image_03443.jpg \n inflating: flower_data/train/23/image_03457.jpg \n inflating: flower_data/train/23/image_03453.jpg \n inflating: flower_data/train/23/image_03447.jpg \n inflating: flower_data/train/23/image_03446.jpg \n inflating: flower_data/train/23/image_03452.jpg \n inflating: flower_data/train/23/image_03450.jpg \n inflating: flower_data/train/23/image_03451.jpg \n inflating: flower_data/train/23/image_03445.jpg \n inflating: flower_data/train/23/image_03387.jpg \n inflating: flower_data/train/23/image_03393.jpg \n inflating: flower_data/train/23/image_03378.jpg \n inflating: flower_data/train/23/image_03422.jpg \n inflating: flower_data/train/23/image_03423.jpg \n inflating: flower_data/train/23/image_03379.jpg \n inflating: flower_data/train/23/image_03392.jpg \n inflating: flower_data/train/23/image_03384.jpg \n inflating: flower_data/train/23/image_03435.jpg \n inflating: flower_data/train/23/image_03421.jpg \n inflating: flower_data/train/23/image_03434.jpg \n inflating: flower_data/train/23/image_03408.jpg \n inflating: flower_data/train/23/image_03385.jpg \n inflating: flower_data/train/23/image_03391.jpg \n inflating: flower_data/train/23/image_03395.jpg \n inflating: flower_data/train/23/image_03381.jpg \n inflating: flower_data/train/23/image_03430.jpg \n inflating: flower_data/train/23/image_03424.jpg \n inflating: flower_data/train/23/image_03418.jpg \n inflating: flower_data/train/23/image_03419.jpg \n inflating: flower_data/train/23/image_03431.jpg \n inflating: flower_data/train/23/image_03380.jpg \n inflating: flower_data/train/23/image_03394.jpg \n inflating: flower_data/train/23/image_03396.jpg \n inflating: flower_data/train/23/image_03427.jpg \n inflating: flower_data/train/23/image_03433.jpg \n inflating: flower_data/train/23/image_03369.jpg \n inflating: flower_data/train/23/image_03432.jpg \n inflating: flower_data/train/23/image_03426.jpg \n inflating: flower_data/train/23/image_03397.jpg \n creating: flower_data/train/4/\n inflating: flower_data/train/4/image_05648.jpg \n inflating: flower_data/train/4/image_05674.jpg \n inflating: flower_data/train/4/image_05675.jpg \n inflating: flower_data/train/4/image_05661.jpg \n inflating: flower_data/train/4/image_05649.jpg \n inflating: flower_data/train/4/image_05663.jpg \n inflating: flower_data/train/4/image_05662.jpg \n inflating: flower_data/train/4/image_05676.jpg \n inflating: flower_data/train/4/image_05672.jpg \n inflating: flower_data/train/4/image_05666.jpg \n inflating: flower_data/train/4/image_05667.jpg \n inflating: flower_data/train/4/image_05673.jpg \n inflating: flower_data/train/4/image_05665.jpg \n inflating: flower_data/train/4/image_05671.jpg \n inflating: flower_data/train/4/image_05659.jpg \n inflating: flower_data/train/4/image_05670.jpg \n inflating: flower_data/train/4/image_05664.jpg \n inflating: flower_data/train/4/image_05629.jpg \n inflating: flower_data/train/4/image_05639.jpg \n inflating: flower_data/train/4/image_05635.jpg \n inflating: flower_data/train/4/image_05634.jpg \n inflating: flower_data/train/4/image_05630.jpg \n inflating: flower_data/train/4/image_05631.jpg \n inflating: flower_data/train/4/image_05633.jpg \n inflating: flower_data/train/4/image_05632.jpg \n inflating: flower_data/train/4/image_05682.jpg \n inflating: flower_data/train/4/image_05669.jpg \n inflating: flower_data/train/4/image_05641.jpg \n inflating: flower_data/train/4/image_05655.jpg \n inflating: flower_data/train/4/image_05654.jpg \n inflating: flower_data/train/4/image_05668.jpg \n inflating: flower_data/train/4/image_05683.jpg \n inflating: flower_data/train/4/image_05656.jpg \n inflating: flower_data/train/4/image_05642.jpg \n inflating: flower_data/train/4/image_05643.jpg \n inflating: flower_data/train/4/image_05684.jpg \n inflating: flower_data/train/4/image_05647.jpg \n inflating: flower_data/train/4/image_05646.jpg \n inflating: flower_data/train/4/image_05652.jpg \n inflating: flower_data/train/4/image_05644.jpg \n inflating: flower_data/train/4/image_05650.jpg \n inflating: flower_data/train/4/image_05679.jpg \n inflating: flower_data/train/4/image_05651.jpg \n inflating: flower_data/train/4/image_05645.jpg \n creating: flower_data/train/15/\n inflating: flower_data/train/15/image_06382.jpg \n inflating: flower_data/train/15/image_06355.jpg \n inflating: flower_data/train/15/image_06368.jpg \n inflating: flower_data/train/15/image_06383.jpg \n inflating: flower_data/train/15/image_06381.jpg \n inflating: flower_data/train/15/image_06356.jpg \n inflating: flower_data/train/15/image_06357.jpg \n inflating: flower_data/train/15/image_06394.jpg \n inflating: flower_data/train/15/image_06380.jpg \n inflating: flower_data/train/15/image_06384.jpg \n inflating: flower_data/train/15/image_06390.jpg \n inflating: flower_data/train/15/image_06347.jpg \n inflating: flower_data/train/15/image_06352.jpg \n inflating: flower_data/train/15/image_06391.jpg \n inflating: flower_data/train/15/image_06393.jpg \n inflating: flower_data/train/15/image_06387.jpg \n inflating: flower_data/train/15/image_06378.jpg \n inflating: flower_data/train/15/image_06350.jpg \n inflating: flower_data/train/15/image_06379.jpg \n inflating: flower_data/train/15/image_06392.jpg \n inflating: flower_data/train/15/image_06348.jpg \n inflating: flower_data/train/15/image_06349.jpg \n inflating: flower_data/train/15/image_06361.jpg \n inflating: flower_data/train/15/image_06375.jpg \n inflating: flower_data/train/15/image_06388.jpg \n inflating: flower_data/train/15/image_06363.jpg \n inflating: flower_data/train/15/image_06377.jpg \n inflating: flower_data/train/15/image_06376.jpg \n inflating: flower_data/train/15/image_06362.jpg \n inflating: flower_data/train/15/image_06366.jpg \n inflating: flower_data/train/15/image_06372.jpg \n inflating: flower_data/train/15/image_06373.jpg \n inflating: flower_data/train/15/image_06367.jpg \n inflating: flower_data/train/15/image_06359.jpg \n inflating: flower_data/train/15/image_06371.jpg \n inflating: flower_data/train/15/image_06365.jpg \n inflating: flower_data/train/15/image_06364.jpg \n inflating: flower_data/train/15/image_06370.jpg \n creating: flower_data/train/3/\n inflating: flower_data/train/3/image_06625.jpg \n inflating: flower_data/train/3/image_06619.jpg \n inflating: flower_data/train/3/image_06618.jpg \n inflating: flower_data/train/3/image_06624.jpg \n inflating: flower_data/train/3/image_06630.jpg \n inflating: flower_data/train/3/image_06626.jpg \n inflating: flower_data/train/3/image_06632.jpg \n inflating: flower_data/train/3/image_06633.jpg \n inflating: flower_data/train/3/image_06627.jpg \n inflating: flower_data/train/3/image_06623.jpg \n inflating: flower_data/train/3/image_06637.jpg \n inflating: flower_data/train/3/image_06636.jpg \n inflating: flower_data/train/3/image_06622.jpg \n inflating: flower_data/train/3/image_06620.jpg \n inflating: flower_data/train/3/image_06635.jpg \n inflating: flower_data/train/3/image_06646.jpg \n inflating: flower_data/train/3/image_06647.jpg \n inflating: flower_data/train/3/image_06645.jpg \n inflating: flower_data/train/3/image_06651.jpg \n inflating: flower_data/train/3/image_06650.jpg \n inflating: flower_data/train/3/image_06644.jpg \n inflating: flower_data/train/3/image_06640.jpg \n inflating: flower_data/train/3/image_06643.jpg \n inflating: flower_data/train/3/image_06642.jpg \n inflating: flower_data/train/3/image_06649.jpg \n inflating: flower_data/train/3/image_06648.jpg \n inflating: flower_data/train/3/image_06638.jpg \n inflating: flower_data/train/3/image_06639.jpg \n inflating: flower_data/train/3/image_06613.jpg \n inflating: flower_data/train/3/image_06612.jpg \n inflating: flower_data/train/3/image_06616.jpg \n inflating: flower_data/train/3/image_06617.jpg \n inflating: flower_data/train/3/image_06629.jpg \n inflating: flower_data/train/3/image_06615.jpg \n inflating: flower_data/train/3/image_06614.jpg \n inflating: flower_data/train/3/image_06628.jpg \n creating: flower_data/train/12/\n inflating: flower_data/train/12/image_04026.jpg \n inflating: flower_data/train/12/image_04032.jpg \n inflating: flower_data/train/12/image_04033.jpg \n inflating: flower_data/train/12/image_04027.jpg \n inflating: flower_data/train/12/image_04031.jpg \n inflating: flower_data/train/12/image_04025.jpg \n inflating: flower_data/train/12/image_04019.jpg \n inflating: flower_data/train/12/image_04018.jpg \n inflating: flower_data/train/12/image_04024.jpg \n inflating: flower_data/train/12/image_04030.jpg \n inflating: flower_data/train/12/image_04008.jpg \n inflating: flower_data/train/12/image_04034.jpg \n inflating: flower_data/train/12/image_04020.jpg \n inflating: flower_data/train/12/image_04021.jpg \n inflating: flower_data/train/12/image_04035.jpg \n inflating: flower_data/train/12/image_04009.jpg \n inflating: flower_data/train/12/image_04037.jpg \n inflating: flower_data/train/12/image_04036.jpg \n inflating: flower_data/train/12/image_03995.jpg \n inflating: flower_data/train/12/image_04022.jpg \n inflating: flower_data/train/12/image_04045.jpg \n inflating: flower_data/train/12/image_04051.jpg \n inflating: flower_data/train/12/image_04079.jpg \n inflating: flower_data/train/12/image_04078.jpg \n inflating: flower_data/train/12/image_04050.jpg \n inflating: flower_data/train/12/image_04044.jpg \n inflating: flower_data/train/12/image_04046.jpg \n inflating: flower_data/train/12/image_04047.jpg \n inflating: flower_data/train/12/image_04053.jpg \n inflating: flower_data/train/12/image_04057.jpg \n inflating: flower_data/train/12/image_04043.jpg \n inflating: flower_data/train/12/image_04042.jpg \n inflating: flower_data/train/12/image_04056.jpg \n inflating: flower_data/train/12/image_04068.jpg \n inflating: flower_data/train/12/image_04040.jpg \n inflating: flower_data/train/12/image_04054.jpg \n inflating: flower_data/train/12/image_04055.jpg \n inflating: flower_data/train/12/image_04041.jpg \n inflating: flower_data/train/12/image_04069.jpg \n inflating: flower_data/train/12/image_04064.jpg \n inflating: flower_data/train/12/image_04070.jpg \n inflating: flower_data/train/12/image_04058.jpg \n inflating: flower_data/train/12/image_04071.jpg \n inflating: flower_data/train/12/image_04065.jpg \n inflating: flower_data/train/12/image_04073.jpg \n inflating: flower_data/train/12/image_04067.jpg \n inflating: flower_data/train/12/image_04066.jpg \n inflating: flower_data/train/12/image_04076.jpg \n inflating: flower_data/train/12/image_04062.jpg \n inflating: flower_data/train/12/image_04063.jpg \n inflating: flower_data/train/12/image_04049.jpg \n inflating: flower_data/train/12/image_04061.jpg \n inflating: flower_data/train/12/image_04075.jpg \n inflating: flower_data/train/12/image_04074.jpg \n inflating: flower_data/train/12/image_04060.jpg \n inflating: flower_data/train/12/image_04048.jpg \n inflating: flower_data/train/12/image_04007.jpg \n inflating: flower_data/train/12/image_03999.jpg \n inflating: flower_data/train/12/image_04006.jpg \n inflating: flower_data/train/12/image_04010.jpg \n inflating: flower_data/train/12/image_04004.jpg \n inflating: flower_data/train/12/image_04038.jpg \n inflating: flower_data/train/12/image_04039.jpg \n inflating: flower_data/train/12/image_04005.jpg \n inflating: flower_data/train/12/image_04011.jpg \n inflating: flower_data/train/12/image_04029.jpg \n inflating: flower_data/train/12/image_04015.jpg \n inflating: flower_data/train/12/image_04001.jpg \n inflating: flower_data/train/12/image_04000.jpg \n inflating: flower_data/train/12/image_04028.jpg \n inflating: flower_data/train/12/image_04002.jpg \n inflating: flower_data/train/12/image_04017.jpg \n inflating: flower_data/train/12/image_04003.jpg \n creating: flower_data/train/85/\n inflating: flower_data/train/85/image_04783.jpg \n inflating: flower_data/train/85/image_04768.jpg \n inflating: flower_data/train/85/image_04813.jpg \n inflating: flower_data/train/85/image_04807.jpg \n inflating: flower_data/train/85/image_07301.jpg \n inflating: flower_data/train/85/image_04812.jpg \n inflating: flower_data/train/85/image_04769.jpg \n inflating: flower_data/train/85/image_04796.jpg \n inflating: flower_data/train/85/image_04782.jpg \n inflating: flower_data/train/85/image_04794.jpg \n inflating: flower_data/train/85/image_04780.jpg \n inflating: flower_data/train/85/image_04811.jpg \n inflating: flower_data/train/85/image_04781.jpg \n inflating: flower_data/train/85/image_04795.jpg \n inflating: flower_data/train/85/image_04791.jpg \n inflating: flower_data/train/85/image_04785.jpg \n inflating: flower_data/train/85/image_04815.jpg \n inflating: flower_data/train/85/image_04828.jpg \n inflating: flower_data/train/85/image_04800.jpg \n inflating: flower_data/train/85/image_04784.jpg \n inflating: flower_data/train/85/image_04790.jpg \n inflating: flower_data/train/85/image_04792.jpg \n inflating: flower_data/train/85/image_04779.jpg \n inflating: flower_data/train/85/image_04816.jpg \n inflating: flower_data/train/85/image_04802.jpg \n inflating: flower_data/train/85/image_04803.jpg \n inflating: flower_data/train/85/image_04778.jpg \n inflating: flower_data/train/85/image_04793.jpg \n inflating: flower_data/train/85/image_04787.jpg \n inflating: flower_data/train/85/image_04775.jpg \n inflating: flower_data/train/85/image_04826.jpg \n inflating: flower_data/train/85/image_04827.jpg \n inflating: flower_data/train/85/image_04774.jpg \n inflating: flower_data/train/85/image_04789.jpg \n inflating: flower_data/train/85/image_04776.jpg \n inflating: flower_data/train/85/image_04824.jpg \n inflating: flower_data/train/85/image_04777.jpg \n inflating: flower_data/train/85/image_04788.jpg \n inflating: flower_data/train/85/image_04798.jpg \n inflating: flower_data/train/85/image_04773.jpg \n inflating: flower_data/train/85/image_04767.jpg \n inflating: flower_data/train/85/image_04820.jpg \n inflating: flower_data/train/85/image_04821.jpg \n inflating: flower_data/train/85/image_04772.jpg \n inflating: flower_data/train/85/image_04770.jpg \n inflating: flower_data/train/85/image_04823.jpg \n inflating: flower_data/train/85/image_04822.jpg \n inflating: flower_data/train/85/image_04771.jpg \n creating: flower_data/train/71/\n inflating: flower_data/train/71/image_04542.jpg \n inflating: flower_data/train/71/image_04556.jpg \n inflating: flower_data/train/71/image_04543.jpg \n inflating: flower_data/train/71/image_04541.jpg \n inflating: flower_data/train/71/image_04540.jpg \n inflating: flower_data/train/71/image_04550.jpg \n inflating: flower_data/train/71/image_04544.jpg \n inflating: flower_data/train/71/image_04545.jpg \n inflating: flower_data/train/71/image_04551.jpg \n inflating: flower_data/train/71/image_04547.jpg \n inflating: flower_data/train/71/image_04553.jpg \n inflating: flower_data/train/71/image_04552.jpg \n inflating: flower_data/train/71/image_04546.jpg \n inflating: flower_data/train/71/image_04521.jpg \n inflating: flower_data/train/71/image_04535.jpg \n inflating: flower_data/train/71/image_04509.jpg \n inflating: flower_data/train/71/image_04496.jpg \n inflating: flower_data/train/71/image_04483.jpg \n inflating: flower_data/train/71/image_04497.jpg \n inflating: flower_data/train/71/image_04508.jpg \n inflating: flower_data/train/71/image_04534.jpg \n inflating: flower_data/train/71/image_04520.jpg \n inflating: flower_data/train/71/image_04536.jpg \n inflating: flower_data/train/71/image_04522.jpg \n inflating: flower_data/train/71/image_04481.jpg \n inflating: flower_data/train/71/image_04495.jpg \n inflating: flower_data/train/71/image_04494.jpg \n inflating: flower_data/train/71/image_04523.jpg \n inflating: flower_data/train/71/image_04537.jpg \n inflating: flower_data/train/71/image_04533.jpg \n inflating: flower_data/train/71/image_04527.jpg \n inflating: flower_data/train/71/image_04484.jpg \n inflating: flower_data/train/71/image_04490.jpg \n inflating: flower_data/train/71/image_04491.jpg \n inflating: flower_data/train/71/image_04485.jpg \n inflating: flower_data/train/71/image_04526.jpg \n inflating: flower_data/train/71/image_04532.jpg \n inflating: flower_data/train/71/image_04518.jpg \n inflating: flower_data/train/71/image_04530.jpg \n inflating: flower_data/train/71/image_04493.jpg \n inflating: flower_data/train/71/image_04487.jpg \n inflating: flower_data/train/71/image_04486.jpg \n inflating: flower_data/train/71/image_04492.jpg \n inflating: flower_data/train/71/image_04531.jpg \n inflating: flower_data/train/71/image_04525.jpg \n inflating: flower_data/train/71/image_04500.jpg \n inflating: flower_data/train/71/image_04528.jpg \n inflating: flower_data/train/71/image_04529.jpg \n inflating: flower_data/train/71/image_04501.jpg \n inflating: flower_data/train/71/image_04503.jpg \n inflating: flower_data/train/71/image_04489.jpg \n inflating: flower_data/train/71/image_04516.jpg \n inflating: flower_data/train/71/image_04506.jpg \n inflating: flower_data/train/71/image_08089.jpg \n inflating: flower_data/train/71/image_04499.jpg \n inflating: flower_data/train/71/image_04498.jpg \n inflating: flower_data/train/71/image_04507.jpg \n inflating: flower_data/train/71/image_04505.jpg \n inflating: flower_data/train/71/image_04511.jpg \n inflating: flower_data/train/71/image_04510.jpg \n inflating: flower_data/train/71/image_04504.jpg \n inflating: flower_data/train/71/image_04538.jpg \n inflating: flower_data/train/71/image_04548.jpg \n inflating: flower_data/train/71/image_04549.jpg \n creating: flower_data/train/76/\n inflating: flower_data/train/76/image_02480.jpg \n inflating: flower_data/train/76/image_02536.jpg \n inflating: flower_data/train/76/image_02522.jpg \n inflating: flower_data/train/76/image_02495.jpg \n inflating: flower_data/train/76/image_02483.jpg \n inflating: flower_data/train/76/image_02497.jpg \n inflating: flower_data/train/76/image_02454.jpg \n inflating: flower_data/train/76/image_02468.jpg \n inflating: flower_data/train/76/image_02534.jpg \n inflating: flower_data/train/76/image_02520.jpg \n inflating: flower_data/train/76/image_02508.jpg \n inflating: flower_data/train/76/image_02509.jpg \n inflating: flower_data/train/76/image_02521.jpg \n inflating: flower_data/train/76/image_02535.jpg \n inflating: flower_data/train/76/image_02469.jpg \n inflating: flower_data/train/76/image_02455.jpg \n inflating: flower_data/train/76/image_02482.jpg \n inflating: flower_data/train/76/image_02486.jpg \n inflating: flower_data/train/76/image_02492.jpg \n inflating: flower_data/train/76/image_02451.jpg \n inflating: flower_data/train/76/image_02519.jpg \n inflating: flower_data/train/76/image_02531.jpg \n inflating: flower_data/train/76/image_02524.jpg \n inflating: flower_data/train/76/image_02518.jpg \n inflating: flower_data/train/76/image_02450.jpg \n inflating: flower_data/train/76/image_02478.jpg \n inflating: flower_data/train/76/image_02493.jpg \n inflating: flower_data/train/76/image_02487.jpg \n inflating: flower_data/train/76/image_02491.jpg \n inflating: flower_data/train/76/image_02452.jpg \n inflating: flower_data/train/76/image_02526.jpg \n inflating: flower_data/train/76/image_02532.jpg \n inflating: flower_data/train/76/image_02533.jpg \n inflating: flower_data/train/76/image_02527.jpg \n inflating: flower_data/train/76/image_02447.jpg \n inflating: flower_data/train/76/image_02453.jpg \n inflating: flower_data/train/76/image_02490.jpg \n inflating: flower_data/train/76/image_02540.jpg \n inflating: flower_data/train/76/image_02541.jpg \n inflating: flower_data/train/76/image_02543.jpg \n inflating: flower_data/train/76/image_02542.jpg \n inflating: flower_data/train/76/image_02552.jpg \n inflating: flower_data/train/76/image_02546.jpg \n inflating: flower_data/train/76/image_02547.jpg \n inflating: flower_data/train/76/image_02545.jpg \n inflating: flower_data/train/76/image_02551.jpg \n inflating: flower_data/train/76/image_02544.jpg \n inflating: flower_data/train/76/image_02549.jpg \n inflating: flower_data/train/76/image_02548.jpg \n inflating: flower_data/train/76/image_02489.jpg \n inflating: flower_data/train/76/image_02476.jpg \n inflating: flower_data/train/76/image_02462.jpg \n inflating: flower_data/train/76/image_02502.jpg \n inflating: flower_data/train/76/image_02516.jpg \n inflating: flower_data/train/76/image_02517.jpg \n inflating: flower_data/train/76/image_02503.jpg \n inflating: flower_data/train/76/image_02463.jpg \n inflating: flower_data/train/76/image_02477.jpg \n inflating: flower_data/train/76/image_02488.jpg \n inflating: flower_data/train/76/image_02461.jpg \n inflating: flower_data/train/76/image_02449.jpg \n inflating: flower_data/train/76/image_02515.jpg \n inflating: flower_data/train/76/image_02528.jpg \n inflating: flower_data/train/76/image_02500.jpg \n inflating: flower_data/train/76/image_02448.jpg \n inflating: flower_data/train/76/image_02474.jpg \n inflating: flower_data/train/76/image_02460.jpg \n inflating: flower_data/train/76/image_02464.jpg \n inflating: flower_data/train/76/image_02470.jpg \n inflating: flower_data/train/76/image_02538.jpg \n inflating: flower_data/train/76/image_02510.jpg \n inflating: flower_data/train/76/image_02505.jpg \n inflating: flower_data/train/76/image_02511.jpg \n inflating: flower_data/train/76/image_02539.jpg \n inflating: flower_data/train/76/image_02471.jpg \n inflating: flower_data/train/76/image_02465.jpg \n inflating: flower_data/train/76/image_02459.jpg \n inflating: flower_data/train/76/image_02498.jpg \n inflating: flower_data/train/76/image_02473.jpg \n inflating: flower_data/train/76/image_02507.jpg \n inflating: flower_data/train/76/image_02506.jpg \n inflating: flower_data/train/76/image_02466.jpg \n inflating: flower_data/train/76/image_02499.jpg \n creating: flower_data/train/82/\n inflating: flower_data/train/82/image_01638.jpg \n inflating: flower_data/train/82/image_01604.jpg \n inflating: flower_data/train/82/image_01611.jpg \n inflating: flower_data/train/82/image_01605.jpg \n inflating: flower_data/train/82/image_01639.jpg \n inflating: flower_data/train/82/image_01613.jpg \n inflating: flower_data/train/82/image_01607.jpg \n inflating: flower_data/train/82/image_01606.jpg \n inflating: flower_data/train/82/image_01612.jpg \n inflating: flower_data/train/82/image_01616.jpg \n inflating: flower_data/train/82/image_01602.jpg \n inflating: flower_data/train/82/image_01617.jpg \n inflating: flower_data/train/82/image_01601.jpg \n inflating: flower_data/train/82/image_01615.jpg \n inflating: flower_data/train/82/image_01629.jpg \n inflating: flower_data/train/82/image_01588.jpg \n inflating: flower_data/train/82/image_01589.jpg \n inflating: flower_data/train/82/image_01628.jpg \n inflating: flower_data/train/82/image_01600.jpg \n inflating: flower_data/train/82/image_01667.jpg \n inflating: flower_data/train/82/image_01673.jpg \n inflating: flower_data/train/82/image_01670.jpg \n inflating: flower_data/train/82/image_01664.jpg \n inflating: flower_data/train/82/image_01665.jpg \n inflating: flower_data/train/82/image_01675.jpg \n inflating: flower_data/train/82/image_01648.jpg \n inflating: flower_data/train/82/image_01660.jpg \n inflating: flower_data/train/82/image_01674.jpg \n inflating: flower_data/train/82/image_01689.jpg \n inflating: flower_data/train/82/image_01676.jpg \n inflating: flower_data/train/82/image_01677.jpg \n inflating: flower_data/train/82/image_01663.jpg \n inflating: flower_data/train/82/image_01688.jpg \n inflating: flower_data/train/82/image_01646.jpg \n inflating: flower_data/train/82/image_01652.jpg \n inflating: flower_data/train/82/image_01647.jpg \n inflating: flower_data/train/82/image_01684.jpg \n inflating: flower_data/train/82/image_01692.jpg \n inflating: flower_data/train/82/image_01651.jpg \n inflating: flower_data/train/82/image_01645.jpg \n inflating: flower_data/train/82/image_01644.jpg \n inflating: flower_data/train/82/image_01678.jpg \n inflating: flower_data/train/82/image_01687.jpg \n inflating: flower_data/train/82/image_01654.jpg \n inflating: flower_data/train/82/image_01668.jpg \n inflating: flower_data/train/82/image_01669.jpg \n inflating: flower_data/train/82/image_01641.jpg \n inflating: flower_data/train/82/image_01655.jpg \n inflating: flower_data/train/82/image_01682.jpg \n inflating: flower_data/train/82/image_01680.jpg \n inflating: flower_data/train/82/image_01694.jpg \n inflating: flower_data/train/82/image_01657.jpg \n inflating: flower_data/train/82/image_01656.jpg \n inflating: flower_data/train/82/image_01642.jpg \n inflating: flower_data/train/82/image_01695.jpg \n inflating: flower_data/train/82/image_01681.jpg \n inflating: flower_data/train/82/image_01619.jpg \n inflating: flower_data/train/82/image_01631.jpg \n inflating: flower_data/train/82/image_01590.jpg \n inflating: flower_data/train/82/image_01584.jpg \n inflating: flower_data/train/82/image_01585.jpg \n inflating: flower_data/train/82/image_01591.jpg \n inflating: flower_data/train/82/image_01630.jpg \n inflating: flower_data/train/82/image_01624.jpg \n inflating: flower_data/train/82/image_01632.jpg \n inflating: flower_data/train/82/image_01626.jpg \n inflating: flower_data/train/82/image_01587.jpg \n inflating: flower_data/train/82/image_01593.jpg \n inflating: flower_data/train/82/image_01586.jpg \n inflating: flower_data/train/82/image_01627.jpg \n inflating: flower_data/train/82/image_01633.jpg \n inflating: flower_data/train/82/image_01637.jpg \n inflating: flower_data/train/82/image_01623.jpg \n inflating: flower_data/train/82/image_01597.jpg \n inflating: flower_data/train/82/image_01622.jpg \n inflating: flower_data/train/82/image_01636.jpg \n inflating: flower_data/train/82/image_01620.jpg \n inflating: flower_data/train/82/image_01634.jpg \n inflating: flower_data/train/82/image_01608.jpg \n inflating: flower_data/train/82/image_01595.jpg \n inflating: flower_data/train/82/image_01635.jpg \n inflating: flower_data/train/82/image_01621.jpg \n creating: flower_data/train/49/\n inflating: flower_data/train/49/image_06221.jpg \n inflating: flower_data/train/49/image_06208.jpg \n inflating: flower_data/train/49/image_06234.jpg \n inflating: flower_data/train/49/image_06220.jpg \n inflating: flower_data/train/49/image_06236.jpg \n inflating: flower_data/train/49/image_06223.jpg \n inflating: flower_data/train/49/image_06237.jpg \n inflating: flower_data/train/49/image_06233.jpg \n inflating: flower_data/train/49/image_06227.jpg \n inflating: flower_data/train/49/image_06226.jpg \n inflating: flower_data/train/49/image_06232.jpg \n inflating: flower_data/train/49/image_06218.jpg \n inflating: flower_data/train/49/image_06224.jpg \n inflating: flower_data/train/49/image_06231.jpg \n inflating: flower_data/train/49/image_06225.jpg \n inflating: flower_data/train/49/image_06219.jpg \n inflating: flower_data/train/49/image_06242.jpg \n inflating: flower_data/train/49/image_06243.jpg \n inflating: flower_data/train/49/image_06241.jpg \n inflating: flower_data/train/49/image_06244.jpg \n inflating: flower_data/train/49/image_06245.jpg \n inflating: flower_data/train/49/image_06246.jpg \n inflating: flower_data/train/49/image_06200.jpg \n inflating: flower_data/train/49/image_06214.jpg \n inflating: flower_data/train/49/image_06229.jpg \n inflating: flower_data/train/49/image_06201.jpg \n inflating: flower_data/train/49/image_06217.jpg \n inflating: flower_data/train/49/image_06203.jpg \n inflating: flower_data/train/49/image_06212.jpg \n inflating: flower_data/train/49/image_06206.jpg \n inflating: flower_data/train/49/image_06207.jpg \n inflating: flower_data/train/49/image_06198.jpg \n inflating: flower_data/train/49/image_06239.jpg \n inflating: flower_data/train/49/image_06205.jpg \n inflating: flower_data/train/49/image_06211.jpg \n inflating: flower_data/train/49/image_06204.jpg \n inflating: flower_data/train/49/image_06238.jpg \n inflating: flower_data/train/49/image_06199.jpg \n creating: flower_data/train/40/\n inflating: flower_data/train/40/image_04581.jpg \n inflating: flower_data/train/40/image_04595.jpg \n inflating: flower_data/train/40/image_04620.jpg \n inflating: flower_data/train/40/image_04621.jpg \n inflating: flower_data/train/40/image_04594.jpg \n inflating: flower_data/train/40/image_04580.jpg \n inflating: flower_data/train/40/image_04569.jpg \n inflating: flower_data/train/40/image_04623.jpg \n inflating: flower_data/train/40/image_04622.jpg \n inflating: flower_data/train/40/image_04583.jpg \n inflating: flower_data/train/40/image_04597.jpg \n inflating: flower_data/train/40/image_04593.jpg \n inflating: flower_data/train/40/image_04592.jpg \n inflating: flower_data/train/40/image_04590.jpg \n inflating: flower_data/train/40/image_04619.jpg \n inflating: flower_data/train/40/image_04624.jpg \n inflating: flower_data/train/40/image_04618.jpg \n inflating: flower_data/train/40/image_04591.jpg \n inflating: flower_data/train/40/image_04585.jpg \n inflating: flower_data/train/40/image_04577.jpg \n inflating: flower_data/train/40/image_04615.jpg \n inflating: flower_data/train/40/image_04601.jpg \n inflating: flower_data/train/40/image_04600.jpg \n inflating: flower_data/train/40/image_04614.jpg \n inflating: flower_data/train/40/image_04576.jpg \n inflating: flower_data/train/40/image_04562.jpg \n inflating: flower_data/train/40/image_04589.jpg \n inflating: flower_data/train/40/image_04574.jpg \n inflating: flower_data/train/40/image_04560.jpg \n inflating: flower_data/train/40/image_04602.jpg \n inflating: flower_data/train/40/image_04616.jpg \n inflating: flower_data/train/40/image_04617.jpg \n inflating: flower_data/train/40/image_04603.jpg \n inflating: flower_data/train/40/image_04561.jpg \n inflating: flower_data/train/40/image_04575.jpg \n inflating: flower_data/train/40/image_04559.jpg \n inflating: flower_data/train/40/image_04571.jpg \n inflating: flower_data/train/40/image_04565.jpg \n inflating: flower_data/train/40/image_04607.jpg \n inflating: flower_data/train/40/image_04613.jpg \n inflating: flower_data/train/40/image_04612.jpg \n inflating: flower_data/train/40/image_04606.jpg \n inflating: flower_data/train/40/image_04564.jpg \n inflating: flower_data/train/40/image_04570.jpg \n inflating: flower_data/train/40/image_04558.jpg \n inflating: flower_data/train/40/image_04599.jpg \n inflating: flower_data/train/40/image_04566.jpg \n inflating: flower_data/train/40/image_04572.jpg \n inflating: flower_data/train/40/image_04610.jpg \n inflating: flower_data/train/40/image_04604.jpg \n inflating: flower_data/train/40/image_04605.jpg \n inflating: flower_data/train/40/image_04611.jpg \n inflating: flower_data/train/40/image_04567.jpg \n inflating: flower_data/train/40/image_04598.jpg \n creating: flower_data/train/47/\n inflating: flower_data/train/47/image_04967.jpg \n inflating: flower_data/train/47/image_04973.jpg \n inflating: flower_data/train/47/image_04998.jpg \n inflating: flower_data/train/47/image_04999.jpg \n inflating: flower_data/train/47/image_04972.jpg \n inflating: flower_data/train/47/image_04958.jpg \n inflating: flower_data/train/47/image_04970.jpg \n inflating: flower_data/train/47/image_04964.jpg \n inflating: flower_data/train/47/image_04965.jpg \n inflating: flower_data/train/47/image_04971.jpg \n inflating: flower_data/train/47/image_04959.jpg \n inflating: flower_data/train/47/image_04975.jpg \n inflating: flower_data/train/47/image_04961.jpg \n inflating: flower_data/train/47/image_04960.jpg \n inflating: flower_data/train/47/image_04974.jpg \n inflating: flower_data/train/47/image_04962.jpg \n inflating: flower_data/train/47/image_04976.jpg \n inflating: flower_data/train/47/image_04988.jpg \n inflating: flower_data/train/47/image_04977.jpg \n inflating: flower_data/train/47/image_04963.jpg \n inflating: flower_data/train/47/image_05011.jpg \n inflating: flower_data/train/47/image_05005.jpg \n inflating: flower_data/train/47/image_05004.jpg \n inflating: flower_data/train/47/image_05010.jpg \n inflating: flower_data/train/47/image_05006.jpg \n inflating: flower_data/train/47/image_05012.jpg \n inflating: flower_data/train/47/image_05003.jpg \n inflating: flower_data/train/47/image_05017.jpg \n inflating: flower_data/train/47/image_05016.jpg \n inflating: flower_data/train/47/image_05002.jpg \n inflating: flower_data/train/47/image_05014.jpg \n inflating: flower_data/train/47/image_05000.jpg \n inflating: flower_data/train/47/image_05001.jpg \n inflating: flower_data/train/47/image_05015.jpg \n inflating: flower_data/train/47/image_05018.jpg \n inflating: flower_data/train/47/image_05019.jpg \n inflating: flower_data/train/47/image_05009.jpg \n inflating: flower_data/train/47/image_05008.jpg \n inflating: flower_data/train/47/image_04985.jpg \n inflating: flower_data/train/47/image_04991.jpg \n inflating: flower_data/train/47/image_04990.jpg \n inflating: flower_data/train/47/image_04984.jpg \n inflating: flower_data/train/47/image_04953.jpg \n inflating: flower_data/train/47/image_04979.jpg \n inflating: flower_data/train/47/image_04992.jpg \n inflating: flower_data/train/47/image_04986.jpg \n inflating: flower_data/train/47/image_04987.jpg \n inflating: flower_data/train/47/image_04978.jpg \n inflating: flower_data/train/47/image_04954.jpg \n inflating: flower_data/train/47/image_04968.jpg \n inflating: flower_data/train/47/image_04997.jpg \n inflating: flower_data/train/47/image_04983.jpg \n inflating: flower_data/train/47/image_04982.jpg \n inflating: flower_data/train/47/image_04996.jpg \n inflating: flower_data/train/47/image_04969.jpg \n inflating: flower_data/train/47/image_04955.jpg \n inflating: flower_data/train/47/image_04980.jpg \n inflating: flower_data/train/47/image_04994.jpg \n inflating: flower_data/train/47/image_04995.jpg \n inflating: flower_data/train/47/image_04981.jpg \n inflating: flower_data/train/47/image_04956.jpg \n creating: flower_data/train/78/\n inflating: flower_data/train/78/image_01837.jpg \n inflating: flower_data/train/78/image_01957.jpg \n inflating: flower_data/train/78/image_01943.jpg \n inflating: flower_data/train/78/image_01942.jpg \n inflating: flower_data/train/78/image_01956.jpg \n inflating: flower_data/train/78/image_01836.jpg \n inflating: flower_data/train/78/image_01834.jpg \n inflating: flower_data/train/78/image_01940.jpg \n inflating: flower_data/train/78/image_01954.jpg \n inflating: flower_data/train/78/image_01955.jpg \n inflating: flower_data/train/78/image_01941.jpg \n inflating: flower_data/train/78/image_01835.jpg \n inflating: flower_data/train/78/image_01831.jpg \n inflating: flower_data/train/78/image_01945.jpg \n inflating: flower_data/train/78/image_01951.jpg \n inflating: flower_data/train/78/image_01950.jpg \n inflating: flower_data/train/78/image_01832.jpg \n inflating: flower_data/train/78/image_01952.jpg \n inflating: flower_data/train/78/image_01946.jpg \n inflating: flower_data/train/78/image_01947.jpg \n inflating: flower_data/train/78/image_01833.jpg \n inflating: flower_data/train/78/image_01827.jpg \n inflating: flower_data/train/78/image_01840.jpg \n inflating: flower_data/train/78/image_01854.jpg \n inflating: flower_data/train/78/image_01868.jpg \n inflating: flower_data/train/78/image_01883.jpg \n inflating: flower_data/train/78/image_01897.jpg \n inflating: flower_data/train/78/image_01934.jpg \n inflating: flower_data/train/78/image_01920.jpg \n inflating: flower_data/train/78/image_01908.jpg \n inflating: flower_data/train/78/image_01909.jpg \n inflating: flower_data/train/78/image_01921.jpg \n inflating: flower_data/train/78/image_01935.jpg \n inflating: flower_data/train/78/image_01896.jpg \n inflating: flower_data/train/78/image_01882.jpg \n inflating: flower_data/train/78/image_01869.jpg \n inflating: flower_data/train/78/image_01841.jpg \n inflating: flower_data/train/78/image_01857.jpg \n inflating: flower_data/train/78/image_01880.jpg \n inflating: flower_data/train/78/image_01923.jpg \n inflating: flower_data/train/78/image_01936.jpg \n inflating: flower_data/train/78/image_01881.jpg \n inflating: flower_data/train/78/image_01895.jpg \n inflating: flower_data/train/78/image_01842.jpg \n inflating: flower_data/train/78/image_01852.jpg \n inflating: flower_data/train/78/image_01846.jpg \n inflating: flower_data/train/78/image_01891.jpg \n inflating: flower_data/train/78/image_01885.jpg \n inflating: flower_data/train/78/image_01926.jpg \n inflating: flower_data/train/78/image_01932.jpg \n inflating: flower_data/train/78/image_01933.jpg \n inflating: flower_data/train/78/image_01927.jpg \n inflating: flower_data/train/78/image_01884.jpg \n inflating: flower_data/train/78/image_01890.jpg \n inflating: flower_data/train/78/image_01847.jpg \n inflating: flower_data/train/78/image_01853.jpg \n inflating: flower_data/train/78/image_01879.jpg \n inflating: flower_data/train/78/image_01845.jpg \n inflating: flower_data/train/78/image_01851.jpg \n inflating: flower_data/train/78/image_01886.jpg \n inflating: flower_data/train/78/image_01892.jpg \n inflating: flower_data/train/78/image_01925.jpg \n inflating: flower_data/train/78/image_01924.jpg \n inflating: flower_data/train/78/image_01930.jpg \n inflating: flower_data/train/78/image_01918.jpg \n inflating: flower_data/train/78/image_01887.jpg \n inflating: flower_data/train/78/image_01850.jpg \n inflating: flower_data/train/78/image_01878.jpg \n inflating: flower_data/train/78/image_01861.jpg \n inflating: flower_data/train/78/image_01875.jpg \n inflating: flower_data/train/78/image_01849.jpg \n inflating: flower_data/train/78/image_01915.jpg \n inflating: flower_data/train/78/image_01901.jpg \n inflating: flower_data/train/78/image_01900.jpg \n inflating: flower_data/train/78/image_01860.jpg \n inflating: flower_data/train/78/image_01876.jpg \n inflating: flower_data/train/78/image_01889.jpg \n inflating: flower_data/train/78/image_01902.jpg \n inflating: flower_data/train/78/image_01916.jpg \n inflating: flower_data/train/78/image_01917.jpg \n inflating: flower_data/train/78/image_01863.jpg \n inflating: flower_data/train/78/image_01877.jpg \n inflating: flower_data/train/78/image_01873.jpg \n inflating: flower_data/train/78/image_01867.jpg \n inflating: flower_data/train/78/image_01898.jpg \n inflating: flower_data/train/78/image_01907.jpg \n inflating: flower_data/train/78/image_01913.jpg \n inflating: flower_data/train/78/image_01912.jpg \n inflating: flower_data/train/78/image_01906.jpg \n inflating: flower_data/train/78/image_01899.jpg \n inflating: flower_data/train/78/image_01866.jpg \n inflating: flower_data/train/78/image_01872.jpg \n inflating: flower_data/train/78/image_01858.jpg \n inflating: flower_data/train/78/image_01864.jpg \n inflating: flower_data/train/78/image_01870.jpg \n inflating: flower_data/train/78/image_01904.jpg \n inflating: flower_data/train/78/image_01905.jpg \n inflating: flower_data/train/78/image_01911.jpg \n inflating: flower_data/train/78/image_01939.jpg \n inflating: flower_data/train/78/image_01871.jpg \n inflating: flower_data/train/78/image_01865.jpg \n inflating: flower_data/train/78/image_01859.jpg \n inflating: flower_data/train/78/image_01963.jpg \n inflating: flower_data/train/78/image_01829.jpg \n inflating: flower_data/train/78/image_01961.jpg \n inflating: flower_data/train/78/image_01948.jpg \n inflating: flower_data/train/78/image_01960.jpg \n inflating: flower_data/train/78/image_01828.jpg \n inflating: flower_data/train/78/image_01838.jpg \n inflating: flower_data/train/78/image_01958.jpg \n inflating: flower_data/train/78/image_01959.jpg \n inflating: flower_data/train/78/image_01839.jpg \n creating: flower_data/train/2/\n inflating: flower_data/train/2/image_05106.jpg \n inflating: flower_data/train/2/image_05112.jpg \n inflating: flower_data/train/2/image_05099.jpg \n inflating: flower_data/train/2/image_05098.jpg \n inflating: flower_data/train/2/image_05113.jpg \n inflating: flower_data/train/2/image_05139.jpg \n inflating: flower_data/train/2/image_05111.jpg \n inflating: flower_data/train/2/image_05105.jpg \n inflating: flower_data/train/2/image_05104.jpg \n inflating: flower_data/train/2/image_05110.jpg \n inflating: flower_data/train/2/image_05138.jpg \n inflating: flower_data/train/2/image_05114.jpg \n inflating: flower_data/train/2/image_05128.jpg \n inflating: flower_data/train/2/image_05129.jpg \n inflating: flower_data/train/2/image_05115.jpg \n inflating: flower_data/train/2/image_05103.jpg \n inflating: flower_data/train/2/image_05117.jpg \n inflating: flower_data/train/2/image_05088.jpg \n inflating: flower_data/train/2/image_05089.jpg \n inflating: flower_data/train/2/image_05116.jpg \n inflating: flower_data/train/2/image_05102.jpg \n inflating: flower_data/train/2/image_05144.jpg \n inflating: flower_data/train/2/image_05145.jpg \n inflating: flower_data/train/2/image_05146.jpg \n inflating: flower_data/train/2/image_05143.jpg \n inflating: flower_data/train/2/image_05141.jpg \n inflating: flower_data/train/2/image_05140.jpg \n inflating: flower_data/train/2/image_05127.jpg \n inflating: flower_data/train/2/image_05090.jpg \n inflating: flower_data/train/2/image_05091.jpg \n inflating: flower_data/train/2/image_05132.jpg \n inflating: flower_data/train/2/image_05126.jpg \n inflating: flower_data/train/2/image_05118.jpg \n inflating: flower_data/train/2/image_05130.jpg \n inflating: flower_data/train/2/image_05087.jpg \n inflating: flower_data/train/2/image_05093.jpg \n inflating: flower_data/train/2/image_05092.jpg \n inflating: flower_data/train/2/image_05131.jpg \n inflating: flower_data/train/2/image_05119.jpg \n inflating: flower_data/train/2/image_05135.jpg \n inflating: flower_data/train/2/image_05121.jpg \n inflating: flower_data/train/2/image_05096.jpg \n inflating: flower_data/train/2/image_05097.jpg \n inflating: flower_data/train/2/image_05108.jpg \n inflating: flower_data/train/2/image_05120.jpg \n inflating: flower_data/train/2/image_05134.jpg \n inflating: flower_data/train/2/image_05122.jpg \n inflating: flower_data/train/2/image_05095.jpg \n inflating: flower_data/train/2/image_05123.jpg \n creating: flower_data/train/13/\n inflating: flower_data/train/13/image_05788.jpg \n inflating: flower_data/train/13/image_05777.jpg \n inflating: flower_data/train/13/image_05762.jpg \n inflating: flower_data/train/13/image_05776.jpg \n inflating: flower_data/train/13/image_05789.jpg \n inflating: flower_data/train/13/image_05748.jpg \n inflating: flower_data/train/13/image_05760.jpg \n inflating: flower_data/train/13/image_05774.jpg \n inflating: flower_data/train/13/image_05765.jpg \n inflating: flower_data/train/13/image_05771.jpg \n inflating: flower_data/train/13/image_05758.jpg \n inflating: flower_data/train/13/image_05770.jpg \n inflating: flower_data/train/13/image_05764.jpg \n inflating: flower_data/train/13/image_05766.jpg \n inflating: flower_data/train/13/image_05773.jpg \n inflating: flower_data/train/13/image_05781.jpg \n inflating: flower_data/train/13/image_05756.jpg \n inflating: flower_data/train/13/image_05757.jpg \n inflating: flower_data/train/13/image_05780.jpg \n inflating: flower_data/train/13/image_05782.jpg \n inflating: flower_data/train/13/image_05755.jpg \n inflating: flower_data/train/13/image_05754.jpg \n inflating: flower_data/train/13/image_05768.jpg \n inflating: flower_data/train/13/image_05744.jpg \n inflating: flower_data/train/13/image_05750.jpg \n inflating: flower_data/train/13/image_05778.jpg \n inflating: flower_data/train/13/image_05779.jpg \n inflating: flower_data/train/13/image_05751.jpg \n inflating: flower_data/train/13/image_05792.jpg \n inflating: flower_data/train/13/image_05786.jpg \n inflating: flower_data/train/13/image_05790.jpg \n inflating: flower_data/train/13/image_05784.jpg \n inflating: flower_data/train/13/image_05753.jpg \n inflating: flower_data/train/13/image_05747.jpg \n inflating: flower_data/train/13/image_05746.jpg \n inflating: flower_data/train/13/image_05752.jpg \n inflating: flower_data/train/13/image_05785.jpg \n inflating: flower_data/train/13/image_05791.jpg \n creating: flower_data/train/5/\n inflating: flower_data/train/5/image_05165.jpg \n inflating: flower_data/train/5/image_05171.jpg \n inflating: flower_data/train/5/image_05207.jpg \n inflating: flower_data/train/5/image_05206.jpg \n inflating: flower_data/train/5/image_05170.jpg \n inflating: flower_data/train/5/image_05158.jpg \n inflating: flower_data/train/5/image_05172.jpg \n inflating: flower_data/train/5/image_05204.jpg \n inflating: flower_data/train/5/image_05210.jpg \n inflating: flower_data/train/5/image_05211.jpg \n inflating: flower_data/train/5/image_05205.jpg \n inflating: flower_data/train/5/image_05167.jpg \n inflating: flower_data/train/5/image_05173.jpg \n inflating: flower_data/train/5/image_05198.jpg \n inflating: flower_data/train/5/image_05177.jpg \n inflating: flower_data/train/5/image_05163.jpg \n inflating: flower_data/train/5/image_05201.jpg \n inflating: flower_data/train/5/image_05200.jpg \n inflating: flower_data/train/5/image_05162.jpg \n inflating: flower_data/train/5/image_05176.jpg \n inflating: flower_data/train/5/image_05189.jpg \n inflating: flower_data/train/5/image_05160.jpg \n inflating: flower_data/train/5/image_05174.jpg \n inflating: flower_data/train/5/image_05148.jpg \n inflating: flower_data/train/5/image_05202.jpg \n inflating: flower_data/train/5/image_05203.jpg \n inflating: flower_data/train/5/image_05149.jpg \n inflating: flower_data/train/5/image_05175.jpg \n inflating: flower_data/train/5/image_05161.jpg \n inflating: flower_data/train/5/image_05187.jpg \n inflating: flower_data/train/5/image_05193.jpg \n inflating: flower_data/train/5/image_05178.jpg \n inflating: flower_data/train/5/image_05150.jpg \n inflating: flower_data/train/5/image_05151.jpg \n inflating: flower_data/train/5/image_05179.jpg \n inflating: flower_data/train/5/image_05190.jpg \n inflating: flower_data/train/5/image_05184.jpg \n inflating: flower_data/train/5/image_05153.jpg \n inflating: flower_data/train/5/image_05147.jpg \n inflating: flower_data/train/5/image_05152.jpg \n inflating: flower_data/train/5/image_05185.jpg \n inflating: flower_data/train/5/image_05191.jpg \n inflating: flower_data/train/5/image_05195.jpg \n inflating: flower_data/train/5/image_05181.jpg \n inflating: flower_data/train/5/image_05156.jpg \n inflating: flower_data/train/5/image_05208.jpg \n inflating: flower_data/train/5/image_05157.jpg \n inflating: flower_data/train/5/image_05180.jpg \n inflating: flower_data/train/5/image_05194.jpg \n inflating: flower_data/train/5/image_05182.jpg \n inflating: flower_data/train/5/image_05155.jpg \n inflating: flower_data/train/5/image_05154.jpg \n inflating: flower_data/train/5/image_05197.jpg \n inflating: flower_data/train/5/image_05183.jpg \n creating: flower_data/train/14/\n inflating: flower_data/train/14/image_06054.jpg \n inflating: flower_data/train/14/image_06068.jpg \n inflating: flower_data/train/14/image_06069.jpg \n inflating: flower_data/train/14/image_06055.jpg \n inflating: flower_data/train/14/image_06096.jpg \n inflating: flower_data/train/14/image_06094.jpg \n inflating: flower_data/train/14/image_06080.jpg \n inflating: flower_data/train/14/image_06057.jpg \n inflating: flower_data/train/14/image_06056.jpg \n inflating: flower_data/train/14/image_06081.jpg \n inflating: flower_data/train/14/image_06095.jpg \n inflating: flower_data/train/14/image_06085.jpg \n inflating: flower_data/train/14/image_06053.jpg \n inflating: flower_data/train/14/image_06084.jpg \n inflating: flower_data/train/14/image_06090.jpg \n inflating: flower_data/train/14/image_06086.jpg \n inflating: flower_data/train/14/image_06092.jpg \n inflating: flower_data/train/14/image_06079.jpg \n inflating: flower_data/train/14/image_06051.jpg \n inflating: flower_data/train/14/image_06050.jpg \n inflating: flower_data/train/14/image_06078.jpg \n inflating: flower_data/train/14/image_06093.jpg \n inflating: flower_data/train/14/image_06087.jpg \n inflating: flower_data/train/14/image_06061.jpg \n inflating: flower_data/train/14/image_06075.jpg \n inflating: flower_data/train/14/image_06049.jpg \n inflating: flower_data/train/14/image_06074.jpg \n inflating: flower_data/train/14/image_06060.jpg \n inflating: flower_data/train/14/image_06089.jpg \n inflating: flower_data/train/14/image_06076.jpg \n inflating: flower_data/train/14/image_06062.jpg \n inflating: flower_data/train/14/image_06063.jpg \n inflating: flower_data/train/14/image_06077.jpg \n inflating: flower_data/train/14/image_06088.jpg \n inflating: flower_data/train/14/image_06073.jpg \n inflating: flower_data/train/14/image_06067.jpg \n inflating: flower_data/train/14/image_06066.jpg \n inflating: flower_data/train/14/image_06072.jpg \n inflating: flower_data/train/14/image_06058.jpg \n inflating: flower_data/train/14/image_06064.jpg \n inflating: flower_data/train/14/image_06070.jpg \n inflating: flower_data/train/14/image_06071.jpg \n inflating: flower_data/train/14/image_06065.jpg \n inflating: flower_data/train/14/image_06059.jpg \n creating: flower_data/train/22/\n inflating: flower_data/train/22/image_05367.jpg \n inflating: flower_data/train/22/image_05373.jpg \n inflating: flower_data/train/22/image_05372.jpg \n inflating: flower_data/train/22/image_05358.jpg \n inflating: flower_data/train/22/image_05370.jpg \n inflating: flower_data/train/22/image_05365.jpg \n inflating: flower_data/train/22/image_05371.jpg \n inflating: flower_data/train/22/image_05359.jpg \n inflating: flower_data/train/22/image_05375.jpg \n inflating: flower_data/train/22/image_05349.jpg \n inflating: flower_data/train/22/image_05348.jpg \n inflating: flower_data/train/22/image_05374.jpg \n inflating: flower_data/train/22/image_05389.jpg \n inflating: flower_data/train/22/image_05377.jpg \n inflating: flower_data/train/22/image_05363.jpg \n inflating: flower_data/train/22/image_05385.jpg \n inflating: flower_data/train/22/image_05346.jpg \n inflating: flower_data/train/22/image_05353.jpg \n inflating: flower_data/train/22/image_05347.jpg \n inflating: flower_data/train/22/image_05390.jpg \n inflating: flower_data/train/22/image_05384.jpg \n inflating: flower_data/train/22/image_05392.jpg \n inflating: flower_data/train/22/image_05386.jpg \n inflating: flower_data/train/22/image_05379.jpg \n inflating: flower_data/train/22/image_05351.jpg \n inflating: flower_data/train/22/image_05345.jpg \n inflating: flower_data/train/22/image_05344.jpg \n inflating: flower_data/train/22/image_05350.jpg \n inflating: flower_data/train/22/image_05378.jpg \n inflating: flower_data/train/22/image_05387.jpg \n inflating: flower_data/train/22/image_05393.jpg \n inflating: flower_data/train/22/image_05397.jpg \n inflating: flower_data/train/22/image_05383.jpg \n inflating: flower_data/train/22/image_05354.jpg \n inflating: flower_data/train/22/image_05340.jpg \n inflating: flower_data/train/22/image_05369.jpg \n inflating: flower_data/train/22/image_05341.jpg \n inflating: flower_data/train/22/image_05355.jpg \n inflating: flower_data/train/22/image_05382.jpg \n inflating: flower_data/train/22/image_05396.jpg \n inflating: flower_data/train/22/image_05380.jpg \n inflating: flower_data/train/22/image_05394.jpg \n inflating: flower_data/train/22/image_05343.jpg \n inflating: flower_data/train/22/image_05357.jpg \n inflating: flower_data/train/22/image_05356.jpg \n inflating: flower_data/train/22/image_05342.jpg \n inflating: flower_data/train/22/image_05395.jpg \n creating: flower_data/train/25/\n inflating: flower_data/train/25/image_06590.jpg \n inflating: flower_data/train/25/image_06591.jpg \n inflating: flower_data/train/25/image_06585.jpg \n inflating: flower_data/train/25/image_06587.jpg \n inflating: flower_data/train/25/image_06578.jpg \n inflating: flower_data/train/25/image_06579.jpg \n inflating: flower_data/train/25/image_06586.jpg \n inflating: flower_data/train/25/image_06592.jpg \n inflating: flower_data/train/25/image_06596.jpg \n inflating: flower_data/train/25/image_06597.jpg \n inflating: flower_data/train/25/image_06608.jpg \n inflating: flower_data/train/25/image_06581.jpg \n inflating: flower_data/train/25/image_06595.jpg \n inflating: flower_data/train/25/image_06594.jpg \n inflating: flower_data/train/25/image_06609.jpg \n inflating: flower_data/train/25/image_06610.jpg \n inflating: flower_data/train/25/image_06604.jpg \n inflating: flower_data/train/25/image_06599.jpg \n inflating: flower_data/train/25/image_06573.jpg \n inflating: flower_data/train/25/image_06598.jpg \n inflating: flower_data/train/25/image_06605.jpg \n inflating: flower_data/train/25/image_06607.jpg \n inflating: flower_data/train/25/image_06571.jpg \n inflating: flower_data/train/25/image_06606.jpg \n inflating: flower_data/train/25/image_06602.jpg \n inflating: flower_data/train/25/image_06574.jpg \n inflating: flower_data/train/25/image_06575.jpg \n inflating: flower_data/train/25/image_06603.jpg \n inflating: flower_data/train/25/image_06601.jpg \n inflating: flower_data/train/25/image_06588.jpg \n inflating: flower_data/train/25/image_06577.jpg \n inflating: flower_data/train/25/image_06576.jpg \n inflating: flower_data/train/25/image_06589.jpg \n inflating: flower_data/train/25/image_06600.jpg \n"
]
],
[
[
"## Import the test environment",
"_____no_output_____"
]
],
[
[
"!git clone https://github.com/GabrielePicco/deep-learning-flower-identifier\n!pip install requests\n!pip install airtable\nimport sys\nsys.path.insert(0, 'deep-learning-flower-identifier')\nfrom test_model_pytorch_facebook_challenge import publish_evaluated_model, calc_accuracy",
"Cloning into 'deep-learning-flower-identifier'...\nremote: Enumerating objects: 31, done.\u001b[K\nremote: Counting objects: 100% (31/31), done.\u001b[K\nremote: Compressing objects: 100% (27/27), done.\u001b[K\nremote: Total 31 (delta 13), reused 18 (delta 2), pack-reused 0\u001b[K\nUnpacking objects: 100% (31/31), done.\nRequirement already satisfied: requests in /usr/local/lib/python3.6/dist-packages (2.18.4)\nRequirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests) (2018.11.29)\nRequirement already satisfied: idna<2.7,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests) (2.6)\nRequirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests) (3.0.4)\nRequirement already satisfied: urllib3<1.23,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests) (1.22)\nCollecting airtable\n Downloading https://files.pythonhosted.org/packages/a8/ef/8b2bee11988bb1b4df41454a999f855bd568a74b8b58fd92279bfb50fb56/airtable-0.3.1.tar.gz\nRequirement already satisfied: requests>=2.5.3 in /usr/local/lib/python3.6/dist-packages (from airtable) (2.18.4)\nRequirement already satisfied: idna<2.7,>=2.5 in /usr/local/lib/python3.6/dist-packages (from requests>=2.5.3->airtable) (2.6)\nRequirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.6/dist-packages (from requests>=2.5.3->airtable) (2018.11.29)\nRequirement already satisfied: chardet<3.1.0,>=3.0.2 in /usr/local/lib/python3.6/dist-packages (from requests>=2.5.3->airtable) (3.0.4)\nRequirement already satisfied: urllib3<1.23,>=1.21.1 in /usr/local/lib/python3.6/dist-packages (from requests>=2.5.3->airtable) (1.22)\nBuilding wheels for collected packages: airtable\n Running setup.py bdist_wheel for airtable ... \u001b[?25l-\b \bdone\n\u001b[?25h Stored in directory: /root/.cache/pip/wheels/9b/ba/63/364c02fabcd50ef6e2f101a57feb727bd7a693697765a9df17\nSuccessfully built airtable\nInstalling collected packages: airtable\nSuccessfully installed airtable-0.3.1\n"
]
],
[
[
"## Import",
"_____no_output_____"
]
],
[
[
"!pip install --no-cache-dir -I pillow\n# Imports here\n%matplotlib inline\nimport time\nimport os\nimport json\nimport copy\nimport matplotlib.pyplot as plt\nimport seaborn as sns\nimport numpy as np\nfrom PIL import Image\nfrom collections import OrderedDict\nimport torch\nfrom torch import nn, optim\nfrom torch.optim import lr_scheduler\nfrom torch.autograd import Variable\nfrom torchvision import datasets, models, transforms\nfrom google.colab import files",
"Collecting pillow\n\u001b[?25l Downloading https://files.pythonhosted.org/packages/62/94/5430ebaa83f91cc7a9f687ff5238e26164a779cca2ef9903232268b0a318/Pillow-5.3.0-cp36-cp36m-manylinux1_x86_64.whl (2.0MB)\n\u001b[K 100% |████████████████████████████████| 2.0MB 4.1MB/s \n\u001b[?25hInstalling collected packages: pillow\nSuccessfully installed pillow-5.3.0\n"
]
],
[
[
"## Load the data\n\nHere you'll use `torchvision` to load the data ([documentation](http://pytorch.org/docs/0.3.0/torchvision/index.html)). You can [download the data here](https://s3.amazonaws.com/content.udacity-data.com/courses/nd188/flower_data.zip). The dataset is split into two parts, training and validation. For the training, you'll want to apply transformations such as random scaling, cropping, and flipping. This will help the network generalize leading to better performance. If you use a pre-trained network, you'll also need to make sure the input data is resized to 224x224 pixels as required by the networks.\n\nThe validation set is used to measure the model's performance on data it hasn't seen yet. For this you don't want any scaling or rotation transformations, but you'll need to resize then crop the images to the appropriate size.\n\nThe pre-trained networks available from `torchvision` were trained on the ImageNet dataset where each color channel was normalized separately. For both sets you'll need to normalize the means and standard deviations of the images to what the network expects. For the means, it's `[0.485, 0.456, 0.406]` and for the standard deviations `[0.229, 0.224, 0.225]`, calculated from the ImageNet images. These values will shift each color channel to be centered at 0 and range from -1 to 1.",
"_____no_output_____"
]
],
[
[
"data_dir = './flower_data'\ntrain_dir = os.path.join(data_dir, 'train')\nvalid_dir = os.path.join(data_dir, 'valid')\n\ndirs = {'train': train_dir, \n 'valid': valid_dir}",
"_____no_output_____"
],
[
"size = 224\ndata_transforms = data_transforms = {\n 'train': transforms.Compose([\n transforms.RandomRotation(45),\n transforms.RandomResizedCrop(size),\n transforms.RandomHorizontalFlip(),\n transforms.ToTensor(),\n transforms.Normalize([0.485, 0.456, 0.406], \n [0.229, 0.224, 0.225])\n ]),\n 'valid': transforms.Compose([\n transforms.Resize(size + 32),\n transforms.CenterCrop(size),\n transforms.ToTensor(),\n transforms.Normalize([0.485, 0.456, 0.406], \n [0.229, 0.224, 0.225])\n ]),\n} \n\nimage_datasets = {x: datasets.ImageFolder(dirs[x], transform=data_transforms[x]) for x in ['train', 'valid']}\ndataloaders = {x: torch.utils.data.DataLoader(image_datasets[x], batch_size=32, shuffle=True) for x in ['train', 'valid']}\ndataset_sizes = {x: len(image_datasets[x]) \n for x in ['train', 'valid']}\nclass_names = image_datasets['train'].classes",
"_____no_output_____"
]
],
[
[
"### Label mapping\n\nYou'll also need to load in a mapping from category label to category name. You can find this in the file `cat_to_name.json`. It's a JSON object which you can read in with the [`json` module](https://docs.python.org/2/library/json.html). This will give you a dictionary mapping the integer encoded categories to the actual names of the flowers.",
"_____no_output_____"
]
],
[
[
"with open('cat_to_name.json', 'r') as f:\n cat_to_name = json.load(f)",
"_____no_output_____"
]
],
[
[
"# Building and training the classifier\n\nNow that the data is ready, it's time to build and train the classifier. As usual, you should use one of the pretrained models from `torchvision.models` to get the image features. Build and train a new feed-forward classifier using those features.\n\nWe're going to leave this part up to you. If you want to talk through it with someone, chat with your fellow students! You can also ask questions on the forums or join the instructors in office hours.\n\nRefer to [the rubric](https://review.udacity.com/#!/rubrics/1663/view) for guidance on successfully completing this section. Things you'll need to do:\n\n* Load a [pre-trained network](http://pytorch.org/docs/master/torchvision/models.html) (If you need a starting point, the VGG networks work great and are straightforward to use)\n* Define a new, untrained feed-forward network as a classifier, using ReLU activations and dropout\n* Train the classifier layers using backpropagation using the pre-trained network to get the features\n* Track the loss and accuracy on the validation set to determine the best hyperparameters\n\nWe've left a cell open for you below, but use as many as you need. Our advice is to break the problem up into smaller parts you can run separately. Check that each part is doing what you expect, then move on to the next. You'll likely find that as you work through each part, you'll need to go back and modify your previous code. This is totally normal!\n\nWhen training make sure you're updating only the weights of the feed-forward network. You should be able to get the validation accuracy above 70% if you build everything right. Make sure to try different hyperparameters (learning rate, units in the classifier, epochs, etc) to find the best model. Save those hyperparameters to use as default values in the next part of the project.",
"_____no_output_____"
]
],
[
[
"model = models.vgg19(pretrained=True)\n\n# freeze all pretrained model parameters \nfor param in model.parameters():\n param.requires_grad_(False)\n \nprint(model)",
"Downloading: \"https://download.pytorch.org/models/vgg19-dcbb9e9d.pth\" to /root/.torch/models/vgg19-dcbb9e9d.pth\n100%|██████████| 574673361/574673361 [00:09<00:00, 58146854.99it/s]\n"
],
[
"classifier = nn.Sequential(OrderedDict([\n ('fc1', nn.Linear(25088, 4096)),\n ('relu', nn.ReLU()),\n ('fc2', nn.Linear(4096, 102)),\n ('output', nn.LogSoftmax(dim=1))\n ]))\nmodel.classifier = classifier",
"_____no_output_____"
],
[
"def train_model(model, criteria, optimizer, scheduler, num_epochs=25, device='cuda'):\n model.to(device)\n since = time.time()\n\n best_model_wts = copy.deepcopy(model.state_dict())\n best_acc = 0.0\n\n for epoch in range(num_epochs):\n print('Epoch {}/{}'.format(epoch, num_epochs - 1))\n print('-' * 10)\n\n # Each epoch has a training and validation phase\n for phase in ['train', 'valid']:\n if phase == 'train':\n scheduler.step()\n model.train() # Set model to training mode\n else:\n model.eval() # Set model to evaluate mode\n\n running_loss = 0.0\n running_corrects = 0\n\n # Iterate over data.\n for inputs, labels in dataloaders[phase]:\n inputs = inputs.to(device)\n labels = labels.to(device)\n\n # zero the parameter gradients\n optimizer.zero_grad()\n\n # forward\n # track history if only in train\n with torch.set_grad_enabled(phase == 'train'):\n outputs = model(inputs)\n _, preds = torch.max(outputs, 1)\n loss = criteria(outputs, labels)\n\n # backward + optimize only if in training phase\n if phase == 'train':\n loss.backward()\n optimizer.step()\n\n # statistics\n running_loss += loss.item() * inputs.size(0)\n running_corrects += torch.sum(preds == labels.data)\n\n epoch_loss = running_loss / dataset_sizes[phase]\n epoch_acc = running_corrects.double() / dataset_sizes[phase]\n\n print('{} Loss: {:.4f} Acc: {:.4f}'.format(\n phase, epoch_loss, epoch_acc))\n\n # deep copy the model\n if phase == 'valid' and epoch_acc > best_acc:\n best_acc = epoch_acc\n best_model_wts = copy.deepcopy(model.state_dict())\n\n print()\n\n time_elapsed = time.time() - since\n print('Training complete in {:.0f}m {:.0f}s'.format(\n time_elapsed // 60, time_elapsed % 60))\n print('Best val Acc: {:4f}'.format(best_acc))\n\n # load best model weights\n model.load_state_dict(best_model_wts)\n return model",
"_____no_output_____"
],
[
"# Criteria NLLLoss which is recommended with Softmax final layer\ncriteria = nn.NLLLoss()\n# Observe that all parameters are being optimized\noptimizer = optim.Adam(model.classifier.parameters(), lr=0.001)\n# Decay LR by a factor of 0.1 every 4 epochs\nsched = lr_scheduler.StepLR(optimizer, step_size=4, gamma=0.1)\n# Number of epochs\neps=5",
"_____no_output_____"
],
[
"device = \"cuda\" if torch.cuda.is_available() else \"cpu\"\nmodel_ft = train_model(model, criteria, optimizer, sched, eps, device)",
"Epoch 0/4\n----------\ntrain Loss: 2.4615 Acc: 0.4750\nvalid Loss: 0.9368 Acc: 0.7445\n\nEpoch 1/4\n----------\ntrain Loss: 1.1086 Acc: 0.6958\nvalid Loss: 0.6747 Acc: 0.8093\n\nEpoch 2/4\n----------\ntrain Loss: 0.9610 Acc: 0.7427\nvalid Loss: 0.6377 Acc: 0.8289\n\nEpoch 3/4\n----------\ntrain Loss: 0.8783 Acc: 0.7592\nvalid Loss: 0.5280 Acc: 0.8460\n\nEpoch 4/4\n----------\ntrain Loss: 0.6236 Acc: 0.8278\nvalid Loss: 0.4131 Acc: 0.8863\n\nTraining complete in 22m 36s\nBest val Acc: 0.886308\n"
]
],
[
[
"## Save the checkpoint\n\nNow that your network is trained, save the model so you can load it later for making predictions. You probably want to save other things such as the mapping of classes to indices which you get from one of the image datasets: `image_datasets['train'].class_to_idx`. You can attach this to the model as an attribute which makes inference easier later on.\n\n```model.class_to_idx = image_datasets['train'].class_to_idx```\n\nRemember that you'll want to completely rebuild the model later so you can use it for inference. Make sure to include any information you need in the checkpoint. If you want to load the model and keep training, you'll want to save the number of epochs as well as the optimizer state, `optimizer.state_dict`. You'll likely want to use this trained model in the next part of the project, so best to save it now.",
"_____no_output_____"
]
],
[
[
"model_file_name = 'classifier.pth'\nmodel.class_to_idx = image_datasets['train'].class_to_idx\nmodel.cpu()\ntorch.save({'arch': 'vgg19',\n 'state_dict': model.state_dict(), \n 'class_to_idx': model.class_to_idx}, \n model_file_name)",
"_____no_output_____"
]
],
[
[
"## Loading the checkpoint\n\nAt this point it's good to write a function that can load a checkpoint and rebuild the model. That way you can come back to this project and keep working on it without having to retrain the network.",
"_____no_output_____"
]
],
[
[
"def load_model(checkpoint_path):\n chpt = torch.load(checkpoint_path)\n pretrained_model = getattr(models, chpt['arch'])\n if callable(pretrained_model):\n model = pretrained_model(pretrained=True)\n for param in model.parameters():\n param.requires_grad = False\n else:\n print(\"Sorry base architecture not recognized\")\n \n model.class_to_idx = chpt['class_to_idx']\n \n # Create the classifier\n classifier = nn.Sequential(OrderedDict([\n ('fc1', nn.Linear(25088, 4096)),\n ('relu', nn.ReLU()),\n ('fc2', nn.Linear(4096, 102)),\n ('output', nn.LogSoftmax(dim=1))\n ]))\n # Put the classifier on the pretrained network\n model.classifier = classifier\n \n model.load_state_dict(chpt['state_dict'])\n \n return model",
"_____no_output_____"
],
[
"model = load_model('classifier.pth')\ncalc_accuracy(model, input_image_size=224, testset_path=valid_dir)",
"Batch accuracy (Size 32): 0.8125\nBatch accuracy (Size 32): 0.8125\nBatch accuracy (Size 32): 0.78125\nBatch accuracy (Size 32): 0.78125\nBatch accuracy (Size 32): 0.875\nBatch accuracy (Size 32): 0.9375\nBatch accuracy (Size 32): 0.96875\nBatch accuracy (Size 32): 0.9375\nBatch accuracy (Size 32): 0.90625\nBatch accuracy (Size 32): 0.875\nBatch accuracy (Size 32): 0.84375\nBatch accuracy (Size 32): 0.875\nBatch accuracy (Size 32): 0.71875\nBatch accuracy (Size 32): 0.9375\nBatch accuracy (Size 32): 0.90625\nBatch accuracy (Size 32): 0.9375\nBatch accuracy (Size 32): 0.875\nBatch accuracy (Size 32): 0.90625\nBatch accuracy (Size 32): 0.96875\nBatch accuracy (Size 32): 0.90625\nBatch accuracy (Size 32): 0.90625\nBatch accuracy (Size 32): 0.875\nBatch accuracy (Size 32): 0.875\nBatch accuracy (Size 32): 0.9375\nBatch accuracy (Size 32): 0.78125\nBatch accuracy (Size 32): 0.8888888955116272\nMean accuracy: 0.8779380321502686\n"
]
],
[
[
"# Inference for classification\n\nNow you'll write a function to use a trained network for inference. That is, you'll pass an image into the network and predict the class of the flower in the image. Write a function called `predict` that takes an image and a model, then returns the top $K$ most likely classes along with the probabilities. It should look like \n\n```python\nprobs, classes = predict(image_path, model)\nprint(probs)\nprint(classes)\n> [ 0.01558163 0.01541934 0.01452626 0.01443549 0.01407339]\n> ['70', '3', '45', '62', '55']\n```\n\nFirst you'll need to handle processing the input image such that it can be used in your network. \n\n## Image Preprocessing\n\nYou'll want to use `PIL` to load the image ([documentation](https://pillow.readthedocs.io/en/latest/reference/Image.html)). It's best to write a function that preprocesses the image so it can be used as input for the model. This function should process the images in the same manner used for training. \n\nFirst, resize the images where the shortest side is 256 pixels, keeping the aspect ratio. This can be done with the [`thumbnail`](http://pillow.readthedocs.io/en/3.1.x/reference/Image.html#PIL.Image.Image.thumbnail) or [`resize`](http://pillow.readthedocs.io/en/3.1.x/reference/Image.html#PIL.Image.Image.thumbnail) methods. Then you'll need to crop out the center 224x224 portion of the image.\n\nColor channels of images are typically encoded as integers 0-255, but the model expected floats 0-1. You'll need to convert the values. It's easiest with a Numpy array, which you can get from a PIL image like so `np_image = np.array(pil_image)`.\n\nAs before, the network expects the images to be normalized in a specific way. For the means, it's `[0.485, 0.456, 0.406]` and for the standard deviations `[0.229, 0.224, 0.225]`. You'll want to subtract the means from each color channel, then divide by the standard deviation. \n\nAnd finally, PyTorch expects the color channel to be the first dimension but it's the third dimension in the PIL image and Numpy array. You can reorder dimensions using [`ndarray.transpose`](https://docs.scipy.org/doc/numpy-1.13.0/reference/generated/numpy.ndarray.transpose.html). The color channel needs to be first and retain the order of the other two dimensions.",
"_____no_output_____"
]
],
[
[
"def process_image(image_path):\n ''' \n Scales, crops, and normalizes a PIL image for a PyTorch \n model, returns an Numpy array\n '''\n # Open the image\n from PIL import Image\n img = Image.open(image_path)\n # Resize\n if img.size[0] > img.size[1]:\n img.thumbnail((10000, 256))\n else:\n img.thumbnail((256, 10000))\n # Crop \n left_margin = (img.width-224)/2\n bottom_margin = (img.height-224)/2\n right_margin = left_margin + 224\n top_margin = bottom_margin + 224\n img = img.crop((left_margin, bottom_margin, right_margin, \n top_margin))\n # Normalize\n img = np.array(img)/255\n mean = np.array([0.485, 0.456, 0.406]) #provided mean\n std = np.array([0.229, 0.224, 0.225]) #provided std\n img = (img - mean)/std\n \n # Move color channels to first dimension as expected by PyTorch\n img = img.transpose((2, 0, 1))\n \n return img",
"_____no_output_____"
]
],
[
[
"To check your work, the function below converts a PyTorch tensor and displays it in the notebook. If your `process_image` function works, running the output through this function should return the original image (except for the cropped out portions).",
"_____no_output_____"
]
],
[
[
"def imshow(image, ax=None, title=None):\n if ax is None:\n fig, ax = plt.subplots()\n if title:\n plt.title(title)\n # PyTorch tensors assume the color channel is first\n # but matplotlib assumes is the third dimension\n image = image.transpose((1, 2, 0))\n \n # Undo preprocessing\n mean = np.array([0.485, 0.456, 0.406])\n std = np.array([0.229, 0.224, 0.225])\n image = std * image + mean\n \n # Image needs to be clipped between 0 and 1\n image = np.clip(image, 0, 1)\n \n ax.imshow(image)\n \n return ax",
"_____no_output_____"
]
],
[
[
"## Class Prediction\n\nOnce you can get images in the correct format, it's time to write a function for making predictions with your model. A common practice is to predict the top 5 or so (usually called top-$K$) most probable classes. You'll want to calculate the class probabilities then find the $K$ largest values.\n\nTo get the top $K$ largest values in a tensor use [`x.topk(k)`](http://pytorch.org/docs/master/torch.html#torch.topk). This method returns both the highest `k` probabilities and the indices of those probabilities corresponding to the classes. You need to convert from these indices to the actual class labels using `class_to_idx` which hopefully you added to the model or from an `ImageFolder` you used to load the data ([see here](#Save-the-checkpoint)). Make sure to invert the dictionary so you get a mapping from index to class as well.\n\nAgain, this method should take a path to an image and a model checkpoint, then return the probabilities and classes.\n\n```python\nprobs, classes = predict(image_path, model)\nprint(probs)\nprint(classes)\n> [ 0.01558163 0.01541934 0.01452626 0.01443549 0.01407339]\n> ['70', '3', '45', '62', '55']\n```",
"_____no_output_____"
]
],
[
[
"def predict(image_path, model, top_num=5):\n # Process image\n img = process_image(image_path)\n \n # Numpy -> Tensor\n image_tensor = torch.from_numpy(img).type(torch.FloatTensor)\n # Add batch of size 1 to image\n model_input = image_tensor.unsqueeze(0)\n \n image_tensor.to('cpu')\n model_input.to('cpu')\n model.to('cpu')\n \n # Probs\n probs = torch.exp(model.forward(model_input))\n \n # Top probs\n top_probs, top_labs = probs.topk(top_num)\n top_probs = top_probs.detach().numpy().tolist()[0] \n top_labs = top_labs.detach().numpy().tolist()[0]\n \n # Convert indices to classes\n idx_to_class = {val: key for key, val in \n model.class_to_idx.items()}\n top_labels = [idx_to_class[lab] for lab in top_labs]\n top_flowers = [cat_to_name[idx_to_class[lab]] for lab in top_labs]\n return top_probs, top_labels, top_flowers",
"_____no_output_____"
]
],
[
[
"## Sanity Checking\n\nNow that you can use a trained model for predictions, check to make sure it makes sense. Even if the validation accuracy is high, it's always good to check that there aren't obvious bugs. Use `matplotlib` to plot the probabilities for the top 5 classes as a bar graph, along with the input image. It should look like this:\n\n<img src='https://i.imgur.com/KvRmrUp.png' width=300px>\n\nYou can convert from the class integer encoding to actual flower names with the `cat_to_name.json` file (should have been loaded earlier in the notebook). To show a PyTorch tensor as an image, use the `imshow` function defined above.",
"_____no_output_____"
]
],
[
[
"def plot_solution(image_path, model):\n # Set up plot\n plt.figure(figsize = (6,10))\n ax = plt.subplot(2,1,1)\n # Set up title\n flower_num = image_path.split('/')[3]\n title_ = cat_to_name[flower_num]\n # Plot flower\n img = process_image(image_path)\n imshow(img, ax, title = title_);\n # Make prediction\n probs, labs, flowers = predict(image_path, model) \n # Plot bar chart\n plt.subplot(2,1,2)\n sns.barplot(x=probs, y=flowers, color=sns.color_palette()[0]);\n plt.show()",
"_____no_output_____"
],
[
"image_path = os.path.join(valid_dir, '28/image_05265.jpg')\nplot_solution(image_path, model)",
"/usr/local/lib/python3.6/dist-packages/seaborn/categorical.py:1428: FutureWarning: remove_na is deprecated and is a private function. Do not use.\n stat_data = remove_na(group_data)\n"
]
],
[
[
"## Download the model",
"_____no_output_____"
],
[
"Alternatively, download the file from the google colaboratory explorer menu",
"_____no_output_____"
]
],
[
[
"#files.download('classifier.pth')",
"_____no_output_____"
]
],
[
[
"## Publish the result on the Airtable shared leaderboard",
"_____no_output_____"
]
],
[
[
"publish_evaluated_model(model, input_image_size=224, username=\"@Slack.Username\", model_name=\"VGG19\", optim=\"Adam\", criteria=\"NLLLoss\", scheduler=\"StepLR\", epoch=5)",
"Downloading the dataset from: https://www.dropbox.com/s/3zmf1kq58o909rq/google_test_data.zip?dl=1\n"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07ea853de2616ed3d860d80e194ac02b4eaa47f | 710 | ipynb | Jupyter Notebook | git_tutorials/merge_link.ipynb | noallynoclan/gradient | ce39067c350e669a24ebfe2c8877ef8a5f533dd7 | [
"Apache-2.0"
] | null | null | null | git_tutorials/merge_link.ipynb | noallynoclan/gradient | ce39067c350e669a24ebfe2c8877ef8a5f533dd7 | [
"Apache-2.0"
] | null | null | null | git_tutorials/merge_link.ipynb | noallynoclan/gradient | ce39067c350e669a24ebfe2c8877ef8a5f533dd7 | [
"Apache-2.0"
] | 1 | 2020-09-17T13:22:29.000Z | 2020-09-17T13:22:29.000Z | 17.75 | 78 | 0.53662 | [
[
[
"https://git-scm.com/book/en/v2/Git-Branching-Basic-Branching-and-Merging",
"_____no_output_____"
]
]
] | [
"markdown"
] | [
[
"markdown"
]
] |
d07eab95f6fa6e116ae49185a1a444f764ce3210 | 25,806 | ipynb | Jupyter Notebook | course work/Problem 3/.ipynb_checkpoints/Assignment_3-checkpoint.ipynb | taareek/machine_learning | e9e7cf3636a3adf8572e69346c08e65cfcdb1100 | [
"MIT"
] | null | null | null | course work/Problem 3/.ipynb_checkpoints/Assignment_3-checkpoint.ipynb | taareek/machine_learning | e9e7cf3636a3adf8572e69346c08e65cfcdb1100 | [
"MIT"
] | null | null | null | course work/Problem 3/.ipynb_checkpoints/Assignment_3-checkpoint.ipynb | taareek/machine_learning | e9e7cf3636a3adf8572e69346c08e65cfcdb1100 | [
"MIT"
] | null | null | null | 26.631579 | 102 | 0.460862 | [
[
[
"import numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt",
"_____no_output_____"
],
[
"data = pd.read_csv(\"iris.data\")\ndata.head()",
"_____no_output_____"
],
[
"data = pd.read_csv(\"iris.data\", header= None)\ncolumn_name = ['sepal length', 'sepal width', 'petal length', 'petal width', 'class']\ndata.columns = column_name",
"_____no_output_____"
],
[
"data.head()",
"_____no_output_____"
]
],
[
[
"Some Rough",
"_____no_output_____"
]
],
[
[
"# Split the dataset by class values, returns a dictionary\ndef separate_by_class(dataset):\n separated = dict()\n for i in range(len(dataset)):\n vector = dataset[i]\n class_value = vector[-1]\n if (class_value not in separated):\n separated[class_value] = list()\n separated[class_value].append(vector)\n return separated",
"_____no_output_____"
],
[
"# Test separating data by class\ndataset = [[3.393533211,2.331273381,0],\n [3.110073483,1.781539638,0],\n [1.343808831,3.368360954,0],\n [3.582294042,4.67917911,0],\n [2.280362439,2.866990263,0],\n [7.423436942,4.696522875,1],\n [5.745051997,3.533989803,1],\n [9.172168622,2.511101045,1],\n [7.792783481,3.424088941,1],\n [7.939820817,0.791637231,1]]\nseparated = separate_by_class(dataset)\nfor label in separated:\n print(label)\n for row in separated[label]:\n print(row)",
"0\n[3.393533211, 2.331273381, 0]\n[3.110073483, 1.781539638, 0]\n[1.343808831, 3.368360954, 0]\n[3.582294042, 4.67917911, 0]\n[2.280362439, 2.866990263, 0]\n1\n[7.423436942, 4.696522875, 1]\n[5.745051997, 3.533989803, 1]\n[9.172168622, 2.511101045, 1]\n[7.792783481, 3.424088941, 1]\n[7.939820817, 0.791637231, 1]\n"
],
[
"x = data[['sepal length', 'sepal width']]",
"_____no_output_____"
],
[
"x.head()",
"_____no_output_____"
],
[
"# function to calculate average(mean)\n\ndef mean(data):\n return sum(data)/ float(len(data))",
"_____no_output_____"
],
[
"from math import sqrt\n# function to calculate the standard deviation\n\ndef stdev(data):\n avg = mean(data)\n variance = sum([(x-avg)**2 for x in data]) / float(len(data)-1)\n return sqrt(variance)",
"_____no_output_____"
],
[
"t = [1,2,3,5, 6.7,8, 9.3]\na = stdev(t)\nprint(a)",
"3.145896798476814\n"
],
[
"# Calculate the mean, stdev and count for each column in a dataset\ndef summarize_dataset(dataset):\n summaries = [(mean(column), stdev(column), len(column)) for column in zip(*dataset)]\n del(summaries[-1])\n return summaries",
"_____no_output_____"
],
[
"a = summarize_dataset(dataset)\na",
"_____no_output_____"
],
[
"\n# Split dataset by class then calculate statistics for each row\ndef summarize_by_class(dataset):\n separated = separate_by_class(dataset)\n summaries = dict()\n for class_value, rows in separated.items():\n summaries[class_value] = summarize_dataset(rows)\n return summaries",
"_____no_output_____"
],
[
"summary = summarize_by_class(dataset)\nfor label in summary:\n print(label)\n for row in summary[label]:\n print(row)",
"0\n(2.7420144012, 0.9265683289298018, 5)\n(3.0054686692, 1.1073295894898725, 5)\n1\n(7.6146523718, 1.2344321550313704, 5)\n(2.9914679790000003, 1.4541931384601618, 5)\n"
],
[
"from math import pi\nfrom math import exp\n# Calculate the Gaussian probability distribution function for x\ndef calculate_probability(x, mean, stdev):\n exponent = exp(-((x-mean)**2 / (2 * stdev**2 )))\n return (1 / (sqrt(2 * pi) * stdev)) * exponent",
"_____no_output_____"
],
[
"# Test Gaussian PDF\nprint(calculate_probability(1.0, 1.0, 1.0))\nprint(calculate_probability(2.0, 1.0, 1.0))\nprint(calculate_probability(0.0, 1.0, 1.0))",
"0.3989422804014327\n0.24197072451914337\n0.24197072451914337\n"
],
[
"# Calculate the probabilities of predicting each class for a given row\ndef calculate_class_probabilities(summaries, row):\n total_rows = sum([summaries[label][0][2] for label in summaries])\n probabilities = dict()\n for class_value, class_summaries in summaries.items():\n probabilities[class_value] = summaries[class_value][0][2]/float(total_rows)\n for i in range(len(class_summaries)):\n mean, stdev, _ = class_summaries[i]\n probabilities[class_value] *= calculate_probability(row[i], mean, stdev)\n return probabilities",
"_____no_output_____"
],
[
"summaries = summarize_by_class(dataset)\nprobabilities = calculate_class_probabilities(summaries, dataset[0])\nprint(probabilities)",
"{0: 0.05032427673372076, 1: 0.00011557718379945765}\n"
],
[
"def variance1(data, ddof=0):\n n = len(data)\n mean = sum(data) / n\n return sum((x - mean) ** 2 for x in data) / (n - ddof)",
"_____no_output_____"
]
],
[
[
"# Iris",
"_____no_output_____"
]
],
[
[
"# Convert string column to float\ndef str_column_to_float(dataset, column):\n for row in dataset:\n row[column] = float(row[column].strip())",
"_____no_output_____"
],
[
"# Convert string column to integer\ndef str_column_to_int(dataset, column):\n class_values = [row[column] for row in dataset]\n unique = set(class_values)\n lookup = dict()\n for i, value in enumerate(unique):\n lookup[value] = i\n for row in dataset:\n row[column] = lookup[row[column]]\n return lookup",
"_____no_output_____"
],
[
"# Split a dataset into k folds\ndef cross_validation_split(dataset, n_folds):\n dataset_split = list()\n dataset_copy = list(dataset)\n fold_size = int(len(dataset) / n_folds)\n for _ in range(n_folds):\n fold = list()\n while len(fold) < fold_size:\n index = randrange(len(dataset_copy))\n fold.append(dataset_copy.pop(index))\n dataset_split.append(fold)\n return dataset_split",
"_____no_output_____"
],
[
"# Calculate accuracy percentage\ndef accuracy_metric(actual, predicted):\n correct = 0\n for i in range(len(actual)):\n if actual[i] == predicted[i]:\n correct += 1\n return correct / float(len(actual)) * 100.0",
"_____no_output_____"
],
[
"# Evaluate an algorithm using a cross validation split\ndef evaluate_algorithm(dataset, algorithm, n_folds, *args):\n folds = cross_validation_split(dataset, n_folds)\n scores = list()\n for fold in folds:\n train_set = list(folds)\n train_set.remove(fold)\n train_set = sum(train_set, [])\n test_set = list()\n for row in fold:\n row_copy = list(row)\n test_set.append(row_copy)\n row_copy[-1] = None\n predicted = algorithm(train_set, test_set, *args)\n actual = [row[-1] for row in fold]\n accuracy = accuracy_metric(actual, predicted)\n scores.append(accuracy)\n return scores",
"_____no_output_____"
],
[
"# Split the dataset by class values, returns a dictionary\ndef separate_by_class(dataset):\n separated = dict()\n for i in range(len(dataset)):\n vector = dataset[i]\n class_value = vector[-1]\n if (class_value not in separated):\n separated[class_value] = list()\n separated[class_value].append(vector)\n return separated",
"_____no_output_____"
],
[
"\n# Calculate the mean of a list of numbers\ndef mean(numbers):\n return sum(numbers)/float(len(numbers))",
"_____no_output_____"
],
[
"# Calculate the standard deviation of a list of numbers\ndef stdev(numbers):\n avg = mean(numbers)\n variance = sum([(x-avg)**2 for x in numbers]) / float(len(numbers)-1)\n return sqrt(variance)",
"_____no_output_____"
],
[
"# Calculate the mean, stdev and count for each column in a dataset\ndef summarize_dataset(dataset):\n summaries = [(mean(column), stdev(column), len(column)) for column in zip(*dataset)]\n del(summaries[-1])\n return summaries",
"_____no_output_____"
],
[
"# Split dataset by class then calculate statistics for each row\ndef summarize_by_class(dataset):\n separated = separate_by_class(dataset)\n summaries = dict()\n for class_value, rows in separated.items():\n summaries[class_value] = summarize_dataset(rows)\n return summaries",
"_____no_output_____"
],
[
"# Calculate the Gaussian probability distribution function for x\ndef calculate_probability(x, mean, stdev):\n exponent = exp(-((x-mean)**2 / (2 * stdev**2 )))\n return (1 / (sqrt(2 * pi) * stdev)) * exponent",
"_____no_output_____"
],
[
"# Calculate the probabilities of predicting each class for a given row\ndef calculate_class_probabilities(summaries, row):\n total_rows = sum([summaries[label][0][2] for label in summaries])\n probabilities = dict()\n for class_value, class_summaries in summaries.items():\n probabilities[class_value] = summaries[class_value][0][2]/float(total_rows)\n for i in range(len(class_summaries)):\n mean, stdev, _ = class_summaries[i]\n probabilities[class_value] *= calculate_probability(row[i], mean, stdev)\n return probabilities",
"_____no_output_____"
],
[
"# Predict the class for a given row\ndef predict(summaries, row):\n probabilities = calculate_class_probabilities(summaries, row)\n best_label, best_prob = None, -1\n for class_value, probability in probabilities.items():\n if best_label is None or probability > best_prob:\n best_prob = probability\n best_label = class_value\n return best_label",
"_____no_output_____"
],
[
"# Naive Bayes Algorithm\ndef naive_bayes(train, test):\n summarize = summarize_by_class(train)\n predictions = list()\n for row in test:\n output = predict(summarize, row)\n predictions.append(output)\n return(predictions)",
"_____no_output_____"
],
[
"# Load a CSV file\ndef load_csv(filename):\n dataset = list()\n with open(filename, 'r') as file:\n csv_reader = reader(file)\n for row in csv_reader:\n if not row:\n continue\n dataset.append(row)\n return dataset",
"_____no_output_____"
],
[
"from csv import reader\nfrom random import seed\nfrom random import randrange\n",
"_____no_output_____"
],
[
"seed(1)\nfilename = 'iris.data'\ndataset = load_csv(filename)\nfor i in range(len(dataset[0])-1):\n str_column_to_float(dataset, i)\n# convert class column to integers\nstr_column_to_int(dataset, len(dataset[0])-1)\n# evaluate algorithm\nn_folds = 5\nscores = evaluate_algorithm(dataset, naive_bayes, n_folds)\nprint('Scores: %s' % scores)\nprint('Mean Accuracy: %.3f%%' % (sum(scores)/float(len(scores))))",
"Scores: [93.33333333333333, 96.66666666666667, 100.0, 93.33333333333333, 93.33333333333333]\nMean Accuracy: 95.333%\n"
],
[
"v = variance1(scores)",
"_____no_output_____"
],
[
"print('Variance of classifier: ', v)",
"Variance of classifier: 7.111111111111126\n"
]
]
] | [
"code",
"raw",
"code",
"markdown",
"code"
] | [
[
"code",
"code",
"code",
"code"
],
[
"raw"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07eb3f54f0bdd60bb24349e4d0512c47bce3553 | 3,901 | ipynb | Jupyter Notebook | tests/notebooks/mirror/script_to_ipynb/knitr-spin.ipynb | Skylion007/jupytext | efd9bec9abc1091db145b411e794851760f0f2cf | [
"MIT"
] | 5,378 | 2018-09-01T22:03:43.000Z | 2022-03-31T06:51:42.000Z | tests/notebooks/mirror/script_to_ipynb/knitr-spin.ipynb | Skylion007/jupytext | efd9bec9abc1091db145b411e794851760f0f2cf | [
"MIT"
] | 812 | 2018-08-31T08:26:13.000Z | 2022-03-30T18:12:11.000Z | tests/notebooks/mirror/script_to_ipynb/knitr-spin.ipynb | Skylion007/jupytext | efd9bec9abc1091db145b411e794851760f0f2cf | [
"MIT"
] | 380 | 2018-09-02T01:40:07.000Z | 2022-03-25T13:57:23.000Z | 21.086486 | 89 | 0.53089 | [
[
[
"The below derives from\nhttps://github.com/yihui/knitr/blob/master/inst/examples/knitr-spin.R\n\nThis is a special R script which can be used to generate a report. You can\nwrite normal text in roxygen comments.\n\nFirst we set up some options (you do not have to do this):",
"_____no_output_____"
]
],
[
[
"library(knitr)\nopts_chunk$set(fig.path = 'figure/silk-')",
"_____no_output_____"
]
],
[
[
"The report begins here.",
"_____no_output_____"
]
],
[
[
"# boring examples as usual\nset.seed(123)\nx = rnorm(5)\nmean(x)",
"_____no_output_____"
]
],
[
[
"You can not use here the special syntax {{code}} to embed inline expressions, e.g.",
"_____no_output_____"
]
],
[
[
"{{mean(x) + 2}}",
"_____no_output_____"
]
],
[
[
"is the mean of x plus 2.\nThe code itself may contain braces, but these are not checked. Thus,\nperfectly valid (though very strange) R code such as `{{2 + 3}} - {{4 - 5}}`\ncan lead to errors because `2 + 3}} - {{4 - 5` will be treated as inline code.\n\nNow we continue writing the report. We can draw plots as well.",
"_____no_output_____"
]
],
[
[
"par(mar = c(4, 4, .1, .1)); plot(x)",
"_____no_output_____"
]
],
[
[
"Actually you do not have to write chunk options, in which case knitr will use\ndefault options. For example, the code below has no options attached:",
"_____no_output_____"
]
],
[
[
"var(x)\nquantile(x)",
"_____no_output_____"
]
],
[
[
"And you can also write two chunks successively like this:",
"_____no_output_____"
]
],
[
[
"sum(x^2) # chi-square distribution with df 5",
"_____no_output_____"
],
[
"sum((x - mean(x))^2) # df is 4 now",
"_____no_output_____"
]
],
[
[
"Done. Call spin('knitr-spin.R') to make silk from sow's ear now and knit a\nlovely purse.",
"_____no_output_____"
]
],
[
[
"# /* you can write comments between /* and */ like C comments (the preceding #\n# is optional)\nSys.sleep(60)\n# */",
"_____no_output_____"
],
[
"# /* there is no inline comment; you have to write block comments */",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
d07ed19ce47568d2ec50576b5d2eb551a60c85c9 | 13,717 | ipynb | Jupyter Notebook | examples/04_Variational_and_Approximate_GPs/SVGP_Regression_CUDA.ipynb | lrast/gpytorch | 2e0bbc9f59e4b4b54780c3e55db784c3d2c9a5bf | [
"MIT"
] | 2,673 | 2018-02-19T22:28:58.000Z | 2022-03-31T13:22:28.000Z | examples/04_Variational_and_Approximate_GPs/SVGP_Regression_CUDA.ipynb | sdaulton/gpytorch | f06004e888542289d19c155e4e2ad35aad0573b7 | [
"MIT"
] | 1,415 | 2018-02-19T20:38:20.000Z | 2022-03-30T12:53:13.000Z | examples/04_Variational_and_Approximate_GPs/SVGP_Regression_CUDA.ipynb | sdaulton/gpytorch | f06004e888542289d19c155e4e2ad35aad0573b7 | [
"MIT"
] | 467 | 2018-03-07T02:06:05.000Z | 2022-03-27T07:05:44.000Z | 37.173442 | 384 | 0.61843 | [
[
[
"# Stochastic Variational GP Regression\n\n## Overview\n\nIn this notebook, we'll give an overview of how to use SVGP stochastic variational regression ((https://arxiv.org/pdf/1411.2005.pdf)) to rapidly train using minibatches on the `3droad` UCI dataset with hundreds of thousands of training examples. This is one of the more common use-cases of variational inference for GPs.\n\nIf you are unfamiliar with variational inference, we recommend the following resources:\n- [Variational Inference: A Review for Statisticians](https://arxiv.org/abs/1601.00670) by David M. Blei, Alp Kucukelbir, Jon D. McAuliffe.\n- [Scalable Variational Gaussian Process Classification](https://arxiv.org/abs/1411.2005) by James Hensman, Alex Matthews, Zoubin Ghahramani.",
"_____no_output_____"
]
],
[
[
"import tqdm\nimport math\nimport torch\nimport gpytorch\nfrom matplotlib import pyplot as plt\n\n# Make plots inline\n%matplotlib inline",
"_____no_output_____"
]
],
[
[
"For this example notebook, we'll be using the `song` UCI dataset used in the paper. Running the next cell downloads a copy of the dataset that has already been scaled and normalized appropriately. For this notebook, we'll simply be splitting the data using the first 80% of the data as training and the last 20% as testing.\n\n**Note**: Running the next cell will attempt to download a **~136 MB** file to the current directory.",
"_____no_output_____"
]
],
[
[
"import urllib.request\nimport os\nfrom scipy.io import loadmat\nfrom math import floor\n\n\n# this is for running the notebook in our testing framework\nsmoke_test = ('CI' in os.environ)\n\n\nif not smoke_test and not os.path.isfile('../elevators.mat'):\n print('Downloading \\'elevators\\' UCI dataset...')\n urllib.request.urlretrieve('https://drive.google.com/uc?export=download&id=1jhWL3YUHvXIaftia4qeAyDwVxo6j1alk', '../elevators.mat')\n\n\nif smoke_test: # this is for running the notebook in our testing framework\n X, y = torch.randn(1000, 3), torch.randn(1000)\nelse:\n data = torch.Tensor(loadmat('../elevators.mat')['data'])\n X = data[:, :-1]\n X = X - X.min(0)[0]\n X = 2 * (X / X.max(0)[0]) - 1\n y = data[:, -1]\n\n\ntrain_n = int(floor(0.8 * len(X)))\ntrain_x = X[:train_n, :].contiguous()\ntrain_y = y[:train_n].contiguous()\n\ntest_x = X[train_n:, :].contiguous()\ntest_y = y[train_n:].contiguous()\n\nif torch.cuda.is_available():\n train_x, train_y, test_x, test_y = train_x.cuda(), train_y.cuda(), test_x.cuda(), test_y.cuda()",
"_____no_output_____"
]
],
[
[
"## Creating a DataLoader\n\nThe next step is to create a torch `DataLoader` that will handle getting us random minibatches of data. This involves using the standard `TensorDataset` and `DataLoader` modules provided by PyTorch.\n\nIn this notebook we'll be using a fairly large batch size of 1024 just to make optimization run faster, but you could of course change this as you so choose.",
"_____no_output_____"
]
],
[
[
"from torch.utils.data import TensorDataset, DataLoader\ntrain_dataset = TensorDataset(train_x, train_y)\ntrain_loader = DataLoader(train_dataset, batch_size=1024, shuffle=True)\n\ntest_dataset = TensorDataset(test_x, test_y)\ntest_loader = DataLoader(test_dataset, batch_size=1024, shuffle=False)",
"_____no_output_____"
]
],
[
[
"## Creating a SVGP Model\n\n\nFor most variational/approximate GP models, you will need to construct the following GPyTorch objects:\n\n1. A **GP Model** (`gpytorch.models.ApproximateGP`) - This handles basic variational inference.\n1. A **Variational distribution** (`gpytorch.variational._VariationalDistribution`) - This tells us what form the variational distribution q(u) should take.\n1. A **Variational strategy** (`gpytorch.variational._VariationalStrategy`) - This tells us how to transform a distribution q(u) over the inducing point values to a distribution q(f) over the latent function values for some input x.\n\nHere, we use a `VariationalStrategy` with `learn_inducing_points=True`, and a `CholeskyVariationalDistribution`. These are the most straightforward and common options.\n\n\n#### The GP Model\n \nThe `ApproximateGP` model is GPyTorch's simplest approximate inference model. It approximates the true posterior with a distribution specified by a `VariationalDistribution`, which is most commonly some form of MultivariateNormal distribution. The model defines all the variational parameters that are needed, and keeps all of this information under the hood.\n\nThe components of a user built `ApproximateGP` model in GPyTorch are:\n\n1. An `__init__` method that constructs a mean module, a kernel module, a variational distribution object and a variational strategy object. This method should also be responsible for construting whatever other modules might be necessary.\n\n2. A `forward` method that takes in some $n \\times d$ data `x` and returns a MultivariateNormal with the *prior* mean and covariance evaluated at `x`. In other words, we return the vector $\\mu(x)$ and the $n \\times n$ matrix $K_{xx}$ representing the prior mean and covariance matrix of the GP.",
"_____no_output_____"
]
],
[
[
"from gpytorch.models import ApproximateGP\nfrom gpytorch.variational import CholeskyVariationalDistribution\nfrom gpytorch.variational import VariationalStrategy\n\nclass GPModel(ApproximateGP):\n def __init__(self, inducing_points):\n variational_distribution = CholeskyVariationalDistribution(inducing_points.size(0))\n variational_strategy = VariationalStrategy(self, inducing_points, variational_distribution, learn_inducing_locations=True)\n super(GPModel, self).__init__(variational_strategy)\n self.mean_module = gpytorch.means.ConstantMean()\n self.covar_module = gpytorch.kernels.ScaleKernel(gpytorch.kernels.RBFKernel())\n \n def forward(self, x):\n mean_x = self.mean_module(x)\n covar_x = self.covar_module(x)\n return gpytorch.distributions.MultivariateNormal(mean_x, covar_x)\n\ninducing_points = train_x[:500, :]\nmodel = GPModel(inducing_points=inducing_points)\nlikelihood = gpytorch.likelihoods.GaussianLikelihood()\n\nif torch.cuda.is_available():\n model = model.cuda()\n likelihood = likelihood.cuda()",
"_____no_output_____"
]
],
[
[
"### Training the Model\n\nThe cell below trains the model above, learning both the hyperparameters of the Gaussian process **and** the parameters of the neural network in an end-to-end fashion using Type-II MLE.\n\nUnlike when using the exact GP marginal log likelihood, performing variational inference allows us to make use of stochastic optimization techniques. For this example, we'll do one epoch of training. Given the small size of the neural network relative to the size of the dataset, this should be sufficient to achieve comparable accuracy to what was observed in the DKL paper.\n\nThe optimization loop differs from the one seen in our more simple tutorials in that it involves looping over both a number of training iterations (epochs) *and* minibatches of the data. However, the basic process is the same: for each minibatch, we forward through the model, compute the loss (the `VariationalELBO` or ELBO), call backwards, and do a step of optimization.",
"_____no_output_____"
]
],
[
[
"num_epochs = 1 if smoke_test else 4\n\n\nmodel.train()\nlikelihood.train()\n\noptimizer = torch.optim.Adam([\n {'params': model.parameters()},\n {'params': likelihood.parameters()},\n], lr=0.01)\n\n# Our loss object. We're using the VariationalELBO\nmll = gpytorch.mlls.VariationalELBO(likelihood, model, num_data=train_y.size(0))\n\n\nepochs_iter = tqdm.notebook.tqdm(range(num_epochs), desc=\"Epoch\")\nfor i in epochs_iter:\n # Within each iteration, we will go over each minibatch of data\n minibatch_iter = tqdm.notebook.tqdm(train_loader, desc=\"Minibatch\", leave=False)\n for x_batch, y_batch in minibatch_iter:\n optimizer.zero_grad()\n output = model(x_batch)\n loss = -mll(output, y_batch)\n minibatch_iter.set_postfix(loss=loss.item())\n loss.backward()\n optimizer.step()",
"_____no_output_____"
]
],
[
[
"### Making Predictions\n\nThe next cell gets the predictive covariance for the test set (and also technically gets the predictive mean, stored in `preds.mean()`). Because the test set is substantially smaller than the training set, we don't need to make predictions in mini batches here, although this can be done by passing in minibatches of `test_x` rather than the full tensor.",
"_____no_output_____"
]
],
[
[
"model.eval()\nlikelihood.eval()\nmeans = torch.tensor([0.])\nwith torch.no_grad():\n for x_batch, y_batch in test_loader:\n preds = model(x_batch)\n means = torch.cat([means, preds.mean.cpu()])\nmeans = means[1:]",
"_____no_output_____"
],
[
"print('Test MAE: {}'.format(torch.mean(torch.abs(means - test_y.cpu()))))",
"Test MAE: 0.11148034781217575\n"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
d07ed6ac3d3f9b950dbc1e0b047a13577ac8ece1 | 9,001 | ipynb | Jupyter Notebook | Execution_of_the_bundling_layer_3.ipynb | inpicksys/Neural_networks_and_computer_vision_SamsungResearch_Open_Education | ab1f475659096c116c4d59558faf4e253ffdc89a | [
"MIT"
] | null | null | null | Execution_of_the_bundling_layer_3.ipynb | inpicksys/Neural_networks_and_computer_vision_SamsungResearch_Open_Education | ab1f475659096c116c4d59558faf4e253ffdc89a | [
"MIT"
] | null | null | null | Execution_of_the_bundling_layer_3.ipynb | inpicksys/Neural_networks_and_computer_vision_SamsungResearch_Open_Education | ab1f475659096c116c4d59558faf4e253ffdc89a | [
"MIT"
] | null | null | null | 37.978903 | 183 | 0.470948 | [
[
[
"import torch\nfrom abc import ABC, abstractmethod\nimport numpy as np",
"_____no_output_____"
],
[
"def calc_out_shape(input_matrix_shape, out_channels, kernel_size, stride, padding):\n batch_size, channels_count, input_height, input_width = input_matrix_shape\n output_height = (input_height + 2 * padding - (kernel_size - 1) - 1) // stride + 1\n output_width = (input_width + 2 * padding - (kernel_size - 1) - 1) // stride + 1\n\n return batch_size, out_channels, output_height, output_width",
"_____no_output_____"
],
[
"class ABCConv2d(ABC):\n def __init__(self, in_channels, out_channels, kernel_size, stride):\n self.in_channels = in_channels\n self.out_channels = out_channels\n self.kernel_size = kernel_size\n self.stride = stride\n\n def set_kernel(self, kernel):\n self.kernel = kernel\n\n @abstractmethod\n def __call__(self, input_tensor):\n pass\n",
"_____no_output_____"
],
[
"def create_and_call_conv2d_layer(conv2d_layer_class, stride, kernel, input_matrix):\n out_channels = kernel.shape[0]\n in_channels = kernel.shape[1]\n kernel_size = kernel.shape[2]\n\n layer = conv2d_layer_class(in_channels, out_channels, kernel_size, stride)\n layer.set_kernel(kernel)\n\n return layer(input_matrix)",
"_____no_output_____"
],
[
"class Conv2d(ABCConv2d):\n def __init__(self, in_channels, out_channels, kernel_size, stride):\n self.conv2d = torch.nn.Conv2d(in_channels, out_channels, kernel_size,\n stride, padding=0, bias=False)\n\n def set_kernel(self, kernel):\n self.conv2d.weight.data = kernel\n\n def __call__(self, input_tensor):\n return self.conv2d(input_tensor)",
"_____no_output_____"
],
[
"def test_conv2d_layer(conv2d_layer_class, batch_size=2,\n input_height=4, input_width=4, stride=2):\n kernel = torch.tensor(\n [[[[0., 1, 0],\n [1, 2, 1],\n [0, 1, 0]],\n\n [[1, 2, 1],\n [0, 3, 3],\n [0, 1, 10]],\n\n [[10, 11, 12],\n [13, 14, 15],\n [16, 17, 18]]]])\n\n in_channels = kernel.shape[1]\n\n input_tensor = torch.arange(0, batch_size * in_channels *\n input_height * input_width,\n out=torch.FloatTensor()) \\\n .reshape(batch_size, in_channels, input_height, input_width)\n\n custom_conv2d_out = create_and_call_conv2d_layer(\n conv2d_layer_class, stride, kernel, input_tensor)\n conv2d_out = create_and_call_conv2d_layer(\n Conv2d, stride, kernel, input_tensor)\n return torch.allclose(custom_conv2d_out, conv2d_out) \\\n and (custom_conv2d_out.shape == conv2d_out.shape)",
"_____no_output_____"
],
[
"class Conv2dMatrixV2(ABCConv2d):\n # Функция преобразования кернела в нужный формат.\n def _convert_kernel(self):\n matrix_zeros = np.zeros((self.kernel.size()[0],self.kernel.size()[1] * self.kernel.size()[2]*self.kernel.size()[3]))\n j, k = 0, 0\n for image in self.kernel[0].numpy():\n for i in range(len(image)):\n matrix_zeros[:, k:k+self.kernel.size()[2]] = image[i]\n k += self.kernel.size()[2]\n j+=1\n converted_kernel = torch.from_numpy(matrix_zeros).float()# Реализуйте преобразование кернела.\n return converted_kernel\n\n # Функция преобразования входа в нужный формат.\n def _convert_input(self, torch_input, output_height, output_width):\n matrix_zeros = np.zeros((torch_input.size()[1] *self.kernel.size()[2]*self.kernel.size()[3], torch_input.size()[0] ))\n for i in range(matrix_zeros.shape[1]):\n j = 0\n for core in torch_input[i].numpy():\n core_reshape = core.reshape(1, core.shape[0]*core.shape[1])\n k = 0\n for g in range(self.kernel.size()[2]):\n matrix_zeros[j:j+self.kernel.size()[2], i] = core_reshape[:,k:k+self.kernel.size()[2]] # np.flip(core_reshape, 1)\n j += self.kernel.size()[2]\n k += self.kernel.size()[2]+1\n converted_input = torch.from_numpy(matrix_zeros).float() # Реализуйте преобразование входа.\n return converted_input\n\n def __call__(self, torch_input):\n batch_size, out_channels, output_height, output_width\\\n = calc_out_shape(\n input_matrix_shape=torch_input.shape,\n out_channels=self.kernel.shape[0],\n kernel_size=self.kernel.shape[2],\n stride=self.stride,\n padding=0)\n\n converted_kernel = self._convert_kernel()\n converted_input = self._convert_input(torch_input, output_height, output_width)\n\n conv2d_out_alternative_matrix_v2 = converted_kernel @ converted_input\n \n return conv2d_out_alternative_matrix_v2.transpose(0, 1).view(torch_input.shape[0], self.out_channels, output_height, output_width).transpose(1, 3).transpose(2, 3)\n\n# Проверка происходит автоматически вызовом следующего кода\n# (раскомментируйте для самостоятельной проверки,\n# в коде для сдачи задания должно быть закомментировано):\nprint(test_conv2d_layer(Conv2dMatrixV2))",
"test tensor([[[[ 5252.]]],\n\n\n [[[12596.]]]]) tensor([[[[ 5252.]]],\n\n\n [[[12596.]]]], grad_fn=<MkldnnConvolutionBackward>)\nTrue\n"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07ed933b298b7959aa50443954ede7eb0a1fd19 | 151,525 | ipynb | Jupyter Notebook | genomics/3-MRCA/MRCA_Estimation_Notebook.ipynb | bwlang/SARS-CoV-2 | 865ff522120af26dd51f58f288b428246b65b9a7 | [
"MIT"
] | 132 | 2020-02-13T19:08:02.000Z | 2022-01-29T22:00:23.000Z | genomics/3-MRCA/MRCA_Estimation_Notebook.ipynb | bwlang/SARS-CoV-2 | 865ff522120af26dd51f58f288b428246b65b9a7 | [
"MIT"
] | 116 | 2020-02-17T09:21:38.000Z | 2022-02-27T01:28:59.000Z | genomics/3-MRCA/MRCA_Estimation_Notebook.ipynb | bwlang/SARS-CoV-2 | 865ff522120af26dd51f58f288b428246b65b9a7 | [
"MIT"
] | 82 | 2020-02-13T15:05:48.000Z | 2022-01-29T20:58:49.000Z | 257.258065 | 77,802 | 0.893727 | [
[
[
"# MRCA estimation",
"_____no_output_____"
],
[
"-------\nYou can access your data via the dataset number. For example, ``handle = open(get(42), 'r')``.\nTo save data, write your data to a file, and then call ``put('filename.txt')``. The dataset will then be available in your galaxy history.\nNotebooks can be saved to Galaxy by clicking the large green button at the top right of the IPython interface.<br>\nMore help and informations can be found on the project [website](https://github.com/bgruening/galaxy-ipython).",
"_____no_output_____"
],
[
"## Inputs\n\n------\n\nThis notebook expects two inputs from Galaxy history:\n\n 1. a comma separated list of accession numbers and corresponding collection dates\n 2. a phylogenetic tree (in newick format) in which OTU labels correspond to accession numbers from input 1\n \nHere is an example of input 1:\n```\nAccession,Collection_Date\nMT049951,2020-01-17\nMT019531,2019-12-30\nMT019529,2019-12-23\nMN975262,2020-01-11\nMN996528,2019-12-30\nMT019532,2019-12-30\nMT019530,2019-12-30\nMN994468,2020-01-22\n```",
"_____no_output_____"
]
],
[
[
"# Set history items for datasets containing accession/dates and a maximum likelihood tree:\n# These numbers correspond to numbers of Galaxy datasets\nacc_date = 1\ntree = 116",
"_____no_output_____"
],
[
"!pip install --upgrade pip==20.0.2",
"_____no_output_____"
],
[
"!pip install --upgrade statsmodels==0.11.0",
"_____no_output_____"
],
[
"!pip install --upgrade pandas==0.24.2",
"_____no_output_____"
],
[
"from Bio import Phylo as phylo\nfrom matplotlib import pyplot as plt\nimport pandas as pd\nimport datetime\n\nimport statsmodels.api as sm\nimport statsmodels.formula.api as smf\n%matplotlib inline",
"_____no_output_____"
],
[
"# Get accessions and dates\nacc_path = get(acc_date)\n# Get ML tree\ntree_path = get(tree)",
"_____no_output_____"
],
[
"!mv {acc_path} acc_date.csv\n!mv {tree_path} tree.nwk",
"_____no_output_____"
],
[
"col_dates = pd.read_csv('acc_date.csv')",
"_____no_output_____"
],
[
"col_dates",
"_____no_output_____"
],
[
"tree = next( phylo.parse( 'tree.nwk', \"newick\" ) )",
"_____no_output_____"
],
[
"plt.rcParams['figure.figsize'] = [15, 50] \nphylo.draw( tree )",
"_____no_output_____"
],
[
"def root_to_tip( tree, date_df ):\n accum = []\n def tree_walker( clade, total_branch_length ):\n for child in clade.clades:\n if child.is_terminal:\n if child.name is not None:\n date = date_df[date_df['Accession']==child.name]['Collection_Date'].to_string(index=False)\n accum.append( ( child.name, date, total_branch_length + child.branch_length ) )\n tree_walker( child, total_branch_length + child.branch_length )\n tree_walker( tree.clade, 0 )\n return pd.DataFrame( accum, columns=[\"name\",\"date\",\"distance_to_root\"] )",
"_____no_output_____"
],
[
"for clade in list( tree.find_clades() ):\n tree.root_with_outgroup( clade )\n df = root_to_tip( tree, col_dates )\n df['date'] = pd.to_datetime(df['date']) \n df['date_as_numeric'] = [d.year + (d.dayofyear-1)/365 for d in df['date']]",
"_____no_output_____"
],
[
"plt.rcParams['figure.figsize'] = [15, 10] \ndf.plot( x=\"date\", y=\"distance_to_root\" )",
"_____no_output_____"
],
[
"df['date_as_numeric'] = [d.year + (d.dayofyear-1)/365 for d in df['date']]",
"_____no_output_____"
]
],
[
[
"## MRCA timing is ...",
"_____no_output_____"
]
],
[
[
"import datetime\ndef decimal_to_calendar (decimal):\n years = int (decimal)\n d = datetime.datetime (years, 1,1) + datetime.timedelta (days = int ((decimal-years)*365))\n return d\n\nmodel = smf.ols(formula='distance_to_root ~ date_as_numeric ', data=df)\nresults = model.fit()\nprint( results.summary() )\nprint (\"Root predicted at {}\".format(decimal_to_calendar(-results.params.Intercept/results.params.date_as_numeric)))",
" OLS Regression Results \n==============================================================================\nDep. Variable: distance_to_root R-squared: 0.056\nModel: OLS Adj. R-squared: 0.023\nMethod: Least Squares F-statistic: 1.671\nDate: Thu, 13 Feb 2020 Prob (F-statistic): 0.207\nTime: 19:40:50 Log-Likelihood: 246.10\nNo. Observations: 30 AIC: -488.2\nDf Residuals: 28 BIC: -485.4\nDf Model: 1 \nCovariance Type: nonrobust \n===================================================================================\n coef std err t P>|t| [0.025 0.975]\n-----------------------------------------------------------------------------------\nIntercept -0.9146 0.708 -1.293 0.207 -2.364 0.535\ndate_as_numeric 0.0005 0.000 1.293 0.207 -0.000 0.001\n==============================================================================\nOmnibus: 3.224 Durbin-Watson: 1.763\nProb(Omnibus): 0.199 Jarque-Bera (JB): 2.585\nSkew: 0.716 Prob(JB): 0.275\nKurtosis: 2.868 Cond. No. 1.14e+08\n==============================================================================\n\nWarnings:\n[1] Standard Errors assume that the covariance matrix of the errors is correctly specified.\n[2] The condition number is large, 1.14e+08. This might indicate that there are\nstrong multicollinearity or other numerical problems.\nRoot predicted at 2019-11-14 00:00:00\n"
]
]
] | [
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07edae8d2f7d87af6b44f3ffb4df7af8975c314 | 28,412 | ipynb | Jupyter Notebook | temas/IV.optimizacion_convexa_y_machine_learning/4.6.Metodo_de_BL_para_puntos_iniciales_no_factibles_Python_2nd_version.ipynb | Danahirmt/analisis-numerico-computo-cientifico | ec42457dc8f707ce8d481cb62441d29fc0a9ab52 | [
"Apache-2.0"
] | null | null | null | temas/IV.optimizacion_convexa_y_machine_learning/4.6.Metodo_de_BL_para_puntos_iniciales_no_factibles_Python_2nd_version.ipynb | Danahirmt/analisis-numerico-computo-cientifico | ec42457dc8f707ce8d481cb62441d29fc0a9ab52 | [
"Apache-2.0"
] | null | null | null | temas/IV.optimizacion_convexa_y_machine_learning/4.6.Metodo_de_BL_para_puntos_iniciales_no_factibles_Python_2nd_version.ipynb | Danahirmt/analisis-numerico-computo-cientifico | ec42457dc8f707ce8d481cb62441d29fc0a9ab52 | [
"Apache-2.0"
] | null | null | null | 34.026347 | 1,849 | 0.538258 | [
[
[
"**Notas para contenedor de docker:**",
"_____no_output_____"
],
[
"Comando de docker para ejecución de la nota de forma local:\n\nnota: cambiar `<ruta a mi directorio>` por la ruta de directorio que se desea mapear a `/datos` dentro del contenedor de docker.\n\n```\ndocker run --rm -v <ruta a mi directorio>:/datos --name jupyterlab_numerical -p 8888:8888 -d palmoreck/jupyterlab_numerical:1.1.0\n```\n\npassword para jupyterlab: `qwerty`\n\nDetener el contenedor de docker:\n\n```\ndocker stop jupyterlab_numerical\n```\n",
"_____no_output_____"
],
[
"Documentación de la imagen de docker `palmoreck/jupyterlab_numerical:1.1.0` en [liga](https://github.com/palmoreck/dockerfiles/tree/master/jupyterlab/numerical).",
"_____no_output_____"
],
[
"---",
"_____no_output_____"
],
[
"Nota generada a partir de [liga1](https://drive.google.com/file/d/1zCIHNAxe5Shc36Qo0XjehHgwrafKSJ_t/view), [liga2](https://drive.google.com/file/d/1RMwUXEN_SOHKue-J9Cx3Ldvj9bejLjiM/view).",
"_____no_output_____"
]
],
[
[
"!pip3 install --user -q cvxpy",
"^C\n\u001b[31mERROR: Operation cancelled by user\u001b[0m\n"
],
[
"import os",
"_____no_output_____"
],
[
"cur_directory = os.getcwd()",
"_____no_output_____"
],
[
"dir_alg_python = '/algoritmos/Python'",
"_____no_output_____"
],
[
"os.chdir(cur_directory + dir_alg_python)",
"_____no_output_____"
],
[
"import math\n\nimport numpy as np\n\nfrom utils_2nd_version import compute_error\n\nfrom algorithms_for_cieco_2nd_version import path_following_method_infeasible_init_point_2nd_version, \\\n path_following_method_infeasible_init_point\n",
"_____no_output_____"
]
],
[
[
"# Está lista la implementación para puntos iniciales no factibles en $Ax=b$ pero no para la parte de las desigualdades $f_i(x) <0 \\quad \\forall i=1,\\dots,m$",
"_____no_output_____"
],
[
"# Primer ejemplo",
"_____no_output_____"
],
[
"$$ \\min \\quad x_1^2 + x_2^2 + x_3^2 + x_4^2 -2x_1-3x_4$$",
"_____no_output_____"
],
[
"$$\\text{sujeto a: } $$",
"_____no_output_____"
],
[
"$$\n\\begin{array}{c}\n2x_1 + x_2 + x_3 + 4x_4 = 7 \\\\\nx_1 + x_2 + 2x_3 + x_4 = 6\n\\end{array}\n$$",
"_____no_output_____"
],
[
"$$x_1, x_2, x_3, x_4 \\geq 0$$",
"_____no_output_____"
],
[
"## Infeasible for Ax=b",
"_____no_output_____"
]
],
[
[
"from utils_2nd_version import constraint_inequalities_funcs_eval",
"_____no_output_____"
],
[
"fo = lambda x: x[0]**2 + x[1]**2 + x[2]**2 + x[3]**2-2*x[0]-3*x[3]",
"_____no_output_____"
],
[
"const = {0: lambda x: -x[0],\n 1: lambda x: -x[1],\n 2: lambda x: -x[2],\n 3: lambda x: -x[3]\n }",
"_____no_output_____"
],
[
"A= np.array([[2,1,1,4],\n [1,1,2,1]])",
"_____no_output_____"
],
[
"b=np.array([7,6])",
"_____no_output_____"
],
[
"x_ast=np.array([1.1232876712328763,0.6506849315068493,\n 1.8287671232876714,0.5684931506849317])",
"_____no_output_____"
],
[
"x_0 = np.array([-5,-5,-5,-5],dtype=float) #with -5,-5,-5,-5 return one entry of x negative",
"_____no_output_____"
],
[
"nu_0 = np.array([0,0], dtype=float)",
"_____no_output_____"
],
[
"p_ast=fo(x_ast)",
"_____no_output_____"
],
[
"p_ast",
"_____no_output_____"
],
[
"tol_outer_iter = 1e-6\ntol=1e-8\ntol_backtracking=1e-8\nmaxiter=30\nmu=10",
"_____no_output_____"
],
[
"x = path_following_method_infeasible_init_point_2nd_version(fo, A, b,\n const,\n x_0, nu_0, tol,\n tol_backtracking = tol_backtracking, \n x_ast = x_ast, \n p_ast=p_ast, \n maxiter=maxiter,\n mu=mu, tol_outer_iter = tol_outer_iter \n )",
"Outer iterations of path following infeasible method\nMu value: 1.00e+01\nLogBarrier \tt_log_barrier\t s_infeasible\n4.00e+10\t3.24e-02\t1.50e+01\n----------------------------------------------------------------------------------------\ngfeval\n[-0.48835165 -0.42362637 -0.42362637 -0.52071427 0. ]\nHfeval\n[[0.07484601 0.0004599 0.0004599 0. 0. ]\n [0.0004599 0.07530591 0.0004599 0. 0. ]\n [0.0004599 0.0004599 0.07530591 0. 0. ]\n [0. 0. 0. 0.07484601 0. ]\n [0. 0. 0. 0. 0. ]]\nI\t||res_primal||\t||res_dual|| \tNewton Decrement\tError x_ast\tError p_ast\tline search\tCondHf\n0\t5.83e+01\t1.03e+02\t1.12e+01\t8.34e+00\t8.82e+01\t---\t\tinf\n1\t1.26e-15\t6.75e-01\t1.44e-01\t9.95e-02\t3.78e-02\t1.00e+00\tinf\n2\t8.88e-16\t6.75e-01\t1.44e-01\t9.95e-02\t3.79e-02\t6.10e-05\tinf\n3\t0.00e+00\t6.75e-01\t1.44e-01\t9.95e-02\t3.79e-02\t3.05e-05\tinf\n4\t1.26e-15\t6.75e-01\t1.44e-01\t9.96e-02\t3.79e-02\t7.63e-06\tinf\n5\t0.00e+00\t6.75e-01\t1.44e-01\t9.96e-02\t3.79e-02\t3.81e-06\tinf\n6\t1.26e-15\t6.75e-01\t1.44e-01\t9.96e-02\t3.79e-02\t4.77e-07\tinf\n7\t8.88e-16\t6.75e-01\t1.44e-01\t9.96e-02\t3.79e-02\t2.38e-07\tinf\n8\t1.99e-15\t6.75e-01\t1.44e-01\t9.96e-02\t3.79e-02\t1.19e-07\tinf\n9\t1.99e-15\t6.75e-01\t1.44e-01\t9.96e-02\t3.79e-02\t5.96e-08\tinf\n10\t1.78e-15\t6.75e-01\t1.44e-01\t9.96e-02\t3.79e-02\t2.98e-08\tinf\n11\t8.88e-16\t6.75e-01\t1.44e-01\t9.96e-02\t3.79e-02\t1.49e-08\tinf\n12\t8.88e-16\t6.75e-01\t1.44e-01\t9.96e-02\t3.79e-02\t1.86e-09\tinf\nBacktracking value less than tol_backtracking, try other initial point\nx\n[1.17048255 0.84785972 1.73077939 0.52009895]\ns\n0.0\n----------------------------------------------------------------------------------------\nLogBarrier\n-6.57e-02\n"
],
[
"x",
"_____no_output_____"
],
[
"A@x",
"_____no_output_____"
]
],
[
[
"# Comparación con [cvxpy](https://github.com/cvxgrp/cvxpy)",
"_____no_output_____"
]
],
[
[
"import cvxpy as cp",
"_____no_output_____"
],
[
"x1 = cp.Variable()\nx2 = cp.Variable()\nx3 = cp.Variable()\nx4 = cp.Variable()\n",
"_____no_output_____"
],
[
"# Create two constraints.\nconstraints = [2*x1+x2+x3+4*x4-7 == 0,x1+x2+2*x3+x4-6 == 0,x1>=0,x2>=0,x3>=0,x4>=0]\n\n# Form objective.\n\nobj = cp.Minimize(x1**2+x2**2+x3**2+x4**2-2*x1-3*x4)",
"_____no_output_____"
],
[
"# Form and solve problem.\nprob = cp.Problem(obj, constraints)\nprob.solve() # Returns the optimal value.",
"_____no_output_____"
],
[
"print(\"status:\", prob.status)\nprint(\"optimal value\", prob.value)\nprint(\"optimal var\", x1.value, x2.value, x3.value,x4.value)",
"status: optimal\noptimal value 1.4006849315068515\noptimal var 1.1232876712328763 0.6506849315068494 1.8287671232876717 0.5684931506849316\n"
]
],
[
[
"# Segundo ejemplo",
"_____no_output_____"
],
[
"$$\\min 2x_1 + 5x_2$$",
"_____no_output_____"
],
[
"$$\\text{sujeto a: }$$",
"_____no_output_____"
],
[
"$$\n\\begin{array}{c}\n6-x_1-x_2 \\leq 0 \\\\\n-18 + x_1 +2x_2 \\leq 0\\\\\nx_1, x_2 \\geq 0\n\\end{array}\n$$",
"_____no_output_____"
]
],
[
[
"fo = lambda x: 2*x[0] + 5*x[1]",
"_____no_output_____"
],
[
"const = {0: lambda x: 6-x[0]-x[1],\n 1: lambda x: -18+x[0]+2*x[1],\n 2: lambda x: -x[0],\n 3: lambda x: -x[1]\n }",
"_____no_output_____"
],
[
"A=np.array([0,0],dtype=float)\nb = 0",
"_____no_output_____"
],
[
"x_ast = np.array([6,0], dtype=float)",
"_____no_output_____"
],
[
"#x_0 = np.array([-2,-2], dtype=float)\nx_0 = np.array([5,0], dtype=float)\n#x_0 = np.array([10,-1], dtype=float)\n#x_0 = np.array([6.5,0], dtype=float)",
"_____no_output_____"
],
[
"p_ast=fo(x_ast)",
"_____no_output_____"
],
[
"p_ast",
"_____no_output_____"
],
[
"nu_0 = np.array([0,0], dtype=float)",
"_____no_output_____"
],
[
"tol_outer_iter = 1e-3\ntol=1e-8\ntol_backtracking=1e-4\nmaxiter=30\nmu=10",
"_____no_output_____"
],
[
"fo(x_0)",
"_____no_output_____"
],
[
"x = path_following_method_infeasible_init_point_2nd_version(fo, A, b,\n const,\n x_0, nu_0, tol,\n tol_backtracking, x_ast, p_ast, maxiter,\n mu, tol_outer_iter = tol_outer_iter \n )",
"Outer iterations of path following infeasible method\nMu value: 1.00e+01\nLogBarrier \tt_log_barrier\t s_infeasible\n4.00e+10\t2.00e+00\t1.10e+01\n----------------------------------------------------------------------------------------\ngfeval\n[3.87916665 9.89242436 0. ]\nHfeval\n[[0.01556835 0.01347222 0. ]\n [0.01347222 0.02522001 0. ]\n [0. 0. 0. ]]\nI\t||res_primal||\t||res_dual|| \tNewton Decrement\tError x_ast\tError p_ast\tline search\tCondHf\n0\t1.10e+01\t1.53e+01\t4.12e+03\t1.84e+00\t1.67e-01\t---\t\tinf\n1\t0.00e+00\t1.08e+01\t1.56e+06\t8.50e+01\t1.73e+02\t1.00e+00\tinf\n2\t0.00e+00\t1.08e+01\t8.17e+11\t2.61e+04\t6.53e+04\t1.00e+00\tinf\n"
],
[
"from utils import constraint_inequalities_funcs_eval",
"_____no_output_____"
],
[
"constraint_inequalities_funcs_eval(x, const)",
"_____no_output_____"
]
],
[
[
"# Comparación con [cvxpy](https://github.com/cvxgrp/cvxpy)",
"_____no_output_____"
]
],
[
[
"x1 = cp.Variable()\nx2 = cp.Variable()\n",
"_____no_output_____"
],
[
"# Create two constraints.\nconstraints = [6-x1-x2 <= 0,-18+x1+2*x2<=0,x1>=0,x2>=0]\n\n# Form objective.\n\nobj = cp.Minimize(2*x1+5*x2)\n\n",
"_____no_output_____"
],
[
"# Form and solve problem.\nprob = cp.Problem(obj, constraints)\nprob.solve() # Returns the optimal value.\n",
"_____no_output_____"
],
[
"print(\"status:\", prob.status)\nprint(\"optimal value\", prob.value)\nprint(\"optimal var\", x1.value, x2.value)",
"status: optimal\noptimal value 12.0000000016275\noptimal var 6.000000000175689 2.552244387851183e-10\n"
]
],
[
[
"**Referencias:**\n\n* S. P. Boyd, L. Vandenberghe, Convex Optimization, Cambridge University Press, 2009.",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
]
] |
d07f13d77ed070b5ad7cf0811f3ebe9a4b40bdb1 | 20,524 | ipynb | Jupyter Notebook | notebooks/Pandas/Pandas Introduction.ipynb | jthielen/unidata-python-workshop | 120e9c310443274465a87780f972956b76e2acb5 | [
"MIT"
] | 1 | 2021-08-16T03:12:07.000Z | 2021-08-16T03:12:07.000Z | notebooks/Pandas/Pandas Introduction.ipynb | jthielen/unidata-python-workshop | 120e9c310443274465a87780f972956b76e2acb5 | [
"MIT"
] | null | null | null | notebooks/Pandas/Pandas Introduction.ipynb | jthielen/unidata-python-workshop | 120e9c310443274465a87780f972956b76e2acb5 | [
"MIT"
] | null | null | null | 23.112613 | 399 | 0.534204 | [
[
[
"<div style=\"width:1000 px\">\n\n<div style=\"float:right; width:98 px; height:98px;\">\n<img src=\"https://raw.githubusercontent.com/Unidata/MetPy/master/metpy/plots/_static/unidata_150x150.png\" alt=\"Unidata Logo\" style=\"height: 98px;\">\n</div>\n\n<h1>Introduction to Pandas</h1>\n<h3>Unidata Python Workshop</h3>\n\n<div style=\"clear:both\"></div>\n</div>\n\n<hr style=\"height:2px;\">\n\n\n## Overview:\n\n* **Teaching:** 30 minutes\n* **Exercises:** 30 minutes\n\n### Questions\n1. What is Pandas?\n1. What are the basic Pandas data structures?\n1. How can I read data into Pandas?\n1. What are some of the data operations available in Pandas?\n\n### Objectives\n1. <a href=\"#series\">Data Series</a>\n1. <a href=\"#frames\">Data Frames</a>\n1. <a href=\"#loading\">Loading Data in Pandas</a>\n1. <a href=\"#missing\">Missing Data</a>\n1. <a href=\"#manipulating\">Manipulating Data</a>",
"_____no_output_____"
],
[
"<a name=\"series\"></a>\n## Data Series\nData series are one of the fundamental data structures in Pandas. You can think of them like a dictionary; they have a key (index) and value (data/values) like a dictionary, but also have some handy functionality attached to them.\n\nTo start out, let's create a series from scratch. We'll imagine these are temperature observations.",
"_____no_output_____"
]
],
[
[
"from pandas import Series\ntemperatures = Series([23, 20, 25, 18])\ntemperatures",
"_____no_output_____"
]
],
[
[
"The values on the left are the index (zero based integers by default) and on the right are the values. Notice that the data type is an integer. Any NumPy datatype is acceptable in a series.\n\nThat's great, but it'd be more useful if the station were associated with those values. In fact you could say we want the values *indexed* by station name.",
"_____no_output_____"
]
],
[
[
"temperatures = Series([23, 20, 25, 18], index=['TOP', 'OUN', 'DAL', 'DEN'])\ntemperatures",
"_____no_output_____"
]
],
[
[
"Now, very similar to a dictionary, we can use the index to access and modify elements.",
"_____no_output_____"
]
],
[
[
"temperatures['DAL']",
"_____no_output_____"
],
[
"temperatures[['DAL', 'OUN']]",
"_____no_output_____"
]
],
[
[
"We can also do basic filtering, math, etc.",
"_____no_output_____"
]
],
[
[
"temperatures[temperatures > 20]",
"_____no_output_____"
],
[
"temperatures + 2",
"_____no_output_____"
]
],
[
[
"Remember how I said that series are like dictionaries? We can create a series striaght from a dictionary.",
"_____no_output_____"
]
],
[
[
"dps = {'TOP': 14,\n 'OUN': 18,\n 'DEN': 9,\n 'PHX': 11,\n 'DAL': 23}\n\ndewpoints = Series(dps)\ndewpoints",
"_____no_output_____"
]
],
[
[
"It's also easy to check and see if an index exists in a given series:",
"_____no_output_____"
]
],
[
[
"'PHX' in dewpoints",
"_____no_output_____"
],
[
"'PHX' in temperatures",
"_____no_output_____"
]
],
[
[
"Series have a name attribute and their index has a name attribute.",
"_____no_output_____"
]
],
[
[
"temperatures.name = 'temperature'\ntemperatures.index.name = 'station'",
"_____no_output_____"
],
[
"temperatures",
"_____no_output_____"
]
],
[
[
"<div class=\"alert alert-success\">\n <b>EXERCISE</b>:\n <ul>\n <li>Create a series of pressures for stations TOP, OUN, DEN, and DAL (assign any values you like).</li>\n <li>Set the series name and series index name.</li>\n <li>Print the pressures for all stations which have a dewpoint below 15.</li>\n </ul>\n</div>",
"_____no_output_____"
]
],
[
[
"# Your code goes here\n",
"_____no_output_____"
]
],
[
[
"<button data-toggle=\"collapse\" data-target=\"#sol1\" class='btn btn-primary'>View Solution</button>\n<div id=\"sol1\" class=\"collapse\">\n<code><pre>\npressures = Series([1012.1, 1010.6, 1008.8, 1011.2], index=['TOP', 'OUN', 'DEN', 'DAL'])\npressures.name = 'pressure'\npressures.index.name = 'station'\nprint(pressures[dewpoints < 15])\n</pre></code>\n</div>",
"_____no_output_____"
],
[
"<a href=\"#top\">Top</a>\n<hr style=\"height:2px;\">",
"_____no_output_____"
],
[
"<a name=\"frames\"></a>\n## Data Frames\nSeries are great, but what about a bunch of related series? Something like a table or a spreadsheet? Enter the data frame. A data frame can be thought of as a dictionary of data series. They have indexes for their rows and their columns. Each data series can be of a different type , but they will all share a common index.\n\nThe easiest way to create a data frame by hand is to use a dictionary.",
"_____no_output_____"
]
],
[
[
"from pandas import DataFrame\n\ndata = {'station': ['TOP', 'OUN', 'DEN', 'DAL'],\n 'temperature': [23, 20, 25, 18],\n 'dewpoint': [14, 18, 9, 23]}\n\ndf = DataFrame(data)\ndf",
"_____no_output_____"
]
],
[
[
"You can access columns (data series) using dictionary type notation or attribute type notation.",
"_____no_output_____"
]
],
[
[
"df['temperature']",
"_____no_output_____"
],
[
"df.dewpoint",
"_____no_output_____"
]
],
[
[
"Notice the index is shared and that the name of the column is attached as the series name.\n\nYou can also create a new column and assign values. If I only pass a scalar it is duplicated.",
"_____no_output_____"
]
],
[
[
"df['wspeed'] = 0.\ndf",
"_____no_output_____"
]
],
[
[
"Let's set the index to be the station.",
"_____no_output_____"
]
],
[
[
"df.index = df.station\ndf",
"_____no_output_____"
]
],
[
[
"Well, that's close, but we now have a redundant column, so let's get rid of it.",
"_____no_output_____"
]
],
[
[
"df.drop('station', 1, inplace=True)\ndf",
"_____no_output_____"
]
],
[
[
"Now let's get a row from the dataframe instead of a column.",
"_____no_output_____"
]
],
[
[
"df.loc['DEN']",
"_____no_output_____"
]
],
[
[
"We can even transpose the data easily if we needed that do make things easier to merge/munge later.",
"_____no_output_____"
]
],
[
[
"df.T",
"_____no_output_____"
]
],
[
[
"Look at the `values` attribute to access the data as a 1D or 2D array for series and data frames recpectively.",
"_____no_output_____"
]
],
[
[
"df.values",
"_____no_output_____"
],
[
"df.temperature.values",
"_____no_output_____"
]
],
[
[
"<div class=\"alert alert-success\">\n <b>EXERCISE</b>:\n <ul>\n <li>Add a series of rain observations to the existing data frame.</li>\n <li>Apply an instrument correction of -2 to the dewpoint observations.</li>\n </ul>\n</div>",
"_____no_output_____"
]
],
[
[
"# Your code goes here\n",
"_____no_output_____"
]
],
[
[
"<button data-toggle=\"collapse\" data-target=\"#sol2\" class='btn btn-primary'>View Solution</button>\n<div id=\"sol2\" class=\"collapse\">\n<code><pre>\ndf['rain'] = [0, 0.4, 0.2, 0]\ndf.dewpoint = df.dewpoint - 2\ndf\n</pre></code>\n</div>",
"_____no_output_____"
],
[
"<a href=\"#top\">Top</a>\n<hr style=\"height:2px;\">",
"_____no_output_____"
],
[
"<a name=\"loading\"></a>\n## Loading Data in Pandas\nThe real power of pandas is in manupulating and summarizing large sets of tabular data. To do that, we'll need a large set of tabular data. We've included a file in this directory called `JAN17_CO_ASOS.txt` that has all of the ASOS observations for several stations in Colorado for January of 2017. It's a few hundred thousand rows of data in a tab delimited format. Let's load it into Pandas.",
"_____no_output_____"
]
],
[
[
"import pandas as pd",
"_____no_output_____"
],
[
"df = pd.read_table('Jan17_CO_ASOS.txt')",
"_____no_output_____"
],
[
"df.head()",
"_____no_output_____"
],
[
"df = pd.read_table('Jan17_CO_ASOS.txt', parse_dates=['valid'])",
"_____no_output_____"
],
[
"df.head()",
"_____no_output_____"
],
[
"df = pd.read_table('Jan17_CO_ASOS.txt', parse_dates=['valid'], na_values='M')",
"_____no_output_____"
],
[
"df.head()",
"_____no_output_____"
]
],
[
[
"Let's look in detail at those column names. Turns out we need to do some cleaning of this file. Welcome to real world data analysis.",
"_____no_output_____"
]
],
[
[
"df.columns",
"_____no_output_____"
],
[
"df.columns = ['station', 'time', 'temperature', 'dewpoint', 'pressure']",
"_____no_output_____"
],
[
"df.head()",
"_____no_output_____"
]
],
[
[
"For other formats of data CSV, fixed width, etc. that are tools to read it as well. You can even read excel files straight into Pandas.",
"_____no_output_____"
],
[
"<a href=\"#top\">Top</a>\n<hr style=\"height:2px;\">",
"_____no_output_____"
],
[
"<a name=\"missing\"></a>\n## Missing Data\nWe've already dealt with some missing data by turning the 'M' string into actual NaN's while reading the file in. We can do one better though and delete any rows that have all values missing. There are similar operations that could be performed for columns. You can even drop if any values are missing, all are missing, or just those you specify are missing.",
"_____no_output_____"
]
],
[
[
"len(df)",
"_____no_output_____"
],
[
"df.dropna(axis=0, how='all', subset=['temperature', 'dewpoint', 'pressure'], inplace=True)",
"_____no_output_____"
],
[
"len(df)",
"_____no_output_____"
],
[
"df.head()",
"_____no_output_____"
]
],
[
[
"<div class=\"alert alert-success\">\n <b>EXERCISE</b>:\n Create a new data frame called df2 that contains all data that only have temperature, dewpoint and pressure observations.\n</div>",
"_____no_output_____"
]
],
[
[
"# Your code goes here\n# df2 = ",
"_____no_output_____"
]
],
[
[
"<button data-toggle=\"collapse\" data-target=\"#sol3\" class='btn btn-primary'>View Solution</button>\n<div id=\"sol3\" class=\"collapse\">\n<code><pre>\ndf2 = df.dropna(how='any')\ndf2\n</pre></code>\n</div>",
"_____no_output_____"
],
[
"Lastly, we still have the original index values. Let's reindex to a new zero-based index for only the rows that have valid data in them.",
"_____no_output_____"
]
],
[
[
"df.reset_index(drop=True, inplace=True)",
"_____no_output_____"
],
[
"df.head()",
"_____no_output_____"
]
],
[
[
"<a href=\"#top\">Top</a>\n<hr style=\"height:2px;\">",
"_____no_output_____"
],
[
"<a name=\"manipulating\"></a>\n## Manipulating Data\nWe can now take our data and do some intersting things with it. Let's start with a simple min/max.",
"_____no_output_____"
]
],
[
[
"print('Min: {}\\nMax: {}'.format(df.temperature.min(), df.temperature.max()))",
"_____no_output_____"
]
],
[
[
"You can also do some useful statistics on data with attached methods like corr for correlation coefficient.",
"_____no_output_____"
]
],
[
[
"df.temperature.corr(df.dewpoint)",
"_____no_output_____"
]
],
[
[
"We can also call a `groupby` on the data frame to start getting some summary information for each station.",
"_____no_output_____"
]
],
[
[
"df.groupby('station').mean()",
"_____no_output_____"
]
],
[
[
"<div class=\"alert alert-success\">\n <b>EXERCISE</b>:\n Calculate the min, max, and standard deviation of the temperature field grouped by each station.\n</div>",
"_____no_output_____"
]
],
[
[
"# Calculate min\n",
"_____no_output_____"
],
[
"# Calculate max\n",
"_____no_output_____"
],
[
"# Calculate standard deviation\n",
"_____no_output_____"
]
],
[
[
"<button data-toggle=\"collapse\" data-target=\"#sol4\" class='btn btn-primary'>View Solution</button>\n<div id=\"sol4\" class=\"collapse\">\n<code><pre>\nprint(df.groupby('station').min())\nprint(df.groupby('station').max())\nprint(df.groupby('station').std())\n</pre></code>\n</div>",
"_____no_output_____"
],
[
"Now, let me show you how to do all of that and more in a single call.",
"_____no_output_____"
]
],
[
[
"df.groupby('station').describe()",
"_____no_output_____"
]
],
[
[
"Now let's suppose we're going to make a meteogram or similar and want to get all of the data for a single station.",
"_____no_output_____"
]
],
[
[
"df.groupby('station').get_group('0CO').head().reset_index(drop=True)",
"_____no_output_____"
]
],
[
[
"<div class=\"alert alert-success\">\n <b>EXERCISE</b>:\n <ul>\n <li>Round the temperature column to whole degrees.</li>\n <li>Group the observations by temperaturee and use the count method to see how many instances of the rounded temperatures there are in the dataset.</li>\n</div>",
"_____no_output_____"
]
],
[
[
"# Your code goes here\n",
"_____no_output_____"
]
],
[
[
"<button data-toggle=\"collapse\" data-target=\"#sol5\" class='btn btn-primary'>View Solution</button>\n<div id=\"sol5\" class=\"collapse\">\n<code><pre>\ndf.temperature = df.temperature.round()\ndf.groupby('temperature').count()\n</pre></code>\n</div>",
"_____no_output_____"
],
[
"<a href=\"#top\">Top</a>\n<hr style=\"height:2px;\">",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
]
] |
d07f15df2543327083452b89fb9de2a3d7cfc661 | 7,238 | ipynb | Jupyter Notebook | notebooks/test_train_val-split with class_weights.ipynb | skyimager/charni | bdce26811c9e05791d6713c3d40e06d37f2839ba | [
"MIT"
] | null | null | null | notebooks/test_train_val-split with class_weights.ipynb | skyimager/charni | bdce26811c9e05791d6713c3d40e06d37f2839ba | [
"MIT"
] | null | null | null | notebooks/test_train_val-split with class_weights.ipynb | skyimager/charni | bdce26811c9e05791d6713c3d40e06d37f2839ba | [
"MIT"
] | null | null | null | 27.007463 | 95 | 0.477894 | [
[
[
"## Generate test.txt, train.txt, validation.txt and class_weights to be used in training",
"_____no_output_____"
]
],
[
[
"%load_ext autoreload\n%autoreload 2",
"_____no_output_____"
],
[
"import os\nos.getcwd()",
"_____no_output_____"
],
[
"import sys, glob, shutil\nos.chdir(os.path.dirname(os.getcwd()))\nos.getcwd()",
"_____no_output_____"
]
],
[
[
"## Input params",
"_____no_output_____"
]
],
[
[
"import fnmatch\nimport numpy as np\nimport pickle\n\nbase = \"data/English\"\ndatasets = [\"GoodImg\",\"BadImg\"]\n\nconfig = {\"GoodImg\":{\"train\":0.8,\n \"validation\":0.10,\n \"test\":0.10},\n \"BadImg\":{\"train\":0.8,\n \"validation\":0.10,\n \"test\":0.10}\n }\n\nphases = [\"train\", \"test\", \"validation\"]",
"_____no_output_____"
],
[
"for dataset in datasets:\n \n if os.path.exists(os.path.join(base, dataset, \"train.txt\")):\n os.remove(os.path.join(base, dataset, \"train.txt\"))\n\n if os.path.exists(os.path.join(base, dataset, \"validation.txt\")):\n os.remove(os.path.join(base, dataset, \"validation.txt\"))\n\n if os.path.exists(os.path.join(base, dataset, \"test.txt\")):\n os.remove(os.path.join(base, dataset, \"test.txt\"))\n \n if os.path.exists(os.path.join(base, dataset, dataset+\".pkl\")):\n os.remove(os.path.join(base, dataset, dataset+\".pkl\"))\n\n files_per_klass = []\n no_of_files_per_klass = []\n class_weights = {}\n\n for klass in os.listdir(os.path.join(base, dataset)):\n if not klass.startswith(\".\"):\n dire = os.path.join(base, dataset, klass)\n files = []\n\n #glob recursive doesn't work below 3.5\n for root, dirnames, filenames in os.walk(dire):\n for filename in fnmatch.filter(filenames, '*.png'):\n files.append(os.path.join(root, filename))\n\n files_per_klass.append((klass, files))\n no_of_files_per_klass.append(len(files))\n \n maxo = max(no_of_files_per_klass)\n train_per = config[dataset][\"train\"] #no of files in training per class\n val_per = config[dataset][\"validation\"] #no of files in validation per class\n \n for klass, files in files_per_klass:\n \n images = {}\n train_size = int(train_per*len(files))\n val_size = int(val_per*len(files))\n \n images[\"train\"] = files[:train_size]\n images[\"validation\"] = files[train_size:(train_size+val_size)]\n images[\"test\"] = files[(train_size+val_size):]\n \n label = int(klass[-2:]) - 1\n class_weights[label] = maxo//len(files)\n\n for phase in phases:\n with open('{}/{}.txt'.format(os.path.join(base,dataset),phase), 'a') as f:\n for image in images[phase]:\n image = os.path.relpath(image, os.path.join(base,dataset))\n f.write('{} {}\\n'.format(image, label))\n \n with open('{}/{}.pkl'.format(os.path.join(base,dataset),dataset), 'wb') as f:\n pickle.dump(class_weights, f, pickle.HIGHEST_PROTOCOL)\n \n for phase in phases:\n content = []\n with open('{}/{}.txt'.format(os.path.join(base,dataset),phase), 'r') as f:\n content = f.readlines()\n np.random.shuffle(content)\n print(dataset, \" \", phase, \" \", len(content))\n\n with open('{}/{}.txt'.format(os.path.join(base,dataset), phase), 'w') as f:\n for line in content:\n f.write(line)\n \n print(\"file shuffling completed for {} \\n\".format(dataset))",
"GoodImg train 6140\nGoodImg test 824\nGoodImg validation 741\nfile shuffling completed for GoodImg \n\nBadImg train 3814\nBadImg test 531\nBadImg validation 453\nfile shuffling completed for BadImg \n\n"
]
],
[
[
"### Just for Checking the generated class_weights",
"_____no_output_____"
]
],
[
[
"f = open('data/English/GoodImg/GoodImg.pkl', 'rb')\nclass_weight = pickle.load(f)\nclass_weight[0]",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07f20a98fadf41c7c7fcab7ee20ed6d8ea3b26b | 288,145 | ipynb | Jupyter Notebook | projet2/projet2-solution.ipynb | tiombo/Projet-Final | 0964e042599bb41d2c523a3f05075775449b7d14 | [
"MIT"
] | null | null | null | projet2/projet2-solution.ipynb | tiombo/Projet-Final | 0964e042599bb41d2c523a3f05075775449b7d14 | [
"MIT"
] | 3 | 2020-10-27T22:31:01.000Z | 2021-08-23T20:43:45.000Z | projet2/projet2-solution.ipynb | tiombo/Projet-Final | 0964e042599bb41d2c523a3f05075775449b7d14 | [
"MIT"
] | null | null | null | 127.104102 | 62,480 | 0.850995 | [
[
[
"420-A52-SF - Algorithmes d'apprentissage supervisé - Hiver 2020 - Spécialisation technique en Intelligence Artificielle<br/>\nMIT License - Copyright (c) 2020 Mikaël Swawola\n<br/>\n\n<br/>",
"_____no_output_____"
]
],
[
[
"%reload_ext autoreload\n%autoreload 2\n%matplotlib inline",
"_____no_output_____"
]
],
[
[
"## 0 - Import des bibliothèques",
"_____no_output_____"
]
],
[
[
"from datetime import datetime\nfrom tqdm import tqdm\nfrom collections import defaultdict\n\nimport numpy as np\nimport pandas as pd\n\nfrom sklearn.model_selection import cross_val_score\nfrom sklearn.metrics import roc_auc_score, log_loss, f1_score, roc_curve, confusion_matrix\n\nfrom sklearn.preprocessing import StandardScaler\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.utils import resample\n\nfrom sklearn.utils.fixes import loguniform\nfrom scipy.stats import randint\nfrom sklearn.model_selection import RandomizedSearchCV\nfrom sklearn.model_selection import GridSearchCV\n\n\nfrom sklearn.dummy import DummyClassifier\nfrom sklearn.linear_model import LogisticRegression\nfrom sklearn.neighbors import KNeighborsClassifier\nfrom sklearn.tree import DecisionTreeClassifier\nfrom sklearn.ensemble import BaggingClassifier\nfrom sklearn.ensemble import RandomForestClassifier\nfrom sklearn.ensemble import GradientBoostingClassifier\n\nimport xgboost as xgb\n\nfrom helpers import convertDir2Deg, plot_confusion_matrix",
"_____no_output_____"
],
[
"import matplotlib.pyplot as plt\nimport seaborn as sns\n\n# Configuration de la visualisation\nsns.set(style=\"darkgrid\")\nsns.set_context(\"notebook\", font_scale=1.5, rc={\"lines.linewidth\": 2.5, })\nsns.set(rc={'figure.figsize':(11.7,8.27)})",
"_____no_output_____"
]
],
[
[
"## 1 - Chargement et exploration sommaire des données",
"_____no_output_____"
]
],
[
[
"AUS = pd.read_csv('AUS_train.csv', index_col=['Date'], parse_dates=True)",
"_____no_output_____"
],
[
"AUS.head()",
"_____no_output_____"
],
[
"AUS = AUS.drop(columns=['Unnamed: 0'])",
"_____no_output_____"
],
[
"AUS.columns",
"_____no_output_____"
]
],
[
[
"## Date",
"_____no_output_____"
]
],
[
[
"AUS['Year'] = AUS.index.year\nAUS['Month'] = AUS.index.month",
"_____no_output_____"
]
],
[
[
"### Location",
"_____no_output_____"
]
],
[
[
"AUS['Location'].unique()",
"_____no_output_____"
],
[
"AUS['Location'].value_counts().plot(kind='barh')",
"_____no_output_____"
],
[
"AUS = pd.get_dummies(AUS, columns = ['Location'], prefix=\"loc\", drop_first=True)",
"_____no_output_____"
],
[
"AUS.columns",
"_____no_output_____"
]
],
[
[
"## Wind direction",
"_____no_output_____"
]
],
[
[
"AUS['WindGustDir'] = AUS['WindGustDir'].apply(lambda d : convertDir2Deg(d))\nAUS['WindDir9am'] = AUS['WindDir9am'].apply(lambda d : convertDir2Deg(d))\nAUS['WindDir3pm'] = AUS['WindDir3pm'].apply(lambda d : convertDir2Deg(d))",
"_____no_output_____"
],
[
"AUS['Wind1Cos'] = np.cos(AUS['WindGustDir']*2*np.pi/360)\nAUS['Wind1Sin'] = np.sin(AUS['WindGustDir']*2*np.pi/360)\nAUS['Wind2Cos'] = np.cos(AUS['WindDir9am']*2*np.pi/360)\nAUS['Wind2Sin'] = np.sin(AUS['WindDir9am']*2*np.pi/360)\nAUS['Wind3Cos'] = np.cos(AUS['WindDir3pm']*2*np.pi/360)\nAUS['Wind3Sin'] = np.sin(AUS['WindDir3pm']*2*np.pi/360)",
"_____no_output_____"
],
[
"AUS = AUS.drop(columns=['WindGustDir','WindDir9am','WindDir3pm'])",
"_____no_output_____"
]
],
[
[
"#### Vérification de la proportion des classes positives (Rain) et négatives (No rain) ",
"_____no_output_____"
]
],
[
[
"AUS['RainTomorrow'].value_counts().plot(kind='bar')",
"_____no_output_____"
],
[
"AUS['RainTomorrow'] = (AUS['RainTomorrow'] == 'Yes').astype(int)\nAUS['RainToday'] = (AUS['RainToday'] == 'Yes').astype(int)",
"_____no_output_____"
],
[
"y = AUS[['RainTomorrow']].values.ravel() # Le ravel sert à éviter un warning tanant ...\n\nAUS = AUS.drop(columns=['RainTomorrow'])\nX = AUS.values\n\nm = len(y)",
"_____no_output_____"
]
],
[
[
"## 3 - Sous-échantillonnage du jeu de données",
"_____no_output_____"
],
[
"Puisque le jeu de données est volumineux, nous allons commencer cette étude d'apprentissage supervisé avec seulement 20 % des données",
"_____no_output_____"
]
],
[
[
"X_sub, y_sub = resample(X, y, n_samples=0.2*m, stratify=y, random_state=2020)",
"_____no_output_____"
]
],
[
[
"## 4 - Modèle de référence",
"_____no_output_____"
]
],
[
[
"clf_dummy = DummyClassifier(strategy=\"most_frequent\").fit(X_sub, y_sub)\ndummy_score = log_loss(y_sub, clf_dummy.predict_proba(X_sub)[:,1])",
"_____no_output_____"
],
[
"history = {}\nhistory['Baseline'] = dummy_score\nhistory",
"_____no_output_____"
]
],
[
[
"## 5 - Régression logistique",
"_____no_output_____"
],
[
"[class sklearn.linear_model.LogisticRegression(penalty='l2', dual=False, tol=0.0001, C=1.0, fit_intercept=True, intercept_scaling=1, class_weight=None, random_state=None, solver='lbfgs', max_iter=100, multi_class='auto', verbose=0, warm_start=False, n_jobs=None, l1_ratio=None)](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html)",
"_____no_output_____"
]
],
[
[
"# Standardisation\nscaler = StandardScaler().fit(X_sub)\nX_sub_scale = scaler.transform(X_sub)",
"_____no_output_____"
],
[
"# Grille de recherche\nparameters = {'C':[0.01, 0.1, 1],\n 'l1_ratio':[0, 0.1, 0.2, 0.3, 0.4],\n 'penalty': ['none', 'elasticnet']}\n\n# Modèle\nclf_logreg = LogisticRegression(max_iter=10000,\n solver='saga',\n random_state=2020)\n\n# Recherche sur grille avec validation croisée\nclf_logreg_grid = GridSearchCV(clf_logreg, parameters, cv=5, scoring=\"neg_log_loss\", verbose=1, n_jobs=-1)",
"_____no_output_____"
],
[
"clf_logreg_grid.fit(X_sub_scale, y_sub)",
"Fitting 5 folds for each of 30 candidates, totalling 150 fits\n"
],
[
"print(f'Meilleurs paramètres: {clf_logreg_grid.best_params_}')\nprint(f'Meilleur score (mean CV): {clf_logreg_grid.best_score_}')",
"Meilleurs paramètres: {'C': 0.1, 'l1_ratio': 0.2, 'penalty': 'elasticnet'}\nMeilleur score (mean CV): -0.3604876603983897\n"
],
[
"history['Logistic regression'] = -clf_logreg_grid.best_score_\nhistory",
"_____no_output_____"
]
],
[
[
"## 6 - K plus proches voisins",
"_____no_output_____"
],
[
"[class sklearn.neighbors.KNeighborsClassifier(n_neighbors=5, weights='uniform', algorithm='auto', leaf_size=30, p=2, metric='minkowski', metric_params=None, n_jobs=None, **kwargs)](https://scikit-learn.org/stable/modules/generated/sklearn.neighbors.KNeighborsClassifier.html)",
"_____no_output_____"
]
],
[
[
"# Grille de recherche\nparameters = {'n_neighbors':[75, 100, 125, 150],\n 'p':[1, 2],\n 'weights':['uniform', 'distance']}\n\n# Modèle\nclf_knn = KNeighborsClassifier()\n\n# Recherche sur grille avec validation croisée\nclf_knn_grid = GridSearchCV(clf_knn, parameters, cv=5, scoring=\"neg_log_loss\", verbose=1, n_jobs=-1)",
"_____no_output_____"
],
[
"clf_knn_grid.fit(X_sub, y_sub)",
"Fitting 5 folds for each of 16 candidates, totalling 80 fits\n"
],
[
"print(f'Meilleurs paramètres: {clf_knn_grid.best_params_}')\nprint(f'Meilleur score (mean CV): {clf_knn_grid.best_score_}')",
"Meilleurs paramètres: {'n_neighbors': 150, 'p': 1, 'weights': 'distance'}\nMeilleur score (mean CV): -0.32224499430471\n"
],
[
"history['KNN'] = -clf_knn_grid.best_score_\nhistory",
"_____no_output_____"
]
],
[
[
"## 7 - Arbres de décision",
"_____no_output_____"
],
[
"[class sklearn.tree.DecisionTreeClassifier(criterion='gini', splitter='best', max_depth=None, min_samples_split=2, min_samples_leaf=1, min_weight_fraction_leaf=0.0, max_features=None, random_state=None, max_leaf_nodes=None, min_impurity_decrease=0.0, min_impurity_split=None, class_weight=None, presort='deprecated', ccp_alpha=0.0)](https://scikit-learn.org/stable/modules/generated/sklearn.tree.DecisionTreeClassifier.html)",
"_____no_output_____"
]
],
[
[
"# Distributions des hyperparamètres\ndistributions = dict(\n criterion=['gini', 'entropy'],\n ccp_alpha=loguniform(1e-4, 1e3),\n max_depth=randint(2, 128))\n\n# Modèle\nclf_tree = DecisionTreeClassifier(random_state=2020)\n\n \n# Recherche aléatoire avec validation croisée\nclf_tree_rnd = RandomizedSearchCV(clf_tree, distributions, n_iter=1000, cv=5, scoring=\"neg_log_loss\", verbose=1, n_jobs=-1, random_state=2020)",
"_____no_output_____"
],
[
"clf_tree_rnd.fit(X_sub, y_sub)",
"Fitting 5 folds for each of 1000 candidates, totalling 5000 fits\n"
],
[
"print(f'Meilleurs paramètres: {clf_tree_rnd.best_params_}')\nprint(f'Meilleur score (mean CV): {clf_tree_rnd.best_score_}')",
"Meilleurs paramètres: {'ccp_alpha': 0.001012142152271772, 'criterion': 'entropy', 'max_depth': 6}\nMeilleur score (mean CV): -0.39106069011934386\n"
],
[
"history['Decision Tree'] = -clf_tree_rnd.best_score_\nhistory",
"_____no_output_____"
]
],
[
[
"## 8 - Bagging (arbres de décision)",
"_____no_output_____"
],
[
"[class sklearn.ensemble.BaggingClassifier(base_estimator=None, n_estimators=10, max_samples=1.0, max_features=1.0, bootstrap=True, bootstrap_features=False, oob_score=False, warm_start=False, n_jobs=None, random_state=None, verbose=0)](https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.BaggingClassifier.html)",
"_____no_output_____"
]
],
[
[
"clf_bag = BaggingClassifier(base_estimator=clf_tree_rnd.best_estimator_, n_estimators=1000, verbose=1, n_jobs=-1, random_state=2020)",
"_____no_output_____"
],
[
"clf_bag.fit(X_sub, y_sub)",
"[Parallel(n_jobs=8)]: Using backend LokyBackend with 8 concurrent workers.\n[Parallel(n_jobs=8)]: Done 2 out of 8 | elapsed: 25.2s remaining: 1.3min\n[Parallel(n_jobs=8)]: Done 8 out of 8 | elapsed: 25.6s finished\n"
],
[
"# Score de validation croisée\ncv_score = cross_val_score(clf_bag, X_sub, y_sub, cv=5, scoring=\"neg_log_loss\", verbose=1, n_jobs=-1).mean()",
"[Parallel(n_jobs=-1)]: Using backend LokyBackend with 8 concurrent workers.\n[Parallel(n_jobs=-1)]: Done 2 out of 5 | elapsed: 1.5min remaining: 2.3min\n[Parallel(n_jobs=-1)]: Done 5 out of 5 | elapsed: 1.5min finished\n"
],
[
"print(f'Score (mean CV): {cv_score}')",
"Score (mean CV): -0.3671654438449209\n"
],
[
"history['Bagging'] = -cv_score\nhistory",
"_____no_output_____"
]
],
[
[
"## 9 - Forêts aléatoires",
"_____no_output_____"
],
[
"[class sklearn.ensemble.RandomForestClassifier(n_estimators=100, criterion='gini', max_depth=None, min_samples_split=2, min_samples_leaf=1, min_weight_fraction_leaf=0.0, max_features='auto', max_leaf_nodes=None, min_impurity_decrease=0.0, min_impurity_split=None, bootstrap=True, oob_score=False, n_jobs=None, random_state=None, verbose=0, warm_start=False, class_weight=None, ccp_alpha=0.0, max_samples=None)](https://scikit-learn.org/stable/modules/generated/sklearn.ensemble.RandomForestClassifier.html)",
"_____no_output_____"
]
],
[
[
"# Grille de recherche\nparameters = {'ccp_alpha': [1e-3, 1e-2, 1e-1, 1],\n 'criterion':['gini','entropy'],\n 'max_features': [None, 'log2', 'sqrt']}\n\n# Modèle\nclf_rf = RandomForestClassifier(n_estimators=100, random_state=2020)\n\n# Recherche sur grille avec validation croisée\nclf_rf_grid = GridSearchCV(clf_rf, parameters, cv=5, scoring=\"neg_log_loss\", verbose=1, n_jobs=-1)",
"_____no_output_____"
],
[
"clf_rf_grid.fit(X_sub, y_sub)",
"Fitting 5 folds for each of 24 candidates, totalling 120 fits\n"
],
[
"print(f'Meilleurs paramètres: {clf_rf_grid.best_params_}')\nprint(f'Meilleur score (mean CV): {clf_rf_grid.best_score_}')",
"Meilleurs paramètres: {'ccp_alpha': 0.001, 'criterion': 'entropy', 'max_features': None}\nMeilleur score (mean CV): -0.35768899900641327\n"
],
[
"history['Random Forests'] = -clf_rf_grid.best_score_\nhistory",
"_____no_output_____"
]
],
[
[
"## 10 - Gradient Boosting",
"_____no_output_____"
]
],
[
[
"# Grille de recherche\nparameters = {\n 'learning_rate': [0.01, 0.1, 1],\n 'max_features': ['sqrt', None],\n 'loss': ['deviance', 'exponential'],\n 'ccp_alpha': [1e-5, 1e-4, 1e-3]}\n\n# Modèle\nclf_gb = GradientBoostingClassifier(n_estimators=100, random_state=2020)\n\n# Recherche sur grille avec validation croisée\nclf_gb_grid = GridSearchCV(clf_gb, parameters, cv=5, scoring=\"neg_log_loss\", verbose=1, n_jobs=-1)",
"_____no_output_____"
],
[
"clf_gb_grid.fit(X_sub, y_sub)",
"Fitting 5 folds for each of 36 candidates, totalling 180 fits\n"
],
[
"print(f'Meilleurs paramètres: {clf_gb_grid.best_params_}')\nprint(f'Meilleur score (mean CV): {clf_gb_grid.best_score_}')",
"Meilleurs paramètres: {'ccp_alpha': 0.0001, 'learning_rate': 0.1, 'loss': 'exponential', 'max_features': None}\nMeilleur score (mean CV): -0.3520269128924259\n"
],
[
"history['Gradient Boosting'] = -clf_gb_grid.best_score_\nhistory",
"_____no_output_____"
]
],
[
[
"## 11 - XGBoost",
"_____no_output_____"
]
],
[
[
"# Grille de recherche\nparameters = {\n 'learning_rate': [0.001, 0.01, 0.1],\n 'reg_alpha': [1e-4, 1e-3, 1e-2],\n 'reg_lambda': [1e-4, 1e-3, 1e-2]}\n\n# Modèle\nclf_xgb = xgb.XGBClassifier(objective='binary:logistic',\n colsample_bytree=0.3,\n max_depth=30,\n n_estimators=100,\n random_state=2020)\n\n\n# Recherche sur grille avec validation croisée\nclf_xgb_grid = GridSearchCV(clf_xgb, parameters, cv=5, scoring=\"neg_log_loss\", verbose=1, n_jobs=-1)",
"_____no_output_____"
],
[
"clf_xgb_grid.fit(X_sub, y_sub)",
"Fitting 5 folds for each of 27 candidates, totalling 135 fits\n"
],
[
"print(f'Meilleurs paramètres: {clf_xgb_grid.best_params_}')\nprint(f'Meilleur score (mean CV): {clf_xgb_grid.best_score_}')",
"Meilleurs paramètres: {'learning_rate': 0.1, 'reg_alpha': 0.001, 'reg_lambda': 0.01}\nMeilleur score (mean CV): -0.350077011659007\n"
],
[
"history['XGBoost'] = -clf_xgb_grid.best_score_\nhistory",
"_____no_output_____"
]
],
[
[
"## 12 - Courbes d'apprentissage pour le meilleur modèle",
"_____no_output_____"
],
[
"### XGBoost",
"_____no_output_____"
]
],
[
[
"lcurves = defaultdict(list)\n\nfor p in tqdm([0.2, 0.25, 0.3, 0.35, 0.4, 0.45, 0.5, 0.55, 0.6, 0.65, 0.7]):\n X_, y_ = resample(X, y, n_samples=p*m, stratify=y, random_state=2020)\n Xt, Xv, yt, yv = train_test_split(X_, y_, test_size=0.3, stratify=y_, random_state=2020)\n \n clf_xgb_grid.best_estimator_.fit(Xt, yt)\n \n lcurves['Train'].append(log_loss(yt, clf_xgb_grid.predict_proba(Xt)[:,1]))\n lcurves['Val'].append(log_loss(yv, clf_xgb_grid.predict_proba(Xv)[:,1]))",
"100%|██████████| 11/11 [00:56<00:00, 5.12s/it]\n"
]
],
[
[
"#### Affichage des courbes d'apprentissage",
"_____no_output_____"
]
],
[
[
"plt.plot(lcurves['Train'], label=\"Train\")\nplt.plot(lcurves['Val'], label=\"Validation\")\nplt.legend()",
"_____no_output_____"
]
],
[
[
"### Gradient Boosting",
"_____no_output_____"
]
],
[
[
"lcurves = defaultdict(list)\n\nfor p in tqdm([0.2, 0.25, 0.3, 0.35, 0.4, 0.45, 0.5, 0.55, 0.6, 0.65, 0.7]):\n X_, y_ = resample(X, y, n_samples=p*m, stratify=y, random_state=2020)\n Xt, Xv, yt, yv = train_test_split(X_, y_, test_size=0.3, stratify=y_, random_state=2020)\n \n clf_gb_grid.best_estimator_.fit(Xt, yt)\n \n lcurves['Train'].append(log_loss(yt, clf_gb_grid.predict_proba(Xt)[:,1]))\n lcurves['Val'].append(log_loss(yv, clf_gb_grid.predict_proba(Xv)[:,1]))",
"100%|██████████| 11/11 [02:23<00:00, 13.03s/it]\n"
],
[
"plt.plot(lcurves['Train'], label=\"Train\")\nplt.plot(lcurves['Val'], label=\"Validation\")\nplt.legend()",
"_____no_output_____"
]
],
[
[
"## 13 - Réentraînement du meilleur modèle en prenant en compte les meilleurs hyperparamètres",
"_____no_output_____"
]
],
[
[
"clf_best = GradientBoostingClassifier(\n n_estimators=100,\n ccp_alpha=0.0001,\n learning_rate=0.1,\n loss='exponential',\n max_features= None,\n random_state=2020)\n\nclf_best.fit(X, y)\n\ncv_score = cross_val_score(clf_best, X, y, cv=5, scoring=\"neg_log_loss\", verbose=1, n_jobs=-1)\ncv_score.mean()",
"[Parallel(n_jobs=-1)]: Using backend LokyBackend with 8 concurrent workers.\n[Parallel(n_jobs=-1)]: Done 2 out of 5 | elapsed: 51.5s remaining: 1.3min\n[Parallel(n_jobs=-1)]: Done 5 out of 5 | elapsed: 1.1min finished\n"
]
],
[
[
"## 14 - Métriques",
"_____no_output_____"
],
[
"#### Prédictions",
"_____no_output_____"
]
],
[
[
"y_train_pred_proba_best = clf_best.predict_proba(X)[:,1]",
"_____no_output_____"
]
],
[
[
"#### Aire sous la courbe",
"_____no_output_____"
]
],
[
[
"print(f'AUC = {roc_auc_score(y, y_train_pred_proba_best)}')",
"AUC = 0.8793425672328384\n"
]
],
[
[
"#### Courbe ROC",
"_____no_output_____"
]
],
[
[
"fpr_rf, tpr_rf, thresholds = roc_curve(y, y_train_pred_proba_best)\n\nfig = plt.figure(4, figsize=(6, 6))\n\nplt.plot([0, 1], [0, 1], 'k--')\nplt.plot(fpr_rf, tpr_rf, label='Meilleur modèle')\nplt.xlabel('False positive rate')\nplt.ylabel('True positive rate')\nplt.title('ROC curve')\nplt.show()",
"_____no_output_____"
]
],
[
[
"#### Recherche du meilleur seuil",
"_____no_output_____"
]
],
[
[
"selected_threshold = thresholds[np.argmax(-fpr_rf + tpr_rf)]\nselected_threshold",
"_____no_output_____"
]
],
[
[
"#### F1 score",
"_____no_output_____"
]
],
[
[
"f1_score(y, y_train_pred_proba_best > selected_threshold)",
"_____no_output_____"
]
],
[
[
"#### Matrice de confusion",
"_____no_output_____"
]
],
[
[
"fig = plt.figure(3, figsize=(6, 6))\n\ncnf_matrix = confusion_matrix(y, y_train_pred_proba_best > selected_threshold)\nnp.set_printoptions(precision=2)\nplot_confusion_matrix(cnf_matrix, classes=['0','1'], title='Matrice de confusion')",
"Confusion matrix, without normalization\n"
],
[
"# Accuracy\n(61331+17755)/(61331+17755+15846+4604)",
"_____no_output_____"
]
],
[
[
"## 15 - Prédictions sur le jeu de test",
"_____no_output_____"
],
[
"#### On applique les mêmes transformations que pour le jeu d'entraînement",
"_____no_output_____"
]
],
[
[
"AUS_test = pd.read_csv('AUS_test.csv', index_col=['Date'], parse_dates=True)\nAUS_test = AUS_test.drop(columns=['Unnamed: 0'])\nAUS_test['Year'] = AUS_test.index.year\nAUS_test['Month'] = AUS_test.index.month\nAUS_test = pd.get_dummies(AUS_test, columns = ['Location'], prefix=\"loc\", drop_first=True)\nAUS_test['WindGustDir'] = AUS_test['WindGustDir'].apply(lambda d : convertDir2Deg(d))\nAUS_test['WindDir9am'] = AUS_test['WindDir9am'].apply(lambda d : convertDir2Deg(d))\nAUS_test['WindDir3pm'] = AUS_test['WindDir3pm'].apply(lambda d : convertDir2Deg(d))\nAUS_test['Wind1Cos'] = np.cos(AUS_test['WindGustDir']*2*np.pi/360)\nAUS_test['Wind1Sin'] = np.sin(AUS_test['WindGustDir']*2*np.pi/360)\nAUS_test['Wind2Cos'] = np.cos(AUS_test['WindDir9am']*2*np.pi/360)\nAUS_test['Wind2Sin'] = np.sin(AUS_test['WindDir9am']*2*np.pi/360)\nAUS_test['Wind3Cos'] = np.cos(AUS_test['WindDir3pm']*2*np.pi/360)\nAUS_test['Wind3Sin'] = np.sin(AUS_test['WindDir3pm']*2*np.pi/360)\nAUS_test = AUS_test.drop(columns=['WindGustDir','WindDir9am','WindDir3pm'])\nAUS_test['RainToday'] = (AUS_test['RainToday'] == 'Yes').astype(int)\nX_test = AUS_test.values",
"_____no_output_____"
]
],
[
[
"#### Calcul des prédictions sur le jeu de test",
"_____no_output_____"
]
],
[
[
"y_test_pred_proba_best = clf_best.predict_proba(X_test)[:,1]",
"_____no_output_____"
]
],
[
[
"#### Lecture de la véritable réponse",
"_____no_output_____"
]
],
[
[
"AUS_response = pd.read_csv('AUS_test_Rain_tomorrow.csv', index_col=['Date'], parse_dates=True)",
"_____no_output_____"
],
[
"y_true = (AUS_response['RainTomorrow'] == 'Yes').astype(int)",
"_____no_output_____"
]
],
[
[
"#### Calcul du log-loss",
"_____no_output_____"
]
],
[
[
"log_loss(y_true, y_test_pred_proba_best)",
"_____no_output_____"
]
],
[
[
"#### Aire sous la courbe",
"_____no_output_____"
]
],
[
[
"print(f'AUC = {roc_auc_score(y_true, y_test_pred_proba_best)}')",
"AUC = 0.8737077575165082\n"
]
],
[
[
"#### Courbe ROC",
"_____no_output_____"
]
],
[
[
"fpr_rf, tpr_rf, thresholds = roc_curve(y_true, y_test_pred_proba_best)\n\nfig = plt.figure(4, figsize=(6, 6))\n\nplt.plot([0, 1], [0, 1], 'k--')\nplt.plot(fpr_rf, tpr_rf, label='Meilleur modèle')\nplt.xlabel('False positive rate')\nplt.ylabel('True positive rate')\nplt.title('ROC curve')\nplt.show()",
"_____no_output_____"
]
],
[
[
"#### Score F1",
"_____no_output_____"
]
],
[
[
"f1_score(y_true, y_test_pred_proba_best > selected_threshold) # Attention, utiliser le seuil trouvé par validation croisée !",
"_____no_output_____"
]
],
[
[
"#### Matrice de confusion",
"_____no_output_____"
]
],
[
[
"fig = plt.figure(3, figsize=(6, 6))\n\ncnf_matrix = confusion_matrix(y_true, y_test_pred_proba_best > selected_threshold)\nnp.set_printoptions(precision=2)\nplot_confusion_matrix(cnf_matrix, classes=['0','1'], title='Matrice de confusion')",
"Confusion matrix, without normalization\n"
],
[
"# Accuracy\n(26094+7500)/(26094+7500+7045+2018)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
d07f29965eb0178ec669f8acab02d17351be4aaa | 161,069 | ipynb | Jupyter Notebook | Q1 PartD/Q1 PartD - Monthly/MiniProj_RNN_ADAgrad_MSE_Q1_PartD_Pytorch_Monthly.ipynb | parhamzm/Beijing-Weather-Prediction | 9971201a766d8ddce50b3fb737b6aa49eea55fc3 | [
"MIT"
] | null | null | null | Q1 PartD/Q1 PartD - Monthly/MiniProj_RNN_ADAgrad_MSE_Q1_PartD_Pytorch_Monthly.ipynb | parhamzm/Beijing-Weather-Prediction | 9971201a766d8ddce50b3fb737b6aa49eea55fc3 | [
"MIT"
] | null | null | null | Q1 PartD/Q1 PartD - Monthly/MiniProj_RNN_ADAgrad_MSE_Q1_PartD_Pytorch_Monthly.ipynb | parhamzm/Beijing-Weather-Prediction | 9971201a766d8ddce50b3fb737b6aa49eea55fc3 | [
"MIT"
] | null | null | null | 161,069 | 161,069 | 0.816662 | [
[
[
"!git clone https://github.com/parhamzm/Beijing-Pollution-DataSet",
"fatal: destination path 'Beijing-Pollution-DataSet' already exists and is not an empty directory.\n"
],
[
"!ls Beijing-Pollution-DataSet",
"pollution.csv polution_dataSet.npy README.md\n"
],
[
"import torch\nimport torchvision\nimport torch.nn as nn\nfrom torchvision import transforms\nimport pandas as pd\n\nimport matplotlib.pyplot as plt\nimport numpy as np\n\nfrom torch.utils.data import random_split\n\nfrom math import sqrt\nfrom numpy import concatenate\nfrom matplotlib import pyplot\nfrom pandas import read_csv\nfrom pandas import DataFrame\nfrom pandas import concat\nfrom sklearn.preprocessing import MinMaxScaler\nfrom sklearn.preprocessing import LabelEncoder\nfrom sklearn.metrics import mean_squared_error\n\nfrom numpy import array\nfrom numpy import hstack",
"_____no_output_____"
]
],
[
[
"# **Data Pre Processing**",
"_____no_output_____"
]
],
[
[
"DATA_DIR = \"Beijing-Pollution-DataSet/\"\nfrom pandas import read_csv\nfrom datetime import datetime\nfrom random import randint\n\ndef select_month(sequences, n_samples=250):\n X, y = list(), list()\n rand_hour = randint(0, 24)\n rand_day = randint(0, 7)\n for i in range(0, n_samples):\n start_ix = rand_hour + rand_day*24 + 672 * i # 168 : Week hours!\n idxs = []\n for j in range(0, 4):\n if j <=2:\n idx = start_ix + (j * 168) # Add different weeks\n idxs.append(idx)\n if j == 3: # Target\n idy = start_ix + (j * 168)\n seq_x = sequences[idxs, :]\n seq_y = sequences[idy, 0]\n y.append(seq_y)\n X.append(seq_x)\n\n return X, y\n\n\n\n# split a multivariate sequence into samples\ndef split_sequences(sequences, n_steps, n_samples=12000, start_from=0):\n\tX, y = list(), list()\n\tfor i in range(start_from, (start_from + n_samples)):\n # find the end of this pattern\n\t\tend_ix = i + n_steps\n # check if we are beyond the dataset\n # if end_ix > len(sequences):\n # break\n # gather input and output parts of the pattern\n\t\tseq_x = sequences[i:end_ix, :]\n\t\tseq_y = sequences[end_ix, 0]\n\t\ty.append(seq_y)\n\t\tX.append(seq_x)\n \n\n\t\n\treturn array(X), array(y)\n\n\n# load dataset\nDATA_DIR = \"Beijing-Pollution-DataSet/\"\n\ndata = np.load(DATA_DIR + 'polution_dataSet.npy')\nscaled_data = data\n\nx, y = select_month(data, n_samples=65)\nprint(\"X shape => \", np.array(x).shape)\nprint(\"y shape => \", np.array(y).shape)\nx = np.array(x)\ny = np.array(y)\ndataset = data\ntrain_X, train_y = x[0:50], y[0:50] #split_sequences(dataset, n_timesteps, n_samples=15000, start_from=0)\nvalid_X, valid_y = x[50:], y[50:] #split_sequences(dataset, n_timesteps, n_samples=3000, start_from=15000)\n\n\n",
"X shape => (65, 3, 8)\ny shape => (65,)\n"
],
[
"test_loader_X = torch.utils.data.DataLoader(dataset=(train_X), batch_size=20, shuffle=False)\n# train_X = torch.tensor(train_X, dtype=torch.float32) \n# train_y = torch.tensor(train_y, dtype=torch.float32)\nprint(\"Train X Shape :=> \", train_X.shape)\nprint(\"Train Y Shape :=> \", train_y.shape)\nprint(\"####################################\")\nprint(\"Test X Shape :=> \", valid_X.shape)\nprint(\"Test Y Shape :=> \", valid_y.shape)",
"Train X Shape :=> (50, 3, 8)\nTrain Y Shape :=> (50,)\n####################################\nTest X Shape :=> (15, 3, 8)\nTest Y Shape :=> (15,)\n"
],
[
"class RNN(torch.nn.Module):\n def __init__(self, n_features=8, n_output=1, seq_length=6, n_hidden_layers=233, n_layers=1):\n super(RNN, self).__init__()\n self.n_features = n_features\n self.seq_len = seq_length\n self.n_output = n_output\n\n self.n_hidden = n_hidden_layers # number of hidden states\n self.n_layers = n_layers # number of LSTM layers (stacked)\n \n # define RNN with specified parameters\n # bath_first means that the first dim of the input and output will be the batch_size\n self.rnn = nn.RNN(input_size=self.n_features,\n hidden_size=self.n_hidden,\n num_layers=self.n_layers,\n batch_first=True)\n \n # last, fully connected layer\n self.l_linear = torch.nn.Linear(self.n_hidden*self.seq_len, self.n_output)\n\n\n def forward(self, x, hidden):\n # hidden_state = torch.zeros(self.n_layers, x.size(0), self.n_hidden).requires_grad_()\n # cell_state = torch.zeros(self.n_layers, x.size(0), self.n_hidden).requires_grad_()\n batch_size = x.size(0)\n\n rnn_out, hidden = self.rnn(x, hidden)\n # print(rnn_out.shape)\n rnn_out = rnn_out.contiguous().view(batch_size, -1)\n \n # lstm_out(with batch_first = True) is \n # (batch_size,seq_len,num_directions * hidden_size)\n # for following linear layer we want to keep batch_size dimension and merge rest \n # .contiguous() -> solves tensor compatibility error\n # x = lstm_out.contiguous().view(batch_size, -1)\n out = self.l_linear(rnn_out)\n return out, hidden",
"_____no_output_____"
],
[
"torch.manual_seed(13)\nmodel = RNN(n_features=8, n_output=1, seq_length=3, n_hidden_layers=233, n_layers=1)\ncriterion = nn.MSELoss()\noptimizer = torch.optim.Adagrad(model.parameters(), lr=0.001)",
"_____no_output_____"
],
[
"model = model#.to(device)\ncriterion = criterion#.to(device)\nfor p in model.parameters():\n print(p.numel())",
"1864\n54289\n233\n233\n699\n1\n"
],
[
"import time\nstart_time = time.time()\nhidden = None\nhidden_test = None\nepochs = 200\nmodel.train()\nbatch_size = 5\nrunning_loss_history = []\nval_running_loss_history = []\nfor epoch in range(epochs):\n running_loss = 0.0\n val_running_loss = 0.0\n model.train()\n for b in range(0, len(train_X), batch_size):\n inpt = train_X[b:b+batch_size, :, :]\n target = train_y[b:b+batch_size]\n\n # print(\"Input Shape :=> \", inpt.shape)\n\n x_batch = torch.tensor(inpt, dtype=torch.float32) \n y_batch = torch.tensor(target, dtype=torch.float32)\n\n output, hidden = model(x_batch, hidden)\n\n hidden = hidden.data\n loss = criterion(output.view(-1), y_batch)\n\n running_loss += loss.item()\n\n loss.backward()\n optimizer.step() \n optimizer.zero_grad() \n\n else:\n with torch.no_grad(): # it will temprerorerly set all the required grad flags to be false\n model.eval()\n for b in range(0, len(valid_X), batch_size):\n inpt = valid_X[b:b+batch_size, :, :]\n target = valid_y[b:b+batch_size] \n\n x_batch_test = torch.tensor(inpt, dtype=torch.float32)\n y_batch_test = torch.tensor(target, dtype=torch.float32)\n\n # model.init_hidden(x_batch_test.size(0))\n\n output_test, hidden_test = model(x_batch_test, hidden_test)\n\n hidden_test = hidden_test.data\n loss_valid = criterion(output_test.view(-1), y_batch_test)\n\n val_running_loss += loss_valid.item()\n\n val_epoch_loss = val_running_loss / len(valid_X)\n val_running_loss_history.append(val_epoch_loss)\n epoch_loss = running_loss / len(train_X)\n running_loss_history.append(epoch_loss)\n print('step : ' , epoch , ' Train loss : ' , epoch_loss, ', Valid Loss : => ', val_epoch_loss)\n print(\"***->>>-----------------------------------------------<<<-***\")\n\ntotal_time = time.time() - start_time\nprint(\"===========================================================\")\nprint(\"*********************************************************\")\nprint(\"The total Training Time is Equal with ==> : {0} Sec.\".format(total_time))\nprint(\"*********************************************************\")\nprint(\"===========================================================\")",
"step : 0 Train loss : 0.002059369618073106 , Valid Loss : => 0.0016567523901661236\n***->>>-----------------------------------------------<<<-***\nstep : 1 Train loss : 0.0016436615632846952 , Valid Loss : => 0.0011269247469802698\n***->>>-----------------------------------------------<<<-***\nstep : 2 Train loss : 0.001507067703641951 , Valid Loss : => 0.0010587704678376515\n***->>>-----------------------------------------------<<<-***\nstep : 3 Train loss : 0.001451271460391581 , Valid Loss : => 0.0010326983251919349\n***->>>-----------------------------------------------<<<-***\nstep : 4 Train loss : 0.0014128131046891213 , Valid Loss : => 0.001020789270599683\n***->>>-----------------------------------------------<<<-***\nstep : 5 Train loss : 0.001382403769530356 , Valid Loss : => 0.0010150961577892303\n***->>>-----------------------------------------------<<<-***\nstep : 6 Train loss : 0.0013568328041583299 , Valid Loss : => 0.0010126059409230948\n***->>>-----------------------------------------------<<<-***\nstep : 7 Train loss : 0.0013345721224322915 , Valid Loss : => 0.0010119682643562554\n***->>>-----------------------------------------------<<<-***\nstep : 8 Train loss : 0.00131475385511294 , Valid Loss : => 0.0010125029211242994\n***->>>-----------------------------------------------<<<-***\nstep : 9 Train loss : 0.0012968285358510912 , Valid Loss : => 0.0010138347434500853\n***->>>-----------------------------------------------<<<-***\nstep : 10 Train loss : 0.0012804210395552218 , Valid Loss : => 0.0010157436598092317\n***->>>-----------------------------------------------<<<-***\nstep : 11 Train loss : 0.001265262053348124 , Valid Loss : => 0.0010180907944838207\n***->>>-----------------------------------------------<<<-***\nstep : 12 Train loss : 0.0012511499598622322 , Valid Loss : => 0.0010207863369335731\n***->>>-----------------------------------------------<<<-***\nstep : 13 Train loss : 0.0012379297753795982 , Valid Loss : => 0.0010237675004949173\n***->>>-----------------------------------------------<<<-***\nstep : 14 Train loss : 0.0012254791846498846 , Valid Loss : => 0.0010269906527052323\n***->>>-----------------------------------------------<<<-***\nstep : 15 Train loss : 0.0012136997282505035 , Valid Loss : => 0.0010304229954878489\n***->>>-----------------------------------------------<<<-***\nstep : 16 Train loss : 0.001202511244919151 , Valid Loss : => 0.0010340404075880846\n***->>>-----------------------------------------------<<<-***\nstep : 17 Train loss : 0.001191846679430455 , Valid Loss : => 0.0010378239676356316\n***->>>-----------------------------------------------<<<-***\nstep : 18 Train loss : 0.0011816500732675195 , Valid Loss : => 0.0010417587589472532\n***->>>-----------------------------------------------<<<-***\nstep : 19 Train loss : 0.00117187415715307 , Valid Loss : => 0.0010458327829837798\n***->>>-----------------------------------------------<<<-***\nstep : 20 Train loss : 0.0011624784045852722 , Valid Loss : => 0.0010500364160786072\n***->>>-----------------------------------------------<<<-***\nstep : 21 Train loss : 0.0011534272297285498 , Valid Loss : => 0.0010543610124538342\n***->>>-----------------------------------------------<<<-***\nstep : 22 Train loss : 0.0011446907836943866 , Valid Loss : => 0.0010588002546379964\n***->>>-----------------------------------------------<<<-***\nstep : 23 Train loss : 0.0011362422863021493 , Valid Loss : => 0.0010633479803800582\n***->>>-----------------------------------------------<<<-***\nstep : 24 Train loss : 0.001128058184403926 , Valid Loss : => 0.0010679994244128465\n***->>>-----------------------------------------------<<<-***\nstep : 25 Train loss : 0.001120117746759206 , Valid Loss : => 0.0010727504268288612\n***->>>-----------------------------------------------<<<-***\nstep : 26 Train loss : 0.0011124027520418168 , Valid Loss : => 0.001077597215771675\n***->>>-----------------------------------------------<<<-***\nstep : 27 Train loss : 0.0011048966739326716 , Valid Loss : => 0.0010825359417746465\n***->>>-----------------------------------------------<<<-***\nstep : 28 Train loss : 0.0010975849325768649 , Valid Loss : => 0.0010875643386195103\n***->>>-----------------------------------------------<<<-***\nstep : 29 Train loss : 0.001090454226359725 , Valid Loss : => 0.001092679783081015\n***->>>-----------------------------------------------<<<-***\nstep : 30 Train loss : 0.001083492562174797 , Valid Loss : => 0.0010978798226763805\n***->>>-----------------------------------------------<<<-***\nstep : 31 Train loss : 0.0010766891157254577 , Valid Loss : => 0.0011031626878927152\n***->>>-----------------------------------------------<<<-***\nstep : 32 Train loss : 0.0010700340918265282 , Valid Loss : => 0.0011085260814676682\n***->>>-----------------------------------------------<<<-***\nstep : 33 Train loss : 0.0010635186382569372 , Valid Loss : => 0.0011139693204313516\n***->>>-----------------------------------------------<<<-***\nstep : 34 Train loss : 0.001057134356815368 , Valid Loss : => 0.001119490247219801\n***->>>-----------------------------------------------<<<-***\nstep : 35 Train loss : 0.0010508739668875933 , Valid Loss : => 0.0011250882254292569\n***->>>-----------------------------------------------<<<-***\nstep : 36 Train loss : 0.0010447306325659157 , Valid Loss : => 0.0011307613303263983\n***->>>-----------------------------------------------<<<-***\nstep : 37 Train loss : 0.0010386978462338448 , Valid Loss : => 0.0011365089565515518\n***->>>-----------------------------------------------<<<-***\nstep : 38 Train loss : 0.001032769507728517 , Valid Loss : => 0.0011423305297891299\n***->>>-----------------------------------------------<<<-***\nstep : 39 Train loss : 0.0010269406856969 , Valid Loss : => 0.0011482248393197855\n***->>>-----------------------------------------------<<<-***\nstep : 40 Train loss : 0.001021206199657172 , Valid Loss : => 0.0011541911711295445\n***->>>-----------------------------------------------<<<-***\nstep : 41 Train loss : 0.0010155615070834756 , Valid Loss : => 0.0011602291526893775\n***->>>-----------------------------------------------<<<-***\nstep : 42 Train loss : 0.001010002053808421 , Valid Loss : => 0.0011663381786396105\n***->>>-----------------------------------------------<<<-***\nstep : 43 Train loss : 0.0010045241727493703 , Valid Loss : => 0.0011725172710915406\n***->>>-----------------------------------------------<<<-***\nstep : 44 Train loss : 0.0009991237544454633 , Valid Loss : => 0.001178766880184412\n***->>>-----------------------------------------------<<<-***\nstep : 45 Train loss : 0.0009937975858338177 , Valid Loss : => 0.001185085748632749\n***->>>-----------------------------------------------<<<-***\nstep : 46 Train loss : 0.0009885425330139696 , Valid Loss : => 0.0011914740627010664\n***->>>-----------------------------------------------<<<-***\nstep : 47 Train loss : 0.0009833554178476334 , Valid Loss : => 0.0011979311549415192\n***->>>-----------------------------------------------<<<-***\nstep : 48 Train loss : 0.0009782332414761186 , Valid Loss : => 0.0012044571340084076\n***->>>-----------------------------------------------<<<-***\nstep : 49 Train loss : 0.0009731737710535526 , Valid Loss : => 0.0012110513634979725\n***->>>-----------------------------------------------<<<-***\nstep : 50 Train loss : 0.0009681743173860013 , Valid Loss : => 0.0012177140141526857\n***->>>-----------------------------------------------<<<-***\nstep : 51 Train loss : 0.0009632325917482376 , Valid Loss : => 0.0012244443719585736\n***->>>-----------------------------------------------<<<-***\nstep : 52 Train loss : 0.0009583467780612409 , Valid Loss : => 0.0012312424058715501\n***->>>-----------------------------------------------<<<-***\nstep : 53 Train loss : 0.0009535145410336554 , Valid Loss : => 0.0012381073087453843\n***->>>-----------------------------------------------<<<-***\nstep : 54 Train loss : 0.0009487342135980725 , Valid Loss : => 0.0012450399498144785\n***->>>-----------------------------------------------<<<-***\nstep : 55 Train loss : 0.0009440042660571635 , Valid Loss : => 0.001252038807918628\n***->>>-----------------------------------------------<<<-***\nstep : 56 Train loss : 0.0009393230662681163 , Valid Loss : => 0.0012591044418513776\n***->>>-----------------------------------------------<<<-***\nstep : 57 Train loss : 0.0009346891660243273 , Valid Loss : => 0.0012662357029815515\n***->>>-----------------------------------------------<<<-***\nstep : 58 Train loss : 0.0009301011683419347 , Valid Loss : => 0.0012734324671328067\n***->>>-----------------------------------------------<<<-***\nstep : 59 Train loss : 0.0009255582862533629 , Valid Loss : => 0.0012806943928201993\n***->>>-----------------------------------------------<<<-***\nstep : 60 Train loss : 0.0009210590505972505 , Valid Loss : => 0.0012880203935007255\n***->>>-----------------------------------------------<<<-***\nstep : 61 Train loss : 0.0009166030585765839 , Valid Loss : => 0.0012954098793367544\n***->>>-----------------------------------------------<<<-***\nstep : 62 Train loss : 0.0009121891413815319 , Valid Loss : => 0.0013028622604906559\n***->>>-----------------------------------------------<<<-***\nstep : 63 Train loss : 0.0009078167797997594 , Valid Loss : => 0.0013103763572871684\n***->>>-----------------------------------------------<<<-***\nstep : 64 Train loss : 0.0009034856059588492 , Valid Loss : => 0.0013179510521392028\n***->>>-----------------------------------------------<<<-***\nstep : 65 Train loss : 0.0008991949586197734 , Valid Loss : => 0.0013255854758123557\n***->>>-----------------------------------------------<<<-***\nstep : 66 Train loss : 0.000894944672472775 , Valid Loss : => 0.0013332783865431944\n***->>>-----------------------------------------------<<<-***\nstep : 67 Train loss : 0.0008907343470491469 , Valid Loss : => 0.0013410276733338833\n***->>>-----------------------------------------------<<<-***\nstep : 68 Train loss : 0.0008865642757155001 , Valid Loss : => 0.0013488323738177618\n***->>>-----------------------------------------------<<<-***\nstep : 69 Train loss : 0.0008824341394938529 , Valid Loss : => 0.0013566899423797926\n***->>>-----------------------------------------------<<<-***\nstep : 70 Train loss : 0.0008783442783169449 , Valid Loss : => 0.0013645992614328862\n***->>>-----------------------------------------------<<<-***\nstep : 71 Train loss : 0.0008742948202416301 , Valid Loss : => 0.001372557661185662\n***->>>-----------------------------------------------<<<-***\nstep : 72 Train loss : 0.0008702862146310508 , Valid Loss : => 0.0013805628443757693\n***->>>-----------------------------------------------<<<-***\nstep : 73 Train loss : 0.000866318792104721 , Valid Loss : => 0.00138861204807957\n***->>>-----------------------------------------------<<<-***\nstep : 74 Train loss : 0.000862393241841346 , Valid Loss : => 0.0013967030371228853\n***->>>-----------------------------------------------<<<-***\nstep : 75 Train loss : 0.00085850995965302 , Valid Loss : => 0.0014048322724799316\n***->>>-----------------------------------------------<<<-***\nstep : 76 Train loss : 0.0008546698186546564 , Valid Loss : => 0.0014129963393012682\n***->>>-----------------------------------------------<<<-***\nstep : 77 Train loss : 0.0008508735336363316 , Valid Loss : => 0.0014211931886772315\n***->>>-----------------------------------------------<<<-***\nstep : 78 Train loss : 0.0008471217867918313 , Valid Loss : => 0.0014294181329508623\n***->>>-----------------------------------------------<<<-***\nstep : 79 Train loss : 0.0008434158121235669 , Valid Loss : => 0.0014376676951845487\n***->>>-----------------------------------------------<<<-***\nstep : 80 Train loss : 0.0008397561917081475 , Valid Loss : => 0.0014459384605288506\n***->>>-----------------------------------------------<<<-***\nstep : 81 Train loss : 0.0008361440221779048 , Valid Loss : => 0.0014542258655031522\n***->>>-----------------------------------------------<<<-***\nstep : 82 Train loss : 0.0008325800811871887 , Valid Loss : => 0.0014625256260236104\n***->>>-----------------------------------------------<<<-***\nstep : 83 Train loss : 0.0008290655515156687 , Valid Loss : => 0.0014708340167999268\n***->>>-----------------------------------------------<<<-***\nstep : 84 Train loss : 0.0008256012131460011 , Valid Loss : => 0.0014791463191310564\n***->>>-----------------------------------------------<<<-***\nstep : 85 Train loss : 0.0008221878949552775 , Valid Loss : => 0.001487458124756813\n***->>>-----------------------------------------------<<<-***\nstep : 86 Train loss : 0.0008188266679644584 , Valid Loss : => 0.001495764870196581\n***->>>-----------------------------------------------<<<-***\nstep : 87 Train loss : 0.0008155181747861207 , Valid Loss : => 0.00150406276807189\n***->>>-----------------------------------------------<<<-***\nstep : 88 Train loss : 0.0008122629998251796 , Valid Loss : => 0.0015123457026978333\n***->>>-----------------------------------------------<<<-***\nstep : 89 Train loss : 0.0008090618345886469 , Valid Loss : => 0.0015206104144454002\n***->>>-----------------------------------------------<<<-***\nstep : 90 Train loss : 0.0008059154404327273 , Valid Loss : => 0.0015288522777458032\n***->>>-----------------------------------------------<<<-***\nstep : 91 Train loss : 0.0008028238406404853 , Valid Loss : => 0.001537067008515199\n***->>>-----------------------------------------------<<<-***\nstep : 92 Train loss : 0.000799787393771112 , Valid Loss : => 0.0015452500432729722\n***->>>-----------------------------------------------<<<-***\nstep : 93 Train loss : 0.0007968061184510589 , Valid Loss : => 0.0015533976256847382\n***->>>-----------------------------------------------<<<-***\nstep : 94 Train loss : 0.0007938800973352045 , Valid Loss : => 0.001561506154636542\n***->>>-----------------------------------------------<<<-***\nstep : 95 Train loss : 0.0007910089904908091 , Valid Loss : => 0.0015695716875294844\n***->>>-----------------------------------------------<<<-***\nstep : 96 Train loss : 0.0007881924870889634 , Valid Loss : => 0.0015775907784700393\n***->>>-----------------------------------------------<<<-***\nstep : 97 Train loss : 0.000785430008545518 , Valid Loss : => 0.0015855609439313413\n***->>>-----------------------------------------------<<<-***\nstep : 98 Train loss : 0.0007827208447270096 , Valid Loss : => 0.001593478179226319\n***->>>-----------------------------------------------<<<-***\nstep : 99 Train loss : 0.0007800642761867494 , Valid Loss : => 0.0016013405906657378\n***->>>-----------------------------------------------<<<-***\nstep : 100 Train loss : 0.0007774589699693024 , Valid Loss : => 0.001609145508458217\n***->>>-----------------------------------------------<<<-***\nstep : 101 Train loss : 0.0007749039866030216 , Valid Loss : => 0.0016168913183112938\n***->>>-----------------------------------------------<<<-***\nstep : 102 Train loss : 0.0007723980175796896 , Valid Loss : => 0.0016245766232411067\n***->>>-----------------------------------------------<<<-***\nstep : 103 Train loss : 0.0007699399162083864 , Valid Loss : => 0.0016321990949412187\n***->>>-----------------------------------------------<<<-***\nstep : 104 Train loss : 0.0007675275905057788 , Valid Loss : => 0.0016397583298385144\n***->>>-----------------------------------------------<<<-***\nstep : 105 Train loss : 0.0007651599973905832 , Valid Loss : => 0.0016472532724340757\n***->>>-----------------------------------------------<<<-***\nstep : 106 Train loss : 0.0007628352683968842 , Valid Loss : => 0.0016546828051408132\n***->>>-----------------------------------------------<<<-***\nstep : 107 Train loss : 0.0007605518237687647 , Valid Loss : => 0.0016620483870307605\n***->>>-----------------------------------------------<<<-***\nstep : 108 Train loss : 0.0007583077880553901 , Valid Loss : => 0.0016693483106791974\n***->>>-----------------------------------------------<<<-***\nstep : 109 Train loss : 0.0007561014243401587 , Valid Loss : => 0.001676583383232355\n***->>>-----------------------------------------------<<<-***\nstep : 110 Train loss : 0.0007539312553126365 , Valid Loss : => 0.0016837540082633496\n***->>>-----------------------------------------------<<<-***\nstep : 111 Train loss : 0.000751795006217435 , Valid Loss : => 0.0016908608687420687\n***->>>-----------------------------------------------<<<-***\nstep : 112 Train loss : 0.000749691363889724 , Valid Loss : => 0.0016979049270351729\n***->>>-----------------------------------------------<<<-***\nstep : 113 Train loss : 0.0007476185215637088 , Valid Loss : => 0.00170488686611255\n***->>>-----------------------------------------------<<<-***\nstep : 114 Train loss : 0.0007455745991319418 , Valid Loss : => 0.0017118084554870923\n***->>>-----------------------------------------------<<<-***\nstep : 115 Train loss : 0.0007435584417544305 , Valid Loss : => 0.0017186702229082585\n***->>>-----------------------------------------------<<<-***\nstep : 116 Train loss : 0.0007415679621044546 , Valid Loss : => 0.0017254732549190522\n***->>>-----------------------------------------------<<<-***\nstep : 117 Train loss : 0.000739602274261415 , Valid Loss : => 0.0017322205317517121\n***->>>-----------------------------------------------<<<-***\nstep : 118 Train loss : 0.0007376592967193574 , Valid Loss : => 0.0017389126432438692\n***->>>-----------------------------------------------<<<-***\nstep : 119 Train loss : 0.0007357383030466736 , Valid Loss : => 0.0017455503654976686\n***->>>-----------------------------------------------<<<-***\nstep : 120 Train loss : 0.0007338377041742205 , Valid Loss : => 0.0017521371444066366\n***->>>-----------------------------------------------<<<-***\nstep : 121 Train loss : 0.000731956500094384 , Valid Loss : => 0.0017586731972793737\n***->>>-----------------------------------------------<<<-***\nstep : 122 Train loss : 0.000730093460297212 , Valid Loss : => 0.0017651612870395184\n***->>>-----------------------------------------------<<<-***\nstep : 123 Train loss : 0.0007282475428655743 , Valid Loss : => 0.001771602282921473\n***->>>-----------------------------------------------<<<-***\nstep : 124 Train loss : 0.0007264175615273416 , Valid Loss : => 0.0017779989168047905\n***->>>-----------------------------------------------<<<-***\nstep : 125 Train loss : 0.00072460314957425 , Valid Loss : => 0.001784351250777642\n***->>>-----------------------------------------------<<<-***\nstep : 126 Train loss : 0.0007228030601982027 , Valid Loss : => 0.001790663010130326\n***->>>-----------------------------------------------<<<-***\nstep : 127 Train loss : 0.0007210166344884784 , Valid Loss : => 0.0017969341638187568\n***->>>-----------------------------------------------<<<-***\nstep : 128 Train loss : 0.0007192431727889925 , Valid Loss : => 0.0018031672885020574\n***->>>-----------------------------------------------<<<-***\nstep : 129 Train loss : 0.0007174820534419269 , Valid Loss : => 0.0018093642157812914\n***->>>-----------------------------------------------<<<-***\nstep : 130 Train loss : 0.0007157326699234546 , Valid Loss : => 0.0018155258769790331\n***->>>-----------------------------------------------<<<-***\nstep : 131 Train loss : 0.000713994357502088 , Valid Loss : => 0.001821654848754406\n***->>>-----------------------------------------------<<<-***\nstep : 132 Train loss : 0.0007122667296789587 , Valid Loss : => 0.001827751286327839\n***->>>-----------------------------------------------<<<-***\nstep : 133 Train loss : 0.0007105491310358047 , Valid Loss : => 0.0018338179215788841\n***->>>-----------------------------------------------<<<-***\nstep : 134 Train loss : 0.0007088411203585565 , Valid Loss : => 0.0018398552822570006\n***->>>-----------------------------------------------<<<-***\nstep : 135 Train loss : 0.0007071426545735449 , Valid Loss : => 0.0018458656035363675\n***->>>-----------------------------------------------<<<-***\nstep : 136 Train loss : 0.0007054529467131942 , Valid Loss : => 0.0018518500030040741\n***->>>-----------------------------------------------<<<-***\nstep : 137 Train loss : 0.0007037719525396824 , Valid Loss : => 0.0018578095361590386\n***->>>-----------------------------------------------<<<-***\nstep : 138 Train loss : 0.0007020993705373257 , Valid Loss : => 0.0018637457552055517\n***->>>-----------------------------------------------<<<-***\nstep : 139 Train loss : 0.0007004344265442342 , Valid Loss : => 0.001869660367568334\n***->>>-----------------------------------------------<<<-***\nstep : 140 Train loss : 0.000698777528014034 , Valid Loss : => 0.0018755539320409298\n***->>>-----------------------------------------------<<<-***\nstep : 141 Train loss : 0.000697127926396206 , Valid Loss : => 0.0018814278145631155\n***->>>-----------------------------------------------<<<-***\nstep : 142 Train loss : 0.0006954856286756694 , Valid Loss : => 0.001887283908824126\n***->>>-----------------------------------------------<<<-***\nstep : 143 Train loss : 0.000693850324023515 , Valid Loss : => 0.0018931217802067599\n***->>>-----------------------------------------------<<<-***\nstep : 144 Train loss : 0.0006922218552790582 , Valid Loss : => 0.0018989438811937967\n***->>>-----------------------------------------------<<<-***\nstep : 145 Train loss : 0.0006905998568981885 , Valid Loss : => 0.001904751267284155\n***->>>-----------------------------------------------<<<-***\nstep : 146 Train loss : 0.0006889845058321953 , Valid Loss : => 0.0019105440626541773\n***->>>-----------------------------------------------<<<-***\nstep : 147 Train loss : 0.0006873753969557584 , Valid Loss : => 0.0019163243783017\n***->>>-----------------------------------------------<<<-***\nstep : 148 Train loss : 0.0006857724010478706 , Valid Loss : => 0.0019220922452708085\n***->>>-----------------------------------------------<<<-***\nstep : 149 Train loss : 0.0006841753283515573 , Valid Loss : => 0.001927848905324936\n***->>>-----------------------------------------------<<<-***\nstep : 150 Train loss : 0.0006825839588418602 , Valid Loss : => 0.0019335956623156866\n***->>>-----------------------------------------------<<<-***\nstep : 151 Train loss : 0.0006809986045118421 , Valid Loss : => 0.0019393328266839186\n***->>>-----------------------------------------------<<<-***\nstep : 152 Train loss : 0.000679418594809249 , Valid Loss : => 0.0019450624162952105\n***->>>-----------------------------------------------<<<-***\nstep : 153 Train loss : 0.0006778439658228308 , Valid Loss : => 0.0019507834066947302\n***->>>-----------------------------------------------<<<-***\nstep : 154 Train loss : 0.0006762750155758113 , Valid Loss : => 0.001956498312453429\n***->>>-----------------------------------------------<<<-***\nstep : 155 Train loss : 0.0006747109303250909 , Valid Loss : => 0.0019622071956594783\n***->>>-----------------------------------------------<<<-***\nstep : 156 Train loss : 0.0006731522257905453 , Valid Loss : => 0.001967911080767711\n***->>>-----------------------------------------------<<<-***\nstep : 157 Train loss : 0.0006715984689071774 , Valid Loss : => 0.001973610588659843\n***->>>-----------------------------------------------<<<-***\nstep : 158 Train loss : 0.0006700495351105928 , Valid Loss : => 0.001979306029776732\n***->>>-----------------------------------------------<<<-***\nstep : 159 Train loss : 0.0006685055373236537 , Valid Loss : => 0.001984999484072129\n***->>>-----------------------------------------------<<<-***\nstep : 160 Train loss : 0.0006669663591310382 , Valid Loss : => 0.001990690330664317\n***->>>-----------------------------------------------<<<-***\nstep : 161 Train loss : 0.0006654319982044399 , Valid Loss : => 0.001996379594008128\n***->>>-----------------------------------------------<<<-***\nstep : 162 Train loss : 0.0006639019493013621 , Valid Loss : => 0.0020020682364702224\n***->>>-----------------------------------------------<<<-***\nstep : 163 Train loss : 0.0006623766524717212 , Valid Loss : => 0.0020077571272850035\n***->>>-----------------------------------------------<<<-***\nstep : 164 Train loss : 0.0006608556711580605 , Valid Loss : => 0.002013446390628815\n***->>>-----------------------------------------------<<<-***\nstep : 165 Train loss : 0.000659339209087193 , Valid Loss : => 0.002019137206176917\n***->>>-----------------------------------------------<<<-***\nstep : 166 Train loss : 0.0006578269717283547 , Valid Loss : => 0.002024829667061567\n***->>>-----------------------------------------------<<<-***\nstep : 167 Train loss : 0.0006563189905136824 , Valid Loss : => 0.002030524176855882\n***->>>-----------------------------------------------<<<-***\nstep : 168 Train loss : 0.0006548154447227716 , Valid Loss : => 0.002036222318808238\n***->>>-----------------------------------------------<<<-***\nstep : 169 Train loss : 0.0006533158081583678 , Valid Loss : => 0.0020419240929186342\n***->>>-----------------------------------------------<<<-***\nstep : 170 Train loss : 0.000651820267084986 , Valid Loss : => 0.0020476297785838446\n***->>>-----------------------------------------------<<<-***\nstep : 171 Train loss : 0.0006503288901876658 , Valid Loss : => 0.002053340710699558\n***->>>-----------------------------------------------<<<-***\nstep : 172 Train loss : 0.0006488413189072162 , Valid Loss : => 0.0020590565477808316\n***->>>-----------------------------------------------<<<-***\nstep : 173 Train loss : 0.0006473579804878682 , Valid Loss : => 0.0020647781590620675\n***->>>-----------------------------------------------<<<-***\nstep : 174 Train loss : 0.0006458783592097462 , Valid Loss : => 0.002070507158835729\n***->>>-----------------------------------------------<<<-***\nstep : 175 Train loss : 0.0006444025679957122 , Valid Loss : => 0.0020762425226469833\n***->>>-----------------------------------------------<<<-***\nstep : 176 Train loss : 0.0006429307092912495 , Valid Loss : => 0.0020819853991270064\n***->>>-----------------------------------------------<<<-***\nstep : 177 Train loss : 0.0006414624827448278 , Valid Loss : => 0.0020877367506424585\n***->>>-----------------------------------------------<<<-***\nstep : 178 Train loss : 0.0006399980012793094 , Valid Loss : => 0.002093495676914851\n***->>>-----------------------------------------------<<<-***\nstep : 179 Train loss : 0.0006385374069213867 , Valid Loss : => 0.002099264164765676\n***->>>-----------------------------------------------<<<-***\nstep : 180 Train loss : 0.0006370803248137236 , Valid Loss : => 0.002105041624357303\n***->>>-----------------------------------------------<<<-***\nstep : 181 Train loss : 0.0006356268795207143 , Valid Loss : => 0.0021108301977316537\n***->>>-----------------------------------------------<<<-***\nstep : 182 Train loss : 0.0006341771048028022 , Valid Loss : => 0.0021166279911994934\n***->>>-----------------------------------------------<<<-***\nstep : 183 Train loss : 0.0006327310216147452 , Valid Loss : => 0.0021224368363618852\n***->>>-----------------------------------------------<<<-***\nstep : 184 Train loss : 0.0006312882760539651 , Valid Loss : => 0.0021282568263510863\n***->>>-----------------------------------------------<<<-***\nstep : 185 Train loss : 0.0006298491125926375 , Valid Loss : => 0.0021340892339746158\n***->>>-----------------------------------------------<<<-***\nstep : 186 Train loss : 0.0006284135044552386 , Valid Loss : => 0.0021399333452184993\n***->>>-----------------------------------------------<<<-***\nstep : 187 Train loss : 0.0006269815179985017 , Valid Loss : => 0.002145789998273055\n***->>>-----------------------------------------------<<<-***\nstep : 188 Train loss : 0.0006255527550820261 , Valid Loss : => 0.002151659472535054\n***->>>-----------------------------------------------<<<-***\nstep : 189 Train loss : 0.0006241275754291564 , Valid Loss : => 0.0021575421715776125\n***->>>-----------------------------------------------<<<-***\nstep : 190 Train loss : 0.000622705778805539 , Valid Loss : => 0.002163438871502876\n***->>>-----------------------------------------------<<<-***\nstep : 191 Train loss : 0.0006212874234188348 , Valid Loss : => 0.002169349944839875\n***->>>-----------------------------------------------<<<-***\nstep : 192 Train loss : 0.000619872537208721 , Valid Loss : => 0.0021752746465305488\n***->>>-----------------------------------------------<<<-***\nstep : 193 Train loss : 0.0006184611679054797 , Valid Loss : => 0.002181215149660905\n***->>>-----------------------------------------------<<<-***\nstep : 194 Train loss : 0.0006170529150404036 , Valid Loss : => 0.0021871703676879404\n***->>>-----------------------------------------------<<<-***\nstep : 195 Train loss : 0.0006156482477672398 , Valid Loss : => 0.002193141511331002\n***->>>-----------------------------------------------<<<-***\nstep : 196 Train loss : 0.0006142469984479249 , Valid Loss : => 0.0021991278665761155\n***->>>-----------------------------------------------<<<-***\nstep : 197 Train loss : 0.0006128487369278446 , Valid Loss : => 0.0022051312029361726\n***->>>-----------------------------------------------<<<-***\nstep : 198 Train loss : 0.0006114543834701181 , Valid Loss : => 0.002211150744309028\n***->>>-----------------------------------------------<<<-***\nstep : 199 Train loss : 0.0006100632378365844 , Valid Loss : => 0.0022171868942677973\n***->>>-----------------------------------------------<<<-***\n===========================================================\n*********************************************************\nThe total Training Time is Equal with ==> : 3.6312246322631836 Sec.\n*********************************************************\n===========================================================\n"
],
[
"f, ax = plt.subplots(1, 1, figsize=(10, 7))\nplt.title(\"Train & Valid Loss - RNN\", fontsize=18)\nplt.xlabel(\"Epoch\")\nplt.ylabel(\"Loss\")\nplt.plot(running_loss_history, label='Train')\nplt.plot(val_running_loss_history, label='Test')\n# pyplot.plot(history.history['val_loss'], label='test')\nplt.legend()\nplt.show()",
"_____no_output_____"
],
[
"test_x, test_y = x[50:], y[50:]\nmodel.eval()\ntest_x = torch.tensor(test_x, dtype=torch.float32)\ntest_y = torch.tensor(test_y, dtype=torch.float32)\nres, hid = model(test_x, None)\nloss_test = criterion(res.view(-1), test_y)\n\nfuture = 100\nwindow_size = 11\n# preds = dataset[15000:15100, 0].tolist()\n# print(len(preds))\n# print(preds)\n\n# for i in range (future):\n# # seq = torch.FloatTensor(preds[-window_size:])\n# with torch.no_grad():\n# # seq = torch.tensor(seq, dtype=torch.float32).view(1, 11, 8)\n# # model.hidden = (torch.zeros(1, 1, model.hidden_size),\n# # torch.zeros(1, 1, model.hidden_size))\n \n# preds.append(model(seq))\n\n# print(preds[11:])\n\n",
"_____no_output_____"
],
[
"fig = plt.figure(figsize=(20, 7))\nplt.title(\"Beijing Polution Prediction - RNN\", fontsize=18)\nplt.ylabel('Polution')\nplt.xlabel('Num data')\nplt.grid(True)\nplt.autoscale(axis='x', tight=True)\nfig.autofmt_xdate()\nplt.plot(test_y, label=\"Real\")\nplt.plot(res.detach().numpy(), label=\"Prediction\")\nplt.legend()\nplt.show()",
"_____no_output_____"
],
[
"test_x, test_y = x[50:], y[50:]\nmodel.eval()\n\n\ntest_running_loss = 0\nwith torch.no_grad(): # it will temprerorerly set all the required grad flags to be false\n model.eval()\n for b in range(0, len(test_x), batch_size):\n inpt = test_x[b:b+batch_size, :, :]\n target = test_y[b:b+batch_size] \n\n x_batch_test = torch.tensor(inpt, dtype=torch.float32)\n y_batch_test = torch.tensor(target, dtype=torch.float32)\n\n # model.init_hidden(x_batch_test.size(0))\n\n output_test, hidden_test = model(x_batch_test, hidden_test)\n\n hidden_test = hidden_test.data\n loss_test = criterion(output_test.view(-1), y_batch_test)\n\n test_running_loss += loss_test.item()\n\n test_epoch_loss = test_running_loss / len(test_x)\n\nprint(\"##########################################################\")\nprint(\">>>>---------------------------------------------------<<<<\")\nprint(\">>>>----------***************************--------------<<<<\")\nprint(\"**** Test Loss :==>>> \", test_epoch_loss)\nprint(\">>>>----------***************************--------------<<<<\")\nprint(\">>>>---------------------------------------------------<<<<\")\nprint(\"##########################################################\")",
"##########################################################\n>>>>---------------------------------------------------<<<<\n>>>>----------***************************--------------<<<<\n**** Test Loss :==>>> 0.002217333360264699\n>>>>----------***************************--------------<<<<\n>>>>---------------------------------------------------<<<<\n##########################################################\n"
],
[
"",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code"
] | [
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07f2b2df8e913a286e45450bb53ff8f0a64ddb3 | 20,974 | ipynb | Jupyter Notebook | pokehackathon-kedro/notebooks/Teamforming.ipynb | franperic/pokehackathon | c91977d630914d738de995ddddd1791b5b7b23b9 | [
"MIT"
] | null | null | null | pokehackathon-kedro/notebooks/Teamforming.ipynb | franperic/pokehackathon | c91977d630914d738de995ddddd1791b5b7b23b9 | [
"MIT"
] | 3 | 2021-05-21T14:34:49.000Z | 2022-02-10T01:51:21.000Z | pokehackathon-kedro/notebooks/Teamforming.ipynb | franperic/pokehackathon | c91977d630914d738de995ddddd1791b5b7b23b9 | [
"MIT"
] | null | null | null | 156.522388 | 1,747 | 0.694384 | [
[
[
"import keras\n\n# keras.models.load_model(\"/20200530\")\ncheck = keras.models.load_model(\"data/06_models/20200530\")\n# check = keras.models.load_model(\"20200530.zip\")\n# check.predict(norm_X_test)",
"_____no_output_____"
],
[
"!ls data/06_models/",
"\u001b[34m20200530\u001b[m\u001b[m\r\n"
],
[
"!ls",
"MLPregressor.sav \u001b[34mdocs\u001b[m\u001b[m possible_battles.zip\r\nREADME.md \u001b[31mkedro_cli.py\u001b[m\u001b[m setup.cfg\r\n\u001b[34m__pycache__\u001b[m\u001b[m \u001b[34mlogs\u001b[m\u001b[m \u001b[34msrc\u001b[m\u001b[m\r\n\u001b[34mconf\u001b[m\u001b[m \u001b[34mnotebooks\u001b[m\u001b[m\r\n\u001b[34mdata\u001b[m\u001b[m possible_battles.csv\r\n"
],
[
"check = keras.models.load_model(\"data/06_models/20200530\")",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code"
]
] |
d07f2bba4c319cc1ed18a85e6702d87b8fc08afb | 16,566 | ipynb | Jupyter Notebook | Computational/Intro to Numerical Computing/Note_Chap10_QR algorithm for eigenvalue problems.ipynb | XavierOwen/Notes | d262a9103b29ee043aa198b475654aabd7a2818d | [
"MIT"
] | 2 | 2018-11-27T10:31:08.000Z | 2019-01-20T03:11:58.000Z | Computational/Intro to Numerical Computing/Note_Chap10_QR algorithm for eigenvalue problems.ipynb | XavierOwen/Notes | d262a9103b29ee043aa198b475654aabd7a2818d | [
"MIT"
] | null | null | null | Computational/Intro to Numerical Computing/Note_Chap10_QR algorithm for eigenvalue problems.ipynb | XavierOwen/Notes | d262a9103b29ee043aa198b475654aabd7a2818d | [
"MIT"
] | 1 | 2020-07-14T19:57:23.000Z | 2020-07-14T19:57:23.000Z | 51.931034 | 565 | 0.499396 | [
[
[
"# Simulation iteration\nLet $A$ be an $m \\times m$ real symmetric matrix with eigenvalue decomposition: $\\newcommand{\\ffrac}{\\displaystyle \\frac} \\newcommand{\\Tran}[1]{{#1}^{\\mathrm{T}}}A = Q \\Lambda \\Tran{Q}$, where the eigenvalues of $A$ ar ordered as $\\left| \\lambda_1 \\right| > \\left| \\lambda_2 \\right| > \\left| \\lambda_3 \\right| \\geq \\cdots \\geq \\left| \\lambda_m \\right|$ with corresponding orthogonal orthonormal eigenvectors $\\vec{q}_1, \\vec{q}_2, \\dots, \\vec{q}_m$, the columns of matrix $Q$.\n\nNow we try to obtain the two largest eigenvalues $\\lambda_1, \\lambda_2$ and their corresponding eigenvectors. We start from $\\vec{e}_1, \\vec{e}_2$ to do the power iteration.\n\n$$\\vec{e}_1 = \\sum_{i=1}^{m} \\alpha_i \\vec{q}_i, \\vec{e}_2 = \\sum_{i=1}^{m} \\beta_i \\vec{q}_i$$\n\nSo after we assume that $\\alpha_1 \\neq 0$, as $k \\to \\infty$, $\\left\\| \\ffrac{ A^k \\vec{e}_1} {\\alpha_1 \\lambda_1^k} \\right\\| \\to \\left\\| \\vec{q}_1 \\right\\|$,$\\left\\| \\ffrac{ A^k \\vec{e}_2} {\\beta_1 \\lambda_1^k} \\right\\| \\to \\left\\| \\vec{q}_1 \\right\\|$ with convergence speed the ratio of $\\ffrac{\\lambda_2} {\\lambda_1}$; also seeing from $\\beta_1 A^k \\vec{e}_1 - \\alpha_1 A^k \\vec{e}_2$, if we also assume that $\\ffrac{\\alpha_2} {\\alpha_1} \\neq \\ffrac{\\beta_2} {\\beta_1}$, $i.e.$, $\\left| \\begin{array}{cc}\n\\alpha_1 & \\alpha_2 \\\\\n\\beta_1 & \\beta_2 \n\\end{array} \\right| \\neq 0$, as $k \\to \\infty$, in term of spanned space: $\\left \\langle A^k\\vec{e}_1, A^k\\vec{e}_2 \\right \\rangle \\to \\left \\langle \\vec{q}_1, \\vec{q}_2 \\right \\rangle$. Now from $QR$ factorization, we have\n\n$$A^k = \\left[\\begin{array}{cccc}\nA^k\\vec{e}_1 & A^k \\vec{e}_2 & \\cdots A^k \\vec{e}_m\n\\end{array}\n\\right] = Q^{(k)} R^{(k)} \\Rightarrow \\left[\\begin{array}{cc}\nA^k\\vec{e}_1 & A^k \\vec{e}_2 \n\\end{array}\n\\right] = \\underline{Q}^{(k)}\\underline{R}^{(k)} = \\left[\\begin{array}{cc}\n\\underline{\\vec{q}}_1^{(k)} & \\underline{\\vec{q}}_2^{(k)} \n\\end{array}\n\\right]\\underline{R}^{(k)}$$\n\nand as $k \\to \\infty$ we also have $\\underline{\\vec{q}}_i^{(k)} \\to \\vec{q}_i$. And we can generalize this method to obtain all the eigenvalues and eigenvectors of $A$.\n\nWe first assume that $\\left| \\lambda_1 \\right| > \\left| \\lambda_2 \\right| > \\cdots > \\left| \\lambda_m \\right|$. And express the unit vector as the linear combination of $\\vec{q}_1, \\vec{q}_2, \\dots , \\vec{q}_m$: $\\vec{e}_i = \\displaystyle\\sum_{j=1}^{m} z_{ij} \\vec{q}_j$, $i.e.$,\n\n$$I = \\left[\\begin{array}{cccc}\n\\vec{q}_1 & \\vec{q}_2 & \\cdots \\vec{q}_m\n\\end{array}\n\\right] \\Tran{\\left[ \\begin{array}{cccc}\nz_{11} & z_{12} & \\cdots & z_{1m} \\\\\nz_{21} & z_{22} & \\cdots & z_{2m} \\\\\n\\vdots & \\vdots & \\ddots & \\vdots \\\\\nz_{m1} & z_{m2} & \\cdots & z_{mm} \\\\\n\\end{array}\\right]} = Q \\Tran{Q}$$\n\nwhich require that $\\left| \\begin{array}{cccc}\nz_{11} & z_{12} & \\cdots & z_{1i} \\\\\nz_{21} & z_{22} & \\cdots & z_{2i} \\\\\n\\vdots & \\vdots & \\ddots & \\vdots \\\\\nz_{i1} & z_{i2} & \\cdots & z_{ii} \\\\\n\\end{array}\\right| \\neq 0$ for $i = 1 , 2 , \\dots , m$, $i.e.$, we assume that all the **leading principal minors of the matrix** $Q$ are nonsingular. And actually we have $\\left[ \\begin{array}{cccc}\nz_{11} & z_{12} & \\cdots & z_{1m} \\\\\nz_{21} & z_{22} & \\cdots & z_{2m} \\\\\n\\vdots & \\vdots & \\ddots & \\vdots \\\\\nz_{m1} & z_{m2} & \\cdots & z_{mm} \\\\\n\\end{array}\\right] = Q$. Then as $k \\to \\infty$, we have for $j = 1, 2, \\dots, m$\n\n$$\\left \\langle A^k \\vec{e}_1, A^k \\vec{e}_2, \\cdots, A^k \\vec{e}_j \\right \\rangle \\longrightarrow \\left \\langle \\vec{q}_1, \\vec{q}_2, \\cdots, \\vec{q}_j \\right \\rangle$$ \n\nThus, each $\\vec{q}_j$ is the limit of the component in $A^k \\vec{e}_j$ orthogonal to the space $\\left \\langle A^k \\vec{e}_1, A^k \\vec{e}_2, \\cdots, A^k \\vec{e}_{j-1} \\right \\rangle$. And such orthogonal components can be obtained from the $QR$ factorization of the matrix $A^k$.\n\n$$A^k = \\left[\\begin{array}{cccc}\nA^k\\vec{e}_1 & A^k \\vec{e}_2 & \\cdots A^k \\vec{e}_m\n\\end{array}\n\\right] = \\underline{Q}^{(k)} \\underline{R}^{(k)} = \\left[ \\begin{array}{cccc}\n\\underline{\\vec{q}}_1^{(k)} & \\underline{\\vec{q}}_2^{(k)} & \\cdots & \\underline{\\vec{q}}_m^{(k)}\n\\end{array}\n\\right] \\underline{R}^{(k)}$$\n\nAnd then as $k \\to \\infty$, $\\underline{\\vec{q}}_j^{(k)} \\to \\vec{q}_j$ for all possible $j$ with speed of each a ratio of $\\left| \\ffrac{\\lambda_{j+1}} {\\lambda_{j}}\\right|$.\n\nIn conclusion, for any real symmetric matrix $A$, as long as all its eigenvalues are *distinct* and all *leading principal minors* of the matrix $Q$ are nonsingular, then the orthogonal matrix $\\underline{Q}^{(k)}$ from the $QR$ decomposition of $A^k$ converges to $Q$, then we can use Rayleigh quotient to find the corresponding eigenvalues.\n\n$Algorithm$\n\nGiven a real symmetric matrix $A$, we apply the **Simultaneous iteration**\n\n```\nAK = A\nfor k = 1:n\n AK = A * AK\nend\nQ*R = AK\n```\n\nHowever, this method is not very stable. Since all columns of $A^K$ will converges to $\\vec{q}_1$, so the condition number will get larger and larger with increasing $k$. Here's a more stable one.\n\n```\nAK = A\nfor k = 1:n\n Q * R = AK \n AK = A * Q\nend\n```\n\nIn addition, it's also hard to determine the convergence of the algorithm. See another one below.",
"_____no_output_____"
],
[
"# QR algorithm without shift\n\n$Algorithm$\n\n$QR$ algorithm without shift\n\n```MATLAB\nA^{(0)} = A\nfor k = 1:until convergence\n Q^{(k)} * R^{(k)} = A^{(k − 1)}\n A^{(k)} = R^{(k)} * Q^{(k)}\nend\n```\n\nFor this algorithm, $A^{(k)}$ will converges to $\\Lambda$. And what we gonna prove is for $k = 1, 2, \\dots $, two equations hold.\n\n$$\nA^k = Q^{(1)} Q^{(2)} \\cdots Q^{(k)} R^{(k)} \\cdots R^{(2)} R^{(1)} := \\underline{Q}^{(k)}\\underline{R}^{(k)} \\\\\nA^{(k)}:=\\Tran{\\left( Q^{(1)} Q^{(2)} \\cdots Q^{(k)} \\right)} A \\left( Q^{(1)} Q^{(2)} \\cdots Q^{(k)} \\right)\n$$\n\n$Proof$\n\n$$\nA^1 = A^{(0)} = Q^{(1)} R^{(1)} := \\underline{Q}^{(1)}\\underline{R}^{(1)} \\\\\nA^{(1)}:= R^{(1)} Q^{(1)} = \\Tran{\\left( Q^{(1)} \\right)}Q^{(1)}R^{(1)} \\left( Q^{(1)} \\right) = \\Tran{\\left( Q^{(1)} \\right)} A^{(0)} \\left( Q^{(1)} \\right) = \\Tran{\\left( Q^{(1)} \\right)}A \\left( Q^{(1)} \\right) \\\\\nQ^{(2)}R^{(2)} = A^{(1)} \\\\\nA^{(2)}:= R^{(2)} Q^{(2)} = \\Tran{\\left( Q^{(2)} \\right)}Q^{(2)}R^{(2)} \\left( Q^{(2)} \\right) = \\Tran{\\left( Q^{(2)} \\right)} A^{(1)} \\left( Q^{(2)} \\right) = \\Tran{\\left( Q^{(2)} \\right)} \\Tran{\\left( Q^{(1)} \\right)}A \\left( Q^{(1)} \\right) \\left( Q^{(2)} \\right)\\\\\nA^2 = A\\left( Q^{(1)}R^{(1)} \\right) = Q^{(1)}\\left( \\Tran{Q^{(1)}}A^{(1)}Q^{(1)} \\right) R^{(1)} = Q^{(1)}\\left( A^{(2)} \\right) R^{(1)} = Q^{(1)}Q^{(2)}R^{(2)}R^{(1)} := \\underline{Q}^{(2)}\\underline{R}^{(2)} \n$$\n\nTimes after times of iteration, we have\n\n$$\nA^{(k)} = \\Tran{{\\underline{Q}}^{(k)}} A {\\underline{Q}}^{(k)}\\\\\nA^k = \\underline{Q}^{(k)}\\underline{R}^{(k)}\n$$\n\nso as $k \\to \\infty$ we have $\\Tran{Q} A Q = \\Lambda$. Here's the theorem\n\n$Theorem$\n\nLet $A$ is a real symmetric matrix with eigenvalue decomposition $A = Q \\lambda \\Tran{Q}$, and $\\left| \\lambda_1 \\right| > \\left| \\lambda_2 \\right| > \\cdots > \\left| \\lambda_m \\right|$. Assume that all the leading principal\nminors of $Q$ are nonsingular. Then from the $QR$ algorithm, $A^{(k)}$ converges to $\\Lambda$, as $k \\to \\infty$, with a linear convergence rate determined by $\\max\\limits_{1 \\leq j < m} \\ffrac{\\left| \\lambda_{j+1} \\right|} {\\left| \\lambda_{j} \\right|}$",
"_____no_output_____"
],
[
"# Deflation in the implementation of QR algorithm\nSo actually the sequence of $A^{(k)}$ will converge to $\\Lambda$ with entries other than on the diagonal close to $0$, at least less than a certain tolerance value, $\\varepsilon$.\n\n$Algorithm$\n\n```MATLAB\nfunction [ Anew ] = qralg( A )\n l = length( A );\n while ( | A(l,l-1) | > varepsilon )\n [ Q, R ] = qr( A );\n A = R * Q;\n end\n Anew = A;\nend\n\nfunction [ eigens ] = qreigens( A )\n for k = m:-1:2\n Anew = qralg( A );\n eigens( k ) = Anew( k,k );\n A = Anew(1:k-1 , 1:k-1 );\n end\n eigens(1) = A(1,1);\nend\n```\n\nSo each time we obtain an eigenvalue in the downright corner of the matrix, we reduce the one less dimension to that matrix until all the eigenvalues are found.",
"_____no_output_____"
],
[
"# QR algorithm with shift\nCompared with power iteration, the inverse iteration and the Rayleigh quotient iteration converge faster when *shift value* is sufficiently close to a true eigenvalue of $A$. And when doing inverse iteration we can introduce a shift into the $QR$ algorithm and make it converges faster. See the algorithm first\n\n$$A^{k} = \\underline{Q}^{(k)} \\underline{R}^{k} \\Rightarrow A^{-k} = \\left( \\underline{R}^{(k)} \\right)^{-1} \\left( \\underline{Q}^{(k)} \\right)^{-1} = \\left( \\underline{R}^{(k)} \\right)^{-1} \\Tran{\\left( \\underline{Q}^{(k)} \\right)} \\\\\n$$\n\nThen since $A$ is symmetric, we have $\\left( \\Tran{A} \\right)^{-k} = A^{-k} = \\left( \\underline{Q}^{(k)} \\right) \\left( \\Tran{ \\underline{R}^{(k)}} \\right)^{-1}$. Notice that here $\\left( \\Tran{ \\underline{R}^{(k)}} \\right) ^{-1}$ is an lower triangular matrix.\n\nThen we denote $P$ the $m$ by $m$ permutation matrix. $P = \\left[ \\begin{array}{cccc}\n& & & 1 \\\\\n& & 1 & \\\\\n& \\ddots & & \\\\\n1 & & & \n\\end{array}\\right] = \\left[ \\begin{array}{cccc}\n\\vec{e}_m & \\vec{e}_{m-1} & \\cdots & \\vec{e}_1\n\\end{array}\\right]$. Then we have\n\n$$A^{-k}P = \\left( \\underline{Q}^{(k)}P \\right) P\\left( \\Tran{ \\underline{R}^{(k)}} \\right)^{-1}P$$\n\nSo then we denote $P\\left( \\Tran{ \\underline{R}^{(k)}} \\right)^{-1}P$ as $\\tilde{\\underline{R}}^{(k)}$, we have the following fomula\n\n$$\\left( A^{-1} \\right)^{k} \\left[ \\begin{array}{cccc}\n\\vec{e}_m & \\vec{e}_{m-1} & \\cdots & \\vec{e}_1\n\\end{array}\\right] = \\left[ \\begin{array}{cccc}\n\\underline{\\vec{q}}_m^{(k)} & \\underline{\\vec{q}}_{m-1}^{(k)} & \\cdots & \\underline{\\vec{q}}_1^{(k)}\n\\end{array}\\right]\\tilde{\\underline{R}}^{(k)}\n$$\n\nAnd the $QR$ algorithm with a constant shift value $\\mu$ is as following.\n\n$Algorithm$\n\n```\nA^{(0)} = A;\n\nfor k = 1:n \n Q^{(k)} * R^{(k)} = A^{(k-1)} - \\mu * I;\n A^{(k)} = R^{(k)} Q^{(k)} + \\mu * I;\nend\n```\n\nThis is essentially the $QR$ algorithm on the matrix $A - \\mu I$. So similarly we have\n\n$$\n\\left( A - \\mu I \\right)^k = Q^{(1)} Q^{(2)} \\cdots Q^{(k)} R^{(k)} \\cdots R^{(2)} R^{(1)} := \\underline{Q}^{(k)}\\underline{R}^{(k)} \\\\\nA^{(k)}:=\\Tran{\\left( Q^{(1)} Q^{(2)} \\cdots Q^{(k)} \\right)} A \\left( Q^{(1)} Q^{(2)} \\cdots Q^{(k)} \\right)\n$$\n\nThen written in terms of the inverse iteration, we have \n\n$$\\left( \\left( A-\\mu I \\right)^{-1} \\right)^{k} \\left[ \\begin{array}{cccc}\n\\vec{e}_m & \\vec{e}_{m-1} & \\cdots & \\vec{e}_1\n\\end{array}\\right] = \\left[ \\begin{array}{cccc}\n\\underline{\\vec{q}}_m^{(k)} & \\underline{\\vec{q}}_{m-1}^{(k)} & \\cdots & \\underline{\\vec{q}}_1^{(k)}\n\\end{array}\\right]\\tilde{\\underline{R}}^{(k)}\n$$\n\nStill the $\\underline{\\vec{q}}_{m}^{(k)}$ will converge to the eigenvalues of $A$, but the speed is now determined by the ratio $\\ffrac{\\left| \\lambda_m - \\mu \\right|} {\\left| \\lambda_{m - 1} - \\mu \\right|}$, and we can even update the shift value $\\mu$ in each iteration as we did before in the Rayleigh quotient iteration\n\n$Algorithm$\n\n```\nA^{(0)} = A;\nfor k = 1:n\n Q^{(k)} R^{(k)} = A^{(k-1)} - \\mu^{(k)} * I\n A^{(k)} = R^{(k)} Q^{(k)} + \\mu^{(k)} * I\nend\n```\n***\n\nBut how to find that shift value, and how to update each time? We want the $\\mu^{(k)}$ are sufficiently close to $\\lambda_m$, so when the convergence to $\\lambda_m$ is achieved, we not only deflate the matrix to a smaller one but also shift the value of $\\mu^{(k)} = A^{(k-1)}(m,m)$.\n\nHere's the explanation: From Rayleigh quotient, $\\Tran{ \\underline{\\vec{q}}_m^{(k-1)} }A\\underline{\\vec{q}}_m^{(k-1)}$ and since $A^{(k-1)} = \\Tran{ \\left( \\underline{Q}^{(k-1)} \\right) } A \\underline{Q}^{(k-1)}$, \n\n$$\\Tran{ \\underline{\\vec{q}}_m^{(k-1)} }A\\underline{\\vec{q}}_m^{(k-1)} = \\left( \\Tran{\\vec{e}_{m}} \\Tran{ \\left( \\underline{Q}^{(k-1)} \\right) } \\right) A \\left( \\underline{Q}^{(k-1)}\\vec{e}_{m} \\right) = \\Tran{\\vec{e}_{m}}A^{(k-1)}\\vec{e}_{m} = A^{(k-1)}(m,m) := \\mu^{(k)}$$\n\nAnd there's a better shift: *Wilkinson shift*, defined as follows: the one eigenvalue of the lower-rightmost $2 \\times 2$ submatrix of $A^{(k-1)}$ that is closer to $A^{(k-1)}_{m,m}$.\n\n$$\\left[ \\begin{array}{cc}\nA^{(k-1)}_{m-1,m-1} & A^{(k-1)}_{m-1,m} \\\\[1em]\nA^{(k-1)}_{m,m-1} & A^{(k-1)}_{m,m}\n\\end{array} \\right]$$\n\nAnd we can write that as \n\n$$\\mu^{(k)} = A^{(k-1)}_{m,m} - \\DeclareMathOperator*{\\sign}{sign} \\frac{ \\sign \\left( \\delta \\right) {A^{(k-1)} _{m,m-1}}^2} { \\left| \\delta \\right| + \\sqrt{ \\delta^2 + {A^{(k-1)} _{m,m-1}}^2} } \\\\[1em]\n\\delta = \\frac{\\left( A^{(k-1)} _{m-1,m-1} - A^{(k-1)} _{m,m} \\right)} {2}$$",
"_____no_output_____"
]
]
] | [
"markdown"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
d07f356dad9a8133c3d68b8a3ce0682c57da435f | 1,033,757 | ipynb | Jupyter Notebook | Linear Regression w sklearn Case Exercise.ipynb | jovemmanuelre/Practical-Case-Example-Regression-with-sklearn | 3011d7efe4a8083cf987632c6840695c5f685d57 | [
"Apache-2.0"
] | null | null | null | Linear Regression w sklearn Case Exercise.ipynb | jovemmanuelre/Practical-Case-Example-Regression-with-sklearn | 3011d7efe4a8083cf987632c6840695c5f685d57 | [
"Apache-2.0"
] | null | null | null | Linear Regression w sklearn Case Exercise.ipynb | jovemmanuelre/Practical-Case-Example-Regression-with-sklearn | 3011d7efe4a8083cf987632c6840695c5f685d57 | [
"Apache-2.0"
] | null | null | null | 51.014459 | 44,428 | 0.561072 | [
[
[
"import numpy as np\nimport pandas as pd\nimport statsmodels.api as sm\nimport matplotlib.pyplot as plt\nfrom sklearn.linear_model import LinearRegression\nimport seaborn as sns\nsns.set()",
"_____no_output_____"
],
[
"raw_data = pd.read_csv('1.04. Real-life example.csv')\nraw_data.head()",
"_____no_output_____"
]
],
[
[
"## Preprocessing",
"_____no_output_____"
]
],
[
[
"raw_data.describe(include='all')",
"_____no_output_____"
],
[
"data = raw_data.drop(['Model'],axis=1)\ndata.describe(include='all')\n# dropped the Model category because it is irrelevant to increasing the accuracy of my model.",
"_____no_output_____"
],
[
"data.isnull().sum()\n#this shows the missing values in my dataset",
"_____no_output_____"
],
[
"data_no_mv = data.dropna(axis=0)\n#I drop the missing values here, which is acceptable because it is less than 5% of the observations.",
"_____no_output_____"
],
[
"data_no_mv.describe(include='all')",
"_____no_output_____"
]
],
[
[
"### PDFs\n#### Here I check the Probability Distribution Functions (PDF) of the Independent Variables Price, Year, Mileage, and Engine Volume to identify and weed out the Outliers. They can adversely affect the accuracy of my Regression model because a Regression attempts to draw a line closest to all the data; including the Outliers might inflate/deflate my model. ",
"_____no_output_____"
]
],
[
[
"sns.distplot(data_no_mv['Price'])",
"/Users/jovemmanuelr.ermitano/opt/anaconda3/envs/JERETech/lib/python3.7/site-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n"
],
[
"q = data_no_mv['Price'].quantile(0.99)\ndata_1 = data_no_mv[data_no_mv['Price']<q]\ndata_1.describe(include='all')\n# I decided to exclude the observations in the 99th percentile and above to get rid of the Outliers.",
"_____no_output_____"
],
[
"sns.distplot(data_1['Price'])\n# Now the Price variable only includes observations up to the 98th percentile and has much fewer Outliers.",
"/Users/jovemmanuelr.ermitano/opt/anaconda3/envs/JERETech/lib/python3.7/site-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n"
],
[
"sns.distplot(data_no_mv['Mileage'])",
"/Users/jovemmanuelr.ermitano/opt/anaconda3/envs/JERETech/lib/python3.7/site-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n"
],
[
"q = data_1['Mileage'].quantile(0.99)\ndata_2 = data_1[data_1['Mileage']<q]\n# Similar to the Price variable, I decided to exclude the observations in the 99th percentile and beyond to remove the Outliers.",
"_____no_output_____"
],
[
"sns.distplot(data_2['Mileage'])",
"/Users/jovemmanuelr.ermitano/opt/anaconda3/envs/JERETech/lib/python3.7/site-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n"
],
[
"sns.distplot(data_no_mv['EngineV'])\n# The PDF looks unusual compared to the previous two.",
"/Users/jovemmanuelr.ermitano/opt/anaconda3/envs/JERETech/lib/python3.7/site-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n"
],
[
"data_3 = data_2[data_2['EngineV']<6.6]\n# After research, I found out that the normal interval of the Engine Volume falls between 06. to 6.5.\n# The observations beyond 6.5 are mostly 99.99 - a variable that was used in the past to label missing values. It is a bad idea to label missing values in this manner now.\n# I decided to remove such observations as they are Outliers.",
"_____no_output_____"
],
[
"sns.distplot(data_3['EngineV'])",
"/Users/jovemmanuelr.ermitano/opt/anaconda3/envs/JERETech/lib/python3.7/site-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n"
],
[
"sns.distplot(data_no_mv['Year'])\n# Most cars are newer but there are a few vintage cars in the variable.",
"/Users/jovemmanuelr.ermitano/opt/anaconda3/envs/JERETech/lib/python3.7/site-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n"
],
[
"q = data_3['Year'].quantile(0.01)\ndata_4 = data_3[data_3['Year']>q]\n# I decided to remove the 1st percentile and keep the rest",
"_____no_output_____"
],
[
"sns.distplot(data_4['Year'])",
"/Users/jovemmanuelr.ermitano/opt/anaconda3/envs/JERETech/lib/python3.7/site-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n"
],
[
"data_cleaned = data_4.reset_index(drop=True)\n#I reset the index to completely forget the old index.",
"_____no_output_____"
],
[
"data_cleaned.describe(include='all')\n# This excludes ~250 problematic observations that could've hindered the accuracy of my model if left unchecked.",
"_____no_output_____"
]
],
[
[
"## Checking the OLS assumptions",
"_____no_output_____"
],
[
"### Distribution",
"_____no_output_____"
]
],
[
[
"f, (ax1, ax2, ax3) = plt.subplots(1, 3, sharey=True, figsize =(15,3))\nax1.scatter(data_cleaned['Year'],data_cleaned['Price'])\nax1.set_title('Price and Year')\nax2.scatter(data_cleaned['EngineV'],data_cleaned['Price'])\nax2.set_title('Price and EngineV')\nax3.scatter(data_cleaned['Mileage'],data_cleaned['Price'])\nax3.set_title('Price and Mileage')\nplt.show()\n# These are not linear regressions and shows that I should first transform one or more variables to run the Regression.",
"_____no_output_____"
],
[
"sns.distplot(data_cleaned['Price'])\n#Here I check the distribution of the dependent variable Price.",
"/Users/jovemmanuelr.ermitano/opt/anaconda3/envs/JERETech/lib/python3.7/site-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n"
],
[
"log_price = np.log(data_cleaned['Price'])\n# Here I used the log transformation to fix heteroscedasticity and remove outliers from the variable Price.\ndata_cleaned['log_price'] = log_price\ndata_cleaned",
"_____no_output_____"
],
[
"f, (ax1, ax2, ax3) = plt.subplots(1, 3, sharey=True, figsize =(15,3))\nax1.scatter(data_cleaned['Year'],data_cleaned['log_price'])\nax1.set_title('Log Price and Year')\nax2.scatter(data_cleaned['EngineV'],data_cleaned['log_price'])\nax2.set_title('Log Price and EngineV')\nax3.scatter(data_cleaned['Mileage'],data_cleaned['log_price'])\nax3.set_title('Log Price and Mileage')\nplt.show()\n# After performing the log transformation on Price, the PDFs now show a linear regression line.",
"_____no_output_____"
],
[
"data_cleaned = data_cleaned.drop(['Price'],axis=1)\n# Here I dropped the variable Price and replaced it with log_Price because the former has no statistical significance to my model.",
"_____no_output_____"
]
],
[
[
"### Multicollinearity",
"_____no_output_____"
]
],
[
[
"data_cleaned.columns.values",
"_____no_output_____"
],
[
"from statsmodels.stats.outliers_influence import variance_inflation_factor\nvariables = data_cleaned[['Mileage','Year','EngineV']]\nvif = pd.DataFrame()\nvif[\"VIF\"] = [variance_inflation_factor(variables.values, i) for i in range(variables.shape[1])]\nvif[\"features\"] = variables.columns\n# Through Statsmodels, I used the Variance Inflation Factor here to check for multicollinearity in my variables.\n# While I expect multicollinearity in my data, I wanted to check the variables the introduce unnacceptable correlation to my model; they have high VIFs.",
"_____no_output_____"
],
[
"vif",
"_____no_output_____"
],
[
"data_no_multicollinearity = data_cleaned.drop(['Year'],axis=1)\n# Dropped 'Year' because it has an unacceptably high VIF and is therefore a feature that introduces correlation in my data",
"_____no_output_____"
],
[
"data_with_dummies = pd.get_dummies(data_no_multicollinearity, drop_first=True)\n# This identifies categorical variables and creates dummies automatically to avoid multicollinearity in my Model",
"_____no_output_____"
],
[
"data_with_dummies.head()",
"_____no_output_____"
],
[
"data_with_dummies.columns.values",
"_____no_output_____"
],
[
"cols = ['log_price', 'Mileage', 'EngineV', 'Brand_BMW',\n 'Brand_Mercedes-Benz', 'Brand_Mitsubishi', 'Brand_Renault',\n 'Brand_Toyota', 'Brand_Volkswagen', 'Body_hatch', 'Body_other',\n 'Body_sedan', 'Body_vagon', 'Body_van', 'Engine Type_Gas',\n 'Engine Type_Other', 'Engine Type_Petrol', 'Registration_yes']",
"_____no_output_____"
],
[
"data_preprocessed = data_with_dummies[cols]\ndata_preprocessed.head()\n# Here I arranged the data into a table.",
"_____no_output_____"
]
],
[
[
"## Training my Regression Model\n",
"_____no_output_____"
]
],
[
[
"targets = data_preprocessed['log_price']\ninputs = data_preprocessed.drop(['log_price'], axis=1)\n# I removed log_price in the inputs to exclude the transformed dependent variable from my inputs.",
"_____no_output_____"
],
[
"from sklearn.preprocessing import StandardScaler\nscaler = StandardScaler ()\nscaler.fit(inputs)\ninputs_scaled = scaler.transform(inputs)\n# This standardizes my inputs; in other words, it subtractrs the mean and divide by the standard deviation from each observation.",
"_____no_output_____"
],
[
"from sklearn.model_selection import train_test_split\nx_train, x_test, y_train, y_test = train_test_split(inputs_scaled, targets, test_size=0.2, random_state=365)\n# I did this to avoid overfitting my model to my data.\n# The default setting of the train-test split is 75-25, but here I chose 80-20.\n# I used 'random_state' to ensure that I get the same random shuffle every time I split my data.\n",
"_____no_output_____"
],
[
"reg = LinearRegression()\nreg.fit(x_train, y_train)",
"_____no_output_____"
],
[
"y_hat = reg.predict(x_train)",
"_____no_output_____"
],
[
"plt.scatter(y_train, y_hat)\nplt.xlabel('Targets(y_train)', size=20)\nplt.ylabel('Predictions(y_hat)', size=20)\nplt.xlim(6,13)\nplt.ylim(6,13)\nplt.show()",
"_____no_output_____"
]
],
[
[
"### Using Residuals to check the Model",
"_____no_output_____"
]
],
[
[
"sns.distplot(y_train-y_hat)\nplt.title(\"Residuals PDF\", size=20)\n#to check whether the Residuals is normally distributed and the variability of the outcome",
"/Users/jovemmanuelr.ermitano/opt/anaconda3/envs/JERETech/lib/python3.7/site-packages/seaborn/distributions.py:2619: FutureWarning: `distplot` is a deprecated function and will be removed in a future version. Please adapt your code to use either `displot` (a figure-level function with similar flexibility) or `histplot` (an axes-level function for histograms).\n warnings.warn(msg, FutureWarning)\n"
],
[
"reg.score(x_train, y_train)",
"_____no_output_____"
],
[
"reg.intercept_\n# The intercept or bias calibrates the model: without it, each feature will be off the mark.",
"_____no_output_____"
],
[
"reg.coef_",
"_____no_output_____"
],
[
"reg_summary=pd.DataFrame(inputs.columns.values, columns=['Features'])\nreg_summary['Weights']=reg.coef_\nreg_summary\n\n# A feature with a coefficient of 0 means that it has no significance to the model.",
"_____no_output_____"
],
[
"#to know the categorical variables of my features\ndata_cleaned['Brand'].unique()",
"_____no_output_____"
],
[
"data_cleaned['Body'].unique()",
"_____no_output_____"
],
[
"data_cleaned['Engine Type'].unique()",
"_____no_output_____"
],
[
"data_cleaned['Registration'].unique()",
"_____no_output_____"
]
],
[
[
"## Testing my Model",
"_____no_output_____"
]
],
[
[
"y_hat_test = reg.predict(x_test)",
"_____no_output_____"
],
[
"plt.scatter(y_test, y_hat_test, alpha=0.2)\nplt.xlabel('Targets(y_test)', size=20)\nplt.ylabel('Predictions(y_hat_test)', size=20)\nplt.xlim(6, 13)\nplt.ylim(6, 13)\nplt.show()",
"_____no_output_____"
],
[
"df_pf = pd.DataFrame(np.exp(y_hat_test), columns=['Prediction'])\n#this returns the exponential of Y Hat Test and removes the log. \ndf_pf.head()",
"_____no_output_____"
],
[
"y_test = y_test.reset_index(drop=True)\ny_test.head()",
"_____no_output_____"
],
[
"df_pf['Target'] = np.exp(y_test)\ndf_pf",
"_____no_output_____"
],
[
"df_pf['Residual'] = df_pf['Target'] - df_pf['Prediction']\ndf_pf['Difference%'] = np.absolute(df_pf['Residual']/df_pf['Target']*100)\ndf_pf",
"_____no_output_____"
],
[
"pd.options.display.max_rows = 999\npd.set_option('display.float_format', lambda x: '%2f' % x)\ndf_pf.sort_values(by=['Difference%'])\n# This table shows the difference in percetage of the prediction and the target using the test data.\n# I included the Residuals because examining them as the same as examining the heart of the alogirthm.",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07f49428d1b59e24accb047ce99e1578f472b63 | 247,748 | ipynb | Jupyter Notebook | notebooks/regular_sentiment_analysis_pipeline.ipynb | AnzorGozalishvili/active_learning_playground | 9f7c71465456122d487b6dd088b425f66b1eaee7 | [
"MIT"
] | null | null | null | notebooks/regular_sentiment_analysis_pipeline.ipynb | AnzorGozalishvili/active_learning_playground | 9f7c71465456122d487b6dd088b425f66b1eaee7 | [
"MIT"
] | null | null | null | notebooks/regular_sentiment_analysis_pipeline.ipynb | AnzorGozalishvili/active_learning_playground | 9f7c71465456122d487b6dd088b425f66b1eaee7 | [
"MIT"
] | null | null | null | 41.202062 | 34,010 | 0.554103 | [
[
[
"<a href=\"https://colab.research.google.com/github/AnzorGozalishvili/active_learning_playground/blob/main/notebooks/regular_sentiment_analysis_pipeline.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
],
[
"# Simple Sentiment Analysis Pipeline\n\nHere we train simple 2 layer neural network for sentiment analysis. \n\n- Model: 2 Fully Connected layer NN (PyTorch)\n- Dataset: Sentiment Analysis\n- Embedding: spacy en_core_web_lg (mean aggregated embeddings of the text)",
"_____no_output_____"
],
[
"Install Requirements from [repository](https://github.com/AnzorGozalishvili/active_learning_playground)",
"_____no_output_____"
]
],
[
[
"!wget https://raw.githubusercontent.com/AnzorGozalishvili/active_learning_playground/main/requirements.txt\n!pip install -r requirements.txt\n!rm requirements.txt\n!pip install spacy-sentence-bert==0.1.2",
"_____no_output_____"
]
],
[
[
"# Imports",
"_____no_output_____"
]
],
[
[
"# system\nimport os\nimport sys\n\n# data and models\nimport numpy as np\nimport pandas as pd\nimport scipy\n\n# utilities\nimport random\nimport re\nimport datetime\n\n# text embeddings\nimport spacy\nimport spacy_sentence_bert\n\n# scikit-learn stuff\nimport sklearn\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.metrics import f1_score, roc_auc_score, precision_score, recall_score\n\n# PyTorch stuff\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nimport torch.optim as optim\n\n# visualization\nimport matplotlib.pyplot as plt\nfrom tqdm import tqdm\n\n# dataset retrieval\nfrom io import BytesIO\nfrom zipfile import ZipFile\nfrom urllib.request import urlopen",
"_____no_output_____"
]
],
[
[
"# Set Random Seeds\nFor reproducibility we set several random seeds which are recommended by PyTorch. ([See here](https://pytorch.org/docs/stable/notes/randomness.html))",
"_____no_output_____"
]
],
[
[
"random.seed(hash(\"setting random seeds\") % 2**32 - 1)\nnp.random.seed(hash(\"improves reproducibility\") % 2**32 - 1)\ntorch.manual_seed(hash(\"PyTorch\") % 2**32 - 1)\n\nRANDOM_SEED = 42",
"_____no_output_____"
]
],
[
[
"# Dataset\nLet's download dataset from given [url](https://archive.ics.uci.edu/ml/machine-learning-databases/00228/smsspamcollection.zip), then take a look at samples.",
"_____no_output_____"
],
[
"## Retrieve dataset",
"_____no_output_____"
]
],
[
[
"def get_dataset():\n resp = urlopen(\"https://archive.ics.uci.edu/ml/machine-learning-databases/00228/smsspamcollection.zip\")\n zipfile = ZipFile(BytesIO(resp.read()))\n \n lines = list()\n for line in zipfile.open('SMSSpamCollection').readlines():\n lines.append(line.decode('utf-8'))\n \n data = pd.DataFrame(data=lines)\n new = data[0].str.split(\"\\t\", n = 1, expand = True) \n data[\"text\"]= new[1] \n data[\"label\"]= new[0] \n data.drop(columns=[0], inplace = True)\n \n return data",
"_____no_output_____"
],
[
"dataset = get_dataset()",
"_____no_output_____"
]
],
[
[
"## Explore Samples",
"_____no_output_____"
]
],
[
[
"dataset.head()",
"_____no_output_____"
],
[
"dataset.shape",
"_____no_output_____"
]
],
[
[
"## Generate Train/Test splits and move forward",
"_____no_output_____"
],
[
"We see the imbalance in target variable",
"_____no_output_____"
]
],
[
[
"dataset.label.value_counts()",
"_____no_output_____"
]
],
[
[
"We have duplicated records",
"_____no_output_____"
]
],
[
[
"dataset.duplicated().sum()",
"_____no_output_____"
]
],
[
[
"remove these duplicates",
"_____no_output_____"
]
],
[
[
"dataset.drop_duplicates(inplace=True)",
"_____no_output_____"
],
[
"dataset.reset_index(drop=True, inplace=True)",
"_____no_output_____"
]
],
[
[
"split into train/test splits with 20/80 ratio",
"_____no_output_____"
]
],
[
[
"train, test = train_test_split(dataset, test_size=0.2, random_state=RANDOM_SEED)",
"_____no_output_____"
]
],
[
[
"Store these sets into dataset directory",
"_____no_output_____"
]
],
[
[
"DATASET_NAME = \"SMSSpamCollection\"\nif not os.path.exists('data'):\n os.mkdir('data')\nif not os.path.exists(f'data/{DATASET_NAME}'):\n os.mkdir(f'data/{DATASET_NAME}')\n \ntrain.to_csv(f'data/{DATASET_NAME}/train.csv')\ntest.to_csv(f'data/{DATASET_NAME}/test.csv')",
"_____no_output_____"
]
],
[
[
"Load again and continue",
"_____no_output_____"
]
],
[
[
"train = pd.read_csv(f'data/{DATASET_NAME}/train.csv', index_col=0)\ntest = pd.read_csv(f'data/{DATASET_NAME}/test.csv', index_col=0)",
"_____no_output_____"
],
[
"train.shape, test.shape",
"_____no_output_____"
],
[
"train.head(2)",
"_____no_output_____"
]
],
[
[
"# Generate Embeddings\n\nWe use spacy embeddings to vectorize our samples",
"_____no_output_____"
]
],
[
[
"class Vectorizer:\n \"\"\"Generates text embedding using deep learning model\"\"\"\n\n def __init__(self, *args, **kwargs):\n self.model = spacy_sentence_bert.load_model(kwargs.get('model', 'en_paraphrase_distilroberta_base_v1'))\n \n def __call__(self, text):\n if not text:\n text = \"\"\n \n return self.model(text).vector",
"_____no_output_____"
],
[
"vectorizer = Vectorizer()",
"_____no_output_____"
],
[
"EMBEDDING_DIM = vectorizer('sample text for embedding').shape[0]; \nEMBEDDING_DIM",
"_____no_output_____"
],
[
"train['vector'] = train.text.apply(vectorizer).apply(lambda x: x.tolist())\ntest['vector'] = test.text.apply(vectorizer).apply(lambda x: x.tolist())",
"_____no_output_____"
],
[
"DATASET_NAME = \"SMSSpamCollection\"\nif not os.path.exists('data'):\n os.mkdir('data')\nif not os.path.exists(f'data/{DATASET_NAME}'):\n os.mkdir(f'data/{DATASET_NAME}')\n \ntrain.to_csv(f'data/{DATASET_NAME}/train_vectorized.csv')\ntest.to_csv(f'data/{DATASET_NAME}/test_vectorized.csv')",
"_____no_output_____"
],
[
"train = pd.read_csv(f'data/{DATASET_NAME}/train_vectorized.csv', index_col=0)\ntest = pd.read_csv(f'data/{DATASET_NAME}/test_vectorized.csv', index_col=0)\n\ntrain['vector'] = train.vector.apply(eval)\ntest['vector'] = test.vector.apply(eval)",
"_____no_output_____"
]
],
[
[
"# PyTorch ML Pipeline",
"_____no_output_____"
],
[
"## Model\n\nExample of model is taken from [here](https://github.com/rmunro/pytorch_active_learning/blob/master/active_learning_basics.py)",
"_____no_output_____"
]
],
[
[
"class MLP(nn.Module):\n \"\"\"Simple 2 Layer Fully Connected NN (MLP)\"\"\"\n \n def __init__(self, num_labels, emb_dim):\n super(MLP, self).__init__()\n\n # Define model with one hidden layer with 128 neurons\n self.linear1 = nn.Linear(emb_dim, 128)\n self.linear2 = nn.Linear(128, num_labels)\n\n def forward(self, vector):\n hidden1 = self.linear1(vector).clamp(min=0) # ReLU\n output = self.linear2(hidden1)\n return F.log_softmax(output, dim=1)",
"_____no_output_____"
],
[
"MLP(num_labels=2, emb_dim=EMBEDDING_DIM)",
"_____no_output_____"
],
[
"\ntrain.sample()",
"_____no_output_____"
],
[
"torch.Tensor(train.vector.iloc[:10].values.tolist())",
"_____no_output_____"
],
[
"class Trainer:\n \"\"\"Trains PyTorch model on training data and also evaluated\"\"\"\n\n def __init__(self, *args, **kwargs):\n self.model = kwargs.get('model', MLP(num_labels=2, emb_dim=EMBEDDING_DIM))\n self.loss_function = kwargs.get('loss_function', nn.NLLLoss())\n self.optimizer = kwargs.get('optimizer', optim.SGD(self.model.parameters(), lr=0.01))\n self.label_to_idx = kwargs.get('label_to_idx', {'ham': 0, 'spam': 1})\n self.idx_to_label = {v:k for k,v in self.label_to_idx.items()}\n self.batch_size = kwargs.get('batch_size', 64)\n \n self.losses = []\n\n def train(self, training_data, test_data, epochs):\n \n for epoch in range(epochs):\n print(f'Epoch: {str(epoch)}')\n\n shuffled_training_data = training_data.sample(frac=1.0, random_state=RANDOM_SEED + epoch)\n\n for batch_idx, start_idx in enumerate(range(0, len(shuffled_training_data), self.batch_size)):\n vecs = torch.Tensor(\n shuffled_training_data.vector.iloc[start_idx:start_idx+self.batch_size].tolist()\n )\n targets = torch.LongTensor(\n shuffled_training_data.label.iloc[start_idx:start_idx+self.batch_size].apply(lambda x: self.label_to_idx[x]).tolist()\n )\n\n self.model.zero_grad()\n log_probs = self.model(vecs)\n\n loss = self.loss_function(log_probs, targets)\n loss.backward()\n self.optimizer.step()\n self.losses.append(loss.item())\n\n print(f\"\\tBatch: {batch_idx}\\tLoss: {self.losses[-1]}\")\n\n eval_results = self.evaluate(test_data)\n print(f\"Evaluation Results: {repr(eval_results)}\")\n\n # save model to path that is alphanumeric and includes number of items and accuracies in filename\n timestamp = re.sub('\\.[0-9]*','_',str(datetime.datetime.now())).replace(\" \", \"_\").replace(\"-\", \"\").replace(\":\",\"\")\n f1_score = str(eval_results['f1']) \n model_path = \"models/\"+timestamp+f1_score+\".params\"\n\n if not os.path.exists('models'):\n os.mkdir('models')\n\n torch.save(self.model.state_dict(), model_path)\n\n return model_path\n\n\n def evaluate(self, dataset):\n targets = []\n preds = []\n probs = []\n\n with torch.no_grad():\n for idx, row in dataset.iterrows():\n vec = torch.Tensor(row.vector).view(1, -1)\n target = self.label_to_idx[row.label]\n \n logits = self.model(vec)\n prob = np.exp(logits.cpu().data.numpy()[0]).tolist()\n pred = np.argmax(prob)\n\n probs.append(prob[1])\n preds.append(pred)\n targets.append(target)\n \n results = {\n \"f1\": round(f1_score(targets, preds, pos_label=1), 3),\n \"precision\": round(precision_score(targets, preds, pos_label=1), 3),\n \"recall\": round(recall_score(targets, preds, pos_label=1), 3),\n \"roc_auc\": round(roc_auc_score(targets, probs, labels=list(self.label_to_idx.keys())), 3),\n }\n \n return results\n\n def plot_loss(self):\n plt.figure(figsize=(10, 6))\n plt.plot(self.losses)\n plt.show()",
"_____no_output_____"
],
[
"LABEL_TO_IDX = {item:idx for idx, item in enumerate(sorted(train.label.unique().tolist()))};\nLABEL_TO_IDX",
"_____no_output_____"
],
[
"mlp = MLP(num_labels=2, emb_dim=EMBEDDING_DIM)",
"_____no_output_____"
],
[
"trainer = Trainer(\n **{\n \"model\": mlp,\n \"loss_function\": nn.NLLLoss(),\n \"optimizer\": optim.SGD(mlp.parameters(), lr=0.01),\n \"label_to_idx\": LABEL_TO_IDX,\n \"batch_size\": 256,\n }\n)",
"_____no_output_____"
],
[
"trainer.train(training_data=train, test_data=test, epochs=10)",
"Epoch: 0\n\tBatch: 0\tLoss: 0.7006976008415222\n\tBatch: 1\tLoss: 0.6923462152481079\n\tBatch: 2\tLoss: 0.6919472217559814\n\tBatch: 3\tLoss: 0.6792134046554565\n\tBatch: 4\tLoss: 0.6740986704826355\n\tBatch: 5\tLoss: 0.6665879487991333\n\tBatch: 6\tLoss: 0.6615784764289856\n\tBatch: 7\tLoss: 0.6523606777191162\n\tBatch: 8\tLoss: 0.6496344208717346\n\tBatch: 9\tLoss: 0.6394569873809814\n\tBatch: 10\tLoss: 0.6366879940032959\n\tBatch: 11\tLoss: 0.6268129348754883\n\tBatch: 12\tLoss: 0.6198194026947021\n\tBatch: 13\tLoss: 0.6122130751609802\n\tBatch: 14\tLoss: 0.6036674976348877\n\tBatch: 15\tLoss: 0.6109205484390259\n\tBatch: 16\tLoss: 0.6451691389083862\nEpoch: 1\n\tBatch: 0\tLoss: 0.5963299870491028\n\tBatch: 1\tLoss: 0.5895066261291504\n\tBatch: 2\tLoss: 0.5847741365432739\n\tBatch: 3\tLoss: 0.5800781846046448\n\tBatch: 4\tLoss: 0.5626293420791626\n\tBatch: 5\tLoss: 0.5618782639503479\n\tBatch: 6\tLoss: 0.5759237408638\n\tBatch: 7\tLoss: 0.5480124950408936\n\tBatch: 8\tLoss: 0.5573577284812927\n\tBatch: 9\tLoss: 0.5636530518531799\n\tBatch: 10\tLoss: 0.5534355640411377\n\tBatch: 11\tLoss: 0.5363779067993164\n\tBatch: 12\tLoss: 0.531516432762146\n\tBatch: 13\tLoss: 0.5345359444618225\n\tBatch: 14\tLoss: 0.5113618969917297\n\tBatch: 15\tLoss: 0.5223129391670227\n\tBatch: 16\tLoss: 0.5452355146408081\nEpoch: 2\n\tBatch: 0\tLoss: 0.520038366317749\n\tBatch: 1\tLoss: 0.5214989185333252\n\tBatch: 2\tLoss: 0.49336323142051697\n\tBatch: 3\tLoss: 0.5137934684753418\n\tBatch: 4\tLoss: 0.5033373832702637\n\tBatch: 5\tLoss: 0.5038758516311646\n\tBatch: 6\tLoss: 0.4804615080356598\n\tBatch: 7\tLoss: 0.4932331144809723\n\tBatch: 8\tLoss: 0.47400185465812683\n\tBatch: 9\tLoss: 0.4609416723251343\n\tBatch: 10\tLoss: 0.4803539216518402\n\tBatch: 11\tLoss: 0.45216771960258484\n\tBatch: 12\tLoss: 0.4435400664806366\n\tBatch: 13\tLoss: 0.466361939907074\n\tBatch: 14\tLoss: 0.45447641611099243\n\tBatch: 15\tLoss: 0.4524100422859192\n\tBatch: 16\tLoss: 0.44968289136886597\nEpoch: 3\n\tBatch: 0\tLoss: 0.4390922784805298\n\tBatch: 1\tLoss: 0.4375496804714203\n\tBatch: 2\tLoss: 0.462681382894516\n\tBatch: 3\tLoss: 0.4228961169719696\n\tBatch: 4\tLoss: 0.42517659068107605\n\tBatch: 5\tLoss: 0.4232017993927002\n\tBatch: 6\tLoss: 0.4278464615345001\n\tBatch: 7\tLoss: 0.4267292022705078\n\tBatch: 8\tLoss: 0.45678412914276123\n\tBatch: 9\tLoss: 0.42939215898513794\n\tBatch: 10\tLoss: 0.4095659554004669\n\tBatch: 11\tLoss: 0.40427514910697937\n\tBatch: 12\tLoss: 0.3761459290981293\n\tBatch: 13\tLoss: 0.41930779814720154\n\tBatch: 14\tLoss: 0.413412868976593\n\tBatch: 15\tLoss: 0.41669607162475586\n\tBatch: 16\tLoss: 0.33717551827430725\nEpoch: 4\n\tBatch: 0\tLoss: 0.3938661515712738\n\tBatch: 1\tLoss: 0.4024693965911865\n\tBatch: 2\tLoss: 0.3963456451892853\n\tBatch: 3\tLoss: 0.4177422523498535\n\tBatch: 4\tLoss: 0.4058922231197357\n\tBatch: 5\tLoss: 0.3757079243659973\n\tBatch: 6\tLoss: 0.35021311044692993\n\tBatch: 7\tLoss: 0.3794701099395752\n\tBatch: 8\tLoss: 0.36651068925857544\n\tBatch: 9\tLoss: 0.3909286558628082\n\tBatch: 10\tLoss: 0.3510792851448059\n\tBatch: 11\tLoss: 0.3708222508430481\n\tBatch: 12\tLoss: 0.3647116422653198\n\tBatch: 13\tLoss: 0.39035066962242126\n\tBatch: 14\tLoss: 0.36266952753067017\n\tBatch: 15\tLoss: 0.355114609003067\n\tBatch: 16\tLoss: 0.33397990465164185\nEpoch: 5\n\tBatch: 0\tLoss: 0.35017159581184387\n\tBatch: 1\tLoss: 0.3768596053123474\n\tBatch: 2\tLoss: 0.38127779960632324\n\tBatch: 3\tLoss: 0.35308128595352173\n\tBatch: 4\tLoss: 0.3730151355266571\n\tBatch: 5\tLoss: 0.3818468749523163\n\tBatch: 6\tLoss: 0.3399249017238617\n\tBatch: 7\tLoss: 0.32580360770225525\n\tBatch: 8\tLoss: 0.3494769036769867\n\tBatch: 9\tLoss: 0.3304775357246399\n\tBatch: 10\tLoss: 0.3435797691345215\n\tBatch: 11\tLoss: 0.31057703495025635\n\tBatch: 12\tLoss: 0.30819177627563477\n\tBatch: 13\tLoss: 0.3310580253601074\n\tBatch: 14\tLoss: 0.3599465489387512\n\tBatch: 15\tLoss: 0.32216140627861023\n\tBatch: 16\tLoss: 0.3447261154651642\nEpoch: 6\n\tBatch: 0\tLoss: 0.301715224981308\n\tBatch: 1\tLoss: 0.303802490234375\n\tBatch: 2\tLoss: 0.3215929865837097\n\tBatch: 3\tLoss: 0.3273893892765045\n\tBatch: 4\tLoss: 0.3375716507434845\n\tBatch: 5\tLoss: 0.3422122597694397\n\tBatch: 6\tLoss: 0.29706713557243347\n\tBatch: 7\tLoss: 0.3457891643047333\n\tBatch: 8\tLoss: 0.31521621346473694\n\tBatch: 9\tLoss: 0.3192196786403656\n\tBatch: 10\tLoss: 0.30378925800323486\n\tBatch: 11\tLoss: 0.3308448791503906\n\tBatch: 12\tLoss: 0.3161982297897339\n\tBatch: 13\tLoss: 0.30179062485694885\n\tBatch: 14\tLoss: 0.302897572517395\n\tBatch: 15\tLoss: 0.3643922805786133\n\tBatch: 16\tLoss: 0.34449824690818787\nEpoch: 7\n\tBatch: 0\tLoss: 0.31158044934272766\n\tBatch: 1\tLoss: 0.29753610491752625\n\tBatch: 2\tLoss: 0.26616254448890686\n\tBatch: 3\tLoss: 0.3361019194126129\n\tBatch: 4\tLoss: 0.33420565724372864\n\tBatch: 5\tLoss: 0.34775805473327637\n\tBatch: 6\tLoss: 0.3069077432155609\n\tBatch: 7\tLoss: 0.3167460560798645\n\tBatch: 8\tLoss: 0.28498807549476624\n\tBatch: 9\tLoss: 0.306414932012558\n\tBatch: 10\tLoss: 0.28183215856552124\n\tBatch: 11\tLoss: 0.2751620411872864\n\tBatch: 12\tLoss: 0.3399288058280945\n\tBatch: 13\tLoss: 0.24875763058662415\n\tBatch: 14\tLoss: 0.2681702971458435\n\tBatch: 15\tLoss: 0.29835623502731323\n\tBatch: 16\tLoss: 0.2631158232688904\nEpoch: 8\n\tBatch: 0\tLoss: 0.29547119140625\n\tBatch: 1\tLoss: 0.2988022267818451\n\tBatch: 2\tLoss: 0.2820678949356079\n\tBatch: 3\tLoss: 0.2921670973300934\n\tBatch: 4\tLoss: 0.24506497383117676\n\tBatch: 5\tLoss: 0.2711237072944641\n\tBatch: 6\tLoss: 0.2877733111381531\n\tBatch: 7\tLoss: 0.3350869119167328\n\tBatch: 8\tLoss: 0.2439935952425003\n\tBatch: 9\tLoss: 0.3010849356651306\n\tBatch: 10\tLoss: 0.2436361312866211\n\tBatch: 11\tLoss: 0.2753848433494568\n\tBatch: 12\tLoss: 0.3150378167629242\n\tBatch: 13\tLoss: 0.26992931962013245\n\tBatch: 14\tLoss: 0.26075324416160583\n\tBatch: 15\tLoss: 0.331144243478775\n\tBatch: 16\tLoss: 0.28505346179008484\nEpoch: 9\n\tBatch: 0\tLoss: 0.29621079564094543\n\tBatch: 1\tLoss: 0.29683801531791687\n\tBatch: 2\tLoss: 0.2638823390007019\n\tBatch: 3\tLoss: 0.21892012655735016\n\tBatch: 4\tLoss: 0.29494509100914\n\tBatch: 5\tLoss: 0.25886091589927673\n\tBatch: 6\tLoss: 0.288422167301178\n\tBatch: 7\tLoss: 0.3268676698207855\n\tBatch: 8\tLoss: 0.25718218088150024\n\tBatch: 9\tLoss: 0.27496710419654846\n\tBatch: 10\tLoss: 0.2725825011730194\n\tBatch: 11\tLoss: 0.2624233365058899\n\tBatch: 12\tLoss: 0.2734237313270569\n\tBatch: 13\tLoss: 0.2606518864631653\n\tBatch: 14\tLoss: 0.24521711468696594\n\tBatch: 15\tLoss: 0.22512483596801758\n\tBatch: 16\tLoss: 0.29755955934524536\nEvaluation Results: {'f1': 0.0, 'precision': 0.0, 'recall': 0.0, 'roc_auc': 0.966}\n"
],
[
"trainer.plot_loss()",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07f524cc8fe2afb424335cc63a491cdb369156f | 236,920 | ipynb | Jupyter Notebook | docs/source/bias_removal.ipynb | andersy005/climpred | 3179be895eca3d9036b86bc4ab2a337d6bf73fc5 | [
"MIT"
] | 1 | 2020-12-09T01:32:44.000Z | 2020-12-09T01:32:44.000Z | docs/source/bias_removal.ipynb | andersy005/climpred | 3179be895eca3d9036b86bc4ab2a337d6bf73fc5 | [
"MIT"
] | null | null | null | docs/source/bias_removal.ipynb | andersy005/climpred | 3179be895eca3d9036b86bc4ab2a337d6bf73fc5 | [
"MIT"
] | null | null | null | 745.031447 | 81,452 | 0.952334 | [
[
[
"# Bias Removal",
"_____no_output_____"
],
[
"Climate models can have biases relative to different verification datasets. Commonly, biases are removed by postprocessing before verification of forecasting skill. `climpred` provides convenience functions to do so.",
"_____no_output_____"
]
],
[
[
"import climpred\nimport xarray as xr\nimport matplotlib.pyplot as plt\nfrom climpred import HindcastEnsemble",
"_____no_output_____"
],
[
"hind = climpred.tutorial.load_dataset('CESM-DP-SST') # CESM-DPLE hindcast ensemble output.\nobs = climpred.tutorial.load_dataset('ERSST') # observations\nhind[\"lead\"].attrs[\"units\"] = \"years\"",
"_____no_output_____"
]
],
[
[
"We begin by removing a mean climatology for the observations, since `CESM-DPLE` generates its anomalies over this same time period.",
"_____no_output_____"
]
],
[
[
"obs = obs - obs.sel(time=slice('1964', '2014')).mean('time')",
"_____no_output_____"
],
[
"hindcast = HindcastEnsemble(hind)\nhindcast = hindcast.add_observations(obs)\nhindcast.plot()",
"/Users/rileybrady/Desktop/dev/climpred/climpred/utils.py:141: UserWarning: Assuming annual resolution due to numeric inits. Change init to a datetime if it is another resolution.\n \"Assuming annual resolution due to numeric inits. \"\n"
]
],
[
[
"The warming of the `observations` is similar to `initialized`.",
"_____no_output_____"
],
[
"## Mean bias removal",
"_____no_output_____"
],
[
"Typically, bias depends on lead-time and therefore should therefore also be removed depending on lead-time.",
"_____no_output_____"
]
],
[
[
"bias = hindcast.verify(metric='bias', comparison='e2o', dim=[], alignment='same_verifs')",
"_____no_output_____"
],
[
"bias.SST.plot()",
"_____no_output_____"
]
],
[
[
"Against `observations`, there is small cold bias in 1980 and 1990 initialization years and warm bias before and after.",
"_____no_output_____"
]
],
[
[
"# lead-time dependant mean bias over all initializations is quite small but negative\nmean_bias = bias.mean('init')\nmean_bias.SST.plot()",
"_____no_output_____"
]
],
[
[
"### Cross Validatation\nTo remove the mean bias quickly, the mean bias over all initializations is subtracted. For formally correct bias removal with cross validation, the given initialization is left out when subtracting the mean bias.",
"_____no_output_____"
],
[
"`climpred` wraps these functions in `HindcastEnsemble.remove_bias(how='mean', cross_validate={bool})`.",
"_____no_output_____"
]
],
[
[
"hindcast.remove_bias(how='mean', cross_validate=True, alignment='same_verifs').plot()\nplt.title('hindcast lead timeseries removed for unconditional mean bias')\nplt.show()",
"_____no_output_____"
]
],
[
[
"## Skill",
"_____no_output_____"
],
[
"Distance-based accuracy metrics like (`mse`,`rmse`,`nrmse`,...) are sensitive to mean bias removal. Correlations like (`pearson_r`, `spearman_r`) are insensitive to bias correction.",
"_____no_output_____"
]
],
[
[
"metric='rmse'\nhindcast.verify(metric=metric,\n comparison='e2o',\n dim='init',\n alignment='same_verifs')['SST'].plot(label='no bias correction')\nhindcast.remove_bias(cross_validate=False,\n alignment='same_verifs') \\\n .verify(metric=metric,\n comparison='e2o',\n dim='init',\n alignment='same_verifs').SST.plot(label='bias correction without cross validation')\nhindcast.remove_bias(cross_validate=True,\n alignment='same_verifs') \\\n .verify(metric=metric,\n comparison='e2o',\n dim='init',\n alignment='same_verifs').SST.plot(label='formally correct bias correction with cross validation')\nplt.legend()\nplt.title(f\"{metric.upper()} SST evaluated against observations\")\nplt.show()",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
]
] |
d07f5f6623aa016dfde5c9a87dbe507fd13910b7 | 108,810 | ipynb | Jupyter Notebook | notebooks/experiments.ipynb | feeeper/zindi-expresso-churn-prediction-challenge | 39f14a47976984e45963727aad05c79e11dfd0a4 | [
"MIT"
] | null | null | null | notebooks/experiments.ipynb | feeeper/zindi-expresso-churn-prediction-challenge | 39f14a47976984e45963727aad05c79e11dfd0a4 | [
"MIT"
] | null | null | null | notebooks/experiments.ipynb | feeeper/zindi-expresso-churn-prediction-challenge | 39f14a47976984e45963727aad05c79e11dfd0a4 | [
"MIT"
] | null | null | null | 47.97619 | 2,243 | 0.504742 | [
[
[
"import re\nfrom typing import List\n\nimport pandas as pd\nimport numpy as np\nfrom tqdm.notebook import tqdm\n\nimport optuna\nfrom sklearn.metrics import roc_auc_score\nfrom sklearn.preprocessing import OneHotEncoder, StandardScaler, MinMaxScaler, PolynomialFeatures\nfrom sklearn.linear_model import LogisticRegression, SGDClassifier\nfrom sklearn.model_selection import train_test_split\n\nfrom sklearn.ensemble import GradientBoostingClassifier\nfrom sklearn.ensemble import RandomForestClassifier\nfrom xgboost import XGBClassifier\nfrom lightgbm import LGBMClassifier\nfrom catboost import CatBoostClassifier\n\ntqdm.pandas()",
"_____no_output_____"
],
[
"# XGB\n# fillna числовых колонок как средние значения по соотв колонке,\n# TENURE & REGION OneHotEncoded \n# StScaler on whole dataset \n# target endocding by region and tenure\n\n#import data\ntrain = pd.read_csv('./data/Train_folds.zip')\n# train = train[train['kfold'].isin([0, 1, 2])]\n\ntest = pd.read_csv('./data/Test.zip')\nsubmission = pd.read_csv('./data/SampleSubmission.csv')\n\ncat_cols = [\n 'REGION',\n 'TENURE',\n 'TOP_PACK'\n]\n\nnum_cols = [\n 'MONTANT',\n 'FREQUENCE_RECH',\n 'REVENUE',\n 'ARPU_SEGMENT',\n 'FREQUENCE',\n 'DATA_VOLUME',\n 'ON_NET', \n 'ORANGE',\n 'TIGO',\n 'ZONE1',\n 'ZONE2',\n 'REGULARITY',\n 'FREQ_TOP_PACK',\n]\n\ntarget = 'CHURN'\n\nmapping = {\n 'D 3-6 month': 1,\n 'E 6-9 month': 2,\n 'F 9-12 month': 3,\n 'G 12-15 month': 4,\n 'H 15-18 month': 5,\n 'I 18-21 month': 6,\n 'J 21-24 month': 7,\n 'K > 24 month': 8,\n 'OTHER': 9\n}\n\ntrain['TOP_PACK'] = train['TOP_PACK'].fillna('OTHER')\ntest['TOP_PACK'] = test['TOP_PACK'].fillna('OTHER')\n\ntrain['TENURE'] = train['TENURE'].fillna('OTHER')\ntest['TENURE'] = test['TENURE'].fillna('OTHER')\ntrain['TENURE'] = train['TENURE'].map(mapping)\ntest['TENURE'] = test['TENURE'].map(mapping)\n\ntrain['REGION'] = train['REGION'].fillna('OTHER')\ntest['REGION'] = test['REGION'].fillna('OTHER')\n\nfor nc in tqdm(num_cols):\n mean = train[nc].mean()\n train[nc] = train[nc].fillna(mean)\n test[nc] = test[nc].fillna(mean)\n \ntrain.shape, test.shape",
"_____no_output_____"
],
[
"churn_by_tenure = pd.read_csv('./data/agg_by_tenure_churn.csv')\nchurn_by_tenure = churn_by_tenure.append(pd.DataFrame({'TENURE': [9], 'CHURN_mean': 0, 'CHURN_median': 0}))\n\ntrain = pd.merge(train, churn_by_tenure[['TENURE', 'CHURN_mean']], left_on='TENURE', right_on='TENURE', how='left')\ntrain = train.rename({'CHURN_mean': 'MEAN_CHURN_BY_TENURE'}, axis='columns')\n\ntest = pd.merge(test, churn_by_tenure[['TENURE', 'CHURN_mean']], left_on='TENURE', right_on='TENURE', how='left')\ntest = test.rename({'CHURN_mean': 'MEAN_CHURN_BY_TENURE'}, axis='columns')\n\ntrain.shape, test.shape",
"_____no_output_____"
],
[
"churn_by_region = pd.read_csv('./data/agg_by_region_churn.csv')\n\nvc = train[train['REGION'] == 'OTHER']['CHURN'].value_counts()\nchurn_by_region_mean = vc[1]/(vc[0]+vc[1])\nchurn_by_region = churn_by_region.append(pd.DataFrame({'REGION': ['OTHER'], 'CHURN_mean': churn_by_region_mean, 'CHURN_median': 0}))\n\ntrain = pd.merge(train, churn_by_region[['REGION', 'CHURN_mean']], left_on='REGION', right_on='REGION', how='left')\ntrain = train.rename({'CHURN_mean': 'MEAN_CHURN_BY_REGION'}, axis='columns')\n\ntest = pd.merge(test, churn_by_region[['REGION', 'CHURN_mean']], left_on='REGION', right_on='REGION', how='left')\ntest = test.rename({'CHURN_mean': 'MEAN_CHURN_BY_REGION'}, axis='columns')\n\ntrain.shape, test.shape",
"_____no_output_____"
],
[
"churn_by_top_pack = train[['TOP_PACK', 'CHURN']].groupby('TOP_PACK').agg({'CHURN': ['mean', 'median']})\nchurn_by_top_pack.columns = ['_'.join(col).strip() for col in churn_by_top_pack.columns.values]\nchurn_by_top_pack_mean = np.mean(train[train['TOP_PACK'] == 'OTHER']['CHURN'])\nchurn_by_top_pack = churn_by_top_pack.reset_index()\n\nd = {\n 'TOP_PACK': [],\n 'CHURN_mean': [],\n 'CHURN_median': []\n}\n\nfor tp in test['TOP_PACK'].unique():\n if tp not in churn_by_top_pack['TOP_PACK'].unique():\n d['TOP_PACK'].append(tp)\n d['CHURN_mean'].append(churn_by_top_pack_mean)\n d['CHURN_median'].append(0)\n \nchurn_by_top_pack = churn_by_top_pack.append(pd.DataFrame(d))\n\ntrain = pd.merge(train, churn_by_top_pack[['TOP_PACK', 'CHURN_mean']], left_on='TOP_PACK', right_on='TOP_PACK', how='left')\ntrain = train.rename({'CHURN_mean': 'MEAN_CHURN_BY_TOP_PACK'}, axis='columns')\n\ntest = pd.merge(test, churn_by_top_pack[['TOP_PACK', 'CHURN_mean']], left_on='TOP_PACK', right_on='TOP_PACK', how='left')\ntest = test.rename({'CHURN_mean': 'MEAN_CHURN_BY_TOP_PACK'}, axis='columns')\n\ntrain.shape, test.shape",
"_____no_output_____"
],
[
"train.head()",
"_____no_output_____"
],
[
"useful_cols = [\n 'REGION',\n 'TENURE',\n # 'MRG', # constant\n 'TOP_PACK', # wtf column\n 'MONTANT',\n 'FREQUENCE_RECH',\n 'REVENUE',\n 'ARPU_SEGMENT',\n 'FREQUENCE',\n 'DATA_VOLUME',\n 'ON_NET', \n 'ORANGE',\n 'TIGO',\n 'ZONE1',\n 'ZONE2',\n 'REGULARITY',\n 'FREQ_TOP_PACK',\n 'MEAN_CHURN_BY_TENURE',\n 'MEAN_CHURN_BY_REGION',\n 'MEAN_CHURN_BY_TOP_PACK'\n]\n\nfor cat_col in cat_cols:\n encoder = OneHotEncoder(handle_unknown='ignore')\n unique_values = train[cat_col].unique()\n\n one_hot_encoded_cols = [f'{cat_col}_{i}' for i in range(len(unique_values))]\n \n ohe_df = pd.DataFrame(encoder.fit_transform(train[[cat_col]]).toarray(), columns=one_hot_encoded_cols)\n ohe_df.index = train.index\n train = train.drop(cat_col, axis=1)\n train = pd.concat([train, ohe_df], axis=1) \n print(f'[{cat_col}] xtrain transformed')\n\n ohe_df = pd.DataFrame(encoder.transform(test[[cat_col]]).toarray(), columns=one_hot_encoded_cols)\n ohe_df.index = test.index\n test = test.drop(cat_col, axis=1)\n test = pd.concat([test, ohe_df], axis=1)\n print(f'[{cat_col}] xtest transformed')\n \n useful_cols += one_hot_encoded_cols\n useful_cols.remove(cat_col)\n \nscaler = StandardScaler()\ntrain[num_cols] = scaler.fit_transform(train[num_cols])\ntest[num_cols] = scaler.transform(test[num_cols])",
"[REGION] xtrain transformed\n[REGION] xtest transformed\n[TENURE] xtrain transformed\n[TENURE] xtest transformed\n[TOP_PACK] xtrain transformed\n[TOP_PACK] xtest transformed\n"
],
[
"poly = PolynomialFeatures(degree=3, interaction_only=True, include_bias=False)\ntrain_poly = poly.fit_transform(train[num_cols])\ntest_poly = poly.fit_transform(test[num_cols])\n\npoly_columns = [f'poly_{x.replace(\" \", \"__\")}' for x in poly.get_feature_names(num_cols)] # [f\"poly_{i}\" for i in range(train_poly.shape[1])]\ndf_poly = pd.DataFrame(train_poly, columns=poly_columns, dtype=np.float32)\ndf_test_poly = pd.DataFrame(test_poly, columns=poly_columns, dtype=np.float32)\n\ntrain = pd.concat([train, df_poly], axis=1)\ntest = pd.concat([test, df_test_poly], axis=1)\n\nuseful_cols += poly_columns\n\ntrain.head()",
"_____no_output_____"
],
[
"original_columns = [x for x in train.columns if not x.startswith(('poly', 'MEAN_', 'REGION_', 'TENURE_', 'TOP_PACK_'))]\ntrain[original_columns]",
"_____no_output_____"
],
[
"for col in [x for x in original_columns if x != 'user_id' and x != 'MRG']:\n print(f'{col}:', train[[col]].corrwith(train['CHURN']).iloc[-1])",
"MONTANT: -0.04725325016612715\nFREQUENCE_RECH: -0.05552728780708329\nREVENUE: -0.05380789684838681\nARPU_SEGMENT: -0.053807976689509644\nFREQUENCE: -0.06573370680557317\nDATA_VOLUME: -0.015397683867390988\nON_NET: -0.026601387005991874\nORANGE: -0.02446127513667626\nTIGO: -0.009995508562641881\nZONE1: 0.00144668078351635\nZONE2: 0.0004906335298515414\nREGULARITY: -0.47999140986851524\nFREQ_TOP_PACK: -0.03262768810648328\nCHURN: 1.0\nkfold: -0.0008295540518998162\n"
],
[
"# from sklearn.preprocessing import MinMaxScaler\n\n# def corr(df: pd.DataFrame) -> None:\n# for col in [x for x in list(df) if x != 'CHURN']:\n# print(f'{col}:', df[[col]].corrwith(df['CHURN']).iloc[-1])\n\n\n# tmp = pd.DataFrame({\n# 'CHURN': train['CHURN'],\n# 'REG': train['REGULARITY'],\n# })\n# tmp['REG_LOG'] = tmp['REG'].apply(np.log)\n# tmp['REG_SQRT'] = tmp['REG'].apply(np.sqrt)\n# tmp['REG_ABS'] = tmp['REG'].apply(np.abs)\n# tmp['REG_POW2'] = tmp['REG'].apply(lambda x: x ** 2)\n\n# tmp['REG_ABS_LOG'] = tmp['REG_ABS'].apply(np.log)\n# tmp['REG_ABS_SQRT'] = tmp['REG_ABS'].apply(np.sqrt)\n\n# corr(tmp)\n\n# scaler = MinMaxScaler() \n# arr_scaled = scaler.fit_transform([tmp['REG_ABS_SQRT']])\nmaximum = tmp['REG_ABS_SQRT'].max()\nminumum = tmp['REG_ABS_SQRT'].min()\npd.DataFrame({'c': tmp['REG_ABS_SQRT'].apply(lambda x: (x - minumum) / (maximum - minumum))}).corrwith(tmp['CHURN'])",
"_____no_output_____"
],
[
"def optimize_floats(df: pd.DataFrame) -> pd.DataFrame:\n floats = df.select_dtypes(include=['float64']).columns.tolist()\n df[floats] = df[floats].apply(pd.to_numeric, downcast='float')\n return df\n\n\ndef optimize_ints(df: pd.DataFrame) -> pd.DataFrame:\n ints = df.select_dtypes(include=['int64']).columns.tolist()\n df[ints] = df[ints].apply(pd.to_numeric, downcast='integer')\n return df\n\n\ndef optimize_objects(df: pd.DataFrame, datetime_features: List[str]) -> pd.DataFrame:\n for col in df.select_dtypes(include=['object']):\n if col not in datetime_features:\n num_unique_values = len(df[col].unique())\n num_total_values = len(df[col])\n if float(num_unique_values) / num_total_values < 0.5:\n df[col] = df[col].astype('category')\n else:\n df[col] = pd.to_datetime(df[col])\n return df\n\n\n\ndef optimize(df: pd.DataFrame, datetime_features: List[str] = []):\n return optimize_floats(optimize_ints(optimize_objects(df, datetime_features)))\n\ntrain = optimize(train, [])",
"_____no_output_____"
],
[
"sum(train.memory_usage())/1024/1024",
"_____no_output_____"
],
[
"def fit_predict(xtrain: pd.DataFrame,\n ytrain: pd.DataFrame,\n xvalid: pd.DataFrame = None,\n yvalid: pd.DataFrame = None,\n valid_ids: list[str] = None):\n xtest = test[useful_cols]\n\n model = LGBMClassifier(\n n_estimators=7000,\n n_jobs=-1,\n random_state=42,\n# xgb\n# **{\n# 'learning_rate': 0.014461849398074727,\n# 'reg_lambda': 0.08185850904776007,\n# 'reg_alpha': 0.0001173486815850512,\n# 'subsample': 0.7675905290878289,\n# 'colsample_bytree': 0.2708299922996371,\n# 'max_depth': 7\n# }\n# lbg\n **{\n 'learning_rate': 0.029253877255476443,\n 'reg_lambda': 16.09426889606859,\n 'reg_alpha': 0.014354120473120952,\n 'subsample': 0.43289663848783977,\n 'colsample_bytree': 0.5268279718406376,\n 'max_depth': 6\n }\n )\n \n if xvalid is not None:\n model.fit(xtrain, ytrain, early_stopping_rounds=300, eval_set=[(xvalid, yvalid)], verbose=1000)\n preds_valid = model.predict_proba(xvalid)[:, 1]\n test_preds = model.predict_proba(xtest)[:, 1]\n score = roc_auc_score(yvalid, preds_valid)\n print(fold, score)\n\n return model, test_preds, dict(zip(valid_ids, preds_valid)), score\n \n model.fit(xtrain, ytrain, verbose=1000)\n return model, None, None, None",
"_____no_output_____"
],
[
"final_test_predictions = []\nfinal_valid_predictions = {}\n\nscores = []\n\nfor fold in tqdm(range(2), 'folds'):\n xtrain = train[train['kfold'] != fold][useful_cols]\n ytrain = train[train['kfold'] != fold][target]\n \n xvalid = train[train['kfold'] == fold][useful_cols]\n yvalid = train[train['kfold'] == fold][target]\n \n valid_ids = train[train['kfold'] == fold]['user_id'].values.tolist()\n \n# xtest = test[useful_cols]\n\n# model = LGBMClassifier(\n# n_estimators=7000,\n# n_jobs=-1,\n# random_state=42,\n# tree_method='gpu_hist',\n# gpu_id=0,\n# predictor=\"gpu_predictor\",\n# **{\n# 'learning_rate': 0.014461849398074727,\n# 'reg_lambda': 0.08185850904776007,\n# 'reg_alpha': 0.0001173486815850512,\n# 'subsample': 0.7675905290878289,\n# 'colsample_bytree': 0.2708299922996371,\n# 'max_depth': 7\n# }\n# )\n# model.fit(xtrain, ytrain, early_stopping_rounds=300, eval_set=[(xvalid, yvalid)], verbose=1000)\n \n# preds_valid = model.predict_proba(xvalid)[:, 1]\n# test_preds = model.predict_proba(xtest)[:, 1]\n# final_test_predictions.append(test_preds)\n# final_valid_predictions.update(dict(zip(valid_ids, preds_valid)))\n# score = roc_auc_score(yvalid, preds_valid)\n \n model, test_preds, val_preds, score = fit_predict(xtrain, ytrain, xvalid, yvalid, valid_ids)\n \n del xtrain\n del ytrain\n del xvalid\n del yvalid\n del valid_ids\n del model\n \n scores.append(score)\n print(fold, score) \n\nprint(np.mean(scores), np.std(scores))",
"_____no_output_____"
],
[
"final_test_predictions = []\nfinal_valid_predictions = {}\n\nscores = []\n\nmodel, test_preds, val_preds, score = fit_predict(train[useful_cols], train[target])\npreds = model.predict_proba(test[useful_cols])\npreds",
"_____no_output_____"
],
[
"sample_submission = pd.read_csv('./data/SampleSubmission.csv')\nsample_submission['CHURN'] = preds[:, 1]\nsample_submission.to_csv(\"./data/2021-10-29-01-03-lgb-full-dataset.csv\", index=False)",
"_____no_output_____"
],
[
"def fit_predict(xtrain: pd.DataFrame, ytrain: pd.DataFrame, xvalid: pd.DataFrame, yvalid: pd.DataFrame, valid_ids: list[str]):\n xtest = test[useful_cols]\n\n model = XGBClassifier(\n n_estimators=7000,\n n_jobs=-1,\n random_state=42,\n tree_method='gpu_hist',\n gpu_id=0,\n predictor=\"gpu_predictor\",\n# xgb\n# **{\n# 'learning_rate': 0.014461849398074727,\n# 'reg_lambda': 0.08185850904776007,\n# 'reg_alpha': 0.0001173486815850512,\n# 'subsample': 0.7675905290878289,\n# 'colsample_bytree': 0.2708299922996371,\n# 'max_depth': 7\n# }\n# lbg\n **{\n 'learning_rate': 0.029253877255476443,\n 'reg_lambda': 16.09426889606859,\n 'reg_alpha': 0.014354120473120952,\n 'subsample': 0.43289663848783977,\n 'colsample_bytree': 0.5268279718406376,\n 'max_depth': 6\n }\n ) \n model.fit(xtrain, ytrain, early_stopping_rounds=300, eval_set=[(xvalid, yvalid)], verbose=1000)\n \n preds_valid = model.predict_proba(xvalid)[:, 1]\n test_preds = model.predict_proba(xtest)[:, 1]\n score = roc_auc_score(yvalid, preds_valid)\n print(fold, score)\n \n return test_preds, dict(zip(valid_ids, preds_valid)), score\n\nfinal_test_predictions = []\nfinal_valid_predictions = {}\n\nscores = []\n\nfolds = train['kfold'].unique()\nfor fold in tqdm([0,1,2], 'folds'):\n xtrain = train[train['kfold'] != fold][useful_cols]\n ytrain = train[train['kfold'] != fold][target]\n \n xvalid = train[train['kfold'] == fold][useful_cols]\n yvalid = train[train['kfold'] == fold][target]\n \n valid_ids = train[train['kfold'] == fold]['user_id'].values.tolist()\n \n test_preds, val_preds, score = fit_predict(xtrain, ytrain, xvalid, yvalid, valid_ids)\n \n final_test_predictions.append(test_preds)\n final_valid_predictions.update(val_preds)\n scores.append(score)\n\nprint(np.mean(scores), np.std(scores))",
"_____no_output_____"
],
[
"sample_submission = pd.read_csv('./data/SampleSubmission.csv')\nsample_submission['CHURN'] = np.mean(np.column_stack(final_test_predictions), axis=1)\n# sample_submission.columns = [\"id\", \"pred_1\"]\nsample_submission.to_csv(\"./data/2021-10-27-xgb.csv\", index=False)",
"_____no_output_____"
],
[
"sample_submission",
"_____no_output_____"
],
[
"val = train[train['kfold'] == 1]\ntrain = train[train['kfold'] == 0]\n\nprint(train.shape, val.shape)",
"_____no_output_____"
],
[
"train_copy = train.copy()\ntest_copy = test.copy()\nval_copy = val.copy()\n\nminmax_scaler_cols = ['DATA_VOLUME', 'ON_NET']\nscaler = MinMaxScaler()\ntrain_copy[minmax_scaler_cols] = scaler.fit_transform(train_copy[minmax_scaler_cols])\ntest_copy[minmax_scaler_cols] = scaler.transform(test_copy[minmax_scaler_cols])\n\nstandard_scaler_cols = [col for col in train_copy.columns if col not in set(['user_id', 'MRG', 'TOP_PACK', 'REGION', 'DATA_VOLUME', 'ON_NET', 'kfold', 'CHURN'])]\nscaler = StandardScaler()\ntrain_copy[standard_scaler_cols] = scaler.fit_transform(train_copy[standard_scaler_cols])\ntest_copy[standard_scaler_cols] = scaler.transform(test_copy[standard_scaler_cols])\n\n# poly features\nnumerical_cols = [\n 'DATA_VOLUME',\n 'ON_NET',\n# 'MONTANT',\n# 'FREQUENCE_RECH',\n# 'REVENUE',\n# 'ARPU_SEGMENT',\n# 'FREQUENCE',\n 'ORANGE',\n 'TIGO',\n# 'ZONE1',\n# 'ZONE2',\n# 'REGULARITY',\n]\npoly = PolynomialFeatures(degree=3, interaction_only=True, include_bias=False)\ntrain_poly = poly.fit_transform(train_copy[numerical_cols])\ntest_poly = poly.fit_transform(test_copy[numerical_cols])\n\npoly_columns = [f\"poly_{i}\" for i in range(train_poly.shape[1])]\ndf_poly = pd.DataFrame(train_poly, columns=poly_columns)\ndf_test_poly = pd.DataFrame(test_poly, columns=poly_columns)\n\ntrain_copy = pd.concat([train_copy, df_poly], axis=1)\ntest_copy = pd.concat([test_copy, df_test_poly], axis=1)\n\nuseful_cols += poly_columns\n\nfor cat_col in cat_cols:\n encoder = OneHotEncoder(handle_unknown='ignore')\n unique_values = train_copy[cat_col].unique()\n\n one_hot_encoded_cols = [f'{cat_col}_{i}' for i in range(len(unique_values))]\n \n ohe_df = pd.DataFrame(encoder.fit_transform(train_copy[[cat_col]]).toarray(), columns=one_hot_encoded_cols)\n ohe_df.index = train_copy.index\n train_copy = train_copy.drop(cat_col, axis=1)\n train_copy = pd.concat([train_copy, ohe_df], axis=1) \n print(f'[{cat_col}] xtrain transformed')\n\n ohe_df = pd.DataFrame(encoder.transform(test_copy[[cat_col]]).toarray(), columns=one_hot_encoded_cols)\n ohe_df.index = test_copy.index\n test_copy = test_copy.drop(cat_col, axis=1)\n test_copy = pd.concat([test_copy, ohe_df], axis=1)\n print(f'[{cat_col}] xtest transformed')\n \n useful_cols += one_hot_encoded_cols\n useful_cols.remove(cat_col)",
"_____no_output_____"
],
[
"useful_cols = [col for col in train.columns if col not in set(['user_id', \n 'MRG',\n 'TOP_PACK', \n 'CHURN',\n 'kfold'])]\n\nlgb = LGBMClassifier(random_state=42)\nlgb.fit(train[useful_cols], train[target])",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07f650ff4b211c7126bc47f8d8ae24fbf8f4916 | 7,166 | ipynb | Jupyter Notebook | section_7/ml_libraries.ipynb | kiyoshion/minna-ai | 653f275f3c0e07db0c73ccb38892bb7360093e17 | [
"MIT"
] | 1 | 2021-01-10T03:17:50.000Z | 2021-01-10T03:17:50.000Z | section_7/ml_libraries.ipynb | kiyoshion/minna-ai | 653f275f3c0e07db0c73ccb38892bb7360093e17 | [
"MIT"
] | null | null | null | section_7/ml_libraries.ipynb | kiyoshion/minna-ai | 653f275f3c0e07db0c73ccb38892bb7360093e17 | [
"MIT"
] | null | null | null | 33.485981 | 240 | 0.505163 | [
[
[
"<a href=\"https://colab.research.google.com/github/yukinaga/minnano_ai/blob/master/section_7/ml_libraries.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
],
[
"# 機械学習ライブラリ\n機械学習ライブラリ、KerasとPyTorchのコードを紹介します。 \n今回はコードの詳しい解説は行いませんが、実装の大まかな流れを把握しましょう。\n",
"_____no_output_____"
],
[
"## ● Kerasのコード\n以下のコードは、Kerasによるシンプルなニューラルネットワークの実装です。 \nIrisの各花を、SetosaとVersicolorに分類します。 \n以下のコードでは、`Sequential`でモデルを作り、層や活性化関数を追加しています。",
"_____no_output_____"
]
],
[
[
"import numpy as np\n\nfrom sklearn import datasets\nfrom sklearn import preprocessing\nfrom sklearn.model_selection import train_test_split\n\niris = datasets.load_iris()\niris_data = iris.data\nsl_data = iris_data[:100, 0] # SetosaとVersicolor、Sepal length\nsw_data = iris_data[:100, 1] # SetosaとVersicolor、Sepal width\n\n# 平均値を0に\nsl_ave = np.average(sl_data) # 平均値\nsl_data -= sl_ave # 平均値を引く\nsw_ave = np.average(sw_data)\nsw_data -= sw_ave\n\n# 入力をリストに格納\ninput_data = []\ncorrect_data = []\nfor i in range(100):\n input_data.append([sl_data[i], sw_data[i]])\n correct_data.append([iris.target[i]])\n\n# 訓練データとテストデータに分割\ninput_data = np.array(input_data) # NumPyの配列に変換\ncorrect_data = np.array(correct_data)\nx_train, x_test, t_train, t_test = train_test_split(input_data, correct_data)\n\n# ------ ここからKerasのコード ------\nfrom keras.utils import np_utils\nfrom keras.models import Sequential\nfrom keras.layers import Dense, Activation\nfrom keras.optimizers import SGD\n\nmodel = Sequential()\nmodel.add(Dense(2, input_dim=2)) # 入力:2、中間層のニューロン数:2\nmodel.add(Activation(\"sigmoid\")) # シグモイド関数\nmodel.add(Dense(1)) # 出力層のニューロン数:1\nmodel.add(Activation(\"sigmoid\")) # シグモイド関数\nmodel.compile(optimizer=SGD(lr=0.3), loss=\"mean_squared_error\", metrics=[\"accuracy\"])\n\nmodel.fit(x_train, t_train, epochs=32, batch_size=1) # 訓練\n\nloss, accuracy = model.evaluate(x_test, t_test)\nprint(\"正解率: \" + str(accuracy*100) + \"%\")",
"_____no_output_____"
]
],
[
[
"## ● PyTorchのコード\n以下のコードは、PyTorchよるシンプルなニューラルネットワークの実装です。 \nIrisの各花を、SetosaとVersicolorに分類します。 \n以下のコードでは、Kerasと同様に`Sequential`でモデルを作り、層や活性化関数を並べています。 \nPyTorchでは、入力や正解をTensor形式のデータに変換する必要があります。",
"_____no_output_____"
]
],
[
[
"import numpy as np\n\nfrom sklearn import datasets\nfrom sklearn import preprocessing\nfrom sklearn.model_selection import train_test_split\n\niris = datasets.load_iris()\niris_data = iris.data\nsl_data = iris_data[:100, 0] # SetosaとVersicolor、Sepal length\nsw_data = iris_data[:100, 1] # SetosaとVersicolor、Sepal width\n\n# 平均値を0に\nsl_ave = np.average(sl_data) # 平均値\nsl_data -= sl_ave # 平均値を引く\nsw_ave = np.average(sw_data)\nsw_data -= sw_ave\n\n# 入力をリストに格納\ninput_data = []\ncorrect_data = []\nfor i in range(100):\n input_data.append([sl_data[i], sw_data[i]])\n correct_data.append([iris.target[i]])\n\n# 訓練データとテストデータに分割\ninput_data = np.array(input_data) # NumPyの配列に変換\ncorrect_data = np.array(correct_data)\nx_train, x_test, t_train, t_test = train_test_split(input_data, correct_data)\n\n# ------ ここからPyTorchのコード ------\nimport torch\nfrom torch import nn\nfrom torch import optim\n\n# Tensorに変換\nx_train = torch.tensor(x_train, dtype=torch.float32)\nt_train = torch.tensor(t_train, dtype=torch.float32) \nx_test = torch.tensor(x_test, dtype=torch.float32)\nt_test = torch.tensor(t_test, dtype=torch.float32) \n\nnet = nn.Sequential(\n nn.Linear(2, 2), # 入力:2、中間層のニューロン数:2\n nn.Sigmoid(), # シグモイド関数\n nn.Linear(2, 1), # 出力層のニューロン数:1\n nn.Sigmoid() # シグモイド関数\n)\n\nloss_fnc = nn.MSELoss()\noptimizer = optim.SGD(net.parameters(), lr=0.3)\n\n# 1000エポック学習\nfor i in range(1000):\n\n # 勾配を0に\n optimizer.zero_grad()\n \n # 順伝播\n y_train = net(x_train)\n y_test = net(x_test)\n \n # 誤差を求める\n loss_train = loss_fnc(y_train, t_train)\n loss_test = loss_fnc(y_test, t_test)\n\n # 逆伝播(勾配を求める)\n loss_train.backward()\n \n # パラメータの更新\n optimizer.step()\n\n if i%100 == 0:\n print(\"Epoch:\", i, \"Loss_Train:\", loss_train.item(), \"Loss_Test:\", loss_test.item())\n\ny_test = net(x_test)\ncount = ((y_test.detach().numpy()>0.5) == (t_test.detach().numpy()==1.0)).sum().item()\nprint(\"正解率: \" + str(count/len(y_test)*100) + \"%\")",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07f65adf21601114df783b8065a2eeb0a0e8923 | 264,287 | ipynb | Jupyter Notebook | experiments/tl_2/cores_wisig-oracle.run2.framed/trials/1/trial.ipynb | stevester94/csc500-notebooks | 4c1b04c537fe233a75bed82913d9d84985a89177 | [
"MIT"
] | null | null | null | experiments/tl_2/cores_wisig-oracle.run2.framed/trials/1/trial.ipynb | stevester94/csc500-notebooks | 4c1b04c537fe233a75bed82913d9d84985a89177 | [
"MIT"
] | null | null | null | experiments/tl_2/cores_wisig-oracle.run2.framed/trials/1/trial.ipynb | stevester94/csc500-notebooks | 4c1b04c537fe233a75bed82913d9d84985a89177 | [
"MIT"
] | null | null | null | 100.260622 | 73,196 | 0.701177 | [
[
[
"# Transfer Learning Template",
"_____no_output_____"
]
],
[
[
"%load_ext autoreload\n%autoreload 2\n%matplotlib inline\n\n \nimport os, json, sys, time, random\nimport numpy as np\nimport torch\nfrom torch.optim import Adam\nfrom easydict import EasyDict\nimport matplotlib.pyplot as plt\n\nfrom steves_models.steves_ptn import Steves_Prototypical_Network\n\nfrom steves_utils.lazy_iterable_wrapper import Lazy_Iterable_Wrapper\nfrom steves_utils.iterable_aggregator import Iterable_Aggregator\nfrom steves_utils.ptn_train_eval_test_jig import PTN_Train_Eval_Test_Jig\nfrom steves_utils.torch_sequential_builder import build_sequential\nfrom steves_utils.torch_utils import get_dataset_metrics, ptn_confusion_by_domain_over_dataloader\nfrom steves_utils.utils_v2 import (per_domain_accuracy_from_confusion, get_datasets_base_path)\nfrom steves_utils.PTN.utils import independent_accuracy_assesment\n\nfrom torch.utils.data import DataLoader\n\nfrom steves_utils.stratified_dataset.episodic_accessor import Episodic_Accessor_Factory\n\nfrom steves_utils.ptn_do_report import (\n get_loss_curve,\n get_results_table,\n get_parameters_table,\n get_domain_accuracies,\n)\n\nfrom steves_utils.transforms import get_chained_transform",
"_____no_output_____"
]
],
[
[
"# Allowed Parameters\nThese are allowed parameters, not defaults\nEach of these values need to be present in the injected parameters (the notebook will raise an exception if they are not present)\n\nPapermill uses the cell tag \"parameters\" to inject the real parameters below this cell.\nEnable tags to see what I mean",
"_____no_output_____"
]
],
[
[
"required_parameters = {\n \"experiment_name\",\n \"lr\",\n \"device\",\n \"seed\",\n \"dataset_seed\",\n \"n_shot\",\n \"n_query\",\n \"n_way\",\n \"train_k_factor\",\n \"val_k_factor\",\n \"test_k_factor\",\n \"n_epoch\",\n \"patience\",\n \"criteria_for_best\",\n \"x_net\",\n \"datasets\",\n \"torch_default_dtype\",\n \"NUM_LOGS_PER_EPOCH\",\n \"BEST_MODEL_PATH\",\n}",
"_____no_output_____"
],
[
"from steves_utils.CORES.utils import (\n ALL_NODES,\n ALL_NODES_MINIMUM_1000_EXAMPLES,\n ALL_DAYS\n)\n\nfrom steves_utils.ORACLE.utils_v2 import (\n ALL_DISTANCES_FEET_NARROWED,\n ALL_RUNS,\n ALL_SERIAL_NUMBERS,\n)\n\nstandalone_parameters = {}\nstandalone_parameters[\"experiment_name\"] = \"STANDALONE PTN\"\nstandalone_parameters[\"lr\"] = 0.001\nstandalone_parameters[\"device\"] = \"cuda\"\n\nstandalone_parameters[\"seed\"] = 1337\nstandalone_parameters[\"dataset_seed\"] = 1337\n\nstandalone_parameters[\"n_way\"] = 8\nstandalone_parameters[\"n_shot\"] = 3\nstandalone_parameters[\"n_query\"] = 2\nstandalone_parameters[\"train_k_factor\"] = 1\nstandalone_parameters[\"val_k_factor\"] = 2\nstandalone_parameters[\"test_k_factor\"] = 2\n\n\nstandalone_parameters[\"n_epoch\"] = 50\n\nstandalone_parameters[\"patience\"] = 10\nstandalone_parameters[\"criteria_for_best\"] = \"source_loss\"\n\nstandalone_parameters[\"datasets\"] = [\n {\n \"labels\": ALL_SERIAL_NUMBERS,\n \"domains\": ALL_DISTANCES_FEET_NARROWED,\n \"num_examples_per_domain_per_label\": 100,\n \"pickle_path\": os.path.join(get_datasets_base_path(), \"oracle.Run1_framed_2000Examples_stratified_ds.2022A.pkl\"),\n \"source_or_target_dataset\": \"source\",\n \"x_transforms\": [\"unit_mag\", \"minus_two\"],\n \"episode_transforms\": [],\n \"domain_prefix\": \"ORACLE_\"\n },\n {\n \"labels\": ALL_NODES,\n \"domains\": ALL_DAYS,\n \"num_examples_per_domain_per_label\": 100,\n \"pickle_path\": os.path.join(get_datasets_base_path(), \"cores.stratified_ds.2022A.pkl\"),\n \"source_or_target_dataset\": \"target\",\n \"x_transforms\": [\"unit_power\", \"times_zero\"],\n \"episode_transforms\": [],\n \"domain_prefix\": \"CORES_\"\n } \n]\n\nstandalone_parameters[\"torch_default_dtype\"] = \"torch.float32\" \n\n\n\nstandalone_parameters[\"x_net\"] = [\n {\"class\": \"nnReshape\", \"kargs\": {\"shape\":[-1, 1, 2, 256]}},\n {\"class\": \"Conv2d\", \"kargs\": { \"in_channels\":1, \"out_channels\":256, \"kernel_size\":(1,7), \"bias\":False, \"padding\":(0,3), },},\n {\"class\": \"ReLU\", \"kargs\": {\"inplace\": True}},\n {\"class\": \"BatchNorm2d\", \"kargs\": {\"num_features\":256}},\n\n {\"class\": \"Conv2d\", \"kargs\": { \"in_channels\":256, \"out_channels\":80, \"kernel_size\":(2,7), \"bias\":True, \"padding\":(0,3), },},\n {\"class\": \"ReLU\", \"kargs\": {\"inplace\": True}},\n {\"class\": \"BatchNorm2d\", \"kargs\": {\"num_features\":80}},\n {\"class\": \"Flatten\", \"kargs\": {}},\n\n {\"class\": \"Linear\", \"kargs\": {\"in_features\": 80*256, \"out_features\": 256}}, # 80 units per IQ pair\n {\"class\": \"ReLU\", \"kargs\": {\"inplace\": True}},\n {\"class\": \"BatchNorm1d\", \"kargs\": {\"num_features\":256}},\n\n {\"class\": \"Linear\", \"kargs\": {\"in_features\": 256, \"out_features\": 256}},\n]\n\n# Parameters relevant to results\n# These parameters will basically never need to change\nstandalone_parameters[\"NUM_LOGS_PER_EPOCH\"] = 10\nstandalone_parameters[\"BEST_MODEL_PATH\"] = \"./best_model.pth\"\n\n\n\n\n",
"_____no_output_____"
],
[
"# Parameters\nparameters = {\n \"experiment_name\": \"cores+wisig -> oracle.run2.framed\",\n \"device\": \"cuda\",\n \"lr\": 0.001,\n \"seed\": 1337,\n \"dataset_seed\": 1337,\n \"n_shot\": 3,\n \"n_query\": 2,\n \"train_k_factor\": 3,\n \"val_k_factor\": 2,\n \"test_k_factor\": 2,\n \"torch_default_dtype\": \"torch.float32\",\n \"n_epoch\": 50,\n \"patience\": 3,\n \"criteria_for_best\": \"target_loss\",\n \"x_net\": [\n {\"class\": \"nnReshape\", \"kargs\": {\"shape\": [-1, 1, 2, 256]}},\n {\n \"class\": \"Conv2d\",\n \"kargs\": {\n \"in_channels\": 1,\n \"out_channels\": 256,\n \"kernel_size\": [1, 7],\n \"bias\": False,\n \"padding\": [0, 3],\n },\n },\n {\"class\": \"ReLU\", \"kargs\": {\"inplace\": True}},\n {\"class\": \"BatchNorm2d\", \"kargs\": {\"num_features\": 256}},\n {\n \"class\": \"Conv2d\",\n \"kargs\": {\n \"in_channels\": 256,\n \"out_channels\": 80,\n \"kernel_size\": [2, 7],\n \"bias\": True,\n \"padding\": [0, 3],\n },\n },\n {\"class\": \"ReLU\", \"kargs\": {\"inplace\": True}},\n {\"class\": \"BatchNorm2d\", \"kargs\": {\"num_features\": 80}},\n {\"class\": \"Flatten\", \"kargs\": {}},\n {\"class\": \"Linear\", \"kargs\": {\"in_features\": 20480, \"out_features\": 256}},\n {\"class\": \"ReLU\", \"kargs\": {\"inplace\": True}},\n {\"class\": \"BatchNorm1d\", \"kargs\": {\"num_features\": 256}},\n {\"class\": \"Linear\", \"kargs\": {\"in_features\": 256, \"out_features\": 256}},\n ],\n \"NUM_LOGS_PER_EPOCH\": 10,\n \"BEST_MODEL_PATH\": \"./best_model.pth\",\n \"n_way\": 16,\n \"datasets\": [\n {\n \"labels\": [\n \"1-10.\",\n \"1-11.\",\n \"1-15.\",\n \"1-16.\",\n \"1-17.\",\n \"1-18.\",\n \"1-19.\",\n \"10-4.\",\n \"10-7.\",\n \"11-1.\",\n \"11-14.\",\n \"11-17.\",\n \"11-20.\",\n \"11-7.\",\n \"13-20.\",\n \"13-8.\",\n \"14-10.\",\n \"14-11.\",\n \"14-14.\",\n \"14-7.\",\n \"15-1.\",\n \"15-20.\",\n \"16-1.\",\n \"16-16.\",\n \"17-10.\",\n \"17-11.\",\n \"17-2.\",\n \"19-1.\",\n \"19-16.\",\n \"19-19.\",\n \"19-20.\",\n \"19-3.\",\n \"2-10.\",\n \"2-11.\",\n \"2-17.\",\n \"2-18.\",\n \"2-20.\",\n \"2-3.\",\n \"2-4.\",\n \"2-5.\",\n \"2-6.\",\n \"2-7.\",\n \"2-8.\",\n \"3-13.\",\n \"3-18.\",\n \"3-3.\",\n \"4-1.\",\n \"4-10.\",\n \"4-11.\",\n \"4-19.\",\n \"5-5.\",\n \"6-15.\",\n \"7-10.\",\n \"7-14.\",\n \"8-18.\",\n \"8-20.\",\n \"8-3.\",\n \"8-8.\",\n ],\n \"domains\": [1, 2, 3, 4, 5],\n \"num_examples_per_domain_per_label\": 100,\n \"pickle_path\": \"/mnt/wd500GB/CSC500/csc500-main/datasets/cores.stratified_ds.2022A.pkl\",\n \"source_or_target_dataset\": \"source\",\n \"x_transforms\": [\"unit_mag\"],\n \"episode_transforms\": [],\n \"domain_prefix\": \"C_A_\",\n },\n {\n \"labels\": [\n \"1-10\",\n \"1-12\",\n \"1-14\",\n \"1-16\",\n \"1-18\",\n \"1-19\",\n \"1-8\",\n \"10-11\",\n \"10-17\",\n \"10-4\",\n \"10-7\",\n \"11-1\",\n \"11-10\",\n \"11-19\",\n \"11-20\",\n \"11-4\",\n \"11-7\",\n \"12-19\",\n \"12-20\",\n \"12-7\",\n \"13-14\",\n \"13-18\",\n \"13-19\",\n \"13-20\",\n \"13-3\",\n \"13-7\",\n \"14-10\",\n \"14-11\",\n \"14-12\",\n \"14-13\",\n \"14-14\",\n \"14-19\",\n \"14-20\",\n \"14-7\",\n \"14-8\",\n \"14-9\",\n \"15-1\",\n \"15-19\",\n \"15-6\",\n \"16-1\",\n \"16-16\",\n \"16-19\",\n \"16-20\",\n \"17-10\",\n \"17-11\",\n \"18-1\",\n \"18-10\",\n \"18-11\",\n \"18-12\",\n \"18-13\",\n \"18-14\",\n \"18-15\",\n \"18-16\",\n \"18-17\",\n \"18-19\",\n \"18-2\",\n \"18-20\",\n \"18-4\",\n \"18-5\",\n \"18-7\",\n \"18-8\",\n \"18-9\",\n \"19-1\",\n \"19-10\",\n \"19-11\",\n \"19-12\",\n \"19-13\",\n \"19-14\",\n \"19-15\",\n \"19-19\",\n \"19-2\",\n \"19-20\",\n \"19-3\",\n \"19-4\",\n \"19-6\",\n \"19-7\",\n \"19-8\",\n \"19-9\",\n \"2-1\",\n \"2-13\",\n \"2-15\",\n \"2-3\",\n \"2-4\",\n \"2-5\",\n \"2-6\",\n \"2-7\",\n \"2-8\",\n \"20-1\",\n \"20-12\",\n \"20-14\",\n \"20-15\",\n \"20-16\",\n \"20-18\",\n \"20-19\",\n \"20-20\",\n \"20-3\",\n \"20-4\",\n \"20-5\",\n \"20-7\",\n \"20-8\",\n \"3-1\",\n \"3-13\",\n \"3-18\",\n \"3-2\",\n \"3-8\",\n \"4-1\",\n \"4-10\",\n \"4-11\",\n \"5-1\",\n \"5-5\",\n \"6-1\",\n \"6-15\",\n \"6-6\",\n \"7-10\",\n \"7-11\",\n \"7-12\",\n \"7-13\",\n \"7-14\",\n \"7-7\",\n \"7-8\",\n \"7-9\",\n \"8-1\",\n \"8-13\",\n \"8-14\",\n \"8-18\",\n \"8-20\",\n \"8-3\",\n \"8-8\",\n \"9-1\",\n \"9-7\",\n ],\n \"domains\": [1, 2, 3, 4],\n \"num_examples_per_domain_per_label\": 100,\n \"pickle_path\": \"/mnt/wd500GB/CSC500/csc500-main/datasets/wisig.node3-19.stratified_ds.2022A.pkl\",\n \"source_or_target_dataset\": \"source\",\n \"x_transforms\": [\"unit_mag\"],\n \"episode_transforms\": [],\n \"domain_prefix\": \"W_A_\",\n },\n {\n \"labels\": [\n \"3123D52\",\n \"3123D65\",\n \"3123D79\",\n \"3123D80\",\n \"3123D54\",\n \"3123D70\",\n \"3123D7B\",\n \"3123D89\",\n \"3123D58\",\n \"3123D76\",\n \"3123D7D\",\n \"3123EFE\",\n \"3123D64\",\n \"3123D78\",\n \"3123D7E\",\n \"3124E4A\",\n ],\n \"domains\": [32, 38, 8, 44, 14, 50, 20, 26],\n \"num_examples_per_domain_per_label\": 2000,\n \"pickle_path\": \"/mnt/wd500GB/CSC500/csc500-main/datasets/oracle.Run2_framed_2000Examples_stratified_ds.2022A.pkl\",\n \"source_or_target_dataset\": \"target\",\n \"x_transforms\": [\"unit_mag\"],\n \"episode_transforms\": [],\n \"domain_prefix\": \"ORACLE.run2_\",\n },\n ],\n}\n",
"_____no_output_____"
],
[
"# Set this to True if you want to run this template directly\nSTANDALONE = False\nif STANDALONE:\n print(\"parameters not injected, running with standalone_parameters\")\n parameters = standalone_parameters\n\nif not 'parameters' in locals() and not 'parameters' in globals():\n raise Exception(\"Parameter injection failed\")\n\n#Use an easy dict for all the parameters\np = EasyDict(parameters)\n\nsupplied_keys = set(p.keys())\n\nif supplied_keys != required_parameters:\n print(\"Parameters are incorrect\")\n if len(supplied_keys - required_parameters)>0: print(\"Shouldn't have:\", str(supplied_keys - required_parameters))\n if len(required_parameters - supplied_keys)>0: print(\"Need to have:\", str(required_parameters - supplied_keys))\n raise RuntimeError(\"Parameters are incorrect\")\n\n",
"_____no_output_____"
],
[
"###################################\n# Set the RNGs and make it all deterministic\n###################################\nnp.random.seed(p.seed)\nrandom.seed(p.seed)\ntorch.manual_seed(p.seed)\n\ntorch.use_deterministic_algorithms(True) ",
"_____no_output_____"
],
[
"###########################################\n# The stratified datasets honor this\n###########################################\ntorch.set_default_dtype(eval(p.torch_default_dtype))",
"_____no_output_____"
],
[
"###################################\n# Build the network(s)\n# Note: It's critical to do this AFTER setting the RNG\n###################################\nx_net = build_sequential(p.x_net)",
"_____no_output_____"
],
[
"start_time_secs = time.time()",
"_____no_output_____"
],
[
"p.domains_source = []\np.domains_target = []\n\n\ntrain_original_source = []\nval_original_source = []\ntest_original_source = []\n\ntrain_original_target = []\nval_original_target = []\ntest_original_target = []",
"_____no_output_____"
],
[
"# global_x_transform_func = lambda x: normalize(x.to(torch.get_default_dtype()), \"unit_power\") # unit_power, unit_mag\n# global_x_transform_func = lambda x: normalize(x, \"unit_power\") # unit_power, unit_mag",
"_____no_output_____"
],
[
"def add_dataset(\n labels,\n domains,\n pickle_path,\n x_transforms,\n episode_transforms,\n domain_prefix,\n num_examples_per_domain_per_label,\n source_or_target_dataset:str,\n iterator_seed=p.seed,\n dataset_seed=p.dataset_seed,\n n_shot=p.n_shot,\n n_way=p.n_way,\n n_query=p.n_query,\n train_val_test_k_factors=(p.train_k_factor,p.val_k_factor,p.test_k_factor),\n):\n \n if x_transforms == []: x_transform = None\n else: x_transform = get_chained_transform(x_transforms)\n \n if episode_transforms == []: episode_transform = None\n else: raise Exception(\"episode_transforms not implemented\")\n \n episode_transform = lambda tup, _prefix=domain_prefix: (_prefix + str(tup[0]), tup[1])\n\n\n eaf = Episodic_Accessor_Factory(\n labels=labels,\n domains=domains,\n num_examples_per_domain_per_label=num_examples_per_domain_per_label,\n iterator_seed=iterator_seed,\n dataset_seed=dataset_seed,\n n_shot=n_shot,\n n_way=n_way,\n n_query=n_query,\n train_val_test_k_factors=train_val_test_k_factors,\n pickle_path=pickle_path,\n x_transform_func=x_transform,\n )\n\n train, val, test = eaf.get_train(), eaf.get_val(), eaf.get_test()\n train = Lazy_Iterable_Wrapper(train, episode_transform)\n val = Lazy_Iterable_Wrapper(val, episode_transform)\n test = Lazy_Iterable_Wrapper(test, episode_transform)\n\n if source_or_target_dataset==\"source\":\n train_original_source.append(train)\n val_original_source.append(val)\n test_original_source.append(test)\n\n p.domains_source.extend(\n [domain_prefix + str(u) for u in domains]\n )\n elif source_or_target_dataset==\"target\":\n train_original_target.append(train)\n val_original_target.append(val)\n test_original_target.append(test)\n p.domains_target.extend(\n [domain_prefix + str(u) for u in domains]\n )\n else:\n raise Exception(f\"invalid source_or_target_dataset: {source_or_target_dataset}\")\n ",
"_____no_output_____"
],
[
"for ds in p.datasets:\n add_dataset(**ds)",
"_____no_output_____"
],
[
"# from steves_utils.CORES.utils import (\n# ALL_NODES,\n# ALL_NODES_MINIMUM_1000_EXAMPLES,\n# ALL_DAYS\n# )\n\n# add_dataset(\n# labels=ALL_NODES,\n# domains = ALL_DAYS,\n# num_examples_per_domain_per_label=100,\n# pickle_path=os.path.join(get_datasets_base_path(), \"cores.stratified_ds.2022A.pkl\"),\n# source_or_target_dataset=\"target\",\n# x_transform_func=global_x_transform_func,\n# domain_modifier=lambda u: f\"cores_{u}\"\n# )",
"_____no_output_____"
],
[
"# from steves_utils.ORACLE.utils_v2 import (\n# ALL_DISTANCES_FEET,\n# ALL_RUNS,\n# ALL_SERIAL_NUMBERS,\n# )\n\n\n# add_dataset(\n# labels=ALL_SERIAL_NUMBERS,\n# domains = list(set(ALL_DISTANCES_FEET) - {2,62}),\n# num_examples_per_domain_per_label=100,\n# pickle_path=os.path.join(get_datasets_base_path(), \"oracle.Run2_framed_2000Examples_stratified_ds.2022A.pkl\"),\n# source_or_target_dataset=\"source\",\n# x_transform_func=global_x_transform_func,\n# domain_modifier=lambda u: f\"oracle1_{u}\"\n# )\n",
"_____no_output_____"
],
[
"# from steves_utils.ORACLE.utils_v2 import (\n# ALL_DISTANCES_FEET,\n# ALL_RUNS,\n# ALL_SERIAL_NUMBERS,\n# )\n\n\n# add_dataset(\n# labels=ALL_SERIAL_NUMBERS,\n# domains = list(set(ALL_DISTANCES_FEET) - {2,62,56}),\n# num_examples_per_domain_per_label=100,\n# pickle_path=os.path.join(get_datasets_base_path(), \"oracle.Run2_framed_2000Examples_stratified_ds.2022A.pkl\"),\n# source_or_target_dataset=\"source\",\n# x_transform_func=global_x_transform_func,\n# domain_modifier=lambda u: f\"oracle2_{u}\"\n# )",
"_____no_output_____"
],
[
"# add_dataset(\n# labels=list(range(19)),\n# domains = [0,1,2],\n# num_examples_per_domain_per_label=100,\n# pickle_path=os.path.join(get_datasets_base_path(), \"metehan.stratified_ds.2022A.pkl\"),\n# source_or_target_dataset=\"target\",\n# x_transform_func=global_x_transform_func,\n# domain_modifier=lambda u: f\"met_{u}\"\n# )",
"_____no_output_____"
],
[
"# # from steves_utils.wisig.utils import (\n# # ALL_NODES_MINIMUM_100_EXAMPLES,\n# # ALL_NODES_MINIMUM_500_EXAMPLES,\n# # ALL_NODES_MINIMUM_1000_EXAMPLES,\n# # ALL_DAYS\n# # )\n\n# import steves_utils.wisig.utils as wisig\n\n\n# add_dataset(\n# labels=wisig.ALL_NODES_MINIMUM_100_EXAMPLES,\n# domains = wisig.ALL_DAYS,\n# num_examples_per_domain_per_label=100,\n# pickle_path=os.path.join(get_datasets_base_path(), \"wisig.node3-19.stratified_ds.2022A.pkl\"),\n# source_or_target_dataset=\"target\",\n# x_transform_func=global_x_transform_func,\n# domain_modifier=lambda u: f\"wisig_{u}\"\n# )",
"_____no_output_____"
],
[
"###################################\n# Build the dataset\n###################################\ntrain_original_source = Iterable_Aggregator(train_original_source, p.seed)\nval_original_source = Iterable_Aggregator(val_original_source, p.seed)\ntest_original_source = Iterable_Aggregator(test_original_source, p.seed)\n\n\ntrain_original_target = Iterable_Aggregator(train_original_target, p.seed)\nval_original_target = Iterable_Aggregator(val_original_target, p.seed)\ntest_original_target = Iterable_Aggregator(test_original_target, p.seed)\n\n# For CNN We only use X and Y. And we only train on the source.\n# Properly form the data using a transform lambda and Lazy_Iterable_Wrapper. Finally wrap them in a dataloader\n\ntransform_lambda = lambda ex: ex[1] # Original is (<domain>, <episode>) so we strip down to episode only\n\ntrain_processed_source = Lazy_Iterable_Wrapper(train_original_source, transform_lambda)\nval_processed_source = Lazy_Iterable_Wrapper(val_original_source, transform_lambda)\ntest_processed_source = Lazy_Iterable_Wrapper(test_original_source, transform_lambda)\n\ntrain_processed_target = Lazy_Iterable_Wrapper(train_original_target, transform_lambda)\nval_processed_target = Lazy_Iterable_Wrapper(val_original_target, transform_lambda)\ntest_processed_target = Lazy_Iterable_Wrapper(test_original_target, transform_lambda)\n\ndatasets = EasyDict({\n \"source\": {\n \"original\": {\"train\":train_original_source, \"val\":val_original_source, \"test\":test_original_source},\n \"processed\": {\"train\":train_processed_source, \"val\":val_processed_source, \"test\":test_processed_source}\n },\n \"target\": {\n \"original\": {\"train\":train_original_target, \"val\":val_original_target, \"test\":test_original_target},\n \"processed\": {\"train\":train_processed_target, \"val\":val_processed_target, \"test\":test_processed_target}\n },\n})",
"_____no_output_____"
],
[
"from steves_utils.transforms import get_average_magnitude, get_average_power\n\nprint(set([u for u,_ in val_original_source]))\nprint(set([u for u,_ in val_original_target]))\n\ns_x, s_y, q_x, q_y, _ = next(iter(train_processed_source))\nprint(s_x)\n\n# for ds in [\n# train_processed_source,\n# val_processed_source,\n# test_processed_source,\n# train_processed_target,\n# val_processed_target,\n# test_processed_target\n# ]:\n# for s_x, s_y, q_x, q_y, _ in ds:\n# for X in (s_x, q_x):\n# for x in X:\n# assert np.isclose(get_average_magnitude(x.numpy()), 1.0)\n# assert np.isclose(get_average_power(x.numpy()), 1.0)\n ",
"{'W_A_4', 'W_A_2', 'W_A_1', 'C_A_3', 'C_A_5', 'C_A_2', 'C_A_4', 'C_A_1', 'W_A_3'}\n"
],
[
"###################################\n# Build the model\n###################################\nmodel = Steves_Prototypical_Network(x_net, device=p.device, x_shape=(2,256))\noptimizer = Adam(params=model.parameters(), lr=p.lr)",
"(2, 256)\n"
],
[
"###################################\n# train\n###################################\njig = PTN_Train_Eval_Test_Jig(model, p.BEST_MODEL_PATH, p.device)\n\njig.train(\n train_iterable=datasets.source.processed.train,\n source_val_iterable=datasets.source.processed.val,\n target_val_iterable=datasets.target.processed.val,\n num_epochs=p.n_epoch,\n num_logs_per_epoch=p.NUM_LOGS_PER_EPOCH,\n patience=p.patience,\n optimizer=optimizer,\n criteria_for_best=p.criteria_for_best,\n)",
"epoch: 1, [batch: 1 / 2081], examples_per_second: 128.2052, train_label_loss: 2.9273, \n"
],
[
"total_experiment_time_secs = time.time() - start_time_secs",
"_____no_output_____"
],
[
"###################################\n# Evaluate the model\n###################################\nsource_test_label_accuracy, source_test_label_loss = jig.test(datasets.source.processed.test)\ntarget_test_label_accuracy, target_test_label_loss = jig.test(datasets.target.processed.test)\n\nsource_val_label_accuracy, source_val_label_loss = jig.test(datasets.source.processed.val)\ntarget_val_label_accuracy, target_val_label_loss = jig.test(datasets.target.processed.val)\n\nhistory = jig.get_history()\n\ntotal_epochs_trained = len(history[\"epoch_indices\"])\n\nval_dl = Iterable_Aggregator((datasets.source.original.val,datasets.target.original.val))\n\nconfusion = ptn_confusion_by_domain_over_dataloader(model, p.device, val_dl)\nper_domain_accuracy = per_domain_accuracy_from_confusion(confusion)\n\n# Add a key to per_domain_accuracy for if it was a source domain\nfor domain, accuracy in per_domain_accuracy.items():\n per_domain_accuracy[domain] = {\n \"accuracy\": accuracy,\n \"source?\": domain in p.domains_source\n }\n\n# Do an independent accuracy assesment JUST TO BE SURE!\n# _source_test_label_accuracy = independent_accuracy_assesment(model, datasets.source.processed.test, p.device)\n# _target_test_label_accuracy = independent_accuracy_assesment(model, datasets.target.processed.test, p.device)\n# _source_val_label_accuracy = independent_accuracy_assesment(model, datasets.source.processed.val, p.device)\n# _target_val_label_accuracy = independent_accuracy_assesment(model, datasets.target.processed.val, p.device)\n\n# assert(_source_test_label_accuracy == source_test_label_accuracy)\n# assert(_target_test_label_accuracy == target_test_label_accuracy)\n# assert(_source_val_label_accuracy == source_val_label_accuracy)\n# assert(_target_val_label_accuracy == target_val_label_accuracy)\n\nexperiment = {\n \"experiment_name\": p.experiment_name,\n \"parameters\": dict(p),\n \"results\": {\n \"source_test_label_accuracy\": source_test_label_accuracy,\n \"source_test_label_loss\": source_test_label_loss,\n \"target_test_label_accuracy\": target_test_label_accuracy,\n \"target_test_label_loss\": target_test_label_loss,\n \"source_val_label_accuracy\": source_val_label_accuracy,\n \"source_val_label_loss\": source_val_label_loss,\n \"target_val_label_accuracy\": target_val_label_accuracy,\n \"target_val_label_loss\": target_val_label_loss,\n \"total_epochs_trained\": total_epochs_trained,\n \"total_experiment_time_secs\": total_experiment_time_secs,\n \"confusion\": confusion,\n \"per_domain_accuracy\": per_domain_accuracy,\n },\n \"history\": history,\n \"dataset_metrics\": get_dataset_metrics(datasets, \"ptn\"),\n}",
"_____no_output_____"
],
[
"ax = get_loss_curve(experiment)\nplt.show()",
"_____no_output_____"
],
[
"get_results_table(experiment)",
"_____no_output_____"
],
[
"get_domain_accuracies(experiment)",
"_____no_output_____"
],
[
"print(\"Source Test Label Accuracy:\", experiment[\"results\"][\"source_test_label_accuracy\"], \"Target Test Label Accuracy:\", experiment[\"results\"][\"target_test_label_accuracy\"])\nprint(\"Source Val Label Accuracy:\", experiment[\"results\"][\"source_val_label_accuracy\"], \"Target Val Label Accuracy:\", experiment[\"results\"][\"target_val_label_accuracy\"])",
"Source Test Label Accuracy: 0.9677868150684932 Target Test Label Accuracy: 0.20481770833333332\nSource Val Label Accuracy: 0.9731378424657534 Target Val Label Accuracy: 0.20755208333333333\n"
],
[
"json.dumps(experiment)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07f7a838b037059e04a6a7effaa007945460416 | 16,734 | ipynb | Jupyter Notebook | results/draw_results.ipynb | Shekhale/gbnns_theory | d089878c707e5372e0e2d809b650ba330aa55690 | [
"MIT"
] | 11 | 2020-07-02T11:18:09.000Z | 2022-03-29T20:11:22.000Z | results/draw_results.ipynb | Shekhale/gbnns_theory | d089878c707e5372e0e2d809b650ba330aa55690 | [
"MIT"
] | null | null | null | results/draw_results.ipynb | Shekhale/gbnns_theory | d089878c707e5372e0e2d809b650ba330aa55690 | [
"MIT"
] | 1 | 2020-07-02T11:45:37.000Z | 2020-07-02T11:45:37.000Z | 29.935599 | 129 | 0.489662 | [
[
[
"import tqdm\nfrom tqdm import tqdm_notebook\n\nimport time\n\nimport numpy as np\nimport matplotlib\nimport matplotlib.pyplot as plt\nimport mpl_toolkits.mplot3d.axes3d as axes3d\n\nimport matplotlib.ticker as ticker",
"_____no_output_____"
],
[
"# import warnings\n# warnings.filterwarnings('ignore')",
"_____no_output_____"
],
[
"import pandas as pd\nimport matplotlib.pyplot as plt\nimport seaborn as sns",
"_____no_output_____"
],
[
"import os",
"_____no_output_____"
],
[
"# fonts for ICML\ndef SetPlotRC():\n #If fonttype = 1 doesn't work with LaTeX, try fonttype 42.\n plt.rc('pdf',fonttype = 42)\n plt.rc('ps',fonttype = 42)",
"_____no_output_____"
],
[
"SetPlotRC()",
"_____no_output_____"
],
[
"def get_df(num_models, num_exper, file_names, model_names, y_axe, y_axe_name, title, num_show = []):\n acc_num = 3\n dist_calc_num = 7\n df = []\n for i in range(len(file_names)):\n file_name = os.path.expanduser(file_names[i])\n data = np.genfromtxt(file_name, dtype=('U10','U10','U10',float,'U10',int,'U10',int,'U10',float)).tolist()\n cur_line = -1\n for mod in range(num_models[i]):\n for j in range(num_exper[i]):\n cur_line += 1\n if y_axe == 11:\n df.append([1.00001 - data[cur_line][acc_num],\n 1 / data[cur_line][y_axe], model_names[i][mod], title])\n else:\n df.append([1.00001 - data[cur_line][acc_num],\n data[cur_line][y_axe], model_names[i][mod], title])\n\n df = pd.DataFrame(df, columns=[\"Error = 1 - Recall@1\", y_axe_name, \"algorithm\", \"title\"])\n# print(df.shape)\n if len(num_show) > 0:\n it = 0\n itt = 0\n num_for_iloc = []\n model_names_list = []\n for i in range(len(file_names)):\n for mod in range(len(model_names[i])):\n model_names_list.append(model_names[i][mod])\n allowed_set = set()\n same_dict = dict()\n for i in range(len(file_names)):\n for mod in range(len(model_names[i])):\n if itt in num_show:\n allowed_set.add(model_names_list[i])\n for j in range(num_exper[i]):\n num_for_iloc.append(it)\n it += 1\n else:\n it += num_exper[i]\n itt += 1\n df = df.iloc[num_for_iloc]\n \n return df",
"_____no_output_____"
],
[
"def show_results(frames, title, y_axe_name, x_log=True, y_log=False,\n dims=(18, 12), save=False, file_name='trash'):\n size = len(frames)\n ylim = [[500, 5000], [0, 1000],[0, 1000],[0, 1000]]\n a4_dims = dims\n fig, axs = plt.subplots(2, 2, figsize=a4_dims)\n for i in range(2):\n for j in range(2):\n num = i * 2 + j\n if i + j == 2:\n sns.lineplot(x=\"Error = 1 - Recall@1\", y=y_axe_name,hue=\"algorithm\",\n markers=True, style=\"algorithm\", dashes=False,\n data=frames[num], ax=axs[i, j], linewidth=3, ms=15)\n else:\n sns.lineplot(x=\"Error = 1 - Recall@1\", y=y_axe_name,hue=\"algorithm\",\n markers=True, style=\"algorithm\", dashes=False,\n data=frames[num], ax=axs[i, j], legend=False, linewidth=3, ms=15)\n \n axs[i, j].set_title(title[num], size='30')\n\n lx = axs[i, j].get_xlabel()\n ly = axs[i, j].get_ylabel()\n axs[i, j].set_xlabel(lx, fontsize=25)\n axs[i, j].set_ylabel(ly, fontsize=25)\n axs[i, j].tick_params(axis='both', which='both', labelsize=25)\n axs[i, j].set_ymargin(0.075)\n if i == 0:\n axs[i, j].set_xlabel('')\n if j == 1:\n axs[i, j].set_ylabel('')\n \n \n# ApplyFont(axs[i,j])\n \n plt.legend(loc=2, bbox_to_anchor=(1.05, 1, 0.5, 0.5), fontsize='30', markerscale=3, borderaxespad=0.)\n if y_log:\n for i in range(2):\n for j in range(2):\n axs[i, j].set(yscale=\"log\")\n \n if x_log:\n for i in range(2):\n for j in range(2):\n axs[i, j].set(xscale=\"log\")# num_exper = [6, 6, 3]\n if save:\n fig.savefig(file_name + \".pdf\", bbox_inches='tight')",
"_____no_output_____"
],
[
"path = '~/Desktop/results/synthetic_n_10_6_d_'\n# path = '~/results/synthetic_n_10_6_d_'",
"_____no_output_____"
],
[
"y_axe = 7\ny_axe_name = \"dist calc\"\nmodel_names = [['kNN', 'kNN + Kl', 'kNN + Kl + llf', 'kNN + beam', 'kNN + beam + Kl + llf']]\nnum_show = [0, 1, 2, 3, 4]\n\nnum_exper = [5]\nnum_models = [5]\nfile_names = [path + '3.txt']\ndf_2 = get_df(num_models, num_exper, file_names, model_names, y_axe, y_axe_name, title=\"trash\", num_show=num_show)\n\n# print(df_2)\n\nnum_exper = [6]\nnum_models = [5]\nfile_names = [path + '5.txt']\ndf_4 = get_df(num_models, num_exper, file_names, model_names, y_axe, y_axe_name, title=\"trash\", num_show=num_show)\n\nnum_exper = [6]\nnum_models = [5]\nfile_names = [path + '9.txt']\ndf_8 = get_df(num_models, num_exper, file_names, model_names, y_axe, y_axe_name, title=\"trash\", num_show=num_show)\n \nnum_exper = [6]\nnum_models = [5]\nfile_names = [path + '17.txt']\n\ndf_16 = get_df(num_models, num_exper, file_names, model_names, y_axe, y_axe_name, title=\"trash\", num_show=num_show)\n\nframes = [df_2, df_4, df_8, df_16]\nshow_results(frames, ['d = 2', 'd = 4', 'd = 8', 'd = 16'], y_axe_name,\n y_log=False, x_log=True, dims=(24, 14),\n save=False, file_name='synthetic_datasets_2_2_final')\n",
"_____no_output_____"
]
],
[
[
"## Supplementary",
"_____no_output_____"
]
],
[
[
"def show_results_dist_1_3(frames, title, y_axe_name,\n dims=(18, 12), save=False, file_name='trash', legend_size=13):\n size = len(frames)\n a4_dims = dims\n fig, axs = plt.subplots(1, 3, figsize=a4_dims)\n for i in range(3):\n sns.lineplot(x=\"Error = 1 - Recall@1\", y=y_axe_name,hue=\"algorithm\",\n markers=True, style=\"algorithm\", dashes=False,\n data=frames[i], ax=axs[i], linewidth=2, ms=10)\n \n axs[i].set_title(title[i], size='20')\n\n lx = axs[i].get_xlabel()\n ly = axs[i].get_ylabel()\n axs[i].set_xlabel(lx, fontsize=20)\n axs[i].set_ylabel(ly, fontsize=20)\n axs[i].set_xscale('log')\n if i == 0:\n axs[i].set_xticks([0.001, 0.01, .1])\n else:\n axs[i].set_xticks([0.01, 0.1])\n axs[i].get_xaxis().set_major_formatter(matplotlib.ticker.ScalarFormatter())\n axs[i].tick_params(axis='both', which='both', labelsize=15)\n if i > 0:\n axs[i].set_ylabel('')\n \n plt.setp(axs[i].get_legend().get_texts(), fontsize=legend_size)\n \n if save:\n fig.savefig(file_name + \".pdf\", bbox_inches='tight')",
"_____no_output_____"
],
[
"y_axe = 7\ny_axe_name = \"dist calc\"\nmodel_names = [['kNN', 'kNN + Kl + llf 4', 'kNN + Kl + llf 8',\n 'kNN + Kl + llf 16', 'kNN + Kl + llf 32']]\n\nnum_show = [0, 1, 2, 3, 4]\n\nnum_exper = [6]\nnum_models = [5]\nfile_names = [path + '5_nlt.txt']\ndf_kl_4 = get_df(num_models, num_exper, file_names, model_names, y_axe, y_axe_name, title=\"trash\", num_show=num_show)\n\nnum_exper = [6]\nnum_models = [5]\nfile_names = [path + '9_nlt.txt']\ndf_kl_8 = get_df(num_models, num_exper, file_names, model_names, y_axe, y_axe_name, title=\"trash\", num_show=num_show)\n\nnum_exper = [6]\nnum_models = [5]\nfile_names = [path + '17_nlt.txt']\ndf_kl_16 = get_df(num_models, num_exper, file_names, model_names, y_axe, y_axe_name, title=\"trash\", num_show=num_show)\n\nframes = [df_kl_4, df_kl_8, df_kl_16]\nshow_results_dist_1_3(frames, ['d = 4', 'd = 8', 'd = 16'], y_axe_name, dims=(20, 6),\n save=False, file_name='suppl_figure_optimal_kl_number')\n\n",
"_____no_output_____"
],
[
"path_end = '_llt.txt'",
"_____no_output_____"
],
[
"y_axe = 7\ny_axe_name = \"dist calc\"\nmodel_names = [['kNN', 'thrNN', 'kNN + Kl-dist + llf', 'kNN + Kl-rank + llf', 'kNN + Kl-rank sample + llf']]\n\nnum_show = [0, 1, 2, 3, 4]\n\nnum_exper = [6]\nnum_models = [5]\nfile_names = [path + '5' + path_end]\ndf_kl_4 = get_df(num_models, num_exper, file_names, model_names, y_axe, y_axe_name, title=\"trash\", num_show=num_show)\n\nnum_exper = [6]\nnum_models = [5]\nfile_names = [path + '9' + path_end]\ndf_kl_8 = get_df(num_models, num_exper, file_names, model_names, y_axe, y_axe_name, title=\"trash\", num_show=num_show)\n\nnum_exper = [6]\nnum_models = [5]\nfile_names = [path + '17' + path_end]\ndf_kl_16 = get_df(num_models, num_exper, file_names, model_names, y_axe, y_axe_name, title=\"trash\", num_show=num_show)\n\nframes = [df_kl_4, df_kl_8, df_kl_16]\nshow_results_dist_1_3(frames, ['d = 4', 'd = 8', 'd = 16'], y_axe_name, dims=(20, 6),\n save=False, file_name='suppl_figure_optimal_kl_type', legend_size=10)\n\n",
"_____no_output_____"
],
[
"path_start = \"~/Desktop/results/distr_to_1_\"\npath_end = \".txt\"",
"_____no_output_____"
],
[
"\na4_dims = (7, 3)\nfig, ax = plt.subplots(figsize=a4_dims)\nax.set_yticks([]) \n\nfile_name = path_start + \"sift\" + path_end\nfile_name = os.path.expanduser(file_name)\ndistr = np.genfromtxt(file_name)\nsns.distplot(distr, label=\"SIFT\")\n\nfile_name = path_start + \"d_9\" + path_end\nfile_name = os.path.expanduser(file_name)\ndistr = np.genfromtxt(file_name)\nsns.distplot(distr, label=\"d=8\")\n\nfile_name = path_start + \"d_17\" + path_end\nfile_name = os.path.expanduser(file_name)\ndistr = np.genfromtxt(file_name)\nsns.distplot(distr, label=\"d=16\")\n\nfile_name = path_start + \"d_33\" + path_end\nfile_name = os.path.expanduser(file_name)\ndistr = np.genfromtxt(file_name)\nsns.distplot(distr, label=\"d=32\")\n\nfile_name = path_start + \"d_65\" + path_end\nfile_name = os.path.expanduser(file_name)\ndistr = np.genfromtxt(file_name)\nsns.distplot(distr, label=\"d=64\")\n\n\nplt.legend()\n\nfig.savefig(\"suppl_dist_disrt.pdf\", bbox_inches='tight')",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07f876d14babe22b5e1e48a5f300ae813406964 | 2,751 | ipynb | Jupyter Notebook | Metody Numeryczne 2019/Lab 9/lab9.ipynb | jakub-sacha/public_lectures | fbd1360e0a0f4655985e49ef53fcecfd5e99367f | [
"CC-BY-4.0"
] | null | null | null | Metody Numeryczne 2019/Lab 9/lab9.ipynb | jakub-sacha/public_lectures | fbd1360e0a0f4655985e49ef53fcecfd5e99367f | [
"CC-BY-4.0"
] | null | null | null | Metody Numeryczne 2019/Lab 9/lab9.ipynb | jakub-sacha/public_lectures | fbd1360e0a0f4655985e49ef53fcecfd5e99367f | [
"CC-BY-4.0"
] | null | null | null | 23.715517 | 216 | 0.535078 | [
[
[
"# <center>Laboratorium 9: Układy równań liniowych - metody iteracyjne</center>",
"_____no_output_____"
],
[
"Instrukcja: \nNa zajęciach należy wykonać poniższe zadania, a następnie sporządzić sprawozdanie zawierające odpowiedzi (w postaci kodu) z komentarzami w środowisku Jupyter Notebook i umieścić je na platformie e-learningowej.",
"_____no_output_____"
],
[
"***Temat główny:*** \n\nRozwiązać układ równań: \n \n### $ Ax = b$\n\ngdzie:\n\n$\\mathbf{A} =\\left[ \\begin{matrix}\n10 & 0 & 2 & 2\\\\\n-1 & -13 & 0 & 1\\\\\n0 & 0.5 & 15 & -1 \\\\\n17 & -3 & 2 & -4\n\\end{matrix}\\right]$\n\n$\\mathbf{b} =\\left[ \\begin{matrix}\n22\\\\\n50\\\\\n280\\\\\n164\n\\end{matrix}\\right]\n$ \n\n\nmetodami:\n* [Jacobiego](https://en.wikipedia.org/wiki/Jacobi_method),\n* [Gaussa-Seidla](https://en.wikipedia.org/wiki/Gauss%E2%80%93Seidel_method)",
"_____no_output_____"
],
[
"***Zadanie 1.*** \nRozwiąż zadanie dowolną metodą dokładną",
"_____no_output_____"
],
[
"***Zadanie 2.*** \nZaimplementuj metodę Jakobiego i rozwiąż zadanie tą metodą z dokładnością do $10^{-6}$\n* Zaimplementuj również sprawdzenie warunku koniecznego dla zbieżności ",
"_____no_output_____"
],
[
"***Zadanie 3.*** \nZaimplementuj metodę Gaussa-Seidla i rozwiąż zadanie tą metodą z dokładnością do $10^{-6}$\n* Zaimplementuj również sprawdzenie warunku koniecznego dla zbieżności ",
"_____no_output_____"
],
[
"***Zadanie 4.*** \nPorównaj zbieżność tych metod (ilość iteracji)",
"_____no_output_____"
],
[
"***Zadanie 5.*** \nPorównaj czas wykonywania tych metod z metodą użytą do rozwiązania dokładnego",
"_____no_output_____"
]
]
] | [
"markdown"
] | [
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
d07f9a80e2e5e3bc8c23a818489d0fa6883f1387 | 17,944 | ipynb | Jupyter Notebook | autompg_regression-Copy1.ipynb | power3247/test_machinlearning | a107360c07ad63e520db2c92968a9b7384dcb7cf | [
"Apache-2.0"
] | null | null | null | autompg_regression-Copy1.ipynb | power3247/test_machinlearning | a107360c07ad63e520db2c92968a9b7384dcb7cf | [
"Apache-2.0"
] | null | null | null | autompg_regression-Copy1.ipynb | power3247/test_machinlearning | a107360c07ad63e520db2c92968a9b7384dcb7cf | [
"Apache-2.0"
] | null | null | null | 23.213454 | 83 | 0.369148 | [
[
[
"import sklearn",
"_____no_output_____"
],
[
"print('a')",
"a\n"
],
[
"from sklearn.preprocessing import StandardScaler\nX = [[0, 15], [1, -10]]\nStandardScaler().fit(X).transform(X)",
"_____no_output_____"
],
[
"import pandas as pd\n",
"_____no_output_____"
],
[
"pd_data = pd.read_csv('./files/auto-mpg.csv', header=None)\npd_data.info()",
"<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 398 entries, 0 to 397\nData columns (total 9 columns):\n # Column Non-Null Count Dtype \n--- ------ -------------- ----- \n 0 0 398 non-null float64\n 1 1 398 non-null int64 \n 2 2 398 non-null float64\n 3 3 398 non-null object \n 4 4 398 non-null float64\n 5 5 398 non-null float64\n 6 6 398 non-null int64 \n 7 7 398 non-null int64 \n 8 8 398 non-null object \ndtypes: float64(4), int64(3), object(2)\nmemory usage: 28.1+ KB\n"
],
[
"pd_data.columns = ['mpg','cylinders','displacement','horsepower','weight',\n 'acceleration','model year','origin','name']",
"_____no_output_____"
],
[
"from sklearn.linear_model import LinearRegression",
"_____no_output_____"
],
[
"x=pd_data[['weight','cylinders']]\n",
"_____no_output_____"
],
[
"y=pd_data[['mpg']]",
"_____no_output_____"
],
[
"pd_data",
"_____no_output_____"
],
[
"lr = LinearRegression()",
"_____no_output_____"
],
[
"lr.fit(x,y)",
"_____no_output_____"
],
[
"lr.coef_, lr.intercept_ #기울기, 절편",
"_____no_output_____"
]
],
[
[
"#### y = 0.14766146x + 12.10283073",
"_____no_output_____"
]
],
[
[
"lr.score(x,y) #정확도",
"_____no_output_____"
]
],
[
[
"## 사이킷 런 모델링을 통해서 두 데이터를 주어주고 선형모델을 만든것 ",
"_____no_output_____"
]
],
[
[
"y_predicted = lr.predict([[99]])",
"_____no_output_____"
],
[
"y_predicted",
"_____no_output_____"
]
],
[
[
"1. 정보단계: 수집 가공\n- 문제 데이터 확인 및 처리\n2. 교육 단계: 머신 대상\n- 컬럼선택\n- 모델 선택\n- 교육\n- 정확도 확인\n3. 서비스 단계:고객응대",
"_____no_output_____"
],
[
"#### X의 값을 늘릴수 있다 ",
"_____no_output_____"
]
],
[
[
"from sklearn.model_selection import train_test_split",
"_____no_output_____"
],
[
"X_train, X_test, Y_train, Y_test = train_test_split(x, y)",
"_____no_output_____"
],
[
"X_train, X_test, Y_train, Y_test",
"_____no_output_____"
],
[
"lr.fit(X_train, Y_train)",
"_____no_output_____"
],
[
"lr.coef_, lr.intercept_ #기울기, 절편",
"_____no_output_____"
],
[
"lr.score(X_train, Y_train)",
"_____no_output_____"
],
[
"lr.score(X_test,Y_test)",
"_____no_output_____"
],
[
"lr.predict([[3504.0, 8]])",
"_____no_output_____"
],
[
"lr.predict([[2790.0, 4]])",
"_____no_output_____"
],
[
"import pickle",
"_____no_output_____"
],
[
"pickle.dump(lr,open('./saves/autompg_lr.pkl','wb'))",
"_____no_output_____"
],
[
"abc = pickle.load(open('./saves/autompg_lr.pkl','rb'))",
"_____no_output_____"
]
]
] | [
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07fa5e100d3ec3a3cab208ef65fef4908c6ca90 | 273,099 | ipynb | Jupyter Notebook | mdr_analysis/3Fig-version0-AUC_IQR_across_therapy_and_sets-separate.ipynb | bonilab/malariaibm-generation-of-MDR-mutants | 41086cf621274dcdf8b445435cb1825424a2e722 | [
"MIT"
] | null | null | null | mdr_analysis/3Fig-version0-AUC_IQR_across_therapy_and_sets-separate.ipynb | bonilab/malariaibm-generation-of-MDR-mutants | 41086cf621274dcdf8b445435cb1825424a2e722 | [
"MIT"
] | null | null | null | mdr_analysis/3Fig-version0-AUC_IQR_across_therapy_and_sets-separate.ipynb | bonilab/malariaibm-generation-of-MDR-mutants | 41086cf621274dcdf8b445435cb1825424a2e722 | [
"MIT"
] | null | null | null | 647.154028 | 64,740 | 0.936291 | [
[
[
"# Fig.3 - Comparisons of MDR's AUC across 12 different sets",
"_____no_output_____"
],
[
"Change plot's default size and font",
"_____no_output_____"
]
],
[
[
"import matplotlib.pyplot as plt\nplt.rcParams['figure.figsize'] = [20, 12]\nrc = {\"font.family\" : \"sans-serif\", \n \"font.style\" : \"normal\",\n \"mathtext.fontset\" : \"dejavusans\"}\nplt.rcParams.update(rc)\nplt.rcParams[\"font.sans-serif\"] = [\"Myriad Pro\"] + plt.rcParams[\"font.sans-serif\"]",
"_____no_output_____"
],
[
"from constant import HEADER_NAME, COLUMNS_TO_DROP\nimport pandas as pd",
"_____no_output_____"
],
[
"TITLE_FONTSIZE = 20\nXLABEL_FONTSIZE = 18",
"_____no_output_____"
]
],
[
[
"## PfPR = 0.1%",
"_____no_output_____"
]
],
[
[
"fig, axes = plt.subplots(5, 3, sharex='col', sharey='row',\n gridspec_kw={'hspace': 0, 'wspace': 0})\nfig.patch.set_facecolor('white')\n\nfig.suptitle('PfPR = 0.1%', y=0.93, fontweight='bold', fontsize=TITLE_FONTSIZE)\n# plot IQR of AUC for most-dangerous-triple and -double\nfrom plotter import fig3_dangerous_triple_or_double_AUC_IQR\n\nfor (a,b,idx) in zip([1,5,9],[20,40,60],[0,1,2]): # set#, coverage, index#\n file_path_cyc = 'raw_data/set%s_c/monthly/set%sc_' % (a,a) +'%smonthly_data_0.txt'\n file_path_mft = 'raw_data/set%s_m/monthly/set%sm_' % (a,a) + '%smonthly_data_0.txt'\n file_path_adpcyc = 'raw_data/set%s_ac/monthly/set%sac_' % (a,a) + '%smonthly_data_0.txt'\n \n dflist_cyc = []\n dflist_mft = []\n dflist_adpcyc = []\n for i in range(1,101):\n dflist_cyc.append(pd.read_csv(file_path_cyc % i, index_col=False, \\\n names=HEADER_NAME, sep='\\t').drop(columns=COLUMNS_TO_DROP))\n dflist_mft.append(pd.read_csv(file_path_mft % i, index_col=False, \\\n names=HEADER_NAME, sep='\\t').drop(columns=COLUMNS_TO_DROP))\n dflist_adpcyc.append(pd.read_csv(file_path_adpcyc % i, index_col=False, \\\n names=HEADER_NAME, sep='\\t').drop(columns=COLUMNS_TO_DROP))\n \n axes[0][idx].set_title('%s%% coverage' % b, fontsize=XLABEL_FONTSIZE)\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[0][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='TYY..Y2.', \n option='triple')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[1][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='KNF..Y2.', \n option='triple')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[2][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='DHA-PPQ', \n option='double')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[3][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='ASAQ', \n option='double')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[4][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='AL', \n option='double')\n\nfor c in range(5):\n for tick in axes[c][0].yaxis.get_major_ticks():\n tick.label.set_fontsize(XLABEL_FONTSIZE)\n\naxes[0][0].set_xlim(-1, 22) # left col\naxes[0][1].set_xlim(-90/22, 90) # mid col\n\nfig.add_subplot(111, frameon=False)\n# hide tick and tick label of the big axis\nplt.tick_params(labelcolor='none', top=False, bottom=False, left=False, right=False)\n# add common x- and y-labels\nplt.xlabel('AUC', fontsize=XLABEL_FONTSIZE)",
"_____no_output_____"
]
],
[
[
"## PfPR = 1%",
"_____no_output_____"
]
],
[
[
"fig, axes = plt.subplots(5, 3, sharex='col', sharey='row',\n gridspec_kw={'hspace': 0, 'wspace': 0})\nfig.patch.set_facecolor('white')\n\nfig.suptitle('PfPR = 1%', y=0.93, fontweight='bold', fontsize=TITLE_FONTSIZE)\n# plot IQR of AUC for most-dangerous-triple and -double\nfrom plotter import fig3_dangerous_triple_or_double_AUC_IQR\n\nfor (a,b,idx) in zip([2,6,10],[20,40,60],[0,1,2]): # set#, coverage, index#\n file_path_cyc = 'raw_data/set%s_c/monthly/set%sc_' % (a,a) +'%smonthly_data_0.txt'\n file_path_mft = 'raw_data/set%s_m/monthly/set%sm_' % (a,a) + '%smonthly_data_0.txt'\n file_path_adpcyc = 'raw_data/set%s_ac/monthly/set%sac_' % (a,a) + '%smonthly_data_0.txt'\n \n dflist_cyc = []\n dflist_mft = []\n dflist_adpcyc = []\n for i in range(1,101):\n dflist_cyc.append(pd.read_csv(file_path_cyc % i, index_col=False, \\\n names=HEADER_NAME, sep='\\t').drop(columns=COLUMNS_TO_DROP))\n dflist_mft.append(pd.read_csv(file_path_mft % i, index_col=False, \\\n names=HEADER_NAME, sep='\\t').drop(columns=COLUMNS_TO_DROP))\n dflist_adpcyc.append(pd.read_csv(file_path_adpcyc % i, index_col=False, \\\n names=HEADER_NAME, sep='\\t').drop(columns=COLUMNS_TO_DROP))\n \n axes[0][idx].set_title('%s%% coverage' % b, fontsize=XLABEL_FONTSIZE)\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[0][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='TYY..Y2.', \n option='triple')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[1][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='KNF..Y2.', \n option='triple')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[2][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='DHA-PPQ', \n option='double')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[3][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='ASAQ', \n option='double')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[4][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='AL', \n option='double')\n \nfor c in range(5):\n for tick in axes[c][0].yaxis.get_major_ticks():\n tick.label.set_fontsize(XLABEL_FONTSIZE)\n \nfig.add_subplot(111, frameon=False)\n# hide tick and tick label of the big axis\nplt.tick_params(labelcolor='none', top=False, bottom=False, left=False, right=False)\n# add common x- and y-labels\nplt.xlabel('AUC', fontsize=XLABEL_FONTSIZE)",
"_____no_output_____"
]
],
[
[
"## PfPR = 5%",
"_____no_output_____"
]
],
[
[
"fig, axes = plt.subplots(5, 3, sharex='col', sharey='row',\n gridspec_kw={'hspace': 0, 'wspace': 0})\nfig.patch.set_facecolor('white')\n\nfig.suptitle('PfPR = 5%', y=0.93, fontweight='bold', fontsize=TITLE_FONTSIZE)\n# plot IQR of AUC for most-dangerous-triple and -double\nfrom plotter import fig3_dangerous_triple_or_double_AUC_IQR\n\nfor (a,b,idx) in zip([3,7,11],[20,40,60],[0,1,2]): # set#, coverage, index#\n file_path_cyc = 'raw_data/set%s_c/monthly/set%sc_' % (a,a) +'%smonthly_data_0.txt'\n file_path_mft = 'raw_data/set%s_m/monthly/set%sm_' % (a,a) + '%smonthly_data_0.txt'\n file_path_adpcyc = 'raw_data/set%s_ac/monthly/set%sac_' % (a,a) + '%smonthly_data_0.txt'\n \n dflist_cyc = []\n dflist_mft = []\n dflist_adpcyc = []\n for i in range(1,101):\n dflist_cyc.append(pd.read_csv(file_path_cyc % i, index_col=False, \\\n names=HEADER_NAME, sep='\\t').drop(columns=COLUMNS_TO_DROP))\n dflist_mft.append(pd.read_csv(file_path_mft % i, index_col=False, \\\n names=HEADER_NAME, sep='\\t').drop(columns=COLUMNS_TO_DROP))\n dflist_adpcyc.append(pd.read_csv(file_path_adpcyc % i, index_col=False, \\\n names=HEADER_NAME, sep='\\t').drop(columns=COLUMNS_TO_DROP))\n \n axes[0][idx].set_title('%s%% coverage' % b, fontsize=XLABEL_FONTSIZE)\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[0][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='TYY..Y2.', \n option='triple')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[1][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='KNF..Y2.', \n option='triple')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[2][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='DHA-PPQ', \n option='double')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[3][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='ASAQ', \n option='double')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[4][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='AL', \n option='double')\n \nfor c in range(5):\n for tick in axes[c][0].yaxis.get_major_ticks():\n tick.label.set_fontsize(XLABEL_FONTSIZE)\n \nfig.add_subplot(111, frameon=False)\n# hide tick and tick label of the big axis\nplt.tick_params(labelcolor='none', top=False, bottom=False, left=False, right=False)\n# add common x- and y-labels\nplt.xlabel('AUC', fontsize=XLABEL_FONTSIZE)",
"_____no_output_____"
]
],
[
[
"## PfPR = 20%",
"_____no_output_____"
]
],
[
[
"fig, axes = plt.subplots(5, 3, sharex='col', sharey='row',\n gridspec_kw={'hspace': 0, 'wspace': 0})\nfig.patch.set_facecolor('white')\n\nfig.suptitle('PfPR = 20%', y=0.93, fontweight='bold', fontsize=TITLE_FONTSIZE)\n# plot IQR of AUC for most-dangerous-triple and -double\nfrom plotter import fig3_dangerous_triple_or_double_AUC_IQR\n\nfor (a,b,idx) in zip([4,8,12],[20,40,60],[0,1,2]): # set#, coverage, index#\n file_path_cyc = 'raw_data/set%s_c/monthly/set%sc_' % (a,a) +'%smonthly_data_0.txt'\n file_path_mft = 'raw_data/set%s_m/monthly/set%sm_' % (a,a) + '%smonthly_data_0.txt'\n file_path_adpcyc = 'raw_data/set%s_ac/monthly/set%sac_' % (a,a) + '%smonthly_data_0.txt'\n \n dflist_cyc = []\n dflist_mft = []\n dflist_adpcyc = []\n for i in range(1,101):\n dflist_cyc.append(pd.read_csv(file_path_cyc % i, index_col=False, \\\n names=HEADER_NAME, sep='\\t').drop(columns=COLUMNS_TO_DROP))\n dflist_mft.append(pd.read_csv(file_path_mft % i, index_col=False, \\\n names=HEADER_NAME, sep='\\t').drop(columns=COLUMNS_TO_DROP))\n dflist_adpcyc.append(pd.read_csv(file_path_adpcyc % i, index_col=False, \\\n names=HEADER_NAME, sep='\\t').drop(columns=COLUMNS_TO_DROP))\n \n axes[0][idx].set_title('%s%% coverage' % b, fontsize=XLABEL_FONTSIZE)\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[0][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='TYY..Y2.', \n option='triple')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[1][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='KNF..Y2.', \n option='triple')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[2][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='DHA-PPQ', \n option='double')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[3][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='ASAQ', \n option='double')\n fig3_dangerous_triple_or_double_AUC_IQR(ax=axes[4][idx], m=dflist_mft, c=dflist_cyc, \n a=dflist_adpcyc, pattern='AL', \n option='double')\n \nfor c in range(5):\n for tick in axes[c][0].yaxis.get_major_ticks():\n tick.label.set_fontsize(XLABEL_FONTSIZE)\n \nfig.add_subplot(111, frameon=False)\n# hide tick and tick label of the big axis\nplt.tick_params(labelcolor='none', top=False, bottom=False, left=False, right=False)\n# add common x- and y-labels\nplt.xlabel('AUC', fontsize=XLABEL_FONTSIZE)",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] | [
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
d07fafae0b25cd867711a2d52246665010e427e8 | 11,842 | ipynb | Jupyter Notebook | Amrita/Function_Assisgnment.ipynb | Sureshkrishh/Practice-codes | 3a96584866d0ca4f5975dccc4809eb615a8aa2e0 | [
"MIT"
] | null | null | null | Amrita/Function_Assisgnment.ipynb | Sureshkrishh/Practice-codes | 3a96584866d0ca4f5975dccc4809eb615a8aa2e0 | [
"MIT"
] | null | null | null | Amrita/Function_Assisgnment.ipynb | Sureshkrishh/Practice-codes | 3a96584866d0ca4f5975dccc4809eb615a8aa2e0 | [
"MIT"
] | 5 | 2020-08-04T15:58:22.000Z | 2021-08-04T19:16:18.000Z | 22.343396 | 140 | 0.477791 | [
[
[
"# Define a function calls addNumber(x, y) that takes in two number and returns the sum of the two numbers.\n\ndef addNumber(x, y):\n return(sum((x, y)))\n\naddNumber(10,20)",
"_____no_output_____"
],
[
"# Define a function calls subtractNumber(x, y) that takes in two numbers and returns the difference of the two numbers.\n\ndef subtractNumber(x,y):\n return(x-y)\nsubtractNumber(10,20)",
"_____no_output_____"
],
[
"# Write a function getBiggerNumber(x, y) that takes in two numbers as arguments and returns the bigger number.\n\n\ndef getBiggerNumber(x,y):\n return(max(x,y))\ngetBiggerNumber(56,32)",
"_____no_output_____"
],
[
"# You will need to do a \"import math\" before you are allowed to use the functions within the math module.\n# Calculate the square root of 16 and stores it in the variable a\n# Calculate 3 to the power of 5 and stores it in the variable b\n# Calculate area of circle with radius = 3.0 by making use of the math.pi constant and store it in the variable c",
"_____no_output_____"
],
[
"import math\ndef sq(sq_root):\n a=math.sqrt(sq_root)\n return a\nsq(16)",
"_____no_output_____"
],
[
"def pwr(x,y):\n b=math.pow(x,y)\n return b \n\npwr(3,5)",
"_____no_output_____"
],
[
"def area(r):\n c=round((math.pi)*(r*r),2)\n return c \n\narea(3.0)",
"_____no_output_____"
],
[
"# Write a function to convert temperature from Celsius to Fahrenheit scale.\n# oC to oF Conversion: Multipy by 9, then divide by 5, then add 32.\n\n# Note: Return a string of 2 decimal places.\n# In - Cel2Fah(28.0)\n# Out - '82.40'\n# In - Cel2Fah(0.00)\n# Out - '32.00'\n\nimport math\ndef Cel2Fah(cel):\n fahrenheit = ((cel * 9)/5) + 32\n format='{0:.2f}'.format(fahrenheit)\n return (f\"'{format}'\")\n\n\nprint(Cel2Fah(28.0))\nprint(Cel2Fah(0.00))\n ",
"'82.40'\n'32.00'\n"
],
[
"# Write a function to compute the BMI of a person.\n# BMI = weight(kg) / ( height(m)*height(m) )\n\n# Note: Return a string of 1 decimal place.\n# In - BMI(63, 1.7)\n# Out - '21.8'\n# In - BMI(110, 2)\n# Out - '27.5'\n\n\ndef BMI(weight,Height):\n cal=weight/(Height*Height)\n format='{0:.1f}'.format(cal)\n return (f\"'{format}'\")\n\nprint(BMI(63,1.7))\nprint(BMI(110,2))\n ",
"'21.8'\n'27.5'\n"
],
[
"# Write a function to compute the hypotenuse given sides a and b of the triangle.\n# Hint: You can use math.sqrt(x) to compute the square root of x.\n# In - hypotenuse(3, 4)\n# Out - 5\n# In - hypotenuse(5, 12)\n# Out - 13\n\nfrom math import sqrt\ndef hypo(a,b):\n return(int(sqrt(a**2 + b**2)))\n\nprint(hypo(3,4))\nprint(hypo(5,12))",
"5\n13\n"
],
[
"# Write a function getSumOfLastDigits() that takes in a list of positive numbers and returns the sum \n#of all the last digits in the list.\n# getSumOfLastDigits([2, 3, 4])\n# 9\n# getSumOfLastDigits([1, 23, 456])\n# 10\n\ndef getSumOfLastDigits(lst):\n lst_digit=[]\n for i in lst:\n lst_digit.append(i % 10)\n s=sum(lst_digit)\n return s\n \n \nprint(getSumOfLastDigits([2, 3, 4]))\nprint(getSumOfLastDigits([1, 23, 456]))\n ",
"9\n10\n"
],
[
"# Write a function that uses a default value.\n# In - introduce('Lim', 20)\n# Out - 'My name is Lim. I am 20 years old.'\n# In - introduce('Ahmad')\n# Out - 'My name is Ahmad. My age is secret.'\n\n\ndef introduce(name,age='My age is secret'):\n if age is introduce.__defaults__[0]:\n print(f\"My name is {name}.{age}\")\n else:\n print(f\"My name is {name}. I am {age} years old.\")\n\nintroduce('Lim',20)\nintroduce('Ahmad')",
"My name is Lim. I am 20 years old.\nMy name is Ahmad.My age is secret\n"
],
[
"# Write a function isEquilateral(x, y, z) that accepts the 3 sides of a triangle as arguments. \n# The program should return True if it is an equilateral triangle.\n\n# In - isEquilateral(2, 4, 3)\n# False - False\n# In - isEquilateral(3, 3, 3)\n# Out - True\n# In - isEquilateral(-3, -3, -3)\n# Out - False\n\ndef Eq(x,y,z):\n return (x == y == z )\n\nprint(Eq(2,4,3))\nprint(Eq(3,3,3))\nprint(Eq(-3,-3,-3))",
"False\nTrue\nTrue\n"
],
[
"# For a quadratic equation in the form of ax2+bx+c, the discriminant, D is b2-4ac. Write a function to compute the discriminant, D.\n# In - quadratic(1, 2, 3)\n# Out - 'The discriminant is -8.'\n# In - quadratic(1, 3, 2)\n# Out - 'The discriminant is 1.'\n# In - quadratic(1, 4, 4)\n# Out - 'The discriminant is 0.'\n\ndef qa(a,b,c):\n d= (b*b)-(4*a*c)\n print(f\"The discriminant is {d}\")\n #return(d)\n\nqa(1, 2, 3)\nqa(1,3,2)\nqa(1,4,4)",
"The discriminant is -8\nThe discriminant is 1\nThe discriminant is 0\n"
],
[
"# Define a function calls addFirstAndLast(x) that takes in a list of numbers and returns the sum of the first and last numbers.\n# In - addFirstAndLast([])\n# Out - 0\n# In - addFirstAndLast([2, 7, 3])\n# Out - 5\n# In - addFirstAndLast([10])\n# Out - 10\n\n\ndef addFirstAndLast(x=0):\n return(x[0] + x[-1])\n\nprint(addFirstAndLast([2,7,3]))\nprint(addFirstAndLast([10]))\n\n\n",
"5\n20\n"
],
[
"# Complete the 'lambda' expression so that it returns True if the argument is an even number, and False otherwise.\n\ndef even(num):\n return (lambda num : num% 2 ==0 )(num)\neven(6)",
"_____no_output_____"
],
[
"# getScore.__doc__\n# 'A function that computes and returns the final score.'\n\n",
"_____no_output_____"
],
[
"# In Python, it is possible to pass a function as a argument to another function. --yes\n# Write a function useFunction(func, num) that takes in a function and a number as arguments. \n# The useFunction should produce the output shown in the examples given below.\n\n# def addOne(x):\n# return x + 1\n# useFunction(addOne, 4)\n# 25\n# useFunction(addOne, 9)\n# 100\n# useFunction(addOne, 0)\n# 1\n\n\ndef useFunction(addOne,num):\n new=addOne(num)\n return new**2\n \ndef addOne(x):\n return x + 1\n\nuseFunction(addOne,4)\nuseFunction(addOne,9)\n",
"_____no_output_____"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07fc1939dee8afed5bf8d3e846738a0ff5747f1 | 89,161 | ipynb | Jupyter Notebook | Wine test .ipynb | SachinHotkar/Wine-test-for-KNN | d457773551242e6457d9eaf1f3fe6d4394b47795 | [
"MIT"
] | null | null | null | Wine test .ipynb | SachinHotkar/Wine-test-for-KNN | d457773551242e6457d9eaf1f3fe6d4394b47795 | [
"MIT"
] | null | null | null | Wine test .ipynb | SachinHotkar/Wine-test-for-KNN | d457773551242e6457d9eaf1f3fe6d4394b47795 | [
"MIT"
] | null | null | null | 124.87535 | 17,564 | 0.860915 | [
[
[
"import numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport seaborn as sns\nimport math\n%matplotlib inline",
"_____no_output_____"
],
[
"wine=pd.read_csv(\"C:/Users/sachi/Downloads/winequality-red.csv\")\nwine.shape",
"_____no_output_____"
],
[
"wine.head()",
"_____no_output_____"
],
[
"print(wine.groupby('quality').size())",
"quality\n3 10\n4 53\n5 681\n6 638\n7 199\n8 18\ndtype: int64\n"
],
[
"sns.countplot(x='quality', data=wine)",
"_____no_output_____"
],
[
"wine.info()",
"<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 1599 entries, 0 to 1598\nData columns (total 12 columns):\nfixed acidity 1599 non-null float64\nvolatile acidity 1599 non-null float64\ncitric acid 1599 non-null float64\nresidual sugar 1599 non-null float64\nchlorides 1599 non-null float64\nfree sulfur dioxide 1599 non-null float64\ntotal sulfur dioxide 1599 non-null float64\ndensity 1599 non-null float64\npH 1599 non-null float64\nsulphates 1599 non-null float64\nalcohol 1599 non-null float64\nquality 1599 non-null int64\ndtypes: float64(11), int64(1)\nmemory usage: 150.0 KB\n"
],
[
"sns.barplot(x='quality', y='fixed acidity' ,data=wine)",
"C:\\Users\\sachi\\Anaconda3\\lib\\site-packages\\scipy\\stats\\stats.py:1713: FutureWarning: Using a non-tuple sequence for multidimensional indexing is deprecated; use `arr[tuple(seq)]` instead of `arr[seq]`. In the future this will be interpreted as an array index, `arr[np.array(seq)]`, which will result either in an error or a different result.\n return np.add.reduce(sorted[indexer] * weights, axis=axis) / sumval\n"
],
[
"sns.barplot(x='quality', y='volatile acidity' ,data=wine)",
"C:\\Users\\sachi\\Anaconda3\\lib\\site-packages\\scipy\\stats\\stats.py:1713: FutureWarning: Using a non-tuple sequence for multidimensional indexing is deprecated; use `arr[tuple(seq)]` instead of `arr[seq]`. In the future this will be interpreted as an array index, `arr[np.array(seq)]`, which will result either in an error or a different result.\n return np.add.reduce(sorted[indexer] * weights, axis=axis) / sumval\n"
],
[
"sns.barplot(x='quality',y='citric acid',data=wine)",
"_____no_output_____"
],
[
"sns.barplot(x='quality', y='residual sugar' ,data=wine)",
"C:\\Users\\sachi\\Anaconda3\\lib\\site-packages\\scipy\\stats\\stats.py:1713: FutureWarning: Using a non-tuple sequence for multidimensional indexing is deprecated; use `arr[tuple(seq)]` instead of `arr[seq]`. In the future this will be interpreted as an array index, `arr[np.array(seq)]`, which will result either in an error or a different result.\n return np.add.reduce(sorted[indexer] * weights, axis=axis) / sumval\n"
],
[
"sns.barplot(x='quality', y='chlorides' ,data=wine)",
"C:\\Users\\sachi\\Anaconda3\\lib\\site-packages\\scipy\\stats\\stats.py:1713: FutureWarning: Using a non-tuple sequence for multidimensional indexing is deprecated; use `arr[tuple(seq)]` instead of `arr[seq]`. In the future this will be interpreted as an array index, `arr[np.array(seq)]`, which will result either in an error or a different result.\n return np.add.reduce(sorted[indexer] * weights, axis=axis) / sumval\n"
],
[
"sns.barplot(x='quality', y='free sulfur dioxide' ,data=wine)",
"C:\\Users\\sachi\\Anaconda3\\lib\\site-packages\\scipy\\stats\\stats.py:1713: FutureWarning: Using a non-tuple sequence for multidimensional indexing is deprecated; use `arr[tuple(seq)]` instead of `arr[seq]`. In the future this will be interpreted as an array index, `arr[np.array(seq)]`, which will result either in an error or a different result.\n return np.add.reduce(sorted[indexer] * weights, axis=axis) / sumval\n"
],
[
"sns.barplot(x='quality', y='alcohol' ,data=wine)",
"C:\\Users\\sachi\\Anaconda3\\lib\\site-packages\\scipy\\stats\\stats.py:1713: FutureWarning: Using a non-tuple sequence for multidimensional indexing is deprecated; use `arr[tuple(seq)]` instead of `arr[seq]`. In the future this will be interpreted as an array index, `arr[np.array(seq)]`, which will result either in an error or a different result.\n return np.add.reduce(sorted[indexer] * weights, axis=axis) / sumval\n"
],
[
"print(wine.groupby('quality').size())",
"quality\n3 10\n4 53\n5 681\n6 638\n7 199\n8 18\ndtype: int64\n"
],
[
"bins = (2, 6.5, 8)\ngroup_names = ['bad', 'good']\nwine['quality'] = pd.cut(wine['quality'], bins = bins, labels = group_names)",
"_____no_output_____"
],
[
"from sklearn.preprocessing import StandardScaler, LabelEncoder",
"_____no_output_____"
],
[
"label_quality=LabelEncoder()\nwine['quality']=label_quality.fit_transform(wine['quality'])",
"_____no_output_____"
],
[
"print(wine.groupby('quality').size())",
"quality\n0 1382\n1 217\ndtype: int64\n"
],
[
"X=wine.drop('quality',axis=1)\ny=wine['quality']\nfrom sklearn.model_selection import train_test_split\nx_train,x_test,y_train,y_test=train_test_split(X,y,test_size=30,random_state=66)",
"_____no_output_____"
],
[
"from sklearn.neighbors import KNeighborsClassifier\ntraining_accuracy=[]\ntesting_accuracy=[]\nneighbors_setting= range(1,11)\nfor n_neighbors in neighbors_setting:\n knn=KNeighborsClassifier(n_neighbors=n_neighbors)\n knn.fit(x_train,y_train)\n training_accuracy.append(knn.score(x_train,y_train))\n testing_accuracy.append(knn.score(x_test,y_test))\n\nplt.plot(neighbors_setting,training_accuracy,label='training')\nplt.plot(neighbors_setting,testing_accuracy,label='testing')\nplt.legend()\nplt.xlabel('neighbors_setting')\nplt.ylabel('accuracy')\nplt.show()\n",
"_____no_output_____"
],
[
"knn=KNeighborsClassifier(n_neighbors=3)\nknn.fit(x_train,y_train)\nprint('Accuracy of K-nn classifier on training set:{:}'.format(knn.score(x_train,y_train)))\nprint('Accuracy of K-nn classifier on test set:{:.2f}'.format(knn.score(x_test,y_test)))",
"Accuracy of K-nn classifier on training set:0.9279796048438496\nAccuracy of K-nn classifier on test set:0.93\n"
]
]
] | [
"code"
] | [
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
d07fc3bcf5318a52a5cc0e03b6fb61be2c38ec27 | 123,183 | ipynb | Jupyter Notebook | image-classification/dlnd_image_classification.ipynb | mdiaz236/DeepLearningFoundations | 82cd92417c292589ac68c796b221a64a2cbe32e4 | [
"MIT"
] | null | null | null | image-classification/dlnd_image_classification.ipynb | mdiaz236/DeepLearningFoundations | 82cd92417c292589ac68c796b221a64a2cbe32e4 | [
"MIT"
] | null | null | null | image-classification/dlnd_image_classification.ipynb | mdiaz236/DeepLearningFoundations | 82cd92417c292589ac68c796b221a64a2cbe32e4 | [
"MIT"
] | null | null | null | 93.604103 | 61,056 | 0.802018 | [
[
[
"# Image Classification\nIn this project, you'll classify images from the [CIFAR-10 dataset](https://www.cs.toronto.edu/~kriz/cifar.html). The dataset consists of airplanes, dogs, cats, and other objects. You'll preprocess the images, then train a convolutional neural network on all the samples. The images need to be normalized and the labels need to be one-hot encoded. You'll get to apply what you learned and build a convolutional, max pooling, dropout, and fully connected layers. At the end, you'll get to see your neural network's predictions on the sample images.\n## Get the Data\nRun the following cell to download the [CIFAR-10 dataset for python](https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz).",
"_____no_output_____"
]
],
[
[
"\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n\"\"\"\nfrom urllib.request import urlretrieve\nfrom os.path import isfile, isdir\nfrom tqdm import tqdm\nimport problem_unittests as tests\nimport tarfile\n\ncifar10_dataset_folder_path = 'cifar-10-batches-py'\n\n# Use Floyd's cifar-10 dataset if present\nfloyd_cifar10_location = '/input/cifar-10/python.tar.gz'\nif isfile(floyd_cifar10_location):\n tar_gz_path = floyd_cifar10_location\nelse:\n tar_gz_path = 'cifar-10-python.tar.gz'\n\nclass DLProgress(tqdm):\n last_block = 0\n\n def hook(self, block_num=1, block_size=1, total_size=None):\n self.total = total_size\n self.update((block_num - self.last_block) * block_size)\n self.last_block = block_num\n\nif not isfile(tar_gz_path):\n with DLProgress(unit='B', unit_scale=True, miniters=1, desc='CIFAR-10 Dataset') as pbar:\n urlretrieve(\n 'https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz',\n tar_gz_path,\n pbar.hook)\n\nif not isdir(cifar10_dataset_folder_path):\n with tarfile.open(tar_gz_path) as tar:\n tar.extractall()\n tar.close()\n\n\ntests.test_folder_path(cifar10_dataset_folder_path)",
"CIFAR-10 Dataset: 171MB [01:58, 1.44MB/s] \n"
]
],
[
[
"## Explore the Data\nThe dataset is broken into batches to prevent your machine from running out of memory. The CIFAR-10 dataset consists of 5 batches, named `data_batch_1`, `data_batch_2`, etc.. Each batch contains the labels and images that are one of the following:\n* airplane\n* automobile\n* bird\n* cat\n* deer\n* dog\n* frog\n* horse\n* ship\n* truck\n\nUnderstanding a dataset is part of making predictions on the data. Play around with the code cell below by changing the `batch_id` and `sample_id`. The `batch_id` is the id for a batch (1-5). The `sample_id` is the id for a image and label pair in the batch.\n\nAsk yourself \"What are all possible labels?\", \"What is the range of values for the image data?\", \"Are the labels in order or random?\". Answers to questions like these will help you preprocess the data and end up with better predictions.",
"_____no_output_____"
]
],
[
[
"%matplotlib inline\n%config InlineBackend.figure_format = 'retina'\n\nimport helper\nimport numpy as np\n\n# Explore the dataset\nbatch_id = 1\nsample_id = 5\nhelper.display_stats(cifar10_dataset_folder_path, batch_id, sample_id)",
"\nStats of batch 1:\nSamples: 10000\nLabel Counts: {0: 1005, 1: 974, 2: 1032, 3: 1016, 4: 999, 5: 937, 6: 1030, 7: 1001, 8: 1025, 9: 981}\nFirst 20 Labels: [6, 9, 9, 4, 1, 1, 2, 7, 8, 3, 4, 7, 7, 2, 9, 9, 9, 3, 2, 6]\n\nExample of Image 5:\nImage - Min Value: 0 Max Value: 252\nImage - Shape: (32, 32, 3)\nLabel - Label Id: 1 Name: automobile\n"
]
],
[
[
"## Implement Preprocess Functions\n### Normalize\nIn the cell below, implement the `normalize` function to take in image data, `x`, and return it as a normalized Numpy array. The values should be in the range of 0 to 1, inclusive. The return object should be the same shape as `x`.",
"_____no_output_____"
]
],
[
[
"def normalize(x):\n \"\"\"\n Normalize a list of sample image data in the range of 0 to 1\n : x: List of image data. The image shape is (32, 32, 3)\n : return: Numpy array of normalize data\n \"\"\"\n min = np.amin(x)\n max = np.amax(x)\n return (x - min) / (max - min)\n\n\n\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n\"\"\"\ntests.test_normalize(normalize)",
"Tests Passed\n"
]
],
[
[
"### One-hot encode\nJust like the previous code cell, you'll be implementing a function for preprocessing. This time, you'll implement the `one_hot_encode` function. The input, `x`, are a list of labels. Implement the function to return the list of labels as One-Hot encoded Numpy array. The possible values for labels are 0 to 9. The one-hot encoding function should return the same encoding for each value between each call to `one_hot_encode`. Make sure to save the map of encodings outside the function.\n\nHint: Don't reinvent the wheel.",
"_____no_output_____"
]
],
[
[
"x = np.array([6, 1, 5])\nnp.array([[1 if i == y else 0 for i in range(10)] for y in x])",
"_____no_output_____"
],
[
"def one_hot_encode(x):\n \"\"\"\n One hot encode a list of sample labels. Return a one-hot encoded vector for each label.\n : x: List of sample Labels\n : return: Numpy array of one-hot encoded labels\n \"\"\"\n return np.array([[1 if i == y else 0 for i in range(10)] for y in x])\n\n\n\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n\"\"\"\ntests.test_one_hot_encode(one_hot_encode)",
"Tests Passed\n"
]
],
[
[
"### Randomize Data\nAs you saw from exploring the data above, the order of the samples are randomized. It doesn't hurt to randomize it again, but you don't need to for this dataset.",
"_____no_output_____"
],
[
"## Preprocess all the data and save it\nRunning the code cell below will preprocess all the CIFAR-10 data and save it to file. The code below also uses 10% of the training data for validation.",
"_____no_output_____"
]
],
[
[
"\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL\n\"\"\"\n# Preprocess Training, Validation, and Testing Data\nhelper.preprocess_and_save_data(cifar10_dataset_folder_path, normalize, one_hot_encode)",
"_____no_output_____"
]
],
[
[
"# Check Point\nThis is your first checkpoint. If you ever decide to come back to this notebook or have to restart the notebook, you can start from here. The preprocessed data has been saved to disk.",
"_____no_output_____"
]
],
[
[
"\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL\n\"\"\"\nimport pickle\nimport problem_unittests as tests\nimport helper\n\n# Load the Preprocessed Validation data\nvalid_features, valid_labels = pickle.load(open('preprocess_validation.p', mode='rb'))",
"_____no_output_____"
]
],
[
[
"## Build the network\nFor the neural network, you'll build each layer into a function. Most of the code you've seen has been outside of functions. To test your code more thoroughly, we require that you put each layer in a function. This allows us to give you better feedback and test for simple mistakes using our unittests before you submit your project.\n\n>**Note:** If you're finding it hard to dedicate enough time for this course each week, we've provided a small shortcut to this part of the project. In the next couple of problems, you'll have the option to use classes from the [TensorFlow Layers](https://www.tensorflow.org/api_docs/python/tf/layers) or [TensorFlow Layers (contrib)](https://www.tensorflow.org/api_guides/python/contrib.layers) packages to build each layer, except the layers you build in the \"Convolutional and Max Pooling Layer\" section. TF Layers is similar to Keras's and TFLearn's abstraction to layers, so it's easy to pickup.\n\n>However, if you would like to get the most out of this course, try to solve all the problems _without_ using anything from the TF Layers packages. You **can** still use classes from other packages that happen to have the same name as ones you find in TF Layers! For example, instead of using the TF Layers version of the `conv2d` class, [tf.layers.conv2d](https://www.tensorflow.org/api_docs/python/tf/layers/conv2d), you would want to use the TF Neural Network version of `conv2d`, [tf.nn.conv2d](https://www.tensorflow.org/api_docs/python/tf/nn/conv2d). \n\nLet's begin!\n\n### Input\nThe neural network needs to read the image data, one-hot encoded labels, and dropout keep probability. Implement the following functions\n* Implement `neural_net_image_input`\n * Return a [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder)\n * Set the shape using `image_shape` with batch size set to `None`.\n * Name the TensorFlow placeholder \"x\" using the TensorFlow `name` parameter in the [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder).\n* Implement `neural_net_label_input`\n * Return a [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder)\n * Set the shape using `n_classes` with batch size set to `None`.\n * Name the TensorFlow placeholder \"y\" using the TensorFlow `name` parameter in the [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder).\n* Implement `neural_net_keep_prob_input`\n * Return a [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder) for dropout keep probability.\n * Name the TensorFlow placeholder \"keep_prob\" using the TensorFlow `name` parameter in the [TF Placeholder](https://www.tensorflow.org/api_docs/python/tf/placeholder).\n\nThese names will be used at the end of the project to load your saved model.\n\nNote: `None` for shapes in TensorFlow allow for a dynamic size.",
"_____no_output_____"
]
],
[
[
"import tensorflow as tf\n\ndef neural_net_image_input(image_shape):\n \"\"\"\n Return a Tensor for a batch of image input\n : image_shape: Shape of the images\n : return: Tensor for image input.\n \"\"\"\n return tf.placeholder(tf.float32, shape = [None] + list(image_shape), name = \"x\")\n\n\ndef neural_net_label_input(n_classes):\n \"\"\"\n Return a Tensor for a batch of label input\n : n_classes: Number of classes\n : return: Tensor for label input.\n \"\"\"\n return tf.placeholder(tf.float32, shape = (None, n_classes), name = \"y\")\n\ndef neural_net_keep_prob_input():\n \"\"\"\n Return a Tensor for keep probability\n : return: Tensor for keep probability.\n \"\"\"\n # TODO: Implement Function\n return tf.placeholder(tf.float32, shape = None, name = \"keep_prob\")\n\n\n\n\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n\"\"\"\ntf.reset_default_graph()\ntests.test_nn_image_inputs(neural_net_image_input)\ntests.test_nn_label_inputs(neural_net_label_input)\ntests.test_nn_keep_prob_inputs(neural_net_keep_prob_input)",
"Image Input Tests Passed.\nLabel Input Tests Passed.\nKeep Prob Tests Passed.\n"
]
],
[
[
"### Convolution and Max Pooling Layer\nConvolution layers have a lot of success with images. For this code cell, you should implement the function `conv2d_maxpool` to apply convolution then max pooling:\n* Create the weight and bias using `conv_ksize`, `conv_num_outputs` and the shape of `x_tensor`.\n* Apply a convolution to `x_tensor` using weight and `conv_strides`.\n * We recommend you use same padding, but you're welcome to use any padding.\n* Add bias\n* Add a nonlinear activation to the convolution.\n* Apply Max Pooling using `pool_ksize` and `pool_strides`.\n * We recommend you use same padding, but you're welcome to use any padding.\n\n**Note:** You **can't** use [TensorFlow Layers](https://www.tensorflow.org/api_docs/python/tf/layers) or [TensorFlow Layers (contrib)](https://www.tensorflow.org/api_guides/python/contrib.layers) for **this** layer, but you can still use TensorFlow's [Neural Network](https://www.tensorflow.org/api_docs/python/tf/nn) package. You may still use the shortcut option for all the **other** layers.",
"_____no_output_____"
]
],
[
[
"def conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides):\n \"\"\"\n Apply convolution then max pooling to x_tensor\n :param x_tensor: TensorFlow Tensor\n :param conv_num_outputs: Number of outputs for the convolutional layer\n :param conv_ksize: kernal size 2-D Tuple for the convolutional layer\n :param conv_strides: Stride 2-D Tuple for convolution\n :param pool_ksize: kernel size 2-D Tuple for pool\n :param pool_strides: Stride 2-D Tuple for pool\n : return: A tensor that represents convolution and max pooling of x_tensor\n \"\"\"\n x_size = x_tensor.get_shape().as_list()[1:]\n W = tf.Variable(tf.truncated_normal(x_size + [conv_num_outputs], stddev=0.05))\n b = tf.Variable(tf.zeros(conv_num_outputs))\n x = tf.nn.conv2d(x_tensor, W, strides = (1, conv_strides[0], conv_strides[1], 1), \n padding = \"SAME\")\n x = tf.nn.bias_add(x, b)\n x = tf.nn.relu(x)\n x = tf.nn.max_pool(x, ksize = (1, pool_ksize[0], pool_ksize[1], 1), \n strides = (1, pool_strides[0], pool_strides[1], 1), \n padding = \"SAME\")\n return x \n\n\n\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n\"\"\"\ntests.test_con_pool(conv2d_maxpool)",
"Tests Passed\n"
]
],
[
[
"### Flatten Layer\nImplement the `flatten` function to change the dimension of `x_tensor` from a 4-D tensor to a 2-D tensor. The output should be the shape (*Batch Size*, *Flattened Image Size*). Shortcut option: you can use classes from the [TensorFlow Layers](https://www.tensorflow.org/api_docs/python/tf/layers) or [TensorFlow Layers (contrib)](https://www.tensorflow.org/api_guides/python/contrib.layers) packages for this layer. For more of a challenge, only use other TensorFlow packages.",
"_____no_output_____"
]
],
[
[
"import numpy as np\ndef flatten(x_tensor):\n \"\"\"\n Flatten x_tensor to (Batch Size, Flattened Image Size)\n : x_tensor: A tensor of size (Batch Size, ...), where ... are the image dimensions.\n : return: A tensor of size (Batch Size, Flattened Image Size).\n \"\"\"\n x_size = x_tensor.get_shape().as_list()\n return tf.reshape(x_tensor, shape=(-1, np.prod(x_size[1:])))\n\n\n\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n\"\"\"\ntests.test_flatten(flatten)",
"Tests Passed\n"
]
],
[
[
"### Fully-Connected Layer\nImplement the `fully_conn` function to apply a fully connected layer to `x_tensor` with the shape (*Batch Size*, *num_outputs*). Shortcut option: you can use classes from the [TensorFlow Layers](https://www.tensorflow.org/api_docs/python/tf/layers) or [TensorFlow Layers (contrib)](https://www.tensorflow.org/api_guides/python/contrib.layers) packages for this layer. For more of a challenge, only use other TensorFlow packages.",
"_____no_output_____"
]
],
[
[
"def fully_conn(x_tensor, num_outputs):\n \"\"\"\n Apply a fully connected layer to x_tensor using weight and bias\n : x_tensor: A 2-D tensor where the first dimension is batch size.\n : num_outputs: The number of output that the new tensor should be.\n : return: A 2-D tensor where the second dimension is num_outputs.\n \"\"\"\n x_size = x_tensor.get_shape().as_list()[1:]\n W = tf.Variable(tf.truncated_normal(x_size + [num_outputs], stddev=.05))\n b = tf.Variable(tf.zeros(num_outputs))\n x = tf.add(tf.matmul(x_tensor, W), b)\n x = tf.nn.relu(x)\n return x\n\n\n\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n\"\"\"\ntests.test_fully_conn(fully_conn)",
"Tests Passed\n"
]
],
[
[
"### Output Layer\nImplement the `output` function to apply a fully connected layer to `x_tensor` with the shape (*Batch Size*, *num_outputs*). Shortcut option: you can use classes from the [TensorFlow Layers](https://www.tensorflow.org/api_docs/python/tf/layers) or [TensorFlow Layers (contrib)](https://www.tensorflow.org/api_guides/python/contrib.layers) packages for this layer. For more of a challenge, only use other TensorFlow packages.\n\n**Note:** Activation, softmax, or cross entropy should **not** be applied to this.",
"_____no_output_____"
]
],
[
[
"def output(x_tensor, num_outputs):\n \"\"\"\n Apply a output layer to x_tensor using weight and bias\n : x_tensor: A 2-D tensor where the first dimension is batch size.\n : num_outputs: The number of output that the new tensor should be.\n : return: A 2-D tensor where the second dimension is num_outputs.\n \"\"\"\n x_size = x_tensor.get_shape().as_list()[1:]\n W = tf.Variable(tf.truncated_normal(x_size + [num_outputs], stddev=.05))\n b = tf.Variable(tf.zeros(num_outputs))\n x = tf.add(tf.matmul(x_tensor, W), b)\n\n return x\n\n\n\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n\"\"\"\ntests.test_output(output)",
"Tests Passed\n"
]
],
[
[
"### Create Convolutional Model\nImplement the function `conv_net` to create a convolutional neural network model. The function takes in a batch of images, `x`, and outputs logits. Use the layers you created above to create this model:\n\n* Apply 1, 2, or 3 Convolution and Max Pool layers\n* Apply a Flatten Layer\n* Apply 1, 2, or 3 Fully Connected Layers\n* Apply an Output Layer\n* Return the output\n* Apply [TensorFlow's Dropout](https://www.tensorflow.org/api_docs/python/tf/nn/dropout) to one or more layers in the model using `keep_prob`. ",
"_____no_output_____"
]
],
[
[
"def conv_net(x, keep_prob):\n \"\"\"\n Create a convolutional neural network model\n : x: Placeholder tensor that holds image data.\n : keep_prob: Placeholder tensor that hold dropout keep probability.\n : return: Tensor that represents logits\n \"\"\"\n # TODO: Apply 1, 2, or 3 Convolution and Max Pool layers\n # Play around with different number of outputs, kernel size and stride\n # Function Definition from Above:\n # conv2d_maxpool(x_tensor, conv_num_outputs, conv_ksize, conv_strides, pool_ksize, pool_strides)\n x = conv2d_maxpool(x, conv_num_outputs=64, conv_ksize=2,\n conv_strides=(1, 1), pool_ksize=(2, 2), pool_strides=(2, 2))\n x = tf.layers.dropout(x, keep_prob)\n\n# x = conv2d_maxpool(x, conv_num_outputs=128, conv_ksize=3,\n# conv_strides=(1, 1), pool_ksize=(2, 2), pool_strides=(2, 2))\n \n# x = conv2d_maxpool(x, conv_num_outputs=256, conv_ksize=3,\n# conv_strides=(1, 1), pool_ksize=(2, 2), pool_strides=(2, 2))\n \n # TODO: Apply a Flatten Layer\n # Function Definition from Above:\n # flatten(x_tensor)\n x = flatten(x)\n\n # TODO: Apply 1, 2, or 3 Fully Connected Layers\n # Play around with different number of outputs\n # Function Definition from Above:\n # fully_conn(x_tensor, num_outputs)\n \n x = fully_conn(x, 512)\n x = tf.layers.dropout(x, keep_prob)\n\n# x = fully_conn(x, 128)\n# x = tf.layers.dropout(x, keep_prob)\n \n# x = fully_conn(x, 64)\n# x = tf.layers.dropout(x, keep_prob)\n \n # TODO: Apply an Output Layer\n # Set this to the number of classes\n # Function Definition from Above:\n # output(x_tensor, num_outputs)\n x = output(x, 10)\n \n # TODO: return output\n return x\n\n\n\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n\"\"\"\n\n##############################\n## Build the Neural Network ##\n##############################\n\n# Remove previous weights, bias, inputs, etc..\ntf.reset_default_graph()\n\n# Inputs\nx = neural_net_image_input((32, 32, 3))\ny = neural_net_label_input(10)\nkeep_prob = neural_net_keep_prob_input()\n\n# Model\nlogits = conv_net(x, keep_prob)\n\n# Name logits Tensor, so that is can be loaded from disk after training\nlogits = tf.identity(logits, name='logits')\n\n# Loss and Optimizer\ncost = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y))\noptimizer = tf.train.AdamOptimizer().minimize(cost)\n\n# Accuracy\ncorrect_pred = tf.equal(tf.argmax(logits, 1), tf.argmax(y, 1))\naccuracy = tf.reduce_mean(tf.cast(correct_pred, tf.float32), name='accuracy')\n\ntests.test_conv_net(conv_net)",
"Neural Network Built!\n"
]
],
[
[
"## Train the Neural Network\n### Single Optimization\nImplement the function `train_neural_network` to do a single optimization. The optimization should use `optimizer` to optimize in `session` with a `feed_dict` of the following:\n* `x` for image input\n* `y` for labels\n* `keep_prob` for keep probability for dropout\n\nThis function will be called for each batch, so `tf.global_variables_initializer()` has already been called.\n\nNote: Nothing needs to be returned. This function is only optimizing the neural network.",
"_____no_output_____"
]
],
[
[
"def train_neural_network(session, optimizer, keep_probability, feature_batch, label_batch):\n \"\"\"\n Optimize the session on a batch of images and labels\n : session: Current TensorFlow session\n : optimizer: TensorFlow optimizer function\n : keep_probability: keep probability\n : feature_batch: Batch of Numpy image data\n : label_batch: Batch of Numpy label data\n \"\"\"\n session.run(optimizer, feed_dict={keep_prob: keep_probability, x: feature_batch, \n y: label_batch})\n pass\n\n\n\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n\"\"\"\ntests.test_train_nn(train_neural_network)",
"Tests Passed\n"
]
],
[
[
"### Show Stats\nImplement the function `print_stats` to print loss and validation accuracy. Use the global variables `valid_features` and `valid_labels` to calculate validation accuracy. Use a keep probability of `1.0` to calculate the loss and validation accuracy.",
"_____no_output_____"
]
],
[
[
"def print_stats(session, feature_batch, label_batch, cost, accuracy):\n \"\"\"\n Print information about loss and validation accuracy\n : session: Current TensorFlow session\n : feature_batch: Batch of Numpy image data\n : label_batch: Batch of Numpy label data\n : cost: TensorFlow cost function\n : accuracy: TensorFlow accuracy function\n \"\"\"\n# print(feature_batch)\n loss, acc = session.run([cost, accuracy], \n feed_dict={x: feature_batch, y: label_batch, \n keep_prob: 1.})\n print(\"Training Loss= \" + \\\n \"{:.6f}\".format(loss) + \", Training Accuracy= \" + \\\n \"{:.5f}\".format(acc))\n \n valid_loss, valid_acc = session.run([cost, accuracy], \n feed_dict={x: valid_features, y: valid_labels, \n keep_prob: 1.})\n print(\"Validation Loss= \" + \\\n \"{:.6f}\".format(valid_loss) + \", Validation Accuracy= \" + \\\n \"{:.5f}\".format(valid_acc)) \n \n \n# batch_cost = session.run(cost, feed_dict={keep_probability: 1,\n# x: feature_batch, y: label_batch})\n# batch_accuracy = session.run(accuracy, feed_dict={keep_probability: 1,\n# x: feature_batch, y: label_batch})\n \n# valid_cost = session.run(cost, feed_dict={keep_probability: 1,\n# x: valid_features, y: valid_labels})\n# valid_accuracy = session.run(accuracy, feed_dict={keep_probability: 1,\n# x: valid_features, y: valid_labels})\n \n# print('Training Cost: {}'.format(batch_cost))\n# print('Training Accuracy: {}'.format(batch_accuracy))\n# print('Validation Cost: {}'.format(valid_cost))\n# print('Validation Accuracy: {}'.format(valid_accuracy))\n# print('Accuracy: {}'.format(accuracy))\n pass",
"_____no_output_____"
]
],
[
[
"### Hyperparameters\nTune the following parameters:\n* Set `epochs` to the number of iterations until the network stops learning or start overfitting\n* Set `batch_size` to the highest number that your machine has memory for. Most people set them to common sizes of memory:\n * 64\n * 128\n * 256\n * ...\n* Set `keep_probability` to the probability of keeping a node using dropout",
"_____no_output_____"
]
],
[
[
"# TODO: Tune Parameters\nepochs = 10\nbatch_size = 256\nkeep_probability = .5",
"_____no_output_____"
]
],
[
[
"### Train on a Single CIFAR-10 Batch\nInstead of training the neural network on all the CIFAR-10 batches of data, let's use a single batch. This should save time while you iterate on the model to get a better accuracy. Once the final validation accuracy is 50% or greater, run the model on all the data in the next section.",
"_____no_output_____"
]
],
[
[
"\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL\n\"\"\"\nprint('Checking the Training on a Single Batch...')\nwith tf.Session() as sess:\n # Initializing the variables\n sess.run(tf.global_variables_initializer())\n \n # Training cycle\n for epoch in range(epochs):\n batch_i = 1\n for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):\n train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)\n print('Epoch {:>2}, CIFAR-10 Batch {}: '.format(epoch + 1, batch_i), end='')\n print_stats(sess, batch_features, batch_labels, cost, accuracy)",
"Checking the Training on a Single Batch...\nEpoch 1, CIFAR-10 Batch 1: Training Loss= 2.303360, Training Accuracy= 0.07500\nValidation Loss= 2.302445, Validation Accuracy= 0.10020\nEpoch 2, CIFAR-10 Batch 1: Training Loss= 2.300095, Training Accuracy= 0.07500\nValidation Loss= 2.300672, Validation Accuracy= 0.10680\nEpoch 3, CIFAR-10 Batch 1: Training Loss= 2.133212, Training Accuracy= 0.20000\nValidation Loss= 2.013359, Validation Accuracy= 0.25460\nEpoch 4, CIFAR-10 Batch 1: Training Loss= 1.994526, Training Accuracy= 0.42500\nValidation Loss= 1.930314, Validation Accuracy= 0.30360\nEpoch 5, CIFAR-10 Batch 1: Training Loss= 1.805106, Training Accuracy= 0.47500\nValidation Loss= 1.899843, Validation Accuracy= 0.31120\nEpoch 6, CIFAR-10 Batch 1: Training Loss= 1.622841, Training Accuracy= 0.50000\nValidation Loss= 1.794162, Validation Accuracy= 0.35400\nEpoch 7, CIFAR-10 Batch 1: Training Loss= 1.484576, Training Accuracy= 0.55000\nValidation Loss= 1.768973, Validation Accuracy= 0.36840\nEpoch 8, CIFAR-10 Batch 1: Training Loss= 1.343377, Training Accuracy= 0.55000\nValidation Loss= 1.755146, Validation Accuracy= 0.37300\nEpoch 9, CIFAR-10 Batch 1: Training Loss= 1.148018, Training Accuracy= 0.62500\nValidation Loss= 1.719376, Validation Accuracy= 0.38340\nEpoch 10, CIFAR-10 Batch 1: Training Loss= 0.982874, Training Accuracy= 0.67500\nValidation Loss= 1.727221, Validation Accuracy= 0.38440\n"
]
],
[
[
"### Fully Train the Model\nNow that you got a good accuracy with a single CIFAR-10 batch, try it with all five batches.",
"_____no_output_____"
]
],
[
[
"\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL\n\"\"\"\nsave_model_path = './image_classification'\n\nprint('Training...')\nwith tf.Session() as sess:\n # Initializing the variables\n sess.run(tf.global_variables_initializer())\n \n # Training cycle\n for epoch in range(epochs):\n # Loop over all batches\n n_batches = 5\n for batch_i in range(1, n_batches + 1):\n for batch_features, batch_labels in helper.load_preprocess_training_batch(batch_i, batch_size):\n train_neural_network(sess, optimizer, keep_probability, batch_features, batch_labels)\n print('Epoch {:>2}, CIFAR-10 Batch {}: '.format(epoch + 1, batch_i), end='')\n print_stats(sess, batch_features, batch_labels, cost, accuracy)\n \n # Save Model\n saver = tf.train.Saver()\n save_path = saver.save(sess, save_model_path)",
"Training...\nEpoch 1, CIFAR-10 Batch 1: Training Loss= 2.267260, Training Accuracy= 0.10000\nValidation Loss= 2.247928, Validation Accuracy= 0.15120\nEpoch 1, CIFAR-10 Batch 2: Training Loss= 1.995992, Training Accuracy= 0.17500\nValidation Loss= 2.067119, Validation Accuracy= 0.22640\nEpoch 1, CIFAR-10 Batch 3: Training Loss= 1.601563, Training Accuracy= 0.42500\nValidation Loss= 1.950456, Validation Accuracy= 0.29940\nEpoch 1, CIFAR-10 Batch 4: Training Loss= 1.685398, Training Accuracy= 0.37500\nValidation Loss= 1.832309, Validation Accuracy= 0.33260\nEpoch 1, CIFAR-10 Batch 5: Training Loss= 1.742794, Training Accuracy= 0.42500\nValidation Loss= 1.772525, Validation Accuracy= 0.36480\nEpoch 2, CIFAR-10 Batch 1: Training Loss= 1.930166, Training Accuracy= 0.42500\nValidation Loss= 1.730577, Validation Accuracy= 0.38260\nEpoch 2, CIFAR-10 Batch 2: Training Loss= 1.677821, Training Accuracy= 0.52500\nValidation Loss= 1.696393, Validation Accuracy= 0.38680\nEpoch 2, CIFAR-10 Batch 3: Training Loss= 1.347411, Training Accuracy= 0.52500\nValidation Loss= 1.672744, Validation Accuracy= 0.39920\nEpoch 2, CIFAR-10 Batch 4: Training Loss= 1.295316, Training Accuracy= 0.62500\nValidation Loss= 1.630986, Validation Accuracy= 0.41060\nEpoch 2, CIFAR-10 Batch 5: Training Loss= 1.535737, Training Accuracy= 0.47500\nValidation Loss= 1.588744, Validation Accuracy= 0.42880\nEpoch 3, CIFAR-10 Batch 1: Training Loss= 1.647690, Training Accuracy= 0.47500\nValidation Loss= 1.609808, Validation Accuracy= 0.42340\nEpoch 3, CIFAR-10 Batch 2: Training Loss= 1.431329, Training Accuracy= 0.52500\nValidation Loss= 1.584422, Validation Accuracy= 0.43340\nEpoch 3, CIFAR-10 Batch 3: Training Loss= 1.112353, Training Accuracy= 0.62500\nValidation Loss= 1.557316, Validation Accuracy= 0.43240\nEpoch 3, CIFAR-10 Batch 4: Training Loss= 1.119438, Training Accuracy= 0.65000\nValidation Loss= 1.552788, Validation Accuracy= 0.44160\nEpoch 3, CIFAR-10 Batch 5: Training Loss= 1.367179, Training Accuracy= 0.50000\nValidation Loss= 1.503110, Validation Accuracy= 0.46300\nEpoch 4, CIFAR-10 Batch 1: Training Loss= 1.455486, Training Accuracy= 0.57500\nValidation Loss= 1.554065, Validation Accuracy= 0.44300\nEpoch 4, CIFAR-10 Batch 2: Training Loss= 1.224739, Training Accuracy= 0.52500\nValidation Loss= 1.517005, Validation Accuracy= 0.46140\nEpoch 4, CIFAR-10 Batch 3: Training Loss= 0.871683, Training Accuracy= 0.82500\nValidation Loss= 1.502632, Validation Accuracy= 0.45320\nEpoch 4, CIFAR-10 Batch 4: Training Loss= 0.915391, Training Accuracy= 0.72500\nValidation Loss= 1.488573, Validation Accuracy= 0.46680\nEpoch 4, CIFAR-10 Batch 5: Training Loss= 1.179544, Training Accuracy= 0.57500\nValidation Loss= 1.464614, Validation Accuracy= 0.47320\nEpoch 5, CIFAR-10 Batch 1: Training Loss= 1.337982, Training Accuracy= 0.60000\nValidation Loss= 1.505418, Validation Accuracy= 0.46500\nEpoch 5, CIFAR-10 Batch 2: Training Loss= 1.041483, Training Accuracy= 0.60000\nValidation Loss= 1.465458, Validation Accuracy= 0.47860\nEpoch 5, CIFAR-10 Batch 3: Training Loss= 0.739452, Training Accuracy= 0.80000\nValidation Loss= 1.513790, Validation Accuracy= 0.44840\nEpoch 5, CIFAR-10 Batch 4: Training Loss= 0.770327, Training Accuracy= 0.72500\nValidation Loss= 1.493302, Validation Accuracy= 0.46580\nEpoch 5, CIFAR-10 Batch 5: Training Loss= 1.037692, Training Accuracy= 0.67500\nValidation Loss= 1.450228, Validation Accuracy= 0.48040\nEpoch 6, CIFAR-10 Batch 1: Training Loss= 1.211084, Training Accuracy= 0.70000\nValidation Loss= 1.447853, Validation Accuracy= 0.47460\nEpoch 6, CIFAR-10 Batch 2: Training Loss= 0.880263, Training Accuracy= 0.72500\nValidation Loss= 1.458044, Validation Accuracy= 0.47740\nEpoch 6, CIFAR-10 Batch 3: Training Loss= 0.578820, Training Accuracy= 0.87500\nValidation Loss= 1.495665, Validation Accuracy= 0.45800\nEpoch 6, CIFAR-10 Batch 4: Training Loss= 0.660608, Training Accuracy= 0.80000\nValidation Loss= 1.465039, Validation Accuracy= 0.46980\nEpoch 6, CIFAR-10 Batch 5: Training Loss= 0.921411, Training Accuracy= 0.72500\nValidation Loss= 1.453837, Validation Accuracy= 0.47860\nEpoch 7, CIFAR-10 Batch 1: Training Loss= 1.094060, Training Accuracy= 0.70000\nValidation Loss= 1.461889, Validation Accuracy= 0.48060\nEpoch 7, CIFAR-10 Batch 2: Training Loss= 0.741231, Training Accuracy= 0.82500\nValidation Loss= 1.472825, Validation Accuracy= 0.47680\nEpoch 7, CIFAR-10 Batch 3: Training Loss= 0.513557, Training Accuracy= 0.95000\nValidation Loss= 1.468261, Validation Accuracy= 0.47640\nEpoch 7, CIFAR-10 Batch 4: Training Loss= 0.579966, Training Accuracy= 0.82500\nValidation Loss= 1.440210, Validation Accuracy= 0.48640\nEpoch 7, CIFAR-10 Batch 5: Training Loss= 0.747553, Training Accuracy= 0.82500\nValidation Loss= 1.499854, Validation Accuracy= 0.46560\nEpoch 8, CIFAR-10 Batch 1: Training Loss= 0.951779, Training Accuracy= 0.70000\nValidation Loss= 1.496742, Validation Accuracy= 0.48260\nEpoch 8, CIFAR-10 Batch 2: Training Loss= 0.597912, Training Accuracy= 0.87500\nValidation Loss= 1.450022, Validation Accuracy= 0.49020\nEpoch 8, CIFAR-10 Batch 3: Training Loss= 0.429645, Training Accuracy= 1.00000\nValidation Loss= 1.456745, Validation Accuracy= 0.49120\nEpoch 8, CIFAR-10 Batch 4: Training Loss= 0.523796, Training Accuracy= 0.85000\nValidation Loss= 1.464786, Validation Accuracy= 0.48740\nEpoch 8, CIFAR-10 Batch 5: Training Loss= 0.572838, Training Accuracy= 0.85000\nValidation Loss= 1.513322, Validation Accuracy= 0.47360\nEpoch 9, CIFAR-10 Batch 1: Training Loss= 0.810197, Training Accuracy= 0.80000\nValidation Loss= 1.480253, Validation Accuracy= 0.50260\nEpoch 9, CIFAR-10 Batch 2: Training Loss= 0.554052, Training Accuracy= 0.87500\nValidation Loss= 1.442528, Validation Accuracy= 0.49260\nEpoch 9, CIFAR-10 Batch 3: Training Loss= 0.419015, Training Accuracy= 0.97500\nValidation Loss= 1.507362, Validation Accuracy= 0.48060\nEpoch 9, CIFAR-10 Batch 4: Training Loss= 0.425726, Training Accuracy= 0.95000\nValidation Loss= 1.479631, Validation Accuracy= 0.49280\nEpoch 9, CIFAR-10 Batch 5: Training Loss= 0.464258, Training Accuracy= 0.90000\nValidation Loss= 1.496848, Validation Accuracy= 0.48580\nEpoch 10, CIFAR-10 Batch 1: Training Loss= 0.678040, Training Accuracy= 0.77500\nValidation Loss= 1.521347, Validation Accuracy= 0.48420\nEpoch 10, CIFAR-10 Batch 2: Training Loss= 0.463171, Training Accuracy= 0.95000\nValidation Loss= 1.451705, Validation Accuracy= 0.49820\nEpoch 10, CIFAR-10 Batch 3: Training Loss= 0.335313, Training Accuracy= 0.97500\nValidation Loss= 1.495026, Validation Accuracy= 0.49000\nEpoch 10, CIFAR-10 Batch 4: Training Loss= 0.352970, Training Accuracy= 0.97500\nValidation Loss= 1.478027, Validation Accuracy= 0.49940\nEpoch 10, CIFAR-10 Batch 5: Training Loss= 0.415313, Training Accuracy= 0.97500\nValidation Loss= 1.469447, Validation Accuracy= 0.50540\n"
]
],
[
[
"# Checkpoint\nThe model has been saved to disk.\n## Test Model\nTest your model against the test dataset. This will be your final accuracy. You should have an accuracy greater than 50%. If you don't, keep tweaking the model architecture and parameters.",
"_____no_output_____"
]
],
[
[
"\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL\n\"\"\"\n%matplotlib inline\n%config InlineBackend.figure_format = 'retina'\n\nimport tensorflow as tf\nimport pickle\nimport helper\nimport random\n\n# Set batch size if not already set\ntry:\n if batch_size:\n pass\nexcept NameError:\n batch_size = 64\n\nsave_model_path = './image_classification'\nn_samples = 4\ntop_n_predictions = 3\n\ndef test_model():\n \"\"\"\n Test the saved model against the test dataset\n \"\"\"\n\n test_features, test_labels = pickle.load(open('preprocess_test.p', mode='rb'))\n loaded_graph = tf.Graph()\n\n with tf.Session(graph=loaded_graph) as sess:\n # Load model\n loader = tf.train.import_meta_graph(save_model_path + '.meta')\n loader.restore(sess, save_model_path)\n\n # Get Tensors from loaded model\n loaded_x = loaded_graph.get_tensor_by_name('x:0')\n loaded_y = loaded_graph.get_tensor_by_name('y:0')\n loaded_keep_prob = loaded_graph.get_tensor_by_name('keep_prob:0')\n loaded_logits = loaded_graph.get_tensor_by_name('logits:0')\n loaded_acc = loaded_graph.get_tensor_by_name('accuracy:0')\n \n # Get accuracy in batches for memory limitations\n test_batch_acc_total = 0\n test_batch_count = 0\n \n for test_feature_batch, test_label_batch in helper.batch_features_labels(test_features, test_labels, batch_size):\n test_batch_acc_total += sess.run(\n loaded_acc,\n feed_dict={loaded_x: test_feature_batch, loaded_y: test_label_batch, loaded_keep_prob: 1.0})\n test_batch_count += 1\n\n print('Testing Accuracy: {}\\n'.format(test_batch_acc_total/test_batch_count))\n\n # Print Random Samples\n random_test_features, random_test_labels = tuple(zip(*random.sample(list(zip(test_features, test_labels)), n_samples)))\n random_test_predictions = sess.run(\n tf.nn.top_k(tf.nn.softmax(loaded_logits), top_n_predictions),\n feed_dict={loaded_x: random_test_features, loaded_y: random_test_labels, loaded_keep_prob: 1.0})\n helper.display_image_predictions(random_test_features, random_test_labels, random_test_predictions)\n\n\ntest_model()",
"Testing Accuracy: 0.5068359375\n\n"
]
],
[
[
"## Why 50-80% Accuracy?\nYou might be wondering why you can't get an accuracy any higher. First things first, 50% isn't bad for a simple CNN. Pure guessing would get you 10% accuracy. However, you might notice people are getting scores [well above 80%](http://rodrigob.github.io/are_we_there_yet/build/classification_datasets_results.html#43494641522d3130). That's because we haven't taught you all there is to know about neural networks. We still need to cover a few more techniques.\n## Submitting This Project\nWhen submitting this project, make sure to run all the cells before saving the notebook. Save the notebook file as \"dlnd_image_classification.ipynb\" and save it as a HTML file under \"File\" -> \"Download as\". Include the \"helper.py\" and \"problem_unittests.py\" files in your submission.",
"_____no_output_____"
]
]
] | [
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] | [
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.