message
stringlengths 13
484
| diff
stringlengths 38
4.63k
|
---|---|
Set high watermark explicitly
Problem: 'tezos-client setup ledger to baker' doesn't update high
watermark by default and thus bakes and endorsments can be missed in
case we switch from network with higher number of blocks to the network
with lower number of blocks.
Solution: Provide current level as high watermark expicitly. | @@ -13,6 +13,7 @@ import os, sys, subprocess, shlex
import readline
import re, textwrap
import urllib.request
+import json
from typing import List
@@ -497,6 +498,12 @@ class Setup:
+ self.config["node_rpc_addr"]
)
+ def get_current_head_level(self):
+ response = urllib.request.urlopen(
+ self.config["node_rpc_addr"] + "/chains/main/blocks/head/header"
+ )
+ return str(json.load(response)["level"])
+
# Check if there is already some blockchain data in the tezos-node data directory,
# and ask the user if it can be overwritten.
def check_blockchain_data(self):
@@ -736,6 +743,8 @@ class Setup:
+ tezos_client_options
+ " setup ledger to bake for "
+ baker_alias
+ + " --main-hwm "
+ + self.get_current_head_level()
)
except EOFError:
|
Update version 0.9.0 -> 0.9.1
Changes
* Removed deprecated kwarg in `SpinReversalTransformComposite`
Fixes
* `RoofDualityComposite`, `ConnectedComponentComposite` and `FixedVariableComposite` now all work with the new BQM types
New Features
* A sample method testing framework for Samplers
* Significant documentation update
* A unified location for header files | #
# ================================================================================================
-__version__ = '0.9.0'
+__version__ = '0.9.1'
__author__ = 'D-Wave Systems Inc.'
__authoremail__ = '[email protected]'
__description__ = 'A shared API for binary quadratic model samplers.'
|
SetAlgo: fix bug in error reporting logic
std::string( len, character ) doesn't work for negative `len` | @@ -426,7 +426,15 @@ void expressionToAST( const std::string &setExpression, ExpressionAst &ast)
{
int offset = iter - setExpression.begin();
std::string errorIndication( offset, ' ' );
- errorIndication += '|' + std::string(setExpression.end() - iter - 2, '-') + '|';
+ int indicationSize = setExpression.end() - iter;
+ if( indicationSize <= 2 )
+ {
+ errorIndication += std::string( indicationSize, '|');
+ }
+ else
+ {
+ errorIndication += '|' + std::string( indicationSize - 2, '-') + '|';
+ }
throw IECore::Exception( boost::str( boost::format( "Syntax error in indicated part of SetExpression.\n%s\n%i\n." ) % setExpression % errorIndication ) ) ;
}
|
Update nodejs install version for Ubuntu.
The 5.x releases give a warning about being deprecated and no longer
getting security fixes. Current master works fine with 6.x, which is
still maintained. | @@ -61,7 +61,7 @@ Any issues? Please let us know on our forums at: https://forum.mattermost.org/
4. Set GOROOT (optional) in your `~/.bash_profile`
- `export GOROOT=/usr/local/go/`
6. Install Node.js
- - `curl -sL https://deb.nodesource.com/setup_5.x | sudo -E bash -`
+ - `curl -sL https://deb.nodesource.com/setup_6.x | sudo -E bash -`
- `sudo apt-get install -y nodejs`
7. Fork Mattermost on GitHub.com from [https://github.com/mattermost/platform](https://github.com/mattermost/platform), then:
1. `cd ~/go`
|
css: Replace unnecessary figure element with div.
The figure element here was used for a text bubble rather than a
graphics (i.e. "figure"), hence a div element is more appropriate.
This change doesn't effect the visual styling as verfied by comparing
the rendered result visually, and comparing the applied styles in the
devtools. | </li>
<li><div class="list-content">Share papers, presentations or images with <a href="/help/share-and-upload-files">drag-and-drop file uploads</a>.</div></li>
</ul>
- <figure class="quote">
+ <div class="quote">
<blockquote>
For more than a year, Zulip has been the cornerstone of our online
Category Theory community. We greatly appreciate the seamless
<div class="author">
— Stelios Tsampas, PhD in Theoretical Computer Science
</div>
- </figure>
+ </div>
</div>
</div>
|
Update ursnif.txt
Minus dups. | @@ -5473,6 +5473,49 @@ feel500.at
kwjqbk2fw9p8q5y.com
xumti39cg1kuf9t2y.com
+# Reference: https://raw.githubusercontent.com/pan-unit42/iocs/master/Valak/2020-03-23-to-2020-07-07-TA551-traffic-pattern-history-since-Valak.txt
+
+00otg18ixk6o8kows.com
+2zvdoq8grm7vwed20-zz.com
+adersr4utx.com
+amc4we.com
+c1vfsbk.com
+d6rc53.com
+d9q944ord8l-tydx.com
+ebh3zy1l0l66zt144-ph.com
+ebwz497.com
+eed9jqjd4b600bu2b-md.com
+f0hc7osjnl2vi61g.com
+fw6rzlxc.com
+fz782ze.com
+gandael6.com
+gma7im.com
+grumnoud.com
+gwn2649pm.com
+his3t35rif0krjkn.com
+hlyctn2zx8zyjox1.com
+j4abq17dqadmb4hz.com
+je85oemozig2x4yq.com
+jzi0hc.com
+k0llld9j.com
+kwjqbk2fw9p8q5y.com
+kzex9vp0jfw6a8up1.com
+l95dtz8.com
+landcareus.com
+le7dv4wry1qy0dozb-df.com
+m4tz0of0xi8o3brr.com
+pk3ehqmow0a.com
+siicg8lgad.com
+turjaxqqzwyfzy6a.com
+v4x99v.com
+ws3adlfkm1.com
+xcjhb30ton.com
+xekolw77fzn-pwzb.com
+xljksdu.com
+xumti39cg1kuf9t2y.com
+yfpyutf.com
+zp9x80h.com
+
# Generic trails
/a.aspx?redir=1&clientUuid=
|
Update flow_setup.rst
* Update flow_setup.rst
Fix for the download of rllab-multiagent; it's off the master branch, not cistar_release.
* Update flow_setup.rst
* Update flow_setup.rst
* Update flow_setup.rst
* Update flow_setup.rst
Removed comments to team | Setup Instructions
*****************************
-To get flow\_dev running, you need three things: flow\_dev (or
+To get flow running, you need three things: flow (or
flow), SUMO, and rllab. Once each component is installed successfully,
you might get some missing module bugs from python. Just install the
missing module using your OS-specific package manager / installation
@@ -10,12 +10,11 @@ tool. Follow the shell commands below to get started.
Installing Flow
=================
-Install rllab-multiagent (NOTE TO TEAM: For now, it's still rllab-distributed)
+Install rllab-multiagent
::
- git clone [email protected]:cathywu/rllab-multiagent.git
+ git clone https://github.com/cathywu/rllab-multiagent.git
cd rllab-multiagent
- git checkout flow_release # TODO eliminate this step
Create a conda environment (add warning, that EVERYTHING is a specific version):
::
@@ -36,22 +35,28 @@ For Linux
::
+Now for both Linux and OSX, run
+::
python setup.py develop
Install flow within the rllab-multiagent repo
::
- git clone https://github.com/cathywu/flow.git # Needs to be here for AWS experiments using rllab (NOTE TO TEAM: This eliminates the make prepare step.)
+ git clone https://github.com/cathywu/flow.git # Needs to be here for AWS experiments using rllab
cd flow
./scripts/setup_sumo_osx.sh <DESIRED_PATH_TO_SUMO> # installs sumo
python setup.py develop # (install flow, rllab, and dependencies)
- cp flow/core/config.template.py flow/core/config.py # TODO eliminate or move to setup_osx.sh or add to commonly asked questions
+ cp flow/core/config.template.py flow/core/config.py # Create template for users using pycharm
Finally, add <SUMO_DIR>/tools to your PYTHON_PATH to give Python access to TraCI and sumolib.
Test the installation
=====================
+To run any of the examples, make sure to run
+::
+ source activate flow
+
Running the following should result in the loading of the SUMO GUI.
Click the run button and you should see unstable traffic form after a
few seconds, a la (Sugiyama et al, 2008).
|
Fix bugs with --notify omission
Two minor bugs with `--notify`:
When not passed, was producing "None", rather than "{}"
`--notify ""` should be the same as `--notify "off"`, so the None
checking logic has to be explicit, not truthiness check | @@ -332,8 +332,10 @@ def task_submission_options(f):
In code, produces True, False, or a set
"""
- if not value:
- return None
+ # if no value was set, don't set any explicit options
+ # the API default is "everything on"
+ if value is None:
+ return {}
value = value.lower()
value = [x.strip() for x in value.split(',')]
|
[Chore] Don't sign commits in bottles sync script
Problem: Public github actions runners don't have access to our signing
key and thus it's impossible to sign commits in it.
Solution: Don't sign commits in this script, signing will be performed
separately. | -#! /usr/bin/env nix-shell
-#! nix-shell shell.nix -i bash
+#! /usr/bin/env bash
# SPDX-FileCopyrightText: 2021 TQ Tezos <https://tqtezos.com/>
#
# SPDX-License-Identifier: LicenseRef-MIT-TQ
@@ -32,7 +31,7 @@ while : ; do
git fetch --all
git reset --hard origin/"$branch_name"
./scripts/bottle-hashes.sh .
- git commit -a -m "[Chore] Add $1 hashes to brew formulae" --gpg-sign="[email protected]"
+ git commit -a -m "[Chore] Add $1 hashes to brew formulae"
! git push || break
done
|
dispatch app : Adjust to new design of DispatchDialogue
Note that currently the app only supports a single dispatcher, even in gui mode. | @@ -161,24 +161,11 @@ class dispatch( Gaffer.Application ) :
nodes.append( node )
dispatcherType = args["dispatcher"].value or GafferDispatch.Dispatcher.getDefaultDispatcherType()
-
- if args["gui"].value :
-
- import GafferUI
- import GafferDispatchUI
-
- self.__dialogue = GafferDispatchUI.DispatchDialogue( script, nodes, dispatcherType, applyUserDefaults=args["applyUserDefaults"].value )
- self.__dialogueClosedConnection = self.__dialogue.closedSignal().connect( Gaffer.WeakMethod( self.__dialogueClosed ) )
- dispatcher = self.__dialogue.getDispatcher()
-
- else :
-
dispatcher = GafferDispatch.Dispatcher.create( dispatcherType )
if not dispatcher :
IECore.msg( IECore.Msg.Level.Error, "gaffer dispatch", "{} is not a registered dispatcher.".format( dispatcherType ) )
return 1
- if args["applyUserDefaults"].value :
Gaffer.NodeAlgo.applyUserDefaults( dispatcher )
if len(args["settings"]) % 2 :
@@ -201,6 +188,11 @@ class dispatch( Gaffer.Application ) :
if args["gui"].value :
+ import GafferUI
+ import GafferDispatchUI
+
+ self.__dialogue = GafferDispatchUI.DispatchDialogue( nodes, [ dispatcher ] )
+ self.__dialogueClosedConnection = self.__dialogue.closedSignal().connect( Gaffer.WeakMethod( self.__dialogueClosed ) )
self.__dialogue.setVisible( True )
GafferUI.EventLoop.mainEventLoop().start()
|
Fixed rect select on delete.
The selection will be gone on a delete. It will correct the bounds. | @@ -1078,6 +1078,7 @@ class Elemental(Module):
self.device.signal('element_removed', elem)
self._elements[i] = None
self.remove_elements_from_operations(elements_list)
+ self.validate_bounds()
def remove_operations(self, operations_list):
for op in operations_list:
|
update happy_num
Another way to do this and code is also less | +#Way2 1:
+
#isHappyNumber() will determine whether a number is happy or not
def isHappyNumber(num):
rem = sum = 0;
@@ -17,7 +19,27 @@ while(result != 1 and result != 4):
#Happy number always ends with 1
if(result == 1):
- print(str(num) + " is a happy number");
+ print(str(num) + " is a happy number after apply way 1");
#Unhappy number ends in a cycle of repeating numbers which contain 4
elif(result == 4):
- print(str(num) + " is not a happy number");
+ print(str(num) + " is not a happy number after apply way 1");
+
+
+
+
+
+#way 2:
+
+#Another way to do this and code is also less
+n=num
+setData=set() #set datastructure for checking a number is repeated or not.
+while 1:
+ if n==1:
+ print("{} is a happy number after apply way 2".format(num))
+ break
+ if n in setData:
+ print("{} is Not a happy number after apply way 2".format(num))
+ break
+ else:
+ setData.add(n) #adding into set if not inside set
+ n=int(''.join(str(sum([int(i)**2 for i in str(n)])))) #Pythonic way
|
Added initial live pts check.
Added a "while True" loop to make sure there is at least one not -inf initial live pt. | @@ -403,6 +403,7 @@ def NestedSampler(loglikelihood, prior_transform, ndim, nlive=500,
kwargs['compute_jac'] = compute_jac
# Initialize live points and calculate log-likelihoods.
+ while True:
if live_points is None:
live_u = rstate.rand(nlive, npdim) # positions in unit cube
if use_pool.get('prior_transform', True):
@@ -430,6 +431,9 @@ def NestedSampler(loglikelihood, prior_transform, ndim, nlive=500,
"located at u={2} v={3} is invalid."
.format(logl, i, live_points[0][i],
live_points[1][i]))
+ # check to make sure there is at least one not -inf initial live pt.
+ if any(live_points[2] != -1e300):
+ break
# Initialize our nested sampler.
sampler = _SAMPLERS[bound](loglike, ptform, npdim,
|
Handle GzipPacked lost requests & possibly fix reading normal
Reading normal "lost" requests didn't .seek(-4) to read the TLObject
again. Now it has been slightly refactored to seek back always and
only seek forward when needed (e.g. rpc error). | @@ -476,11 +476,13 @@ class MtProtoSender:
reader.read_int(signed=False) # code
request_id = reader.read_long()
inner_code = reader.read_int(signed=False)
+ reader.seek(-4)
__log__.debug('Received response for request with ID %d', request_id)
request = self._pop_request(request_id)
if inner_code == 0x2144ca19: # RPC Error
+ reader.seek(4)
if self.session.report_errors and request:
error = rpc_message_to_error(
reader.read_int(), reader.tgread_string(),
@@ -505,12 +507,10 @@ class MtProtoSender:
return True # All contents were read okay
elif request:
- if inner_code == 0x3072cfa1: # GZip packed
- unpacked_data = gzip.decompress(reader.tgread_bytes())
- with BinaryReader(unpacked_data) as compressed_reader:
+ if inner_code == GzipPacked.CONSTRUCTOR_ID:
+ with BinaryReader(GzipPacked.read(reader)) as compressed_reader:
request.on_response(compressed_reader)
else:
- reader.seek(-4)
request.on_response(reader)
self.session.process_entities(request.result)
@@ -525,10 +525,17 @@ class MtProtoSender:
# session, it will be skipped by the handle_container().
# For some reason this also seems to happen when downloading
# photos, where the server responds with FileJpeg().
+ def _try_read(r):
try:
- obj = reader.tgread_object()
+ return r.tgread_object()
except Exception as e:
- obj = '(failed to read: %s)' % e
+ return '(failed to read: {})'.format(e)
+
+ if inner_code == GzipPacked.CONSTRUCTOR_ID:
+ with BinaryReader(GzipPacked.read(reader)) as compressed_reader:
+ obj = _try_read(compressed_reader)
+ else:
+ obj = _try_read(reader)
__log__.warning(
'Lost request (ID %d) with code %s will be skipped, contents: %s',
|
Add networking-baremetal repo overrides
This patch adds networking-baremetal to the openstack services
to be tracked for installation. | @@ -197,6 +197,10 @@ networking_nsxlib_git_repo: https://opendev.org/x/vmware-nsxlib
networking_nsxlib_git_install_branch: 3548bcfd87fbf6efba4c930daa410cccc5b8203a
networking_nsxlib_git_track_branch: master
+networking_baremetal_git_repo: https://opendev.org/openstack/networking-baremetal
+networking_baremetal_git_install_branch: 9468d6ec601f3cc0c375fbdb94a30c60c449e39f
+networking_baremetal_git_track_branch: master
+
## Nova service
nova_git_repo: https://opendev.org/openstack/nova
nova_git_install_branch: c4cd6ee4615a35a57dd6a2f3cb5a9cbc8653f7ee
|
Use a more accurate type for predicates in itertools
The only constraint on the return value of a predicate is to be "boolable".
Because `bool` recives an object in the constructor this is a more accurate description of a predicate. | @@ -8,6 +8,7 @@ from typing import (Iterator, TypeVar, Iterable, overload, Any, Callable, Tuple,
_T = TypeVar('_T')
_S = TypeVar('_S')
_N = TypeVar('_N', int, float)
+Predicate = Callable[[_T], object]
def count(start: _N = ...,
step: _N = ...) -> Iterator[_N]: ... # more general types?
@@ -28,9 +29,9 @@ class chain(Iterator[_T], Generic[_T]):
def from_iterable(iterable: Iterable[Iterable[_S]]) -> Iterator[_S]: ...
def compress(data: Iterable[_T], selectors: Iterable[Any]) -> Iterator[_T]: ...
-def dropwhile(predicate: Callable[[_T], Any],
+def dropwhile(predicate: Predicate[_T],
iterable: Iterable[_T]) -> Iterator[_T]: ...
-def filterfalse(predicate: Optional[Callable[[_T], Any]],
+def filterfalse(predicate: Optional[Predicate[_T]],
iterable: Iterable[_T]) -> Iterator[_T]: ...
@overload
@@ -46,7 +47,7 @@ def islice(iterable: Iterable[_T], start: Optional[int], stop: Optional[int],
step: Optional[int] = ...) -> Iterator[_T]: ...
def starmap(func: Callable[..., _S], iterable: Iterable[Iterable[Any]]) -> Iterator[_S]: ...
-def takewhile(predicate: Callable[[_T], Any],
+def takewhile(predicate: Predicate[_T],
iterable: Iterable[_T]) -> Iterator[_T]: ...
def tee(iterable: Iterable[_T], n: int = ...) -> Tuple[Iterator[_T], ...]: ...
def zip_longest(*p: Iterable[Any],
|
In installation doc, pip install --upgrade tensorflow-hub.
Users looking here don't want to stay stuck on random old stuff. | @@ -9,12 +9,13 @@ right away, and current users upgrade to it.
Use [pip](https://pip.pypa.io/) to
[install TensorFlow 2](https://www.tensorflow.org/install) as usual.
(See there for extra instructions about GPU support.)
-Then install [`tensorflow-hub`](https://pypi.org/project/tensorflow-hub/)
-next to it.
+Then install a current version of
+[`tensorflow-hub`](https://pypi.org/project/tensorflow-hub/)
+next to it (must be 0.5.0 or newer).
```bash
$ pip install "tensorflow>=2.0.0"
-$ pip install tensorflow-hub
+$ pip install --upgrade tensorflow-hub
```
The TF1-style API of TensorFlow Hub works with the v1 compatibility mode
@@ -29,8 +30,8 @@ to TF1-compatible behavior but contains many TF2 features under the hood
to allow some use of TensorFlow Hub's TF2-style APIs.
```bash
-$ pip install "tensorflow~=1.15"
-$ pip install tensorflow-hub
+$ pip install "tensorflow>=1.15,<2.0"
+$ pip install --upgrade tensorflow-hub
```
## Use of pre-release versions
|
Update ooni data bucket information
For more recent data we are publishing it to a new bucket in a European region. I have updated also other metadata with the most up to date links.
Thanks! | -Name: Open Observatory of Network Interference
+Name: Open Observatory of Network Interference (OONI)
Description: A free software, global observation network for detecting censorship, surveillance and traffic manipulation on the internet.
-Documentation: https://ooni.torproject.org/about/
-Contact: https://ooni.torproject.org/get-involved/
+Documentation: https://ooni.org/data/
+Contact: https://ooni.org/get-involved/
UpdateFrequency: Daily
Tags:
- aws-pds
- internet
-License: Creative Commons Attribution-ShareAlike 4.0 International https://github.com/OpenObservatory/legal/blob/master/LEGALCODE-CC4.0-BY-SA.txt
+License: Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International https://github.com/ooni/license/blob/master/data/LICENSE.md
Resources:
- - Description: S3 bucket with data
+ - Description: New S3 bucket with JSONL files
+ ARN: arn:aws:s3:::ooni-data-eu-fra
+ Region: eu-central-1
+ Type: S3 Bucket
+ - Description: Old S3 bucket with cans for older measurements
ARN: arn:aws:s3:::ooni-data
Region: us-east-1
Type: S3 Bucket
|
issue with __search_attribute_update_variables
please take a look at | @@ -436,6 +436,7 @@ class OpcUaConnector(Thread, Connector):
self.__search_node(node, attribute_path, result=attribute_nodes)
for attribute_node in attribute_nodes:
if attribute_node is not None:
+ if self.get_node_path(attribute_node) == attribute_path:
self.__available_object_resources[device_name]["variables"].append({attribute_update["attributeOnThingsBoard"]: attribute_node})
else:
log.error("Attribute update node with path \"%s\" - NOT FOUND!", attribute_path)
|
Change pexpect hackery to use a custom class
So the original pexpect is unperturbed | # Python 2/3 compatibility
Python3 = sys.version_info[0] == 3
-BaseString = str if Python3 else basestring
+BaseString = str if Python3 else str.__base__
Encoding = 'utf-8' if Python3 else None
def decode( s ):
"Decode a byte string if needed for Python 3"
@@ -24,19 +24,22 @@ def decode( s ):
def encode( s ):
"Encode a byte string if needed for Python 3"
return s.encode( Encoding ) if Python3 else s
-# Make pexpect compatible with Python 3 strings
try:
import pexpect as oldpexpect
- pexpect, oldspawn = oldpexpect, oldpexpect.spawn
+ class Pexpect( object ):
+ "Custom pexpect that is compatible with str"
def spawn( self, *args, **kwargs):
- "Let pexpect work with Python3 utf-8 strings"
- if Python3:
+ "pexpect.spawn that is compatible with str"
+ if Python3 and 'encoding' not in kwargs:
kwargs.update( encoding='utf-8' )
- return oldspawn( self, *args, **kwargs )
- oldpexpect.spawn = spawn
+ return oldpexpect.spawn( *args, **kwargs )
+ def __getattr__( self, name ):
+ return getattr( oldpexpect, name )
+ pexpect = Pexpect()
except:
pass
+
# Command execution support
def run( cmd ):
|
Changed WikipediaCog-> Wikipedia
Change this becuase this causing issue in help command | @@ -15,7 +15,7 @@ SEARCH_API = "https://en.wikipedia.org/w/api.php?action=query&list=search&srsear
WIKIPEDIA_URL = "https://en.wikipedia.org/wiki/{title}"
-class WikipediaCog(commands.Cog):
+class Wikipedia(commands.Cog):
"""Get info from wikipedia."""
def __init__(self, bot: commands.Bot):
@@ -111,4 +111,4 @@ class WikipediaCog(commands.Cog):
def setup(bot: commands.Bot) -> None:
"""Wikipedia Cog load."""
- bot.add_cog(WikipediaCog(bot))
+ bot.add_cog(Wikipedia(bot))
|
Changing the description of paramter nnz
This is in accordance will pull request | @@ -54,7 +54,7 @@ class bsr_matrix(_cs_matrix, _minmax_mixin):
ndim : int
Number of dimensions (this is always 2)
nnz
- Number of nonzero elements
+ Number of stored values, including explicit zeros
data
Data array of the matrix
indices
|
Added information on configuring the location of the cord
directory as required by the installer creator. | @@ -123,6 +123,8 @@ Also please copy the ansible configuration to `~/.ansible.cfg`:
cp ~/cord/incubator/voltha/install/ansible/ansible.cfg ~/.ansible.cfg
```
+Also please change the value of the `cord_home` variable in the `install/ansible/group_vars/all` to refer to the location of your cord directory. This is usually in your home directory but it can be anywhere so the installer can't guess at it.
+
Also destroy any running voltha VM by first ensuring your config file `settings.vagrant.yaml` is set as specified above then peforming the following:
```
|
Add Morocco power origin ratios
Needed for Spanish imports
By the way, I used the percentages from the Our World in Data percentage view and the numbers add up to 1.0002 (rounding error I guess), is this a problem? | "wind": 0.051255527209169635
}
},
+ "MA": {
+ "_source": "https://ourworldindata.org/grapher/electricity-prod-source-stacked?time=earliest..latest&country=~MAR",
+ "powerOriginRatios": {
+ "coal": 0.4157,
+ "gas": 0.2443,
+ "hydro": 0.0379,
+ "oil": 0.114,
+ "solar": 0.0474,
+ "wind": 0.1409
+ }
+ },
"MD": {
"_source": "Tomorrow",
"powerOriginRatios": {
|
Fix typo on tpu.rst
* Fix typo on tpu.rst
There're 3 ways :)
* Update docs/source/tpu.rst | @@ -27,9 +27,9 @@ some subset of those 2048 cores.
How to access TPUs
------------------
-To access TPUs there are two main ways.
+To access TPUs, there are three main ways.
-1. Using google colab.
+1. Using Google Colab.
2. Using Google Cloud (GCP).
3. Using Kaggle.
|
BUG: Add HOME to the git environment.
git config files can contain ~ expansions that require $HOME to be defined.
Some installations of git have these in the global defaults now. | @@ -73,7 +73,7 @@ def git_version():
def _minimal_ext_cmd(cmd):
# construct minimal environment
env = {}
- for k in ['SYSTEMROOT', 'PATH']:
+ for k in ['SYSTEMROOT', 'PATH', 'HOME']:
v = os.environ.get(k)
if v is not None:
env[k] = v
|
Add .signer_fingerprint property to PGPSignature.
- This returns the issuer fingerprint if the IssuerFingerprint
subpacket is present, otherwise empty string. | @@ -261,6 +261,15 @@ class PGPSignature(Armorable, ParentRef, PGPObject):
"""
return self._signature.signer
+ @property
+ def signer_fingerprint(self):
+ """
+ The fingerprint of the key that generated this signature, if it contained. Otherwise, an empty ``str``.
+ """
+ if 'IssuerFingerprint' in self._signature.subpackets:
+ return next(iter(self._signature.subpackets['IssuerFingerprint'])).issuer_fingerprint
+ return ''
+
@property
def target_signature(self):
return NotImplemented
|
BUG: Handle comparison against None for ``is_missing``.
Older versions of Numpy return False when comparing arrays with None. We want a
vectorized compare. | @@ -349,6 +349,11 @@ def is_missing(data, missing_value):
return isnan(data)
elif is_datetime(data) and isnat(missing_value):
return isnat(data)
+ elif is_object(data) and missing_value is None:
+ # XXX: Older versions of numpy returns True/False for array ==
+ # None. Work around this by boxing None in a 1x1 array, which causes
+ # numpy to do the broadcasted comparison we want.
+ return data == np.array([missing_value])
return (data == missing_value)
|
Remove useless setting of toggle link text
This was immediately overwritten by the call to `setOutlierLinkText`
below. | @@ -366,14 +366,10 @@ var analyseChart = {
if (_this.globalOptions.hasOutliers) {
if (_this.globalOptions.hideOutliers) {
_this.globalOptions.hideOutliers = false;
- $(_this.el.outliersToggle).find('a').text(
- 'Remove them from the chart');
Cookies.set('hide_small_lists', '0');
} else {
// set a cookie
_this.globalOptions.hideOutliers = true;
- $(_this.el.outliersToggle).find('a').text(
- 'Show them in the chart');
Cookies.set('hide_small_lists', '1');
}
_this.setOutlierLinkText();
|
GHCI: Don't use `--always-make` to regenerate test data
`make test-data` always regenerates test data, without the need to pass
the `--always-make` option to make. | @@ -94,7 +94,7 @@ jobs:
uses: osbuild/containers/ghci/actions/ghci-osbuild@ghci/v1
with:
run: |
- make --always-make test-data
+ make test-data
git diff --exit-code -- ./test/data
codespell:
|
Update az keyvault secret set command description
Highlight the CREATE functionality of the SET command when a secret does not exist in the vault. | @@ -159,6 +159,11 @@ type: group
short-summary: Manage secrets.
"""
+helps['keyvault secret set'] = """
+type: command
+short-summary: Create a secret (if one doesn't exist) or update a secret in a KeyVault.
+"""
+
helps['keyvault show'] = """
type: command
short-summary: Show details of a key vault.
|
Update with Niantic Warning
Warn user if recieve warning from Niantic | @@ -213,6 +213,8 @@ class PokemonGoBot(object):
self.event_manager.register_event('login_failed')
self.event_manager.register_event('login_successful')
+ self.event_manager.register_event('niantic_warning')
+
self.event_manager.register_event('set_start_location')
self.event_manager.register_event('load_cached_location')
self.event_manager.register_event('location_cache_ignored')
@@ -1181,9 +1183,11 @@ class PokemonGoBot(object):
# print('Response dictionary: \n\r{}'.format(json.dumps(response_dict, indent=2)))
currency_1 = "0"
currency_2 = "0"
+ warn = False
if response_dict:
self._player = response_dict['responses']['GET_PLAYER']['player_data']
+ warn = response_dict['responses']['GET_PLAYER']['warn']
player = self._player
else:
self.logger.info(
@@ -1264,6 +1268,16 @@ class PokemonGoBot(object):
' | Dragon Scale: ' + str(items_inventory.get(1104).count) +
' | Upgrade: ' + str(items_inventory.get(1105).count))
+ if warn:
+ self.logger.info('')
+ self.event_manager.emit(
+ 'niantic_warning',
+ sender=self,
+ level='warning',
+ formatted="This account has recieved a warning from Niantic. Bot at own risk."
+ )
+ sleep(5) # Pause to allow user to see warning
+
self.logger.info('')
def _print_list_pokemon(self):
|
Check qartod_variable references and validate the URL
QARTOD variables should have valid a valid URL as the reference. | @@ -8,7 +8,7 @@ from lxml.etree import XPath
from compliance_checker.acdd import ACDD1_3Check
from compliance_checker.cfutil import get_geophysical_variables, get_instrument_variables
from compliance_checker.cf.cf import CF1_6Check, CF1_7Check
-from rfc3986 import is_valid_uri
+from rfc3986 import api, exceptions, validators
class IOOSBaseCheck(BaseCheck):
@@ -637,8 +637,9 @@ class IOOS1_2Check(IOOSNCCheck):
def check_qartod_variables_references(self, ds):
"""
- For any variables that are deemed QARTOD variables,
- check that they contain the "references" attribute.
+ For any variables that are deemed QARTOD variables, check that they
+ contain the "references" attribute and that the value of the attribute
+ is a valid URL.
Args:
ds (netCDF4.Dataset): open Dataset
@@ -647,17 +648,21 @@ class IOOS1_2Check(IOOSNCCheck):
list of Results
"""
- # TODO
- # is_valid_uri will be deprecated soon, use validators.Validator
- # what schemes should be allowed?
+ vldr = validators.Validator()
+ vldr.require_presence_of("scheme", "host")
+ vldr.allow_schemes("http", "https")
results = []
ctxt = "qartod_variable:references"
for v in ds.get_variables_by_attributes(standard_name=lambda x: x in self._qartod_std_names):
- msg = f"\"references\" attribute for variable \"{v.name}\" must be a valid URI"
+ msg = f"\"references\" attribute for variable \"{v.name}\" must be a valid URL"
val = True
- ref = getattr(v, "references", None)
- if not (isinstance(ref, str) and is_valid_uri(ref)):
+ ref = getattr(v, "references", b'')
+ url = api.uri_reference(ref)
+
+ try:
+ vldr.validate(url)
+ except exceptions.MissingComponentError:
val = False
results.append(Result(BaseCheck.MEDIUM, val, ctxt, [msg]))
|
Adds the fixtures file to the list of things that require a full test
Problem:
Incremental testing needs the fixtures file added to it
Analysis:
This adds that file
Tests: | @@ -58,7 +58,7 @@ def examine_non_python_rules(line):
def determine_files_to_test(product, commit):
results = []
build_all = [
- 'setup.py', 'contexts.py', 'mixins.py', 'resource.py'
+ 'setup.py', 'contexts.py', 'mixins.py', 'resource.py', 'f5sdk_plugins/fixtures.py'
]
output_file = "pytest.{0}.jenkins.txt".format(product)
|
Update README.md
change [DuinoCoibyLabVIEW] to [DuinoCoinbyLabVIEW] | @@ -95,7 +95,7 @@ After doing this, you are good to go with launching the software (just double cl
</summary>
### Other miners known to work with Duino-Coin:
- * [DuinoCoibyLabVIEW](https://github.com/ericddm/DuinoCoinbyLabVIEW) - miner for LabVIEW family by ericddm
+ * [DuinoCoinbyLabVIEW](https://github.com/ericddm/DuinoCoinbyLabVIEW) - miner for LabVIEW family by ericddm
* [Duino-JS](https://github.com/Hoiboy19/Duino-JS) - a JavaScript miner which you can easily implement in your site by Hoiboy19
* [Mineuino](https://github.com/VatsaDev/Mineuino) - website monetizer by VatsaDev
* [hauchel's duco-related stuff repository](https://github.com/hauchel/duco/) - Collection of various codes for mining DUCO on other microcontrollers
|
Docs: no-relative-path rule doc edits
* no-relative-path rule doc edits
* chore: auto fixes from pre-commit.com hooks
for more information, see | @@ -5,21 +5,20 @@ This rule checks for relative paths in the `ansible.builtin.copy` and `ansible.b
Relative paths in a task most often direct Ansible to remote files and directories on managed nodes.
In the `ansible.builtin.copy` and `ansible.builtin.template` modules, the `src` argument refers to local files and directories on the control node.
-```{note}
-For `copy` best location to store files is inside `files/` folder within the
-playbook/role directory. For `template` the recommended location is `templates/`
-folder, also within the playbook/role directory.
+The recommended locations to store files are as follows:
+
+- Use the `files/` folder in the playbook or role directory for the `copy` module.
+- Use the `templates/` folder in the playbook or role directory for the `template` module.
-For this reason, for `src`, you should either:
-- Do not specify a path, or use a sub-folder of either `files/` or `templates/`.
-- Use absolute path if the resources are above your Ansible playbook/role
+These folders allow you to omit the path or use a sub-folder when specifying files with the `src` argument.
+
+```{note}
+If resources are outside your Ansible playbook or role directory you should use an absolute path with the `src` argument.
```
```{warning}
-Avoid storing files or templates inside the same directory as your playbook or
-tasks files. Doing this is a bad practice and also will generate linting
-warning in the future. Imagine the user confusion if these files also happen
-to be YAML.
+Do not store resources at the same directory level as your Ansible playbook or tasks files.
+Doing this can result in disorganized projects and cause user confusion when distinguishing between resources of the same type, such as YAML.
```
See [task paths](https://docs.ansible.com/ansible/latest/user_guide/playbook_pathing.html#task-paths) in the Ansible documentation for more information.
|
Fix Eltex.MES __init__
HG--
branch : feature/dcs | ## Vendor: Eltex
## OS: MES
##----------------------------------------------------------------------
-## Copyright (C) 2007-2011 The NOC Project
+## Copyright (C) 2007-2017 The NOC Project
## See LICENSE for details
##----------------------------------------------------------------------
@@ -16,16 +16,17 @@ class Profile(BaseProfile):
pattern_more = [
(r"^More: <space>, Quit: q, One line: <return>$", " "),
(r"\[Yes/press any key for no\]", "Y"),
- (r"<return>, Quit: q or <ctrl>", " ")
+ (r"<return>, Quit: q or <ctrl>", " "),
+ (r"q or <ctrl>+z", " ")
]
- pattern_unpriveleged_prompt = r"^\S+> "
+ pattern_unpriveleged_prompt = r"^(?P<hostname>\S+)> "
pattern_syntax_error = r"^% (Unrecognized command|Incomplete command|Wrong number of parameters or invalid range, size or characters entered)$"
command_disable_pager = "terminal datadump"
command_super = "enable"
command_enter_config = "configure"
command_leave_config = "end"
command_save_config = "copy running-config startup-config"
- pattern_prompt = r"^\S+#"
+ pattern_prompt = r"^(?P<hostname>\S+)#"
convert_interface_name = BaseProfile.convert_interface_name_cisco
INTERFACE_TYPES = {
|
CompileCtx.check_env_metadata: remove redundant check
TN: | @@ -712,7 +712,7 @@ class CompileCtx(object):
:param StructType cls: Environment metadata struct type.
"""
- from langkit.compiled_types import BoolType, UserField, resolve_type
+ from langkit.compiled_types import BoolType, resolve_type
with cls.diagnostic_context():
check_source_language(
@@ -723,11 +723,6 @@ class CompileCtx(object):
for field in cls.get_fields():
with field.diagnostic_context():
- check_source_language(
- isinstance(field, UserField),
- 'Fields of the Struct type chosen to be environment'
- ' metadata must be instances of UserField.'
- )
check_source_language(
resolve_type(field.type).matches(BoolType),
'Environment metadata fields must all be booleans'
|
tests: update tests for mds to cover multimds case
in case of multimds we must check for the number of mds up instead of
just checking if the hostname of the node is in the fsmap. | @@ -34,8 +34,7 @@ class TestMDSs(object):
hostname=node["vars"]["inventory_hostname"],
cluster=node["cluster_name"]
)
+ num_mdss = len(host.ansible.get_variables()["groups"]["mdss"])
output_raw = host.check_output(cmd)
output_json = json.loads(output_raw)
- active_daemon = output_json["fsmap"]["by_rank"][0]["name"]
- if active_daemon != hostname:
- assert output_json['fsmap']['up:standby'] == 1
+ assert output_json['fsmap']['up'] and output_json['fsmap']['in'] == num_mdss
\ No newline at end of file
|
[BitBucket] Fix get_project_tags.
If tag_name is given a single tag is returned, else a generator for the tags (paged API) | @@ -1200,7 +1200,7 @@ class Bitbucket(BitbucketBase):
params["orderBy"] = order_by
return self._get_paged(url, params=params)
- def get_project_tags(self, project_key, repository_slug, tag_name):
+ def get_project_tags(self, project_key, repository_slug, tag_name=None):
"""
Retrieve a tag in the specified repository.
The authenticated user must have REPO_READ permission for the context repository to call this resource.
@@ -1211,7 +1211,10 @@ class Bitbucket(BitbucketBase):
:return:
"""
url = self._url_repo_tags(project_key, repository_slug)
- return self.get(url)
+ if tag_name is not None:
+ return self.get("{}/{}".format(url, tag_name))
+
+ return self._get_paged(url)
def set_tag(self, project_key, repository_slug, tag_name, commit_revision, description=None):
"""
@@ -1225,13 +1228,13 @@ class Bitbucket(BitbucketBase):
:return:
"""
url = self._url_repo_tags(project_key, repository_slug)
- body = {}
- if tag_name is not None:
- body["name"] = tag_name
- if tag_name is not None:
- body["startPoint"] = commit_revision
- if tag_name is not None:
+ body = {
+ "name": tag_name,
+ "startPoint": commit_revision,
+ }
+ if description is not None:
body["message"] = description
+
return self.post(url, data=body)
def delete_tag(self, project_key, repository_slug, tag_name):
@@ -1247,7 +1250,6 @@ class Bitbucket(BitbucketBase):
self._url_repo_tags(project_key, repository_slug, api_root="rest/git"),
tag_name,
)
- (project_key, repository_slug, tag_name)
return self.delete(url)
def _url_repo_hook_settings(self, project_key, repository_slug):
|
Fix uncommon column case issue for fbprophet
'date' was sometimes 'Date' | @@ -95,6 +95,7 @@ def fbprophet(l_args, s_ticker, df_stock):
df_stock = df_stock.sort_index(ascending=True)
df_stock.reset_index(level=0, inplace=True)
+ df_stock.columns = map(str.lower, df_stock.columns) # column names are sometimes upper cased
df_stock = df_stock[["date", "5. adjusted close"]]
df_stock = df_stock.rename(columns={"date": "ds", "5. adjusted close": "y"})
df_stock["ds"] = pd.to_datetime(df_stock["ds"])
|
Update list_settings.js
Don't limit list view to 4 | @@ -78,7 +78,7 @@ export default class ListSettings {
if (field_count < 4) {
field_count = 4;
} else if (field_count > 10) {
- field_count = 4;
+ field_count = 10;
}
me.dialog.set_value("total_fields", field_count);
|
Update staging_settings.py
rm CELERY_WORKER_CONCURRENCY and CELERY_WORKER_MAX_TASKS_PER_CHILD for testing Rancher | @@ -136,9 +136,9 @@ USE_TZ = True
# Results backend
CELERY_RESULT_BACKEND = 'django-db'
-if os.environ.get('K8S_DEPLOY') is not None:
- CELERY_WORKER_MAX_TASKS_PER_CHILD = 50
-CELERY_WORKER_MAX_MEMORY_PER_CHILD = 6000000 # 6 GB
+# if os.environ.get('K8S_DEPLOY') is not None:
+# CELERY_WORKER_MAX_TASKS_PER_CHILD = 50
+CELERY_WORKER_MAX_MEMORY_PER_CHILD = 4000000 # 4 GB
# celery task registration
CELERY_IMPORTS = (
@@ -150,7 +150,7 @@ CELERY_IMPORTS = (
)
# limit number of concurrent workers
-CELERY_WORKER_CONCURRENCY = 2
+# CELERY_WORKER_CONCURRENCY = 2
# Static files (CSS, JavaScript, Images)
# https://docs.djangoproject.com/en/2.2/howto/static-files/
|
Added Stories Notifications(Enable/Disable)
Changed Parameter Name : disable -> revert | @@ -911,7 +911,7 @@ class UserMixin:
"""
return self.enable_posts_notifications(user_id, True)
- def enable_videos_notifications(self, user_id: str, disable: bool = False) -> bool:
+ def enable_videos_notifications(self, user_id: str, revert: bool = False) -> bool:
"""
Enable videos notifications of a user
@@ -919,7 +919,7 @@ class UserMixin:
----------
user_id: str
Unique identifier of a User
- disable: bool, optional
+ revert: bool, optional
Unfavorite when True
Returns
@@ -930,7 +930,7 @@ class UserMixin:
assert self.user_id, "Login required"
user_id = str(user_id)
data = self.with_action_data({"user_id": user_id, "_uid": self.user_id})
- name = "unfavorite" if disable else "favorite"
+ name = "unfavorite" if revert else "favorite"
result = self.private_request(f"friendships/{name}_for_igtv/{user_id}/", data)
return result["status"] == "ok"
@@ -949,7 +949,7 @@ class UserMixin:
"""
return self.enable_videos_notifications(user_id, True)
- def enable_reels_notifications(self, user_id: str, disable: bool = False) -> bool:
+ def enable_reels_notifications(self, user_id: str, revert: bool = False) -> bool:
"""
Enable reels notifications of a user
@@ -957,7 +957,7 @@ class UserMixin:
----------
user_id: str
Unique identifier of a User
- disable: bool, optional
+ revert: bool, optional
Unfavorite when True
Returns
@@ -968,7 +968,7 @@ class UserMixin:
assert self.user_id, "Login required"
user_id = str(user_id)
data = self.with_action_data({"user_id": user_id, "_uid": self.user_id})
- name = "unfavorite" if disable else "favorite"
+ name = "unfavorite" if revert else "favorite"
result = self.private_request(f"friendships/{name}_for_clips/{user_id}/", data)
return result["status"] == "ok"
@@ -986,3 +986,41 @@ class UserMixin:
A boolean value
"""
return self.enable_reels_notifications(user_id, True)
+
+ def enable_stories_notifications(self, user_id: str, revert: bool = False) -> bool:
+ """
+ Enable stories notifications of a user
+
+ Parameters
+ ----------
+ user_id: str
+ Unique identifier of a User
+ revert: bool, optional
+ Unfavorite when True
+
+ Returns
+ -------
+ bool
+ A boolean value
+ """
+ assert self.user_id, "Login required"
+ user_id = str(user_id)
+ data = self.with_action_data({"user_id": user_id, "_uid": self.user_id})
+ name = "unfavorite" if revert else "favorite"
+ result = self.private_request(f"friendships/{name}_for_stories/{user_id}/", data)
+ return result["status"] == "ok"
+
+ def disable_stories_notifications(self, user_id: str) -> bool:
+ """
+ Disable stories notifications of a user
+
+ Parameters
+ ----------
+ user_id: str
+ Unique identifier of a User
+ Returns
+ -------
+ bool
+ A boolean value
+ """
+ return self.enable_stories_notifications(user_id, True)
|
Fix datastore abnormal display with trove list
According bug description, the datastore display abnormal
when use trove list.
Closes-Bug: | @@ -309,13 +309,13 @@ def _print_instances(instances, is_admin=False):
setattr(instance, 'size', instance.volume['size'])
else:
setattr(instance, 'size', '-')
+ if not hasattr(instance, 'region'):
+ setattr(instance, 'region', '')
if hasattr(instance, 'datastore'):
if instance.datastore.get('version'):
setattr(instance, 'datastore_version',
instance.datastore['version'])
setattr(instance, 'datastore', instance.datastore['type'])
- if not hasattr(instance, 'region'):
- setattr(instance, 'region', '')
fields = ['id', 'name', 'datastore',
'datastore_version', 'status',
'flavor_id', 'size', 'region']
|
Update bytes_modbus_uplink_converter.py
modified bits decoding to have the correct bit order | @@ -117,9 +117,10 @@ class BytesModbusUplinkConverter(ModbusConverter):
decoded = None
- if lower_type == 'bits':
+ if lower_type in ['bit','bits']:
+ decoded_lastbyte= decoder_functions[type_]()
decoded= decoder_functions[type_]()
- decoded += decoder_functions[type_]()
+ decoded+=decoded_lastbyte
elif lower_type == "string":
decoded = decoder_functions[type_](objects_count * 2)
|
Update CONTRIBUTING.md
Revert to existing slackin invite link. | @@ -5,7 +5,7 @@ We always welcome third-party contributions. And we would love you to become an
### Reporting issues
There are several options:
-* Talk to us. You can join our Slack team via this [link](https://devito-slackin.now.sh/). Should you have installation issues, or should you bump into something that appears to be a Devito-related bug, do not hesitate to get in touch. We are always keen to help out.
+* Talk to us. You can join our Slack team via this [link](https://opesci-slackin.now.sh/). Should you have installation issues, or should you bump into something that appears to be a Devito-related bug, do not hesitate to get in touch. We are always keen to help out.
* File an issue on [our GitHub page](https://github.com/devitocodes/devito/issues).
### Making changes
|
Commenting out bug-fix (part 2)
These two lines match the version that was programmed before introducing the bug-fix. | @@ -341,6 +341,9 @@ def activate_CCandACH_trigen(Q_cooling_unmet_W,
E_ACH_req_W = 0.0
Qc_CT_ACH_W = 0.0
+ Qc_from_storage_W = 0.0 # TODO: Remove this section after merging the pull request for this branch
+ Qc_to_storage_W = 0.0
+
# if Qc_from_storage_W > 0.0:
# Qc_storage_correction, Qc_DailyStorage_content_W = \
# daily_storage_class.discharge_storage(Qc_from_storage_W)
|
[commands] Fix typing.Union converters for 3.7
Guido please don't break this | @@ -257,7 +257,12 @@ class Command:
if converter is bool:
return _convert_to_bool(argument)
- if type(converter) is typing._Union:
+ try:
+ origin = converter.__origin__
+ except AttributeError:
+ pass
+ else:
+ if origin is typing.Union:
errors = []
for conv in converter.__args__:
try:
|
Update command-line-tools.rst
Added documentation for channel rename command per PR | @@ -136,6 +136,7 @@ mattermost channel
- `mattermost channel move`_ - Move a channel to another team
- `mattermost channel remove`_ - Remove users from a channel
- `mattermost channel restore`_ - Restore a channel from the archive
+ - `mattermost channel rename`_ - Rename a channel
.. _channel-value-note:
@@ -321,6 +322,23 @@ mattermost channel restore
sudo ./mattermost channel restore 8soyabwthjnf9qibfztje5a36h
sudo ./mattermost channel restore myteam:mychannel
+mattermost channel rename
+~~~~~~~~~~~~~~~~~~~~~~~~
+
+ Description
+ Rename a channel. Channels can be specified by {team}:{channel} using the team and channel names, or by using channel IDs.
+
+ Format
+ .. code-block:: none
+
+ mattermost channel rename {channel} newchannelname --display_name "New Display Name"
+
+ Examples
+ .. code-block:: none
+
+ sudo ./mattermost channel rename 8soyabwthjnf9qibfztje5a36h newchannelname --display_name "New Display Name"
+ sudo ./mattermost channel rename myteam:mychannel newchannelname --display_name "New Display Name"
+
mattermost command
-----------------
|
Removing doi: from anchors for DOIs found in text
This looked odd in URLs that had DOIs and DOI that
already had a DOI: | @@ -227,7 +227,7 @@ def _doi_sub(match: Match, doi_to_url: Callable[[str], str])->Tuple[Markup, str]
doi_url = f'https://dx.doi.org/{quoted_doi}'
doi_url = doi_to_url(doi_url)
- anchor = escape('doi:'+doi)
+ anchor = escape(doi)
front = match.string[0:match.start()]
return (Markup(f'{front}<a href="{doi_url}">{anchor}</a>'), back)
|
fix: add ignore_errors when waiting for nfs resources to start
This commit adds an ignore_errors when waiting for the NFS resources
to start. | args:
executable: /bin/bash
-- name: Wait for the image registry operator to start its componentes
+- name: Wait for the image registry operator to start its components
ansible.builtin.shell: |
export KUBECONFIG=~/.kube/config
oc get configs.imageregistry.operator.openshift.io cluster
until: iregistry_result.rc == 0
changed_when: "iregistry_result.rc == 0"
when: kubeinit_inventory_cluster_distro == 'okd'
+ ignore_errors: True
args:
executable: /bin/bash
|
Add atol parameter to PauliSum expectation methods
Makes the signatures identical to the corresponding methods in PauliString. | @@ -347,6 +347,7 @@ class PauliSum:
state: np.ndarray,
qubit_map: Mapping[raw_types.Qid, int],
*,
+ atol: float = 1e-7,
check_preconditions: bool = True
) -> float:
"""Evaluate the expectation of this PauliSum given a wavefunction.
@@ -357,6 +358,7 @@ class PauliSum:
state: An array representing a valid wavefunction.
qubit_map: A map from all qubits used in this PauliSum to the
indices of the qubits that `state` is defined over.
+ atol: Absolute numerical tolerance.
check_preconditions: Whether to check that `state` represents a
valid wavefunction.
@@ -387,7 +389,8 @@ class PauliSum:
from cirq.sim.wave_function import validate_normalized_state
validate_normalized_state(state=state,
qid_shape=(2,) * num_qubits,
- dtype=state.dtype)
+ dtype=state.dtype,
+ atol=atol)
return sum(
p._expectation_from_wavefunction_no_validation(state, qubit_map)
for p in self)
@@ -396,6 +399,7 @@ class PauliSum:
state: np.ndarray,
qubit_map: Mapping[raw_types.Qid, int],
*,
+ atol: float = 1e-7,
check_preconditions: bool = True
) -> float:
"""Evaluate the expectation of this PauliSum given a density matrix.
@@ -406,6 +410,7 @@ class PauliSum:
state: An array representing a valid density matrix.
qubit_map: A map from all qubits used in this PauliSum to the
indices of the qubits that `state` is defined over.
+ atol: Absolute numerical tolerance.
check_preconditions: Whether to check that `state` represents a
valid density matrix.
@@ -439,7 +444,8 @@ class PauliSum:
_ = to_valid_density_matrix(density_matrix_rep=state.reshape(
dim, dim),
num_qubits=num_qubits,
- dtype=state.dtype)
+ dtype=state.dtype,
+ atol=atol)
return sum(
p._expectation_from_density_matrix_no_validation(state, qubit_map)
for p in self)
|
Update links in notebooks/README.md
Fixes | @@ -54,9 +54,10 @@ demonstrates ART with TensorFlow v2 using tensorflow.keras without eager executi
or [attack_feature_adversaries_tensorflow_v2.ipynb](attack_feature_adversaries_tensorflow_v2.ipynb) [[on nbviewer](https://nbviewer.jupyter.org/github/Trusted-AI/adversarial-robustness-toolbox/blob/main/notebooks/attack_feature_adversaries_tensorflow_v2.ipynb)]
show how to use ART to create feature adversaries ([Sabour et al., 2016](https://arxiv.org/abs/1511.05122)).
-[attack_adversarial_patch.ipynb](adversarial_patch/attack_adversarial_patch.ipynb) [[on nbviewer](https://nbviewer.jupyter.org/github/Trusted-AI/adversarial-robustness-toolbox/blob/main/notebooks/attack_adversarial_patch.ipynb)]
+[attack_adversarial_patch.ipynb](adversarial_patch/attack_adversarial_patch.ipynb) [[on nbviewer](https://nbviewer.jupyter.org/github/Trusted-AI/adversarial-robustness-toolbox/blob/main/notebooks/adversarial_patch/attack_adversarial_patch.ipynb)]
shows how to use ART to create real-world adversarial patches that fool real-world object detection and classification
models.
+[attack_adversarial_patch_TensorFlowV2.ipynb](adversarial_patch/attack_adversarial_patch.ipynb) [[on nbviewer](https://nbviewer.jupyter.org/github/Trusted-AI/adversarial-robustness-toolbox/blob/main/notebooks/adversarial_patch/attack_adversarial_patch_TensorFlowV2.ipynb)] TensorFlow v2 specific attack implementation.
<p align="center">
<img src="../utils/data/images/adversarial_patch.png?raw=true" width="200" title="adversarial_patch">
|
In MenuPageMixin.get_repeated_menu_item(), always set 'has_children_in_menu' to True (Fixes
Add some comments to the same method to help explain what is going on
Abstract out the logic to identify text to use for a repeated menu item to a new 'get_text_for_repeated_item' method, making it easier to override | @@ -111,6 +111,18 @@ class MenuPageMixin(models.Model):
"""
return menu_instance.page_has_children(self)
+ def get_text_for_repeated_item(
+ self, request=None, current_site=None, original_menu_tag='', **kwargs
+ ):
+ """Return the a string to use as 'text' for this page when it is being
+ included as a 'repeated' menu item in a menu. You might want to
+ override this method if you're creating a multilingual site and you
+ have different translations of 'repeated_item_text' that you wish to
+ surface."""
+ return self.repeated_item_text or getattr(
+ self, app_settings.PAGE_FIELD_FOR_MENU_ITEM_TEXT, self.title
+ )
+
def get_repeated_menu_item(
self, current_page, current_site, apply_active_classes,
original_menu_tag, request=None, use_absolute_page_urls=False,
@@ -119,7 +131,13 @@ class MenuPageMixin(models.Model):
for this specific page."""
menuitem = copy(self)
- setattr(menuitem, 'text', self.repeated_item_text or self.title)
+
+ # Set/reset 'text'
+ menuitem.text = self.get_text_for_repeated_item(
+ request, current_site, original_menu_tag
+ )
+
+ # Set/reset 'href'
if use_absolute_page_urls:
# Try for 'get_full_url' method (added in Wagtail 1.11) or fall
# back to 'full_url' property
@@ -129,11 +147,17 @@ class MenuPageMixin(models.Model):
url = self.full_url
else:
url = self.relative_url(current_site)
- setattr(menuitem, 'href', url)
- active_class = ''
+ menuitem.href = url
+
+ # Set/reset 'active_class'
if apply_active_classes and self == current_page:
- active_class = app_settings.ACTIVE_CLASS
- setattr(menuitem, 'active_class', active_class)
+ menuitem.active_class = app_settings.ACTIVE_CLASS
+ else:
+ menuitem.active_class = ''
+
+ # Set/reset 'has_children_in_menu'
+ menuitem.has_children_in_menu = False
+
return menuitem
|
fix: Switch the position of LIMIT and OFFSET
MariaDB needs LIMIT before OFFSET.
Whereas Postgres accepts LIMIT and OFFSET in any order | @@ -38,7 +38,8 @@ def get_feed(start, page_length):
{match_conditions_comment}
) X
order by X.creation DESC
- OFFSET %(start)s LIMIT %(page_length)s"""
+ LIMIT %(page_length)s
+ OFFSET %(start)s"""
.format(match_conditions_comment = match_conditions_comment,
match_conditions_communication = match_conditions_communication), {
"user": frappe.session.user,
|
DOC: Correct usage example for np.char.decode docstring
The docstring was previously a copy-paste error from `encode` rather than `decode`. | @@ -545,8 +545,8 @@ def _code_dispatcher(a, encoding=None, errors=None):
@array_function_dispatch(_code_dispatcher)
def decode(a, encoding=None, errors=None):
- """
- Calls `str.decode` element-wise.
+ r"""
+ Calls ``bytes.decode`` element-wise.
The set of available codecs comes from the Python standard library,
and may be extended at runtime. For more information, see the
@@ -568,7 +568,7 @@ def decode(a, encoding=None, errors=None):
See Also
--------
- str.decode
+ :py:meth:`bytes.decode`
Notes
-----
@@ -576,13 +576,13 @@ def decode(a, encoding=None, errors=None):
Examples
--------
- >>> c = np.array(['aAaAaA', ' aA ', 'abBABba'])
+ >>> c = np.array([b'\x81\xc1\x81\xc1\x81\xc1', b'@@\x81\xc1@@',
+ ... b'\x81\x82\xc2\xc1\xc2\x82\x81'])
>>> c
+ array([b'\x81\xc1\x81\xc1\x81\xc1', b'@@\x81\xc1@@',
+ ... b'\x81\x82\xc2\xc1\xc2\x82\x81'], dtype='|S7')
+ >>> np.char.decode(c, encoding='cp037')
array(['aAaAaA', ' aA ', 'abBABba'], dtype='<U7')
- >>> np.char.encode(c, encoding='cp037')
- array(['\\x81\\xc1\\x81\\xc1\\x81\\xc1', '@@\\x81\\xc1@@',
- '\\x81\\x82\\xc2\\xc1\\xc2\\x82\\x81'],
- dtype='|S7')
"""
return _to_string_or_unicode_array(
@@ -2237,7 +2237,7 @@ def count(self, sub, start=0, end=None):
def decode(self, encoding=None, errors=None):
"""
- Calls `str.decode` element-wise.
+ Calls ``bytes.decode`` element-wise.
See Also
--------
|
Add path detection to the topological sort
Passing cyclic graphs is a bug, so we just raise and AssertionError | @@ -140,17 +140,23 @@ def toposort(graph):
test_deps = _reduce_deps(graph)
visited = util.OrderedSet()
- def visit(node):
+ def visit(node, path):
+ # We assume an acyclic graph
+ assert node not in path
+
+ path.add(node)
+
# Do a DFS visit of all the adjacent nodes
for adj in test_deps[node]:
if adj not in visited:
- visit(adj)
+ visit(adj, path)
+ path.pop()
visited.add(node)
for r in test_deps.keys():
if r not in visited:
- visit(r)
+ visit(r, util.OrderedSet())
# Index test cases by test name
cases_by_name = {}
|
Skipped
Palo Alto Networks - Malware Remediation Test,
Fidelis Elevate Network
rsa_packets_and_logs_test
Cloaken-Test
Preempt | }
],
"skipped_tests": {
+ "Palo Alto Networks - Malware Remediation Test": "Issue 20265",
+ "Fidelis Elevate Network": "Issue 20263",
+ "rsa_packets_and_logs_test": "Issue 20262",
+ "Cloaken-Test": "Issue 20036",
"Create Phishing Classifier V2 ML Test": "Issue 20174",
"Extract Indicators From File - Generic v2": "Issue 20143",
"NetWitness Endpoint Test": "Issue 19878",
"_comment": "~~~ INSTANCE ISSUES ~~~",
+ "Preempt": "Issue 20268",
"iDefense": "Issue 20095",
"carbonblack-v2": "Issue 19929",
"Joe Security": "Issue 17996",
|
Add ConstructedCanvas drawing type
see ginga.canvas.types.layer | @@ -77,8 +77,61 @@ class DrawingCanvas(Mixins.UIMixin, DrawingMixin, Canvas):
self.editable = False
+class ConstructedCanvas(DrawingMixin, Canvas):
+ """Constructed canvas from a list of specifications.
+
+ Parameters are specifications of child objects, where each specification
+ is a map specifying 'type' and (optionally) 'args' and 'kwargs'.
+ If present, 'args' is a sequence of arguments and 'kwargs' is a map
+ of keyword arguments to provide to the constructor of the child.
+
+ Example:
+
+ .. code-block: Python
+
+ ConstructedCanvas([dict(type='point', args=(x, y, radius),
+ kwargs=dict(color='red')),
+ dict(type='circle', args=(x, y, radius),
+ kwargs=dict(color='yellow'))])
+
+ This makes a point inside a circle.
+ """
+ def __init__(self, spec_list, **kwdargs):
+ Canvas.__init__(self, **kwdargs)
+ DrawingMixin.__init__(self)
+ self.objects = self.build_objects(spec_list)
+
+ self.kind = 'constructed'
+ self.editable = False
+
+ def build_objects(self, spec_list):
+ return [self.build_object(spec) for spec in spec_list]
+
+ def build_object(self, spec):
+ ctype = spec.get('type', None)
+ if ctype is None:
+ raise ValueError("Item specification needs a 'type' designator: %s" % (
+ str(spec)))
+
+ # TODO: we need to be a subclass of DrawingMixin in order to get
+ # access to the get_draw_class() method. Otherwise we could just
+ # be a subclass of CompoundObject. See if this can be fixed.
+ draw_class = self.get_draw_class(ctype)
+
+ args = spec.get('args', [])
+ kwargs = spec.get('kwargs', {})
+
+ if isinstance(draw_class, CompoundObject):
+ # special case for compound objects: need to have actual objects
+ # in constructor args, not specifications
+ args = self.build_objects(args)
+
+ return draw_class(*args, **kwargs)
+
+
catalog = dict(compoundobject=CompoundObject, canvas=Canvas,
- drawingcanvas=DrawingCanvas)
+ drawingcanvas=DrawingCanvas,
+ constructedcanvas=ConstructedCanvas)
register_canvas_types(catalog)
# END
|
Update requirements-docs.in
pinning sqlalchemy | @@ -29,6 +29,7 @@ inflection
josepy
logmatic-python
marshmallow-sqlalchemy == 0.23.1 #related to the marshmallow issue (to avoid conflicts, as newer versions require marshmallow>=3.0.0)
+sqlalchemy < 1.4.0 # ImportError: cannot import name '_ColumnEntity' https://github.com/sqlalchemy/sqlalchemy/issues/6226
marshmallow<2.20.5 #schema duplicate issues https://github.com/marshmallow-code/marshmallow-sqlalchemy/issues/121
paramiko # required for the SFTP destination plugin
pem
|
Update exercism-swift hashes
The project configuration for 4.2, 5.0 have been updated to match the same hash as 5.1. Associated xfails have also been removed. | "compatibility": [
{
"version": "4.2",
- "commit": "38a17de8717a2282fd4ff62cd1ce732b926bf4ab"
+ "commit": "3df5e4ab83a9ab47228a46da7263e09a2a2b0b90"
},
{
"version": "5.0",
- "commit": "38a17de8717a2282fd4ff62cd1ce732b926bf4ab"
+ "commit": "3df5e4ab83a9ab47228a46da7263e09a2a2b0b90"
},
{
"version": "5.1",
{
"action": "BuildSwiftPackage",
"configuration": "release",
- "tags": "sourcekit",
- "xfail": {
- "compatibility": {
- "4.2": {
- "branch": {
- "master": "https://bugs.swift.org/browse/SR-8307",
- "swift-5.0-branch": "https://bugs.swift.org/browse/SR-8307",
- "swift-5.1-branch": "https://bugs.swift.org/browse/SR-8307"
- }
- },
- "5.0": {
- "branch": {
- "master": "https://bugs.swift.org/browse/SR-8307",
- "swift-5.0-branch": "https://bugs.swift.org/browse/SR-8307",
- "swift-5.1-branch": "https://bugs.swift.org/browse/SR-8307"
- }
- }
- }
- }
+ "tags": "sourcekit"
},
{
"action": "TestSwiftPackage"
|
Update make-osd-partitions.yml
change | osd_group_name: osds
journal_typecode: 45b0969e-9b03-4f30-b4c6-b4b80ceff106
data_typecode: 4fbd7e29-9d25-41b8-afd0-062c0ceff05d
- deviecs: []
+ devices: []
hosts:
- "{{ osd_group_name }}"
|
Strip leading and trailing spaces from user's inputs
The user input must be sanitized because it is used for following
queries. | @@ -47,7 +47,7 @@ def prompt(prompt, default_value=None, hidden=False, options=None):
if var == '':
return default_value
else:
- return var
+ return var.strip()
def get_regions():
regions = boto.ec2.regions()
|
libmanage.py: isolate script formatting logic from do_setenv
TN: | @@ -783,36 +783,25 @@ class ManageScript(object):
shutil.copyfile(build_path, install_path)
- def do_setenv(self, args, output_file=sys.stdout):
+ def do_setenv(self, args):
"""
Unless --json is passed, display Bourne shell commands that setup
- environment in order to make libadalang available. Otherwise, return a
- JSON document that describe this environment.
+ environment in order to make the generated library available.
+ Otherwise, return a JSON document that describe this environment.
:param argparse.Namespace args: The arguments parsed from the command
line invocation of manage.py.
- :param file output_file: File to which this should write the shell
- commands.
"""
- env_dict = {}
-
- def add_path(name, path):
- output_file.write(
- '{name}={path}"{sep}${name}"; export {name}\n'.format(
- name=name, path=pipes.quote(path),
- # On Cygwin, PATH keeps the Unix syntax instead of using
- # the Window path separator.
- sep=':' if name == 'PATH' else os.path.pathsep,
- )
- )
+ if args.json:
+ result = {}
def add_json(name, path):
- env_dict[name] = path
+ result[name] = path
- self.setup_environment(add_json if args.json else add_path)
-
- if json:
- output_file.write(json.dumps(env_dict))
+ self.setup_environment(add_json)
+ print json.dumps(result)
+ else:
+ self.write_setenv()
def do_help(self, args):
"""
@@ -853,6 +842,25 @@ class ManageScript(object):
self.setup_environment(add_path)
return env
+ def write_setenv(self, output_file):
+ """
+ Display Bourne shell commands that setup environment in order to make
+ the generated library available.
+
+ :param file output_file: File to which this should write the shell
+ commands.
+ """
+ def add_path(name, path):
+ output_file.write(
+ '{name}={path}"{sep}${name}"; export {name}\n'.format(
+ name=name, path=pipes.quote(path),
+ # On Cygwin, PATH keeps the Unix syntax instead of using
+ # the Window path separator.
+ sep=':' if name == 'PATH' else os.path.pathsep,
+ )
+ )
+ self.setup_environment(add_path)
+
def check_call(self, args, name, argv, env=None):
"""
Log and run a command with a derived environment.
|
Increase chunk for fetch_maven_artifacts downloads
By default requests module uses 1 byte chunks when iterating
over streamed content. This is optimal for responsive use
cases, not so much for downloading large files. The chunk size
is now set to 10Mb to improve throughput. | @@ -14,6 +14,7 @@ import os
import requests
from atomic_reactor import util
+from atomic_reactor.constants import DEFAULT_DOWNLOAD_BLOCK_SIZE
from atomic_reactor.koji_util import create_koji_session
from atomic_reactor.plugin import PreBuildPlugin
from collections import namedtuple
@@ -206,7 +207,7 @@ class FetchMavenArtifactsPlugin(PreBuildPlugin):
request = requests.get(download.url, stream=True)
request.raise_for_status()
with open(dest_path, 'wb') as f:
- for chunk in request.iter_content():
+ for chunk in request.iter_content(chunk_size=DEFAULT_DOWNLOAD_BLOCK_SIZE):
f.write(chunk)
for checksum in checksums.values():
checksum.update(chunk)
|
Make sure pvc and pod names are unique.
Avoids AlreadyExists exceptions. | @@ -73,7 +73,10 @@ class MillionFilesOnCephfs(object):
with open(constants.CSI_CEPHFS_POD_YAML, "r") as pod_fd:
pod_info = yaml.safe_load(pod_fd)
pvc_name = pod_info["spec"]["volumes"][0]["persistentVolumeClaim"]["claimName"]
- self.pod_name = pod_info["metadata"]["name"]
+ # Make sure the pvc and pod names are unique, so AlreadyExists
+ # exceptions are not thrown.
+ pvc_name += str(uuid.uuid4())
+ self.pod_name = pod_info["metadata"]["name"] + str(uuid.uuid4())
config.RUN["cli_params"]["teardown"] = True
self.cephfs_pvc = helpers.create_pvc(
sc_name=constants.DEFAULT_STORAGECLASS_CEPHFS,
|
US-NW-AVRN added to exceptions
Although AVRN has a gas plant, it is not always running, so I think it should be added to the fossil fuel exceptions.
Wind production is reported sometimes around -12. | @@ -67,7 +67,7 @@ def validate_production(obj, zone_key):
'US-CAR-YAD','US-NW-SCL','US-NW-CHPD',
'US-NW-WWA','US-NW-GCPD','US-NW-TPWR',
'US-NW-WAUW','US-SE-SEPA','US-NW-GWA',
- 'US-NW-DOPD'])):
+ 'US-NW-DOPD', 'US-NW-AVRN'])):
raise ValidationError(
"Coal, gas or oil or unknown production value is required for"
" %s" % zone_key)
|
Fix systray icon_size bug
If icon has min_width/height hints, the icon size is not increased
if the user specifies an icon greater than this size.
Fixes | @@ -55,19 +55,15 @@ class Icon(window._Window):
icon_size = self.systray.icon_size
self.update_hints()
- try:
- width = self.hints["min_width"]
- height = self.hints["min_height"]
- except KeyError:
- width = icon_size
- height = icon_size
+ width = self.hints.get("min_width", icon_size)
+ height = self.hints.get("min_height", icon_size)
+
+ width = max(width, icon_size)
+ height = max(height, icon_size)
if height > icon_size:
width = width * icon_size // height
height = icon_size
- if height <= 0:
- width = icon_size
- height = icon_size
self.width = width
self.height = height
|
subs: Use e.key instead of deprecated e.which.
Tested by making sure the "Filter stream" search box filters stream
on user input (when Enter is not pressed) in the Streams modal. | @@ -639,7 +639,7 @@ export function setup_page(callback) {
// streams, either explicitly via user_can_create_streams, or
// implicitly because page_params.realm_is_zephyr_mirror_realm.
$("#stream_filter input[type='text']").on("keypress", (e) => {
- if (e.which !== 13) {
+ if (e.key !== "Enter") {
return;
}
|
Docs: Update dataframe-indexing.rst
This PR updates `dataframe-index.rst` with a few naming convention corrections (only minor stuff) | Indexing into Dask DataFrames
=============================
-Dask DataFrame supports some of pandas' indexing behavior.
+Dask DataFrame supports some of Pandas' indexing behavior.
.. currentmodule:: dask.dataframe
@@ -15,14 +15,14 @@ Dask DataFrame supports some of pandas' indexing behavior.
Label-based Indexing
--------------------
-Just like pandas, Dask DataFrame supports label-based indexing with the ``.loc``
+Just like Pandas, Dask DataFrame supports label-based indexing with the ``.loc``
accessor for selecting rows or columns, and ``__getitem__`` (square brackets)
for selecting just columns.
.. note::
To select rows, the DataFrame's divisions must be known (see
- :ref:`dataframe.design` and :ref:`dataframe.performance`) for more.
+ :ref:`dataframe.design` and :ref:`dataframe.performance` for more information.)
.. code-block:: python
@@ -76,7 +76,7 @@ Slicing rows and (optionally) columns with ``.loc``:
c ...
Dask Name: loc, 2 tasks
-Dask DataFrame supports pandas' `partial-string indexing <https://pandas.pydata.org/pandas-docs/stable/timeseries.html#partial-string-indexing>`_:
+Dask DataFrame supports Pandas' `partial-string indexing <https://pandas.pydata.org/pandas-docs/stable/timeseries.html#partial-string-indexing>`_:
.. code-block:: python
|
speed up the code by using random() instead of uniform() as it's
almost ten times faster | @@ -476,7 +476,8 @@ class MultiEllipsoid:
else:
# If `q` is not being returned, assume the user wants this
# done internally so we repeat the loop if needed
- if q == 1 or rstate.uniform() < (1. / q):
+ # random is faster than uniform
+ if q == 1 or rstate.random() < (1. / q):
return x, idx
def samples(self, nsamples, rstate=None):
@@ -701,7 +702,8 @@ class RadFriends:
idx = rstate.integers(nctrs)
x = ctrs[idx] + dx
q = self.overlap(x, ctrs)
- if q == 1 or return_q or rstate.uniform() < (1. / q):
+ # random is faster than uniform
+ if q == 1 or return_q or rstate.random() < (1. / q):
if return_q:
return x, q
else:
@@ -967,7 +969,8 @@ class SupFriends:
# Check how many cubes the point lies within, passing over
# the `idx`-th cube `x` was sampled from.
q = self.overlap(x, ctrs)
- if q == 1 or return_q or rstate.uniform() < (1. / q):
+ # random() is faster than uniform()
+ if q == 1 or return_q or rstate.random() < (1. / q):
if return_q:
return x, q
else:
@@ -1165,7 +1168,10 @@ def randsphere(n, rstate=None):
"""Draw a point uniformly within an `n`-dimensional unit sphere."""
z = rstate.standard_normal(size=n) # initial n-dim vector
- xhat = z * (rstate.uniform()**(1. / n) / lalg.norm(z, check_finite=False)
+ # notice I use random () instead of uniform
+ # and standard_norm instead of normal as those are faster
+ # as this is a time-critical function
+ xhat = z * (rstate.random()**(1. / n) / lalg.norm(z, check_finite=False)
) # scale
return xhat
@@ -1176,7 +1182,8 @@ def rand_choice(pb, rstate):
The pb must sum to 1
"""
p1 = np.cumsum(pb)
- xr = rstate.uniform()
+ # random is faster than uniform
+ xr = rstate.random()
return min(np.searchsorted(p1, xr), len(pb) - 1)
|
Completed test cov erage of utils.py
TRAC#7301 | @@ -31,6 +31,7 @@ def test_interval_map_indexing_and_length():
with pytest.raises(IndexError):
im[1 * 10**32 + 20]
+
def test_interval_map_get_offset():
im = IntervalMap()
@@ -54,3 +55,26 @@ def test_interval_map_get_offset():
with pytest.raises(TypeError):
im.get_offset(-1.0)
+
+
+def test_invalid_types():
+ im = IntervalMap()
+
+ im.append_interval(10, "a")
+ im.append_interval(5, "b")
+ im.append_interval(3, "c")
+
+ with pytest.raises(TypeError):
+ im["wrong index"]
+ with pytest.raises(IndexError):
+ im[-1]
+ with pytest.raises(IndexError):
+ im[1000000]
+
+ with pytest.raises(TypeError):
+ im.get_offset("wrong index")
+ with pytest.raises(IndexError):
+ im.get_offset(-1)
+ with pytest.raises(IndexError):
+ im.get_offset(1000000)
+
|
[refactor] Switch to f-strings
Update astropy/io/votable/connect.py | @@ -101,11 +101,10 @@ def read_table_votable(
if len(tables) > 1:
if table_id is None:
raise ValueError(
- "Multiple tables found: table id should be set via "
- "the table_id= argument. The available tables are {}, "
- "or integers less than {}.".format(
- ", ".join(table_id_mapping.keys()), len(tables)
- )
+ "Multiple tables found: table id should be set via the table_id="
+ " argument. The available tables are"
+ f" {', '.join(table_id_mapping)}, or integers less than"
+ f" {len(tables)}."
)
elif isinstance(table_id, str):
if table_id in table_id_mapping:
@@ -117,9 +116,8 @@ def read_table_votable(
table = tables[table_id]
else:
raise IndexError(
- "Table index {} is out of range. {} tables found".format(
- table_id, len(tables)
- )
+ f"Table index {table_id} is out of range. {len(tables)} tables"
+ " found"
)
elif len(tables) == 1:
table = tables[0]
@@ -164,9 +162,7 @@ def write_table_votable(
if unsupported_cols:
unsupported_names = [col.info.name for col in unsupported_cols]
raise ValueError(
- "cannot write table with mixin column(s) {} to VOTable".format(
- unsupported_names
- )
+ f"cannot write table with mixin column(s) {unsupported_names} to VOTable"
)
# Check if output file already exists
|
Update changelog/5040.bugfix.rst
Better end-user text | Change dependency Networkx from featurizers ``setup.py`` and ``requirements.txt``.
There is an imcompatibility between Rasa dependecy requests 2.22.0 and the own depedency from Rasa for networkx raising errors upon pip install. There is also a bug corrected in ``requirements.txt`` which used ``~=`` instead of ``==``. All of these are fixed using networkx 2.4.0.
-
-To make the ``visualization.py`` continue working, graph now uses ``nodes`` instead of ``node``.
|
parent: move subprocess creation to mux thread too
Now connect() really is a pure blocking wrapper. | @@ -1567,6 +1567,19 @@ class Connection(object):
mitogen.core.listen(self._router.broker, 'shutdown',
self._on_broker_shutdown)
self._start_timer()
+
+ try:
+ self.proc = self.start_child()
+ except Exception:
+ self._fail_connection(sys.exc_info()[1])
+ return
+
+ LOG.debug('child for %r started: pid:%r stdin:%r stdout:%r stderr:%r',
+ self, self.proc.pid,
+ self.proc.stdin.fileno(),
+ self.proc.stdout.fileno(),
+ self.proc.stderr and self.proc.stderr.fileno())
+
self.stdio_stream = self._setup_stdio_stream()
if self.context.name is None:
self.context.name = self.stdio_stream.name
@@ -1576,13 +1589,6 @@ class Connection(object):
def connect(self, context):
self.context = context
- self.proc = self.start_child()
- LOG.debug('%r.connect(): pid:%r stdin:%r stdout:%r stderr:%r',
- self, self.proc.pid,
- self.proc.stdin.fileno(),
- self.proc.stdout.fileno(),
- self.proc.stderr and self.proc.stderr.fileno())
-
self.latch = mitogen.core.Latch()
self._router.broker.defer(self._async_connect)
self.latch.get()
|
Update core/dbt/deprecations.py
Include the actual docs link! | @@ -71,7 +71,7 @@ class MaterializationReturnDeprecation(DBTDeprecation):
added, but this behavior will be removed in a future version of dbt.
For more information, see:
- --- TODO: docs link here ---
+ https://docs.getdbt.com/v0.15/docs/creating-new-materializations#section-6-returning-relations
'''.lstrip()
|
dashboard: add missing parameter in `ceph_cmd`
the `ceph_cmd` fact is missing the `--net=host` parameter.
Some tasks consuming this fact can fail like following:
```
Error: error configuring network namespace for container Missing CNI default network
```
Closes: | - name: set_fact container_run_cmd
set_fact:
- ceph_cmd: "{{ hostvars[groups[mon_group_name][0]]['container_binary'] + ' run --interactive --rm -v /etc/ceph:/etc/ceph:z --entrypoint=ceph ' + ceph_docker_registry + '/' + ceph_docker_image + ':' + ceph_docker_image_tag if containerized_deployment | bool else 'ceph' }}"
+ ceph_cmd: "{{ hostvars[groups[mon_group_name][0]]['container_binary'] + ' run --interactive --net=host --rm -v /etc/ceph:/etc/ceph:z --entrypoint=ceph ' + ceph_docker_registry + '/' + ceph_docker_image + ':' + ceph_docker_image_tag if containerized_deployment | bool else 'ceph' }}"
- name: disable SSL for dashboard
when: dashboard_protocol == "http"
|
[ci/release] Disable infra retries for now
Infra errors are tackled with concurrency groups. Thus we can disable old mitigation methods like automatic infra retry for now.
We keep the script as it does other logic (e.g. checkout local test branch) and infra retry can be enabled via env variable if needed. | @@ -50,7 +50,7 @@ if [ -z "${NO_INSTALL}" ]; then
fi
RETRY_NUM=0
-MAX_RETRIES=${MAX_RETRIES-3}
+MAX_RETRIES=${MAX_RETRIES-1}
if [ "${BUILDKITE_RETRY_COUNT-0}" -ge 1 ]; then
echo "This is a manually triggered retry from the Buildkite web UI, so we set the number of infra retries to 1."
@@ -108,7 +108,7 @@ if [ -z "${NO_ARTIFACTS}" ]; then
fi
echo "----------------------------------------"
-echo "release test finished with final exit code ${EXIT_CODE} after ${RETRY_NUM}/${MAX_RETRIES} tries"
+echo "Release test finished with final exit code ${EXIT_CODE} after ${RETRY_NUM}/${MAX_RETRIES} tries"
echo "Run results:"
COUNTER=1
|
fix(global): not using correct preset tags
fixing pixelAspect to by applied to letter box too | @@ -31,7 +31,7 @@ class ExtractReview(pyblish.api.InstancePlugin):
inst_data = instance.data
fps = inst_data.get("fps")
start_frame = inst_data.get("frameStart")
-
+ pixel_aspect = instance.data["pixelAspect"]
self.log.debug("Families In: `{}`".format(instance.data["families"]))
# get representation and loop them
@@ -147,13 +147,16 @@ class ExtractReview(pyblish.api.InstancePlugin):
)
output_args = []
- output_args.extend(profile.get('codec', []))
+ codec_args = profile.get('codec', [])
+ output_args.extend(codec_args)
# preset's output data
output_args.extend(profile.get('output', []))
# letter_box
lb = profile.get('letter_box', None)
if lb:
+ if "reformat" not in p_tags:
+ lb /= pixel_aspect
output_args.append(
"-filter:v drawbox=0:0:iw:round((ih-(iw*(1/{0})))/2):t=fill:c=black,drawbox=0:ih-round((ih-(iw*(1/{0})))/2):iw:round((ih-(iw*(1/{0})))/2):t=fill:c=black".format(lb))
@@ -165,8 +168,7 @@ class ExtractReview(pyblish.api.InstancePlugin):
# scaling none square pixels and 1920 width
# scale=320:-2 # to auto count height with output to be multiple of 2
- if "reformat" in tags:
- pixel_aspect = instance.data["pixelAspect"]
+ if "reformat" in p_tags:
scaling_arg = "scale=1920:'ceil((1920/{})/2)*2':flags=lanczos,setsar=1".format(
pixel_aspect)
vf_back = self.add_video_filter_args(
@@ -176,7 +178,7 @@ class ExtractReview(pyblish.api.InstancePlugin):
# baking lut file application
lut_path = instance.data.get("lutPath")
- if lut_path and ("bake-lut" in tags):
+ if lut_path and ("bake-lut" in p_tags):
# removing Gama info as it is all baked in lut
gamma = next((g for g in input_args
if "-gamma" in g), None)
@@ -220,7 +222,7 @@ class ExtractReview(pyblish.api.InstancePlugin):
'files': repr_file,
"tags": new_tags,
"outputName": name,
- "codec": profile.get('codec', [])
+ "codec": codec_args
})
if repre_new.get('preview'):
repre_new.pop("preview")
|
Update mapsSync.py
Updated disclaimer. | @@ -133,7 +133,7 @@ def get_mapsSync(files_found, report_folder, seeker):
if usageentries > 0:
- description = 'MapSync values. Medium confidence. Locations and searches from other linked devices might show up here. Travel may or may not have happened.'
+ description = 'Disclaimer: Entries should be corroborated. Locations and searches from other linked devices might show up here. Travel should be confirmed. Medium confidence.'
report = ArtifactHtmlReport('MapsSync')
report.start_artifact_report(report_folder, 'MapsSync', description)
report.add_script()
|
llvm/codegen: Opencode 'run' for loop.
This will be extended with processing of termination conditions. | @@ -1080,12 +1080,37 @@ def gen_composition_run(ctx, composition, *, tags:frozenset):
builder.store(cond_init, cond)
runs = builder.load(runs_ptr, "runs")
- with helpers.for_loop_zero_inc(builder, runs, "run_loop") as (b, iters):
+ iters_ptr = builder.alloca(runs.type)
+ builder.store(iters_ptr.type.pointee(0), iters_ptr)
+
+ # Start the main loop structure
+ loop_condition = builder.append_basic_block(name="run_loop_condition")
+ builder.branch(loop_condition)
+
+ # Generate a while not 'end condition' loop
+ builder.position_at_end(loop_condition)
+
+ # Iter cond
+ iters = builder.load(iters_ptr)
+ iter_cond = builder.icmp_unsigned("<", iters, runs)
+
+ # Increment. Use new name to not taint 'iters'
+ new_iters = builder.add(iters, iters.type(1))
+ builder.store(new_iters, iters_ptr)
+
+ loop_body = builder.append_basic_block(name="run_loop_body")
+ exit_block = builder.append_basic_block(name="run_exit")
+
+ builder.cbranch(iter_cond, loop_body, exit_block)
+
+ # Generate loop body
+ builder.position_at_end(loop_body)
+
# Get the right input stimulus
- input_idx = b.urem(iters, b.load(inputs_ptr))
- data_in_ptr = b.gep(data_in, [input_idx])
+ input_idx = builder.urem(iters, builder.load(inputs_ptr))
+ data_in_ptr = builder.gep(data_in, [input_idx])
- # Reset internal clocks of each node
+ # Reset internal 'RUN' clocks of each node
for idx, node in enumerate(composition._all_nodes):
node_state = builder.gep(state, [ctx.int32_ty(0), ctx.int32_ty(0), ctx.int32_ty(idx)])
num_executions_ptr = helpers.get_state_ptr(builder, node, node_state, "num_executions")
@@ -1095,17 +1120,21 @@ def gen_composition_run(ctx, composition, *, tags:frozenset):
# Call execution
exec_tags = tags.difference({"run"})
exec_f = ctx.import_llvm_function(composition, tags=exec_tags)
- b.call(exec_f, [state, params, data_in_ptr, data, cond])
+ builder.call(exec_f, [state, params, data_in_ptr, data, cond])
if not simulation:
# Extract output_CIM result
idx = composition._get_node_index(composition.output_CIM)
- result_ptr = b.gep(data, [ctx.int32_ty(0), ctx.int32_ty(0),
+ result_ptr = builder.gep(data, [ctx.int32_ty(0), ctx.int32_ty(0),
ctx.int32_ty(idx)])
- output_ptr = b.gep(data_out, [iters])
- result = b.load(result_ptr)
- b.store(result, output_ptr)
+ output_ptr = builder.gep(data_out, [iters])
+ result = builder.load(result_ptr)
+ builder.store(result, output_ptr)
+
+ builder.branch(loop_condition)
+ # Exit
+ builder.position_at_end(exit_block)
builder.ret_void()
return llvm_func
|
[cleanup] Remove deprecated twntranslate method
This method was deprecated about six years ago. Also cleaning up a couple bits
of code that were commented as being for it. | @@ -28,7 +28,6 @@ from collections.abc import Mapping
from contextlib import suppress
from textwrap import fill
from typing import Optional, Union
-from warnings import warn
import pywikibot
from pywikibot import __url__, config
@@ -36,7 +35,6 @@ from pywikibot.backports import List, cache
from pywikibot.plural import plural_rule
from pywikibot.tools import (
ModuleDeprecationWrapper,
- deprecated,
deprecated_args,
issue_deprecation_warning,
)
@@ -710,21 +708,9 @@ def twtranslate(source,
'See {}/i18n'
.format(_messages_package_name, twtitle, __url__))
- source_needed = False
- # If a site is given instead of a lang, use its language
- if hasattr(source, 'lang'):
- lang = source.lang
- # check whether we need the language code back
- elif isinstance(source, list):
- # For backwards compatibility still support lists, when twntranslate
- # was not deprecated and needed a way to get the used language code
- # back.
- warn('The source argument should not be a list but either a BaseSite '
- 'or a str/unicode.', DeprecationWarning, 2)
- lang = source.pop()
- source_needed = True
- else:
- lang = source
+ # if source is a site then use its lang attribute, otherwise it's a str
+
+ lang = getattr(source, 'lang', source)
# There are two possible failure modes: the translation dict might not have
# the language altogether, or a specific key could be untranslated. Both
@@ -743,9 +729,6 @@ def twtranslate(source,
'outdated submodule. See {}/i18n'
.format('English' if 'en' in langs else "'{}'".format(lang),
twtitle, __url__)))
- # send the language code back via the given mutable list parameter
- if source_needed:
- source.append(alt)
if '{{PLURAL:' in trans:
# _extract_plural supports in theory non-mappings, but they are
@@ -754,16 +737,6 @@ def twtranslate(source,
raise TypeError('parameters must be a mapping.')
trans = _extract_plural(alt, trans, parameters)
- # this is only the case when called in twntranslate, and that didn't apply
- # parameters when it wasn't a dict
- if isinstance(parameters, _PluralMappingAlias):
- # This is called due to the old twntranslate function which ignored
- # KeyError. Instead only_plural should be used.
- if isinstance(parameters.source, dict):
- with suppress(KeyError):
- trans %= parameters.source
- parameters = None
-
if parameters is not None and not isinstance(parameters, Mapping):
raise ValueError('parameters should be a mapping, not {}'
.format(type(parameters).__name__))
@@ -773,16 +746,6 @@ def twtranslate(source,
return trans
-@deprecated('twtranslate', since='20151009', future_warning=True)
-@deprecated_args(code='source')
-def twntranslate(source, twtitle: str,
- parameters: Optional[Mapping] = None) -> Optional[str]:
- """DEPRECATED: Get translated string for the key."""
- if parameters is not None:
- parameters = _PluralMappingAlias(parameters)
- return twtranslate(source, twtitle, parameters)
-
-
@deprecated_args(code='source')
def twhas_key(source, twtitle: str) -> bool:
"""
|
More flexible opref comparison
If either pkg type, name, or version is not specified, ignore it in
the comparison. | @@ -70,8 +70,12 @@ def _opref_is_op_run(opref, run):
return False
else:
return (
- run_opref.pkg_type == opref.pkg_type and
- run_opref.pkg_name == opref.pkg_name and
+ (opref.pkg_type is None or
+ run_opref.pkg_type == opref.pkg_type) and
+ (opref.pkg_name is None or
+ run_opref.pkg_name == opref.pkg_name) and
+ (opref.pkg_version is None or
+ run_opref.pkg_version == opref.pkg_version) and
run_opref.model_name == opref.model_name and
run_opref.op_name == opref.op_name)
|
mlanunch: Make 8th+ replica set members non-voting
Replica sets can have up to 7 voting members[1].
This change allows spinning up replica sets with more than 7 members by forcing
zero votes starting from the 8th node.
[1] | @@ -1512,6 +1512,10 @@ class MLaunchTool(BaseCmdLineTool):
if i == 0 and self.args['priority']:
member_config['priority'] = 10
+ if i >= 7:
+ member_config['votes'] = 0
+ member_config['priority'] = 0
+
self.config_docs[name]['members'].append(member_config)
# launch arbiter if True
|
[docs] fix partition_set page
Summary: I either missed this or it got mixed up in the docs migration
Test Plan: eyes
Reviewers: sashank, nate, prha | @@ -36,16 +36,6 @@ file:/dagster_examples/stocks/simple_partitions.py
lines:13-16
```
-Finally, using these two functions, we create a <PyObject module="dagster" object="PartitionSetDefinition" /> named
-`stock_data_partitions_set`. We can register these
-partitions decorating a function that returns a list of partition set
-definitions with the <PyObject module="dagster" object="repository_partitions" displayText="@repository_partitions" />
-
-```python literalinclude caption=repository.py
-file:/dagster_examples/stocks/simple_partitions.py
-lines:19-29
-```
-
Finally, we add the partitions to our repository definition:
```python literalinclude emphasize-lines=6
|
GDB helpers: avoid overflow in token text decoding
In order to avoid overflow, switch source buffer bounds computations
from inferior's integers to native Python integers. This fixes the
handling of empty ranges (i.e. superflat arrays, i.e. (1 .. 0 => <>).
TN: | @@ -74,8 +74,8 @@ class Token(object):
# Fetch the fat pointer, the bounds and then go subscript the
# underlying array ourselves.
src_buffer = self.tdh.value['source_buffer']
- first = self.value['source_first']
- last = self.value['source_last']
+ first = int(self.value['source_first'])
+ last = int(self.value['source_last'])
length = last - first + 1
if length <= 0:
|
ebuild/profiles: add 'use' attribute for profile objects
Combines the valid USE and USE_EXPAND settings for the profile. | @@ -520,10 +520,23 @@ class ProfileStack(object):
@property
def use_expand(self):
+ """USE_EXPAND variables defined by the profile."""
if "USE_EXPAND" in const.incrementals:
return frozenset(self.default_env.get("USE_EXPAND", ()))
return frozenset(self.default_env.get("USE_EXPAND", "").split())
+ @klass.jit_attr
+ def use(self):
+ """USE flag settings for the profile."""
+ use = list(self.default_env.get('USE', ()))
+ for u in self.use_expand:
+ value = self.default_env.get(u)
+ if value is None:
+ continue
+ u2 = u.lower() + '_'
+ use.extend(u2 + x for x in value.split())
+ return tuple(use)
+
@property
def use_expand_hidden(self):
if "USE_EXPAND_HIDDEN" in const.incrementals:
|
[isolate] Fix inversion in the fallback query
The unchangeable law of maths:
X - i > X - (i+1) | @@ -280,9 +280,9 @@ class CronCleanupExpiredHandler(webapp2.RequestHandler):
# It was observed that limiting the range on both sides helps with the
# chances of the query succeeding, instead of raising a Timeout.
q = model.ContentEntry.query(
- model.ContentEntry.expiration_ts >=
- now - datetime.timedelta(days=i),
model.ContentEntry.expiration_ts <
+ now - datetime.timedelta(days=i),
+ model.ContentEntry.expiration_ts >=
now - datetime.timedelta(days=i+1))
try:
# Don't order() here otherwise the query will likely time out. Don't
|
AbstractExpression: arbitrary objects can override the prepare protocol
TN: | @@ -496,6 +496,9 @@ class AbstractExpression(Frozable):
elif isinstance(obj, (dict)):
for v in obj.items():
explore(v, fn)
+ elif (not isinstance(obj, (PropertyDef, TypeRepo.Defer)) and
+ hasattr(obj, 'prepare')):
+ explore(obj.prepare(), fn)
ret = self
for p, order in passes:
|
fw/output: update internal state on write_config()
Update the internal _combined_config object with the one that
has been written to ensure that the serialized and run time states are
the same. | @@ -269,6 +269,7 @@ class RunOutput(Output):
write_pod(self.state.to_pod(), self.statefile)
def write_config(self, config):
+ self._combined_config = config
write_pod(config.to_pod(), self.configfile)
def read_config(self):
|
(python-config-type-instance-9) Convert SolidContainerConfigDict and SolidConfigDict to inst()
Summary:
These are special config types in the environment configs system.
Depends on D1587
Test Plan: BK
Reviewers: max, alangenfeld | @@ -32,14 +32,14 @@ def SystemDict(fields, description=None):
return build_config_dict(fields, description, is_system_config=True)
-class _SolidContainerConfigDict(_ConfigHasFields):
+class SolidContainerConfigDict(_ConfigHasFields):
def __init__(self, name, fields, description=None, handle=None, child_solids_config_field=None):
self._handle = check.opt_inst_param(handle, 'handle', SolidHandle)
self._child_solids_config_field = check.opt_inst_param(
child_solids_config_field, 'child_solids_config_field', Field
)
- super(_SolidContainerConfigDict, self).__init__(
+ super(SolidContainerConfigDict, self).__init__(
key=name,
name=name,
kind=ConfigTypeKind.DICT,
@@ -48,6 +48,9 @@ def __init__(self, name, fields, description=None, handle=None, child_solids_con
type_attributes=ConfigTypeAttributes(is_system_config=True),
)
+ def inst(self):
+ return self
+
@property
def handle(self):
'''A solid handle ref to the composite solid that is associated with this config schema
@@ -63,30 +66,17 @@ def child_solids_config_field(self):
return self._child_solids_config_field
-def SolidContainerConfigDict(
- name, fields, description=None, handle=None, child_solids_config_field=None
-):
- class _SolidContainerConfigDictInternal(_SolidContainerConfigDict):
- def __init__(self):
- super(_SolidContainerConfigDictInternal, self).__init__(
- name=name,
- fields=fields,
- description=description,
- handle=handle,
- child_solids_config_field=child_solids_config_field,
- )
-
- return _SolidContainerConfigDictInternal
-
-
def SystemSelector(fields, description=None):
return Selector(fields, description, is_system_config=True)
-class _SolidConfigDict(_ConfigHasFields):
- def __init__(self, name, fields, description):
+class SolidConfigDict(_ConfigHasFields):
+ def __init__(self, name, fields, description=None):
+ from dagster.core.types.field_utils import check_user_facing_fields_dict
+
+ check_user_facing_fields_dict(fields, 'NamedDict named "{}"'.format(name))
- super(_SolidConfigDict, self).__init__(
+ super(SolidConfigDict, self).__init__(
key=name,
name=name,
kind=ConfigTypeKind.DICT,
@@ -95,27 +85,16 @@ def __init__(self, name, fields, description):
type_attributes=ConfigTypeAttributes(is_system_config=True),
)
-
-def SolidConfigDict(name, fields, description=None):
- from dagster.core.types.field_utils import check_user_facing_fields_dict
-
- check_user_facing_fields_dict(fields, 'NamedDict named "{}"'.format(name))
-
- class _SolidConfigDictInternal(_SolidConfigDict):
- def __init__(self):
- super(_SolidConfigDictInternal, self).__init__(
- name=name, fields=fields, description=description
- )
-
- return _SolidConfigDictInternal
+ def inst(self):
+ return self
def is_solid_dict(obj):
- return isinstance(obj, _SolidConfigDict)
+ return isinstance(obj, SolidConfigDict)
def is_solid_container_config(obj):
- return isinstance(obj, _SolidContainerConfigDict)
+ return isinstance(obj, SolidContainerConfigDict)
def _is_selector_field_optional(config_type):
|
Duckpond: Add a list of already ducked messages
Previously race conditions caused the messages to be processed again before
knowing the white check mark reaction got added, this seems to solve it | @@ -22,6 +22,7 @@ class DuckPond(Cog):
self.bot = bot
self.webhook_id = constants.Webhooks.duck_pond
self.webhook = None
+ self.ducked_messages = []
self.bot.loop.create_task(self.fetch_webhook())
self.relay_lock = None
@@ -176,7 +177,8 @@ class DuckPond(Cog):
duck_count = await self.count_ducks(message)
# If we've got more than the required amount of ducks, send the message to the duck_pond.
- if duck_count >= constants.DuckPond.threshold:
+ if duck_count >= constants.DuckPond.threshold and message.id not in self.ducked_messages:
+ self.ducked_messages.append(message.id)
await self.locked_relay(message)
@Cog.listener()
|
Fixes ensuring that the verifier can be restarted cleanly when
mTLS for agents is disabled
The agent attribute "mtls_cert" can be boolean "None" (API 1.0) or
could be assigned the value "disabled" in case mTLS is purposefully
disabled in post-1.0 API. These changes covers both cases. | @@ -582,7 +582,7 @@ class AgentsHandler(BaseHandler):
if "reactivate" in rest_params:
if not isinstance(agent, dict):
agent = _from_db_obj(agent)
- if agent["mtls_cert"]:
+ if agent["mtls_cert"] and agent["mtls_cert"] != "disabled":
agent["ssl_context"] = web_util.generate_agent_mtls_context(agent["mtls_cert"], mtls_options)
agent["operational_state"] = states.START
asyncio.ensure_future(process_agent(agent, states.GET_QUOTE))
@@ -929,7 +929,7 @@ async def notify_error(agent, msgtype="revocation", event=None):
for agent_db_obj in agents:
if agent_db_obj.agent_id != agent["agent_id"]:
agent = _from_db_obj(agent_db_obj)
- if agent["mtls_cert"]:
+ if agent["mtls_cert"] and agent["mtls_cert"] != "disabled":
agent["ssl_context"] = web_util.generate_agent_mtls_context(agent["mtls_cert"], mtls_options)
func = functools.partial(invoke_notify_error, agent, tosend)
futures.append(await loop.run_in_executor(pool, func))
@@ -1119,7 +1119,7 @@ async def activate_agents(verifier_id, verifier_ip, verifier_port):
agent.verifier_ip = verifier_ip
agent.verifier_host = verifier_port
agent_run = _from_db_obj(agent)
- if agent_run["mtls_cert"]:
+ if agent_run["mtls_cert"] and agent_run["mtls_cert"] != "disabled":
agent_run["ssl_context"] = web_util.generate_agent_mtls_context(agent_run["mtls_cert"], mtls_options)
if agent.operational_state == states.START:
asyncio.ensure_future(process_agent(agent_run, states.GET_QUOTE))
|
new CentralDifferenceTS
new ._create_nlst_a(), .create_nlst(), .step() | @@ -571,6 +571,84 @@ class VelocityVerletTS(ElastodynamicsBaseTS):
return vect
+class CentralDifferenceTS(ElastodynamicsBaseTS):
+ r"""
+ Solve elastodynamics problems by the central difference method.
+
+ Should be more efficient than the corresponding NewmarkTS
+ with :math:`\beta = 0`, :math:`\gamma = 1/2`.
+ """
+ name = 'ts.central_difference'
+
+ _parameters = [
+ ('t0', 'float', 0.0, False,
+ 'The initial time.'),
+ ('t1', 'float', 1.0, False,
+ 'The final time.'),
+ ('dt', 'float', None, False,
+ 'The time step. Used if `n_step` is not given.'),
+ ('n_step', 'int', 10, False,
+ 'The number of time steps. Has precedence over `dt`.'),
+ ('is_linear', 'bool', False, False,
+ 'If True, the problem is considered to be linear.'),
+ ]
+
+ def _create_nlst_a(self, nls, dt, u1, vfun, cc, cache_name):
+ nlst = nls.copy()
+
+ def fun(at):
+ vec = nm.r_[u1, vfun(at), at]
+
+ aux = nls.fun(vec)
+
+ i3 = len(at)
+ rt = aux[:i3] + aux[i3:2*i3] + aux[2*i3:]
+ return rt
+
+ @_cache(self, cache_name, self.conf.is_linear)
+ def fun_grad(at):
+ vec = None if self.conf.is_linear else nm.r_[u1, vfun(at), at]
+ M, C, K = self.get_matrices(nls, vec)
+
+ Kt = M + cc * C
+ return Kt
+
+ nlst.fun = fun
+ nlst.fun_grad = fun_grad
+ nlst.u = u1
+ nlst.v = vfun
+
+ return nlst
+
+ def create_nlst(self, nls, dt, u0, v0, a0):
+ dt2 = dt**2
+
+ def v(a):
+ return v0 + dt * 0.5 * (a0 + a)
+
+ def u():
+ return u0 + dt * v0 + dt2 * 0.5 * a0
+
+ nlst = self._create_nlst_a(nls, dt, u(), v, 0.5 * dt,
+ 'matrix')
+ return nlst
+
+ def step(self, ts, vec, nls, pack, unpack, **kwargs):
+ """
+ Solve a single time step.
+ """
+ dt = ts.dt
+ ut, vt, at = unpack(vec)
+
+ nlst = self.create_nlst(nls, dt, ut, vt, at)
+ atp = nlst(at)
+ vtp = nlst.v(atp)
+ utp = nlst.u
+
+ vect = pack(utp, vtp, atp)
+
+ return vect
+
class NewmarkTS(ElastodynamicsBaseTS):
r"""
Solve elastodynamics problems by the Newmark method.
|
Added "proxy support" via environment variables
Useful if you use proxy in your environment | @@ -36,9 +36,16 @@ metadata = {
"ddsourcecategory": "aws",
}
-
+try:
+ host = os.environ['DD_URL']
+except Exception:
host = "lambda-intake.logs.datadoghq.com"
+
+try:
+ ssl_port = os.environ['DD_PORT']
+except Exception:
ssl_port = 10516
+
cloudtrail_regex = re.compile('\d+_CloudTrail_\w{2}-\w{4,9}-\d_\d{8}T\d{4}Z.+.json.gz$', re.I)
|
Update clock.py - Corrected typo
There was a typo in one of the descriptions. Changed "continuo" to 'continue." | @@ -486,7 +486,7 @@ class ClockBaseBehavior(object):
MIN_SLEEP = 0.005
'''The minimum time to sleep. If the remaining time is less than this,
- the event loop will continuo
+ the event loop will continue.
'''
SLEEP_UNDERSHOOT = MIN_SLEEP - 0.001
|
squad analysis: highlight yellow title
make "yellow squad" title font black on yellow background for better
readability | @@ -1020,6 +1020,11 @@ def add_squad_analysis_to_email(session, soup):
.squad-unassigned {
background-color: #FFBA88;
}
+ h4.squad-yellow {
+ color: black;
+ background-color: yellow;
+ display: inline;
+ }
"""
# prepare place for the Squad Analysis in the email
squad_analysis_div = soup.new_tag("div")
|
Stabilize TreeViewAdditionalTestCases.testCheckBoxes()
HG--
branch : dev00 | @@ -51,6 +51,7 @@ from pywinauto.sysinfo import is_x64_Python # noqa: E402
from pywinauto.remote_memory_block import RemoteMemoryBlock # noqa: E402
from pywinauto.actionlogger import ActionLogger # noqa: E402
from pywinauto.timings import Timings # noqa: E402
+from pywinauto.timings import wait_until # noqa: E402
from pywinauto import mouse # noqa: E402
@@ -601,9 +602,9 @@ class TreeViewAdditionalTestCases(unittest.TestCase):
self.dlg.TVS_CHECKBOXES.check_by_click()
birds = self.ctrl.GetItem(r'\Birds')
birds.Click(where='check')
- self.assertEquals(birds.IsChecked(), True)
+ self.assertEqual(birds.IsChecked(), True)
birds.click_input(where='check')
- self.assertEquals(birds.IsChecked(), False)
+ wait_until(3, 0.4, lambda: birds.IsChecked(), value=False)
def testPrintItems(self):
"""Test TreeView method PrintItems()"""
|
Get right tag for reprocessed sentinel2 l2a
L2A products with baseline 00.01 are products reprocessed by sentinelhub. Because of that the right tag for the tile id is TILE_ID. | @@ -265,7 +265,7 @@ class SafeTile(AwsTile):
tree = client.get_xml(self.get_url(AwsConstants.METADATA))
tile_id_tag = 'TILE_ID_2A' if (self.data_collection is DataCollection.SENTINEL2_L2A
- and self.baseline <= '02.06') else 'TILE_ID'
+ and '00.01' < self.baseline <= '02.06') else 'TILE_ID'
tile_id = tree[0].find(tile_id_tag).text
if self.safe_type is EsaSafeType.OLD_TYPE:
return tile_id
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.