code
stringlengths
20
4.93k
docstring
stringlengths
33
1.27k
source
stringclasses
3 values
def creationlog(base, package, stackdepth=_def_stackdepth): @staticmethod def wrapnew(cls, *argl, **argd): global _atdepth_new, _cstack_new, streamlining origstream = None if (not (decorating or streamlining)): (entry, _atdepth_new) = _pre_create(cls, _atdepth_new, stackdept...
Decorator for wrapping the creation of class instances that are being logged by acorn. Args: base: base class used to call __new__ for the construction. package (str): name of (global) package the class belongs to. stackdepth (int): if the calling stack is less than this depth, than include the entry in the log; other...
codesearchnet
def points_random_3d(count, range_x=((- 10.0), 10.0), range_y=((- 10.0), 10.0), range_z=((- 10.0), 10.0), seed=None) -> VAO: random.seed(seed) def gen(): for _ in range(count): (yield random.uniform(*range_x)) (yield random.uniform(*range_y)) (yield random.uniform(*r...
Generates random positions inside a confied box. Args: count (int): Number of points to generate Keyword Args: range_x (tuple): min-max range for x axis: Example (-10.0. 10.0) range_y (tuple): min-max range for y axis: Example (-10.0. 10.0) range_z (tuple): min-max range for z axis: Example (-10.0. 10.0) seed (int): ...
codesearchnet
def load_tf_sharded_weights(model, shard_files, ignore_mismatched_sizes=False, strict=False, _prefix=None): unexpected_keys = set() saved_keys = set() mismatched_keys = set() model_keys = set() model_layer_map = {} for i, k in enumerate(model.weights): layer_name = k.name if _pre...
This is the same as `load_tf_weights` but for a sharded checkpoint. Detect missing and unexpected layers and load the TF weights from the shard file accordingly to their names and shapes. This load is performed efficiently: each checkpoint shard is loaded one by one in RAM and deleted after being loaded in the model. ...
github-repos
def conversation(self, name=None, **kwargs): convo = Conversation(self, **kwargs) super().conversation(name, convo) return convo
Make a new conversation. Arguments: name: The key for the dictionary the conversation will be stored as in conversations. If None the conversation will be stored as a list instead. Mixing both types results in an error. **kwargs: Keyword arguments to pass into the new conversation. These accept the same arguments as C...
codesearchnet
def method(*args, **kwargs): assert (len(args) == 0) assert (len(kwargs) == 1) assert ('num_return_vals' in kwargs) num_return_vals = kwargs['num_return_vals'] def annotate_method(method): method.__ray_num_return_vals__ = num_return_vals return method return annotate_method
Annotate an actor method. .. code-block:: python @ray.remote class Foo(object): @ray.method(num_return_vals=2) def bar(self): return 1, 2 f = Foo.remote() _, _ = f.bar.remote() Args: num_return_vals: The number of object IDs that should be returned by invocations of this actor method.
codesearchnet
def GetContract(self, script_hash): if script_hash.ToBytes() in self._contracts.keys(): return self._contracts[script_hash.ToBytes()] return None
Get contract for specified script_hash. Args: script_hash (UInt160): a bytearray (len 20). Returns: Contract: if a contract was found matching the provided script hash, otherwise None
juraj-google-style
def __init__(self, resolution, **kwargs): super(CelebaHQConfig, self).__init__( name="%d" % resolution, description=("CelebaHQ images in %d x %d resolution" % (resolution, resolution)), **kwargs) self.resolution = resolution self.file_name = "data%dx%d.tar" ...
BuilderConfig for SQUAD. Args: resolution: Resolution of the image. Values supported: powers of 2 up to 1024. **kwargs: keyword arguments forwarded to super.
juraj-google-style
def range(*args, prefix: str): return [NamedQubit((prefix + str(i))) for i in range(*args)]
Returns a range of NamedQubits. The range returned starts with the prefix, and followed by a qubit for each number in the range, e.g.: NamedQubit.range(3, prefix="a") -> ["a1", "a2", "a3] NamedQubit.range(2, 4, prefix="a") -> ["a2", "a3] Args: *args: Args to be passed to Python's standard range function. prefix: A p...
codesearchnet
def args_to_kwargs(base_type, removed_method=False, removed_args=None): def wrap(func): if removed_method: return func removed_arg_names = removed_args if removed_args is not None else [] base_arg_spec = getfullargspec(unwrap(getattr(base_type, func.__name__))) base_arg_...
Convert all args to kwargs before calling the decorated function. When applied to a function, this decorator creates a new function that always calls the wrapped function with *only* keyword arguments. It inspects the argspec for the identically-named method on `base_type` to determine the name to use for arguments th...
github-repos
def tap_hold(self, x, y, duration=1.0): data = {'x': x, 'y': y, 'duration': duration} return self.http.post('/wda/touchAndHold', data=data)
Tap and hold for a moment Args: - x, y(int): position - duration(float): seconds of hold time [[FBRoute POST:@"/wda/touchAndHold"] respondWithTarget:self action:@selector(handleTouchAndHoldCoordinate:)],
juraj-google-style
def _get_object_by_name(self, object_endpoint, object_name, timeout=None): timeout = timeout or self._timeout resp = self._get(self._u(object_endpoint, object_name), session=self._session, timeout=timeout) resp.raise_for_status() return resp.json()
generic function to get object (metadata, tag, ) by name from SignalFx. Args: object_endpoint (string): API endpoint suffix (e.g. 'v2/tag') object_name (string): name of the object (e.g. 'jvm.cpu.load') Returns: dictionary of response
juraj-google-style
def count_divisors(n): if not isinstance(n, int): raise TypeError("Expecting a strictly positive integer") if n <= 0: raise ValueError("Expecting a strictly positive integer") number_of_divisors = 1 remain = n for p in prime_generator(): if p > n: return ...
Count the number of divisors of an integer n Args: n (int): strictly positive integer Returns: The number of distinct divisors of n Raises: TypeError: if n is not an integer ValueError: if n is negative
juraj-google-style
def _block_orth(self, p1, p2): if p1.shape.as_list() != p2.shape.as_list(): raise ValueError(f'The dimension of the matrices must be the same. Received p1.shape={p1.shape} and p2.shape={p2.shape}.') n = p1.shape.as_list()[0] kernel2x2 = {} eye = linalg_ops_impl.eye(n, dtype=self.dtype) kerne...
Construct a 2 x 2 kernel. Used to construct orthgonal kernel. Args: p1: A symmetric projection matrix. p2: A symmetric projection matrix. Returns: A 2 x 2 kernel [[p1p2, p1(1-p2)], [(1-p1)p2, (1-p1)(1-p2)]]. Raises: ValueError: If the dimensions of p1 and p2 are different.
github-repos
def __str__(self): name = self.__class__.__name__ return '%s(Type %d, Address %d)' % (name, self.Type, self.Addr)
Returns a string representation of the data event. Args: self (JLinkDataEvent): the ``JLinkDataEvent`` instance Returns: A string representation of the data event.
juraj-google-style
def add_workflow_definitions(sbi_config: dict): registered_workflows = [] for i in range(len(sbi_config['processing_blocks'])): workflow_config = sbi_config['processing_blocks'][i]['workflow'] workflow_name = '{}:{}'.format(workflow_config['id'], workflow_config['version']) if (workflow_...
Add any missing SBI workflow definitions as placeholders. This is a utility function used in testing and adds mock / test workflow definitions to the database for workflows defined in the specified SBI config. Args: sbi_config (dict): SBI configuration dictionary.
codesearchnet
def _read_ipv4_options(self, size=None): counter = 0 optkind = list() options = dict() while (counter < size): kind = self._read_unpack(1) opts = IPv4_OPT.get(kind) if (opts is None): len_ = (size - counter) counter = size options['Unknown'] = ...
Read IPv4 option list. Positional arguments: * size -- int, buffer size Returns: * tuple -- IPv4 option list * dict -- extracted IPv4 option
codesearchnet
def find_duplicates_in_array(array): duplicates = [] non_duplicates = [] if (len(array) != len(set(array))): for item in array: if (item not in non_duplicates): non_duplicates.append(item) elif ((item in non_duplicates) and (item not in duplicates)): ...
Runs through the array and returns the elements that contain more than one duplicate Args: array: The array to check for duplicates. Returns: Array of the elements that are duplicates. Returns empty list if there are no duplicates.
codesearchnet
def get_figure(new_fig=True, subplot='111', params=None): _get_plt() if new_fig: fig = plt.figure() else: fig = plt.gcf() params = dict_if_none(params) if isinstance(subplot, (tuple, list)): ax = fig.add_subplot(*subplot, **params) else: ax = fig.add_subpl...
Function to be used for viewing - plotting, to initialize the matplotlib figure - axes. Args: new_fig(bool): Defines if a new figure will be created, if false current figure is used subplot (tuple or matplolib subplot specifier string): Create axes with these parameters params (dict): extra options passed to add_subpl...
juraj-google-style
def summary(self, line_length=160, detailed=True, print_fn=None): if not self._converted: raise RuntimeError(f'Impossible to call `{self.__class__.__name__}.summary()` before calling {self.__class__.__name__}.convert()`.') if line_length < 160: raise ValueError(f'Invalid `line_length` value has ...
This method describes the results of the conversion by TF-TRT. It includes information such as the name of the engine, the number of nodes per engine, the input and output dtype, along with the input shape of each TRTEngineOp. Args: line_length: Default line length when printing on the console. Minimum 160 characters...
github-repos
def MsgUser(msg): msg_tested_versions = ['xp', 'vista', '2008', '2003'] msg_args = ['/c', '%SystemRoot%\\System32\\msg.exe', '*', '/TIME:0'] host_version = platform.platform().lower() if (not msg): return ('Command not ran.', 'Empty message.', (- 1)) else: msg_args.extend([msg]) ...
Sends a message to a user. Args: msg: Message to be displaied to user. Returns: res which is a tuple of (stdout, stderr, exit_status, time_taken).
codesearchnet
def top_kth_iterative(x, k): def next_x(cur_x, _): top_x = tf.reduce_max(cur_x, axis=(- 1), keep_dims=True) return (cur_x * to_float((cur_x < top_x))) fin_x = tf.foldl(next_x, tf.range((k - 1)), initializer=tf.stop_gradient(x), parallel_iterations=2, back_prop=False) return tf.stop_gradient...
Compute the k-th top element of x on the last axis iteratively. This assumes values in x are non-negative, rescale if needed. It is often faster than tf.nn.top_k for small k, especially if k < 30. Note: this does not support back-propagation, it stops gradients! Args: x: a Tensor of non-negative numbers of type float...
codesearchnet
def vectorial_decomp(self, symbols): try: symbols = [s.vec for s in symbols] N = sum(map((lambda s: len(s)), symbols)) symbols_ = Vector(N) i = 0 for v in symbols: for s in v: symbols_[i] = s i += 1 symbols = symbols_ ex...
Compute the vectorial decomposition of the expression according to the given symbols. symbols is a list that represents the input of the resulting application. They are considerated as a flatten vector of bits. Args: symbols: TODO Returns: An :class:`pytanque.App` object Example: >>> mba = MBA(4) >>> x = mba.var('x...
codesearchnet
def request(self, request_method, api_method, *args, **kwargs): url = self._build_url(api_method) resp = requests.request(request_method, url, *args, **kwargs) try: rv = resp.json() except ValueError: raise RequestFailedError(resp, 'not a json body') if (not resp.ok): raise R...
Perform a request. Args: request_method: HTTP method for this request. api_method: API method name for this request. *args: Extra arguments to pass to the request. **kwargs: Extra keyword arguments to pass to the request. Returns: A dict contains the request response data. Raises: RequestFailedError: Raises when Bea...
codesearchnet
def must_exist(*components): _path = path(*components) if (not exists(_path)): raise File404(_path) return _path
Ensure path exists. Arguments: *components (str[]): Path components. Returns: str: File path. Raises: File404: If path does not exist.
codesearchnet
def unwrap(tensor): while isinstance(tensor, (PrettyTensor, Loss)): tensor = tensor.tensor return tensor
Returns the underlying tensor if tensor is wrapped or tensor. Args: tensor: The tensor to unwrap. Returns: Tensor or if it is a pretty tensor, the unwrapped version. Raises: ValueError: if tensor holds a sequence.
juraj-google-style
def submit_job(self, job_config=None): job_id = self._delegator._submit_bundle(self, job_config) return self._instance.get_job(job_id)
Submit this Streams Application Bundle (sab file) to its associated instance. Args: job_config(JobConfig): a job configuration overlay Returns: Job: Resulting job instance.
juraj-google-style
def log_cdf_laplace(x, name='log_cdf_laplace'): with tf.name_scope(name): x = tf.convert_to_tensor(value=x, name='x') lower_solution = ((- np.log(2.0)) + x) safe_exp_neg_x = tf.exp((- tf.abs(x))) upper_solution = tf.math.log1p(((- 0.5) * safe_exp_neg_x)) return tf.where((x < ...
Log Laplace distribution function. This function calculates `Log[L(x)]`, where `L(x)` is the cumulative distribution function of the Laplace distribution, i.e. ```L(x) := 0.5 * int_{-infty}^x e^{-|t|} dt``` For numerical accuracy, `L(x)` is computed in different ways depending on `x`, ``` x <= 0: Log[L(x)] = Log[0....
codesearchnet
def extract_annotation(self, node, var, name, stack, allowed_type_params: set[str] | None=None): try: typ = abstract_utils.get_atomic_value(var) except abstract_utils.ConversionError: self.ctx.errorlog.ambiguous_annotation(self.ctx.vm.frames, None, name) return self.ctx.convert.unsolvabl...
Returns an annotation extracted from 'var'. Args: node: The current node. var: The variable to extract from. name: The annotated name. stack: The frame stack. allowed_type_params: Type parameters that are allowed to appear in the annotation. 'None' means all are allowed. If non-None, the result of calling get_callable...
github-repos
def merge(left, right, how='inner', on=None, left_on=None, right_on=None, left_index=False, right_index=False, sort=False, suffixes=('_x', '_y'), copy=True, indicator=False, validate=None): if (not isinstance(left, DataFrame)): raise ValueError('can not merge DataFrame with instance of type {}'.format(type(...
Database style join, where common columns in "on" are merged. Args: left: DataFrame. right: DataFrame. how: What type of join to use. on: The common column name(s) to join on. If None, and left_on and right_on are also None, will default to all commonly named columns. left_on: The column(s) on the left to use for the...
codesearchnet
def slice_hidden(x, hidden_size, num_blocks): (batch_size, latent_dim, _) = common_layers.shape_list(x) block_dim = (hidden_size x_sliced = tf.reshape(x, shape=[batch_size, latent_dim, num_blocks, block_dim]) return x_sliced
Slice encoder hidden state under num_blocks. Args: x: Encoder hidden state of shape [batch_size, latent_dim, hidden_size]. hidden_size: Dimension of the latent space. num_blocks: Number of blocks in DVQ. Returns: Sliced states of shape [batch_size, latent_dim, num_blocks, block_dim].
codesearchnet
def readuntil(self, token, size=0): self.__append() i = self.buf.find(token, self.pos) if (i < 0): index = max((len(token) - 1), size) newpos = max((len(self.buf) - index), self.pos) data = self.buf[self.pos:newpos] self.pos = newpos self.__discard() return (F...
Reads data from the FIFO until a token is encountered. If no token is encountered as much data is read from the FIFO as possible keeping in mind that the FIFO must retain enough data to perform matches for the token across writes. Args: token: The token to read until. size: The minimum amount of data that should be l...
codesearchnet
def decorator(wrapped_decorator): def helper(_func=None, **options): def outer_wrapper(func): @wrapping(func) def inner_wrapper(*args, **kwds): return wrapped_decorator(func, args, kwds, **options) return inner_wrapper if (_func is None): ...
Converts a function into a decorator that optionally accepts keyword arguments in its declaration. Example usage: @utils.decorator def decorator(func, args, kwds, op1=None): ... apply op1 ... return func(*args, **kwds) # Form (1), vanilla @decorator foo(...) ... # Form (2), with options @decorator(op1=5) foo(...) .....
codesearchnet
def run_inference(examples, serving_bundle): batch_size = 64 if (serving_bundle.estimator and serving_bundle.feature_spec): preds = serving_bundle.estimator.predict((lambda : tf.data.Dataset.from_tensor_slices(tf.parse_example([ex.SerializeToString() for ex in examples], serving_bundle.feature_spec)).ba...
Run inference on examples given model information Args: examples: A list of examples that matches the model spec. serving_bundle: A `ServingBundle` object that contains the information to make the inference request. Returns: A ClassificationResponse or RegressionResponse proto.
codesearchnet
def matmul(x1, x2): if any_symbolic_tensors((x1, x2)): return Matmul().symbolic_call(x1, x2) return backend.numpy.matmul(x1, x2)
Matrix product of two tensors. - If both tensors are 1-dimensional, the dot product (scalar) is returned. - If either tensor is N-D, N > 2, it is treated as a stack of matrices residing in the last two indexes and broadcast accordingly. - If the first tensor is 1-D, it is promoted to a matrix by prepending a 1 to its ...
github-repos
def add_time_step(self, **create_time_step_kwargs): ts = time_step.TimeStep.create_time_step(**create_time_step_kwargs) assert isinstance(ts, time_step.TimeStep) self._time_steps.append(ts)
Creates a time-step and appends it to the list. Args: **create_time_step_kwargs: Forwarded to time_step.TimeStep.create_time_step.
juraj-google-style
def local_set_state(self, device, state, id_override=None, type_override=None): if ALLOW_LOCAL_CONTROL: if (device.local_id() is not None): hub = HUBS.get(device.hub_id()) if ((hub is None) or (hub['token'] is None)): return self.set_device_state(device, state, id_ove...
Set device state via local API, and fall back to online API. Args: device (WinkDevice): The device the change is being requested for. state (Dict): The state being requested. id_override (String, optional): A device ID used to override the passed in device's ID. Used to make changes on sub-devices. i.e. Outlet in a Po...
codesearchnet
def get_elements_between_bands(self, band_i, band_j): if ((band_i < 1) or (band_i > self.nb_bands) or (band_j < 1) or (band_j > self.nb_bands)): raise ValueError('Band index out of bounds') return self.data[(:, (band_i - 1), (band_j - 1), :)]
Method returning a numpy array with elements [cdum_x_real, cdum_x_imag, cdum_y_real, cdum_y_imag, cdum_z_real, cdum_z_imag] between bands band_i and band_j (vasp 1-based indexing) for all kpoints. Args: band_i (Integer): Index of band i band_j (Integer): Index of band j Returns: a numpy list of elements for each kp...
codesearchnet
def _get_anchor(module_to_name, fullname): if not _anchor_re.match(fullname): raise ValueError("'%s' is not a valid anchor" % fullname) anchor = fullname for module_name in module_to_name.values(): if fullname.startswith(module_name + "."): rest = fullname[len(module_name)+1:] if len...
Turn a full member name into an anchor. Args: module_to_name: Dictionary mapping modules to short names. fullname: Fully qualified name of symbol. Returns: HTML anchor string. The longest module name prefix of fullname is removed to make the anchor. Raises: ValueError: If fullname uses characters invalid in an anch...
juraj-google-style
def decode_image_tokens(self, image_tokens: torch.Tensor): decoded_image = self.model.vqmodel.decode(image_tokens) decoded_image = decoded_image.permute(0, 2, 3, 1) return decoded_image
Decodes generated image tokens from language model to continuous pixel values with VQGAN module via upsampling. Args: image_tokens (`torch.LongTensor` of shape `(batch_size, num_of_tokens)`): The tensors corresponding to the input images.
github-repos
def _bits_in_condition(self, cond): all_bits = [] if cond is not None: all_bits.extend([(cond[0], j) for j in range(self.cregs[cond[0].name].size)]) return all_bits
Return a list of bits in the given condition. Args: cond (tuple or None): optional condition (ClassicalRegister, int) Returns: list[(ClassicalRegister, idx)]: list of bits
juraj-google-style
def gaussian_pdf(std=10.0, mean=0.0): norm_const = 1.0 def pdf(x): return ((norm_const * np.exp(((- 0.5) * (((x - mean) / std) ** 2)))) * np.sin(((np.pi / 180.0) * x))) norm_dev = quad(pdf, 0.0, 180.0)[0] norm_const /= norm_dev return pdf
Gaussian PDF for orientation averaging. Args: std: The standard deviation in degrees of the Gaussian PDF mean: The mean in degrees of the Gaussian PDF. This should be a number in the interval [0, 180) Returns: pdf(x), a function that returns the value of the spherical Jacobian- normalized Gaussian PDF with the given...
codesearchnet
def _add_tag(self, tag): tags = self.data.get('tags', None) if tags: if (tag in [x['name'] for x in tags]): return False else: tags = list() tags.append({'name': tag}) self.data['tags'] = tags return True
Add a tag Args: tag (str): Tag to add Returns: bool: True if tag added or False if tag already present
codesearchnet
def fuzzUsufy(fDomains=None, fFuzzStruct=None): if (fFuzzStruct == None): fuzzingStructures = ['http: else: try: fuzzingStructures = fFuzzStruct.read().splitlines() except: print(('Usufy could NOT open the following file: ' + fFuzzStruct)) res = {} lines =...
Method to guess the usufy path against a list of domains or subdomains. Args: ----- fDomains: A list to strings containing the domains and (optionally) a nick. fFuzzStruct: A list to strings containing the transforms to be performed. Returns: -------- dict: A dictionary of the form of `{"domain": "url"}`.
codesearchnet
def update(self, force=False): if self.is_404 and not force: return 0 if self._last_modified: headers = {'If-Modified-Since': self._last_modified} else: headers = None try: res = self._board._requests_session.g...
Fetch new posts from the server. Arguments: force (bool): Force a thread update, even if thread has 404'd. Returns: int: How many new posts have been fetched.
juraj-google-style
def construct_graph(sakefile, settings): verbose = settings['verbose'] sprint = settings['sprint'] G = nx.DiGraph() sprint('Going to construct Graph', level='verbose') for target in sakefile: if (target == 'all'): continue if ('formula' not in sakefile[target]): ...
Takes the sakefile dictionary and builds a NetworkX graph Args: A dictionary that is the parsed Sakefile (from sake.py) The settings dictionary Returns: A NetworkX graph
codesearchnet
def should_drop(self): if self._drop_if_none and self.value is None: return True if self._drop_if_default and self.value == self._default: return True return False
Return True if the item should be dropped, or False if it should not be dropped. This depends on the drop_if_none, and drop_if_default calls. Returns: True or False; depending on whether the item should be dropped or kept.
github-repos
def auto_convert_cell_no_flags(cell, units=None, parens_as_neg=True): units = (units if (units != None) else {}) return auto_convert_cell(flagable=Flagable(), cell=cell, position=None, worksheet=0, flags={}, units=units, parens_as_neg=parens_as_neg)
Performs a first step conversion of the cell to check it's type or try to convert if a valid conversion exists. This version of conversion doesn't flag changes nor store cell units. Args: units: The dictionary holder for cell units. parens_as_neg: Converts numerics surrounded by parens to negative values
codesearchnet
def enrich_json_objects_by_object_type(request, value): time_start_globally = time() if isinstance(value, list): json = [x.to_json() if hasattr(x, "to_json") else x for x in value] else: if isinstance(value, dict): json = value else: json = value.to_json(...
Take the given value and start enrichment by object_type. The va Args: request (django.http.request.HttpRequest): request which is currently processed value (dict|list|django.db.models.Model): in case of django.db.models.Model object (or list of these objects), to_json method is invoked Returns: dict|list
juraj-google-style
def handle_malformed_config(error: MalformedConfigError) -> ResponseReturnValue: return (DQMResponse(name='MalformedConfigError', description=str(error), code=400), 400)
DQM Malformed Config Response. Args: * error: Config error Returns: * DQMResponse for the error with a 400 status code
github-repos
def start_range(self, line, membership): last = self._transitions[-1] if self._transitions else -1 if line < last: raise ValueError('Line number less than previous start_range() call.') previous = len(self._transitions) % 2 == 1 if membership == previous: return elif line == last: ...
Start a range of lines that are either included/excluded from the set. Args: line: A line number. membership: If True, lines >= line are included in the set (starting a range), otherwise they are excluded (ending a range). Raises: ValueError: if line is less than that of a previous call to start_range().
github-repos
def history(self, condition: Optional[Callable[['Origin'], bool]]=None) -> List['Origin']: condition = condition or (lambda o: True) current = self history = [] while current is not None: if condition(current): history.append(current) current = getattr(current.source, 'sym_or...
Returns a history of origins with an optional filter. Args: condition: An optional callable object with signature (origin) -> should_list. If None, all origins will be listed. Returns: A list of filtered origin from the earliest (root) to the most recent.
github-repos
def connect(self, db_uri, debug=False): kwargs = {'echo': debug, 'convert_unicode': True} if ('mysql' in db_uri): kwargs['pool_recycle'] = 3600 elif (': logger.debug('detected sqlite path URI: {}'.format(db_uri)) db_path = os.path.abspath(os.path.expanduser(db_uri)) db_uri = ...
Configure connection to a SQL database. Args: db_uri (str): path/URI to the database to connect to debug (Optional[bool]): whether to output logging information
codesearchnet
def write_config(config, config_path=CONFIG_PATH): if not os.path.exists(config_path): os.makedirs(os.path.dirname(config_path)) with open(config_path, 'w', encoding='utf-8') as f: config.write(f)
Write the config to the output path. Creates the necessary directories if they aren't there. Args: config (configparser.ConfigParser): A ConfigParser.
juraj-google-style
def cylindrical_vert(script, radius=1.0, inside=True): if inside: function = 'sqrt(x^2+y^2)<={}'.format(radius) else: function = 'sqrt(x^2+y^2)>={}'.format(radius) vert_function(script, function=function) return None
Select all vertices within a cylindrical radius Args: radius (float): radius of the sphere center_pt (3 coordinate tuple or list): center point of the sphere Layer stack: No impacts MeshLab versions: 2016.12 1.3.4BETA
codesearchnet
def GetFileEntryByPathSpec(self, path_spec): return encoded_stream_file_entry.EncodedStreamFileEntry( self._resolver_context, self, path_spec, is_root=True, is_virtual=True)
Retrieves a file entry for a path specification. Args: path_spec (PathSpec): a path specification. Returns: EncodedStreamFileEntry: a file entry or None if not available.
juraj-google-style
def __call__(self, *x_batch, **kwargs) -> Union[List, np.ndarray]: with self.graph.as_default(): K.set_session(self.sess) return self._net.predict_on_batch(x_batch, **kwargs)
Predicts answers on batch elements. Args: instance: a batch to predict answers on
juraj-google-style
def json_to_entity(tc_data, value_fields, resource_type, resource_type_parent): if (not isinstance(tc_data, list)): tc_data = [tc_data] entity_array = [] for d in tc_data: entity = {'id': d.get('id'), 'webLink': d.get('webLink')} values = [] if ('summary' in d): v...
Convert ThreatConnect JSON response to a TCEntityArray. .. Attention:: This method is subject to frequent changes. Args: tc_data (dictionary): Array of data returned from TC API call. value_fields (list): Field names that contain the "value" data. resource_type (string): The resource type of the tc_data provided. res...
codesearchnet
def model_call_event(self) -> asyncio.Event: return self._model_call_event
Returns an event that is set when the wrapped processor has all parts. The event is set when the wrapped processor has all the input parts and is about to start generating the output. The event starts in a cleared state when the first part of the input stream is yielded. It is also cleared at the end of the wrappedpr...
github-repos
def determine_git_ref(self, config): ref_config_keys = 0 for i in ['commit', 'tag', 'branch']: if config.get(i): ref_config_keys += 1 if (ref_config_keys > 1): raise ImportError("Fetching remote git sources failed: conflicting revisions (e.g. 'commit', 'tag', 'branch') specified ...
Determine the ref to be used for 'git checkout'. Args: config (dict): git config dictionary Returns: str: A commit id or tag name
codesearchnet
def get_ip_reports(self, ips): api_name = 'virustotal-ip-address-reports' (all_responses, ips) = self._bulk_cache_lookup(api_name, ips) responses = self._request_reports('ip', ips, 'ip-address/report') for (ip, response) in zip(ips, responses): if self._cache: self._cache.cache_value...
Retrieves the most recent VT info for a set of ips. Args: ips: list of IPs. Returns: A dict with the IP as key and the VT report as value.
codesearchnet
def get_alignment_df(a_aln_seq, b_aln_seq, a_seq_id=None, b_seq_id=None): if (len(a_aln_seq) != len(b_aln_seq)): raise ValueError('Sequence lengths not equal - was an alignment run?') if (not a_seq_id): a_seq_id = 'a_seq' if (not b_seq_id): b_seq_id = 'b_seq' a_aln_seq = ssbio.pr...
Summarize two alignment strings in a dataframe. Args: a_aln_seq (str): Aligned sequence string b_aln_seq (str): Aligned sequence string a_seq_id (str): Optional ID of a_seq b_seq_id (str): Optional ID of b_aln_seq Returns: DataFrame: a per-residue level annotation of the alignment
codesearchnet
def find_gaps(self, index=False): return self.__find_incongruities(op=operator.lt, index=index)
Finds gaps in a striplog. Args: index (bool): If True, returns indices of intervals with gaps after them. Returns: Striplog: A striplog of all the gaps. A sort of anti-striplog.
juraj-google-style
def write(data, file_name, worksheet_names=None): if re.search(XML_EXT_REGEX, file_name): return write_xml(data, file_name, worksheet_names=worksheet_names) elif re.search(XLSX_EXT_REGEX, file_name): return write_xlsx(data, file_name, worksheet_names=worksheet_names) elif re.search(XLS_EXT_R...
Writes 2D tables to file. Args: data: 2D list of tables/worksheets. file_name: Name of the output file (determines type). worksheet_names: A list of worksheet names (optional).
codesearchnet
def _terminate_all(self, sig=None): sig = sig or getattr(signal, 'SIGKILL', signal.SIGTERM) for (task_type, task_id), p in self._processes.items(): if p.exitcode is not None: logging.info('%s-%d has already exited. Not terminating.', task_type, task_id) continue try: ...
Terminates all subprocesses. The caller is required to hold self._process_lock. Args: sig: the signal used to terminate the process. The default is SIGKILL.
github-repos
def poll_stack(self): logging.info('polling stack status, POLL_INTERVAL={}'.format(POLL_INTERVAL)) time.sleep(POLL_INTERVAL) completed_states = [ 'CREATE_COMPLETE', 'UPDATE_COMPLETE', 'DELETE_COMPLETE' ] stack_name = self._config.get('...
Spin in a loop while the Cloud Formation process either fails or succeeds Args: None Returns: Good or bad; True or False
juraj-google-style
def _GetClientIdFromQueue(q): split = q.Split() if not split or len(split) < 2: return None split = [s.lower() for s in split] str_client_id, tasks_marker = split if not str_client_id.startswith("c.") or tasks_marker != "tasks": return None str_client_id = "C" + str_client_id[1:] ret...
Returns q's client id, if q is a client task queue, otherwise None. Args: q: rdfvalue.RDFURN Returns: string or None
juraj-google-style
def _ReadAttributeValueString(self, attribute_values_data, record_offset, attribute_values_data_offset, attribute_value_offset): if (attribute_value_offset == 0): return None data_type_map = self._GetDataTypeMap('keychain_string') file_offset = ((record_offset + attribute_values_data_offset) + attri...
Reads a string attribute value. Args: attribute_values_data (bytes): attribute values data. record_offset (int): offset of the record relative to the start of the file. attribute_values_data_offset (int): offset of the attribute values data relative to the start of the record. attribute_value_offset (int): offset of t...
codesearchnet
def get_occupation(self, atom_index, orbital): orbital_index = self.orbitals.index(orbital) return {spin: np.sum((d[(:, :, atom_index, orbital_index)] * self.weights[(:, None)])) for (spin, d) in self.data.items()}
Returns the occupation for a particular orbital of a particular atom. Args: atom_num (int): Index of atom in the PROCAR. It should be noted that VASP uses 1-based indexing for atoms, but this is converted to 0-based indexing in this parser to be consistent with representation of structures in pymatgen. orbital (str): ...
codesearchnet
def getUserCaPath(self, name): cert = self.getUserCert(name) if (cert is None): return None return self._getCaPath(cert)
Gets the path to the CA certificate that issued a given user keypair. Args: name (str): The name of the user keypair. Examples: Get the path to the CA cert which issue the cert for "myuser": mypath = cdir.getUserCaPath('myuser') Returns: str: The path if exists.
codesearchnet
def _configure_tls_parameters(parameters): cert = config.conf["tls"]["certfile"] key = config.conf["tls"]["keyfile"] if cert and key: _log.info( "Authenticating with server using x509 (certfile: %s, keyfile: %s)", cert, key, ) parameters.crede...
Configure the pika connection parameters for TLS based on the configuration. This modifies the object provided to it. This accounts for whether or not the new API based on the standard library's SSLContext is available for pika. Args: parameters (pika.ConnectionParameters): The connection parameters to apply TLS conn...
juraj-google-style
def new(self, injection_site_fn): return _InjectionContext(injection_site_fn, binding_stack=[], scope_id=scoping.UNSCOPED, is_scope_usable_from_scope_fn=self._is_scope_usable_from_scope_fn)
Creates a _InjectionContext. Args: injection_site_fn: the initial function being injected into Returns: a new empty _InjectionContext in the default scope
codesearchnet
def ExtractEvents(self, parser_mediator, registry_key, **kwargs): values_dict = {} if registry_key.number_of_values > 0: for registry_value in registry_key.GetValues(): value_name = registry_value.name or '(default)' if registry_value.DataIsString(): value_string = '[{0:s}...
Extracts events from a Windows Registry key. Args: parser_mediator (ParserMediator): mediates interactions between parsers and other components, such as storage and dfvfs. registry_key (dfwinreg.WinRegistryKey): Windows Registry key.
juraj-google-style
def generate_support_dump(self, information, timeout=(- 1)): uri = '{}/support-dumps'.format(self.data['uri']) return self._helper.create(information, uri=uri, timeout=timeout)
Generates a support dump for the logical enclosure with the specified ID. A logical enclosure support dump includes content for logical interconnects associated with that logical enclosure. By default, it also contains appliance support dump content. Args: information (dict): Information to generate support dump. time...
codesearchnet
def bind(self, extension: Extension) -> 'DictMentor': if not Extension.is_valid_extension(extension): raise ValueError("Cannot bind extension due to missing interface requirements") self._extensions.append(extension) return self
Add any predefined or custom extension. Args: extension: Extension to add to the processor. Returns: The DictMentor itself for chaining.
juraj-google-style
def describe(self, version_name): version_yaml = yaml.safe_dump(self.get_version_details(version_name), default_flow_style=False) print(version_yaml)
Print information of a specified model. Args: version: the name of the version in short form, such as "v1".
juraj-google-style
def split(s, posix=True): if isinstance(s, six.binary_type): s = s.decode('utf-8') return shlex.split(s, posix=posix)
Split the string s using shell-like syntax. Args: s (str): String to split posix (bool): Use posix split Returns: list of str: List of string parts
codesearchnet
def oxide_type(structure, relative_cutoff=1.1, return_nbonds=False): ox_obj = OxideType(structure, relative_cutoff) if return_nbonds: return ox_obj.oxide_type, ox_obj.nbonds else: return ox_obj.oxide_type
Determines if an oxide is a peroxide/superoxide/ozonide/normal oxide Args: structure (Structure): Input structure. relative_cutoff (float): Relative_cutoff * act. cutoff stipulates the max distance two O atoms must be from each other. return_nbonds (bool): Should number of bonds be requested?
juraj-google-style
def while_loop(self, context, step_method): logger.debug('starting') context['whileCounter'] = 0 if ((self.stop is None) and (self.max is None)): logger.error(f'while decorator missing both max and stop.') raise PipelineDefinitionError('the while decorator must have either max or stop, or bo...
Run step inside a while loop. Args: context: (pypyr.context.Context) The pypyr context. This arg will mutate - after method execution will contain the new updated context. step_method: (method/function) This is the method/function that will execute on every loop iteration. Signature is: function(context)
codesearchnet
def generate_output(line='0', short=None, name=None, value=None, is_parent=False, colorize=True): output = '{0}{1}{2}{3}{4}{5}{6}{7}\n'.format((LINES['{0}{1}'.format(line, ('C' if colorize else ''))] if (line in LINES.keys()) else ''), (COLOR_DEPTH[line] if (colorize and (line in COLOR_DEPTH)) else ''), ANSI['b'], ...
The function for formatting CLI output results. Args: line (:obj:`str`): The line number (0-4). Determines indentation. Defaults to '0'. short (:obj:`str`): The optional abbreviated name for a field. See hr.py for values. name (:obj:`str`): The optional name for a field. See hr.py for values. value (:obj:`str`): The f...
codesearchnet
def run_using_threadpool(fn_to_execute, inputs, pool_size): if not hasattr(threading.current_thread(), '_children'): threading.current_thread()._children = weakref.WeakKeyDictionary() pool = ThreadPool(min(pool_size, len(inputs))) try: old_level = logging.getLogger().level return poo...
For internal use only; no backwards-compatibility guarantees. Runs the given function on given inputs using a thread pool. Args: fn_to_execute: Function to execute inputs: Inputs on which given function will be executed in parallel. pool_size: Size of thread pool. Returns: Results retrieved after executing the given ...
github-repos
def _GetPlistRootKey(self, file_entry): file_object = file_entry.GetFileObject() try: plist_file = plist.PlistFile() plist_file.Read(file_object) except IOError as exception: location = getattr(file_entry.path_spec, 'location', '') raise errors.PreProcessFail('Unable to read ...
Retrieves the root key of a plist file. Args: file_entry (dfvfs.FileEntry): file entry of the plist. Returns: dict[str, object]: plist root key. Raises: errors.PreProcessFail: if the preprocessing fails.
codesearchnet
def cli_print(msg, color='', end=None, file=sys.stdout, logger=_LOG): if logger: logger.debug('-> {}'.format(msg)) if CLI_QUIET: return if (end is None): end = _linesep_for_file(file) file.write('{color}{msg}{reset}{end}'.format(color=color, msg=msg, reset=colorama.Style.RESET_AL...
Print the message to file and also log it. This function is intended as a 'tee' mechanism to enable the CLI interface as a first-class citizen, while ensuring that everything the operator sees also has an analogous logging entry in the test record for later inspection. Args: msg: The message to print/log. color: Opti...
codesearchnet
def get_uid(prefix=''): object_name_uids = global_state.get_global_attribute('object_name_uids', default=collections.defaultdict(int), set_to_default=True) object_name_uids[prefix] += 1 return object_name_uids[prefix]
Associates a string prefix with an integer counter. Args: prefix: String prefix to index. Returns: Unique integer ID. Example: >>> get_uid('dense') 1 >>> get_uid('dense') 2
github-repos
def generate_hdate(date: str, subtract_year: str) -> str: try: input_date = datetime.datetime.strptime(date, '%Y-%m-%d') if input_date.month == 2 and input_date.day == 29: input_date = input_date - datetime.timedelta(days=1) subtract_year = int(subtract_year) except (ValueErr...
Generate a historical date by subtracting a specified number of years from the given date. If input date is leap day (Feb 29), return Feb 28 even if target hdate is also a leap year. This is expected in ECMWF API. Args: date (str): The input date in the format 'YYYY-MM-DD'. subtract_year (str): The number of years to ...
github-repos
def add(self, value, date=None, return_value=False, key=None): data = {} if self._metric_id is None: self.tcex.handle_error(715, [self._metric_name]) body = {'value': value} if date is not None: body['date'] = self.tcex.utils.format_datetime(date, date_f...
Add metrics data to collection. Args: value (str): The value of the metric. date (str, optional): The optional date of the metric. return_value (bool, default:False): Tell the API to return the updates metric value. key (str, optional): The key value for keyed metrics. Return: dict: If return_value is True a dict wit...
juraj-google-style
def html_to_xhtml(html_unicode_string): try: assert isinstance(html_unicode_string, basestring) except AssertionError: raise TypeError root = BeautifulSoup(html_unicode_string, 'html.parser') try: assert root.html is not None except AssertionError: raise Val...
Converts html to xhtml Args: html_unicode_string: A (possible unicode) string representing HTML. Returns: A (possibly unicode) string representing XHTML. Raises: TypeError: Raised if input_string isn't a unicode string or string.
juraj-google-style
def cumprod(x, axis=0): return math_ops.cumprod(x, axis=axis)
Cumulative product of the values in a tensor, alongside the specified axis. Args: x: A tensor or variable. axis: An integer, the axis to compute the product. Returns: A tensor of the cumulative product of values of `x` along `axis`.
github-repos
def revert_to(self): response = self.resource.repo.api.http_request('PATCH', self.uri) if response.status_code == 204: logger.debug('reverting to previous version of resource, %s' % self.uri) self._current_resource.refresh() else: raise Exception('HTTP %s, could not revert to resource v...
method to revert resource to this version by issuing PATCH Args: None Returns: None: sends PATCH request, and refreshes parent resource
juraj-google-style
def _zeros_slot(self, var, slot_name, op_name): named_slots = self._slot_dict(slot_name) if _var_key(var) not in named_slots: new_slot_variable = slot_creator.create_zeros_slot(var, op_name, copy_xla_sharding=True) self._restore_slot_variable(slot_name=slot_name, variable=var, slot_variable=new_...
Find or create a slot initialized with 0.0. Args: var: A `Variable` object. slot_name: Name for the slot. op_name: Name to use when scoping the Variable that needs to be created for the slot. Returns: A `Variable` object.
github-repos
def set(self, *args, **kwargs): if args: for arg in args: if arg is not None: for name in self.__slots__: self._set(name, getattr(arg, name, UNSET)) for name in kwargs: self._set(name, kwargs.get(name, UNSET))
Conveniently set one or more fields at a time. Args: *args: Optionally set from other objects, available fields from the passed object are used in order **kwargs: Set from given key/value pairs (only names defined in __slots__ are used)
juraj-google-style
def asdim(dimension): if isinstance(dimension, Dimension): return dimension elif isinstance(dimension, (tuple, dict, basestring)): return Dimension(dimension) else: raise ValueError('%s type could not be interpreted as Dimension. Dimensions must be declared as a string, tuple, dictio...
Convert the input to a Dimension. Args: dimension: tuple, dict or string type to convert to Dimension Returns: A Dimension object constructed from the dimension spec. No copy is performed if the input is already a Dimension.
codesearchnet
def ajax(cls, url, param={}, method='get'): param = urllib.parse.urlencode(param) if (method.lower() == 'get'): req = urllib.request.Request(((url + '?') + param)) elif (method.lower() == 'post'): param = param.encode('utf-8') req = urllib.request.Request(url, data=param) else: ...
Get info by ajax Args: url: string Returns: dict: json decoded into a dict
codesearchnet
def WMITimeStrToRDFDatetime(self, timestr): offset_minutes = timestr[21:] year = timestr[:4] month = timestr[4:6] day = timestr[6:8] hours = timestr[8:10] minutes = timestr[10:12] seconds = timestr[12:14] microseconds = timestr[15:21] unix_seconds = calendar.timegm(tuple(map(int, [ye...
Return RDFDatetime from string like 20140825162259.000000-420. Args: timestr: WMI time string Returns: rdfvalue.RDFDatetime We have some timezone manipulation work to do here because the UTC offset is in minutes rather than +-HHMM
codesearchnet
def copy_course_videos(source_course_id, destination_course_id): if (source_course_id == destination_course_id): return course_videos = CourseVideo.objects.select_related('video', 'video_image').filter(course_id=six.text_type(source_course_id)) for course_video in course_videos: (destination...
Adds the destination_course_id to the videos taken from the source_course_id Args: source_course_id: The original course_id destination_course_id: The new course_id where the videos will be copied
codesearchnet
def set_property(property_map, name, value, exclude_from_indexes=None): set_value(property_map[name], value, exclude_from_indexes)
Set property value in the given datastore.Property proto message. Args: property_map: a string->datastore.Value protobuf map. name: name of the property. value: python object or datastore.Value. exclude_from_indexes: if the value should be exclude from indexes. None leaves indexing as is (defaults to False if value is...
codesearchnet
def interpolate(features, hparams, decode_hp): inputs, targets = features["inputs"], features["targets"] inputs = tf.unstack(inputs, axis=1) targets = tf.unstack(targets, axis=1) coeffs = np.linspace(0.0, 1.0, decode_hp.num_interp) first_frame, last_frame = inputs[0], targets[-1] first_top_z, first_l...
Interpolate between the first input frame and last target frame. Args: features: dict of tensors hparams: HParams, training hparams. decode_hp: HParams, decode hparams. Returns: images: interpolated images, 4-D Tensor, shape=(num_interp, H, W, C) first_frame: image, 3-D Tensor, shape=(1, H, W, C) last_frame: image, 3-...
juraj-google-style
def dbmax50years(self, value=None): if value is not None: try: value = float(value) except ValueError: raise ValueError('value {} need to be of type float ' 'for field `dbmax50years`'.format(value)) self._...
Corresponds to IDD Field `dbmax50years` 50-year return period values for maximum extreme dry-bulb temperature Args: value (float): value for IDD Field `dbmax50years` Unit: C if `value` is None it will not be checked against the specification and is assumed to be a missing value Raises: ValueError: if `value` is not a...
juraj-google-style
def __init__(self, rr, table='services'): self.rr = rr self.table = table self._ensure_table()
Initialize the service registry. Creates the database table if it does not exist. Args: rr (doublethink.Rethinker): a doublethink.Rethinker, which must have `dbname` set
juraj-google-style
def PrepareMergeTaskStorage(self, task): if task.identifier not in self._task_storage_writers: raise IOError('Storage writer for task: {0:s} does not exist.'.format( task.identifier))
Prepares a task storage for merging. Args: task (Task): task. Raises: IOError: if the task storage does not exist. OSError: if the task storage does not exist.
juraj-google-style
def dump_artifact(obj, path, filename=None): p_sha1 = None if (not os.path.exists(path)): os.makedirs(path, mode=448) else: p_sha1 = hashlib.sha1() p_sha1.update(obj.encode(encoding='UTF-8')) if (filename is None): (fd, fn) = tempfile.mkstemp(dir=path) else: f...
Write the artifact to disk at the specified path Args: obj (string): The string object to be dumped to disk in the specified path. The artifact filename will be automatically created path (string): The full path to the artifacts data directory. filename (string, optional): The name of file to write the artifact to....
codesearchnet