signature
stringlengths
8
3.44k
body
stringlengths
0
1.41M
docstring
stringlengths
1
122k
id
stringlengths
5
17
def get_num_obs_choosing_each_alternative(obs_per_alt_dict):
<EOL>num_obs_per_group = OrderedDict()<EOL>for alt_id in obs_per_alt_dict:<EOL><INDENT>num_obs_per_group[alt_id] = len(obs_per_alt_dict[alt_id])<EOL><DEDENT>tot_num_obs = sum([num_obs_per_group[g] for g in num_obs_per_group])<EOL>return num_obs_per_group, tot_num_obs<EOL>
Will create an ordered dictionary that records the number of units of observation that have chosen the given alternative (i.e. the associated dictionary key). Will also determine the total number of unique observations in the dataset. Parameters ---------- obs_per_alt_dict : dict. Each key should be a unique alternave id. Each key's value will be 1D ndarray that contains the sorted, unique observation ids of those observational units that chose the given alternative. Returns ------- num_obs_per_group : OrderedDict. Keys will be the alternative ids present in `obs_per_alt_dict`. Values will be the `len(obs_per_alt_dict[alt_id]).` tot_num_obs : int. Denotes the total number of unique observation ids in one's dataset.
f7705:m1
def create_cross_sectional_bootstrap_samples(obs_id_array,<EOL>alt_id_array,<EOL>choice_array,<EOL>num_samples,<EOL>seed=None):
<EOL>chosen_alts_to_obs_ids =relate_obs_ids_to_chosen_alts(obs_id_array, alt_id_array, choice_array)<EOL>num_obs_per_group, tot_num_obs =get_num_obs_choosing_each_alternative(chosen_alts_to_obs_ids)<EOL>ids_per_sample = np.empty((num_samples, tot_num_obs), dtype=float)<EOL>if seed is not None:<EOL><INDENT>if not isinstance(seed, int):<EOL><INDENT>msg = "<STR_LIT>"<EOL>raise ValueError(msg)<EOL><DEDENT>np.random.seed(seed)<EOL><DEDENT>col_idx = <NUM_LIT:0><EOL>for alt_id in num_obs_per_group:<EOL><INDENT>relevant_ids = chosen_alts_to_obs_ids[alt_id]<EOL>resample_size = num_obs_per_group[alt_id]<EOL>current_ids = (np.random.choice(relevant_ids,<EOL>size=resample_size * num_samples,<EOL>replace=True)<EOL>.reshape((num_samples, resample_size)))<EOL>end_col = col_idx + resample_size<EOL>ids_per_sample[:, col_idx:end_col] = current_ids<EOL>col_idx += resample_size<EOL><DEDENT>return ids_per_sample<EOL>
Determines the unique observations that will be present in each bootstrap sample. This function DOES NOT create the new design matrices or a new long-format dataframe for each bootstrap sample. Note that these will be correct bootstrap samples for cross-sectional datasets. This function will not work correctly for panel datasets. Parameters ---------- obs_id_array : 1D ndarray of ints. Each element should denote a unique observation id for the corresponding row of the long format array. alt_id_array : 1D ndarray of ints. Each element should denote a unique alternative id for the corresponding row of the long format array. choice_array : 1D ndarray of ints. Each element should be a one or a zero. The values should denote a whether or not the corresponding alternative in `alt_id_array` was chosen by the observational unit in the corresponding row of `obs_id_array.` num_samples : int. Denotes the number of bootstrap samples that need to be drawn. seed : non-negative int or None, optional. Denotes the random seed to be used in order to ensure reproducibility of the bootstrap sample generation. Default is None. If None, no seed will be used and the generation of the bootstrap samples will (in general) not be reproducible. Returns ------- ids_per_sample : 2D ndarray. Each row represents a complete bootstrap sample. Each column denotes a selected bootstrap observation that comprises the bootstrap sample. The elements of the array denote the observation ids of the chosen observational units.
f7705:m2
def create_bootstrap_id_array(obs_id_per_sample):
<EOL>n_rows, n_cols = obs_id_per_sample.shape<EOL>bootstrap_id_array =np.tile(np.arange(n_cols) + <NUM_LIT:1>, n_rows).reshape((n_rows, n_cols))<EOL>return bootstrap_id_array<EOL>
Creates a 2D ndarray that contains the 'bootstrap ids' for each replication of each unit of observation that is an the set of bootstrap samples. Parameters ---------- obs_id_per_sample : 2D ndarray of ints. Should have one row for each bootsrap sample. Should have one column for each observational unit that is serving as a new bootstrap observational unit. Returns ------- bootstrap_id_array : 2D ndarray of ints. Will have the same shape as `obs_id_per_sample`. Each element will denote the fake observational id in the new bootstrap dataset.
f7705:m3
def create_deepcopied_groupby_dict(orig_df, obs_id_col):
<EOL>obs_id_vals = orig_df[obs_id_col].values<EOL>unique_obs_ids = np.unique(obs_id_vals)<EOL>groupby_dict = {}<EOL>for obs_id in unique_obs_ids:<EOL><INDENT>desired_rows = obs_id_vals == obs_id<EOL>groupby_dict[obs_id] = orig_df.loc[desired_rows].copy(deep=True)<EOL><DEDENT>return groupby_dict<EOL>
Will create a dictionary where each key corresponds to a unique value in `orig_df[obs_id_col]` and each value corresponds to all of the rows of `orig_df` where `orig_df[obs_id_col] == key`. Parameters ---------- orig_df : pandas DataFrame. Should be long-format dataframe containing the data used to estimate the desired choice model. obs_id_col : str. Should be a column name within `orig_df`. Should denote the original observation id column. Returns ------- groupby_dict : dict. Each key will be a unique value in `orig_df[obs_id_col]` and each value will be the rows of `orig_df` where `orig_df[obs_id_col] == key`.
f7705:m4
def check_column_existence(col_name, df, presence=True):
if presence:<EOL><INDENT>if col_name not in df.columns:<EOL><INDENT>msg = "<STR_LIT>"<EOL>raise ValueError(msg.format(col_name))<EOL><DEDENT><DEDENT>else:<EOL><INDENT>if col_name in df.columns:<EOL><INDENT>msg = "<STR_LIT>"<EOL>raise ValueError(msg.format(col_name))<EOL><DEDENT><DEDENT>return None<EOL>
Checks whether or not `col_name` is in `df` and raises a helpful error msg if the desired condition is not met. Parameters ---------- col_name : str. Should represent a column whose presence in `df` is to be checked. df : pandas DataFrame. The dataframe that will be checked for the presence of `col_name`. presence : bool, optional. If True, then this function checks for the PRESENCE of `col_name` from `df`. If False, then this function checks for the ABSENCE of `col_name` in `df`. Default == True. Returns ------- None.
f7705:m5
def ensure_resampled_obs_ids_in_df(resampled_obs_ids, orig_obs_id_array):
if not np.in1d(resampled_obs_ids, orig_obs_id_array).all():<EOL><INDENT>msg ="<STR_LIT>"<EOL>raise ValueError(msg)<EOL><DEDENT>return None<EOL>
Checks whether all ids in `resampled_obs_ids` are in `orig_obs_id_array`. Raises a helpful ValueError if not. Parameters ---------- resampled_obs_ids : 1D ndarray of ints. Should contain the observation ids of the observational units that will be used in the current bootstrap sample. orig_obs_id_array : 1D ndarray of ints. Should countain the observation ids of the observational units in the original dataframe containing the data for this model. Returns ------- None.
f7705:m6
def create_bootstrap_dataframe(orig_df,<EOL>obs_id_col,<EOL>resampled_obs_ids_1d,<EOL>groupby_dict,<EOL>boot_id_col="<STR_LIT>"):
<EOL>check_column_existence(obs_id_col, orig_df, presence=True)<EOL>check_column_existence(boot_id_col, orig_df, presence=False)<EOL>obs_id_values = orig_df[obs_id_col].values<EOL>ensure_resampled_obs_ids_in_df(resampled_obs_ids_1d, obs_id_values)<EOL>component_dfs = []<EOL>for boot_id, obs_id in enumerate(resampled_obs_ids_1d):<EOL><INDENT>extracted_df = groupby_dict[obs_id].copy()<EOL>extracted_df[boot_id_col] = boot_id + <NUM_LIT:1><EOL>component_dfs.append(extracted_df)<EOL><DEDENT>bootstrap_df = pd.concat(component_dfs, axis=<NUM_LIT:0>, ignore_index=True)<EOL>return bootstrap_df<EOL>
Will create the altered dataframe of data needed to estimate a choice model with the particular observations that belong to the current bootstrap sample. Parameters ---------- orig_df : pandas DataFrame. Should be long-format dataframe containing the data used to estimate the desired choice model. obs_id_col : str. Should be a column name within `orig_df`. Should denote the original observation id column. resampled_obs_ids_1d : 1D ndarray of ints. Each value should represent the alternative id of a given bootstrap replicate. groupby_dict : dict. Each key will be a unique value in `orig_df[obs_id_col]` and each value will be the rows of `orig_df` where `orig_df[obs_id_col] == key`. boot_id_col : str, optional. Denotes the new column that will be created to specify the bootstrap observation ids for choice model estimation. Returns ------- bootstrap_df : pandas Dataframe. Will contain all the same columns as `orig_df` as well as the additional `boot_id_col`. For each value in `resampled_obs_ids_1d`, `bootstrap_df` will contain the long format rows from `orig_df` that have the given observation id.
f7705:m7
def paths_in_directory(input_directory):
paths = []<EOL>for base_path, directories, filenames in os.walk(input_directory):<EOL><INDENT>relative_path = os.path.relpath(base_path, input_directory)<EOL>path_components = relative_path.split(os.sep)<EOL>if path_components[<NUM_LIT:0>] == "<STR_LIT:.>":<EOL><INDENT>path_components = path_components[<NUM_LIT:1>:]<EOL><DEDENT>if path_components and path_components[<NUM_LIT:0>].startswith("<STR_LIT:.>"):<EOL><INDENT>continue<EOL><DEDENT>path_components = list(filter(bool, path_components)) <EOL>for filename in filenames:<EOL><INDENT>if filename.startswith("<STR_LIT:.>"):<EOL><INDENT>continue<EOL><DEDENT>paths.append(path_components + [filename])<EOL><DEDENT><DEDENT>return paths<EOL>
Generate a list of all files in input_directory, each as a list containing path components.
f7707:m0
def static_uint8_variable_for_data(variable_name, data, max_line_length=<NUM_LIT>, comment="<STR_LIT>", indent=<NUM_LIT:2>):
hex_components = []<EOL>for byte in data:<EOL><INDENT>byte_as_hex = "<STR_LIT>".format(u=ord(byte))<EOL>hex_components.append(byte_as_hex)<EOL><DEDENT>chunk_size = (max_line_length - indent + <NUM_LIT:2> - <NUM_LIT:1>) // <NUM_LIT:6> <EOL>array_lines = []<EOL>for chunk_offset in range(<NUM_LIT:0>, len(hex_components), chunk_size):<EOL><INDENT>chunk = hex_components[chunk_offset:chunk_offset + chunk_size]<EOL>array_lines.append("<STR_LIT:U+0020>" * indent + "<STR_LIT:U+002CU+0020>".join(chunk) + "<STR_LIT:U+002C>")<EOL><DEDENT>array_data = "<STR_LIT:\n>".join(array_lines)<EOL>if comment != "<STR_LIT>":<EOL><INDENT>comment = "<STR_LIT>" + comment<EOL><DEDENT>substitutions = {"<STR_LIT:v>": variable_name,<EOL>"<STR_LIT:l>": len(hex_components),<EOL>"<STR_LIT:d>": array_data,<EOL>"<STR_LIT:c>": comment}<EOL>declaration = "<STR_LIT>".format(**substitutions)<EOL>return declaration<EOL>
r""" >>> static_uint8_variable_for_data("v", "abc") 'static uint8_t v[3] = {\n 0x61, 0x62, 0x63,\n}; // v' >>> static_uint8_variable_for_data("v", "abc", comment="hi") 'static uint8_t v[3] = { // hi\n 0x61, 0x62, 0x63,\n}; // v' >>> static_uint8_variable_for_data("v", "abc", indent=4) 'static uint8_t v[3] = {\n 0x61, 0x62, 0x63,\n}; // v' >>> static_uint8_variable_for_data("v", "abcabcabcabc", max_line_length=20) 'static uint8_t v[12] = {\n 0x61, 0x62, 0x63,\n 0x61, 0x62, 0x63,\n 0x61, 0x62, 0x63,\n 0x61, 0x62, 0x63,\n}; // v'
f7707:m1
def tap(f):
@wraps(f)<EOL>def _cb(res, *a, **kw):<EOL><INDENT>d = maybeDeferred(f, res, *a, **kw)<EOL>d.addCallback(lambda ignored: res)<EOL>return d<EOL><DEDENT>return _cb<EOL>
"Tap" a Deferred callback chain with a function whose return value is ignored.
f7710:m0
@deprecated(Version('<STR_LIT>', <NUM_LIT:1>, <NUM_LIT:2>, <NUM_LIT:0>), '<STR_LIT>')<EOL>def chainCerts(data):
matches = re.findall(<EOL>r'<STR_LIT>',<EOL>data,<EOL>flags=re.DOTALL)<EOL>chainCertificates = [<EOL>Certificate.loadPEM(chainCertPEM).original<EOL>for chainCertPEM in matches]<EOL>return chainCertificates[<NUM_LIT:1>:]<EOL>
Matches and returns any certificates found except the first match. Regex code copied from L{twisted.internet.endpoints._parseSSL}. Related ticket: https://twistedmatrix.com/trac/ticket/7732 @type path: L{bytes} @param data: PEM-encoded data containing the certificates. @rtype: L{list} containing L{Certificate}s.
f7712:m0
def timeout_deferred(deferred, timeout, clock, deferred_description=None):
timed_out = [False]<EOL>def time_it_out():<EOL><INDENT>timed_out[<NUM_LIT:0>] = True<EOL>deferred.cancel()<EOL><DEDENT>delayed_call = clock.callLater(timeout, time_it_out)<EOL>def convert_cancelled(f):<EOL><INDENT>if timed_out[<NUM_LIT:0>]:<EOL><INDENT>f.trap(defer.CancelledError)<EOL>raise TimedOutError(timeout, deferred_description)<EOL><DEDENT>return f<EOL><DEDENT>deferred.addErrback(convert_cancelled)<EOL>def cancel_timeout(result):<EOL><INDENT>if delayed_call.active():<EOL><INDENT>delayed_call.cancel()<EOL><DEDENT>return result<EOL><DEDENT>deferred.addBoth(cancel_timeout)<EOL>
Time out a deferred - schedule for it to be canceling it after ``timeout`` seconds from now, as per the clock. If it gets timed out, it errbacks with a :class:`TimedOutError`, unless a cancelable function is passed to the ``Deferred``'s initialization and it callbacks or errbacks with something else when cancelled. (see the documentation for :class:`twisted.internet.defer.Deferred`) for more details. :param Deferred deferred: Which deferred to time out (cancel) :param int timeout: How long before timing out the deferred (in seconds) :param str deferred_description: A description of the Deferred or the Deferred's purpose - if not provided, defaults to the ``repr`` of the Deferred. To be passed to :class:`TimedOutError` for a pretty Exception string. :param IReactorTime clock: Clock to be used to schedule the timeout - used for testing. :return: ``None`` based on: https://twistedmatrix.com/trac/ticket/990
f7713:m0
def get_keywords():
<EOL>git_refnames = "<STR_LIT>"<EOL>git_full = "<STR_LIT>"<EOL>git_date = "<STR_LIT>"<EOL>keywords = {"<STR_LIT>": git_refnames, "<STR_LIT>": git_full, "<STR_LIT:date>": git_date}<EOL>return keywords<EOL>
Get the keywords needed to look up the version information.
f7714:m0
def get_config():
<EOL>cfg = VersioneerConfig()<EOL>cfg.VCS = "<STR_LIT>"<EOL>cfg.style = "<STR_LIT>"<EOL>cfg.tag_prefix = "<STR_LIT>"<EOL>cfg.parentdir_prefix = "<STR_LIT>"<EOL>cfg.versionfile_source = "<STR_LIT>"<EOL>cfg.verbose = False<EOL>return cfg<EOL>
Create, populate and return the VersioneerConfig() object.
f7714:m1
def register_vcs_handler(vcs, method):
def decorate(f):<EOL><INDENT>"""<STR_LIT>"""<EOL>if vcs not in HANDLERS:<EOL><INDENT>HANDLERS[vcs] = {}<EOL><DEDENT>HANDLERS[vcs][method] = f<EOL>return f<EOL><DEDENT>return decorate<EOL>
Decorator to mark a method as the handler for a particular VCS.
f7714:m2
def run_command(commands, args, cwd=None, verbose=False, hide_stderr=False,<EOL>env=None):
assert isinstance(commands, list)<EOL>p = None<EOL>for c in commands:<EOL><INDENT>try:<EOL><INDENT>dispcmd = str([c] + args)<EOL>p = subprocess.Popen([c] + args, cwd=cwd, env=env,<EOL>stdout=subprocess.PIPE,<EOL>stderr=(subprocess.PIPE if hide_stderr<EOL>else None))<EOL>break<EOL><DEDENT>except EnvironmentError:<EOL><INDENT>e = sys.exc_info()[<NUM_LIT:1>]<EOL>if e.errno == errno.ENOENT:<EOL><INDENT>continue<EOL><DEDENT>if verbose:<EOL><INDENT>print("<STR_LIT>" % dispcmd)<EOL>print(e)<EOL><DEDENT>return None, None<EOL><DEDENT><DEDENT>else:<EOL><INDENT>if verbose:<EOL><INDENT>print("<STR_LIT>" % (commands,))<EOL><DEDENT>return None, None<EOL><DEDENT>stdout = p.communicate()[<NUM_LIT:0>].strip()<EOL>if sys.version_info[<NUM_LIT:0>] >= <NUM_LIT:3>:<EOL><INDENT>stdout = stdout.decode()<EOL><DEDENT>if p.returncode != <NUM_LIT:0>:<EOL><INDENT>if verbose:<EOL><INDENT>print("<STR_LIT>" % dispcmd)<EOL>print("<STR_LIT>" % stdout)<EOL><DEDENT>return None, p.returncode<EOL><DEDENT>return stdout, p.returncode<EOL>
Call the given command(s).
f7714:m3
def versions_from_parentdir(parentdir_prefix, root, verbose):
rootdirs = []<EOL>for i in range(<NUM_LIT:3>):<EOL><INDENT>dirname = os.path.basename(root)<EOL>if dirname.startswith(parentdir_prefix):<EOL><INDENT>return {"<STR_LIT:version>": dirname[len(parentdir_prefix):],<EOL>"<STR_LIT>": None,<EOL>"<STR_LIT>": False, "<STR_LIT:error>": None, "<STR_LIT:date>": None}<EOL><DEDENT>else:<EOL><INDENT>rootdirs.append(root)<EOL>root = os.path.dirname(root) <EOL><DEDENT><DEDENT>if verbose:<EOL><INDENT>print("<STR_LIT>" %<EOL>(str(rootdirs), parentdir_prefix))<EOL><DEDENT>raise NotThisMethod("<STR_LIT>")<EOL>
Try to determine the version from the parent directory name. Source tarballs conventionally unpack into a directory that includes both the project name and a version string. We will also support searching up two directory levels for an appropriately named parent directory
f7714:m4
@register_vcs_handler("<STR_LIT>", "<STR_LIT>")<EOL>def git_get_keywords(versionfile_abs):
<EOL>keywords = {}<EOL>try:<EOL><INDENT>f = open(versionfile_abs, "<STR_LIT:r>")<EOL>for line in f.readlines():<EOL><INDENT>if line.strip().startswith("<STR_LIT>"):<EOL><INDENT>mo = re.search(r'<STR_LIT>', line)<EOL>if mo:<EOL><INDENT>keywords["<STR_LIT>"] = mo.group(<NUM_LIT:1>)<EOL><DEDENT><DEDENT>if line.strip().startswith("<STR_LIT>"):<EOL><INDENT>mo = re.search(r'<STR_LIT>', line)<EOL>if mo:<EOL><INDENT>keywords["<STR_LIT>"] = mo.group(<NUM_LIT:1>)<EOL><DEDENT><DEDENT>if line.strip().startswith("<STR_LIT>"):<EOL><INDENT>mo = re.search(r'<STR_LIT>', line)<EOL>if mo:<EOL><INDENT>keywords["<STR_LIT:date>"] = mo.group(<NUM_LIT:1>)<EOL><DEDENT><DEDENT><DEDENT>f.close()<EOL><DEDENT>except EnvironmentError:<EOL><INDENT>pass<EOL><DEDENT>return keywords<EOL>
Extract version information from the given file.
f7714:m5
@register_vcs_handler("<STR_LIT>", "<STR_LIT>")<EOL>def git_versions_from_keywords(keywords, tag_prefix, verbose):
if not keywords:<EOL><INDENT>raise NotThisMethod("<STR_LIT>")<EOL><DEDENT>date = keywords.get("<STR_LIT:date>")<EOL>if date is not None:<EOL><INDENT>date = date.strip().replace("<STR_LIT:U+0020>", "<STR_LIT:T>", <NUM_LIT:1>).replace("<STR_LIT:U+0020>", "<STR_LIT>", <NUM_LIT:1>)<EOL><DEDENT>refnames = keywords["<STR_LIT>"].strip()<EOL>if refnames.startswith("<STR_LIT>"):<EOL><INDENT>if verbose:<EOL><INDENT>print("<STR_LIT>")<EOL><DEDENT>raise NotThisMethod("<STR_LIT>")<EOL><DEDENT>refs = set([r.strip() for r in refnames.strip("<STR_LIT>").split("<STR_LIT:U+002C>")])<EOL>TAG = "<STR_LIT>"<EOL>tags = set([r[len(TAG):] for r in refs if r.startswith(TAG)])<EOL>if not tags:<EOL><INDENT>tags = set([r for r in refs if re.search(r'<STR_LIT>', r)])<EOL>if verbose:<EOL><INDENT>print("<STR_LIT>" % "<STR_LIT:U+002C>".join(refs - tags))<EOL><DEDENT><DEDENT>if verbose:<EOL><INDENT>print("<STR_LIT>" % "<STR_LIT:U+002C>".join(sorted(tags)))<EOL><DEDENT>for ref in sorted(tags):<EOL><INDENT>if ref.startswith(tag_prefix):<EOL><INDENT>r = ref[len(tag_prefix):]<EOL>if verbose:<EOL><INDENT>print("<STR_LIT>" % r)<EOL><DEDENT>return {"<STR_LIT:version>": r,<EOL>"<STR_LIT>": keywords["<STR_LIT>"].strip(),<EOL>"<STR_LIT>": False, "<STR_LIT:error>": None,<EOL>"<STR_LIT:date>": date}<EOL><DEDENT><DEDENT>if verbose:<EOL><INDENT>print("<STR_LIT>")<EOL><DEDENT>return {"<STR_LIT:version>": "<STR_LIT>",<EOL>"<STR_LIT>": keywords["<STR_LIT>"].strip(),<EOL>"<STR_LIT>": False, "<STR_LIT:error>": "<STR_LIT>", "<STR_LIT:date>": None}<EOL>
Get version information from git keywords.
f7714:m6
@register_vcs_handler("<STR_LIT>", "<STR_LIT>")<EOL>def git_pieces_from_vcs(tag_prefix, root, verbose, run_command=run_command):
GITS = ["<STR_LIT>"]<EOL>if sys.platform == "<STR_LIT:win32>":<EOL><INDENT>GITS = ["<STR_LIT>", "<STR_LIT>"]<EOL><DEDENT>out, rc = run_command(GITS, ["<STR_LIT>", "<STR_LIT>"], cwd=root,<EOL>hide_stderr=True)<EOL>if rc != <NUM_LIT:0>:<EOL><INDENT>if verbose:<EOL><INDENT>print("<STR_LIT>" % root)<EOL><DEDENT>raise NotThisMethod("<STR_LIT>")<EOL><DEDENT>describe_out, rc = run_command(GITS, ["<STR_LIT>", "<STR_LIT>", "<STR_LIT>",<EOL>"<STR_LIT>", "<STR_LIT>",<EOL>"<STR_LIT>", "<STR_LIT>" % tag_prefix],<EOL>cwd=root)<EOL>if describe_out is None:<EOL><INDENT>raise NotThisMethod("<STR_LIT>")<EOL><DEDENT>describe_out = describe_out.strip()<EOL>full_out, rc = run_command(GITS, ["<STR_LIT>", "<STR_LIT>"], cwd=root)<EOL>if full_out is None:<EOL><INDENT>raise NotThisMethod("<STR_LIT>")<EOL><DEDENT>full_out = full_out.strip()<EOL>pieces = {}<EOL>pieces["<STR_LIT>"] = full_out<EOL>pieces["<STR_LIT>"] = full_out[:<NUM_LIT:7>] <EOL>pieces["<STR_LIT:error>"] = None<EOL>git_describe = describe_out<EOL>dirty = git_describe.endswith("<STR_LIT>")<EOL>pieces["<STR_LIT>"] = dirty<EOL>if dirty:<EOL><INDENT>git_describe = git_describe[:git_describe.rindex("<STR_LIT>")]<EOL><DEDENT>if "<STR_LIT:->" in git_describe:<EOL><INDENT>mo = re.search(r'<STR_LIT>', git_describe)<EOL>if not mo:<EOL><INDENT>pieces["<STR_LIT:error>"] = ("<STR_LIT>"<EOL>% describe_out)<EOL>return pieces<EOL><DEDENT>full_tag = mo.group(<NUM_LIT:1>)<EOL>if not full_tag.startswith(tag_prefix):<EOL><INDENT>if verbose:<EOL><INDENT>fmt = "<STR_LIT>"<EOL>print(fmt % (full_tag, tag_prefix))<EOL><DEDENT>pieces["<STR_LIT:error>"] = ("<STR_LIT>"<EOL>% (full_tag, tag_prefix))<EOL>return pieces<EOL><DEDENT>pieces["<STR_LIT>"] = full_tag[len(tag_prefix):]<EOL>pieces["<STR_LIT>"] = int(mo.group(<NUM_LIT:2>))<EOL>pieces["<STR_LIT>"] = mo.group(<NUM_LIT:3>)<EOL><DEDENT>else:<EOL><INDENT>pieces["<STR_LIT>"] = None<EOL>count_out, rc = run_command(GITS, ["<STR_LIT>", "<STR_LIT>", "<STR_LIT>"],<EOL>cwd=root)<EOL>pieces["<STR_LIT>"] = int(count_out) <EOL><DEDENT>date = run_command(GITS, ["<STR_LIT>", "<STR_LIT>", "<STR_LIT>", "<STR_LIT>"],<EOL>cwd=root)[<NUM_LIT:0>].strip()<EOL>pieces["<STR_LIT:date>"] = date.strip().replace("<STR_LIT:U+0020>", "<STR_LIT:T>", <NUM_LIT:1>).replace("<STR_LIT:U+0020>", "<STR_LIT>", <NUM_LIT:1>)<EOL>return pieces<EOL>
Get version from 'git describe' in the root of the source tree. This only gets called if the git-archive 'subst' keywords were *not* expanded, and _version.py hasn't already been rewritten with a short version string, meaning we're inside a checked out source tree.
f7714:m7
def plus_or_dot(pieces):
if "<STR_LIT:+>" in pieces.get("<STR_LIT>", "<STR_LIT>"):<EOL><INDENT>return "<STR_LIT:.>"<EOL><DEDENT>return "<STR_LIT:+>"<EOL>
Return a + if we don't already have one, else return a .
f7714:m8
def render_pep440(pieces):
if pieces["<STR_LIT>"]:<EOL><INDENT>rendered = pieces["<STR_LIT>"]<EOL>if pieces["<STR_LIT>"] or pieces["<STR_LIT>"]:<EOL><INDENT>rendered += plus_or_dot(pieces)<EOL>rendered += "<STR_LIT>" % (pieces["<STR_LIT>"], pieces["<STR_LIT>"])<EOL>if pieces["<STR_LIT>"]:<EOL><INDENT>rendered += "<STR_LIT>"<EOL><DEDENT><DEDENT><DEDENT>else:<EOL><INDENT>rendered = "<STR_LIT>" % (pieces["<STR_LIT>"],<EOL>pieces["<STR_LIT>"])<EOL>if pieces["<STR_LIT>"]:<EOL><INDENT>rendered += "<STR_LIT>"<EOL><DEDENT><DEDENT>return rendered<EOL>
Build up version string, with post-release "local version identifier". Our goal: TAG[+DISTANCE.gHEX[.dirty]] . Note that if you get a tagged build and then dirty it, you'll get TAG+0.gHEX.dirty Exceptions: 1: no tags. git_describe was just HEX. 0+untagged.DISTANCE.gHEX[.dirty]
f7714:m9
def render_pep440_pre(pieces):
if pieces["<STR_LIT>"]:<EOL><INDENT>rendered = pieces["<STR_LIT>"]<EOL>if pieces["<STR_LIT>"]:<EOL><INDENT>rendered += "<STR_LIT>" % pieces["<STR_LIT>"]<EOL><DEDENT><DEDENT>else:<EOL><INDENT>rendered = "<STR_LIT>" % pieces["<STR_LIT>"]<EOL><DEDENT>return rendered<EOL>
TAG[.post.devDISTANCE] -- No -dirty. Exceptions: 1: no tags. 0.post.devDISTANCE
f7714:m10
def render_pep440_post(pieces):
if pieces["<STR_LIT>"]:<EOL><INDENT>rendered = pieces["<STR_LIT>"]<EOL>if pieces["<STR_LIT>"] or pieces["<STR_LIT>"]:<EOL><INDENT>rendered += "<STR_LIT>" % pieces["<STR_LIT>"]<EOL>if pieces["<STR_LIT>"]:<EOL><INDENT>rendered += "<STR_LIT>"<EOL><DEDENT>rendered += plus_or_dot(pieces)<EOL>rendered += "<STR_LIT>" % pieces["<STR_LIT>"]<EOL><DEDENT><DEDENT>else:<EOL><INDENT>rendered = "<STR_LIT>" % pieces["<STR_LIT>"]<EOL>if pieces["<STR_LIT>"]:<EOL><INDENT>rendered += "<STR_LIT>"<EOL><DEDENT>rendered += "<STR_LIT>" % pieces["<STR_LIT>"]<EOL><DEDENT>return rendered<EOL>
TAG[.postDISTANCE[.dev0]+gHEX] . The ".dev0" means dirty. Note that .dev0 sorts backwards (a dirty tree will appear "older" than the corresponding clean one), but you shouldn't be releasing software with -dirty anyways. Exceptions: 1: no tags. 0.postDISTANCE[.dev0]
f7714:m11
def render_pep440_old(pieces):
if pieces["<STR_LIT>"]:<EOL><INDENT>rendered = pieces["<STR_LIT>"]<EOL>if pieces["<STR_LIT>"] or pieces["<STR_LIT>"]:<EOL><INDENT>rendered += "<STR_LIT>" % pieces["<STR_LIT>"]<EOL>if pieces["<STR_LIT>"]:<EOL><INDENT>rendered += "<STR_LIT>"<EOL><DEDENT><DEDENT><DEDENT>else:<EOL><INDENT>rendered = "<STR_LIT>" % pieces["<STR_LIT>"]<EOL>if pieces["<STR_LIT>"]:<EOL><INDENT>rendered += "<STR_LIT>"<EOL><DEDENT><DEDENT>return rendered<EOL>
TAG[.postDISTANCE[.dev0]] . The ".dev0" means dirty. Eexceptions: 1: no tags. 0.postDISTANCE[.dev0]
f7714:m12
def render_git_describe(pieces):
if pieces["<STR_LIT>"]:<EOL><INDENT>rendered = pieces["<STR_LIT>"]<EOL>if pieces["<STR_LIT>"]:<EOL><INDENT>rendered += "<STR_LIT>" % (pieces["<STR_LIT>"], pieces["<STR_LIT>"])<EOL><DEDENT><DEDENT>else:<EOL><INDENT>rendered = pieces["<STR_LIT>"]<EOL><DEDENT>if pieces["<STR_LIT>"]:<EOL><INDENT>rendered += "<STR_LIT>"<EOL><DEDENT>return rendered<EOL>
TAG[-DISTANCE-gHEX][-dirty]. Like 'git describe --tags --dirty --always'. Exceptions: 1: no tags. HEX[-dirty] (note: no 'g' prefix)
f7714:m13
def render_git_describe_long(pieces):
if pieces["<STR_LIT>"]:<EOL><INDENT>rendered = pieces["<STR_LIT>"]<EOL>rendered += "<STR_LIT>" % (pieces["<STR_LIT>"], pieces["<STR_LIT>"])<EOL><DEDENT>else:<EOL><INDENT>rendered = pieces["<STR_LIT>"]<EOL><DEDENT>if pieces["<STR_LIT>"]:<EOL><INDENT>rendered += "<STR_LIT>"<EOL><DEDENT>return rendered<EOL>
TAG-DISTANCE-gHEX[-dirty]. Like 'git describe --tags --dirty --always -long'. The distance/hash is unconditional. Exceptions: 1: no tags. HEX[-dirty] (note: no 'g' prefix)
f7714:m14
def render(pieces, style):
if pieces["<STR_LIT:error>"]:<EOL><INDENT>return {"<STR_LIT:version>": "<STR_LIT>",<EOL>"<STR_LIT>": pieces.get("<STR_LIT>"),<EOL>"<STR_LIT>": None,<EOL>"<STR_LIT:error>": pieces["<STR_LIT:error>"],<EOL>"<STR_LIT:date>": None}<EOL><DEDENT>if not style or style == "<STR_LIT:default>":<EOL><INDENT>style = "<STR_LIT>" <EOL><DEDENT>if style == "<STR_LIT>":<EOL><INDENT>rendered = render_pep440(pieces)<EOL><DEDENT>elif style == "<STR_LIT>":<EOL><INDENT>rendered = render_pep440_pre(pieces)<EOL><DEDENT>elif style == "<STR_LIT>":<EOL><INDENT>rendered = render_pep440_post(pieces)<EOL><DEDENT>elif style == "<STR_LIT>":<EOL><INDENT>rendered = render_pep440_old(pieces)<EOL><DEDENT>elif style == "<STR_LIT>":<EOL><INDENT>rendered = render_git_describe(pieces)<EOL><DEDENT>elif style == "<STR_LIT>":<EOL><INDENT>rendered = render_git_describe_long(pieces)<EOL><DEDENT>else:<EOL><INDENT>raise ValueError("<STR_LIT>" % style)<EOL><DEDENT>return {"<STR_LIT:version>": rendered, "<STR_LIT>": pieces["<STR_LIT>"],<EOL>"<STR_LIT>": pieces["<STR_LIT>"], "<STR_LIT:error>": None,<EOL>"<STR_LIT:date>": pieces.get("<STR_LIT:date>")}<EOL>
Render the given version pieces into the requested style.
f7714:m15
def get_versions():
<EOL>cfg = get_config()<EOL>verbose = cfg.verbose<EOL>try:<EOL><INDENT>return git_versions_from_keywords(get_keywords(), cfg.tag_prefix,<EOL>verbose)<EOL><DEDENT>except NotThisMethod:<EOL><INDENT>pass<EOL><DEDENT>try:<EOL><INDENT>root = os.path.realpath(__file__)<EOL>for i in cfg.versionfile_source.split('<STR_LIT:/>'):<EOL><INDENT>root = os.path.dirname(root)<EOL><DEDENT><DEDENT>except NameError:<EOL><INDENT>return {"<STR_LIT:version>": "<STR_LIT>", "<STR_LIT>": None,<EOL>"<STR_LIT>": None,<EOL>"<STR_LIT:error>": "<STR_LIT>",<EOL>"<STR_LIT:date>": None}<EOL><DEDENT>try:<EOL><INDENT>pieces = git_pieces_from_vcs(cfg.tag_prefix, root, verbose)<EOL>return render(pieces, cfg.style)<EOL><DEDENT>except NotThisMethod:<EOL><INDENT>pass<EOL><DEDENT>try:<EOL><INDENT>if cfg.parentdir_prefix:<EOL><INDENT>return versions_from_parentdir(cfg.parentdir_prefix, root, verbose)<EOL><DEDENT><DEDENT>except NotThisMethod:<EOL><INDENT>pass<EOL><DEDENT>return {"<STR_LIT:version>": "<STR_LIT>", "<STR_LIT>": None,<EOL>"<STR_LIT>": None,<EOL>"<STR_LIT:error>": "<STR_LIT>", "<STR_LIT:date>": None}<EOL>
Get version information or return default if unable to do so.
f7714:m16
def setUp(self):
self.clock = Clock()<EOL>self.deferred = Deferred()<EOL>
Create a clock and a deferred to be cancelled
f7716:c1:m0
def enum_values_fixture():
return [<EOL>EnumItem(u'<STR_LIT:foo>', u'<STR_LIT>', quux=u'<STR_LIT:hello>', frob=u'<STR_LIT>'),<EOL>EnumItem(u'<STR_LIT:bar>', u'<STR_LIT>', quux=u'<STR_LIT>'),<EOL>EnumItem(u'<STR_LIT>', u'<STR_LIT>', frob=u'<STR_LIT>')]<EOL>
Fixture suitable for use with `Enum`.
f7718:m0
def object_enum_values_fixture(object1, object2, object3):
return [<EOL>EnumItem(object1, u'<STR_LIT>', quux=u'<STR_LIT:hello>', frob=u'<STR_LIT>'),<EOL>EnumItem(object2, u'<STR_LIT>', quux=u'<STR_LIT>'),<EOL>EnumItem(object3, u'<STR_LIT>', frob=u'<STR_LIT>', id=u'<STR_LIT>')]<EOL>
Fixture suitable for use with `ObjectEnum`.
f7718:m1
def filter_enum(pred, enum):
def _items():<EOL><INDENT>for item in enum:<EOL><INDENT>yield EnumItem(<EOL>item.value,<EOL>item.desc,<EOL>not pred(item),<EOL>**item._extra)<EOL><DEDENT><DEDENT>return Enum('<STR_LIT>'.format(enum), list(_items()))<EOL>
Create a new enumeration containing only items filtered from another enumeration. Hidden enum items in the original enumeration are excluded. :type pred: ``Callable[[`EnumItem`], bool]`` :param pred: Predicate that will keep items for which the result is true. :type enum: Enum :param enum: Enumeration to filter. :rtype: Enum :return: New filtered enumeration.
f7719:m0
def __init__(self, doc, values, value_key=attrgetter('<STR_LIT:value>')):
self.doc = doc<EOL>_order = self._order = []<EOL>_values = self._values = {}<EOL>for value in values:<EOL><INDENT>key = value_key(value)<EOL>if key in _values:<EOL><INDENT>raise ValueError(<EOL>'<STR_LIT>'.format(key))<EOL><DEDENT>_order.append(value)<EOL>_values[key] = value<EOL><DEDENT>
:param str doc: Brief documentation of the enumeration. :type values: ``List[`EnumItem`]`` :param values: List of enumeration items. :type value_key: ``Callable[[`EnumItem`], unicode]`` :param value_key: Function to produce the key to use when constructing a mapping for each item in ``values``.
f7719:c0:m0
@classmethod<EOL><INDENT>def from_pairs(cls, doc, pairs):<DEDENT>
values = (EnumItem(value, desc) for value, desc in pairs)<EOL>return cls(doc=doc, values=values)<EOL>
Construct an enumeration from an iterable of pairs. :param doc: See `Enum.__init__`. :type pairs: ``Iterable[Tuple[unicode, unicode]]`` :param pairs: Iterable to construct the enumeration from. :rtype: Enum
f7719:c0:m4
def get(self, value):
_nothing = object()<EOL>item = self._values.get(value, _nothing)<EOL>if item is _nothing:<EOL><INDENT>raise InvalidEnumItem(value)<EOL><DEDENT>return item<EOL>
Get an enumeration item for an enumeration value. :param unicode value: Enumeration value. :raise InvalidEnumItem: If ``value`` does not match any known enumeration value. :rtype: EnumItem
f7719:c0:m6
def desc(self, value):
try:<EOL><INDENT>return self.get(value).desc<EOL><DEDENT>except InvalidEnumItem:<EOL><INDENT>return u'<STR_LIT>'<EOL><DEDENT>
Get the enumeration item description for an enumeration value. :param unicode value: Enumeration value.
f7719:c0:m7
def extra(self, value, extra_name, default=None):
try:<EOL><INDENT>return self.get(value).get(extra_name, default)<EOL><DEDENT>except InvalidEnumItem:<EOL><INDENT>return default<EOL><DEDENT>
Get the additional enumeration value for ``extra_name``. :param unicode value: Enumeration value. :param str extra_name: Extra name. :param default: Default value in the case ``extra_name`` doesn't exist.
f7719:c0:m9
def find(self, **names):
for res in self.find_all(**names):<EOL><INDENT>return res<EOL><DEDENT>return None<EOL>
Find the first item with matching extra values. :param \*\*names: Extra values to match. :rtype: `EnumItem` :return: First matching item or ``None``.
f7719:c0:m11
def find_all(self, **names):
values = names.items()<EOL>if len(values) != <NUM_LIT:1>:<EOL><INDENT>raise ValueError('<STR_LIT>')<EOL><DEDENT>name, value = values[<NUM_LIT:0>]<EOL>for item in self:<EOL><INDENT>if item.get(name) == value:<EOL><INDENT>yield item<EOL><DEDENT><DEDENT>
Find all items with matching extra values. :param \*\*names: Extra values to match. :rtype: ``Iterable[`EnumItem`]``
f7719:c0:m12
def as_pairs(self):
return [(i.value, i.desc) for i in self]<EOL>
Transform the enumeration into a sequence of pairs. :rtype: ``List[Tuple[unicode, unicode]]`` :return: List of enumeration value and description pairs.
f7719:c0:m14
def __init__(self, value, desc, hidden=False, **extra):
self.value = value<EOL>self.desc = desc<EOL>self.hidden = hidden<EOL>self._extra = extra<EOL>
Initialise an enumeration item. :param value: See `EnumItem.value`. :param desc: See `EnumItem.desc`. :param hidden: See `EnumItem.hidden`. :param \*\*extra: Additional extra values, accessed via `EnumItem.get`.
f7719:c2:m0
def __getattr__(self, name):
warn('<STR_LIT>',<EOL>DeprecationWarning, <NUM_LIT:2>)<EOL>if name in self._extra:<EOL><INDENT>return self.get(name)<EOL><DEDENT>raise AttributeError(<EOL>'<STR_LIT>'.format(<EOL>type(self).__name__, name))<EOL>
Get an extra value by name.
f7719:c2:m3
def get(self, name, default=None):
return self._extra.get(name, default)<EOL>
Get the value of an extra parameter. :param str name: Extra parameter name. :param default: Default value in the case ``name`` doesn't exist.
f7719:c2:m4
def items(self):
return self._extra.items()<EOL>
Additional enumeration values. :rtype: ``Iterable[Tuple[str, object]]``
f7719:c2:m5
def patched_str(self):
def red(words):<EOL><INDENT>return u("<STR_LIT>") % words<EOL><DEDENT>def white(words):<EOL><INDENT>return u("<STR_LIT>") % words<EOL><DEDENT>def blue(words):<EOL><INDENT>return u("<STR_LIT>") % words<EOL><DEDENT>def teal(words):<EOL><INDENT>return u("<STR_LIT>") % words<EOL><DEDENT>def get_uri(code):<EOL><INDENT>return "<STR_LIT>".format(code)<EOL><DEDENT>if hasattr(sys.stderr, '<STR_LIT>') and sys.stderr.isatty():<EOL><INDENT>msg = (<EOL>"<STR_LIT>"<EOL>"<STR_LIT>".format(<EOL>red_error=red("<STR_LIT>"),<EOL>request_was=white("<STR_LIT>"),<EOL>http_line=teal("<STR_LIT>" % (self.method, self.uri)),<EOL>sw_returned=white(<EOL>"<STR_LIT>"),<EOL>message=blue(str(self.msg))<EOL>))<EOL>if self.code:<EOL><INDENT>msg = "<STR_LIT>".join([msg, "<STR_LIT>".format(<EOL>more_info=white("<STR_LIT>"),<EOL>uri=blue(get_uri(self.code))),<EOL>])<EOL><DEDENT>return msg<EOL><DEDENT>else:<EOL><INDENT>return "<STR_LIT>".format(self.status, self.msg)<EOL><DEDENT>
Try to pretty-print the exception, if this is going on screen.
f7730:m0
def patched_applicationinstance_init(self, version, payload, account_sid, sid=None):
super(ApplicationInstance, self).__init__(version)<EOL>self._properties = {<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': deserialize.rfc2822_datetime(payload['<STR_LIT>']),<EOL>'<STR_LIT>': deserialize.rfc2822_datetime(payload['<STR_LIT>']),<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload.get('<STR_LIT>', '<STR_LIT>'), <EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>}<EOL>self._context = None<EOL>self._solution = {'<STR_LIT>': account_sid, '<STR_LIT>': sid or self._properties['<STR_LIT>'], }<EOL>
Initialize the ApplicationInstance :returns: twilio.rest.api.v2010.account.application.ApplicationInstance :rtype: twilio.rest.api.v2010.account.application.ApplicationInstance
f7730:m1
def patched_accountinstance_init(self, version, payload, sid=None):
super(AccountInstance, self).__init__(version)<EOL>self._properties = {<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': deserialize.rfc2822_datetime(payload['<STR_LIT>']),<EOL>'<STR_LIT>': deserialize.rfc2822_datetime(payload['<STR_LIT>']),<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload.get('<STR_LIT>', '<STR_LIT>'),<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT:status>': payload['<STR_LIT:status>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT:type>': payload['<STR_LIT:type>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>}<EOL>self._context = None<EOL>self._solution = {'<STR_LIT>': sid or self._properties['<STR_LIT>'], }<EOL>
Initialize the AccountInstance :returns: twilio.rest.api.v2010.account.AccountInstance :rtype: twilio.rest.api.v2010.account.AccountInstance
f7730:m2
def patched_localinstance_init(self, version, payload, account_sid, country_code):
super(LocalInstance, self).__init__(version)<EOL>self._properties = {<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload.get('<STR_LIT>', '<STR_LIT>'), <EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': deserialize.decimal(payload['<STR_LIT>']),<EOL>'<STR_LIT>': deserialize.decimal(payload['<STR_LIT>']),<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>}<EOL>self._context = None<EOL>self._solution = {'<STR_LIT>': account_sid, '<STR_LIT>': country_code, }<EOL>
Initialize the LocalInstance :returns: twilio.rest.api.v2010.account.available_phone_number.local.LocalInstance :rtype: twilio.rest.api.v2010.account.available_phone_number.local.LocalInstance
f7730:m3
def patched_incomingphonenumberinstance_init(self, version, payload, account_sid, sid=None):
super(IncomingPhoneNumberInstance, self).__init__(version)<EOL>self._properties = {<EOL>'<STR_LIT>': payload.get('<STR_LIT>', '<STR_LIT>'), <EOL>'<STR_LIT>': payload.get('<STR_LIT>', '<STR_LIT>'), <EOL>'<STR_LIT>': payload.get('<STR_LIT>', '<STR_LIT>'), <EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': deserialize.rfc2822_datetime(payload['<STR_LIT>']),<EOL>'<STR_LIT>': deserialize.rfc2822_datetime(payload['<STR_LIT>']),<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload.get('<STR_LIT>', '<STR_LIT>'), <EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload.get('<STR_LIT>', '<STR_LIT>'), <EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload.get('<STR_LIT>', '<STR_LIT>'), <EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload.get('<STR_LIT>', '<STR_LIT>'), <EOL>'<STR_LIT>': payload.get('<STR_LIT>', '<STR_LIT>'), <EOL>}<EOL>self._context = None<EOL>self._solution = {'<STR_LIT>': account_sid, '<STR_LIT>': sid or self._properties['<STR_LIT>'], }<EOL>
Initialize the IncomingPhoneNumberInstance :returns: twilio.rest.api.v2010.account.incoming_phone_number.IncomingPhoneNumberInstance :rtype: twilio.rest.api.v2010.account.incoming_phone_number.IncomingPhoneNumberInstance
f7730:m4
def patched_tollfreeinstance_init(self, version, payload, account_sid, country_code):
super(TollFreeInstance, self).__init__(version)<EOL>self._properties = {<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload.get('<STR_LIT>', '<STR_LIT>'), <EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': deserialize.decimal(payload['<STR_LIT>']),<EOL>'<STR_LIT>': deserialize.decimal(payload['<STR_LIT>']),<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>}<EOL>self._context = None<EOL>self._solution = {'<STR_LIT>': account_sid, '<STR_LIT>': country_code, }<EOL>
Initialize the TollFreeInstance :returns: twilio.rest.api.v2010.account.available_phone_number.toll_free.TollFreeInstance :rtype: twilio.rest.api.v2010.account.available_phone_number.toll_free.TollFreeInstance
f7730:m5
def patched_recordinginstance_init(self, version, payload, account_sid, sid=None):
super(RecordingInstance, self).__init__(version)<EOL>self._properties = {<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': deserialize.rfc2822_datetime(payload['<STR_LIT>']),<EOL>'<STR_LIT>': deserialize.rfc2822_datetime(payload['<STR_LIT>']),<EOL>'<STR_LIT>': deserialize.rfc2822_datetime(payload['<STR_LIT>']),<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': deserialize.decimal(payload['<STR_LIT>']),<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload.get('<STR_LIT>', '<STR_LIT>'), <EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT:status>': payload['<STR_LIT:status>'],<EOL>'<STR_LIT>': deserialize.integer(payload.get('<STR_LIT>', <NUM_LIT:1>)), <EOL>'<STR_LIT:source>': payload['<STR_LIT:source>'],<EOL>'<STR_LIT>': deserialize.integer(payload['<STR_LIT>']),<EOL>}<EOL>self._context = None<EOL>self._solution = {'<STR_LIT>': account_sid, '<STR_LIT>': sid or self._properties['<STR_LIT>'], }<EOL>
Initialize the RecordingInstance :returns: twilio.rest.api.v2010.account.call.recording.RecordingInstance :rtype: twilio.rest.api.v2010.account.call.recording.RecordingInstance
f7730:m6
def patched_transcriptioninstance_init(self, version, payload, account_sid, sid=None):
super(TranscriptionInstance, self).__init__(version)<EOL>self._properties = {<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': deserialize.rfc2822_datetime(payload['<STR_LIT>']),<EOL>'<STR_LIT>': deserialize.rfc2822_datetime(payload['<STR_LIT>']),<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': deserialize.decimal(payload['<STR_LIT>']),<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT:status>': payload['<STR_LIT:status>'],<EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>'<STR_LIT:type>': payload.get('<STR_LIT:type>', '<STR_LIT>'), <EOL>'<STR_LIT>': payload['<STR_LIT>'],<EOL>}<EOL>self._context = None<EOL>self._solution = {'<STR_LIT>': account_sid, '<STR_LIT>': sid or self._properties['<STR_LIT>'], }<EOL>
Initialize the TranscriptionInstance :returns: twilio.rest.api.v2010.account.transcription.TranscriptionInstance :rtype: twilio.rest.api.v2010.account.transcription.TranscriptionInstance
f7730:m7
def patched_fax_init(self, twilio):
super(TwilioFax, self).__init__(twilio)<EOL>self.base_url = '<STR_LIT>'<EOL>self.account_sid = twilio.account_sid<EOL>self._v1 = None<EOL>
Initialize the Fax Domain :returns: Domain for Fax :rtype: twilio.rest.fax.Fax
f7730:m8
def patched_fax_v1_init(self, domain):
print(domain.__class__.__name__)<EOL>super(TwilioV1, self).__init__(domain)<EOL>self.version = "<STR_LIT>" + domain.account_sid<EOL>self._faxes = None<EOL>
Initialize the V1 version of Fax :returns: V1 version of Fax :rtype: twilio.rest.fax.v1.V1.V1
f7730:m9
def reject(self, **kwargs):
return self.nest(Reject(**kwargs))<EOL>
Create a <Reject> element :param kwargs: additional attributes :returns: <Reject> element
f7733:c0:m0
def _generate_sympify_namespace(<EOL>independent_variables, dependent_variables, helper_functions<EOL>):
<EOL>dent_variable = independent_variables[<NUM_LIT:0>] <EOL>c_independent_variable = Symbol(independent_variable)<EOL>tial_derivative(symbolic_independent_variable, i, expr):<EOL>urn Derivative(expr, symbolic_independent_variable, i)<EOL>ce = {independent_variable: symbolic_independent_variable}<EOL>ce.update(<EOL>"<STR_LIT>"<EOL>% (independent_variable * i): partial(<EOL>partial_derivative, symbolic_independent_variable, i<EOL>)<EOL>for i in range(<NUM_LIT:1>, <NUM_LIT:10>)<EOL>ce.update(<EOL>"<STR_LIT>"<EOL>% (independent_variable * order, var): Derivative(<EOL>Function(var)(independent_variable), independent_variable, order<EOL>)<EOL>for order, var in product(<EOL>range(<NUM_LIT:1>, <NUM_LIT:10>), dependent_variables + helper_functions<EOL>)<EOL>.debug("<STR_LIT>" % namespace)<EOL>namespace<EOL>
Generate the link between the symbols of the derivatives and the sympy Derivative operation. Parameters ---------- independent_variable : str name of the independant variable ("x") dependent_variables : iterable of str names of the dependent variables helper_functions : iterable of str names of the helper functions Returns ------- dict dictionnary containing the symbol to parse as keys and the sympy expression to evaluate instead as values.
f7741:m0
def save(self, filename):
with open(filename, "<STR_LIT:wb>") as f:<EOL><INDENT>dump(self, f)<EOL><DEDENT>
Save the model as a binary pickle file. Parameters ---------- filename : str name of the file where the model is saved. Returns ------- None
f7741:c0:m5
@staticmethod<EOL><INDENT>def load(filename):<DEDENT>
with open(filename, "<STR_LIT:rb>") as f:<EOL><INDENT>return load(f)<EOL><DEDENT>
load a pre-compiled triflow model. The internal of theano allow a caching of the model. Will be slow if it is the first time the model is loaded on the system. Parameters ---------- filename : str path of the pre-compiled model Returns ------- triflow.core.Model triflow pre-compiled model
f7741:c0:m7
def __call__(self, t, fields, dt, pars,<EOL>hook=null_hook):
<EOL>if self._time_control:<EOL><INDENT>return self._variable_step(t, fields, dt, pars,<EOL>hook=hook)<EOL><DEDENT>t, fields, _ = self._fixed_step(t, fields, dt, pars,<EOL>hook=hook)<EOL>fields, pars = hook(t, fields, pars)<EOL>return t, fields<EOL>
Perform a step of the solver: took a time and a system state as a triflow Fields container and return the next time step with updated container. Parameters ---------- t : float actual time step fields : triflow.Fields actual system state in a triflow Fields dt : float temporal step-size pars : dict physical parameters of the model hook : callable, optional any callable taking the actual time, fields and parameters and return modified fields and parameters. Will be called every internal time step and can be used to include time dependent or conditionnal parameters, boundary conditions... container Returns ------- tuple : t, fields updated time and fields container Raises ------ NotImplementedError raised if a time stepping is requested but the scheme do not provide the b predictor coefficients. ValueError raised if time_stepping is True and tol is not provided.
f7742:c0:m2
def __call__(self, t, fields, dt, pars,<EOL>hook=null_hook):
<EOL>solv = self._solv<EOL>fields, pars = hook(t, fields, pars)<EOL>solv.set_initial_value(fields.uflat, t)<EOL>solv.set_f_params(fields, pars, hook)<EOL>solv.set_jac_params(fields, pars, hook)<EOL>U = solv.integrate(t + dt)<EOL>fields.fill(U)<EOL>fields, _ = hook(t + dt, fields, pars)<EOL>return t + dt, fields<EOL>
Perform a step of the solver: took a time and a system state as a triflow Fields container and return the next time step with updated container. Parameters ---------- t : float actual time step fields : triflow.Fields actual system state in a triflow Fields dt : float temporal step-size pars : dict physical parameters of the model hook : callable, optional any callable taking the actual time, fields and parameters and return modified fields and parameters. Will be called every internal time step and can be used to include time dependent or conditionnal parameters, boundary conditions... container Returns ------- tuple : t, fields updated time and fields container Raises ------ RuntimeError Description
f7742:c5:m1
def __call__(self, t, fields, dt, pars,<EOL>hook=null_hook):
<EOL>fields = fields.copy()<EOL>fields, pars = hook(t, fields, pars)<EOL>F = self._model.F(fields, pars)<EOL>J = self._model.J(fields, pars)<EOL>U = fields.uflat<EOL>B = dt * (F - self._theta * J @ U) + U<EOL>J = (sps.identity(U.size,<EOL>format='<STR_LIT>') -<EOL>self._theta * dt * J)<EOL>fields.fill(self._solver(J, B))<EOL>fields, _ = hook(t + dt, fields, pars)<EOL>return t + dt, fields<EOL>
Perform a step of the solver: took a time and a system state as a triflow Fields container and return the next time step with updated container. Parameters ---------- t : float actual time step fields : triflow.Fields actual system state in a triflow Fields container dt : float temporal step-size pars : dict physical parameters of the model hook : callable, optional any callable taking the actual time, fields and parameters and return modified fields and parameters. Will be called every internal time step and can be used to include time dependent or conditionnal parameters, boundary conditions... Returns ------- tuple : t, fields updated time and fields container
f7742:c6:m1
def _compute_one_step(self, t, fields, pars):
fields, pars = self._hook(t, fields, pars)<EOL>self.dt = (self.tmax - t<EOL>if self.tmax and (t + self.dt >= self.tmax)<EOL>else self.dt)<EOL>before_compute = time.process_time()<EOL>t, fields = self._scheme(t, fields, self.dt,<EOL>pars, hook=self._hook)<EOL>after_compute = time.process_time()<EOL>self._last_running = after_compute - before_compute<EOL>self._total_running += self._last_running<EOL>self._last_timestamp = self._actual_timestamp<EOL>self._actual_timestamp = pendulum.now()<EOL>return t, fields, pars<EOL>
Compute one step of the simulation, then update the timers.
f7743:c1:m1
def compute(self):
fields = self.fields<EOL>t = self.t<EOL>pars = self.parameters<EOL>self._started_timestamp = pendulum.now()<EOL>self.stream.emit(self)<EOL>try:<EOL><INDENT>while True:<EOL><INDENT>t, fields, pars = self._compute_one_step(t, fields, pars)<EOL>self.i += <NUM_LIT:1><EOL>self.t = t<EOL>self.fields = fields<EOL>self.parameters = pars<EOL>for pprocess in self.post_processes:<EOL><INDENT>pprocess.function(self)<EOL><DEDENT>self.stream.emit(self)<EOL>yield self.t, self.fields<EOL>if self.tmax and (isclose(self.t, self.tmax)):<EOL><INDENT>self._end_simulation()<EOL>return<EOL><DEDENT><DEDENT><DEDENT>except RuntimeError:<EOL><INDENT>self.status = '<STR_LIT>'<EOL>raise<EOL><DEDENT>
Generator which yield the actual state of the system every dt. Yields ------ tuple : t, fields Actual time and updated fields container.
f7743:c1:m2
def run(self, progress=True, verbose=False):
total_iter = int((self.tmax // self.user_dt) if self.tmax else None)<EOL>log = logging.info if verbose else logging.debug<EOL>if progress:<EOL><INDENT>with tqdm(initial=(self.i if self.i < total_iter else total_iter),<EOL>total=total_iter) as pbar:<EOL><INDENT>for t, fields in self:<EOL><INDENT>pbar.update(<NUM_LIT:1>)<EOL>log("<STR_LIT>" % (self.id, t))<EOL><DEDENT>try:<EOL><INDENT>return t, fields<EOL><DEDENT>except UnboundLocalError:<EOL><INDENT>warnings.warn("<STR_LIT>")<EOL><DEDENT><DEDENT><DEDENT>for t, fields in self:<EOL><INDENT>log("<STR_LIT>" % (self.id, t))<EOL><DEDENT>try:<EOL><INDENT>return t, fields<EOL><DEDENT>except UnboundLocalError:<EOL><INDENT>warnings.warn("<STR_LIT>")<EOL><DEDENT>
Compute all steps of the simulation. Be careful: if tmax is not set, this function will result in an infinit loop. Returns ------- (t, fields): last time and result fields.
f7743:c1:m4
def attach_container(self, path=None, save="<STR_LIT:all>",<EOL>mode="<STR_LIT:w>", nbuffer=<NUM_LIT:50>, force=False):
self._container = TriflowContainer("<STR_LIT>" % (path, self.id)<EOL>if path else None,<EOL>save=save,<EOL>mode=mode, metadata=self.parameters,<EOL>force=force, nbuffer=nbuffer)<EOL>self._container.connect(self.stream)<EOL>return self._container<EOL>
add a Container to the simulation which allows some persistance to the simulation. Parameters ---------- path : str or None (default: None) path for the container. If None (the default), the data lives only in memory (and are available with `simulation.container`) mode : str, optional "a" or "w" (default "w") save : str, optional "all" will save every time-step, "last" will only get the last time step nbuffer : int, optional wait until nbuffer data in the Queue before save on disk. timeout : int, optional wait until timeout since last flush before save on disk. force : bool, optional (default False) if True, remove the target folder if not empty. if False, raise an error.
f7743:c1:m6
def add_post_process(self, name, post_process, description="<STR_LIT>"):
self._pprocesses.append(PostProcess(name=name,<EOL>function=post_process,<EOL>description=description))<EOL>self._pprocesses[-<NUM_LIT:1>].function(self)<EOL>
add a post-process Parameters ---------- name : str name of the post-traitment post_process : callback (function of a class with a __call__ method or a streamz.Stream). this callback have to accept the simulation state as parameter and return the modifield simulation state. if a streamz.Stream is provided, it will me plugged_in with the previous streamz (and ultimately to the initial_stream). All these stream accept and return the simulation state. description : str, optional, Default is "". give extra information about the post-processing
f7743:c1:m11
def remove_post_process(self, name):
self._pprocesses = [post_process<EOL>for post_process in self._pprocesses<EOL>if post_process.name != name]<EOL>
remove a post-process Parameters ---------- name : str name of the post-process to remove.
f7743:c1:m12
@staticmethod<EOL><INDENT>def factory(coords,<EOL>dependent_variables,<EOL>helper_functions):<DEDENT>
Field = type('<STR_LIT>', BaseFields.__bases__,<EOL>dict(BaseFields.__dict__))<EOL>Field._coords = coords<EOL>Field.dependent_variables_info = dependent_variables<EOL>Field.helper_functions_info = helper_functions<EOL>Field._var_info = [*list(Field.dependent_variables_info),<EOL>*list(Field.helper_functions_info)]<EOL>Field.dependent_variables = [dep[<NUM_LIT:0>]<EOL>for dep<EOL>in Field.dependent_variables_info]<EOL>Field.helper_functions = [dep[<NUM_LIT:0>]<EOL>for dep<EOL>in Field.helper_functions_info]<EOL>Field._keys, Field._coords_info = zip(*Field._var_info)<EOL>return Field<EOL>
Fields factory generating specialized container build around a triflow Model and xarray. Parameters ---------- coords: iterable of str: coordinates name. First coordinate have to be shared with all variables dependent_variables : iterable tuple (name, coords) coordinates and name of the dependent variables helper_functions : iterable tuple (name, coords) coordinates and name of the helpers functions Returns ------- triflow.BaseFields Specialized container which expose the data as a structured numpy array
f7744:c0:m0
@staticmethod<EOL><INDENT>def factory1D(dependent_variables,<EOL>helper_functions):<DEDENT>
return BaseFields.factory(("<STR_LIT:x>", ),<EOL>[(name, ("<STR_LIT:x>", ))<EOL>for name<EOL>in dependent_variables],<EOL>[(name, ("<STR_LIT:x>", ))<EOL>for name<EOL>in helper_functions],)<EOL>
Fields factory generating specialized container build around a triflow Model and xarray. Wrapper for 1D data. Parameters ---------- dependent_variables : iterable for str name of the dependent variables helper_functions : iterable of str name of the helpers functions Returns ------- triflow.BaseFields Specialized container which expose the data as a structured numpy array
f7744:c0:m1
@property<EOL><INDENT>def size(self):<DEDENT>
<EOL>return self["<STR_LIT:x>"].size<EOL>
numpy.ndarray.view: view of the dependent variables of the main numpy array
f7744:c0:m6
@property<EOL><INDENT>def uarray(self):<DEDENT>
<EOL>return self[self.dependent_variables]<EOL>
numpy.ndarray.view: view of the dependent variables of the main numpy array
f7744:c0:m7
@property<EOL><INDENT>def uflat(self):<DEDENT>
<EOL>aligned_arrays = [self[key].values[[(slice(None)<EOL>if c in coords<EOL>else None)<EOL>for c in self._coords]].T<EOL>for key, coords in self.dependent_variables_info]<EOL>return np.vstack(aligned_arrays).flatten("<STR_LIT:F>")<EOL>
return a flatten **copy** of the main numpy array with only the dependant variables. Be carefull, modification of these data will not be reflected on the main array!
f7744:c0:m8
def theano_compiler(model):
from theano import tensor as T<EOL>from theano.ifelse import ifelse<EOL>import theano.sparse as ths<EOL>from theano import function<EOL>def th_Min(a, b):<EOL><INDENT>if isinstance(a, T.TensorVariable) or isinstance(b, T.TensorVariable):<EOL><INDENT>return T.where(a < b, a, b)<EOL><DEDENT>return min(a, b)<EOL><DEDENT>def th_Max(a, b):<EOL><INDENT>if isinstance(a, T.TensorVariable) or isinstance(b, T.TensorVariable):<EOL><INDENT>return T.where(a < b, b, a)<EOL><DEDENT>return max(a, b)<EOL><DEDENT>def th_Heaviside(a):<EOL><INDENT>if isinstance(a, T.TensorVariable):<EOL><INDENT>return T.where(a < <NUM_LIT:0>, <NUM_LIT:1>, <NUM_LIT:1>)<EOL><DEDENT>return <NUM_LIT:0> if a < <NUM_LIT:0> else <NUM_LIT:1><EOL><DEDENT>mapargs = {arg: T.vector(arg)<EOL>for arg, sarg<EOL>in zip(model._args, model._symbolic_args)}<EOL>to_feed = mapargs.copy()<EOL>x_th = mapargs['<STR_LIT:x>']<EOL>N = x_th.size<EOL>L = x_th[-<NUM_LIT:1>] - x_th[<NUM_LIT:0>]<EOL>dx = L / (N - <NUM_LIT:1>)<EOL>to_feed['<STR_LIT>'] = dx<EOL>periodic = T.scalar("<STR_LIT>", dtype="<STR_LIT>")<EOL>middle_point = int((model._window_range - <NUM_LIT:1>) / <NUM_LIT:2>)<EOL>th_args = [mapargs[key]<EOL>for key<EOL>in [*model._indep_vars,<EOL>*model._dep_vars,<EOL>*model._help_funcs,<EOL>*model._pars]] + [periodic]<EOL>map_extended = {}<EOL>for (varname, discretisation_tree) inmodel._symb_vars_with_spatial_diff_order.items():<EOL><INDENT>pad_left, pad_right = model._bounds<EOL>th_arg = mapargs[varname]<EOL>per_extended_var = T.concatenate([th_arg[pad_left:],<EOL>th_arg,<EOL>th_arg[:pad_right]])<EOL>edge_extended_var = T.concatenate([[th_arg[<NUM_LIT:0>]] * middle_point,<EOL>th_arg,<EOL>[th_arg[-<NUM_LIT:1>]] * middle_point])<EOL>extended_var = ifelse(periodic,<EOL>per_extended_var,<EOL>edge_extended_var)<EOL>map_extended[varname] = extended_var<EOL>for order in range(pad_left, pad_right + <NUM_LIT:1>):<EOL><INDENT>if order != <NUM_LIT:0>:<EOL><INDENT>var = ("<STR_LIT>").format(varname,<EOL>'<STR_LIT:m>' if order < <NUM_LIT:0> else '<STR_LIT:p>',<EOL>np.abs(order))<EOL><DEDENT>else:<EOL><INDENT>var = varname<EOL><DEDENT>new_var = extended_var[order - pad_left:<EOL>extended_var.size +<EOL>order - pad_right]<EOL>to_feed[var] = new_var<EOL><DEDENT><DEDENT>F = lambdify((model._symbolic_args),<EOL>expr=model.F_array.tolist(),<EOL>modules=[T, {"<STR_LIT>": th_Max,<EOL>"<STR_LIT>": th_Min,<EOL>"<STR_LIT>": th_Heaviside}])(<EOL>*[to_feed[key]<EOL>for key<EOL>in model._args]<EOL>)<EOL>F = T.concatenate(F, axis=<NUM_LIT:0>).reshape((model._nvar, N)).T<EOL>F = T.stack(F).flatten()<EOL>J = lambdify((model._symbolic_args),<EOL>expr=model.J_array.tolist(),<EOL>modules=[T, {"<STR_LIT>": th_Max,<EOL>"<STR_LIT>": th_Min,<EOL>"<STR_LIT>": th_Heaviside}])(<EOL>*[to_feed[key]<EOL>for key<EOL>in model._args]<EOL>)<EOL>J = [j if j != <NUM_LIT:0> else T.constant(<NUM_LIT:0.>)<EOL>for j in J]<EOL>J = [j if not isinstance(j, (int, float)) else T.constant(j)<EOL>for j in J]<EOL>J = T.stack([T.repeat(j, N) if j.ndim == <NUM_LIT:0> else j<EOL>for j in J])<EOL>J = J[model._sparse_indices[<NUM_LIT:0>]].T.squeeze()<EOL>i = T.arange(N).dimshuffle([<NUM_LIT:0>, '<STR_LIT:x>'])<EOL>idx = T.arange(N * model._nvar).reshape((N, model._nvar)).T<EOL>edge_extended_idx = T.concatenate([T.repeat(idx[:, :<NUM_LIT:1>],<EOL>middle_point,<EOL>axis=<NUM_LIT:1>),<EOL>idx,<EOL>T.repeat(idx[:, -<NUM_LIT:1>:],<EOL>middle_point,<EOL>axis=<NUM_LIT:1>)],<EOL>axis=<NUM_LIT:1>).T.flatten()<EOL>per_extended_idx = T.concatenate([idx[:, -middle_point:],<EOL>idx,<EOL>idx[:, :middle_point]],<EOL>axis=<NUM_LIT:1>).T.flatten()<EOL>extended_idx = ifelse(periodic,<EOL>per_extended_idx,<EOL>edge_extended_idx)<EOL>rows = T.tile(T.arange(model._nvar),<EOL>model._window_range * model._nvar) + i * model._nvar<EOL>cols = T.repeat(T.arange(model._window_range * model._nvar),<EOL>model._nvar) + i * model._nvar<EOL>rows = rows[:, model._sparse_indices].reshape(J.shape).flatten()<EOL>cols = extended_idx[cols][:, model._sparse_indices].reshape(J.shape).flatten()<EOL>permutation = T.argsort(cols)<EOL>J = J.flatten()[permutation]<EOL>rows = rows[permutation]<EOL>cols = cols[permutation]<EOL>count = T.zeros((N * model._nvar + <NUM_LIT:1>,), dtype=int)<EOL>uq, cnt = T.extra_ops.Unique(False, False, True)(cols)<EOL>count = T.set_subtensor(count[uq + <NUM_LIT:1>], cnt)<EOL>indptr = T.cumsum(count)<EOL>shape = T.stack([N * model._nvar, N * model._nvar])<EOL>sparse_J = ths.CSC(J, rows, indptr, shape)<EOL>F_theano_function = function(inputs=th_args,<EOL>outputs=F,<EOL>on_unused_input='<STR_LIT:ignore>',<EOL>allow_input_downcast=True)<EOL>J_theano_function = function(inputs=th_args,<EOL>outputs=sparse_J,<EOL>on_unused_input='<STR_LIT:ignore>',<EOL>allow_input_downcast=True)<EOL>return F_theano_function, J_theano_function<EOL>
Take a triflow model and return optimized theano routines. Parameters ---------- model: triflow.Model: Model to compile Returns ------- (theano function, theano_function): Optimized routine that compute the evolution equations and their jacobian matrix.
f7745:m0
def numpy_compiler(model):
def np_Min(args):<EOL><INDENT>a, b = args<EOL>return np.where(a < b, a, b)<EOL><DEDENT>def np_Max(args):<EOL><INDENT>a, b = args<EOL>return np.where(a < b, b, a)<EOL><DEDENT>def np_Heaviside(a):<EOL><INDENT>return np.where(a < <NUM_LIT:0>, <NUM_LIT:1>, <NUM_LIT:1>)<EOL><DEDENT>f_func = lambdify((model._symbolic_args),<EOL>expr=model.F_array.tolist(),<EOL>modules=[{"<STR_LIT>": np_Max,<EOL>"<STR_LIT>": np_Min,<EOL>"<STR_LIT>": np_Heaviside},<EOL>"<STR_LIT>"])<EOL>j_func = lambdify((model._symbolic_args),<EOL>expr=model._J_sparse_array.tolist(),<EOL>modules=[{"<STR_LIT>": np_Max,<EOL>"<STR_LIT>": np_Min,<EOL>"<STR_LIT>": np_Heaviside},<EOL>"<STR_LIT>"])<EOL>compute_F = partial(compute_F_numpy, model, f_func)<EOL>compute_J = partial(compute_J_numpy, model, j_func)<EOL>return compute_F, compute_J<EOL>
Take a triflow model and return optimized numpy routines. Parameters ---------- model: triflow.Model: Model to compile Returns ------- (numpy function, numpy function): Optimized routine that compute the evolution equations and their jacobian matrix.
f7745:m1
def _split_docker_link(alias_name):
sanitized_name = alias_name.strip().upper()<EOL>split_list = re.split(r'<STR_LIT>', core.str('<STR_LIT>'.format(sanitized_name)))<EOL>return list(filter(None, split_list))<EOL>
Splits a docker link string into a list of 3 items (protocol, host, port). - Assumes IPv4 Docker links ex: _split_docker_link('DB') -> ['tcp', '172.17.0.82', '8080']
f7764:m0
def read(alias_name, allow_none=False):
warnings.warn('<STR_LIT>', DeprecationWarning, stacklevel=<NUM_LIT:2>)<EOL>return core.read('<STR_LIT>'.format(alias_name), default=None, allow_none=allow_none)<EOL>
Get the raw docker link value. Get the raw environment variable for the docker link Args: alias_name: The environment variable name default: The default value if the link isn't available allow_none: If the return value can be `None` (i.e. optional)
f7764:m1
def isset(alias_name):
warnings.warn('<STR_LIT>', DeprecationWarning, stacklevel=<NUM_LIT:2>)<EOL>raw_value = read(alias_name, allow_none=True)<EOL>if raw_value:<EOL><INDENT>if re.compile(r'<STR_LIT>').match(raw_value):<EOL><INDENT>return True<EOL><DEDENT>else:<EOL><INDENT>warnings.warn('<STR_LIT>'.format(alias_name, raw_value), stacklevel=<NUM_LIT:2>)<EOL>return False<EOL><DEDENT><DEDENT>return False<EOL>
Return a boolean if the docker link is set or not and is a valid looking docker link value. Args: alias_name: The link alias name
f7764:m3
def protocol(alias_name, default=None, allow_none=False):
warnings.warn('<STR_LIT>', DeprecationWarning, stacklevel=<NUM_LIT:2>)<EOL>try:<EOL><INDENT>return _split_docker_link(alias_name)[<NUM_LIT:0>]<EOL><DEDENT>except KeyError as err:<EOL><INDENT>if default or allow_none:<EOL><INDENT>return default<EOL><DEDENT>else:<EOL><INDENT>raise err<EOL><DEDENT><DEDENT>
Get the protocol from the docker link alias or return the default. Args: alias_name: The docker link alias default: The default value if the link isn't available allow_none: If the return value can be `None` (i.e. optional) Examples: Assuming a Docker link was created with ``docker --link postgres:db`` and the resulting environment variable is ``DB_PORT=tcp://172.17.0.82:5432``. >>> envitro.docker.protocol('DB') tcp
f7764:m4
def host(alias_name, default=None, allow_none=False):
warnings.warn('<STR_LIT>', DeprecationWarning, stacklevel=<NUM_LIT:2>)<EOL>try:<EOL><INDENT>return _split_docker_link(alias_name)[<NUM_LIT:1>]<EOL><DEDENT>except KeyError as err:<EOL><INDENT>if default or allow_none:<EOL><INDENT>return default<EOL><DEDENT>else:<EOL><INDENT>raise err<EOL><DEDENT><DEDENT>
Get the host from the docker link alias or return the default. Args: alias_name: The docker link alias default: The default value if the link isn't available allow_none: If the return value can be `None` (i.e. optional) Examples: Assuming a Docker link was created with ``docker --link postgres:db`` and the resulting environment variable is ``DB_PORT=tcp://172.17.0.82:5432``. >>> envitro.docker.host('DB') 172.17.0.82
f7764:m5
def port(alias_name, default=None, allow_none=False):
warnings.warn('<STR_LIT>', DeprecationWarning, stacklevel=<NUM_LIT:2>)<EOL>try:<EOL><INDENT>return int(_split_docker_link(alias_name)[<NUM_LIT:2>])<EOL><DEDENT>except KeyError as err:<EOL><INDENT>if default or allow_none:<EOL><INDENT>return default<EOL><DEDENT>else:<EOL><INDENT>raise err<EOL><DEDENT><DEDENT>
Get the port from the docker link alias or return the default. Args: alias_name: The docker link alias default: The default value if the link isn't available allow_none: If the return value can be `None` (i.e. optional) Examples: Assuming a Docker link was created with ``docker --link postgres:db`` and the resulting environment variable is ``DB_PORT=tcp://172.17.0.82:5432``. >>> envitro.docker.port('DB') 5432
f7764:m6
def _strtobool(val):
val = val.lower()<EOL>if val in ('<STR_LIT:y>', '<STR_LIT:yes>', '<STR_LIT:t>', '<STR_LIT:true>', '<STR_LIT>', '<STR_LIT:1>'):<EOL><INDENT>return <NUM_LIT:1><EOL><DEDENT>elif val in ('<STR_LIT:n>', '<STR_LIT>', '<STR_LIT:f>', '<STR_LIT:false>', '<STR_LIT>', '<STR_LIT:0>', '<STR_LIT>'):<EOL><INDENT>return <NUM_LIT:0><EOL><DEDENT>else:<EOL><INDENT>raise ValueError('<STR_LIT>'.format(val))<EOL><DEDENT>
Convert a string representation of truth to true (1) or false (0). True values are 'y', 'yes', 't', 'true', 'on', and '1'; false values are 'n', 'no', 'f', 'false', 'off', '0' and ''. Raises ValueError if 'val' is anything else.
f7765:m0
def _str_to_list(value, separator):
value_list = [item.strip() for item in value.split(separator)]<EOL>value_list_sanitized = builtins.list(filter(None, value_list))<EOL>if len(value_list_sanitized) > <NUM_LIT:0>:<EOL><INDENT>return value_list_sanitized<EOL><DEDENT>else:<EOL><INDENT>raise ValueError('<STR_LIT>')<EOL><DEDENT>
Convert a string to a list with sanitization.
f7765:m1
def isset(name):
return True if environ.get(name) else False<EOL>
Return a boolean if the environment variable is set or not. Args: name: The environment variable name
f7765:m2
def write(name, value):
if value is not None:<EOL><INDENT>environ[name] = builtins.str(value)<EOL><DEDENT>elif environ.get(name):<EOL><INDENT>del environ[name]<EOL><DEDENT>
Write a raw env value. A ``None`` value clears the environment variable. Args: name: The environment variable name value: The value to write
f7765:m3
def read(name, default=None, allow_none=False, fallback=None):
raw_value = environ.get(name)<EOL>if raw_value is None and fallback is not None:<EOL><INDENT>if not isinstance(fallback, builtins.list) and not isinstance(fallback, builtins.tuple):<EOL><INDENT>fallback = [fallback]<EOL><DEDENT>for fall in fallback:<EOL><INDENT>raw_value = environ.get(fall)<EOL>if raw_value is not None:<EOL><INDENT>break<EOL><DEDENT><DEDENT><DEDENT>if raw_value or raw_value == '<STR_LIT>':<EOL><INDENT>return raw_value<EOL><DEDENT>elif default is not None or allow_none:<EOL><INDENT>return default<EOL><DEDENT>else:<EOL><INDENT>raise KeyError('<STR_LIT>'.format(name))<EOL><DEDENT>
Read the raw env value. Read the raw environment variable or use the default. If the value is not found and no default is set throw an exception. Args: name: The environment variable name default: The default value to use if no environment variable is found allow_none: If the return value can be `None` (i.e. optional) fallback: A list of fallback env variables to try and read if the primary environment variable is unavailable.
f7765:m5
def str(name, default=None, allow_none=False, fallback=None):
value = read(name, default, allow_none, fallback=fallback)<EOL>if value is None and allow_none:<EOL><INDENT>return None<EOL><DEDENT>else:<EOL><INDENT>return builtins.str(value).strip()<EOL><DEDENT>
Get a string based environment value or the default. Args: name: The environment variable name default: The default value to use if no environment variable is found allow_none: If the return value can be `None` (i.e. optional)
f7765:m7
def bool(name, default=None, allow_none=False, fallback=None):
value = read(name, default, allow_none, fallback=fallback)<EOL>if isinstance(value, builtins.bool):<EOL><INDENT>return value<EOL><DEDENT>elif isinstance(value, builtins.int):<EOL><INDENT>return True if value > <NUM_LIT:0> else False<EOL><DEDENT>elif value is None and allow_none:<EOL><INDENT>return None<EOL><DEDENT>else:<EOL><INDENT>value_str = builtins.str(value).lower().strip()<EOL>return _strtobool(value_str)<EOL><DEDENT>
Get a boolean based environment value or the default. Args: name: The environment variable name default: The default value to use if no environment variable is found allow_none: If the return value can be `None` (i.e. optional)
f7765:m8
def int(name, default=None, allow_none=False, fallback=None):
value = read(name, default, allow_none, fallback=fallback)<EOL>if isinstance(value, builtins.str):<EOL><INDENT>value = value.strip()<EOL><DEDENT>if value is None and allow_none:<EOL><INDENT>return None<EOL><DEDENT>else:<EOL><INDENT>return builtins.int(value)<EOL><DEDENT>
Get a string environment value or the default. Args: name: The environment variable name default: The default value to use if no environment variable is found allow_none: If the return value can be `None` (i.e. optional)
f7765:m9
def float(name, default=None, allow_none=False, fallback=None):
value = read(name, default, allow_none, fallback=fallback)<EOL>if isinstance(value, builtins.str):<EOL><INDENT>value = value.strip()<EOL><DEDENT>if value is None and allow_none:<EOL><INDENT>return None<EOL><DEDENT>else:<EOL><INDENT>return builtins.float(value)<EOL><DEDENT>
Get a string environment value or the default. Args: name: The environment variable name default: The default value to use if no environment variable is found allow_none: If the return value can be `None` (i.e. optional)
f7765:m10
def list(name, default=None, allow_none=False, fallback=None, separator='<STR_LIT:U+002C>'):
value = read(name, default, allow_none, fallback=fallback)<EOL>if isinstance(value, builtins.list):<EOL><INDENT>return value<EOL><DEDENT>elif isinstance(value, builtins.str):<EOL><INDENT>return _str_to_list(value, separator)<EOL><DEDENT>elif value is None and allow_none:<EOL><INDENT>return None<EOL><DEDENT>else:<EOL><INDENT>return [builtins.str(value)]<EOL><DEDENT>
Get a list of strings or the default. The individual list elements are whitespace-stripped. Args: name: The environment variable name default: The default value to use if no environment variable is found allow_none: If the return value can be `None` (i.e. optional) separator: The list item separator character or pattern
f7765:m11
def tuple(name, default=None, allow_none=False, fallback=None, separator='<STR_LIT:U+002C>'):
try:<EOL><INDENT>value = read(name, default, allow_none, fallback=fallback)<EOL>if isinstance(value, builtins.tuple):<EOL><INDENT>return value<EOL><DEDENT>elif isinstance(value, builtins.str):<EOL><INDENT>return builtins.tuple(_str_to_list(value, separator))<EOL><DEDENT>elif value is None and allow_none:<EOL><INDENT>return None<EOL><DEDENT>else:<EOL><INDENT>return (builtins.str(value), )<EOL><DEDENT><DEDENT>except ValueError:<EOL><INDENT>raise ValueError('<STR_LIT>')<EOL><DEDENT>
Get a tuple of strings or the default. The individual list elements are whitespace-stripped. Args: name: The environment variable name default: The default value to use if no environment variable is found allow_none: If the return value can be `None` (i.e. optional) separator: The list item separator character or pattern
f7765:m12
def write(name, value):
def wrapped(func):<EOL><INDENT>@functools.wraps(func)<EOL>def _decorator(*args, **kwargs):<EOL><INDENT>existing_env = core.read(name, allow_none=True)<EOL>core.write(name, value)<EOL>func_val = func(*args, **kwargs)<EOL>core.write(name, existing_env)<EOL>return func_val<EOL><DEDENT>return _decorator<EOL><DEDENT>return wrapped<EOL>
Temporarily change or set the environment variable during the execution of a function. Args: name: The name of the environment variable value: A value to set for the environment variable Returns: The function return value.
f7766:m0
def isset(name):
def wrapped(func):<EOL><INDENT>@functools.wraps(func)<EOL>def _decorator(*args, **kwargs):<EOL><INDENT>if core.isset(name):<EOL><INDENT>return func(*args, **kwargs)<EOL><DEDENT><DEDENT>return _decorator<EOL><DEDENT>return wrapped<EOL>
Only execute the function if the variable is set. Args: name: The name of the environment variable Returns: The function return value or `None` if the function was skipped.
f7766:m2
def bool(name, execute_bool=True, default=None):
def wrapped(func):<EOL><INDENT>@functools.wraps(func)<EOL>def _decorator(*args, **kwargs):<EOL><INDENT>if core.isset(name) and core.bool(name) == execute_bool:<EOL><INDENT>return func(*args, **kwargs)<EOL><DEDENT>elif default is not None and default == execute_bool:<EOL><INDENT>return func(*args, **kwargs)<EOL><DEDENT><DEDENT>return _decorator<EOL><DEDENT>return wrapped<EOL>
Only execute the function if the boolean variable is set. Args: name: The name of the environment variable execute_bool: The boolean value to execute the function on default: The default value if the environment variable is not set (respects `execute_bool`) Returns: The function return value or `None` if the function was skipped.
f7766:m3
def reload(*command, ignore_patterns=[]):
path = "<STR_LIT:.>"<EOL>sig = signal.SIGTERM<EOL>delay = <NUM_LIT><EOL>ignorefile = "<STR_LIT>"<EOL>ignore_patterns = ignore_patterns or load_ignore_patterns(ignorefile)<EOL>event_handler = ReloadEventHandler(ignore_patterns)<EOL>reloader = Reloader(command, signal)<EOL>observer = Observer()<EOL>observer.schedule(event_handler, path, recursive=True)<EOL>observer.start()<EOL>reloader.start_command()<EOL>try:<EOL><INDENT>while True:<EOL><INDENT>time.sleep(delay)<EOL>sys.stdout.write(reloader.read())<EOL>sys.stdout.flush()<EOL>if event_handler.modified:<EOL><INDENT>reloader.restart_command()<EOL><DEDENT><DEDENT><DEDENT>except KeyboardInterrupt:<EOL><INDENT>observer.stop()<EOL><DEDENT>observer.join()<EOL>reloader.stop_command()<EOL>sys.stdout.write(reloader.read())<EOL>sys.stdout.flush()<EOL>
Reload given command
f7769:m1
def reload_me(*args, ignore_patterns=[]):
command = [sys.executable, sys.argv[<NUM_LIT:0>]]<EOL>command.extend(args)<EOL>reload(*command, ignore_patterns=ignore_patterns)<EOL>
Reload currently running command with given args
f7769:m2
def addTextOut(self, text):
self._currentColor = self._black<EOL>self.addText(text)<EOL>
add black text
f7772:c0:m3
def addTextErr(self, text):
self._currentColor = self._red<EOL>self.addText(text)<EOL>
add red text
f7772:c0:m4
def addText(self, text):
<EOL>self.moveCursor(QtGui.QTextCursor.End)<EOL>self.setTextColor(self._currentColor)<EOL>self.textCursor().insertText(text)<EOL>
append text in the chosen color
f7772:c0:m5