signature
stringlengths
8
3.44k
body
stringlengths
0
1.41M
docstring
stringlengths
1
122k
id
stringlengths
5
17
def _get_header(self):
return self._header<EOL>
Controls printing of table header with field names Arguments: header - print a header showing field names (True or False)
f14504:c0:m42
def _get_header_style(self):
return self._header_style<EOL>
Controls stylisation applied to field names in header Arguments: header_style - stylisation to apply to field names in header ("cap", "title", "upper", "lower" or None)
f14504:c0:m44
def _get_border(self):
return self._border<EOL>
Controls printing of border around table Arguments: border - print a border around the table (True or False)
f14504:c0:m46
def _get_hrules(self):
return self._hrules<EOL>
Controls printing of horizontal rules after rows Arguments: hrules - horizontal rules style. Allowed values: FRAME, ALL, HEADER, NONE
f14504:c0:m48
def _get_vrules(self):
return self._vrules<EOL>
Controls printing of vertical rules between columns Arguments: vrules - vertical rules style. Allowed values: FRAME, ALL, NONE
f14504:c0:m50
def _get_int_format(self):
return self._int_format<EOL>
Controls formatting of integer data Arguments: int_format - integer format string
f14504:c0:m52
def _get_float_format(self):
return self._float_format<EOL>
Controls formatting of floating point data Arguments: float_format - floating point format string
f14504:c0:m54
def _get_padding_width(self):
return self._padding_width<EOL>
The number of empty spaces between a column's edge and its content Arguments: padding_width - number of spaces, must be a positive integer
f14504:c0:m56
def _get_left_padding_width(self):
return self._left_padding_width<EOL>
The number of empty spaces between a column's left edge and its content Arguments: left_padding - number of spaces, must be a positive integer
f14504:c0:m58
def _get_right_padding_width(self):
return self._right_padding_width<EOL>
The number of empty spaces between a column's right edge and its content Arguments: right_padding - number of spaces, must be a positive integer
f14504:c0:m60
def _get_vertical_char(self):
return self._vertical_char<EOL>
The charcter used when printing table borders to draw vertical lines Arguments: vertical_char - single character string used to draw vertical lines
f14504:c0:m62
def _get_horizontal_char(self):
return self._horizontal_char<EOL>
The charcter used when printing table borders to draw horizontal lines Arguments: horizontal_char - single character string used to draw horizontal lines
f14504:c0:m64
def _get_junction_char(self):
return self._junction_char<EOL>
The charcter used when printing table borders to draw line junctions Arguments: junction_char - single character string used to draw line junctions
f14504:c0:m66
def _get_format(self):
return self._format<EOL>
Controls whether or not HTML tables are formatted to match styling options Arguments: format - True or False
f14504:c0:m68
def _get_print_empty(self):
return self._print_empty<EOL>
Controls whether or not empty tables produce a header and frame or just an empty string Arguments: print_empty - True or False
f14504:c0:m70
def _get_attributes(self):
return self._attributes<EOL>
A dictionary of HTML attribute name/value pairs to be included in the <table> tag when printing HTML Arguments: attributes - dictionary of attributes
f14504:c0:m72
def add_row(self, row):
if self._field_names and len(row) != len(self._field_names):<EOL><INDENT>raise Exception("<STR_LIT>" %(len(row),len(self._field_names)))<EOL><DEDENT>if not self._field_names:<EOL><INDENT>self.field_names = [("<STR_LIT>" % (n+<NUM_LIT:1>)) for n in range(<NUM_LIT:0>,len(row))]<EOL><DEDENT>self._rows.append(list(row))<EOL>
Add a row to the table Arguments: row - row of data, should be a list with as many elements as the table has fields
f14504:c0:m80
def del_row(self, row_index):
if row_index > len(self._rows)-<NUM_LIT:1>:<EOL><INDENT>raise Exception("<STR_LIT>" % (row_index, len(self._rows)))<EOL><DEDENT>del self._rows[row_index]<EOL>
Delete a row to the table Arguments: row_index - The index of the row you want to delete. Indexing starts at 0.
f14504:c0:m81
def add_column(self, fieldname, column, align="<STR_LIT:c>", valign="<STR_LIT:t>"):
if len(self._rows) in (<NUM_LIT:0>, len(column)):<EOL><INDENT>self._validate_align(align)<EOL>self._validate_valign(valign)<EOL>self._field_names.append(fieldname)<EOL>self._align[fieldname] = align<EOL>self._valign[fieldname] = valign<EOL>for i in range(<NUM_LIT:0>, len(column)):<EOL><INDENT>if len(self._rows) < i+<NUM_LIT:1>:<EOL><INDENT>self._rows.append([])<EOL><DEDENT>self._rows[i].append(column[i])<EOL><DEDENT><DEDENT>else:<EOL><INDENT>raise Exception("<STR_LIT>" % (len(column), len(self._rows)))<EOL><DEDENT>
Add a column to the table. Arguments: fieldname - name of the field to contain the new column of data column - column of data, should be a list with as many elements as the table has rows align - desired alignment for this column - "l" for left, "c" for centre and "r" for right valign - desired vertical alignment for new columns - "t" for top, "m" for middle and "b" for bottom
f14504:c0:m82
def clear_rows(self):
self._rows = []<EOL>
Delete all rows from the table but keep the current field names
f14504:c0:m83
def clear(self):
self._rows = []<EOL>self._field_names = []<EOL>self._widths = []<EOL>
Delete all rows and field names from the table, maintaining nothing but styling options
f14504:c0:m84
def _get_rows(self, options):
<EOL>rows = copy.deepcopy(self._rows[options["<STR_LIT:start>"]:options["<STR_LIT:end>"]])<EOL>if options["<STR_LIT>"]:<EOL><INDENT>sortindex = self._field_names.index(options["<STR_LIT>"])<EOL>rows = [[row[sortindex]]+row for row in rows]<EOL>rows.sort(reverse=options["<STR_LIT>"], key=options["<STR_LIT>"])<EOL>rows = [row[<NUM_LIT:1>:] for row in rows]<EOL><DEDENT>return rows<EOL>
Return only those data rows that should be printed, based on slicing and sorting. Arguments: options - dictionary of option settings.
f14504:c0:m89
def get_string(self, **kwargs):
options = self._get_options(kwargs)<EOL>lines = []<EOL>if self.rowcount == <NUM_LIT:0> and (not options["<STR_LIT>"] or not options["<STR_LIT>"]):<EOL><INDENT>return "<STR_LIT>"<EOL><DEDENT>rows = self._get_rows(options)<EOL>formatted_rows = self._format_rows(rows, options)<EOL>self._compute_widths(formatted_rows, options)<EOL>self._hrule = self._stringify_hrule(options)<EOL>if options["<STR_LIT>"]:<EOL><INDENT>lines.append(self._stringify_header(options))<EOL><DEDENT>elif options["<STR_LIT>"] and options["<STR_LIT>"] in (ALL, FRAME):<EOL><INDENT>lines.append(self._hrule)<EOL><DEDENT>for row in formatted_rows:<EOL><INDENT>lines.append(self._stringify_row(row, options))<EOL><DEDENT>if options["<STR_LIT>"] and options["<STR_LIT>"] == FRAME:<EOL><INDENT>lines.append(self._hrule)<EOL><DEDENT>return self._unicode("<STR_LIT:\n>").join(lines)<EOL>
Return string representation of table in current state. Arguments: start - index of first data row to include in output end - index of last data row to include in output PLUS ONE (list slice style) fields - names of fields (columns) to include header - print a header showing field names (True or False) border - print a border around the table (True or False) hrules - controls printing of horizontal rules after rows. Allowed values: ALL, FRAME, HEADER, NONE vrules - controls printing of vertical rules between columns. Allowed values: FRAME, ALL, NONE int_format - controls formatting of integer data float_format - controls formatting of floating point data padding_width - number of spaces on either side of column data (only used if left and right paddings are None) left_padding_width - number of spaces on left hand side of column data right_padding_width - number of spaces on right hand side of column data vertical_char - single character string used to draw vertical lines horizontal_char - single character string used to draw horizontal lines junction_char - single character string used to draw line junctions sortby - name of field to sort rows by sort_key - sorting key function, applied to data points before sorting reversesort - True or False to sort in descending or ascending order print empty - if True, stringify just the header for an empty table, if False return an empty string
f14504:c0:m92
def get_html_string(self, **kwargs):
options = self._get_options(kwargs)<EOL>if options["<STR_LIT>"]:<EOL><INDENT>string = self._get_formatted_html_string(options)<EOL><DEDENT>else:<EOL><INDENT>string = self._get_simple_html_string(options)<EOL><DEDENT>return string<EOL>
Return string representation of HTML formatted version of table in current state. Arguments: start - index of first data row to include in output end - index of last data row to include in output PLUS ONE (list slice style) fields - names of fields (columns) to include header - print a header showing field names (True or False) border - print a border around the table (True or False) hrules - controls printing of horizontal rules after rows. Allowed values: ALL, FRAME, HEADER, NONE vrules - controls printing of vertical rules between columns. Allowed values: FRAME, ALL, NONE int_format - controls formatting of integer data float_format - controls formatting of floating point data padding_width - number of spaces on either side of column data (only used if left and right paddings are None) left_padding_width - number of spaces on left hand side of column data right_padding_width - number of spaces on right hand side of column data sortby - name of field to sort rows by sort_key - sorting key function, applied to data points before sorting attributes - dictionary of name/value pairs to include as HTML attributes in the <table> tag xhtml - print <br/> tags if True, <br> tags if false
f14504:c0:m96
def generate_table(self, rows):
table = PrettyTable(**self.kwargs)<EOL>for row in self.rows:<EOL><INDENT>if len(row[<NUM_LIT:0>]) < self.max_row_width:<EOL><INDENT>appends = self.max_row_width - len(row[<NUM_LIT:0>])<EOL>for i in range(<NUM_LIT:1>,appends):<EOL><INDENT>row[<NUM_LIT:0>].append("<STR_LIT:->")<EOL><DEDENT><DEDENT>if row[<NUM_LIT:1>] == True:<EOL><INDENT>self.make_fields_unique(row[<NUM_LIT:0>])<EOL>table.field_names = row[<NUM_LIT:0>]<EOL><DEDENT>else:<EOL><INDENT>table.add_row(row[<NUM_LIT:0>])<EOL><DEDENT><DEDENT>return table<EOL>
Generates from a list of rows a PrettyTable object.
f14504:c1:m4
def make_fields_unique(self, fields):
for i in range(<NUM_LIT:0>, len(fields)):<EOL><INDENT>for j in range(i+<NUM_LIT:1>, len(fields)):<EOL><INDENT>if fields[i] == fields[j]:<EOL><INDENT>fields[j] += "<STR_LIT:'>"<EOL><DEDENT><DEDENT><DEDENT>
iterates over the row and make each field unique
f14504:c1:m5
def _pull_all(command):
page = requests.get(BASE_URL, verify=False)<EOL>soup = BeautifulSoup(page.text, "<STR_LIT>")<EOL>table = soup.find('<STR_LIT>', {'<STR_LIT:class>': '<STR_LIT:list>'})<EOL>rows = table.findAll("<STR_LIT>")<EOL>rows = rows[<NUM_LIT:1>:-<NUM_LIT:1>]<EOL>l = []<EOL>name_max = <NUM_LIT:0><EOL>for row in rows:<EOL><INDENT>elements = row.findAll('<STR_LIT>')<EOL>date = elements[<NUM_LIT:0>].string<EOL>name = elements[<NUM_LIT:1>].string<EOL>n = _ascii_checker(name)<EOL>version = n.split('<STR_LIT:U+0020>')[<NUM_LIT:1>]<EOL>name = n.split('<STR_LIT:U+0020>')[<NUM_LIT:0>]<EOL>if name_max < len(name):<EOL><INDENT>name_max = len(name)<EOL><DEDENT>link = elements[<NUM_LIT:1>].find('<STR_LIT:a>')['<STR_LIT>']<EOL>link = urllib.parse.urljoin(BASE_URL, link)<EOL>desc = elements[<NUM_LIT:2>].string<EOL>li = (name, desc, link, date, version)<EOL>l.append(li)<EOL><DEDENT>print("<STR_LIT>")<EOL>if command == '<STR_LIT>':<EOL><INDENT>for li in l:<EOL><INDENT>print("<STR_LIT>" %<EOL>(li[<NUM_LIT:0>], li[<NUM_LIT:4>]))<EOL><DEDENT><DEDENT>if command == '<STR_LIT:list>':<EOL><INDENT>for li in l:<EOL><INDENT>name = li[<NUM_LIT:0>] + "<STR_LIT>".join("<STR_LIT:U+0020>" for i in range(name_max-len(li[<NUM_LIT:0>])))<EOL>desc = li[<NUM_LIT:1>]<EOL>if len(li[<NUM_LIT:1>]) > <NUM_LIT>:<EOL><INDENT>desc = desc[:<NUM_LIT>] + "<STR_LIT>"<EOL><DEDENT>print("<STR_LIT>" %<EOL>(name, desc))<EOL><DEDENT><DEDENT>
Website scraper for the info from table content
f14528:m0
def _release_info(jsn, VERSION):
try:<EOL><INDENT>release_point = jsn['<STR_LIT>'][VERSION][<NUM_LIT:0>]<EOL><DEDENT>except KeyError:<EOL><INDENT>print("<STR_LIT>")<EOL>exit(<NUM_LIT:1>)<EOL><DEDENT>python_version = release_point['<STR_LIT>']<EOL>filename = release_point['<STR_LIT:filename>']<EOL>md5 = release_point['<STR_LIT>']<EOL>download_url_for_release = release_point['<STR_LIT:url>']<EOL>download_num_for_release = release_point['<STR_LIT>']<EOL>download_size_for_release = _sizeof_fmt(int(release_point['<STR_LIT:size>']))<EOL>print(
Gives information about a particular package version.
f14528:m4
def _construct(PACKAGE, VERSION):
jsn = _get_info(PACKAGE)<EOL>package_url = jsn['<STR_LIT:info>']['<STR_LIT>']<EOL>author = jsn['<STR_LIT:info>']['<STR_LIT>']<EOL>author_email = jsn['<STR_LIT:info>']['<STR_LIT>']<EOL>description = jsn['<STR_LIT:info>']['<STR_LIT:description>']<EOL>last_month = jsn['<STR_LIT:info>']['<STR_LIT>']['<STR_LIT>']<EOL>last_week = jsn['<STR_LIT:info>']['<STR_LIT>']['<STR_LIT>']<EOL>last_day = jsn['<STR_LIT:info>']['<STR_LIT>']['<STR_LIT>']<EOL>classifiers = jsn['<STR_LIT:info>']['<STR_LIT>']<EOL>license = jsn['<STR_LIT:info>']['<STR_LIT>']<EOL>summary = jsn['<STR_LIT:info>']['<STR_LIT>']<EOL>home_page = jsn['<STR_LIT:info>']['<STR_LIT>']<EOL>releases = reversed(list(jsn['<STR_LIT>'].keys()))<EOL>releases = '<STR_LIT>'.join(releases)[:<NUM_LIT>]<EOL>download_url = jsn['<STR_LIT>'][<NUM_LIT:0>]['<STR_LIT:url>']<EOL>filename = jsn['<STR_LIT>'][<NUM_LIT:0>]['<STR_LIT:filename>']<EOL>size = _sizeof_fmt(int(jsn['<STR_LIT>'][<NUM_LIT:0>]['<STR_LIT:size>']))<EOL>if VERSION:<EOL><INDENT>try:<EOL><INDENT>_release_info(jsn, VERSION)<EOL><DEDENT>except IndexError:<EOL><INDENT>print(<EOL>"<STR_LIT>")<EOL><DEDENT>return None<EOL><DEDENT>print(
Construct the information part from the API.
f14528:m5
def main():
arguments = docopt(__doc__, version=__version__)<EOL>if arguments['<STR_LIT>']:<EOL><INDENT>_pull_all('<STR_LIT>')<EOL><DEDENT>elif arguments['<STR_LIT:list>']:<EOL><INDENT>_pull_all('<STR_LIT:list>')<EOL><DEDENT>elif arguments['<STR_LIT>']:<EOL><INDENT>try:<EOL><INDENT>if arguments['<STR_LIT>']:<EOL><INDENT>_construct(arguments['<STR_LIT>'], arguments['<STR_LIT>'])<EOL><DEDENT>else:<EOL><INDENT>_construct(arguments['<STR_LIT>'], None)<EOL><DEDENT><DEDENT>except ValueError:<EOL><INDENT>print(<EOL>"<STR_LIT>")<EOL><DEDENT><DEDENT>else:<EOL><INDENT>print(__doc__)<EOL><DEDENT>
cheesy gives you the news for today's cheese pipy factory from command line
f14528:m6
def __init__(self, freq=<NUM_LIT:8>, width=<NUM_LIT:32>):
self.freq = freq<EOL>self.width = width<EOL>scale = float(freq) / width<EOL>width2 = width**<NUM_LIT:2><EOL>texel = (ctypes.c_ushort * (<NUM_LIT:2> * width**<NUM_LIT:3>))()<EOL>for z in range(width):<EOL><INDENT>for y in range(width):<EOL><INDENT>for x in range(width):<EOL><INDENT>texel[(x + (y * width) + (z * width2)) * <NUM_LIT:2>] = int((pnoise3(<EOL>x * scale, y * scale, z * scale, <EOL>repeatx=freq, repeaty=freq, repeatz=freq) + <NUM_LIT:1.0>) * <NUM_LIT>)<EOL>texel[(x + (y * width) + (z * width2)) * <NUM_LIT:2> + <NUM_LIT:1>] = int((pnoise3(<EOL>x * scale, y * scale, z * scale, <EOL>repeatx=freq, repeaty=freq, repeatz=freq, base=freq + <NUM_LIT:1>) + <NUM_LIT:1.0>) * <NUM_LIT>)<EOL><DEDENT><DEDENT><DEDENT>self.data = texel<EOL>
Generate the 3D noise texture. freq -- frequency of generated noise over the width of the texture. width -- Width of the texture in texels. The texture is cubic, thus all sides are the same width. Must be a power of two. Using a larger width can reduce artifacts caused by linear interpolation of the noise texture, at the cost of video memory, and possibly slower texture access.
f14531:c0:m0
def load(self):
glTexImage3D(GL_TEXTURE_3D, <NUM_LIT:0>, GL_LUMINANCE16_ALPHA16, <EOL>self.width, self.width, self.width, <NUM_LIT:0>, GL_LUMINANCE_ALPHA, <EOL>GL_UNSIGNED_SHORT, ctypes.byref(self.data))<EOL>
Load the noise texture data into the current texture unit
f14531:c0:m1
def enable(self):
glEnable(GL_TEXTURE_3D)<EOL>glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_WRAP_S, GL_REPEAT)<EOL>glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_WRAP_T, GL_REPEAT)<EOL>glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_WRAP_R, GL_REPEAT)<EOL>glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MAG_FILTER, GL_LINEAR)<EOL>glTexParameteri(GL_TEXTURE_3D, GL_TEXTURE_MIN_FILTER, GL_LINEAR)<EOL>
Convenience method to enable 3D texturing state so the texture may be used by the ffpnoise shader function
f14531:c0:m2
def create_3d_texture(width, scale):
coords = range(width)<EOL>texel = (ctypes.c_byte * width**<NUM_LIT:3>)()<EOL>half = <NUM_LIT:0> <EOL>for z in coords:<EOL><INDENT>for y in coords:<EOL><INDENT>for x in coords:<EOL><INDENT>v = snoise3(x * scale - half, y * scale - half, z * scale - half, octaves=<NUM_LIT:4>, persistence=<NUM_LIT>)<EOL>texel[x + (y * width) + (z * width**<NUM_LIT:2>)] = int(v * <NUM_LIT>)<EOL><DEDENT><DEDENT><DEDENT>glPixelStorei(GL_UNPACK_ALIGNMENT, <NUM_LIT:1>)<EOL>glTexImage3D(GL_TEXTURE_3D, <NUM_LIT:0>, GL_LUMINANCE, width, width, width, <NUM_LIT:0>, <EOL>GL_LUMINANCE, GL_BYTE, ctypes.byref(texel))<EOL>return texel<EOL>
Create a grayscale 3d texture map with the specified pixel width on each side and load it into the current texture unit. The luminace of each texel is derived using the input function as: v = func(x * scale, y * scale, z * scale) where x, y, z = 0 in the center texel of the texture. func(x, y, z) is assumed to always return a value in the range [-1, 1].
f14533:m0
def on_resize(width, height):
glViewport(<NUM_LIT:0>, <NUM_LIT:0>, width, height)<EOL>glMatrixMode(GL_PROJECTION)<EOL>glLoadIdentity()<EOL>gluPerspective(<NUM_LIT>, <NUM_LIT:1.0>*width/height, <NUM_LIT:0.1>, <NUM_LIT>)<EOL>glMatrixMode(GL_MODELVIEW)<EOL>glLoadIdentity()<EOL>
Setup 3D viewport
f14538:m0
def __init__(self, period=None, permutation_table=None, randint_function=None):
if randint_function is not None: <EOL><INDENT>if not hasattr(randint_function, '<STR_LIT>'):<EOL><INDENT>raise TypeError(<EOL>'<STR_LIT>')<EOL><DEDENT>self.randint_function = randint_function<EOL>if period is None:<EOL><INDENT>period = self.period <EOL><DEDENT><DEDENT>if period is not None and permutation_table is not None:<EOL><INDENT>raise ValueError(<EOL>'<STR_LIT>')<EOL><DEDENT>if period is not None:<EOL><INDENT>self.randomize(period)<EOL><DEDENT>elif permutation_table is not None:<EOL><INDENT>self.permutation = tuple(permutation_table) * <NUM_LIT:2><EOL>self.period = len(permutation_table)<EOL><DEDENT>
Initialize the noise generator. With no arguments, the default period and permutation table are used (256). The default permutation table generates the exact same noise pattern each time. An integer period can be specified, to generate a random permutation table with period elements. The period determines the (integer) interval that the noise repeats, which is useful for creating tiled textures. period should be a power-of-two, though this is not enforced. Note that the speed of the noise algorithm is indpendent of the period size, though larger periods mean a larger table, which consume more memory. A permutation table consisting of an iterable sequence of whole numbers can be specified directly. This should have a power-of-two length. Typical permutation tables are a sequnce of unique integers in the range [0,period) in random order, though other arrangements could prove useful, they will not be "pure" simplex noise. The largest element in the sequence must be no larger than period-1. period and permutation_table may not be specified together. A substitute for the method random.randint(a, b) can be chosen. The method must take two integer parameters a and b and return an integer N such that a <= N <= b.
f14541:c0:m0
def randomize(self, period=None):
if period is not None:<EOL><INDENT>self.period = period<EOL><DEDENT>perm = list(range(self.period))<EOL>perm_right = self.period - <NUM_LIT:1><EOL>for i in list(perm):<EOL><INDENT>j = self.randint_function(<NUM_LIT:0>, perm_right)<EOL>perm[i], perm[j] = perm[j], perm[i]<EOL><DEDENT>self.permutation = tuple(perm) * <NUM_LIT:2><EOL>
Randomize the permutation table used by the noise functions. This makes them generate a different noise pattern for the same inputs.
f14541:c0:m1
def noise2(self, x, y):
<EOL>s = (x + y) * _F2<EOL>i = floor(x + s)<EOL>j = floor(y + s)<EOL>t = (i + j) * _G2<EOL>x0 = x - (i - t) <EOL>y0 = y - (j - t)<EOL>if x0 > y0:<EOL><INDENT>i1 = <NUM_LIT:1>; j1 = <NUM_LIT:0> <EOL><DEDENT>else:<EOL><INDENT>i1 = <NUM_LIT:0>; j1 = <NUM_LIT:1> <EOL><DEDENT>x1 = x0 - i1 + _G2 <EOL>y1 = y0 - j1 + _G2<EOL>x2 = x0 + _G2 * <NUM_LIT> - <NUM_LIT:1.0> <EOL>y2 = y0 + _G2 * <NUM_LIT> - <NUM_LIT:1.0><EOL>perm = self.permutation<EOL>ii = int(i) % self.period<EOL>jj = int(j) % self.period<EOL>gi0 = perm[ii + perm[jj]] % <NUM_LIT:12><EOL>gi1 = perm[ii + i1 + perm[jj + j1]] % <NUM_LIT:12><EOL>gi2 = perm[ii + <NUM_LIT:1> + perm[jj + <NUM_LIT:1>]] % <NUM_LIT:12><EOL>tt = <NUM_LIT:0.5> - x0**<NUM_LIT:2> - y0**<NUM_LIT:2><EOL>if tt > <NUM_LIT:0>:<EOL><INDENT>g = _GRAD3[gi0]<EOL>noise = tt**<NUM_LIT:4> * (g[<NUM_LIT:0>] * x0 + g[<NUM_LIT:1>] * y0)<EOL><DEDENT>else:<EOL><INDENT>noise = <NUM_LIT:0.0><EOL><DEDENT>tt = <NUM_LIT:0.5> - x1**<NUM_LIT:2> - y1**<NUM_LIT:2><EOL>if tt > <NUM_LIT:0>:<EOL><INDENT>g = _GRAD3[gi1]<EOL>noise += tt**<NUM_LIT:4> * (g[<NUM_LIT:0>] * x1 + g[<NUM_LIT:1>] * y1)<EOL><DEDENT>tt = <NUM_LIT:0.5> - x2**<NUM_LIT:2> - y2**<NUM_LIT:2><EOL>if tt > <NUM_LIT:0>:<EOL><INDENT>g = _GRAD3[gi2]<EOL>noise += tt**<NUM_LIT:4> * (g[<NUM_LIT:0>] * x2 + g[<NUM_LIT:1>] * y2)<EOL><DEDENT>return noise * <NUM_LIT><EOL>
2D Perlin simplex noise. Return a floating point value from -1 to 1 for the given x, y coordinate. The same value is always returned for a given x, y pair unless the permutation table changes (see randomize above).
f14541:c1:m0
def noise3(self, x, y, z):
<EOL>s = (x + y + z) * _F3<EOL>i = floor(x + s)<EOL>j = floor(y + s)<EOL>k = floor(z + s)<EOL>t = (i + j + k) * _G3<EOL>x0 = x - (i - t) <EOL>y0 = y - (j - t)<EOL>z0 = z - (k - t)<EOL>if x0 >= y0:<EOL><INDENT>if y0 >= z0:<EOL><INDENT>i1 = <NUM_LIT:1>; j1 = <NUM_LIT:0>; k1 = <NUM_LIT:0><EOL>i2 = <NUM_LIT:1>; j2 = <NUM_LIT:1>; k2 = <NUM_LIT:0><EOL><DEDENT>elif x0 >= z0:<EOL><INDENT>i1 = <NUM_LIT:1>; j1 = <NUM_LIT:0>; k1 = <NUM_LIT:0><EOL>i2 = <NUM_LIT:1>; j2 = <NUM_LIT:0>; k2 = <NUM_LIT:1><EOL><DEDENT>else:<EOL><INDENT>i1 = <NUM_LIT:0>; j1 = <NUM_LIT:0>; k1 = <NUM_LIT:1><EOL>i2 = <NUM_LIT:1>; j2 = <NUM_LIT:0>; k2 = <NUM_LIT:1><EOL><DEDENT><DEDENT>else: <EOL><INDENT>if y0 < z0:<EOL><INDENT>i1 = <NUM_LIT:0>; j1 = <NUM_LIT:0>; k1 = <NUM_LIT:1><EOL>i2 = <NUM_LIT:0>; j2 = <NUM_LIT:1>; k2 = <NUM_LIT:1><EOL><DEDENT>elif x0 < z0:<EOL><INDENT>i1 = <NUM_LIT:0>; j1 = <NUM_LIT:1>; k1 = <NUM_LIT:0><EOL>i2 = <NUM_LIT:0>; j2 = <NUM_LIT:1>; k2 = <NUM_LIT:1><EOL><DEDENT>else:<EOL><INDENT>i1 = <NUM_LIT:0>; j1 = <NUM_LIT:1>; k1 = <NUM_LIT:0><EOL>i2 = <NUM_LIT:1>; j2 = <NUM_LIT:1>; k2 = <NUM_LIT:0><EOL><DEDENT><DEDENT>x1 = x0 - i1 + _G3<EOL>y1 = y0 - j1 + _G3<EOL>z1 = z0 - k1 + _G3<EOL>x2 = x0 - i2 + <NUM_LIT> * _G3<EOL>y2 = y0 - j2 + <NUM_LIT> * _G3<EOL>z2 = z0 - k2 + <NUM_LIT> * _G3<EOL>x3 = x0 - <NUM_LIT:1.0> + <NUM_LIT> * _G3<EOL>y3 = y0 - <NUM_LIT:1.0> + <NUM_LIT> * _G3<EOL>z3 = z0 - <NUM_LIT:1.0> + <NUM_LIT> * _G3<EOL>perm = self.permutation<EOL>ii = int(i) % self.period<EOL>jj = int(j) % self.period<EOL>kk = int(k) % self.period<EOL>gi0 = perm[ii + perm[jj + perm[kk]]] % <NUM_LIT:12><EOL>gi1 = perm[ii + i1 + perm[jj + j1 + perm[kk + k1]]] % <NUM_LIT:12><EOL>gi2 = perm[ii + i2 + perm[jj + j2 + perm[kk + k2]]] % <NUM_LIT:12><EOL>gi3 = perm[ii + <NUM_LIT:1> + perm[jj + <NUM_LIT:1> + perm[kk + <NUM_LIT:1>]]] % <NUM_LIT:12><EOL>noise = <NUM_LIT:0.0><EOL>tt = <NUM_LIT> - x0**<NUM_LIT:2> - y0**<NUM_LIT:2> - z0**<NUM_LIT:2><EOL>if tt > <NUM_LIT:0>:<EOL><INDENT>g = _GRAD3[gi0]<EOL>noise = tt**<NUM_LIT:4> * (g[<NUM_LIT:0>] * x0 + g[<NUM_LIT:1>] * y0 + g[<NUM_LIT:2>] * z0)<EOL><DEDENT>else:<EOL><INDENT>noise = <NUM_LIT:0.0><EOL><DEDENT>tt = <NUM_LIT> - x1**<NUM_LIT:2> - y1**<NUM_LIT:2> - z1**<NUM_LIT:2><EOL>if tt > <NUM_LIT:0>:<EOL><INDENT>g = _GRAD3[gi1]<EOL>noise += tt**<NUM_LIT:4> * (g[<NUM_LIT:0>] * x1 + g[<NUM_LIT:1>] * y1 + g[<NUM_LIT:2>] * z1)<EOL><DEDENT>tt = <NUM_LIT> - x2**<NUM_LIT:2> - y2**<NUM_LIT:2> - z2**<NUM_LIT:2><EOL>if tt > <NUM_LIT:0>:<EOL><INDENT>g = _GRAD3[gi2]<EOL>noise += tt**<NUM_LIT:4> * (g[<NUM_LIT:0>] * x2 + g[<NUM_LIT:1>] * y2 + g[<NUM_LIT:2>] * z2)<EOL><DEDENT>tt = <NUM_LIT> - x3**<NUM_LIT:2> - y3**<NUM_LIT:2> - z3**<NUM_LIT:2><EOL>if tt > <NUM_LIT:0>:<EOL><INDENT>g = _GRAD3[gi3]<EOL>noise += tt**<NUM_LIT:4> * (g[<NUM_LIT:0>] * x3 + g[<NUM_LIT:1>] * y3 + g[<NUM_LIT:2>] * z3)<EOL><DEDENT>return noise * <NUM_LIT><EOL>
3D Perlin simplex noise. Return a floating point value from -1 to 1 for the given x, y, z coordinate. The same value is always returned for a given x, y, z pair unless the permutation table changes (see randomize above).
f14541:c1:m1
def noise3(self, x, y, z, repeat, base=<NUM_LIT:0.0>):
i = int(fmod(floor(x), repeat))<EOL>j = int(fmod(floor(y), repeat))<EOL>k = int(fmod(floor(z), repeat))<EOL>ii = (i + <NUM_LIT:1>) % repeat<EOL>jj = (j + <NUM_LIT:1>) % repeat<EOL>kk = (k + <NUM_LIT:1>) % repeat<EOL>if base:<EOL><INDENT>i += base; j += base; k += base<EOL>ii += base; jj += base; kk += base<EOL><DEDENT>x -= floor(x); y -= floor(y); z -= floor(z)<EOL>fx = x**<NUM_LIT:3> * (x * (x * <NUM_LIT:6> - <NUM_LIT:15>) + <NUM_LIT:10>)<EOL>fy = y**<NUM_LIT:3> * (y * (y * <NUM_LIT:6> - <NUM_LIT:15>) + <NUM_LIT:10>)<EOL>fz = z**<NUM_LIT:3> * (z * (z * <NUM_LIT:6> - <NUM_LIT:15>) + <NUM_LIT:10>)<EOL>perm = self.permutation<EOL>A = perm[i]<EOL>AA = perm[A + j]<EOL>AB = perm[A + jj]<EOL>B = perm[ii]<EOL>BA = perm[B + j]<EOL>BB = perm[B + jj]<EOL>return lerp(fz, lerp(fy, lerp(fx, grad3(perm[AA + k], x, y, z),<EOL>grad3(perm[BA + k], x - <NUM_LIT:1>, y, z)),<EOL>lerp(fx, grad3(perm[AB + k], x, y - <NUM_LIT:1>, z),<EOL>grad3(perm[BB + k], x - <NUM_LIT:1>, y - <NUM_LIT:1>, z))),<EOL>lerp(fy, lerp(fx, grad3(perm[AA + kk], x, y, z - <NUM_LIT:1>),<EOL>grad3(perm[BA + kk], x - <NUM_LIT:1>, y, z - <NUM_LIT:1>)),<EOL>lerp(fx, grad3(perm[AB + kk], x, y - <NUM_LIT:1>, z - <NUM_LIT:1>),<EOL>grad3(perm[BB + kk], x - <NUM_LIT:1>, y - <NUM_LIT:1>, z - <NUM_LIT:1>))))<EOL>
Tileable 3D noise. repeat specifies the integer interval in each dimension when the noise pattern repeats. base allows a different texture to be generated for the same repeat interval.
f14541:c2:m0
def save(self, force_insert=False, force_update=False):
if force_insert and force_update:<EOL><INDENT>raise ValueError("<STR_LIT>")<EOL><DEDENT>data = {}<EOL>for name, field in self._meta.fields.items():<EOL><INDENT>if field.serialize:<EOL><INDENT>data[name] = field.dehydrate(getattr(self, name, None))<EOL><DEDENT><DEDENT>insert = True if force_insert or self.resource_uri is None else False<EOL>if insert:<EOL><INDENT>resp = self._meta.api.http_resource("<STR_LIT:POST>", self._meta.resource_name, data=self._meta.api.resource_serialize(data))<EOL><DEDENT>else:<EOL><INDENT>resp = self._meta.api.http_resource("<STR_LIT>", self.resource_uri, data=self._meta.api.resource_serialize(data))<EOL><DEDENT>if "<STR_LIT>" in resp.headers:<EOL><INDENT>resp = self._meta.api.http_resource("<STR_LIT:GET>", resp.headers["<STR_LIT>"])<EOL><DEDENT>elif resp.status_code == <NUM_LIT>:<EOL><INDENT>resp = self._meta.api.http_resource("<STR_LIT:GET>", self.resource_uri)<EOL><DEDENT>else:<EOL><INDENT>return<EOL><DEDENT>data = self._meta.api.resource_deserialize(resp.text)<EOL>self.__init__(**data)<EOL>
Saves the current instance. Override this in a subclass if you want to control the saving process. The 'force_insert' and 'force_update' parameters can be used to insist that the "save" must be a POST or PUT respectively. Normally, they should not be set.
f14546:c2:m3
def delete(self):
if self.resource_uri is None:<EOL><INDENT>raise ValueError("<STR_LIT>".format(self._meta.resource_name))<EOL><DEDENT>self._meta.api.http_resource("<STR_LIT>", self.resource_uri)<EOL>
Deletes the current instance. Override this in a subclass if you want to control the deleting process.
f14546:c2:m4
def _add_doc(func, doc):
func.__doc__ = doc<EOL>
Add documentation to a function.
f14547:m0
def _import_module(name):
__import__(name)<EOL>return sys.modules[name]<EOL>
Import module, returning the module after the last dot.
f14547:m1
def add_move(move):
setattr(_MovedItems, move.name, move)<EOL>
Add an item to six.moves.
f14547:m2
def remove_move(name):
try:<EOL><INDENT>delattr(_MovedItems, name)<EOL><DEDENT>except AttributeError:<EOL><INDENT>try:<EOL><INDENT>del moves.__dict__[name]<EOL><DEDENT>except KeyError:<EOL><INDENT>raise AttributeError("<STR_LIT>" % (name,))<EOL><DEDENT><DEDENT>
Remove item from six.moves.
f14547:m3
def iterkeys(d):
return iter(getattr(d, _iterkeys)())<EOL>
Return an iterator over the keys of a dictionary.
f14547:m4
def itervalues(d):
return iter(getattr(d, _itervalues)())<EOL>
Return an iterator over the values of a dictionary.
f14547:m5
def iteritems(d):
return iter(getattr(d, _iteritems)())<EOL>
Return an iterator over the (key, value) pairs of a dictionary.
f14547:m6
def with_metaclass(meta, base=object):
return meta("<STR_LIT>", (base,), {})<EOL>
Create a base class with a metaclass.
f14547:m7
def clone(self, klass=None, memo=None, **kwargs):
obj = Empty()<EOL>obj.__class__ = klass or self.__class__<EOL>obj.resource = self.resource<EOL>obj.filters = self.filters.copy()<EOL>obj.order_by = self.order_by<EOL>obj.low_mark = self.low_mark<EOL>obj.high_mark = self.high_mark<EOL>obj.__dict__.update(kwargs)<EOL>return obj<EOL>
Creates a copy of the current instance. The 'kwargs' parameter can be used by clients to update attributes after copying has taken place.
f14549:c1:m1
def add_filters(self, **filters):
self.filters.update(filters)<EOL>
Adjusts the filters that should be applied to the request to the API.
f14549:c1:m2
def add_ordering(self, ordering=None):
if ordering is not None:<EOL><INDENT>self.order_by = ordering<EOL><DEDENT>else:<EOL><INDENT>self.clear_ordering()<EOL><DEDENT>
Adds items from the 'ordering' sequence to the query's "order by" clause. These items are either field names (not column names) -- possibly with a direction prefix ('-'). If 'ordering' is empty, all ordering is cleared from the query.
f14549:c1:m3
def clear_ordering(self):
self.order_by = None<EOL>
Removes any ordering settings.
f14549:c1:m4
def set_limits(self, low=None, high=None):
if high is not None:<EOL><INDENT>if self.high_mark is not None:<EOL><INDENT>self.high_mark = min(self.high_mark, self.low_mark + high)<EOL><DEDENT>else:<EOL><INDENT>self.high_mark = self.low_mark + high<EOL><DEDENT><DEDENT>if low is not None:<EOL><INDENT>if self.high_mark is not None:<EOL><INDENT>self.low_mark = min(self.high_mark, self.low_mark + low)<EOL><DEDENT>else:<EOL><INDENT>self.low_mark = self.low_mark + low<EOL><DEDENT><DEDENT>
Adjusts the limits on the rows retrieved. We use low/high to set these, as it makes it more Pythonic to read and write. When the API query is created, they are converted to the appropriate offset and limit values. Any limits passed in here are applied relative to the existing constraints. So low is added to the current low value and both will be clamped to any existing high value.
f14549:c1:m5
def results(self, limit=<NUM_LIT:100>):
limited = True if self.high_mark is not None else False<EOL>rmax = self.high_mark - self.low_mark if limited else None<EOL>rnum = <NUM_LIT:0><EOL>params = self.get_params()<EOL>params["<STR_LIT>"] = self.low_mark<EOL>params["<STR_LIT>"] = limit<EOL>while not limited and rmax is None or rnum < rmax:<EOL><INDENT>if limited or rmax is not None:<EOL><INDENT>rleft = rmax - rnum<EOL>params["<STR_LIT>"] = rleft if rleft < limit else limit<EOL><DEDENT>r = self.resource._meta.api.http_resource("<STR_LIT:GET>", self.resource._meta.resource_name, params=params)<EOL>data = self.resource._meta.api.resource_deserialize(r.text)<EOL>if not limited:<EOL><INDENT>rmax = data["<STR_LIT>"]["<STR_LIT>"]<EOL><DEDENT>if data["<STR_LIT>"]["<STR_LIT>"] < rmax:<EOL><INDENT>rmax = data["<STR_LIT>"]["<STR_LIT>"]<EOL><DEDENT>params["<STR_LIT>"] = data["<STR_LIT>"]["<STR_LIT>"] + data["<STR_LIT>"]["<STR_LIT>"]<EOL>for item in data["<STR_LIT>"]:<EOL><INDENT>rnum += <NUM_LIT:1><EOL>yield item<EOL><DEDENT><DEDENT>
Yields the results from the API, efficiently handling the pagination and properly passing all paramaters.
f14549:c1:m6
def delete(self):
uris = [obj["<STR_LIT>"] for obj in self.results()]<EOL>data = self.resource._meta.api.resource_serialize({"<STR_LIT>": [], "<STR_LIT>": uris})<EOL>self.resource._meta.api.http_resource("<STR_LIT>", self.resource._meta.resource_name, data=data)<EOL>return len(uris)<EOL>
Deletes the results of this query, it first fetches all the items to be deletes and then issues a PATCH against the list uri of the resource.
f14549:c1:m7
def get_count(self):
params = self.get_params()<EOL>params["<STR_LIT>"] = self.low_mark<EOL>params["<STR_LIT>"] = <NUM_LIT:1><EOL>r = self.resource._meta.api.http_resource("<STR_LIT:GET>", self.resource._meta.resource_name, params=params)<EOL>data = self.resource._meta.api.resource_deserialize(r.text)<EOL>number = data["<STR_LIT>"]["<STR_LIT>"]<EOL>number = max(<NUM_LIT:0>, number - self.low_mark)<EOL>if self.high_mark is not None:<EOL><INDENT>number = min(number, self.high_mark - self.low_mark)<EOL><DEDENT>return number<EOL>
Gets the total_count using the current filter constraints.
f14549:c1:m9
def can_filter(self):
return not self.low_mark and self.high_mark is None<EOL>
Returns True if adding filters to this instance is still possible. Typically, this means no limits or offsets have been put on the results.
f14549:c1:m10
def __deepcopy__(self, memo):
obj = self.__class__()<EOL>for k, v in six.iteritems(self.__dict__):<EOL><INDENT>if k in ("<STR_LIT>", "<STR_LIT>"):<EOL><INDENT>obj.__dict__[k] = None<EOL><DEDENT>else:<EOL><INDENT>obj.__dict__[k] = copy.deepcopy(v, memo)<EOL><DEDENT><DEDENT>return obj<EOL>
Deep copy of a QuerySet doesn't populate the cache
f14549:c2:m1
def __getstate__(self):
<EOL>len(self)<EOL>obj_dict = self.__dict__.copy()<EOL>obj_dict["<STR_LIT>"] = None<EOL>return obj_dict<EOL>
Allows the QuerySet to be pickled.
f14549:c2:m2
def __getitem__(self, k):
if not isinstance(k, (slice,) + six.integer_types):<EOL><INDENT>raise TypeError<EOL><DEDENT>assert ((not isinstance(k, slice) and (k >= <NUM_LIT:0>))<EOL>or (isinstance(k, slice) and (k.start is None or k.start >= <NUM_LIT:0>)<EOL>and (k.stop is None or k.stop >= <NUM_LIT:0>))),"<STR_LIT>"<EOL>if self._result_cache is not None:<EOL><INDENT>if self._iter is not None:<EOL><INDENT>if isinstance(k, slice):<EOL><INDENT>if k.stop is not None:<EOL><INDENT>bound = int(k.stop)<EOL><DEDENT>else:<EOL><INDENT>bound = None<EOL><DEDENT><DEDENT>else:<EOL><INDENT>bound = k + <NUM_LIT:1><EOL><DEDENT>if len(self._result_cache) < bound:<EOL><INDENT>self._fill_cache(bound - len(self._result_cache))<EOL><DEDENT><DEDENT>return self._result_cache[k]<EOL><DEDENT>if isinstance(k, slice):<EOL><INDENT>qs = self._clone()<EOL>if k.start is not None:<EOL><INDENT>start = int(k.start)<EOL><DEDENT>else:<EOL><INDENT>start = None<EOL><DEDENT>if k.stop is not None:<EOL><INDENT>stop = int(k.stop)<EOL><DEDENT>else:<EOL><INDENT>stop = None<EOL><DEDENT>qs.query.set_limits(start, stop)<EOL>return k.step and list(qs)[::k.step] or qs<EOL><DEDENT>qs = self._clone()<EOL>qs.query.set_limits(k, k + <NUM_LIT:1>)<EOL>return list(qs)[<NUM_LIT:0>]<EOL>
Retrieves an item or slice from the set of results.
f14549:c2:m7
def iterator(self):
for item in self.query.results():<EOL><INDENT>obj = self.resource(**item)<EOL>yield obj<EOL><DEDENT>
An iterator over the results from applying this QuerySet to the api.
f14549:c2:m8
def count(self):
if self._result_cache is not None and not self._iter:<EOL><INDENT>return len(self._result_cache)<EOL><DEDENT>return self.query.get_count()<EOL>
Returns the number of records as an integer. If the QuerySet is already fully cached this simply returns the length of the cached results set to avoid an api call.
f14549:c2:m9
def get(self, *args, **kwargs):
clone = self.filter(*args, **kwargs)<EOL>if self.query.can_filter():<EOL><INDENT>clone = clone.order_by()<EOL><DEDENT>num = len(clone)<EOL>if num == <NUM_LIT:1>:<EOL><INDENT>return clone._result_cache[<NUM_LIT:0>]<EOL><DEDENT>if not num:<EOL><INDENT>raise self.resource.DoesNotExist(<EOL>"<STR_LIT>"<EOL>"<STR_LIT>" %<EOL>(self.resource._meta.resource_name, kwargs))<EOL><DEDENT>raise self.resource.MultipleObjectsReturned(<EOL>"<STR_LIT>"<EOL>"<STR_LIT>" %<EOL>(self.resource._meta.resource_name, num, kwargs))<EOL>
Performs the query and returns a single object matching the given keyword arguments.
f14549:c2:m10
def create(self, **kwargs):
obj = self.resource(**kwargs)<EOL>obj.save(force_insert=True)<EOL>return obj<EOL>
Creates a new object with the given kwargs, saving it to the api and returning the created object.
f14549:c2:m11
def get_or_create(self, **kwargs):
assert kwargs, "<STR_LIT>"<EOL>defaults = kwargs.pop("<STR_LIT>", {})<EOL>lookup = kwargs.copy()<EOL>try:<EOL><INDENT>return self.get(**lookup), False<EOL><DEDENT>except self.resource.DoesNotExist:<EOL><INDENT>params = dict([(k, v) for k, v in kwargs.items()])<EOL>params.update(defaults)<EOL>obj = self.create(**params)<EOL>return obj, True<EOL><DEDENT>
Looks up an object with the given kwargs, creating one if necessary. Returns a tuple of (object, created), where created is a boolean specifying whether an object was created.
f14549:c2:m12
def delete(self):
assert self.query.can_filter(), "<STR_LIT>"<EOL>del_query = self._clone()<EOL>del_query.query.clear_ordering()<EOL>return del_query.query.delete()<EOL>
Deletes the records in the current QuerySet.
f14549:c2:m13
def all(self):
return self._clone()<EOL>
Returns a new QuerySet that is a copy of the current one.
f14549:c2:m15
def filter(self, **kwargs):
if kwargs:<EOL><INDENT>assert self.query.can_filter(), "<STR_LIT>"<EOL><DEDENT>clone = self._clone()<EOL>clone.query.add_filters(**kwargs)<EOL>return clone<EOL>
Returns a new QuerySet instance with the args ANDed to the existing set.
f14549:c2:m16
def order_by(self, field_name=None):
assert self.query.can_filter(), "<STR_LIT>"<EOL>clone = self._clone()<EOL>clone.query.clear_ordering()<EOL>if field_name is not None:<EOL><INDENT>clone.query.add_ordering(field_name)<EOL><DEDENT>return clone<EOL>
Returns a new QuerySet instance with the ordering changed.
f14549:c2:m17
@property<EOL><INDENT>def ordered(self):<DEDENT>
if self.query.order_by:<EOL><INDENT>return True<EOL><DEDENT>else:<EOL><INDENT>return False<EOL><DEDENT>
Returns True if the QuerySet is ordered -- i.e. has an order_by() clause.
f14549:c2:m18
def _fill_cache(self, num=None):
if self._iter:<EOL><INDENT>try:<EOL><INDENT>for i in range(num or ITER_CHUNK_SIZE):<EOL><INDENT>self._result_cache.append(next(self._iter))<EOL><DEDENT><DEDENT>except StopIteration:<EOL><INDENT>self._iter = None<EOL><DEDENT><DEDENT>
Fills the result cache with 'num' more entries (or until the results iterator is exhausted).
f14549:c2:m20
def subclass_exception(name, parents, module, attached_to=None):
class_dict = {'<STR_LIT>': module}<EOL>if attached_to is not None:<EOL><INDENT>def __reduce__(self):<EOL><INDENT>return (unpickle_inner_exception, (attached_to, name), self.args)<EOL><DEDENT>def __setstate__(self, args):<EOL><INDENT>self.args = args<EOL><DEDENT>class_dict['<STR_LIT>'] = __reduce__<EOL>class_dict['<STR_LIT>'] = __setstate__<EOL><DEDENT>return type(name, parents, class_dict)<EOL>
Create exception subclass. If 'attached_to' is supplied, the exception will be created in a way that allows it to be pickled, assuming the returned exception class will be added as an attribute to the 'attached_to' class.
f14551:m1
@staticmethod<EOL><INDENT>def resource_serialize(o):<DEDENT>
return json.dumps(o)<EOL>
Returns JSON serialization of given object.
f14552:c0:m4
@staticmethod<EOL><INDENT>def resource_deserialize(s):<DEDENT>
try:<EOL><INDENT>return json.loads(s)<EOL><DEDENT>except ValueError:<EOL><INDENT>raise ResponseError("<STR_LIT>")<EOL><DEDENT>
Returns dict deserialization of a given JSON string.
f14552:c0:m5
def http_resource(self, method, url, params=None, data=None):
url = urllib_parse.urljoin(self.url, url)<EOL>url = url if url.endswith("<STR_LIT:/>") else url + "<STR_LIT:/>"<EOL>headers = None<EOL>if method.lower() in self.unsupported_methods:<EOL><INDENT>headers = {"<STR_LIT>": method.upper()}<EOL>method = "<STR_LIT:POST>"<EOL><DEDENT>r = self.session.request(method, url, params=params, data=data, headers=headers)<EOL>r.raise_for_status()<EOL>return r<EOL>
Makes an HTTP request.
f14552:c0:m6
def find_nth(s: str, x: str, n: int = <NUM_LIT:0>, overlap: bool = False) -> int:
<EOL>h_of_fragment = <NUM_LIT:1> if overlap else len(x)<EOL>length_of_fragment<EOL><INDENT>in range(n + <NUM_LIT:1>):<EOL>= s.find(x, i + length_of_fragment)<EOL><DEDENT>f i < <NUM_LIT:0>:<EOL><INDENT>break<EOL><DEDENT>n i<EOL>
Finds the position of *n*\ th occurrence of ``x`` in ``s``, or ``-1`` if there isn't one. - The ``n`` parameter is zero-based (i.e. 0 for the first, 1 for the second...). - If ``overlap`` is true, allows fragments to overlap. If not, they must be distinct. As per https://stackoverflow.com/questions/1883980/find-the-nth-occurrence-of-substring-in-a-string
f14554:m0
def split_string(x: str, n: int) -> List[str]:
<EOL>return [x[i:i+n] for i in range(<NUM_LIT:0>, len(x), n)]<EOL>
Split string into chunks of length n
f14554:m1
def multiple_replace(text: str, rep: Dict[str, str]) -> str:
rep = dict((re.escape(k), v) for k, v in rep.items())<EOL>pattern = re.compile("<STR_LIT:|>".join(rep.keys()))<EOL>return pattern.sub(lambda m: rep[re.escape(m.group(<NUM_LIT:0>))], text)<EOL>
Returns a version of ``text`` in which the keys of ``rep`` (a dict) have been replaced by their values. As per http://stackoverflow.com/questions/6116978/python-replace-multiple-strings.
f14554:m2
def replace_in_list(stringlist: Iterable[str],<EOL>replacedict: Dict[str, str]) -> List[str]:
newlist = []<EOL>for fromstring in stringlist:<EOL><INDENT>newlist.append(multiple_replace(fromstring, replacedict))<EOL><DEDENT>return newlist<EOL>
Returns a list produced by applying :func:`multiple_replace` to every string in ``stringlist``. Args: stringlist: list of source strings replacedict: dictionary mapping "original" to "replacement" strings Returns: list of final strings
f14554:m3
def mangle_unicode_to_ascii(s: Any) -> str:
<EOL>if s is None:<EOL><INDENT>return "<STR_LIT>"<EOL><DEDENT>if not isinstance(s, str):<EOL><INDENT>s = str(s)<EOL><DEDENT>return (<EOL>unicodedata.normalize('<STR_LIT>', s)<EOL>.encode('<STR_LIT:ascii>', '<STR_LIT:ignore>') <EOL>.decode('<STR_LIT:ascii>') <EOL>)<EOL>
Mangle unicode to ASCII, losing accents etc. in the process.
f14554:m4
def strnum(prefix: str, num: int, suffix: str = "<STR_LIT>") -> str:
return "<STR_LIT>".format(prefix, num, suffix)<EOL>
Makes a string of the format ``<prefix><number><suffix>``.
f14554:m5
def strnumlist(prefix: str, numbers: List[int], suffix: str = "<STR_LIT>") -> List[str]:
return ["<STR_LIT>".format(prefix, num, suffix) for num in numbers]<EOL>
Makes a string of the format ``<prefix><number><suffix>`` for every number in ``numbers``, and returns them as a list.
f14554:m6
def strseq(prefix: str, first: int, last: int, suffix: str = "<STR_LIT>") -> List[str]:
return [strnum(prefix, n, suffix) for n in range(first, last + <NUM_LIT:1>)]<EOL>
Makes a string of the format ``<prefix><number><suffix>`` for every number from ``first`` to ``last`` inclusive, and returns them as a list.
f14554:m7
def ip_addresses_from_xff(value: str) -> List[str]:
if not value:<EOL><INDENT>return []<EOL><DEDENT>return [x.strip() for x in value.split("<STR_LIT:U+002C>")]<EOL>
Returns a list of IP addresses (as strings), given the value of an HTTP ``X-Forwarded-For`` (or ``WSGI HTTP_X_FORWARDED_FOR``) header. Args: value: the value of an HTTP ``X-Forwarded-For`` (or ``WSGI HTTP_X_FORWARDED_FOR``) header Returns: a list of IP address as strings See: - https://en.wikipedia.org/wiki/X-Forwarded-For - https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/X-Forwarded-For # noqa - NOT THIS: http://tools.ietf.org/html/rfc7239
f14555:m0
def first_from_xff(value: str) -> str:
ip_addresses = ip_addresses_from_xff(value)<EOL>if not ip_addresses:<EOL><INDENT>return '<STR_LIT>'<EOL><DEDENT>return ip_addresses[<NUM_LIT:0>]<EOL>
Returns the first IP address from an ``X-Forwarded-For`` header; see :func:`ip_addresses_from_xff`. Args: value: the value of an HTTP ``X-Forwarded-For`` (or ``WSGI HTTP_X_FORWARDED_FOR``) header Returns: an IP address as a string, or ``''`` if none is found
f14555:m1
def __init__(self,<EOL>trusted_proxy_headers: List[str] = None,<EOL>http_host: str = None,<EOL>remote_addr: str = None,<EOL>script_name: str = None,<EOL>server_name: str = None,<EOL>server_port: int = None,<EOL>url_scheme: str = None,<EOL>rewrite_path_info: bool = False) -> None:
self.trusted_proxy_headers = [] <EOL>if trusted_proxy_headers:<EOL><INDENT>for x in trusted_proxy_headers:<EOL><INDENT>h = x.upper()<EOL>if h in ReverseProxiedMiddleware.ALL_CANDIDATES:<EOL><INDENT>self.trusted_proxy_headers.append(h)<EOL><DEDENT><DEDENT><DEDENT>self.http_host = http_host<EOL>self.remote_addr = remote_addr<EOL>self.script_name = script_name.rstrip("<STR_LIT:/>") if script_name else "<STR_LIT>"<EOL>self.server_name = server_name<EOL>self.server_port = str(server_port) if server_port is not None else "<STR_LIT>"<EOL>self.url_scheme = url_scheme.lower() if url_scheme else "<STR_LIT>"<EOL>self.rewrite_path_info = rewrite_path_info<EOL>
Args: trusted_proxy_headers: list of headers, from :const:`ReverseProxiedMiddleware.ALL_CANDIDATES`, that the middleware will treat as trusted and obey. All others from this list will be stripped. http_host: Value to write to the ``HTTP_HOST`` WSGI variable. If not specified, an appropriate trusted header will be used (if there is one). remote_addr: ... similarly for ``REMOTE_ADDR`` script_name: ... similarly for ``SCRIPT_NAME`` server_name: ... similarly for ``SERVER_NAME`` server_port: ... similarly for ``SERVER_PORT`` url_scheme: ... similarly for ``URL_SCHEME`` (e.g. ``"https"``) rewrite_path_info: If ``True``, then if the middleware sets ``SCRIPT_NAME`` and ``PATH_INFO`` starts with ``SCRIPT_NAME``, the ``SCRIPT_NAME`` will be stripped off the front of ``PATH_INFO``. This is appropriate for front-end web servers that fail to rewrite the incoming URL properly. (Do not use for Apache with ``ProxyPass``; ``ProxyPass`` rewrites the URLs properly for you.) ... as per e.g. http://flask.pocoo.org/snippets/35/
f14555:c0:m0
def necessary(self) -> bool:
return any([<EOL>self.trusted_proxy_headers,<EOL>self.http_host,<EOL>self.remote_addr,<EOL>self.script_name,<EOL>self.server_name,<EOL>self.server_port,<EOL>self.url_scheme,<EOL>self.rewrite_path_info,<EOL>])<EOL>
Is any special handling (e.g. the addition of :class:`ReverseProxiedMiddleware`) necessary for thie config?
f14555:c0:m1
def __call__(self,<EOL>environ: TYPE_WSGI_ENVIRON,<EOL>start_response: TYPE_WSGI_START_RESPONSE)-> TYPE_WSGI_APP_RESULT:
<EOL>if self.debug:<EOL><INDENT>log.debug("<STR_LIT>", pformat(environ))<EOL>oldenv = environ.copy()<EOL><DEDENT>keys_to_keep = [] <EOL>config = self.config<EOL>http_host = (<EOL>config.http_host or <EOL>self._get_first(environ, self.vars_host, keys_to_keep)<EOL>)<EOL>if http_host:<EOL><INDENT>environ[WsgiEnvVar.HTTP_HOST] = http_host<EOL><DEDENT>remote_addr = (<EOL>config.remote_addr or<EOL>self._get_first(environ, self.vars_addr, keys_to_keep,<EOL>as_remote_addr=True)<EOL>)<EOL>if remote_addr:<EOL><INDENT>environ[WsgiEnvVar.REMOTE_ADDR] = remote_addr<EOL><DEDENT>script_name = (<EOL>config.script_name or<EOL>self._get_first(environ, self.vars_script, keys_to_keep)<EOL>)<EOL>if script_name:<EOL><INDENT>environ[WsgiEnvVar.SCRIPT_NAME] = script_name<EOL>path_info = environ[WsgiEnvVar.PATH_INFO]<EOL>if config.rewrite_path_info and path_info.startswith(script_name):<EOL><INDENT>newpath = path_info[len(script_name):]<EOL>if not newpath: <EOL><INDENT>newpath = "<STR_LIT:/>"<EOL><DEDENT>environ[WsgiEnvVar.PATH_INFO] = newpath<EOL><DEDENT><DEDENT>server_name = (<EOL>config.server_name or<EOL>self._get_first(environ, self.vars_server, keys_to_keep)<EOL>)<EOL>if server_name:<EOL><INDENT>environ[WsgiEnvVar.SERVER_NAME] = server_name<EOL><DEDENT>server_port = (<EOL>config.server_port or<EOL>self._get_first(environ, self.vars_port, keys_to_keep)<EOL>)<EOL>if server_port:<EOL><INDENT>environ[WsgiEnvVar.SERVER_PORT] = server_port<EOL><DEDENT>url_scheme = (<EOL>config.url_scheme or <EOL>self._get_first(environ, self.vars_scheme_a, keys_to_keep) or<EOL>self._proto_if_one_true(environ, self.vars_scheme_b, keys_to_keep)<EOL>)<EOL>if url_scheme:<EOL><INDENT>url_scheme = url_scheme.lower()<EOL>environ[WsgiEnvVar.WSGI_URL_SCHEME] = url_scheme<EOL><DEDENT>delete_keys(environ,<EOL>keys_to_delete=self.ALL_CANDIDATES,<EOL>keys_to_keep=keys_to_keep)<EOL>if self.debug:<EOL><INDENT>changes = dict_diff(oldenv, environ)<EOL>log.debug("<STR_LIT>", pformat(changes))<EOL><DEDENT>return self.app(environ, start_response)<EOL>
----------------------------------------------------------------------- REWRITING THE HOST (setting HTTP_HOST): ----------------------------------------------------------------------- If you don't rewrite the host, the Pyramid debug toolbar will get things a bit wrong. An example: http://127.0.0.1:80/camcops is proxied by Apache to http://127.0.0.7:8000/camcops In that situation, HTTP_HOST will be '127.0.0.1:8000', and so the Pyramid debug toolbar will start asking the web browser to go to http://127.0.0.1:8000/camcops/_debug_toolbar/... ... which is wrong (it's a reference to the "internal" site). If you allow the host to be rewritten, then you get a sensible reference e.g. to http://wombat/camcops/_debug_toolbar/... Should we be looking at HTTP_X_FORWARDED_HOST or HTTP_X_FORWARDED_SERVER? See https://github.com/omnigroup/Apache/blob/master/httpd/modules/proxy/mod_proxy_http.c # noqa ... and let's follow mod_wsgi. ----------------------------------------------------------------------- HTTP_HOST versus SERVER_NAME ----------------------------------------------------------------------- https://stackoverflow.com/questions/2297403/what-is-the-difference-between-http-host-and-server-name-in-php # noqa ----------------------------------------------------------------------- REWRITING THE PROTOCOL ----------------------------------------------------------------------- Consider how we get here. For example, we may have this sequence: .. code-block:: none user's web browser -> Apache front-end web server via HTTPS on port 443 -> ProxyPass/ProxyPassReverse -> CherryPy server via HTTP on port 8000 or via a Unix socket -> ... -> cherrypy/wsgiserver/__init__.py, WSGIGateway_10.get_environ() ... which creates a WSGI environment from an HTTP request. So if you want to see what's coming by way of raw headers, put this in at the end of that get_environ() function: .. code-block:: python from pprint import pformat; import logging; log = logging.getLogger(__name__); log.critical("Request headers:\n" + pformat(req.inheaders))
f14555:c1:m4
def __init__(self, app: TYPE_WSGI_APP,<EOL>logger: logging.Logger = log,<EOL>loglevel: int = logging.INFO,<EOL>show_request_immediately: bool = True,<EOL>show_response: bool = True,<EOL>show_timing: bool = True) -> None:
self.app = app<EOL>self.logger = logger<EOL>self.loglevel = loglevel<EOL>self.show_response = show_response<EOL>self.show_request_immediately = show_request_immediately<EOL>self.show_timing = show_timing<EOL>self.two_parts = show_request_immediately and (<EOL>show_response or show_timing)<EOL>
Args: app: The WSGI application to wrap logger: The Python logger to write to loglevel: The log level to use (e.g. ``logging.DEBUG``, ``logging.INFO``) show_request_immediately: Show the request immediately, so it's written to the log before the WSGI app does its processing, and is guaranteed to be visible even if the WSGI app hangs? The only reason to use ``False`` is probably if you intend to show response and/or timing information and you want to minimize the number of lines written to the log; in this case, only a single line is written to the log (after the wrapped WSGI app has finished processing). show_response: Show the HTTP response code? show_timing: Show the time that the wrapped WSGI app took?
f14557:c0:m0
def log(self, msg) -> None:
self.logger.log(self.loglevel, msg)<EOL>
Writes a message to the chosen log.
f14557:c0:m1
def add_never_cache_headers(headers: TYPE_WSGI_RESPONSE_HEADERS) -> None:
headers.append(("<STR_LIT>", "<STR_LIT>")) <EOL>headers.append(("<STR_LIT>", "<STR_LIT>")) <EOL>headers.append(("<STR_LIT>", "<STR_LIT:0>"))<EOL>
Adds WSGI headers to say "never cache this response".
f14558:m0
@abstractmethod<EOL><INDENT>def close(self) -> None:<DEDENT>
pass<EOL>
See https://www.python.org/dev/peps/pep-0249/#connection-objects
f14561:c1:m0
@abstractmethod<EOL><INDENT>def commit(self) -> None:<DEDENT>
pass<EOL>
See https://www.python.org/dev/peps/pep-0249/#connection-objects
f14561:c1:m1
@abstractmethod<EOL><INDENT>def rollback(self) -> None:<DEDENT>
pass<EOL>
See https://www.python.org/dev/peps/pep-0249/#connection-objects
f14561:c1:m2
@abstractmethod<EOL><INDENT>def cursor(self) -> "<STR_LIT>":<DEDENT>
pass<EOL>
See https://www.python.org/dev/peps/pep-0249/#connection-objects
f14561:c1:m3
@property<EOL><INDENT>@abstractmethod<EOL>def messages(self) -> List[Tuple[Type, Any]]:<DEDENT>
pass<EOL>
See https://www.python.org/dev/peps/pep-0249/#optional-db-api-extensions
f14561:c1:m4
@abstractmethod<EOL><INDENT>def __iter__(self) -> Iterator[_DATABASE_ROW_TYPE]:<DEDENT>
pass<EOL>
See https://www.python.org/dev/peps/pep-0249/#optional-db-api-extensions
f14561:c2:m1
@property<EOL><INDENT>@abstractmethod<EOL>def description(self)-> Optional[Sequence[Sequence[Any]]]:<DEDENT>
pass<EOL>
A sequence of column_description objects, where each column_description describes one result column and has the following items: - name: ``str`` - type_code: ``Optional[Type]``? Not sure. - display_size: ``Optional[int]`` - internal_size: ``Optional[int]`` - precision: ``Optional[int]`` - scale: ``Optional[int]`` - null_ok: ``Optional[bool]`` The attribute is ``None`` for operations that don't return rows, and for un-executed cursors.
f14561:c2:m4