body stringlengths 26 98.2k | body_hash int64 -9,222,864,604,528,158,000 9,221,803,474B | docstring stringlengths 1 16.8k | path stringlengths 5 230 | name stringlengths 1 96 | repository_name stringlengths 7 89 | lang stringclasses 1
value | body_without_docstring stringlengths 20 98.2k |
|---|---|---|---|---|---|---|---|
def boundary_integral(self, child, discretised_child, region):
'\n Implements the boundary integral for a spatial method.\n\n Parameters\n ----------\n child: :class:`pybamm.Symbol`\n The symbol to which is being integrated\n discretised_child: :class:`pybamm.Symbol`\n ... | -2,703,065,040,527,775,000 | Implements the boundary integral for a spatial method.
Parameters
----------
child: :class:`pybamm.Symbol`
The symbol to which is being integrated
discretised_child: :class:`pybamm.Symbol`
The discretised symbol of the correct size
region: str
The region of the boundary over which to integrate. If region i... | pybamm/spatial_methods/spatial_method.py | boundary_integral | jedgedrudd/PyBaMM | python | def boundary_integral(self, child, discretised_child, region):
'\n Implements the boundary integral for a spatial method.\n\n Parameters\n ----------\n child: :class:`pybamm.Symbol`\n The symbol to which is being integrated\n discretised_child: :class:`pybamm.Symbol`\n ... |
def delta_function(self, symbol, discretised_symbol):
'\n Implements the delta function on the approriate side for a spatial method.\n\n Parameters\n ----------\n symbol: :class:`pybamm.Symbol`\n The symbol to which is being integrated\n discretised_symbol: :class:`pyba... | 3,166,707,958,765,238,300 | Implements the delta function on the approriate side for a spatial method.
Parameters
----------
symbol: :class:`pybamm.Symbol`
The symbol to which is being integrated
discretised_symbol: :class:`pybamm.Symbol`
The discretised symbol of the correct size | pybamm/spatial_methods/spatial_method.py | delta_function | jedgedrudd/PyBaMM | python | def delta_function(self, symbol, discretised_symbol):
'\n Implements the delta function on the approriate side for a spatial method.\n\n Parameters\n ----------\n symbol: :class:`pybamm.Symbol`\n The symbol to which is being integrated\n discretised_symbol: :class:`pyba... |
def internal_neumann_condition(self, left_symbol_disc, right_symbol_disc, left_mesh, right_mesh):
'\n A method to find the internal neumann conditions between two symbols\n on adjacent subdomains.\n\n Parameters\n ----------\n left_symbol_disc : :class:`pybamm.Symbol`\n ... | -1,846,082,221,403,767,600 | A method to find the internal neumann conditions between two symbols
on adjacent subdomains.
Parameters
----------
left_symbol_disc : :class:`pybamm.Symbol`
The discretised symbol on the left subdomain
right_symbol_disc : :class:`pybamm.Symbol`
The discretised symbol on the right subdomain
left_mesh : list
... | pybamm/spatial_methods/spatial_method.py | internal_neumann_condition | jedgedrudd/PyBaMM | python | def internal_neumann_condition(self, left_symbol_disc, right_symbol_disc, left_mesh, right_mesh):
'\n A method to find the internal neumann conditions between two symbols\n on adjacent subdomains.\n\n Parameters\n ----------\n left_symbol_disc : :class:`pybamm.Symbol`\n ... |
def boundary_value_or_flux(self, symbol, discretised_child):
'\n Returns the boundary value or flux using the approriate expression for the\n spatial method. To do this, we create a sparse vector \'bv_vector\' that extracts\n either the first (for side="left") or last (for side="right") point f... | -5,708,088,011,332,040,000 | Returns the boundary value or flux using the approriate expression for the
spatial method. To do this, we create a sparse vector 'bv_vector' that extracts
either the first (for side="left") or last (for side="right") point from
'discretised_child'.
Parameters
-----------
symbol: :class:`pybamm.Symbol`
The boundary... | pybamm/spatial_methods/spatial_method.py | boundary_value_or_flux | jedgedrudd/PyBaMM | python | def boundary_value_or_flux(self, symbol, discretised_child):
'\n Returns the boundary value or flux using the approriate expression for the\n spatial method. To do this, we create a sparse vector \'bv_vector\' that extracts\n either the first (for side="left") or last (for side="right") point f... |
def mass_matrix(self, symbol, boundary_conditions):
'\n Calculates the mass matrix for a spatial method.\n\n Parameters\n ----------\n symbol: :class:`pybamm.Variable`\n The variable corresponding to the equation for which we are\n calculating the mass matrix.\n ... | -2,986,177,874,146,341,400 | Calculates the mass matrix for a spatial method.
Parameters
----------
symbol: :class:`pybamm.Variable`
The variable corresponding to the equation for which we are
calculating the mass matrix.
boundary_conditions : dict
The boundary conditions of the model
({symbol.id: {"left": left bc, "right": right ... | pybamm/spatial_methods/spatial_method.py | mass_matrix | jedgedrudd/PyBaMM | python | def mass_matrix(self, symbol, boundary_conditions):
'\n Calculates the mass matrix for a spatial method.\n\n Parameters\n ----------\n symbol: :class:`pybamm.Variable`\n The variable corresponding to the equation for which we are\n calculating the mass matrix.\n ... |
def process_binary_operators(self, bin_op, left, right, disc_left, disc_right):
'Discretise binary operators in model equations. Default behaviour is to\n return a new binary operator with the discretised children.\n\n Parameters\n ----------\n bin_op : :class:`pybamm.BinaryOperator`\n ... | 3,478,685,334,077,549,600 | Discretise binary operators in model equations. Default behaviour is to
return a new binary operator with the discretised children.
Parameters
----------
bin_op : :class:`pybamm.BinaryOperator`
Binary operator to discretise
left : :class:`pybamm.Symbol`
The left child of `bin_op`
right : :class:`pybamm.Symbol`... | pybamm/spatial_methods/spatial_method.py | process_binary_operators | jedgedrudd/PyBaMM | python | def process_binary_operators(self, bin_op, left, right, disc_left, disc_right):
'Discretise binary operators in model equations. Default behaviour is to\n return a new binary operator with the discretised children.\n\n Parameters\n ----------\n bin_op : :class:`pybamm.BinaryOperator`\n ... |
def concatenation(self, disc_children):
'Discrete concatenation object.\n\n Parameters\n ----------\n disc_children : list\n List of discretised children\n\n Returns\n -------\n :class:`pybamm.DomainConcatenation`\n Concatenation of the discretised chi... | -2,007,725,983,006,562,800 | Discrete concatenation object.
Parameters
----------
disc_children : list
List of discretised children
Returns
-------
:class:`pybamm.DomainConcatenation`
Concatenation of the discretised children | pybamm/spatial_methods/spatial_method.py | concatenation | jedgedrudd/PyBaMM | python | def concatenation(self, disc_children):
'Discrete concatenation object.\n\n Parameters\n ----------\n disc_children : list\n List of discretised children\n\n Returns\n -------\n :class:`pybamm.DomainConcatenation`\n Concatenation of the discretised chi... |
def save_gif(gif_fname, images, fps):
'\n To generate a gif from image files, first generate palette from images\n and then generate the gif from the images and the palette.\n ffmpeg -i input_%02d.jpg -vf palettegen -y palette.png\n ffmpeg -i input_%02d.jpg -i palette.png -lavfi paletteuse -y output.gif... | -8,273,607,447,406,788,000 | To generate a gif from image files, first generate palette from images
and then generate the gif from the images and the palette.
ffmpeg -i input_%02d.jpg -vf palettegen -y palette.png
ffmpeg -i input_%02d.jpg -i palette.png -lavfi paletteuse -y output.gif
Alternatively, use a filter to map the input images to both th... | video_prediction/utils/ffmpeg_gif.py | save_gif | Bonennult/video_prediction | python | def save_gif(gif_fname, images, fps):
'\n To generate a gif from image files, first generate palette from images\n and then generate the gif from the images and the palette.\n ffmpeg -i input_%02d.jpg -vf palettegen -y palette.png\n ffmpeg -i input_%02d.jpg -i palette.png -lavfi paletteuse -y output.gif... |
def __init__(self, db_uri):
"\n db_uri = f'mysql+pymysql://{username}:{password}@{host}:{port}/{database}?charset=utf8mb4'\n\n "
engine = create_engine(db_uri)
self.session = sessionmaker(bind=engine)() | 7,448,832,326,011,661,000 | db_uri = f'mysql+pymysql://{username}:{password}@{host}:{port}/{database}?charset=utf8mb4' | httprunner/database/engine.py | __init__ | AlanFightting/httprunner | python | def __init__(self, db_uri):
"\n \n\n "
engine = create_engine(db_uri)
self.session = sessionmaker(bind=engine)() |
@staticmethod
def value_decode(row: dict):
'\n Try to decode value of table\n datetime.datetime-->string\n datetime.date-->string\n json str-->dict\n :param row:\n :return:\n '
for (k, v) in row.items():
if isinstance(v, datetime.datetime):
ro... | -8,875,464,380,168,999,000 | Try to decode value of table
datetime.datetime-->string
datetime.date-->string
json str-->dict
:param row:
:return: | httprunner/database/engine.py | value_decode | AlanFightting/httprunner | python | @staticmethod
def value_decode(row: dict):
'\n Try to decode value of table\n datetime.datetime-->string\n datetime.date-->string\n json str-->dict\n :param row:\n :return:\n '
for (k, v) in row.items():
if isinstance(v, datetime.datetime):
ro... |
@property
def name(self):
'Name of the virtual server.<br/>Minimum length = 1.\n\t\t'
try:
return self._name
except Exception as e:
raise e | -3,012,199,902,447,286,300 | Name of the virtual server.<br/>Minimum length = 1. | build/lib/nssrc/com/citrix/netscaler/nitro/resource/config/vpn/vpnvserver_vpnnexthopserver_binding.py | name | guardicore/nitro-python | python | @property
def name(self):
'\n\t\t'
try:
return self._name
except Exception as e:
raise e |
@name.setter
def name(self, name):
'Name of the virtual server.<br/>Minimum length = 1\n\t\t'
try:
self._name = name
except Exception as e:
raise e | 868,641,619,903,283,700 | Name of the virtual server.<br/>Minimum length = 1 | build/lib/nssrc/com/citrix/netscaler/nitro/resource/config/vpn/vpnvserver_vpnnexthopserver_binding.py | name | guardicore/nitro-python | python | @name.setter
def name(self, name):
'\n\t\t'
try:
self._name = name
except Exception as e:
raise e |
@property
def nexthopserver(self):
'The name of the next hop server bound to the VPN virtual server.\n\t\t'
try:
return self._nexthopserver
except Exception as e:
raise e | -351,339,766,894,237,400 | The name of the next hop server bound to the VPN virtual server. | build/lib/nssrc/com/citrix/netscaler/nitro/resource/config/vpn/vpnvserver_vpnnexthopserver_binding.py | nexthopserver | guardicore/nitro-python | python | @property
def nexthopserver(self):
'\n\t\t'
try:
return self._nexthopserver
except Exception as e:
raise e |
@nexthopserver.setter
def nexthopserver(self, nexthopserver):
'The name of the next hop server bound to the VPN virtual server.\n\t\t'
try:
self._nexthopserver = nexthopserver
except Exception as e:
raise e | 5,440,545,738,043,150,000 | The name of the next hop server bound to the VPN virtual server. | build/lib/nssrc/com/citrix/netscaler/nitro/resource/config/vpn/vpnvserver_vpnnexthopserver_binding.py | nexthopserver | guardicore/nitro-python | python | @nexthopserver.setter
def nexthopserver(self, nexthopserver):
'\n\t\t'
try:
self._nexthopserver = nexthopserver
except Exception as e:
raise e |
def _get_nitro_response(self, service, response):
' converts nitro response into object and returns the object array in case of get request.\n\t\t'
try:
result = service.payload_formatter.string_to_resource(vpnvserver_vpnnexthopserver_binding_response, response, self.__class__.__name__)
if (resu... | 844,322,438,921,633,900 | converts nitro response into object and returns the object array in case of get request. | build/lib/nssrc/com/citrix/netscaler/nitro/resource/config/vpn/vpnvserver_vpnnexthopserver_binding.py | _get_nitro_response | guardicore/nitro-python | python | def _get_nitro_response(self, service, response):
' \n\t\t'
try:
result = service.payload_formatter.string_to_resource(vpnvserver_vpnnexthopserver_binding_response, response, self.__class__.__name__)
if (result.errorcode != 0):
if (result.errorcode == 444):
service.cl... |
def _get_object_name(self):
' Returns the value of object identifier argument\n\t\t'
try:
if (self.name is not None):
return str(self.name)
return None
except Exception as e:
raise e | 7,474,779,240,333,799,000 | Returns the value of object identifier argument | build/lib/nssrc/com/citrix/netscaler/nitro/resource/config/vpn/vpnvserver_vpnnexthopserver_binding.py | _get_object_name | guardicore/nitro-python | python | def _get_object_name(self):
' \n\t\t'
try:
if (self.name is not None):
return str(self.name)
return None
except Exception as e:
raise e |
@classmethod
def filter_add_parameters(cls, resource):
' Use this function to create a resource with only add operation specific parameters.\n\t\t'
addresource = vpnvserver_vpnnexthopserver_binding()
addresource.name = resource.name
addresource.nexthopserver = resource.nexthopserver
return addresour... | -2,142,231,376,633,605,400 | Use this function to create a resource with only add operation specific parameters. | build/lib/nssrc/com/citrix/netscaler/nitro/resource/config/vpn/vpnvserver_vpnnexthopserver_binding.py | filter_add_parameters | guardicore/nitro-python | python | @classmethod
def filter_add_parameters(cls, resource):
' \n\t\t'
addresource = vpnvserver_vpnnexthopserver_binding()
addresource.name = resource.name
addresource.nexthopserver = resource.nexthopserver
return addresource |
@classmethod
def filter_delete_parameters(cls, resource):
' Use this function to create a resource with only delete operation specific parameters.\n\t\t'
deleteresource = vpnvserver_vpnnexthopserver_binding()
deleteresource.name = resource.name
deleteresource.nexthopserver = resource.nexthopserver
r... | 3,789,293,309,144,867,000 | Use this function to create a resource with only delete operation specific parameters. | build/lib/nssrc/com/citrix/netscaler/nitro/resource/config/vpn/vpnvserver_vpnnexthopserver_binding.py | filter_delete_parameters | guardicore/nitro-python | python | @classmethod
def filter_delete_parameters(cls, resource):
' \n\t\t'
deleteresource = vpnvserver_vpnnexthopserver_binding()
deleteresource.name = resource.name
deleteresource.nexthopserver = resource.nexthopserver
return deleteresource |
@classmethod
def get(cls, service, name='', option_=''):
' Use this API to fetch vpnvserver_vpnnexthopserver_binding resources.\n\t\t'
try:
if (not name):
obj = vpnvserver_vpnnexthopserver_binding()
response = obj.get_resources(service, option_)
else:
obj = vp... | -2,785,300,714,051,055,000 | Use this API to fetch vpnvserver_vpnnexthopserver_binding resources. | build/lib/nssrc/com/citrix/netscaler/nitro/resource/config/vpn/vpnvserver_vpnnexthopserver_binding.py | get | guardicore/nitro-python | python | @classmethod
def get(cls, service, name=, option_=):
' \n\t\t'
try:
if (not name):
obj = vpnvserver_vpnnexthopserver_binding()
response = obj.get_resources(service, option_)
else:
obj = vpnvserver_vpnnexthopserver_binding()
obj.name = name
... |
@classmethod
def get_filtered(cls, service, name, filter_):
' Use this API to fetch filtered set of vpnvserver_vpnnexthopserver_binding resources.\n\t\tFilter string should be in JSON format.eg: "port:80,servicetype:HTTP".\n\t\t'
try:
obj = vpnvserver_vpnnexthopserver_binding()
obj.name = name
... | 8,369,779,430,298,223,000 | Use this API to fetch filtered set of vpnvserver_vpnnexthopserver_binding resources.
Filter string should be in JSON format.eg: "port:80,servicetype:HTTP". | build/lib/nssrc/com/citrix/netscaler/nitro/resource/config/vpn/vpnvserver_vpnnexthopserver_binding.py | get_filtered | guardicore/nitro-python | python | @classmethod
def get_filtered(cls, service, name, filter_):
' Use this API to fetch filtered set of vpnvserver_vpnnexthopserver_binding resources.\n\t\tFilter string should be in JSON format.eg: "port:80,servicetype:HTTP".\n\t\t'
try:
obj = vpnvserver_vpnnexthopserver_binding()
obj.name = name
... |
@classmethod
def count(cls, service, name):
' Use this API to count vpnvserver_vpnnexthopserver_binding resources configued on NetScaler.\n\t\t'
try:
obj = vpnvserver_vpnnexthopserver_binding()
obj.name = name
option_ = options()
option_.count = True
response = obj.get_re... | 2,512,976,632,532,896,300 | Use this API to count vpnvserver_vpnnexthopserver_binding resources configued on NetScaler. | build/lib/nssrc/com/citrix/netscaler/nitro/resource/config/vpn/vpnvserver_vpnnexthopserver_binding.py | count | guardicore/nitro-python | python | @classmethod
def count(cls, service, name):
' \n\t\t'
try:
obj = vpnvserver_vpnnexthopserver_binding()
obj.name = name
option_ = options()
option_.count = True
response = obj.get_resources(service, option_)
if response:
return response[0].__dict__['___... |
@classmethod
def count_filtered(cls, service, name, filter_):
' Use this API to count the filtered set of vpnvserver_vpnnexthopserver_binding resources.\n\t\tFilter string should be in JSON format.eg: "port:80,servicetype:HTTP".\n\t\t'
try:
obj = vpnvserver_vpnnexthopserver_binding()
obj.name = ... | -2,350,699,591,094,263,300 | Use this API to count the filtered set of vpnvserver_vpnnexthopserver_binding resources.
Filter string should be in JSON format.eg: "port:80,servicetype:HTTP". | build/lib/nssrc/com/citrix/netscaler/nitro/resource/config/vpn/vpnvserver_vpnnexthopserver_binding.py | count_filtered | guardicore/nitro-python | python | @classmethod
def count_filtered(cls, service, name, filter_):
' Use this API to count the filtered set of vpnvserver_vpnnexthopserver_binding resources.\n\t\tFilter string should be in JSON format.eg: "port:80,servicetype:HTTP".\n\t\t'
try:
obj = vpnvserver_vpnnexthopserver_binding()
obj.name = ... |
def get_external_lb_endpoints():
'\n Return a list of any external API load-balancer endpoints that have\n been manually configured.\n '
ha_connected = is_flag_set('ha.connected')
forced_lb_ips = hookenv.config('loadbalancer-ips').split()
vips = hookenv.config('ha-cluster-vip').split()
dns_... | -8,604,976,135,310,179,000 | Return a list of any external API load-balancer endpoints that have
been manually configured. | lib/charms/layer/kubernetes_master.py | get_external_lb_endpoints | hemanthnakkina/charm-kubernetes-master | python | def get_external_lb_endpoints():
'\n Return a list of any external API load-balancer endpoints that have\n been manually configured.\n '
ha_connected = is_flag_set('ha.connected')
forced_lb_ips = hookenv.config('loadbalancer-ips').split()
vips = hookenv.config('ha-cluster-vip').split()
dns_... |
def get_lb_endpoints():
'\n Return all load-balancer endpoints, whether from manual config or via\n relation.\n '
external_lb_endpoints = get_external_lb_endpoints()
loadbalancer = endpoint_from_flag('loadbalancer.available')
if external_lb_endpoints:
return external_lb_endpoints
el... | -1,953,696,430,980,908,300 | Return all load-balancer endpoints, whether from manual config or via
relation. | lib/charms/layer/kubernetes_master.py | get_lb_endpoints | hemanthnakkina/charm-kubernetes-master | python | def get_lb_endpoints():
'\n Return all load-balancer endpoints, whether from manual config or via\n relation.\n '
external_lb_endpoints = get_external_lb_endpoints()
loadbalancer = endpoint_from_flag('loadbalancer.available')
if external_lb_endpoints:
return external_lb_endpoints
el... |
def get_api_endpoint(relation=None):
'\n Determine the best endpoint for a client to connect to.\n\n If a relation is given, it will take that into account when choosing an\n endpoint.\n '
endpoints = get_lb_endpoints()
if endpoints:
return endpoints[(kubernetes_common.get_unit_number() ... | -4,188,973,924,212,686,000 | Determine the best endpoint for a client to connect to.
If a relation is given, it will take that into account when choosing an
endpoint. | lib/charms/layer/kubernetes_master.py | get_api_endpoint | hemanthnakkina/charm-kubernetes-master | python | def get_api_endpoint(relation=None):
'\n Determine the best endpoint for a client to connect to.\n\n If a relation is given, it will take that into account when choosing an\n endpoint.\n '
endpoints = get_lb_endpoints()
if endpoints:
return endpoints[(kubernetes_common.get_unit_number() ... |
def install_ceph_common():
'Install ceph-common tools.\n\n :return: None\n '
ceph_admin = endpoint_from_flag('ceph-storage.available')
ceph_context = {'mon_hosts': ceph_admin.mon_hosts(), 'fsid': ceph_admin.fsid(), 'auth_supported': ceph_admin.auth(), 'use_syslog': 'true', 'ceph_public_network': '', '... | -1,121,089,419,783,745,400 | Install ceph-common tools.
:return: None | lib/charms/layer/kubernetes_master.py | install_ceph_common | hemanthnakkina/charm-kubernetes-master | python | def install_ceph_common():
'Install ceph-common tools.\n\n :return: None\n '
ceph_admin = endpoint_from_flag('ceph-storage.available')
ceph_context = {'mon_hosts': ceph_admin.mon_hosts(), 'fsid': ceph_admin.fsid(), 'auth_supported': ceph_admin.auth(), 'use_syslog': 'true', 'ceph_public_network': , 'ce... |
def deprecate_auth_file(auth_file):
'\n In 1.19+, file-based authentication was deprecated in favor of webhook\n auth. Write out generic files that inform the user of this.\n '
csv_file = Path(auth_file)
csv_file.parent.mkdir(exist_ok=True)
csv_backup = Path('{}.{}'.format(csv_file, AUTH_BACKUP... | 5,085,122,092,301,805,000 | In 1.19+, file-based authentication was deprecated in favor of webhook
auth. Write out generic files that inform the user of this. | lib/charms/layer/kubernetes_master.py | deprecate_auth_file | hemanthnakkina/charm-kubernetes-master | python | def deprecate_auth_file(auth_file):
'\n In 1.19+, file-based authentication was deprecated in favor of webhook\n auth. Write out generic files that inform the user of this.\n '
csv_file = Path(auth_file)
csv_file.parent.mkdir(exist_ok=True)
csv_backup = Path('{}.{}'.format(csv_file, AUTH_BACKUP... |
def migrate_auth_file(filename):
'Create secrets or known tokens depending on what file is being migrated.'
with open(str(filename), 'r') as f:
rows = list(csv.reader(f))
for row in rows:
try:
if row[0].startswith('#'):
continue
elif (filename == AUTH_... | -7,641,784,702,578,400,000 | Create secrets or known tokens depending on what file is being migrated. | lib/charms/layer/kubernetes_master.py | migrate_auth_file | hemanthnakkina/charm-kubernetes-master | python | def migrate_auth_file(filename):
with open(str(filename), 'r') as f:
rows = list(csv.reader(f))
for row in rows:
try:
if row[0].startswith('#'):
continue
elif (filename == AUTH_BASIC_FILE):
create_known_token(*row)
elif (fi... |
def generate_rfc1123(length=10):
'Generate a random string compliant with RFC 1123.\n\n https://kubernetes.io/docs/concepts/overview/working-with-objects/names/#dns-subdomain-names\n\n param: length - the length of the string to generate\n '
length = (253 if (length > 253) else length)
first_last_o... | 1,612,866,948,318,523,600 | Generate a random string compliant with RFC 1123.
https://kubernetes.io/docs/concepts/overview/working-with-objects/names/#dns-subdomain-names
param: length - the length of the string to generate | lib/charms/layer/kubernetes_master.py | generate_rfc1123 | hemanthnakkina/charm-kubernetes-master | python | def generate_rfc1123(length=10):
'Generate a random string compliant with RFC 1123.\n\n https://kubernetes.io/docs/concepts/overview/working-with-objects/names/#dns-subdomain-names\n\n param: length - the length of the string to generate\n '
length = (253 if (length > 253) else length)
first_last_o... |
def token_generator(length=32):
'Generate a random token for use in account tokens.\n\n param: length - the length of the token to generate\n '
alpha = (string.ascii_letters + string.digits)
token = ''.join((random.SystemRandom().choice(alpha) for _ in range(length)))
return token | 5,597,463,047,298,483,000 | Generate a random token for use in account tokens.
param: length - the length of the token to generate | lib/charms/layer/kubernetes_master.py | token_generator | hemanthnakkina/charm-kubernetes-master | python | def token_generator(length=32):
'Generate a random token for use in account tokens.\n\n param: length - the length of the token to generate\n '
alpha = (string.ascii_letters + string.digits)
token = .join((random.SystemRandom().choice(alpha) for _ in range(length)))
return token |
def delete_secret(secret_id):
'Delete a given secret id.'
return kubernetes_common.kubectl_success('delete', 'secret', '-n', AUTH_SECRET_NS, secret_id) | 8,358,800,692,941,659,000 | Delete a given secret id. | lib/charms/layer/kubernetes_master.py | delete_secret | hemanthnakkina/charm-kubernetes-master | python | def delete_secret(secret_id):
return kubernetes_common.kubectl_success('delete', 'secret', '-n', AUTH_SECRET_NS, secret_id) |
def get_csv_password(csv_fname, user):
'Get the password for the given user within the csv file provided.'
root_cdk = '/root/cdk'
tokens_fname = (Path(root_cdk) / csv_fname)
if (not tokens_fname.is_file()):
return None
with tokens_fname.open('r') as stream:
for line in stream:
... | 3,880,112,355,788,806,700 | Get the password for the given user within the csv file provided. | lib/charms/layer/kubernetes_master.py | get_csv_password | hemanthnakkina/charm-kubernetes-master | python | def get_csv_password(csv_fname, user):
root_cdk = '/root/cdk'
tokens_fname = (Path(root_cdk) / csv_fname)
if (not tokens_fname.is_file()):
return None
with tokens_fname.open('r') as stream:
for line in stream:
record = line.split(',')
try:
if ... |
def get_secret_password(username):
'Get the password for the given user from the secret that CK created.'
try:
output = kubernetes_common.kubectl('get', 'secrets', '-n', AUTH_SECRET_NS, '--field-selector', 'type={}'.format(AUTH_SECRET_TYPE), '-o', 'json').decode('UTF-8')
except CalledProcessError:
... | -7,479,111,451,008,058,000 | Get the password for the given user from the secret that CK created. | lib/charms/layer/kubernetes_master.py | get_secret_password | hemanthnakkina/charm-kubernetes-master | python | def get_secret_password(username):
try:
output = kubernetes_common.kubectl('get', 'secrets', '-n', AUTH_SECRET_NS, '--field-selector', 'type={}'.format(AUTH_SECRET_TYPE), '-o', 'json').decode('UTF-8')
except CalledProcessError:
token = None
if (username == 'admin'):
admi... |
def service_cidr():
" Return the charm's service-cidr config"
frozen_cidr = db.get('kubernetes-master.service-cidr')
return (frozen_cidr or hookenv.config('service-cidr')) | -7,382,090,746,641,430,000 | Return the charm's service-cidr config | lib/charms/layer/kubernetes_master.py | service_cidr | hemanthnakkina/charm-kubernetes-master | python | def service_cidr():
" "
frozen_cidr = db.get('kubernetes-master.service-cidr')
return (frozen_cidr or hookenv.config('service-cidr')) |
def freeze_service_cidr():
' Freeze the service CIDR. Once the apiserver has started, we can no\n longer safely change this value. '
frozen_service_cidr = db.get('kubernetes-master.service-cidr')
if ((not frozen_service_cidr) or is_service_cidr_expansion()):
db.set('kubernetes-master.service-cidr... | 7,960,924,385,793,680,000 | Freeze the service CIDR. Once the apiserver has started, we can no
longer safely change this value. | lib/charms/layer/kubernetes_master.py | freeze_service_cidr | hemanthnakkina/charm-kubernetes-master | python | def freeze_service_cidr():
' Freeze the service CIDR. Once the apiserver has started, we can no\n longer safely change this value. '
frozen_service_cidr = db.get('kubernetes-master.service-cidr')
if ((not frozen_service_cidr) or is_service_cidr_expansion()):
db.set('kubernetes-master.service-cidr... |
def get_preferred_service_network(service_cidrs):
'Get the network preferred for cluster service, preferring IPv4'
net_ipv4 = kubernetes_common.get_ipv4_network(service_cidrs)
net_ipv6 = kubernetes_common.get_ipv6_network(service_cidrs)
return (net_ipv4 or net_ipv6) | -629,480,460,032,981,200 | Get the network preferred for cluster service, preferring IPv4 | lib/charms/layer/kubernetes_master.py | get_preferred_service_network | hemanthnakkina/charm-kubernetes-master | python | def get_preferred_service_network(service_cidrs):
net_ipv4 = kubernetes_common.get_ipv4_network(service_cidrs)
net_ipv6 = kubernetes_common.get_ipv6_network(service_cidrs)
return (net_ipv4 or net_ipv6) |
def get_kubernetes_service_ips():
'Get the IP address(es) for the kubernetes service based on the cidr.'
return [next(network.hosts()).exploded for network in kubernetes_common.get_networks(service_cidr())] | 381,198,669,786,912,200 | Get the IP address(es) for the kubernetes service based on the cidr. | lib/charms/layer/kubernetes_master.py | get_kubernetes_service_ips | hemanthnakkina/charm-kubernetes-master | python | def get_kubernetes_service_ips():
return [next(network.hosts()).exploded for network in kubernetes_common.get_networks(service_cidr())] |
def get_snap_revs(snaps):
'Get a dict of snap revisions for a given list of snaps.'
channel = hookenv.config('channel')
rev_info = {}
for s in sorted(snaps):
try:
info = check_output(['snap', 'info', s]).decode('utf8', errors='ignore')
except CalledProcessError:
i... | 3,718,453,624,753,426,000 | Get a dict of snap revisions for a given list of snaps. | lib/charms/layer/kubernetes_master.py | get_snap_revs | hemanthnakkina/charm-kubernetes-master | python | def get_snap_revs(snaps):
channel = hookenv.config('channel')
rev_info = {}
for s in sorted(snaps):
try:
info = check_output(['snap', 'info', s]).decode('utf8', errors='ignore')
except CalledProcessError:
info =
snap_rev = None
yaml_data = safe_l... |
@parameterized.named_parameters(('1', None, np.int32([]), dtypes.bool), ('2', None, np.int32([]), dtypes.int32), ('3', None, np.int32([]), dtypes.float32), ('4', None, np.int32([]), dtypes.string), ('5', None, np.int32([2]), dtypes.int32), ('6', None, np.int32([2, 2]), dtypes.int32), ('7', (None, None, None), np.int32(... | 5,215,181,121,348,939,000 | Tests windowing by chaining it with flat map.
Args:
structure: the input structure
shape: the input shape
dtype: the input data type | tensorflow/contrib/data/python/kernel_tests/window_dataset_op_test.py | testWindowDatasetFlatMap | Esail/tensorflow | python | @parameterized.named_parameters(('1', None, np.int32([]), dtypes.bool), ('2', None, np.int32([]), dtypes.int32), ('3', None, np.int32([]), dtypes.float32), ('4', None, np.int32([]), dtypes.string), ('5', None, np.int32([2]), dtypes.int32), ('6', None, np.int32([2, 2]), dtypes.int32), ('7', (None, None, None), np.int32(... |
@parameterized.named_parameters(('1', None, np.int32([]), dtypes.bool), ('2', None, np.int32([]), dtypes.int32), ('3', None, np.int32([]), dtypes.float32), ('4', None, np.int32([]), dtypes.string), ('5', None, np.int32([2]), dtypes.int32), ('6', None, np.int32([2, 2]), dtypes.int32), ('7', (None, None, None), np.int32(... | 7,058,848,213,807,847,000 | Tests batching of dense tensor windows.
Args:
structure: the input structure
shape: the input shape
dtype: the input data type | tensorflow/contrib/data/python/kernel_tests/window_dataset_op_test.py | testWindowDatasetBatchDense | Esail/tensorflow | python | @parameterized.named_parameters(('1', None, np.int32([]), dtypes.bool), ('2', None, np.int32([]), dtypes.int32), ('3', None, np.int32([]), dtypes.float32), ('4', None, np.int32([]), dtypes.string), ('5', None, np.int32([2]), dtypes.int32), ('6', None, np.int32([2, 2]), dtypes.int32), ('7', (None, None, None), np.int32(... |
@parameterized.named_parameters(('1', np.int32([])), ('2', np.int32([1])), ('3', np.int32([1, 2, 3])))
def testWindowDatasetBatchDenseDynamicShape(self, shape):
'Tests batching of dynamically shaped dense tensor windows.\n\n Args:\n shape: the input shape\n '
shape_t = array_ops.placeholder(dtypes.in... | -5,008,434,139,535,988,000 | Tests batching of dynamically shaped dense tensor windows.
Args:
shape: the input shape | tensorflow/contrib/data/python/kernel_tests/window_dataset_op_test.py | testWindowDatasetBatchDenseDynamicShape | Esail/tensorflow | python | @parameterized.named_parameters(('1', np.int32([])), ('2', np.int32([1])), ('3', np.int32([1, 2, 3])))
def testWindowDatasetBatchDenseDynamicShape(self, shape):
'Tests batching of dynamically shaped dense tensor windows.\n\n Args:\n shape: the input shape\n '
shape_t = array_ops.placeholder(dtypes.in... |
@parameterized.named_parameters(('1', None, np.int32([]), dtypes.bool), ('2', None, np.int32([]), dtypes.int32), ('3', None, np.int32([]), dtypes.float32), ('4', None, np.int32([]), dtypes.string), ('5', None, np.int32([2]), dtypes.int32), ('6', None, np.int32([2, 2]), dtypes.int32), ('7', (None, None, None), np.int32(... | -2,907,993,216,005,055,500 | Tests batching of sparse tensor windows.
Args:
structure: the input structure
shape: the input shape
dtype: the input data type | tensorflow/contrib/data/python/kernel_tests/window_dataset_op_test.py | testWindowDatasetBatchSparse | Esail/tensorflow | python | @parameterized.named_parameters(('1', None, np.int32([]), dtypes.bool), ('2', None, np.int32([]), dtypes.int32), ('3', None, np.int32([]), dtypes.float32), ('4', None, np.int32([]), dtypes.string), ('5', None, np.int32([2]), dtypes.int32), ('6', None, np.int32([2, 2]), dtypes.int32), ('7', (None, None, None), np.int32(... |
@parameterized.named_parameters(('1', np.int32([])), ('2', np.int32([1])), ('3', np.int32([1, 2, 3])))
def testWindowDatasetBatchSparseDynamicShape(self, shape):
'Tests batching of dynamically shaped sparse tensor windows.\n\n Args:\n shape: the input shape\n '
shape_t = array_ops.placeholder(dtypes.... | -4,503,179,503,890,113,000 | Tests batching of dynamically shaped sparse tensor windows.
Args:
shape: the input shape | tensorflow/contrib/data/python/kernel_tests/window_dataset_op_test.py | testWindowDatasetBatchSparseDynamicShape | Esail/tensorflow | python | @parameterized.named_parameters(('1', np.int32([])), ('2', np.int32([1])), ('3', np.int32([1, 2, 3])))
def testWindowDatasetBatchSparseDynamicShape(self, shape):
'Tests batching of dynamically shaped sparse tensor windows.\n\n Args:\n shape: the input shape\n '
shape_t = array_ops.placeholder(dtypes.... |
@parameterized.named_parameters(('1', None, np.int32([[1], [2], [3]]), dtypes.bool, [(- 1)]), ('2', None, np.int32([[1], [2], [3]]), dtypes.int32, [(- 1)]), ('3', None, np.int32([[1], [2], [3]]), dtypes.float32, [(- 1)]), ('4', None, np.int32([[1], [2], [3]]), dtypes.string, [(- 1)]), ('5', None, np.int32([[1, 3], [2, ... | -4,706,676,081,892,534,000 | Tests padded batching of dense tensor windows.
Args:
structure: the input structure
shapes: the input shapes
dtype: the input data type
padded_shape: the shape to pad the output to | tensorflow/contrib/data/python/kernel_tests/window_dataset_op_test.py | testWindowDatasetPaddedBatchDense | Esail/tensorflow | python | @parameterized.named_parameters(('1', None, np.int32([[1], [2], [3]]), dtypes.bool, [(- 1)]), ('2', None, np.int32([[1], [2], [3]]), dtypes.int32, [(- 1)]), ('3', None, np.int32([[1], [2], [3]]), dtypes.float32, [(- 1)]), ('4', None, np.int32([[1], [2], [3]]), dtypes.string, [(- 1)]), ('5', None, np.int32([[1, 3], [2, ... |
@parameterized.named_parameters(('1', np.int32([[1], [2], [3]]), [(- 1)]), ('2', np.int32([[1, 3], [2, 2], [3, 1]]), [(- 1), (- 1)]), ('3', np.int32([[3, 1, 3], [1, 3, 1]]), [(- 1), (- 1), (- 1)]))
def testWindowDatasetPaddedBatchDenseDynamicShape(self, shapes, padded_shape):
'Tests padded batching of dynamically s... | -7,236,535,758,031,820,000 | Tests padded batching of dynamically shaped dense tensor windows.
Args:
shapes: the input shapes
padded_shape: the shape to pad the output to | tensorflow/contrib/data/python/kernel_tests/window_dataset_op_test.py | testWindowDatasetPaddedBatchDenseDynamicShape | Esail/tensorflow | python | @parameterized.named_parameters(('1', np.int32([[1], [2], [3]]), [(- 1)]), ('2', np.int32([[1, 3], [2, 2], [3, 1]]), [(- 1), (- 1)]), ('3', np.int32([[3, 1, 3], [1, 3, 1]]), [(- 1), (- 1), (- 1)]))
def testWindowDatasetPaddedBatchDenseDynamicShape(self, shapes, padded_shape):
'Tests padded batching of dynamically s... |
@parameterized.named_parameters(('1', np.int32([[1]]), np.int32([0])), ('2', np.int32([[10], [20]]), np.int32([15])))
def testWindowDatasetPaddedBatchDenseInvalid(self, shapes, padded_shape):
'Tests invalid padded batching of dense tensor windows.\n\n Args:\n shapes: the input shapes\n padded_shape: th... | 8,705,696,122,364,075,000 | Tests invalid padded batching of dense tensor windows.
Args:
shapes: the input shapes
padded_shape: the shape to pad the output to | tensorflow/contrib/data/python/kernel_tests/window_dataset_op_test.py | testWindowDatasetPaddedBatchDenseInvalid | Esail/tensorflow | python | @parameterized.named_parameters(('1', np.int32([[1]]), np.int32([0])), ('2', np.int32([[10], [20]]), np.int32([15])))
def testWindowDatasetPaddedBatchDenseInvalid(self, shapes, padded_shape):
'Tests invalid padded batching of dense tensor windows.\n\n Args:\n shapes: the input shapes\n padded_shape: th... |
@parameterized.named_parameters(('1', None, np.int64([[1], [2], [3]]), dtypes.bool, [(- 1)]), ('2', None, np.int64([[1], [2], [3]]), dtypes.int32, [(- 1)]), ('3', None, np.int64([[1], [2], [3]]), dtypes.float32, [(- 1)]), ('4', None, np.int64([[1], [2], [3]]), dtypes.string, [(- 1)]), ('5', None, np.int64([[1, 3], [2, ... | 283,449,577,564,227,520 | Tests padded batching of sparse tensor windows.
Args:
structure: the input structure
shapes: the input shapes
dtype: the input data type
padded_shape: the shape to pad the output to | tensorflow/contrib/data/python/kernel_tests/window_dataset_op_test.py | testWindowDatasetPaddedBatchSparse | Esail/tensorflow | python | @parameterized.named_parameters(('1', None, np.int64([[1], [2], [3]]), dtypes.bool, [(- 1)]), ('2', None, np.int64([[1], [2], [3]]), dtypes.int32, [(- 1)]), ('3', None, np.int64([[1], [2], [3]]), dtypes.float32, [(- 1)]), ('4', None, np.int64([[1], [2], [3]]), dtypes.string, [(- 1)]), ('5', None, np.int64([[1, 3], [2, ... |
@parameterized.named_parameters(('1', np.int64([[1], [2], [3]]), [(- 1)]), ('2', np.int64([[1, 3], [2, 2], [3, 1]]), [(- 1), (- 1)]), ('3', np.int64([[3, 1, 3], [1, 3, 1]]), [(- 1), (- 1), (- 1)]))
def testWindowDatasetPaddedBatchSparseDynamicShape(self, shapes, padded_shape):
'Tests padded batching of dynamically ... | -5,328,868,751,272,801,000 | Tests padded batching of dynamically shaped sparse tensor windows.
Args:
shapes: the input shapes
padded_shape: the shape to pad the output to | tensorflow/contrib/data/python/kernel_tests/window_dataset_op_test.py | testWindowDatasetPaddedBatchSparseDynamicShape | Esail/tensorflow | python | @parameterized.named_parameters(('1', np.int64([[1], [2], [3]]), [(- 1)]), ('2', np.int64([[1, 3], [2, 2], [3, 1]]), [(- 1), (- 1)]), ('3', np.int64([[3, 1, 3], [1, 3, 1]]), [(- 1), (- 1), (- 1)]))
def testWindowDatasetPaddedBatchSparseDynamicShape(self, shapes, padded_shape):
'Tests padded batching of dynamically ... |
@parameterized.named_parameters(('1', np.int64([[1]]), [0]), ('2', np.int64([[10], [20]]), [15]))
def testWindowDatasetPaddedBatchSparseInvalid(self, shapes, padded_shape):
'Tests invalid padded batching of sparse tensor windows.\n\n Args:\n shapes: the input shapes\n padded_shape: the shape to pad the... | 5,317,069,918,889,011,000 | Tests invalid padded batching of sparse tensor windows.
Args:
shapes: the input shapes
padded_shape: the shape to pad the output to | tensorflow/contrib/data/python/kernel_tests/window_dataset_op_test.py | testWindowDatasetPaddedBatchSparseInvalid | Esail/tensorflow | python | @parameterized.named_parameters(('1', np.int64([[1]]), [0]), ('2', np.int64([[10], [20]]), [15]))
def testWindowDatasetPaddedBatchSparseInvalid(self, shapes, padded_shape):
'Tests invalid padded batching of sparse tensor windows.\n\n Args:\n shapes: the input shapes\n padded_shape: the shape to pad the... |
def calc_sample_norms(named_params: Iterator[Tuple[(str, torch.Tensor)]], flat: bool=True) -> List[torch.Tensor]:
'\n Calculates the norm of the given tensors for each sample.\n\n This function calculates the overall norm of the given tensors for each sample,\n assuming the each batch\'s dim is zero.\n\n ... | 1,104,742,471,149,859,100 | Calculates the norm of the given tensors for each sample.
This function calculates the overall norm of the given tensors for each sample,
assuming the each batch's dim is zero.
Args:
named_params: An iterator of tuples <name, param> with name being a
string and param being a tensor of shape ``[B, ...]`` w... | opacus/utils/tensor_utils.py | calc_sample_norms | DaveBrind/SynthVAE | python | def calc_sample_norms(named_params: Iterator[Tuple[(str, torch.Tensor)]], flat: bool=True) -> List[torch.Tensor]:
'\n Calculates the norm of the given tensors for each sample.\n\n This function calculates the overall norm of the given tensors for each sample,\n assuming the each batch\'s dim is zero.\n\n ... |
def sum_over_all_but_batch_and_last_n(tensor: torch.Tensor, n_dims: int) -> torch.Tensor:
'\n Calculates the sum over all dimensions, except the first\n (batch dimension), and excluding the last n_dims.\n\n This function will ignore the first dimension and it will\n not aggregate over the last n_dims di... | -8,455,549,789,229,907,000 | Calculates the sum over all dimensions, except the first
(batch dimension), and excluding the last n_dims.
This function will ignore the first dimension and it will
not aggregate over the last n_dims dimensions.
Args:
tensor: An input tensor of shape ``(B, ..., X[n_dims-1])``.
n_dims: Number of dimensions to ... | opacus/utils/tensor_utils.py | sum_over_all_but_batch_and_last_n | DaveBrind/SynthVAE | python | def sum_over_all_but_batch_and_last_n(tensor: torch.Tensor, n_dims: int) -> torch.Tensor:
'\n Calculates the sum over all dimensions, except the first\n (batch dimension), and excluding the last n_dims.\n\n This function will ignore the first dimension and it will\n not aggregate over the last n_dims di... |
def unfold3d(tensor: torch.Tensor, kernel_size: Union[(int, Tuple[(int, int, int)])], padding: Union[(int, Tuple[(int, int, int)])]=0, stride: Union[(int, Tuple[(int, int, int)])]=1, dilation: Union[(int, Tuple[(int, int, int)])]=1):
'\n Extracts sliding local blocks from an batched input tensor.\n\n :class:`... | 2,870,308,372,176,920,000 | Extracts sliding local blocks from an batched input tensor.
:class:`torch.nn.Unfold` only supports 4D inputs (batched image-like tensors).
This method implements the same action for 5D inputs
Args:
tensor: An input tensor of shape ``(B, C, D, H, W)``.
kernel_size: the size of the sliding blocks
padding: i... | opacus/utils/tensor_utils.py | unfold3d | DaveBrind/SynthVAE | python | def unfold3d(tensor: torch.Tensor, kernel_size: Union[(int, Tuple[(int, int, int)])], padding: Union[(int, Tuple[(int, int, int)])]=0, stride: Union[(int, Tuple[(int, int, int)])]=1, dilation: Union[(int, Tuple[(int, int, int)])]=1):
'\n Extracts sliding local blocks from an batched input tensor.\n\n :class:`... |
def set_logger(log_path):
'Set the logger to log info in terminal and file `log_path`.\n\n In general, it is useful to have a logger so that every output to the terminal is saved\n in a permanent file. Here we save it to `model_dir/train.log`.\n\n Example:\n ```\n logging.info("Starting training...")\n ```\n\... | 9,111,767,959,850,705,000 | Set the logger to log info in terminal and file `log_path`.
In general, it is useful to have a logger so that every output to the terminal is saved
in a permanent file. Here we save it to `model_dir/train.log`.
Example:
```
logging.info("Starting training...")
```
Args:
log_path: (string) where to log | utils.py | set_logger | haamoon/finding_common_object | python | def set_logger(log_path):
'Set the logger to log info in terminal and file `log_path`.\n\n In general, it is useful to have a logger so that every output to the terminal is saved\n in a permanent file. Here we save it to `model_dir/train.log`.\n\n Example:\n ```\n logging.info("Starting training...")\n ```\n\... |
def copy_most_recent_model():
" Copy the most recent model to the 'models/' directory "
best_model = get_most_recent_model()
if best_model:
print('Warm-starting from {}'.format(best_model), end='', flush=True)
for blob in bucket.list_blobs(prefix=best_model):
dest_file = 'models/... | -1,350,659,773,957,631,200 | Copy the most recent model to the 'models/' directory | contrib/distr-env/dg_storage.py | copy_most_recent_model | Chicoryn/dream-go | python | def copy_most_recent_model():
" "
best_model = get_most_recent_model()
if best_model:
print('Warm-starting from {}'.format(best_model), end=, flush=True)
for blob in bucket.list_blobs(prefix=best_model):
dest_file = 'models/{}/{}'.format(basename(best_model), basename(blob.name)... |
def wait_until_all_models_rated():
' Wait until all models has been assigned an ELO score. '
while True:
models = {}
for blob in bucket.list_blobs(prefix='models/'):
if (blob.size > 0):
models[dirname(blob.name)] = True
if (blob.metadata and ('elo' in ... | 5,103,271,914,473,873,000 | Wait until all models has been assigned an ELO score. | contrib/distr-env/dg_storage.py | wait_until_all_models_rated | Chicoryn/dream-go | python | def wait_until_all_models_rated():
' '
while True:
models = {}
for blob in bucket.list_blobs(prefix='models/'):
if (blob.size > 0):
models[dirname(blob.name)] = True
if (blob.metadata and ('elo' in blob.metadata)):
return True
... |
def copy_most_recent_games():
' Download the 200,000 most recent games, each file should\n contain 1,000 game records. So we need to download the 200\n most recent files. '
files = []
blobs = sorted([blob for blob in bucket.list_blobs(prefix='games/') if (blob.size > 0)], key=(lambda blob: blob.time_c... | 7,008,108,335,458,877,000 | Download the 200,000 most recent games, each file should
contain 1,000 game records. So we need to download the 200
most recent files. | contrib/distr-env/dg_storage.py | copy_most_recent_games | Chicoryn/dream-go | python | def copy_most_recent_games():
' Download the 200,000 most recent games, each file should\n contain 1,000 game records. So we need to download the 200\n most recent files. '
files = []
blobs = sorted([blob for blob in bucket.list_blobs(prefix='games/') if (blob.size > 0)], key=(lambda blob: blob.time_c... |
def upload_next_model(next_model):
' Upload the specified model to google storage. '
for src_file in glob('models/*{}/*'.format(next_model)):
if isfile(src_file):
print('Uploading', src_file)
blob = bucket.blob(src_file)
blob.upload_from_filename(filename=src_file) | 2,930,675,446,021,968,000 | Upload the specified model to google storage. | contrib/distr-env/dg_storage.py | upload_next_model | Chicoryn/dream-go | python | def upload_next_model(next_model):
' '
for src_file in glob('models/*{}/*'.format(next_model)):
if isfile(src_file):
print('Uploading', src_file)
blob = bucket.blob(src_file)
blob.upload_from_filename(filename=src_file) |
def upload_next_network(next_model, data, args=None):
' Upload the specified network to google storage. '
blob = bucket.blob('networks/{}.json'.format(next_model))
blob.metadata = {'args': json.dumps(args, sort_keys=True), 'rev': getenv('GIT_REV')}
blob.upload_from_string(data, 'application/json') | 20,544,352,858,846,290 | Upload the specified network to google storage. | contrib/distr-env/dg_storage.py | upload_next_network | Chicoryn/dream-go | python | def upload_next_network(next_model, data, args=None):
' '
blob = bucket.blob('networks/{}.json'.format(next_model))
blob.metadata = {'args': json.dumps(args, sort_keys=True), 'rev': getenv('GIT_REV')}
blob.upload_from_string(data, 'application/json') |
def upload_game_records(data, from_network=None, env=None, args=None):
' Upload the specified game records to google storage. '
dest_file = 'games/{}.sgf'.format(datetime.now().strftime('%Y%m%d.%H%M'))
print('Uploading', dest_file)
blob = bucket.blob(dest_file)
blob.metadata = {'args': json.dumps(ar... | -2,438,408,460,449,279,500 | Upload the specified game records to google storage. | contrib/distr-env/dg_storage.py | upload_game_records | Chicoryn/dream-go | python | def upload_game_records(data, from_network=None, env=None, args=None):
' '
dest_file = 'games/{}.sgf'.format(datetime.now().strftime('%Y%m%d.%H%M'))
print('Uploading', dest_file)
blob = bucket.blob(dest_file)
blob.metadata = {'args': json.dumps(args, sort_keys=True), 'env': json.dumps(env, sort_key... |
def events(self, event_id: int=None, _from: str=None, published: bool=True, page_size: int=100, page: int=1, field_sort: str=None, sort: str='ASC', fields: str=None) -> dict:
'\n Esta API fornece acesso às informações de eventos criados na plataforma Sympla, exclusivamente aqueles vinculados ao usuário propr... | -4,554,569,738,001,877,000 | Esta API fornece acesso às informações de eventos criados na plataforma Sympla, exclusivamente aqueles vinculados ao usuário proprietário do token.
A API também permite a personalização dos resultados, possibilitando filtrar eventos dentro de uma janela de data ou restringir quais campos são relevantes e devem ser exi... | pysympla/sympla.py | events | hudsonbrendon/pysympla | python | def events(self, event_id: int=None, _from: str=None, published: bool=True, page_size: int=100, page: int=1, field_sort: str=None, sort: str='ASC', fields: str=None) -> dict:
'\n Esta API fornece acesso às informações de eventos criados na plataforma Sympla, exclusivamente aqueles vinculados ao usuário propr... |
def orders_by_event(self, event_id: int, status: bool=False, page_size: int=100, page: int=1, field_sort: str=None, sort: str='ASC', fields: str=None) -> dict:
'\n Retorna os pedidos de um determinado evento.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/ge... | 4,699,294,017,509,968,000 | Retorna os pedidos de um determinado evento.
Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/getListOrders
:param event_id: Identificador único do evento
:param status: Retorna todos os pedidos com qualquer status.
True: Retorna os pedidos de todos os status;
... | pysympla/sympla.py | orders_by_event | hudsonbrendon/pysympla | python | def orders_by_event(self, event_id: int, status: bool=False, page_size: int=100, page: int=1, field_sort: str=None, sort: str='ASC', fields: str=None) -> dict:
'\n Retorna os pedidos de um determinado evento.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/ge... |
def order_by_identifier(self, event_id: int, order_id: str, fields: str=None) -> dict:
'\n Retorna o pedido correspondente ao identificador informado.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/getOneOrder\n\n :param event_id: Identificador único ... | -428,633,821,227,176,300 | Retorna o pedido correspondente ao identificador informado.
Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/getOneOrder
:param event_id: Identificador único do evento
:param order_id: id do pedido
:param fields: Deve ser utilizado para retornar apenas os atributos indicados do ... | pysympla/sympla.py | order_by_identifier | hudsonbrendon/pysympla | python | def order_by_identifier(self, event_id: int, order_id: str, fields: str=None) -> dict:
'\n Retorna o pedido correspondente ao identificador informado.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/getOneOrder\n\n :param event_id: Identificador único ... |
def participants_by_order(self, event_id: int, order_id: str, page_size: int=100, page: int=1, field_sort: str=None, sort: str='ASC', fields: str=None) -> dict:
'\n Retorna o(s) participante(s) contido(s) em um determinado pedido.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/i... | -4,891,483,069,128,448,000 | Retorna o(s) participante(s) contido(s) em um determinado pedido.
Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/getAllParticipantsForOrder
:param event_id: Identificador único do evento
:param order_id: Identificador único do pedido
:param page_size: Especifica quantos regist... | pysympla/sympla.py | participants_by_order | hudsonbrendon/pysympla | python | def participants_by_order(self, event_id: int, order_id: str, page_size: int=100, page: int=1, field_sort: str=None, sort: str='ASC', fields: str=None) -> dict:
'\n Retorna o(s) participante(s) contido(s) em um determinado pedido.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/i... |
def participants_by_event(self, event_id: int, ticket_number: str=None, page_size: int=100, page: int=1, field_sort: str=None, sort: str='ASC', fields: str=None) -> dict:
'\n Retorna os participantes de um determinado evento.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.... | 6,278,634,500,556,122,000 | Retorna os participantes de um determinado evento.
Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/getAllParticipants
:param event_id: Identificador único do evento
:param ticket_number: Código escrito no ingresso.
:param page_size: Especifica quantos registros por página o usu... | pysympla/sympla.py | participants_by_event | hudsonbrendon/pysympla | python | def participants_by_event(self, event_id: int, ticket_number: str=None, page_size: int=100, page: int=1, field_sort: str=None, sort: str='ASC', fields: str=None) -> dict:
'\n Retorna os participantes de um determinado evento.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.... |
def participant_by_ticket_id(self, event_id: int, participant_id: int, fields: str=None) -> dict:
'\n Retorna o participante correspondente ao ingresso informado.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/getOneParticipant\n\n :param event_id: Id... | -7,480,940,268,470,330,000 | Retorna o participante correspondente ao ingresso informado.
Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/getOneParticipant
:param event_id: Identificador único do evento
:param participant_id: Identificador único do ingresso
:param fields: Deve ser utilizado para retornar a... | pysympla/sympla.py | participant_by_ticket_id | hudsonbrendon/pysympla | python | def participant_by_ticket_id(self, event_id: int, participant_id: int, fields: str=None) -> dict:
'\n Retorna o participante correspondente ao ingresso informado.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/getOneParticipant\n\n :param event_id: Id... |
def participant_by_ticket_number(self, event_id: int, ticket_number: str, fields: str=None) -> dict:
'\n Retorna o participante correspondente ao ingresso informado.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/getOneParticipantByTicketNumber\n\n :p... | -8,268,009,442,225,868,000 | Retorna o participante correspondente ao ingresso informado.
Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/getOneParticipantByTicketNumber
:param event_id: Identificador único do evento
:param ticket_number: Número do ingresso
:param fields: Deve ser utilizado para retornar a... | pysympla/sympla.py | participant_by_ticket_number | hudsonbrendon/pysympla | python | def participant_by_ticket_number(self, event_id: int, ticket_number: str, fields: str=None) -> dict:
'\n Retorna o participante correspondente ao ingresso informado.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/getOneParticipantByTicketNumber\n\n :p... |
def checkin_by_ticket_id(self, event_id: int, participant_id: int) -> dict:
'\n Realiza o check-in de um participante por id do ingresso.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/checkInByParticipantId\n\n :param event_id: Identificador único do... | 5,844,175,330,238,192,000 | Realiza o check-in de um participante por id do ingresso.
Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/checkInByParticipantId
:param event_id: Identificador único do evento
:param participant_id: Identificador único do ingresso | pysympla/sympla.py | checkin_by_ticket_id | hudsonbrendon/pysympla | python | def checkin_by_ticket_id(self, event_id: int, participant_id: int) -> dict:
'\n Realiza o check-in de um participante por id do ingresso.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/checkInByParticipantId\n\n :param event_id: Identificador único do... |
def checkin_by_ticket_number(self, event_id: int, ticket_number: str) -> dict:
'\n Realiza o check-in de um participante por número do ingresso.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/checkInByTicketNumber\n\n :param event_id: Identificador ún... | -1,569,102,426,574,836,500 | Realiza o check-in de um participante por número do ingresso.
Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/checkInByTicketNumber
:param event_id: Identificador único do evento
:param ticket_number: Número do ingresso | pysympla/sympla.py | checkin_by_ticket_number | hudsonbrendon/pysympla | python | def checkin_by_ticket_number(self, event_id: int, ticket_number: str) -> dict:
'\n Realiza o check-in de um participante por número do ingresso.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#operation/checkInByTicketNumber\n\n :param event_id: Identificador ún... |
def affiliates(self, event_id: int) -> dict:
'\n Esta API fornece acesso às informações relativas ao programa de afiliados e seus respectivos afiliados.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#tag/Afiliados\n\n :param event_id: Identificador único do eve... | -4,809,683,616,987,814,000 | Esta API fornece acesso às informações relativas ao programa de afiliados e seus respectivos afiliados.
Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#tag/Afiliados
:param event_id: Identificador único do evento | pysympla/sympla.py | affiliates | hudsonbrendon/pysympla | python | def affiliates(self, event_id: int) -> dict:
'\n Esta API fornece acesso às informações relativas ao programa de afiliados e seus respectivos afiliados.\n\n Para saber mais, acesse: https://developers.sympla.com.br/api-doc/index.html#tag/Afiliados\n\n :param event_id: Identificador único do eve... |
def _normalize(tensor, norm_layer):
'\n Broadcast layer norm\n '
size = tensor.size()
return norm_layer(tensor.view((- 1), size[(- 1)])).view(size) | -6,191,567,655,660,572,000 | Broadcast layer norm | parlai/agents/transformer/modules.py | _normalize | jinjiren/ParlAI | python | def _normalize(tensor, norm_layer):
'\n \n '
size = tensor.size()
return norm_layer(tensor.view((- 1), size[(- 1)])).view(size) |
def _create_embeddings(dictionary, embedding_size, padding_idx):
'Create and initialize word embeddings.'
e = nn.Embedding(len(dictionary), embedding_size, padding_idx)
nn.init.normal_(e.weight, mean=0, std=(embedding_size ** (- 0.5)))
nn.init.constant_(e.weight[padding_idx], 0)
return e | 2,160,154,638,503,389,400 | Create and initialize word embeddings. | parlai/agents/transformer/modules.py | _create_embeddings | jinjiren/ParlAI | python | def _create_embeddings(dictionary, embedding_size, padding_idx):
e = nn.Embedding(len(dictionary), embedding_size, padding_idx)
nn.init.normal_(e.weight, mean=0, std=(embedding_size ** (- 0.5)))
nn.init.constant_(e.weight[padding_idx], 0)
return e |
def forward(self, input):
'\n input data is a FloatTensor of shape [batch, seq_len, dim]\n mask is a ByteTensor of shape [batch, seq_len], filled with 1 when\n inside the sequence and 0 outside.\n '
mask = (input != self.padding_idx)
seq_len = input.size(1)
positi... | 225,423,487,833,513,200 | input data is a FloatTensor of shape [batch, seq_len, dim]
mask is a ByteTensor of shape [batch, seq_len], filled with 1 when
inside the sequence and 0 outside. | parlai/agents/transformer/modules.py | forward | jinjiren/ParlAI | python | def forward(self, input):
'\n input data is a FloatTensor of shape [batch, seq_len, dim]\n mask is a ByteTensor of shape [batch, seq_len], filled with 1 when\n inside the sequence and 0 outside.\n '
mask = (input != self.padding_idx)
seq_len = input.size(1)
positi... |
def list_by_subscription(self, **kwargs):
'Lists ExpressRoute gateways under a given subscription.\n\n :keyword callable cls: A custom type or function that will be passed the direct response\n :return: ExpressRouteGatewayList, or the result of cls(response)\n :rtype: ~azure.mgmt.network.v2019_... | -6,477,420,243,702,260,000 | Lists ExpressRoute gateways under a given subscription.
:keyword callable cls: A custom type or function that will be passed the direct response
:return: ExpressRouteGatewayList, or the result of cls(response)
:rtype: ~azure.mgmt.network.v2019_07_01.models.ExpressRouteGatewayList
:raises: ~azure.core.exceptions.HttpRe... | sdk/network/azure-mgmt-network/azure/mgmt/network/v2019_07_01/operations/_express_route_gateways_operations.py | list_by_subscription | Co0olboi/azure-sdk-for-python | python | def list_by_subscription(self, **kwargs):
'Lists ExpressRoute gateways under a given subscription.\n\n :keyword callable cls: A custom type or function that will be passed the direct response\n :return: ExpressRouteGatewayList, or the result of cls(response)\n :rtype: ~azure.mgmt.network.v2019_... |
def list_by_resource_group(self, resource_group_name, **kwargs):
'Lists ExpressRoute gateways in a given resource group.\n\n :param resource_group_name: The name of the resource group.\n :type resource_group_name: str\n :keyword callable cls: A custom type or function that will be passed the di... | 3,271,096,405,484,876,000 | Lists ExpressRoute gateways in a given resource group.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:keyword callable cls: A custom type or function that will be passed the direct response
:return: ExpressRouteGatewayList, or the result of cls(response)
:rtype: ~azure.mgmt... | sdk/network/azure-mgmt-network/azure/mgmt/network/v2019_07_01/operations/_express_route_gateways_operations.py | list_by_resource_group | Co0olboi/azure-sdk-for-python | python | def list_by_resource_group(self, resource_group_name, **kwargs):
'Lists ExpressRoute gateways in a given resource group.\n\n :param resource_group_name: The name of the resource group.\n :type resource_group_name: str\n :keyword callable cls: A custom type or function that will be passed the di... |
def begin_create_or_update(self, resource_group_name, express_route_gateway_name, put_express_route_gateway_parameters, **kwargs):
'Creates or updates a ExpressRoute gateway in a specified resource group.\n\n :param resource_group_name: The name of the resource group.\n :type resource_group_name: str\... | 630,927,761,922,093,800 | Creates or updates a ExpressRoute gateway in a specified resource group.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param express_route_gateway_name: The name of the ExpressRoute gateway.
:type express_route_gateway_name: str
:param put_express_route_gateway_parameters:... | sdk/network/azure-mgmt-network/azure/mgmt/network/v2019_07_01/operations/_express_route_gateways_operations.py | begin_create_or_update | Co0olboi/azure-sdk-for-python | python | def begin_create_or_update(self, resource_group_name, express_route_gateway_name, put_express_route_gateway_parameters, **kwargs):
'Creates or updates a ExpressRoute gateway in a specified resource group.\n\n :param resource_group_name: The name of the resource group.\n :type resource_group_name: str\... |
def get(self, resource_group_name, express_route_gateway_name, **kwargs):
'Fetches the details of a ExpressRoute gateway in a resource group.\n\n :param resource_group_name: The name of the resource group.\n :type resource_group_name: str\n :param express_route_gateway_name: The name of the Exp... | 7,877,672,485,163,125,000 | Fetches the details of a ExpressRoute gateway in a resource group.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param express_route_gateway_name: The name of the ExpressRoute gateway.
:type express_route_gateway_name: str
:keyword callable cls: A custom type or function t... | sdk/network/azure-mgmt-network/azure/mgmt/network/v2019_07_01/operations/_express_route_gateways_operations.py | get | Co0olboi/azure-sdk-for-python | python | def get(self, resource_group_name, express_route_gateway_name, **kwargs):
'Fetches the details of a ExpressRoute gateway in a resource group.\n\n :param resource_group_name: The name of the resource group.\n :type resource_group_name: str\n :param express_route_gateway_name: The name of the Exp... |
def begin_delete(self, resource_group_name, express_route_gateway_name, **kwargs):
'Deletes the specified ExpressRoute gateway in a resource group. An ExpressRoute gateway\n resource can only be deleted when there are no connection subresources.\n\n :param resource_group_name: The name of the resource... | -4,264,051,642,790,468,000 | Deletes the specified ExpressRoute gateway in a resource group. An ExpressRoute gateway
resource can only be deleted when there are no connection subresources.
:param resource_group_name: The name of the resource group.
:type resource_group_name: str
:param express_route_gateway_name: The name of the ExpressRoute gate... | sdk/network/azure-mgmt-network/azure/mgmt/network/v2019_07_01/operations/_express_route_gateways_operations.py | begin_delete | Co0olboi/azure-sdk-for-python | python | def begin_delete(self, resource_group_name, express_route_gateway_name, **kwargs):
'Deletes the specified ExpressRoute gateway in a resource group. An ExpressRoute gateway\n resource can only be deleted when there are no connection subresources.\n\n :param resource_group_name: The name of the resource... |
def validate_configuration(self, configuration: Optional[ExpectationConfiguration]) -> bool:
'\n Validates that a configuration has been set, and sets a configuration if it has yet to be set. Ensures that\n necessary configuration arguments have been provided for the validation of the expectation.\n\n... | 2,431,498,238,814,919,000 | Validates that a configuration has been set, and sets a configuration if it has yet to be set. Ensures that
necessary configuration arguments have been provided for the validation of the expectation.
Args:
configuration (OPTIONAL[ExpectationConfiguration]): An optional Expectation Configuration ent... | great_expectations/expectations/core/expect_select_column_values_to_be_unique_within_record.py | validate_configuration | MeganBeckett/great_expectations | python | def validate_configuration(self, configuration: Optional[ExpectationConfiguration]) -> bool:
'\n Validates that a configuration has been set, and sets a configuration if it has yet to be set. Ensures that\n necessary configuration arguments have been provided for the validation of the expectation.\n\n... |
def __init__(self, string):
' Initialize the exception\n :param string: The message to append to the error\n '
self.string = string | -3,299,297,612,292,880,400 | Initialize the exception
:param string: The message to append to the error | pymodbus/exceptions.py | __init__ | Biondoap/pymodbus | python | def __init__(self, string):
' Initialize the exception\n :param string: The message to append to the error\n '
self.string = string |
def isError(self):
'Error'
return True | -4,660,284,373,349,241,000 | Error | pymodbus/exceptions.py | isError | Biondoap/pymodbus | python | def is(self):
return True |
def __init__(self, string='', function_code=None):
' Initialize the exception\n :param string: The message to append to the error\n '
self.fcode = function_code
self.message = ('[Input/Output] %s' % string)
ModbusException.__init__(self, self.message) | 5,281,375,963,429,550,000 | Initialize the exception
:param string: The message to append to the error | pymodbus/exceptions.py | __init__ | Biondoap/pymodbus | python | def __init__(self, string=, function_code=None):
' Initialize the exception\n :param string: The message to append to the error\n '
self.fcode = function_code
self.message = ('[Input/Output] %s' % string)
ModbusException.__init__(self, self.message) |
def __init__(self, string=''):
' Initialize the exception\n\n :param string: The message to append to the error\n '
message = ('[Invalid Parameter] %s' % string)
ModbusException.__init__(self, message) | -2,489,978,286,978,406,400 | Initialize the exception
:param string: The message to append to the error | pymodbus/exceptions.py | __init__ | Biondoap/pymodbus | python | def __init__(self, string=):
' Initialize the exception\n\n :param string: The message to append to the error\n '
message = ('[Invalid Parameter] %s' % string)
ModbusException.__init__(self, message) |
def __init__(self, string=''):
' Initialize the exception\n\n :param string: The message to append to the error\n '
message = ('[No Such Slave] %s' % string)
ModbusException.__init__(self, message) | 2,132,472,531,180,455,200 | Initialize the exception
:param string: The message to append to the error | pymodbus/exceptions.py | __init__ | Biondoap/pymodbus | python | def __init__(self, string=):
' Initialize the exception\n\n :param string: The message to append to the error\n '
message = ('[No Such Slave] %s' % string)
ModbusException.__init__(self, message) |
def __init__(self, string=''):
' Initialize the exception\n :param string: The message to append to the error\n '
message = ('[Not Implemented] %s' % string)
ModbusException.__init__(self, message) | 5,851,030,548,588,889,000 | Initialize the exception
:param string: The message to append to the error | pymodbus/exceptions.py | __init__ | Biondoap/pymodbus | python | def __init__(self, string=):
' Initialize the exception\n :param string: The message to append to the error\n '
message = ('[Not Implemented] %s' % string)
ModbusException.__init__(self, message) |
def __init__(self, string=''):
' Initialize the exception\n\n :param string: The message to append to the error\n '
message = ('[Connection] %s' % string)
ModbusException.__init__(self, message) | 1,008,842,865,108,339,600 | Initialize the exception
:param string: The message to append to the error | pymodbus/exceptions.py | __init__ | Biondoap/pymodbus | python | def __init__(self, string=):
' Initialize the exception\n\n :param string: The message to append to the error\n '
message = ('[Connection] %s' % string)
ModbusException.__init__(self, message) |
def __init__(self, string=''):
' Initialize the exception\n\n :param string: The message to append to the error\n '
message = ('[Invalid Message] %s' % string)
ModbusException.__init__(self, message) | 6,749,463,681,928,901,000 | Initialize the exception
:param string: The message to append to the error | pymodbus/exceptions.py | __init__ | Biondoap/pymodbus | python | def __init__(self, string=):
' Initialize the exception\n\n :param string: The message to append to the error\n '
message = ('[Invalid Message] %s' % string)
ModbusException.__init__(self, message) |
def _send_request(self, request, **kwargs):
'Runs the network request through the client\'s chained policies.\n\n >>> from azure.core.rest import HttpRequest\n >>> request = HttpRequest("GET", "https://www.example.org/")\n <HttpRequest [GET], url: \'https://www.example.org/\'>\n >>> resp... | -703,176,707,303,729,200 | Runs the network request through the client's chained policies.
>>> from azure.core.rest import HttpRequest
>>> request = HttpRequest("GET", "https://www.example.org/")
<HttpRequest [GET], url: 'https://www.example.org/'>
>>> response = client._send_request(request)
<HttpResponse: 200 OK>
For more information on this... | docs/samples/specification/multiapi/generated/azure/multiapi/sample/v3/_multiapi_service_client.py | _send_request | changlong-liu/autorest.python | python | def _send_request(self, request, **kwargs):
'Runs the network request through the client\'s chained policies.\n\n >>> from azure.core.rest import HttpRequest\n >>> request = HttpRequest("GET", "https://www.example.org/")\n <HttpRequest [GET], url: \'https://www.example.org/\'>\n >>> resp... |
def configure_trigger(cam):
'\n This function configures the camera to use a trigger. First, trigger mode is\n set to off in order to select the trigger source. Once the trigger source\n has been selected, trigger mode is then enabled, which has the camera\n capture only a single image upon the executio... | 7,378,235,373,236,104,000 | This function configures the camera to use a trigger. First, trigger mode is
set to off in order to select the trigger source. Once the trigger source
has been selected, trigger mode is then enabled, which has the camera
capture only a single image upon the execution of the chosen trigger.
:param cam: Camera to confi... | Sample/spinnaker_python-2.2.0.48-cp37-cp37m-win_amd64/Examples/Python3/Trigger.py | configure_trigger | BevanLab/Recording_Script | python | def configure_trigger(cam):
'\n This function configures the camera to use a trigger. First, trigger mode is\n set to off in order to select the trigger source. Once the trigger source\n has been selected, trigger mode is then enabled, which has the camera\n capture only a single image upon the executio... |
def grab_next_image_by_trigger(nodemap, cam):
'\n This function acquires an image by executing the trigger node.\n\n :param cam: Camera to acquire images from.\n :param nodemap: Device nodemap.\n :type cam: CameraPtr\n :type nodemap: INodeMap\n :return: True if successful, False otherwise.\n :r... | -2,424,736,275,929,776,000 | This function acquires an image by executing the trigger node.
:param cam: Camera to acquire images from.
:param nodemap: Device nodemap.
:type cam: CameraPtr
:type nodemap: INodeMap
:return: True if successful, False otherwise.
:rtype: bool | Sample/spinnaker_python-2.2.0.48-cp37-cp37m-win_amd64/Examples/Python3/Trigger.py | grab_next_image_by_trigger | BevanLab/Recording_Script | python | def grab_next_image_by_trigger(nodemap, cam):
'\n This function acquires an image by executing the trigger node.\n\n :param cam: Camera to acquire images from.\n :param nodemap: Device nodemap.\n :type cam: CameraPtr\n :type nodemap: INodeMap\n :return: True if successful, False otherwise.\n :r... |
def acquire_images(cam, nodemap, nodemap_tldevice):
'\n This function acquires and saves 10 images from a device.\n Please see Acquisition example for more in-depth comments on acquiring images.\n\n :param cam: Camera to acquire images from.\n :param nodemap: Device nodemap.\n :param nodemap_tldevice... | -6,248,954,651,740,775,000 | This function acquires and saves 10 images from a device.
Please see Acquisition example for more in-depth comments on acquiring images.
:param cam: Camera to acquire images from.
:param nodemap: Device nodemap.
:param nodemap_tldevice: Transport layer device nodemap.
:type cam: CameraPtr
:type nodemap: INodeMap
:type... | Sample/spinnaker_python-2.2.0.48-cp37-cp37m-win_amd64/Examples/Python3/Trigger.py | acquire_images | BevanLab/Recording_Script | python | def acquire_images(cam, nodemap, nodemap_tldevice):
'\n This function acquires and saves 10 images from a device.\n Please see Acquisition example for more in-depth comments on acquiring images.\n\n :param cam: Camera to acquire images from.\n :param nodemap: Device nodemap.\n :param nodemap_tldevice... |
def reset_trigger(nodemap):
'\n This function returns the camera to a normal state by turning off trigger mode.\n \n :param nodemap: Transport layer device nodemap.\n :type nodemap: INodeMap\n :returns: True if successful, False otherwise.\n :rtype: bool\n '
try:
result = True
... | -3,314,745,031,887,221,000 | This function returns the camera to a normal state by turning off trigger mode.
:param nodemap: Transport layer device nodemap.
:type nodemap: INodeMap
:returns: True if successful, False otherwise.
:rtype: bool | Sample/spinnaker_python-2.2.0.48-cp37-cp37m-win_amd64/Examples/Python3/Trigger.py | reset_trigger | BevanLab/Recording_Script | python | def reset_trigger(nodemap):
'\n This function returns the camera to a normal state by turning off trigger mode.\n \n :param nodemap: Transport layer device nodemap.\n :type nodemap: INodeMap\n :returns: True if successful, False otherwise.\n :rtype: bool\n '
try:
result = True
... |
def print_device_info(nodemap):
'\n This function prints the device information of the camera from the transport\n layer; please see NodeMapInfo example for more in-depth comments on printing\n device information from the nodemap.\n\n :param nodemap: Transport layer device nodemap.\n :type nodemap: I... | 4,143,192,014,156,832,300 | This function prints the device information of the camera from the transport
layer; please see NodeMapInfo example for more in-depth comments on printing
device information from the nodemap.
:param nodemap: Transport layer device nodemap.
:type nodemap: INodeMap
:returns: True if successful, False otherwise.
:rtype: b... | Sample/spinnaker_python-2.2.0.48-cp37-cp37m-win_amd64/Examples/Python3/Trigger.py | print_device_info | BevanLab/Recording_Script | python | def print_device_info(nodemap):
'\n This function prints the device information of the camera from the transport\n layer; please see NodeMapInfo example for more in-depth comments on printing\n device information from the nodemap.\n\n :param nodemap: Transport layer device nodemap.\n :type nodemap: I... |
def run_single_camera(cam):
'\n This function acts as the body of the example; please see NodeMapInfo example\n for more in-depth comments on setting up cameras.\n\n :param cam: Camera to run on.\n :type cam: CameraPtr\n :return: True if successful, False otherwise.\n :rtype: bool\n '
try:
... | 3,507,507,488,458,038,000 | This function acts as the body of the example; please see NodeMapInfo example
for more in-depth comments on setting up cameras.
:param cam: Camera to run on.
:type cam: CameraPtr
:return: True if successful, False otherwise.
:rtype: bool | Sample/spinnaker_python-2.2.0.48-cp37-cp37m-win_amd64/Examples/Python3/Trigger.py | run_single_camera | BevanLab/Recording_Script | python | def run_single_camera(cam):
'\n This function acts as the body of the example; please see NodeMapInfo example\n for more in-depth comments on setting up cameras.\n\n :param cam: Camera to run on.\n :type cam: CameraPtr\n :return: True if successful, False otherwise.\n :rtype: bool\n '
try:
... |
def main():
'\n Example entry point; please see Enumeration example for more in-depth\n comments on preparing and cleaning up the system.\n\n :return: True if successful, False otherwise.\n :rtype: bool\n '
try:
test_file = open('test.txt', 'w+')
except IOError:
print('Unable ... | -5,703,057,015,471,915,000 | Example entry point; please see Enumeration example for more in-depth
comments on preparing and cleaning up the system.
:return: True if successful, False otherwise.
:rtype: bool | Sample/spinnaker_python-2.2.0.48-cp37-cp37m-win_amd64/Examples/Python3/Trigger.py | main | BevanLab/Recording_Script | python | def main():
'\n Example entry point; please see Enumeration example for more in-depth\n comments on preparing and cleaning up the system.\n\n :return: True if successful, False otherwise.\n :rtype: bool\n '
try:
test_file = open('test.txt', 'w+')
except IOError:
print('Unable ... |
def make_prediction_net(num_out_channels, kernel_size=3, num_filters=256, bias_fill=None):
'Creates a network to predict the given number of output channels.\n\n This function is intended to make the prediction heads for the CenterNet\n meta architecture.\n\n Args:\n num_out_channels: Number of output channel... | 6,019,871,673,096,493,000 | Creates a network to predict the given number of output channels.
This function is intended to make the prediction heads for the CenterNet
meta architecture.
Args:
num_out_channels: Number of output channels.
kernel_size: The size of the conv kernel in the intermediate layer
num_filters: The number of filters i... | research/object_detection/meta_architectures/center_net_meta_arch.py | make_prediction_net | AvikantSrivastava/models | python | def make_prediction_net(num_out_channels, kernel_size=3, num_filters=256, bias_fill=None):
'Creates a network to predict the given number of output channels.\n\n This function is intended to make the prediction heads for the CenterNet\n meta architecture.\n\n Args:\n num_out_channels: Number of output channel... |
def top_k_feature_map_locations(feature_map, max_pool_kernel_size=3, k=100, per_channel=False):
'Returns the top k scores and their locations in a feature map.\n\n Given a feature map, the top k values (based on activation) are returned. If\n `per_channel` is True, the top k values **per channel** are returned.\n... | 8,923,855,060,416,032,000 | Returns the top k scores and their locations in a feature map.
Given a feature map, the top k values (based on activation) are returned. If
`per_channel` is True, the top k values **per channel** are returned.
The `max_pool_kernel_size` argument allows for selecting local peaks in a
region. This filtering is done per... | research/object_detection/meta_architectures/center_net_meta_arch.py | top_k_feature_map_locations | AvikantSrivastava/models | python | def top_k_feature_map_locations(feature_map, max_pool_kernel_size=3, k=100, per_channel=False):
'Returns the top k scores and their locations in a feature map.\n\n Given a feature map, the top k values (based on activation) are returned. If\n `per_channel` is True, the top k values **per channel** are returned.\n... |
def prediction_tensors_to_boxes(detection_scores, y_indices, x_indices, channel_indices, height_width_predictions, offset_predictions):
'Converts CenterNet class-center, offset and size predictions to boxes.\n\n Args:\n detection_scores: A [batch, num_boxes] float32 tensor with detection\n scores in range ... | 4,516,369,718,014,506,000 | Converts CenterNet class-center, offset and size predictions to boxes.
Args:
detection_scores: A [batch, num_boxes] float32 tensor with detection
scores in range [0, 1].
y_indices: A [batch, num_boxes] int32 tensor with y indices corresponding to
object center locations (expressed in output coordinate fram... | research/object_detection/meta_architectures/center_net_meta_arch.py | prediction_tensors_to_boxes | AvikantSrivastava/models | python | def prediction_tensors_to_boxes(detection_scores, y_indices, x_indices, channel_indices, height_width_predictions, offset_predictions):
'Converts CenterNet class-center, offset and size predictions to boxes.\n\n Args:\n detection_scores: A [batch, num_boxes] float32 tensor with detection\n scores in range ... |
def prediction_tensors_to_temporal_offsets(y_indices, x_indices, offset_predictions):
"Converts CenterNet temporal offset map predictions to batched format.\n\n This function is similiar to the box offset conversion function, as both\n temporal offsets and box offsets are size-2 vectors.\n\n Args:\n y_indices... | 8,263,310,012,307,342,000 | Converts CenterNet temporal offset map predictions to batched format.
This function is similiar to the box offset conversion function, as both
temporal offsets and box offsets are size-2 vectors.
Args:
y_indices: A [batch, num_boxes] int32 tensor with y indices corresponding to
object center locations (expresse... | research/object_detection/meta_architectures/center_net_meta_arch.py | prediction_tensors_to_temporal_offsets | AvikantSrivastava/models | python | def prediction_tensors_to_temporal_offsets(y_indices, x_indices, offset_predictions):
"Converts CenterNet temporal offset map predictions to batched format.\n\n This function is similiar to the box offset conversion function, as both\n temporal offsets and box offsets are size-2 vectors.\n\n Args:\n y_indices... |
def prediction_tensors_to_keypoint_candidates(keypoint_heatmap_predictions, keypoint_heatmap_offsets, keypoint_score_threshold=0.1, max_pool_kernel_size=1, max_candidates=20):
"Convert keypoint heatmap predictions and offsets to keypoint candidates.\n\n Args:\n keypoint_heatmap_predictions: A float tensor of sh... | 7,684,851,093,205,623,000 | Convert keypoint heatmap predictions and offsets to keypoint candidates.
Args:
keypoint_heatmap_predictions: A float tensor of shape [batch_size, height,
width, num_keypoints] representing the per-keypoint heatmaps.
keypoint_heatmap_offsets: A float tensor of shape [batch_size, height,
width, 2] (or [batch... | research/object_detection/meta_architectures/center_net_meta_arch.py | prediction_tensors_to_keypoint_candidates | AvikantSrivastava/models | python | def prediction_tensors_to_keypoint_candidates(keypoint_heatmap_predictions, keypoint_heatmap_offsets, keypoint_score_threshold=0.1, max_pool_kernel_size=1, max_candidates=20):
"Convert keypoint heatmap predictions and offsets to keypoint candidates.\n\n Args:\n keypoint_heatmap_predictions: A float tensor of sh... |
def regressed_keypoints_at_object_centers(regressed_keypoint_predictions, y_indices, x_indices):
'Returns the regressed keypoints at specified object centers.\n\n The original keypoint predictions are regressed relative to each feature map\n location. The returned keypoints are expressed in absolute coordinates i... | -4,714,983,378,805,419,000 | Returns the regressed keypoints at specified object centers.
The original keypoint predictions are regressed relative to each feature map
location. The returned keypoints are expressed in absolute coordinates in the
output frame (i.e. the center offsets are added to each individual regressed
set of keypoints).
Args:
... | research/object_detection/meta_architectures/center_net_meta_arch.py | regressed_keypoints_at_object_centers | AvikantSrivastava/models | python | def regressed_keypoints_at_object_centers(regressed_keypoint_predictions, y_indices, x_indices):
'Returns the regressed keypoints at specified object centers.\n\n The original keypoint predictions are regressed relative to each feature map\n location. The returned keypoints are expressed in absolute coordinates i... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.