body stringlengths 26 98.2k | body_hash int64 -9,222,864,604,528,158,000 9,221,803,474B | docstring stringlengths 1 16.8k | path stringlengths 5 230 | name stringlengths 1 96 | repository_name stringlengths 7 89 | lang stringclasses 1
value | body_without_docstring stringlengths 20 98.2k |
|---|---|---|---|---|---|---|---|
def _ValidateTestPathPartName(name):
'Checks whether a Master, Bot or TestMetadata name is OK.'
if (name.startswith('__') and name.endswith('__')):
raise BadRequestError(('Invalid name: "%s". Names cannot start and end with "__".' % name)) | 3,896,287,173,283,997,700 | Checks whether a Master, Bot or TestMetadata name is OK. | dashboard/dashboard/add_point.py | _ValidateTestPathPartName | bopopescu/catapult-2 | python | def _ValidateTestPathPartName(name):
if (name.startswith('__') and name.endswith('__')):
raise BadRequestError(('Invalid name: "%s". Names cannot start and end with "__".' % name)) |
def _ValidateRowId(row_dict, test_map):
'Checks whether the ID for a Row is OK.\n\n Args:\n row_dict: A dictionary with new point properties, including "revision".\n test_map: A dictionary mapping test paths to the last previously added\n revision for each test.\n\n Raises:\n BadRequestError: The ... | 4,812,546,671,652,088,000 | Checks whether the ID for a Row is OK.
Args:
row_dict: A dictionary with new point properties, including "revision".
test_map: A dictionary mapping test paths to the last previously added
revision for each test.
Raises:
BadRequestError: The revision is not acceptable for some reason. | dashboard/dashboard/add_point.py | _ValidateRowId | bopopescu/catapult-2 | python | def _ValidateRowId(row_dict, test_map):
'Checks whether the ID for a Row is OK.\n\n Args:\n row_dict: A dictionary with new point properties, including "revision".\n test_map: A dictionary mapping test paths to the last previously added\n revision for each test.\n\n Raises:\n BadRequestError: The ... |
def _IsAcceptableRowId(row_id, last_row_id, allow_jump=False):
'Checks whether the given row id (aka revision) is not too large or small.\n\n For each data series (i.e. TestMetadata entity), we assume that row IDs are\n monotonically increasing. On a given chart, points are sorted by these\n row IDs. This way, p... | -1,879,351,429,549,624,300 | Checks whether the given row id (aka revision) is not too large or small.
For each data series (i.e. TestMetadata entity), we assume that row IDs are
monotonically increasing. On a given chart, points are sorted by these
row IDs. This way, points can arrive out of order but still be shown
correctly in the chart.
Howe... | dashboard/dashboard/add_point.py | _IsAcceptableRowId | bopopescu/catapult-2 | python | def _IsAcceptableRowId(row_id, last_row_id, allow_jump=False):
'Checks whether the given row id (aka revision) is not too large or small.\n\n For each data series (i.e. TestMetadata entity), we assume that row IDs are\n monotonically increasing. On a given chart, points are sorted by these\n row IDs. This way, p... |
def GetAndValidateRowId(row_dict):
"Returns the integer ID for a new Row.\n\n This method is also responsible for validating the input fields related\n to making the new row ID.\n\n Args:\n row_dict: A dictionary obtained from the input JSON.\n\n Returns:\n An integer row ID.\n\n Raises:\n BadRequestE... | 4,024,942,979,099,729,000 | Returns the integer ID for a new Row.
This method is also responsible for validating the input fields related
to making the new row ID.
Args:
row_dict: A dictionary obtained from the input JSON.
Returns:
An integer row ID.
Raises:
BadRequestError: The input wasn't formatted properly. | dashboard/dashboard/add_point.py | GetAndValidateRowId | bopopescu/catapult-2 | python | def GetAndValidateRowId(row_dict):
"Returns the integer ID for a new Row.\n\n This method is also responsible for validating the input fields related\n to making the new row ID.\n\n Args:\n row_dict: A dictionary obtained from the input JSON.\n\n Returns:\n An integer row ID.\n\n Raises:\n BadRequestE... |
def GetAndValidateRowProperties(row):
'From the object received, make a dictionary of properties for a Row.\n\n This includes the default "value" and "error" columns as well as all\n supplemental columns, but it doesn\'t include "revision", and it doesn\'t\n include input fields that are properties of the parent... | -2,355,803,680,181,531,000 | From the object received, make a dictionary of properties for a Row.
This includes the default "value" and "error" columns as well as all
supplemental columns, but it doesn't include "revision", and it doesn't
include input fields that are properties of the parent TestMetadata, such as
"units".
This method is respons... | dashboard/dashboard/add_point.py | GetAndValidateRowProperties | bopopescu/catapult-2 | python | def GetAndValidateRowProperties(row):
'From the object received, make a dictionary of properties for a Row.\n\n This includes the default "value" and "error" columns as well as all\n supplemental columns, but it doesn\'t include "revision", and it doesn\'t\n include input fields that are properties of the parent... |
def _GetSupplementalColumns(row):
'Gets a dict of supplemental columns.\n\n If any columns are invalid, a warning is logged and they just aren\'t included,\n but no exception is raised.\n\n Individual rows may specify up to _MAX_NUM_COLUMNS extra data, revision,\n and annotation columns. These columns must foll... | -916,637,843,531,906,400 | Gets a dict of supplemental columns.
If any columns are invalid, a warning is logged and they just aren't included,
but no exception is raised.
Individual rows may specify up to _MAX_NUM_COLUMNS extra data, revision,
and annotation columns. These columns must follow formatting rules for
their type. Invalid columns ar... | dashboard/dashboard/add_point.py | _GetSupplementalColumns | bopopescu/catapult-2 | python | def _GetSupplementalColumns(row):
'Gets a dict of supplemental columns.\n\n If any columns are invalid, a warning is logged and they just aren\'t included,\n but no exception is raised.\n\n Individual rows may specify up to _MAX_NUM_COLUMNS extra data, revision,\n and annotation columns. These columns must foll... |
def _CheckSupplementalColumn(name, value):
'Returns a possibly modified value for a supplemental column, or None.'
name = str(name)
if (len(name) > _MAX_COLUMN_NAME_LENGTH):
logging.warn('Supplemental column name too long.')
return None
if (name[:2] not in ('d_', 'r_', 'a_')):
lo... | 6,983,524,817,797,946,000 | Returns a possibly modified value for a supplemental column, or None. | dashboard/dashboard/add_point.py | _CheckSupplementalColumn | bopopescu/catapult-2 | python | def _CheckSupplementalColumn(name, value):
name = str(name)
if (len(name) > _MAX_COLUMN_NAME_LENGTH):
logging.warn('Supplemental column name too long.')
return None
if (name[:2] not in ('d_', 'r_', 'a_')):
logging.warn('Bad column name "%s", invalid prefix.', name)
retur... |
def post(self):
'Validates data parameter and add task to queue to process points.\n\n The row data comes from a "data" parameter, which is a JSON encoding of a\n list of dictionaries, each of which represents one performance result\n (one point in a graph) and associated data.\n\n [\n {\n ... | 2,781,788,386,795,497,000 | Validates data parameter and add task to queue to process points.
The row data comes from a "data" parameter, which is a JSON encoding of a
list of dictionaries, each of which represents one performance result
(one point in a graph) and associated data.
[
{
"master": "ChromiumPerf",
"bot": "xp-relea... | dashboard/dashboard/add_point.py | post | bopopescu/catapult-2 | python | def post(self):
'Validates data parameter and add task to queue to process points.\n\n The row data comes from a "data" parameter, which is a JSON encoding of a\n list of dictionaries, each of which represents one performance result\n (one point in a graph) and associated data.\n\n [\n {\n ... |
def S_IFMT(mode):
"Return the portion of the file's mode that describes the\n file type.\n "
return (mode & 61440) | -3,216,242,946,293,737,000 | Return the portion of the file's mode that describes the
file type. | others/explorer_standalone.py | S_IFMT | eggfly/M5StickVComputer | python | def S_IFMT(mode):
"Return the portion of the file's mode that describes the\n file type.\n "
return (mode & 61440) |
def S_ISDIR(mode):
'Return True if mode is from a directory.'
return (S_IFMT(mode) == S_IFDIR) | 4,509,911,602,829,706,000 | Return True if mode is from a directory. | others/explorer_standalone.py | S_ISDIR | eggfly/M5StickVComputer | python | def S_ISDIR(mode):
return (S_IFMT(mode) == S_IFDIR) |
def belong(in_list1: list, in_list2: list) -> list:
'\n Check wheter or not all the element in list in_list1 belong into in_list2\n :param in_list1: the source list\n :param in_list2: the target list where to find the element in in_list1\n :return: return True if the statement is verified otherwise retu... | -9,019,206,028,006,766,000 | Check wheter or not all the element in list in_list1 belong into in_list2
:param in_list1: the source list
:param in_list2: the target list where to find the element in in_list1
:return: return True if the statement is verified otherwise return False | Python/List/14.belong.py | belong | angelmpalomares/ModelAndLanguagesForBioInformatics | python | def belong(in_list1: list, in_list2: list) -> list:
'\n Check wheter or not all the element in list in_list1 belong into in_list2\n :param in_list1: the source list\n :param in_list2: the target list where to find the element in in_list1\n :return: return True if the statement is verified otherwise retu... |
def add_scaling(spot_fleet, template, cluster_name):
' Add scaling resources to a cluster '
ssm_param = Parameter('Scale{}'.format(sanitize_cfn_resource_name(spot_fleet.get('name'))), Type='String', Value='0', Name=Sub('/ecs-maestro/${ClusterName}/${Version}/scaletime'))
template.add_resource(ssm_param)
... | 3,883,582,814,284,162,600 | Add scaling resources to a cluster | ecs_cluster_deployer/compute/lambda_scaler.py | add_scaling | apollusehs-devops/ecs-cluster-deployer | python | def add_scaling(spot_fleet, template, cluster_name):
' '
ssm_param = Parameter('Scale{}'.format(sanitize_cfn_resource_name(spot_fleet.get('name'))), Type='String', Value='0', Name=Sub('/ecs-maestro/${ClusterName}/${Version}/scaletime'))
template.add_resource(ssm_param)
function_name = sanitize_cfn_reso... |
@op_info_register(stack_init_op_info)
def _stack_init_aicpu():
'StackInit aicpu register'
return | 2,930,796,386,539,487,000 | StackInit aicpu register | mindspore/ops/_op_impl/aicpu/stack_push_pop.py | _stack_init_aicpu | 233-puchi/mindspore | python | @op_info_register(stack_init_op_info)
def _stack_init_aicpu():
return |
@op_info_register(stack_push_op_info)
def _stack_push_aicpu():
'StackPush aicpu register'
return | -1,631,848,826,431,700,500 | StackPush aicpu register | mindspore/ops/_op_impl/aicpu/stack_push_pop.py | _stack_push_aicpu | 233-puchi/mindspore | python | @op_info_register(stack_push_op_info)
def _stack_push_aicpu():
return |
@op_info_register(stack_pop_op_info)
def _stack_pop_aicpu():
'StackPop aicpu register'
return | 4,465,277,019,540,829,700 | StackPop aicpu register | mindspore/ops/_op_impl/aicpu/stack_push_pop.py | _stack_pop_aicpu | 233-puchi/mindspore | python | @op_info_register(stack_pop_op_info)
def _stack_pop_aicpu():
return |
@op_info_register(stack_destroy_op_info)
def _stack_destroy_aicpu():
'StackDestroy aicpu register'
return | 8,348,166,599,229,350,000 | StackDestroy aicpu register | mindspore/ops/_op_impl/aicpu/stack_push_pop.py | _stack_destroy_aicpu | 233-puchi/mindspore | python | @op_info_register(stack_destroy_op_info)
def _stack_destroy_aicpu():
return |
def min_depth(self, root):
'\n :type root: TreeNode\n :rtype: int\n '
if (root is None):
return 0
if ((root.left is not None) or (root.right is not None)):
return (max(self.minDepth(root.left), self.minDepth(root.right)) + 1)
return (min(self.minDepth(root.left), self.minDepth(r... | -8,175,042,898,806,348,000 | :type root: TreeNode
:rtype: int | algorithms/tree/min_height.py | min_depth | AdrialYeoh/algorithms | python | def min_depth(self, root):
'\n :type root: TreeNode\n :rtype: int\n '
if (root is None):
return 0
if ((root.left is not None) or (root.right is not None)):
return (max(self.minDepth(root.left), self.minDepth(root.right)) + 1)
return (min(self.minDepth(root.left), self.minDepth(r... |
def test_plugins(self):
'Test that plugins without dependencies work'
localrc = {'test_localrc': '1'}
local_conf = {'install': {'nova.conf': {'main': {'test_conf': '2'}}}}
services = {'cinder': True}
plugins = OrderedDict([('bar', 'git://git.openstack.org/openstack/bar-plugin'), ('foo', 'git://git.o... | 8,778,339,353,578,309,000 | Test that plugins without dependencies work | roles/write-devstack-local-conf/library/test.py | test_plugins | HoonMinJeongUm/HoonMin-devstack | python | def test_plugins(self):
localrc = {'test_localrc': '1'}
local_conf = {'install': {'nova.conf': {'main': {'test_conf': '2'}}}}
services = {'cinder': True}
plugins = OrderedDict([('bar', 'git://git.openstack.org/openstack/bar-plugin'), ('foo', 'git://git.openstack.org/openstack/foo-plugin'), ('baz', ... |
def test_plugin_deps(self):
'Test that plugins with dependencies work'
os.makedirs(os.path.join(self.tmpdir, 'foo-plugin', 'devstack'))
os.makedirs(os.path.join(self.tmpdir, 'foo-plugin', '.git'))
os.makedirs(os.path.join(self.tmpdir, 'bar-plugin', 'devstack'))
os.makedirs(os.path.join(self.tmpdir, ... | 412,481,352,534,518,700 | Test that plugins with dependencies work | roles/write-devstack-local-conf/library/test.py | test_plugin_deps | HoonMinJeongUm/HoonMin-devstack | python | def test_plugin_deps(self):
os.makedirs(os.path.join(self.tmpdir, 'foo-plugin', 'devstack'))
os.makedirs(os.path.join(self.tmpdir, 'foo-plugin', '.git'))
os.makedirs(os.path.join(self.tmpdir, 'bar-plugin', 'devstack'))
os.makedirs(os.path.join(self.tmpdir, 'bar-plugin', '.git'))
with open(os.pa... |
def test_libs_from_git(self):
'Test that LIBS_FROM_GIT is auto-generated'
projects = {'git.openstack.org/openstack/nova': {'required': True, 'short_name': 'nova'}, 'git.openstack.org/openstack/oslo.messaging': {'required': True, 'short_name': 'oslo.messaging'}, 'git.openstack.org/openstack/devstack-plugin': {'r... | 5,259,233,446,241,077,000 | Test that LIBS_FROM_GIT is auto-generated | roles/write-devstack-local-conf/library/test.py | test_libs_from_git | HoonMinJeongUm/HoonMin-devstack | python | def test_libs_from_git(self):
projects = {'git.openstack.org/openstack/nova': {'required': True, 'short_name': 'nova'}, 'git.openstack.org/openstack/oslo.messaging': {'required': True, 'short_name': 'oslo.messaging'}, 'git.openstack.org/openstack/devstack-plugin': {'required': False, 'short_name': 'devstack-pl... |
def test_overridelibs_from_git(self):
'Test that LIBS_FROM_GIT can be overridden'
localrc = {'LIBS_FROM_GIT': 'oslo.db'}
projects = {'git.openstack.org/openstack/nova': {'required': True, 'short_name': 'nova'}, 'git.openstack.org/openstack/oslo.messaging': {'required': True, 'short_name': 'oslo.messaging'},... | -5,863,802,321,256,962,000 | Test that LIBS_FROM_GIT can be overridden | roles/write-devstack-local-conf/library/test.py | test_overridelibs_from_git | HoonMinJeongUm/HoonMin-devstack | python | def test_overridelibs_from_git(self):
localrc = {'LIBS_FROM_GIT': 'oslo.db'}
projects = {'git.openstack.org/openstack/nova': {'required': True, 'short_name': 'nova'}, 'git.openstack.org/openstack/oslo.messaging': {'required': True, 'short_name': 'oslo.messaging'}, 'git.openstack.org/openstack/devstack-plug... |
def test_plugin_circular_deps(self):
'Test that plugins with circular dependencies fail'
os.makedirs(os.path.join(self.tmpdir, 'foo-plugin', 'devstack'))
os.makedirs(os.path.join(self.tmpdir, 'foo-plugin', '.git'))
os.makedirs(os.path.join(self.tmpdir, 'bar-plugin', 'devstack'))
os.makedirs(os.path.... | -1,101,312,770,292,055,400 | Test that plugins with circular dependencies fail | roles/write-devstack-local-conf/library/test.py | test_plugin_circular_deps | HoonMinJeongUm/HoonMin-devstack | python | def test_plugin_circular_deps(self):
os.makedirs(os.path.join(self.tmpdir, 'foo-plugin', 'devstack'))
os.makedirs(os.path.join(self.tmpdir, 'foo-plugin', '.git'))
os.makedirs(os.path.join(self.tmpdir, 'bar-plugin', 'devstack'))
os.makedirs(os.path.join(self.tmpdir, 'bar-plugin', '.git'))
with o... |
def _create_k8s_job(self, yaml_spec):
' _create_k8s_job creates a kubernetes job based on the yaml spec '
pod = k8s_client.V1Pod(metadata=k8s_client.V1ObjectMeta(generate_name=yaml_spec['metadata']['generateName']))
container = k8s_client.V1Container(name=yaml_spec['spec']['containers'][0]['name'], image=ya... | 942,747,812,642,086,400 | _create_k8s_job creates a kubernetes job based on the yaml spec | sdk/python/kfp/compiler/_k8s_helper.py | _create_k8s_job | JohnPaton/pipelines | python | def _create_k8s_job(self, yaml_spec):
' '
pod = k8s_client.V1Pod(metadata=k8s_client.V1ObjectMeta(generate_name=yaml_spec['metadata']['generateName']))
container = k8s_client.V1Container(name=yaml_spec['spec']['containers'][0]['name'], image=yaml_spec['spec']['containers'][0]['image'], args=yaml_spec['spec... |
def _wait_for_k8s_job(self, pod_name, yaml_spec, timeout):
' _wait_for_k8s_job waits for the job to complete '
status = 'running'
start_time = datetime.now()
while (status in ['pending', 'running']):
try:
api_response = self._corev1.read_namespaced_pod(pod_name, yaml_spec['metadata']... | 390,679,719,422,102,600 | _wait_for_k8s_job waits for the job to complete | sdk/python/kfp/compiler/_k8s_helper.py | _wait_for_k8s_job | JohnPaton/pipelines | python | def _wait_for_k8s_job(self, pod_name, yaml_spec, timeout):
' '
status = 'running'
start_time = datetime.now()
while (status in ['pending', 'running']):
try:
api_response = self._corev1.read_namespaced_pod(pod_name, yaml_spec['metadata']['namespace'])
status = api_respons... |
def _delete_k8s_job(self, pod_name, yaml_spec):
' _delete_k8s_job deletes a pod '
try:
api_response = self._corev1.delete_namespaced_pod(pod_name, yaml_spec['metadata']['namespace'], body=k8s_client.V1DeleteOptions())
except k8s_client.rest.ApiException as e:
logging.exception('Exception whe... | -4,173,525,661,513,618,000 | _delete_k8s_job deletes a pod | sdk/python/kfp/compiler/_k8s_helper.py | _delete_k8s_job | JohnPaton/pipelines | python | def _delete_k8s_job(self, pod_name, yaml_spec):
' '
try:
api_response = self._corev1.delete_namespaced_pod(pod_name, yaml_spec['metadata']['namespace'], body=k8s_client.V1DeleteOptions())
except k8s_client.rest.ApiException as e:
logging.exception('Exception when calling CoreV1Api->delete_n... |
def run_job(self, yaml_spec, timeout=600):
' run_job runs a kubernetes job and clean up afterwards '
(pod_name, succ) = self._create_k8s_job(yaml_spec)
if (not succ):
return False
succ = self._wait_for_k8s_job(pod_name, yaml_spec, timeout)
if (not succ):
logging.info('Kubernetes job ... | 8,632,286,401,087,466,000 | run_job runs a kubernetes job and clean up afterwards | sdk/python/kfp/compiler/_k8s_helper.py | run_job | JohnPaton/pipelines | python | def run_job(self, yaml_spec, timeout=600):
' '
(pod_name, succ) = self._create_k8s_job(yaml_spec)
if (not succ):
return False
succ = self._wait_for_k8s_job(pod_name, yaml_spec, timeout)
if (not succ):
logging.info('Kubernetes job failed.')
return False
self._delete_k8s_j... |
@staticmethod
def sanitize_k8s_name(name):
'From _make_kubernetes_name\n sanitize_k8s_name cleans and converts the names in the workflow.\n '
return re.sub('-+', '-', re.sub('[^-0-9a-z]+', '-', name.lower())).lstrip('-').rstrip('-') | -6,757,738,004,173,168,000 | From _make_kubernetes_name
sanitize_k8s_name cleans and converts the names in the workflow. | sdk/python/kfp/compiler/_k8s_helper.py | sanitize_k8s_name | JohnPaton/pipelines | python | @staticmethod
def sanitize_k8s_name(name):
'From _make_kubernetes_name\n sanitize_k8s_name cleans and converts the names in the workflow.\n '
return re.sub('-+', '-', re.sub('[^-0-9a-z]+', '-', name.lower())).lstrip('-').rstrip('-') |
@staticmethod
def convert_k8s_obj_to_json(k8s_obj):
'\n Builds a JSON K8s object.\n\n If obj is None, return None.\n If obj is str, int, long, float, bool, return directly.\n If obj is datetime.datetime, datetime.date\n convert to string in iso8601 format.\n If obj is list, sanitize each eleme... | -8,150,503,951,591,003,000 | Builds a JSON K8s object.
If obj is None, return None.
If obj is str, int, long, float, bool, return directly.
If obj is datetime.datetime, datetime.date
convert to string in iso8601 format.
If obj is list, sanitize each element in the list.
If obj is dict, return the dict.
If obj is swagger model, return the prop... | sdk/python/kfp/compiler/_k8s_helper.py | convert_k8s_obj_to_json | JohnPaton/pipelines | python | @staticmethod
def convert_k8s_obj_to_json(k8s_obj):
'\n Builds a JSON K8s object.\n\n If obj is None, return None.\n If obj is str, int, long, float, bool, return directly.\n If obj is datetime.datetime, datetime.date\n convert to string in iso8601 format.\n If obj is list, sanitize each eleme... |
def encrypt_payload(secret_key, payload):
'Return a encrypted payload given a key and dictionary of data.'
try:
from nacl.secret import SecretBox
from nacl.encoding import Base64Encoder
except (ImportError, OSError):
pytest.skip('libnacl/libsodium is not installed')
return
... | 1,771,727,756,332,680,400 | Return a encrypted payload given a key and dictionary of data. | tests/components/mobile_app/test_webhook.py | encrypt_payload | Bonnee/core | python | def encrypt_payload(secret_key, payload):
try:
from nacl.secret import SecretBox
from nacl.encoding import Base64Encoder
except (ImportError, OSError):
pytest.skip('libnacl/libsodium is not installed')
return
import json
keylen = SecretBox.KEY_SIZE
prepped_key = ... |
def decrypt_payload(secret_key, encrypted_data):
'Return a decrypted payload given a key and a string of encrypted data.'
try:
from nacl.secret import SecretBox
from nacl.encoding import Base64Encoder
except (ImportError, OSError):
pytest.skip('libnacl/libsodium is not installed')
... | 1,057,821,506,854,998,300 | Return a decrypted payload given a key and a string of encrypted data. | tests/components/mobile_app/test_webhook.py | decrypt_payload | Bonnee/core | python | def decrypt_payload(secret_key, encrypted_data):
try:
from nacl.secret import SecretBox
from nacl.encoding import Base64Encoder
except (ImportError, OSError):
pytest.skip('libnacl/libsodium is not installed')
return
import json
keylen = SecretBox.KEY_SIZE
prepped... |
async def test_webhook_handle_render_template(create_registrations, webhook_client):
'Test that we render templates properly.'
resp = (await webhook_client.post('/api/webhook/{}'.format(create_registrations[1]['webhook_id']), json=RENDER_TEMPLATE))
assert (resp.status == 200)
json = (await resp.json())
... | -2,946,827,424,843,341,000 | Test that we render templates properly. | tests/components/mobile_app/test_webhook.py | test_webhook_handle_render_template | Bonnee/core | python | async def test_webhook_handle_render_template(create_registrations, webhook_client):
resp = (await webhook_client.post('/api/webhook/{}'.format(create_registrations[1]['webhook_id']), json=RENDER_TEMPLATE))
assert (resp.status == 200)
json = (await resp.json())
assert (json == {'one': 'Hello world'... |
async def test_webhook_handle_call_services(hass, create_registrations, webhook_client):
'Test that we call services properly.'
calls = async_mock_service(hass, 'test', 'mobile_app')
resp = (await webhook_client.post('/api/webhook/{}'.format(create_registrations[1]['webhook_id']), json=CALL_SERVICE))
as... | -3,661,124,779,861,039,600 | Test that we call services properly. | tests/components/mobile_app/test_webhook.py | test_webhook_handle_call_services | Bonnee/core | python | async def test_webhook_handle_call_services(hass, create_registrations, webhook_client):
calls = async_mock_service(hass, 'test', 'mobile_app')
resp = (await webhook_client.post('/api/webhook/{}'.format(create_registrations[1]['webhook_id']), json=CALL_SERVICE))
assert (resp.status == 200)
assert (... |
async def test_webhook_handle_fire_event(hass, create_registrations, webhook_client):
'Test that we can fire events.'
events = []
@callback
def store_event(event):
'Helepr to store events.'
events.append(event)
hass.bus.async_listen('test_event', store_event)
resp = (await webho... | -6,889,423,410,323,974,000 | Test that we can fire events. | tests/components/mobile_app/test_webhook.py | test_webhook_handle_fire_event | Bonnee/core | python | async def test_webhook_handle_fire_event(hass, create_registrations, webhook_client):
events = []
@callback
def store_event(event):
'Helepr to store events.'
events.append(event)
hass.bus.async_listen('test_event', store_event)
resp = (await webhook_client.post('/api/webhook/{}... |
async def test_webhook_update_registration(webhook_client, authed_api_client):
'Test that a we can update an existing registration via webhook.'
register_resp = (await authed_api_client.post('/api/mobile_app/registrations', json=REGISTER_CLEARTEXT))
assert (register_resp.status == 201)
register_json = (... | -5,394,532,641,275,007,000 | Test that a we can update an existing registration via webhook. | tests/components/mobile_app/test_webhook.py | test_webhook_update_registration | Bonnee/core | python | async def test_webhook_update_registration(webhook_client, authed_api_client):
register_resp = (await authed_api_client.post('/api/mobile_app/registrations', json=REGISTER_CLEARTEXT))
assert (register_resp.status == 201)
register_json = (await register_resp.json())
webhook_id = register_json[CONF_W... |
async def test_webhook_handle_get_zones(hass, create_registrations, webhook_client):
'Test that we can get zones properly.'
(await async_setup_component(hass, ZONE_DOMAIN, {ZONE_DOMAIN: {}}))
resp = (await webhook_client.post('/api/webhook/{}'.format(create_registrations[1]['webhook_id']), json={'type': 'ge... | -8,149,553,562,526,938,000 | Test that we can get zones properly. | tests/components/mobile_app/test_webhook.py | test_webhook_handle_get_zones | Bonnee/core | python | async def test_webhook_handle_get_zones(hass, create_registrations, webhook_client):
(await async_setup_component(hass, ZONE_DOMAIN, {ZONE_DOMAIN: {}}))
resp = (await webhook_client.post('/api/webhook/{}'.format(create_registrations[1]['webhook_id']), json={'type': 'get_zones'}))
assert (resp.status ==... |
async def test_webhook_handle_get_config(hass, create_registrations, webhook_client):
'Test that we can get config properly.'
resp = (await webhook_client.post('/api/webhook/{}'.format(create_registrations[1]['webhook_id']), json={'type': 'get_config'}))
assert (resp.status == 200)
json = (await resp.js... | -5,016,961,611,160,766,000 | Test that we can get config properly. | tests/components/mobile_app/test_webhook.py | test_webhook_handle_get_config | Bonnee/core | python | async def test_webhook_handle_get_config(hass, create_registrations, webhook_client):
resp = (await webhook_client.post('/api/webhook/{}'.format(create_registrations[1]['webhook_id']), json={'type': 'get_config'}))
assert (resp.status == 200)
json = (await resp.json())
if ('components' in json):
... |
async def test_webhook_returns_error_incorrect_json(webhook_client, create_registrations, caplog):
'Test that an error is returned when JSON is invalid.'
resp = (await webhook_client.post('/api/webhook/{}'.format(create_registrations[1]['webhook_id']), data='not json'))
assert (resp.status == 400)
json ... | 4,461,940,669,026,047,000 | Test that an error is returned when JSON is invalid. | tests/components/mobile_app/test_webhook.py | test_webhook_returns_error_incorrect_json | Bonnee/core | python | async def test_webhook_returns_error_incorrect_json(webhook_client, create_registrations, caplog):
resp = (await webhook_client.post('/api/webhook/{}'.format(create_registrations[1]['webhook_id']), data='not json'))
assert (resp.status == 400)
json = (await resp.json())
assert (json == {})
asse... |
async def test_webhook_handle_decryption(webhook_client, create_registrations):
'Test that we can encrypt/decrypt properly.'
key = create_registrations[0]['secret']
data = encrypt_payload(key, RENDER_TEMPLATE['data'])
container = {'type': 'render_template', 'encrypted': True, 'encrypted_data': data}
... | -8,089,620,602,666,996,000 | Test that we can encrypt/decrypt properly. | tests/components/mobile_app/test_webhook.py | test_webhook_handle_decryption | Bonnee/core | python | async def test_webhook_handle_decryption(webhook_client, create_registrations):
key = create_registrations[0]['secret']
data = encrypt_payload(key, RENDER_TEMPLATE['data'])
container = {'type': 'render_template', 'encrypted': True, 'encrypted_data': data}
resp = (await webhook_client.post('/api/web... |
async def test_webhook_requires_encryption(webhook_client, create_registrations):
'Test that encrypted registrations only accept encrypted data.'
resp = (await webhook_client.post('/api/webhook/{}'.format(create_registrations[0]['webhook_id']), json=RENDER_TEMPLATE))
assert (resp.status == 400)
webhook_... | 804,871,898,392,491,100 | Test that encrypted registrations only accept encrypted data. | tests/components/mobile_app/test_webhook.py | test_webhook_requires_encryption | Bonnee/core | python | async def test_webhook_requires_encryption(webhook_client, create_registrations):
resp = (await webhook_client.post('/api/webhook/{}'.format(create_registrations[0]['webhook_id']), json=RENDER_TEMPLATE))
assert (resp.status == 400)
webhook_json = (await resp.json())
assert ('error' in webhook_json)... |
async def test_webhook_update_location(hass, webhook_client, create_registrations):
'Test that location can be updated.'
resp = (await webhook_client.post('/api/webhook/{}'.format(create_registrations[1]['webhook_id']), json={'type': 'update_location', 'data': {'gps': [1, 2], 'gps_accuracy': 10, 'altitude': (- ... | 4,189,325,734,411,630,600 | Test that location can be updated. | tests/components/mobile_app/test_webhook.py | test_webhook_update_location | Bonnee/core | python | async def test_webhook_update_location(hass, webhook_client, create_registrations):
resp = (await webhook_client.post('/api/webhook/{}'.format(create_registrations[1]['webhook_id']), json={'type': 'update_location', 'data': {'gps': [1, 2], 'gps_accuracy': 10, 'altitude': (- 10)}}))
assert (resp.status == 2... |
async def test_webhook_enable_encryption(hass, webhook_client, create_registrations):
'Test that encryption can be added to a reg initially created without.'
webhook_id = create_registrations[1]['webhook_id']
enable_enc_resp = (await webhook_client.post(f'/api/webhook/{webhook_id}', json={'type': 'enable_en... | 999,848,620,369,200,100 | Test that encryption can be added to a reg initially created without. | tests/components/mobile_app/test_webhook.py | test_webhook_enable_encryption | Bonnee/core | python | async def test_webhook_enable_encryption(hass, webhook_client, create_registrations):
webhook_id = create_registrations[1]['webhook_id']
enable_enc_resp = (await webhook_client.post(f'/api/webhook/{webhook_id}', json={'type': 'enable_encryption'}))
assert (enable_enc_resp.status == 200)
enable_enc_... |
async def test_webhook_camera_stream_non_existent(hass, create_registrations, webhook_client):
'Test fetching camera stream URLs for a non-existent camera.'
webhook_id = create_registrations[1]['webhook_id']
resp = (await webhook_client.post(f'/api/webhook/{webhook_id}', json={'type': 'stream_camera', 'data... | -8,410,440,844,275,927,000 | Test fetching camera stream URLs for a non-existent camera. | tests/components/mobile_app/test_webhook.py | test_webhook_camera_stream_non_existent | Bonnee/core | python | async def test_webhook_camera_stream_non_existent(hass, create_registrations, webhook_client):
webhook_id = create_registrations[1]['webhook_id']
resp = (await webhook_client.post(f'/api/webhook/{webhook_id}', json={'type': 'stream_camera', 'data': {'camera_entity_id': 'camera.doesnt_exist'}}))
assert ... |
async def test_webhook_camera_stream_non_hls(hass, create_registrations, webhook_client):
'Test fetching camera stream URLs for a non-HLS/stream-supporting camera.'
hass.states.async_set('camera.non_stream_camera', 'idle', {'supported_features': 0})
webhook_id = create_registrations[1]['webhook_id']
res... | -4,140,335,046,916,990,000 | Test fetching camera stream URLs for a non-HLS/stream-supporting camera. | tests/components/mobile_app/test_webhook.py | test_webhook_camera_stream_non_hls | Bonnee/core | python | async def test_webhook_camera_stream_non_hls(hass, create_registrations, webhook_client):
hass.states.async_set('camera.non_stream_camera', 'idle', {'supported_features': 0})
webhook_id = create_registrations[1]['webhook_id']
resp = (await webhook_client.post(f'/api/webhook/{webhook_id}', json={'type':... |
async def test_webhook_camera_stream_stream_available(hass, create_registrations, webhook_client):
'Test fetching camera stream URLs for an HLS/stream-supporting camera.'
hass.states.async_set('camera.stream_camera', 'idle', {'supported_features': CAMERA_SUPPORT_STREAM})
webhook_id = create_registrations[1]... | 462,953,747,465,132,740 | Test fetching camera stream URLs for an HLS/stream-supporting camera. | tests/components/mobile_app/test_webhook.py | test_webhook_camera_stream_stream_available | Bonnee/core | python | async def test_webhook_camera_stream_stream_available(hass, create_registrations, webhook_client):
hass.states.async_set('camera.stream_camera', 'idle', {'supported_features': CAMERA_SUPPORT_STREAM})
webhook_id = create_registrations[1]['webhook_id']
with patch('homeassistant.components.camera.async_re... |
async def test_webhook_camera_stream_stream_available_but_errors(hass, create_registrations, webhook_client):
'Test fetching camera stream URLs for an HLS/stream-supporting camera but that streaming errors.'
hass.states.async_set('camera.stream_camera', 'idle', {'supported_features': CAMERA_SUPPORT_STREAM})
... | -3,383,425,318,128,153,600 | Test fetching camera stream URLs for an HLS/stream-supporting camera but that streaming errors. | tests/components/mobile_app/test_webhook.py | test_webhook_camera_stream_stream_available_but_errors | Bonnee/core | python | async def test_webhook_camera_stream_stream_available_but_errors(hass, create_registrations, webhook_client):
hass.states.async_set('camera.stream_camera', 'idle', {'supported_features': CAMERA_SUPPORT_STREAM})
webhook_id = create_registrations[1]['webhook_id']
with patch('homeassistant.components.came... |
@callback
def store_event(event):
'Helepr to store events.'
events.append(event) | -6,398,689,500,183,424,000 | Helepr to store events. | tests/components/mobile_app/test_webhook.py | store_event | Bonnee/core | python | @callback
def store_event(event):
events.append(event) |
def __init__(self, host='localhost', port=8125, max_buffer_size=50):
'Initialize an Offline Connection object.\n\n >>> monascastatsd = MonascaStatsd()\n\n :name: the name for this client. Everything sent by this client\n will be prefixed by name\n :param host: the host of the Monasc... | 6,217,063,500,374,581,000 | Initialize an Offline Connection object.
>>> monascastatsd = MonascaStatsd()
:name: the name for this client. Everything sent by this client
will be prefixed by name
:param host: the host of the MonascaStatsd server.
:param port: the port of the MonascaStatsd server.
:param max_buffer_size: Maximum number of met... | monasca_notification/common/utils.py | __init__ | martinchacon/monasca-notification | python | def __init__(self, host='localhost', port=8125, max_buffer_size=50):
'Initialize an Offline Connection object.\n\n >>> monascastatsd = MonascaStatsd()\n\n :name: the name for this client. Everything sent by this client\n will be prefixed by name\n :param host: the host of the Monasc... |
def connect(self, host, port):
'Avoid to connect to the monascastatsd server.\n\n '
pass | -6,134,275,743,187,880,000 | Avoid to connect to the monascastatsd server. | monasca_notification/common/utils.py | connect | martinchacon/monasca-notification | python | def connect(self, host, port):
'\n\n '
pass |
def value_iteration(env, gamma, epsilon):
' Solves the shortest path problem using value iteration\n :input town_map env : The town_map environment in which we seek to\n find the shortest path.\n :input float gamma : The discount factor.\n :in... | 1,402,534,937,885,546,500 | Solves the shortest path problem using value iteration
:input town_map env : The town_map environment in which we seek to
find the shortest path.
:input float gamma : The discount factor.
:input float epsilon : accuracy of the value iteration procedure.
:return numpy.ar... | Assignment 2/robbing_banks.py | value_iteration | takeitbillykyle/EL2805-Reinforcement-Learning- | python | def value_iteration(env, gamma, epsilon):
' Solves the shortest path problem using value iteration\n :input town_map env : The town_map environment in which we seek to\n find the shortest path.\n :input float gamma : The discount factor.\n :in... |
def __init__(self, town_map):
' Constructor of the environment town_map.\n '
self.STEP_REWARD = 0
self.BANK_REWARD = 10
self.CAUGHT_REWARD = (- 50)
self.town_map = town_map
self.initial_state = np.array([0, 0, 1, 2])
self.actions = self.__actions()
(self.states, self.map) = self._... | 6,462,737,220,212,363,000 | Constructor of the environment town_map. | Assignment 2/robbing_banks.py | __init__ | takeitbillykyle/EL2805-Reinforcement-Learning- | python | def __init__(self, town_map):
' \n '
self.STEP_REWARD = 0
self.BANK_REWARD = 10
self.CAUGHT_REWARD = (- 50)
self.town_map = town_map
self.initial_state = np.array([0, 0, 1, 2])
self.actions = self.__actions()
(self.states, self.map) = self.__states()
self.n_actions = len(self.... |
def __move(self, state, action):
' Makes a step in the town_map, given a current position and an action.\n If the action STAY or an inadmissible action is used, the robber stays in place.\n\n :return integer next_cell corresponding to position (x,y) x (x,y) on the town_map that agent transitio... | 4,031,673,675,314,286,600 | Makes a step in the town_map, given a current position and an action.
If the action STAY or an inadmissible action is used, the robber stays in place.
:return integer next_cell corresponding to position (x,y) x (x,y) on the town_map that agent transitions to. | Assignment 2/robbing_banks.py | __move | takeitbillykyle/EL2805-Reinforcement-Learning- | python | def __move(self, state, action):
' Makes a step in the town_map, given a current position and an action.\n If the action STAY or an inadmissible action is used, the robber stays in place.\n\n :return integer next_cell corresponding to position (x,y) x (x,y) on the town_map that agent transitio... |
def __police_positions(self, state):
'\n Input: The state as an int\n Returns: A list of possible new minotaur positions from current state \n '
agent_pos = self.states[state][0:2]
police_pos = self.states[state][2:]
diff_pos = np.sign((agent_pos - police_pos))
list_pos ... | 3,609,533,900,443,011,000 | Input: The state as an int
Returns: A list of possible new minotaur positions from current state | Assignment 2/robbing_banks.py | __police_positions | takeitbillykyle/EL2805-Reinforcement-Learning- | python | def __police_positions(self, state):
'\n Input: The state as an int\n Returns: A list of possible new minotaur positions from current state \n '
agent_pos = self.states[state][0:2]
police_pos = self.states[state][2:]
diff_pos = np.sign((agent_pos - police_pos))
list_pos ... |
def __transitions(self):
' Computes the transition probabilities for every state action pair.\n :return numpy.tensor transition probabilities: tensor of transition\n probabilities of dimension S*S*A\n '
dimensions = (self.n_states, self.n_states, self.n_actions)
transition_proba... | 499,550,621,826,146,400 | Computes the transition probabilities for every state action pair.
:return numpy.tensor transition probabilities: tensor of transition
probabilities of dimension S*S*A | Assignment 2/robbing_banks.py | __transitions | takeitbillykyle/EL2805-Reinforcement-Learning- | python | def __transitions(self):
' Computes the transition probabilities for every state action pair.\n :return numpy.tensor transition probabilities: tensor of transition\n probabilities of dimension S*S*A\n '
dimensions = (self.n_states, self.n_states, self.n_actions)
transition_proba... |
@classmethod
def _verify_local_backends(cls):
'\n Return the local backends in `SDK_STANDARD_BACKENDS` that are\n effectively available (as some of them might depend on the presence\n of an optional dependency or on the existence of a binary).\n\n Returns:\n dict[str:BaseBacke... | 8,764,399,197,171,842,000 | Return the local backends in `SDK_STANDARD_BACKENDS` that are
effectively available (as some of them might depend on the presence
of an optional dependency or on the existence of a binary).
Returns:
dict[str:BaseBackend]: a dict of the local backends instances for
the backends that could be instantiated, k... | qiskit/backends/local/localprovider.py | _verify_local_backends | Hosseinyeganeh/qiskit-core | python | @classmethod
def _verify_local_backends(cls):
'\n Return the local backends in `SDK_STANDARD_BACKENDS` that are\n effectively available (as some of them might depend on the presence\n of an optional dependency or on the existence of a binary).\n\n Returns:\n dict[str:BaseBacke... |
@classmethod
def _get_backend_instance(cls, backend_cls):
'\n Return an instance of a backend from its class.\n\n Args:\n backend_cls (class): Backend class.\n Returns:\n BaseBackend: a backend instance.\n Raises:\n QISKitError: if the backend could not b... | 2,972,808,443,166,617,000 | Return an instance of a backend from its class.
Args:
backend_cls (class): Backend class.
Returns:
BaseBackend: a backend instance.
Raises:
QISKitError: if the backend could not be instantiated or does not
provide a valid configuration containing a name. | qiskit/backends/local/localprovider.py | _get_backend_instance | Hosseinyeganeh/qiskit-core | python | @classmethod
def _get_backend_instance(cls, backend_cls):
'\n Return an instance of a backend from its class.\n\n Args:\n backend_cls (class): Backend class.\n Returns:\n BaseBackend: a backend instance.\n Raises:\n QISKitError: if the backend could not b... |
@classmethod
def add_source(cls, source):
'\n A convenience method for downstream modules to add channel\n source types once they have implemented the step in the wizard\n below.\n\n This method must be called from `__setup__` method of downstream\n module.\n '
source_l... | 5,484,621,005,522,392,000 | A convenience method for downstream modules to add channel
source types once they have implemented the step in the wizard
below.
This method must be called from `__setup__` method of downstream
module. | product.py | add_source | aniforprez/trytond-sale-channel | python | @classmethod
def add_source(cls, source):
'\n A convenience method for downstream modules to add channel\n source types once they have implemented the step in the wizard\n below.\n\n This method must be called from `__setup__` method of downstream\n module.\n '
source_l... |
@classmethod
def __setup__(cls):
'\n Setup the class and define constraints\n '
super(TemplateSaleChannelListing, cls).__setup__()
table = cls.__table__()
cls._sql_constraints += [('channel_template_unique', Unique(table, table.channel, table.template_identifier, table.template), 'Product ... | 6,125,404,840,998,321,000 | Setup the class and define constraints | product.py | __setup__ | aniforprez/trytond-sale-channel | python | @classmethod
def __setup__(cls):
'\n \n '
super(TemplateSaleChannelListing, cls).__setup__()
table = cls.__table__()
cls._sql_constraints += [('channel_template_unique', Unique(table, table.channel, table.template_identifier, table.template), 'Product Template is already mapped to this cha... |
@classmethod
def create_from(cls, channel, product_data):
'\n Create the product for the channel\n '
raise NotImplementedError(('create_from is not implemented in product for %s channels' % channel.source)) | -3,979,193,719,550,255,600 | Create the product for the channel | product.py | create_from | aniforprez/trytond-sale-channel | python | @classmethod
def create_from(cls, channel, product_data):
'\n \n '
raise NotImplementedError(('create_from is not implemented in product for %s channels' % channel.source)) |
@classmethod
def get_listing_url(cls, records, name):
'\n Downstream modules should implement this function\n and return a valid url\n '
return dict.fromkeys([r.id for r in records]) | -7,361,309,929,933,427,000 | Downstream modules should implement this function
and return a valid url | product.py | get_listing_url | aniforprez/trytond-sale-channel | python | @classmethod
def get_listing_url(cls, records, name):
'\n Downstream modules should implement this function\n and return a valid url\n '
return dict.fromkeys([r.id for r in records]) |
@classmethod
def __setup__(cls):
'\n Setup the class and define constraints\n '
super(ProductSaleChannelListing, cls).__setup__()
table = cls.__table__()
cls._sql_constraints += [('channel_product_identifier_uniq', Unique(table, table.channel, table.product_identifier), 'This external prod... | -3,085,905,198,177,834,500 | Setup the class and define constraints | product.py | __setup__ | aniforprez/trytond-sale-channel | python | @classmethod
def __setup__(cls):
'\n \n '
super(ProductSaleChannelListing, cls).__setup__()
table = cls.__table__()
cls._sql_constraints += [('channel_product_identifier_uniq', Unique(table, table.channel, table.product_identifier), 'This external product is already mapped with same channe... |
@classmethod
def create_from(cls, channel, product_data):
'\n Create a listing for the product from channel and data\n '
raise NotImplementedError(('create_from is not implemented in channel listing for %s channels' % channel.source)) | 4,653,899,921,623,539,000 | Create a listing for the product from channel and data | product.py | create_from | aniforprez/trytond-sale-channel | python | @classmethod
def create_from(cls, channel, product_data):
'\n \n '
raise NotImplementedError(('create_from is not implemented in channel listing for %s channels' % channel.source)) |
def export_inventory(self):
'\n Export listing.product inventory to listing.channel\n\n Since external channels are implemented by downstream modules, it is\n the responsibility of those channels to implement exporting or call\n super to delegate.\n '
raise NotImplementedError... | 5,280,339,303,968,193,000 | Export listing.product inventory to listing.channel
Since external channels are implemented by downstream modules, it is
the responsibility of those channels to implement exporting or call
super to delegate. | product.py | export_inventory | aniforprez/trytond-sale-channel | python | def export_inventory(self):
'\n Export listing.product inventory to listing.channel\n\n Since external channels are implemented by downstream modules, it is\n the responsibility of those channels to implement exporting or call\n super to delegate.\n '
raise NotImplementedError... |
@classmethod
def export_bulk_inventory(cls, listings):
'\n Export listing.product inventory to listing.channel in bulk\n\n Since external channels are implemented by downstream modules, it is\n the responsibility of those channels to implement bulk exporting for\n respective channels.\n ... | -1,496,527,611,567,674,400 | Export listing.product inventory to listing.channel in bulk
Since external channels are implemented by downstream modules, it is
the responsibility of those channels to implement bulk exporting for
respective channels.
Default behaviour is to export inventory individually. | product.py | export_bulk_inventory | aniforprez/trytond-sale-channel | python | @classmethod
def export_bulk_inventory(cls, listings):
'\n Export listing.product inventory to listing.channel in bulk\n\n Since external channels are implemented by downstream modules, it is\n the responsibility of those channels to implement bulk exporting for\n respective channels.\n ... |
def import_product_image(self):
'\n Import specific product image from external channel based on product\n identifier.\n\n Since external channels are implemented by downstream modules, it is\n the responsibility of those channels to implement importing or call\n super to delegate... | 6,288,277,365,324,206,000 | Import specific product image from external channel based on product
identifier.
Since external channels are implemented by downstream modules, it is
the responsibility of those channels to implement importing or call
super to delegate. | product.py | import_product_image | aniforprez/trytond-sale-channel | python | def import_product_image(self):
'\n Import specific product image from external channel based on product\n identifier.\n\n Since external channels are implemented by downstream modules, it is\n the responsibility of those channels to implement importing or call\n super to delegate... |
def get_availability_context(self):
'\n Allow overriding the context used to compute availability of\n products.\n '
return {'locations': [self.channel.warehouse.id]} | 6,406,106,973,323,258,000 | Allow overriding the context used to compute availability of
products. | product.py | get_availability_context | aniforprez/trytond-sale-channel | python | def get_availability_context(self):
'\n Allow overriding the context used to compute availability of\n products.\n '
return {'locations': [self.channel.warehouse.id]} |
def get_availability(self):
'\n Return the availability of the product for this listing\n '
Product = Pool().get('product.product')
with Transaction().set_context(**self.get_availability_context()):
rv = {'type': 'bucket', 'value': None, 'quantity': None}
if self.product:
... | 5,950,218,621,468,469,000 | Return the availability of the product for this listing | product.py | get_availability | aniforprez/trytond-sale-channel | python | def get_availability(self):
'\n \n '
Product = Pool().get('product.product')
with Transaction().set_context(**self.get_availability_context()):
rv = {'type': 'bucket', 'value': None, 'quantity': None}
if self.product:
product = Product(self.product.id)
r... |
def maxChunksToSorted(self, arr):
'\n :type arr: List[int]\n :rtype: int\n '
stacks = []
for num in arr:
if (not stacks):
stacks.append([num])
elif (num >= stacks[(- 1)][0]):
stacks.append([num])
else:
stacks[(- 1)].append(num)... | 7,987,127,453,734,050,000 | :type arr: List[int]
:rtype: int | p768_max_chunks_to_make_sorted_ii.py | maxChunksToSorted | feigaochn/leetcode | python | def maxChunksToSorted(self, arr):
'\n :type arr: List[int]\n :rtype: int\n '
stacks = []
for num in arr:
if (not stacks):
stacks.append([num])
elif (num >= stacks[(- 1)][0]):
stacks.append([num])
else:
stacks[(- 1)].append(num)... |
def _get_lsp_admin_group_include_any_group_id(self):
'\n Getter method for lsp_admin_group_include_any_group_id, mapped from YANG variable /brocade_mpls_rpc/show_mpls_lsp_extensive/output/lsp/show_mpls_lsp_extensive_info/show_mpls_lsp_sec_path_info/sec_path/lsp_sec_path_config_admin_groups/lsp_admin_group/lsp_ad... | -1,830,173,705,894,686,500 | Getter method for lsp_admin_group_include_any_group_id, mapped from YANG variable /brocade_mpls_rpc/show_mpls_lsp_extensive/output/lsp/show_mpls_lsp_extensive_info/show_mpls_lsp_sec_path_info/sec_path/lsp_sec_path_config_admin_groups/lsp_admin_group/lsp_admin_group_include_any/lsp_admin_group_include_any_group_id (uint... | pybind/slxos/v17r_1_01a/brocade_mpls_rpc/show_mpls_lsp_extensive/output/lsp/show_mpls_lsp_extensive_info/show_mpls_lsp_sec_path_info/sec_path/lsp_sec_path_config_admin_groups/lsp_admin_group/lsp_admin_group_include_any/__init__.py | _get_lsp_admin_group_include_any_group_id | extremenetworks/pybind | python | def _get_lsp_admin_group_include_any_group_id(self):
'\n Getter method for lsp_admin_group_include_any_group_id, mapped from YANG variable /brocade_mpls_rpc/show_mpls_lsp_extensive/output/lsp/show_mpls_lsp_extensive_info/show_mpls_lsp_sec_path_info/sec_path/lsp_sec_path_config_admin_groups/lsp_admin_group/lsp_ad... |
def _set_lsp_admin_group_include_any_group_id(self, v, load=False):
'\n Setter method for lsp_admin_group_include_any_group_id, mapped from YANG variable /brocade_mpls_rpc/show_mpls_lsp_extensive/output/lsp/show_mpls_lsp_extensive_info/show_mpls_lsp_sec_path_info/sec_path/lsp_sec_path_config_admin_groups/lsp_adm... | -4,791,081,038,477,707,000 | Setter method for lsp_admin_group_include_any_group_id, mapped from YANG variable /brocade_mpls_rpc/show_mpls_lsp_extensive/output/lsp/show_mpls_lsp_extensive_info/show_mpls_lsp_sec_path_info/sec_path/lsp_sec_path_config_admin_groups/lsp_admin_group/lsp_admin_group_include_any/lsp_admin_group_include_any_group_id (uint... | pybind/slxos/v17r_1_01a/brocade_mpls_rpc/show_mpls_lsp_extensive/output/lsp/show_mpls_lsp_extensive_info/show_mpls_lsp_sec_path_info/sec_path/lsp_sec_path_config_admin_groups/lsp_admin_group/lsp_admin_group_include_any/__init__.py | _set_lsp_admin_group_include_any_group_id | extremenetworks/pybind | python | def _set_lsp_admin_group_include_any_group_id(self, v, load=False):
'\n Setter method for lsp_admin_group_include_any_group_id, mapped from YANG variable /brocade_mpls_rpc/show_mpls_lsp_extensive/output/lsp/show_mpls_lsp_extensive_info/show_mpls_lsp_sec_path_info/sec_path/lsp_sec_path_config_admin_groups/lsp_adm... |
def draw(self):
'Draw the line.\n\n Returns\n -------\n list\n The GUIDs of the created Rhino objects.\n\n '
start = list(self.primitive.start)
end = list(self.primitive.end)
lines = [{'start': start, 'end': end, 'color': self.color, 'name': self.name}]
guids =... | 8,097,840,954,285,600,000 | Draw the line.
Returns
-------
list
The GUIDs of the created Rhino objects. | src/compas_rhino/artists/lineartist.py | draw | KEERTHANAUDAY/compas | python | def draw(self):
'Draw the line.\n\n Returns\n -------\n list\n The GUIDs of the created Rhino objects.\n\n '
start = list(self.primitive.start)
end = list(self.primitive.end)
lines = [{'start': start, 'end': end, 'color': self.color, 'name': self.name}]
guids =... |
@staticmethod
def draw_collection(collection, names=None, colors=None, layer=None, clear=False, add_to_group=False, group_name=None):
'Draw a collection of lines.\n\n Parameters\n ----------\n collection: list of compas.geometry.Line\n A collection of ``Line`` objects.\n names... | 5,524,344,590,940,111,000 | Draw a collection of lines.
Parameters
----------
collection: list of compas.geometry.Line
A collection of ``Line`` objects.
names : list of str, optional
Individual names for the lines.
colors : color or list of color, optional
A color specification for the lines as a single color or a list of individual ... | src/compas_rhino/artists/lineartist.py | draw_collection | KEERTHANAUDAY/compas | python | @staticmethod
def draw_collection(collection, names=None, colors=None, layer=None, clear=False, add_to_group=False, group_name=None):
'Draw a collection of lines.\n\n Parameters\n ----------\n collection: list of compas.geometry.Line\n A collection of ``Line`` objects.\n names... |
def main():
'\n connects the pieces to grab posts from reddit and throw them on twitter\n '
reddit = Reddit()
twitter = Twitter()
tweets = reddit.get_tweets()
print('sending {} tweets'.format(len(tweets)))
for tweet in reddit.get_tweets():
status = twitter.send_tweet(tweet.Primary)... | 3,298,083,492,189,179,000 | connects the pieces to grab posts from reddit and throw them on twitter | tweet_bot.py | main | seanneal/tweetbot | python | def main():
'\n \n '
reddit = Reddit()
twitter = Twitter()
tweets = reddit.get_tweets()
print('sending {} tweets'.format(len(tweets)))
for tweet in reddit.get_tweets():
status = twitter.send_tweet(tweet.Primary)
if tweet.Second:
twitter.send_tweet(tweet.Second, ... |
def get_cython_func_and_vals(self, values: np.ndarray, is_numeric: bool):
'\n Find the appropriate cython function, casting if necessary.\n\n Parameters\n ----------\n values : np.ndarray\n is_numeric : bool\n\n Returns\n -------\n func : callable\n val... | 5,217,492,559,264,406,000 | Find the appropriate cython function, casting if necessary.
Parameters
----------
values : np.ndarray
is_numeric : bool
Returns
-------
func : callable
values : np.ndarray | pandas/core/groupby/ops.py | get_cython_func_and_vals | CuteLemon/pandas | python | def get_cython_func_and_vals(self, values: np.ndarray, is_numeric: bool):
'\n Find the appropriate cython function, casting if necessary.\n\n Parameters\n ----------\n values : np.ndarray\n is_numeric : bool\n\n Returns\n -------\n func : callable\n val... |
def _disallow_invalid_ops(self, dtype: DtypeObj, is_numeric: bool=False):
'\n Check if we can do this operation with our cython functions.\n\n Raises\n ------\n NotImplementedError\n This is either not a valid function for this dtype, or\n valid but not implemented ... | -5,195,850,799,562,637,000 | Check if we can do this operation with our cython functions.
Raises
------
NotImplementedError
This is either not a valid function for this dtype, or
valid but not implemented in cython. | pandas/core/groupby/ops.py | _disallow_invalid_ops | CuteLemon/pandas | python | def _disallow_invalid_ops(self, dtype: DtypeObj, is_numeric: bool=False):
'\n Check if we can do this operation with our cython functions.\n\n Raises\n ------\n NotImplementedError\n This is either not a valid function for this dtype, or\n valid but not implemented ... |
def _get_result_dtype(self, dtype: DtypeObj) -> DtypeObj:
'\n Get the desired dtype of a result based on the\n input dtype and how it was computed.\n\n Parameters\n ----------\n dtype : np.dtype or ExtensionDtype\n Input dtype.\n\n Returns\n -------\n ... | 1,181,172,871,760,055,300 | Get the desired dtype of a result based on the
input dtype and how it was computed.
Parameters
----------
dtype : np.dtype or ExtensionDtype
Input dtype.
Returns
-------
np.dtype or ExtensionDtype
The desired dtype of the result. | pandas/core/groupby/ops.py | _get_result_dtype | CuteLemon/pandas | python | def _get_result_dtype(self, dtype: DtypeObj) -> DtypeObj:
'\n Get the desired dtype of a result based on the\n input dtype and how it was computed.\n\n Parameters\n ----------\n dtype : np.dtype or ExtensionDtype\n Input dtype.\n\n Returns\n -------\n ... |
@final
def _ea_wrap_cython_operation(self, values: ExtensionArray, min_count: int, ngroups: int, comp_ids: np.ndarray, **kwargs) -> ArrayLike:
'\n If we have an ExtensionArray, unwrap, call _cython_operation, and\n re-wrap if appropriate.\n '
if (isinstance(values, BaseMaskedArray) and self... | -3,675,165,534,000,637,000 | If we have an ExtensionArray, unwrap, call _cython_operation, and
re-wrap if appropriate. | pandas/core/groupby/ops.py | _ea_wrap_cython_operation | CuteLemon/pandas | python | @final
def _ea_wrap_cython_operation(self, values: ExtensionArray, min_count: int, ngroups: int, comp_ids: np.ndarray, **kwargs) -> ArrayLike:
'\n If we have an ExtensionArray, unwrap, call _cython_operation, and\n re-wrap if appropriate.\n '
if (isinstance(values, BaseMaskedArray) and self... |
def _reconstruct_ea_result(self, values, res_values):
'\n Construct an ExtensionArray result from an ndarray result.\n '
if isinstance(values.dtype, (BooleanDtype, _IntegerDtype, FloatingDtype, StringDtype)):
dtype = self._get_result_dtype(values.dtype)
cls = dtype.construct_array_... | 3,063,274,066,866,768,000 | Construct an ExtensionArray result from an ndarray result. | pandas/core/groupby/ops.py | _reconstruct_ea_result | CuteLemon/pandas | python | def _reconstruct_ea_result(self, values, res_values):
'\n \n '
if isinstance(values.dtype, (BooleanDtype, _IntegerDtype, FloatingDtype, StringDtype)):
dtype = self._get_result_dtype(values.dtype)
cls = dtype.construct_array_type()
return cls._from_sequence(res_values, dtype... |
@final
def _masked_ea_wrap_cython_operation(self, values: BaseMaskedArray, min_count: int, ngroups: int, comp_ids: np.ndarray, **kwargs) -> BaseMaskedArray:
"\n Equivalent of `_ea_wrap_cython_operation`, but optimized for masked EA's\n and cython algorithms which accept a mask.\n "
orig_val... | -3,460,015,546,438,205,400 | Equivalent of `_ea_wrap_cython_operation`, but optimized for masked EA's
and cython algorithms which accept a mask. | pandas/core/groupby/ops.py | _masked_ea_wrap_cython_operation | CuteLemon/pandas | python | @final
def _masked_ea_wrap_cython_operation(self, values: BaseMaskedArray, min_count: int, ngroups: int, comp_ids: np.ndarray, **kwargs) -> BaseMaskedArray:
"\n Equivalent of `_ea_wrap_cython_operation`, but optimized for masked EA's\n and cython algorithms which accept a mask.\n "
orig_val... |
@final
def cython_operation(self, *, values: ArrayLike, axis: int, min_count: int=(- 1), comp_ids: np.ndarray, ngroups: int, **kwargs) -> ArrayLike:
'\n Call our cython function, with appropriate pre- and post- processing.\n '
if (values.ndim > 2):
raise NotImplementedError('number of dime... | -1,751,327,355,450,369,500 | Call our cython function, with appropriate pre- and post- processing. | pandas/core/groupby/ops.py | cython_operation | CuteLemon/pandas | python | @final
def cython_operation(self, *, values: ArrayLike, axis: int, min_count: int=(- 1), comp_ids: np.ndarray, ngroups: int, **kwargs) -> ArrayLike:
'\n \n '
if (values.ndim > 2):
raise NotImplementedError('number of dimensions is currently limited to 2')
elif (values.ndim == 2):
... |
def get_iterator(self, data: NDFrameT, axis: int=0) -> Iterator[tuple[(Hashable, NDFrameT)]]:
'\n Groupby iterator\n\n Returns\n -------\n Generator yielding sequence of (name, subsetted object)\n for each group\n '
splitter = self._get_splitter(data, axis=axis)
key... | -4,732,774,640,510,437,000 | Groupby iterator
Returns
-------
Generator yielding sequence of (name, subsetted object)
for each group | pandas/core/groupby/ops.py | get_iterator | CuteLemon/pandas | python | def get_iterator(self, data: NDFrameT, axis: int=0) -> Iterator[tuple[(Hashable, NDFrameT)]]:
'\n Groupby iterator\n\n Returns\n -------\n Generator yielding sequence of (name, subsetted object)\n for each group\n '
splitter = self._get_splitter(data, axis=axis)
key... |
@final
def _get_splitter(self, data: NDFrame, axis: int=0) -> DataSplitter:
'\n Returns\n -------\n Generator yielding subsetted objects\n\n __finalize__ has not been called for the subsetted objects returned.\n '
(ids, _, ngroups) = self.group_info
return get_splitter(dat... | -6,953,872,263,223,836,000 | Returns
-------
Generator yielding subsetted objects
__finalize__ has not been called for the subsetted objects returned. | pandas/core/groupby/ops.py | _get_splitter | CuteLemon/pandas | python | @final
def _get_splitter(self, data: NDFrame, axis: int=0) -> DataSplitter:
'\n Returns\n -------\n Generator yielding subsetted objects\n\n __finalize__ has not been called for the subsetted objects returned.\n '
(ids, _, ngroups) = self.group_info
return get_splitter(dat... |
def _get_grouper(self):
"\n We are a grouper as part of another's groupings.\n\n We have a specific method of grouping, so cannot\n convert to a Index for our grouper.\n "
return self.groupings[0].grouping_vector | 3,283,307,727,770,842,000 | We are a grouper as part of another's groupings.
We have a specific method of grouping, so cannot
convert to a Index for our grouper. | pandas/core/groupby/ops.py | _get_grouper | CuteLemon/pandas | python | def _get_grouper(self):
"\n We are a grouper as part of another's groupings.\n\n We have a specific method of grouping, so cannot\n convert to a Index for our grouper.\n "
return self.groupings[0].grouping_vector |
@cache_readonly
def indices(self) -> dict[(Hashable, npt.NDArray[np.intp])]:
'dict {group name -> group indices}'
if ((len(self.groupings) == 1) and isinstance(self.result_index, CategoricalIndex)):
return self.groupings[0].indices
codes_list = [ping.codes for ping in self.groupings]
keys = [pin... | 3,842,011,190,711,266,000 | dict {group name -> group indices} | pandas/core/groupby/ops.py | indices | CuteLemon/pandas | python | @cache_readonly
def indices(self) -> dict[(Hashable, npt.NDArray[np.intp])]:
if ((len(self.groupings) == 1) and isinstance(self.result_index, CategoricalIndex)):
return self.groupings[0].indices
codes_list = [ping.codes for ping in self.groupings]
keys = [ping.group_index for ping in self.group... |
@final
def size(self) -> Series:
'\n Compute group sizes.\n '
(ids, _, ngroups) = self.group_info
if ngroups:
out = np.bincount(ids[(ids != (- 1))], minlength=ngroups)
else:
out = []
return Series(out, index=self.result_index, dtype='int64') | 8,537,188,025,531,969,000 | Compute group sizes. | pandas/core/groupby/ops.py | size | CuteLemon/pandas | python | @final
def size(self) -> Series:
'\n \n '
(ids, _, ngroups) = self.group_info
if ngroups:
out = np.bincount(ids[(ids != (- 1))], minlength=ngroups)
else:
out = []
return Series(out, index=self.result_index, dtype='int64') |
@cache_readonly
def groups(self) -> dict[(Hashable, np.ndarray)]:
'dict {group name -> group labels}'
if (len(self.groupings) == 1):
return self.groupings[0].groups
else:
to_groupby = zip(*(ping.grouping_vector for ping in self.groupings))
index = Index(to_groupby)
return sel... | -2,508,148,584,364,830,000 | dict {group name -> group labels} | pandas/core/groupby/ops.py | groups | CuteLemon/pandas | python | @cache_readonly
def groups(self) -> dict[(Hashable, np.ndarray)]:
if (len(self.groupings) == 1):
return self.groupings[0].groups
else:
to_groupby = zip(*(ping.grouping_vector for ping in self.groupings))
index = Index(to_groupby)
return self.axis.groupby(index) |
@final
@cache_readonly
def result_arraylike(self) -> ArrayLike:
'\n Analogous to result_index, but returning an ndarray/ExtensionArray\n allowing us to retain ExtensionDtypes not supported by Index.\n '
if (len(self.groupings) == 1):
return self.groupings[0].group_arraylike
retu... | -8,581,127,941,794,736,000 | Analogous to result_index, but returning an ndarray/ExtensionArray
allowing us to retain ExtensionDtypes not supported by Index. | pandas/core/groupby/ops.py | result_arraylike | CuteLemon/pandas | python | @final
@cache_readonly
def result_arraylike(self) -> ArrayLike:
'\n Analogous to result_index, but returning an ndarray/ExtensionArray\n allowing us to retain ExtensionDtypes not supported by Index.\n '
if (len(self.groupings) == 1):
return self.groupings[0].group_arraylike
retu... |
@final
def _cython_operation(self, kind: str, values, how: str, axis: int, min_count: int=(- 1), **kwargs) -> ArrayLike:
'\n Returns the values of a cython operation.\n '
assert (kind in ['transform', 'aggregate'])
cy_op = WrappedCythonOp(kind=kind, how=how)
(ids, _, _) = self.group_info
... | -6,726,947,257,547,295,000 | Returns the values of a cython operation. | pandas/core/groupby/ops.py | _cython_operation | CuteLemon/pandas | python | @final
def _cython_operation(self, kind: str, values, how: str, axis: int, min_count: int=(- 1), **kwargs) -> ArrayLike:
'\n \n '
assert (kind in ['transform', 'aggregate'])
cy_op = WrappedCythonOp(kind=kind, how=how)
(ids, _, _) = self.group_info
ngroups = self.ngroups
return cy_o... |
@final
def agg_series(self, obj: Series, func: Callable, preserve_dtype: bool=False) -> ArrayLike:
'\n Parameters\n ----------\n obj : Series\n func : function taking a Series and returning a scalar-like\n preserve_dtype : bool\n Whether the aggregation is known to be d... | 421,957,958,651,067,140 | Parameters
----------
obj : Series
func : function taking a Series and returning a scalar-like
preserve_dtype : bool
Whether the aggregation is known to be dtype-preserving.
Returns
-------
np.ndarray or ExtensionArray | pandas/core/groupby/ops.py | agg_series | CuteLemon/pandas | python | @final
def agg_series(self, obj: Series, func: Callable, preserve_dtype: bool=False) -> ArrayLike:
'\n Parameters\n ----------\n obj : Series\n func : function taking a Series and returning a scalar-like\n preserve_dtype : bool\n Whether the aggregation is known to be d... |
@cache_readonly
def groups(self):
'dict {group name -> group labels}'
result = {key: value for (key, value) in zip(self.binlabels, self.bins) if (key is not NaT)}
return result | -5,176,918,563,581,249,000 | dict {group name -> group labels} | pandas/core/groupby/ops.py | groups | CuteLemon/pandas | python | @cache_readonly
def groups(self):
result = {key: value for (key, value) in zip(self.binlabels, self.bins) if (key is not NaT)}
return result |
def _get_grouper(self):
"\n We are a grouper as part of another's groupings.\n\n We have a specific method of grouping, so cannot\n convert to a Index for our grouper.\n "
return self | 7,450,340,969,822,096,000 | We are a grouper as part of another's groupings.
We have a specific method of grouping, so cannot
convert to a Index for our grouper. | pandas/core/groupby/ops.py | _get_grouper | CuteLemon/pandas | python | def _get_grouper(self):
"\n We are a grouper as part of another's groupings.\n\n We have a specific method of grouping, so cannot\n convert to a Index for our grouper.\n "
return self |
def get_iterator(self, data: NDFrame, axis: int=0):
'\n Groupby iterator\n\n Returns\n -------\n Generator yielding sequence of (name, subsetted object)\n for each group\n '
if (axis == 0):
slicer = (lambda start, edge: data.iloc[start:edge])
else:
s... | 4,232,520,393,238,952,000 | Groupby iterator
Returns
-------
Generator yielding sequence of (name, subsetted object)
for each group | pandas/core/groupby/ops.py | get_iterator | CuteLemon/pandas | python | def get_iterator(self, data: NDFrame, axis: int=0):
'\n Groupby iterator\n\n Returns\n -------\n Generator yielding sequence of (name, subsetted object)\n for each group\n '
if (axis == 0):
slicer = (lambda start, edge: data.iloc[start:edge])
else:
s... |
@tf_export('math.argmax', 'argmax', v1=[])
def argmax_v2(input, axis=None, output_type=dtypes.int64, name=None):
'Returns the index with the largest value across axes of a tensor.\n\n Note that in case of ties the identity of the return value is not guaranteed.\n\n Args:\n input: A `Tensor`. Must be one of the... | -3,456,371,538,898,410,000 | Returns the index with the largest value across axes of a tensor.
Note that in case of ties the identity of the return value is not guaranteed.
Args:
input: A `Tensor`. Must be one of the following types: `float32`, `float64`,
`int32`, `uint8`, `int16`, `int8`, `complex64`, `int64`, `qint8`, `quint8`,
`qint32`,... | tensorflow/python/ops/math_ops.py | argmax_v2 | minminsun/tensorflow | python | @tf_export('math.argmax', 'argmax', v1=[])
def argmax_v2(input, axis=None, output_type=dtypes.int64, name=None):
'Returns the index with the largest value across axes of a tensor.\n\n Note that in case of ties the identity of the return value is not guaranteed.\n\n Args:\n input: A `Tensor`. Must be one of the... |
@tf_export('math.argmin', 'argmin', v1=[])
def argmin_v2(input, axis=None, output_type=dtypes.int64, name=None):
'Returns the index with the smallest value across axes of a tensor.\n\n Note that in case of ties the identity of the return value is not guaranteed.\n\n Args:\n input: A `Tensor`. Must be one of th... | 327,193,619,100,737,500 | Returns the index with the smallest value across axes of a tensor.
Note that in case of ties the identity of the return value is not guaranteed.
Args:
input: A `Tensor`. Must be one of the following types: `float32`, `float64`,
`int32`, `uint8`, `int16`, `int8`, `complex64`, `int64`, `qint8`, `quint8`,
`qint32`... | tensorflow/python/ops/math_ops.py | argmin_v2 | minminsun/tensorflow | python | @tf_export('math.argmin', 'argmin', v1=[])
def argmin_v2(input, axis=None, output_type=dtypes.int64, name=None):
'Returns the index with the smallest value across axes of a tensor.\n\n Note that in case of ties the identity of the return value is not guaranteed.\n\n Args:\n input: A `Tensor`. Must be one of th... |
@tf_export('math.abs', 'abs')
@dispatch.add_dispatch_support
def abs(x, name=None):
'Computes the absolute value of a tensor.\n\n Given a tensor `x` of complex numbers, this operation returns a tensor of type\n `float32` or `float64` that is the absolute value of each element in `x`. All\n elements in `x` must b... | 5,716,243,541,336,007,000 | Computes the absolute value of a tensor.
Given a tensor `x` of complex numbers, this operation returns a tensor of type
`float32` or `float64` that is the absolute value of each element in `x`. All
elements in `x` must be complex numbers of the form \\(a + bj\\). The
absolute value is computed as \\( \sqrt{a^2 + b^2}\... | tensorflow/python/ops/math_ops.py | abs | minminsun/tensorflow | python | @tf_export('math.abs', 'abs')
@dispatch.add_dispatch_support
def abs(x, name=None):
'Computes the absolute value of a tensor.\n\n Given a tensor `x` of complex numbers, this operation returns a tensor of type\n `float32` or `float64` that is the absolute value of each element in `x`. All\n elements in `x` must b... |
@tf_export('math.divide', 'divide')
@dispatch.add_dispatch_support
def divide(x, y, name=None):
'Computes Python style division of `x` by `y`.'
if (name is not None):
return (DivideDelegateWithName(x, name) / y)
else:
return (x / y) | 1,802,829,701,935,261,400 | Computes Python style division of `x` by `y`. | tensorflow/python/ops/math_ops.py | divide | minminsun/tensorflow | python | @tf_export('math.divide', 'divide')
@dispatch.add_dispatch_support
def divide(x, y, name=None):
if (name is not None):
return (DivideDelegateWithName(x, name) / y)
else:
return (x / y) |
@deprecation.deprecated('2016-12-30', '`tf.neg(x)` is deprecated, please use `tf.negative(x)` or `-x`')
def _neg(x, name=None):
'Computes numerical negative value element-wise.\n\n I.e., \\(y = -x\\).\n\n Args:\n x: A `Tensor` or `SparseTensor`. Must be one of the following types: `half`,\n `float32`, `fl... | 3,601,977,973,089,695,000 | Computes numerical negative value element-wise.
I.e., \(y = -x\).
Args:
x: A `Tensor` or `SparseTensor`. Must be one of the following types: `half`,
`float32`, `float64`, `int32`, `int64`, `complex64`, `complex128`.
name: A name for the operation (optional).
Returns:
A `Tensor` or `SparseTensor`, respectiv... | tensorflow/python/ops/math_ops.py | _neg | minminsun/tensorflow | python | @deprecation.deprecated('2016-12-30', '`tf.neg(x)` is deprecated, please use `tf.negative(x)` or `-x`')
def _neg(x, name=None):
'Computes numerical negative value element-wise.\n\n I.e., \\(y = -x\\).\n\n Args:\n x: A `Tensor` or `SparseTensor`. Must be one of the following types: `half`,\n `float32`, `fl... |
@tf_export(v1=['math.scalar_mul', 'scalar_mul'])
def scalar_mul(scalar, x, name=None):
'Multiplies a scalar times a `Tensor` or `IndexedSlices` object.\n\n Intended for use in gradient code which might deal with `IndexedSlices`\n objects, which are easy to multiply by a scalar but more expensive to\n multiply wi... | -5,209,761,818,786,379,000 | Multiplies a scalar times a `Tensor` or `IndexedSlices` object.
Intended for use in gradient code which might deal with `IndexedSlices`
objects, which are easy to multiply by a scalar but more expensive to
multiply with arbitrary tensors.
Args:
scalar: A 0-D scalar `Tensor`. Must have known shape.
x: A `Tensor` o... | tensorflow/python/ops/math_ops.py | scalar_mul | minminsun/tensorflow | python | @tf_export(v1=['math.scalar_mul', 'scalar_mul'])
def scalar_mul(scalar, x, name=None):
'Multiplies a scalar times a `Tensor` or `IndexedSlices` object.\n\n Intended for use in gradient code which might deal with `IndexedSlices`\n objects, which are easy to multiply by a scalar but more expensive to\n multiply wi... |
@tf_export('math.pow', 'pow')
@dispatch.add_dispatch_support
def pow(x, y, name=None):
'Computes the power of one value to another.\n\n Given a tensor `x` and a tensor `y`, this operation computes \\\\(x^y\\\\) for\n corresponding elements in `x` and `y`. For example:\n\n ```python\n x = tf.constant([[2, 2], [3... | 2,437,528,458,899,659,300 | Computes the power of one value to another.
Given a tensor `x` and a tensor `y`, this operation computes \\(x^y\\) for
corresponding elements in `x` and `y`. For example:
```python
x = tf.constant([[2, 2], [3, 3]])
y = tf.constant([[8, 16], [2, 3]])
tf.pow(x, y) # [[256, 65536], [9, 27]]
```
Args:
x: A `Tensor` o... | tensorflow/python/ops/math_ops.py | pow | minminsun/tensorflow | python | @tf_export('math.pow', 'pow')
@dispatch.add_dispatch_support
def pow(x, y, name=None):
'Computes the power of one value to another.\n\n Given a tensor `x` and a tensor `y`, this operation computes \\\\(x^y\\\\) for\n corresponding elements in `x` and `y`. For example:\n\n ```python\n x = tf.constant([[2, 2], [3... |
@tf_export('dtypes.complex', 'complex')
@dispatch.add_dispatch_support
def complex(real, imag, name=None):
'Converts two real numbers to a complex number.\n\n Given a tensor `real` representing the real part of a complex number, and a\n tensor `imag` representing the imaginary part of a complex number, this\n op... | 6,540,480,198,212,402,000 | Converts two real numbers to a complex number.
Given a tensor `real` representing the real part of a complex number, and a
tensor `imag` representing the imaginary part of a complex number, this
operation returns complex numbers elementwise of the form \\(a + bj\\), where
*a* represents the `real` part and *b* represe... | tensorflow/python/ops/math_ops.py | complex | minminsun/tensorflow | python | @tf_export('dtypes.complex', 'complex')
@dispatch.add_dispatch_support
def complex(real, imag, name=None):
'Converts two real numbers to a complex number.\n\n Given a tensor `real` representing the real part of a complex number, and a\n tensor `imag` representing the imaginary part of a complex number, this\n op... |
@tf_export('math.real', v1=['math.real', 'real'])
@deprecation.deprecated_endpoints('real')
@dispatch.add_dispatch_support
def real(input, name=None):
'Returns the real part of a complex (or real) tensor.\n\n Given a tensor `input`, this operation returns a tensor of type `float` that\n is the real part of each e... | 5,528,627,588,415,537,000 | Returns the real part of a complex (or real) tensor.
Given a tensor `input`, this operation returns a tensor of type `float` that
is the real part of each element in `input` considered as a complex number.
For example:
```python
x = tf.constant([-2.25 + 4.75j, 3.25 + 5.75j])
tf.real(x) # [-2.25, 3.25]
```
If `inpu... | tensorflow/python/ops/math_ops.py | real | minminsun/tensorflow | python | @tf_export('math.real', v1=['math.real', 'real'])
@deprecation.deprecated_endpoints('real')
@dispatch.add_dispatch_support
def real(input, name=None):
'Returns the real part of a complex (or real) tensor.\n\n Given a tensor `input`, this operation returns a tensor of type `float` that\n is the real part of each e... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.