hexsha stringlengths 40 40 | size int64 2 991k | ext stringclasses 2 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 4 208 | max_stars_repo_name stringlengths 6 106 | max_stars_repo_head_hexsha stringlengths 40 40 | max_stars_repo_licenses list | max_stars_count int64 1 33.5k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 4 208 | max_issues_repo_name stringlengths 6 106 | max_issues_repo_head_hexsha stringlengths 40 40 | max_issues_repo_licenses list | max_issues_count int64 1 16.3k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 4 208 | max_forks_repo_name stringlengths 6 106 | max_forks_repo_head_hexsha stringlengths 40 40 | max_forks_repo_licenses list | max_forks_count int64 1 6.91k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 2 991k | avg_line_length float64 1 36k | max_line_length int64 1 977k | alphanum_fraction float64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
03e7c00589b6d6895bab61b3207bd44781a1b461 | 33,507 | ex | Elixir | clients/compute/lib/google_api/compute/v1/api/packet_mirrorings.ex | renovate-bot/elixir-google-api | 1da34cd39b670c99f067011e05ab90af93fef1f6 | [
"Apache-2.0"
] | 1 | 2021-12-20T03:40:53.000Z | 2021-12-20T03:40:53.000Z | clients/compute/lib/google_api/compute/v1/api/packet_mirrorings.ex | swansoffiee/elixir-google-api | 9ea6d39f273fb430634788c258b3189d3613dde0 | [
"Apache-2.0"
] | 1 | 2020-08-18T00:11:23.000Z | 2020-08-18T00:44:16.000Z | clients/compute/lib/google_api/compute/v1/api/packet_mirrorings.ex | dazuma/elixir-google-api | 6a9897168008efe07a6081d2326735fe332e522c | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Compute.V1.Api.PacketMirrorings do
@moduledoc """
API calls for all endpoints tagged `PacketMirrorings`.
"""
alias GoogleApi.Compute.V1.Connection
alias GoogleApi.Gax.{Request, Response}
@library_version Mix.Project.config() |> Keyword.get(:version, "")
@doc """
Retrieves an aggregated list of packetMirrorings.
## Parameters
* `connection` (*type:* `GoogleApi.Compute.V1.Connection.t`) - Connection to server
* `project` (*type:* `String.t`) - Project ID for this request.
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `:userIp` (*type:* `String.t`) - Legacy name for parameter that has been superseded by `quotaUser`.
* `:filter` (*type:* `String.t`) - A filter expression that filters resources listed in the response. The expression must specify the field name, an operator, and the value that you want to use for filtering. The value must be a string, a number, or a boolean. The operator must be either `=`, `!=`, `>`, `<`, `<=`, `>=` or `:`. For example, if you are filtering Compute Engine instances, you can exclude instances named `example-instance` by specifying `name != example-instance`. The `:` operator can be used with string fields to match substrings. For non-string fields it is equivalent to the `=` operator. The `:*` comparison can be used to test whether a key has been defined. For example, to find all objects with `owner` label use: ``` labels.owner:* ``` You can also filter nested fields. For example, you could specify `scheduling.automaticRestart = false` to include instances only if they are not scheduled for automatic restarts. You can use filtering on nested fields to filter based on resource labels. To filter on multiple expressions, provide each separate expression within parentheses. For example: ``` (scheduling.automaticRestart = true) (cpuPlatform = "Intel Skylake") ``` By default, each expression is an `AND` expression. However, you can include `AND` and `OR` expressions explicitly. For example: ``` (cpuPlatform = "Intel Skylake") OR (cpuPlatform = "Intel Broadwell") AND (scheduling.automaticRestart = true) ```
* `:includeAllScopes` (*type:* `boolean()`) - Indicates whether every visible scope for each scope type (zone, region, global) should be included in the response. For new resource types added after this field, the flag has no effect as new resource types will always include every visible scope for each scope type in response. For resource types which predate this field, if this flag is omitted or false, only scopes of the scope types where the resource type is expected to be found will be included.
* `:maxResults` (*type:* `integer()`) - The maximum number of results per page that should be returned. If the number of available results is larger than `maxResults`, Compute Engine returns a `nextPageToken` that can be used to get the next page of results in subsequent list requests. Acceptable values are `0` to `500`, inclusive. (Default: `500`)
* `:orderBy` (*type:* `String.t`) - Sorts list results by a certain order. By default, results are returned in alphanumerical order based on the resource name. You can also sort results in descending order based on the creation timestamp using `orderBy="creationTimestamp desc"`. This sorts results based on the `creationTimestamp` field in reverse chronological order (newest result first). Use this to sort resources like operations so that the newest operation is returned first. Currently, only sorting by `name` or `creationTimestamp desc` is supported.
* `:pageToken` (*type:* `String.t`) - Specifies a page token to use. Set `pageToken` to the `nextPageToken` returned by a previous list request to get the next page of results.
* `:returnPartialSuccess` (*type:* `boolean()`) - Opt-in for partial success behavior which provides partial results in case of failure. The default value is false.
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.Compute.V1.Model.PacketMirroringAggregatedList{}}` on success
* `{:error, info}` on failure
"""
@spec compute_packet_mirrorings_aggregated_list(
Tesla.Env.client(),
String.t(),
keyword(),
keyword()
) ::
{:ok, GoogleApi.Compute.V1.Model.PacketMirroringAggregatedList.t()}
| {:ok, Tesla.Env.t()}
| {:ok, list()}
| {:error, any()}
def compute_packet_mirrorings_aggregated_list(
connection,
project,
optional_params \\ [],
opts \\ []
) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query,
:userIp => :query,
:filter => :query,
:includeAllScopes => :query,
:maxResults => :query,
:orderBy => :query,
:pageToken => :query,
:returnPartialSuccess => :query
}
request =
Request.new()
|> Request.method(:get)
|> Request.url("/projects/{project}/aggregated/packetMirrorings", %{
"project" => URI.encode(project, &URI.char_unreserved?/1)
})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(
opts ++ [struct: %GoogleApi.Compute.V1.Model.PacketMirroringAggregatedList{}]
)
end
@doc """
Deletes the specified PacketMirroring resource.
## Parameters
* `connection` (*type:* `GoogleApi.Compute.V1.Connection.t`) - Connection to server
* `project` (*type:* `String.t`) - Project ID for this request.
* `region` (*type:* `String.t`) - Name of the region for this request.
* `packet_mirroring` (*type:* `String.t`) - Name of the PacketMirroring resource to delete.
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `:userIp` (*type:* `String.t`) - Legacy name for parameter that has been superseded by `quotaUser`.
* `:requestId` (*type:* `String.t`) - An optional request ID to identify requests. Specify a unique request ID so that if you must retry your request, the server will know to ignore the request if it has already been completed. For example, consider a situation where you make an initial request and the request times out. If you make the request again with the same request ID, the server can check if original operation with the same request ID was received, and if so, will ignore the second request. This prevents clients from accidentally creating duplicate commitments. The request ID must be a valid UUID with the exception that zero UUID is not supported ( 00000000-0000-0000-0000-000000000000).
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.Compute.V1.Model.Operation{}}` on success
* `{:error, info}` on failure
"""
@spec compute_packet_mirrorings_delete(
Tesla.Env.client(),
String.t(),
String.t(),
String.t(),
keyword(),
keyword()
) ::
{:ok, GoogleApi.Compute.V1.Model.Operation.t()}
| {:ok, Tesla.Env.t()}
| {:ok, list()}
| {:error, any()}
def compute_packet_mirrorings_delete(
connection,
project,
region,
packet_mirroring,
optional_params \\ [],
opts \\ []
) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query,
:userIp => :query,
:requestId => :query
}
request =
Request.new()
|> Request.method(:delete)
|> Request.url("/projects/{project}/regions/{region}/packetMirrorings/{packetMirroring}", %{
"project" => URI.encode(project, &URI.char_unreserved?/1),
"region" => URI.encode(region, &URI.char_unreserved?/1),
"packetMirroring" => URI.encode(packet_mirroring, &(URI.char_unreserved?(&1) || &1 == ?/))
})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(opts ++ [struct: %GoogleApi.Compute.V1.Model.Operation{}])
end
@doc """
Returns the specified PacketMirroring resource.
## Parameters
* `connection` (*type:* `GoogleApi.Compute.V1.Connection.t`) - Connection to server
* `project` (*type:* `String.t`) - Project ID for this request.
* `region` (*type:* `String.t`) - Name of the region for this request.
* `packet_mirroring` (*type:* `String.t`) - Name of the PacketMirroring resource to return.
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `:userIp` (*type:* `String.t`) - Legacy name for parameter that has been superseded by `quotaUser`.
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.Compute.V1.Model.PacketMirroring{}}` on success
* `{:error, info}` on failure
"""
@spec compute_packet_mirrorings_get(
Tesla.Env.client(),
String.t(),
String.t(),
String.t(),
keyword(),
keyword()
) ::
{:ok, GoogleApi.Compute.V1.Model.PacketMirroring.t()}
| {:ok, Tesla.Env.t()}
| {:ok, list()}
| {:error, any()}
def compute_packet_mirrorings_get(
connection,
project,
region,
packet_mirroring,
optional_params \\ [],
opts \\ []
) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query,
:userIp => :query
}
request =
Request.new()
|> Request.method(:get)
|> Request.url("/projects/{project}/regions/{region}/packetMirrorings/{packetMirroring}", %{
"project" => URI.encode(project, &URI.char_unreserved?/1),
"region" => URI.encode(region, &URI.char_unreserved?/1),
"packetMirroring" => URI.encode(packet_mirroring, &(URI.char_unreserved?(&1) || &1 == ?/))
})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(opts ++ [struct: %GoogleApi.Compute.V1.Model.PacketMirroring{}])
end
@doc """
Creates a PacketMirroring resource in the specified project and region using the data included in the request.
## Parameters
* `connection` (*type:* `GoogleApi.Compute.V1.Connection.t`) - Connection to server
* `project` (*type:* `String.t`) - Project ID for this request.
* `region` (*type:* `String.t`) - Name of the region for this request.
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `:userIp` (*type:* `String.t`) - Legacy name for parameter that has been superseded by `quotaUser`.
* `:requestId` (*type:* `String.t`) - An optional request ID to identify requests. Specify a unique request ID so that if you must retry your request, the server will know to ignore the request if it has already been completed. For example, consider a situation where you make an initial request and the request times out. If you make the request again with the same request ID, the server can check if original operation with the same request ID was received, and if so, will ignore the second request. This prevents clients from accidentally creating duplicate commitments. The request ID must be a valid UUID with the exception that zero UUID is not supported ( 00000000-0000-0000-0000-000000000000).
* `:body` (*type:* `GoogleApi.Compute.V1.Model.PacketMirroring.t`) -
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.Compute.V1.Model.Operation{}}` on success
* `{:error, info}` on failure
"""
@spec compute_packet_mirrorings_insert(
Tesla.Env.client(),
String.t(),
String.t(),
keyword(),
keyword()
) ::
{:ok, GoogleApi.Compute.V1.Model.Operation.t()}
| {:ok, Tesla.Env.t()}
| {:ok, list()}
| {:error, any()}
def compute_packet_mirrorings_insert(
connection,
project,
region,
optional_params \\ [],
opts \\ []
) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query,
:userIp => :query,
:requestId => :query,
:body => :body
}
request =
Request.new()
|> Request.method(:post)
|> Request.url("/projects/{project}/regions/{region}/packetMirrorings", %{
"project" => URI.encode(project, &URI.char_unreserved?/1),
"region" => URI.encode(region, &URI.char_unreserved?/1)
})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(opts ++ [struct: %GoogleApi.Compute.V1.Model.Operation{}])
end
@doc """
Retrieves a list of PacketMirroring resources available to the specified project and region.
## Parameters
* `connection` (*type:* `GoogleApi.Compute.V1.Connection.t`) - Connection to server
* `project` (*type:* `String.t`) - Project ID for this request.
* `region` (*type:* `String.t`) - Name of the region for this request.
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `:userIp` (*type:* `String.t`) - Legacy name for parameter that has been superseded by `quotaUser`.
* `:filter` (*type:* `String.t`) - A filter expression that filters resources listed in the response. The expression must specify the field name, an operator, and the value that you want to use for filtering. The value must be a string, a number, or a boolean. The operator must be either `=`, `!=`, `>`, `<`, `<=`, `>=` or `:`. For example, if you are filtering Compute Engine instances, you can exclude instances named `example-instance` by specifying `name != example-instance`. The `:` operator can be used with string fields to match substrings. For non-string fields it is equivalent to the `=` operator. The `:*` comparison can be used to test whether a key has been defined. For example, to find all objects with `owner` label use: ``` labels.owner:* ``` You can also filter nested fields. For example, you could specify `scheduling.automaticRestart = false` to include instances only if they are not scheduled for automatic restarts. You can use filtering on nested fields to filter based on resource labels. To filter on multiple expressions, provide each separate expression within parentheses. For example: ``` (scheduling.automaticRestart = true) (cpuPlatform = "Intel Skylake") ``` By default, each expression is an `AND` expression. However, you can include `AND` and `OR` expressions explicitly. For example: ``` (cpuPlatform = "Intel Skylake") OR (cpuPlatform = "Intel Broadwell") AND (scheduling.automaticRestart = true) ```
* `:maxResults` (*type:* `integer()`) - The maximum number of results per page that should be returned. If the number of available results is larger than `maxResults`, Compute Engine returns a `nextPageToken` that can be used to get the next page of results in subsequent list requests. Acceptable values are `0` to `500`, inclusive. (Default: `500`)
* `:orderBy` (*type:* `String.t`) - Sorts list results by a certain order. By default, results are returned in alphanumerical order based on the resource name. You can also sort results in descending order based on the creation timestamp using `orderBy="creationTimestamp desc"`. This sorts results based on the `creationTimestamp` field in reverse chronological order (newest result first). Use this to sort resources like operations so that the newest operation is returned first. Currently, only sorting by `name` or `creationTimestamp desc` is supported.
* `:pageToken` (*type:* `String.t`) - Specifies a page token to use. Set `pageToken` to the `nextPageToken` returned by a previous list request to get the next page of results.
* `:returnPartialSuccess` (*type:* `boolean()`) - Opt-in for partial success behavior which provides partial results in case of failure. The default value is false.
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.Compute.V1.Model.PacketMirroringList{}}` on success
* `{:error, info}` on failure
"""
@spec compute_packet_mirrorings_list(
Tesla.Env.client(),
String.t(),
String.t(),
keyword(),
keyword()
) ::
{:ok, GoogleApi.Compute.V1.Model.PacketMirroringList.t()}
| {:ok, Tesla.Env.t()}
| {:ok, list()}
| {:error, any()}
def compute_packet_mirrorings_list(
connection,
project,
region,
optional_params \\ [],
opts \\ []
) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query,
:userIp => :query,
:filter => :query,
:maxResults => :query,
:orderBy => :query,
:pageToken => :query,
:returnPartialSuccess => :query
}
request =
Request.new()
|> Request.method(:get)
|> Request.url("/projects/{project}/regions/{region}/packetMirrorings", %{
"project" => URI.encode(project, &URI.char_unreserved?/1),
"region" => URI.encode(region, &URI.char_unreserved?/1)
})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(opts ++ [struct: %GoogleApi.Compute.V1.Model.PacketMirroringList{}])
end
@doc """
Patches the specified PacketMirroring resource with the data included in the request. This method supports PATCH semantics and uses JSON merge patch format and processing rules.
## Parameters
* `connection` (*type:* `GoogleApi.Compute.V1.Connection.t`) - Connection to server
* `project` (*type:* `String.t`) - Project ID for this request.
* `region` (*type:* `String.t`) - Name of the region for this request.
* `packet_mirroring` (*type:* `String.t`) - Name of the PacketMirroring resource to patch.
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `:userIp` (*type:* `String.t`) - Legacy name for parameter that has been superseded by `quotaUser`.
* `:requestId` (*type:* `String.t`) - An optional request ID to identify requests. Specify a unique request ID so that if you must retry your request, the server will know to ignore the request if it has already been completed. For example, consider a situation where you make an initial request and the request times out. If you make the request again with the same request ID, the server can check if original operation with the same request ID was received, and if so, will ignore the second request. This prevents clients from accidentally creating duplicate commitments. The request ID must be a valid UUID with the exception that zero UUID is not supported ( 00000000-0000-0000-0000-000000000000).
* `:body` (*type:* `GoogleApi.Compute.V1.Model.PacketMirroring.t`) -
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.Compute.V1.Model.Operation{}}` on success
* `{:error, info}` on failure
"""
@spec compute_packet_mirrorings_patch(
Tesla.Env.client(),
String.t(),
String.t(),
String.t(),
keyword(),
keyword()
) ::
{:ok, GoogleApi.Compute.V1.Model.Operation.t()}
| {:ok, Tesla.Env.t()}
| {:ok, list()}
| {:error, any()}
def compute_packet_mirrorings_patch(
connection,
project,
region,
packet_mirroring,
optional_params \\ [],
opts \\ []
) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query,
:userIp => :query,
:requestId => :query,
:body => :body
}
request =
Request.new()
|> Request.method(:patch)
|> Request.url("/projects/{project}/regions/{region}/packetMirrorings/{packetMirroring}", %{
"project" => URI.encode(project, &URI.char_unreserved?/1),
"region" => URI.encode(region, &URI.char_unreserved?/1),
"packetMirroring" => URI.encode(packet_mirroring, &(URI.char_unreserved?(&1) || &1 == ?/))
})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(opts ++ [struct: %GoogleApi.Compute.V1.Model.Operation{}])
end
@doc """
Returns permissions that a caller has on the specified resource.
## Parameters
* `connection` (*type:* `GoogleApi.Compute.V1.Connection.t`) - Connection to server
* `project` (*type:* `String.t`) - Project ID for this request.
* `region` (*type:* `String.t`) - The name of the region for this request.
* `resource` (*type:* `String.t`) - Name or id of the resource for this request.
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `:userIp` (*type:* `String.t`) - Legacy name for parameter that has been superseded by `quotaUser`.
* `:body` (*type:* `GoogleApi.Compute.V1.Model.TestPermissionsRequest.t`) -
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.Compute.V1.Model.TestPermissionsResponse{}}` on success
* `{:error, info}` on failure
"""
@spec compute_packet_mirrorings_test_iam_permissions(
Tesla.Env.client(),
String.t(),
String.t(),
String.t(),
keyword(),
keyword()
) ::
{:ok, GoogleApi.Compute.V1.Model.TestPermissionsResponse.t()}
| {:ok, Tesla.Env.t()}
| {:ok, list()}
| {:error, any()}
def compute_packet_mirrorings_test_iam_permissions(
connection,
project,
region,
resource,
optional_params \\ [],
opts \\ []
) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query,
:userIp => :query,
:body => :body
}
request =
Request.new()
|> Request.method(:post)
|> Request.url(
"/projects/{project}/regions/{region}/packetMirrorings/{resource}/testIamPermissions",
%{
"project" => URI.encode(project, &URI.char_unreserved?/1),
"region" => URI.encode(region, &URI.char_unreserved?/1),
"resource" => URI.encode(resource, &URI.char_unreserved?/1)
}
)
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(opts ++ [struct: %GoogleApi.Compute.V1.Model.TestPermissionsResponse{}])
end
end
| 54.839607 | 1,450 | 0.639896 |
03e7d8f802e0fe3470fbf9fddc4657a5061831ce | 3,953 | ex | Elixir | clients/ad_sense/lib/google_api/ad_sense/v14/model/ad_unit.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | 1 | 2018-12-03T23:43:10.000Z | 2018-12-03T23:43:10.000Z | clients/ad_sense/lib/google_api/ad_sense/v14/model/ad_unit.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | null | null | null | clients/ad_sense/lib/google_api/ad_sense/v14/model/ad_unit.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This class is auto generated by the elixir code generator program.
# Do not edit the class manually.
defmodule GoogleApi.AdSense.V14.Model.AdUnit do
@moduledoc """
## Attributes
* `code` (*type:* `String.t`, *default:* `nil`) - Identity code of this ad unit, not necessarily unique across ad clients.
* `contentAdsSettings` (*type:* `GoogleApi.AdSense.V14.Model.AdUnitContentAdsSettings.t`, *default:* `nil`) - Settings specific to content ads (AFC) and highend mobile content ads (AFMC - deprecated).
* `customStyle` (*type:* `GoogleApi.AdSense.V14.Model.AdStyle.t`, *default:* `nil`) - Custom style information specific to this ad unit.
* `feedAdsSettings` (*type:* `GoogleApi.AdSense.V14.Model.AdUnitFeedAdsSettings.t`, *default:* `nil`) - Settings specific to feed ads (AFF) - deprecated.
* `id` (*type:* `String.t`, *default:* `nil`) - Unique identifier of this ad unit. This should be considered an opaque identifier; it is not safe to rely on it being in any particular format.
* `kind` (*type:* `String.t`, *default:* `adsense#adUnit`) - Kind of resource this is, in this case adsense#adUnit.
* `mobileContentAdsSettings` (*type:* `GoogleApi.AdSense.V14.Model.AdUnitMobileContentAdsSettings.t`, *default:* `nil`) - Settings specific to WAP mobile content ads (AFMC) - deprecated.
* `name` (*type:* `String.t`, *default:* `nil`) - Name of this ad unit.
* `savedStyleId` (*type:* `String.t`, *default:* `nil`) - ID of the saved ad style which holds this ad unit's style information.
* `status` (*type:* `String.t`, *default:* `nil`) - Status of this ad unit. Possible values are:
NEW: Indicates that the ad unit was created within the last seven days and does not yet have any activity associated with it.
ACTIVE: Indicates that there has been activity on this ad unit in the last seven days.
INACTIVE: Indicates that there has been no activity on this ad unit in the last seven days.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:code => String.t(),
:contentAdsSettings => GoogleApi.AdSense.V14.Model.AdUnitContentAdsSettings.t(),
:customStyle => GoogleApi.AdSense.V14.Model.AdStyle.t(),
:feedAdsSettings => GoogleApi.AdSense.V14.Model.AdUnitFeedAdsSettings.t(),
:id => String.t(),
:kind => String.t(),
:mobileContentAdsSettings =>
GoogleApi.AdSense.V14.Model.AdUnitMobileContentAdsSettings.t(),
:name => String.t(),
:savedStyleId => String.t(),
:status => String.t()
}
field(:code)
field(:contentAdsSettings, as: GoogleApi.AdSense.V14.Model.AdUnitContentAdsSettings)
field(:customStyle, as: GoogleApi.AdSense.V14.Model.AdStyle)
field(:feedAdsSettings, as: GoogleApi.AdSense.V14.Model.AdUnitFeedAdsSettings)
field(:id)
field(:kind)
field(:mobileContentAdsSettings, as: GoogleApi.AdSense.V14.Model.AdUnitMobileContentAdsSettings)
field(:name)
field(:savedStyleId)
field(:status)
end
defimpl Poison.Decoder, for: GoogleApi.AdSense.V14.Model.AdUnit do
def decode(value, options) do
GoogleApi.AdSense.V14.Model.AdUnit.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.AdSense.V14.Model.AdUnit do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 49.4125 | 204 | 0.707817 |
03e7ee24fb6005565f548f638eda3624a24fc3e9 | 1,870 | exs | Elixir | clients/analytics_data/mix.exs | jechol/elixir-google-api | 0290b683dfc6491ca2ef755a80bc329378738d03 | [
"Apache-2.0"
] | null | null | null | clients/analytics_data/mix.exs | jechol/elixir-google-api | 0290b683dfc6491ca2ef755a80bc329378738d03 | [
"Apache-2.0"
] | null | null | null | clients/analytics_data/mix.exs | jechol/elixir-google-api | 0290b683dfc6491ca2ef755a80bc329378738d03 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.AnalyticsData.Mixfile do
use Mix.Project
@version "0.10.0"
def project() do
[
app: :google_api_analytics_data,
version: @version,
elixir: "~> 1.6",
build_embedded: Mix.env == :prod,
start_permanent: Mix.env == :prod,
description: description(),
package: package(),
deps: deps(),
source_url: "https://github.com/googleapis/elixir-google-api/tree/master/clients/analytics_data"
]
end
def application() do
[extra_applications: [:logger]]
end
defp deps() do
[
{:google_gax, "~> 0.4"},
{:ex_doc, "~> 0.16", only: :dev}
]
end
defp description() do
"""
Google Analytics Data API client library. Accesses report data in Google Analytics.
"""
end
defp package() do
[
files: ["lib", "mix.exs", "README*", "LICENSE"],
maintainers: ["Jeff Ching", "Daniel Azuma"],
licenses: ["Apache 2.0"],
links: %{
"GitHub" => "https://github.com/googleapis/elixir-google-api/tree/master/clients/analytics_data",
"Homepage" => "https://developers.google.com/analytics/devguides/reporting/data/v1/"
}
]
end
end
| 27.910448 | 105 | 0.660963 |
03e813f45751187e1c87ca9315d0391bd5fd0bb5 | 418 | ex | Elixir | apps/reaper/lib/reaper/decoder/unknown.ex | calebcarroll1/smartcitiesdata | b0f03496f6c592c82ba14aebf6c5996311cf3cd0 | [
"Apache-2.0"
] | 18 | 2020-11-13T15:38:24.000Z | 2021-05-26T00:40:08.000Z | apps/reaper/lib/reaper/decoder/unknown.ex | calebcarroll1/smartcitiesdata | b0f03496f6c592c82ba14aebf6c5996311cf3cd0 | [
"Apache-2.0"
] | 365 | 2020-09-21T12:31:40.000Z | 2021-09-25T14:54:21.000Z | apps/reaper/lib/reaper/decoder/unknown.ex | jakeprem/smartcitiesdata | da309ac0d2261527278951cbae88604455207589 | [
"Apache-2.0"
] | 3 | 2020-10-06T16:17:49.000Z | 2021-09-03T17:11:41.000Z | defmodule Reaper.Decoder.Unknown do
@moduledoc """
Decoder implementation that always returns an error tuple
"""
@behaviour Reaper.Decoder
@impl Reaper.Decoder
def decode({:file, _filename}, %SmartCity.Dataset{technical: %{sourceFormat: other}}) do
{:error, "", %RuntimeError{message: "#{other} is an invalid format"}}
end
@impl Reaper.Decoder
def handle?(_source_format) do
true
end
end
| 24.588235 | 90 | 0.708134 |
03e81473cfc1744f7d6be054705d5736a858433c | 2,714 | exs | Elixir | test_old/planga_web/controllers/conversation_controller_test.exs | ResiliaDev/Planga | b21d290dd7c2c7fa30571d0a5124d63bd09c0c9e | [
"MIT"
] | 37 | 2018-07-13T14:08:16.000Z | 2021-04-09T15:00:22.000Z | test_old/planga_web/controllers/conversation_controller_test.exs | ResiliaDev/Planga | b21d290dd7c2c7fa30571d0a5124d63bd09c0c9e | [
"MIT"
] | 9 | 2018-07-16T15:24:39.000Z | 2021-09-01T14:21:20.000Z | test_old/planga_web/controllers/conversation_controller_test.exs | ResiliaDev/Planga | b21d290dd7c2c7fa30571d0a5124d63bd09c0c9e | [
"MIT"
] | 3 | 2018-10-05T20:19:25.000Z | 2019-12-05T00:30:01.000Z | defmodule PlangaWeb.ConversationControllerTest do
use PlangaWeb.ConnCase
alias Planga.Chat
alias Planga.Chat.Conversation
@create_attrs %{remote_id: "some remote_id"}
@update_attrs %{remote_id: "some updated remote_id"}
@invalid_attrs %{remote_id: nil}
def fixture(:conversation) do
{:ok, conversation} = Chat.create_conversation(@create_attrs)
conversation
end
setup %{conn: conn} do
{:ok, conn: put_req_header(conn, "accept", "application/json")}
end
describe "index" do
test "lists all conversations", %{conn: conn} do
conn = get(conn, conversation_path(conn, :index))
assert json_response(conn, 200)["data"] == []
end
end
describe "create conversation" do
test "renders conversation when data is valid", %{conn: conn} do
conn = post(conn, conversation_path(conn, :create), conversation: @create_attrs)
assert %{"id" => id} = json_response(conn, 201)["data"]
conn = get(conn, conversation_path(conn, :show, id))
assert json_response(conn, 200)["data"] == %{"id" => id, "remote_id" => "some remote_id"}
end
test "renders errors when data is invalid", %{conn: conn} do
conn = post(conn, conversation_path(conn, :create), conversation: @invalid_attrs)
assert json_response(conn, 422)["errors"] != %{}
end
end
describe "update conversation" do
setup [:create_conversation]
test "renders conversation when data is valid", %{
conn: conn,
conversation: %Conversation{id: id} = conversation
} do
conn =
put(conn, conversation_path(conn, :update, conversation), conversation: @update_attrs)
assert %{"id" => ^id} = json_response(conn, 200)["data"]
conn = get(conn, conversation_path(conn, :show, id))
assert json_response(conn, 200)["data"] == %{
"id" => id,
"remote_id" => "some updated remote_id"
}
end
test "renders errors when data is invalid", %{conn: conn, conversation: conversation} do
conn =
put(conn, conversation_path(conn, :update, conversation), conversation: @invalid_attrs)
assert json_response(conn, 422)["errors"] != %{}
end
end
describe "delete conversation" do
setup [:create_conversation]
test "deletes chosen conversation", %{conn: conn, conversation: conversation} do
conn = delete(conn, conversation_path(conn, :delete, conversation))
assert response(conn, 204)
assert_error_sent(404, fn ->
get(conn, conversation_path(conn, :show, conversation))
end)
end
end
defp create_conversation(_) do
conversation = fixture(:conversation)
{:ok, conversation: conversation}
end
end
| 30.840909 | 95 | 0.657332 |
03e816b8a54068a63e6f153ba732b1fe81d3dceb | 817 | ex | Elixir | lib/cforum/open_close.ex | jrieger/cforum_ex | 61f6ce84708cb55bd0feedf69853dae64146a7a0 | [
"MIT"
] | 16 | 2019-04-04T06:33:33.000Z | 2021-08-16T19:34:31.000Z | lib/cforum/open_close.ex | jrieger/cforum_ex | 61f6ce84708cb55bd0feedf69853dae64146a7a0 | [
"MIT"
] | 294 | 2019-02-10T11:10:27.000Z | 2022-03-30T04:52:53.000Z | lib/cforum/open_close.ex | jrieger/cforum_ex | 61f6ce84708cb55bd0feedf69853dae64146a7a0 | [
"MIT"
] | 10 | 2019-02-10T10:39:24.000Z | 2021-07-06T11:46:05.000Z | defmodule Cforum.OpenClose do
alias Cforum.Repo
alias Cforum.OpenClose.State
def get_open_closed_state(user, thread),
do: Repo.get_by(State, user_id: user.user_id, thread_id: thread.thread_id)
def open_thread(user, thread) do
oc = get_open_closed_state(user, thread)
if oc != nil && oc.state != "open" do
Repo.delete(oc)
else
%State{}
|> State.changeset(%{user_id: user.user_id, thread_id: thread.thread_id, state: "open"})
|> Repo.insert()
end
end
def close_thread(user, thread) do
oc = get_open_closed_state(user, thread)
if oc != nil && oc.state != "closed" do
Repo.delete(oc)
else
%State{}
|> State.changeset(%{user_id: user.user_id, thread_id: thread.thread_id, state: "closed"})
|> Repo.insert()
end
end
end
| 25.53125 | 96 | 0.647491 |
03e82935b524dad90c26eda65ac5a19db430d248 | 89 | ex | Elixir | Chapter11/code/apps/elixir_drip_web/lib/elixir_drip_web/prometheus/endpoint_instrumenter.ex | sthagen/Mastering-Elixir | 1b52ee79afe6b2ae80767a5e55c2be51df3c4c1d | [
"MIT"
] | 28 | 2018-08-09T05:05:29.000Z | 2022-03-14T06:59:07.000Z | Chapter11/code/apps/elixir_drip_web/lib/elixir_drip_web/prometheus/endpoint_instrumenter.ex | sthagen/Mastering-Elixir | 1b52ee79afe6b2ae80767a5e55c2be51df3c4c1d | [
"MIT"
] | 1 | 2019-02-11T09:11:33.000Z | 2019-05-06T06:40:19.000Z | Chapter11/code/apps/elixir_drip_web/lib/elixir_drip_web/prometheus/endpoint_instrumenter.ex | sthagen/Mastering-Elixir | 1b52ee79afe6b2ae80767a5e55c2be51df3c4c1d | [
"MIT"
] | 8 | 2018-08-09T14:53:02.000Z | 2020-12-14T19:31:21.000Z | defmodule ElixirDripWeb.EndpointInstrumenter do
use Prometheus.PhoenixInstrumenter
end
| 22.25 | 47 | 0.88764 |
03e86b13144faf5aec3924cf19a5268d52827d7c | 16,521 | ex | Elixir | lib/aws/generated/cost_explorer.ex | qyon-brazil/aws-elixir | f7f21bebffc6776f95ffe9ef563cf368773438af | [
"Apache-2.0"
] | null | null | null | lib/aws/generated/cost_explorer.ex | qyon-brazil/aws-elixir | f7f21bebffc6776f95ffe9ef563cf368773438af | [
"Apache-2.0"
] | null | null | null | lib/aws/generated/cost_explorer.ex | qyon-brazil/aws-elixir | f7f21bebffc6776f95ffe9ef563cf368773438af | [
"Apache-2.0"
] | 1 | 2020-10-28T08:56:54.000Z | 2020-10-28T08:56:54.000Z | # WARNING: DO NOT EDIT, AUTO-GENERATED CODE!
# See https://github.com/aws-beam/aws-codegen for more details.
defmodule AWS.CostExplorer do
@moduledoc """
The Cost Explorer API enables you to programmatically query your cost and usage
data.
You can query for aggregated data such as total monthly costs or total daily
usage. You can also query for granular data, such as the number of daily write
operations for Amazon DynamoDB database tables in your production environment.
Service Endpoint
The Cost Explorer API provides the following endpoint:
* `https://ce.us-east-1.amazonaws.com`
For information about costs associated with the Cost Explorer API, see [AWS Cost Management Pricing](http://aws.amazon.com/aws-cost-management/pricing/).
"""
alias AWS.Client
alias AWS.Request
def metadata do
%AWS.ServiceMetadata{
abbreviation: "AWS Cost Explorer",
api_version: "2017-10-25",
content_type: "application/x-amz-json-1.1",
credential_scope: "us-east-1",
endpoint_prefix: "ce",
global?: true,
protocol: "json",
service_id: "Cost Explorer",
signature_version: "v4",
signing_name: "ce",
target_prefix: "AWSInsightsIndexService"
}
end
@doc """
Creates a new cost anomaly detection monitor with the requested type and monitor
specification.
"""
def create_anomaly_monitor(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "CreateAnomalyMonitor", input, options)
end
@doc """
Adds a subscription to a cost anomaly detection monitor.
You can use each subscription to define subscribers with email or SNS
notifications. Email subscribers can set a dollar threshold and a time frequency
for receiving notifications.
"""
def create_anomaly_subscription(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "CreateAnomalySubscription", input, options)
end
@doc """
Creates a new Cost Category with the requested name and rules.
"""
def create_cost_category_definition(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "CreateCostCategoryDefinition", input, options)
end
@doc """
Deletes a cost anomaly monitor.
"""
def delete_anomaly_monitor(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "DeleteAnomalyMonitor", input, options)
end
@doc """
Deletes a cost anomaly subscription.
"""
def delete_anomaly_subscription(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "DeleteAnomalySubscription", input, options)
end
@doc """
Deletes a Cost Category.
Expenses from this month going forward will no longer be categorized with this
Cost Category.
"""
def delete_cost_category_definition(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "DeleteCostCategoryDefinition", input, options)
end
@doc """
Returns the name, ARN, rules, definition, and effective dates of a Cost Category
that's defined in the account.
You have the option to use `EffectiveOn` to return a Cost Category that is
active on a specific date. If there is no `EffectiveOn` specified, you’ll see a
Cost Category that is effective on the current date. If Cost Category is still
effective, `EffectiveEnd` is omitted in the response.
"""
def describe_cost_category_definition(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "DescribeCostCategoryDefinition", input, options)
end
@doc """
Retrieves all of the cost anomalies detected on your account, during the time
period specified by the `DateInterval` object.
"""
def get_anomalies(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetAnomalies", input, options)
end
@doc """
Retrieves the cost anomaly monitor definitions for your account.
You can filter using a list of cost anomaly monitor Amazon Resource Names
(ARNs).
"""
def get_anomaly_monitors(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetAnomalyMonitors", input, options)
end
@doc """
Retrieves the cost anomaly subscription objects for your account.
You can filter using a list of cost anomaly monitor Amazon Resource Names
(ARNs).
"""
def get_anomaly_subscriptions(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetAnomalySubscriptions", input, options)
end
@doc """
Retrieves cost and usage metrics for your account.
You can specify which cost and usage-related metric, such as `BlendedCosts` or
`UsageQuantity`, that you want the request to return. You can also filter and
group your data by various dimensions, such as `SERVICE` or `AZ`, in a specific
time range. For a complete list of valid dimensions, see the
[GetDimensionValues](https://docs.aws.amazon.com/aws-cost-management/latest/APIReference/API_GetDimensionValues.html) operation. Management account in an organization in AWS Organizations have
access to all member accounts.
For information about filter limitations, see [Quotas and
restrictions](https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/billing-limits.html)
in the *Billing and Cost Management User Guide*.
"""
def get_cost_and_usage(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetCostAndUsage", input, options)
end
@doc """
Retrieves cost and usage metrics with resources for your account.
You can specify which cost and usage-related metric, such as `BlendedCosts` or
`UsageQuantity`, that you want the request to return. You can also filter and
group your data by various dimensions, such as `SERVICE` or `AZ`, in a specific
time range. For a complete list of valid dimensions, see the
[GetDimensionValues](https://docs.aws.amazon.com/aws-cost-management/latest/APIReference/API_GetDimensionValues.html) operation. Management account in an organization in AWS Organizations have
access to all member accounts. This API is currently available for the Amazon
Elastic Compute Cloud – Compute service only.
This is an opt-in only feature. You can enable this feature from the Cost
Explorer Settings page. For information on how to access the Settings page, see
[Controlling Access for Cost
Explorer](https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/ce-access.html)
in the *AWS Billing and Cost Management User Guide*.
"""
def get_cost_and_usage_with_resources(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetCostAndUsageWithResources", input, options)
end
@doc """
Retrieves an array of Cost Category names and values incurred cost.
If some Cost Category names and values are not associated with any cost, they
will not be returned by this API.
"""
def get_cost_categories(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetCostCategories", input, options)
end
@doc """
Retrieves a forecast for how much Amazon Web Services predicts that you will
spend over the forecast time period that you select, based on your past costs.
"""
def get_cost_forecast(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetCostForecast", input, options)
end
@doc """
Retrieves all available filter values for a specified filter over a period of
time.
You can search the dimension values for an arbitrary string.
"""
def get_dimension_values(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetDimensionValues", input, options)
end
@doc """
Retrieves the reservation coverage for your account.
This enables you to see how much of your Amazon Elastic Compute Cloud, Amazon
ElastiCache, Amazon Relational Database Service, or Amazon Redshift usage is
covered by a reservation. An organization's management account can see the
coverage of the associated member accounts. This supports dimensions, Cost
Categories, and nested expressions. For any time period, you can filter data
about reservation usage by the following dimensions:
* AZ
* CACHE_ENGINE
* DATABASE_ENGINE
* DEPLOYMENT_OPTION
* INSTANCE_TYPE
* LINKED_ACCOUNT
* OPERATING_SYSTEM
* PLATFORM
* REGION
* SERVICE
* TAG
* TENANCY
To determine valid values for a dimension, use the `GetDimensionValues`
operation.
"""
def get_reservation_coverage(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetReservationCoverage", input, options)
end
@doc """
Gets recommendations for which reservations to purchase.
These recommendations could help you reduce your costs. Reservations provide a
discounted hourly rate (up to 75%) compared to On-Demand pricing.
AWS generates your recommendations by identifying your On-Demand usage during a
specific time period and collecting your usage into categories that are eligible
for a reservation. After AWS has these categories, it simulates every
combination of reservations in each category of usage to identify the best
number of each type of RI to purchase to maximize your estimated savings.
For example, AWS automatically aggregates your Amazon EC2 Linux, shared tenancy,
and c4 family usage in the US West (Oregon) Region and recommends that you buy
size-flexible regional reservations to apply to the c4 family usage. AWS
recommends the smallest size instance in an instance family. This makes it
easier to purchase a size-flexible RI. AWS also shows the equal number of
normalized units so that you can purchase any instance size that you want. For
this example, your RI recommendation would be for `c4.large` because that is the
smallest size instance in the c4 instance family.
"""
def get_reservation_purchase_recommendation(%Client{} = client, input, options \\ []) do
Request.request_post(
client,
metadata(),
"GetReservationPurchaseRecommendation",
input,
options
)
end
@doc """
Retrieves the reservation utilization for your account.
Management account in an organization have access to member accounts. You can
filter data by dimensions in a time period. You can use `GetDimensionValues` to
determine the possible dimension values. Currently, you can group only by
`SUBSCRIPTION_ID`.
"""
def get_reservation_utilization(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetReservationUtilization", input, options)
end
@doc """
Creates recommendations that help you save cost by identifying idle and
underutilized Amazon EC2 instances.
Recommendations are generated to either downsize or terminate instances, along
with providing savings detail and metrics. For details on calculation and
function, see [Optimizing Your Cost with Rightsizing Recommendations](https://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/ce-rightsizing.html)
in the *AWS Billing and Cost Management User Guide*.
"""
def get_rightsizing_recommendation(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetRightsizingRecommendation", input, options)
end
@doc """
Retrieves the Savings Plans covered for your account.
This enables you to see how much of your cost is covered by a Savings Plan. An
organization’s management account can see the coverage of the associated member
accounts. This supports dimensions, Cost Categories, and nested expressions. For
any time period, you can filter data for Savings Plans usage with the following
dimensions:
* `LINKED_ACCOUNT`
* `REGION`
* `SERVICE`
* `INSTANCE_FAMILY`
To determine valid values for a dimension, use the `GetDimensionValues`
operation.
"""
def get_savings_plans_coverage(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetSavingsPlansCoverage", input, options)
end
@doc """
Retrieves your request parameters, Savings Plan Recommendations Summary and
Details.
"""
def get_savings_plans_purchase_recommendation(%Client{} = client, input, options \\ []) do
Request.request_post(
client,
metadata(),
"GetSavingsPlansPurchaseRecommendation",
input,
options
)
end
@doc """
Retrieves the Savings Plans utilization for your account across date ranges with
daily or monthly granularity.
Management account in an organization have access to member accounts. You can
use `GetDimensionValues` in `SAVINGS_PLANS` to determine the possible dimension
values.
You cannot group by any dimension values for `GetSavingsPlansUtilization`.
"""
def get_savings_plans_utilization(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetSavingsPlansUtilization", input, options)
end
@doc """
Retrieves attribute data along with aggregate utilization and savings data for a
given time period.
This doesn't support granular or grouped data (daily/monthly) in response. You
can't retrieve data by dates in a single response similar to
`GetSavingsPlanUtilization`, but you have the option to make multiple calls to
`GetSavingsPlanUtilizationDetails` by providing individual dates. You can use
`GetDimensionValues` in `SAVINGS_PLANS` to determine the possible dimension
values.
`GetSavingsPlanUtilizationDetails` internally groups data by `SavingsPlansArn`.
"""
def get_savings_plans_utilization_details(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetSavingsPlansUtilizationDetails", input, options)
end
@doc """
Queries for available tag keys and tag values for a specified period.
You can search the tag values for an arbitrary string.
"""
def get_tags(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetTags", input, options)
end
@doc """
Retrieves a forecast for how much Amazon Web Services predicts that you will use
over the forecast time period that you select, based on your past usage.
"""
def get_usage_forecast(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "GetUsageForecast", input, options)
end
@doc """
Returns the name, ARN, `NumberOfRules` and effective dates of all Cost
Categories defined in the account.
You have the option to use `EffectiveOn` to return a list of Cost Categories
that were active on a specific date. If there is no `EffectiveOn` specified,
you’ll see Cost Categories that are effective on the current date. If Cost
Category is still effective, `EffectiveEnd` is omitted in the response.
`ListCostCategoryDefinitions` supports pagination. The request can have a
`MaxResults` range up to 100.
"""
def list_cost_category_definitions(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "ListCostCategoryDefinitions", input, options)
end
@doc """
Modifies the feedback property of a given cost anomaly.
"""
def provide_anomaly_feedback(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "ProvideAnomalyFeedback", input, options)
end
@doc """
Updates an existing cost anomaly monitor.
The changes made are applied going forward, and does not change anomalies
detected in the past.
"""
def update_anomaly_monitor(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "UpdateAnomalyMonitor", input, options)
end
@doc """
Updates an existing cost anomaly monitor subscription.
"""
def update_anomaly_subscription(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "UpdateAnomalySubscription", input, options)
end
@doc """
Updates an existing Cost Category.
Changes made to the Cost Category rules will be used to categorize the current
month’s expenses and future expenses. This won’t change categorization for the
previous months.
"""
def update_cost_category_definition(%Client{} = client, input, options \\ []) do
Request.request_post(client, metadata(), "UpdateCostCategoryDefinition", input, options)
end
end | 38.331787 | 194 | 0.738696 |
03e8c5279fa16d9bae1b6248cda2675ba99ed983 | 1,586 | ex | Elixir | apps/omg_watcher_info/test/support/factories/data_helper.ex | omisego/elixir-omg | 2c68973d8f29033d137f63a6e060f12e2a7dcd59 | [
"Apache-2.0"
] | 177 | 2018-08-24T03:51:02.000Z | 2020-05-30T13:29:25.000Z | apps/omg_watcher_info/test/support/factories/data_helper.ex | omisego/elixir-omg | 2c68973d8f29033d137f63a6e060f12e2a7dcd59 | [
"Apache-2.0"
] | 1,042 | 2018-08-25T00:52:39.000Z | 2020-06-01T05:15:17.000Z | apps/omg_watcher_info/test/support/factories/data_helper.ex | omisego/elixir-omg | 2c68973d8f29033d137f63a6e060f12e2a7dcd59 | [
"Apache-2.0"
] | 47 | 2018-08-24T12:06:33.000Z | 2020-04-28T11:49:25.000Z | # Copyright 2019-2020 OMG Network Pte Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
defmodule OMG.WatcherInfo.Factory.DataHelper do
@moduledoc """
A data helper module with functions to generate useful data for testing. Unlike the factories,
the data generated in this module is not constrained to the sructures defined in the DB models.
"""
defmacro __using__(_opts) do
quote do
alias OMG.Eth.Encoding
alias OMG.Watcher.Utxo
require Utxo
# Generates a certain length of random bytes. Uniqueness not guaranteed so it's not recommended for identifiers.
def insecure_random_bytes(num_bytes) when num_bytes >= 0 and num_bytes <= 255 do
0..255 |> Enum.shuffle() |> Enum.take(num_bytes) |> :erlang.list_to_binary()
end
# creates event data specifically for the TxOutput.spend_utxos/3function
def spend_uxto_params_from_txoutput(txoutput) do
{Utxo.position(txoutput.blknum, txoutput.txindex, txoutput.oindex), txoutput.spending_tx_oindex,
txoutput.spending_txhash}
end
end
end
end
| 39.65 | 118 | 0.737074 |
03e8f8c19e8b8d202ae7d288d6afcdd2127baef6 | 2,637 | ex | Elixir | lib/timber/http_clients/hackney.ex | axelson/timber-elixir | def9de8ebbb64a6f6d5dd85f8958d8b24f8d6c31 | [
"0BSD"
] | null | null | null | lib/timber/http_clients/hackney.ex | axelson/timber-elixir | def9de8ebbb64a6f6d5dd85f8958d8b24f8d6c31 | [
"0BSD"
] | null | null | null | lib/timber/http_clients/hackney.ex | axelson/timber-elixir | def9de8ebbb64a6f6d5dd85f8958d8b24f8d6c31 | [
"0BSD"
] | null | null | null | defmodule Timber.HTTPClients.Hackney do
@moduledoc false
# An efficient HTTP client that leverages hackney, keep alive connections, and connection
# pools to communicate with the Timber API.
# ## Configuration
# ```elixir
# config :timber, :hackney_client,
# request_options: [
# connect_timeout: 5_000, # 5 seconds, timeout to connect
# recv_timeout: 20_000 # 20 seconds, timeout to receive a response
# ]
# ```
# * `:request_options` - Passed to `:hackney.request(method, url, headers, body, request_options)`.
alias Timber.HTTPClient
@behaviour HTTPClient
@default_request_options [
# 5 seconds, timeout to connect
connect_timeout: 5_000,
# 10 seconds, timeout to receive a response
recv_timeout: 10_000
]
@doc false
@impl HTTPClient
def async_request(method, url, headers, body) do
req_headers = Enum.map(headers, & &1)
req_opts =
get_request_options()
|> Keyword.merge(async: true)
:hackney.request(method, url, req_headers, body, req_opts)
end
@doc false
@impl HTTPClient
# Legacy response structure for older versions of `:hackney`
def handle_async_response(ref, {:hackney_response, ref, {:ok, status, _body}}) do
{:ok, status}
end
# New response structure for current versions of `:hackney`
def handle_async_response(ref, {:hackney_response, ref, {:status, status, _body}}) do
{:ok, status}
end
# New response structure for current versions of `:hackney`
def handle_async_response(ref, {:hackney_response, ref, {:status, status}}) do
{:ok, status}
end
# Return errors since that conforms to the spec
def handle_async_response(ref, {:hackney_response, ref, {:error, _error} = error_tuple}) do
error_tuple
end
# Pass other messages
def handle_async_response(_ref, _msg) do
:pass
end
@doc false
@impl HTTPClient
def wait_on_response(ref, timeout) do
receive do
{:hackney_response, ^ref, _response} = msg ->
handle_async_response(ref, msg)
after
timeout ->
:timeout
end
end
@doc false
@impl HTTPClient
def request(method, url, headers, body) do
req_headers = Enum.map(headers, & &1)
req_opts =
get_request_options()
|> Keyword.merge(with_body: true)
:hackney.request(method, url, req_headers, body, req_opts)
end
#
# Config
#
@spec config :: Keyword.t()
defp config, do: Application.get_env(:timber, :hackney_client, [])
@spec get_request_options() :: Keyword.t()
defp get_request_options(),
do: Keyword.get(config(), :request_options, @default_request_options)
end
| 25.601942 | 101 | 0.683732 |
03e93074acc83abe709980e32a0ba296384ffcad | 2,376 | exs | Elixir | test/shopify_api/rate_limiting/rest_tracker_test.exs | ProtoJazz/elixir-shopifyapi | 759e20baff5afdff235386193bc42b2ecd343f5d | [
"Apache-2.0"
] | 18 | 2019-06-07T13:36:39.000Z | 2021-08-03T21:06:36.000Z | test/shopify_api/rate_limiting/rest_tracker_test.exs | ProtoJazz/elixir-shopifyapi | 759e20baff5afdff235386193bc42b2ecd343f5d | [
"Apache-2.0"
] | 158 | 2018-08-30T22:09:00.000Z | 2021-09-22T01:18:59.000Z | test/shopify_api/rate_limiting/rest_tracker_test.exs | ProtoJazz/elixir-shopifyapi | 759e20baff5afdff235386193bc42b2ecd343f5d | [
"Apache-2.0"
] | 4 | 2020-09-05T00:48:46.000Z | 2020-09-30T15:53:50.000Z | defmodule ShopifyAPI.RateLimiting.RESTTrackerTest do
use ExUnit.Case
alias HTTPoison.Response
alias ShopifyAPI.AuthToken
alias ShopifyAPI.RateLimiting.RESTTracker
describe "api_hit_limit/2" do
test "returns a 2000 ms delay" do
call_limit_header = {"X-Shopify-Shop-Api-Call-Limit", "50/50"}
retry_after_header = {"Retry-After", "2.0"}
response = %Response{headers: [{"foo", "bar"}, call_limit_header, retry_after_header]}
token = %AuthToken{app_name: "test", shop_name: "shop"}
assert {0, 2000} == RESTTracker.api_hit_limit(token, response)
end
end
describe "update_api_call_limit/2" do
test "back off for 1000 ms when 0 is left" do
call_limit_header = {"X-Shopify-Shop-Api-Call-Limit", "50/50"}
response = %Response{headers: [{"foo", "bar"}, call_limit_header]}
token = %AuthToken{app_name: "test", shop_name: "shop"}
assert {0, 1000} == RESTTracker.update_api_call_limit(token, response)
end
test "does not back off if there is a limit left" do
call_limit_header = {"X-Shopify-Shop-Api-Call-Limit", "40/50"}
response = %Response{headers: [{"foo", "bar"}, call_limit_header]}
token = %AuthToken{app_name: "test", shop_name: "shop"}
assert {10, 0} == RESTTracker.update_api_call_limit(token, response)
end
end
describe "get/1" do
test "handles get without having set" do
now = ~U[2020-01-01 12:00:00.000000Z]
token = %AuthToken{app_name: "empty", shop_name: "empty"}
assert {40, 0} == RESTTracker.get(token, now, 1)
end
test "returns with a sleep after hitting limit" do
call_limit_header = {"X-Shopify-Shop-Api-Call-Limit", "50/50"}
retry_after_header = {"Retry-After", "2.0"}
response = %Response{headers: [{"foo", "bar"}, call_limit_header, retry_after_header]}
token = %AuthToken{app_name: "hit_limit", shop_name: "hit_limit"}
hit_time = ~U[2020-01-01 12:00:00.000000Z]
assert {0, 2000} == RESTTracker.api_hit_limit(token, response, hit_time)
assert {0, 2000} == RESTTracker.get(token, hit_time, 1)
wait_a_second = ~U[2020-01-01 12:00:01.100000Z]
assert {0, 900} == RESTTracker.get(token, wait_a_second, 1)
wait_two_seconds = ~U[2020-01-01 12:00:02.100000Z]
assert {0, 0} == RESTTracker.get(token, wait_two_seconds, 1)
end
end
end
| 33.942857 | 92 | 0.664141 |
03e957c46288e7291ddf759187ef413ae4c364b6 | 29,163 | exs | Elixir | test/hammox_test.exs | camilleryr/hammox | 9f11ffb11bc3c2689c13feca123b6aef2dc66d76 | [
"Apache-2.0"
] | 430 | 2019-09-26T05:01:13.000Z | 2022-03-30T05:21:11.000Z | test/hammox_test.exs | wojtekmach/hammox | c7ea8e598a753a69384a9a08f71a51ff03d4751b | [
"Apache-2.0"
] | 48 | 2019-09-20T16:11:51.000Z | 2022-03-23T20:33:16.000Z | test/hammox_test.exs | wojtekmach/hammox | c7ea8e598a753a69384a9a08f71a51ff03d4751b | [
"Apache-2.0"
] | 18 | 2019-10-02T19:59:33.000Z | 2021-12-07T07:49:16.000Z | defmodule HammoxTest do
use ExUnit.Case, async: true
import Hammox
defmock(TestMock, for: Hammox.Test.Behaviour)
describe "protect/1" do
test "decorate all functions inside the module" do
assert %{other_foo_0: _, other_foo_1: _, foo_0: _} =
Hammox.protect(Hammox.Test.BehaviourImplementation)
end
test "decorates the function designated by the MFA tuple" do
fun = Hammox.protect({Hammox.Test.BehaviourImplementation, :foo, 0})
assert_raise(Hammox.TypeMatchError, fn -> fun.() end)
end
end
describe "protect/2" do
test "returns function protected from contract errors" do
fun = Hammox.protect({Hammox.Test.SmallImplementation, :foo, 0}, Hammox.Test.SmallBehaviour)
assert_raise(Hammox.TypeMatchError, fn -> fun.() end)
end
test "throws when typespec does not exist" do
assert_raise(Hammox.TypespecNotFoundError, fn ->
Hammox.protect(
{Hammox.Test.SmallImplementation, :nospec_fun, 0},
Hammox.Test.SmallBehaviour
)
end)
end
test "throws when behaviour module does not exist" do
assert_raise(ArgumentError, fn ->
Hammox.protect(
{Hammox.Test.SmallImplementation, :foo, 0},
NotExistModule
)
end)
end
test "throws when implementation module does not exist" do
assert_raise(ArgumentError, fn ->
Hammox.protect(
{NotExistModule, :foo, 0},
Hammox.Test.SmallBehaviour
)
end)
end
test "throws when implementation function does not exist" do
assert_raise(ArgumentError, fn ->
Hammox.protect(
{Hammox.Test.SmallImplementation, :nonexistent_fun, 0},
Hammox.Test.SmallBehaviour
)
end)
end
test "decorate multiple functions inside behaviour-implementation module" do
assert %{foo_0: _, other_foo_1: _} =
Hammox.protect(Hammox.Test.BehaviourImplementation,
foo: 0,
other_foo: 1
)
end
test "decorate all functions" do
assert %{foo_0: _, other_foo_0: _, other_foo_1: _} =
Hammox.protect(Hammox.Test.SmallImplementation, Hammox.Test.SmallBehaviour)
end
end
describe "protect/3" do
test "returns setup_all friendly map" do
assert %{foo_0: _, other_foo_1: _} =
Hammox.protect(Hammox.Test.SmallImplementation, Hammox.Test.SmallBehaviour,
foo: 0,
other_foo: 1
)
end
test "works with arity arrays" do
assert %{other_foo_0: _, other_foo_1: _} =
Hammox.protect(Hammox.Test.SmallImplementation, Hammox.Test.SmallBehaviour,
other_foo: [0, 1]
)
end
end
describe "union" do
test "pass first type" do
assert_pass(:foo_union, :a)
end
test "pass second type" do
assert_pass(:foo_union, :b)
end
test "fail" do
assert_fail(:foo_union, :c)
end
test "provides deepest stacktrace" do
assert_fail(:foo_uneven_union, %{a: "a"}, ~r/Map value/)
end
end
describe "any()" do
test "pass" do
assert_pass(:foo_any, :baz)
end
end
describe "none()" do
test "fail" do
assert_fail(:foo_none, :baz)
end
end
describe "atom()" do
test "pass" do
assert_pass(:foo_atom, :baz)
end
test "fail" do
assert_fail(:foo_atom, "baz")
end
end
describe "map()" do
test "pass" do
assert_pass(:foo_map, %{a: 1})
end
test "fail" do
assert_fail(:foo_map, :baz)
end
end
describe "pid()" do
test "pass" do
assert_pass(:foo_pid, spawn(fn -> nil end))
end
test "fail" do
assert_fail(:foo_pid, 1)
end
end
describe "port()" do
test "pass" do
{:ok, port} = :gen_tcp.listen(0, [])
assert_pass(:foo_port, port)
end
test "fail" do
assert_fail(:foo_port, :baz)
end
end
describe "reference()" do
test "pass" do
assert_pass(:foo_reference, Kernel.make_ref())
end
test "fail" do
assert_fail(:foo_reference, :baz)
end
end
describe "struct()" do
test "pass" do
assert_pass(:foo_struct, %Hammox.Test.Struct{foo: :bar})
end
test "fail" do
assert_fail(:foo_struct, %{foo: :bar})
end
end
describe "tuple()" do
test "pass empty" do
assert_pass(:foo_tuple, {})
end
test "pass 1-tuple" do
assert_pass(:foo_tuple, {:a})
end
test "fail" do
assert_fail(:foo_tuple, [])
end
end
describe "float()" do
test "pass" do
assert_pass(:foo_float, 0.0)
end
test "fail" do
assert_fail(:foo_float, 0)
end
end
describe "integer()" do
test "pass" do
assert_pass(:foo_integer, 0)
end
test "fail" do
assert_fail(:foo_integer, 0.0)
end
end
describe "neg_integer()" do
test "pass" do
assert_pass(:foo_neg_integer, -1)
end
test "fail float" do
assert_fail(:foo_neg_integer, -1.0)
end
test "fail zero" do
assert_fail(:foo_neg_integer, 0)
end
end
describe "non_neg_integer()" do
test "pass" do
assert_pass(:foo_non_neg_integer, 0)
end
test "fail float" do
assert_fail(:foo_non_neg_integer, 0.0)
end
test "fail" do
assert_fail(:foo_non_neg_integer, -1)
end
end
describe "pos_integer()" do
test "pass" do
assert_pass(:foo_pos_integer, 1)
end
test "fail float" do
assert_fail(:foo_pos_integer, 1.0)
end
test "fail zero" do
assert_fail(:foo_pos_integer, 0)
end
end
describe "list(type)" do
test "empty pass" do
assert_pass(:foo_list, [])
end
test "pass" do
assert_pass(:foo_list, [:a, :b])
end
test "fail" do
assert_fail(:foo_list, [:a, 1, :b])
end
end
describe "nonempty_list(type)" do
test "empty fail" do
assert_fail(:foo_nonempty_list, [])
end
test "pass" do
assert_pass(:foo_nonempty_list, [:a, :b])
end
test "fail" do
assert_fail(:foo_nonempty_list, [:a, 1, :b])
end
end
describe "maybe_improper_list(type1, type2)" do
test "empty list pass" do
assert_pass(:foo_maybe_improper_list, [])
end
test "proper list type pass" do
assert_pass(:foo_maybe_improper_list, [:a])
end
test "proper list type fail" do
assert_fail(:foo_maybe_improper_list, [:b])
end
test "improper list type fail" do
assert_fail(:foo_maybe_improper_list, [:a | :a])
end
test "improper list type pass" do
assert_pass(:foo_maybe_improper_list, [:a | :b])
end
end
describe "nonempty_improper_list(type1, type2)" do
test "empty list fail" do
assert_fail(:foo_nonempty_improper_list, [])
end
test "proper list fail" do
assert_fail(:foo_nonempty_improper_list, [:b])
end
test "improper list type fail" do
assert_fail(:foo_nonempty_improper_list, [:a | :a])
end
test "improper list pass" do
assert_pass(:foo_nonempty_improper_list, [:a | :b])
end
end
describe "nonempty_maybe_improper_list(type1, type2)" do
test "empty list fail" do
assert_fail(:foo_nonempty_maybe_improper_list, [])
end
test "proper list type pass" do
assert_pass(:foo_nonempty_maybe_improper_list, [:a])
end
test "proper list type fail" do
assert_fail(:foo_nonempty_maybe_improper_list, [:b])
end
test "improper list type fail" do
assert_fail(:foo_nonempty_maybe_improper_list, [:a | :a])
end
test "improper list type pass" do
assert_pass(:foo_nonempty_maybe_improper_list, [:a | :b])
end
end
describe "atom literal" do
test "pass" do
assert_pass(:foo_atom_literal, :ok)
end
test "fail" do
assert_fail(:foo_atom_literal, :other)
end
end
describe "empty bitstring literal" do
test "pass" do
assert_pass(:foo_empty_bitstring_literal, <<>>)
end
test "fail" do
assert_fail(:foo_empty_bitstring_literal, <<1>>)
end
end
describe "bitstring with size literal" do
test "pass" do
assert_pass(:foo_bitstring_size_literal, <<1::size(3)>>)
end
test "fail" do
assert_fail(:foo_bitstring_size_literal, <<1::size(4)>>)
end
end
describe "bitstring with unit literal" do
test "pass" do
assert_pass(:foo_bitstring_unit_literal, <<1::9>>)
end
test "fail" do
assert_fail(:foo_bitstring_unit_literal, <<1::7>>)
end
end
describe "bitstring with size and unit literal" do
test "pass" do
assert_pass(:foo_bitstring_size_unit_literal, <<1::8>>)
end
test "fail" do
assert_fail(:foo_bitstring_size_unit_literal, <<1::7>>)
end
end
describe "nullary function literal" do
test "pass" do
assert_pass(:foo_nullary_function_literal, fn -> nil end)
end
test "fail" do
assert_fail(:foo_nullary_function_literal, fn _ -> nil end)
end
end
describe "binary function literal" do
test "pass" do
assert_pass(:foo_binary_function_literal, fn _, _ -> nil end)
end
test "fail" do
assert_fail(:foo_binary_function_literal, fn _, _, _ -> nil end)
end
end
describe "any arity function literal" do
test "pass zero" do
assert_pass(:foo_any_arity_function_literal, fn -> nil end)
end
test "pass two" do
assert_pass(:foo_any_arity_function_literal, fn _, _ -> nil end)
end
test "fail non function" do
assert_fail(:foo_any_arity_function_literal, :fun)
end
end
describe "integer literal" do
test "pass" do
assert_pass(:foo_integer_literal, 1)
end
test "fail" do
assert_fail(:foo_integer_literal, 2)
end
end
describe "negative integer literal" do
test "pass" do
assert_pass(:foo_neg_integer_literal, -1)
end
test "fail" do
assert_fail(:foo_neg_integer_literal, 1)
end
end
describe "integer range literal" do
test "pass" do
assert_pass(:foo_integer_range_literal, 5)
end
test "fail" do
assert_fail(:foo_integer_range_literal, 11)
end
end
describe "list literal" do
test "empty pass" do
assert_pass(:foo_list_literal, [])
end
test "pass" do
assert_pass(:foo_list_literal, [:a, :b])
end
test "fail" do
assert_fail(:foo_list_literal, [:a, 1, :b])
end
end
describe "empty list literal" do
test "pass" do
assert_pass(:foo_empty_list_literal, [])
end
test "fail" do
assert_fail(:foo_empty_list_literal, [:a])
end
end
describe "nonempty any list literal" do
test "empty fail" do
assert_fail(:foo_nonempty_any_list_literal, [])
end
test "pass" do
assert_pass(:foo_nonempty_any_list_literal, [:a, :b])
end
test "fail" do
assert_pass(:foo_nonempty_any_list_literal, [:a, 1, :b])
end
end
describe "nonempty list literal" do
test "empty fail" do
assert_fail(:foo_nonempty_list_literal, [])
end
test "pass" do
assert_pass(:foo_nonempty_list_literal, [:a, :b])
end
test "fail" do
assert_fail(:foo_nonempty_list_literal, [:a, 1, :b])
end
end
describe "keyword list literal" do
test "empty pass" do
assert_pass(:foo_keyword_list_literal, [])
end
test "missing key pass" do
assert_pass(:foo_keyword_list_literal, key1: :bar)
end
test "different order pass" do
assert_pass(:foo_keyword_list_literal, key2: 2, key1: :bar)
end
test "unknown key fail" do
assert_fail(:foo_keyword_list_literal, key3: "bar")
end
test "wrong type fail" do
assert_fail(:foo_keyword_list_literal, key1: "bar")
end
end
describe "empty map literal" do
test "pass" do
assert_pass(:foo_empty_map_literal, %{})
end
test "fail" do
assert_fail(:foo_empty_map_literal, %{a: 1})
end
end
describe "map with required atom key literal" do
test "empty fail" do
assert_fail(:foo_map_required_atom_key_literal, %{})
end
test "pass" do
assert_pass(:foo_map_required_atom_key_literal, %{key: :bar})
end
test "unknown key fail" do
assert_fail(:foo_map_required_atom_key_literal, %{key: :bar, key2: :baz})
end
end
describe "map with required key literal" do
test "empty fail" do
assert_fail(:foo_map_required_key_literal, %{})
end
test "pass" do
assert_pass(:foo_map_required_key_literal, %{key: :bar})
end
test "pass multiple keys matching type" do
assert_pass(:foo_map_required_key_literal, %{key1: :bar, key2: :baz})
end
test "unknown key type fail" do
assert_fail(:foo_map_required_key_literal, %{key1: :bar, key2: 1})
end
end
describe "map with optional key literal" do
test "empty pass" do
assert_pass(:foo_map_optional_key_literal, %{})
end
test "pass" do
assert_pass(:foo_map_optional_key_literal, %{key: :bar})
end
test "unknown key type fail" do
assert_fail(:foo_map_optional_key_literal, %{key1: :bar, key2: 1})
end
end
describe "map with required and optional keys literal" do
test "empty fail" do
assert_fail(:foo_map_required_and_optional_key_literal, %{})
end
test "pass without optional" do
assert_pass(:foo_map_required_and_optional_key_literal, %{key: :bar})
end
test "pass multiple keys matching required type" do
assert_pass(:foo_map_required_and_optional_key_literal, %{key1: :bar, key2: :baz})
end
test "pass optional key type" do
assert_pass(:foo_map_required_and_optional_key_literal, %{:key1 => :bar, 1 => 2})
end
test "fail unknown type" do
assert_fail(:foo_map_required_and_optional_key_literal, %{key1: :bar, key2: []})
end
end
describe "map with overlapping required key types" do
test "empty fail" do
assert_fail(:foo_map_overlapping_required_types_literal, %{})
end
test "one key fulfilling both pass" do
assert_pass(:foo_map_overlapping_required_types_literal, %{foo: :bar})
end
end
describe "map with __struct__ key" do
test "empty fail" do
assert_fail(:foo_map_struct_key, %{})
end
test "pass" do
assert_pass(:foo_map_struct_key, %{
__struct__: :foo,
key: 42
})
end
test "fail with missing __struct__" do
assert_fail(:foo_map_struct_key, %{key: 42})
end
test "fail with incorrect __struct__" do
assert_fail(:foo_map_struct_key, %{
__struct__: "wrong type",
key: 42
})
end
test "fail with incorrect fields" do
assert_fail(:foo_map_struct_key, %{
__struct__: :foo,
key: "not a number"
})
end
test "fail with incorrect __struct__ and fields" do
assert_fail(:foo_map_struct_key, %{
__struct__: "wrong type",
key: "not a number"
})
end
end
describe "struct literal" do
test "fail map" do
assert_fail(:foo_struct_literal, %{foo: :bar})
end
test "fail different struct" do
assert_fail(:foo_struct_literal, %Hammox.Test.OtherStruct{})
end
test "pass default struct" do
assert_pass(:foo_struct_literal, %Hammox.Test.Struct{})
end
test "pass struct with fields" do
assert_pass(:foo_struct_literal, %Hammox.Test.Struct{foo: 1})
end
end
describe "struct with fields literal" do
test "fail map" do
assert_fail(:foo_struct_fields_literal, %{foo: 1})
end
test "fail default struct" do
assert_fail(:foo_struct_fields_literal, %Hammox.Test.Struct{})
end
test "pass struct with correct fields" do
assert_pass(:foo_struct_fields_literal, %Hammox.Test.Struct{foo: 1})
end
test "fail struct with incorrect fields" do
assert_fail(:foo_struct_fields_literal, %Hammox.Test.Struct{foo: "bar"})
end
end
describe "empty tuple" do
test "pass" do
assert_pass(:foo_empty_tuple_literal, {})
end
test "fail" do
assert_fail(:foo_empty_tuple_literal, {:foo})
end
end
describe "2-tuple" do
test "fail different size" do
assert_fail(:foo_two_tuple_literal, {:foo})
end
test "fail wrong type" do
assert_fail(:foo_two_tuple_literal, {:error, :reason})
end
test "pass" do
assert_pass(:foo_two_tuple_literal, {:ok, :details})
end
end
describe "term()" do
test "pass" do
assert_pass(:foo_term, :any)
end
end
describe "arity()" do
test "pass" do
assert_pass(:foo_arity, 100)
end
test "fail >255" do
assert_fail(:foo_arity, 300)
end
test "fail <0" do
assert_fail(:foo_arity, -1)
end
end
describe "as_boolean(type)" do
test "pass type" do
assert_pass(:foo_as_boolean, :ok)
end
test "fail wrong type" do
assert_fail(:foo_as_boolean, :error)
end
end
describe "binary()" do
test "pass string" do
assert_pass(:foo_binary, "abc")
end
test "fail bitstring" do
assert_fail(:foo_binary, <<1::7>>)
end
end
describe "bitstring()" do
test "pass string" do
assert_pass(:foo_bitstring, "abc")
end
test "pass bitstring" do
assert_pass(:foo_bitstring, <<1::7>>)
end
test "fail other" do
assert_fail(:foo_bitstring, 1)
end
end
describe "boolean()" do
test "pass true" do
assert_pass(:foo_boolean, true)
end
test "pass false" do
assert_pass(:foo_boolean, false)
end
test "fail nil" do
assert_fail(:foo_boolean, nil)
end
end
describe "byte()" do
test "pass" do
assert_pass(:foo_byte, 100)
end
test "fail >255" do
assert_fail(:foo_byte, 300)
end
test "fail <0" do
assert_fail(:foo_byte, -1)
end
end
describe "char()" do
test "pass" do
assert_pass(:foo_char, 0x100000)
end
test "fail >0x10FFFF" do
assert_fail(:foo_char, 0x200000)
end
test "fail <0" do
assert_fail(:foo_char, -1)
end
end
describe "charlist()" do
test "pass" do
assert_pass(:foo_charlist, [65])
end
test "fail" do
assert_fail(:foo_charlist, "A")
end
end
describe "nonempty_charlist()" do
test "pass" do
assert_pass(:foo_nonempty_charlist, [65])
end
test "fail empty" do
assert_fail(:foo_nonempty_charlist, [])
end
test "fail string" do
assert_fail(:foo_nonempty_charlist, "A")
end
end
describe "fun()" do
test "pass" do
assert_pass(:foo_fun, fn -> nil end)
end
test "fail" do
assert_fail(:foo_fun, :fun)
end
end
describe "function()" do
test "pass" do
assert_pass(:foo_function, fn -> nil end)
end
test "fail" do
assert_fail(:foo_function, :fun)
end
end
describe "identifier()" do
test "pass pid" do
assert_pass(:foo_identifier, spawn(fn -> nil end))
end
test "pass port" do
{:ok, port} = :gen_tcp.listen(0, [])
assert_pass(:foo_identifier, port)
end
test "pass reference" do
assert_pass(:foo_identifier, Kernel.make_ref())
end
test "fail integer" do
assert_fail(:foo_identifier, 1)
end
end
describe "iodata()" do
test "pass iolist" do
assert_pass(:foo_iodata, [[123 | "a"] | "ab"])
end
test "pass binary" do
assert_pass(:foo_iodata, "abc")
end
test "fail" do
assert_fail(:foo_iodata, [:a])
end
end
describe "iolist()" do
test "pass" do
assert_pass(:foo_iolist, [[123 | "a"] | "ab"])
end
test "fail" do
assert_fail(:foo_iolist, [:a])
end
end
describe "keyword()" do
test "pass" do
assert_pass(:foo_keyword, a: 1)
end
test "fail" do
assert_fail(:foo_keyword, [{1, 2}])
end
end
describe "keyword(type)" do
test "pass" do
assert_pass(:foo_keyword_type, a: 1)
end
test "fail" do
assert_fail(:foo_keyword_type, a: :b)
end
end
describe "list()" do
test "pass empty" do
assert_pass(:foo_list_any, [])
end
test "pass nonempty" do
assert_pass(:foo_list_any, [1])
end
test "fail" do
assert_fail(:foo_list_any, "")
end
end
describe "nonempty_list()" do
test "pass" do
assert_pass(:foo_nonempty_list_any, [1])
end
test "fail empty" do
assert_fail(:foo_nonempty_list_any, [])
end
end
describe "maybe_improper_list()" do
test "empty list pass" do
assert_pass(:foo_maybe_improper_list_any, [])
end
test "proper list type pass" do
assert_pass(:foo_maybe_improper_list_any, [:a])
end
test "improper list type pass" do
assert_pass(:foo_maybe_improper_list_any, [:a | :b])
end
test "not list fail" do
assert_fail(:foo_maybe_improper_list_any, "b")
end
end
describe "nonempty_maybe_improper_list()" do
test "empty list fail" do
assert_fail(:foo_nonempty_maybe_improper_list_any, [])
end
test "proper list pass" do
assert_pass(:foo_nonempty_maybe_improper_list_any, [:a])
end
test "improper list type pass" do
assert_pass(:foo_nonempty_maybe_improper_list_any, [:a | :b])
end
end
describe "mfa()" do
test "pass" do
assert_pass(:foo_mfa, {Enum, :map, 2})
end
test "fail" do
assert_fail(:foo_mfa, {Enum, :map, -1})
end
end
describe "module()" do
test "pass" do
assert_pass(:foo_module, Enum)
end
test "fail" do
assert_fail(:foo_module, "Enum")
end
end
describe "no_return()" do
test "fail" do
assert_fail(:foo_no_return, :foo)
end
end
describe "node()" do
test "pass" do
assert_pass(:foo_node, :node)
end
test "fail" do
assert_fail(:foo_node, "node")
end
end
describe "number()" do
test "pass integer" do
assert_pass(:foo_number, 1)
end
test "pass float" do
assert_pass(:foo_number, 1.0)
end
test "fail" do
assert_fail(:foo_number, "baz")
end
end
describe "timeout()" do
test "pass :infinity" do
assert_pass(:foo_timeout, :infinity)
end
test "fail other atoms" do
assert_fail(:foo_timeout, :foo)
end
test "pass non negative integer" do
assert_pass(:foo_timeout, 0)
end
test "fail negative integer" do
assert_fail(:foo_timeout, -1)
end
test "fail float" do
assert_fail(:foo_timeout, 1.0)
end
end
describe "remote type" do
test "fail" do
assert_fail(:foo_remote_type, :foo)
end
test "pass" do
assert_pass(:foo_remote_type, [1])
end
end
describe "remote type with param" do
test "pass" do
assert_pass(:foo_remote_type_with_arg, [[1]])
end
test "fail" do
assert_fail(:foo_remote_type_with_arg, [1])
end
end
describe "nonexistent remote module" do
test "fail" do
assert_fail(:foo_nonexistent_remote_module, :foo)
end
end
describe "nonexistent remote type" do
test "fail" do
assert_fail(:foo_nonexistent_remote_type, :foo)
end
end
describe "protocol remote type" do
test "pass" do
assert_pass(:foo_protocol_remote_type, [])
end
test "fail" do
assert_fail(:foo_protocol_remote_type, :a)
end
end
describe "user type" do
test "pass" do
assert_pass(:foo_user_type, [[:foo_type]])
end
test "fail" do
assert_fail(:foo_user_type, [[:other_type]])
end
end
describe "user type defined in behaviour" do
test "pass" do
assert_pass(:foo_behaviour_user_type, :foo_type)
end
test "fail" do
assert_fail(:foo_behaviour_user_type, :other_type)
end
end
describe "user type as annotated param" do
test "pass" do
TestMock |> expect(:foo_ann_type_user_type, fn _ -> :ok end)
assert :ok == TestMock.foo_ann_type_user_type(:foo_type)
end
test "fail" do
TestMock |> expect(:foo_ann_type_user_type, fn _ -> :ok end)
assert_raise(
Hammox.TypeMatchError,
fn -> TestMock.foo_ann_type_user_type(:other_type) end
)
end
end
describe "annotated return type" do
test "pass" do
assert_pass(:foo_annotated_return_type, :return_type)
end
test "fail" do
assert_fail(:foo_annotated_return_type, :other_type)
end
end
describe "annotated type in a container" do
test "pass" do
assert_pass(:foo_annotated_type_in_container, {:correct_type})
end
test "fail" do
assert_fail(:foo_annotated_type_in_container, {:incorrect_type})
end
end
describe "local type as remote type param" do
test "pass" do
assert_pass(:foo_remote_param_type, {:ok, :local})
end
test "fail" do
assert_fail(:foo_remote_param_type, {:ok, :other})
end
end
describe "arg type checking" do
test "no args pass" do
TestMock |> expect(:foo_no_arg, fn -> :ok end)
assert :ok == TestMock.foo_no_arg()
end
test "unnamed arg pass" do
TestMock |> expect(:foo_unnamed_arg, fn _arg -> :ok end)
assert :ok == TestMock.foo_unnamed_arg(:bar)
end
test "unnamed arg fail" do
TestMock |> expect(:foo_unnamed_arg, fn _arg -> :ok end)
assert_raise(
Hammox.TypeMatchError,
~r/1st argument value "bar" does not match 1st parameter's type atom()./,
fn -> TestMock.foo_unnamed_arg("bar") end
)
end
test "named arg pass" do
TestMock |> expect(:foo_named_arg, fn _arg -> :ok end)
assert :ok == TestMock.foo_named_arg(:bar)
end
test "named arg fail" do
TestMock |> expect(:foo_named_arg, fn _arg -> :ok end)
assert_raise(
Hammox.TypeMatchError,
~r/1st argument value "bar" does not match 1st parameter's type atom\(\) \("arg1"\)/,
fn -> TestMock.foo_named_arg("bar") end
)
end
test "named and unnamed arg pass" do
TestMock |> expect(:foo_named_and_unnamed_arg, fn _arg1, _arg2 -> :ok end)
assert :ok == TestMock.foo_named_and_unnamed_arg(:bar, 1)
end
test "named and unnamed arg fail" do
TestMock |> expect(:foo_named_and_unnamed_arg, fn _arg1, _arg2 -> :ok end)
assert_raise(
Hammox.TypeMatchError,
~r/2nd argument value "baz" does not match 2nd parameter's type number\(\) \("arg2"\)/,
fn -> TestMock.foo_named_and_unnamed_arg(:bar, "baz") end
)
end
test "remote type arg pass" do
TestMock |> expect(:foo_remote_type_arg, fn _ -> :ok end)
assert :ok == TestMock.foo_remote_type_arg([])
end
end
describe "multiple typespec for one function" do
test "passes first typespec" do
TestMock |> expect(:foo_multiple_typespec, fn _ -> :a end)
assert :a == TestMock.foo_multiple_typespec(:a)
end
test "passes second typespec" do
TestMock |> expect(:foo_multiple_typespec, fn _ -> :b end)
assert :b == TestMock.foo_multiple_typespec(:b)
end
test "fails mix of typespecs" do
TestMock |> expect(:foo_multiple_typespec, fn _ -> :b end)
assert_raise Hammox.TypeMatchError, fn -> TestMock.foo_multiple_typespec(:a) end
end
end
describe "nested parametrized types" do
test "pass" do
assert_pass(:foo_nested_param_types, :param)
end
test "fail" do
assert_fail(:foo_nested_param_types, :wrong_param)
end
end
describe "multiline parametrized types" do
test "works when param usage is on a line other than declaration line" do
assert_pass(:foo_multiline_param_type, %{value: :arg})
end
end
describe "private types" do
test "pass" do
assert_pass(:foo_private_type, :private_value)
end
test "fail" do
assert_fail(:foo_private_type, :other)
end
end
describe "opaque types" do
test "pass" do
assert_pass(:foo_opaque_type, :opaque_value)
end
test "fail" do
assert_fail(:foo_opaque_type, :other)
end
end
describe "expect/4" do
test "protects mocks" do
TestMock |> expect(:foo_none, fn -> :baz end)
assert_raise(Hammox.TypeMatchError, fn -> TestMock.foo_none() end)
end
end
describe "stub/3" do
test "protects stubs" do
TestMock |> stub(:foo_none, fn -> :baz end)
assert_raise(Hammox.TypeMatchError, fn -> TestMock.foo_none() end)
end
end
defp assert_pass(function_name, value) do
TestMock |> expect(function_name, fn -> value end)
assert value == apply(TestMock, function_name, [])
end
defp assert_fail(function_name, value) do
TestMock |> expect(function_name, fn -> value end)
assert_raise(Hammox.TypeMatchError, fn -> apply(TestMock, function_name, []) end)
end
defp assert_fail(function_name, value, expected_message) do
TestMock |> expect(function_name, fn -> value end)
assert_raise(Hammox.TypeMatchError, expected_message, fn ->
apply(TestMock, function_name, [])
end)
end
end
| 22.347126 | 98 | 0.632994 |
03e95df45edf02621b7a69fa7e4fae9dcdd7cd23 | 5,848 | ex | Elixir | apps/admin_app/lib/admin_app_web/router.ex | saurabharch/avia | 74a82a95cf8bfe8143d1fce8136a3bb7ffc9467c | [
"MIT"
] | 1 | 2018-12-01T18:13:55.000Z | 2018-12-01T18:13:55.000Z | apps/admin_app/lib/admin_app_web/router.ex | saurabharch/avia | 74a82a95cf8bfe8143d1fce8136a3bb7ffc9467c | [
"MIT"
] | null | null | null | apps/admin_app/lib/admin_app_web/router.ex | saurabharch/avia | 74a82a95cf8bfe8143d1fce8136a3bb7ffc9467c | [
"MIT"
] | null | null | null | defmodule AdminAppWeb.Router do
use AdminAppWeb, :router
use Plug.ErrorHandler
use Sentry.Plug
pipeline :browser do
plug(:accepts, ["html"])
plug(:fetch_session)
plug(:fetch_flash)
plug(:protect_from_forgery)
plug(:put_secure_browser_headers)
end
# This pipeline is just to avoid CSRF token.
# TODO: This needs to be remove when the token issue gets fixed in custom form
pipeline :avoid_csrf do
plug(:accepts, ["html"])
plug(:fetch_session)
plug(:fetch_flash)
plug(:put_secure_browser_headers)
end
pipeline :api do
plug(:accepts, ["json"])
end
pipeline :authentication do
plug(AdminAppWeb.AuthenticationPipe)
end
scope "/", AdminAppWeb do
# Use the default browser stack
pipe_through([:browser, :authentication])
get("/", PageController, :index)
get("/orders/:category", OrderController, :index)
get("/orders", OrderController, :index)
get("/orders/:number/detail", OrderController, :show)
put("/orders/:id/packages/", OrderController, :update_package, as: :order_package)
put("/orders/:id/state", OrderController, :update_state, as: :order_state)
put("/orders/:id/cod-payment", OrderController, :cod_payment_update, as: :order_cod_update)
resources "/orders", OrderController, only: ~w[show create]a, param: "number" do
get("/cart", OrderController, :get, as: :cart)
post("/cart", OrderController, :remove_item, as: :cart)
post("/cart/edit", OrderController, :edit, as: :cart)
put("/cart/update", OrderController, :update_line_item, as: :cart)
put("/cart", OrderController, :add, as: :cart)
get("/address/search", OrderController, :address_search, as: :cart)
put("/address/search", OrderController, :address_attach, as: :cart)
get("/address/add", OrderController, :address_add_index, as: :cart)
post("/address/add", OrderController, :address_add, as: :cart)
end
resources("/tax_categories", TaxCategoryController, only: [:index, :new, :create])
resources("/stock_locations", StockLocationController)
resources("/option_types", OptionTypeController)
resources("/properties", PropertyController, except: [:show])
resources("/registrations", RegistrationController, only: [:new, :create])
resources("/session", SessionController, only: [:delete])
resources("/users", UserController)
resources("/roles", RoleController)
resources("/permissions", PermissionController)
resources("/variation_themes", VariationThemeController, except: [:show])
resources("/prototypes", PrototypeController, except: [:show])
resources("/products", ProductController)
resources("/product_brands", ProductBrandController)
resources("/payment_methods", PaymentMethodController)
resources("/zones", ZoneController, only: [:index, :new, :create, :edit, :update, :delete])
resources("/general_settings", GeneralSettingsController)
post("/payment-provider-inputs", PaymentMethodController, :payment_preferences)
get("/product/category", ProductController, :select_category)
post("/product-images/:product_id", ProductController, :add_images)
delete("/product-images/", ProductController, :delete_image)
get("/taxonomy", TaxonomyController, :show_default_taxonomy)
resources("/taxonomy", TaxonomyController, only: [:create])
get("/products/:product_id/property", ProductController, :index_property)
get("/products/:product_id/property/new", ProductController, :new_property)
get("/products/:product_id/property/:property_id/edit", ProductController, :edit_property)
post("/products/:product_id/property/create", ProductController, :create_property)
get("/dashboard", DashboardController, :index)
post(
"/products/:product_id/property/:property_id/update",
ProductController,
:update_property
)
delete(
"/products/:product_id/property/:property_id/delete",
ProductController,
:delete_property
)
get("/shipping-policy/new", ShippingPolicyController, :new)
get("/shipping-policy/:id/edit", ShippingPolicyController, :edit)
put("/shipping-policy/:id", ShippingPolicyController, :update)
get("/product/import/etsy", ProductImportController, :import_etsy)
get("/product/import/etsy/callback", ProductImportController, :oauth_callback)
get("/product/import/etsy/progress", ProductImportController, :import_progress)
end
scope "/", AdminAppWeb do
pipe_through(:avoid_csrf)
post("/products/variants/new", ProductController, :new_variant)
post("/product/stock", ProductController, :add_stock)
end
scope "/", AdminAppWeb do
pipe_through(:browser)
get("/orders/:number/show-invoice", OrderController, :show_invoice)
get("/orders/:number/show-packing-slip", OrderController, :show_packing_slip)
get("/orders/:number/download-packing-slip", OrderController, :download_packing_slip_pdf)
get("/orders/:number/download-invoice", OrderController, :download_invoice_pdf)
resources("/session", SessionController, only: [:new, :create, :edit, :update])
get("/password_reset", SessionController, :password_reset)
get("/password_recovery", SessionController, :verify)
post("/check_email", SessionController, :check_email)
end
# Other scopes may use custom stacks.
scope "/api", AdminAppWeb do
pipe_through(:api)
resources("/stock_locations", StockLocationController)
end
scope "/api", AdminAppWeb.TemplateApi do
pipe_through(:api)
resources("/option_types", OptionTypeController)
get("/categories/:taxon_id", TaxonomyController, :index)
get("/taxon/:taxon_id", TaxonomyController, :taxon_edit)
put("/taxonomy/update", TaxonomyController, :update_taxon)
post("/product_option_values/:id", OptionTypeController, :update)
end
end
| 41.183099 | 95 | 0.717339 |
03e96a5a4e552521885e93d070bd1e06e94c0fe3 | 59 | ex | Elixir | lib/op_api_web/views/layout_view.ex | operate-bsv/api.operatebsv.org | c31a5d2a050cd88d99a94ed2a0e3bcc987114220 | [
"MIT"
] | 3 | 2019-11-06T23:40:55.000Z | 2020-06-20T22:33:25.000Z | lib/op_api_web/views/layout_view.ex | operate-bsv/api.operatebsv.org | c31a5d2a050cd88d99a94ed2a0e3bcc987114220 | [
"MIT"
] | null | null | null | lib/op_api_web/views/layout_view.ex | operate-bsv/api.operatebsv.org | c31a5d2a050cd88d99a94ed2a0e3bcc987114220 | [
"MIT"
] | null | null | null | defmodule OpApiWeb.LayoutView do
use OpApiWeb, :view
end
| 14.75 | 32 | 0.79661 |
03e96d8900fc526cd3f679f22056c11e9e1f72fb | 570 | ex | Elixir | test/support/test/retry.ex | mathieuprog/absinthe_graphql_ws | d2a6be578459f7bb0dd81e64bb7e33cf6c580a01 | [
"MIT"
] | 26 | 2021-06-07T11:00:25.000Z | 2022-03-13T00:17:52.000Z | test/support/test/retry.ex | mathieuprog/absinthe_graphql_ws | d2a6be578459f7bb0dd81e64bb7e33cf6c580a01 | [
"MIT"
] | 9 | 2021-06-07T14:59:41.000Z | 2022-03-27T13:08:50.000Z | test/support/test/retry.ex | mathieuprog/absinthe_graphql_ws | d2a6be578459f7bb0dd81e64bb7e33cf6c580a01 | [
"MIT"
] | 7 | 2021-06-07T11:04:47.000Z | 2022-03-19T05:29:20.000Z | defmodule Test.Retry do
@moduledoc false
def retry(fun) when is_function(fun),
do: retry_for(5000, fun)
def retry_for(timeout, fun) when is_function(fun) and is_number(timeout) do
DateTime.utc_now()
|> DateTime.add(timeout, :millisecond)
|> retry_until(fun)
end
def retry_until(%DateTime{} = time, fun) when is_function(fun) do
fun.()
rescue
e ->
if DateTime.compare(time, DateTime.utc_now()) == :gt do
:timer.sleep(100)
retry_until(time, fun)
else
reraise e, __STACKTRACE__
end
end
end
| 22.8 | 77 | 0.645614 |
03e980d1f72abc9823a27de9db99add39a68ed36 | 14,508 | exs | Elixir | test/credo/check/warning/unused_list_operation_test.exs | hrzndhrn/credo | 71a7b24a5ca8e7a48416e0cdfb42cf8a0fef9593 | [
"MIT"
] | 4,590 | 2015-09-28T06:01:43.000Z | 2022-03-29T08:48:57.000Z | test/credo/check/warning/unused_list_operation_test.exs | hrzndhrn/credo | 71a7b24a5ca8e7a48416e0cdfb42cf8a0fef9593 | [
"MIT"
] | 890 | 2015-11-16T21:07:07.000Z | 2022-03-29T08:52:07.000Z | test/credo/check/warning/unused_list_operation_test.exs | hrzndhrn/credo | 71a7b24a5ca8e7a48416e0cdfb42cf8a0fef9593 | [
"MIT"
] | 479 | 2015-11-17T19:42:40.000Z | 2022-03-29T00:09:21.000Z | defmodule Credo.Check.Warning.UnusedListOperationTest do
use Credo.Test.Case
@described_check Credo.Check.Warning.UnusedListOperation
test "it should NOT report expected code" do
"""
defmodule CredoSampleModule do
def some_function(parameter1, parameter2) do
List.to_tuple(parameter1) + parameter2
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report when result is piped" do
"""
defmodule CredoSampleModule do
def some_function(parameter1, parameter2) do
List.to_tuple(parameter1)
|> some_where
parameter1
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when end of pipe AND return value" do
"""
defmodule CredoSampleModule do
def some_function(parameter1, parameter2) do
parameter1 + parameter2
|> List.to_tuple(parameter1)
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when inside of pipe" do
"""
defmodule CredoSampleModule do
def some_function(parameter1, parameter2) do
parameter1 + parameter2
|> List.to_tuple(parameter1)
|> some_func_who_knows_what_it_does
:ok
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when inside an assignment" do
"""
defmodule CredoSampleModule do
def some_function(parameter1, parameter2) do
offset = List.wrap(line)
parameter1 + parameter2 + offset
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when inside a condition" do
"""
defmodule CredoSampleModule do
def some_function(parameter1, parameter2) do
if List.wrap(x1) > List.wrap(x2) do
cond do
List.wrap(x3) == "" -> IO.puts("1")
List.wrap(x) == 15 -> IO.puts("2")
List.delete_at(x3, 1) == "b" -> IO.puts("2")
end
else
case List.wrap(x3) do
0 -> true
1 -> false
_ -> something
end
end
unless List.wrap(x4) == "" do
IO.puts "empty"
end
parameter1 + parameter2 + offset
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when inside a quote" do
"""
defmodule CredoSampleModule do
defp category_body(nil) do
quote do
__MODULE__
|> Module.split
|> List.delete_at(2)
end
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when inside of assignment" do
"""
defmodule CredoSampleModule do
defp print_issue(%Issue{check: check, message: message, filename: filename, priority: priority} = issue, source_file) do
pos =
pos_string(issue.line_no, issue.column)
[
Output.issue_color(issue), "┃ ",
Output.check_tag(check), " ", priority |> Output.priority_arrow,
:normal, :white, " ", message,
]
|> IO.ANSI.format
|> IO.puts
if issue.column do
offset = List.wrap(line)
[
List.to_tuple(x, " "), :faint, List.to_tuple(w, ","),
]
|> IO.puts
end
[Output.issue_color(issue), :faint, "┃ "]
|> IO.ANSI.format
|> IO.puts
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when call is buried in else block but is the last call" do
"""
defmodule CredoSampleModule do
defp print_issue(%Issue{check: check, message: message, filename: filename, priority: priority} = issue, source_file) do
if issue.column do
IO.puts "."
else
[:this_actually_might_return, List.to_tuple(w, ","), :ok] # THIS is not the last_call!
end
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when call is buried in else block and is not the last call, but the result is assigned to a variable" do
"""
defmodule CredoSampleModule do
defp print_issue(%Issue{check: check, message: message, filename: filename, priority: priority} = issue, source_file) do
result =
if issue.column do
IO.puts "."
else
[:this_goes_nowhere, List.to_tuple(w, ",")]
end
IO.puts "8"
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when buried in :if, :when and :fn 2" do
"""
defmodule CredoSampleModule do
defp print_issue(%Issue{check: check, message: message, filename: filename, priority: priority} = issue, source_file) do
if issue.column do
IO.puts "."
else
case check do
true -> false
_ ->
List.foldr(arr, [], fn(w) ->
[:this_might_return, List.to_tuple(w, ",")]
end)
end
end
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when :for and :case" do
"""
defmodule CredoSampleModule do
defp convert_parsers(parsers) do
for parser <- parsers do
case Atom.to_string(parser) do
"Elixir." <> _ -> parser
reference -> List.delete_at(reference)
end
end
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when part of a function call" do
"""
defmodule CredoSampleModule do
defp convert_parsers(parsers) do
for parser <- parsers do
case Atom.to_string(parser) do
"Elixir." <> _ -> parser
reference -> Module.concat(Plug.Parsers, List.delete_at(reference))
end
end
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when :for and :case 2" do
"""
defmodule CredoSampleModule do
defp convert_parsers(parsers) do
for segment <- List.keysort(bin, 1), segment != "", do: segment
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when in :after block" do
"""
defp my_function(fun, opts) do
try do
:fprof.analyse(
dest: analyse_dest,
totals: true,
details: Keyword.get(opts, :details, false),
callers: Keyword.get(opts, :callers, false),
sort: sorting
)
else
:ok ->
{_in, analysis_output} = StringIO.contents(analyse_dest)
List.wrap(analysis_output)
after
StringIO.close(analyse_dest)
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when in function call" do
"""
def my_function(url) when is_binary(url) do
if info.userinfo do
destructure [username, password], List.to_tuple(info.userinfo, ":")
end
List.foldl(opts, [], fn {_k, v} -> is_nil(v) end)
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when in function call 2" do
"""
defp print_process(pid_atom, count, own) do
IO.puts([?", List.to_tuple(own, "-")])
IO.write format_item(Path.to_tuple(path, item), List.zip(item))
print_row(["s", "B", "s", ".3f", "s"], [count, "", own, ""])
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when in list that is returned" do
"""
defp indent_line(str, indentation, with \\\\ " ") do
[List.to_tuple(with, indentation), str]
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should NOT report a violation when :fn is in the surrounding function calls arguments" do
"""
defmodule A do
def a do
Enum.each(Enum.with_index([]), [], fn a -> a end)
1
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
test "it should report a violation when buried in :if, :when and :fn" do
"""
defmodule CredoSampleModule do
defp print_issue(%Issue{check: check, message: message, filename: filename, priority: priority} = issue, source_file) do
if issue.column do
IO.puts "."
else
case check do
true -> false
_ ->
list =
List.foldr(arr, [], fn(w) ->
[:this_goes_nowhere, List.to_tuple(w, ",")]
end)
end
end
IO.puts "x"
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> refute_issues()
end
##############################################################################
##############################################################################
test "it should report a violation" do
"""
defmodule CredoSampleModule do
def some_function(parameter1, parameter2) do
x = parameter1 + parameter2
List.delete_at(parameter1, x)
parameter1
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> assert_issue()
end
test "it should report a violation when end of pipe" do
"""
defmodule CredoSampleModule do
def some_function(parameter1, parameter2) do
parameter1 + parameter2
|> List.delete_at(parameter1)
parameter1
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> assert_issue()
end
test "it should report a violation when buried in :if" do
"""
defmodule CredoSampleModule do
defp print_issue(%Issue{check: check, message: message, filename: filename, priority: priority} = issue, source_file) do
if issue.column do
[
:this_goes_nowhere,
List.to_tuple(w, ",") # THIS is not the last_call!
]
IO.puts "."
else
IO.puts "x"
end
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> assert_issue()
end
test "it should report a violation when buried in :else" do
"""
defmodule CredoSampleModule do
defp print_issue(%Issue{check: check, message: message, filename: filename, priority: priority} = issue, source_file) do
if issue.column do
IO.puts "."
else
List.wrap(filename)
IO.puts "x"
end
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> assert_issue()
end
test "it should report a violation when buried in :if, :when and :fn 2" do
"""
defmodule CredoSampleModule do
defp print_issue(%Issue{check: check, message: message, filename: filename, priority: priority} = issue, source_file) do
if issue.column do
IO.puts "."
else
case check do
true -> false
_ ->
List.foldr(arr, [], fn(w) ->
[:this_goes_nowhere, x]
end)
end
end
IO.puts "x"
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> assert_issue()
end
test "it should report a violation when call is buried in else block but is the last call" do
"""
defmodule CredoSampleModule do
defp print_issue(%Issue{check: check, message: message, filename: filename, priority: priority} = issue, source_file) do
if issue.column do
IO.puts "."
else
[:this_goes_nowhere, List.to_tuple(w, ",")] # THIS is not the last_call!
end
IO.puts
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> assert_issue()
end
test "it should report a violation when call is buried in else block but is the last call 2" do
"""
defmodule CredoSampleModule do
defp print_issue(%Issue{check: check, message: message, filename: filename, priority: priority} = issue, source_file) do
if issue.column do
IO.puts "."
else
[:this_goes_nowhere, List.to_tuple(w, ",")] # THIS is not the last_call!
IO.puts " "
end
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> assert_issue(fn issue ->
assert "List.to_tuple" == issue.trigger
end)
end
test "it should report several violations" do
"""
defmodule CredoSampleModule do
def some_function(parameter1, parameter2) do
List.foldl(parameter1, [], &is_nil/1)
parameter1
end
def some_function2(parameter1, parameter2) do
List.foldr(parameter1, [], parameter2)
parameter1
end
def some_function3(parameter1, parameter2) do
List.foldr(parameter1, [], parameter2)
parameter1
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> assert_issues(fn issues ->
assert 3 == Enum.count(issues)
end)
end
test "it should report a violation when used incorrectly, even inside a :for" do
"""
defmodule CredoSampleModule do
defp something(bin) do
for segment <- List.keysort(segment1, 1), segment != "" do
List.flatten(segment, [:added_to_the_tail])
segment
end
end
end
"""
|> to_source_file
|> run_check(@described_check)
|> assert_issue(fn issue ->
assert "List.flatten" == issue.trigger
end)
end
end
| 26.046679 | 145 | 0.579404 |
03e9a108995e888edc9d4cbab1d69b6070582891 | 24,350 | exs | Elixir | test/ecto/query_test.exs | Anber/ecto | 2b903c8c6acb924f87746fe4d40cb4b42a7f0491 | [
"Apache-2.0"
] | null | null | null | test/ecto/query_test.exs | Anber/ecto | 2b903c8c6acb924f87746fe4d40cb4b42a7f0491 | [
"Apache-2.0"
] | null | null | null | test/ecto/query_test.exs | Anber/ecto | 2b903c8c6acb924f87746fe4d40cb4b42a7f0491 | [
"Apache-2.0"
] | null | null | null | Code.require_file "../support/eval_helpers.exs", __DIR__
defmodule Ecto.QueryTest do
use ExUnit.Case, async: true
import Support.EvalHelpers
import Ecto.Query
alias Ecto.Query
defmacrop macro_equal(column, value) do
quote do
unquote(column) == unquote(value)
end
end
defmacro macro_map(key) do
quote do
%{"1" => unquote(key),
"2" => unquote(key)}
end
end
describe "query building" do
test "allows macros" do
test_data = "test"
query = from(p in "posts") |> where([q], macro_equal(q.title, ^test_data))
assert "&0.title() == ^0" == Macro.to_string(hd(query.wheres).expr)
end
test "allows macros in select" do
key = "hello"
from(p in "posts", select: [macro_map(^key)])
end
defmacrop macrotest(x), do: quote(do: is_nil(unquote(x)) or unquote(x) == "A")
defmacrop deeper_macrotest(x), do: quote(do: macrotest(unquote(x)) or unquote(x) == "B")
test "allows macro in where" do
_ = from(p in "posts", where: p.title == "C" or macrotest(p.title))
_ = from(p in "posts", where: p.title == "C" or deeper_macrotest(p.title))
end
test "does not allow nils in comparison at compile time" do
assert_raise Ecto.Query.CompileError,
~r"comparison with nil is forbidden as it is unsafe", fn ->
quote_and_eval from p in "posts", where: p.id == nil
end
end
test "does not allow nils in comparison at runtime" do
assert_raise ArgumentError, ~r"comparison with nil is forbidden as it is unsafe", fn ->
Post |> where([p], p.title == ^nil)
end
end
end
describe "from" do
test "does not allow non-queryable" do
assert_raise Protocol.UndefinedError, fn ->
from(p in 123, []) |> select([p], p.title)
end
assert_raise UndefinedFunctionError, fn ->
from(p in NotASchema, []) |> select([p], p.title)
end
end
test "normalizes expressions" do
quote_and_eval(from("posts", []))
assert_raise ArgumentError, fn ->
quote_and_eval(from("posts", [123]))
end
assert_raise ArgumentError, fn ->
quote_and_eval(from("posts", 123))
end
end
end
describe "subqueries" do
test "builds a subquery struct" do
assert subquery("posts").query.from.source == {"posts", nil}
assert subquery(subquery("posts")).query.from.source == {"posts", nil}
assert subquery(subquery("posts").query).query.from.source == {"posts", nil}
end
test "prefix is not applied if left blank" do
assert subquery("posts").query.prefix == nil
assert subquery(subquery("posts")).query.prefix == nil
assert subquery(subquery("posts").query).query.prefix == nil
end
test "applies prefix to the subquery's query if provided" do
assert subquery("posts", prefix: "my_prefix").query.prefix == "my_prefix"
assert subquery(subquery("posts", prefix: "my_prefix")).query.prefix == "my_prefix"
assert subquery(subquery("posts", prefix: "my_prefix").query).query.prefix == "my_prefix"
end
end
describe "combinations" do
test "adds union expressions" do
union_query1 = from(p in "posts1")
union_query2 = from(p in "posts2")
query =
"posts"
|> union(union_query1)
|> union_all(union_query2)
assert {:union, ^union_query1} = query.combinations |> Enum.at(0)
assert {:union_all, ^union_query2} = query.combinations |> Enum.at(1)
end
test "adds except expressions" do
except_query1 = from(p in "posts1")
except_query2 = from(p in "posts2")
query =
"posts"
|> except(except_query1)
|> except_all(except_query2)
assert {:except, ^except_query1} = query.combinations |> Enum.at(0)
assert {:except_all, ^except_query2} = query.combinations |> Enum.at(1)
end
test "adds intersect expressions" do
intersect_query1 = from(p in "posts1")
intersect_query2 = from(p in "posts2")
query =
"posts"
|> intersect(intersect_query1)
|> intersect_all(intersect_query2)
assert {:intersect, ^intersect_query1} = query.combinations |> Enum.at(0)
assert {:intersect_all, ^intersect_query2} = query.combinations |> Enum.at(1)
end
end
describe "bindings" do
test "are not required by macros" do
_ = from(p in "posts") |> limit(1)
_ = from(p in "posts") |> order_by([asc: :title])
_ = from(p in "posts") |> where(title: "foo")
_ = from(p in "posts") |> having(title: "foo")
_ = from(p in "posts") |> offset(1)
_ = from(p in "posts") |> update(set: [title: "foo"])
_ = from(p in "posts") |> select([:title])
_ = from(p in "posts") |> group_by([:title])
_ = from(p in "posts") |> distinct(true)
_ = from(p in "posts") |> join(:inner, "comments")
end
test "must be a list of variables" do
assert_raise Ecto.Query.CompileError,
"binding list should contain only variables or `{as, var}` tuples, got: 0", fn ->
quote_and_eval select(%Query{}, [0], 1)
end
end
test "ignore unbound _ var" do
assert_raise Ecto.Query.CompileError, fn ->
quote_and_eval("posts" |> select([], _.x))
end
"posts" |> select([_], 0)
"posts" |> join(:inner, [], "comments") |> select([_, c], c.text)
"posts" |> join(:inner, [], "comments") |> select([p, _], p.title)
"posts" |> join(:inner, [], "comments") |> select([_, _], 0)
end
test "can be added through joins" do
from(c in "comments", join: p in "posts", select: {p.title, c.text})
"comments" |> join(:inner, [c], p in "posts", on: true) |> select([c, p], {p.title, c.text})
end
test "can be added through joins with a counter" do
base = join("comments", :inner, [c], p in "posts", on: true)
assert select(base, [{p, 1}], p) == select(base, [c, p], p)
end
test "raise on binding collision" do
assert_raise Ecto.Query.CompileError, "variable `x` is bound twice", fn ->
quote_and_eval("posts" |> from("comments") |> select([x, x], x.id))
end
end
test "raise on too many vars" do
assert from(a in %Query{}, [])
assert from([] in %Query{}, [])
assert from([a] in %Query{}, [])
assert_raise Ecto.Query.CompileError, fn ->
comment = "comments"
from([a, b] in comment, [])
end
end
end
describe "trailing bindings (...)" do
test "match on last bindings" do
query = "posts" |> join(:inner, [], "comments") |> join(:inner, [], "votes")
assert select(query, [..., v], v).select.expr ==
{:&, [], [2]}
assert select(query, [p, ..., v], {p, v}).select.expr ==
{:{}, [], [{:&, [], [0]}, {:&, [], [2]}]}
assert select(query, [p, c, v, ...], v).select.expr ==
{:&, [], [2]}
assert select(query, [..., c, v], {c, v}).select.expr ==
{:{}, [], [{:&, [], [1]}, {:&, [], [2]}]}
end
test "match on last bindings with multiple constructs" do
query =
"posts"
|> join(:inner, [], "comments")
|> where([..., c], c.public)
|> join(:inner, [], "votes")
|> select([..., v], v)
assert query.select.expr == {:&, [], [2]}
assert hd(query.wheres).expr == {{:., [], [{:&, [], [1]}, :public]}, [], []}
end
test "match on last bindings inside joins" do
query =
"posts"
|> join(:inner, [], "comments")
|> join(:inner, [..., c], v in "votes", on: c.id == v.id)
assert hd(tl(query.joins)).on.expr ==
{:==, [], [
{{:., [], [{:&, [], [1]}, :id]}, [], []},
{{:., [], [{:&, [], [2]}, :id]}, [], []}
]}
end
test "match on last bindings on keyword query" do
posts = "posts"
query = from [..., p] in posts, join: c in "comments", on: p.id == c.id
assert hd(query.joins).on.expr ==
{:==, [], [
{{:., [], [{:&, [], [0]}, :id]}, [], []},
{{:., [], [{:&, [], [1]}, :id]}, [], []}
]}
end
test "dynamic in :on takes new binding when ... is used" do
join_on = dynamic([p, ..., c], c.text == "Test Comment")
query = from p in "posts", join: c in "comments", on: ^join_on
assert inspect(query) ==
~s[#Ecto.Query<from p in \"posts\", join: c in \"comments\", on: c.text == \"Test Comment\">]
end
end
describe "named bindings" do
test "assigns a name to a join" do
query =
from(p in "posts",
join: b in "blogs",
join: c in "comments", as: :comment,
join: l in "links", on: l.valid, as: :link)
assert %{comment: 2, link: 3} == query.aliases
end
test "assigns a name to query source" do
query = from p in "posts", as: :post
assert %{post: 0} == query.aliases
assert %{as: :post} = query.from
end
test "assigns a name to query source in var" do
posts_source = "posts"
query = from p in posts_source, as: :post
assert %{post: 0} == query.aliases
assert %{as: :post} = query.from
end
test "assigns a name to a subquery source" do
posts_query = from p in "posts"
query = from p in subquery(posts_query), as: :post
assert %{post: 0} == query.aliases
assert %{as: :post} = query.from
end
test "assign to source fails when non-atom name passed" do
message = ~r"`as` must be a compile time atom, got: `\"post\"`"
assert_raise Ecto.Query.CompileError, message, fn ->
quote_and_eval(from(p in "posts", as: "post"))
end
end
test "crashes on duplicate as for keyword query" do
message = ~r"`as` keyword was given more than once"
assert_raise Ecto.Query.CompileError, message, fn ->
quote_and_eval(from(p in "posts", join: b in "blogs", as: :foo, as: :bar))
end
end
test "crashes on assigning the same name twice at compile time" do
message = ~r"alias `:foo` already exists"
assert_raise Ecto.Query.CompileError, message, fn ->
quote_and_eval(from(p in "posts", join: b in "blogs", as: :foo, join: c in "comments", as: :foo))
end
end
test "crashes on assigning the same name twice at runtime" do
message = ~r"alias `:foo` already exists"
assert_raise Ecto.Query.CompileError, message, fn ->
query = "posts"
from(p in query, join: b in "blogs", as: :foo, join: c in "comments", as: :foo)
end
end
test "crashes on assigning the same name twice when aliasing source" do
message = ~r"alias `:foo` already exists"
assert_raise Ecto.Query.CompileError, message, fn ->
query = from p in "posts", join: b in "blogs", as: :foo
from(p in query, as: :foo)
end
end
test "crashes on assigning the name to source when it already has one" do
message = ~r"can't apply alias `:foo`, binding in `from` is already aliased to `:post`"
assert_raise Ecto.Query.CompileError, message, fn ->
query = from p in "posts", as: :post
from(p in query, as: :foo)
end
end
test "match on binding by name" do
query =
"posts"
|> join(:inner, [p], c in "comments", as: :comment)
|> where([comment: c], c.id == 0)
assert inspect(query) ==
~s[#Ecto.Query<from p in \"posts\", join: c in \"comments\", as: :comment, on: true, where: c.id == 0>]
end
test "match on binding by name for source" do
query =
from(p in "posts", as: :post)
|> where([post: p], p.id == 0)
assert inspect(query) ==
~s[#Ecto.Query<from p in \"posts\", as: :post, where: p.id == 0>]
end
test "match on binding by name for source and join" do
query =
"posts"
|> from(as: :post)
|> join(:inner, [post: p], "comments", as: :comment)
|> update([comment: c], set: [id: c.id + 1])
assert inspect(query) ==
~s{#Ecto.Query<from p in "posts", as: :post, join: c in "comments", as: :comment, on: true, update: [set: [id: c.id + 1]]>}
end
test "match on binding by name with ... in the middle" do
query =
"posts"
|> join(:inner, [p], c in "comments")
|> join(:inner, [], a in "authors", as: :authors)
|> where([p, ..., authors: a], a.id == 0)
assert inspect(query) ==
~s[#Ecto.Query<from p in \"posts\", join: c in \"comments\", on: true, join: a in \"authors\", as: :authors, on: true, where: a.id == 0>]
end
test "referring to non-existing binding" do
assert_raise Ecto.QueryError, ~r"unknown bind name `:nope`", fn ->
"posts"
|> join(:inner, [p], c in "comments", as: :comment)
|> where([nope: c], c.id == 0)
end
end
test "named bind not in tail of the list" do
message = ~r"tuples must be at the end of the binding list"
assert_raise Ecto.Query.CompileError, message, fn ->
quote_and_eval(
"posts"
|> join(:inner, [p], c in "comments", as: :comment)
|> where([{:comment, c}, p], c.id == 0)
)
end
end
test "dynamic in :on takes new binding when alias is used" do
join_on = dynamic([p, comment: c], c.text == "Test Comment")
query = from p in "posts", join: c in "comments", as: :comment, on: ^join_on
assert inspect(query) ==
~s[#Ecto.Query<from p in \"posts\", join: c in \"comments\", as: :comment, on: c.text == \"Test Comment\">]
end
end
describe "prefixes" do
test "are supported on from and join" do
query = from "posts", prefix: "hello", join: "comments", prefix: "world"
assert query.from.prefix == "hello"
assert hd(query.joins).prefix == "world"
end
test "are supported on dynamic from" do
posts = "posts"
query = from posts, prefix: "hello"
assert query.from.prefix == "hello"
end
test "raises when conflicting with dynamic from" do
posts = from "posts", prefix: "hello"
message = "can't apply prefix `\"world\"`, `from` is already prefixed to `\"hello\"`"
assert_raise Ecto.Query.CompileError, message, fn ->
from posts, prefix: "world"
end
end
test "are expected to be compile-time strings" do
assert_raise Ecto.Query.CompileError, ~r"`prefix` must be a compile time string", fn ->
quote_and_eval(from "posts", prefix: 123)
end
assert_raise Ecto.Query.CompileError, ~r"`prefix` must be a compile time string", fn ->
quote_and_eval(from "posts", join: "comments", prefix: 123)
end
end
end
describe "hints" do
test "are supported on from and join" do
query = from "posts", hints: "hello", join: "comments", hints: ["world", "extra"]
assert query.from.hints == ["hello"]
assert hd(query.joins).hints == ["world", "extra"]
end
test "are supported on dynamic from" do
posts = "posts"
query = from posts, hints: "hello"
assert query.from.hints == ["hello"]
posts = from "posts", hints: "hello"
query = from posts, hints: "world"
assert query.from.hints == ["hello", "world"]
end
test "are expected to be compile-time strings or list of strings" do
assert_raise Ecto.Query.CompileError, ~r"`hints` must be a compile time string", fn ->
quote_and_eval(from "posts", hints: 123)
end
assert_raise Ecto.Query.CompileError, ~r"`hints` must be a compile time string", fn ->
quote_and_eval(from "posts", join: "comments", hints: 123)
end
end
end
describe "keyword queries" do
test "are supported through from/2" do
# queries need to be on the same line or == wont work
assert from(p in "posts", select: 1 < 2) == from(p in "posts", []) |> select([p], 1 < 2)
assert from(p in "posts", where: 1 < 2) == from(p in "posts", []) |> where([p], 1 < 2)
query = "posts"
assert (query |> select([p], p.title)) == from(p in query, select: p.title)
end
test "are built at compile time with binaries" do
quoted =
quote do
from(p in "posts",
join: b in "blogs",
join: c in "comments", on: c.text == "",
limit: 0,
where: p.id == 0 and b.id == 0 and c.id == 0,
select: p)
end
assert {:%{}, _, list} = Macro.expand(quoted, __ENV__)
assert List.keyfind(list, :__struct__, 0) == {:__struct__, Query}
end
test "are built at compile time with atoms" do
quoted =
quote do
from(p in Post,
join: b in Blog,
join: c in Comment, on: c.text == "",
limit: 0,
where: p.id == 0 and b.id == 0 and c.id == 0,
select: p)
end
assert {:%{}, _, list} = Macro.expand(quoted, __ENV__)
assert List.keyfind(list, :__struct__, 0) == {:__struct__, Query}
end
test "are built at compile time even with joins" do
from(c in "comments", join: p in "posts", on: c.text == "", select: c)
from(p in "posts", join: c in assoc(p, :comments), select: p)
message = ~r"`on` keyword must immediately follow a join"
assert_raise Ecto.Query.CompileError, message, fn ->
quote_and_eval(from(c in "comments", on: c.text == "", select: c))
end
end
end
describe "exclude/2" do
test "removes the given field" do
base = %Ecto.Query{}
query =
from(p in "posts",
join: b in "blogs",
join: c in "comments",
where: p.id == 0 and b.id == 0,
or_where: c.id == 0,
order_by: p.title,
union: from(p in "posts"),
union_all: from(p in "posts"),
except: from(p in "posts"),
intersect: from(p in "posts"),
limit: 2,
offset: 10,
group_by: p.author,
having: p.comments > 10,
distinct: p.category,
lock: "FOO",
select: p
)
# Pre-exclusion assertions
refute query.joins == base.joins
refute query.wheres == base.wheres
refute query.order_bys == base.order_bys
refute query.group_bys == base.group_bys
refute query.havings == base.havings
refute query.distinct == base.distinct
refute query.select == base.select
refute query.combinations == base.combinations
refute query.limit == base.limit
refute query.offset == base.offset
refute query.lock == base.lock
excluded_query =
query
|> exclude(:join)
|> exclude(:where)
|> exclude(:order_by)
|> exclude(:group_by)
|> exclude(:having)
|> exclude(:distinct)
|> exclude(:select)
|> exclude(:combinations)
|> exclude(:limit)
|> exclude(:offset)
|> exclude(:lock)
# Post-exclusion assertions
assert excluded_query.joins == base.joins
assert excluded_query.wheres == base.wheres
assert excluded_query.order_bys == base.order_bys
assert excluded_query.group_bys == base.group_bys
assert excluded_query.havings == base.havings
assert excluded_query.distinct == base.distinct
assert excluded_query.select == base.select
assert excluded_query.combinations == base.combinations
assert excluded_query.limit == base.limit
assert excluded_query.offset == base.offset
assert excluded_query.lock == base.lock
end
test "works on any queryable" do
query = "posts" |> exclude(:select)
assert query.from
refute query.select
end
test "does not set a non-existent field to nil" do
query = from(p in "posts", select: p)
msg = ~r"no function clause matching in Ecto.Query"
assert_raise FunctionClauseError, msg, fn ->
Ecto.Query.exclude(query, :fake_field)
end
end
test "does not reset :from" do
query = from(p in "posts", select: p)
msg = ~r"no function clause matching in Ecto.Query"
assert_raise FunctionClauseError, msg, fn ->
Ecto.Query.exclude(query, :from)
end
end
test "resets both preloads and assocs if :preloads is passed in" do
base = %Ecto.Query{}
query = from p in "posts", join: c in assoc(p, :comments), preload: [:author, comments: c]
refute query.preloads == base.preloads
refute query.assocs == base.assocs
excluded_query = query |> exclude(:preload)
assert excluded_query.preloads == base.preloads
assert excluded_query.assocs == base.assocs
end
test "removes join qualifiers" do
base = %Ecto.Query{}
inner_query = from p in "posts", inner_join: b in "blogs"
cross_query = from p in "posts", cross_join: b in "blogs"
left_query = from p in "posts", left_join: b in "blogs"
right_query = from p in "posts", right_join: b in "blogs"
full_query = from p in "posts", full_join: b in "blogs"
inner_lateral_query = from p in "posts", inner_lateral_join: b in "blogs"
left_lateral_query = from p in "posts", left_lateral_join: b in "blogs"
refute inner_query.joins == base.joins
refute cross_query.joins == base.joins
refute left_query.joins == base.joins
refute right_query.joins == base.joins
refute full_query.joins == base.joins
refute inner_lateral_query.joins == base.joins
refute left_lateral_query.joins == base.joins
excluded_inner_query = exclude(inner_query, :inner_join)
assert excluded_inner_query.joins == base.joins
excluded_cross_query = exclude(cross_query, :cross_join)
assert excluded_cross_query.joins == base.joins
excluded_left_query = exclude(left_query, :left_join)
assert excluded_left_query.joins == base.joins
excluded_right_query = exclude(right_query, :right_join)
assert excluded_right_query.joins == base.joins
excluded_full_query = exclude(full_query, :full_join)
assert excluded_full_query.joins == base.joins
excluded_inner_lateral_query = exclude(inner_lateral_query, :inner_lateral_join)
assert excluded_inner_lateral_query.joins == base.joins
excluded_left_lateral_query = exclude(left_lateral_query, :left_lateral_join)
assert excluded_left_lateral_query.joins == base.joins
end
end
describe "fragment/1" do
test "raises at runtime when interpolation is not a keyword list" do
assert_raise ArgumentError, ~r/fragment\(...\) does not allow strings to be interpolated/s, fn ->
clause = ["1 = ?"]
from p in "posts", where: fragment(^clause)
end
end
test "raises at runtime when interpolation is a binary string" do
assert_raise ArgumentError, ~r/fragment\(...\) does not allow strings to be interpolated/, fn ->
clause = "1 = ?"
from p in "posts", where: fragment(^clause)
end
end
test "keeps UTF-8 encoding" do
assert inspect(from p in "posts", where: fragment("héllò")) ==
~s[#Ecto.Query<from p in \"posts\", where: fragment("héllò")>]
end
end
describe "has_named_binding?/1" do
test "returns true if query has a named binding" do
query =
from(p in "posts", as: :posts,
join: b in "blogs",
join: c in "comments", as: :comment,
join: l in "links", on: l.valid, as: :link)
assert has_named_binding?(query, :posts)
assert has_named_binding?(query, :comment)
assert has_named_binding?(query, :link)
end
test "returns false if query does not have a named binding" do
query = from(p in "posts")
refute has_named_binding?(query, :posts)
end
test "returns false when query is a tuple, atom or binary" do
refute has_named_binding?({:foo, :bar}, :posts)
refute has_named_binding?(:foo, :posts)
refute has_named_binding?("foo", :posts)
end
test "casts queryable to query" do
assert_raise Protocol.UndefinedError, "protocol Ecto.Queryable not implemented for []", fn ->
has_named_binding?([], :posts)
end
end
end
end
| 33.866481 | 145 | 0.583696 |
03e9b31d2139563a10afd00c10b83e51f610fd38 | 111 | exs | Elixir | test/simple_state_machine_test.exs | RobertDober/lab42_simple_state_machine | 947c092212fefea18c512a881c7ff25a898b9177 | [
"Apache-2.0"
] | null | null | null | test/simple_state_machine_test.exs | RobertDober/lab42_simple_state_machine | 947c092212fefea18c512a881c7ff25a898b9177 | [
"Apache-2.0"
] | null | null | null | test/simple_state_machine_test.exs | RobertDober/lab42_simple_state_machine | 947c092212fefea18c512a881c7ff25a898b9177 | [
"Apache-2.0"
] | null | null | null | defmodule SimpleStateMachineTest do
use ExUnit.Case
doctest Lab42.SimpleStateMachine, import: true
end
| 15.857143 | 48 | 0.801802 |
03e9b6b4180df084cfc576556bd862eac6e1a42d | 1,779 | exs | Elixir | test/supermemo_test.exs | FrancoB411/supermemo | 8179c8f7a407c32518f01b8a8c706e21af457d78 | [
"MIT"
] | 18 | 2015-10-02T11:23:36.000Z | 2021-11-17T10:53:40.000Z | test/supermemo_test.exs | FrancoB411/supermemo | 8179c8f7a407c32518f01b8a8c706e21af457d78 | [
"MIT"
] | 3 | 2020-02-05T02:29:44.000Z | 2021-02-13T06:24:25.000Z | test/supermemo_test.exs | FrancoB411/supermemo | 8179c8f7a407c32518f01b8a8c706e21af457d78 | [
"MIT"
] | 4 | 2019-01-25T04:02:49.000Z | 2021-02-13T06:25:39.000Z | defmodule SupermemoTest do
use ExUnit.Case
use ExCheck
test "first iteration" do
assert Supermemo.set_interval(1, 0, 1, 2.5) == 1
end
test "second iteration" do
assert Supermemo.set_interval(1, 1, 1, 2.5) == 6
end
property :interval do
for_all x in int() do
implies x > 2 do
prior = Supermemo.set_interval(1, x - 1, x - 1, 2.5)
interval = Supermemo.set_interval(1, x, prior, 2.5)
expected = round(prior * 2.5)
interval == expected
end
end
end
property :efactor_with_score do
for_all x in such_that(xx in int(1, 100) when xx < 100) do
q = x / 100
base_ef = 2.5
expected = base_ef + (0.1 - (5 - (q * 5)) * (0.08 + (5 - (q * 5)) * 0.02))
adjusted = Supermemo.adjust_efactor(base_ef, q)
adjusted == expected
end
end
property :efactor_with_efactor do
for_all x in such_that(xx in int(130, 250) when xx < 250) do
ef = x / 100
q = 1.0
expected = ef + (0.1 - (5 - (q * 5)) * (0.08 + (5 - (q * 5)) * 0.02))
adjusted = Supermemo.adjust_efactor(ef, q)
adjusted == expected
end
end
property :efactor_with_efactor_and_zero_dot_eight do
for_all x in such_that(xx in int(130, 250) when xx < 250) do
ef = x / 100
q = 0.8
expected = ef + (0.1 - (5 - (q * 5)) * (0.08 + (5 - (q * 5)) * 0.02))
adjusted = Supermemo.adjust_efactor(ef, q)
adjusted == expected
end
end
property :efactor_with_efactor_and_zero_dot_four do
for_all x in such_that(xx in int(130, 250) when xx < 250) do
ef = x / 100
q = 0.4
expected = ef + (0.1 - (5 - (q * 5)) * (0.08 + (5 - (q * 5)) * 0.02))
adjusted = Supermemo.adjust_efactor(ef, q)
adjusted == expected
end
end
end
| 27.796875 | 80 | 0.572794 |
03e9c74e95edde9c97e74db205a8de4d43a9868f | 1,160 | exs | Elixir | test/serialization/module_name_type_provider_test.exs | jccf091/commanded | 5d68a2b1b7a222b6f204c48d886f3d2c9670f26a | [
"MIT"
] | 1 | 2022-02-20T10:42:07.000Z | 2022-02-20T10:42:07.000Z | test/serialization/module_name_type_provider_test.exs | jccf091/commanded | 5d68a2b1b7a222b6f204c48d886f3d2c9670f26a | [
"MIT"
] | null | null | null | test/serialization/module_name_type_provider_test.exs | jccf091/commanded | 5d68a2b1b7a222b6f204c48d886f3d2c9670f26a | [
"MIT"
] | null | null | null | defmodule Commanded.Serialization.ModuleNameTypeProviderTest do
use ExUnit.Case
alias Commanded.ExampleDomain.BankAccount.Events.BankAccountOpened
alias Commanded.Serialization.ModuleNameTypeProvider
@account_opened_type "Elixir.Commanded.ExampleDomain.BankAccount.Events.BankAccountOpened"
test "should convert an event to its type" do
account_opened = %BankAccountOpened{account_number: "ACC123", initial_balance: 1_000}
assert ModuleNameTypeProvider.to_string(account_opened) == @account_opened_type
end
test "should convert a type to its struct" do
assert ModuleNameTypeProvider.to_struct(@account_opened_type) == %BankAccountOpened{}
end
defmodule NamedEvent do
defstruct [:data]
end
defmodule AnotherNamedEvent do
defstruct [:data]
end
test "should convert module struct to type" do
assert "Elixir.Commanded.Serialization.ModuleNameTypeProviderTest.NamedEvent" ==
ModuleNameTypeProvider.to_string(%NamedEvent{})
assert "Elixir.Commanded.Serialization.ModuleNameTypeProviderTest.AnotherNamedEvent" ==
ModuleNameTypeProvider.to_string(%AnotherNamedEvent{})
end
end
| 33.142857 | 92 | 0.787069 |
03e9e17bd010b761cf12e1830d78dfdf7c70e604 | 2,559 | ex | Elixir | lib/ueberauth/strategy/esi/oauth.ex | joshuataylor/ueberauth_esi | 294430beea892cd8acb2f0fb0088d429f8905667 | [
"MIT"
] | null | null | null | lib/ueberauth/strategy/esi/oauth.ex | joshuataylor/ueberauth_esi | 294430beea892cd8acb2f0fb0088d429f8905667 | [
"MIT"
] | 2 | 2019-10-30T11:22:31.000Z | 2019-11-04T09:05:57.000Z | lib/ueberauth/strategy/esi/oauth.ex | joshuataylor/ueberauth_esi | 294430beea892cd8acb2f0fb0088d429f8905667 | [
"MIT"
] | 3 | 2019-10-29T10:42:30.000Z | 2020-03-21T19:26:27.000Z | defmodule Ueberauth.Strategy.ESI.OAuth do
@moduledoc """
An implementation of OAuth2 for ESI.
To add your `client_id` and `client_secret` include these values in your configuration.
config :ueberauth, Ueberauth.Strategy.ESI.OAuth,
client_id: System.get_env("ESI_CLIENT_ID"),
client_secret: System.get_env("ESI_CLIENT_SECRET")
"""
use OAuth2.Strategy
@defaults [
strategy: __MODULE__,
site: "https://login.eveonline.com",
authorize_url: "https://login.eveonline.com/oauth/authorize",
token_url: "https://login.eveonline.com/oauth/token",
]
def get(token, url, headers \\ [], opts \\ []) do
[token: token]
|> client
|> put_param("client_secret", client().client_secret)
|> OAuth2.Client.get(url, headers, opts)
end
def get_authorization_access_token(token) do
client()
|> basic_auth()
|> post!(
Keyword.get(@defaults, :token_url),
%{grant_type: "authorization_code", code: token},
["Content-Type": "application/json"]
)
|> Map.get(:body)
# |> Map.get("access_token")
end
@doc """
Construct a client for requests to ESI.
Optionally include any OAuth2 options here to be merged with the defaults.
Ueberauth.Strategy.ESI.OAuth.client(redirect_uri: "http://localhost:4000/auth/ESI/callback")
This will be setup automatically for you in `Ueberauth.Strategy.ESI`.
These options are only useful for usage outside the normal callback phase of Ueberauth.
"""
def client(opts \\ []) do
config =
:ueberauth
|> Application.fetch_env!(Ueberauth.Strategy.ESI.OAuth)
|> check_config_key_exists(:client_id)
|> check_config_key_exists(:client_secret)
client_opts =
@defaults
|> Keyword.merge(config)
|> Keyword.merge(opts)
OAuth2.Client.new(client_opts)
end
@doc """
Provides the authorize url for the request phase of Ueberauth. No need to call this usually.
"""
def authorize_url!(params \\ [], opts \\ []) do
opts
|> client
|> OAuth2.Client.authorize_url!(params)
end
def authorize_url(client, params) do
OAuth2.Strategy.AuthCode.authorize_url(client, params)
end
defp check_config_key_exists(config, key) when is_list(config) do
unless Keyword.has_key?(config, key) do
raise "#{inspect (key)} missing from config :ueberauth, Ueberauth.Strategy.ESI"
end
config
end
defp check_config_key_exists(_, _) do
raise "Config :ueberauth, Ueberauth.Strategy.ESI is not a keyword list, as expected"
end
end
| 29.413793 | 98 | 0.679562 |
03ea0cae2b7cda8cf6ade8005c3ce992801b485f | 68 | exs | Elixir | chapter6/ModulesAndFunctions-4.exs | matheustp/programming-elixir-1.3-exercises | f32ad2f6c53da11a24e602fc63299f65a3c48dfc | [
"MIT"
] | null | null | null | chapter6/ModulesAndFunctions-4.exs | matheustp/programming-elixir-1.3-exercises | f32ad2f6c53da11a24e602fc63299f65a3c48dfc | [
"MIT"
] | null | null | null | chapter6/ModulesAndFunctions-4.exs | matheustp/programming-elixir-1.3-exercises | f32ad2f6c53da11a24e602fc63299f65a3c48dfc | [
"MIT"
] | null | null | null | defmodule Sum do
def of(1), do: 1
def of(n), do: n + of(n-1)
end | 17 | 28 | 0.573529 |
03ea0ea692045fc1fee6d223393933574063e50e | 5,684 | ex | Elixir | lib/ueberauth/strategy/facebook.ex | lesykos/ueberauth_facebook | 9e8a4cc48260372330969f6ecafad7bcaa4e954d | [
"MIT"
] | null | null | null | lib/ueberauth/strategy/facebook.ex | lesykos/ueberauth_facebook | 9e8a4cc48260372330969f6ecafad7bcaa4e954d | [
"MIT"
] | null | null | null | lib/ueberauth/strategy/facebook.ex | lesykos/ueberauth_facebook | 9e8a4cc48260372330969f6ecafad7bcaa4e954d | [
"MIT"
] | null | null | null | defmodule Ueberauth.Strategy.Facebook do
@moduledoc """
Facebook Strategy for Überauth.
"""
use Ueberauth.Strategy, default_scope: "email,public_profile",
profile_fields: "id,email,gender,link,locale,name,timezone,updated_time,verified",
uid_field: :id,
allowed_request_params: [
:auth_type,
:scope,
:locale,
:state,
:display
]
alias Ueberauth.Auth.Info
alias Ueberauth.Auth.Credentials
alias Ueberauth.Auth.Extra
@doc """
Handles initial request for Facebook authentication.
"""
def handle_request!(conn) do
allowed_params = conn
|> option(:allowed_request_params)
|> Enum.map(&to_string/1)
authorize_url = conn.params
|> maybe_replace_param(conn, "auth_type", :auth_type)
|> maybe_replace_param(conn, "scope", :default_scope)
|> maybe_replace_param(conn, "state", :state)
|> maybe_replace_param(conn, "display", :display)
|> Enum.filter(fn {k, _v} -> Enum.member?(allowed_params, k) end)
|> Enum.map(fn {k, v} -> {String.to_existing_atom(k), v} end)
|> Keyword.put(:redirect_uri, callback_url(conn))
|> Ueberauth.Strategy.Facebook.OAuth.authorize_url!
redirect!(conn, authorize_url)
end
@doc """
Handles the callback from Facebook.
"""
def handle_callback!(%Plug.Conn{params: %{"code" => code}} = conn) do
opts = [redirect_uri: callback_url(conn)]
try do
client = Ueberauth.Strategy.Facebook.OAuth.get_token!([code: code], opts)
token = client.token
if token.access_token == nil do
err = token.other_params["error"]
desc = token.other_params["error_description"]
set_errors!(conn, [error(err, desc)])
else
fetch_user(conn, client)
end
rescue
OAuth2.Error ->
set_errors!(conn, [error("invalid_code", "The code has been used or has expired")])
end
end
@doc false
def handle_callback!(conn) do
set_errors!(conn, [error("missing_code", "No code received")])
end
@doc false
def handle_cleanup!(conn) do
conn
|> put_private(:facebook_user, nil)
|> put_private(:facebook_token, nil)
end
@doc """
Fetches the uid field from the response.
"""
def uid(conn) do
uid_field =
conn
|> option(:uid_field)
|> to_string
conn.private.facebook_user[uid_field]
end
@doc """
Includes the credentials from the facebook response.
"""
def credentials(conn) do
token = conn.private.facebook_token
scopes = token.other_params["scope"] || ""
scopes = String.split(scopes, ",")
%Credentials{
expires: !!token.expires_at,
expires_at: token.expires_at,
scopes: scopes,
token: token.access_token
}
end
@doc """
Fetches the fields to populate the info section of the
`Ueberauth.Auth` struct.
"""
def info(conn) do
user = conn.private.facebook_user
%Info{
description: user["bio"],
email: user["email"],
first_name: user["first_name"],
image: fetch_image(user["id"]),
last_name: user["last_name"],
name: user["name"],
urls: %{
facebook: user["link"],
website: user["website"]
}
}
end
@doc """
Stores the raw information (including the token) obtained from
the facebook callback.
"""
def extra(conn) do
%Extra{
raw_info: %{
token: conn.private.facebook_token,
user: conn.private.facebook_user
}
}
end
defp fetch_image(uid) do
"https://graph.facebook.com/#{uid}/picture?type=large"
end
defp fetch_user(conn, client) do
conn = put_private(conn, :facebook_token, client.token)
query = user_query(conn, client.token)
path = "/me?#{query}"
case OAuth2.Client.get(client, path) do
{:ok, %OAuth2.Response{status_code: 401, body: _body}} ->
set_errors!(conn, [error("token", "unauthorized")])
{:ok, %OAuth2.Response{status_code: status_code, body: user}}
when status_code in 200..399 ->
put_private(conn, :facebook_user, user)
{:error, %OAuth2.Error{reason: reason}} ->
set_errors!(conn, [error("OAuth2", reason)])
end
end
defp user_query(conn, token) do
%{"appsecret_proof" => appsecret_proof(token)}
|> Map.merge(query_params(conn, :locale))
|> Map.merge(query_params(conn, :profile))
|> URI.encode_query
end
defp appsecret_proof(token) do
config = Application.get_env(:ueberauth, Ueberauth.Strategy.Facebook.OAuth)
client_secret = Keyword.get(config, :client_secret)
token.access_token
|> hmac(:sha256, client_secret)
|> Base.encode16(case: :lower)
end
defp hmac(data, type, key) do
:crypto.hmac(type, key, data)
end
defp query_params(conn, :profile) do
%{"fields" => option(conn, :profile_fields)}
end
defp query_params(conn, :locale) do
case option(conn, :locale) do
nil -> %{}
locale -> %{"locale" => locale}
end
end
defp option(conn, key) do
default = Keyword.get(default_options(), key)
conn
|> options
|> Keyword.get(key, default)
end
defp option(nil, conn, key), do: option(conn, key)
defp option(value, _conn, _key), do: value
defp maybe_replace_param(params, conn, name, config_key) do
if params[name] || is_nil(option(params[name], conn, config_key)) do
params
else
Map.put(
params,
name,
option(params[name], conn, config_key)
)
end
end
end
| 27.066667 | 108 | 0.614356 |
03ea24e826f41cd8254b985b210344893a8dbae6 | 190 | ex | Elixir | lib/conduit/blog/queries/article_by_slug.ex | rudyyazdi/conduit | 8defa60962482fb81f5093ea5d58b71a160db3c4 | [
"MIT"
] | null | null | null | lib/conduit/blog/queries/article_by_slug.ex | rudyyazdi/conduit | 8defa60962482fb81f5093ea5d58b71a160db3c4 | [
"MIT"
] | 2 | 2022-01-15T02:09:30.000Z | 2022-01-22T10:18:43.000Z | lib/conduit/blog/queries/article_by_slug.ex | rudyyazdi/conduit | 8defa60962482fb81f5093ea5d58b71a160db3c4 | [
"MIT"
] | null | null | null | defmodule Conduit.Blog.Queries.ArticleBySlug do
import Ecto.Query
alias Conduit.Blog.Projections.Article
def new(slug) do
from a in Article,
where: a.slug == ^slug
end
end
| 17.272727 | 47 | 0.721053 |
03ea58d02d73633862e1eca7c901d7da8ab0feb0 | 242 | exs | Elixir | exercises/rna-transcription/rna_transcription.exs | jerith/elixir | 9a3f2a2fbee26a7b6a6b3ad74a9e6d1ff2495ed4 | [
"Apache-2.0"
] | null | null | null | exercises/rna-transcription/rna_transcription.exs | jerith/elixir | 9a3f2a2fbee26a7b6a6b3ad74a9e6d1ff2495ed4 | [
"Apache-2.0"
] | null | null | null | exercises/rna-transcription/rna_transcription.exs | jerith/elixir | 9a3f2a2fbee26a7b6a6b3ad74a9e6d1ff2495ed4 | [
"Apache-2.0"
] | 1 | 2018-07-19T23:43:56.000Z | 2018-07-19T23:43:56.000Z | defmodule RNATranscription do
@doc """
Transcribes a character list representing DNA nucleotides to RNA
## Examples
iex> RNATranscription.to_rna('ACTG')
'UGAC'
"""
@spec to_rna([char]) :: [char]
def to_rna(dna) do
end
end
| 17.285714 | 66 | 0.681818 |
03ea6d5757b13c6f3aca6e951eb5133e8db0163a | 1,520 | ex | Elixir | clients/safe_browsing/lib/google_api/safe_browsing/v4/model/google_protobuf_empty.ex | yoshi-code-bot/elixir-google-api | cdb6032f01fac5ab704803113c39f2207e9e019d | [
"Apache-2.0"
] | null | null | null | clients/safe_browsing/lib/google_api/safe_browsing/v4/model/google_protobuf_empty.ex | yoshi-code-bot/elixir-google-api | cdb6032f01fac5ab704803113c39f2207e9e019d | [
"Apache-2.0"
] | null | null | null | clients/safe_browsing/lib/google_api/safe_browsing/v4/model/google_protobuf_empty.ex | yoshi-code-bot/elixir-google-api | cdb6032f01fac5ab704803113c39f2207e9e019d | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.SafeBrowsing.V4.Model.GoogleProtobufEmpty do
@moduledoc """
A generic empty message that you can re-use to avoid defining duplicated empty messages in your APIs. A typical example is to use it as the request or the response type of an API method. For instance: service Foo { rpc Bar(google.protobuf.Empty) returns (google.protobuf.Empty); }
## Attributes
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{}
end
defimpl Poison.Decoder, for: GoogleApi.SafeBrowsing.V4.Model.GoogleProtobufEmpty do
def decode(value, options) do
GoogleApi.SafeBrowsing.V4.Model.GoogleProtobufEmpty.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.SafeBrowsing.V4.Model.GoogleProtobufEmpty do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 36.190476 | 282 | 0.767763 |
03ea83fa04f0ce449ae5aaf9a7b8e58c9bf05640 | 482 | ex | Elixir | base/fc_support/lib/fc_support/control_flow.ex | fleadope/freshcom | 8d5944befaa6eea8d31e5f5995939be2a1a44262 | [
"BSD-3-Clause"
] | 46 | 2018-10-13T23:18:13.000Z | 2021-08-07T07:46:51.000Z | base/fc_support/lib/fc_support/control_flow.ex | fleadope/freshcom | 8d5944befaa6eea8d31e5f5995939be2a1a44262 | [
"BSD-3-Clause"
] | 25 | 2018-10-14T00:56:07.000Z | 2019-12-23T19:41:02.000Z | base/fc_support/lib/fc_support/control_flow.ex | fleadope/freshcom | 8d5944befaa6eea8d31e5f5995939be2a1a44262 | [
"BSD-3-Clause"
] | 5 | 2018-12-16T04:39:51.000Z | 2020-10-01T12:17:03.000Z | defmodule FCSupport.ControlFlow do
@doc """
Unwrap the reuslt out of the tagged tuple if the tag is `:ok`, otherwise
return the input.
"""
@spec unwrap_ok({:error, any}) :: {:error, any}
def unwrap_ok({:error, reason}), do: {:error, reason}
@spec unwrap_ok({:ok, any}) :: any
def unwrap_ok({:ok, result}), do: result
@spec unwrap_ok(any) :: any
def unwrap_ok(any), do: any
def tt_wrap({:error, any}), do: {:error, any}
def tt_wrap(any), do: {:ok, any}
end | 28.352941 | 74 | 0.636929 |
03ea9200fdef830400d8bf135067d882ba3f3d42 | 701 | exs | Elixir | mix.exs | technicalcapt/phoenix_mjml | 7e18839fafb02c0ada44c1c6ade12d84ac01c842 | [
"MIT"
] | null | null | null | mix.exs | technicalcapt/phoenix_mjml | 7e18839fafb02c0ada44c1c6ade12d84ac01c842 | [
"MIT"
] | null | null | null | mix.exs | technicalcapt/phoenix_mjml | 7e18839fafb02c0ada44c1c6ade12d84ac01c842 | [
"MIT"
] | null | null | null | defmodule PhoenixMjml.Mixfile do
use Mix.Project
def project do
[app: :phoenix_mjml,
version: "0.2.1",
elixir: "~> 1.4",
deps: deps(),
package: package(),
description: description()]
end
def application do
[applications: [:phoenix, :uuid]]
end
defp deps do
[{:phoenix, "~> 1.2"},
{:phoenix_html, "~> 2.6"},
{:uuid, "~> 1.1"},
{:ex_doc, "~> 0.18", only: :dev, runtime: false}]
end
defp description do
"""
Phoenix Template Engine for Mjml
"""
end
defp package do
[name: :phoenix_mjml,
maintainers: ["MQuy"],
licenses: ["MIT"],
links: %{github: "https://github.com/MQuy/phoenix_mjml"}]
end
end
| 18.945946 | 62 | 0.564907 |
03eaa8956afb4b7ce5b66bb5c15162cb3c4247af | 5,723 | ex | Elixir | lib/bolt_sips/internals/pack_stream/message/encoder_v1.ex | cheerfulstoic/bolt_sips | e86d6443f69d59f6cc41ecae5d0718ed05ea4904 | [
"Apache-2.0"
] | null | null | null | lib/bolt_sips/internals/pack_stream/message/encoder_v1.ex | cheerfulstoic/bolt_sips | e86d6443f69d59f6cc41ecae5d0718ed05ea4904 | [
"Apache-2.0"
] | null | null | null | lib/bolt_sips/internals/pack_stream/message/encoder_v1.ex | cheerfulstoic/bolt_sips | e86d6443f69d59f6cc41ecae5d0718ed05ea4904 | [
"Apache-2.0"
] | null | null | null | defmodule Bolt.Sips.Internals.PackStream.Message.EncoderV1 do
@moduledoc false
use Bolt.Sips.Internals.PackStream.Message.Signatures
alias Bolt.Sips.Internals.PackStream.Message.Encoder
@valid_signatures [
@ack_failure_signature,
@discard_all_signature,
@init_signature,
@pull_all_signature,
@reset_signature,
@run_signature
]
@valid_message_types [
:ack_failure,
:discard_all,
:init,
:pull_all,
:reset,
:run
]
@doc """
Return the valid signatures for bolt V1
"""
@spec valid_signatures() :: [integer()]
def valid_signatures() do
@valid_signatures
end
@spec signature(Bolt.Sips.Internals.PackStream.Message.out_signature()) :: integer()
defp signature(:ack_failure), do: @ack_failure_signature
defp signature(:discard_all), do: @discard_all_signature
defp signature(:init), do: @init_signature
defp signature(:pull_all), do: @pull_all_signature
defp signature(:reset), do: @reset_signature
defp signature(:run), do: @run_signature
@doc """
Encode INIT message without auth token
"""
@spec encode({Bolt.Sips.Internals.PackStream.Message.out_signature(), list()}, integer()) ::
Bolt.Sips.Internals.PackStream.Message.encoded() | {:error, :not_implemented}
def encode({:init, []}, bolt_version) do
encode({:init, [{}]}, bolt_version)
end
@doc """
Encode INIT message with a valid auth token.
The auth token is tuple formated as: {user, password}
"""
def encode({:init, [auth]}, bolt_version) do
do_encode(:init, [Encoder.client_name(), auth_params(auth)], bolt_version)
end
@doc """
Encode RUN message with its data: statement and parameters
"""
def encode({:run, [statement]}, bolt_version) do
do_encode(:run, [statement, %{}], bolt_version)
end
@doc """
Encode messages that don't need any data formating
"""
def encode({message_type, data}, bolt_version) when message_type in @valid_message_types do
do_encode(message_type, data, bolt_version)
end
@doc """
Encode messages
# Supported messages
## INIT
Usage: intialize the session.
Signature: `0x01`
Struct: `client_name` `auth_token`
with:
| data | type |
|-----|-----|
|client_name | string|
|auth_token | map: {scheme: string, principal: string, credentials: string}|
Examples (excluded from doctest because client_name changes at each bolt_sips version)
# without auth token
diex> alias Bolt.Sips.Internals.PackStream.Message.EncoderV1
EncoderV1.encode({:init, []}, 1)
<<0x0, 0x10, 0xB2, 0x1, 0x8C, 0x42, 0x6F, 0x6C, 0x74, 0x65, 0x78, 0x2F, 0x30, 0x2E, 0x34,
0x2E, 0x30, 0xA0, 0x0, 0x0>>
# with auth token
# The auth token is tuple formated as: {user, password}
diex> alias Bolt.Sips.Internals.PackStream.Message.EncoderV1
diex> EncoderV1.encode({:init, [{"neo4j", "password"}]})
<<0x0, 0x42, 0xB2, 0x1, 0x8C, 0x42, 0x6F, 0x6C, 0x74, 0x65, 0x78, 0x2F, 0x30, 0x2E, 0x34,
0x2E, 0x30, 0xA3, 0x8B, 0x63, 0x72, 0x65, 0x64, 0x65, 0x6E, 0x74, 0x69, 0x61, 0x6C, 0x73,
0x88, 0x70, 0x61, 0x73, 0x73, 0x77, 0x6F, 0x72, 0x64, 0x89, 0x70, 0x72, 0x69, 0x6E, 0x63,
0x69, 0x70, 0x61, 0x6C, 0x85, ...>>
## RUN
Usage: pass statement for execution to the server.
Signature: `0x10`
Struct: `statement` `parameters`
with:
| data | type |
|-----|-----|
| statement | string |
| parameters | map |
Examples
# without parameters
iex> alias Bolt.Sips.Internals.PackStream.Message.EncoderV1
iex> EncoderV1.encode({:run, ["RETURN 1 AS num"]}, 1)
<<0x0, 0x13, 0xB2, 0x10, 0x8F, 0x52, 0x45, 0x54, 0x55, 0x52, 0x4E, 0x20, 0x31, 0x20, 0x41,
0x53, 0x20, 0x6E, 0x75, 0x6D, 0xA0, 0x0, 0x0>>
# with parameters
iex> EncoderV1.encode({:run, ["RETURN {num} AS num", %{num: 1}]}, 1)
<<0x0, 0x1D, 0xB2, 0x10, 0xD0, 0x13, 0x52, 0x45, 0x54, 0x55, 0x52, 0x4E, 0x20, 0x7B, 0x6E,
0x75, 0x6D, 0x7D, 0x20, 0x41, 0x53, 0x20, 0x6E, 0x75, 0x6D, 0xA1, 0x83, 0x6E, 0x75, 0x6D,
0x1, 0x0, 0x0>>
## ACK_FAILURE
Usage: Acknowledge a failure the server has sent.
Signature: `0x0E`
Struct: no data
Example
iex> alias Bolt.Sips.Internals.PackStream.Message.EncoderV1
iex> EncoderV1.encode({:ack_failure, []}, 1)
<<0x0, 0x2, 0xB0, 0xE, 0x0, 0x0>>
## DISCARD_ALL
Uage: Discard all remaining items from the active result stream.
Signature: `0x2F`
Struct: no data
Example
iex> alias Bolt.Sips.Internals.PackStream.Message.EncoderV1
iex> EncoderV1.encode({:discard_all, []}, 1)
<<0x0, 0x2, 0xB0, 0x2F, 0x0, 0x0>>
## PULL_ALL
Usage: Retrieve all remaining items from the active result stream.
Signature: `0x3F`
Struct: no data
Example
iex> alias Bolt.Sips.Internals.PackStream.Message.EncoderV1
iex> EncoderV1.encode({:pull_all, []}, 1)
<<0x0, 0x2, 0xB0, 0x3F, 0x0, 0x0>>
## RESET
Usage: Return the current session to a "clean" state.
Signature: `0x0F`
Struct: no data
Example
iex> alias Bolt.Sips.Internals.PackStream.Message.EncoderV1
iex> EncoderV1.encode({:reset, []}, 1)
<<0x0, 0x2, 0xB0, 0xF, 0x0, 0x0>>
"""
def encode(_data, _bolt_version) do
{:error, :not_implemented}
end
defp do_encode(message_type, data, bolt_version) do
signature = signature(message_type)
Encoder.encode_message(message_type, signature, data, bolt_version)
end
# Format the auth params
@spec auth_params({} | {String.t(), String.t()}) :: map()
defp auth_params({}), do: %{}
defp auth_params({username, password}) do
%{
scheme: "basic",
principal: username,
credentials: password
}
end
end
| 27.647343 | 96 | 0.65892 |
03eacc0cc08ba5f1e23f791bb4085813f1527f48 | 2,424 | exs | Elixir | elixir/sublist/sublist_test.exs | jkrukoff/Exercism | a58535afaef312b6ad45730eaa346f2c9f4c3056 | [
"MIT"
] | null | null | null | elixir/sublist/sublist_test.exs | jkrukoff/Exercism | a58535afaef312b6ad45730eaa346f2c9f4c3056 | [
"MIT"
] | null | null | null | elixir/sublist/sublist_test.exs | jkrukoff/Exercism | a58535afaef312b6ad45730eaa346f2c9f4c3056 | [
"MIT"
] | null | null | null | if !System.get_env("EXERCISM_TEST_EXAMPLES") do
Code.load_file("sublist.exs", __DIR__)
end
ExUnit.start()
ExUnit.configure(exclude: :pending, trace: true)
defmodule SublistTest do
use ExUnit.Case
test "empty equals empty" do
assert Sublist.compare([], []) == :equal
end
test "empty is a sublist of anything" do
assert Sublist.compare([], [nil]) == :sublist
end
test "anything is a superlist of empty" do
assert Sublist.compare([nil], []) == :superlist
end
test "1 is not 2" do
assert Sublist.compare([1], [2]) == :unequal
end
test "comparing massive equal lists" do
l = Enum.to_list(1..1_000_000)
assert Sublist.compare(l, l) == :equal
end
test "sublist at start" do
assert Sublist.compare([1, 2, 3], [1, 2, 3, 4, 5]) == :sublist
end
test "sublist in middle" do
assert Sublist.compare([4, 3, 2], [5, 4, 3, 2, 1]) == :sublist
end
test "sublist at end" do
assert Sublist.compare([3, 4, 5], [1, 2, 3, 4, 5]) == :sublist
end
test "partially matching sublist at start" do
assert Sublist.compare([1, 1, 2], [1, 1, 1, 2]) == :sublist
end
test "sublist early in huge list" do
assert Sublist.compare([3, 4, 5], Enum.to_list(1..1_000_000)) == :sublist
end
test "huge sublist not in huge list" do
assert Sublist.compare(Enum.to_list(10..1_000_001), Enum.to_list(1..1_000_000)) == :unequal
end
test "superlist at start" do
assert Sublist.compare([1, 2, 3, 4, 5], [1, 2, 3]) == :superlist
end
test "superlist in middle" do
assert Sublist.compare([5, 4, 3, 2, 1], [4, 3, 2]) == :superlist
end
test "superlist at end" do
assert Sublist.compare([1, 2, 3, 4, 5], [3, 4, 5]) == :superlist
end
test "1 and 2 does not contain 3" do
assert Sublist.compare([1, 2], [3]) == :unequal
end
test "partially matching superlist at start" do
assert Sublist.compare([1, 1, 1, 2], [1, 1, 2]) == :superlist
end
test "superlist early in huge list" do
assert Sublist.compare(Enum.to_list(1..1_000_000), [3, 4, 5]) == :superlist
end
test "strict equality needed" do
assert Sublist.compare([1], [1.0, 2]) == :unequal
end
test "recurring values sublist" do
assert Sublist.compare([1, 2, 1, 2, 3], [1, 2, 3, 1, 2, 1, 2, 3, 2, 1]) == :sublist
end
test "recurring values unequal" do
assert Sublist.compare([1, 2, 1, 2, 3], [1, 2, 3, 1, 2, 3, 2, 3, 2, 1]) == :unequal
end
end
| 26.347826 | 95 | 0.625413 |
03ead7bcb9efecc92df5d720ea94d5085977e79b | 718 | ex | Elixir | lib/myelin/cmd/agent/agent_spec.ex | neocortexlab/myelin | 0f352c90b41de61133402fe32474a880b544d199 | [
"Apache-2.0"
] | null | null | null | lib/myelin/cmd/agent/agent_spec.ex | neocortexlab/myelin | 0f352c90b41de61133402fe32474a880b544d199 | [
"Apache-2.0"
] | null | null | null | lib/myelin/cmd/agent/agent_spec.ex | neocortexlab/myelin | 0f352c90b41de61133402fe32474a880b544d199 | [
"Apache-2.0"
] | null | null | null | defmodule Myelin.Cmd.Agent.CmdSpec do
@moduledoc """
Command Spec for 'agent'
"""
def spec do
{
:agent, [
name: "agent",
about: "Operations with agent",
flags: [
new: [
short: "-n",
long: "--new",
help: "Create new agent",
multiple: false,
],
info: [
short: "-i",
long: "--info",
help: "Get info about agent",
multiple: false,
]
],
args: [
name: [
value_name: "NAME",
help: "Agent name",
required: true,
parser: :string
]
]
]
}
end
end
| 19.944444 | 41 | 0.384401 |
03eb1e5612b480806776ebf96eee63ac1dc5b1fd | 4,138 | ex | Elixir | lib/slackex/channels.ex | soriyath/slackex | ee972bb4ef61d6f0876f7da7cb9b944491681d26 | [
"MIT"
] | 30 | 2016-01-12T18:05:37.000Z | 2020-06-19T19:54:11.000Z | lib/slackex/channels.ex | soriyath/slackex | ee972bb4ef61d6f0876f7da7cb9b944491681d26 | [
"MIT"
] | 7 | 2016-01-12T18:23:35.000Z | 2020-11-30T13:26:47.000Z | lib/slackex/channels.ex | soriyath/slackex | ee972bb4ef61d6f0876f7da7cb9b944491681d26 | [
"MIT"
] | 5 | 2016-11-01T17:17:38.000Z | 2019-10-29T16:08:45.000Z | defmodule Slackex.Channels do
@moduledoc """
Get info on your team's Slack channels,
create or archive channels, invite users,
set the topic and purpose, and mark a
channel as read.
"""
@doc """
This method archives a channel.
"""
def archive(channel, options \\ %{}) do
params = options |> Map.merge(%{channel: channel})
Slackex.request("channels.archive", params)
end
@doc """
This method is used to create a channel.
"""
def create(name, options \\ %{}) do
params = options |> Map.merge(%{name: name})
Slackex.request("channels.create", params)
end
@doc """
This method returns a portion of messages/events
from the specified channel. To read the entire
history for a channel, call the method with no
latest or oldest arguments, and then continue paging
using the instructions below.
"""
def history(channel, options \\ %{}) do
params = options |> Map.merge(%{channel: channel})
Slackex.request("channels.history", params)
end
@doc """
"""
def info(channel, options \\ %{}) do
params = options |> Map.merge(%{channel: channel})
Slackex.request("channels.info", params)
end
@doc """
This method is used to invite a user to a channel.
The calling user must be a member of the channel.
"""
def invite(channel, user, options \\ %{}) do
params = options |> Map.merge(%{channel: channel, user: user})
Slackex.request("channels.invite", params)
end
@doc """
This method is used to join a channel. If the
channel does not exist, it is created.
"""
def join(name, options \\ %{}) do
params = options |> Map.merge(%{name: name})
Slackex.request("channels.join", params)
end
@doc """
This method allows a user to remove another member
from a team channel.
"""
def kick(channel, user, options \\ %{}) do
params = options |> Map.merge(%{channel: channel, user: user})
Slackex.request("channels.kick", params)
end
@doc """
This method is used to leave a channel.
"""
def leave(channel, options \\ %{}) do
params = options |> Map.merge(%{channel: channel})
Slackex.request("channels.leave", params)
end
@doc """
This method returns a list of all channels
in the team. This includes channels the
caller is in, channels they are not currently
in, and archived channels but does not include
private channels. The number of (non-deactivated)
members in each channel is also returned.
To retrieve a list of private channels, use groups.list
"""
def list(options \\ %{}) do
Slackex.request("channels.list", options)
end
@doc """
This method moves the read cursor in a channel.
"""
def mark(channel, timestamp, options \\ %{}) do
params = options |> Map.merge(%{channel: channel, ts: timestamp})
Slackex.request("channels.mark", params)
end
@doc """
This method renames a team channel.
The only people who can rename a channel are team
admins, or the person that originally created the
channel. Others will receive a "not_authorized"
error.
"""
def rename(channel, new_name, options \\ %{}) do
params = options |> Map.merge(%{channel: channel, name: new_name})
Slackex.request("channels.rename", params)
end
@doc """
This method is used to change the purpose of a
channel. The calling user must be a member of
the channel.
"""
def set_purpose(channel, purpose, options \\ %{}) do
params = options |> Map.merge(%{channel: channel, purpose: purpose})
Slackex.request("channels.setPurpose", params)
end
@doc """
This method is used to change the topic of a
channel. The calling user must be a member of
the channel.
"""
def set_topic(channel, topic, options \\ %{}) do
params = options |> Map.merge(%{channel: channel, topic: topic})
Slackex.request("channels.setTopic", params)
end
@doc """
This method unarchives a channel. The calling
user is added to the channel.
"""
def unarchive(channel, options \\ %{}) do
params = options |> Map.merge(%{channel: channel})
Slackex.request("channels.unarchive", params)
end
end
| 29.140845 | 72 | 0.662397 |
03eb23fbbcc0b8ce4a149d836159ee439e910994 | 2,672 | ex | Elixir | lib/whistle/html.ex | boudra/whistle | f39f4e41bf2a69d692ffd3259f69088202cb4a23 | [
"MIT"
] | 59 | 2018-12-18T15:24:23.000Z | 2020-11-19T18:40:25.000Z | lib/whistle/html.ex | boudra/whistle | f39f4e41bf2a69d692ffd3259f69088202cb4a23 | [
"MIT"
] | 8 | 2019-02-05T22:36:17.000Z | 2019-03-20T21:13:42.000Z | lib/whistle/html.ex | boudra/whistle | f39f4e41bf2a69d692ffd3259f69088202cb4a23 | [
"MIT"
] | 3 | 2019-01-30T19:13:53.000Z | 2020-11-19T18:40:28.000Z | defmodule Whistle.Html do
@tags [
:div,
:meta,
:img,
:a,
:i,
:form,
:table,
:tr,
:td,
:tbody,
:thead,
:select,
:option,
:section,
:header,
:footer,
:nav,
:ul,
:ol,
:li,
:input,
:button,
:br,
:p,
:b,
:strong,
:center,
:span,
:html,
:body,
:head,
:script,
:link,
:h1,
:h2,
:h3,
:h4
]
for tag <- @tags do
tag_name = Atom.to_string(tag)
@doc """
Helper to create a `<#{tag_name}>` node.
```
iex> Html.#{tag_name}([], "text")
{"#{tag_name}", {[], [{0, "text"}]}}
```
"""
defmacro unquote(tag)(attributes \\ [], children \\ []) do
build_quoted_node(unquote(tag_name), attributes, children)
end
end
defmacro node(tag, attributes, children) do
build_quoted_node(tag, attributes, children)
end
@doc """
Create a link that will be handled by Whistle.
```
iex> Html.ahref("/chat/general", [], "go to the homepage")
Html.a([href: "/chat/general", "data-whistle-href": true], "go to the homepage")
```
"""
defmacro ahref(route, attributes, children) do
attributes =
quote do
unquote(attributes) ++ [href: unquote(route), "data-whistle-href": true]
end
build_quoted_node("a", attributes, children)
end
defmacro title(children) do
build_quoted_node("title", [], children)
end
def text(content) do
to_string(content)
end
def lazy(fun, args) do
{:lazy, {fun, args}}
end
def program(name, params) do
{:program, {name, params}}
end
@doc false
def build_children(children) when is_list(children) do
children
|> List.flatten()
|> Enum.with_index()
|> Enum.map(fn
{child = {index, _node}, _} when is_integer(index) ->
child
{node, index} ->
{index, node}
end)
end
def build_children(children) do
children
end
@doc false
def build_quoted_node(tag, attributes, node) when not is_list(node) do
build_quoted_node(tag, attributes, [node])
end
def build_quoted_node(tag, attributes, children) do
new_children =
if Macro.quoted_literal?(children) do
build_children(children)
else
quote do
unquote(children)
|> List.flatten()
|> Whistle.Html.build_children()
end
end
{tag, {attributes, new_children}}
end
@doc false
def build_node(tag, attributes, node) when not is_list(node) do
build_node(tag, attributes, [node])
end
def build_node(tag, attributes, children) do
{tag, {attributes, build_children(children)}}
end
end
| 18.555556 | 82 | 0.57747 |
03eb2e75f2ddc6c2729a449b827f9145556c2aaf | 218 | exs | Elixir | apps/ewallet_config/config/dev.exs | jimpeebles/ewallet | ad4a9750ec8dc5adc4c0dfe6c22f0ef760825405 | [
"Apache-2.0"
] | null | null | null | apps/ewallet_config/config/dev.exs | jimpeebles/ewallet | ad4a9750ec8dc5adc4c0dfe6c22f0ef760825405 | [
"Apache-2.0"
] | null | null | null | apps/ewallet_config/config/dev.exs | jimpeebles/ewallet | ad4a9750ec8dc5adc4c0dfe6c22f0ef760825405 | [
"Apache-2.0"
] | null | null | null | use Mix.Config
config :ewallet_config, EWalletConfig.Repo,
adapter: Ecto.Adapters.Postgres,
url: {:system, "DATABASE_URL", "postgres://localhost/ewallet_dev"},
migration_timestamps: [type: :naive_datetime_usec]
| 31.142857 | 69 | 0.770642 |
03eb2f2f9823612de51b1ee6659b7c72d4fffcb4 | 1,686 | ex | Elixir | clients/docs/lib/google_api/docs/v1/model/table_column_properties.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/docs/lib/google_api/docs/v1/model/table_column_properties.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/docs/lib/google_api/docs/v1/model/table_column_properties.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Docs.V1.Model.TableColumnProperties do
@moduledoc """
The properties of a column in a table.
## Attributes
* `width` (*type:* `GoogleApi.Docs.V1.Model.Dimension.t`, *default:* `nil`) - The width of the column. Set when the column's `width_type` is
FIXED_WIDTH.
* `widthType` (*type:* `String.t`, *default:* `nil`) - The width type of the column.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:width => GoogleApi.Docs.V1.Model.Dimension.t(),
:widthType => String.t()
}
field(:width, as: GoogleApi.Docs.V1.Model.Dimension)
field(:widthType)
end
defimpl Poison.Decoder, for: GoogleApi.Docs.V1.Model.TableColumnProperties do
def decode(value, options) do
GoogleApi.Docs.V1.Model.TableColumnProperties.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Docs.V1.Model.TableColumnProperties do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 33.058824 | 144 | 0.723013 |
03eb36d3843ff3747c1d18a2acae4afd064cf9e0 | 662 | exs | Elixir | mix.exs | aforward-oss/phoenix_haml | 8bb904e694d58742f1c25162fc44eba1cb809d27 | [
"MIT"
] | null | null | null | mix.exs | aforward-oss/phoenix_haml | 8bb904e694d58742f1c25162fc44eba1cb809d27 | [
"MIT"
] | null | null | null | mix.exs | aforward-oss/phoenix_haml | 8bb904e694d58742f1c25162fc44eba1cb809d27 | [
"MIT"
] | null | null | null | defmodule PhoenixHaml.Mixfile do
use Mix.Project
def project do
[
app: :phoenix_haml,
version: "0.1.0-dev",
elixir: "~> 1.0.1 or ~> 1.1",
deps: deps,
package: [
contributors: ["Chris McCord"],
licenses: ["MIT"],
links: [github: "https://github.com/chrismccord/phoenix_haml"]
],
description: """
Phoenix Template Engine for Haml
"""
]
end
def application do
[applications: [:phoenix]]
end
defp deps do
[
{:phoenix, github: "phoenixframework/phoenix"},
{:cowboy, "~> 1.0.0", only: [:dev, :test]},
{:calliope, "~> 0.2.7"}
]
end
end
| 20.060606 | 70 | 0.536254 |
03eb3ac991c83d29def8272bb7e5396f9d098833 | 204 | exs | Elixir | test/test_helper.exs | jwarwick/lum_patterns_web | 66ecaa56d1311a21ed9d55c92267e4834be33882 | [
"MIT"
] | null | null | null | test/test_helper.exs | jwarwick/lum_patterns_web | 66ecaa56d1311a21ed9d55c92267e4834be33882 | [
"MIT"
] | null | null | null | test/test_helper.exs | jwarwick/lum_patterns_web | 66ecaa56d1311a21ed9d55c92267e4834be33882 | [
"MIT"
] | null | null | null | ExUnit.start
Mix.Task.run "ecto.create", ~w(-r LumPatternsWeb.Repo --quiet)
Mix.Task.run "ecto.migrate", ~w(-r LumPatternsWeb.Repo --quiet)
Ecto.Adapters.SQL.begin_test_transaction(LumPatternsWeb.Repo)
| 29.142857 | 63 | 0.769608 |
03eb4d3d4dc31e8fdeb284a60372e1790ff9681a | 1,907 | ex | Elixir | clients/cloud_iot/lib/google_api/cloud_iot/v1/model/public_key_certificate.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2021-12-20T03:40:53.000Z | 2021-12-20T03:40:53.000Z | clients/cloud_iot/lib/google_api/cloud_iot/v1/model/public_key_certificate.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2020-08-18T00:11:23.000Z | 2020-08-18T00:44:16.000Z | clients/cloud_iot/lib/google_api/cloud_iot/v1/model/public_key_certificate.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.CloudIot.V1.Model.PublicKeyCertificate do
@moduledoc """
A public key certificate format and data.
## Attributes
* `certificate` (*type:* `String.t`, *default:* `nil`) - The certificate data.
* `format` (*type:* `String.t`, *default:* `nil`) - The certificate format.
* `x509Details` (*type:* `GoogleApi.CloudIot.V1.Model.X509CertificateDetails.t`, *default:* `nil`) - [Output only] The certificate details. Used only for X.509 certificates.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:certificate => String.t() | nil,
:format => String.t() | nil,
:x509Details => GoogleApi.CloudIot.V1.Model.X509CertificateDetails.t() | nil
}
field(:certificate)
field(:format)
field(:x509Details, as: GoogleApi.CloudIot.V1.Model.X509CertificateDetails)
end
defimpl Poison.Decoder, for: GoogleApi.CloudIot.V1.Model.PublicKeyCertificate do
def decode(value, options) do
GoogleApi.CloudIot.V1.Model.PublicKeyCertificate.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.CloudIot.V1.Model.PublicKeyCertificate do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 35.981132 | 177 | 0.725747 |
03eb5001cb56974f0cddcf8c9c3809f2cd8e28bc | 76 | exs | Elixir | demo/test/demo_web/views/layout_view_test.exs | ZmagoD/phoenix_bootstrap_form | 4ef6b54afd586d03f9eb21f1f5ef73c56e668e95 | [
"MIT"
] | 40 | 2017-09-07T03:25:23.000Z | 2022-02-27T04:47:14.000Z | demo/test/demo_web/views/layout_view_test.exs | ZmagoD/phoenix_bootstrap_form | 4ef6b54afd586d03f9eb21f1f5ef73c56e668e95 | [
"MIT"
] | 7 | 2017-09-06T23:56:36.000Z | 2019-09-06T09:49:16.000Z | demo/test/demo_web/views/layout_view_test.exs | ZmagoD/phoenix_bootstrap_form | 4ef6b54afd586d03f9eb21f1f5ef73c56e668e95 | [
"MIT"
] | 15 | 2017-09-29T09:16:11.000Z | 2022-01-19T22:48:14.000Z | defmodule DemoWeb.LayoutViewTest do
use DemoWeb.ConnCase, async: true
end
| 19 | 35 | 0.815789 |
03eb53e7e231e123da8c35c03682a2e4c834285a | 1,667 | ex | Elixir | lib/auto_api/capabilities/ignition_capability.ex | nonninz/auto-api-elixir | 53e11542043285e94bbb5a0a3b8ffff0b1b47167 | [
"MIT"
] | null | null | null | lib/auto_api/capabilities/ignition_capability.ex | nonninz/auto-api-elixir | 53e11542043285e94bbb5a0a3b8ffff0b1b47167 | [
"MIT"
] | null | null | null | lib/auto_api/capabilities/ignition_capability.ex | nonninz/auto-api-elixir | 53e11542043285e94bbb5a0a3b8ffff0b1b47167 | [
"MIT"
] | null | null | null | # AutoAPI
# The MIT License
#
# Copyright (c) 2018- High-Mobility GmbH (https://high-mobility.com)
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
defmodule AutoApi.IgnitionCapability do
@moduledoc """
Basic settings for Ignition Capability
iex> alias AutoApi.IgnitionCapability, as: E
iex> E.identifier
<<0x00, 0x35>>
iex> E.name
:ignition
iex> E.description
"Ignition"
iex> length(E.properties)
8
iex> List.first(E.properties)
{1, :status}
"""
@command_module AutoApi.IgnitionCommand
@state_module AutoApi.IgnitionState
use AutoApi.Capability, spec_file: "ignition.json"
end
| 37.044444 | 79 | 0.740852 |
03eb5b6796d1cd58741273985bffd4f8ded11e9e | 2,311 | ex | Elixir | lib/auto_api/states/text_input_state.ex | highmobility/hm-auto-api-elixir | 026c3f50871c56877a4acd5f39a8887118a87bb5 | [
"MIT"
] | 4 | 2018-01-19T16:11:10.000Z | 2019-12-13T16:35:10.000Z | lib/auto_api/states/text_input_state.ex | highmobility/auto-api-elixir | 026c3f50871c56877a4acd5f39a8887118a87bb5 | [
"MIT"
] | 5 | 2020-07-16T07:20:21.000Z | 2021-09-22T10:18:04.000Z | lib/auto_api/states/text_input_state.ex | highmobility/hm-auto-api-elixir | 026c3f50871c56877a4acd5f39a8887118a87bb5 | [
"MIT"
] | 1 | 2021-02-17T18:36:13.000Z | 2021-02-17T18:36:13.000Z | # AutoAPI
# The MIT License
#
# Copyright (c) 2018- High-Mobility GmbH (https://high-mobility.com)
#
# Permission is hereby granted, free of charge, to any person obtaining a copy
# of this software and associated documentation files (the "Software"), to deal
# in the Software without restriction, including without limitation the rights
# to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
# copies of the Software, and to permit persons to whom the Software is
# furnished to do so, subject to the following conditions:
#
# The above copyright notice and this permission notice shall be included in
# all copies or substantial portions of the Software.
#
# THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
# IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
# FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
# AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
# LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
# OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
# THE SOFTWARE.
defmodule AutoApi.TextInputState do
@moduledoc """
TextInput state
"""
alias AutoApi.State
use AutoApi.State, spec_file: "text_input.json"
@type t :: %__MODULE__{
text: State.property(String.t())
}
@doc """
Build state based on binary value
iex> text = "Rendezvous with Rama"
iex> size = byte_size(text)
iex> AutoApi.TextInputState.from_bin(<<1, size + 3::integer-16, 1, size::integer-16, text::binary>>)
%AutoApi.TextInputState{text: %AutoApi.Property{data: "Rendezvous with Rama"}}
"""
@spec from_bin(binary) :: __MODULE__.t()
def from_bin(bin) do
parse_bin_properties(bin, %__MODULE__{})
end
@doc """
Parse state to bin
iex> text = "Rendezvous with Rama"
iex> state = %AutoApi.TextInputState{text: %AutoApi.Property{data: text}}
iex> AutoApi.TextInputState.to_bin(state)
<<1, 23::integer-16, 1, 20::integer-16, 0x52, 0x65, 0x6E, 0x64, 0x65, 0x7A, 0x76, 0x6F, 0x75, 0x73, 0x20, 0x77, 0x69, 0x74, 0x68, 0x20, 0x52, 0x61, 0x6D, 0x61>>
"""
@spec to_bin(__MODULE__.t()) :: binary
def to_bin(%__MODULE__{} = state) do
parse_state_properties(state)
end
end
| 37.274194 | 165 | 0.717871 |
03eb6a61420d65e8bcaeaf51ad9213e4bff49f52 | 826 | ex | Elixir | test/support/conn_case.ex | Science-Adventurers/game-backend | 9c99609b5c9c8e350af04fa1e838c3fa31283e42 | [
"MIT"
] | 5 | 2017-02-21T17:19:51.000Z | 2017-02-24T14:46:00.000Z | test/support/conn_case.ex | Science-Adventurers/game-backend | 9c99609b5c9c8e350af04fa1e838c3fa31283e42 | [
"MIT"
] | null | null | null | test/support/conn_case.ex | Science-Adventurers/game-backend | 9c99609b5c9c8e350af04fa1e838c3fa31283e42 | [
"MIT"
] | null | null | null | defmodule Game.ConnCase do
@moduledoc """
This module defines the test case to be used by
tests that require setting up a connection.
Such tests rely on `Phoenix.ConnTest` and also
import other functionality to make it easier
to build and query models.
Finally, if the test case interacts with the database,
it cannot be async. For this reason, every test runs
inside a transaction which is reset at the beginning
of the test unless the test case is marked as async.
"""
use ExUnit.CaseTemplate
using do
quote do
# Import conveniences for testing with connections
use Phoenix.ConnTest
import Game.Router.Helpers
# The default endpoint for testing
@endpoint Game.Endpoint
end
end
setup tags do
{:ok, conn: Phoenix.ConnTest.build_conn()}
end
end
| 23.6 | 56 | 0.717918 |
03eb710be1a8649315165f09f74e54ccf6cfa8a6 | 4,285 | exs | Elixir | test/phoenix/logger_test.exs | achalagarwal/phoenix | 6534f05beda6696c50ca007d02c922fa168083d7 | [
"MIT"
] | 1 | 2022-02-05T13:37:59.000Z | 2022-02-05T13:37:59.000Z | test/phoenix/logger_test.exs | achalagarwal/phoenix | 6534f05beda6696c50ca007d02c922fa168083d7 | [
"MIT"
] | null | null | null | test/phoenix/logger_test.exs | achalagarwal/phoenix | 6534f05beda6696c50ca007d02c922fa168083d7 | [
"MIT"
] | null | null | null | defmodule Phoenix.LoggerTest do
use ExUnit.Case, async: true
use RouterHelper
describe "filter_values/2 with discard strategy" do
test "in top level map" do
values = %{"foo" => "bar", "password" => "should_not_show"}
assert Phoenix.Logger.filter_values(values, ["password"]) ==
%{"foo" => "bar", "password" => "[FILTERED]"}
end
test "when a map has secret key" do
values = %{"foo" => "bar", "map" => %{"password" => "should_not_show"}}
assert Phoenix.Logger.filter_values(values, ["password"]) ==
%{"foo" => "bar", "map" => %{"password" => "[FILTERED]"}}
end
test "when a list has a map with secret" do
values = %{"foo" => "bar", "list" => [%{"password" => "should_not_show"}]}
assert Phoenix.Logger.filter_values(values, ["password"]) ==
%{"foo" => "bar", "list" => [%{"password" => "[FILTERED]"}]}
end
test "does not filter structs" do
values = %{"foo" => "bar", "file" => %Plug.Upload{}}
assert Phoenix.Logger.filter_values(values, ["password"]) ==
%{"foo" => "bar", "file" => %Plug.Upload{}}
values = %{"foo" => "bar", "file" => %{__struct__: "s"}}
assert Phoenix.Logger.filter_values(values, ["password"]) ==
%{"foo" => "bar", "file" => %{:__struct__ => "s"}}
end
test "does not fail on atomic keys" do
values = %{:foo => "bar", "password" => "should_not_show"}
assert Phoenix.Logger.filter_values(values, ["password"]) ==
%{:foo => "bar", "password" => "[FILTERED]"}
end
end
describe "filter_values/2 with keep strategy" do
test "discards values not specified in params" do
values = %{"foo" => "bar", "password" => "abc123", "file" => %Plug.Upload{}}
assert Phoenix.Logger.filter_values(values, {:keep, []}) ==
%{"foo" => "[FILTERED]", "password" => "[FILTERED]", "file" => "[FILTERED]"}
end
test "keeps values that are specified in params" do
values = %{"foo" => "bar", "password" => "abc123", "file" => %Plug.Upload{}}
assert Phoenix.Logger.filter_values(values, {:keep, ["foo", "file"]}) ==
%{"foo" => "bar", "password" => "[FILTERED]", "file" => %Plug.Upload{}}
end
test "keeps all values under keys that are kept" do
values = %{"foo" => %{"bar" => 1, "baz" => 2}}
assert Phoenix.Logger.filter_values(values, {:keep, ["foo"]}) ==
%{"foo" => %{"bar" => 1, "baz" => 2}}
end
test "only filters leaf values" do
values = %{"foo" => %{"bar" => 1, "baz" => 2}, "ids" => [1, 2]}
assert Phoenix.Logger.filter_values(values, {:keep, []}) ==
%{"foo" => %{"bar" => "[FILTERED]", "baz" => "[FILTERED]"},
"ids" => ["[FILTERED]", "[FILTERED]"]}
end
end
describe "telemetry" do
def log_level(conn) do
case conn.path_info do
[] -> :debug
["warn" | _] -> :warn
["error" | _] -> :error
["false" | _] -> false
_ -> :info
end
end
test "invokes log level callback from Plug.Telemetry" do
opts =
Plug.Telemetry.init(
event_prefix: [:phoenix, :endpoint],
log: {__MODULE__, :log_level, []}
)
assert ExUnit.CaptureLog.capture_log(fn ->
Plug.Telemetry.call(conn(:get, "/"), opts)
end) =~ "[debug] GET /"
assert ExUnit.CaptureLog.capture_log(fn ->
Plug.Telemetry.call(conn(:get, "/warn"), opts)
end) =~ ~r"\[warn(ing)?\] ?GET /warn"
assert ExUnit.CaptureLog.capture_log(fn ->
Plug.Telemetry.call(conn(:get, "/error/404"), opts)
end) =~ "[error] GET /error/404"
assert ExUnit.CaptureLog.capture_log(fn ->
Plug.Telemetry.call(conn(:get, "/any"), opts)
end) =~ "[info] GET /any"
end
test "invokes log level from Plug.Telemetry" do
assert ExUnit.CaptureLog.capture_log(fn ->
opts = Plug.Telemetry.init(event_prefix: [:phoenix, :endpoint])
Plug.Telemetry.call(conn(:get, "/"), opts)
end) =~ "[info] GET /"
assert ExUnit.CaptureLog.capture_log(fn ->
opts = Plug.Telemetry.init(event_prefix: [:phoenix, :endpoint], log: false)
Plug.Telemetry.call(conn(:get, "/"), opts)
end) == ""
end
end
end
| 36.939655 | 88 | 0.546091 |
03ebc6bbb5b93e3425c0ef9725ff0b44e2e6ff6c | 22,869 | ex | Elixir | lib/elixir/lib/code/fragment.ex | felipelincoln/elixir | 6724c1d1819f2926dac561980b4beab281bbd3c2 | [
"Apache-2.0"
] | null | null | null | lib/elixir/lib/code/fragment.ex | felipelincoln/elixir | 6724c1d1819f2926dac561980b4beab281bbd3c2 | [
"Apache-2.0"
] | null | null | null | lib/elixir/lib/code/fragment.ex | felipelincoln/elixir | 6724c1d1819f2926dac561980b4beab281bbd3c2 | [
"Apache-2.0"
] | null | null | null | defmodule Code.Fragment do
@moduledoc """
This module provides conveniences for analyzing fragments of
textual code and extract available information whenever possible.
Most of the functions in this module provide a best-effort
and may not be accurate under all circumstances. Read each
documentation for more information.
This module should be considered experimental.
"""
@type position :: {line :: pos_integer(), column :: pos_integer()}
@doc """
Receives a string and returns the cursor context.
This function receives a string with an Elixir code fragment,
representing a cursor position, and based on the string, it
provides contextual information about said position. The
return of this function can then be used to provide tips,
suggestions, and autocompletion functionality.
This function provides a best-effort detection and may not be
accurate under all circumstances. See the "Limitations"
section below.
Consider adding a catch-all clause when handling the return
type of this function as new cursor information may be added
in future releases.
## Examples
iex> Code.Fragment.cursor_context("")
:expr
iex> Code.Fragment.cursor_context("hello_wor")
{:local_or_var, 'hello_wor'}
## Return values
* `{:alias, charlist}` - the context is an alias, potentially
a nested one, such as `Hello.Wor` or `HelloWor`
* `{:dot, inside_dot, charlist}` - the context is a dot
where `inside_dot` is either a `{:var, charlist}`, `{:alias, charlist}`,
`{:module_attribute, charlist}`, `{:unquoted_atom, charlist}` or a `dot`
itself. If a var is given, this may either be a remote call or a map
field access. Examples are `Hello.wor`, `:hello.wor`, `hello.wor`,
`Hello.nested.wor`, `hello.nested.wor`, and `@hello.world`
* `{:dot_arity, inside_dot, charlist}` - the context is a dot arity
where `inside_dot` is either a `{:var, charlist}`, `{:alias, charlist}`,
`{:module_attribute, charlist}`, `{:unquoted_atom, charlist}` or a `dot`
itself. If a var is given, it must be a remote arity. Examples are
`Hello.world/`, `:hello.world/`, `hello.world/2`, and `@hello.world/2`
* `{:dot_call, inside_dot, charlist}` - the context is a dot
call. This means parentheses or space have been added after the expression.
where `inside_dot` is either a `{:var, charlist}`, `{:alias, charlist}`,
`{:module_attribute, charlist}`, `{:unquoted_atom, charlist}` or a `dot`
itself. If a var is given, it must be a remote call. Examples are
`Hello.world(`, `:hello.world(`, `Hello.world `, `hello.world(`, `hello.world `,
and `@hello.world(`
* `:expr` - may be any expression. Autocompletion may suggest an alias,
local or var
* `{:local_or_var, charlist}` - the context is a variable or a local
(import or local) call, such as `hello_wor`
* `{:local_arity, charlist}` - the context is a local (import or local)
arity, such as `hello_world/`
* `{:local_call, charlist}` - the context is a local (import or local)
call, such as `hello_world(` and `hello_world `
* `{:module_attribute, charlist}` - the context is a module attribute, such
as `@hello_wor`
* `{:operator, charlist}` (since v1.13.0) - the context is an operator,
such as `+` or `==`. Note textual operators, such as `when` do not
appear as operators but rather as `:local_or_var`. `@` is never an
`:operator` and always a `:module_attribute`
* `{:operator_arity, charlist}` (since v1.13.0) - the context is an
operator arity, which is an operator followed by /, such as `+/`,
`not/` or `when/`
* `{:operator_call, charlist}` (since v1.13.0) - the context is an
operator call, which is an operator followed by space, such as
`left + `, `not ` or `x when `
* `:none` - no context possible
* `{:unquoted_atom, charlist}` - the context is an unquoted atom. This
can be any atom or an atom representing a module
## Limitations
* The current algorithm only considers the last line of the input.
This means it will also show suggestions inside strings, heredocs,
etc, which is intentional as it helps with doctests, references,
and more
* Context does not yet track `alias A.{B`, structs, nor sigils
"""
@doc since: "1.13.0"
@spec cursor_context(List.Chars.t(), keyword()) ::
{:alias, charlist}
| {:dot, inside_dot, charlist}
| {:dot_arity, inside_dot, charlist}
| {:dot_call, inside_dot, charlist}
| :expr
| {:local_or_var, charlist}
| {:local_arity, charlist}
| {:local_call, charlist}
| {:module_attribute, charlist}
| {:operator, charlist}
| {:operator_arity, charlist}
| {:operator_call, charlist}
| :none
| {:unquoted_atom, charlist}
when inside_dot:
{:alias, charlist}
| {:dot, inside_dot, charlist}
| {:module_attribute, charlist}
| {:unquoted_atom, charlist}
| {:var, charlist}
def cursor_context(fragment, opts \\ [])
def cursor_context(binary, opts) when is_binary(binary) and is_list(opts) do
binary =
case :binary.matches(binary, "\n") do
[] ->
binary
matches ->
{position, _} = List.last(matches)
binary_part(binary, position + 1, byte_size(binary) - position - 1)
end
binary
|> String.to_charlist()
|> :lists.reverse()
|> codepoint_cursor_context(opts)
|> elem(0)
end
def cursor_context(charlist, opts) when is_list(charlist) and is_list(opts) do
charlist =
case charlist |> Enum.chunk_by(&(&1 == ?\n)) |> List.last([]) do
[?\n | _] -> []
rest -> rest
end
charlist
|> :lists.reverse()
|> codepoint_cursor_context(opts)
|> elem(0)
end
def cursor_context(other, opts) when is_list(opts) do
cursor_context(to_charlist(other), opts)
end
@operators '\\<>+-*/:=|&~^%!'
@starter_punctuation ',([{;'
@non_starter_punctuation ')]}"\'.$'
@space '\t\s'
@trailing_identifier '?!'
@non_identifier @trailing_identifier ++
@operators ++ @starter_punctuation ++ @non_starter_punctuation ++ @space
@textual_operators ~w(when not and or in)c
defp codepoint_cursor_context(reverse, _opts) do
{stripped, spaces} = strip_spaces(reverse, 0)
case stripped do
# It is empty
[] -> {:expr, 0}
# Token/AST only operators
[?>, ?= | rest] when rest == [] or hd(rest) != ?: -> {:expr, 0}
[?>, ?- | rest] when rest == [] or hd(rest) != ?: -> {:expr, 0}
# Two-digit containers
[?<, ?< | rest] when rest == [] or hd(rest) != ?< -> {:expr, 0}
# Ambiguity around :
[?: | rest] when rest == [] or hd(rest) != ?: -> unquoted_atom_or_expr(spaces)
# Dots
[?.] -> {:none, 0}
[?. | rest] when hd(rest) not in '.:' -> dot(rest, spaces + 1, '')
# It is a local or remote call with parens
[?( | rest] -> call_to_cursor_context(strip_spaces(rest, spaces + 1))
# A local arity definition
[?/ | rest] -> arity_to_cursor_context(strip_spaces(rest, spaces + 1))
# Starting a new expression
[h | _] when h in @starter_punctuation -> {:expr, 0}
# It is a local or remote call without parens
rest when spaces > 0 -> call_to_cursor_context({rest, spaces})
# It is an identifier
_ -> identifier_to_cursor_context(reverse, 0, false)
end
end
defp strip_spaces([h | rest], count) when h in @space, do: strip_spaces(rest, count + 1)
defp strip_spaces(rest, count), do: {rest, count}
defp unquoted_atom_or_expr(0), do: {{:unquoted_atom, ''}, 1}
defp unquoted_atom_or_expr(_), do: {:expr, 0}
defp arity_to_cursor_context({reverse, spaces}) do
case identifier_to_cursor_context(reverse, spaces, true) do
{{:local_or_var, acc}, count} -> {{:local_arity, acc}, count}
{{:dot, base, acc}, count} -> {{:dot_arity, base, acc}, count}
{{:operator, acc}, count} -> {{:operator_arity, acc}, count}
{_, _} -> {:none, 0}
end
end
defp call_to_cursor_context({reverse, spaces}) do
case identifier_to_cursor_context(reverse, spaces, true) do
{{:local_or_var, acc}, count} -> {{:local_call, acc}, count}
{{:dot, base, acc}, count} -> {{:dot_call, base, acc}, count}
{{:operator, acc}, count} -> {{:operator_call, acc}, count}
{_, _} -> {:none, 0}
end
end
defp identifier_to_cursor_context([?., ?., ?: | _], n, _), do: {{:unquoted_atom, '..'}, n + 3}
defp identifier_to_cursor_context([?., ?., ?. | _], n, _), do: {{:local_or_var, '...'}, n + 3}
defp identifier_to_cursor_context([?., ?: | _], n, _), do: {{:unquoted_atom, '.'}, n + 2}
defp identifier_to_cursor_context([?., ?. | _], n, _), do: {{:operator, '..'}, n + 2}
defp identifier_to_cursor_context(reverse, count, call_op?) do
case identifier(reverse, count) do
:none ->
{:none, 0}
:operator ->
operator(reverse, count, [], call_op?)
{:module_attribute, acc, count} ->
{{:module_attribute, acc}, count}
{:unquoted_atom, acc, count} ->
{{:unquoted_atom, acc}, count}
{:alias, rest, acc, count} ->
case strip_spaces(rest, count) do
{'.' ++ rest, count} when rest == [] or hd(rest) != ?. ->
nested_alias(rest, count + 1, acc)
_ ->
{{:alias, acc}, count}
end
{:identifier, _, acc, count} when call_op? and acc in @textual_operators ->
{{:operator, acc}, count}
{:identifier, rest, acc, count} ->
case strip_spaces(rest, count) do
{'.' ++ rest, count} when rest == [] or hd(rest) != ?. ->
dot(rest, count + 1, acc)
_ ->
{{:local_or_var, acc}, count}
end
end
end
defp identifier([?? | rest], count), do: check_identifier(rest, count + 1, [??])
defp identifier([?! | rest], count), do: check_identifier(rest, count + 1, [?!])
defp identifier(rest, count), do: check_identifier(rest, count, [])
defp check_identifier([h | t], count, acc) when h not in @non_identifier,
do: rest_identifier(t, count + 1, [h | acc])
defp check_identifier(_, _, _), do: :operator
defp rest_identifier([h | rest], count, acc) when h not in @non_identifier do
rest_identifier(rest, count + 1, [h | acc])
end
defp rest_identifier(rest, count, [?@ | acc]) do
case tokenize_identifier(rest, count, acc) do
{:identifier, _rest, acc, count} -> {:module_attribute, acc, count}
:none when acc == [] -> {:module_attribute, '', count}
_ -> :none
end
end
defp rest_identifier([?: | rest], count, acc) when rest == [] or hd(rest) != ?: do
case String.Tokenizer.tokenize(acc) do
{_, _, [], _, _, _} -> {:unquoted_atom, acc, count + 1}
_ -> :none
end
end
defp rest_identifier([?? | _], _count, _acc) do
:none
end
defp rest_identifier(rest, count, acc) do
tokenize_identifier(rest, count, acc)
end
defp tokenize_identifier(rest, count, acc) do
case String.Tokenizer.tokenize(acc) do
# Not actually an atom cause rest is not a :
{:atom, _, _, _, _, _} ->
:none
# Aliases must be ascii only
{:alias, _, _, _, false, _} ->
:none
{kind, _, [], _, _, extra} ->
if ?@ in extra do
:none
else
{kind, rest, acc, count}
end
_ ->
:none
end
end
defp nested_alias(rest, count, acc) do
{rest, count} = strip_spaces(rest, count)
case identifier_to_cursor_context(rest, count, true) do
{{:alias, prev}, count} -> {{:alias, prev ++ '.' ++ acc}, count}
_ -> {:none, 0}
end
end
defp dot(rest, count, acc) do
{rest, count} = strip_spaces(rest, count)
case identifier_to_cursor_context(rest, count, true) do
{{:local_or_var, var}, count} -> {{:dot, {:var, var}, acc}, count}
{{:unquoted_atom, _} = prev, count} -> {{:dot, prev, acc}, count}
{{:alias, _} = prev, count} -> {{:dot, prev, acc}, count}
{{:dot, _, _} = prev, count} -> {{:dot, prev, acc}, count}
{{:module_attribute, _} = prev, count} -> {{:dot, prev, acc}, count}
{_, _} -> {:none, 0}
end
end
defp operator([h | rest], count, acc, call_op?) when h in @operators do
operator(rest, count + 1, [h | acc], call_op?)
end
defp operator(rest, count, acc, call_op?) when acc in ~w(^^ ~~ ~)c do
{rest, dot_count} = strip_spaces(rest, count)
cond do
call_op? ->
{:none, 0}
match?([?. | rest] when rest == [] or hd(rest) != ?., rest) ->
dot(tl(rest), dot_count + 1, acc)
true ->
{{:operator, acc}, count}
end
end
defp operator(rest, count, acc, _call_op?) do
case :elixir_tokenizer.tokenize(acc, 1, 1, []) do
{:ok, _, [{:atom, _, _}]} ->
{{:unquoted_atom, tl(acc)}, count}
{:ok, _, [{_, _, op}]} ->
{rest, dot_count} = strip_spaces(rest, count)
cond do
Code.Identifier.unary_op(op) == :error and Code.Identifier.binary_op(op) == :error ->
:none
match?([?. | rest] when rest == [] or hd(rest) != ?., rest) ->
dot(tl(rest), dot_count + 1, acc)
true ->
{{:operator, acc}, count}
end
_ ->
{:none, 0}
end
end
@doc """
Receives a string and returns the surround context.
This function receives a string with an Elixir code fragment
and a `position`. It returns a map containing the beginning
and ending of the expression alongside its context, or `:none`
if there is nothing with a known context.
The difference between `cursor_context/2` and `surround_context/3`
is that the former assumes the expression in the code fragment
is incomplete. For example, `do` in `cursor_context/2` may be
a keyword or a variable or a local call, while `surround_context/3`
assumes the expression in the code fragment is complete, therefore
`do` would always be a keyword.
The `position` contains both the `line` and `column`, both starting
with the index of 1. The column must preceed the surrounding expression.
For example, the expression `foo`, will return something for the columns
1, 2, and 3, but not 4:
foo
^ column 1
foo
^ column 2
foo
^ column 3
foo
^ column 4
The returned map contains the column the expression starts and the
first column after the expression ends.
This function builds on top of `cursor_context/2`. Therefore
it also provides a best-effort detection and may not be accurate
under all circumstances. See the "Return values" section for more
information on the available contexts as well as the "Limitations"
section.
## Examples
iex> Code.Fragment.surround_context("foo", {1, 1})
%{begin: {1, 1}, context: {:local_or_var, 'foo'}, end: {1, 4}}
## Differences to `cursor_context/2`
In contrast to `cursor_context/2`, `surround_context/3` does not
return `dot_call`/`dot_arity` nor `operator_call`/`operator_arity`
contexts because they should behave the same as `dot` and `operator`
respectively in complete expressions.
On the other hand, it does make a distinction between `local_call`/
`local_arity` to `local_or_var`, since the latter can be a local or
variable.
Also note that `@` when not followed by any identifier is returned
as `{:operator, '@'}`, while it is a `{:module_attribute, ''}` in
`cursor_context/3`. Once again, this happens because `surround_context/3`
assumes the expression is complete, while `cursor_context/2` does not.
"""
@doc since: "1.13.0"
@spec surround_context(List.Chars.t(), position(), keyword()) ::
%{begin: position, end: position, context: context} | :none
when context:
{:alias, charlist}
| {:dot, inside_dot, charlist}
| {:local_or_var, charlist}
| {:local_arity, charlist}
| {:local_call, charlist}
| {:module_attribute, charlist}
| {:operator, charlist}
| {:unquoted_atom, charlist},
inside_dot:
{:alias, charlist}
| {:dot, inside_dot, charlist}
| {:module_attribute, charlist}
| {:unquoted_atom, charlist}
| {:var, charlist}
def surround_context(fragment, position, options \\ [])
def surround_context(binary, {line, column}, opts) when is_binary(binary) do
binary
|> String.split("\n")
|> Enum.at(line - 1, '')
|> String.to_charlist()
|> position_surround_context(line, column, opts)
end
def surround_context(charlist, {line, column}, opts) when is_list(charlist) do
charlist
|> :string.split('\n', :all)
|> Enum.at(line - 1, '')
|> position_surround_context(line, column, opts)
end
def surround_context(other, position, opts) do
surround_context(to_charlist(other), position, opts)
end
defp position_surround_context(charlist, line, column, opts)
when is_integer(line) and line >= 1 and is_integer(column) and column >= 1 do
{reversed_pre, post} = string_reverse_at(charlist, column - 1, [])
{reversed_pre, post} = adjust_position(reversed_pre, post)
case take_identifier(post, []) do
{_, [], _} ->
maybe_operator(reversed_pre, post, line, opts)
{:identifier, reversed_post, rest} ->
{rest, _} = strip_spaces(rest, 0)
reversed = reversed_post ++ reversed_pre
case codepoint_cursor_context(reversed, opts) do
{{:alias, acc}, offset} ->
build_surround({:alias, acc}, reversed, line, offset)
{{:dot, _, [_ | _]} = dot, offset} ->
build_surround(dot, reversed, line, offset)
{{:local_or_var, acc}, offset} when hd(rest) == ?( ->
build_surround({:local_call, acc}, reversed, line, offset)
{{:local_or_var, acc}, offset} when hd(rest) == ?/ ->
build_surround({:local_arity, acc}, reversed, line, offset)
{{:local_or_var, acc}, offset} when acc in @textual_operators ->
build_surround({:operator, acc}, reversed, line, offset)
{{:local_or_var, acc}, offset} when acc not in ~w(do end after else catch rescue)c ->
build_surround({:local_or_var, acc}, reversed, line, offset)
{{:module_attribute, ''}, offset} ->
build_surround({:operator, '@'}, reversed, line, offset)
{{:module_attribute, acc}, offset} ->
build_surround({:module_attribute, acc}, reversed, line, offset)
{{:unquoted_atom, acc}, offset} ->
build_surround({:unquoted_atom, acc}, reversed, line, offset)
_ ->
maybe_operator(reversed_pre, post, line, opts)
end
{:alias, reversed_post, _rest} ->
reversed = reversed_post ++ reversed_pre
case codepoint_cursor_context(reversed, opts) do
{{:alias, acc}, offset} ->
build_surround({:alias, acc}, reversed, line, offset)
_ ->
:none
end
end
end
defp maybe_operator(reversed_pre, post, line, opts) do
case take_operator(post, []) do
{[], _rest} ->
:none
{reversed_post, _rest} ->
reversed = reversed_post ++ reversed_pre
case codepoint_cursor_context(reversed, opts) do
{{:operator, acc}, offset} ->
build_surround({:operator, acc}, reversed, line, offset)
{{:dot, _, [_ | _]} = dot, offset} ->
build_surround(dot, reversed, line, offset)
_ ->
:none
end
end
end
defp build_surround(context, reversed, line, offset) do
{post, reversed_pre} = enum_reverse_at(reversed, offset, [])
pre = :lists.reverse(reversed_pre)
pre_length = :string.length(pre) + 1
%{
context: context,
begin: {line, pre_length},
end: {line, pre_length + :string.length(post)}
}
end
defp take_identifier([h | t], acc) when h in @trailing_identifier,
do: {:identifier, [h | acc], t}
defp take_identifier([h | t], acc) when h not in @non_identifier,
do: take_identifier(t, [h | acc])
defp take_identifier(rest, acc) do
with {[?. | t], _} <- strip_spaces(rest, 0),
{[h | _], _} when h in ?A..?Z <- strip_spaces(t, 0) do
take_alias(rest, acc)
else
_ -> {:identifier, acc, rest}
end
end
defp take_alias([h | t], acc) when h not in @non_identifier,
do: take_alias(t, [h | acc])
defp take_alias(rest, acc) do
with {[?. | t], acc} <- move_spaces(rest, acc),
{[h | t], acc} when h in ?A..?Z <- move_spaces(t, [?. | acc]) do
take_alias(t, [h | acc])
else
_ -> {:alias, acc, rest}
end
end
defp take_operator([h | t], acc) when h in @operators, do: take_operator(t, [h | acc])
defp take_operator([h | t], acc) when h == ?., do: take_operator(t, [h | acc])
defp take_operator(rest, acc), do: {acc, rest}
# Unquoted atom handling
defp adjust_position(reversed_pre, [?: | post])
when hd(post) != ?: and (reversed_pre == [] or hd(reversed_pre) != ?:) do
{[?: | reversed_pre], post}
end
# Dot handling
defp adjust_position(reversed_pre, post) do
case move_spaces(post, reversed_pre) do
# If we are between spaces and a dot, move past the dot
{[?. | post], reversed_pre} when hd(post) != ?. and hd(reversed_pre) != ?. ->
{post, reversed_pre} = move_spaces(post, [?. | reversed_pre])
{reversed_pre, post}
_ ->
case strip_spaces(reversed_pre, 0) do
# If there is a dot to our left, make sure to move to the first character
{[?. | rest], _} when rest == [] or hd(rest) not in '.:' ->
{post, reversed_pre} = move_spaces(post, reversed_pre)
{reversed_pre, post}
_ ->
{reversed_pre, post}
end
end
end
defp move_spaces([h | t], acc) when h in @space, do: move_spaces(t, [h | acc])
defp move_spaces(t, acc), do: {t, acc}
defp string_reverse_at(charlist, 0, acc), do: {acc, charlist}
defp string_reverse_at(charlist, n, acc) do
case :unicode_util.gc(charlist) do
[gc | cont] when is_integer(gc) -> string_reverse_at(cont, n - 1, [gc | acc])
[gc | cont] when is_list(gc) -> string_reverse_at(cont, n - 1, :lists.reverse(gc, acc))
[] -> {acc, []}
end
end
defp enum_reverse_at([h | t], n, acc) when n > 0, do: enum_reverse_at(t, n - 1, [h | acc])
defp enum_reverse_at(rest, _, acc), do: {acc, rest}
end
| 34.493213 | 96 | 0.60195 |
03ebccb703f3c440db0e4e8b5cee1b1d56ca5b0d | 11,183 | exs | Elixir | test/cldr_units_test.exs | kianmeng/cldr_units | fae3ed1f658e6b4c5164c2ebbe786cc562014a8b | [
"Apache-2.0"
] | null | null | null | test/cldr_units_test.exs | kianmeng/cldr_units | fae3ed1f658e6b4c5164c2ebbe786cc562014a8b | [
"Apache-2.0"
] | null | null | null | test/cldr_units_test.exs | kianmeng/cldr_units | fae3ed1f658e6b4c5164c2ebbe786cc562014a8b | [
"Apache-2.0"
] | null | null | null | defmodule Cldr.UnitsTest do
use ExUnit.Case, async: true
test "new unit with multiple 'per' clauses" do
assert Cldr.Unit.new!(2, "curr-usd-per-meter-per-second").unit ==
"curr_usd_per_meter_per_second"
assert Cldr.Unit.new!(2, "curr-usd-per-mile-per-gallon").unit ==
"curr_usd_per_gallon_mile"
assert Cldr.Unit.new!(2, "curr-usd-per-mile-per-gallon").unit ==
"curr_usd_per_gallon_mile"
end
test "that centimetre conversion is correct" do
assert Cldr.Unit.convert(Cldr.Unit.new!(:millimeter, 300), :centimeter) ==
Cldr.Unit.new(:centimeter, 30.0)
end
test "that pluralization in non-en locales works" do
assert Cldr.Unit.Format.to_string!(1, MyApp.Cldr, locale: "de", unit: :microsecond) ==
"1 Mikrosekunde"
assert Cldr.Unit.Format.to_string!(123, MyApp.Cldr, locale: "de", unit: :microsecond) ==
"123 Mikrosekunden"
assert Cldr.Unit.Format.to_string!(1, MyApp.Cldr, locale: "de", unit: :pint) == "1 Pint"
assert Cldr.Unit.Format.to_string!(123, MyApp.Cldr, locale: "de", unit: :pint) == "123 Pints"
assert Cldr.Unit.Format.to_string!(1, MyApp.Cldr, locale: "de", unit: :century) ==
"1 Jahrhundert"
assert Cldr.Unit.Format.to_string!(123, MyApp.Cldr, locale: "de", unit: :century) ==
"123 Jahrhunderte"
end
test "locale option is passed to Cldr.Number.to_string" do
assert Cldr.Unit.Format.to_string!(1, MyApp.Cldr, format: :spellout, locale: "de", unit: :pint) ==
"eins Pint"
end
test "decimal" do
unit = Cldr.Unit.new!(Decimal.new("300"), :minute)
{:ok, hours} = Cldr.Unit.Conversion.convert(unit, :hour)
assert hours.unit == :hour
assert Decimal.equal?(5, Cldr.Unit.value(Cldr.Unit.round(hours)))
end
test "decimal functional conversion - celsius" do
celsius = Cldr.Unit.new!(Decimal.new("100"), :celsius)
{:ok, fahrenheit} = Cldr.Unit.Conversion.convert(celsius, :fahrenheit)
fahrenheit = Cldr.Unit.to_float_unit(fahrenheit)
assert fahrenheit.value == 212
end
test "decimal functional conversion - kelvin" do
celsius = Cldr.Unit.new!(Decimal.new("0"), :celsius)
{:ok, kelvin} = Cldr.Unit.Conversion.convert(celsius, :kelvin)
kelvin = Cldr.Unit.to_float_unit(kelvin)
assert kelvin.value == 273.15
end
test "decimal conversion without function" do
celsius = Cldr.Unit.new!(Decimal.new(100), :celsius)
{:ok, celsius2} = Cldr.Unit.Conversion.convert(celsius, :celsius)
assert Decimal.equal?(Cldr.Unit.value(celsius2), Decimal.new(100))
end
test "that to_string is invoked by the String.Chars protocol" do
unit = Cldr.Unit.new!(23, :foot)
assert to_string(unit) == "23 feet"
end
test "formatting a list" do
list = [Cldr.Unit.new!(23, :foot), Cldr.Unit.new!(5, :inch)]
assert Cldr.Unit.Format.to_string(list, MyApp.Cldr, []) == {:ok, "23 feet and 5 inches"}
end
test "localize a unit" do
unit = Cldr.Unit.new!(100, :meter)
assert Cldr.Unit.localize(unit, usage: :person, territory: :US) ==
[Cldr.Unit.new!(:inch, Ratio.new(21_617_278_211_378_380_800, 5_490_788_665_690_109))]
assert Cldr.Unit.localize(unit, usage: :person_height, territory: :US) ==
[
Cldr.Unit.new!(:foot, 328),
Cldr.Unit.new!(:inch, Ratio.new(5_534_023_222_111_776, 5_490_788_665_690_109))
]
assert Cldr.Unit.localize(unit, usage: :unknown, territory: :US) ==
{:error,
{Cldr.Unit.UnknownUsageError,
"The unit category :length does not define a usage :unknown"}}
end
test "localize a decimal unit" do
u = Cldr.Unit.new!(Decimal.new(20), :meter)
assert Cldr.Unit.localize(u, territory: :US) ==
[Cldr.Unit.new!(:foot, Ratio.new(360_287_970_189_639_680, 5_490_788_665_690_109))]
end
test "localize a ratio unit" do
u = Cldr.Unit.new!(:foot, Ratio.new(360_287_970_189_639_680, 5_490_788_665_690_109))
assert Cldr.Unit.localize(u, territory: :AU) == [Cldr.Unit.new!(:meter, 20)]
end
test "to_string a decimal unit" do
u = Cldr.Unit.new!(Decimal.new(20), :meter)
assert Cldr.Unit.Format.to_string(u) == {:ok, "20 meters"}
end
test "to_string a ratio unit" do
u = Cldr.Unit.new!(:foot, Ratio.new(360_287_970_189_639_680, 5_490_788_665_690_109))
assert Cldr.Unit.Format.to_string(u) == {:ok, "65.617 feet"}
end
test "inspection when non-default usage or non-default format options" do
assert inspect(Cldr.Unit.new!(:meter, 1)) == "#Cldr.Unit<:meter, 1>"
assert inspect(Cldr.Unit.new!(:meter, 1, usage: :road)) ==
"#Cldr.Unit<:meter, 1, usage: :road, format_options: []>"
assert inspect(Cldr.Unit.new!(:meter, 1, format_options: [round_nearest: 50])) ==
"#Cldr.Unit<:meter, 1, usage: :default, format_options: [round_nearest: 50]>"
end
test "that unit skeletons are used for formatting" do
unit = Cldr.Unit.new!(311, :meter, usage: :road)
localized = Cldr.Unit.localize(unit, MyApp.Cldr, territory: :SE)
assert localized ==
[Cldr.Unit.new!(:meter, 311, usage: :road, format_options: [round_nearest: 50])]
assert Cldr.Unit.Format.to_string!(localized) == "300 meters"
end
test "creating a compound unit" do
assert {:ok, unit} = Cldr.Unit.new("meter_per_kilogram", 1)
assert unit.usage == :default
end
test "to_string a compound unit" do
unit = Cldr.Unit.new!("meter_per_kilogram", 1)
assert {:ok, "1 meter per kilogram"} = Cldr.Unit.Format.to_string(unit)
end
test "to_string for a pattern with no substitutions when the unit value is 0, 1 or 2" do
unit = Cldr.Unit.new!(1, :hour)
assert Cldr.Unit.to_string(unit, locale: "he") == {:ok, "שעה"}
assert Cldr.Unit.to_string(unit, locale: "ar") == {:ok, "ساعة"}
unit = Cldr.Unit.new!(-1, :hour)
assert Cldr.Unit.to_string(unit, locale: "he") == {:ok, "-1 שעות"}
assert Cldr.Unit.to_string(unit, locale: "ar") == {:ok, "-١ ساعة"}
unit = Cldr.Unit.new!(3, :hour)
assert Cldr.Unit.to_string(unit, locale: "he") == {:ok, "3 שעות"}
end
test "to_string a complex compound unit" do
unit = Cldr.Unit.new!("square millimeter per cubic fathom", 3)
assert Cldr.Unit.Format.to_string(unit) == {:ok, "3 square millimeters per cubic fathom"}
end
test "to_string a binary prefixed unit" do
unit = Cldr.Unit.new!("gibibyte", 2)
assert Cldr.Unit.Format.to_string(unit) == {:ok, "2 gibibytes"}
end
test "to_string a per compound unit" do
unit = Cldr.Unit.new!("meter_per_square_kilogram", 1)
assert Cldr.Unit.Format.to_string(unit) == {:ok, "1 meter per square kilogram"}
unit = Cldr.Unit.new!("meter_per_square_kilogram", 2)
assert Cldr.Unit.Format.to_string(unit) == {:ok, "2 meters per square kilogram"}
end
test "localization with current process locales" do
assert Cldr.Unit.localize(Cldr.Unit.new!(2, :meter, usage: :person_height))
assert Cldr.Unit.localize(Cldr.Unit.new!(2, :meter, usage: :person_height), locale: "fr")
end
test "a multiplied unit to_string" do
unit = Cldr.Unit.new!("meter ampere volt", 3)
assert Cldr.Unit.Format.to_string(unit) == {:ok, "3 volt-meter-amperes"}
end
test "create a unit that is directly translatable but has no explicit conversion" do
assert {:ok, "1 kilowatt hour"} ==
Cldr.Unit.new!(1, :kilowatt_hour) |> Cldr.Unit.Format.to_string()
assert {:ok, "1 Kilowattstunde"} ==
Cldr.Unit.new!(1, :kilowatt_hour) |> Cldr.Unit.Format.to_string(locale: "de")
end
test "that a translatable unit name in binary form gets identified as translatable" do
assert {:ok, "1 kilowatt hour"} ==
Cldr.Unit.new!(1, "kilowatt_hour") |> Cldr.Unit.Format.to_string()
end
test "unit categories" do
assert Cldr.Unit.known_unit_categories() ==
[
:acceleration,
:angle,
:area,
:concentr,
:consumption,
:digital,
:duration,
:electric,
:energy,
:force,
:frequency,
:graphics,
:length,
:light,
:mass,
:power,
:pressure,
:speed,
:temperature,
:torque,
:volume
]
end
test "unit categories for" do
assert {:ok, _list} = Cldr.Unit.known_units_for_category(:volume)
assert Cldr.Unit.known_units_for_category(:invalid) ==
{:error,
{Cldr.Unit.UnknownUnitCategoryError, "The unit category :invalid is not known."}}
end
test "unit category for" do
assert Cldr.Unit.unit_category(:year) == {:ok, :duration}
end
test "display names" do
assert Cldr.Unit.display_name(:liter) == "liters"
assert Cldr.Unit.display_name(:liter, locale: "fr") == "litres"
assert Cldr.Unit.display_name(:liter, locale: "fr", style: :short) == "l"
assert Cldr.Unit.display_name(:liter, locale: "fr", style: :invalid) ==
{:error, {Cldr.UnknownFormatError, "The unit style :invalid is not known."}}
assert Cldr.Unit.display_name(:liter, locale: "xx", style: :short) ==
{:error, {Cldr.InvalidLanguageError, "The language \"xx\" is invalid"}}
assert Cldr.Unit.display_name(:invalid, locale: "fr", style: :short) ==
{:error, {Cldr.UnknownUnitError, "The unit :invalid is not known."}}
end
test "Unit of 1 retrieves a default pattern is plural category pattern does not exist" do
unit = Cldr.Unit.new!(1, :pascal)
assert Cldr.Unit.to_string(unit, locale: "de", style: :short) == {:ok, "1 Pa"}
end
test "Format a unit when there is no default backend" do
default = Application.get_env(:ex_cldr, :default_backend)
Application.put_env(:ex_cldr, :default_backend, nil)
assert MyApp.Cldr.Unit.to_string!(7.3, unit: :kilogram) == "7.3 kilograms"
Application.put_env(:ex_cldr, :default_backend, default)
end
test "Cldr.DisplayName protocol for Unit" do
assert Cldr.display_name(Cldr.Unit.new!(:foot, 1)) == "feet"
end
if function_exported?(Code, :fetch_docs, 1) do
test "that no module docs are generated for a backend" do
assert {:docs_v1, _, :elixir, _, :hidden, %{}, _} = Code.fetch_docs(NoDocs.Cldr)
end
assert "that module docs are generated for a backend" do
{:docs_v1, _, :elixir, "text/markdown", %{"en" => _}, %{}, _} = Code.fetch_docs(MyApp.Cldr)
end
end
test "Cldr.Unit.from_map/1" do
assert Cldr.Unit.from_map %{value: 1, unit: "kilogram"} == Cldr.Unit.new(:kilogram, 1)
assert Cldr.Unit.from_map %{"value" => 1, "unit" => "kilogram"} == Cldr.Unit.new(:kilogram, 1)
assert Cldr.Unit.from_map %{value: %{numerator: 3, denominator: 4}, unit: "kilogram"} == Cldr.Unit.new(:kilogram, Ratio.new(3,4))
assert Cldr.Unit.from_map %{"value" => 1, unit: "kilogram"} ==
{:error,
{Cldr.UnknownUnitError,
"The unit %{:unit => \"kilogram\", \"value\" => 1} is not known."}}
end
end
| 37.401338 | 133 | 0.636502 |
03ebd68b88ba42a5a697cde23695ed3fe8e93ac0 | 904 | exs | Elixir | test/day_14_disk_defragmentation_test.exs | scmx/advent-of-code-2017-elixir | 7435028bf7cd4d0c880144363aab2e3d80ab7ceb | [
"MIT"
] | 1 | 2018-11-26T11:34:41.000Z | 2018-11-26T11:34:41.000Z | test/day_14_disk_defragmentation_test.exs | scmx/advent-of-code-2017-elixir | 7435028bf7cd4d0c880144363aab2e3d80ab7ceb | [
"MIT"
] | null | null | null | test/day_14_disk_defragmentation_test.exs | scmx/advent-of-code-2017-elixir | 7435028bf7cd4d0c880144363aab2e3d80ab7ceb | [
"MIT"
] | 1 | 2018-01-23T08:12:26.000Z | 2018-01-23T08:12:26.000Z | defmodule Adventofcode.Day14DiskDefragmentationTest do
use Adventofcode.FancyCase
import Adventofcode.Day14DiskDefragmentation
describe "squares_count/1" do
test "flqrgnkx across the entire 128x128 grid 8108 squares are used" do
assert 8108 = "flqrgnkx" |> squares_count()
end
test "with puzzle input" do
with_puzzle_input("input/day_14_disk_defragmentation.txt", fn input ->
assert 8250 = input |> squares_count()
end)
end
end
describe "regions_count/1" do
test "flqrgnkx across the entire 128x128 grid there are 1242 groups" do
# IO.puts ""
# IO.puts(pretty_print(regions("flqrgnkx")))
assert 1242 = "flqrgnkx" |> regions_count()
end
test "with puzzle input" do
with_puzzle_input("input/day_14_disk_defragmentation.txt", fn input ->
assert 1113 = input |> regions_count()
end)
end
end
end
| 28.25 | 76 | 0.691372 |
03ebe10b3ac9ed336b3d22f73b04206185b24ecb | 1,961 | exs | Elixir | config/dev.exs | AstridMN/elixir1 | f62347179628937cdb63fc14db0b0a32c689e0ae | [
"MIT"
] | 3 | 2017-04-18T06:23:18.000Z | 2018-01-05T04:14:57.000Z | config/dev.exs | AstridMN/elixir1 | f62347179628937cdb63fc14db0b0a32c689e0ae | [
"MIT"
] | 27 | 2016-11-04T19:56:11.000Z | 2017-03-26T22:34:55.000Z | config/dev.exs | AstridMN/elixir1 | f62347179628937cdb63fc14db0b0a32c689e0ae | [
"MIT"
] | 3 | 2016-10-05T11:56:42.000Z | 2017-03-05T14:37:10.000Z | use Mix.Config
# For development, we disable any cache and enable
# debugging and code reloading.
#
# The watchers configuration can be used to run external
# watchers to your application. For example, we use it
# with brunch.io to recompile .js and .css sources.
config :microcrawler_webapp, MicrocrawlerWebapp.Endpoint,
http: [port: 4000],
debug_errors: true,
code_reloader: true,
check_origin: false,
watchers: [node: ["node_modules/brunch/bin/brunch", "watch", "--stdin",
cd: Path.expand("../", __DIR__)]]
# Watch static and templates for browser reloading.
config :microcrawler_webapp, MicrocrawlerWebapp.Endpoint,
live_reload: [
patterns: [
~r{priv/static/.*(js|css|png|jpeg|jpg|gif|svg)$},
~r{priv/gettext/.*(po)$},
~r{web/views/.*(ex)$},
~r{web/templates/.*(eex)$}
]
]
# Do not include metadata nor timestamps in development logs
config :logger, :console, format: "[$level] $message\n"
# Set a higher stacktrace during development. Avoid configuring such
# in production as building large stacktraces may be expensive.
config :phoenix, :stacktrace_depth, 20
# Configure your AMQP
config :amqp,
username: System.get_env("AMQP_USERNAME") || "guest",
password: System.get_env("AMQP_PASSWORD") || "guest",
vhost: System.get_env("AMQP_VHOST") || "/",
hostname: System.get_env("AMQP_HOSTNAME") || "localhost"
config :microcrawler_webapp, MicrocrawlerWebapp.Couchbase,
url: System.get_env("GAUC_URL") || "http://localhost:5000",
bucket: System.get_env("GAUC_BUCKET") || "default"
config :microcrawler_webapp, MicrocrawlerWebapp.Elasticsearch,
url: System.get_env("ELASTIC_URL") || "http://elastic:changeme@localhost:9200",
index: System.get_env("ELASTIC_INDEX") || "default",
doc_type: System.get_env("ELASTIC_DOC_TYPE") || "default"
config :microcrawler_webapp, MicrocrawlerWebapp.Users,
predefined: [
%{email: "montana@example.com", password: "wildhack"}
]
| 35.017857 | 81 | 0.711882 |
03ebecc5cdc7f57fef596c0b0a459c2c1985de4e | 2,228 | ex | Elixir | lib/html_sanitize_ex/scrubber/basic_html.ex | MikeAndrianov/html_sanitize_ex | 91ca3dd14c5ce5d6eb41fca47afea4a47ddce1c7 | [
"MIT"
] | null | null | null | lib/html_sanitize_ex/scrubber/basic_html.ex | MikeAndrianov/html_sanitize_ex | 91ca3dd14c5ce5d6eb41fca47afea4a47ddce1c7 | [
"MIT"
] | null | null | null | lib/html_sanitize_ex/scrubber/basic_html.ex | MikeAndrianov/html_sanitize_ex | 91ca3dd14c5ce5d6eb41fca47afea4a47ddce1c7 | [
"MIT"
] | null | null | null | defmodule HtmlSanitizeEx.Scrubber.BasicHTML do
@moduledoc """
Allows basic HTML tags to support user input for writing relatively
plain text but allowing headings, links, bold, and so on.
Does not allow any mailto-links, styling, HTML5 tags, video embeds etc.
"""
require HtmlSanitizeEx.Scrubber.Meta
alias HtmlSanitizeEx.Scrubber.Meta
@valid_schemes ["http", "https", "mailto"]
# Removes any CDATA tags before the traverser/scrubber runs.
Meta.remove_cdata_sections_before_scrub()
Meta.strip_comments()
Meta.allow_tag_with_uri_attributes("a", ["href"], @valid_schemes)
Meta.allow_tag_with_these_attributes("a", ["name", "title"])
Meta.allow_tag_with_these_attributes("b", [])
Meta.allow_tag_with_these_attributes("blockquote", [])
Meta.allow_tag_with_these_attributes("br", [])
Meta.allow_tag_with_these_attributes("code", [])
Meta.allow_tag_with_these_attributes("del", [])
Meta.allow_tag_with_these_attributes("em", [])
Meta.allow_tag_with_these_attributes("h1", [])
Meta.allow_tag_with_these_attributes("h2", [])
Meta.allow_tag_with_these_attributes("h3", [])
Meta.allow_tag_with_these_attributes("h4", [])
Meta.allow_tag_with_these_attributes("h5", [])
Meta.allow_tag_with_these_attributes("hr", [])
Meta.allow_tag_with_these_attributes("i", [])
Meta.allow_tag_with_uri_attributes("img", ["src"], @valid_schemes)
Meta.allow_tag_with_these_attributes("img", [
"width",
"height",
"title",
"alt"
])
Meta.allow_tag_with_these_attributes("li", [])
Meta.allow_tag_with_these_attributes("ol", [])
Meta.allow_tag_with_these_attributes("p", [])
Meta.allow_tag_with_these_attributes("pre", [])
Meta.allow_tag_with_these_attributes("span", [])
Meta.allow_tag_with_these_attributes("strong", [])
Meta.allow_tag_with_these_attributes("table", [])
Meta.allow_tag_with_these_attributes("tbody", [])
Meta.allow_tag_with_these_attributes("td", [])
Meta.allow_tag_with_these_attributes("th", [])
Meta.allow_tag_with_these_attributes("thead", [])
Meta.allow_tag_with_these_attributes("tr", [])
Meta.allow_tag_with_these_attributes("u", [])
Meta.allow_tag_with_these_attributes("ul", [])
Meta.strip_everything_not_covered()
end
| 35.935484 | 73 | 0.748205 |
03ec106afcbf4cf595b04ba74ee70f4b195ac036 | 1,166 | exs | Elixir | test/phoenix_live_view/controller_test.exs | alanvardy/phoenix_live_view | f5c18a5d5682850050072054397da024371aefca | [
"MIT"
] | null | null | null | test/phoenix_live_view/controller_test.exs | alanvardy/phoenix_live_view | f5c18a5d5682850050072054397da024371aefca | [
"MIT"
] | null | null | null | test/phoenix_live_view/controller_test.exs | alanvardy/phoenix_live_view | f5c18a5d5682850050072054397da024371aefca | [
"MIT"
] | null | null | null | defmodule Phoenix.LiveView.ControllerTest do
use ExUnit.Case, async: true
use Phoenix.ConnTest
alias Phoenix.LiveViewTest.Endpoint
@endpoint Endpoint
setup do
{:ok, conn: Phoenix.ConnTest.build_conn()}
end
test "live renders from controller without session", %{conn: conn} do
conn = get(conn, "/controller/live-render-2")
assert html_response(conn, 200) =~ "session: %{}"
end
test "live renders from controller with session", %{conn: conn} do
conn = get(conn, "/controller/live-render-3")
assert html_response(conn, 200) =~ "session: %{\"custom\" => :session}"
end
test "when session data has atom keys, warns on live render", %{conn: conn} do
assert ExUnit.CaptureIO.capture_io(:stderr, fn ->
conn = get(conn, "/controller/live-render-4")
assert html_response(conn, 200) =~ "session: %{custom: :session}"
end) =~ "Phoenix.LiveView sessions require string keys, got: :custom"
end
test "live renders from controller with merged assigns", %{conn: conn} do
conn = get(conn, "/controller/live-render-5")
assert html_response(conn, 200) =~ "title: Dashboard"
end
end
| 33.314286 | 80 | 0.672384 |
03ec2ff54d79a9ae0ad9bc52eecc0cf2a66604b6 | 6,017 | ex | Elixir | lib/telnet/metrics/client_instrumenter.ex | oestrich/grapevine-telnet | 5753d2b7f2744580e56cdb6fc00f7dd271a2a679 | [
"MIT"
] | 3 | 2019-02-13T06:07:32.000Z | 2021-02-21T13:54:32.000Z | lib/telnet/metrics/client_instrumenter.ex | oestrich/grapevine-telnet | 5753d2b7f2744580e56cdb6fc00f7dd271a2a679 | [
"MIT"
] | null | null | null | lib/telnet/metrics/client_instrumenter.ex | oestrich/grapevine-telnet | 5753d2b7f2744580e56cdb6fc00f7dd271a2a679 | [
"MIT"
] | null | null | null | defmodule GrapevineTelnet.Metrics.ClientInstrumenter do
@moduledoc """
Instrumentation for the telnet client
"""
use Prometheus.Metric
require Logger
alias GrapevineTelnet.Presence
@doc false
def setup() do
events = [
[:start],
[:connection, :connected],
[:connection, :failed],
[:wont],
[:dont],
[:charset, :sent],
[:charset, :accepted],
[:charset, :rejected],
{[:gmcp, :sent], [:game_id]},
{[:gmcp, :received], [:game_id]},
[:line_mode, :sent],
[:mssp, :failed],
[:mssp, :option, :success],
[:mssp, :sent],
[:mssp, :text, :sent],
[:mssp, :text, :success],
[:term_type, :sent],
[:term_type, :details]
]
Enum.each(events, &setup_event/1)
setup_gauges()
end
defp setup_event({event, labels}) do
name = Enum.join(event, "_")
name = "telnet_#{name}"
Counter.declare(
name: String.to_atom("#{name}_total"),
help: "Total count of tracking for telnet event #{name}",
labels: labels
)
:telemetry.attach(name, [:telnet | event], &handle_event/4, nil)
end
defp setup_event(event) do
name = Enum.join(event, "_")
name = "telnet_#{name}"
Counter.declare(
name: String.to_atom("#{name}_total"),
help: "Total count of tracking for telnet event #{name}"
)
:telemetry.attach(name, [:telnet | event], &handle_event/4, nil)
end
defp setup_gauges() do
Gauge.declare(
name: :telnet_client_count,
help: "Number of live web clients"
)
:telemetry.attach(
"grapevine-client-online",
[:telnet, :clients, :online],
&handle_event/4,
nil
)
end
@doc """
Dispatch a clients online telemetry execute
Called from the telemetry-poller
"""
def dispatch_client_count() do
:telemetry.execute([:telnet, :clients, :online], Presence.online_client_count(), %{})
end
def handle_event([:telnet, :clients, :online], count, _metadata, _config) do
Gauge.set([name: :telnet_client_count], count)
end
def handle_event([:telnet, :start], _count, %{host: host, port: port}, _config) do
Logger.debug(
fn ->
"Starting Telnet Client: #{host}:#{port}"
end,
type: :telnet
)
Counter.inc(name: :telnet_start_total)
end
def handle_event([:telnet, :connection, :connected], _count, _metadata, _config) do
Logger.debug("Connected to game", type: :telnet)
Counter.inc(name: :telnet_connection_connected_total)
end
def handle_event([:telnet, :connection, :failed], _count, metadata, _config) do
Logger.debug(
fn ->
"Could not connect to a game - #{inspect(metadata[:error])}"
end,
type: :telnet
)
Counter.inc(name: :telnet_connection_failed_total)
end
def handle_event([:telnet, :wont], _count, metadata, _config) do
Logger.debug(
fn ->
"Rejecting a WONT #{metadata[:byte]}"
end,
type: :telnet
)
Counter.inc(name: :telnet_wont_total)
end
def handle_event([:telnet, :dont], _count, metadata, _config) do
Logger.debug(
fn ->
"Rejecting a DO #{metadata[:byte]}"
end,
type: :telnet
)
Counter.inc(name: :telnet_dont_total)
end
def handle_event([:telnet, :charset, :sent], _count, _metadata, _config) do
Logger.debug("Responding to CHARSET", type: :telnet)
Counter.inc(name: :telnet_charset_sent_total)
end
def handle_event([:telnet, :charset, :accepted], _count, _metadata, _config) do
Logger.debug("Accepting charset", type: :telnet)
Counter.inc(name: :telnet_charset_accepted_total)
end
def handle_event([:telnet, :charset, :rejected], _count, _metadata, _config) do
Logger.debug("Rejecting charset", type: :telnet)
Counter.inc(name: :telnet_charset_rejected_total)
end
def handle_event([:telnet, :gmcp, :sent], _count, metadata, _config) do
Logger.debug("Responding to GMCP", type: :telnet)
Counter.inc(name: :telnet_gmcp_sent_total, labels: [metadata[:game_id]])
end
def handle_event([:telnet, :gmcp, :received], _count, metadata, _config) do
Logger.debug("Received GMCP Message", type: :telnet)
Counter.inc(name: :telnet_gmcp_received_total, labels: [metadata[:game_id]])
end
def handle_event([:telnet, :mssp, :sent], _count, _metadata, _config) do
Logger.debug("Sending MSSP via telnet option", type: :telnet)
Counter.inc(name: :telnet_mssp_sent_total)
end
def handle_event([:telnet, :line_mode, :sent], _count, _metadata, _config) do
Logger.debug("Responding to LINEMODE", type: :telnet)
Counter.inc(name: :telnet_line_mode_sent_total)
end
def handle_event([:telnet, :term_type, :sent], _count, _metadata, _config) do
Logger.debug("Responding to TTYPE", type: :telnet)
Counter.inc(name: :telnet_term_type_sent_total)
end
def handle_event([:telnet, :term_type, :details], _count, _metadata, _config) do
Logger.debug("Responding to TTYPE request", type: :telnet)
end
def handle_event([:telnet, :mssp, :option, :success], _count, state, _config) do
Logger.info("Received MSSP from #{state.host}:#{state.port} - option version", type: :telnet)
Counter.inc(name: :telnet_mssp_option_success_total)
end
def handle_event([:telnet, :mssp, :text, :sent], _count, _metadata, _config) do
Logger.debug("Sending a text version of mssp request", type: :telnet)
Counter.inc(name: :telnet_mssp_text_sent_total)
end
def handle_event([:telnet, :mssp, :text, :success], _count, state, _config) do
Logger.info("Received MSSP from #{state.host}:#{state.port} - text version", type: :telnet)
Counter.inc(name: :telnet_mssp_text_success_total)
end
def handle_event([:telnet, :mssp, :failed], _count, state, _config) do
Logger.debug(
fn ->
"Terminating connection to #{state.host}:#{state.port} due to no MSSP being sent"
end,
type: :telnet
)
Counter.inc(name: :telnet_mssp_failed_total)
end
end
| 28.516588 | 97 | 0.652318 |
03ec38739697af778efab566f4158ab8d50dc78e | 205 | exs | Elixir | v01/ch12/hello3.edit3.exs | oiax/elixir-primer | c8b89a29f108cc335b8e1341b7a1e90ec12adc66 | [
"MIT"
] | null | null | null | v01/ch12/hello3.edit3.exs | oiax/elixir-primer | c8b89a29f108cc335b8e1341b7a1e90ec12adc66 | [
"MIT"
] | null | null | null | v01/ch12/hello3.edit3.exs | oiax/elixir-primer | c8b89a29f108cc335b8e1341b7a1e90ec12adc66 | [
"MIT"
] | null | null | null | defmodule Hello3 do
def message do
"Hello, world!"
end
def message(name) do
"Hello, #{name}!"
end
end
IO.puts Hello3.message("Alice")
IO.puts Hello3.message("Bob")
IO.puts Hello3.message
| 14.642857 | 31 | 0.673171 |
03ec58a76a3e988ce666a06e6c995153eb3aa52b | 537 | ex | Elixir | apps/password_diceware_api_web/lib/password_diceware_api_web/router.ex | sfat/phoenix-passphrase-diceware-generator | 6bf723318c474c6ebd5875e0999717c0da294504 | [
"MIT"
] | null | null | null | apps/password_diceware_api_web/lib/password_diceware_api_web/router.ex | sfat/phoenix-passphrase-diceware-generator | 6bf723318c474c6ebd5875e0999717c0da294504 | [
"MIT"
] | 3 | 2021-03-10T05:05:50.000Z | 2022-02-14T10:05:59.000Z | apps/password_diceware_api_web/lib/password_diceware_api_web/router.ex | sfat/phoenix-passphrase-diceware-generator | 6bf723318c474c6ebd5875e0999717c0da294504 | [
"MIT"
] | null | null | null | defmodule PasswordDicewareApiWeb.Router do
use PasswordDicewareApiWeb, :router
pipeline :browser do
plug :accepts, ["html"]
plug :fetch_session
plug :fetch_flash
plug :protect_from_forgery
plug :put_secure_browser_headers
end
pipeline :api do
plug :accepts, ["json"]
end
scope "/", PasswordDicewareApiWeb do
pipe_through :browser
get "/", PageController, :index
end
# Other scopes may use custom stacks.
# scope "/api", PasswordDicewareApiWeb do
# pipe_through :api
# end
end
| 19.888889 | 43 | 0.702048 |
03ec844c5af9496f80c0de2f65db2365c7c800ab | 4,865 | exs | Elixir | apps/neoscan/priv/repo/migrations/20180607000003_vouts.exs | decentralisedkev/neo-scan | c8a35a0952e8c46d40365e0ac76bce361ac5e558 | [
"MIT"
] | null | null | null | apps/neoscan/priv/repo/migrations/20180607000003_vouts.exs | decentralisedkev/neo-scan | c8a35a0952e8c46d40365e0ac76bce361ac5e558 | [
"MIT"
] | null | null | null | apps/neoscan/priv/repo/migrations/20180607000003_vouts.exs | decentralisedkev/neo-scan | c8a35a0952e8c46d40365e0ac76bce361ac5e558 | [
"MIT"
] | null | null | null | defmodule Neoscan.Repo.Migrations.Vouts do
use Ecto.Migration
def change do
create table(:vouts, primary_key: false) do
add(:transaction_hash, :binary, null: false, primary_key: true)
add(:n, :integer, null: false, primary_key: true)
add(:address_hash, :binary, null: false)
add(:asset_hash, :binary, null: false)
add(:value, :decimal, null: false)
add(:block_time, :naive_datetime, null: false)
add(:claimed, :boolean, null: false, default: false)
add(:spent, :boolean, null: false, default: false)
add(:start_block_index, :integer, null: false)
add(:end_block_index, :integer)
timestamps()
end
#partial index is used to get unspent blocks
create(index(:vouts, [:address_hash, :asset_hash]))
create(index(:vouts, [:address_hash], where: "asset_hash = E'\\\\xC56F33FC6ECFCD0C225C4AB356FEE59390AF8560BE0E930FAEBE74A6DAFF7C9B' and claimed = false", name: "partial_vout_index"))
create(index(:vouts, [:address_hash, :spent]))
create(index(:vouts, [:address_hash, :claimed, :spent]))
create table(:vouts_queue, primary_key: false) do
add(:vin_transaction_hash, :binary, null: true)
add(:transaction_hash, :binary, null: false)
add(:n, :integer, null: false)
add(:claimed, :boolean, null: false, default: false)
add(:spent, :boolean, null: false, default: false)
add(:end_block_index, :integer)
add(:block_time, :naive_datetime, null: false)
timestamps()
end
execute """
CREATE OR REPLACE FUNCTION flush_vouts_queue()
RETURNS bool
LANGUAGE plpgsql
AS $body$
DECLARE
v_inserts int;
v_updates int;
v_prunes int;
BEGIN
IF NOT pg_try_advisory_xact_lock('vouts_queue'::regclass::oid::bigint) THEN
RAISE NOTICE 'skipping vouts_queue flush';
RETURN false;
END IF;
WITH
aggregated_queue AS (
SELECT (array_remove(array_agg(vin_transaction_hash), NULL))[1] as vin_transaction_hash,
transaction_hash, n, BOOL_OR(claimed) as claimed, BOOL_OR(spent) as spent, MAX(end_block_index) as end_block_index,
MAX(block_time) as block_time, MIN(inserted_at) as inserted_at, MAX(updated_at) as updated_at
FROM vouts_queue
GROUP BY transaction_hash, n
),
perform_updates AS (
UPDATE vouts
SET
claimed = vouts.claimed or aggregated_queue.claimed,
spent = vouts.spent or aggregated_queue.spent,
end_block_index = GREATEST(vouts.end_block_index, aggregated_queue.end_block_index)
FROM aggregated_queue
WHERE aggregated_queue.transaction_hash = vouts.transaction_hash and aggregated_queue.n = vouts.n
RETURNING aggregated_queue.vin_transaction_hash, aggregated_queue.transaction_hash, aggregated_queue.n, aggregated_queue.claimed,
aggregated_queue.spent, aggregated_queue.end_block_index, aggregated_queue.block_time,
aggregated_queue.inserted_at, aggregated_queue.updated_at, vouts.address_hash, vouts.asset_hash, vouts.value
),
perform_inserts AS (
INSERT INTO address_histories (address_hash, transaction_hash, asset_hash, value, block_time, inserted_at, updated_at)
SELECT address_hash, vin_transaction_hash, asset_hash, value * -1.0, block_time, inserted_at, updated_at
FROM perform_updates WHERE spent = true
RETURNING 1
),
perform_prune AS (
DELETE FROM vouts_queue USING perform_updates
WHERE vouts_queue.transaction_hash = perform_updates.transaction_hash AND
vouts_queue.n = perform_updates.n
RETURNING 1
)
SELECT
(SELECT count(*) FROM perform_updates) updates,
(SELECT count(*) FROM perform_inserts) inserts,
(SELECT count(*) FROM perform_prune) prunes
INTO v_updates, v_inserts, v_prunes;
RAISE NOTICE 'performed vouts_queue flush: % updates, % inserts, % prunes', v_updates, v_inserts, v_prunes;
RETURN true;
END;
$body$;
"""
execute """
CREATE OR REPLACE FUNCTION flush_vouts_queue_trigger() RETURNS TRIGGER LANGUAGE plpgsql AS $body$
BEGIN
IF random() < 0.01 THEN
PERFORM flush_vouts_queue();
END IF;
RETURN NULL;
END;
$body$;
"""
execute """
CREATE TRIGGER vouts_queue_trigger
AFTER INSERT ON vouts_queue
FOR EACH ROW EXECUTE PROCEDURE flush_vouts_queue_trigger();
"""
end
end
| 41.581197 | 186 | 0.631449 |
03ecb136b9d9b1e97c2f27c8448c68901d69d06e | 3,348 | ex | Elixir | lib/liblink/network/consul/agent.ex | Xerpa/liblink | 7b983431c5b391bb8cf182edd9ca4937601eea35 | [
"Apache-2.0"
] | 3 | 2018-10-26T12:55:15.000Z | 2019-05-03T22:41:34.000Z | lib/liblink/network/consul/agent.ex | Xerpa/liblink | 7b983431c5b391bb8cf182edd9ca4937601eea35 | [
"Apache-2.0"
] | 4 | 2018-08-26T14:43:57.000Z | 2020-09-23T21:14:56.000Z | lib/liblink/network/consul/agent.ex | Xerpa/liblink | 7b983431c5b391bb8cf182edd9ca4937601eea35 | [
"Apache-2.0"
] | null | null | null | # Copyright 2018 (c) Xerpa
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
defmodule Liblink.Network.Consul.Agent do
alias Liblink.Data.Consul.Service
alias Liblink.Network.Consul
@type check_option :: {:note, String.t()}
@spec service_register(Consul.t(), Service.t()) :: Tesla.Env.result()
def service_register(client = %Consul{}, service = %Service{}) do
payload = Jason.encode!(Service.to_consul(service))
Tesla.put(client.agent, "/v1/agent/service/register", payload)
end
@spec service_deregister(Consul.t(), String.t()) :: Tesla.Env.result()
def service_deregister(client = %Consul{}, service_id) when is_binary(service_id) do
Tesla.put(client.agent, "/v1/agent/service/deregister/#{service_id}", "")
end
@spec services(Consul.t()) :: Tesla.Env.result()
def services(client) do
Tesla.get(client.agent, "/v1/agent/services")
end
@spec service(Consul.t(), String.t()) :: Tesla.Env.result()
def service(client = %Consul{}, service_id) when is_binary(service_id) do
with {:ok, reply = %{status: 200}} <- services(client) do
{:ok,
Map.update(reply, :body, nil, fn body ->
body
|> Enum.filter(fn {service, _} -> service == service_id end)
|> Map.new()
end)}
end
end
@spec check_pass(Consul.t(), String.t()) :: Tesla.Env.result()
@spec check_pass(Consul.t(), String.t(), [check_option]) :: Tesla.Env.result()
def check_pass(client = %Consul{}, check_id, params \\ []) when is_binary(check_id) do
payload = Jason.encode!(Map.new(params))
Tesla.put(client.agent, "/v1/agent/check/pass/#{check_id}", payload)
end
@spec check_warn(Consul.t(), String.t()) :: Tesla.Env.result()
@spec check_warn(Consul.t(), String.t(), [check_option]) :: Tesla.Env.result()
def check_warn(client = %Consul{}, check_id, params \\ []) when is_binary(check_id) do
payload = Jason.encode!(Map.new(params))
Tesla.put(client.agent, "/v1/agent/check/warn/#{check_id}", payload)
end
@spec check_fail(Consul.t(), String.t()) :: Tesla.Env.result()
@spec check_fail(Consul.t(), String.t(), [check_option]) :: Tesla.Env.result()
def check_fail(client = %Consul{}, check_id, params \\ []) when is_binary(check_id) do
payload = Jason.encode!(Map.new(params))
Tesla.put(client.agent, "/v1/agent/check/fail/#{check_id}", payload)
end
@spec checks(Consul.t()) :: Tesla.Env.result()
def checks(client = %Consul{}) do
Tesla.get(client.agent, "/v1/agent/checks")
end
@spec check(Consul.t(), String.t()) :: Tesla.Env.result()
def check(client = %Consul{}, check_id) do
with {:ok, reply = %{status: 200}} <- checks(client) do
{:ok,
Map.update(reply, :body, nil, fn body ->
body
|> Enum.filter(fn {check, _} -> check == check_id end)
|> Map.new()
end)}
end
end
end
| 38.482759 | 88 | 0.663381 |
03ecb1c9e093466fddcd53de64e7b6d72dbe951c | 139 | exs | Elixir | test/json_response_test.exs | joaopealves/json_response | 15e9a97d6ac1267541fa715bb6836aedae25d19a | [
"Apache-2.0"
] | 4 | 2021-07-19T11:47:19.000Z | 2021-07-25T21:59:34.000Z | test/json_response_test.exs | joaopealves/json_response | 15e9a97d6ac1267541fa715bb6836aedae25d19a | [
"Apache-2.0"
] | null | null | null | test/json_response_test.exs | joaopealves/json_response | 15e9a97d6ac1267541fa715bb6836aedae25d19a | [
"Apache-2.0"
] | null | null | null | defmodule JsonResponseTest do
use ExUnit.Case
doctest JsonResponse
test "greets the world" do
assert :world == :world
end
end
| 15.444444 | 29 | 0.726619 |
03eccf122cb1ceb18701ea44be805fdf5aa0fc54 | 1,964 | exs | Elixir | config/prod.exs | baymax42/rex | 7c8571ac308960973fea9e8df77a6f1ad5c16906 | [
"MIT"
] | null | null | null | config/prod.exs | baymax42/rex | 7c8571ac308960973fea9e8df77a6f1ad5c16906 | [
"MIT"
] | 16 | 2020-05-18T20:06:29.000Z | 2020-06-08T14:32:11.000Z | config/prod.exs | baymax42/rex | 7c8571ac308960973fea9e8df77a6f1ad5c16906 | [
"MIT"
] | null | null | null | use Mix.Config
# For production, don't forget to configure the url host
# to something meaningful, Phoenix uses this information
# when generating URLs.
#
# Note we also include the path to a cache manifest
# containing the digested version of static files. This
# manifest is generated by the `mix phx.digest` task,
# which you should run after static files are built and
# before starting your production server.
config :rex_web, RexWeb.Endpoint,
url: [host: "example.com", port: 80],
cache_static_manifest: "priv/static/cache_manifest.json"
# ## SSL Support
#
# To get SSL working, you will need to add the `https` key
# to the previous section and set your `:url` port to 443:
#
# config :rex_web, RexWeb.Endpoint,
# ...
# url: [host: "example.com", port: 443],
# https: [
# :inet6,
# port: 443,
# cipher_suite: :strong,
# keyfile: System.get_env("SOME_APP_SSL_KEY_PATH"),
# certfile: System.get_env("SOME_APP_SSL_CERT_PATH")
# ]
#
# The `cipher_suite` is set to `:strong` to support only the
# latest and more secure SSL ciphers. This means old browsers
# and clients may not be supported. You can set it to
# `:compatible` for wider support.
#
# `:keyfile` and `:certfile` expect an absolute path to the key
# and cert in disk or a relative path inside priv, for example
# "priv/ssl/server.key". For all supported SSL configuration
# options, see https://hexdocs.pm/plug/Plug.SSL.html#configure/1
#
# We also recommend setting `force_ssl` in your endpoint, ensuring
# no data is ever sent via http, always redirecting to https:
#
# config :rex_web, RexWeb.Endpoint,
# force_ssl: [hsts: true]
#
# Check `Plug.SSL` for all available options in `force_ssl`.
# Do not print debug messages in production
config :logger, level: :info
# Finally import the config/prod.secret.exs which loads secrets
# and configuration from environment variables.
import_config "prod.secret.exs"
| 35.071429 | 66 | 0.711813 |
03ecdc02af43140fc2b123467364ae24a06fbc68 | 761 | exs | Elixir | xarb/config/test.exs | Erik-joh/examensarbete | 951847f0ee5195abc0e3aa5f2b6fff78233127ee | [
"MIT"
] | null | null | null | xarb/config/test.exs | Erik-joh/examensarbete | 951847f0ee5195abc0e3aa5f2b6fff78233127ee | [
"MIT"
] | null | null | null | xarb/config/test.exs | Erik-joh/examensarbete | 951847f0ee5195abc0e3aa5f2b6fff78233127ee | [
"MIT"
] | null | null | null | use Mix.Config
# Only in tests, remove the complexity from the password hashing algorithm
config :bcrypt_elixir, :log_rounds, 1
# Configure your database
#
# The MIX_TEST_PARTITION environment variable can be used
# to provide built-in test partitioning in CI environment.
# Run `mix help test` for more information.
config :xarb, Xarb.Repo,
username: "postgres",
password: "J9pth6Klm",
database: "xarb_test#{System.get_env("MIX_TEST_PARTITION")}",
hostname: "localhost",
pool: Ecto.Adapters.SQL.Sandbox
# We don't run a server during test. If one is required,
# you can enable the server option below.
config :xarb, XarbWeb.Endpoint,
http: [port: 4002],
server: false
# Print only warnings and errors during test
config :logger, level: :warn
| 29.269231 | 74 | 0.751643 |
03ed1392c48936f17ae564befc70503ab80e99ae | 1,341 | ex | Elixir | lib/fatex_web/endpoint.ex | vinicius-molina/FaTex | bbc05acef7a9697efe9ec0fe64511bce9f26a9b3 | [
"Apache-2.0"
] | 3 | 2019-12-08T19:42:39.000Z | 2020-03-17T13:02:56.000Z | lib/fatex_web/endpoint.ex | vinicius-molina/FaTex | bbc05acef7a9697efe9ec0fe64511bce9f26a9b3 | [
"Apache-2.0"
] | 1 | 2021-03-09T19:49:41.000Z | 2021-03-09T19:49:41.000Z | lib/fatex_web/endpoint.ex | shiryel/fatex | bbc05acef7a9697efe9ec0fe64511bce9f26a9b3 | [
"Apache-2.0"
] | 1 | 2019-12-08T19:42:42.000Z | 2019-12-08T19:42:42.000Z | defmodule FatexWeb.Endpoint do
use Phoenix.Endpoint, otp_app: :fatex
socket "/socket", FatexWeb.UserSocket,
websocket: true,
longpoll: false
socket "/live", Phoenix.LiveView.Socket
# Serve at "/" the static files from "priv/static" directory.
#
# You should set gzip to true if you are running phx.digest
# when deploying your static files in production.
plug Plug.Static,
at: "/",
from: :fatex,
gzip: false,
only: ~w(css fonts js favicon.ico robots.txt particles.json)
# Code reloading can be explicitly enabled under the
# :code_reloader configuration of your endpoint.
if code_reloading? do
socket "/phoenix/live_reload/socket", Phoenix.LiveReloader.Socket
plug Phoenix.LiveReloader
plug Phoenix.CodeReloader
end
plug Plug.RequestId
plug Plug.Telemetry, event_prefix: [:phoenix, :endpoint]
plug Plug.Parsers,
parsers: [:urlencoded, :multipart, :json],
pass: ["*/*"],
json_decoder: Phoenix.json_library()
plug Plug.MethodOverride
plug Plug.Head
# The session will be stored in the cookie and signed,
# this means its contents can be read but not tampered with.
# Set :encryption_salt if you would also like to encrypt it.
plug Plug.Session,
store: :cookie,
key: "_fatex_key",
signing_salt: "7pG1hBmO"
plug FatexWeb.Router
end
| 27.367347 | 69 | 0.709172 |
03ed151e8a033ff6c04fe7d4f98c0c29f09eaf78 | 1,129 | exs | Elixir | mix.exs | h4cc/elixir-badges | e10ad5ba2083236fb26bbfac9e284621fca547a6 | [
"MIT"
] | null | null | null | mix.exs | h4cc/elixir-badges | e10ad5ba2083236fb26bbfac9e284621fca547a6 | [
"MIT"
] | 3 | 2015-04-24T12:28:05.000Z | 2015-04-28T19:54:36.000Z | mix.exs | h4cc/elixir-badges | e10ad5ba2083236fb26bbfac9e284621fca547a6 | [
"MIT"
] | null | null | null | defmodule ElixirBadges.Mixfile do
use Mix.Project
def project do
[app: :elixir_badges,
version: "0.0.1",
elixir: "~> 1.0",
elixirc_paths: elixirc_paths(Mix.env),
compilers: [:phoenix] ++ Mix.compilers,
build_embedded: Mix.env == :prod,
start_permanent: Mix.env == :prod,
deps: deps]
end
# Configuration for the OTP application
#
# Type `mix help compile.app` for more information
def application do
[mod: {ElixirBadges, []},
applications: [:phoenix, :cowboy, :logger, :ecto, :httpotion]]
end
# Specifies which paths to compile per environment
defp elixirc_paths(:test), do: ["lib", "web", "test/support"]
defp elixirc_paths(_), do: ["lib", "web"]
# Specifies your project dependencies
#
# Type `mix help deps` for examples and options
defp deps do
[
{:phoenix, "~> 0.11"},
{:phoenix_ecto, "~> 0.3"},
{:postgrex, ">= 0.0.0"},
{:phoenix_live_reload, "~> 0.3"},
{:cowboy, "~> 1.0"},
{:ibrowse, github: "cmullaparthi/ibrowse", tag: "v4.1.1"},
{:httpotion, "~> 2.0"},
{:poison, "~> 1.4"}
]
end
end
| 26.255814 | 67 | 0.596103 |
03ed2e60e894d5a8cf39584665a6a982f3f79a6a | 1,885 | exs | Elixir | mix.exs | xshadowlegendx/commanded-extreme-adapter | a6833aa5763831c28971c18775f75cba66144783 | [
"MIT"
] | null | null | null | mix.exs | xshadowlegendx/commanded-extreme-adapter | a6833aa5763831c28971c18775f75cba66144783 | [
"MIT"
] | null | null | null | mix.exs | xshadowlegendx/commanded-extreme-adapter | a6833aa5763831c28971c18775f75cba66144783 | [
"MIT"
] | null | null | null | defmodule Commanded.EventStore.Adapters.Extreme.Mixfile do
use Mix.Project
@version "1.1.0"
def project do
[
app: :commanded_extreme_adapter,
version: @version,
elixir: "~> 1.6",
elixirc_paths: elixirc_paths(Mix.env()),
consolidate_protocols: Mix.env() != :test,
description: description(),
docs: docs(),
package: package(),
build_embedded: Mix.env() == :prod,
start_permanent: Mix.env() == :prod,
deps: deps()
]
end
def application do
[
extra_applications: [:logger]
]
end
defp elixirc_paths(:test),
do: [
"deps/commanded/test/event_store",
"deps/commanded/test/support",
"lib",
"test/support"
]
defp elixirc_paths(_), do: ["lib"]
defp deps do
[
{:commanded, git: "https://github.com/xshadowlegendx/commanded.git"},
{:extreme, "~> 0.13"},
# Optional dependencies
{:jason, "~> 1.2", optional: true},
# Test & build tooling
{:ex_doc, "~> 0.21", only: :dev},
{:mox, "~> 1.0", only: :test}
]
end
defp description do
"""
Extreme event store adapter for Commanded
"""
end
defp docs do
[
main: "Commanded.EventStore.Adapters.Extreme",
canonical: "http://hexdocs.pm/commanded_extreme_adapter",
source_ref: "v#{@version}",
extra_section: "GUIDES",
extras: [
"CHANGELOG.md",
"guides/Getting Started.md": [filename: "getting-started", title: "Extreme adapter"],
"guides/Testing.md": [title: "Testing"]
]
]
end
defp package do
[
files: [
"lib",
"mix.exs",
"README*",
"LICENSE*"
],
maintainers: ["Ben Smith"],
licenses: ["MIT"],
links: %{
"GitHub" => "https://github.com/commanded/commanded-extreme-adapter"
}
]
end
end
| 21.420455 | 93 | 0.556499 |
03ed349ee378bc42ebdb98e467c7a3b75e1f6f18 | 3,251 | ex | Elixir | lib/tarearbol/jobs/errand.ex | am-kantox/tarearbol | 37bac59178940df4c72bf942dd08d8acca505130 | [
"MIT"
] | 49 | 2017-07-22T12:25:46.000Z | 2022-02-12T20:29:36.000Z | lib/tarearbol/jobs/errand.ex | am-kantox/tarearbol | 37bac59178940df4c72bf942dd08d8acca505130 | [
"MIT"
] | 15 | 2017-07-21T13:17:32.000Z | 2021-02-25T05:40:11.000Z | lib/tarearbol/jobs/errand.ex | am-kantox/tarearbol | 37bac59178940df4c72bf942dd08d8acca505130 | [
"MIT"
] | 4 | 2017-10-26T10:28:00.000Z | 2019-09-13T08:04:01.000Z | defmodule Tarearbol.Errand do
@moduledoc false
use Boundary, deps: [Tarearbol.Application, Tarearbol.Job, Tarearbol.Scheduler, Tarearbol.Utils]
require Logger
@typep job :: (() -> any()) | {module(), atom()}
@default_opts [
repeatedly: false
]
@msecs_per_day 1_000 * 60 * 60 * 24
@doc """
Runs the task either once at the specified `%DateTime{}` or repeatedly
at the specified `%Time{}`.
"""
@spec run_in(job(), Tarearbol.Utils.interval(), keyword()) :: :ok | Task.t()
def run_in(job, interval, opts \\ opts()) do
type =
if is_function(job, 0) and Function.info(job, :type) == {:type, :local},
do: :local,
else: :external
do_run_in(type, job, interval, opts)
end
@spec do_run_in(:local | :external, job(), Tarearbol.Utils.interval(), keyword()) ::
:ok | Task.t()
defp do_run_in(:external, job, interval, opts) do
{name, _opts} = Keyword.pop(opts, :name, Tarearbol.Utils.random_module_name("Job"))
Tarearbol.Scheduler.push(
name,
job,
Tarearbol.Utils.interval(interval, value: 0)
)
end
@deprecated "Use external function or `{m, f}` tuple as the job instead"
defp do_run_in(:local, job, interval, opts) do
Logger.warn(
"[DEPRECATED] spawning local functions is deprecated; use external function or `{m, f}` tuple as the job instead"
)
Tarearbol.Application.task!(fn ->
waiting_time = Tarearbol.Utils.interval(interval, value: 0)
Process.put(:job, {job, Tarearbol.Utils.add_interval(interval), opts})
Process.sleep(waiting_time)
result = Tarearbol.Job.ensure(job, opts)
Process.delete(:job)
cond do
opts[:next_run] -> run_at(job, opts[:next_run], opts)
opts[:repeatedly] -> run_in(job, interval, opts)
true -> result
end
end)
end
@doc """
Runs the task either once at the specified `%DateTime{}` or repeatedly
at the specified `%Time{}`.
"""
@spec run_at(Tarearbol.Job.job(), %DateTime{}, keyword()) :: :ok | Task.t()
def run_at(job, at, opts \\ opts())
def run_at(job, %DateTime{} = at, opts) do
interval = DateTime.diff(at, DateTime.utc_now(), :millisecond)
run_in(job, interval, run_in_opts(opts))
end
def run_at(job, %Time{} = at, opts) do
next =
case Time.diff(at, Time.utc_now(), :millisecond) do
# tomorrow at that time
msec when msec <= 0 ->
msec + @msecs_per_day
msec ->
msec
end
opts =
opts
|> Keyword.put_new(:next_run, at)
|> run_in_opts()
run_in(job, next, opts)
end
def run_at(job, at, opts) when is_binary(at),
do: run_at(job, DateTime.from_iso8601(at), opts)
@doc "Spawns the task by calling `run_in` with a zero interval"
@spec spawn((() -> any()) | {module(), atom(), list()}, keyword()) :: :ok | Task.t()
def spawn(job, opts \\ opts()), do: run_in(job, :none, opts)
##############################################################################
@spec opts() :: keyword()
defp opts, do: Application.get_env(:tarearbol, :default_scheduler_options, @default_opts)
@spec run_in_opts(keyword()) :: keyword()
defp run_in_opts(opts), do: Keyword.delete(opts, :repeatedly)
end
| 29.825688 | 119 | 0.610274 |
03ed51f49142cc958452a79dcc14755b7118e18b | 3,633 | ex | Elixir | lib/shopifex/plug/shopify_session.ex | smantzavinos/shopifex | 744d732f9237cdea1773056c412f3524e1f5ca4f | [
"Apache-2.0"
] | null | null | null | lib/shopifex/plug/shopify_session.ex | smantzavinos/shopifex | 744d732f9237cdea1773056c412f3524e1f5ca4f | [
"Apache-2.0"
] | null | null | null | lib/shopifex/plug/shopify_session.ex | smantzavinos/shopifex | 744d732f9237cdea1773056c412f3524e1f5ca4f | [
"Apache-2.0"
] | null | null | null | defmodule Shopifex.Plug.ShopifySession do
import Plug.Conn
import Phoenix.Controller
require Logger
def init(options) do
# initialize options
options
end
def call(conn, _) do
token = get_token_from_conn(conn) || Guardian.Plug.current_token(conn)
case Shopifex.Guardian.resource_from_token(token) do
{:ok, shop, claims} ->
locale = get_locale(conn, claims)
put_shop_in_session(conn, shop, locale)
_ ->
initiate_new_session(conn)
end
end
defp get_token_from_conn(%Plug.Conn{params: %{"token" => token}}), do: token
defp get_token_from_conn(_), do: nil
defp get_locale(%Plug.Conn{params: %{"locale" => locale}}, _token_claims), do: locale
defp get_locale(_conn, token_claims),
do: Map.get(token_claims, "loc", Application.get_env(:shopifex, :default_locale, "en"))
defp initiate_new_session(conn = %{params: %{"hmac" => hmac}}) do
hmac = String.upcase(hmac)
query_string =
String.split(conn.query_string, "&")
|> Enum.map(fn query ->
[key, value] = String.split(query, "=")
{key, value}
end)
|> Enum.filter(fn {key, _} ->
key != "hmac"
end)
|> Enum.map(fn {key, value} ->
"#{key}=#{value}"
end)
|> Enum.join("&")
our_hmac =
:crypto.hmac(
:sha256,
Application.fetch_env!(:shopifex, :secret),
query_string
)
|> Base.encode16()
if our_hmac == hmac do
conn
|> do_new_session()
else
Logger.info("Invalid HMAC, expected #{our_hmac}")
respond_invalid(conn)
end
end
defp initiate_new_session(conn), do: respond_invalid(conn)
defp do_new_session(conn = %{params: %{"shop" => shop_url} = params}) do
case Shopifex.Shops.get_shop_by_url(shop_url) do
nil ->
redirect_to_install(conn, shop_url)
shop ->
put_shop_in_session(
conn,
shop,
Map.get(params, "locale", Application.get_env(:shopifex, :default_locale, "en"))
)
end
end
defp redirect_to_install(conn, shop_url) do
Logger.info("Initiating shop installation for #{shop_url}")
install_url =
"https://#{shop_url}/admin/oauth/authorize?client_id=#{
Application.fetch_env!(:shopifex, :api_key)
}&scope=#{Application.fetch_env!(:shopifex, :scopes)}&redirect_uri=#{
Application.fetch_env!(:shopifex, :redirect_uri)
}"
conn
|> redirect(external: install_url)
|> halt()
end
def put_shop_in_session(conn, shop, locale \\ "en") do
Gettext.put_locale(locale)
# Create a new token right away for the next request
{:ok, token, claims} = Shopifex.Guardian.encode_and_sign(shop, %{"loc" => locale})
conn
|> Guardian.Plug.put_current_resource(shop)
|> Guardian.Plug.put_current_claims(claims)
|> Guardian.Plug.put_current_token(token)
|> Plug.Conn.put_private(:shop_url, shop.url)
|> Plug.Conn.put_private(:shop, shop)
|> Plug.Conn.put_private(:locale, locale)
end
defp respond_invalid(conn) do
conn
|> put_view(ShopifexWeb.AuthView)
|> put_layout({ShopifexWeb.LayoutView, "app.html"})
|> render("select-store.html")
|> halt()
end
@doc """
Returns the token for the current session in a plug which has
passed through the `:shopify_session` pipeline, or the
`Shopifex.Plug.ShopifySession` plug.
## Example
iex> session_token(conn)
"header.payload.signature"
"""
@spec session_token(Plug.Conn.t()) :: Guardian.Token.token() | nil
def session_token(conn), do: Guardian.Plug.current_token(conn)
end
| 27.11194 | 91 | 0.638591 |
03eda113c4a2a05da03c642953a5ffd7e7beb89b | 11,376 | exs | Elixir | test/absinthe/strict_schema_test.exs | zoldar/absinthe | 72ff9f91fcc0a261f9965cf8120c7c72ff6e4c7c | [
"MIT"
] | 4,101 | 2016-03-02T03:49:20.000Z | 2022-03-31T05:46:01.000Z | test/absinthe/strict_schema_test.exs | zoldar/absinthe | 72ff9f91fcc0a261f9965cf8120c7c72ff6e4c7c | [
"MIT"
] | 889 | 2016-03-02T16:06:59.000Z | 2022-03-31T20:24:12.000Z | test/absinthe/strict_schema_test.exs | zoldar/absinthe | 72ff9f91fcc0a261f9965cf8120c7c72ff6e4c7c | [
"MIT"
] | 564 | 2016-03-02T07:49:59.000Z | 2022-03-06T14:40:59.000Z | defmodule Absinthe.StrictSchemaTest do
use Absinthe.Case, async: true
describe "directive strict adapter" do
test "can use camelcase external name" do
document = """
query ($input: FooBarInput!) {
fooBarQuery @fooBarDirective(bazQux: $input) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_data(
%{"fooBarQuery" => %{"naiveDatetime" => "2017-01-27T20:31:55"}},
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.StrictLanguageConventions,
variables: variables
)
)
end
test "returns an error when underscore external name used" do
document = """
query ($input: FooBarInput!) {
fooBarQuery @foo_bar_directive(bazQux: $input) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_error_message(
"Unknown directive `foo_bar_directive'.",
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.StrictLanguageConventions,
variables: variables
)
)
end
test "returns an error when underscore external name used in argument" do
document = """
query ($input: FooBarInput!) {
fooBarQuery @fooBarDirective(baz_qux: $input) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_error_message(
"Unknown argument \"baz_qux\" on directive \"@fooBarDirective\".",
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.StrictLanguageConventions,
variables: variables
)
)
end
end
describe "directive non-strict adapter" do
test "can use camelcase external name" do
document = """
query ($input: FooBarInput!) {
fooBarQuery @fooBarDirective(bazQux: $input) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_data(
%{"fooBarQuery" => %{"naiveDatetime" => "2017-01-27T20:31:55"}},
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.LanguageConventions,
variables: variables
)
)
end
test "can use underscore external name" do
document = """
query ($input: FooBarInput!) {
fooBarQuery @foo_bar_directive(bazQux: $input) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_data(
%{"fooBarQuery" => %{"naiveDatetime" => "2017-01-27T20:31:55"}},
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.LanguageConventions,
variables: variables
)
)
end
test "can use underscore external name in argument" do
document = """
query ($input: FooBarInput!) {
fooBarQuery @fooBarDirective(baz_qux: $input) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_data(
%{"fooBarQuery" => %{"naiveDatetime" => "2017-01-27T20:31:55"}},
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.LanguageConventions,
variables: variables
)
)
end
end
describe "query strict adapter" do
test "can use camelcase external name" do
document = """
query ($input: FooBarInput!) {
fooBarQuery(bazQux: $input) @fooBarDirective(bazQux: {naiveDatetime: "2017-01-27T20:31:56"}) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_data(
%{"fooBarQuery" => %{"naiveDatetime" => "2017-01-27T20:31:55"}},
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.StrictLanguageConventions,
variables: variables
)
)
end
test "returns an error when underscore external name used" do
document = """
query ($input: FooBarInput!) {
foo_bar_query(bazQux: $input) @fooBarDirective(bazQux: {naiveDatetime: "2017-01-27T20:31:56"}) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_error_message(
"Cannot query field \"foo_bar_query\" on type \"RootQueryType\". Did you mean to use an inline fragment on \"RootQueryType\"?",
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.StrictLanguageConventions,
variables: variables
)
)
end
test "returns an error when underscore external name used in argument" do
document = """
query ($input: FooBarInput!) {
fooBarQuery(baz_qux: $input) @fooBarDirective(bazQux: {naiveDatetime: "2017-01-27T20:31:56"}) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_error_message(
"Unknown argument \"baz_qux\" on field \"fooBarQuery\" of type \"RootQueryType\".",
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.StrictLanguageConventions,
variables: variables
)
)
end
end
describe "query non-strict adapter" do
test "can use camelcase external name" do
document = """
query ($input: FooBarInput!) {
fooBarQuery(bazQux: $input) @fooBarDirective(bazQux: {naiveDatetime: "2017-01-27T20:31:56"}) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_data(
%{"fooBarQuery" => %{"naiveDatetime" => "2017-01-27T20:31:55"}},
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.LanguageConventions,
variables: variables
)
)
end
test "can use underscore external name" do
document = """
query ($input: FooBarInput!) {
foo_bar_query(bazQux: $input) @fooBarDirective(bazQux: {naiveDatetime: "2017-01-27T20:31:56"}) {
naive_datetime
}
}
"""
variables = %{"input" => %{"naive_datetime" => "2017-01-27T20:31:55"}}
assert_data(
%{"foo_bar_query" => %{"naive_datetime" => "2017-01-27T20:31:55"}},
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.LanguageConventions,
variables: variables
)
)
end
test "can use underscore external name in argument" do
document = """
query ($input: FooBarInput!) {
fooBarQuery(baz_qux: $input) @fooBarDirective(bazQux: {naiveDatetime: "2017-01-27T20:31:56"}) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_data(
%{"fooBarQuery" => %{"naiveDatetime" => "2017-01-27T20:31:55"}},
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.LanguageConventions,
variables: variables
)
)
end
end
describe "mutation strict adapter" do
test "can use camelcase external name" do
document = """
mutation ($input: FooBarInput!) {
fooBarMutation(bazQux: $input) @fooBarDirective(bazQux: {naiveDatetime: "2017-01-27T20:31:56"}) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_data(
%{"fooBarMutation" => %{"naiveDatetime" => "2017-01-27T20:31:55"}},
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.StrictLanguageConventions,
variables: variables
)
)
end
test "returns an error when underscore external name used" do
document = """
mutation ($input: FooBarInput!) {
foo_bar_mutation(bazQux: $input) @fooBarDirective(bazQux: {naiveDatetime: "2017-01-27T20:31:56"}) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_error_message(
"Cannot query field \"foo_bar_mutation\" on type \"RootMutationType\". Did you mean to use an inline fragment on \"RootMutationType\"?",
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.StrictLanguageConventions,
variables: variables
)
)
end
test "returns an error when underscore external name used in argument" do
document = """
mutation ($input: FooBarInput!) {
fooBarMutation(baz_qux: $input) @fooBarDirective(bazQux: {naiveDatetime: "2017-01-27T20:31:56"}) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_error_message(
"Unknown argument \"baz_qux\" on field \"fooBarMutation\" of type \"RootMutationType\".",
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.StrictLanguageConventions,
variables: variables
)
)
end
end
describe "mutation non-strict adapter" do
test "can use camelcase external name" do
document = """
mutation ($input: FooBarInput!) {
fooBarMutation(bazQux: $input) @fooBarDirective(bazQux: {naiveDatetime: "2017-01-27T20:31:56"}) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_data(
%{"fooBarMutation" => %{"naiveDatetime" => "2017-01-27T20:31:55"}},
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.LanguageConventions,
variables: variables
)
)
end
test "can use underscore external name" do
document = """
mutation ($input: FooBarInput!) {
foo_bar_mutation(bazQux: $input) @fooBarDirective(bazQux: {naiveDatetime: "2017-01-27T20:31:56"}) {
naive_datetime
}
}
"""
variables = %{"input" => %{"naive_datetime" => "2017-01-27T20:31:55"}}
assert_data(
%{"foo_bar_mutation" => %{"naive_datetime" => "2017-01-27T20:31:55"}},
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.LanguageConventions,
variables: variables
)
)
end
test "can use underscore external name in argument" do
document = """
mutation ($input: FooBarInput!) {
fooBarMutation(baz_qux: $input) @fooBarDirective(bazQux: {naiveDatetime: "2017-01-27T20:31:56"}) {
naiveDatetime
}
}
"""
variables = %{"input" => %{"naiveDatetime" => "2017-01-27T20:31:55"}}
assert_data(
%{"fooBarMutation" => %{"naiveDatetime" => "2017-01-27T20:31:55"}},
run(document, Absinthe.Fixtures.StrictSchema,
adapter: Absinthe.Adapter.LanguageConventions,
variables: variables
)
)
end
end
end
| 30.255319 | 144 | 0.587992 |
03edce0bd4216a5cb029e0aea1a82cce49e73aa9 | 1,274 | exs | Elixir | .check.exs | clayscode/exzeitable | 312e7a0aebda51f5cd3ccee800c1d1affd9d4248 | [
"MIT"
] | 1 | 2021-10-18T00:55:47.000Z | 2021-10-18T00:55:47.000Z | .check.exs | EnzymeCorp/exzeitable | 74c02ab82de56aaf150006f05836a65a7d47697f | [
"MIT"
] | null | null | null | .check.exs | EnzymeCorp/exzeitable | 74c02ab82de56aaf150006f05836a65a7d47697f | [
"MIT"
] | null | null | null | [
## all available options with default values (see `mix check` docs for description)
# skipped: true,
# exit_status: true,
# parallel: true,
## list of tools (see `mix check` docs for defaults)
tools: [
## curated tools may be disabled (e.g. the check for compilation warnings)
# {:compiler, false},
## ...or adjusted (e.g. use one-line formatter for more compact credo output)
# {:credo, command: "mix credo --format oneline"},
## custom new tools may be added (mix tasks or arbitrary commands)
# {:my_mix_check, command: "mix release", env: %{"MIX_ENV" => "prod"}},
# {:my_arbitrary_check, command: "npm test", cd: "assets"},
{:cypress, command: "mix cypress.run", deps: [{:credo, status: :ok}]},
{:dialyzer, deps: [{:credo, status: :ok}]},
{:ex_doc, deps: [{:credo, status: :ok}]},
{:npm_test, false},
{:ex_coveralls,
command: "mix coveralls.html",
require_files: ["test/test_helper.exs"],
env: %{"MIX_ENV" => "test"}, deps: [{:credo, status: :ok}]},
{:credo, command: "mix credo --strict"},
{:ex_unit, command: "mix test", env: %{"MIX_ENV" => "test"}, deps: [{:credo, status: :ok}]}
# {:my_arbitrary_script, command: ["my_script", "argument with spaces"], cd: "scripts"}
]
]
| 38.606061 | 95 | 0.612245 |
03edddf65e72a637222294b959127c61344c45ea | 2,347 | ex | Elixir | lib/day4/security_through_obscurity.ex | fboyer/advent_of_code_2016 | ffe02f093298fa60a5547dc6a9391c1acfb2b325 | [
"MIT"
] | null | null | null | lib/day4/security_through_obscurity.ex | fboyer/advent_of_code_2016 | ffe02f093298fa60a5547dc6a9391c1acfb2b325 | [
"MIT"
] | null | null | null | lib/day4/security_through_obscurity.ex | fboyer/advent_of_code_2016 | ffe02f093298fa60a5547dc6a9391c1acfb2b325 | [
"MIT"
] | null | null | null | defmodule AdventOfCode2016.SecurityThroughObscurity do
@number_of_letters_in_alphabet 26
@convert_between_ascii_and_alphabet 96
def sum_valid_rooms_sector_ids(data) do
data
|> String.split("\n", trim: true)
|> Stream.map(&parse_room/1)
|> Stream.filter(&is_valid?/1)
|> Stream.map(fn({_encrypted_name, sector_id, _checksum}) -> sector_id end)
|> Enum.sum
end
def parse_room(room) do
[encrypted_name, sector_id, checksum] = Regex.run(~r/(.+)-(\d+)\[(.+)\]/, room, capture: :all_but_first)
{encrypted_name, String.to_integer(sector_id), checksum}
end
def is_valid?({encrypted_name, _sector_id, checksum}) do
calculate_checksum(encrypted_name) == checksum
end
def calculate_checksum(encrypted_name) do
encrypted_name
|> String.replace("-", "")
|> String.graphemes
|> Enum.group_by(&(&1))
|> Enum.sort_by(fn({k, v}) -> {k, length(v)} end, fn({k1, v1}, {k2, v2}) ->
cond do
v1 < v2 -> false
v1 == v2 && k1 > k2 -> false
true -> true
end
end)
|> Stream.map(fn({k, _v}) -> k end)
|> Stream.take(5)
|> Enum.join
end
def find_real_room_name(data, search_term) do
match =
data
|> String.split("\n", trim: true)
|> Stream.map(&parse_room/1)
|> Stream.filter(&is_valid?/1)
|> Stream.map(&decipher_encrypted_name/1)
|> Enum.find(fn({deciphered_name, _sector_id, _checksum}) ->
String.contains?(deciphered_name, search_term)
end)
case match do
nil -> nil
_ -> elem(match, 1)
end
end
def decipher_encrypted_name({encrypted_name, sector_id, checksum}) do
deciphered_name =
encrypted_name
|> String.split("-")
|> Stream.map(&String.to_charlist/1)
|> Stream.map(fn(letters) -> rotate_letters_by_sector_id(letters, sector_id) end)
|> Stream.map(&to_string/1)
|> Enum.join(" ")
{deciphered_name, sector_id, checksum}
end
def rotate_letters_by_sector_id(letters, sector_id) do
letters
|> Stream.map(fn(letter) -> letter - @convert_between_ascii_and_alphabet end)
|> Stream.map(fn(letter) -> letter + sector_id end)
|> Stream.map(fn(letter) -> rem(letter, @number_of_letters_in_alphabet) end)
|> Enum.map(fn(letter) -> letter + @convert_between_ascii_and_alphabet end)
end
end
| 30.089744 | 108 | 0.640392 |
03ee11837a6032aede2c7256072ce1f0dbbab4b3 | 2,394 | exs | Elixir | languages/elixir/exercises/concept/language-list/test/language_list_test.exs | AlexLeSang/v3 | 3d35961a961b5a2129b1d42f1d118972d9665357 | [
"MIT"
] | 3 | 2020-07-25T06:24:00.000Z | 2020-09-14T17:39:11.000Z | languages/elixir/exercises/concept/language-list/test/language_list_test.exs | AlexLeSang/v3 | 3d35961a961b5a2129b1d42f1d118972d9665357 | [
"MIT"
] | 45 | 2020-01-24T17:04:52.000Z | 2020-11-24T17:50:18.000Z | languages/elixir/exercises/concept/language-list/test/language_list_test.exs | AlexLeSang/v3 | 3d35961a961b5a2129b1d42f1d118972d9665357 | [
"MIT"
] | 1 | 2020-04-20T11:41:55.000Z | 2020-04-20T11:41:55.000Z | defmodule LanguageListTest do
use ExUnit.Case
alias LanguageList, as: LL
@languages ~w/Clojure Haskell Erlang F# Elixir/
@languages_reversed Enum.reverse(@languages)
# @tag :pending
test "new list" do
assert LL.new() == []
end
@tag :pending
test "count an empty list" do
assert LL.new() |> LL.count() == 0
end
@tag :pending
test "add a language to a list" do
language = "Elixir"
list = [language]
assert LL.new() |> LL.add(language) == list
end
@tag :pending
test "add several languages to a list" do
assert Enum.reduce(@languages, LL.new(), &LL.add(&2, &1)) == @languages_reversed
end
@tag :pending
test "remove on an empty list results in error" do
assert_raise ArgumentError, fn -> LL.new() |> LL.remove() end
end
@tag :pending
test "add then remove results in empty list" do
list =
LL.new()
|> LL.add("Elixir")
|> LL.remove()
assert list == []
end
@tag :pending
test "adding two languages, when removed, removes first item" do
list =
LL.new()
|> LL.add("F#")
|> LL.add("Elixir")
|> LL.remove()
assert list = ~w/F#/
end
@tag :pending
test "first on an empty list raises an error" do
assert_raise ArgumentError, fn -> LL.new() |> LL.first() end
end
@tag :pending
test "add one language, then get the first" do
assert LL.new() |> LL.add("Elixir") |> LL.first() == "Elixir"
end
@tag :pending
test "add a few languages, then get the first" do
first =
LL.new()
|> LL.add("Elixir")
|> LL.add("Prolog")
|> LL.add("F#")
|> LL.first()
assert first == "F#"
end
@tag :pending
test "the count of a new list is 0" do
assert LL.new() |> LL.count() == 0
end
@tag :pending
test "the count of a one-language list is 1" do
count =
LL.new()
|> LL.add("Elixir")
|> LL.count()
assert count == 1
end
@tag :pending
test "the count of a multiple-item list is equal to its length" do
count =
@languages
|> Enum.reduce(LL.new(), &LL.add(&2, &1))
|> LL.count()
assert count == length(@languages)
end
@tag :pending
test "an exciting language list" do
assert LL.exciting_list?(@languages)
end
@tag :pending
test "not an exciting language list" do
refute LL.exciting_list?(~w/Java C JavaScript/)
end
end
| 20.817391 | 84 | 0.596909 |
03ee1d7351d255d5c2630a46be9e5a1a930453a2 | 1,370 | ex | Elixir | lib/cloak/otp/shadowsocks/shadowsocks.ex | roylez/cloak | 9205458be9055e7a93c7b2960824a28c9ad3ed87 | [
"MIT"
] | 51 | 2020-07-24T06:09:22.000Z | 2021-12-16T16:22:59.000Z | lib/cloak/otp/shadowsocks/shadowsocks.ex | roylez/cloak | 9205458be9055e7a93c7b2960824a28c9ad3ed87 | [
"MIT"
] | null | null | null | lib/cloak/otp/shadowsocks/shadowsocks.ex | roylez/cloak | 9205458be9055e7a93c7b2960824a28c9ad3ed87 | [
"MIT"
] | 15 | 2020-07-31T00:34:29.000Z | 2021-09-26T03:44:09.000Z | defmodule Cloak.Shadowsocks do
use DynamicSupervisor
import Cloak.Registry
@moduledoc """
Module to manage ports used
"""
@type account() :: Cloak.Account.t()
def start_link(_) do
DynamicSupervisor.start_link(__MODULE__, nil, name: __MODULE__)
end
def init(_) do
DynamicSupervisor.init(strategy: :one_for_one)
end
@doc """
Start a new port or reload a already start port
"""
@spec start_worker( account() ) :: { :ok, pid() } | { :error, term() }
def start_worker( %{ port: _, method: _, passwd: _ }=account ) do
DynamicSupervisor.start_child( __MODULE__, { Cloak.Shadowsocks.Worker, account })
end
def start_worker(_), do: nil
@doc """
Stops a port
"""
@spec stop_worker( integer() | account() | pid() ) :: :ok | { :error, :not_found }
def stop_worker( port ) when is_integer(port) do
case where({:worker, port}) do
nil -> :ok
pid ->
DynamicSupervisor.terminate_child( __MODULE__, pid )
end
end
def stop_worker( %{ port: port } ), do: stop_worker( port )
def stop_worker( pid ) when is_pid(pid), do: DynamicSupervisor.terminate_child( __MODULE__, pid )
def stop_worker( _ ), do: nil
@doc """
Returns currently opened port count
"""
@spec count_workers() :: integer()
def count_workers do
DynamicSupervisor.which_children(__MODULE__) |> length()
end
end
| 25.849057 | 99 | 0.659854 |
03ee3123cba73d979b0c0be4272f9c775889ead0 | 425 | exs | Elixir | 13.OrganizeProjects/op6/noaa/test/noaa_data_test.exs | kenspirit/programming-elixir-exercises | 35ca1f0cb17ca4040b395c2f9cc53e51918c99ca | [
"MIT"
] | 6 | 2019-10-25T21:51:11.000Z | 2022-03-23T02:11:38.000Z | 13.OrganizeProjects/op6/noaa/test/noaa_data_test.exs | kenspirit/programming-elixir-exercises | 35ca1f0cb17ca4040b395c2f9cc53e51918c99ca | [
"MIT"
] | null | null | null | 13.OrganizeProjects/op6/noaa/test/noaa_data_test.exs | kenspirit/programming-elixir-exercises | 35ca1f0cb17ca4040b395c2f9cc53e51918c99ca | [
"MIT"
] | 1 | 2020-06-16T12:37:43.000Z | 2020-06-16T12:37:43.000Z | defmodule NoaaDataTest do
use ExUnit.Case
doctest Noaa.NoaaData
import Noaa.NoaaData, only: [ extract_field: 2, parse_body: 2 ]
test "XML field data extraction" do
assert extract_field("<root><a>aa</a><b>bb</b></root>", "b") == %{ "b" => "bb" }
end
test "XML body extraction" do
assert parse_body("<root><a>aa</a><b>bb</b><c>cc</c></root>", ["a", "c"]) ==
%{ "a" => "aa", "c" => "cc" }
end
end
| 26.5625 | 84 | 0.576471 |
03ee7210e3a562c7c839c42c2cf4170a2a57ebdf | 4,167 | ex | Elixir | lib/mix/phoenix_ts_interface/compiler.ex | Ziinc/phoenix_ts_interface | bde31c6a348301907eaf5a264030ed9b45a25f43 | [
"MIT"
] | null | null | null | lib/mix/phoenix_ts_interface/compiler.ex | Ziinc/phoenix_ts_interface | bde31c6a348301907eaf5a264030ed9b45a25f43 | [
"MIT"
] | 1 | 2020-04-22T15:10:28.000Z | 2020-04-22T15:10:28.000Z | lib/mix/phoenix_ts_interface/compiler.ex | Ziinc/phoenix_ts_interface | bde31c6a348301907eaf5a264030ed9b45a25f43 | [
"MIT"
] | null | null | null | defmodule Mix.Compilers.Phoenix.TsInterface do
@moduledoc false
@manifest_vsn :v1
@doc """
This compiler works a little bit different than the usual in the sense that it does not compile sources,
it compiles Router modules into typescript files instead.
The big difference is that the ```source files``` are in fact ```source modules```
and to decide if an input is stale or not the compiler uses the module's md5 hash.
Also, considering that the compiler supports configuring the output folder, it
removes dests that does not have dests (I find it confusing too...)
For instance, if the output folder was configured to `web/static/js` the manifest
would have and entry pointing to this folder. If, after that, the user changes the
outuput folder, there will be no more source pointing to this output, so it will
be removed.
"""
def compile(manifest, mappings, force, callback) do
entries = read_manifest(manifest)
stale =
Enum.filter(mappings, fn {module, dest} ->
entry = find_manifest_entry(entries, module)
force || stale?(module, entry) || output_changed?(dest, entry)
end)
# Files to remove are the ones where the output in the mappings is diferent from
# the output in the entries
files_to_remove =
Enum.filter(entries, fn {module, _, dest} ->
Enum.any?(mappings, fn {mapping_module, mapping_out} ->
mapping_module == module && mapping_out != dest
end)
end)
# Entries to remove are the ones that are in the manifest but not in the mappings
entries_to_remove =
Enum.reject(entries, fn {module, _, _} ->
Enum.any?(mappings, fn {mapping_module, _} ->
mapping_module == module
end)
end)
compile(manifest, entries, stale, entries_to_remove, files_to_remove, callback)
end
defp compile(manifest, entries, stale, entries_to_remove, files_to_remove, callback) do
if stale == [] && entries_to_remove == [] && files_to_remove == [] do
:noop
else
Mix.Project.ensure_structure()
Enum.each(entries_to_remove ++ files_to_remove, &File.rm(elem(&1, 2)))
# Compile stale files and print the results
results =
for {module, output} <- stale do
log_result(output, callback.(module, output))
end
# New entries are the ones in the stale array
new? = fn {module, _, _} -> Enum.any?(stale, &(elem(&1, 0) == module)) end
entries = (entries -- entries_to_remove) |> Enum.filter(&(!new?.(&1)))
entries =
entries ++
Enum.map(stale, fn {module, dest} ->
{module, module.module_info[:md5], dest}
end)
write_manifest(manifest, :lists.usort(entries))
if :error in results do
Mix.raise("Encountered compilation errors")
end
:ok
end
end
@doc """
Cleans up compilation artifacts.
"""
def clean(manifest) do
read_manifest(manifest)
|> Enum.each(fn {_, _, output} -> File.rm(output) end)
end
defp stale?(_, {nil, nil, nil}), do: true
defp stale?(module, {_, hash, _}) do
module.module_info[:md5] != hash
end
defp output_changed?(dest, {_, _, manifest_dest}) do
dest != manifest_dest
end
defp find_manifest_entry(manifest_entries, module) do
Enum.find(manifest_entries, {nil, nil, nil}, &(elem(&1, 0) == module))
end
def read_manifest(manifest) do
case File.read(manifest) do
{:error, _} ->
[]
{:ok, content} ->
:erlang.binary_to_term(content) |> parse_manifest
end
end
defp parse_manifest({@manifest_vsn, entries}), do: entries
defp parse_manifest({version, _}) do
Mix.raise("Unsupported manifest version (#{version})")
end
defp write_manifest(manifest, entries) do
content = {@manifest_vsn, entries} |> :erlang.term_to_binary()
File.write(manifest, content)
end
defp log_result(output, result) do
case result do
:ok ->
Mix.shell().info("Generated #{output}")
:ok
{:error, error} ->
Mix.shell().info("Error generating #{output}\n#{inspect(error)}")
:error
end
end
end
| 29.553191 | 106 | 0.646748 |
03ee94f70f756bf0ebbacb4f9cc91379f9606f0c | 1,463 | ex | Elixir | lib/lv_template_web/router.ex | ustrajunior/lv_template | 633c85d8c5810a130bbf24077845dda49e82ca3f | [
"MIT"
] | null | null | null | lib/lv_template_web/router.ex | ustrajunior/lv_template | 633c85d8c5810a130bbf24077845dda49e82ca3f | [
"MIT"
] | null | null | null | lib/lv_template_web/router.ex | ustrajunior/lv_template | 633c85d8c5810a130bbf24077845dda49e82ca3f | [
"MIT"
] | null | null | null | defmodule LvTemplateWeb.Router do
use LvTemplateWeb, :router
pipeline :browser do
plug :accepts, ["html"]
plug :fetch_session
plug :fetch_live_flash
plug :put_root_layout, {LvTemplateWeb.LayoutView, :root}
plug :protect_from_forgery
plug :put_secure_browser_headers
end
pipeline :api do
plug :accepts, ["json"]
end
scope "/", LvTemplateWeb do
pipe_through :browser
get "/", PageController, :index
end
# Other scopes may use custom stacks.
# scope "/api", LvTemplateWeb do
# pipe_through :api
# end
# Enables LiveDashboard only for development
#
# If you want to use the LiveDashboard in production, you should put
# it behind authentication and allow only admins to access it.
# If your application does not have an admins-only section yet,
# you can use Plug.BasicAuth to set up some basic authentication
# as long as you are also using SSL (which you should anyway).
if Mix.env() in [:dev, :test] do
import Phoenix.LiveDashboard.Router
scope "/" do
pipe_through :browser
live_dashboard "/dashboard", metrics: LvTemplateWeb.Telemetry
end
end
# Enables the Swoosh mailbox preview in development.
#
# Note that preview only shows emails that were sent by the same
# node running the Phoenix server.
if Mix.env() == :dev do
scope "/dev" do
pipe_through :browser
forward "/mailbox", Plug.Swoosh.MailboxPreview
end
end
end
| 25.666667 | 70 | 0.699932 |
03eebd21756b1efcedeed5e08f36d2d89e51fdcf | 1,762 | exs | Elixir | rel/config.exs | mtrudel/beats | 6edf532bc02f2625190cbcb7f99a09fc6f58c5fd | [
"MIT"
] | 37 | 2018-05-19T17:45:46.000Z | 2022-01-18T11:03:36.000Z | rel/config.exs | mtrudel/beats | 6edf532bc02f2625190cbcb7f99a09fc6f58c5fd | [
"MIT"
] | 1 | 2020-10-08T09:53:15.000Z | 2020-10-08T09:53:15.000Z | rel/config.exs | mtrudel/beats | 6edf532bc02f2625190cbcb7f99a09fc6f58c5fd | [
"MIT"
] | 2 | 2018-11-20T18:16:08.000Z | 2021-02-09T15:14:28.000Z | # Import all plugins from `rel/plugins`
# They can then be used by adding `plugin MyPlugin` to
# either an environment, or release definition, where
# `MyPlugin` is the name of the plugin module.
Path.join(["rel", "plugins", "*.exs"])
|> Path.wildcard()
|> Enum.map(&Code.eval_file(&1))
use Mix.Releases.Config,
# This sets the default release built by `mix release`
default_release: :default,
# This sets the default environment used by `mix release`
default_environment: Mix.env()
# For a full list of config options for both releases
# and environments, visit https://hexdocs.pm/distillery/configuration.html
# You may define one or more environments in this file,
# an environment's settings will override those of a release
# when building in that environment, this combination of release
# and environment configuration is called a profile
environment :dev do
# If you are running Phoenix, you should make sure that
# server: true is set and the code reloader is disabled,
# even in dev mode.
# It is recommended that you build with MIX_ENV=prod and pass
# the --env flag to Distillery explicitly if you want to use
# dev mode.
set dev_mode: true
set include_erts: false
set cookie: :"p.VhIq*{?s$LE^yBskgqD!livyy:TU)VRCh$dww|kuICn[hruaRVz/LQ:wk!HE$8"
end
environment :prod do
set include_erts: true
set include_src: false
set cookie: :"h0.`&A{t1LYK}{c?8R1jp(eT6hV7HPmeFr(^q9T)BT8^:RWZ[F4yjym0{%>&$PAo"
end
# You may define one or more releases in this file.
# If you have not set a default release, or selected one
# when running `mix release`, the first release in the file
# will be used by default
release :beats do
set version: current_version(:beats)
set applications: [
:runtime_tools
]
end
| 32.62963 | 81 | 0.733258 |
03eebd4b081e20866e3876cb286362788dc786da | 278 | exs | Elixir | test/built_with_elixir_web/controllers/page_controller_test.exs | ospaarmann/built_with_elixir | 5919107c79f200b2035352c7ef9714f8a8f6ff4c | [
"MIT"
] | 8 | 2018-04-15T19:01:14.000Z | 2018-11-19T16:13:56.000Z | test/built_with_elixir_web/controllers/page_controller_test.exs | ospaarmann/built_with_elixir | 5919107c79f200b2035352c7ef9714f8a8f6ff4c | [
"MIT"
] | 8 | 2018-04-14T03:32:12.000Z | 2018-05-15T04:28:27.000Z | test/built_with_elixir_web/controllers/page_controller_test.exs | ospaarmann/built_with_elixir | 5919107c79f200b2035352c7ef9714f8a8f6ff4c | [
"MIT"
] | 2 | 2021-09-22T13:44:09.000Z | 2021-12-21T14:26:03.000Z | defmodule BuiltWithElixirWeb.PageControllerTest do
use BuiltWithElixirWeb.ConnCase
test "GET /", %{conn: conn} do
conn = get(conn, "/")
assert html_response(conn, 200) =~
"Built With Elixir - A place to discover projects built with Elixir."
end
end
| 25.272727 | 82 | 0.683453 |
03eee0b3160008d41df822d44c20631c7d16a439 | 1,412 | ex | Elixir | lib/handler.ex | faineance/tonic | 24140da47d9ccdfc3dd18b215d712b491157f3ec | [
"MIT"
] | null | null | null | lib/handler.ex | faineance/tonic | 24140da47d9ccdfc3dd18b215d712b491157f3ec | [
"MIT"
] | null | null | null | lib/handler.ex | faineance/tonic | 24140da47d9ccdfc3dd18b215d712b491157f3ec | [
"MIT"
] | null | null | null | defmodule Tonic.Handler do
def init({:tcp, :http}, req, opts) do
{:ok, req, opts}
end
def handle(req, state) do
{method, req} = :cowboy_req.method(req)
{param, req} = :cowboy_req.binding(:filename, req)
{:ok, req} = render(method, param, req)
{:ok, req, state}
end
def render("GET", :undefined, req) do
headers = [{"content-type", "text/html"}]
posts = File.ls! "priv/posts/"
title = "Index"
content = preview_posts posts, ""
body = EEx.eval_file "priv/templates/index.html.eex", [content: content, title: title]
{:ok, resp} = :cowboy_req.reply(200, headers, body, req)
end
def render("GET", param, req) do
headers = [{"content-type", "text/html"}]
{:ok, file} = File.read "priv/posts/" <> param <> ".md"
title = String.capitalize(param)
content = Markdown.to_html(file)
body = EEx.eval_file "priv/templates/index.html.eex", [content: content, title: title]
{:ok, resp} = :cowboy_req.reply(200, headers, body, req)
end
def preview_posts [h|t], index do
{:ok, article} = File.read "priv/posts/" <> h
preview = String.slice article, 0, 400
html = Markdown.to_html preview
filename = String.slice(h, 0, String.length(h) - 3)
link = EEx.eval_file "priv/themes/link.html.eex", [filename: filename]
preview_posts t, index <> html <> link
end
def terminate(_reason, _req, _state) do
:ok
end
end
| 32.090909 | 90 | 0.631728 |
03eee81cbe640c415ba3137005065285e1f2a288 | 1,474 | ex | Elixir | clients/docs/lib/google_api/docs/v1/model/embedded_drawing_properties_suggestion_state.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/docs/lib/google_api/docs/v1/model/embedded_drawing_properties_suggestion_state.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/docs/lib/google_api/docs/v1/model/embedded_drawing_properties_suggestion_state.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Docs.V1.Model.EmbeddedDrawingPropertiesSuggestionState do
@moduledoc """
A mask that indicates which of the fields on the base
EmbeddedDrawingProperties
have been changed in this suggestion. For any field set to true, there is a
new suggested value.
## Attributes
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{}
end
defimpl Poison.Decoder, for: GoogleApi.Docs.V1.Model.EmbeddedDrawingPropertiesSuggestionState do
def decode(value, options) do
GoogleApi.Docs.V1.Model.EmbeddedDrawingPropertiesSuggestionState.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Docs.V1.Model.EmbeddedDrawingPropertiesSuggestionState do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 32.755556 | 96 | 0.774763 |
03eef14cddd5fef596afefdefd22200fea9be269 | 10,206 | ex | Elixir | apps/ex_wire/lib/ex_wire/kademlia/routing_table.ex | wolflee/mana | db66dac85addfaad98d40da5bd4082b3a0198bb1 | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 152 | 2018-10-27T04:52:03.000Z | 2022-03-26T10:34:00.000Z | apps/ex_wire/lib/ex_wire/kademlia/routing_table.ex | wolflee/mana | db66dac85addfaad98d40da5bd4082b3a0198bb1 | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 270 | 2018-04-14T07:34:57.000Z | 2018-10-25T18:10:45.000Z | apps/ex_wire/lib/ex_wire/kademlia/routing_table.ex | wolflee/mana | db66dac85addfaad98d40da5bd4082b3a0198bb1 | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | 25 | 2018-10-27T12:15:13.000Z | 2022-01-25T20:31:14.000Z | defmodule ExWire.Kademlia.RoutingTable do
@moduledoc """
Module for working with current node's buckets
"""
alias ExWire.Handler.Params
alias ExWire.Kademlia.{Bucket, Node}
alias ExWire.Kademlia.Config, as: KademliaConfig
alias ExWire.Message.{FindNeighbours, Neighbours, Ping, Pong}
alias ExWire.{Network, Protocol}
alias ExWire.Struct.Endpoint
alias ExWire.Util.Timestamp
defstruct [
:current_node,
:buckets,
:network_client_name,
:expected_pongs,
:discovery_nodes,
:discovery_round
]
@type expected_pongs :: %{required(binary()) => {Node.t(), Node.t()}}
@type t :: %__MODULE__{
current_node: Node.t(),
buckets: [Bucket.t()],
network_client_name: pid() | atom(),
expected_pongs: expected_pongs(),
discovery_nodes: [Node.t()],
discovery_round: integer()
}
@doc """
Creates new routing table.
## Examples
iex> node = %ExWire.Kademlia.Node{
...> key: <<115, 3, 97, 5, 230, 214, 202, 188, 202, 118, 204, 177, 15, 72, 13, 68,
...> 134, 100, 145, 57, 13, 239, 13, 175, 42, 38, 147, 127, 31, 18, 27, 226>>,
...> public_key: <<4, 108, 224, 89, 48, 199, 42, 188, 99, 44, 88, 226, 228, 50, 79,
...> 124, 126, 164, 120, 206, 192, 237, 79, 162, 82, 137, 130, 207, 52, 72, 48,
...> 148, 233, 203, 201, 33, 110, 122, 163, 73, 105, 18, 66, 87, 109, 85, 42, 42,
...> 86, 170, 234, 228, 38, 197, 48, 61, 237, 103, 124, 228, 85, 186, 26, 205,
...> 157>>,
...> endpoint: %ExWire.Struct.Endpoint{
...> ip: [1, 2, 3, 4],
...> tcp_port: 5,
...> udp_port: nil
...> }
...> }
iex> {:ok, network_client_pid} = ExWire.Adapter.UDP.start_link(:doctest, {ExWire.Network, []}, 35351)
iex> table = ExWire.Kademlia.RoutingTable.new(node, network_client_pid)
iex> table.buckets |> Enum.count
256
"""
@spec new(Node.t(), pid() | atom()) :: t()
def new(node = %Node{}, client_pid) do
initial_buckets = initialize_buckets()
%__MODULE__{
current_node: node,
buckets: initial_buckets,
network_client_name: client_pid,
expected_pongs: %{},
discovery_nodes: [],
discovery_round: 0
}
end
@doc """
Returns table's buckets.
"""
@spec buckets(t()) :: [Bucket.t()]
def buckets(%__MODULE__{buckets: buckets}), do: buckets
@doc """
Adds node to routing table.
"""
@spec refresh_node(t(), Node.t()) :: t()
def refresh_node(table = %__MODULE__{current_node: %Node{key: key}}, %Node{key: key}), do: table
def refresh_node(table = %__MODULE__{buckets: buckets}, node = %Node{}) do
node_bucket_id = bucket_id(table, node)
refresh_node_result =
buckets
|> Enum.at(node_bucket_id)
|> Bucket.refresh_node(node)
case refresh_node_result do
{:full_bucket, candidate_for_removal, _bucket} ->
ping(table, candidate_for_removal, node)
{_descr, _node, bucket} ->
replace_bucket(table, node_bucket_id, bucket)
end
end
@doc """
Removes a node from routing table.
"""
@spec remove_node(t(), Node.t()) :: t()
def remove_node(table = %__MODULE__{}, node = %Node{}) do
node_bucket_id = bucket_id(table, node)
updated_bucket =
table
|> bucket_at(node_bucket_id)
|> Bucket.remove_node(node)
replace_bucket(table, node_bucket_id, updated_bucket)
end
@doc """
Returns neighbours of a specified node.
"""
@spec neighbours(t(), FindNeighbours.t(), Endpoint.t()) :: [Node.t()]
def neighbours(
table = %__MODULE__{},
%FindNeighbours{target: public_key, timestamp: timestamp},
endpoint
) do
if timestamp < Timestamp.now() do
[]
else
node = Node.new(public_key, endpoint)
bucket_idx = bucket_id(table, node)
nearest_neighbors = nodes_at(table, bucket_idx)
found_nodes =
traverse(table, bucket_id: bucket_idx, number: bucket_capacity()) ++ nearest_neighbors
found_nodes
|> Enum.sort_by(&Node.common_prefix(&1, node), &>=/2)
|> Enum.take(bucket_capacity())
end
end
@doc """
Returns current node's discovery nodes. Basically it just finds the closest to current node
nodes and filters nodes that were already used for node discovery.
"""
@spec discovery_nodes(t()) :: [Node.t()]
def discovery_nodes(table) do
filter = fn node ->
!Enum.member?(table.discovery_nodes, node)
end
nodes_number = KademliaConfig.concurrency()
closest_bucket_id = buckets_count() - 1
travers_opts = [bucket_id: closest_bucket_id, filter: filter, number: nodes_number]
nearest_neighbors = nodes_at(table, closest_bucket_id)
found_nodes = traverse(table, travers_opts) ++ nearest_neighbors
found_nodes
|> Enum.sort_by(&Node.common_prefix(&1, table.current_node), &>=/2)
|> Enum.take(nodes_number)
end
@doc """
Checks if node exists in routing table.
"""
@spec member?(t(), Node.t()) :: boolean()
def member?(%__MODULE__{buckets: buckets}, node = %Node{}) do
buckets |> Enum.any?(&Bucket.member?(&1, node))
end
@doc """
Returns bucket id that node belongs to in routing table.
"""
@spec bucket_id(t(), Node.t()) :: integer()
def bucket_id(%__MODULE__{current_node: current_node}, node = %Node{}) do
node |> Node.common_prefix(current_node)
end
@doc """
Pings a node saving it to expected pongs.
"""
@spec ping(t(), Node.t()) :: t()
def ping(
table = %__MODULE__{
current_node: %Node{endpoint: current_endpoint},
network_client_name: network_client_name
},
node = %Node{endpoint: remote_endpoint},
replace_candidate \\ nil
) do
ping = Ping.new(current_endpoint, remote_endpoint)
{:sent_message, _, encoded_message} = Network.send(ping, network_client_name, remote_endpoint)
mdc = Protocol.message_mdc(encoded_message)
updated_pongs =
Map.put(
table.expected_pongs,
mdc,
{node, replace_candidate, ping.timestamp}
)
%{table | expected_pongs: updated_pongs}
end
@doc """
Removes expired pongs.
"""
@spec remove_expired_pongs(t()) :: t()
def remove_expired_pongs(table) do
now = Timestamp.now()
updated_pongs =
table.expected_pongs
|> Enum.reject(fn {_key, {_, _, timestamp}} ->
timestamp < now
end)
|> Map.new()
%{table | expected_pongs: updated_pongs}
end
@doc """
Handles Pong message.
There are two cases:
- If we were waiting for this pong (it's stored in routing table) and it's not expired,
we refresh stale node.
- If a pong is expired, we do nothing.
"""
@spec handle_pong(t(), Pong.t()) :: t()
def handle_pong(
table = %__MODULE__{expected_pongs: pongs},
%Pong{
hash: hash,
timestamp: timestamp
}
) do
{node, updated_pongs} = Map.pop(pongs, hash)
table_without_expected_pong = %{table | expected_pongs: updated_pongs}
if timestamp > Timestamp.now() do
case node do
{removal_candidate, _insertion_candidate, _} ->
refresh_node(table_without_expected_pong, removal_candidate)
_ ->
table_without_expected_pong
end
else
table_without_expected_pong
end
end
@spec handle_ping(t(), Params.t()) :: t()
def handle_ping(table, params) do
add_node_from_params(table, params)
end
@spec handle_neighbours(t(), Neighbours.t()) :: :ok
def handle_neighbours(table, %Neighbours{timestamp: timestamp, nodes: nodes}) do
if timestamp > Timestamp.now() do
nodes
|> Enum.map(fn neighbour ->
Node.new(neighbour.node, neighbour.endpoint)
end)
|> Enum.reject(&member?(table, &1))
|> Enum.reduce(table, fn node, acc ->
ping(acc, node)
end)
else
table
end
end
@spec replace_bucket(t(), integer(), Bucket.t()) :: t()
def replace_bucket(table, idx, bucket) do
buckets =
table.buckets
|> List.replace_at(idx, bucket)
%{table | buckets: buckets}
end
@spec add_node_from_params(t(), Params.t()) :: t()
defp add_node_from_params(table, params) do
node = Node.from_handler_params(params)
refresh_node(table, node)
end
@spec bucket_at(t(), integer()) :: Bucket.t()
defp bucket_at(%__MODULE__{buckets: buckets}, id) do
Enum.at(buckets, id)
end
@spec traverse(t(), Keyword.t()) :: [Node.t()]
defp traverse(table, opts) do
bucket_id = Keyword.fetch!(opts, :bucket_id)
acc = opts[:acc] || []
step = opts[:step] || 1
required_nodes = opts[:number] || bucket_capacity()
filter_function = opts[:filter]
left_boundary = bucket_id - step
right_boundary = bucket_id + step
is_out_of_left_boundary = left_boundary < 0
is_out_of_right_boundary = right_boundary > buckets_count() - 1
left_nodes = if is_out_of_left_boundary, do: [], else: table |> nodes_at(left_boundary)
right_nodes = if is_out_of_right_boundary, do: [], else: table |> nodes_at(right_boundary)
found_nodes = left_nodes ++ right_nodes
filtered_nodes =
if filter_function,
do: Enum.filter(found_nodes, fn el -> filter_function.(el) end),
else: found_nodes
acc = acc ++ filtered_nodes
if (is_out_of_left_boundary && is_out_of_right_boundary) || Enum.count(acc) > required_nodes do
acc
else
opts =
opts
|> Keyword.put(:step, step + 1)
|> Keyword.put(:acc, acc)
traverse(table, opts)
end
end
@spec initialize_buckets() :: [Bucket.t()]
defp initialize_buckets() do
1..buckets_count()
|> Enum.map(fn num ->
Bucket.new(num)
end)
end
@spec nodes_at(t(), integer()) :: list(Node.t())
def nodes_at(table = %__MODULE__{}, bucket_id) do
table
|> bucket_at(bucket_id)
|> Bucket.nodes()
end
@spec buckets_count() :: unquote(KademliaConfig.id_size())
defp buckets_count do
KademliaConfig.id_size()
end
@spec bucket_capacity() :: unquote(KademliaConfig.bucket_size())
defp bucket_capacity do
KademliaConfig.bucket_size()
end
end
| 28.428969 | 107 | 0.627866 |
03eefb3e47a33497cf518d4b2323bd0118137b97 | 299 | ex | Elixir | archive/elavon/credit_card/void/request.ex | auroche/elavon | 8165a1b9abb4e73e214a75b22bc055b09f2a9119 | [
"MIT"
] | null | null | null | archive/elavon/credit_card/void/request.ex | auroche/elavon | 8165a1b9abb4e73e214a75b22bc055b09f2a9119 | [
"MIT"
] | null | null | null | archive/elavon/credit_card/void/request.ex | auroche/elavon | 8165a1b9abb4e73e214a75b22bc055b09f2a9119 | [
"MIT"
] | null | null | null | defmodule Elavon.CreditCard.Void.Request do
use Ecto.Schema
embedded_schema do
field :ssl_transaction_type, :string
field :ssl_merchant_id, :string
field :ssl_user_id, :string
field :ssl_pin, :string
field :ssl_txn_id, :string
end
end
| 24.916667 | 43 | 0.64214 |
03eefe05e66b70843b78879b7e871430a429dc0e | 1,166 | ex | Elixir | lib/twirp/protoc/cli.ex | jnatherley/twirp | bf558df38dce622708ffcc368008dcfcb30f63c1 | [
"Apache-2.0"
] | null | null | null | lib/twirp/protoc/cli.ex | jnatherley/twirp | bf558df38dce622708ffcc368008dcfcb30f63c1 | [
"Apache-2.0"
] | null | null | null | lib/twirp/protoc/cli.ex | jnatherley/twirp | bf558df38dce622708ffcc368008dcfcb30f63c1 | [
"Apache-2.0"
] | null | null | null | defmodule Twirp.Protoc.CLI do
@moduledoc false
# Almost all of this generation stuff is lifted from the elixr protobuf library.
# I don't love the way its implemented but it was the fastest path forward for
# supporting generation of services. I'm going to revisit in the future
# because I barely understand how this code works.
def main(_) do
# https://groups.google.com/forum/#!topic/elixir-lang-talk/T5enez_BBTI
:io.setopts(:standard_io, encoding: :latin1)
bin = IO.binread(:all)
request = Protobuf.Decoder.decode(bin, Google.Protobuf.Compiler.CodeGeneratorRequest)
# debug
# raise inspect(request, limit: :infinity)
ctx =
%Protobuf.Protoc.Context{}
|> Protobuf.Protoc.CLI.parse_params(request.parameter)
|> Protobuf.Protoc.CLI.find_types(request.proto_file)
files =
request.proto_file
|> Enum.filter(fn desc -> Enum.member?(request.file_to_generate, desc.name) end)
|> Enum.map(fn desc -> Twirp.Protoc.Generator.generate(ctx, desc) end)
response = Google.Protobuf.Compiler.CodeGeneratorResponse.new(file: files)
IO.binwrite(Protobuf.Encoder.encode(response))
end
end
| 38.866667 | 89 | 0.720412 |
03eefe0df1e38f25fb80b51ff79ae068760af1b8 | 785 | ex | Elixir | lib/rocketpay/numbers.ex | nnmuniz/rocketpay | 3174bcb6bed1578a1cd96e3be90499cc87382b21 | [
"MIT"
] | 2 | 2021-03-09T22:58:09.000Z | 2021-03-28T12:22:11.000Z | lib/rocketpay/numbers.ex | nnmuniz/rocketpay | 3174bcb6bed1578a1cd96e3be90499cc87382b21 | [
"MIT"
] | 9 | 2021-02-28T20:29:58.000Z | 2021-03-26T01:28:01.000Z | lib/rocketpay/numbers.ex | nnmuniz/rocketpay | 3174bcb6bed1578a1cd96e3be90499cc87382b21 | [
"MIT"
] | 1 | 2021-03-28T11:56:37.000Z | 2021-03-28T11:56:37.000Z | defmodule Rocketpay.Numbers do
def sum_from_file(filename) do
"#{filename}.csv"
|> File.read()
|> handle_file()
end
# defp handle_file({:ok, result}) do
# result =
# result = String.split(result, ",")
# result = Enum.map(result, fn number -> String.to_integer(number) end)
# result = Enum.sum(result)
# result
# end
# Linguagem imutável por isso é necessário reatribuir o valor da variável,
# Pipe Operator compondo as funções e agregando resultado.
defp handle_file({:ok, result}) do
result =
result
|> String.split(",")
|> Stream.map(fn number -> String.to_integer(number) end)
|> Enum.sum()
{:ok, %{result: result}}
end
defp handle_file({:error, _reason}), do: {:error, %{message: "Invalid file!"}}
end
| 25.322581 | 80 | 0.634395 |
03ef317664ca3bf03fea4b36e7a2a9a3a0efd703 | 1,133 | exs | Elixir | config/config.exs | szajbus/transformers | 567885bc8611af3fade6be6ca21791f6f6311519 | [
"MIT"
] | null | null | null | config/config.exs | szajbus/transformers | 567885bc8611af3fade6be6ca21791f6f6311519 | [
"MIT"
] | null | null | null | config/config.exs | szajbus/transformers | 567885bc8611af3fade6be6ca21791f6f6311519 | [
"MIT"
] | null | null | null | # This file is responsible for configuring your application
# and its dependencies with the aid of the Mix.Config module.
use Mix.Config
# This configuration is loaded before any dependency and is restricted
# to this project. If another project depends on this project, this
# file won't be loaded nor affect the parent project. For this reason,
# if you want to provide default values for your application for
# 3rd-party users, it should be done in your "mix.exs" file.
# You can configure your application as:
#
# config :transformers, key: :value
#
# and access this configuration in your application as:
#
# Application.get_env(:transformers, :key)
#
# You can also configure a 3rd-party app:
#
# config :logger, level: :info
#
# It is also possible to import configuration files, relative to this
# directory. For example, you can emulate configuration per environment
# by uncommenting the line below and defining dev.exs, test.exs and such.
# Configuration from the imported file will override the ones defined
# here (which is why it is important to import them last).
#
# import_config "#{Mix.env}.exs"
| 36.548387 | 73 | 0.753751 |
03ef3fce44ef6cd72e6e56e12fab2848e3cf6c09 | 1,382 | ex | Elixir | lib/api/graphql/mutations/session.ex | nunopolonia/psc-api | 2e358503851cc04cdaa89201a3f56586f8746736 | [
"MIT"
] | 1 | 2017-09-10T23:51:40.000Z | 2017-09-10T23:51:40.000Z | lib/api/graphql/mutations/session.ex | nunopolonia/psc-api | 2e358503851cc04cdaa89201a3f56586f8746736 | [
"MIT"
] | 24 | 2018-03-14T18:17:00.000Z | 2021-03-01T07:47:53.000Z | lib/api/graphql/mutations/session.ex | portosummerofcode/psc-api | 2e358503851cc04cdaa89201a3f56586f8746736 | [
"MIT"
] | null | null | null | defmodule Api.GraphQL.Mutations.Session do
use Absinthe.Schema.Notation
use Absinthe.Relay.Schema.Notation, :modern
alias Api.GraphQL.Middleware.{RequireAuthn}
alias Api.GraphQL.Resolvers
alias Api.Accounts
object :session_mutations do
@desc "Authenticates a user and returns a JWT"
field :authenticate, type: :string do
arg :email, non_null(:string)
arg :password, non_null(:string)
resolve Resolvers.run_with_args(
&Accounts.create_session/2,
[
[:args, :email],
[:args, :password],
]
)
end
@desc "Registers an user and returns a JWT"
field :register, type: :string do
arg :email, non_null(:string)
arg :password, non_null(:string)
resolve Resolvers.run(&Accounts.create_user/1)
end
@desc "Updates the current user"
field :update_me, type: :user do
arg :user, non_null(:user_input)
middleware RequireAuthn
resolve fn %{user: params}, %{context: %{current_user: current_user}} ->
Accounts.update_user(current_user, current_user.id, params)
end
end
@desc "Deletes account"
field :delete_account, type: :user do
middleware RequireAuthn
resolve fn _args, %{context: %{current_user: current_user}} ->
Accounts.delete_user(current_user, current_user.id)
end
end
end
end
| 25.592593 | 78 | 0.657742 |
03ef44602a5880408c7f22ff8bce9076740cc4a4 | 2,598 | ex | Elixir | clients/container/lib/google_api/container/v1/model/set_maintenance_policy_request.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | 1 | 2018-12-03T23:43:10.000Z | 2018-12-03T23:43:10.000Z | clients/container/lib/google_api/container/v1/model/set_maintenance_policy_request.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | null | null | null | clients/container/lib/google_api/container/v1/model/set_maintenance_policy_request.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This class is auto generated by the elixir code generator program.
# Do not edit the class manually.
defmodule GoogleApi.Container.V1.Model.SetMaintenancePolicyRequest do
@moduledoc """
SetMaintenancePolicyRequest sets the maintenance policy for a cluster.
## Attributes
* `clusterId` (*type:* `String.t`, *default:* `nil`) - The name of the cluster to update.
* `maintenancePolicy` (*type:* `GoogleApi.Container.V1.Model.MaintenancePolicy.t`, *default:* `nil`) - The maintenance policy to be set for the cluster. An empty field
clears the existing maintenance policy.
* `name` (*type:* `String.t`, *default:* `nil`) - The name (project, location, cluster id) of the cluster to set maintenance
policy.
Specified in the format 'projects/*/locations/*/clusters/*'.
* `projectId` (*type:* `String.t`, *default:* `nil`) - The Google Developers Console [project ID or project
number](https://support.google.com/cloud/answer/6158840).
* `zone` (*type:* `String.t`, *default:* `nil`) - The name of the Google Compute Engine
[zone](/compute/docs/zones#available) in which the cluster
resides.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:clusterId => String.t(),
:maintenancePolicy => GoogleApi.Container.V1.Model.MaintenancePolicy.t(),
:name => String.t(),
:projectId => String.t(),
:zone => String.t()
}
field(:clusterId)
field(:maintenancePolicy, as: GoogleApi.Container.V1.Model.MaintenancePolicy)
field(:name)
field(:projectId)
field(:zone)
end
defimpl Poison.Decoder, for: GoogleApi.Container.V1.Model.SetMaintenancePolicyRequest do
def decode(value, options) do
GoogleApi.Container.V1.Model.SetMaintenancePolicyRequest.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Container.V1.Model.SetMaintenancePolicyRequest do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 39.969231 | 171 | 0.712471 |
03ef57c1a82bc073b86be10b3a652ca3200049a5 | 12,995 | ex | Elixir | clients/cloud_trace/lib/google_api/cloud_trace/v1/api/projects.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | null | null | null | clients/cloud_trace/lib/google_api/cloud_trace/v1/api/projects.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-12-18T09:25:12.000Z | 2020-12-18T09:25:12.000Z | clients/cloud_trace/lib/google_api/cloud_trace/v1/api/projects.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-10-04T10:12:44.000Z | 2020-10-04T10:12:44.000Z | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.CloudTrace.V1.Api.Projects do
@moduledoc """
API calls for all endpoints tagged `Projects`.
"""
alias GoogleApi.CloudTrace.V1.Connection
alias GoogleApi.Gax.{Request, Response}
@library_version Mix.Project.config() |> Keyword.get(:version, "")
@doc """
Sends new traces to Cloud Trace or updates existing traces. If the ID of a trace that you send matches that of an existing trace, any fields in the existing trace and its spans are overwritten by the provided values, and any new fields provided are merged with the existing trace data. If the ID does not match, a new trace is created.
## Parameters
* `connection` (*type:* `GoogleApi.CloudTrace.V1.Connection.t`) - Connection to server
* `project_id` (*type:* `String.t`) - Required. ID of the Cloud project where the trace data is stored.
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `:body` (*type:* `GoogleApi.CloudTrace.V1.Model.Traces.t`) -
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.CloudTrace.V1.Model.Empty{}}` on success
* `{:error, info}` on failure
"""
@spec cloudtrace_projects_patch_traces(Tesla.Env.client(), String.t(), keyword(), keyword()) ::
{:ok, GoogleApi.CloudTrace.V1.Model.Empty.t()} | {:ok, Tesla.Env.t()} | {:error, any()}
def cloudtrace_projects_patch_traces(connection, project_id, optional_params \\ [], opts \\ []) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query,
:body => :body
}
request =
Request.new()
|> Request.method(:patch)
|> Request.url("/v1/projects/{projectId}/traces", %{
"projectId" => URI.encode(project_id, &URI.char_unreserved?/1)
})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(opts ++ [struct: %GoogleApi.CloudTrace.V1.Model.Empty{}])
end
@doc """
Gets a single trace by its ID.
## Parameters
* `connection` (*type:* `GoogleApi.CloudTrace.V1.Connection.t`) - Connection to server
* `project_id` (*type:* `String.t`) - Required. ID of the Cloud project where the trace data is stored.
* `trace_id` (*type:* `String.t`) - Required. ID of the trace to return.
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.CloudTrace.V1.Model.Trace{}}` on success
* `{:error, info}` on failure
"""
@spec cloudtrace_projects_traces_get(
Tesla.Env.client(),
String.t(),
String.t(),
keyword(),
keyword()
) ::
{:ok, GoogleApi.CloudTrace.V1.Model.Trace.t()} | {:ok, Tesla.Env.t()} | {:error, any()}
def cloudtrace_projects_traces_get(
connection,
project_id,
trace_id,
optional_params \\ [],
opts \\ []
) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query
}
request =
Request.new()
|> Request.method(:get)
|> Request.url("/v1/projects/{projectId}/traces/{traceId}", %{
"projectId" => URI.encode(project_id, &URI.char_unreserved?/1),
"traceId" => URI.encode(trace_id, &(URI.char_unreserved?(&1) || &1 == ?/))
})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(opts ++ [struct: %GoogleApi.CloudTrace.V1.Model.Trace{}])
end
@doc """
Returns of a list of traces that match the specified filter conditions.
## Parameters
* `connection` (*type:* `GoogleApi.CloudTrace.V1.Connection.t`) - Connection to server
* `project_id` (*type:* `String.t`) - Required. ID of the Cloud project where the trace data is stored.
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `:endTime` (*type:* `DateTime.t`) - End of the time interval (inclusive) during which the trace data was collected from the application.
* `:filter` (*type:* `String.t`) - Optional. A filter against labels for the request. By default, searches use prefix matching. To specify exact match, prepend a plus symbol (`+`) to the search term. Multiple terms are ANDed. Syntax: * `root:NAME_PREFIX` or `NAME_PREFIX`: Return traces where any root span starts with `NAME_PREFIX`. * `+root:NAME` or `+NAME`: Return traces where any root span's name is exactly `NAME`. * `span:NAME_PREFIX`: Return traces where any span starts with `NAME_PREFIX`. * `+span:NAME`: Return traces where any span's name is exactly `NAME`. * `latency:DURATION`: Return traces whose overall latency is greater or equal to than `DURATION`. Accepted units are nanoseconds (`ns`), milliseconds (`ms`), and seconds (`s`). Default is `ms`. For example, `latency:24ms` returns traces whose overall latency is greater than or equal to 24 milliseconds. * `label:LABEL_KEY`: Return all traces containing the specified label key (exact match, case-sensitive) regardless of the key:value pair's value (including empty values). * `LABEL_KEY:VALUE_PREFIX`: Return all traces containing the specified label key (exact match, case-sensitive) whose value starts with `VALUE_PREFIX`. Both a key and a value must be specified. * `+LABEL_KEY:VALUE`: Return all traces containing a key:value pair exactly matching the specified text. Both a key and a value must be specified. * `method:VALUE`: Equivalent to `/http/method:VALUE`. * `url:VALUE`: Equivalent to `/http/url:VALUE`.
* `:orderBy` (*type:* `String.t`) - Optional. Field used to sort the returned traces. Can be one of the following: * `trace_id` * `name` (`name` field of root span in the trace) * `duration` (difference between `end_time` and `start_time` fields of the root span) * `start` (`start_time` field of the root span) Descending order can be specified by appending `desc` to the sort field (for example, `name desc`). Only one sort field is permitted.
* `:pageSize` (*type:* `integer()`) - Optional. Maximum number of traces to return. If not specified or <= 0, the implementation selects a reasonable value. The implementation may return fewer traces than the requested page size.
* `:pageToken` (*type:* `String.t`) - Token identifying the page of results to return. If provided, use the value of the `next_page_token` field from a previous request.
* `:startTime` (*type:* `DateTime.t`) - Start of the time interval (inclusive) during which the trace data was collected from the application.
* `:view` (*type:* `String.t`) - Optional. Type of data returned for traces in the list. Default is `MINIMAL`.
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.CloudTrace.V1.Model.ListTracesResponse{}}` on success
* `{:error, info}` on failure
"""
@spec cloudtrace_projects_traces_list(Tesla.Env.client(), String.t(), keyword(), keyword()) ::
{:ok, GoogleApi.CloudTrace.V1.Model.ListTracesResponse.t()}
| {:ok, Tesla.Env.t()}
| {:error, any()}
def cloudtrace_projects_traces_list(connection, project_id, optional_params \\ [], opts \\ []) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query,
:endTime => :query,
:filter => :query,
:orderBy => :query,
:pageSize => :query,
:pageToken => :query,
:startTime => :query,
:view => :query
}
request =
Request.new()
|> Request.method(:get)
|> Request.url("/v1/projects/{projectId}/traces", %{
"projectId" => URI.encode(project_id, &URI.char_unreserved?/1)
})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(opts ++ [struct: %GoogleApi.CloudTrace.V1.Model.ListTracesResponse{}])
end
end
| 56.255411 | 1,494 | 0.648865 |
03ef5a7481cc15f7c7fd612aba550c0959f10919 | 1,595 | exs | Elixir | apps/plant_monitor/test/oauth/refresh_token_service_test.exs | bartoszgorka/PlantMonitor | 23e18cd76c51bd8eee021ee98668926de885047b | [
"MIT"
] | 2 | 2019-01-25T21:21:56.000Z | 2021-02-24T08:18:51.000Z | apps/plant_monitor/test/oauth/refresh_token_service_test.exs | bartoszgorka/PlantMonitor | 23e18cd76c51bd8eee021ee98668926de885047b | [
"MIT"
] | null | null | null | apps/plant_monitor/test/oauth/refresh_token_service_test.exs | bartoszgorka/PlantMonitor | 23e18cd76c51bd8eee021ee98668926de885047b | [
"MIT"
] | null | null | null | defmodule PlantMonitor.OAuth.RefreshTokenServiceTest do
use PlantMonitor.DataCase
alias PlantMonitor.OAuth.RefreshToken
alias PlantMonitor.OAuth.RefreshTokenService
# GENERATE NEW TOKEN
test "[GENERATE_NEW_TOKEN] Prepare new refresh token" do
%{id: user_id} = insert(:user)
result = RefreshTokenService.generate_new_token(user_id)
assert {:ok, token} = result
assert String.length(token.secret_code) == 20
assert String.length(token.refresh_token) == 20
assert token.user_id == user_id
end
# FETCH TOKEN
test "[FETCH_TOKEN] No token found" do
%{id: user_id} = insert(:user)
token = insert(:refresh_token, %{user_id: user_id})
params = %{
secret_code: token.secret_code,
user_id: Ecto.UUID.generate()
}
result = RefreshTokenService.fetch_token(params, token.refresh_token)
refute result
end
test "[FETCH_TOKEN] Refresh token found" do
%{id: user_id} = insert(:user)
token = insert(:refresh_token, %{user_id: user_id})
params = %{
secret_code: token.secret_code,
user_id: user_id
}
result = RefreshTokenService.fetch_token(params, token.refresh_token)
assert token == result
end
# REFRESH TOKEN
test "[REFRESH_TOKEN] Receive new token" do
%{id: user_id} = insert(:user)
token = insert(:refresh_token, %{user_id: user_id})
result = RefreshTokenService.refresh_token(token)
assert {:ok, new_token} = result
assert token.id != new_token.id
assert token.user_id == new_token.user_id
refute Repo.get(RefreshToken, token.id)
end
end
| 24.921875 | 73 | 0.696552 |
03ef7a02253d46bc77d9daac435ab2e5b4e8f7b5 | 484 | ex | Elixir | lib/zig/nif/dirty_io.ex | wojtekmach/zigler | b2102744bff212351abfeb4f6d87e57dd1ab232c | [
"MIT"
] | 1 | 2021-02-26T00:00:34.000Z | 2021-02-26T00:00:34.000Z | lib/zig/nif/dirty_io.ex | fhunleth/zigler | 037ff05087563d3255f58fb0abbaedeb12b97211 | [
"MIT"
] | null | null | null | lib/zig/nif/dirty_io.ex | fhunleth/zigler | 037ff05087563d3255f58fb0abbaedeb12b97211 | [
"MIT"
] | null | null | null | defmodule Zig.Nif.DirtyIO do
@moduledoc false
alias Zig.Nif.{Adapter, Synchronous}
@behaviour Adapter
@impl true
defdelegate zig_adapter(nif), to: Synchronous
@impl true
def nif_table_entries(nif) do
"""
e.ErlNifFunc{
.name = "#{nif.name}",
.arity = #{nif.arity},
.fptr = __#{nif.name}_shim__,
.flags = e.ERL_NIF_DIRTY_JOB_IO_BOUND,
},
"""
end
@impl true
defdelegate beam_adapter(nif), to: Synchronous
end
| 17.925926 | 48 | 0.623967 |
03ef7a047d41560b0c689a4c2ce9c3285148fe69 | 1,131 | exs | Elixir | config/config.exs | Kr00lIX/assertions | 84b0ef5844f06589773a898fd8bd4fdf04d5a338 | [
"MIT"
] | 121 | 2018-11-30T16:08:43.000Z | 2022-03-10T04:10:04.000Z | config/config.exs | Kr00lIX/assertions | 84b0ef5844f06589773a898fd8bd4fdf04d5a338 | [
"MIT"
] | 21 | 2019-02-14T08:31:06.000Z | 2022-02-22T16:36:43.000Z | config/config.exs | Kr00lIX/assertions | 84b0ef5844f06589773a898fd8bd4fdf04d5a338 | [
"MIT"
] | 17 | 2018-12-14T13:04:13.000Z | 2021-12-10T22:44:59.000Z | # This file is responsible for configuring your application
# and its dependencies with the aid of the Mix.Config module.
use Mix.Config
# This configuration is loaded before any dependency and is restricted
# to this project. If another project depends on this project, this
# file won't be loaded nor affect the parent project. For this reason,
# if you want to provide default values for your application for
# 3rd-party users, it should be done in your "mix.exs" file.
# You can configure your application as:
#
# config :assertions, key: :value
#
# and access this configuration in your application as:
#
# Application.get_env(:assertions, :key)
#
# You can also configure a 3rd-party app:
#
# config :logger, level: :info
#
# It is also possible to import configuration files, relative to this
# directory. For example, you can emulate configuration per environment
# by uncommenting the line below and defining dev.exs, test.exs and such.
# Configuration from the imported file will override the ones defined
# here (which is why it is important to import them last).
#
# import_config "#{Mix.env()}.exs"
| 36.483871 | 73 | 0.751547 |
03ef817a5b75bc2e4cbc66676ce41333a9a40c24 | 1,131 | ex | Elixir | lib/delta_crdt/causal_context.ex | evadne/delta_crdt_ex | 3ec6940edd6fdf5ee047f5e402e4054f4ad678c9 | [
"MIT"
] | 1 | 2019-11-11T16:58:03.000Z | 2019-11-11T16:58:03.000Z | lib/delta_crdt/causal_context.ex | evadne/delta_crdt_ex | 3ec6940edd6fdf5ee047f5e402e4054f4ad678c9 | [
"MIT"
] | null | null | null | lib/delta_crdt/causal_context.ex | evadne/delta_crdt_ex | 3ec6940edd6fdf5ee047f5e402e4054f4ad678c9 | [
"MIT"
] | null | null | null | defmodule DeltaCrdt.CausalContext do
@moduledoc false
defstruct dots: MapSet.new(),
maxima: %{}
def new(dots \\ [])
def new(%__MODULE__{} = cc), do: cc
def new([]), do: %__MODULE__{}
def new(dots) do
maxima =
dots
|> Enum.reduce(%{}, fn {i, x}, maxima ->
Map.update(maxima, i, x, fn y -> Enum.max([x, y]) end)
end)
%__MODULE__{
dots: MapSet.new(dots),
maxima: maxima
}
end
def next(%__MODULE__{} = cc, i) do
new_maxima = Map.update(cc.maxima, i, 0, fn x -> x + 1 end)
next_dot = {i, Map.get(new_maxima, i)}
new_dots = MapSet.put(cc.dots, next_dot)
{next_dot, %{cc | dots: new_dots, maxima: new_maxima}}
end
def dots(%__MODULE__{dots: dots}), do: dots
def join(cc1, cc2) do
new_dots = MapSet.union(cc1.dots, cc2.dots)
new_maxima =
Enum.reduce(cc1.maxima, cc2.maxima, fn {i, x}, maxima ->
Map.update(maxima, i, x, fn y -> Enum.max([x, y]) end)
end)
%__MODULE__{dots: new_dots, maxima: new_maxima}
end
def compress(%__MODULE__{} = cc) do
%{cc | dots: MapSet.new(cc.maxima)}
end
end
| 23.5625 | 63 | 0.581786 |
03ef985cf671c92f399002a09da2c9d85bdc2060 | 2,953 | exs | Elixir | lib/elixir/test/elixir/kernel/alias_test.exs | davidsulc/elixir | dd4fd6ab742acd75862e34e26dbdb86e0cf6453f | [
"Apache-2.0"
] | null | null | null | lib/elixir/test/elixir/kernel/alias_test.exs | davidsulc/elixir | dd4fd6ab742acd75862e34e26dbdb86e0cf6453f | [
"Apache-2.0"
] | null | null | null | lib/elixir/test/elixir/kernel/alias_test.exs | davidsulc/elixir | dd4fd6ab742acd75862e34e26dbdb86e0cf6453f | [
"Apache-2.0"
] | null | null | null | Code.require_file "../test_helper.exs", __DIR__
alias Kernel.AliasTest.Nested, as: Nested
defmodule Nested do
def value, do: 1
end
defmodule Kernel.AliasTest do
use ExUnit.Case, async: true
test "alias Erlang" do
alias :lists, as: MyList
assert MyList.flatten([1, [2], 3]) == [1, 2, 3]
assert Elixir.MyList.Bar == :"Elixir.MyList.Bar"
assert MyList.Bar == :"Elixir.lists.Bar"
end
test "double alias" do
alias Kernel.AliasTest.Nested, as: Nested2
assert Nested.value == 1
assert Nested2.value == 1
end
test "overwritten alias" do
assert alias(List, as: Nested) == List
assert Nested.flatten([[13]]) == [13]
end
test "lexical" do
if true_fun() do
alias OMG, as: List, warn: false
else
alias ABC, as: List, warn: false
end
assert List.flatten([1, [2], 3]) == [1, 2, 3]
end
defp true_fun(), do: true
defmodule Elixir do
def sample, do: 1
end
test "nested elixir alias" do
assert Kernel.AliasTest.Elixir.sample == 1
end
test "multi-call" do
result = alias unquote(Inspect).{
Opts, Algebra,
}
assert result == [Inspect.Opts, Inspect.Algebra]
assert %Opts{} == %Inspect.Opts{}
assert Algebra.empty == :doc_nil
end
test "alias removal" do
alias __MODULE__.Foo
assert Foo == __MODULE__.Foo
alias Elixir.Foo
assert Foo == Elixir.Foo
alias Elixir.Bar
end
end
defmodule Kernel.AliasNestingGenerator do
defmacro create do
quote do
defmodule Parent do
def a, do: :a
end
defmodule Parent.Child do
def b, do: Parent.a
end
end
end
end
defmodule Kernel.AliasNestingTest do
use ExUnit.Case, async: true
require Kernel.AliasNestingGenerator
Kernel.AliasNestingGenerator.create
test "aliases nesting" do
assert Parent.a == :a
assert Parent.Child.b == :a
end
defmodule Nested do
def value, do: 2
end
test "aliases nesting with previous alias" do
assert Nested.value == 2
end
end
# Test case extracted from using records with aliases
# and @before_compile. We are basically testing that
# macro aliases are not leaking from the macro.
defmodule Macro.AliasTest.Definer do
defmacro __using__(_options) do
quote do
@before_compile unquote(__MODULE__)
end
end
defmacro __before_compile__(_env) do
quote do
defmodule First do
defstruct foo: :bar
end
defmodule Second do
defstruct baz: %First{}
end
end
end
end
defmodule Macro.AliasTest.Aliaser do
defmacro __using__(_options) do
quote do
alias Some.First
end
end
end
defmodule Macro.AliasTest.User do
use ExUnit.Case, async: true
use Macro.AliasTest.Definer
use Macro.AliasTest.Aliaser
test "has a struct defined from after compile" do
assert is_map struct(Macro.AliasTest.User.First, [])
assert is_map struct(Macro.AliasTest.User.Second, []).baz
end
end
| 20.506944 | 61 | 0.668811 |
03eff43bac608f9905774e6f9239ac3eec54eb32 | 678 | ex | Elixir | clients/elixir/generated/lib/adobe_experience_manager(aem)api/model/install_status.ex | hoomaan-kh/swagger-aem | 0b19225bb6e071df761d176cbc13565891fe895f | [
"Apache-2.0"
] | null | null | null | clients/elixir/generated/lib/adobe_experience_manager(aem)api/model/install_status.ex | hoomaan-kh/swagger-aem | 0b19225bb6e071df761d176cbc13565891fe895f | [
"Apache-2.0"
] | null | null | null | clients/elixir/generated/lib/adobe_experience_manager(aem)api/model/install_status.ex | hoomaan-kh/swagger-aem | 0b19225bb6e071df761d176cbc13565891fe895f | [
"Apache-2.0"
] | null | null | null | # NOTE: This class is auto generated by OpenAPI Generator (https://openapi-generator.tech).
# https://openapi-generator.tech
# Do not edit the class manually.
defmodule AdobeExperienceManager(AEM)API.Model.InstallStatus do
@moduledoc """
"""
@derive [Poison.Encoder]
defstruct [
:"status"
]
@type t :: %__MODULE__{
:"status" => InstallStatusStatus
}
end
defimpl Poison.Decoder, for: AdobeExperienceManager(AEM)API.Model.InstallStatus do
import AdobeExperienceManager(AEM)API.Deserializer
def decode(value, options) do
value
|> deserialize(:"status", :struct, AdobeExperienceManager(AEM)API.Model.InstallStatusStatus, options)
end
end
| 24.214286 | 105 | 0.733038 |
03effd894dc8ca94fa326334038dbbfe4c090867 | 1,394 | exs | Elixir | mix.exs | akoutmos/amqp | cd42deb99a8dcd220bb8da27ef5fe6b77e13432b | [
"MIT"
] | null | null | null | mix.exs | akoutmos/amqp | cd42deb99a8dcd220bb8da27ef5fe6b77e13432b | [
"MIT"
] | null | null | null | mix.exs | akoutmos/amqp | cd42deb99a8dcd220bb8da27ef5fe6b77e13432b | [
"MIT"
] | null | null | null | defmodule AMQP.Mixfile do
use Mix.Project
@version "1.4.0"
def project do
[
app: :amqp,
version: @version,
elixir: "~> 1.7",
description: description(),
package: package(),
source_url: "https://github.com/pma/amqp",
deps: deps(),
dialyzer: [
ignore_warnings: "dialyzer.ignore-warnings",
plt_add_deps: :transitive,
flags: [:error_handling, :race_conditions, :no_opaque, :underspecs]
],
docs: [
extras: ["README.md"],
main: "readme",
source_ref: "v#{@version}",
source_url: "https://github.com/pma/amqp"
]
]
end
def application do
[
applications: [:lager, :amqp_client],
mod: {AMQP.Application, []}
]
end
defp deps do
[
{:amqp_client, "~> 3.8.0"},
# Docs dependencies.
{:earmark, "~> 1.0", only: :docs},
{:ex_doc, "~> 0.15", only: :docs},
{:inch_ex, "~> 0.5", only: :docs},
# Dev dependencies.
{:dialyxir, "~> 0.5", only: :dev, runtime: false}
]
end
defp description do
"""
Idiomatic Elixir client for RabbitMQ.
"""
end
defp package do
[
files: ["lib", "mix.exs", "README.md", "LICENSE"],
maintainers: ["Paulo Almeida", "Eduardo Gurgel"],
licenses: ["MIT"],
links: %{"GitHub" => "https://github.com/pma/amqp"}
]
end
end
| 21.446154 | 75 | 0.532999 |
03f035f4d9294dec3afeed829e15dd4b8b03a084 | 2,636 | ex | Elixir | clients/container/lib/google_api/container/v1/model/update_cluster_request.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/container/lib/google_api/container/v1/model/update_cluster_request.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/container/lib/google_api/container/v1/model/update_cluster_request.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Container.V1.Model.UpdateClusterRequest do
@moduledoc """
UpdateClusterRequest updates the settings of a cluster.
## Attributes
* `clusterId` (*type:* `String.t`, *default:* `nil`) - Deprecated. The name of the cluster to upgrade.
This field has been deprecated and replaced by the name field.
* `name` (*type:* `String.t`, *default:* `nil`) - The name (project, location, cluster) of the cluster to update.
Specified in the format 'projects/*/locations/*/clusters/*'.
* `projectId` (*type:* `String.t`, *default:* `nil`) - Deprecated. The Google Developers Console [project ID or project
number](https://support.google.com/cloud/answer/6158840).
This field has been deprecated and replaced by the name field.
* `update` (*type:* `GoogleApi.Container.V1.Model.ClusterUpdate.t`, *default:* `nil`) - Required. A description of the update.
* `zone` (*type:* `String.t`, *default:* `nil`) - Deprecated. The name of the Google Compute Engine
[zone](/compute/docs/zones#available) in which the cluster
resides.
This field has been deprecated and replaced by the name field.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:clusterId => String.t(),
:name => String.t(),
:projectId => String.t(),
:update => GoogleApi.Container.V1.Model.ClusterUpdate.t(),
:zone => String.t()
}
field(:clusterId)
field(:name)
field(:projectId)
field(:update, as: GoogleApi.Container.V1.Model.ClusterUpdate)
field(:zone)
end
defimpl Poison.Decoder, for: GoogleApi.Container.V1.Model.UpdateClusterRequest do
def decode(value, options) do
GoogleApi.Container.V1.Model.UpdateClusterRequest.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Container.V1.Model.UpdateClusterRequest do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 39.939394 | 130 | 0.705615 |
03f04035af3e8f8b2d8dd3cd460f16e944892615 | 671 | ex | Elixir | lib/stripe/coupons.ex | blitzstudios/stripity-stripe | 09034dcc5e01499c0e14a677e21891a2a4307a05 | [
"BSD-3-Clause"
] | 1 | 2020-05-03T15:41:49.000Z | 2020-05-03T15:41:49.000Z | lib/stripe/coupons.ex | jakhongirruziev/stripity-stripe | 09034dcc5e01499c0e14a677e21891a2a4307a05 | [
"BSD-3-Clause"
] | null | null | null | lib/stripe/coupons.ex | jakhongirruziev/stripity-stripe | 09034dcc5e01499c0e14a677e21891a2a4307a05 | [
"BSD-3-Clause"
] | 2 | 2016-08-23T21:06:49.000Z | 2020-02-13T16:04:16.000Z | defmodule Stripe.Coupons do
@moduledoc """
Handles coupons to the Stripe API.
(API ref: https://stripe.com/docs/api#coupons)
Operations:
- create (TODO)
- retrieve
- update (TODO)
- delete (TODO)
- list (TODO)
"""
@endpoint "coupons"
@doc """
Retrieves the coupon with the given ID.
Returns a coupon if a valid coupon ID was provided.
Throws an error otherwise.
## Examples
```
params = "free-1-month"
{:ok, result} = Stripe.Coupons.retrieve params
```
"""
def retrieve(params) do
path = @endpoint <> "/" <> params
Stripe.make_request(:get, path, %{}, %{})
|> Stripe.Util.handle_stripe_response
end
end
| 19.171429 | 53 | 0.630402 |
03f048e1c55588a4e95ef281bca0f701ae588046 | 1,214 | ex | Elixir | lib/spotify/models/context.ex | chippers/spotify_web_api | 221a197dbac4971f87e9917d02cb335e6a42b726 | [
"MIT"
] | null | null | null | lib/spotify/models/context.ex | chippers/spotify_web_api | 221a197dbac4971f87e9917d02cb335e6a42b726 | [
"MIT"
] | null | null | null | lib/spotify/models/context.ex | chippers/spotify_web_api | 221a197dbac4971f87e9917d02cb335e6a42b726 | [
"MIT"
] | null | null | null | defmodule Spotify.Context do
@moduledoc """
A Context object.
[Spotify Docs](https://beta.developer.spotify.com/documentation/web-api/reference/object-model/#context-object)
"""
@behaviour Spotify.ObjectModel
alias Spotify.ExternalUrls
@typedoc """
The object type, e.g. `artist`, `playlist`, `album`.
"""
@type type :: String.t
@typedoc """
A link to the Web API endpoint providing full details of the track.
"""
@type href :: String.t
@typedoc """
External URLs for this context.
"""
@type external_urls :: ExternalUrls.t
@typedoc """
The Spotify URI for the context.
"""
@type uri :: String.t
defstruct [
:type,
:href,
:external_urls,
:uri,
]
@typedoc """
The full Context object.
Contains all the values listed in the
[Spotify Docs](https://beta.developer.spotify.com/documentation/web-api/reference/object-model/#context-object)
"""
@type t :: %__MODULE__{
type: __MODULE__.type | nil,
href: __MODULE__.href | nil,
external_urls: __MODULE__.external_urls | nil,
uri: __MODULE__.uri | nil,
}
def as do
%__MODULE__{}
end
end
| 22.072727 | 115 | 0.61944 |
03f0541b58001ac6085104002367824b199ad2f3 | 1,806 | ex | Elixir | lib/learn_kit/regression/score.ex | davidrichey/elixir_learn_kit | 4f79d052ffca7bf45f5fa887607710fa50689307 | [
"MIT"
] | 28 | 2018-11-12T13:46:08.000Z | 2022-02-09T14:05:15.000Z | lib/learn_kit/regression/score.ex | Kinyugo/elixir_learn_kit | 4f79d052ffca7bf45f5fa887607710fa50689307 | [
"MIT"
] | 2 | 2019-01-08T05:54:58.000Z | 2020-03-12T10:06:26.000Z | lib/learn_kit/regression/score.ex | Kinyugo/elixir_learn_kit | 4f79d052ffca7bf45f5fa887607710fa50689307 | [
"MIT"
] | 4 | 2018-12-23T14:35:26.000Z | 2019-11-28T08:45:30.000Z | defmodule LearnKit.Regression.Score do
@moduledoc """
Module for scoring regression models
"""
alias LearnKit.Math
defmacro __using__(_opts) do
quote do
@doc """
Returns the coefficient of determination R^2 of the prediction
## Parameters
- predictor: %LearnKit.Regression.Linear{}
## Examples
iex> predictor |> LearnKit.Regression.Linear.score
{:ok, 0.9876543209876543}
"""
@spec score(%LearnKit.Regression.Linear{factors: factors, results: results, coefficients: coefficients}) :: {:ok, number}
def score(regression = %_{factors: _, results: _, coefficients: _}) do
{
:ok,
calculate_score(regression)
}
end
defp calculate_score(%_{coefficients: []}, _, _), do: raise("There was no fit for model")
defp calculate_score(regression = %_{coefficients: _, factors: _, results: results}) do
1.0 - sum_of_squared_errors(regression) / total_sum_of_squares(results)
end
defp prediction_error(regression, x, y) do
{:ok, prediction} = predict(regression, x)
y - prediction
end
defp sum_of_squared_errors(regression = %_{coefficients: _, factors: factors, results: results}) do
factors
|> Enum.zip(results)
|> Enum.reduce(0, fn {xi, yi}, acc ->
acc + squared_prediction_error(regression, xi, yi)
end)
end
defp total_sum_of_squares(list) do
mean_list = Math.mean(list)
Enum.reduce(list, 0, fn x, acc -> acc + :math.pow(x - mean_list, 2) end)
end
defp squared_prediction_error(regression = %_{coefficients: coefficients}, x, y) do
regression
|> prediction_error(x, y)
|> :math.pow(2)
end
end
end
end
| 28.21875 | 127 | 0.616833 |
03f08f35e478ef1bc41043d975faceb064bc9834 | 466 | exs | Elixir | test/daily_meals_web/views/error_view_test.exs | vinolivae/daily_meals | 8f375cbb7eaf54abfa6b683705bb8075067f9078 | [
"MIT"
] | null | null | null | test/daily_meals_web/views/error_view_test.exs | vinolivae/daily_meals | 8f375cbb7eaf54abfa6b683705bb8075067f9078 | [
"MIT"
] | null | null | null | test/daily_meals_web/views/error_view_test.exs | vinolivae/daily_meals | 8f375cbb7eaf54abfa6b683705bb8075067f9078 | [
"MIT"
] | null | null | null | defmodule DailyMealsWeb.ErrorViewTest do
use DailyMealsWeb.ConnCase, async: true
# Bring render/3 and render_to_string/3 for testing custom views
import Phoenix.View
test "renders 404.json" do
assert render(DailyMealsWeb.ErrorView, "404.json", []) == %{errors: %{detail: "Not Found"}}
end
test "renders 500.json" do
assert render(DailyMealsWeb.ErrorView, "500.json", []) ==
%{errors: %{detail: "Internal Server Error"}}
end
end
| 29.125 | 95 | 0.690987 |
03f0d8b6a71bd2d0ae62a52767fcbabc398d390f | 12,382 | ex | Elixir | kousa/lib/kousa/room.ex | jamesql/dogehouse | 93b6d07a303e7ae758e25c414e6adc401962c399 | [
"MIT"
] | 1 | 2021-03-22T22:10:34.000Z | 2021-03-22T22:10:34.000Z | kousa/lib/kousa/room.ex | jamesql/dogehouse | 93b6d07a303e7ae758e25c414e6adc401962c399 | [
"MIT"
] | null | null | null | kousa/lib/kousa/room.ex | jamesql/dogehouse | 93b6d07a303e7ae758e25c414e6adc401962c399 | [
"MIT"
] | null | null | null | defmodule Kousa.Room do
alias Kousa.Utils.RegUtils
alias Kousa.Utils.VoiceServerUtils
alias Beef.Users
alias Beef.Follows
alias Beef.Rooms
# note the following 2 module aliases are on the chopping block!
alias Beef.RoomPermissions
alias Beef.RoomBlocks
def set_auto_speaker(user_id, value) do
room = Rooms.get_room_by_creator_id(user_id)
if not is_nil(room) do
RegUtils.lookup_and_cast(Onion.RoomSession, room.id, {:set_auto_speaker, value})
end
end
def set_speak_requests(user_id, value) do
room = Rooms.get_room_by_creator_id(user_id)
if not is_nil(room) do
RegUtils.lookup_and_cast(
Onion.RoomSession,
room.id,
{:set_speak_requests, value}
)
RegUtils.lookup_and_cast(
Onion.RoomSession,
room.id,
{:send_ws_msg, :vscode, %{op: "accept_listeners_change", d: %{value: value}}}
)
end
end
@spec make_room_public(any, any) :: nil | :ok
def make_room_public(user_id, new_name) do
# this needs to be refactored if a user can have multiple rooms
case Beef.Rooms.set_room_privacy_by_creator_id(user_id, false, new_name) do
{1, [room]} ->
Onion.RoomSession.send_cast(
room.id,
{:send_ws_msg, :vscode,
%{op: "room_privacy_change", d: %{roomId: room.id, name: room.name, isPrivate: false}}}
)
_ ->
nil
end
end
@spec make_room_private(any, any) :: nil | :ok
def make_room_private(user_id, new_name) do
# this needs to be refactored if a user can have multiple rooms
case Rooms.set_room_privacy_by_creator_id(user_id, true, new_name) do
{1, [room]} ->
Onion.RoomSession.send_cast(
room.id,
{:send_ws_msg, :vscode,
%{op: "room_privacy_change", d: %{roomId: room.id, name: room.name, isPrivate: true}}}
)
_ ->
nil
end
end
def invite_to_room(user_id, user_id_to_invite) do
user = Beef.Users.get_by_id(user_id)
if not is_nil(user.currentRoomId) and
Follows.following_me?(user_id, user_id_to_invite) do
# @todo store room name in RoomSession to avoid db lookups
room = Rooms.get_room_by_id(user.currentRoomId)
if not is_nil(room) do
Onion.RoomSession.send_cast(
user.currentRoomId,
{:create_invite, user_id_to_invite,
%{
roomName: room.name,
displayName: user.displayName,
username: user.username,
avatarUrl: user.avatarUrl,
type: "invite"
}}
)
end
end
end
def block_from_room(user_id, user_id_to_block_from_room) do
with {status, room} when status in [:creator, :mod] <-
Rooms.get_room_status(user_id) do
if room.creatorId != user_id_to_block_from_room do
RoomBlocks.insert(%{
modId: user_id,
userId: user_id_to_block_from_room,
roomId: room.id
})
user_blocked = Beef.Users.get_by_id(user_id_to_block_from_room)
if user_blocked.currentRoomId == room.id do
leave_room(user_id_to_block_from_room, user_blocked.currentRoomId, true)
end
end
end
end
defp internal_set_listener(user_id_to_make_listener, room_id) do
RoomPermissions.make_listener(user_id_to_make_listener, room_id)
Kousa.Utils.RegUtils.lookup_and_cast(
Onion.RoomSession,
room_id,
{:speaker_removed, user_id_to_make_listener}
)
end
def set_listener(user_id, user_id_to_set_listener) do
if user_id == user_id_to_set_listener do
internal_set_listener(
user_id_to_set_listener,
Beef.Users.get_current_room_id(user_id_to_set_listener)
)
else
{status, room} = Rooms.get_room_status(user_id)
is_creator = user_id_to_set_listener == not is_nil(room) and room.creatorId
if not is_creator and (status == :creator or status == :mod) do
internal_set_listener(
user_id_to_set_listener,
Beef.Users.get_current_room_id(user_id_to_set_listener)
)
end
end
end
@spec internal_set_speaker(any, any) :: nil | :ok | {:err, {:error, :not_found}}
def internal_set_speaker(user_id_to_make_speaker, room_id) do
with {:ok, _} <-
RoomPermissions.set_speaker?(user_id_to_make_speaker, room_id, true) do
case GenRegistry.lookup(
Onion.RoomSession,
room_id
) do
{:ok, session} ->
GenServer.cast(
session,
{:speaker_added, user_id_to_make_speaker,
Onion.UserSession.send_call!(user_id_to_make_speaker, {:get, :muted})}
)
err ->
{:err, err}
end
end
end
def make_speaker(user_id, user_id_to_make_speaker) do
with {status, room} when status in [:creator, :mod] <-
Rooms.get_room_status(user_id) do
internal_set_speaker(user_id_to_make_speaker, room.id)
end
end
def change_mod(user_id, user_id_to_change, value) when is_boolean(value) do
room = Rooms.get_room_by_creator_id(user_id)
if room do
RoomPermissions.set_is_mod(user_id_to_change, room.id, value)
Kousa.Utils.RegUtils.lookup_and_cast(
Onion.RoomSession,
room.id,
{:send_ws_msg, :vscode,
%{
op: "mod_changed",
d: %{roomId: room.id, userId: user_id_to_change}
}}
)
end
end
def change_room_creator(old_creator_id, new_creator_id, current_room_id \\ nil) do
# get current room id
current_room_id =
if is_nil(current_room_id),
do: Beef.Users.get_current_room_id(new_creator_id),
else: current_room_id
# get old creator's room id for validation
old_creator_room_id = Beef.Users.get_current_room_id(old_creator_id)
# validate
case {is_nil(current_room_id), new_creator_id == old_creator_id,
current_room_id == old_creator_room_id} do
{false, false, true} ->
case Rooms.replace_room_owner(old_creator_id, new_creator_id) do
{1, _} ->
internal_set_speaker(old_creator_id, current_room_id)
Onion.RoomSession.send_cast(
current_room_id,
{:send_ws_msg, :vscode,
%{op: "new_room_creator", d: %{roomId: current_room_id, userId: new_creator_id}}}
)
_ ->
nil
end
_ ->
nil
end
end
def join_vc_room(user_id, room, speaker? \\ nil) do
speaker? =
if is_nil(speaker?),
do:
room.creatorId == user_id or
RoomPermissions.speaker?(user_id, room.id),
else: speaker?
op =
if speaker?,
do: "join-as-speaker",
else: "join-as-new-peer"
Onion.VoiceRabbit.send(room.voiceServerId, %{
op: op,
d: %{roomId: room.id, peerId: user_id},
uid: user_id
})
end
def edit_room(user_id, new_name, new_description, is_private) do
with {:ok, room_id} <- Users.tuple_get_current_room_id(user_id) do
case Rooms.edit(room_id, %{
name: new_name,
description: new_description,
is_private: is_private
}) do
{:ok, _room} ->
RegUtils.lookup_and_cast(
Onion.RoomSession,
room_id,
{:new_room_details, new_name, new_description, is_private}
)
{:error, x} ->
{:error, Kousa.Utils.Errors.changeset_to_first_err_message_with_field_name(x)}
end
end
end
@spec create_room(String.t(), String.t(), String.t(), boolean(), String.t() | nil) ::
{:error, any}
| {:ok, %{room: atom | %{:id => any, :voiceServerId => any, optional(any) => any}}}
def create_room(user_id, room_name, room_description, is_private, user_id_to_invite \\ nil) do
room_id = Users.get_current_room_id(user_id)
if not is_nil(room_id) do
leave_room(user_id, room_id)
end
id = Ecto.UUID.generate()
case Rooms.create(%{
id: id,
name: room_name,
description: room_description,
creatorId: user_id,
numPeopleInside: 1,
voiceServerId: VoiceServerUtils.get_next_voice_server_id(),
isPrivate: is_private
}) do
{:ok, room} ->
{:ok, session} =
GenRegistry.lookup_or_start(
Onion.RoomSession,
id,
[
%{
room_id: id,
voice_server_id: room.voiceServerId
}
]
)
GenServer.cast(
session,
{:join_room_no_fan, user_id, Onion.UserSession.send_call!(user_id, {:get, :muted})}
)
Onion.VoiceRabbit.send(room.voiceServerId, %{
op: "create-room",
d: %{roomId: id},
uid: user_id
})
join_vc_room(user_id, room, true)
if not is_private do
Kousa.Follow.notify_followers_you_created_a_room(user_id, room)
end
if not is_nil(user_id_to_invite) do
Task.start(fn ->
Kousa.Room.invite_to_room(user_id, user_id_to_invite)
end)
end
{:ok, %{room: room}}
{:error, x} ->
{:error, Kousa.Utils.Errors.changeset_to_first_err_message_with_field_name(x)}
end
end
def join_room(user_id, room_id) do
currentRoomId = Beef.Users.get_current_room_id(user_id)
if currentRoomId == room_id do
%{room: Rooms.get_room_by_id(room_id)}
else
case Rooms.can_join_room(room_id, user_id) do
{:error, message} ->
%{error: message}
{:ok, room} ->
private_check =
if room.isPrivate do
case Kousa.Utils.RegUtils.lookup_and_call(
Onion.RoomSession,
room.id,
{:redeem_invite, user_id}
) do
{:ok, :error} -> {:err, "the room is private, ask someone inside to invite you"}
{:ok, :ok} -> {:ok}
_ -> {:err, "room session doesn't exist"}
end
else
{:ok}
end
case private_check do
{:err, m} ->
%{error: m}
_ ->
if currentRoomId do
leave_room(user_id, currentRoomId)
end
updated_user = Rooms.join_room(room, user_id)
Onion.RoomSession.send_cast(
room_id,
{:join_room, updated_user, Onion.UserSession.send_call!(user_id, {:get, :muted})}
)
canSpeak =
with %{roomPermissions: %{isSpeaker: true}} <- updated_user do
true
else
_ ->
false
end
join_vc_room(user_id, room, canSpeak || room.isPrivate)
%{room: room}
end
end
end
end
def leave_room(user_id, current_room_id \\ nil, blocked \\ false) do
current_room_id =
if is_nil(current_room_id),
do: Beef.Users.get_current_room_id(user_id),
else: current_room_id
if current_room_id do
case Rooms.leave_room(user_id, current_room_id) do
# the room should be destroyed
{:bye, room} ->
Onion.RoomSession.send_cast(current_room_id, {:destroy, user_id})
Onion.VoiceRabbit.send(room.voiceServerId, %{
op: "destroy-room",
uid: user_id,
d: %{peerId: user_id, roomId: current_room_id}
})
# the room stays alive with new room creator
x ->
case x do
{:new_creator_id, creator_id} ->
Onion.RoomSession.send_cast(
current_room_id,
{:send_ws_msg, :vscode,
%{op: "new_room_creator", d: %{roomId: current_room_id, userId: creator_id}}}
)
_ ->
nil
end
Onion.RoomSession.send_cast(
current_room_id,
{:leave_room, user_id}
)
end
Onion.UserSession.send_cast(
user_id,
{:send_ws_msg, :web,
%{op: "you_left_room", d: %{roomId: current_room_id, blocked: blocked}}}
)
end
end
end
| 29.065728 | 98 | 0.587223 |
03f0e5b988f7d1a330776853bd9eecbff15d6c2b | 1,682 | ex | Elixir | clients/cloud_search/lib/google_api/cloud_search/v1/model/metaline.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/cloud_search/lib/google_api/cloud_search/v1/model/metaline.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/cloud_search/lib/google_api/cloud_search/v1/model/metaline.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.CloudSearch.V1.Model.Metaline do
@moduledoc """
A metaline is a list of properties that are displayed along with the search
result to provide context.
## Attributes
* `properties` (*type:* `list(GoogleApi.CloudSearch.V1.Model.DisplayedProperty.t)`, *default:* `nil`) - The list of displayed properties for the metaline. The maximum number of
properties is 5.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:properties => list(GoogleApi.CloudSearch.V1.Model.DisplayedProperty.t())
}
field(:properties, as: GoogleApi.CloudSearch.V1.Model.DisplayedProperty, type: :list)
end
defimpl Poison.Decoder, for: GoogleApi.CloudSearch.V1.Model.Metaline do
def decode(value, options) do
GoogleApi.CloudSearch.V1.Model.Metaline.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.CloudSearch.V1.Model.Metaline do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 34.326531 | 180 | 0.74673 |
03f10cf4fc3d4a90b3dd8b24ab3fd7b1e121a50e | 1,721 | ex | Elixir | clients/analytics/lib/google_api/analytics/v3/model/goal_visit_time_on_site_details.ex | mocknen/elixir-google-api | dac4877b5da2694eca6a0b07b3bd0e179e5f3b70 | [
"Apache-2.0"
] | null | null | null | clients/analytics/lib/google_api/analytics/v3/model/goal_visit_time_on_site_details.ex | mocknen/elixir-google-api | dac4877b5da2694eca6a0b07b3bd0e179e5f3b70 | [
"Apache-2.0"
] | null | null | null | clients/analytics/lib/google_api/analytics/v3/model/goal_visit_time_on_site_details.ex | mocknen/elixir-google-api | dac4877b5da2694eca6a0b07b3bd0e179e5f3b70 | [
"Apache-2.0"
] | null | null | null | # Copyright 2017 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This class is auto generated by the swagger code generator program.
# https://github.com/swagger-api/swagger-codegen.git
# Do not edit the class manually.
defmodule GoogleApi.Analytics.V3.Model.GoalVisitTimeOnSiteDetails do
@moduledoc """
Details for the goal of the type VISIT_TIME_ON_SITE.
## Attributes
- comparisonType (String.t): Type of comparison. Possible values are LESS_THAN or GREATER_THAN. Defaults to: `null`.
- comparisonValue (String.t): Value used for this comparison. Defaults to: `null`.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:comparisonType => any(),
:comparisonValue => any()
}
field(:comparisonType)
field(:comparisonValue)
end
defimpl Poison.Decoder, for: GoogleApi.Analytics.V3.Model.GoalVisitTimeOnSiteDetails do
def decode(value, options) do
GoogleApi.Analytics.V3.Model.GoalVisitTimeOnSiteDetails.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Analytics.V3.Model.GoalVisitTimeOnSiteDetails do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 33.745098 | 118 | 0.750726 |
03f157f165eb65c37d03aee59c64a7b108902876 | 2,012 | ex | Elixir | lib/ex_aws/credentials_ini.ex | Frameio/ex_aws | 3b335b6ed7932b5cf991323d26cf5497e1e6c122 | [
"Unlicense",
"MIT"
] | null | null | null | lib/ex_aws/credentials_ini.ex | Frameio/ex_aws | 3b335b6ed7932b5cf991323d26cf5497e1e6c122 | [
"Unlicense",
"MIT"
] | null | null | null | lib/ex_aws/credentials_ini.ex | Frameio/ex_aws | 3b335b6ed7932b5cf991323d26cf5497e1e6c122 | [
"Unlicense",
"MIT"
] | null | null | null | if Code.ensure_loaded?(ConfigParser) do
defmodule ExAws.CredentialsIni do
def security_credentials(profile_name) do
shared_credentials = profile_from_shared_credentials(profile_name)
config_credentials = profile_from_config(profile_name)
Map.merge(config_credentials, shared_credentials)
end
def parse_ini_file({:ok, contents}, profile_name) do
contents
|> ConfigParser.parse_string
|> case do
{:ok, %{^profile_name => config}} ->
strip_key_prefix(config)
{:ok, %{}} ->
%{}
_ ->
%{}
end
end
def parse_ini_file(_, _), do: %{}
def strip_key_prefix(credentials) do
credentials
|> Map.take(~w(aws_access_key_id aws_secret_access_key aws_session_token region))
|> Map.new(fn({key, val}) ->
updated_key =
key
|> String.replace_leading("aws_", "")
|> String.to_existing_atom
{updated_key, val}
end)
end
def replace_token_key(credentials) do
case Map.pop(credentials, :session_token) do
{nil, credentials} ->
credentials
{token, credentials} ->
Map.put(credentials, :security_token, token)
end
end
defp profile_from_shared_credentials(profile_name) do
System.user_home
|> Path.join(".aws/credentials")
|> File.read
|> parse_ini_file(profile_name)
|> replace_token_key
end
defp profile_from_config(profile_name) do
section = case profile_name do
"default" -> "default"
other -> "profile #{other}"
end
System.user_home
|> Path.join(".aws/config")
|> File.read
|> parse_ini_file(section)
end
end
else
defmodule ExAws.CredentialsIni do
def security_credentials(_), do: raise "ConfigParser required to use"
def parse_ini_file(_, _), do: raise "ConfigParser required to use"
def replace_token_key(_), do: raise "ConfigParser required to use"
end
end
| 27.561644 | 87 | 0.631213 |
03f1999d1e7b61edb83066f6563ebcba4e60b7f2 | 206 | ex | Elixir | web/queries/order_billing_address.ex | harry-gao/ex-cart | 573e7f977bb3b710d11618dd215d4ddd8f819fb3 | [
"Apache-2.0"
] | 356 | 2016-03-16T12:37:28.000Z | 2021-12-18T03:22:39.000Z | web/queries/order_billing_address.ex | harry-gao/ex-cart | 573e7f977bb3b710d11618dd215d4ddd8f819fb3 | [
"Apache-2.0"
] | 30 | 2016-03-16T09:19:10.000Z | 2021-01-12T08:10:52.000Z | web/queries/order_billing_address.ex | harry-gao/ex-cart | 573e7f977bb3b710d11618dd215d4ddd8f819fb3 | [
"Apache-2.0"
] | 72 | 2016-03-16T13:32:14.000Z | 2021-03-23T11:27:43.000Z | defmodule Nectar.Query.OrderBillingAddress do
use Nectar.Query, model: Nectar.OrderBillingAddress
def for_order(order),
do: from o in Nectar.OrderBillingAddress, where: o.order_id == ^order.id
end
| 29.428571 | 76 | 0.776699 |
03f1a7b1ffe2e4285775c5a0e69bf38d3360d51f | 3,877 | ex | Elixir | apps/rig_outbound_gateway/lib/rig_outbound_gateway/kinesis/java_client.ex | maxglassie/reactive-interaction-gateway | 36b68fc75c71b9b4c3b6bd70fb11900c67172137 | [
"Apache-2.0"
] | null | null | null | apps/rig_outbound_gateway/lib/rig_outbound_gateway/kinesis/java_client.ex | maxglassie/reactive-interaction-gateway | 36b68fc75c71b9b4c3b6bd70fb11900c67172137 | [
"Apache-2.0"
] | null | null | null | apps/rig_outbound_gateway/lib/rig_outbound_gateway/kinesis/java_client.ex | maxglassie/reactive-interaction-gateway | 36b68fc75c71b9b4c3b6bd70fb11900c67172137 | [
"Apache-2.0"
] | null | null | null | defmodule RigOutboundGateway.Kinesis.JavaClient do
@moduledoc """
Manages the external Java-based Kinesis client application.
In Java land this would've been named AmazonKinesisJavaClientManager.
"""
use Rig.Config, :custom_validation
use GenServer
require Logger
alias Rig.EventFilter
alias RigOutboundGateway.Kinesis.LogStream
@jinterface_version "1.8.1"
@restart_delay_ms 20_000
def start_link(opts) do
GenServer.start_link(__MODULE__, :ok, opts)
end
# Confex callback
defp validate_config!(config) do
# checking that the files actually exists is deferred to init (see check_paths/0),
# as System.cwd doesn't point to the umbrella root at this point.
otp_jar =
case Keyword.get(config, :otp_jar) do
nil ->
Path.join(:code.root_dir(), "lib/jinterface-#{@jinterface_version}/priv/OtpErlang.jar")
val ->
val
end
%{
enabled?: Keyword.fetch!(config, :enabled?),
client_jar: Keyword.fetch!(config, :client_jar),
otp_jar: otp_jar,
log_level: Keyword.fetch!(config, :log_level) || "",
kinesis_app_name: Keyword.fetch!(config, :kinesis_app_name),
kinesis_aws_region: Keyword.fetch!(config, :kinesis_aws_region),
kinesis_stream: Keyword.fetch!(config, :kinesis_stream),
kinesis_endpoint: Keyword.fetch!(config, :kinesis_endpoint),
dynamodb_endpoint: Keyword.fetch!(config, :dynamodb_endpoint)
}
end
@impl GenServer
def init(:ok) do
conf = config()
if conf.enabled? do
# make sure the JInterface Jar file exists:
true =
File.exists?(conf.otp_jar) ||
"Could not find OtpErlang.jar for JInterface #{@jinterface_version} at #{conf.otp_jar}. Does your Erlang distribution come with Java support enabled?"
send(self(), :run_java_client)
end
{:ok, %{}}
end
@doc """
Starts (and awaits) the Java-client for Amazon Kinesis.
The process output is discarded. Instead of using stdout to receive Kinesis messages
from the Java client, the Java code uses JInterface to RPC
RigOutboundGateway.handle_raw/1 directly. This ensures that message boundaries are
kept (think newlines in messages) and that console log output doesn't interfere.
"""
@impl GenServer
def handle_info(:run_java_client, state) do
conf = config()
Logger.debug(fn -> "Starting Java-client for Kinesis.." end)
env = [
RIG_ERLANG_NAME: :erlang.node() |> Atom.to_string(),
RIG_ERLANG_COOKIE: :erlang.get_cookie() |> Atom.to_string(),
LOG_LEVEL: conf.log_level,
KINESIS_APP_NAME: conf.kinesis_app_name,
KINESIS_AWS_REGION: conf.kinesis_aws_region,
KINESIS_STREAM: conf.kinesis_stream,
KINESIS_ENDPOINT: conf.kinesis_endpoint,
KINESIS_DYNAMODB_ENDPOINT: conf.dynamodb_endpoint
]
# LogStream is used to pipe the Java logging output to RIG's logging output.
%Porcelain.Result{status: status} =
Porcelain.exec("java", java_args(), out: %LogStream{}, err: :out, env: env)
Logger.warn(fn ->
"Java-client for Kinesis is dead (exit code #{status}; restart in #{@restart_delay_ms} ms)."
end)
Process.send_after(self(), :run_java_client, @restart_delay_ms)
{:noreply, state}
end
defp java_args do
conf = config()
args = [
"-Djava.util.logging.SimpleFormatter.format=%4$s: %5$s%n",
"-Dexecutor=Elixir.RigOutboundGateway.Kinesis.JavaClient",
"-Dclient_name=kinesis-client",
"-cp",
"#{conf.client_jar}:#{conf.otp_jar}",
"com.accenture.rig.App"
]
Logger.info(fn -> "Exec: java #{Enum.join(args, " ")}" end)
args
end
@spec java_client_callback(data :: [{atom(), String.t()}, ...]) ::
:ok
def java_client_callback(data) do
data[:body]
|> Poison.decode!()
|> EventFilter.forward_event()
:ok
end
end
| 30.769841 | 160 | 0.677844 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.