hexsha stringlengths 40 40 | size int64 2 991k | ext stringclasses 2 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 4 208 | max_stars_repo_name stringlengths 6 106 | max_stars_repo_head_hexsha stringlengths 40 40 | max_stars_repo_licenses list | max_stars_count int64 1 33.5k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 4 208 | max_issues_repo_name stringlengths 6 106 | max_issues_repo_head_hexsha stringlengths 40 40 | max_issues_repo_licenses list | max_issues_count int64 1 16.3k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 4 208 | max_forks_repo_name stringlengths 6 106 | max_forks_repo_head_hexsha stringlengths 40 40 | max_forks_repo_licenses list | max_forks_count int64 1 6.91k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 2 991k | avg_line_length float64 1 36k | max_line_length int64 1 977k | alphanum_fraction float64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
e85e0a50e23917706bab469bce5ea5a2dff276bb | 4,994 | ex | Elixir | clients/dfa_reporting/lib/google_api/dfa_reporting/v35/model/order.ex | renovate-bot/elixir-google-api | 1da34cd39b670c99f067011e05ab90af93fef1f6 | [
"Apache-2.0"
] | 1 | 2021-12-20T03:40:53.000Z | 2021-12-20T03:40:53.000Z | clients/dfa_reporting/lib/google_api/dfa_reporting/v35/model/order.ex | swansoffiee/elixir-google-api | 9ea6d39f273fb430634788c258b3189d3613dde0 | [
"Apache-2.0"
] | 1 | 2020-08-18T00:11:23.000Z | 2020-08-18T00:44:16.000Z | clients/dfa_reporting/lib/google_api/dfa_reporting/v35/model/order.ex | dazuma/elixir-google-api | 6a9897168008efe07a6081d2326735fe332e522c | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.DFAReporting.V35.Model.Order do
@moduledoc """
Describes properties of a Planning order.
## Attributes
* `accountId` (*type:* `String.t`, *default:* `nil`) - Account ID of this order.
* `advertiserId` (*type:* `String.t`, *default:* `nil`) - Advertiser ID of this order.
* `approverUserProfileIds` (*type:* `list(String.t)`, *default:* `nil`) - IDs for users that have to approve documents created for this order.
* `buyerInvoiceId` (*type:* `String.t`, *default:* `nil`) - Buyer invoice ID associated with this order.
* `buyerOrganizationName` (*type:* `String.t`, *default:* `nil`) - Name of the buyer organization.
* `comments` (*type:* `String.t`, *default:* `nil`) - Comments in this order.
* `contacts` (*type:* `list(GoogleApi.DFAReporting.V35.Model.OrderContact.t)`, *default:* `nil`) - Contacts for this order.
* `id` (*type:* `String.t`, *default:* `nil`) - ID of this order. This is a read-only, auto-generated field.
* `kind` (*type:* `String.t`, *default:* `nil`) - Identifies what kind of resource this is. Value: the fixed string "dfareporting#order".
* `lastModifiedInfo` (*type:* `GoogleApi.DFAReporting.V35.Model.LastModifiedInfo.t`, *default:* `nil`) - Information about the most recent modification of this order.
* `name` (*type:* `String.t`, *default:* `nil`) - Name of this order.
* `notes` (*type:* `String.t`, *default:* `nil`) - Notes of this order.
* `planningTermId` (*type:* `String.t`, *default:* `nil`) - ID of the terms and conditions template used in this order.
* `projectId` (*type:* `String.t`, *default:* `nil`) - Project ID of this order.
* `sellerOrderId` (*type:* `String.t`, *default:* `nil`) - Seller order ID associated with this order.
* `sellerOrganizationName` (*type:* `String.t`, *default:* `nil`) - Name of the seller organization.
* `siteId` (*type:* `list(String.t)`, *default:* `nil`) - Site IDs this order is associated with.
* `siteNames` (*type:* `list(String.t)`, *default:* `nil`) - Free-form site names this order is associated with.
* `subaccountId` (*type:* `String.t`, *default:* `nil`) - Subaccount ID of this order.
* `termsAndConditions` (*type:* `String.t`, *default:* `nil`) - Terms and conditions of this order.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:accountId => String.t() | nil,
:advertiserId => String.t() | nil,
:approverUserProfileIds => list(String.t()) | nil,
:buyerInvoiceId => String.t() | nil,
:buyerOrganizationName => String.t() | nil,
:comments => String.t() | nil,
:contacts => list(GoogleApi.DFAReporting.V35.Model.OrderContact.t()) | nil,
:id => String.t() | nil,
:kind => String.t() | nil,
:lastModifiedInfo => GoogleApi.DFAReporting.V35.Model.LastModifiedInfo.t() | nil,
:name => String.t() | nil,
:notes => String.t() | nil,
:planningTermId => String.t() | nil,
:projectId => String.t() | nil,
:sellerOrderId => String.t() | nil,
:sellerOrganizationName => String.t() | nil,
:siteId => list(String.t()) | nil,
:siteNames => list(String.t()) | nil,
:subaccountId => String.t() | nil,
:termsAndConditions => String.t() | nil
}
field(:accountId)
field(:advertiserId)
field(:approverUserProfileIds, type: :list)
field(:buyerInvoiceId)
field(:buyerOrganizationName)
field(:comments)
field(:contacts, as: GoogleApi.DFAReporting.V35.Model.OrderContact, type: :list)
field(:id)
field(:kind)
field(:lastModifiedInfo, as: GoogleApi.DFAReporting.V35.Model.LastModifiedInfo)
field(:name)
field(:notes)
field(:planningTermId)
field(:projectId)
field(:sellerOrderId)
field(:sellerOrganizationName)
field(:siteId, type: :list)
field(:siteNames, type: :list)
field(:subaccountId)
field(:termsAndConditions)
end
defimpl Poison.Decoder, for: GoogleApi.DFAReporting.V35.Model.Order do
def decode(value, options) do
GoogleApi.DFAReporting.V35.Model.Order.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.DFAReporting.V35.Model.Order do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 48.019231 | 170 | 0.659191 |
e85e4c9c726a7f422bb37343a203837daca0027e | 1,421 | ex | Elixir | lib/absinthe/middleware/telemetry.ex | pulkit110/absinthe | fa2060307a401d0943bde72d08267602e4027889 | [
"MIT"
] | 4,101 | 2016-03-02T03:49:20.000Z | 2022-03-31T05:46:01.000Z | lib/absinthe/middleware/telemetry.ex | pulkit110/absinthe | fa2060307a401d0943bde72d08267602e4027889 | [
"MIT"
] | 889 | 2016-03-02T16:06:59.000Z | 2022-03-31T20:24:12.000Z | lib/absinthe/middleware/telemetry.ex | pulkit110/absinthe | fa2060307a401d0943bde72d08267602e4027889 | [
"MIT"
] | 564 | 2016-03-02T07:49:59.000Z | 2022-03-06T14:40:59.000Z | defmodule Absinthe.Middleware.Telemetry do
@moduledoc """
Gather and report telemetry about an individual field resolution
"""
@field_start [:absinthe, :resolve, :field, :start]
@field_stop [:absinthe, :resolve, :field, :stop]
@behaviour Absinthe.Middleware
@impl Absinthe.Middleware
def call(resolution, _) do
id = :erlang.unique_integer()
system_time = System.system_time()
start_time_mono = System.monotonic_time()
:telemetry.execute(
@field_start,
%{system_time: system_time},
%{id: id, telemetry_span_context: id, resolution: resolution}
)
%{
resolution
| middleware:
resolution.middleware ++
[
{{__MODULE__, :on_complete},
%{
id: id,
start_time_mono: start_time_mono,
middleware: resolution.middleware
}}
]
}
end
def on_complete(
%{state: :resolved} = resolution,
%{
id: id,
start_time_mono: start_time_mono,
middleware: middleware
}
) do
end_time_mono = System.monotonic_time()
:telemetry.execute(
@field_stop,
%{duration: end_time_mono - start_time_mono},
%{
id: id,
telemetry_span_context: id,
middleware: middleware,
resolution: resolution
}
)
resolution
end
end
| 23.295082 | 67 | 0.579873 |
e85e56ee167193bff4645b9e1b2af6d64bfe1cc7 | 1,841 | ex | Elixir | lib/hexedio/auth/login_attempts/auth_attempts.ex | netsudo/hexedio | 3e0e1ff7cfffcd704ba60394d0a38c4d8608e100 | [
"MIT"
] | null | null | null | lib/hexedio/auth/login_attempts/auth_attempts.ex | netsudo/hexedio | 3e0e1ff7cfffcd704ba60394d0a38c4d8608e100 | [
"MIT"
] | 3 | 2020-07-16T05:37:16.000Z | 2022-03-16T08:58:28.000Z | lib/hexedio/auth/login_attempts/auth_attempts.ex | netsudo/hexedio | 3e0e1ff7cfffcd704ba60394d0a38c4d8608e100 | [
"MIT"
] | null | null | null | defmodule Hexedio.LoginAttempt do
use Agent
alias Hexedio.Auth
@doc """
Starts the LoginAttempt agent to keep track of login attempts
"""
def start_link do
Agent.start_link(fn -> %{} end, name: __MODULE__)
end
@doc """
Get the number of attempts by a username
"""
def get(username) do
Agent.get(__MODULE__, &Map.get(&1, username))
end
@doc """
Adds or increments user when a login attempt is made
"""
def make(username) do
case Auth.get_user(username) do
nil -> {:error, "Incorrect username or password"}
%Auth.User{} -> get(username) |> update(username)
end
end
@doc """
Resets the user attempts back to 0
"""
def reset(username) do
Agent.update(
__MODULE__,
&Map.delete(&1, username)
)
:reset
end
defp update(nil, username) do
Agent.update(
__MODULE__,
&Map.put(&1, username, {1, expiry_date()})
)
end
defp update(attempts = {i,_}, username) when i < 5 do
Agent.update(
__MODULE__,
&Map.put(&1, username, set_attempt(attempts))
)
end
defp update({i, %DateTime{} = expiry}, username) when i == 5 do
case DateTime.compare(expiry, current_time()) do
:lt ->
Agent.update(
__MODULE__,
&Map.put(&1, username, {5, expiry_date()})
)
:gt ->
{:error, "Login limit exceeded"}
end
end
defp set_attempt({attempts, %DateTime{} = expiry}) do
case DateTime.compare(expiry, current_time()) do
:lt -> {1, expiry_date()}
:gt -> {attempts + 1, expiry_date()}
end
end
defp current_time do
{_, datetime} = DateTime.now("Etc/UTC")
datetime
end
defp expiry_date do
DateTime.now("Etc/UTC")
|> elem(1)
# Add 15 minutes to the current date
|> DateTime.add(15*60, :second)
end
end
| 21.658824 | 66 | 0.60076 |
e85e5a6d4d32565282296b1ac62baa194bfded80 | 5,402 | ex | Elixir | lib/nostrum/voice/session.ex | phereford/nostrum | 3d273671f51d839eedac4d6e52ba9cf70720ac01 | [
"MIT"
] | null | null | null | lib/nostrum/voice/session.ex | phereford/nostrum | 3d273671f51d839eedac4d6e52ba9cf70720ac01 | [
"MIT"
] | null | null | null | lib/nostrum/voice/session.ex | phereford/nostrum | 3d273671f51d839eedac4d6e52ba9cf70720ac01 | [
"MIT"
] | null | null | null | defmodule Nostrum.Voice.Session do
@moduledoc false
alias Nostrum.Cache.{ChannelCache, GuildCache}
alias Nostrum.Constants
alias Nostrum.Shard.Stage.Producer
alias Nostrum.Struct.{VoiceState, VoiceWSState}
alias Nostrum.Voice
alias Nostrum.Voice.{Event, Payload}
require Logger
use GenServer
@gateway_qs "/?v=4"
@timeout_connect 10_000
@timeout_ws_upgrade 10_000
def start_link(%VoiceState{} = vs) do
GenServer.start_link(__MODULE__, vs)
end
def init(args) do
{:ok, nil, {:continue, args}}
end
def handle_continue(%VoiceState{} = voice, nil) do
Logger.metadata(
guild: ~s|"#{GuildCache.get!(voice.guild_id).name}"|,
channel: ~s|"#{ChannelCache.get!(voice.channel_id).name}"|
)
[host, port] = String.split(voice.gateway, ":")
gun_opts = %{protocols: [:http], retry: 1_000_000_000, tls_opts: Constants.gun_tls_opts()}
{:ok, worker} = :gun.open(:binary.bin_to_list(host), String.to_integer(port), gun_opts)
{:ok, :http} = :gun.await_up(worker, @timeout_connect)
stream = :gun.ws_upgrade(worker, @gateway_qs)
await_ws_upgrade(worker, stream)
state = %VoiceWSState{
conn_pid: self(),
conn: worker,
guild_id: voice.guild_id,
session: voice.session,
token: voice.token,
gateway: voice.gateway,
stream: stream,
last_heartbeat_ack: DateTime.utc_now(),
heartbeat_ack: true
}
Logger.debug(fn -> "Voice Websocket connection up on worker #{inspect(worker)}" end)
Voice.update_voice(voice.guild_id, session_pid: self())
{:noreply, state}
end
defp await_ws_upgrade(worker, stream) do
receive do
{:gun_upgrade, ^worker, ^stream, [<<"websocket">>], _headers} ->
:ok
{:gun_error, ^worker, ^stream, reason} ->
exit({:ws_upgrade_failed, reason})
after
@timeout_ws_upgrade ->
Logger.error("Voice Websocket upgrade failed after #{@timeout_ws_upgrade / 1000} seconds")
exit(:timeout)
end
end
def get_ws_state(pid) do
GenServer.call(pid, :ws_state)
end
def close_connection(pid) do
GenServer.cast(pid, :close)
end
def set_speaking(pid, speaking, timed_out \\ false) do
GenServer.cast(pid, {:speaking, speaking, timed_out})
end
def on_voice_ready(pid) do
GenServer.cast(pid, :voice_ready)
end
def handle_info({:gun_ws, _worker, stream, {:text, frame}}, state) do
# Jason.decode calls iodata_to_binary internally
payload = Jason.decode!(frame)
from_handle =
payload["op"]
|> Constants.atom_from_voice_opcode()
|> Event.handle(payload, state)
case from_handle do
{new_state, reply} ->
:ok = :gun.ws_send(state.conn, stream, {:text, reply})
{:noreply, new_state}
new_state ->
{:noreply, new_state}
end
end
def handle_info({:gun_ws, _conn, _stream, {:close, errno, reason}}, state) do
Logger.info("Voice websocket closed (errno #{errno}, reason #{inspect(reason)})")
# If we received a errno of 4006, session is no longer valid, so we must get a new session.
if errno == 4006 do
%VoiceState{channel_id: chan, self_mute: mute, self_deaf: deaf} =
Voice.get_voice(state.guild_id)
Voice.leave_channel(state.guild_id)
Voice.join_channel(state.guild_id, chan, mute, deaf)
end
{:noreply, state}
end
def handle_info(
{:gun_down, _conn, _proto, _reason, _killed_streams, _unprocessed_streams},
state
) do
# Try to cancel the internal timer, but
# do not explode if it was already cancelled.
:timer.cancel(state.heartbeat_ref)
{:noreply, state}
end
def handle_info({:gun_up, worker, _proto}, state) do
stream = :gun.ws_upgrade(worker, @gateway_qs)
await_ws_upgrade(worker, stream)
Logger.warn("Reconnected after connection broke")
{:noreply, %{state | heartbeat_ack: true}}
end
def handle_cast(:heartbeat, %{heartbeat_ack: false, heartbeat_ref: timer_ref} = state) do
Logger.warn("heartbeat_ack not received in time, disconnecting")
{:ok, :cancel} = :timer.cancel(timer_ref)
:gun.ws_send(state.conn, state.stream, :close)
{:noreply, state}
end
def handle_cast(:heartbeat, state) do
{:ok, ref} =
:timer.apply_after(state.heartbeat_interval |> trunc, :gen_server, :cast, [
state.conn_pid,
:heartbeat
])
:ok = :gun.ws_send(state.conn, state.stream, {:text, Payload.heartbeat_payload()})
{:noreply,
%{state | heartbeat_ref: ref, heartbeat_ack: false, last_heartbeat_send: DateTime.utc_now()}}
end
def handle_cast(:voice_ready, state) do
voice = Voice.get_voice(state.guild_id)
voice_ready = Payload.voice_ready_payload(voice)
Producer.notify(Producer, voice_ready, state)
{:noreply, state}
end
def handle_cast({:speaking, speaking, timed_out}, state) do
voice = Voice.update_voice(state.guild_id, speaking: speaking)
speaking_update = Payload.speaking_update_payload(voice, timed_out)
payload = Payload.speaking_payload(voice)
Producer.notify(Producer, speaking_update, state)
:ok = :gun.ws_send(state.conn, state.stream, {:text, payload})
{:noreply, state}
end
def handle_cast(:close, state) do
:gun.close(state.conn)
{:noreply, state}
end
def handle_call(:ws_state, _from, state) do
{:reply, state, state}
end
end
| 28.282723 | 98 | 0.671048 |
e85e6809c344a59c9d9bc76e4ea10e7dbe9a7186 | 27,622 | ex | Elixir | lib/ash/api/api.ex | savish/ash | 7faf73097a6c6f801851e9d89569b6f5e6e87f81 | [
"MIT"
] | null | null | null | lib/ash/api/api.ex | savish/ash | 7faf73097a6c6f801851e9d89569b6f5e6e87f81 | [
"MIT"
] | null | null | null | lib/ash/api/api.ex | savish/ash | 7faf73097a6c6f801851e9d89569b6f5e6e87f81 | [
"MIT"
] | null | null | null | defmodule Ash.Api do
@moduledoc """
An Api allows you to interact with your resources, and holds non-resource-specific configuration.
For example, the json api extension adds an api extension that lets you toggle authorization on/off
for all resources in that Api. You include them in an Api like so:
```elixir
defmodule MyApp.Api do
use Ash.Api
resources do
resource OneResource
resource SecondResource
end
end
```
Then you can interact through that Api with the actions that those resources expose.
For example: `MyApp.Api.create(changeset)`, or `MyApp.Api.read(query)`. Corresponding
actions must be defined in your resources in order to call them through the Api.
## Interface
The functions documented here can be used to call any action on any resource in the Api.
For example, `MyApi.read(Myresource, [...])`.
Additionally, you can define a `code_interface` on each resource to be exposed in the Api module.
See the resource DSL documentation for more.
"""
import Ash.OptionsHelpers, only: [merge_schemas: 3]
alias Ash.Actions.{Create, Destroy, Read, Update}
alias Ash.Error.Invalid.{
NoPrimaryAction,
NoSuchAction,
NoSuchResource
}
require Ash.Query
@type t() :: module
@type page_request ::
:next | :prev | :first | :last | integer
@global_opts [
verbose?: [
type: :boolean,
default: false,
doc: "Log engine operations (very verbose!)"
],
action: [
type: :any,
doc: "The action to use, either an Action struct or the name of the action"
],
authorize?: [
type: {:in, [true, false, nil]},
doc:
"If an actor option is provided (even if it is `nil`), authorization happens automatically. If not, this flag can be used to authorize with no user."
],
stacktraces?: [
type: :boolean,
default: false,
doc:
"For Ash errors, can be set to true to get a stacktrace for each error that occured. See the error_handling guide for more."
],
actor: [
type: :any,
doc:
"If an actor is provided, it will be used in conjunction with the authorizers of a resource to authorize access"
]
]
@read_opts_schema merge_schemas(
[
page: [
doc:
"Nested pagination options, see the section on pagination for more",
type: {:custom, __MODULE__, :page_opts, []}
],
return_query?: [
type: :boolean,
doc: """
If `true`, the query that was ultimately used is returned as a third tuple element.
The query goes through many potential changes during a request, potentially adding
authorization filters, or replacing relationships for other data layers with their
corresponding ids. This option can be used to get the true query that was sent to
the data layer.
""",
default: false
]
],
@global_opts,
"Global Options"
)
@doc false
def read_opts_schema, do: @read_opts_schema
@offset_page_opts [
offset: [
type: :non_neg_integer,
doc: "The number of records to skip from the beginning of the query"
],
limit: [
type: :pos_integer,
doc: "The number of records to include in the page"
],
filter: [
type: :any,
doc: """
A filter to apply for pagination purposes, that should not be considered in the full count.
This is used by the liveview paginator to only fetch the records that were *already* on the
page when refreshing data, to avoid pages jittering.
"""
],
count: [
type: :boolean,
doc: "Whether or not to return the page with a full count of all records"
]
]
@keyset_page_opts [
before: [
type: :string,
doc: "Get records that appear before the provided keyset (mutually exclusive with `after`)"
],
after: [
type: :string,
doc: "Get records that appear after the provided keyset (mutually exclusive with `before`)"
],
limit: [
type: :pos_integer,
doc: "How many records to include in the page"
],
filter: [
type: :any,
doc: "See the `filter` option for offset pagination, this behaves the same."
],
count: [
type: :boolean,
doc: "Whether or not to return the page with a full count of all records"
]
]
@doc false
def page_opts(page_opts) do
if page_opts == false do
{:ok, false}
else
if page_opts[:after] || page_opts[:before] do
validate_or_error(page_opts, @keyset_page_opts)
else
if page_opts[:offset] do
validate_or_error(page_opts, @offset_page_opts)
else
validate_or_error(page_opts, @keyset_page_opts)
end
end
end
end
defp validate_or_error(opts, schema) do
case Ash.OptionsHelpers.validate(opts, schema) do
{:ok, value} -> {:ok, value}
{:error, error} -> {:error, Exception.message(error)}
end
end
@load_opts_schema merge_schemas([], @global_opts, "Global Options")
@get_opts_schema [
load: [
type: :any,
doc: "Fields or relationships to load in the query. See `Ash.Query.load/2`"
],
tenant: [
type: :any,
doc: "The tenant to set on the query being run"
],
action: [
type: :atom,
doc: "The action to use for reading the data"
],
context: [
type: :any,
doc: "Context to be set on the query being run"
]
]
|> merge_schemas(@global_opts, "Global Options")
@shared_created_update_and_destroy_opts_schema [
return_notifications?: [
type: :boolean,
default: false,
doc: """
Use this if you're running ash actions in your own transaction and you want notifications to happen still.
If a transaction is ongoing, and this is false, notifications will be discarded, otherwise
the return value is `{:ok, result, notifications}` (or `{:ok, notifications}`)
To send notifications later, use `Ash.Notifier.notify(notifications)`. It sends any notifications
that can be sent, and returns the rest.
"""
]
]
@create_opts_schema [
upsert?: [
type: :boolean,
default: false,
doc:
"If a conflict is found based on the primary key, the record is updated in the database (requires upsert support)"
]
]
|> merge_schemas(@global_opts, "Global Options")
|> merge_schemas(
@shared_created_update_and_destroy_opts_schema,
"Shared create/update/destroy Options"
)
@doc false
def create_opts_schema, do: @create_opts_schema
@update_opts_schema []
|> merge_schemas(@global_opts, "Global Options")
|> merge_schemas(
@shared_created_update_and_destroy_opts_schema,
"Shared create/update/destroy Options"
)
@doc false
def update_opts_schema, do: @update_opts_schema
@destroy_opts_schema []
|> merge_schemas(@global_opts, "Global Opts")
|> merge_schemas(
@shared_created_update_and_destroy_opts_schema,
"Shared create/update/destroy Options"
)
def destroy_opts_schema, do: @destroy_opts_schema
@doc """
Get a record by a primary key. See `c:get/3` for more.
"""
@callback get!(
resource :: Ash.Resource.t(),
id_or_filter :: term(),
params :: Keyword.t()
) ::
Ash.Resource.record() | no_return
@doc """
Get a record by a primary key.
For a resource with a composite primary key, pass a keyword list, e.g
`MyApi.get(MyResource, first_key: 1, second_key: 2)`
#{Ash.OptionsHelpers.docs(@get_opts_schema)}
"""
@callback get(
resource :: Ash.Resource.t(),
id_or_filter :: term(),
params :: Keyword.t()
) ::
{:ok, Ash.Resource.record()} | {:error, term}
@doc """
Run an ash query, raising on more than one result. See `c:read_one/2` for more.
"""
@callback read_one!(Ash.Query.t() | Ash.Resource.t(), params :: Keyword.t()) ::
Ash.Resource.record() | {Ash.Resource.record(), Ash.Query.t()} | no_return
@doc """
Run a query on a resource, but fail on more than one result
This is useful if you have a query that doesn't include a primary key
but you know that it will only ever return a single result
"""
@callback read_one(Ash.Query.t() | Ash.Resource.t(), params :: Keyword.t()) ::
{:ok, Ash.Resource.record()}
| {:ok, Ash.Resource.record(), Ash.Query.t()}
| {:error, term}
@doc """
Run an ash query. See `c:read/2` for more.
"""
@callback read!(Ash.Query.t() | Ash.Resource.t(), params :: Keyword.t()) ::
list(Ash.Resource.record())
| {list(Ash.Resource.record()), Ash.Query.t()}
| no_return
@doc """
Run a query on a resource.
For more information, on building a query, see `Ash.Query`.
#{Ash.OptionsHelpers.docs(@read_opts_schema)}
## Pagination
#### Limit/offset pagination
#{Ash.OptionsHelpers.docs(@offset_page_opts)}
#### Keyset pagination
#{Ash.OptionsHelpers.docs(@keyset_page_opts)}
"""
@callback read(Ash.Query.t(), params :: Keyword.t()) ::
{:ok, list(Ash.Resource.record())}
| {:ok, list(Ash.Resource.record()), Ash.Query.t()}
| {:error, term}
@doc """
Fetch a page relative to the provided page.
"""
@callback page!(Ash.Page.page(), page_request) ::
Ash.Page.page() | no_return
@doc """
Fetch a page relative to the provided page.
A page is the return value of a paginated action called via `c:read/2`.
"""
@callback page(Ash.Page.page(), page_request) ::
{:ok, Ash.Page.page()} | {:error, term}
@doc """
Load fields or relationships on already fetched records. See `c:load/3` for more information.
"""
@callback load!(
record_or_records :: Ash.Resource.record() | [Ash.Resource.record()],
query :: Ash.Query.t(),
opts :: Keyword.t()
) ::
Ash.Resource.record() | [Ash.Resource.record()] | no_return
@doc """
Load fields or relationships on already fetched records.
Accepts a list of non-loaded fields and loads them on the provided records or a query, in
which case the loaded fields of the query are used. Relationship loads can be nested, for
example: `MyApi.load(record, [posts: [:comments]])`.
#{Ash.OptionsHelpers.docs(@load_opts_schema)}
"""
@callback load(
record_or_records :: Ash.Resource.record() | [Ash.Resource.record()],
query :: Ash.Query.t(),
opts :: Keyword.t()
) ::
{:ok, Ash.Resource.record() | [Ash.Resource.record()]} | {:error, term}
@doc """
Create a record. See `c:create/2` for more information.
"""
@callback create!(Ash.Changeset.t(), params :: Keyword.t()) ::
Ash.Resource.record() | no_return
@doc """
Create a record.
#{Ash.OptionsHelpers.docs(@create_opts_schema)}
"""
@callback create(Ash.Changeset.t(), params :: Keyword.t()) ::
{:ok, Ash.Resource.record()} | {:error, term}
@doc """
Update a record. See `c:update/2` for more information.
"""
@callback update!(Ash.Changeset.t(), params :: Keyword.t()) ::
Ash.Resource.record() | no_return
@doc """
Update a record.
#{Ash.OptionsHelpers.docs(@update_opts_schema)}
"""
@callback update(Ash.Changeset.t(), params :: Keyword.t()) ::
{:ok, Ash.Resource.record()} | {:error, term}
@doc """
Destroy a record. See `c:destroy/2` for more information.
"""
@callback destroy!(Ash.Changeset.t() | Ash.Resource.record(), params :: Keyword.t()) ::
:ok | no_return
@doc """
Destroy a record.
#{Ash.OptionsHelpers.docs(@destroy_opts_schema)}
"""
@callback destroy(Ash.Changeset.t() | Ash.Resource.record(), params :: Keyword.t()) ::
:ok | {:error, term}
@doc """
Refetches a record by primary key. See `c:reload/1` for more.
"""
@callback reload!(record :: Ash.Resource.record(), params :: Keyword.t()) ::
Ash.Resource.record() | no_return
@doc """
Refetches a record by primary key.
"""
@callback reload(record :: Ash.Resource.record()) ::
{:ok, Ash.Resource.record()} | {:error, term}
alias Ash.Dsl.Extension
use Ash.Dsl, default_extensions: [extensions: [Ash.Api.Dsl]]
def handle_opts(_) do
quote do
@behaviour Ash.Api
end
end
def handle_before_compile(_) do
quote do
use Ash.Api.Interface
end
end
alias Ash.Dsl.Extension
def resource(api, resource) do
api
|> resource_references()
|> Enum.find(&(&1.resource == resource || &1.as == resource))
|> case do
nil -> {:error, NoSuchResource.exception(resource: resource)}
reference -> {:ok, reference.resource}
end
end
@spec resources(Ash.Api.t()) :: [Ash.Resource.t()]
def resources(api) do
api
|> Extension.get_entities([:resources])
|> Enum.map(& &1.resource)
end
@spec resource_references(Ash.Api.t()) :: [Ash.Api.ResourceReference.t()]
def resource_references(api) do
Extension.get_entities(api, [:resources])
end
@doc false
@spec get!(Ash.Api.t(), Ash.Resource.t(), term(), Keyword.t()) ::
Ash.Resource.record() | no_return
def get!(api, resource, id, opts \\ []) do
opts = Ash.OptionsHelpers.validate!(opts, @get_opts_schema)
api
|> get(resource, id, opts)
|> unwrap_or_raise!(opts[:stacktraces?])
end
@doc false
@spec get(Ash.Api.t(), Ash.Resource.t(), term(), Keyword.t()) ::
{:ok, Ash.Resource.record() | nil} | {:error, term}
def get(api, resource, id, opts) do
with {:ok, opts} <- Ash.OptionsHelpers.validate(opts, @get_opts_schema),
{:ok, resource} <- Ash.Api.resource(api, resource),
{:ok, filter} <- Ash.Filter.get_filter(resource, id) do
query =
resource
|> Ash.Query.new(api)
|> Ash.Query.set_tenant(opts[:tenant])
|> Ash.Query.filter(^filter)
|> Ash.Query.load(opts[:load] || [])
|> Ash.Query.set_context(opts[:context] || %{})
query =
if Ash.DataLayer.data_layer_can?(query.resource, :limit) do
Ash.Query.limit(query, 2)
else
query
end
query
|> api.read(Keyword.take(opts, Keyword.keys(@read_opts_schema)))
|> case do
{:ok, %{results: [single_result]}} ->
{:ok, single_result}
{:ok, %{results: []}} ->
{:ok, nil}
{:ok, %{results: results}} ->
{:error,
Ash.Error.Invalid.MultipleResults.exception(
count: Enum.count(results),
query: query,
at_least?: true
)}
{:ok, [single_result]} ->
{:ok, single_result}
{:ok, []} ->
{:ok, nil}
{:error, error} ->
{:error, error}
{:ok, results} when is_list(results) ->
{:error,
Ash.Error.Invalid.MultipleResults.exception(count: Enum.count(results), query: query)}
end
else
{:error, error} ->
{:error, error}
end
end
def page!(api, keyset, request) do
{_, opts} = keyset.rerun
api
|> page(keyset, request)
|> unwrap_or_raise!(opts[:stacktraces?])
end
def page(_, %Ash.Page.Keyset{results: []} = page, :next) do
{:ok, page}
end
def page(_, %Ash.Page.Keyset{results: []} = page, :prev) do
{:ok, page}
end
def page(_, %Ash.Page.Keyset{}, n) when is_integer(n) do
{:error, "Cannot seek to a specific page with keyset based pagination"}
end
def page(
api,
%Ash.Page.Keyset{results: results, rerun: {query, opts}},
:next
) do
last_keyset =
results
|> :lists.last()
|> Map.get(:__metadata__)
|> Map.get(:keyset)
new_page_opts =
opts[:page]
|> Keyword.delete(:before)
|> Keyword.put(:after, last_keyset)
read(api, query, Keyword.put(opts, :page, new_page_opts))
end
def page(api, %Ash.Page.Keyset{results: results, rerun: {query, opts}}, :prev) do
first_keyset =
results
|> List.first()
|> Map.get(:__metadata__)
|> Map.get(:keyset)
new_page_opts =
opts[:page]
|> Keyword.put(:before, first_keyset)
|> Keyword.delete(:after)
read(api, query, Keyword.put(opts, :page, new_page_opts))
end
def page(api, %Ash.Page.Keyset{rerun: {query, opts}}, :first) do
page_opts =
if opts[:page][:count] do
[count: true]
else
[]
end
read(api, query, Keyword.put(opts, :page, page_opts))
end
def page(
api,
%Ash.Page.Offset{count: count, limit: limit, offset: offset, rerun: {query, opts}},
request
) do
page_opts =
case request do
:next ->
[offset: offset + limit, limit: limit]
:prev ->
[offset: max(offset - limit, 0), limit: limit]
:first ->
[offset: 0, limit: limit]
:last ->
if count do
[offset: count - limit, limit: limit]
else
[offset: 0, limit: limit]
end
page_num when is_integer(page_num) ->
[offset: (page_num - 1) * limit, limit: limit]
end
page_opts =
if opts[:page][:count] do
Keyword.put(page_opts, :count, true)
else
page_opts
end
if request == :last && !count do
{:error, "Cannot fetch last page without counting"}
else
read(api, query, Keyword.put(opts, :page, page_opts))
end
end
@doc false
@spec load!(
Ash.Api.t(),
Ash.Resource.record() | list(Ash.Resource.record()),
Ash.Query.t() | list(atom | {atom, list()}),
Keyword.t()
) ::
list(Ash.Resource.record()) | Ash.Resource.record() | no_return
def load!(api, data, query, opts \\ []) do
opts = Ash.OptionsHelpers.validate!(opts, @load_opts_schema)
api
|> load(data, query, opts)
|> unwrap_or_raise!(opts[:stacktraces?])
end
@doc false
@spec load(
Ash.Api.t(),
Ash.Resource.record() | list(Ash.Resource.record()),
Ash.Query.t() | list(atom | {atom, list()}),
Keyword.t()
) ::
{:ok, list(Ash.Resource.record()) | Ash.Resource.record()} | {:error, term}
def load(api, data, query, opts \\ [])
def load(_, [], _, _), do: {:ok, []}
def load(_, nil, _, _), do: {:ok, nil}
def load(_, {:error, error}, _, _), do: {:error, error}
def load(api, {:ok, values}, query, opts) do
load(api, values, query, opts)
end
def load(api, data, query, opts) when not is_list(data) do
api
|> load(List.wrap(data), query, opts)
|> case do
{:ok, %{results: [data]}} -> {:ok, data}
{:ok, [data]} -> {:ok, data}
{:error, error} -> {:error, error}
end
end
def load(api, [%resource{} = record | _] = data, query, opts) do
query =
case query do
%Ash.Query{} = query ->
Ash.Query.set_tenant(query, query.tenant || Map.get(record.__metadata__, :tenant))
keyword ->
resource
|> Ash.Query.new(api)
|> Ash.Query.set_tenant(Map.get(record.__metadata__, :tenant))
|> Ash.Query.load(keyword)
end
with %{valid?: true} <- query,
{:ok, action} <- get_action(query.resource, opts, :read, query.action),
{:ok, opts} <- Ash.OptionsHelpers.validate(opts, @load_opts_schema) do
Read.run(query, action, Keyword.put(opts, :initial_data, data))
else
{:error, error} ->
{:error, error}
%{errors: errors} ->
{:error, errors}
end
end
@doc false
@spec read!(Ash.Api.t(), Ash.Query.t() | Ash.Resource.t(), Keyword.t()) ::
list(Ash.Resource.record()) | no_return
def read!(api, query, opts \\ []) do
opts = Ash.OptionsHelpers.validate!(opts, @read_opts_schema)
api
|> read(query, opts)
|> unwrap_or_raise!(opts[:stacktraces?])
end
@doc false
@spec read(Ash.Api.t(), Ash.Query.t() | Ash.Resource.t(), Keyword.t()) ::
{:ok, list(Ash.Resource.record()) | Ash.Page.page()} | {:error, term}
def read(api, query, opts \\ [])
def read(api, resource, opts) when is_atom(resource) do
read(api, Ash.Query.new(resource, api), opts)
end
def read(api, query, opts) do
query = Ash.Query.set_api(query, api)
with {:ok, opts} <- Ash.OptionsHelpers.validate(opts, @read_opts_schema),
{:ok, action} <- get_action(query.resource, opts, :read, query.action) do
Read.run(query, action, opts)
else
{:error, error} ->
{:error, error}
end
end
@doc false
def read_one!(api, query, opts) do
api
|> read_one(query, opts)
|> unwrap_or_raise!(opts[:stacktraces?])
end
@doc false
def read_one(api, query, opts) do
api
|> read(query, opts)
|> unwrap_one()
end
defp unwrap_one({:error, error}) do
{:error, error}
end
defp unwrap_one({:ok, result, query}) do
case unwrap_one({:ok, result}) do
{:ok, result} ->
{:ok, result, query}
{:error, %Ash.Error.Invalid.MultipleResults{} = error} ->
{:error, %{error | query: query}}
{:error, error} ->
{:error, error}
end
end
defp unwrap_one({:ok, result}) do
case unwrap_one(result) do
{:ok, result} ->
{:ok, result}
{:error, error} ->
{:error, error}
end
end
defp unwrap_one(%{results: results}) do
unwrap_one(results)
end
defp unwrap_one([]), do: {:ok, nil}
defp unwrap_one([result]), do: {:ok, result}
defp unwrap_one([_ | _] = results) do
error =
Ash.Error.Invalid.MultipleResults.exception(
count: Enum.count(results),
at_least?: true
)
{:error, error}
end
@doc false
@spec create!(Ash.Api.t(), Ash.Changeset.t(), Keyword.t()) ::
Ash.Resource.record() | no_return
def create!(api, changeset, opts) do
opts = Ash.OptionsHelpers.validate!(opts, @create_opts_schema)
api
|> create(changeset, opts)
|> unwrap_or_raise!(opts[:stacktraces?])
end
@doc false
@spec create(Ash.Api.t(), Ash.Changeset.t(), Keyword.t()) ::
{:ok, Ash.Resource.record(), list(Ash.Notifier.Notification.t())}
| {:ok, Ash.Resource.record()}
| {:error, term}
def create(api, changeset, opts) do
with {:ok, opts} <- Ash.OptionsHelpers.validate(opts, @create_opts_schema),
{:ok, resource} <- Ash.Api.resource(api, changeset.resource),
{:ok, action} <- get_action(resource, opts, :create, changeset.action) do
Create.run(api, changeset, action, opts)
end
end
@doc false
def update!(api, changeset, opts) do
opts = Ash.OptionsHelpers.validate!(opts, @update_opts_schema)
api
|> update(changeset, opts)
|> unwrap_or_raise!(opts[:stacktraces?])
end
@doc false
@spec update(Ash.Api.t(), Ash.Resource.record(), Keyword.t()) ::
{:ok, Ash.Resource.record(), list(Ash.Notifier.Notification.t())}
| {:ok, Ash.Resource.record()}
| {:error, term}
def update(api, changeset, opts) do
with {:ok, opts} <- Ash.OptionsHelpers.validate(opts, @update_opts_schema),
{:ok, resource} <- Ash.Api.resource(api, changeset.resource),
{:ok, action} <- get_action(resource, opts, :update, changeset.action) do
Update.run(api, changeset, action, opts)
end
end
@doc false
@spec destroy!(Ash.Api.t(), Ash.Changeset.t() | Ash.Resource.record(), Keyword.t()) ::
:ok | no_return
def destroy!(api, changeset, opts) do
opts = Ash.OptionsHelpers.validate!(opts, @destroy_opts_schema)
api
|> destroy(changeset, opts)
|> unwrap_or_raise!(opts[:stacktraces?])
end
@doc false
@spec destroy(Ash.Api.t(), Ash.Changeset.t() | Ash.Resource.record(), Keyword.t()) ::
{:ok, list(Ash.Notifier.Notification.t())} | :ok | {:error, term}
def destroy(api, %Ash.Changeset{resource: resource} = changeset, opts) do
with {:ok, opts} <- Ash.OptionsHelpers.validate(opts, @destroy_opts_schema),
{:ok, resource} <- Ash.Api.resource(api, resource),
{:ok, action} <- get_action(resource, opts, :destroy, changeset.action) do
Destroy.run(api, changeset, action, opts)
end
end
def destroy(api, record, opts) do
destroy(api, Ash.Changeset.new(record), opts)
end
defp get_action(resource, params, type, preset \\ nil) do
case Keyword.fetch(params, :action) do
{:ok, %_{} = action} ->
{:ok, action}
{:ok, nil} ->
if preset do
get_action(resource, Keyword.put(params, :action, preset), type)
else
get_action(resource, Keyword.delete(params, :action), type)
end
{:ok, action} ->
case Ash.Resource.Info.action(resource, action, type) do
nil ->
{:error, NoSuchAction.exception(resource: resource, action: action, type: type)}
action ->
{:ok, action}
end
:error ->
if preset do
get_action(resource, Keyword.put(params, :action, preset), type)
else
case Ash.Resource.Info.primary_action(resource, type) do
nil ->
if Ash.Resource.Info.resource?(resource) do
{:error, NoSuchResource.exception(resource: resource)}
else
{:error, NoPrimaryAction.exception(resource: resource, type: type)}
end
action ->
{:ok, action}
end
end
end
end
defp unwrap_or_raise!(:ok, _), do: :ok
defp unwrap_or_raise!({:ok, result}, _), do: result
defp unwrap_or_raise!({:ok, result, other}, _), do: {result, other}
defp unwrap_or_raise!({:error, error}, stacktraces?) do
exception = Ash.Error.to_ash_error(error)
exception =
if stacktraces? do
exception
else
Ash.Error.clear_stacktraces(exception)
end
case exception do
%{stacktraces?: _} ->
if stacktraces? do
reraise %{exception | stacktraces?: stacktraces?},
Map.get(exception.stacktrace || %{}, :stacktrace)
else
raise %{exception | stacktraces?: stacktraces?}
end
_ ->
raise exception
end
end
end
| 29.829374 | 157 | 0.571067 |
e85e9e2f655abcfa5b4aa6e4c4d2655c61723eb2 | 407 | exs | Elixir | apps/idp/priv/repo/migrations/20191027205337_create_states_table.exs | lbrty/idp-backend | 81d5f10ef6177a1e678b994331c5a09abbdca8d6 | [
"Apache-2.0"
] | null | null | null | apps/idp/priv/repo/migrations/20191027205337_create_states_table.exs | lbrty/idp-backend | 81d5f10ef6177a1e678b994331c5a09abbdca8d6 | [
"Apache-2.0"
] | null | null | null | apps/idp/priv/repo/migrations/20191027205337_create_states_table.exs | lbrty/idp-backend | 81d5f10ef6177a1e678b994331c5a09abbdca8d6 | [
"Apache-2.0"
] | null | null | null | defmodule Idp.Repo.Migrations.CreateStatesTable do
use Ecto.Migration
def change do
create table(:states, primary_key: false) do
add :id, :uuid, primary_key: true
add :name, :string
add :country_id, references(:countries, on_delete: :delete_all, type: :binary_id)
timestamps()
end
create index(:states, [:name])
create index(:states, [:country_id])
end
end
| 22.611111 | 87 | 0.675676 |
e85eda6b33c44e1afe76cea0a4004dafe64f9de7 | 1,517 | ex | Elixir | lib/rotterdam/node.ex | holandes22/rotterdam | d8b56079638c15a8492c08a6859ed14413163e62 | [
"MIT"
] | null | null | null | lib/rotterdam/node.ex | holandes22/rotterdam | d8b56079638c15a8492c08a6859ed14413163e62 | [
"MIT"
] | null | null | null | lib/rotterdam/node.ex | holandes22/rotterdam | d8b56079638c15a8492c08a6859ed14413163e62 | [
"MIT"
] | null | null | null | defmodule Rotterdam.Node do
defstruct id: nil,
created_at: nil,
updated_at: nil,
hostname: nil,
role: nil,
availability: nil,
nano_cpus: 0,
memory_bytes: 0,
state: nil,
leader: false,
reachability: nil,
addr: nil
def normalize(nodes) when is_list(nodes) do
Enum.map(nodes, &normalize(&1))
end
def normalize(%{"ManagerStatus" => manager_status} = node) do
%{
"Leader" => leader,
"Reachability" => reachability,
"Addr" => addr
} = manager_status
get_node(node)
|> Map.put(:leader, leader)
|> Map.put(:reachability, reachability)
|> Map.put(:addr, addr)
end
def normalize(node), do: get_node(node)
defp get_node(node) do
%{
"ID" => id,
"CreatedAt" => created_at,
"UpdatedAt" => updated_at,
"Description" => %{
"Hostname" => hostname,
"Resources" => %{
"NanoCPUs" => nano_cpus,
"MemoryBytes" => memory_bytes
}
},
"Spec" => %{
"Role" => role,
"Availability" => availability
},
"Status" => %{
"State" => state
}
} = node
%__MODULE__{
id: id,
created_at: created_at,
updated_at: updated_at,
hostname: hostname,
role: role,
availability: availability,
nano_cpus: nano_cpus,
memory_bytes: memory_bytes,
state: state
}
end
end
| 21.671429 | 63 | 0.51615 |
e85edb456ee8de8cf73c0f822f85fdcffe4046b5 | 715 | ex | Elixir | lib/supply_chain_web/live/index_live.ex | BailianBlockchain/Bailian_Supply_Chain | 8341e974b133c86f48488cb52ec4b3b281a99eb5 | [
"MIT"
] | 5 | 2020-12-10T07:24:18.000Z | 2021-12-15T08:26:05.000Z | lib/supply_chain_web/live/index_live.ex | BailianBlockchain/Bailian_Supply_Chain | 8341e974b133c86f48488cb52ec4b3b281a99eb5 | [
"MIT"
] | null | null | null | lib/supply_chain_web/live/index_live.ex | BailianBlockchain/Bailian_Supply_Chain | 8341e974b133c86f48488cb52ec4b3b281a99eb5 | [
"MIT"
] | 2 | 2020-12-12T17:06:13.000Z | 2021-04-09T02:12:31.000Z | defmodule SupplyChainWeb.IndexLive do
use Phoenix.LiveView
def render(assigns) do
~L"""
<div>
<center>
<h2>验证</h2>
</center>
<label>File</label>
</div>
"""
end
def read_test_file() do
File.read!("credential_exp")
|> Poison.decode!()
|> StructTranslater.to_atom_struct()
end
def mount(_params, session, socket) do
if connected?(socket), do: :timer.send_interval(1000, self(), :tick)
{:ok, init(socket)}
end
def init(socket) do
socket
|> assign(:credential, read_test_file())
end
def handle_info(:tick, socket) do
{:noreply, socket}
end
def handle_event("nav", _path, socket) do
{:noreply, socket}
end
end
| 17.875 | 72 | 0.612587 |
e85f03873600b491025cc00bb231b661df2ca49c | 431 | exs | Elixir | mix.exs | jnylen/auditor | 54c2044bf08dcb20f621461b0e8e0ca900690d14 | [
"MIT"
] | null | null | null | mix.exs | jnylen/auditor | 54c2044bf08dcb20f621461b0e8e0ca900690d14 | [
"MIT"
] | null | null | null | mix.exs | jnylen/auditor | 54c2044bf08dcb20f621461b0e8e0ca900690d14 | [
"MIT"
] | null | null | null | defmodule Auditor.MixProject do
use Mix.Project
@name :auditor
@version "0.1.0"
@deps [
{:differ, "~> 0.1.1"}
]
def project do
[
app: @name,
version: @version,
elixir: "~> 1.9",
start_permanent: Mix.env() == :prod,
deps: @deps
]
end
# Run "mix help compile.app" to learn about applications.
def application do
[
extra_applications: [:logger]
]
end
end
| 15.962963 | 59 | 0.561485 |
e85f0c0c2f79f18c74486f142513b815889823f9 | 5,792 | exs | Elixir | test/hammer_test.exs | noma4i/hammer | 481f70210787fc2cae60f18b6da723ce2fa1db04 | [
"MIT"
] | 1 | 2019-04-15T12:59:38.000Z | 2019-04-15T12:59:38.000Z | test/hammer_test.exs | ni-kismet/hammer | 08c43447a4ee4ae63d3af43b5ba2c7a7411617b1 | [
"MIT"
] | null | null | null | test/hammer_test.exs | ni-kismet/hammer | 08c43447a4ee4ae63d3af43b5ba2c7a7411617b1 | [
"MIT"
] | null | null | null | defmodule HammerTest do
use ExUnit.Case, async: true
setup _context do
{:ok, _hammer_ets_pid} = Hammer.Backend.ETS.start_link()
{:ok, []}
end
test "make_rate_checker" do
check = Hammer.make_rate_checker("some-prefix:", 10000, 2)
assert {:allow, 1} = check.("aaa")
assert {:allow, 2} = check.("aaa")
assert {:deny, 2} = check.("aaa")
assert {:deny, 2} = check.("aaa")
assert {:allow, 1} = check.("bbb")
assert {:allow, 2} = check.("bbb")
assert {:deny, 2} = check.("bbb")
assert {:deny, 2} = check.("bbb")
end
test "returns {:ok, 1} tuple on first access" do
assert {:allow, 1} = Hammer.check_rate("my-bucket", 10_000, 10)
end
test "returns {:ok, 4} tuple on in-limit checks" do
assert {:allow, 1} = Hammer.check_rate("my-bucket", 10_000, 10)
assert {:allow, 2} = Hammer.check_rate("my-bucket", 10_000, 10)
assert {:allow, 3} = Hammer.check_rate("my-bucket", 10_000, 10)
assert {:allow, 4} = Hammer.check_rate("my-bucket", 10_000, 10)
end
test "returns expected tuples on mix of in-limit and out-of-limit checks" do
assert {:allow, 1} = Hammer.check_rate("my-bucket", 10_000, 2)
assert {:allow, 2} = Hammer.check_rate("my-bucket", 10_000, 2)
assert {:deny, 2} = Hammer.check_rate("my-bucket", 10_000, 2)
assert {:deny, 2} = Hammer.check_rate("my-bucket", 10_000, 2)
end
test "returns expected tuples on 1000ms bucket check with a sleep in the middle" do
assert {:allow, 1} = Hammer.check_rate("my-bucket", 1000, 2)
assert {:allow, 2} = Hammer.check_rate("my-bucket", 1000, 2)
assert {:deny, 2} = Hammer.check_rate("my-bucket", 1000, 2)
:timer.sleep(1001)
assert {:allow, 1} = Hammer.check_rate("my-bucket", 1000, 2)
assert {:allow, 2} = Hammer.check_rate("my-bucket", 1000, 2)
assert {:deny, 2} = Hammer.check_rate("my-bucket", 1000, 2)
end
test "returns expected tuples on inspect_bucket" do
assert {:ok, {0, 2, _, nil, nil}} = Hammer.inspect_bucket("my-bucket1", 1000, 2)
assert {:allow, 1} = Hammer.check_rate("my-bucket1", 1000, 2)
assert {:ok, {1, 1, _, _, _}} = Hammer.inspect_bucket("my-bucket1", 1000, 2)
assert {:allow, 2} = Hammer.check_rate("my-bucket1", 1000, 2)
assert {:allow, 1} = Hammer.check_rate("my-bucket2", 1000, 2)
assert {:ok, {2, 0, _, _, _}} = Hammer.inspect_bucket("my-bucket1", 1000, 2)
assert {:deny, 2} = Hammer.check_rate("my-bucket1", 1000, 2)
assert {:ok, {3, 0, ms_to_next_bucket, _, _}} = Hammer.inspect_bucket("my-bucket1", 1000, 2)
assert ms_to_next_bucket < 1000
end
test "returns expected tuples on delete_buckets" do
assert {:allow, 1} = Hammer.check_rate("my-bucket1", 1000, 2)
assert {:allow, 2} = Hammer.check_rate("my-bucket1", 1000, 2)
assert {:deny, 2} = Hammer.check_rate("my-bucket1", 1000, 2)
assert {:allow, 1} = Hammer.check_rate("my-bucket2", 1000, 2)
assert {:allow, 2} = Hammer.check_rate("my-bucket2", 1000, 2)
assert {:deny, 2} = Hammer.check_rate("my-bucket2", 1000, 2)
assert {:ok, 1} = Hammer.delete_buckets("my-bucket1")
assert {:allow, 1} = Hammer.check_rate("my-bucket1", 1000, 2)
assert {:deny, 2} = Hammer.check_rate("my-bucket2", 1000, 2)
assert {:ok, 0} = Hammer.delete_buckets("unknown-bucket")
end
end
defmodule ETSTest do
use ExUnit.Case
setup _context do
{:ok, _hammer_ets_pid} = Hammer.Backend.ETS.start_link()
{:ok, []}
end
test "count_hit" do
{stamp, key} = Hammer.Utils.stamp_key("one", 200_000)
assert {:ok, 1} = Hammer.Backend.ETS.count_hit(key, stamp)
assert {:ok, 2} = Hammer.Backend.ETS.count_hit(key, stamp)
assert {:ok, 3} = Hammer.Backend.ETS.count_hit(key, stamp)
end
test "get_bucket" do
{stamp, key} = Hammer.Utils.stamp_key("two", 200_000)
# With no hits
assert {:ok, nil} = Hammer.Backend.ETS.get_bucket(key)
# With one hit
assert {:ok, 1} = Hammer.Backend.ETS.count_hit(key, stamp)
assert {:ok, {{_, "two"}, 1, _, _}} = Hammer.Backend.ETS.get_bucket(key)
# With two hits
assert {:ok, 2} = Hammer.Backend.ETS.count_hit(key, stamp)
assert {:ok, {{_, "two"}, 2, _, _}} = Hammer.Backend.ETS.get_bucket(key)
end
test "delete_buckets" do
{stamp, key} = Hammer.Utils.stamp_key("three", 200_000)
# With no hits
assert {:ok, 0} = Hammer.Backend.ETS.delete_buckets("three")
# With three hits in same bucket
assert {:ok, 1} = Hammer.Backend.ETS.count_hit(key, stamp)
assert {:ok, 2} = Hammer.Backend.ETS.count_hit(key, stamp)
assert {:ok, 3} = Hammer.Backend.ETS.count_hit(key, stamp)
assert {:ok, 1} = Hammer.Backend.ETS.delete_buckets("three")
end
end
defmodule HammerBackendETSSupervisorTest do
use ExUnit.Case
test "supervisor starts correctly" do
assert {:ok, _pid} = Hammer.Backend.ETS.Supervisor.start_link()
end
end
defmodule UtilsTest do
use ExUnit.Case
test "timestamp" do
assert is_integer(Hammer.Utils.timestamp())
end
test "stamp_key" do
id = "test_one_two"
{stamp, key} = Hammer.Utils.stamp_key(id, 60_000)
assert is_integer(stamp)
assert is_tuple(key)
{bucket_number, b_id} = key
assert is_integer(bucket_number)
assert b_id == id
end
test "get_backend_module" do
# With :single and default backend config
assert Hammer.Utils.get_backend_module(:single) == Hammer.Backend.ETS
# With :single and configured backend config
Application.put_env(:hammer, :backend, {Hammer.Backend.SomeBackend, []})
assert Hammer.Utils.get_backend_module(:single) == Hammer.Backend.SomeBackend
# with a specific backend config
Application.put_env(:hammer, :backend, one: {Hammer.Backend.SomeBackend, []})
assert Hammer.Utils.get_backend_module(:one) == Hammer.Backend.SomeBackend
end
end
| 38.357616 | 96 | 0.656941 |
e85f27e0e13f4462449f7fae15d01fb538d21f3a | 1,111 | exs | Elixir | config/config.exs | Homepolish/ulid | 376d778a30bdd9c15e43a29418613e48bd0e3519 | [
"MIT"
] | 23 | 2018-07-16T07:54:35.000Z | 2019-08-04T10:05:47.000Z | config/config.exs | Homepolish/ulid | 376d778a30bdd9c15e43a29418613e48bd0e3519 | [
"MIT"
] | 2 | 2020-01-10T21:38:10.000Z | 2021-03-14T13:13:07.000Z | config/config.exs | Homepolish/ulid | 376d778a30bdd9c15e43a29418613e48bd0e3519 | [
"MIT"
] | 2 | 2018-07-16T07:54:48.000Z | 2019-01-14T21:38:02.000Z | # This file is responsible for configuring your application
# and its dependencies with the aid of the Mix.Config module.
use Mix.Config
# This configuration is loaded before any dependency and is restricted
# to this project. If another project depends on this project, this
# file won't be loaded nor affect the parent project. For this reason,
# if you want to provide default values for your application for
# 3rd-party users, it should be done in your "mix.exs" file.
# You can configure for your application as:
#
# config :ulid, key: :value
#
# And access this configuration in your application as:
#
# Application.get_env(:ulid, :key)
#
# Or configure a 3rd-party app:
#
# config :logger, level: :info
#
# It is also possible to import configuration files, relative to this
# directory. For example, you can emulate configuration per environment
# by uncommenting the line below and defining dev.exs, test.exs and such.
# Configuration from the imported file will override the ones defined
# here (which is why it is important to import them last).
#
# import_config "#{Mix.env}.exs"
| 35.83871 | 73 | 0.749775 |
e85f3b6c16ef9e62b20b4321c178c16585dad9b3 | 1,679 | ex | Elixir | clients/slides/lib/google_api/slides/v1/model/theme_color_pair.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2021-12-20T03:40:53.000Z | 2021-12-20T03:40:53.000Z | clients/slides/lib/google_api/slides/v1/model/theme_color_pair.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2020-08-18T00:11:23.000Z | 2020-08-18T00:44:16.000Z | clients/slides/lib/google_api/slides/v1/model/theme_color_pair.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Slides.V1.Model.ThemeColorPair do
@moduledoc """
A pair mapping a theme color type to the concrete color it represents.
## Attributes
* `color` (*type:* `GoogleApi.Slides.V1.Model.RgbColor.t`, *default:* `nil`) - The concrete color corresponding to the theme color type above.
* `type` (*type:* `String.t`, *default:* `nil`) - The type of the theme color.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:color => GoogleApi.Slides.V1.Model.RgbColor.t() | nil,
:type => String.t() | nil
}
field(:color, as: GoogleApi.Slides.V1.Model.RgbColor)
field(:type)
end
defimpl Poison.Decoder, for: GoogleApi.Slides.V1.Model.ThemeColorPair do
def decode(value, options) do
GoogleApi.Slides.V1.Model.ThemeColorPair.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Slides.V1.Model.ThemeColorPair do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 33.58 | 146 | 0.722454 |
e85f78ad9e2b4cd2d19ec6699d8346ac06b4557c | 1,449 | exs | Elixir | test/functional/render/headings_test.exs | RobertDober/earmark | 6f20bd06f40e4333294d19eb38031ea480f3d3ba | [
"Apache-2.0"
] | null | null | null | test/functional/render/headings_test.exs | RobertDober/earmark | 6f20bd06f40e4333294d19eb38031ea480f3d3ba | [
"Apache-2.0"
] | null | null | null | test/functional/render/headings_test.exs | RobertDober/earmark | 6f20bd06f40e4333294d19eb38031ea480f3d3ba | [
"Apache-2.0"
] | 1 | 2020-09-15T17:47:35.000Z | 2020-09-15T17:47:35.000Z | defmodule HeadingsTest do
use ExUnit.Case
alias Earmark.HtmlRenderer, as: Renderer
alias Earmark.Block.Heading
import Support.Helpers
defp render [level: level, content: content] do
Renderer.render([
%Heading{attrs: nil, level: level, content: content}],
updated_context()
)
end
defp updated_context do
Earmark.Context.update_context( context() )
end
def expected text, level: level do
"<h#{level}>#{text}</h#{level}>\n"
end
test "Basic Heading without inline markup" do
result = render( level: 1, content: "Plain Text" )
assert result == expected( "Plain Text", level: 1 )
end
test "Basic Heading without inline markup (level 6)" do
result = render( level: 6, content: "Plain Text" )
assert result == expected( "Plain Text", level: 6 )
end
test "Heading with emphasis" do
result = render( level: 6, content: "some _emphasis_ is a good thing" )
assert result == expected("some <em>emphasis</em> is a good thing", level: 6 )
end
test "Heading with strong" do
result = render( level: 2, content: "Elixir makes a **strong** impression" )
assert result == expected( "Elixir makes a <strong>strong</strong> impression", level: 2 )
end
test "Heading with code" do
result = render( level: 3, content: "Elixir `code` is beautiful" )
assert result == expected( ~s[Elixir <code class="inline">code</code> is beautiful], level: 3 )
end
end
| 27.865385 | 99 | 0.668737 |
e85fa2c3c3eed6af942a30de13c154da10996e4d | 3,220 | ex | Elixir | clients/cloud_functions/lib/google_api/cloud_functions/v1/model/expr.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | null | null | null | clients/cloud_functions/lib/google_api/cloud_functions/v1/model/expr.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-12-18T09:25:12.000Z | 2020-12-18T09:25:12.000Z | clients/cloud_functions/lib/google_api/cloud_functions/v1/model/expr.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-10-04T10:12:44.000Z | 2020-10-04T10:12:44.000Z | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.CloudFunctions.V1.Model.Expr do
@moduledoc """
Represents a textual expression in the Common Expression Language (CEL) syntax. CEL is a C-like expression language. The syntax and semantics of CEL are documented at https://github.com/google/cel-spec. Example (Comparison): title: "Summary size limit" description: "Determines if a summary is less than 100 chars" expression: "document.summary.size() < 100" Example (Equality): title: "Requestor is owner" description: "Determines if requestor is the document owner" expression: "document.owner == request.auth.claims.email" Example (Logic): title: "Public documents" description: "Determine whether the document should be publicly visible" expression: "document.type != 'private' && document.type != 'internal'" Example (Data Manipulation): title: "Notification string" description: "Create a notification string with a timestamp." expression: "'New message received at ' + string(document.create_time)" The exact variables and functions that may be referenced within an expression are determined by the service that evaluates it. See the service documentation for additional information.
## Attributes
* `description` (*type:* `String.t`, *default:* `nil`) - Optional. Description of the expression. This is a longer text which describes the expression, e.g. when hovered over it in a UI.
* `expression` (*type:* `String.t`, *default:* `nil`) - Textual representation of an expression in Common Expression Language syntax.
* `location` (*type:* `String.t`, *default:* `nil`) - Optional. String indicating the location of the expression for error reporting, e.g. a file name and a position in the file.
* `title` (*type:* `String.t`, *default:* `nil`) - Optional. Title for the expression, i.e. a short string describing its purpose. This can be used e.g. in UIs which allow to enter the expression.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:description => String.t(),
:expression => String.t(),
:location => String.t(),
:title => String.t()
}
field(:description)
field(:expression)
field(:location)
field(:title)
end
defimpl Poison.Decoder, for: GoogleApi.CloudFunctions.V1.Model.Expr do
def decode(value, options) do
GoogleApi.CloudFunctions.V1.Model.Expr.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.CloudFunctions.V1.Model.Expr do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 57.5 | 1,092 | 0.736335 |
e85fb03be07cccb7e929dbb26f2891edd413a5dd | 913 | ex | Elixir | elixir/beer-song-alternatives/beer_song_onefunc.ex | nickbosch/exercism | de81b5ff07bb314160e5a171bedcfca3e7d1e307 | [
"MIT"
] | null | null | null | elixir/beer-song-alternatives/beer_song_onefunc.ex | nickbosch/exercism | de81b5ff07bb314160e5a171bedcfca3e7d1e307 | [
"MIT"
] | null | null | null | elixir/beer-song-alternatives/beer_song_onefunc.ex | nickbosch/exercism | de81b5ff07bb314160e5a171bedcfca3e7d1e307 | [
"MIT"
] | null | null | null | defmodule BeerSong do
@doc """
Get a single verse of the beer song
"""
@spec verse(integer) :: String.t()
def verse(n) do
{how_many_bottles, action, remaing_bottles} =
case n do
2 -> {"2 bottles", "Take one down and pass it around", "1 bottle"}
1 -> {"1 bottle", "Take it down and pass it around", "no more bottles"}
0 -> {"no more bottles", "Go to the store and buy some more", "99 bottles"}
_ -> {"#{n} bottles", "Take one down and pass it around", "#{n - 1} bottles"}
end
"""
#{String.capitalize(how_many_bottles)} of beer on the wall, #{how_many_bottles} of beer.
#{action}, #{remaing_bottles} of beer on the wall.
"""
end
@doc """
Get the entire beer song for a given range of numbers of bottles.
"""
@spec lyrics(Range.t()) :: String.t()
def lyrics(range \\ 99..0) do
Enum.map_join(range, "\n", &verse/1)
end
end
| 31.482759 | 92 | 0.599124 |
e85fd09e38d3c44de6d3cdd87d4d8039068253cc | 28,616 | exs | Elixir | lib/elixir/test/elixir/dynamic_supervisor_test.exs | spencerdcarlson/elixir | 23d75ecdf58df80969e12f4420282238e19219a1 | [
"Apache-2.0"
] | 243 | 2020-02-03T03:48:51.000Z | 2021-11-08T12:56:25.000Z | lib/elixir/test/elixir/dynamic_supervisor_test.exs | spencerdcarlson/elixir | 23d75ecdf58df80969e12f4420282238e19219a1 | [
"Apache-2.0"
] | 6 | 2021-03-19T12:33:21.000Z | 2021-04-02T17:52:45.000Z | lib/elixir/test/elixir/dynamic_supervisor_test.exs | spencerdcarlson/elixir | 23d75ecdf58df80969e12f4420282238e19219a1 | [
"Apache-2.0"
] | 1 | 2018-01-09T20:10:59.000Z | 2018-01-09T20:10:59.000Z | Code.require_file("test_helper.exs", __DIR__)
defmodule DynamicSupervisorTest do
use ExUnit.Case, async: true
defmodule Simple do
use DynamicSupervisor
def init(args), do: args
end
test "can be supervised directly" do
children = [{DynamicSupervisor, strategy: :one_for_one, name: :dyn_sup_spec_test}]
assert {:ok, _} = Supervisor.start_link(children, strategy: :one_for_one)
assert DynamicSupervisor.which_children(:dyn_sup_spec_test) == []
end
test "multiple supervisors can be supervised and identified with simple child spec" do
{:ok, _} = Registry.start_link(keys: :unique, name: DynSup.Registry)
children = [
{DynamicSupervisor, strategy: :one_for_one, name: :simple_name},
{DynamicSupervisor, strategy: :one_for_one, name: {:global, :global_name}},
{DynamicSupervisor,
strategy: :one_for_one, name: {:via, Registry, {DynSup.Registry, "via_name"}}}
]
assert {:ok, supsup} = Supervisor.start_link(children, strategy: :one_for_one)
assert {:ok, no_name_dynsup} =
Supervisor.start_child(supsup, {DynamicSupervisor, strategy: :one_for_one})
assert DynamicSupervisor.which_children(:simple_name) == []
assert DynamicSupervisor.which_children({:global, :global_name}) == []
assert DynamicSupervisor.which_children({:via, Registry, {DynSup.Registry, "via_name"}}) == []
assert DynamicSupervisor.which_children(no_name_dynsup) == []
assert Supervisor.start_child(supsup, {DynamicSupervisor, strategy: :one_for_one}) ==
{:error, {:already_started, no_name_dynsup}}
end
describe "use/2" do
test "generates child_spec/1" do
assert Simple.child_spec([:hello]) == %{
id: Simple,
start: {Simple, :start_link, [[:hello]]},
type: :supervisor
}
defmodule Custom do
use DynamicSupervisor,
id: :id,
restart: :temporary,
shutdown: :infinity,
start: {:foo, :bar, []}
def init(arg), do: {:producer, arg}
end
assert Custom.child_spec([:hello]) == %{
id: :id,
restart: :temporary,
shutdown: :infinity,
start: {:foo, :bar, []},
type: :supervisor
}
end
end
describe "init/1" do
test "set default options" do
assert DynamicSupervisor.init(strategy: :one_for_one) ==
{:ok,
%{
strategy: :one_for_one,
intensity: 3,
period: 5,
max_children: :infinity,
extra_arguments: []
}}
end
end
describe "start_link/3" do
test "with non-ok init" do
Process.flag(:trap_exit, true)
assert DynamicSupervisor.start_link(Simple, {:ok, %{strategy: :unknown}}) ==
{:error, {:supervisor_data, {:invalid_strategy, :unknown}}}
assert DynamicSupervisor.start_link(Simple, {:ok, %{intensity: -1}}) ==
{:error, {:supervisor_data, {:invalid_intensity, -1}}}
assert DynamicSupervisor.start_link(Simple, {:ok, %{period: 0}}) ==
{:error, {:supervisor_data, {:invalid_period, 0}}}
assert DynamicSupervisor.start_link(Simple, {:ok, %{max_children: -1}}) ==
{:error, {:supervisor_data, {:invalid_max_children, -1}}}
assert DynamicSupervisor.start_link(Simple, {:ok, %{extra_arguments: -1}}) ==
{:error, {:supervisor_data, {:invalid_extra_arguments, -1}}}
assert DynamicSupervisor.start_link(Simple, :unknown) ==
{:error, {:bad_return, {Simple, :init, :unknown}}}
assert DynamicSupervisor.start_link(Simple, :ignore) == :ignore
end
test "with registered process" do
{:ok, pid} = DynamicSupervisor.start_link(Simple, {:ok, %{}}, name: __MODULE__)
# Sets up a link
{:links, links} = Process.info(self(), :links)
assert pid in links
# A name
assert Process.whereis(__MODULE__) == pid
# And the initial call
assert {:supervisor, DynamicSupervisorTest.Simple, 1} =
:proc_lib.translate_initial_call(pid)
# And shuts down
assert DynamicSupervisor.stop(__MODULE__) == :ok
end
test "with spawn_opt" do
{:ok, pid} =
DynamicSupervisor.start_link(strategy: :one_for_one, spawn_opt: [priority: :high])
assert Process.info(pid, :priority) == {:priority, :high}
end
test "sets initial call to the same as a regular supervisor" do
{:ok, pid} = Supervisor.start_link([], strategy: :one_for_one)
assert :proc_lib.initial_call(pid) == {:supervisor, Supervisor.Default, [:Argument__1]}
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one)
assert :proc_lib.initial_call(pid) == {:supervisor, Supervisor.Default, [:Argument__1]}
end
test "returns the callback module" do
{:ok, pid} = Supervisor.start_link([], strategy: :one_for_one)
assert :supervisor.get_callback_module(pid) == Supervisor.Default
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one)
assert :supervisor.get_callback_module(pid) == Supervisor.Default
end
end
## Code change
describe "code_change/3" do
test "with non-ok init" do
{:ok, pid} = DynamicSupervisor.start_link(Simple, {:ok, %{}})
assert fake_upgrade(pid, {:ok, %{strategy: :unknown}}) ==
{:error, {:error, {:supervisor_data, {:invalid_strategy, :unknown}}}}
assert fake_upgrade(pid, {:ok, %{intensity: -1}}) ==
{:error, {:error, {:supervisor_data, {:invalid_intensity, -1}}}}
assert fake_upgrade(pid, {:ok, %{period: 0}}) ==
{:error, {:error, {:supervisor_data, {:invalid_period, 0}}}}
assert fake_upgrade(pid, {:ok, %{max_children: -1}}) ==
{:error, {:error, {:supervisor_data, {:invalid_max_children, -1}}}}
assert fake_upgrade(pid, :unknown) == {:error, :unknown}
assert fake_upgrade(pid, :ignore) == :ok
end
test "with ok init" do
{:ok, pid} = DynamicSupervisor.start_link(Simple, {:ok, %{}})
{:ok, _} = DynamicSupervisor.start_child(pid, sleepy_worker())
assert %{active: 1} = DynamicSupervisor.count_children(pid)
assert fake_upgrade(pid, {:ok, %{max_children: 1}}) == :ok
assert %{active: 1} = DynamicSupervisor.count_children(pid)
assert DynamicSupervisor.start_child(pid, {Task, fn -> :ok end}) == {:error, :max_children}
end
defp fake_upgrade(pid, init_arg) do
:ok = :sys.suspend(pid)
:sys.replace_state(pid, fn state -> %{state | args: init_arg} end)
res = :sys.change_code(pid, :gen_server, 123, :extra)
:ok = :sys.resume(pid)
res
end
end
describe "start_child/2" do
test "supports old child spec" do
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one)
child = {Task, {Task, :start_link, [fn -> :ok end]}, :temporary, 5000, :worker, [Task]}
assert {:ok, pid} = DynamicSupervisor.start_child(pid, child)
assert is_pid(pid)
end
test "supports new child spec as tuple" do
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one)
child = %{id: Task, restart: :temporary, start: {Task, :start_link, [fn -> :ok end]}}
assert {:ok, pid} = DynamicSupervisor.start_child(pid, child)
assert is_pid(pid)
end
test "supports new child spec" do
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one)
child = {Task, fn -> Process.sleep(:infinity) end}
assert {:ok, pid} = DynamicSupervisor.start_child(pid, child)
assert is_pid(pid)
end
test "supports extra arguments" do
parent = self()
fun = fn -> send(parent, :from_child) end
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one, extra_arguments: [fun])
child = %{id: Task, restart: :temporary, start: {Task, :start_link, []}}
assert {:ok, pid} = DynamicSupervisor.start_child(pid, child)
assert is_pid(pid)
assert_receive :from_child
end
test "with invalid child spec" do
assert DynamicSupervisor.start_child(:not_used, %{}) == {:error, {:invalid_child_spec, %{}}}
assert DynamicSupervisor.start_child(:not_used, {1, 2, 3, 4, 5, 6}) ==
{:error, {:invalid_mfa, 2}}
assert DynamicSupervisor.start_child(:not_used, %{id: 1, start: {Task, :foo, :bar}}) ==
{:error, {:invalid_mfa, {Task, :foo, :bar}}}
end
test "with different returns" do
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one)
assert {:ok, _, :extra} = DynamicSupervisor.start_child(pid, current_module_worker([:ok3]))
assert {:ok, _} = DynamicSupervisor.start_child(pid, current_module_worker([:ok2]))
assert :ignore = DynamicSupervisor.start_child(pid, current_module_worker([:ignore]))
assert {:error, :found} =
DynamicSupervisor.start_child(pid, current_module_worker([:error]))
assert {:error, :unknown} =
DynamicSupervisor.start_child(pid, current_module_worker([:unknown]))
end
test "with throw/error/exit" do
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one)
assert {:error, {{:nocatch, :oops}, [_ | _]}} =
DynamicSupervisor.start_child(pid, current_module_worker([:non_local, :throw]))
assert {:error, {%RuntimeError{}, [_ | _]}} =
DynamicSupervisor.start_child(pid, current_module_worker([:non_local, :error]))
assert {:error, :oops} =
DynamicSupervisor.start_child(pid, current_module_worker([:non_local, :exit]))
end
test "with max_children" do
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one, max_children: 0)
assert {:error, :max_children} =
DynamicSupervisor.start_child(pid, current_module_worker([:ok2]))
end
test "temporary child is not restarted regardless of reason" do
child = current_module_worker([:ok2], restart: :temporary)
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one)
assert {:ok, child_pid} = DynamicSupervisor.start_child(pid, child)
assert_kill(child_pid, :shutdown)
assert %{workers: 0, active: 0} = DynamicSupervisor.count_children(pid)
assert {:ok, child_pid} = DynamicSupervisor.start_child(pid, child)
assert_kill(child_pid, :whatever)
assert %{workers: 0, active: 0} = DynamicSupervisor.count_children(pid)
end
test "transient child is restarted unless normal/shutdown/{shutdown, _}" do
child = current_module_worker([:ok2], restart: :transient)
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one)
assert {:ok, child_pid} = DynamicSupervisor.start_child(pid, child)
assert_kill(child_pid, :shutdown)
assert %{workers: 0, active: 0} = DynamicSupervisor.count_children(pid)
assert {:ok, child_pid} = DynamicSupervisor.start_child(pid, child)
assert_kill(child_pid, {:shutdown, :signal})
assert %{workers: 0, active: 0} = DynamicSupervisor.count_children(pid)
assert {:ok, child_pid} = DynamicSupervisor.start_child(pid, child)
assert_kill(child_pid, :whatever)
assert %{workers: 1, active: 1} = DynamicSupervisor.count_children(pid)
end
test "permanent child is restarted regardless of reason" do
child = current_module_worker([:ok2], restart: :permanent)
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one, max_restarts: 100_000)
assert {:ok, child_pid} = DynamicSupervisor.start_child(pid, child)
assert_kill(child_pid, :shutdown)
assert %{workers: 1, active: 1} = DynamicSupervisor.count_children(pid)
assert {:ok, child_pid} = DynamicSupervisor.start_child(pid, child)
assert_kill(child_pid, {:shutdown, :signal})
assert %{workers: 2, active: 2} = DynamicSupervisor.count_children(pid)
assert {:ok, child_pid} = DynamicSupervisor.start_child(pid, child)
assert_kill(child_pid, :whatever)
assert %{workers: 3, active: 3} = DynamicSupervisor.count_children(pid)
end
test "child is restarted with different values" do
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one, max_restarts: 100_000)
assert {:ok, child1} =
DynamicSupervisor.start_child(pid, current_module_worker([:restart, :ok2]))
assert [{:undefined, ^child1, :worker, [DynamicSupervisorTest]}] =
DynamicSupervisor.which_children(pid)
assert_kill(child1, :shutdown)
assert %{workers: 1, active: 1} = DynamicSupervisor.count_children(pid)
assert {:ok, child2} =
DynamicSupervisor.start_child(pid, current_module_worker([:restart, :ok3]))
assert [
{:undefined, _, :worker, [DynamicSupervisorTest]},
{:undefined, ^child2, :worker, [DynamicSupervisorTest]}
] = DynamicSupervisor.which_children(pid)
assert_kill(child2, :shutdown)
assert %{workers: 2, active: 2} = DynamicSupervisor.count_children(pid)
assert {:ok, child3} =
DynamicSupervisor.start_child(pid, current_module_worker([:restart, :ignore]))
assert [
{:undefined, _, :worker, [DynamicSupervisorTest]},
{:undefined, _, :worker, [DynamicSupervisorTest]},
{:undefined, _, :worker, [DynamicSupervisorTest]}
] = DynamicSupervisor.which_children(pid)
assert_kill(child3, :shutdown)
assert %{workers: 2, active: 2} = DynamicSupervisor.count_children(pid)
assert {:ok, child4} =
DynamicSupervisor.start_child(pid, current_module_worker([:restart, :error]))
assert [
{:undefined, _, :worker, [DynamicSupervisorTest]},
{:undefined, _, :worker, [DynamicSupervisorTest]},
{:undefined, _, :worker, [DynamicSupervisorTest]}
] = DynamicSupervisor.which_children(pid)
assert_kill(child4, :shutdown)
assert %{workers: 3, active: 2} = DynamicSupervisor.count_children(pid)
assert {:ok, child5} =
DynamicSupervisor.start_child(pid, current_module_worker([:restart, :unknown]))
assert [
{:undefined, _, :worker, [DynamicSupervisorTest]},
{:undefined, _, :worker, [DynamicSupervisorTest]},
{:undefined, :restarting, :worker, [DynamicSupervisorTest]},
{:undefined, _, :worker, [DynamicSupervisorTest]}
] = DynamicSupervisor.which_children(pid)
assert_kill(child5, :shutdown)
assert %{workers: 4, active: 2} = DynamicSupervisor.count_children(pid)
end
test "restarting on init children counted in max_children" do
child = current_module_worker([:restart, :error], restart: :permanent)
opts = [strategy: :one_for_one, max_children: 1, max_restarts: 100_000]
{:ok, pid} = DynamicSupervisor.start_link(opts)
assert {:ok, child_pid} = DynamicSupervisor.start_child(pid, child)
assert_kill(child_pid, :shutdown)
assert %{workers: 1, active: 0} = DynamicSupervisor.count_children(pid)
child = current_module_worker([:restart, :ok2], restart: :permanent)
assert {:error, :max_children} = DynamicSupervisor.start_child(pid, child)
end
test "restarting on exit children counted in max_children" do
child = current_module_worker([:ok2], restart: :permanent)
opts = [strategy: :one_for_one, max_children: 1, max_restarts: 100_000]
{:ok, pid} = DynamicSupervisor.start_link(opts)
assert {:ok, child_pid} = DynamicSupervisor.start_child(pid, child)
assert_kill(child_pid, :shutdown)
assert %{workers: 1, active: 1} = DynamicSupervisor.count_children(pid)
child = current_module_worker([:ok2], restart: :permanent)
assert {:error, :max_children} = DynamicSupervisor.start_child(pid, child)
end
test "restarting a child with extra_arguments successfully restarts child" do
parent = self()
fun = fn ->
send(parent, :from_child)
Process.sleep(:infinity)
end
{:ok, sup} = DynamicSupervisor.start_link(strategy: :one_for_one, extra_arguments: [fun])
child = %{id: Task, restart: :transient, start: {Task, :start_link, []}}
assert {:ok, child} = DynamicSupervisor.start_child(sup, child)
assert is_pid(child)
assert_receive :from_child
assert %{active: 1, workers: 1} = DynamicSupervisor.count_children(sup)
assert_kill(child, :oops)
assert_receive :from_child
assert %{workers: 1, active: 1} = DynamicSupervisor.count_children(sup)
end
test "child is restarted when trying again" do
child = current_module_worker([:try_again, self()], restart: :permanent)
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one, max_restarts: 2)
assert {:ok, child_pid} = DynamicSupervisor.start_child(pid, child)
assert_received {:try_again, true}
assert_kill(child_pid, :shutdown)
assert_receive {:try_again, false}
assert_receive {:try_again, true}
assert %{workers: 1, active: 1} = DynamicSupervisor.count_children(pid)
end
test "child triggers maximum restarts" do
Process.flag(:trap_exit, true)
child = current_module_worker([:restart, :error], restart: :permanent)
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one, max_restarts: 1)
assert {:ok, child_pid} = DynamicSupervisor.start_child(pid, child)
assert_kill(child_pid, :shutdown)
assert_receive {:EXIT, ^pid, :shutdown}
end
test "child triggers maximum intensity when trying again" do
Process.flag(:trap_exit, true)
child = current_module_worker([:restart, :error], restart: :permanent)
{:ok, pid} = DynamicSupervisor.start_link(strategy: :one_for_one, max_restarts: 10)
assert {:ok, child_pid} = DynamicSupervisor.start_child(pid, child)
assert_kill(child_pid, :shutdown)
assert_receive {:EXIT, ^pid, :shutdown}
end
def start_link(:ok3), do: {:ok, spawn_link(fn -> Process.sleep(:infinity) end), :extra}
def start_link(:ok2), do: {:ok, spawn_link(fn -> Process.sleep(:infinity) end)}
def start_link(:error), do: {:error, :found}
def start_link(:ignore), do: :ignore
def start_link(:unknown), do: :unknown
def start_link(:non_local, :throw), do: throw(:oops)
def start_link(:non_local, :error), do: raise("oops")
def start_link(:non_local, :exit), do: exit(:oops)
def start_link(:try_again, notify) do
if Process.get(:try_again) do
Process.put(:try_again, false)
send(notify, {:try_again, false})
{:error, :try_again}
else
Process.put(:try_again, true)
send(notify, {:try_again, true})
start_link(:ok2)
end
end
def start_link(:restart, value) do
if Process.get({:restart, value}) do
start_link(value)
else
Process.put({:restart, value}, true)
start_link(:ok2)
end
end
end
describe "terminate/2" do
test "terminates children with brutal kill" do
Process.flag(:trap_exit, true)
{:ok, sup} = DynamicSupervisor.start_link(strategy: :one_for_one)
child = sleepy_worker(shutdown: :brutal_kill)
assert {:ok, child1} = DynamicSupervisor.start_child(sup, child)
assert {:ok, child2} = DynamicSupervisor.start_child(sup, child)
assert {:ok, child3} = DynamicSupervisor.start_child(sup, child)
Process.monitor(child1)
Process.monitor(child2)
Process.monitor(child3)
assert_kill(sup, :shutdown)
assert_receive {:DOWN, _, :process, ^child1, :killed}
assert_receive {:DOWN, _, :process, ^child2, :killed}
assert_receive {:DOWN, _, :process, ^child3, :killed}
end
test "terminates children with infinity shutdown" do
Process.flag(:trap_exit, true)
{:ok, sup} = DynamicSupervisor.start_link(strategy: :one_for_one)
child = sleepy_worker(shutdown: :infinity)
assert {:ok, child1} = DynamicSupervisor.start_child(sup, child)
assert {:ok, child2} = DynamicSupervisor.start_child(sup, child)
assert {:ok, child3} = DynamicSupervisor.start_child(sup, child)
Process.monitor(child1)
Process.monitor(child2)
Process.monitor(child3)
assert_kill(sup, :shutdown)
assert_receive {:DOWN, _, :process, ^child1, :shutdown}
assert_receive {:DOWN, _, :process, ^child2, :shutdown}
assert_receive {:DOWN, _, :process, ^child3, :shutdown}
end
test "terminates children with infinity shutdown and abnormal reason" do
Process.flag(:trap_exit, true)
{:ok, sup} = DynamicSupervisor.start_link(strategy: :one_for_one)
parent = self()
fun = fn ->
Process.flag(:trap_exit, true)
send(parent, :ready)
receive(do: (_ -> exit({:shutdown, :oops})))
end
child = Supervisor.child_spec({Task, fun}, shutdown: :infinity)
assert {:ok, child1} = DynamicSupervisor.start_child(sup, child)
assert {:ok, child2} = DynamicSupervisor.start_child(sup, child)
assert {:ok, child3} = DynamicSupervisor.start_child(sup, child)
assert_receive :ready
assert_receive :ready
assert_receive :ready
Process.monitor(child1)
Process.monitor(child2)
Process.monitor(child3)
assert_kill(sup, :shutdown)
assert_receive {:DOWN, _, :process, ^child1, {:shutdown, :oops}}
assert_receive {:DOWN, _, :process, ^child2, {:shutdown, :oops}}
assert_receive {:DOWN, _, :process, ^child3, {:shutdown, :oops}}
end
test "terminates children with integer shutdown" do
Process.flag(:trap_exit, true)
{:ok, sup} = DynamicSupervisor.start_link(strategy: :one_for_one)
child = sleepy_worker(shutdown: 1000)
assert {:ok, child1} = DynamicSupervisor.start_child(sup, child)
assert {:ok, child2} = DynamicSupervisor.start_child(sup, child)
assert {:ok, child3} = DynamicSupervisor.start_child(sup, child)
Process.monitor(child1)
Process.monitor(child2)
Process.monitor(child3)
assert_kill(sup, :shutdown)
assert_receive {:DOWN, _, :process, ^child1, :shutdown}
assert_receive {:DOWN, _, :process, ^child2, :shutdown}
assert_receive {:DOWN, _, :process, ^child3, :shutdown}
end
test "terminates children with integer shutdown and abnormal reason" do
Process.flag(:trap_exit, true)
{:ok, sup} = DynamicSupervisor.start_link(strategy: :one_for_one)
parent = self()
fun = fn ->
Process.flag(:trap_exit, true)
send(parent, :ready)
receive(do: (_ -> exit({:shutdown, :oops})))
end
child = Supervisor.child_spec({Task, fun}, shutdown: 1000)
assert {:ok, child1} = DynamicSupervisor.start_child(sup, child)
assert {:ok, child2} = DynamicSupervisor.start_child(sup, child)
assert {:ok, child3} = DynamicSupervisor.start_child(sup, child)
assert_receive :ready
assert_receive :ready
assert_receive :ready
Process.monitor(child1)
Process.monitor(child2)
Process.monitor(child3)
assert_kill(sup, :shutdown)
assert_receive {:DOWN, _, :process, ^child1, {:shutdown, :oops}}
assert_receive {:DOWN, _, :process, ^child2, {:shutdown, :oops}}
assert_receive {:DOWN, _, :process, ^child3, {:shutdown, :oops}}
end
test "terminates children with expired integer shutdown" do
Process.flag(:trap_exit, true)
{:ok, sup} = DynamicSupervisor.start_link(strategy: :one_for_one)
parent = self()
fun = fn ->
Process.sleep(:infinity)
end
tmt = fn ->
Process.flag(:trap_exit, true)
send(parent, :ready)
Process.sleep(:infinity)
end
child_fun = Supervisor.child_spec({Task, fun}, shutdown: 1)
child_tmt = Supervisor.child_spec({Task, tmt}, shutdown: 1)
assert {:ok, child1} = DynamicSupervisor.start_child(sup, child_fun)
assert {:ok, child2} = DynamicSupervisor.start_child(sup, child_tmt)
assert {:ok, child3} = DynamicSupervisor.start_child(sup, child_fun)
assert_receive :ready
Process.monitor(child1)
Process.monitor(child2)
Process.monitor(child3)
assert_kill(sup, :shutdown)
assert_receive {:DOWN, _, :process, ^child1, :shutdown}
assert_receive {:DOWN, _, :process, ^child2, :killed}
assert_receive {:DOWN, _, :process, ^child3, :shutdown}
end
test "terminates children with permanent restart and normal reason" do
Process.flag(:trap_exit, true)
{:ok, sup} = DynamicSupervisor.start_link(strategy: :one_for_one)
parent = self()
fun = fn ->
Process.flag(:trap_exit, true)
send(parent, :ready)
receive(do: (_ -> exit(:normal)))
end
child = Supervisor.child_spec({Task, fun}, shutdown: :infinity, restart: :permanent)
assert {:ok, child1} = DynamicSupervisor.start_child(sup, child)
assert {:ok, child2} = DynamicSupervisor.start_child(sup, child)
assert {:ok, child3} = DynamicSupervisor.start_child(sup, child)
assert_receive :ready
assert_receive :ready
assert_receive :ready
Process.monitor(child1)
Process.monitor(child2)
Process.monitor(child3)
assert_kill(sup, :shutdown)
assert_receive {:DOWN, _, :process, ^child1, :normal}
assert_receive {:DOWN, _, :process, ^child2, :normal}
assert_receive {:DOWN, _, :process, ^child3, :normal}
end
test "terminates with mixed children" do
Process.flag(:trap_exit, true)
{:ok, sup} = DynamicSupervisor.start_link(strategy: :one_for_one)
assert {:ok, child1} =
DynamicSupervisor.start_child(sup, sleepy_worker(shutdown: :infinity))
assert {:ok, child2} =
DynamicSupervisor.start_child(sup, sleepy_worker(shutdown: :brutal_kill))
Process.monitor(child1)
Process.monitor(child2)
assert_kill(sup, :shutdown)
assert_receive {:DOWN, _, :process, ^child1, :shutdown}
assert_receive {:DOWN, _, :process, ^child2, :killed}
end
end
describe "terminate_child/2" do
test "terminates child with brutal kill" do
{:ok, sup} = DynamicSupervisor.start_link(strategy: :one_for_one)
child = sleepy_worker(shutdown: :brutal_kill)
assert {:ok, child_pid} = DynamicSupervisor.start_child(sup, child)
Process.monitor(child_pid)
assert :ok = DynamicSupervisor.terminate_child(sup, child_pid)
assert_receive {:DOWN, _, :process, ^child_pid, :killed}
assert {:error, :not_found} = DynamicSupervisor.terminate_child(sup, child_pid)
assert %{workers: 0, active: 0} = DynamicSupervisor.count_children(sup)
end
test "terminates child with integer shutdown" do
{:ok, sup} = DynamicSupervisor.start_link(strategy: :one_for_one)
child = sleepy_worker(shutdown: 1000)
assert {:ok, child_pid} = DynamicSupervisor.start_child(sup, child)
Process.monitor(child_pid)
assert :ok = DynamicSupervisor.terminate_child(sup, child_pid)
assert_receive {:DOWN, _, :process, ^child_pid, :shutdown}
assert {:error, :not_found} = DynamicSupervisor.terminate_child(sup, child_pid)
assert %{workers: 0, active: 0} = DynamicSupervisor.count_children(sup)
end
test "terminates restarting child" do
{:ok, sup} = DynamicSupervisor.start_link(strategy: :one_for_one, max_restarts: 100_000)
child = current_module_worker([:restart, :error], restart: :permanent)
assert {:ok, child_pid} = DynamicSupervisor.start_child(sup, child)
assert_kill(child_pid, :shutdown)
assert :ok = DynamicSupervisor.terminate_child(sup, child_pid)
assert {:error, :not_found} = DynamicSupervisor.terminate_child(sup, child_pid)
assert %{workers: 0, active: 0} = DynamicSupervisor.count_children(sup)
end
end
defp sleepy_worker(opts \\ []) do
mfa = {Task, :start_link, [Process, :sleep, [:infinity]]}
Supervisor.child_spec(%{id: Task, start: mfa}, opts)
end
defp current_module_worker(args, opts \\ []) do
Supervisor.child_spec(%{id: __MODULE__, start: {__MODULE__, :start_link, args}}, opts)
end
defp assert_kill(pid, reason) do
ref = Process.monitor(pid)
Process.exit(pid, reason)
assert_receive {:DOWN, ^ref, _, _, _}
end
end
| 38.775068 | 98 | 0.651454 |
e85fe57c7c1aa2114f9949676e05650e7727ccbb | 353 | ex | Elixir | test/facade/system/spacing_test.ex | devilcoders/facade | 6aca8aaaf643b83ec275344436049c162d3a3b9a | [
"MIT"
] | null | null | null | test/facade/system/spacing_test.ex | devilcoders/facade | 6aca8aaaf643b83ec275344436049c162d3a3b9a | [
"MIT"
] | null | null | null | test/facade/system/spacing_test.ex | devilcoders/facade | 6aca8aaaf643b83ec275344436049c162d3a3b9a | [
"MIT"
] | null | null | null | defmodule Facade.System.SpacingTest do
use Facade.ConnCase, async: true
alias Facade.System.Spacing
test "generate proper spacin classes list" do
assert Spacing.to_classes(p: 10, m: 10, px: [10, 5, 4]) == [
"p-10",
"m-10",
"px-10",
"sm:px-5",
"md:px-4"
]
end
end
| 22.0625 | 64 | 0.515581 |
e8601206aa26176ec63b82112607d074a4955aaa | 900 | ex | Elixir | lib/sobelow/vuln/coherence.ex | juancgalvis/sobelow | 9ae3874c26ab7cfa6c8a8517ccd02af98e187585 | [
"Apache-2.0"
] | 1,305 | 2017-05-12T21:09:40.000Z | 2022-03-31T04:31:49.000Z | lib/sobelow/vuln/coherence.ex | juancgalvis/sobelow | 9ae3874c26ab7cfa6c8a8517ccd02af98e187585 | [
"Apache-2.0"
] | 95 | 2017-05-15T09:45:41.000Z | 2022-03-23T03:35:48.000Z | lib/sobelow/vuln/coherence.ex | juancgalvis/sobelow | 9ae3874c26ab7cfa6c8a8517ccd02af98e187585 | [
"Apache-2.0"
] | 86 | 2017-05-15T20:18:59.000Z | 2022-02-11T22:10:34.000Z | defmodule Sobelow.Vuln.Coherence do
alias Sobelow.Config
alias Sobelow.Vuln
@uid 22
@finding_type "Vuln.Coherence: Known Vulnerable Dependency - Update Coherence"
use Sobelow.Finding
@vuln_vsn ["<=0.5.1"]
def run(root) do
plug_conf = root <> "/deps/coherence/mix.exs"
if File.exists?(plug_conf) do
vsn = Config.get_version(plug_conf)
case Version.parse(vsn) do
{:ok, vsn} ->
if Enum.any?(@vuln_vsn, fn v -> Version.match?(vsn, v) end) do
Vuln.print_finding(
plug_conf,
vsn,
"Coherence",
"Permissive parameters and privilege escalation",
"TBA - https://github.com/smpallen99/coherence/issues/270",
"Coherence"
)
end
_ ->
nil
end
end
end
def details() do
Sobelow.Vuln.details()
end
end
| 21.95122 | 80 | 0.568889 |
e8601390e5b425193d9b6a324c3d1dab8649e2f7 | 373 | exs | Elixir | apps/platform_runner/config/integration.exs | kennyatpillar/hindsight | e90e2150a14218e5d6fdf5874f57eb055fd2dd07 | [
"Apache-2.0"
] | null | null | null | apps/platform_runner/config/integration.exs | kennyatpillar/hindsight | e90e2150a14218e5d6fdf5874f57eb055fd2dd07 | [
"Apache-2.0"
] | null | null | null | apps/platform_runner/config/integration.exs | kennyatpillar/hindsight | e90e2150a14218e5d6fdf5874f57eb055fd2dd07 | [
"Apache-2.0"
] | null | null | null | import Config
config :platform_runner,
divo: "docker-compose.yml",
divo_wait: [dwell: 1_000, max_tries: 120]
config :ex_aws,
debug_requests: true,
access_key_id: "testing_access_key",
secret_access_key: "testing_secret_key",
region: "local"
config :ex_aws, :s3,
scheme: "http://",
region: "local",
host: %{
"local" => "localhost"
},
port: 9000
| 18.65 | 43 | 0.675603 |
e8602dee2863423fe600d3cad55fcf1873c96098 | 2,416 | exs | Elixir | test/base_test.exs | joevandyk/result | b516f0440ca41f0a68fbd2f66064b1bca70bd1d9 | [
"MIT"
] | 70 | 2019-04-24T00:53:18.000Z | 2022-02-28T07:03:14.000Z | test/base_test.exs | joevandyk/result | b516f0440ca41f0a68fbd2f66064b1bca70bd1d9 | [
"MIT"
] | 3 | 2019-06-25T22:43:32.000Z | 2020-02-13T15:15:50.000Z | test/base_test.exs | joevandyk/result | b516f0440ca41f0a68fbd2f66064b1bca70bd1d9 | [
"MIT"
] | 4 | 2019-04-25T04:52:02.000Z | 2020-07-28T19:56:22.000Z | defmodule BaseTest do
@moduledoc false
use ExUnit.Case
import Brex.Result.Base
doctest Brex.Result.Base
def sgn(x) do
if x > 0 do
{:ok, "pos"}
else
{:error, "neg"}
end
end
def two_args(x, y), do: {:ok, x - y}
def inside_bind(x, f), do: x ~> f.()
test "bind" do
assert_raise FunctionClauseError, fn ->
bind(:ok, &sgn/1)
end
assert {:ok, "pos"} = bind({:ok, 1}, &sgn/1)
assert {:error, "neg"} = bind({:ok, -4}, &sgn/1)
assert {:error, -4} = bind({:error, -4}, &sgn/1)
assert {:ok, :bar} = bind({:ok, :foo}, fn _ -> {:ok, :bar} end)
assert {:error, :foo} = bind({:error, :foo}, fn _ -> {:ok, :bar} end)
assert_raise CaseClauseError, fn ->
bind({:ok, 2}, fn _ -> :ok end)
end
assert_raise CaseClauseError, fn ->
bind({:ok, 2}, & &1)
end
end
test "~>" do
assert_raise FunctionClauseError, fn ->
:ok ~> sgn
end
assert {:ok, "pos"} = {:ok, 1} ~> sgn
assert {:error, "neg"} = {:ok, -3} ~> sgn
assert {:error, 2} = {:error, 2} ~> sgn
assert {:ok, 1} = {:ok, 3} ~> two_args(2)
assert {:ok, 1} = {:ok, 3} ~> two_args(2)
assert {:ok, 1} = {:ok, 3} ~> two_args(2)
assert {:error, 1} = {:ok, 1} |> inside_bind(fn x -> {:error, x} end)
assert {:ok, 2} = {:ok, 1} |> inside_bind(fn x -> {:ok, x + 1} end)
assert {:error, 1} = {:error, 1} |> inside_bind(&{:ok, &1})
assert {:ok, [3, 4, 5]} =
{:ok, [1, 2, 3]} ~> Brex.Result.Mappers.map_while_success(&{:ok, &1 + 2})
assert_raise ArgumentError, fn ->
{:ok, 1} ~> (fn _ -> raise ArgumentError end).()
end
assert {:ok, 2} =
{:ok, 1} ~> (fn x -> if x > 1, do: raise(ArgumentError), else: {:ok, 2} end).()
end
test "fmap" do
assert_raise FunctionClauseError, fn ->
:ok
|> fmap(fn x -> x + 2 end)
end
assert {:ok, 8} =
{:ok, 6}
|> fmap(fn x -> x + 2 end)
assert {:error, 6} =
{:error, 6}
|> fmap(fn x -> x + 2 end)
end
test "ignore" do
assert :ok = ignore({:ok, 2})
assert :ok = ignore(:ok)
assert {:error, :not_found} = ignore({:error, :not_found})
end
test "extract!" do
assert 1 = extract!({:ok, 1})
assert_raise ArgumentError, fn -> extract!({:error, 1}) end
assert_raise FunctionClauseError, fn -> extract!(:ok) end
end
end
| 21.765766 | 92 | 0.502483 |
e860370ee56654f61ec119ecdeff4fbd793d249b | 59 | ex | Elixir | test/support/mocks.ex | daskycodes/trifolium | 322f01f89948c51d974e64b049ca2ce2d0f8b45f | [
"MIT"
] | 1 | 2021-04-19T15:09:35.000Z | 2021-04-19T15:09:35.000Z | test/support/mocks.ex | daskycodes/trifolium | 322f01f89948c51d974e64b049ca2ce2d0f8b45f | [
"MIT"
] | 1 | 2021-04-19T18:08:05.000Z | 2021-04-19T18:08:05.000Z | test/support/mocks.ex | daskycodes/trifolium | 322f01f89948c51d974e64b049ca2ce2d0f8b45f | [
"MIT"
] | 2 | 2021-04-19T15:09:39.000Z | 2021-04-28T14:05:21.000Z | Mox.defmock(Trifolium.HTTPClientMock, for: HTTPoison.Base)
| 29.5 | 58 | 0.830508 |
e8606b7c8683a81dade0d04a41b29f3704119a87 | 1,661 | ex | Elixir | aoc21/lib/graph.ex | sastels/advent-of-code-2021 | 7cefbb8bf08cce2c4a4e9517ecbaf49459a8671a | [
"MIT"
] | null | null | null | aoc21/lib/graph.ex | sastels/advent-of-code-2021 | 7cefbb8bf08cce2c4a4e9517ecbaf49459a8671a | [
"MIT"
] | null | null | null | aoc21/lib/graph.ex | sastels/advent-of-code-2021 | 7cefbb8bf08cce2c4a4e9517ecbaf49459a8671a | [
"MIT"
] | null | null | null | defmodule Graph do
@type graph_t :: map
@type vertex_t :: String.t()
@type edge_t :: {vertex_t(), vertex_t(), number()}
@spec new() :: graph_t()
def new(), do: %{}
@spec new(String.t()) :: graph_t()
def new(edge_data) do
edge_data
|> String.split(~r/\s+/, trim: true)
|> Enum.reduce(new(), &add_edge(&2, &1))
end
@spec vertices(graph_t()) :: list(vertex_t())
def vertices(graph), do: Map.keys(graph)
@spec add_vertex(graph_t(), vertex_t()) :: graph_t()
def add_vertex(graph, v) do
if Enum.member?(Graph.vertices(graph), v) do
graph
else
Map.put(graph, v, %{})
end
end
@spec delete_vertex(graph_t(), vertex_t()) :: graph_t()
def delete_vertex(graph, v) do
if Enum.member?(Graph.vertices(graph), v) do
graph
|> Map.keys()
|> Enum.reduce(graph, fn key, graph ->
Map.update!(graph, key, fn edges -> Map.delete(edges, v) end)
end)
|> Map.delete(v)
else
graph
end
end
# directed with weight
@spec add_directed_edge(graph_t(), edge_t()) :: graph_t()
def add_directed_edge(graph, {x, y, weight}) do
graph = graph |> add_vertex(x) |> add_vertex(y)
x_edges = graph |> Map.fetch!(x) |> Map.put(y, weight)
Map.put(graph, x, x_edges)
end
# undirected weight 1
@spec add_edge(graph_t(), String.t()) :: graph_t()
def add_edge(graph, line) do
[x, y] = String.split(line, "-")
graph |> add_directed_edge({x, y, 1}) |> add_directed_edge({y, x, 1})
end
@spec successor_vertices(graph_t(), vertex_t()) :: list(vertex_t())
def successor_vertices(graph, v) do
graph |> Map.fetch!(v) |> Map.keys()
end
end
| 26.790323 | 73 | 0.603251 |
e860700f41a5b582bf46c894c741ef892c0286b1 | 1,353 | exs | Elixir | priv/templates/html/controller_test.exs | parkerl/phoenix | 3dc207222877ce1876a07b8dd8d67fcd0325f7e1 | [
"MIT"
] | null | null | null | priv/templates/html/controller_test.exs | parkerl/phoenix | 3dc207222877ce1876a07b8dd8d67fcd0325f7e1 | [
"MIT"
] | null | null | null | priv/templates/html/controller_test.exs | parkerl/phoenix | 3dc207222877ce1876a07b8dd8d67fcd0325f7e1 | [
"MIT"
] | null | null | null | defmodule <%= module %>ControllerTest do
use <%= base %>.ConnCase
alias <%= module %>
@valid_params <%= singular %>: <%= inspect params %>
test "GET /<%= plural %>" do
conn = get conn(), <%= singular %>_path(conn, :index)
assert conn.resp_body =~ "Listing <%= plural %>"
end
test "GET /<%= plural %>/new" do
conn = get conn(), <%= singular %>_path(conn, :new)
assert conn.resp_body =~ "New <%= singular %>"
end
test "POST /<%= plural %>" do
conn = post conn(), <%= singular %>_path(conn, :create), @valid_params
assert redirected_to(conn) == <%= singular %>_path(conn, :index)
end
test "GET /<%= plural %>/:id" do
<%= singular %> = Repo.insert %<%= alias %>{}
conn = get conn(), <%= singular %>_path(conn, :show, <%= singular %>.id)
assert conn.resp_body =~ "Show <%= singular %>"
end
test "PUT /<%= plural %>/:id" do
<%= singular %> = Repo.insert %<%= alias %>{}
conn = put conn(), <%= singular %>_path(conn, :update, <%= singular %>.id), @valid_params
assert redirected_to(conn) == <%= singular %>_path(conn, :index)
end
test "DELETE /<%= plural %>/:id" do
<%= singular %> = Repo.insert %<%= alias %>{}
conn = delete conn(), <%= singular %>_path(conn, :delete, <%= singular %>.id)
assert redirected_to(conn) == <%= singular %>_path(conn, :index)
end
end
| 33.825 | 93 | 0.566888 |
e860c080b029f337d9068936f9bea7733eec5c54 | 4,619 | exs | Elixir | test/test_helper.exs | kianmeng/credo_naming | da1d26144d5bb9c019ffc02d19497fb365692b6d | [
"BSD-3-Clause"
] | 36 | 2019-06-17T11:03:45.000Z | 2021-12-21T19:30:17.000Z | test/test_helper.exs | kianmeng/credo_naming | da1d26144d5bb9c019ffc02d19497fb365692b6d | [
"BSD-3-Clause"
] | 12 | 2019-08-01T10:06:31.000Z | 2022-03-18T14:10:21.000Z | test/test_helper.exs | kianmeng/credo_naming | da1d26144d5bb9c019ffc02d19497fb365692b6d | [
"BSD-3-Clause"
] | 5 | 2019-09-13T09:09:46.000Z | 2022-02-24T22:02:38.000Z | Code.require_file("support/test_application.exs", __DIR__)
Credo.Test.Application.start([], [])
ExUnit.start()
check_version =
~w(1.6.5 1.7.0)
|> Enum.reduce([], fn version, acc ->
# allow -dev versions so we can test before the Elixir release.
if System.version() |> Version.match?("< #{version}-dev") do
acc ++ [needs_elixir: version]
else
acc
end
end)
exclude = Keyword.merge([to_be_implemented: true], check_version)
ExUnit.configure(exclude: exclude)
# credo:disable-for-next-line CredoNaming.Check.Warning.AvoidSpecificTermsInModuleNames
defmodule Credo.TestHelper do
defmacro __using__(_) do
quote do
use ExUnit.Case
import CredoSourceFileCase
import CredoCheckCase
end
end
end
defmodule CredoSourceFileCase do
alias Credo.Test.FilenameGenerator
def to_source_file(source) do
to_source_file(source, FilenameGenerator.next())
end
def to_source_file(source, filename) do
case Credo.SourceFile.parse(source, filename) do
%{status: :valid} = source_file ->
source_file
_ ->
raise "Source could not be parsed!"
end
end
def to_source_files(list) do
Enum.map(list, &to_source_file/1)
end
end
defmodule CredoCheckCase do
use ExUnit.Case
alias Credo.Execution.ExecutionIssues
alias Credo.SourceFile
def refute_issues(source_file, check \\ nil, params \\ []) do
issues = issues_for(source_file, check, create_config(), params)
assert [] == issues,
"There should be no issues, got #{Enum.count(issues)}: #{to_inspected(issues)}"
issues
end
def assert_issue(source_file, callback) when is_function(callback) do
assert_issue(source_file, nil, [], callback)
end
def assert_issue(source_file, check, callback) when is_function(callback) do
assert_issue(source_file, check, [], callback)
end
def assert_issue(source_file, check \\ nil, params \\ [], callback \\ nil) do
issues = issues_for(source_file, check, create_config(), params)
refute Enum.empty?(issues), "There should be one issue, got none."
assert Enum.count(issues) == 1,
"There should be only 1 issue, got #{Enum.count(issues)}: #{to_inspected(issues)}"
if callback do
issues |> List.first() |> callback.()
end
issues
end
def assert_issues(source_file, callback) when is_function(callback) do
assert_issues(source_file, nil, [], callback)
end
def assert_issues(source_file, check, callback) when is_function(callback) do
assert_issues(source_file, check, [], callback)
end
def assert_issues(source_file, check \\ nil, params \\ [], callback \\ nil) do
issues = issues_for(source_file, check, create_config(), params)
assert Enum.count(issues) > 0, "There should be multiple issues, got none."
assert Enum.count(issues) > 1,
"There should be more than one issue, got: #{to_inspected(issues)}"
if callback, do: callback.(issues)
issues
end
defp issues_for(source_files, nil, exec, _) when is_list(source_files) do
source_files
|> Enum.flat_map(&get_issues_from_source_file(&1, exec))
|> Enum.map(fn
%Credo.Issue{} = issue ->
issue
value ->
raise "Expected %Issue{}, got: #{inspect(value)}"
end)
end
defp issues_for(source_files, check, _exec, params)
when is_list(source_files) do
exec = create_config()
if check.run_on_all? do
:ok = check.run(source_files, exec, params)
source_files
|> Enum.flat_map(&(&1 |> get_issues_from_source_file(exec)))
else
source_files
|> check.run(params)
|> Enum.flat_map(&(&1 |> get_issues_from_source_file(exec)))
end
end
defp issues_for(%SourceFile{} = source_file, nil, exec, _) do
source_file |> get_issues_from_source_file(exec)
end
defp issues_for(%SourceFile{} = source_file, check, _exec, params) do
_issues = check.run(source_file, params)
end
def assert_trigger([issue], trigger), do: [assert_trigger(issue, trigger)]
def assert_trigger(issue, trigger) do
assert trigger == issue.trigger
issue
end
def to_inspected(value) do
value
|> Inspect.Algebra.to_doc(%Inspect.Opts{})
|> Inspect.Algebra.format(50)
|> Enum.join("")
end
defp create_config do
%Credo.Execution{}
|> Credo.Execution.ExecutionSourceFiles.start_server()
|> Credo.Execution.ExecutionIssues.start_server()
|> Credo.Execution.ExecutionTiming.start_server()
end
defp get_issues_from_source_file(source_file, exec) do
ExecutionIssues.get(exec, source_file)
end
end
| 26.244318 | 93 | 0.686296 |
e860c8cead7a03cc31f1a4e6394ad3322c6f427e | 4,578 | ex | Elixir | lib/hexpm/web/views/package_view.ex | lau/hexpm | beee80f5358a356530debfea35ee65c3a0aa9b25 | [
"Apache-2.0"
] | null | null | null | lib/hexpm/web/views/package_view.ex | lau/hexpm | beee80f5358a356530debfea35ee65c3a0aa9b25 | [
"Apache-2.0"
] | null | null | null | lib/hexpm/web/views/package_view.ex | lau/hexpm | beee80f5358a356530debfea35ee65c3a0aa9b25 | [
"Apache-2.0"
] | null | null | null | defmodule Hexpm.Web.PackageView do
use Hexpm.Web, :view
def show_sort_info(nil), do: show_sort_info(:name)
def show_sort_info(:name), do: "Sort: Name"
def show_sort_info(:inserted_at), do: "Sort: Recently created"
def show_sort_info(:updated_at), do: "Sort: Recently updated"
def show_sort_info(:total_downloads), do: "Sort: Total downloads"
def show_sort_info(:recent_downloads), do: "Sort: Recent downloads"
def show_sort_info(_param), do: nil
def downloads_for_package(package, downloads) do
Map.get(downloads, package.id, %{"all" => 0, "recent" => 0})
end
def display_downloads(package_downloads, view) do
case view do
:recent_downloads ->
Map.get(package_downloads, "recent")
_ ->
Map.get(package_downloads, "all")
end
end
def display_downloads_for_opposite_views(package_downloads, view) do
case view do
:recent_downloads ->
downloads = display_downloads(package_downloads, :all) || 0
"total downloads: #{human_number_space(downloads)}"
_ ->
downloads = display_downloads(package_downloads, :recent_downloads) || 0
"recent downloads: #{human_number_space(downloads)}"
end
end
def display_downloads_view_title(view) do
case view do
:recent_downloads -> "recent downloads"
_ -> "total downloads"
end
end
def dep_snippet(:mix, package, release) do
version = snippet_version(:mix, release.version)
app_name = release.meta.app || package.name
organization = snippet_organization(package.organization.name)
if package.name == app_name do
"{:#{package.name}, \"#{version}\"#{organization}}"
else
"{#{app_name(:mix, app_name)}, \"#{version}\", hex: :#{package.name}#{organization}}"
end
end
def dep_snippet(:rebar, package, release) do
version = snippet_version(:rebar, release.version)
app_name = release.meta.app || package.name
if package.name == app_name do
"{#{package.name}, \"#{version}\"}"
else
"{#{app_name(:rebar, app_name)}, \"#{version}\", {pkg, #{package.name}}}"
end
end
def dep_snippet(:erlang_mk, package, release) do
version = snippet_version(:erlang_mk, release.version)
"dep_#{package.name} = hex #{version}"
end
def snippet_version(:mix, %Version{major: 0, minor: minor, patch: patch, pre: []}) do
"~> 0.#{minor}.#{patch}"
end
def snippet_version(:mix, %Version{major: major, minor: minor, pre: []}) do
"~> #{major}.#{minor}"
end
def snippet_version(:mix, %Version{major: major, minor: minor, patch: patch, pre: pre}) do
"~> #{major}.#{minor}.#{patch}#{pre_snippet(pre)}"
end
def snippet_version(other, %Version{major: major, minor: minor, patch: patch, pre: pre})
when other in [:rebar, :erlang_mk] do
"#{major}.#{minor}.#{patch}#{pre_snippet(pre)}"
end
defp snippet_organization("hexpm"), do: ""
defp snippet_organization(organization), do: ", organization: #{inspect(organization)}"
defp pre_snippet([]), do: ""
defp pre_snippet(pre) do
"-" <>
Enum.map_join(pre, ".", fn
int when is_integer(int) -> Integer.to_string(int)
string when is_binary(string) -> string
end)
end
@elixir_atom_chars ~r"^[a-zA-Z_][a-zA-Z_0-9]*$"
@erlang_atom_chars ~r"^[a-z][a-zA-Z_0-9]*$"
defp app_name(:mix, name) do
if Regex.match?(@elixir_atom_chars, name) do
":#{name}"
else
":#{inspect(name)}"
end
end
defp app_name(:rebar, name) do
if Regex.match?(@erlang_atom_chars, name) do
name
else
inspect(String.to_charlist(name))
end
end
def retirement_message(retirement) do
reason = ReleaseRetirement.reason_text(retirement.reason)
["Retired package"] ++
cond do
reason && retirement.message ->
["; reason: ", reason, " - ", retirement.message]
reason ->
["; reason: ", reason]
retirement.message ->
["; reason: ", retirement.message]
true ->
[]
end
end
def retirement_html(retirement) do
reason = ReleaseRetirement.reason_text(retirement.reason)
cond do
reason && retirement.message ->
[content_tag(:strong, "Retired package;"), " reason: ", reason, " - ", retirement.message]
reason ->
[content_tag(:strong, "Retired package;"), " reason: ", reason]
retirement.message ->
[content_tag(:strong, "Retired package;"), " reason: ", retirement.message]
true ->
[content_tag(:strong, "Retired package")]
end
end
end
| 28.792453 | 98 | 0.637178 |
e860cc9038beba3a66ab80c5e6de61f2c1922638 | 6,206 | exs | Elixir | test/exqlite/integration_test.exs | laszlohegedus/exqlite | ed0668228fc668cf6d49e1989614eaf02c5a2dd9 | [
"MIT"
] | null | null | null | test/exqlite/integration_test.exs | laszlohegedus/exqlite | ed0668228fc668cf6d49e1989614eaf02c5a2dd9 | [
"MIT"
] | null | null | null | test/exqlite/integration_test.exs | laszlohegedus/exqlite | ed0668228fc668cf6d49e1989614eaf02c5a2dd9 | [
"MIT"
] | null | null | null | defmodule Exqlite.IntegrationTest do
use ExUnit.Case
alias Exqlite.Connection
alias Exqlite.Sqlite3
alias Exqlite.Query
test "simple prepare execute and close" do
path = Temp.path!()
{:ok, db} = Sqlite3.open(path)
:ok = Sqlite3.execute(db, "create table test (id ingeger primary key, stuff text)")
:ok = Sqlite3.close(db)
{:ok, conn} = Connection.connect(database: path)
{:ok, query, _} =
%Exqlite.Query{statement: "SELECT * FROM test WHERE id = :id"}
|> Connection.handle_prepare([2], conn)
{:ok, _query, result, conn} = Connection.handle_execute(query, [2], [], conn)
assert result
{:ok, _, conn} = Connection.handle_close(query, [], conn)
assert conn
File.rm(path)
end
test "transaction handling with concurrent connections" do
path = Temp.path!()
{:ok, conn1} =
Connection.connect(
database: path,
journal_mode: :wal,
cache_size: -64_000,
temp_store: :memory
)
{:ok, conn2} =
Connection.connect(
database: path,
journal_mode: :wal,
cache_size: -64_000,
temp_store: :memory
)
{:ok, _result, conn1} = Connection.handle_begin([], conn1)
assert conn1.transaction_status == :transaction
query = %Query{statement: "create table foo(id integer, val integer)"}
{:ok, _query, _result, conn1} = Connection.handle_execute(query, [], [], conn1)
{:ok, _result, conn1} = Connection.handle_rollback([], conn1)
assert conn1.transaction_status == :idle
{:ok, _result, conn2} = Connection.handle_begin([], conn2)
assert conn2.transaction_status == :transaction
query = %Query{statement: "create table foo(id integer, val integer)"}
{:ok, _query, _result, conn2} = Connection.handle_execute(query, [], [], conn2)
{:ok, _result, conn2} = Connection.handle_rollback([], conn2)
assert conn2.transaction_status == :idle
File.rm(path)
end
test "handles busy correctly" do
path = Temp.path!()
{:ok, conn1} =
Connection.connect(
database: path,
journal_mode: :wal,
cache_size: -64_000,
temp_store: :memory,
busy_timeout: 0
)
{:ok, conn2} =
Connection.connect(
database: path,
journal_mode: :wal,
cache_size: -64_000,
temp_store: :memory,
busy_timeout: 0
)
{:ok, _result, conn1} = Connection.handle_begin([mode: :immediate], conn1)
assert conn1.transaction_status == :transaction
{:disconnect, _err, conn2} = Connection.handle_begin([mode: :immediate], conn2)
assert conn2.transaction_status == :idle
{:ok, _result, conn1} = Connection.handle_commit([mode: :immediate], conn1)
assert conn1.transaction_status == :idle
{:ok, _result, conn2} = Connection.handle_begin([mode: :immediate], conn2)
assert conn2.transaction_status == :transaction
{:ok, _result, conn2} = Connection.handle_commit([mode: :immediate], conn2)
assert conn2.transaction_status == :idle
Connection.disconnect(nil, conn1)
Connection.disconnect(nil, conn2)
File.rm(path)
end
test "transaction with interleaved connections" do
path = Temp.path!()
{:ok, conn1} =
Connection.connect(
database: path,
journal_mode: :wal,
cache_size: -64_000,
temp_store: :memory
)
{:ok, conn2} =
Connection.connect(
database: path,
journal_mode: :wal,
cache_size: -64_000,
temp_store: :memory
)
{:ok, _result, conn1} = Connection.handle_begin([mode: :immediate], conn1)
query = %Query{statement: "create table foo(id integer, val integer)"}
{:ok, _query, _result, conn1} = Connection.handle_execute(query, [], [], conn1)
# transaction overlap
{:ok, _result, conn2} = Connection.handle_begin([], conn2)
assert conn2.transaction_status == :transaction
{:ok, _result, conn1} = Connection.handle_rollback([], conn1)
assert conn1.transaction_status == :idle
query = %Query{statement: "create table foo(id integer, val integer)"}
{:ok, _query, _result, conn2} = Connection.handle_execute(query, [], [], conn2)
{:ok, _result, conn2} = Connection.handle_rollback([], conn2)
assert conn2.transaction_status == :idle
Connection.disconnect(nil, conn1)
Connection.disconnect(nil, conn2)
File.rm(path)
end
test "transaction handling with single connection" do
path = Temp.path!()
{:ok, conn1} =
Connection.connect(
database: path,
journal_mode: :wal,
cache_size: -64_000,
temp_store: :memory
)
{:ok, _result, conn1} = Connection.handle_begin([], conn1)
assert conn1.transaction_status == :transaction
query = %Query{statement: "create table foo(id integer, val integer)"}
{:ok, _query, _result, conn1} = Connection.handle_execute(query, [], [], conn1)
{:ok, _result, conn1} = Connection.handle_rollback([], conn1)
assert conn1.transaction_status == :idle
{:ok, _result, conn1} = Connection.handle_begin([], conn1)
assert conn1.transaction_status == :transaction
query = %Query{statement: "create table foo(id integer, val integer)"}
{:ok, _query, _result, conn1} = Connection.handle_execute(query, [], [], conn1)
{:ok, _result, conn1} = Connection.handle_rollback([], conn1)
assert conn1.transaction_status == :idle
File.rm(path)
end
test "exceeding timeout" do
path = Temp.path!()
{:ok, conn} =
DBConnection.start_link(Connection,
idle_interval: 5_000,
database: path,
journal_mode: :wal,
cache_size: -64_000,
temp_store: :memory
)
query = %Query{statement: "create table foo(id integer, val integer)"}
{:ok, _, _} = DBConnection.execute(conn, query, [])
values = for i <- 1..10_001, do: "(#{i}, #{i})"
query = %Query{
statement: "insert into foo(id, val) values #{Enum.join(values, ",")}"
}
{:ok, _, _} = DBConnection.execute(conn, query, [])
query = %Query{statement: "select * from foo"}
{:ok, _, _} = DBConnection.execute(conn, query, [], timeout: 1)
File.rm(path)
end
end
| 30.722772 | 87 | 0.63358 |
e860f0a111a044e7f346ff6b977f2b6b869d1b53 | 69 | ex | Elixir | web/views/layout_view.ex | cavneb/elixir_casts | c650a2850825e0305387b95ba6a7eb386e984097 | [
"MIT"
] | null | null | null | web/views/layout_view.ex | cavneb/elixir_casts | c650a2850825e0305387b95ba6a7eb386e984097 | [
"MIT"
] | null | null | null | web/views/layout_view.ex | cavneb/elixir_casts | c650a2850825e0305387b95ba6a7eb386e984097 | [
"MIT"
] | null | null | null | defmodule ElixirCasts.LayoutView do
use ElixirCasts.Web, :view
end
| 17.25 | 35 | 0.811594 |
e8612243efea4c8731b35c0cd1e5578387a99ba2 | 3,123 | ex | Elixir | clients/you_tube/lib/google_api/you_tube/v3/model/guide_category_list_response.ex | nuxlli/elixir-google-api | ecb8679ac7282b7dd314c3e20c250710ec6a7870 | [
"Apache-2.0"
] | null | null | null | clients/you_tube/lib/google_api/you_tube/v3/model/guide_category_list_response.ex | nuxlli/elixir-google-api | ecb8679ac7282b7dd314c3e20c250710ec6a7870 | [
"Apache-2.0"
] | null | null | null | clients/you_tube/lib/google_api/you_tube/v3/model/guide_category_list_response.ex | nuxlli/elixir-google-api | ecb8679ac7282b7dd314c3e20c250710ec6a7870 | [
"Apache-2.0"
] | 1 | 2020-11-10T16:58:27.000Z | 2020-11-10T16:58:27.000Z | # Copyright 2017 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This class is auto generated by the swagger code generator program.
# https://github.com/swagger-api/swagger-codegen.git
# Do not edit the class manually.
defmodule GoogleApi.YouTube.V3.Model.GuideCategoryListResponse do
@moduledoc """
## Attributes
- etag (String.t): Etag of this resource. Defaults to: `null`.
- eventId (String.t): Serialized EventId of the request which produced this response. Defaults to: `null`.
- items ([GuideCategory]): A list of categories that can be associated with YouTube channels. In this map, the category ID is the map key, and its value is the corresponding guideCategory resource. Defaults to: `null`.
- kind (String.t): Identifies what kind of resource this is. Value: the fixed string \"youtube#guideCategoryListResponse\". Defaults to: `null`.
- nextPageToken (String.t): The token that can be used as the value of the pageToken parameter to retrieve the next page in the result set. Defaults to: `null`.
- pageInfo (PageInfo): Defaults to: `null`.
- prevPageToken (String.t): The token that can be used as the value of the pageToken parameter to retrieve the previous page in the result set. Defaults to: `null`.
- tokenPagination (TokenPagination): Defaults to: `null`.
- visitorId (String.t): The visitorId identifies the visitor. Defaults to: `null`.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:etag => any(),
:eventId => any(),
:items => list(GoogleApi.YouTube.V3.Model.GuideCategory.t()),
:kind => any(),
:nextPageToken => any(),
:pageInfo => GoogleApi.YouTube.V3.Model.PageInfo.t(),
:prevPageToken => any(),
:tokenPagination => GoogleApi.YouTube.V3.Model.TokenPagination.t(),
:visitorId => any()
}
field(:etag)
field(:eventId)
field(:items, as: GoogleApi.YouTube.V3.Model.GuideCategory, type: :list)
field(:kind)
field(:nextPageToken)
field(:pageInfo, as: GoogleApi.YouTube.V3.Model.PageInfo)
field(:prevPageToken)
field(:tokenPagination, as: GoogleApi.YouTube.V3.Model.TokenPagination)
field(:visitorId)
end
defimpl Poison.Decoder, for: GoogleApi.YouTube.V3.Model.GuideCategoryListResponse do
def decode(value, options) do
GoogleApi.YouTube.V3.Model.GuideCategoryListResponse.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.YouTube.V3.Model.GuideCategoryListResponse do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 43.375 | 220 | 0.723023 |
e86123ee2efafc05687d250626a76646cb8ce9d9 | 960 | ex | Elixir | lib/conduit/encoding/gzip.ex | log4b/conduit | 5ecf5aa0edd831f54c28d596f1cb2d414fc99c6e | [
"MIT"
] | 119 | 2016-11-21T13:19:22.000Z | 2021-11-07T17:29:05.000Z | lib/conduit/encoding/gzip.ex | log4b/conduit | 5ecf5aa0edd831f54c28d596f1cb2d414fc99c6e | [
"MIT"
] | 104 | 2018-02-02T20:42:46.000Z | 2021-08-03T05:36:09.000Z | lib/conduit/encoding/gzip.ex | log4b/conduit | 5ecf5aa0edd831f54c28d596f1cb2d414fc99c6e | [
"MIT"
] | 21 | 2018-08-03T02:38:21.000Z | 2022-03-16T18:26:58.000Z | defmodule Conduit.Encoding.GZip do
use Conduit.Encoding
@moduledoc """
Handles encoding to and from gzip.
"""
@doc """
Encodes the message body to gzip.
## Examples
iex> import Conduit.Message
iex> message =
iex> %Conduit.Message{}
iex> |> put_body("{}")
iex> |> Conduit.Encoding.GZip.encode([])
iex> :zlib.gunzip(message.body)
"{}"
"""
def encode(message, _opts) do
put_body(message, :zlib.gzip(message.body))
end
@doc """
Decodes the message body from gzip.
## Examples
iex> import Conduit.Message
iex> body = <<31, 139, 8, 0, 0, 0, 0, 0, 0, 3, 171, 174, 5, 0, 67, 191, 166, 163, 2, 0, 0, 0>>
iex> message =
iex> %Conduit.Message{}
iex> |> put_body(body)
iex> |> Conduit.Encoding.GZip.decode([])
iex> message.body
"{}"
"""
def decode(message, _opts) do
put_body(message, :zlib.gunzip(message.body))
end
end
| 21.333333 | 100 | 0.573958 |
e8616bd64ef855443f93a957edf5842b43b2e3f2 | 29,512 | ex | Elixir | lib/elixir/lib/uri.ex | jmbejar/elixir | 4a5fcbe04a5a477e27c6645a8a4bdb98332fb6eb | [
"Apache-2.0"
] | null | null | null | lib/elixir/lib/uri.ex | jmbejar/elixir | 4a5fcbe04a5a477e27c6645a8a4bdb98332fb6eb | [
"Apache-2.0"
] | null | null | null | lib/elixir/lib/uri.ex | jmbejar/elixir | 4a5fcbe04a5a477e27c6645a8a4bdb98332fb6eb | [
"Apache-2.0"
] | 1 | 2021-12-09T11:36:07.000Z | 2021-12-09T11:36:07.000Z | defmodule URI do
@moduledoc """
Utilities for working with URIs.
This module provides functions for working with URIs (for example, parsing
URIs or encoding query strings). The functions in this module are implemented
according to [RFC 3986](https://tools.ietf.org/html/rfc3986).
URIs are structs behind the scenes. You can access the URI fields directly
but you should not create a new `URI` directly via the struct syntax. Instead
use the functions in this module.
"""
defstruct scheme: nil,
path: nil,
query: nil,
fragment: nil,
authority: nil,
userinfo: nil,
host: nil,
port: nil
@type t :: %__MODULE__{
authority: authority,
fragment: nil | binary,
host: nil | binary,
path: nil | binary,
port: nil | :inet.port_number(),
query: nil | binary,
scheme: nil | binary,
userinfo: nil | binary
}
@typedoc deprecated: "The authority field is deprecated"
@opaque authority :: nil | binary
defmodule Error do
defexception [:action, :reason, :part]
def message(%Error{action: action, reason: reason, part: part}) do
"cannot #{action} due to reason #{reason}: #{inspect(part)}"
end
end
import Bitwise
@reserved_characters ':/?#[]@!$&\'()*+,;='
@formatted_reserved_characters Enum.map_join(@reserved_characters, ", ", &<<?`, &1, ?`>>)
@doc """
Returns the default port for a given `scheme`.
If the scheme is unknown to the `URI` module, this function returns
`nil`. The default port for any scheme can be configured globally
via `default_port/2`.
## Examples
iex> URI.default_port("ftp")
21
iex> URI.default_port("ponzi")
nil
"""
@spec default_port(binary) :: nil | non_neg_integer
def default_port(scheme) when is_binary(scheme) do
:elixir_config.get({:uri, scheme}, nil)
end
@doc """
Registers the default `port` for the given `scheme`.
After this function is called, `port` will be returned by
`default_port/1` for the given scheme `scheme`. Note that this function
changes the default port for the given `scheme` *globally*, meaning for
every application.
It is recommended for this function to be invoked in your
application's start callback in case you want to register
new URIs.
"""
@spec default_port(binary, non_neg_integer) :: :ok
def default_port(scheme, port) when is_binary(scheme) and is_integer(port) and port >= 0 do
:elixir_config.put({:uri, scheme}, port)
end
@doc """
Encodes `enumerable` into a query string using `encoding`.
Takes an enumerable that enumerates as a list of two-element
tuples (for instance, a map or a keyword list) and returns a string
in the form of `key1=value1&key2=value2...`.
Keys and values can be any term that implements the `String.Chars`
protocol with the exception of lists, which are explicitly forbidden.
You can specify one of the following `encoding` strategies:
* `:www_form` - (default, since v1.12.0) keys and values are URL encoded as
per `encode_www_form/1`. This is the format typically used by browsers on
query strings and form data. It encodes " " as "+".
* `:rfc3986` - (since v1.12.0) the same as `:www_form` except it encodes
" " as "%20" according [RFC 3986](https://tools.ietf.org/html/rfc3986).
This is the best option if you are encoding in a non-browser situation,
since encoding spaces as "+" can be ambiguous to URI parsers. This can
inadvertently lead to spaces being interpreted as literal plus signs.
Encoding defaults to `:www_form` for backward compatibility.
## Examples
iex> query = %{"foo" => 1, "bar" => 2}
iex> URI.encode_query(query)
"bar=2&foo=1"
iex> query = %{"key" => "value with spaces"}
iex> URI.encode_query(query)
"key=value+with+spaces"
iex> query = %{"key" => "value with spaces"}
iex> URI.encode_query(query, :rfc3986)
"key=value%20with%20spaces"
iex> URI.encode_query(%{key: [:a, :list]})
** (ArgumentError) encode_query/2 values cannot be lists, got: [:a, :list]
"""
@spec encode_query(Enumerable.t(), :rfc3986 | :www_form) :: binary
def encode_query(enumerable, encoding \\ :www_form) do
Enum.map_join(enumerable, "&", &encode_kv_pair(&1, encoding))
end
defp encode_kv_pair({key, _}, _encoding) when is_list(key) do
raise ArgumentError, "encode_query/2 keys cannot be lists, got: #{inspect(key)}"
end
defp encode_kv_pair({_, value}, _encoding) when is_list(value) do
raise ArgumentError, "encode_query/2 values cannot be lists, got: #{inspect(value)}"
end
defp encode_kv_pair({key, value}, :rfc3986) do
encode(Kernel.to_string(key), &char_unreserved?/1) <>
"=" <> encode(Kernel.to_string(value), &char_unreserved?/1)
end
defp encode_kv_pair({key, value}, :www_form) do
encode_www_form(Kernel.to_string(key)) <> "=" <> encode_www_form(Kernel.to_string(value))
end
@doc """
Decodes `query` into a map.
Given a query string in the form of `key1=value1&key2=value2...`, this
function inserts each key-value pair in the query string as one entry in the
given `map`. Keys and values in the resulting map will be binaries. Keys and
values will be percent-unescaped.
You can specify one of the following `encoding` options:
* `:www_form` - (default, since v1.12.0) keys and values are decoded as per
`decode_www_form/1`. This is the format typically used by browsers on
query strings and form data. It decodes "+" as " ".
* `:rfc3986` - (since v1.12.0) keys and values are decoded as per
`decode/1`. The result is the same as `:www_form` except for leaving "+"
as is in line with [RFC 3986](https://tools.ietf.org/html/rfc3986).
Encoding defaults to `:www_form` for backward compatibility.
Use `query_decoder/1` if you want to iterate over each value manually.
## Examples
iex> URI.decode_query("foo=1&bar=2")
%{"bar" => "2", "foo" => "1"}
iex> URI.decode_query("percent=oh+yes%21", %{"starting" => "map"})
%{"percent" => "oh yes!", "starting" => "map"}
iex> URI.decode_query("percent=oh+yes%21", %{}, :rfc3986)
%{"percent" => "oh+yes!"}
"""
@spec decode_query(binary, %{optional(binary) => binary}, :rfc3986 | :www_form) :: %{
optional(binary) => binary
}
def decode_query(query, map \\ %{}, encoding \\ :www_form)
def decode_query(query, %_{} = dict, encoding) when is_binary(query) do
IO.warn(
"URI.decode_query/3 expects the second argument to be a map, other usage is deprecated"
)
decode_query_into_dict(query, dict, encoding)
end
def decode_query(query, map, encoding) when is_binary(query) and is_map(map) do
decode_query_into_map(query, map, encoding)
end
def decode_query(query, dict, encoding) when is_binary(query) do
IO.warn(
"URI.decode_query/3 expects the second argument to be a map, other usage is deprecated"
)
decode_query_into_dict(query, dict, encoding)
end
defp decode_query_into_map(query, map, encoding) do
case decode_next_query_pair(query, encoding) do
nil ->
map
{{key, value}, rest} ->
decode_query_into_map(rest, Map.put(map, key, value), encoding)
end
end
defp decode_query_into_dict(query, dict, encoding) do
case decode_next_query_pair(query, encoding) do
nil ->
dict
{{key, value}, rest} ->
# Avoid warnings about Dict being deprecated
dict_module = Dict
decode_query_into_dict(rest, dict_module.put(dict, key, value), encoding)
end
end
@doc """
Returns a stream of two-element tuples representing key-value pairs in the
given `query`.
Key and value in each tuple will be binaries and will be percent-unescaped.
You can specify one of the following `encoding` options:
* `:www_form` - (default, since v1.12.0) keys and values are decoded as per
`decode_www_form/1`. This is the format typically used by browsers on
query strings and form data. It decodes "+" as " ".
* `:rfc3986` - (since v1.12.0) keys and values are decoded as per
`decode/1`. The result is the same as `:www_form` except for leaving "+"
as is in line with [RFC 3986](https://tools.ietf.org/html/rfc3986).
Encoding defaults to `:www_form` for backward compatibility.
## Examples
iex> URI.query_decoder("foo=1&bar=2") |> Enum.to_list()
[{"foo", "1"}, {"bar", "2"}]
iex> URI.query_decoder("food=bread%26butter&drinks=tap%20water+please") |> Enum.to_list()
[{"food", "bread&butter"}, {"drinks", "tap water please"}]
iex> URI.query_decoder("food=bread%26butter&drinks=tap%20water+please", :rfc3986) |> Enum.to_list()
[{"food", "bread&butter"}, {"drinks", "tap water+please"}]
"""
@spec query_decoder(binary, :rfc3986 | :www_form) :: Enumerable.t()
def query_decoder(query, encoding \\ :www_form) when is_binary(query) do
Stream.unfold(query, &decode_next_query_pair(&1, encoding))
end
defp decode_next_query_pair("", _encoding) do
nil
end
defp decode_next_query_pair(query, encoding) do
{undecoded_next_pair, rest} =
case :binary.split(query, "&") do
[next_pair, rest] -> {next_pair, rest}
[next_pair] -> {next_pair, ""}
end
next_pair =
case :binary.split(undecoded_next_pair, "=") do
[key, value] ->
{decode_with_encoding(key, encoding), decode_with_encoding(value, encoding)}
[key] ->
{decode_with_encoding(key, encoding), ""}
end
{next_pair, rest}
end
defp decode_with_encoding(string, :www_form) do
decode_www_form(string)
end
defp decode_with_encoding(string, :rfc3986) do
decode(string)
end
@doc ~s"""
Checks if `character` is a reserved one in a URI.
As specified in [RFC 3986, section 2.2](https://tools.ietf.org/html/rfc3986#section-2.2),
the following characters are reserved: #{@formatted_reserved_characters}
## Examples
iex> URI.char_reserved?(?+)
true
"""
@spec char_reserved?(byte) :: boolean
def char_reserved?(character) do
character in @reserved_characters
end
@doc """
Checks if `character` is an unreserved one in a URI.
As specified in [RFC 3986, section 2.3](https://tools.ietf.org/html/rfc3986#section-2.3),
the following characters are unreserved:
* Alphanumeric characters: `A-Z`, `a-z`, `0-9`
* `~`, `_`, `-`, `.`
## Examples
iex> URI.char_unreserved?(?_)
true
"""
@spec char_unreserved?(byte) :: boolean
def char_unreserved?(character) do
character in ?0..?9 or character in ?a..?z or character in ?A..?Z or character in '~_-.'
end
@doc """
Checks if `character` is allowed unescaped in a URI.
This is the default used by `URI.encode/2` where both
[reserved](`char_reserved?/1`) and [unreserved characters](`char_unreserved?/1`)
are kept unescaped.
## Examples
iex> URI.char_unescaped?(?{)
false
"""
@spec char_unescaped?(byte) :: boolean
def char_unescaped?(character) do
char_reserved?(character) or char_unreserved?(character)
end
@doc """
Percent-escapes all characters that require escaping in `string`.
This means reserved characters, such as `:` and `/`, and the
so-called unreserved characters, which have the same meaning both
escaped and unescaped, won't be escaped by default.
See `encode_www_form/1` if you are interested in escaping reserved
characters too.
This function also accepts a `predicate` function as an optional
argument. If passed, this function will be called with each byte
in `string` as its argument and should return a truthy value (anything other
than `false` or `nil`) if the given byte should be left as is, or return a
falsy value (`false` or `nil`) if the character should be escaped. Defaults
to `URI.char_unescaped?/1`.
## Examples
iex> URI.encode("ftp://s-ite.tld/?value=put it+й")
"ftp://s-ite.tld/?value=put%20it+%D0%B9"
iex> URI.encode("a string", &(&1 != ?i))
"a str%69ng"
"""
@spec encode(binary, (byte -> as_boolean(term))) :: binary
def encode(string, predicate \\ &char_unescaped?/1)
when is_binary(string) and is_function(predicate, 1) do
for <<byte <- string>>, into: "", do: percent(byte, predicate)
end
@doc """
Encodes `string` as "x-www-form-urlencoded".
Note "x-www-form-urlencoded" is not specified as part of
RFC 3986. However, it is a commonly used format to encode
query strings and form data by browsers.
## Example
iex> URI.encode_www_form("put: it+й")
"put%3A+it%2B%D0%B9"
"""
@spec encode_www_form(binary) :: binary
def encode_www_form(string) when is_binary(string) do
for <<byte <- string>>, into: "" do
case percent(byte, &char_unreserved?/1) do
"%20" -> "+"
percent -> percent
end
end
end
defp percent(char, predicate) do
if predicate.(char) do
<<char>>
else
<<"%", hex(bsr(char, 4)), hex(band(char, 15))>>
end
end
defp hex(n) when n <= 9, do: n + ?0
defp hex(n), do: n + ?A - 10
@doc """
Percent-unescapes a URI.
## Examples
iex> URI.decode("https%3A%2F%2Felixir-lang.org")
"https://elixir-lang.org"
"""
@spec decode(binary) :: binary
def decode(uri) do
unpercent(uri, "", false)
end
@doc """
Decodes `string` as "x-www-form-urlencoded".
Note "x-www-form-urlencoded" is not specified as part of
RFC 3986. However, it is a commonly used format to encode
query strings and form data by browsers.
## Examples
iex> URI.decode_www_form("%3Call+in%2F")
"<all in/"
"""
@spec decode_www_form(binary) :: binary
def decode_www_form(string) when is_binary(string) do
unpercent(string, "", true)
end
defp unpercent(<<?+, tail::binary>>, acc, spaces = true) do
unpercent(tail, <<acc::binary, ?\s>>, spaces)
end
defp unpercent(<<?%, tail::binary>>, acc, spaces) do
with <<hex1, hex2, tail::binary>> <- tail,
dec1 when is_integer(dec1) <- hex_to_dec(hex1),
dec2 when is_integer(dec2) <- hex_to_dec(hex2) do
unpercent(tail, <<acc::binary, bsl(dec1, 4) + dec2>>, spaces)
else
_ -> unpercent(tail, <<acc::binary, ?%>>, spaces)
end
end
defp unpercent(<<head, tail::binary>>, acc, spaces) do
unpercent(tail, <<acc::binary, head>>, spaces)
end
defp unpercent(<<>>, acc, _spaces), do: acc
@compile {:inline, hex_to_dec: 1}
defp hex_to_dec(n) when n in ?A..?F, do: n - ?A + 10
defp hex_to_dec(n) when n in ?a..?f, do: n - ?a + 10
defp hex_to_dec(n) when n in ?0..?9, do: n - ?0
defp hex_to_dec(_n), do: nil
@doc """
Creates a new URI struct from a URI or a string.
If a `%URI{}` struct is given, it returns `{:ok, uri}`. If a string is
given, it will parse and validate it. If the string is valid, it returns
`{:ok, uri}`, otherwise it returns `{:error, part}` with the invalid part
of the URI. For parsing URIs without further validation, see `parse/1`.
This function can parse both absolute and relative URLs. You can check
if a URI is absolute or relative by checking if the `scheme` field is
`nil` or not.
When a URI is given without a port, the value returned by `URI.default_port/1`
for the URI's scheme is used for the `:port` field. The scheme is also
normalized to lowercase.
## Examples
iex> URI.new("https://elixir-lang.org/")
{:ok, %URI{
fragment: nil,
host: "elixir-lang.org",
path: "/",
port: 443,
query: nil,
scheme: "https",
userinfo: nil
}}
iex> URI.new("//elixir-lang.org/")
{:ok, %URI{
fragment: nil,
host: "elixir-lang.org",
path: "/",
port: nil,
query: nil,
scheme: nil,
userinfo: nil
}}
iex> URI.new("/foo/bar")
{:ok, %URI{
fragment: nil,
host: nil,
path: "/foo/bar",
port: nil,
query: nil,
scheme: nil,
userinfo: nil
}}
iex> URI.new("foo/bar")
{:ok, %URI{
fragment: nil,
host: nil,
path: "foo/bar",
port: nil,
query: nil,
scheme: nil,
userinfo: nil
}}
iex> URI.new("//[fe80::]/")
{:ok, %URI{
fragment: nil,
host: "fe80::",
path: "/",
port: nil,
query: nil,
scheme: nil,
userinfo: nil
}}
iex> URI.new("https:?query")
{:ok, %URI{
fragment: nil,
host: nil,
path: nil,
port: 443,
query: "query",
scheme: "https",
userinfo: nil
}}
iex> URI.new("/invalid_greater_than_in_path/>")
{:error, ">"}
Giving an existing URI simply returns it wrapped in a tuple:
iex> {:ok, uri} = URI.new("https://elixir-lang.org/")
iex> URI.new(uri)
{:ok, %URI{
fragment: nil,
host: "elixir-lang.org",
path: "/",
port: 443,
query: nil,
scheme: "https",
userinfo: nil
}}
"""
@doc since: "1.13.0"
@spec new(t() | String.t()) :: {:ok, t()} | {:error, String.t()}
def new(%URI{} = uri), do: {:ok, uri}
def new(binary) when is_binary(binary) do
case :uri_string.parse(binary) do
%{} = map -> {:ok, uri_from_map(map)}
{:error, :invalid_uri, term} -> {:error, Kernel.to_string(term)}
end
end
@doc """
Similar to `new/0` but raises `URI.Error` if an invalid string is given.
## Examples
iex> URI.new!("https://elixir-lang.org/")
%URI{
fragment: nil,
host: "elixir-lang.org",
path: "/",
port: 443,
query: nil,
scheme: "https",
userinfo: nil
}
iex> URI.new!("/invalid_greater_than_in_path/>")
** (URI.Error) cannot parse due to reason invalid_uri: ">"
Giving an existing URI simply returns it:
iex> uri = URI.new!("https://elixir-lang.org/")
iex> URI.new!(uri)
%URI{
fragment: nil,
host: "elixir-lang.org",
path: "/",
port: 443,
query: nil,
scheme: "https",
userinfo: nil
}
"""
@doc since: "1.13.0"
@spec new!(t() | String.t()) :: t()
def new!(%URI{} = uri), do: uri
def new!(binary) when is_binary(binary) do
case :uri_string.parse(binary) do
%{} = map ->
uri_from_map(map)
{:error, reason, part} ->
raise Error, action: :parse, reason: reason, part: Kernel.to_string(part)
end
end
defp uri_from_map(%{path: ""} = map), do: uri_from_map(%{map | path: nil})
defp uri_from_map(map) do
uri = Map.merge(%URI{}, map)
case map do
%{scheme: scheme} ->
scheme = String.downcase(scheme, :ascii)
case map do
%{port: _} ->
%{uri | scheme: scheme}
%{} ->
case default_port(scheme) do
nil -> %{uri | scheme: scheme}
port -> %{uri | scheme: scheme, port: port}
end
end
%{} ->
uri
end
end
@doc """
Parses a URI into its components, without further validation.
This function can parse both absolute and relative URLs. You can check
if a URI is absolute or relative by checking if the `scheme` field is
nil or not. Furthermore, this function expects both absolute and
relative URIs to be well-formed and does not perform any validation.
See the "Examples" section below. Use `new/1` if you want more strict
validation.
When a URI is given without a port, the value returned by `URI.default_port/1`
for the URI's scheme is used for the `:port` field. The scheme is also
normalized to lowercase.
If a `%URI{}` struct is given to this function, this function returns it
unmodified.
> Note: this function sets the field :authority for backwards
> compatibility reasons but it is deprecated.
## Examples
iex> URI.parse("https://elixir-lang.org/")
%URI{
authority: "elixir-lang.org",
fragment: nil,
host: "elixir-lang.org",
path: "/",
port: 443,
query: nil,
scheme: "https",
userinfo: nil
}
iex> URI.parse("//elixir-lang.org/")
%URI{
authority: "elixir-lang.org",
fragment: nil,
host: "elixir-lang.org",
path: "/",
port: nil,
query: nil,
scheme: nil,
userinfo: nil
}
iex> URI.parse("/foo/bar")
%URI{
authority: nil,
fragment: nil,
host: nil,
path: "/foo/bar",
port: nil,
query: nil,
scheme: nil,
userinfo: nil
}
iex> URI.parse("foo/bar")
%URI{
authority: nil,
fragment: nil,
host: nil,
path: "foo/bar",
port: nil,
query: nil,
scheme: nil,
userinfo: nil
}
In contrast to `URI.new/1`, this function will parse poorly-formed
URIs, for example:
iex> URI.parse("/invalid_greater_than_in_path/>")
%URI{
authority: nil,
fragment: nil,
host: nil,
path: "/invalid_greater_than_in_path/>",
port: nil,
query: nil,
scheme: nil,
userinfo: nil
}
Another example is a URI with brackets in query strings. It is accepted
by `parse/1` but it will be refused by `new/1`:
iex> URI.parse("/?foo[bar]=baz")
%URI{
authority: nil,
fragment: nil,
host: nil,
path: "/",
port: nil,
query: "foo[bar]=baz",
scheme: nil,
userinfo: nil
}
"""
@spec parse(t | binary) :: t
def parse(%URI{} = uri), do: uri
def parse(string) when is_binary(string) do
# From https://tools.ietf.org/html/rfc3986#appendix-B
# Parts: 12 3 4 5 6 7 8 9
regex = ~r{^(([a-z][a-z0-9\+\-\.]*):)?(//([^/?#]*))?([^?#]*)(\?([^#]*))?(#(.*))?}i
parts = Regex.run(regex, string)
destructure [
_full,
# 1
_scheme_with_colon,
# 2
scheme,
# 3
authority_with_slashes,
# 4
_authority,
# 5
path,
# 6
query_with_question_mark,
# 7
_query,
# 8
_fragment_with_hash,
# 9
fragment
],
parts
path = nillify(path)
scheme = nillify(scheme)
query = nillify_query(query_with_question_mark)
{authority, userinfo, host, port} = split_authority(authority_with_slashes)
scheme = scheme && String.downcase(scheme)
port = port || (scheme && default_port(scheme))
%URI{
scheme: scheme,
path: path,
query: query,
fragment: fragment,
authority: authority,
userinfo: userinfo,
host: host,
port: port
}
end
defp nillify_query("?" <> query), do: query
defp nillify_query(_other), do: nil
# Split an authority into its userinfo, host and port parts.
#
# Note that the host field is returned *without* [] even if, according to
# RFC3986 grammar, a native IPv6 address requires them.
defp split_authority("") do
{nil, nil, nil, nil}
end
defp split_authority("//") do
{"", nil, "", nil}
end
defp split_authority("//" <> authority) do
regex = ~r/(^(.*)@)?(\[[a-zA-Z0-9:.]*\]|[^:]*)(:(\d*))?/
components = Regex.run(regex, authority)
destructure [_, _, userinfo, host, _, port], components
userinfo = nillify(userinfo)
host = if nillify(host), do: host |> String.trim_leading("[") |> String.trim_trailing("]")
port = if nillify(port), do: String.to_integer(port)
{authority, userinfo, host, port}
end
# Regex.run returns empty strings sometimes. We want
# to replace those with nil for consistency.
defp nillify(""), do: nil
defp nillify(other), do: other
@doc """
Returns the string representation of the given [URI struct](`t:t/0`).
## Examples
iex> uri = URI.parse("http://google.com")
iex> URI.to_string(uri)
"http://google.com"
iex> uri = URI.parse("foo://bar.baz")
iex> URI.to_string(uri)
"foo://bar.baz"
"""
@spec to_string(t) :: binary
defdelegate to_string(uri), to: String.Chars.URI
@doc ~S"""
Merges two URIs.
This function merges two URIs as per
[RFC 3986, section 5.2](https://tools.ietf.org/html/rfc3986#section-5.2).
## Examples
iex> URI.merge(URI.parse("http://google.com"), "/query") |> to_string()
"http://google.com/query"
iex> URI.merge("http://example.com", "http://google.com") |> to_string()
"http://google.com"
"""
@spec merge(t | binary, t | binary) :: t
def merge(uri, rel)
def merge(%URI{host: nil}, _rel) do
raise ArgumentError, "you must merge onto an absolute URI"
end
def merge(_base, %URI{scheme: rel_scheme} = rel) when rel_scheme != nil do
%{rel | path: remove_dot_segments_from_path(rel.path)}
end
def merge(base, %URI{host: host} = rel) when host != nil do
%{rel | scheme: base.scheme, path: remove_dot_segments_from_path(rel.path)}
end
def merge(%URI{} = base, %URI{path: nil} = rel) do
%{base | query: rel.query || base.query, fragment: rel.fragment}
end
def merge(%URI{} = base, %URI{} = rel) do
new_path = merge_paths(base.path, rel.path)
%{base | path: new_path, query: rel.query, fragment: rel.fragment}
end
def merge(base, rel) do
merge(parse(base), parse(rel))
end
defp merge_paths(nil, rel_path), do: merge_paths("/", rel_path)
defp merge_paths(_, "/" <> _ = rel_path), do: remove_dot_segments_from_path(rel_path)
defp merge_paths(base_path, rel_path) do
[_ | base_segments] = path_to_segments(base_path)
path_to_segments(rel_path)
|> Kernel.++(base_segments)
|> remove_dot_segments([])
|> Enum.join("/")
end
defp remove_dot_segments_from_path(nil) do
nil
end
defp remove_dot_segments_from_path(path) do
path
|> path_to_segments()
|> remove_dot_segments([])
|> Enum.join("/")
end
defp remove_dot_segments([], [head, ".." | acc]), do: remove_dot_segments([], [head | acc])
defp remove_dot_segments([], acc), do: acc
defp remove_dot_segments(["." | tail], acc), do: remove_dot_segments(tail, acc)
defp remove_dot_segments([head | tail], ["..", ".." | _] = acc),
do: remove_dot_segments(tail, [head | acc])
defp remove_dot_segments(segments, [_, ".." | acc]), do: remove_dot_segments(segments, acc)
defp remove_dot_segments([head | tail], acc), do: remove_dot_segments(tail, [head | acc])
defp path_to_segments(path) do
path |> String.split("/") |> Enum.reverse()
end
@doc """
Appends `query` to the given `uri`.
The given `query` is not automatically encoded, use `encode/2` or `encode_www_form/1`.
## Examples
iex> URI.append_query(URI.parse("http://example.com/"), "x=1") |> URI.to_string()
"http://example.com/?x=1"
iex> URI.append_query(URI.parse("http://example.com/?x=1"), "y=2") |> URI.to_string()
"http://example.com/?x=1&y=2"
iex> URI.append_query(URI.parse("http://example.com/?x=1"), "x=2") |> URI.to_string()
"http://example.com/?x=1&x=2"
"""
@doc since: "1.14.0"
@spec append_query(t(), binary()) :: t()
def append_query(%URI{} = uri, query) when is_binary(query) and uri.query in [nil, ""] do
%{uri | query: query}
end
def append_query(%URI{} = uri, query) when is_binary(query) do
%{uri | query: uri.query <> "&" <> query}
end
end
defimpl String.Chars, for: URI do
def to_string(%{host: host, path: path} = uri)
when host != nil and is_binary(path) and
path != "" and binary_part(path, 0, 1) != "/" do
raise ArgumentError,
":path in URI must be empty or an absolute path if URL has a :host, got: #{inspect(uri)}"
end
def to_string(%{scheme: scheme, port: port, path: path, query: query, fragment: fragment} = uri) do
uri =
case scheme && URI.default_port(scheme) do
^port -> %{uri | port: nil}
_ -> uri
end
# Based on https://tools.ietf.org/html/rfc3986#section-5.3
authority = extract_authority(uri)
IO.iodata_to_binary([
if(scheme, do: [scheme, ?:], else: []),
if(authority, do: ["//" | authority], else: []),
if(path, do: path, else: []),
if(query, do: ["?" | query], else: []),
if(fragment, do: ["#" | fragment], else: [])
])
end
defp extract_authority(%{host: nil, authority: authority}) do
authority
end
defp extract_authority(%{host: host, userinfo: userinfo, port: port}) do
# According to the grammar at
# https://tools.ietf.org/html/rfc3986#appendix-A, a "host" can have a colon
# in it only if it's an IPv6 or "IPvFuture" address, so if there's a colon
# in the host we can safely surround it with [].
[
if(userinfo, do: [userinfo | "@"], else: []),
if(String.contains?(host, ":"), do: ["[", host | "]"], else: host),
if(port, do: [":" | Integer.to_string(port)], else: [])
]
end
end
| 29.075862 | 105 | 0.606228 |
e86172e079330607e5a81aa2cb9ff19438c7ef24 | 67 | ex | Elixir | lib/course_01_web/views/planning_view.ex | geirilja/web_dev | e1a64336bf52148a63e54a7326ea9ed13a2034a0 | [
"MIT"
] | null | null | null | lib/course_01_web/views/planning_view.ex | geirilja/web_dev | e1a64336bf52148a63e54a7326ea9ed13a2034a0 | [
"MIT"
] | null | null | null | lib/course_01_web/views/planning_view.ex | geirilja/web_dev | e1a64336bf52148a63e54a7326ea9ed13a2034a0 | [
"MIT"
] | null | null | null | defmodule Course01Web.PlanningView do
use Course01Web, :view
end
| 16.75 | 37 | 0.820896 |
e861853559f9f1db8f0bc3009998e62290259cbf | 76 | exs | Elixir | test/sengo_web/views/page_view_test.exs | henrikcoll/sengo | d5ef840843bffbc858dc18d331e4ebc74e6837cb | [
"MIT"
] | null | null | null | test/sengo_web/views/page_view_test.exs | henrikcoll/sengo | d5ef840843bffbc858dc18d331e4ebc74e6837cb | [
"MIT"
] | null | null | null | test/sengo_web/views/page_view_test.exs | henrikcoll/sengo | d5ef840843bffbc858dc18d331e4ebc74e6837cb | [
"MIT"
] | null | null | null | defmodule SengoWeb.PageViewTest do
use SengoWeb.ConnCase, async: true
end
| 19 | 36 | 0.815789 |
e8619a9f87c7abee959a0fd1f27b2634e1ab4820 | 13,694 | ex | Elixir | lib/mix/oasis/router.ex | hou8/oasis | 1f8a54da4202f315031d6ceeb6735e81bda1dad2 | [
"MIT"
] | null | null | null | lib/mix/oasis/router.ex | hou8/oasis | 1f8a54da4202f315031d6ceeb6735e81bda1dad2 | [
"MIT"
] | null | null | null | lib/mix/oasis/router.ex | hou8/oasis | 1f8a54da4202f315031d6ceeb6735e81bda1dad2 | [
"MIT"
] | null | null | null | defmodule Mix.Oasis.Router do
@moduledoc false
defstruct [
:http_verb,
:path_schema,
:body_schema,
:header_schema,
:cookie_schema,
:query_schema,
:url,
:operation_id,
:name_space,
:pre_plug_module,
:plug_module,
:request_validator,
:plug_parsers,
:security
]
@check_parameter_fields [
"query",
"cookie",
"header",
"path"
]
@spec_ext_name_space "x-oasis-name-space"
@spec_ext_router "x-oasis-router"
@spec_ext_key_to_assigns "x-oasis-key-to-assigns"
@spec_ext_signed_headers "x-oasis-signed-headers"
def generate_files_by_paths_spec(apps, %{"paths" => paths_spec} = spec, opts)
when is_map(paths_spec) do
opts = put_opts(spec, opts)
{name_space_from_paths, paths_spec} = Map.pop(paths_spec, @spec_ext_name_space)
{plug_files, routers} = generate_plug_files(apps, paths_spec, name_space_from_paths, opts)
router_file = generate_router_file(routers, name_space_from_paths, opts)
[router_file | plug_files]
end
defp put_opts(%{"paths" => paths_spec} = spec, opts) do
# Set the :router use the input option from command line if exists,
# or use the Paths Object's extension "x-oasis-router" field if exists.
router = opts[:router] || Map.get(paths_spec, @spec_ext_router)
opts
|> Keyword.put(:router, router)
|> Keyword.put(:global_security, global_security(spec))
end
defp generate_plug_files(apps, paths_spec, name_space_from_paths, opts) do
paths_spec
|> Map.keys()
|> Enum.reduce({[], []}, fn url, acc ->
iterate_paths_spec_by_url(apps, paths_spec, url, acc, name_space_from_paths, opts)
end)
end
defp iterate_paths_spec_by_url(
apps,
paths_spec,
"/" <> _ = url,
{plug_files_acc, routers_acc},
name_space,
opts
) do
# `paths_spec` may contain extended object starts with `x-oasis-`,
# assume a value starts with "/" is a proper url expression.
{plug_files, routers} =
paths_spec
|> Map.get(url, %{})
|> Map.take(Oasis.Spec.Path.supported_http_verbs())
|> Enum.reduce({[], []}, fn {http_verb, operation}, {plug_files_to_url, routers_to_url} ->
operation = Map.put_new(operation, @spec_ext_name_space, name_space)
{plug_files, router} = new(apps, url, http_verb, operation, opts)
{
plug_files ++ plug_files_to_url,
[router | routers_to_url]
}
end)
{
plug_files ++ plug_files_acc,
routers ++ routers_acc
}
end
defp iterate_paths_spec_by_url(_apps, _paths_spec, _url, acc, _name_space, _opts), do: acc
defp generate_router_file(routers, name_space, opts) do
name_space = opts[:name_space] || name_space
{name_space, dir} = Mix.Oasis.name_space(name_space)
module_name = Keyword.get(opts, :router) || "Router"
{module_name, file_name} = Mix.Oasis.module_alias(module_name)
target = Path.join([dir, file_name])
routers = Enum.sort(routers, &(&1.url < &2.url))
{:eex, target, "router.ex", Module.concat([name_space, module_name]), %{routers: routers}}
end
defp new(apps, url, http_verb, operation, opts) do
router =
Map.merge(
%__MODULE__{
http_verb: http_verb,
url: url
},
build_operation(operation, opts)
)
build_files_to_generate(apps, router, opts)
end
defp build_operation(operation, opts) do
operation
|> merge_parameters_to_operation()
|> merge_request_body_to_operation()
|> merge_security_to_operation(opts)
|> merge_others_to_operation(opts)
end
defp merge_parameters_to_operation(%{"parameters" => parameters} = operation) do
router =
@check_parameter_fields
|> Enum.reduce(%{}, fn location, acc ->
parameters_to_location = Map.get(parameters, location)
params_to_schema = group_schemas_by_location(location, parameters_to_location)
to_schema_opt(params_to_schema, location, acc)
end)
{router, operation}
end
defp merge_parameters_to_operation(operation), do: {%{}, operation}
defp merge_request_body_to_operation(
{acc, %{"requestBody" => %{"content" => content} = request_body} = operation}
)
when is_map(content) do
content =
Enum.reduce(content, %{}, fn {content_type, media}, acc ->
schema = Map.get(media, "schema")
if schema != nil do
media = Map.put(media, "schema", %ExJsonSchema.Schema.Root{schema: schema})
Map.put(acc, content_type, media)
else
acc
end
end)
if content == %{} do
# skip if no "schema" defined in Media Type Object.
{acc, operation}
else
body_schema = put_required_if_exists(request_body, %{"content" => content})
{
Map.put(acc, :body_schema, body_schema),
operation
}
end
end
defp merge_request_body_to_operation({acc, operation}), do: {acc, operation}
defp put_required_if_exists(%{"required" => required}, map)
when is_boolean(required) and is_map(map) do
Map.put(map, "required", required)
end
defp put_required_if_exists(_, map), do: map
defp group_schemas_by_location(location, parameters)
when location in @check_parameter_fields and is_list(parameters) do
Enum.reduce(parameters, %{}, fn param, acc ->
map_parameter(param, acc)
end)
end
defp group_schemas_by_location(_location, _parameters), do: nil
defp map_parameter(%{"name" => name, "schema" => schema} = parameter, acc) do
parameter =
put_required_if_exists(parameter, %{"schema" => %ExJsonSchema.Schema.Root{schema: schema}})
Map.merge(acc, %{name => parameter})
end
defp map_parameter(_, acc), do: acc
defp to_schema_opt(nil, _, acc), do: acc
defp to_schema_opt(params, _, acc) when params == %{} do
acc
end
defp to_schema_opt(params, "query", acc) do
Map.put(acc, :query_schema, params)
end
defp to_schema_opt(params, "cookie", acc) do
Map.put(acc, :cookie_schema, params)
end
defp to_schema_opt(params, "header", acc) do
Map.put(acc, :header_schema, params)
end
defp to_schema_opt(params, "path", acc) do
Map.put(acc, :path_schema, params)
end
defp merge_security_to_operation({acc, operation}, opts) do
security =
case opts[:global_security] do
{nil, security_schemes} ->
Oasis.Spec.Security.build(operation, security_schemes)
{global_security, _} ->
global_security
end
{
Map.put(acc, :security, security),
operation
}
end
defp merge_others_to_operation({acc, operation}, opts) do
# Use the input `name_space` option from command line if existed
name_space_from_spec = Map.get(operation, @spec_ext_name_space)
name_space = opts[:name_space] || name_space_from_spec
acc
|> Map.put(:operation_id, Map.get(operation, "operationId"))
|> Map.put(:name_space, name_space)
end
defp build_files_to_generate(apps, router, opts) do
{files, router} =
apps
|> may_inject_request_validator(router)
|> may_inject_plug_parsers()
|> template_plugs_in_pair()
{security_files, router} = may_inject_plug_security(apps, router, opts)
files =
files
|> Kernel.++(security_files)
|> Enum.map(fn(file) ->
Tuple.append(file, router)
end)
{
files,
router
}
end
defp may_inject_plug_parsers(
%__MODULE__{http_verb: http_verb, body_schema: %{"content" => content}} = router
)
when http_verb in ["post", "put", "patch", "delete"] do
Map.put(router, :plug_parsers, inject_plug_parsers(content))
end
defp may_inject_plug_parsers(router), do: router
defp inject_plug_parsers(content) do
opts = [
parsers: MapSet.new(),
pass: ["*/*"],
body_reader: {Oasis.CacheRawBodyReader, :read_body, []}
]
opts =
Enum.reduce(content, opts, fn {content_type, _}, opts ->
map_plug_parsers(content_type, opts)
end)
parsers = MapSet.to_list(opts[:parsers])
if parsers == [] do
nil
else
opts = Keyword.put(opts, :parsers, parsers)
~s|plug(\nPlug.Parsers, #{inspect(opts)})|
end
end
defp map_plug_parsers("application/x-www-form-urlencoded" <> _, acc) do
parsers = MapSet.put(acc[:parsers], :urlencoded)
Keyword.put(acc, :parsers, parsers)
end
defp map_plug_parsers("multipart/form-data" <> _, acc) do
parsers = MapSet.put(acc[:parsers], :multipart)
Keyword.put(acc, :parsers, parsers)
end
defp map_plug_parsers("multipart/mixed" <> _, acc) do
parsers = MapSet.put(acc[:parsers], :multipart)
Keyword.put(acc, :parsers, parsers)
end
defp map_plug_parsers("application/json" <> _, acc) do
parsers = MapSet.put(acc[:parsers], :json)
acc
|> Keyword.put(:parsers, parsers)
|> Keyword.put(:json_decoder, Jason)
end
defp map_plug_parsers(_, acc), do: acc
defp may_inject_request_validator(apps, %{body_schema: body_schema} = router)
when is_map(body_schema) do
inject_request_validator(apps, router)
end
defp may_inject_request_validator(apps, %{query_schema: query_schema} = router)
when is_map(query_schema) do
inject_request_validator(apps, router)
end
defp may_inject_request_validator(apps, %{header_schema: header_schema} = router)
when is_map(header_schema) do
inject_request_validator(apps, router)
end
defp may_inject_request_validator(apps, %{cookie_schema: cookie_schema} = router)
when is_map(cookie_schema) do
inject_request_validator(apps, router)
end
defp may_inject_request_validator(_apps, router), do: router
defp inject_request_validator(apps, router) do
content =
Mix.Oasis.eval_from(apps, "priv/templates/oas.gen.plug/plug/request_validator.exs",
router: router
)
Map.put(router, :request_validator, content)
end
defp may_inject_plug_security(_apps, %{security: nil} = router, _opts) do
{[], router}
end
defp may_inject_plug_security(apps, %{security: security} = router, opts) when is_list(security) do
{security, files} =
Enum.reduce(security, [], fn {security_name, security_scheme}, acc ->
security_scheme = map_security_scheme(apps, security_name, security_scheme, router, opts)
acc ++ [security_scheme]
end)
|> Enum.unzip()
router = Map.put(router, :security, security)
{files, router}
end
defp map_security_scheme(apps, security_name, %{"scheme" => "bearer"} = security_scheme, %{name_space: name_space}, opts) do
# priority use `x-oasis-name-space` field in security scheme object compare to operation's level
# but still use the input argument option from the `oas.gen.plug` command line in the highest priority if possible.
name_space_from_spec = Map.get(security_scheme, @spec_ext_name_space, name_space)
name_space = opts[:name_space] || name_space_from_spec
{name_space, dir} = Mix.Oasis.name_space(name_space)
{module_name, file_name} = Mix.Oasis.module_alias(security_name)
security_module = Module.concat([name_space, module_name])
key_to_assigns = Map.get(security_scheme, @spec_ext_key_to_assigns)
content =
Mix.Oasis.eval_from(apps, "priv/templates/oas.gen.plug/plug/bearer_auth.exs",
security: security_module,
key_to_assigns: key_to_assigns
)
{
content,
{:new_eex, Path.join(dir, file_name), "bearer_token.ex", security_module}
}
end
defp map_security_scheme(apps, security_name, %{"scheme" => "hmac-" <> algorithm} = security_scheme, %{name_space: name_space}, opts) do
# priority use `x-oasis-name-space` field in security scheme object compare to operation's level
# but still use the input argument option from the `oas.gen.plug` command line in the highest priority if possible.
name_space_from_spec = Map.get(security_scheme, @spec_ext_name_space, name_space)
name_space = opts[:name_space] || name_space_from_spec
{name_space, dir} = Mix.Oasis.name_space(name_space)
{module_name, file_name} = Mix.Oasis.module_alias(security_name)
security_module = Module.concat([name_space, module_name])
signed_headers = Map.get(security_scheme, @spec_ext_signed_headers)
content =
Mix.Oasis.eval_from(apps, "priv/templates/oas.gen.plug/plug/hmac_auth.exs",
algorithm: String.to_existing_atom(algorithm),
security: security_module,
signed_headers: signed_headers
)
{
content,
{:new_eex, Path.join(dir, file_name), "hmac_token.ex", security_module}
}
end
defp template_plugs_in_pair(%{name_space: name_space} = router) do
{name_space, dir} = Mix.Oasis.name_space(name_space)
{module_name, plug_file_name} = Mix.Oasis.module_alias(router)
{pre_plug_module_name, pre_plug_file_name} = Mix.Oasis.module_alias("Pre#{module_name}")
pre_plug_module = Module.concat([name_space, pre_plug_module_name])
plug_module = Module.concat([name_space, module_name])
router =
router
|> Map.put(:plug_module, plug_module)
|> Map.put(:pre_plug_module, pre_plug_module)
{
[
{:eex, Path.join([dir, pre_plug_file_name]), "pre_plug.ex", pre_plug_module},
{:new_eex, Path.join([dir, plug_file_name]), "plug.ex", plug_module}
],
router
}
end
defp global_security(spec) do
security_schemes = Oasis.Spec.Security.security_schemes(spec)
{
Oasis.Spec.Security.build(spec, security_schemes),
security_schemes
}
end
end
| 29.704989 | 138 | 0.672265 |
e861ac320acaaceaa98c5661d381893f21fd4b97 | 1,489 | exs | Elixir | test/ja_serializer/builder/resource_object_test.exs | mbta/ja_serializer | efb1d4489809e31e4b54b4af9e85f0b3ceeb650b | [
"Apache-2.0"
] | null | null | null | test/ja_serializer/builder/resource_object_test.exs | mbta/ja_serializer | efb1d4489809e31e4b54b4af9e85f0b3ceeb650b | [
"Apache-2.0"
] | 1 | 2021-06-25T13:28:34.000Z | 2021-06-25T13:28:34.000Z | test/ja_serializer/builder/resource_object_test.exs | mbta/ja_serializer | efb1d4489809e31e4b54b4af9e85f0b3ceeb650b | [
"Apache-2.0"
] | null | null | null | defmodule JaSerializer.Builder.ResourceObjectTest do
use ExUnit.Case
defmodule ArticleSerializer do
use JaSerializer
def type, do: "articles"
attributes [:title, :body]
end
test "single resource object built correctly" do
a1 = %TestModel.Article{id: "a1", title: "a1", body: "a1"}
context = %{data: a1, conn: %{}, serializer: ArticleSerializer, opts: %{}}
primary_resource = JaSerializer.Builder.ResourceObject.build(context)
assert %{id: "a1", attributes: attributes} = primary_resource
assert [_, _] = attributes
# Formatted
json = JaSerializer.format(ArticleSerializer, a1)
assert %{"attributes" => attributes} = json["data"]
fields = Map.keys(attributes)
assert "title" in fields
assert "body" in fields
end
test "sparse fieldset returns only specified fields" do
a1 = %TestModel.Article{id: "a1", title: "a1", body: "a1"}
fields = %{"articles" => "title"}
context = %{
data: a1,
conn: %{},
serializer: ArticleSerializer,
opts: %{fields: fields}
}
primary_resource = JaSerializer.Builder.ResourceObject.build(context)
assert %{id: "a1", attributes: attributes} = primary_resource
assert [_] = attributes
# Formatted
json = JaSerializer.format(ArticleSerializer, a1, %{}, fields: fields)
assert %{"attributes" => attributes} = json["data"]
fields = Map.keys(attributes)
assert "title" in fields
refute "body" in fields
end
end
| 27.574074 | 78 | 0.660846 |
e861c5c5a2545354933ea033a758cbad498495bb | 2,286 | exs | Elixir | test/nabo/parser/front_test.exs | douglascorrea/nabo | b4e72ddfa491f390f2c9d509f334ade4994fd6bc | [
"MIT"
] | 109 | 2017-06-02T08:14:06.000Z | 2021-11-05T17:47:59.000Z | test/nabo/parser/front_test.exs | douglascorrea/nabo | b4e72ddfa491f390f2c9d509f334ade4994fd6bc | [
"MIT"
] | 11 | 2017-07-03T19:46:15.000Z | 2021-02-11T14:48:29.000Z | test/nabo/parser/front_test.exs | douglascorrea/nabo | b4e72ddfa491f390f2c9d509f334ade4994fd6bc | [
"MIT"
] | 20 | 2017-06-11T10:40:51.000Z | 2021-02-10T18:14:06.000Z | defmodule Nabo.Parser.FrontTest do
use ExUnit.Case, async: true
import Nabo.Parser.Front, only: [parse: 2]
test "parse/2 with successful case" do
json = """
{
"title": "Title",
"slug": "slug",
"published_at": "2017-01-01T01:02:03Z",
"draft": false,
"categories": ["cat-1", "cat-2"]
}
"""
assert {:ok, metadata} = parse(json, [])
assert metadata.title == "Title"
assert metadata.slug == "slug"
assert metadata.published_at == DateTime.from_naive!(~N[2017-01-01 01:02:03], "Etc/UTC")
assert metadata.draft? == false
assert %{"categories" => ["cat-1", "cat-2"]} = metadata.extras
end
test "parse/2 with title missing" do
json = """
{
"slug": "slug",
"published_at": "2017-01-01T01:02:03Z",
"draft": false,
"categories": ["cat-1", "cat-2"]
}
"""
assert {:error, reason} = parse(json, [])
assert reason == "\"title\" has to be set"
end
test "parse/2 with slug missing" do
json = """
{
"title": "Title",
"published_at": "2017-01-01T01:02:03Z",
"draft": false,
"categories": ["cat-1", "cat-2"]
}
"""
assert {:error, reason} = parse(json, [])
assert reason == "\"slug\" has to be set"
end
test "parse/2 with published_at missing" do
json = """
{
"title": "Title",
"slug": "slug",
"draft": false,
"categories": ["cat-1", "cat-2"]
}
"""
assert {:error, reason} = parse(json, [])
assert reason == "\"published_at\" has to be set"
end
test "parse/2 with bad format published_at" do
json = """
{
"title": "Title",
"slug": "slug",
"published_at": "Fri, 21 Nov 1997 09:55:06 -0600",
"draft": false,
"categories": ["cat-1", "cat-2"]
}
"""
assert {:error, reason} = parse(json, [])
assert reason ==
"\"published_at\" has to be in ISO-8601 format, got: \"Fri, 21 Nov 1997 09:55:06 -0600\""
end
test "parse/2 with bad JSON" do
json = """
{
"title": "Title",
"slug": "slug",
"published_at": "Fri, 21 Nov 1997 09:55:06 -0600",
"draft": false,
}
"""
assert {:error, reason} = parse(json, [])
assert %Jason.DecodeError{} = reason
end
end
| 23.326531 | 102 | 0.531059 |
e86205355adb37d2c4ff25cde63eb8cabd1300c0 | 488 | ex | Elixir | lib/fuschia_web/plugs/locale.ex | cciuenf/fuschia | cc21d38445a8236142191efd66a3e3997b25cf42 | [
"BSD-3-Clause"
] | 1 | 2022-02-21T18:25:14.000Z | 2022-02-21T18:25:14.000Z | lib/fuschia_web/plugs/locale.ex | cciuenf/fuschia | cc21d38445a8236142191efd66a3e3997b25cf42 | [
"BSD-3-Clause"
] | 32 | 2021-07-27T20:48:30.000Z | 2022-03-26T10:35:43.000Z | lib/fuschia_web/plugs/locale.ex | cciuenf/fuschia | cc21d38445a8236142191efd66a3e3997b25cf42 | [
"BSD-3-Clause"
] | null | null | null | defmodule FuschiaWeb.LocalePlug do
@moduledoc """
Plug to handle gettext locale from request header
"""
import Plug.Conn
@default_locale "en"
@spec init(map) :: map
def init(default), do: default
@spec call(Plug.Conn.t(), map) :: map
def call(conn, _opts) do
req_locale =
conn
|> get_req_header("accept-language")
|> List.first()
locale = req_locale || @default_locale
Gettext.put_locale(FuschiaWeb.Gettext, locale)
conn
end
end
| 19.52 | 51 | 0.657787 |
e8622c5b315897da319cb4c478072463404fc178 | 207 | ex | Elixir | lib/excoveralls_linter/coverage_ratio.ex | miros/excoveralls_linter | 661e9d4019d9f8842340c172d78341e8822d4d6c | [
"Apache-2.0"
] | 4 | 2019-11-25T15:32:45.000Z | 2020-01-29T23:27:45.000Z | lib/excoveralls_linter/coverage_ratio.ex | miros/excoveralls_linter | 661e9d4019d9f8842340c172d78341e8822d4d6c | [
"Apache-2.0"
] | null | null | null | lib/excoveralls_linter/coverage_ratio.ex | miros/excoveralls_linter | 661e9d4019d9f8842340c172d78341e8822d4d6c | [
"Apache-2.0"
] | null | null | null | defmodule ExCoverallsLinter.CoverageRatio do
@type t :: float
def round(coverage) when is_integer(coverage), do: coverage
def round(coverage) when is_float(coverage), do: Float.round(coverage, 3)
end
| 29.571429 | 75 | 0.768116 |
e8624fd869e0d19d95a027f6bd940ab5df252c9f | 406 | ex | Elixir | lib/top5_2_web/views/note_view.ex | rpillar/Top5_Elixir2 | 9f3a9a0315c5dc53cb53aab93deadccdb697a868 | [
"MIT"
] | 1 | 2019-11-11T21:48:20.000Z | 2019-11-11T21:48:20.000Z | lib/top5_2_web/views/note_view.ex | rpillar/Top5_Elixir2 | 9f3a9a0315c5dc53cb53aab93deadccdb697a868 | [
"MIT"
] | 2 | 2021-03-09T09:26:25.000Z | 2021-05-09T08:58:51.000Z | lib/top5_2_web/views/note_view.ex | rpillar/Top5_Elixir2 | 9f3a9a0315c5dc53cb53aab93deadccdb697a868 | [
"MIT"
] | null | null | null | defmodule Top52Web.NoteView do
use Top52Web, :view
alias Top52Web.NoteView
def render("index.json", %{notes: notes}) do
%{data: render_many(notes, NoteView, "note.json")}
end
def render("show.json", %{note: note}) do
%{data: render_one(note, NoteView, "note.json")}
end
def render("note.json", %{note: note}) do
%{id: note.id, note: note.note, task_id: note.task_id}
end
end
| 23.882353 | 58 | 0.657635 |
e86252929be0e40d425e6858528897077498d856 | 994 | ex | Elixir | lib/vintage_net/route/interface_info.ex | danielspofford/vintage_net | 2fc6f251069aa0ae283524aed6714991c0137773 | [
"Apache-2.0"
] | null | null | null | lib/vintage_net/route/interface_info.ex | danielspofford/vintage_net | 2fc6f251069aa0ae283524aed6714991c0137773 | [
"Apache-2.0"
] | null | null | null | lib/vintage_net/route/interface_info.ex | danielspofford/vintage_net | 2fc6f251069aa0ae283524aed6714991c0137773 | [
"Apache-2.0"
] | null | null | null | defmodule VintageNet.Route.InterfaceInfo do
alias VintageNet.Interface.Classification
@moduledoc false
defstruct default_gateway: nil,
weight: 0,
ip_subnets: [],
interface_type: :unknown,
status: :disconnected
@type t :: %__MODULE__{
default_gateway: :inet.ip_address() | nil,
weight: Classification.weight(),
ip_subnets: [{:inet.ip_address(), VintageNet.prefix_length()}],
interface_type: Classification.interface_type(),
status: Classification.connection_status()
}
# @spec metric(t(), [Classification.prioritization()]) :: :disabled | pos_integer()
@spec metric(t(), [
{:_ | :ethernet | :local | :mobile | :unknown | :wifi,
:_ | :disconnected | :internet | :lan}
]) :: :disabled | pos_integer()
def metric(info, prioritization) do
Classification.compute_metric(info.interface_type, info.status, info.weight, prioritization)
end
end
| 34.275862 | 96 | 0.634809 |
e8625fef474822a7e68bb4f69ece457dc698f1ac | 26,896 | ex | Elixir | lib/aws/generated/app_config.ex | salemove/aws-elixir | debdf6482158a71a57636ac664c911e682093395 | [
"Apache-2.0"
] | null | null | null | lib/aws/generated/app_config.ex | salemove/aws-elixir | debdf6482158a71a57636ac664c911e682093395 | [
"Apache-2.0"
] | null | null | null | lib/aws/generated/app_config.ex | salemove/aws-elixir | debdf6482158a71a57636ac664c911e682093395 | [
"Apache-2.0"
] | null | null | null | # WARNING: DO NOT EDIT, AUTO-GENERATED CODE!
# See https://github.com/aws-beam/aws-codegen for more details.
defmodule AWS.AppConfig do
@moduledoc """
AWS AppConfig
Use AWS AppConfig, a capability of AWS Systems Manager, to create, manage, and
quickly deploy application configurations.
AppConfig supports controlled deployments to applications of any size and
includes built-in validation checks and monitoring. You can use AppConfig with
applications hosted on Amazon EC2 instances, AWS Lambda, containers, mobile
applications, or IoT devices.
To prevent errors when deploying application configurations, especially for
production systems where a simple typo could cause an unexpected outage,
AppConfig includes validators. A validator provides a syntactic or semantic
check to ensure that the configuration you want to deploy works as intended. To
validate your application configuration data, you provide a schema or a Lambda
function that runs against the configuration. The configuration deployment or
update can only proceed when the configuration data is valid.
During a configuration deployment, AppConfig monitors the application to ensure
that the deployment is successful. If the system encounters an error, AppConfig
rolls back the change to minimize impact for your application users. You can
configure a deployment strategy for each application or environment that
includes deployment criteria, including velocity, bake time, and alarms to
monitor. Similar to error monitoring, if a deployment triggers an alarm,
AppConfig automatically rolls back to the previous version.
AppConfig supports multiple use cases. Here are some examples.
* **Application tuning**: Use AppConfig to carefully introduce
changes to your application that can only be tested with production traffic.
* **Feature toggle**: Use AppConfig to turn on new features that
require a timely deployment, such as a product launch or announcement.
* **Allow list**: Use AppConfig to allow premium subscribers to
access paid content.
* **Operational issues**: Use AppConfig to reduce stress on your
application when a dependency or other external factor impacts the system.
This reference is intended to be used with the [AWS AppConfig User Guide](http://docs.aws.amazon.com/systems-manager/latest/userguide/appconfig.html).
"""
alias AWS.Client
alias AWS.Request
def metadata do
%AWS.ServiceMetadata{
abbreviation: nil,
api_version: "2019-10-09",
content_type: "application/x-amz-json-1.1",
credential_scope: nil,
endpoint_prefix: "appconfig",
global?: false,
protocol: "rest-json",
service_id: "AppConfig",
signature_version: "v4",
signing_name: "appconfig",
target_prefix: nil
}
end
@doc """
An application in AppConfig is a logical unit of code that provides capabilities
for your customers.
For example, an application can be a microservice that runs on Amazon EC2
instances, a mobile application installed by your users, a serverless
application using Amazon API Gateway and AWS Lambda, or any system you run on
behalf of others.
"""
def create_application(%Client{} = client, input, options \\ []) do
url_path = "/applications"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:post,
url_path,
query_params,
headers,
input,
options,
201
)
end
@doc """
Information that enables AppConfig to access the configuration source.
Valid configuration sources include Systems Manager (SSM) documents, SSM
Parameter Store parameters, and Amazon S3 objects. A configuration profile
includes the following information.
* The Uri location of the configuration data.
* The AWS Identity and Access Management (IAM) role that provides
access to the configuration data.
* A validator for the configuration data. Available validators
include either a JSON Schema or an AWS Lambda function.
For more information, see [Create a Configuration and a Configuration Profile](http://docs.aws.amazon.com/systems-manager/latest/userguide/appconfig-creating-configuration-and-profile.html)
in the *AWS AppConfig User Guide*.
"""
def create_configuration_profile(%Client{} = client, application_id, input, options \\ []) do
url_path = "/applications/#{URI.encode(application_id)}/configurationprofiles"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:post,
url_path,
query_params,
headers,
input,
options,
201
)
end
@doc """
A deployment strategy defines important criteria for rolling out your
configuration to the designated targets.
A deployment strategy includes: the overall duration required, a percentage of
targets to receive the deployment during each interval, an algorithm that
defines how percentage grows, and bake time.
"""
def create_deployment_strategy(%Client{} = client, input, options \\ []) do
url_path = "/deploymentstrategies"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:post,
url_path,
query_params,
headers,
input,
options,
201
)
end
@doc """
For each application, you define one or more environments.
An environment is a logical deployment group of AppConfig targets, such as
applications in a `Beta` or `Production` environment. You can also define
environments for application subcomponents such as the `Web`, `Mobile` and
`Back-end` components for your application. You can configure Amazon CloudWatch
alarms for each environment. The system monitors alarms during a configuration
deployment. If an alarm is triggered, the system rolls back the configuration.
"""
def create_environment(%Client{} = client, application_id, input, options \\ []) do
url_path = "/applications/#{URI.encode(application_id)}/environments"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:post,
url_path,
query_params,
headers,
input,
options,
201
)
end
@doc """
Create a new configuration in the AppConfig configuration store.
"""
def create_hosted_configuration_version(
%Client{} = client,
application_id,
configuration_profile_id,
input,
options \\ []
) do
url_path =
"/applications/#{URI.encode(application_id)}/configurationprofiles/#{
URI.encode(configuration_profile_id)
}/hostedconfigurationversions"
{headers, input} =
[
{"ContentType", "Content-Type"},
{"Description", "Description"},
{"LatestVersionNumber", "Latest-Version-Number"}
]
|> Request.build_params(input)
query_params = []
options =
Keyword.put(
options,
:response_header_parameters,
[
{"Application-Id", "ApplicationId"},
{"Configuration-Profile-Id", "ConfigurationProfileId"},
{"Content-Type", "ContentType"},
{"Description", "Description"},
{"Version-Number", "VersionNumber"}
]
)
Request.request_rest(
client,
metadata(),
:post,
url_path,
query_params,
headers,
input,
options,
201
)
end
@doc """
Delete an application.
Deleting an application does not delete a configuration from a host.
"""
def delete_application(%Client{} = client, application_id, input, options \\ []) do
url_path = "/applications/#{URI.encode(application_id)}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:delete,
url_path,
query_params,
headers,
input,
options,
204
)
end
@doc """
Delete a configuration profile.
Deleting a configuration profile does not delete a configuration from a host.
"""
def delete_configuration_profile(
%Client{} = client,
application_id,
configuration_profile_id,
input,
options \\ []
) do
url_path =
"/applications/#{URI.encode(application_id)}/configurationprofiles/#{
URI.encode(configuration_profile_id)
}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:delete,
url_path,
query_params,
headers,
input,
options,
204
)
end
@doc """
Delete a deployment strategy.
Deleting a deployment strategy does not delete a configuration from a host.
"""
def delete_deployment_strategy(%Client{} = client, deployment_strategy_id, input, options \\ []) do
url_path = "/deployementstrategies/#{URI.encode(deployment_strategy_id)}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:delete,
url_path,
query_params,
headers,
input,
options,
204
)
end
@doc """
Delete an environment.
Deleting an environment does not delete a configuration from a host.
"""
def delete_environment(%Client{} = client, application_id, environment_id, input, options \\ []) do
url_path =
"/applications/#{URI.encode(application_id)}/environments/#{URI.encode(environment_id)}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:delete,
url_path,
query_params,
headers,
input,
options,
204
)
end
@doc """
Delete a version of a configuration from the AppConfig configuration store.
"""
def delete_hosted_configuration_version(
%Client{} = client,
application_id,
configuration_profile_id,
version_number,
input,
options \\ []
) do
url_path =
"/applications/#{URI.encode(application_id)}/configurationprofiles/#{
URI.encode(configuration_profile_id)
}/hostedconfigurationversions/#{URI.encode(version_number)}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:delete,
url_path,
query_params,
headers,
input,
options,
204
)
end
@doc """
Retrieve information about an application.
"""
def get_application(%Client{} = client, application_id, options \\ []) do
url_path = "/applications/#{URI.encode(application_id)}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:get,
url_path,
query_params,
headers,
nil,
options,
200
)
end
@doc """
Receive information about a configuration.
AWS AppConfig uses the value of the `ClientConfigurationVersion` parameter to
identify the configuration version on your clients. If you don’t send
`ClientConfigurationVersion` with each call to `GetConfiguration`, your clients
receive the current configuration. You are charged each time your clients
receive a configuration.
To avoid excess charges, we recommend that you include the
`ClientConfigurationVersion` value with every call to `GetConfiguration`. This
value must be saved on your client. Subsequent calls to `GetConfiguration` must
pass this value by using the `ClientConfigurationVersion` parameter.
"""
def get_configuration(
%Client{} = client,
application,
configuration,
environment,
client_configuration_version \\ nil,
client_id,
options \\ []
) do
url_path =
"/applications/#{URI.encode(application)}/environments/#{URI.encode(environment)}/configurations/#{
URI.encode(configuration)
}"
headers = []
query_params = []
query_params =
if !is_nil(client_id) do
[{"client_id", client_id} | query_params]
else
query_params
end
query_params =
if !is_nil(client_configuration_version) do
[{"client_configuration_version", client_configuration_version} | query_params]
else
query_params
end
options =
Keyword.put(
options,
:response_header_parameters,
[
{"Configuration-Version", "ConfigurationVersion"},
{"Content-Type", "ContentType"}
]
)
Request.request_rest(
client,
metadata(),
:get,
url_path,
query_params,
headers,
nil,
options,
200
)
end
@doc """
Retrieve information about a configuration profile.
"""
def get_configuration_profile(
%Client{} = client,
application_id,
configuration_profile_id,
options \\ []
) do
url_path =
"/applications/#{URI.encode(application_id)}/configurationprofiles/#{
URI.encode(configuration_profile_id)
}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:get,
url_path,
query_params,
headers,
nil,
options,
200
)
end
@doc """
Retrieve information about a configuration deployment.
"""
def get_deployment(
%Client{} = client,
application_id,
deployment_number,
environment_id,
options \\ []
) do
url_path =
"/applications/#{URI.encode(application_id)}/environments/#{URI.encode(environment_id)}/deployments/#{
URI.encode(deployment_number)
}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:get,
url_path,
query_params,
headers,
nil,
options,
200
)
end
@doc """
Retrieve information about a deployment strategy.
A deployment strategy defines important criteria for rolling out your
configuration to the designated targets. A deployment strategy includes: the
overall duration required, a percentage of targets to receive the deployment
during each interval, an algorithm that defines how percentage grows, and bake
time.
"""
def get_deployment_strategy(%Client{} = client, deployment_strategy_id, options \\ []) do
url_path = "/deploymentstrategies/#{URI.encode(deployment_strategy_id)}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:get,
url_path,
query_params,
headers,
nil,
options,
200
)
end
@doc """
Retrieve information about an environment.
An environment is a logical deployment group of AppConfig applications, such as
applications in a `Production` environment or in an `EU_Region` environment.
Each configuration deployment targets an environment. You can enable one or more
Amazon CloudWatch alarms for an environment. If an alarm is triggered during a
deployment, AppConfig roles back the configuration.
"""
def get_environment(%Client{} = client, application_id, environment_id, options \\ []) do
url_path =
"/applications/#{URI.encode(application_id)}/environments/#{URI.encode(environment_id)}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:get,
url_path,
query_params,
headers,
nil,
options,
200
)
end
@doc """
Get information about a specific configuration version.
"""
def get_hosted_configuration_version(
%Client{} = client,
application_id,
configuration_profile_id,
version_number,
options \\ []
) do
url_path =
"/applications/#{URI.encode(application_id)}/configurationprofiles/#{
URI.encode(configuration_profile_id)
}/hostedconfigurationversions/#{URI.encode(version_number)}"
headers = []
query_params = []
options =
Keyword.put(
options,
:response_header_parameters,
[
{"Application-Id", "ApplicationId"},
{"Configuration-Profile-Id", "ConfigurationProfileId"},
{"Content-Type", "ContentType"},
{"Description", "Description"},
{"Version-Number", "VersionNumber"}
]
)
Request.request_rest(
client,
metadata(),
:get,
url_path,
query_params,
headers,
nil,
options,
200
)
end
@doc """
List all applications in your AWS account.
"""
def list_applications(%Client{} = client, max_results \\ nil, next_token \\ nil, options \\ []) do
url_path = "/applications"
headers = []
query_params = []
query_params =
if !is_nil(next_token) do
[{"next_token", next_token} | query_params]
else
query_params
end
query_params =
if !is_nil(max_results) do
[{"max_results", max_results} | query_params]
else
query_params
end
Request.request_rest(
client,
metadata(),
:get,
url_path,
query_params,
headers,
nil,
options,
200
)
end
@doc """
Lists the configuration profiles for an application.
"""
def list_configuration_profiles(
%Client{} = client,
application_id,
max_results \\ nil,
next_token \\ nil,
options \\ []
) do
url_path = "/applications/#{URI.encode(application_id)}/configurationprofiles"
headers = []
query_params = []
query_params =
if !is_nil(next_token) do
[{"next_token", next_token} | query_params]
else
query_params
end
query_params =
if !is_nil(max_results) do
[{"max_results", max_results} | query_params]
else
query_params
end
Request.request_rest(
client,
metadata(),
:get,
url_path,
query_params,
headers,
nil,
options,
200
)
end
@doc """
List deployment strategies.
"""
def list_deployment_strategies(
%Client{} = client,
max_results \\ nil,
next_token \\ nil,
options \\ []
) do
url_path = "/deploymentstrategies"
headers = []
query_params = []
query_params =
if !is_nil(next_token) do
[{"next_token", next_token} | query_params]
else
query_params
end
query_params =
if !is_nil(max_results) do
[{"max_results", max_results} | query_params]
else
query_params
end
Request.request_rest(
client,
metadata(),
:get,
url_path,
query_params,
headers,
nil,
options,
200
)
end
@doc """
Lists the deployments for an environment.
"""
def list_deployments(
%Client{} = client,
application_id,
environment_id,
max_results \\ nil,
next_token \\ nil,
options \\ []
) do
url_path =
"/applications/#{URI.encode(application_id)}/environments/#{URI.encode(environment_id)}/deployments"
headers = []
query_params = []
query_params =
if !is_nil(next_token) do
[{"next_token", next_token} | query_params]
else
query_params
end
query_params =
if !is_nil(max_results) do
[{"max_results", max_results} | query_params]
else
query_params
end
Request.request_rest(
client,
metadata(),
:get,
url_path,
query_params,
headers,
nil,
options,
200
)
end
@doc """
List the environments for an application.
"""
def list_environments(
%Client{} = client,
application_id,
max_results \\ nil,
next_token \\ nil,
options \\ []
) do
url_path = "/applications/#{URI.encode(application_id)}/environments"
headers = []
query_params = []
query_params =
if !is_nil(next_token) do
[{"next_token", next_token} | query_params]
else
query_params
end
query_params =
if !is_nil(max_results) do
[{"max_results", max_results} | query_params]
else
query_params
end
Request.request_rest(
client,
metadata(),
:get,
url_path,
query_params,
headers,
nil,
options,
200
)
end
@doc """
View a list of configurations stored in the AppConfig configuration store by
version.
"""
def list_hosted_configuration_versions(
%Client{} = client,
application_id,
configuration_profile_id,
max_results \\ nil,
next_token \\ nil,
options \\ []
) do
url_path =
"/applications/#{URI.encode(application_id)}/configurationprofiles/#{
URI.encode(configuration_profile_id)
}/hostedconfigurationversions"
headers = []
query_params = []
query_params =
if !is_nil(next_token) do
[{"next_token", next_token} | query_params]
else
query_params
end
query_params =
if !is_nil(max_results) do
[{"max_results", max_results} | query_params]
else
query_params
end
Request.request_rest(
client,
metadata(),
:get,
url_path,
query_params,
headers,
nil,
options,
200
)
end
@doc """
Retrieves the list of key-value tags assigned to the resource.
"""
def list_tags_for_resource(%Client{} = client, resource_arn, options \\ []) do
url_path = "/tags/#{URI.encode(resource_arn)}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:get,
url_path,
query_params,
headers,
nil,
options,
200
)
end
@doc """
Starts a deployment.
"""
def start_deployment(%Client{} = client, application_id, environment_id, input, options \\ []) do
url_path =
"/applications/#{URI.encode(application_id)}/environments/#{URI.encode(environment_id)}/deployments"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:post,
url_path,
query_params,
headers,
input,
options,
201
)
end
@doc """
Stops a deployment.
This API action works only on deployments that have a status of `DEPLOYING`.
This action moves the deployment to a status of `ROLLED_BACK`.
"""
def stop_deployment(
%Client{} = client,
application_id,
deployment_number,
environment_id,
input,
options \\ []
) do
url_path =
"/applications/#{URI.encode(application_id)}/environments/#{URI.encode(environment_id)}/deployments/#{
URI.encode(deployment_number)
}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:delete,
url_path,
query_params,
headers,
input,
options,
202
)
end
@doc """
Metadata to assign to an AppConfig resource.
Tags help organize and categorize your AppConfig resources. Each tag consists of
a key and an optional value, both of which you define. You can specify a maximum
of 50 tags for a resource.
"""
def tag_resource(%Client{} = client, resource_arn, input, options \\ []) do
url_path = "/tags/#{URI.encode(resource_arn)}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:post,
url_path,
query_params,
headers,
input,
options,
204
)
end
@doc """
Deletes a tag key and value from an AppConfig resource.
"""
def untag_resource(%Client{} = client, resource_arn, input, options \\ []) do
url_path = "/tags/#{URI.encode(resource_arn)}"
headers = []
{query_params, input} =
[
{"TagKeys", "tagKeys"}
]
|> Request.build_params(input)
Request.request_rest(
client,
metadata(),
:delete,
url_path,
query_params,
headers,
input,
options,
204
)
end
@doc """
Updates an application.
"""
def update_application(%Client{} = client, application_id, input, options \\ []) do
url_path = "/applications/#{URI.encode(application_id)}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:patch,
url_path,
query_params,
headers,
input,
options,
200
)
end
@doc """
Updates a configuration profile.
"""
def update_configuration_profile(
%Client{} = client,
application_id,
configuration_profile_id,
input,
options \\ []
) do
url_path =
"/applications/#{URI.encode(application_id)}/configurationprofiles/#{
URI.encode(configuration_profile_id)
}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:patch,
url_path,
query_params,
headers,
input,
options,
200
)
end
@doc """
Updates a deployment strategy.
"""
def update_deployment_strategy(%Client{} = client, deployment_strategy_id, input, options \\ []) do
url_path = "/deploymentstrategies/#{URI.encode(deployment_strategy_id)}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:patch,
url_path,
query_params,
headers,
input,
options,
200
)
end
@doc """
Updates an environment.
"""
def update_environment(%Client{} = client, application_id, environment_id, input, options \\ []) do
url_path =
"/applications/#{URI.encode(application_id)}/environments/#{URI.encode(environment_id)}"
headers = []
query_params = []
Request.request_rest(
client,
metadata(),
:patch,
url_path,
query_params,
headers,
input,
options,
200
)
end
@doc """
Uses the validators in a configuration profile to validate a configuration.
"""
def validate_configuration(
%Client{} = client,
application_id,
configuration_profile_id,
input,
options \\ []
) do
url_path =
"/applications/#{URI.encode(application_id)}/configurationprofiles/#{
URI.encode(configuration_profile_id)
}/validators"
headers = []
{query_params, input} =
[
{"ConfigurationVersion", "configuration_version"}
]
|> Request.build_params(input)
Request.request_rest(
client,
metadata(),
:post,
url_path,
query_params,
headers,
input,
options,
204
)
end
end
| 23.759717 | 191 | 0.623959 |
e8627ca11f0be27e55f6366ac9f1435a07944a74 | 1,330 | ex | Elixir | lib/geolix/adapter.ex | tcitworld/geolix | e5afe54efe664b1cd0617fcaedae8d9e9e5d977a | [
"Apache-2.0"
] | null | null | null | lib/geolix/adapter.ex | tcitworld/geolix | e5afe54efe664b1cd0617fcaedae8d9e9e5d977a | [
"Apache-2.0"
] | null | null | null | lib/geolix/adapter.ex | tcitworld/geolix | e5afe54efe664b1cd0617fcaedae8d9e9e5d977a | [
"Apache-2.0"
] | null | null | null | defmodule Geolix.Adapter do
@moduledoc """
Adapter behaviour module.
"""
@optional_callbacks [
database_workers: 1,
load_database: 1,
metadata: 1,
unload_database: 1
]
@doc """
Returns the adapter processes to be supervised.
If no automatic supervision should take place or it is intended to use an
adapter specific supervisor (e.g. using the application config) this callback
should be either unimplemented or return an empty list.
"""
@callback database_workers(database :: Geolix.database()) :: [
:supervisor.child_spec() | {module, term} | module
]
@doc """
Loads a given database into Geolix.
Requires at least the fields `:id` and `:adapter`. Any other required
fields depend on the adapter's requirements.
"""
@callback load_database(database :: Geolix.database()) :: :ok | :delayed | {:error, term}
@doc """
Looks up IP information.
"""
@callback lookup(ip :: :inet.ip_address(), opts :: Keyword.t(), database :: Geolix.database()) ::
map | nil
@doc """
Returns metadata information for a database if available.
"""
@callback metadata(database :: Geolix.database()) :: map | nil
@doc """
Unloads a given database from Geolix.
"""
@callback unload_database(database :: Geolix.database()) :: :ok
end
| 27.708333 | 99 | 0.660902 |
e862803517bd2b8abdc28e8590cbf3d424f3b17c | 4,692 | exs | Elixir | test/parsers/rss2_test.exs | pollingj/itunes-parser | e76cf7af4c5930d624f0a88d117440addd5ece94 | [
"MIT"
] | null | null | null | test/parsers/rss2_test.exs | pollingj/itunes-parser | e76cf7af4c5930d624f0a88d117440addd5ece94 | [
"MIT"
] | null | null | null | test/parsers/rss2_test.exs | pollingj/itunes-parser | e76cf7af4c5930d624f0a88d117440addd5ece94 | [
"MIT"
] | null | null | null | defmodule ITunesParser.Test.Parsers.RSS2 do
use ExUnit.Case
alias ITunesParser.XmlNode
alias ITunesParser.Parsers.RSS2
setup do
sample1 = XmlNode.from_file("test/fixtures/rss2/sample1.xml")
big_sample = XmlNode.from_file("test/fixtures/rss2/bigsample.xml")
{:ok, [sample1: sample1, big_sample: big_sample]}
end
test "valid?", %{sample1: sample1} do
wrong_doc = XmlNode.from_file("test/fixtures/wrong.xml")
assert RSS2.valid?(sample1) == true
assert RSS2.valid?(wrong_doc) == false
end
test "parse_podcast", %{sample1: sample1, big_sample: big_sample} do
meta = RSS2.parse_podcast(sample1)
assert meta == %ITunesParser.Podcast{
description: "Podcasters talking about issues that affect podcast producers. A podcast about podcasting.",
link: "http://podcastersroundtable.com",
title: "Podcasters' Roundtable - Podcaster Discussing Podcasting",
image: %ITunesParser.Image{
href: "http://podcastersroundtable.com/wp-content/uploads/2012/07/Podcasters_Roundtable_ITUNES_image1400x1400.jpg"
},
copyright: "Ray Ortega 2015",
language: "en-US"
}
meta = RSS2.parse_meta(big_sample)
assert meta == %ITunesParser.Podcast{
description: "Podcasters talking about issues that affect podcast producers. A podcast about podcasting.",
link: "http://podcastersroundtable.com",
title: "Podcasters' Roundtable - Podcaster Discussing Podcasting",
copyright: "Ray Ortega 2015",
language: "en-US"
}
end
#test "parse_entry", %{big_sample: big_sample} do
# entry = XmlNode.first(big_sample, "/rss/channel/item")
# |> RSS2.parse_entry
# assert entry == %ITunesParser.Episode{
# author: nil,
# categories: [ "elixir" ],
# description: "<p>I previously <a href=\"http://blog.drewolson.org/the-value-of-explicitness/\">wrote</a> about explicitness in Elixir. One of my favorite ways the language embraces explicitness is in its distinction between eager and lazy operations on collections. Any time you use the <code>Enum</code> module, you're performing an eager operation. Your collection will be transformed/mapped/enumerated immediately. When you use</p>",
# enclosure: nil,
# guid: "9b68a5a7-4ab0-420e-8105-0462357fa1f1",
# link: "http://blog.drewolson.org/elixir-streams/",
# enclosure: %ITunesParser.Enclosure{
# url: "http://www.tutorialspoint.com/mp3s/tutorial.mp3",
# length: "12216320",
# type: "audio/mpeg"
# },
# publication_date: %Timex.DateTime{
# calendar: :gregorian,
# day: 8,
# hour: 13,
# minute: 43,
# month: 6,
# millisecond: 0,
# second: 5,
# timezone: %Timex.TimezoneInfo{
# abbreviation: "UTC",
# from: :min,
# full_name: "UTC",
# offset_std: 0,
# offset_utc: 0,
# until: :max
# },
# year: 2015
# },
# title: "Elixir Streams"
# }
#end
#test "parse_entries", %{sample1: sample1} do
# [item1, item2] = RSS2.parse_entries(sample1)
#
# assert item1.title == "RSS Tutorial"
# assert item1.link == "http://www.w3schools.com/webservices"
# assert item1.description == "New RSS tutorial on W3Schools"
# assert item2.title == "XML Tutorial"
# assert item2.link == "http://www.w3schools.com/xml"
# assert item2.description == "New XML tutorial on W3Schools"
#end
#test "parse", %{sample1: sample1} do
# feed = RSS2.parse(sample1)
# assert feed == %ITunesParser.Feed{
# entries: [
# %ITunesParser.Episode{
# description: "New RSS tutorial on W3Schools",
# link: "http://www.w3schools.com/webservices",
# title: "RSS Tutorial"},
# %ITunesParser.Episode{
# description: "New XML tutorial on W3Schools",
# link: "http://www.w3schools.com/xml",
# title: "XML Tutorial"}],
# meta: %ITunesParser.Podcast{
# description: "Free web building tutorials",
# link: "http://www.w3schools.com",
# title: "W3Schools Home Page",
# image: %ITunesParser.Image{
# title: "Test Image",
# description: "test image...",
# url: "http://localhost/image"
# },
# last_build_date: %Timex.DateTime{
# calendar: :gregorian, day: 16,
# hour: 9, minute: 54, month: 8, millisecond: 0, second: 5,
# timezone: %Timex.TimezoneInfo{
# abbreviation: "UTC", from: :min,
# full_name: "UTC",
# offset_std: 0,
# offset_utc: 0,
# until: :max},
# year: 2015},
# }
# }
#end
end
| 36.092308 | 443 | 0.622336 |
e8629f8063c66812cad0922054cdcf1de04f2c64 | 267 | exs | Elixir | priv/repo/migrations/20170913114958_remove_github_event_source_field.exs | fikape/code-corps-api | c21674b0b2a19fa26945c94268db8894420ca181 | [
"MIT"
] | 275 | 2015-06-23T00:20:51.000Z | 2021-08-19T16:17:37.000Z | priv/repo/migrations/20170913114958_remove_github_event_source_field.exs | fikape/code-corps-api | c21674b0b2a19fa26945c94268db8894420ca181 | [
"MIT"
] | 1,304 | 2015-06-26T02:11:54.000Z | 2019-12-12T21:08:00.000Z | priv/repo/migrations/20170913114958_remove_github_event_source_field.exs | fikape/code-corps-api | c21674b0b2a19fa26945c94268db8894420ca181 | [
"MIT"
] | 140 | 2016-01-01T18:19:47.000Z | 2020-11-22T06:24:47.000Z | defmodule CodeCorps.Repo.Migrations.RemoveGithubEventSourceField do
use Ecto.Migration
def up do
alter table(:github_events) do
remove :source
end
end
def down do
alter table(:github_events) do
add :source, :string
end
end
end
| 16.6875 | 67 | 0.696629 |
e862c0a0b12f4ad42b78be3976929e10906f90cc | 7,189 | exs | Elixir | test/redix/protocol_test.exs | SoCal-Software-Labs/safe-redix | 6ac3d42c104ee3a2bcd5d726aaca1474e95cc29f | [
"MIT"
] | 968 | 2015-08-17T14:14:57.000Z | 2022-03-29T03:39:17.000Z | test/redix/protocol_test.exs | SoCal-Software-Labs/safe-redix | 6ac3d42c104ee3a2bcd5d726aaca1474e95cc29f | [
"MIT"
] | 192 | 2015-08-17T20:39:57.000Z | 2022-03-23T08:48:36.000Z | test/redix/protocol_test.exs | SoCal-Software-Labs/safe-redix | 6ac3d42c104ee3a2bcd5d726aaca1474e95cc29f | [
"MIT"
] | 145 | 2015-08-17T20:38:22.000Z | 2022-03-04T22:59:47.000Z | defmodule Redix.ProtocolTest do
use ExUnit.Case, async: true
use ExUnitProperties
alias Redix.{Error, Protocol.ParseError}
doctest Redix.Protocol
describe "pack/1" do
import Redix.Protocol, only: [pack: 1]
test "empty array" do
assert IO.iodata_to_binary(pack([])) == "*0\r\n"
end
test "regular strings" do
assert IO.iodata_to_binary(pack(["foo", "bar"])) == "*2\r\n$3\r\nfoo\r\n$3\r\nbar\r\n"
assert IO.iodata_to_binary(pack(["with spaces "])) == "*1\r\n$12\r\nwith spaces \r\n"
end
test "unicode" do
str = "føø"
size = byte_size(str)
assert IO.iodata_to_binary(pack([str])) == "*1\r\n$#{size}\r\n#{str}\r\n"
end
test "raw bytes (non printable)" do
assert IO.iodata_to_binary(pack([<<1, 2>>])) == <<"*1\r\n$2\r\n", 1, 2, "\r\n">>
end
end
describe "parse/1" do
import Redix.Protocol, only: [parse: 1]
property "simple strings" do
check all string <- string(:alphanumeric),
split_command <- random_splits("+#{string}\r\n"),
split_command_with_rest = append_to_last(split_command, "rest") do
assert parse_with_continuations(split_command) == {:ok, string, ""}
assert parse_with_continuations(split_command_with_rest) == {:ok, string, "rest"}
end
end
property "errors" do
check all error_message <- string(:alphanumeric),
split_command <- random_splits("-#{error_message}\r\n"),
split_command_with_rest = append_to_last(split_command, "rest") do
error = %Error{message: error_message}
assert parse_with_continuations(split_command) == {:ok, error, ""}
error = %Error{message: error_message}
assert parse_with_continuations(split_command_with_rest) == {:ok, error, "rest"}
end
end
property "integers" do
check all int <- integer(),
split_command <- random_splits(":#{int}\r\n"),
split_command_with_rest = append_to_last(split_command, "rest") do
assert parse_with_continuations(split_command) == {:ok, int, ""}
assert parse_with_continuations(split_command_with_rest) == {:ok, int, "rest"}
end
end
property "bulk strings" do
check all bin <- binary(),
bin_size = byte_size(bin),
command = "$#{bin_size}\r\n#{bin}\r\n",
split_command <- random_splits(command),
split_command_with_rest = append_to_last(split_command, "rest") do
assert parse_with_continuations(split_command) == {:ok, bin, ""}
assert parse_with_continuations(split_command_with_rest) == {:ok, bin, "rest"}
end
# nil
check all split_command <- random_splits("$-1\r\n"),
split_command_with_rest = append_to_last(split_command, "rest"),
max_runs: 10 do
assert parse_with_continuations(split_command) == {:ok, nil, ""}
assert parse_with_continuations(split_command_with_rest) == {:ok, nil, "rest"}
end
end
property "arrays" do
assert parse("*0\r\n") == {:ok, [], ""}
assert parse("*2\r\n$3\r\nfoo\r\n$3\r\nbar\r\n") == {:ok, ["foo", "bar"], ""}
# Mixed types
arr = "*5\r\n:1\r\n:2\r\n:3\r\n:4\r\n$6\r\nfoobar\r\n"
assert parse(arr) == {:ok, [1, 2, 3, 4, "foobar"], ""}
# Says it has only 1 value, has 2
assert parse("*1\r\n:1\r\n:2\r\n") == {:ok, [1], ":2\r\n"}
command = "*-1\r\n"
check all split_command <- random_splits(command),
split_command_with_rest = append_to_last(split_command, "rest") do
assert parse_with_continuations(split_command) == {:ok, nil, ""}
assert parse_with_continuations(split_command_with_rest) == {:ok, nil, "rest"}
end
# Null values (of different types) in the array
arr = "*4\r\n$3\r\nfoo\r\n$-1\r\n$3\r\nbar\r\n*-1\r\n"
assert parse(arr) == {:ok, ["foo", nil, "bar", nil], ""}
# Nested
arr = "*2\r\n*3\r\n:1\r\n:2\r\n:3\r\n*2\r\n+Foo\r\n-ERR Bar\r\n"
assert parse(arr) == {:ok, [[1, 2, 3], ["Foo", %Error{message: "ERR Bar"}]], ""}
payload = ["*", "1\r", "\n", "+OK", "\r\nrest"]
assert parse_with_continuations(payload) == {:ok, ["OK"], "rest"}
payload = ["*2", "\r\n*1", "\r\n", "+", "OK\r\n", ":1", "\r\n"]
assert parse_with_continuations(payload) == {:ok, [["OK"], 1], ""}
payload = ["*2\r\n+OK\r\n", "+OK\r\nrest"]
assert parse_with_continuations(payload) == {:ok, ~w(OK OK), "rest"}
end
test "raising when the binary value has no type specifier" do
msg = ~s[invalid type specifier ("f")]
assert_raise ParseError, msg, fn -> parse("foo\r\n") end
assert_raise ParseError, msg, fn -> parse("foo bar baz") end
end
test "when the binary it's an invalid integer" do
assert_raise ParseError, ~S(expected integer, found: "\r"), fn -> parse(":\r\n") end
assert_raise ParseError, ~S(expected integer, found: "\r"), fn -> parse(":-\r\n") end
assert_raise ParseError, ~S(expected CRLF, found: "a"), fn -> parse(":43a\r\n") end
assert_raise ParseError, ~S(expected integer, found: "f"), fn -> parse(":foo\r\n") end
end
end
describe "parse_multi/2" do
import Redix.Protocol, only: [parse_multi: 2]
test "enough elements" do
data = "+OK\r\n+OK\r\n+OK\r\n"
assert parse_multi(data, 3) == {:ok, ~w(OK OK OK), ""}
assert parse_multi(data, 2) == {:ok, ~w(OK OK), "+OK\r\n"}
end
test "not enough data" do
data = ["+OK\r\n+OK\r\n", "+", "OK", "\r\nrest"]
assert parse_with_continuations(data, &parse_multi(&1, 3)) == {:ok, ~w(OK OK OK), "rest"}
end
end
defp random_splits(splittable_part) do
bytes = for <<byte <- splittable_part>>, do: byte
bytes
|> Enum.map(fn _byte -> frequency([{2, false}, {1, true}]) end)
|> List.replace_at(length(bytes) - 1, constant(false))
|> fixed_list()
|> map(fn split_points -> split_command(bytes, split_points, [""]) end)
end
defp split_command([byte | bytes], [true | split_points], [current | acc]) do
split_command(bytes, split_points, ["", <<current::binary, byte>> | acc])
end
defp split_command([byte | bytes], [false | split_points], [current | acc]) do
split_command(bytes, split_points, [<<current::binary, byte>> | acc])
end
defp split_command([], [], acc) do
Enum.reverse(acc)
end
defp append_to_last(parts, unsplittable_part) do
{first_parts, [last_part]} = Enum.split(parts, -1)
first_parts ++ [last_part <> unsplittable_part]
end
defp parse_with_continuations(data) do
parse_with_continuations(data, &Redix.Protocol.parse/1)
end
defp parse_with_continuations([data], parser_fun) do
parser_fun.(data)
end
defp parse_with_continuations([first | rest], parser_fun) do
import ExUnit.Assertions
{rest, [last]} = Enum.split(rest, -1)
assert {:continuation, cont} = parser_fun.(first)
last_cont =
Enum.reduce(rest, cont, fn data, cont_acc ->
assert {:continuation, cont_acc} = cont_acc.(data)
cont_acc
end)
last_cont.(last)
end
end
| 35.413793 | 95 | 0.601892 |
e86309436b2fab4c573c9052d4f189fb31bd2bd1 | 1,119 | ex | Elixir | lib/ex_jira/utils/query_params.ex | dullandwitless/elixir-ex_jira | d9a2f45fe47c70e1b5f1e1dd2ba2c3e5175c4bc2 | [
"MIT"
] | 2 | 2019-09-18T00:43:54.000Z | 2019-10-24T21:35:02.000Z | lib/ex_jira/utils/query_params.ex | dullandwitless/elixir-ex_jira | d9a2f45fe47c70e1b5f1e1dd2ba2c3e5175c4bc2 | [
"MIT"
] | 5 | 2018-05-21T15:47:45.000Z | 2022-02-11T16:21:00.000Z | lib/ex_jira/utils/query_params.ex | dullandwitless/elixir-ex_jira | d9a2f45fe47c70e1b5f1e1dd2ba2c3e5175c4bc2 | [
"MIT"
] | 6 | 2018-01-04T01:40:04.000Z | 2022-02-11T16:02:27.000Z | defmodule ExJira.QueryParams do
@moduledoc """
Helper module to convert parameters passed as a keyword list into a querystring.
"""
@doc """
Converts a keyword list and a list of atoms into a querystring containing only
entries that exist in both the keyword list and the list of atoms.
## Examples
iex> ExJira.QueryParams.convert([asdf: 123, hjkl: 456], [:asdf, :hjkl])
"asdf=123&hjkl=456"
iex> ExJira.QueryParams.convert([asdf: 123, hjkl: 456], [:asdf])
"asdf=123"
iex> ExJira.QueryParams.convert([asdf: 123, hjkl: 456], [:hjkl])
"hjkl=456"
iex> ExJira.QueryParams.convert([asdf: 123, hjkl: 456], [:qwerty])
""
iex> ExJira.QueryParams.convert([asdf: 123, hjkl: 456], [])
""
"""
@spec convert([{atom, String.t()}], [atom]) :: String.t()
def convert(_params, []), do: ""
def convert(params, [h | t]) do
amp =
case t do
[] -> ""
_ -> "&"
end
if Keyword.has_key?(params, h) do
"#{to_string(h)}=#{params[h]}#{amp}#{convert(params, t)}"
else
convert(params, t)
end
end
end
| 24.866667 | 82 | 0.589812 |
e86338a362e94f57b398d9988570a1a0ffb6f083 | 4,156 | ex | Elixir | lib/absinthe/plug/graphiql/assets.ex | autopilothq/absinthe_plug | c7a87f7e1f0ecc88be9563dc7f9d55ea96e5a781 | [
"MIT"
] | null | null | null | lib/absinthe/plug/graphiql/assets.ex | autopilothq/absinthe_plug | c7a87f7e1f0ecc88be9563dc7f9d55ea96e5a781 | [
"MIT"
] | null | null | null | lib/absinthe/plug/graphiql/assets.ex | autopilothq/absinthe_plug | c7a87f7e1f0ecc88be9563dc7f9d55ea96e5a781 | [
"MIT"
] | 1 | 2019-04-23T04:22:52.000Z | 2019-04-23T04:22:52.000Z | defmodule Absinthe.Plug.GraphiQL.Assets do
@moduledoc """
"""
@config Application.get_env(:absinthe_plug, Absinthe.Plug.GraphiQL)
@default_config [
source: :smart,
local_url_path: "/absinthe_graphiql",
local_directory: "priv/static/absinthe_graphiql",
local_source: ":package/:alias",
remote_source: "https://cdn.jsdelivr.net/npm/:package@:version/:file",
]
@react_version "15.6.1"
@assets [
{"whatwg-fetch", "2.0.3", [
{"fetch.min.js", "fetch.js"},
]},
{"react", @react_version, [
{"dist/react.min.js", "react.js"},
]},
{"react-dom", @react_version, [
{"dist/react-dom.min.js", "react-dom.js"},
]},
{"bootstrap", "3.3.7", [
{"dist/fonts/glyphicons-halflings-regular.eot", "fonts/glyphicons-halflings-regular.eot"},
{"dist/fonts/glyphicons-halflings-regular.ttf", "fonts/glyphicons-halflings-regular.ttf"},
{"dist/fonts/glyphicons-halflings-regular.woff2", "fonts/glyphicons-halflings-regular.woff2"},
{"dist/fonts/glyphicons-halflings-regular.svg", "fonts/glyphicons-halflings-regular.svg"},
{"dist/css/bootstrap.min.css", "css/bootstrap.css"},
]},
{"graphiql", "0.11.10", [
"graphiql.css",
{"graphiql.min.js", "graphiql.js"},
]},
{"@absinthe/graphiql-workspace", "1.1.5", [
"graphiql-workspace.css",
{"graphiql-workspace.min.js", "graphiql-workspace.js"}
]},
# Used by graphql-playground
{"typeface-source-code-pro", "0.0.44", [
{"index.css", "index.css"},
{"files/source-code-pro-latin-400.woff2", "files/source-code-pro-latin-400.woff2"},
{"files/source-code-pro-latin-700.woff2", "files/source-code-pro-latin-700.woff2"},
]},
# Used by graphql-playground
{"typeface-open-sans", "0.0.44", [
{"index.css", "index.css"},
{"files/open-sans-latin-300.woff2", "files/open-sans-latin-300.woff2"},
{"files/open-sans-latin-400.woff2", "files/open-sans-latin-400.woff2"},
{"files/open-sans-latin-600.woff2", "files/open-sans-latin-600.woff2"},
{"files/open-sans-latin-700.woff2", "files/open-sans-latin-700.woff2"},
]},
{"@absinthe/graphql-playground", "1.1.3", [
{"build/static/css/middleware.css", "playground.css"},
{"build/static/js/middleware.js", "playground.js"}
]},
{"@absinthe/socket-graphiql", "0.1.1", [
{"compat/umd/index.js", "socket-graphiql.js"},
]},
]
def assets_config do
case @config do
nil ->
@default_config
config ->
Keyword.merge(@default_config, Keyword.get(config, :assets, []))
end
end
def get_assets do
reduce_assets(
%{},
&Map.put(
&2,
build_asset_path(:local_source, &1),
asset_source_url(assets_config()[:source], &1)
)
)
end
def get_remote_asset_mappings do
reduce_assets(
[],
&(&2 ++ [{
local_asset_path(&1),
asset_source_url(:remote, &1)
}])
)
end
defp reduce_assets(initial, reducer) do
Enum.reduce(@assets, initial, fn {package, version, files}, acc ->
Enum.reduce(files, acc, &reducer.({package, version, &1}, &2))
end)
end
defp local_asset_path(asset) do
Path.join(assets_config()[:local_directory], build_asset_path(:local_source, asset))
end
defp asset_source_url(:smart, asset) do
if File.exists?(local_asset_path(asset)) do
asset_source_url(:local, asset)
else
asset_source_url(:remote, asset)
end
end
defp asset_source_url(:local, asset) do
Path.join(assets_config()[:local_url_path], build_asset_path(:local_source, asset))
end
defp asset_source_url(:remote, asset) do
build_asset_path(:remote_source, asset)
end
defp build_asset_path(source, {package, version, {real_path, aliased_path}}) do
assets_config()[source]
|> String.replace(":package", package)
|> String.replace(":version", version)
|> String.replace(":file", real_path)
|> String.replace(":alias", aliased_path)
end
defp build_asset_path(source, {package, version, path}) do
build_asset_path(source, {package, version, {path, path}})
end
end
| 32.217054 | 100 | 0.63282 |
e8633fbfc27ca1cd36c0dcb1fd9a9c63702fa6f7 | 5,377 | ex | Elixir | lib/event_store/storage/appender.ex | davydog187/eventstore | 085602a8cfae7401e6d89472a053fd52f586832f | [
"MIT"
] | 576 | 2017-11-03T14:11:07.000Z | 2022-03-29T06:18:47.000Z | lib/event_store/storage/appender.ex | davydog187/eventstore | 085602a8cfae7401e6d89472a053fd52f586832f | [
"MIT"
] | 129 | 2017-11-08T06:10:20.000Z | 2021-09-15T16:18:14.000Z | lib/event_store/storage/appender.ex | davydog187/eventstore | 085602a8cfae7401e6d89472a053fd52f586832f | [
"MIT"
] | 118 | 2017-11-14T14:10:09.000Z | 2022-03-28T13:13:56.000Z | defmodule EventStore.Storage.Appender do
@moduledoc false
require Logger
alias EventStore.RecordedEvent
alias EventStore.Sql.Statements
@doc """
Append the given list of events to storage.
Events are inserted in batches of 1,000 within a single transaction. This is
due to PostgreSQL's limit of 65,535 parameters for a single statement.
Returns `:ok` on success, `{:error, reason}` on failure.
"""
def append(conn, stream_id, events, opts) do
stream_uuid = stream_uuid(events)
try do
events
|> Stream.map(&encode_uuids/1)
|> Stream.chunk_every(1_000)
|> Enum.each(fn batch ->
event_count = length(batch)
with :ok <- insert_event_batch(conn, stream_id, stream_uuid, batch, event_count, opts) do
Logger.debug("Appended #{event_count} event(s) to stream #{inspect(stream_uuid)}")
:ok
else
{:error, error} -> throw({:error, error})
end
end)
catch
{:error, error} = reply ->
Logger.warn(
"Failed to append events to stream #{inspect(stream_uuid)} due to: " <> inspect(error)
)
reply
end
end
@doc """
Link the given list of existing event ids to another stream in storage.
Returns `:ok` on success, `{:error, reason}` on failure.
"""
def link(conn, stream_id, event_ids, opts \\ [])
def link(conn, stream_id, event_ids, opts) do
{schema, opts} = Keyword.pop(opts, :schema)
try do
event_ids
|> Stream.map(&encode_uuid/1)
|> Stream.chunk_every(1_000)
|> Enum.each(fn batch ->
event_count = length(batch)
parameters =
batch
|> Stream.with_index(1)
|> Enum.flat_map(fn {event_id, index} -> [index, event_id] end)
params = [stream_id, event_count] ++ parameters
with :ok <- insert_link_events(conn, params, event_count, schema, opts) do
Logger.debug("Linked #{length(event_ids)} event(s) to stream")
:ok
else
{:error, error} -> throw({:error, error})
end
end)
catch
{:error, error} = reply ->
Logger.warn("Failed to link events to stream due to: #{inspect(error)}")
reply
end
end
defp encode_uuids(%RecordedEvent{} = event) do
%RecordedEvent{event_id: event_id, causation_id: causation_id, correlation_id: correlation_id} =
event
%RecordedEvent{
event
| event_id: encode_uuid(event_id),
causation_id: encode_uuid(causation_id),
correlation_id: encode_uuid(correlation_id)
}
end
defp encode_uuid(nil), do: nil
defp encode_uuid(value), do: UUID.string_to_binary!(value)
defp insert_event_batch(conn, stream_id, stream_uuid, events, event_count, opts) do
{schema, opts} = Keyword.pop(opts, :schema)
{expected_version, opts} = Keyword.pop(opts, :expected_version)
statement =
case expected_version do
:any_version -> Statements.insert_events_any_version(schema, stream_id, event_count)
_expected_version -> Statements.insert_events(schema, stream_id, event_count)
end
stream_id_or_uuid = stream_id || stream_uuid
parameters = [stream_id_or_uuid, event_count] ++ build_insert_parameters(events)
conn
|> Postgrex.query(statement, parameters, opts)
|> handle_response()
end
defp build_insert_parameters(events) do
events
|> Enum.with_index(1)
|> Enum.flat_map(fn {%RecordedEvent{} = event, index} ->
%RecordedEvent{
event_id: event_id,
event_type: event_type,
causation_id: causation_id,
correlation_id: correlation_id,
data: data,
metadata: metadata,
created_at: created_at,
stream_version: stream_version
} = event
[
event_id,
event_type,
causation_id,
correlation_id,
data,
metadata,
created_at,
index,
stream_version
]
end)
end
defp insert_link_events(conn, params, event_count, schema, opts) do
statement = Statements.insert_link_events(schema, event_count)
conn
|> Postgrex.query(statement, params, opts)
|> handle_response()
end
defp handle_response({:ok, %Postgrex.Result{num_rows: 0}}), do: {:error, :not_found}
defp handle_response({:ok, %Postgrex.Result{}}), do: :ok
defp handle_response({:error, %Postgrex.Error{} = error}) do
%Postgrex.Error{postgres: %{code: error_code, constraint: constraint}} = error
case {error_code, constraint} do
{:foreign_key_violation, _constraint} ->
{:error, :not_found}
{:unique_violation, "events_pkey"} ->
{:error, :duplicate_event}
{:unique_violation, "stream_events_pkey"} ->
{:error, :duplicate_event}
{:unique_violation, "ix_streams_stream_uuid"} ->
# EventStore.Streams.Stream will retry when it gets this
# error code. That will always work because on the second
# time around, the stream will have been made, so the
# race to create the stream will have been resolved.
{:error, :duplicate_stream_uuid}
{:unique_violation, _constraint} ->
{:error, :wrong_expected_version}
{error_code, _constraint} ->
{:error, error_code}
end
end
defp stream_uuid([event | _]), do: event.stream_uuid
end
| 28.601064 | 100 | 0.639576 |
e8639952372146a8b8008a85f942c73b6c8c51b6 | 3,377 | ex | Elixir | clients/service_directory/lib/google_api/service_directory/v1beta1/model/expr.ex | kolorahl/elixir-google-api | 46bec1e092eb84c6a79d06c72016cb1a13777fa6 | [
"Apache-2.0"
] | null | null | null | clients/service_directory/lib/google_api/service_directory/v1beta1/model/expr.ex | kolorahl/elixir-google-api | 46bec1e092eb84c6a79d06c72016cb1a13777fa6 | [
"Apache-2.0"
] | null | null | null | clients/service_directory/lib/google_api/service_directory/v1beta1/model/expr.ex | kolorahl/elixir-google-api | 46bec1e092eb84c6a79d06c72016cb1a13777fa6 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.ServiceDirectory.V1beta1.Model.Expr do
@moduledoc """
Represents a textual expression in the Common Expression Language (CEL)
syntax. CEL is a C-like expression language. The syntax and semantics of CEL
are documented at https://github.com/google/cel-spec.
Example (Comparison):
title: "Summary size limit"
description: "Determines if a summary is less than 100 chars"
expression: "document.summary.size() < 100"
Example (Equality):
title: "Requestor is owner"
description: "Determines if requestor is the document owner"
expression: "document.owner == request.auth.claims.email"
Example (Logic):
title: "Public documents"
description: "Determine whether the document should be publicly visible"
expression: "document.type != 'private' && document.type != 'internal'"
Example (Data Manipulation):
title: "Notification string"
description: "Create a notification string with a timestamp."
expression: "'New message received at ' + string(document.create_time)"
The exact variables and functions that may be referenced within an expression
are determined by the service that evaluates it. See the service
documentation for additional information.
## Attributes
* `description` (*type:* `String.t`, *default:* `nil`) - Optional. Description of the expression. This is a longer text which
describes the expression, e.g. when hovered over it in a UI.
* `expression` (*type:* `String.t`, *default:* `nil`) - Textual representation of an expression in Common Expression Language
syntax.
* `location` (*type:* `String.t`, *default:* `nil`) - Optional. String indicating the location of the expression for error
reporting, e.g. a file name and a position in the file.
* `title` (*type:* `String.t`, *default:* `nil`) - Optional. Title for the expression, i.e. a short string describing
its purpose. This can be used e.g. in UIs which allow to enter the
expression.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:description => String.t(),
:expression => String.t(),
:location => String.t(),
:title => String.t()
}
field(:description)
field(:expression)
field(:location)
field(:title)
end
defimpl Poison.Decoder, for: GoogleApi.ServiceDirectory.V1beta1.Model.Expr do
def decode(value, options) do
GoogleApi.ServiceDirectory.V1beta1.Model.Expr.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.ServiceDirectory.V1beta1.Model.Expr do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 37.10989 | 129 | 0.710394 |
e863a510ea76ebdaab1bd0e6e4c6630a61b4bf81 | 5,457 | ex | Elixir | lib/tzdata/data_loader.ex | noizu/tzdata | bc9003ec5d23fe7c5dcdbfe0858b16d1080a7a09 | [
"MIT"
] | null | null | null | lib/tzdata/data_loader.ex | noizu/tzdata | bc9003ec5d23fe7c5dcdbfe0858b16d1080a7a09 | [
"MIT"
] | 1 | 2018-12-30T12:15:35.000Z | 2018-12-30T12:15:35.000Z | lib/tzdata/data_loader.ex | koan-ci/tzdata | bc9003ec5d23fe7c5dcdbfe0858b16d1080a7a09 | [
"MIT"
] | null | null | null | defmodule Tzdata.DataLoader do
require Logger
@compile :nowarn_deprecated_function
# Can poll for newest version of tz data and can download
# and extract it.
@download_url "https://data.iana.org/time-zones/tzdata-latest.tar.gz"
def download_new(url \\ @download_url) do
Logger.debug("Tzdata downloading new data from #{url}")
set_latest_remote_poll_date()
{:ok, 200, headers, client_ref} = :hackney.get(url, [], "", [follow_redirect: true])
{:ok, body} = :hackney.body(client_ref)
content_length = byte_size(body)
{:ok, last_modified} = last_modified_from_headers(headers)
new_dir_name =
"#{data_dir()}/tmp_downloads/#{content_length}_#{:random.uniform(100_000_000)}/"
File.mkdir_p!(new_dir_name)
target_filename = "#{new_dir_name}latest.tar.gz"
File.write!(target_filename, body)
extract(target_filename, new_dir_name)
release_version = release_version_for_dir(new_dir_name)
Logger.debug("Tzdata data downloaded. Release version #{release_version}.")
{:ok, content_length, release_version, new_dir_name, last_modified}
end
defp extract(filename, target_dir) do
:erl_tar.extract(filename, [:compressed, {:cwd, target_dir}])
# remove tar.gz file after extraction
File.rm!(filename)
end
def release_version_for_dir(dir_name) do
# 100 lines should be more than enough to get the first Release line
release_string =
"#{dir_name}/NEWS"
|> File.stream!()
|> Stream.filter(fn string -> Regex.match?(~r/Release/, string) end)
|> Enum.take(100)
|> hd
|> String.replace(~r/\s*$/, "")
captured =
Regex.named_captures(
~r/Release[\s]+(?<version>[^\s]+)[\s]+-[\s]+(?<timestamp>.+)/m,
release_string
)
captured["version"]
end
def last_modified_of_latest_available(url \\ @download_url) do
set_latest_remote_poll_date()
case :hackney.head(url, [], "", []) do
{:ok, 200, headers} ->
last_modified_from_headers(headers)
_ ->
{:error, :did_not_get_ok_response}
end
end
def latest_file_size(url \\ @download_url) do
set_latest_remote_poll_date()
case latest_file_size_by_head(url) do
{:ok, size} ->
{:ok, size}
_ ->
Logger.debug("Could not get latest tzdata file size by HEAD request. Trying GET request.")
latest_file_size_by_get(url)
end
end
defp latest_file_size_by_get(url) do
case :hackney.get(url, [], "", []) do
{:ok, 200, _headers, client_ref} ->
{:ok, body} = :hackney.body(client_ref)
{:ok, byte_size(body)}
{:ok, _status, _headers, client_ref} ->
:hackney.skip_body(client_ref)
{:error, :did_not_get_ok_response}
_ ->
{:error, :did_not_get_ok_response}
end
end
defp latest_file_size_by_head(url) do
:hackney.head(url, [], "", [])
|> do_latest_file_size_by_head
end
defp do_latest_file_size_by_head({:error, error}), do: {:error, error}
defp do_latest_file_size_by_head({_tag, resp_code, _headers}) when resp_code != 200,
do: {:error, :did_not_get_ok_response}
defp do_latest_file_size_by_head({_tag, _resp_code, headers}) do
headers
|> content_length_from_headers
end
defp content_length_from_headers(headers) do
case value_from_headers(headers, "Content-Length") do
{:ok, content_length} -> {:ok, content_length |> String.to_integer()}
{:error, reason} -> {:error, reason}
end
end
defp last_modified_from_headers(headers) do
value_from_headers(headers, "Last-Modified")
end
defp value_from_headers(headers, key) do
header =
headers
|> Enum.filter(fn {k, _v} -> k == key end)
|> List.first()
case header do
nil -> {:error, :not_found}
{_, value} -> {:ok, value}
_ -> {:error, :unexpected_headers}
end
end
def set_latest_remote_poll_date do
{y, m, d} = current_date_utc()
File.write!(remote_poll_file_name(), "#{y}-#{m}-#{d}")
end
def latest_remote_poll_date do
latest_remote_poll_file_exists?() |> do_latest_remote_poll_date
end
defp do_latest_remote_poll_date(_file_exists = true) do
File.stream!(remote_poll_file_name())
|> Enum.to_list()
|> return_value_for_file_list
end
defp do_latest_remote_poll_date(_file_exists = false), do: {:unknown, nil}
defp return_value_for_file_list([]), do: {:unknown, nil}
defp return_value_for_file_list([one_line]) do
date =
one_line
|> String.split("-")
|> Enum.map(&(Integer.parse(&1) |> elem(0)))
|> List.to_tuple()
{:ok, date}
end
defp return_value_for_file_list(_) do
raise "latest_remote_poll.txt contains more than 1 line. It should contain exactly 1 line. Remove the file latest_remote_poll.txt in order to resolve the problem."
end
defp latest_remote_poll_file_exists?, do: File.exists?(remote_poll_file_name())
defp current_date_utc, do: :calendar.universal_time() |> elem(0)
def days_since_last_remote_poll do
{tag, date} = latest_remote_poll_date()
case tag do
:ok ->
days_today = :calendar.date_to_gregorian_days(current_date_utc())
days_latest = :calendar.date_to_gregorian_days(date)
{:ok, days_today - days_latest}
_ ->
{tag, date}
end
end
def remote_poll_file_name do
data_dir() <> "/latest_remote_poll.txt"
end
defp data_dir, do: Tzdata.Util.data_dir()
end
| 29.181818 | 167 | 0.6674 |
e863beb3222efe73b0da650d6158604af63132ff | 10,177 | exs | Elixir | lib/elixir/test/elixir/kernel/raise_test.exs | RyanBard/elixir | 3e0f3b47cf26aa121470141b9a1aa55a366c066e | [
"Apache-2.0"
] | 2 | 2018-11-15T06:38:14.000Z | 2018-11-17T18:03:14.000Z | lib/elixir/test/elixir/kernel/raise_test.exs | RyanBard/elixir | 3e0f3b47cf26aa121470141b9a1aa55a366c066e | [
"Apache-2.0"
] | null | null | null | lib/elixir/test/elixir/kernel/raise_test.exs | RyanBard/elixir | 3e0f3b47cf26aa121470141b9a1aa55a366c066e | [
"Apache-2.0"
] | null | null | null | Code.require_file("../test_helper.exs", __DIR__)
defmodule Kernel.RaiseTest do
use ExUnit.Case, async: true
# Silence warnings
defp atom, do: RuntimeError
defp binary, do: "message"
defp opts, do: [message: "message"]
defp struct, do: %RuntimeError{message: "message"}
@trace [{:foo, :bar, 0, []}]
test "raise message" do
assert_raise RuntimeError, "message", fn ->
raise "message"
end
assert_raise RuntimeError, "message", fn ->
var = binary()
raise var
end
end
test "raise with no arguments" do
assert_raise RuntimeError, fn ->
raise RuntimeError
end
assert_raise RuntimeError, fn ->
var = atom()
raise var
end
end
test "raise with arguments" do
assert_raise RuntimeError, "message", fn ->
raise RuntimeError, message: "message"
end
assert_raise RuntimeError, "message", fn ->
atom = atom()
opts = opts()
raise atom, opts
end
end
test "raise existing exception" do
assert_raise RuntimeError, "message", fn ->
raise %RuntimeError{message: "message"}
end
assert_raise RuntimeError, "message", fn ->
var = struct()
raise var
end
end
test "reraise message" do
try do
reraise "message", @trace
flunk("should not reach")
rescue
RuntimeError ->
assert @trace == __STACKTRACE__
end
try do
var = binary()
reraise var, @trace
flunk("should not reach")
rescue
RuntimeError ->
assert @trace == __STACKTRACE__
end
end
test "reraise with no arguments" do
try do
reraise RuntimeError, @trace
flunk("should not reach")
rescue
RuntimeError ->
assert @trace == __STACKTRACE__
end
try do
var = atom()
reraise var, @trace
flunk("should not reach")
rescue
RuntimeError ->
assert @trace == __STACKTRACE__
end
end
test "reraise with arguments" do
try do
reraise RuntimeError, [message: "message"], @trace
flunk("should not reach")
rescue
RuntimeError ->
assert @trace == __STACKTRACE__
end
try do
atom = atom()
opts = opts()
reraise atom, opts, @trace
flunk("should not reach")
rescue
RuntimeError ->
assert @trace == __STACKTRACE__
end
end
test "reraise existing exception" do
try do
reraise %RuntimeError{message: "message"}, @trace
flunk("should not reach")
rescue
RuntimeError ->
assert @trace == __STACKTRACE__
end
try do
var = struct()
reraise var, @trace
flunk("should not reach")
rescue
RuntimeError ->
assert @trace == __STACKTRACE__
end
end
describe "rescue" do
test "runtime error" do
result =
try do
raise "an exception"
rescue
RuntimeError -> true
catch
:error, _ -> false
end
assert result
result =
try do
raise "an exception"
rescue
AnotherError -> true
catch
:error, _ -> false
end
refute result
end
test "named runtime error" do
result =
try do
raise "an exception"
rescue
x in [RuntimeError] -> Exception.message(x)
catch
:error, _ -> false
end
assert result == "an exception"
end
test "named runtime or argument error" do
result =
try do
raise "an exception"
rescue
x in [ArgumentError, RuntimeError] -> Exception.message(x)
catch
:error, _ -> false
end
assert result == "an exception"
end
test "with higher precedence than catch" do
result =
try do
raise "an exception"
rescue
_ -> true
catch
_, _ -> false
end
assert result
end
test "argument error from erlang" do
result =
try do
:erlang.error(:badarg)
rescue
ArgumentError -> true
end
assert result
end
test "argument error from elixir" do
result =
try do
raise ArgumentError, ""
rescue
ArgumentError -> true
end
assert result
end
test "catch-all variable" do
result =
try do
raise "an exception"
rescue
x -> Exception.message(x)
end
assert result == "an exception"
end
test "catch-all underscore" do
result =
try do
raise "an exception"
rescue
_ -> true
end
assert result
end
test "catch-all unused variable" do
result =
try do
raise "an exception"
rescue
_any -> true
end
assert result
end
test "catch-all with \"x in _\" syntax" do
result =
try do
raise "an exception"
rescue
exception in _ ->
Exception.message(exception)
end
assert result == "an exception"
end
end
describe "normalize" do
test "wrap custom Erlang error" do
result =
try do
:erlang.error(:sample)
rescue
x in [ErlangError] -> Exception.message(x)
end
assert result == "Erlang error: :sample"
end
test "undefined function error" do
result =
try do
DoNotExist.for_sure()
rescue
x in [UndefinedFunctionError] -> Exception.message(x)
end
assert result ==
"function DoNotExist.for_sure/0 is undefined (module DoNotExist is not available)"
end
test "function clause error" do
result =
try do
zero(1)
rescue
x in [FunctionClauseError] -> Exception.message(x)
end
assert result == "no function clause matching in Kernel.RaiseTest.zero/1"
end
test "badarg error" do
result =
try do
:erlang.error(:badarg)
rescue
x in [ArgumentError] -> Exception.message(x)
end
assert result == "argument error"
end
test "tuple badarg error" do
result =
try do
:erlang.error({:badarg, [1, 2, 3]})
rescue
x in [ArgumentError] -> Exception.message(x)
end
assert result == "argument error: [1, 2, 3]"
end
test "badarith error" do
result =
try do
:erlang.error(:badarith)
rescue
x in [ArithmeticError] -> Exception.message(x)
end
assert result == "bad argument in arithmetic expression"
end
test "badarity error" do
fun = fn x -> x end
string = "#{inspect(fun)} with arity 1 called with 2 arguments (1, 2)"
result =
try do
fun.(1, 2)
rescue
x in [BadArityError] -> Exception.message(x)
end
assert result == string
end
test "badfun error" do
# Avoid "invalid function call" warning
x = fn -> :example end
result =
try do
x.().(2)
rescue
x in [BadFunctionError] -> Exception.message(x)
end
assert result == "expected a function, got: :example"
end
test "badmatch error" do
x = :example
result =
try do
^x = zero(0)
rescue
x in [MatchError] -> Exception.message(x)
end
assert result == "no match of right hand side value: 0"
end
test "bad key error" do
result =
try do
%{%{} | foo: :bar}
rescue
x in [KeyError] -> Exception.message(x)
end
assert result == "key :foo not found"
result =
try do
%{}.foo
rescue
x in [KeyError] -> Exception.message(x)
end
assert result == "key :foo not found in: %{}"
end
test "bad map error" do
result =
try do
%{zero(0) | foo: :bar}
rescue
x in [BadMapError] -> Exception.message(x)
end
assert result == "expected a map, got: 0"
end
test "bad boolean error" do
result =
try do
1 and true
rescue
x in [BadBooleanError] -> Exception.message(x)
end
assert result == "expected a boolean on left-side of \"and\", got: 1"
end
test "case clause error" do
x = :example
result =
try do
case zero(0) do
^x -> nil
end
rescue
x in [CaseClauseError] -> Exception.message(x)
end
assert result == "no case clause matching: 0"
end
test "cond clause error" do
result =
try do
cond do
!zero(0) -> :ok
end
rescue
x in [CondClauseError] -> Exception.message(x)
end
assert result == "no cond clause evaluated to a truthy value"
end
test "try clause error" do
f = fn -> :example end
result =
try do
try do
f.()
rescue
_exception ->
:ok
else
:other ->
:ok
end
rescue
x in [TryClauseError] -> Exception.message(x)
end
assert result == "no try clause matching: :example"
end
test "undefined function error as Erlang error" do
result =
try do
DoNotExist.for_sure()
rescue
x in [ErlangError] -> Exception.message(x)
end
assert result ==
"function DoNotExist.for_sure/0 is undefined (module DoNotExist is not available)"
end
end
defmacrop exceptions do
[ErlangError]
end
test "with macros" do
result =
try do
DoNotExist.for_sure()
rescue
x in exceptions() -> Exception.message(x)
end
assert result ==
"function DoNotExist.for_sure/0 is undefined (module DoNotExist is not available)"
end
defp zero(0), do: 0
end
| 20.39479 | 97 | 0.535128 |
e86400b0de2c0e25a4e4df65fbcee78580870834 | 167 | exs | Elixir | priv/repo/migrations/20211028122202_grace_period_end.exs | plausible-insights/plausible | 88173342b9e969894879bfb2e8d203426f6a1b1c | [
"MIT"
] | 984 | 2019-09-02T11:36:41.000Z | 2020-06-08T06:25:48.000Z | priv/repo/migrations/20211028122202_grace_period_end.exs | plausible-insights/plausible | 88173342b9e969894879bfb2e8d203426f6a1b1c | [
"MIT"
] | 24 | 2019-09-10T09:53:17.000Z | 2020-06-08T07:35:26.000Z | priv/repo/migrations/20211028122202_grace_period_end.exs | plausible-insights/plausible | 88173342b9e969894879bfb2e8d203426f6a1b1c | [
"MIT"
] | 51 | 2019-09-03T10:48:10.000Z | 2020-06-07T00:23:34.000Z | defmodule Plausible.Repo.Migrations.GracePeriodEnd do
use Ecto.Migration
def change do
alter table(:users) do
add :grace_period, :map
end
end
end
| 16.7 | 53 | 0.712575 |
e8642ccb8241e8c03f793db49290650d9c1d23a5 | 254 | ex | Elixir | testData/org/elixir_lang/parser_definition/at_bracket_operation_parsing_test_case/MatchedQualifiedIdentifier.ex | keyno63/intellij-elixir | 4033e319992c53ddd42a683ee7123a97b5e34f02 | [
"Apache-2.0"
] | 1,668 | 2015-01-03T05:54:27.000Z | 2022-03-25T08:01:20.000Z | testData/org/elixir_lang/parser_definition/at_bracket_operation_parsing_test_case/MatchedQualifiedIdentifier.ex | keyno63/intellij-elixir | 4033e319992c53ddd42a683ee7123a97b5e34f02 | [
"Apache-2.0"
] | 2,018 | 2015-01-01T22:43:39.000Z | 2022-03-31T20:13:08.000Z | testData/org/elixir_lang/parser_definition/at_bracket_operation_parsing_test_case/MatchedQualifiedIdentifier.ex | keyno63/intellij-elixir | 4033e319992c53ddd42a683ee7123a97b5e34f02 | [
"Apache-2.0"
] | 145 | 2015-01-15T11:37:16.000Z | 2021-12-22T05:51:02.000Z | @Module.function[key: value]
@Module.function [key: value]
@Module.function[()]
@Module.function [()]
@Module.function[matched_expression]
@Module.function [matched_expression]
@Module.function[matched_expression,]
@Module.function [matched_expression,]
| 28.222222 | 38 | 0.787402 |
e86438c798bda12e5fb49a29e6422f9551312446 | 2,805 | ex | Elixir | assets/node_modules/phoenix/priv/templates/phx.gen.presence/presence.ex | xiongxin/web_chat | 4123887993083058a358358595c06970a5ac873f | [
"MIT"
] | 8 | 2019-06-02T05:02:36.000Z | 2021-08-11T04:23:10.000Z | assets/node_modules/phoenix/priv/templates/phx.gen.presence/presence.ex | xiongxin/web_chat | 4123887993083058a358358595c06970a5ac873f | [
"MIT"
] | 7 | 2019-05-15T08:32:51.000Z | 2020-06-10T07:46:43.000Z | assets/node_modules/phoenix/priv/templates/phx.gen.presence/presence.ex | xiongxin/web_chat | 4123887993083058a358358595c06970a5ac873f | [
"MIT"
] | 1 | 2019-06-02T05:02:47.000Z | 2019-06-02T05:02:47.000Z | defmodule <%= module %> do
@moduledoc """
Provides presence tracking to channels and processes.
See the [`Phoenix.Presence`](http://hexdocs.pm/phoenix/Phoenix.Presence.html)
docs for more details.
## Usage
Presences can be tracked in your channel after joining:
defmodule <%= base %>.MyChannel do
use <%= base %>Web, :channel
alias <%= base %>.Presence
def join("some:topic", _params, socket) do
send(self, :after_join)
{:ok, assign(socket, :user_id, ...)}
end
def handle_info(:after_join, socket) do
push socket, "presence_state", Presence.list(socket)
{:ok, _} = Presence.track(socket, socket.assigns.user_id, %{
online_at: inspect(System.system_time(:seconds))
})
{:noreply, socket}
end
end
In the example above, `Presence.track` is used to register this
channel's process as a presence for the socket's user ID, with
a map of metadata. Next, the current presence list for
the socket's topic is pushed to the client as a `"presence_state"` event.
Finally, a diff of presence join and leave events will be sent to the
client as they happen in real-time with the "presence_diff" event.
See `Phoenix.Presence.list/2` for details on the presence datastructure.
## Fetching Presence Information
The `fetch/2` callback is triggered when using `list/1`
and serves as a mechanism to fetch presence information a single time,
before broadcasting the information to all channel subscribers.
This prevents N query problems and gives you a single place to group
isolated data fetching to extend presence metadata.
The function receives a topic and map of presences and must return a
map of data matching the Presence datastructure:
%{"123" => %{metas: [%{status: "away", phx_ref: ...}],
"456" => %{metas: [%{status: "online", phx_ref: ...}]}
The `:metas` key must be kept, but you can extend the map of information
to include any additional information. For example:
def fetch(_topic, entries) do
users = entries |> Map.keys() |> Accounts.get_users_map(entries)
# => %{"123" => %{name: "User 123"}, "456" => %{name: nil}}
for {key, %{metas: metas}} <- entries, into: %{} do
{key, %{metas: metas, user: users[key]}}
end
end
The function above fetches all users from the database who
have registered presences for the given topic. The fetched
information is then extended with a `:user` key of the user's
information, while maintaining the required `:metas` field from the
original presence data.
"""
use Phoenix.Presence, otp_app: <%= inspect otp_app %>,
pubsub_server: <%= inspect binding()[:pubsub_server] %>
end
| 37.905405 | 79 | 0.662032 |
e86454b96cf2a9d46471137cbf09e87ccca213ed | 2,620 | exs | Elixir | test/acceptance/emphasis_test.exs | RichMorin/earmark | e65fcf67345c84c23d237c732e5c174246662c68 | [
"Apache-1.1"
] | null | null | null | test/acceptance/emphasis_test.exs | RichMorin/earmark | e65fcf67345c84c23d237c732e5c174246662c68 | [
"Apache-1.1"
] | null | null | null | test/acceptance/emphasis_test.exs | RichMorin/earmark | e65fcf67345c84c23d237c732e5c174246662c68 | [
"Apache-1.1"
] | null | null | null | defmodule Acceptance.EmphasisTest do
use ExUnit.Case
import Support.Helpers, only: [as_html: 1, as_html: 2]
# describe "Emphasis" do
test "important" do
markdown = "*foo bar*\n"
html = "<p><em>foo bar</em></p>\n"
messages = []
assert as_html(markdown) == {:ok, html, messages}
end
test "imporatant quotes" do
markdown = "a*\"foo\"*\n"
html = "<p>a<em>"foo"</em></p>\n"
messages = []
assert as_html(markdown, smartypants: false) == {:ok, html, messages}
end
test "important _" do
markdown = "_foo bar_\n"
html = "<p><em>foo bar</em></p>\n"
messages = []
assert as_html(markdown) == {:ok, html, messages}
end
test "dont get confused" do
markdown = "_foo*\n"
html = "<p>_foo*</p>\n"
messages = []
assert as_html(markdown) == {:ok, html, messages}
end
test "that should make you smile" do
markdown = "_foo_bar_baz_\n"
html = "<p><em>foo_bar_baz</em></p>\n"
messages = []
assert as_html(markdown) == {:ok, html, messages}
end
test "stronger" do
markdown = "**foo bar**\n"
html = "<p><strong>foo bar</strong></p>\n"
messages = []
assert as_html(markdown) == {:ok, html, messages}
end
test "stronger insisde" do
markdown = "foo**bar**\n"
html = "<p>foo<strong>bar</strong></p>\n"
messages = []
assert as_html(markdown) == {:ok, html, messages}
end
test "stronger together" do
markdown = "__foo bar__\n"
html = "<p><strong>foo bar</strong></p>\n"
messages = []
assert as_html(markdown) == {:ok, html, messages}
end
test "let no evil underscores divide us" do
markdown = "**foo__bar**\n"
html = "<p><strong>foo__bar</strong></p>\n"
messages = []
assert as_html(markdown) == {:ok, html, messages}
end
test "strong **and** stronger" do
markdown = "*(**foo**)*\n"
html = "<p><em>(<strong>foo</strong>)</em></p>\n"
messages = []
assert as_html(markdown) == {:ok, html, messages}
end
test "stronger **and** strong" do
markdown = "**(*foo*)**\n"
html = "<p><strong>(<em>foo</em>)</strong></p>\n"
messages = []
assert as_html(markdown) == {:ok, html, messages}
end
test "one is not strong enough" do
markdown = "foo*\n"
html = "<p>foo*</p>\n"
messages = []
assert as_html(markdown) == {:ok, html, messages}
end
# end
end
# SPDX-License-Identifier: Apache-2.0
| 24.485981 | 75 | 0.537405 |
e8646f03feabafc8716b0fe7cec6385bd3f34233 | 198 | ex | Elixir | lib/phone/gn.ex | net/phone | 18e1356d2f8d32fe3f95638c3c44bceab0164fb2 | [
"Apache-2.0"
] | null | null | null | lib/phone/gn.ex | net/phone | 18e1356d2f8d32fe3f95638c3c44bceab0164fb2 | [
"Apache-2.0"
] | null | null | null | lib/phone/gn.ex | net/phone | 18e1356d2f8d32fe3f95638c3c44bceab0164fb2 | [
"Apache-2.0"
] | null | null | null | defmodule Phone.GN do
@moduledoc false
use Helper.Country
def regex, do: ~r/^(224)()(.{8})/
def country, do: "Guinea"
def a2, do: "GN"
def a3, do: "GIN"
matcher :regex, ["224"]
end
| 15.230769 | 35 | 0.59596 |
e864a19d957a85e25863b363313a4b3bb77c1822 | 365 | exs | Elixir | test/web/controllers/api/v1/upload_signature_controller_test.exs | palindrom615/firestorm | 0690493c9dcae5c04c63c5321532a7db923e5be7 | [
"MIT"
] | null | null | null | test/web/controllers/api/v1/upload_signature_controller_test.exs | palindrom615/firestorm | 0690493c9dcae5c04c63c5321532a7db923e5be7 | [
"MIT"
] | 4 | 2021-03-01T21:25:42.000Z | 2022-02-10T23:50:11.000Z | test/web/controllers/api/v1/upload_signature_controller_test.exs | palindrom615/firestorm | 0690493c9dcae5c04c63c5321532a7db923e5be7 | [
"MIT"
] | 1 | 2020-03-20T12:58:37.000Z | 2020-03-20T12:58:37.000Z | defmodule FirestormWeb.Web.Api.V1.UploadSignatureControllerTest do
use FirestormWeb.Web.ConnCase
test "POST /", %{conn: conn} do
upload = %{
"filename" => "image.jpg",
"mimetype" => "image/jpeg"
}
conn = post conn, "/api/v1/upload_signature", upload: upload
response = json_response(conn, 201)["data"]
assert response
end
end
| 26.071429 | 66 | 0.657534 |
e864be8ba9364ab7be1a1b8b6e2e454900b301ea | 665 | ex | Elixir | lib/dark_matter/deps.ex | dark-elixir/dark_matter | 3f70edf4220ad1c066489110ef30880a143522fd | [
"Apache-2.0"
] | 2 | 2020-12-01T21:33:44.000Z | 2021-05-29T14:51:18.000Z | lib/dark_matter/deps.ex | dark-elixir/dark_matter | 3f70edf4220ad1c066489110ef30880a143522fd | [
"Apache-2.0"
] | null | null | null | lib/dark_matter/deps.ex | dark-elixir/dark_matter | 3f70edf4220ad1c066489110ef30880a143522fd | [
"Apache-2.0"
] | 2 | 2020-09-02T14:36:58.000Z | 2021-04-22T11:20:43.000Z | defmodule DarkMatter.Deps do
@moduledoc """
Utils for working with deps.
"""
@moduledoc since: "1.1.0"
@doc """
Check if a given `dependency` has a version matching the `semversion`.
## Examples
iex> version_match?(:dark_matter, ">= 0.0.0")
true
iex> version_match?(:dark_matter, "<= 0.0.0")
false
"""
@spec version_match?(atom(), String.t()) :: boolean()
def version_match?(dependency, semversion) when is_atom(dependency) and is_binary(semversion) do
case Application.spec(dependency, :vsn) do
nil -> false
version when is_list(version) -> Version.match?("#{version}", semversion)
end
end
end
| 25.576923 | 98 | 0.646617 |
e864fd2d3d0b9ce409721187f002f2a1c755ebdf | 1,419 | ex | Elixir | example_json_parse/lib/example_json_parse.ex | cuevacreativa/Domo | 5f2f5ff3cb57dfe774408dcae6ccb5b79d1a3089 | [
"MIT"
] | null | null | null | example_json_parse/lib/example_json_parse.ex | cuevacreativa/Domo | 5f2f5ff3cb57dfe774408dcae6ccb5b79d1a3089 | [
"MIT"
] | null | null | null | example_json_parse/lib/example_json_parse.ex | cuevacreativa/Domo | 5f2f5ff3cb57dfe774408dcae6ccb5b79d1a3089 | [
"MIT"
] | null | null | null | defmodule ExampleJsonParse do
@moduledoc """
Parses JSON and validates data to conform to model types.
"""
alias Core.Product
alias Core.Product.Image
alias JsonReply.ProductCatalog
def parse_valid_file do
parse("product-catalog.json")
end
def parse_invalid_file do
{:error, message} = parse("product-catalog-invalid.json")
IO.puts("=== ERROR ===")
Enum.each(message, fn {field, error} -> IO.puts("#{field}:\n#{error}") end)
end
defp parse(file_path) do
binary = File.read!(file_path)
with {:ok, map} <- Jason.decode(binary),
catalog = MapShaper.from_map(%ProductCatalog{}, map, &maybe_remove_locale/1),
{:ok, catalog} <- ProductCatalog.ensure_type(catalog) do
{:ok, to_products_list(catalog)}
end
end
defp maybe_remove_locale(%{"en-US" => value}), do: value
defp maybe_remove_locale(value), do: value
defp to_products_list(%ProductCatalog{} = catalog) do
image_by_id =
catalog.image_assets
|> Enum.group_by(& &1.id)
|> Enum.map(fn {key, list} -> {key, list |> List.first() |> Map.drop([:id])} end)
|> Enum.into(%{})
Enum.map(catalog.product_entries, fn entry ->
image = image_by_id[entry.image_asset_id]
entry
|> Map.from_struct()
|> Map.drop([:image_asset_id])
|> Map.put(:image, struct!(Image, Map.from_struct(image)))
|> Product.new!()
end)
end
end
| 27.288462 | 87 | 0.641297 |
e8650671441b0aacdbc696a228fccbb00912b4e6 | 1,152 | ex | Elixir | clients/service_directory/lib/google_api/service_directory/v1beta1/connection.ex | jechol/elixir-google-api | 0290b683dfc6491ca2ef755a80bc329378738d03 | [
"Apache-2.0"
] | null | null | null | clients/service_directory/lib/google_api/service_directory/v1beta1/connection.ex | jechol/elixir-google-api | 0290b683dfc6491ca2ef755a80bc329378738d03 | [
"Apache-2.0"
] | 1 | 2020-12-18T09:25:12.000Z | 2020-12-18T09:25:12.000Z | clients/service_directory/lib/google_api/service_directory/v1beta1/connection.ex | jechol/elixir-google-api | 0290b683dfc6491ca2ef755a80bc329378738d03 | [
"Apache-2.0"
] | 1 | 2020-10-04T10:12:44.000Z | 2020-10-04T10:12:44.000Z | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.ServiceDirectory.V1beta1.Connection do
@moduledoc """
Handle Tesla connections for GoogleApi.ServiceDirectory.V1beta1.
"""
@type t :: Tesla.Env.client()
use GoogleApi.Gax.Connection,
scopes: [
# View and manage your data across Google Cloud Platform services
"https://www.googleapis.com/auth/cloud-platform"
],
otp_app: :google_api_service_directory,
base_url: "https://servicedirectory.googleapis.com/"
end
| 34.909091 | 74 | 0.748264 |
e8650fe92cb62dbdbc44658ba3bb4f5b0ebfe7d3 | 24,176 | ex | Elixir | lib/sweet_xml.ex | gilacost/sweet_xml | cc740334a5e2060a98195b05ef55575fc3718b75 | [
"MIT"
] | null | null | null | lib/sweet_xml.ex | gilacost/sweet_xml | cc740334a5e2060a98195b05ef55575fc3718b75 | [
"MIT"
] | null | null | null | lib/sweet_xml.ex | gilacost/sweet_xml | cc740334a5e2060a98195b05ef55575fc3718b75 | [
"MIT"
] | null | null | null | defmodule SweetXpath do
defmodule Priv do
@moduledoc false
@doc false
def self_val(val), do: val
end
defstruct path: ".",
is_value: true,
is_list: false,
is_keyword: false,
is_optional: false,
cast_to: false,
transform_fun: &Priv.self_val/1,
namespaces: []
end
defmodule SweetXml do
@moduledoc ~S"""
`SweetXml` is a thin wrapper around `:xmerl`. It allows you to convert a
string or xmlElement record as defined in `:xmerl` to an elixir value such
as `map`, `list`, `char_list`, or any combination of these.
For normal sized documents, `SweetXml` primarily exposes 3 functions
* `SweetXml.xpath/2` - return a value based on the xpath expression
* `SweetXml.xpath/3` - similar to above but allowing nesting of mapping
* `SweetXml.xmap/2` - return a map with keywords mapped to values returned
from xpath
For something larger, `SweetXml` mainly exposes 1 function
* `SweetXml.stream_tags/3` - stream a given tag or a list of tags, and
optionally "discard" some dom elements in order to free memory during
streaming for big files which cannot fit entirely in memory
## Examples
Simple Xpath
iex> import SweetXml
iex> doc = "<h1><a>Some linked title</a></h1>"
iex> doc |> xpath(~x"//a/text()")
'Some linked title'
Nested Mapping
iex> import SweetXml
iex> doc = "<body><header><p>Message</p><ul><li>One</li><li><a>Two</a></li></ul></header></body>"
iex> doc |> xpath(~x"//header", message: ~x"./p/text()", a_in_li: ~x".//li/a/text()"l)
%{a_in_li: ['Two'], message: 'Message'}
Streaming
iex> import SweetXml
iex> doc = ["<ul><li>l1</li><li>l2", "</li><li>l3</li></ul>"]
iex> SweetXml.stream_tags(doc, :li)
...> |> Stream.map(fn {:li, doc} ->
...> doc |> SweetXml.xpath(~x"./text()")
...> end)
...> |> Enum.to_list
['l1', 'l2', 'l3']
For more examples please see help for each individual functions
## The ~x Sigil
Notice in the above examples, we used the expression `~x"//a/text()"` to
define the path. The reason is it allows us to more precisely specify what
is being returned.
* `~x"//some/path"`
without any modifiers, `xpath/2` will return the value of the entity if
the entity is of type `xmlText`, `xmlAttribute`, `xmlPI`, `xmlComment`
as defined in `:xmerl`
* `~x"//some/path"e`
`e` stands for (e)ntity. This forces `xpath/2` to return the entity with
which you can further chain your `xpath/2` call
* `~x"//some/path"l`
'l' stands for (l)ist. This forces `xpath/2` to return a list. Without
`l`, `xpath/2` will only return the first element of the match
* `~x"//some/path"el` - mix of the above
* `~x"//some/path"k`
'k' stands for (K)eyword. This forces `xpath/2` to return a Keyword instead of a Map.
* `~x"//some/path"s`
's' stands for (s)tring. This forces `xpath/2` to return the value as
string instead of a char list.
* `x"//some/path"o`
'o' stands for (O)ptional. This allows the path to not exist, and will return nil.
* `~x"//some/path"sl` - string list.
Notice also in the examples section, we always import SweetXml first. This
makes `x_sigil` available in the current scope. Without it, instead of using
`~x`, you can do the following
iex> doc = "<h1><a>Some linked title</a></h1>"
iex> doc |> SweetXml.xpath(%SweetXpath{path: '//a/text()', is_value: true, cast_to: false, is_list: false, is_keyword: false})
'Some linked title'
Note the use of char_list in the path definition.
"""
require Record
Record.defrecord(:xmlDecl, Record.extract(:xmlDecl, from_lib: "xmerl/include/xmerl.hrl"))
Record.defrecord(
:xmlAttribute,
Record.extract(:xmlAttribute, from_lib: "xmerl/include/xmerl.hrl")
)
Record.defrecord(
:xmlNamespace,
Record.extract(:xmlNamespace, from_lib: "xmerl/include/xmerl.hrl")
)
Record.defrecord(:xmlNsNode, Record.extract(:xmlNsNode, from_lib: "xmerl/include/xmerl.hrl"))
Record.defrecord(:xmlElement, Record.extract(:xmlElement, from_lib: "xmerl/include/xmerl.hrl"))
Record.defrecord(:xmlText, Record.extract(:xmlText, from_lib: "xmerl/include/xmerl.hrl"))
Record.defrecord(:xmlComment, Record.extract(:xmlComment, from_lib: "xmerl/include/xmerl.hrl"))
Record.defrecord(:xmlPI, Record.extract(:xmlPI, from_lib: "xmerl/include/xmerl.hrl"))
Record.defrecord(
:xmlDocument,
Record.extract(:xmlDocument, from_lib: "xmerl/include/xmerl.hrl")
)
Record.defrecord(:xmlObj, Record.extract(:xmlObj, from_lib: "xmerl/include/xmerl.hrl"))
@doc ~s"""
`sigil_x/2` simply returns a `SweetXpath` struct, with modifiers converted to
boolean fields
iex> SweetXml.sigil_x("//some/path", 'e')
%SweetXpath{path: '//some/path', is_value: false, cast_to: false, is_list: false, is_keyword: false}
or you can simply import and use the `~x` expression
iex> import SweetXml
iex> ~x"//some/path"e
%SweetXpath{path: '//some/path', is_value: false, cast_to: false, is_list: false, is_keyword: false}
Valid modifiers are `e`, `s`, `l` and `k`. Below is the full explanation
* `~x"//some/path"`
without any modifiers, `xpath/2` will return the value of the entity if
the entity is of type `xmlText`, `xmlAttribute`, `xmlPI`, `xmlComment`
as defined in `:xmerl`
* `~x"//some/path"e`
`e` stands for (e)ntity. This forces `xpath/2` to return the entity with
which you can further chain your `xpath/2` call
* `~x"//some/path"l`
'l' stands for (l)ist. This forces `xpath/2` to return a list. Without
`l`, `xpath/2` will only return the first element of the match
* `~x"//some/path"el` - mix of the above
* `~x"//some/path"k`
'k' stands for (K)eyword. This forces `xpath/2` to return a Keyword instead of a Map.
* `~x"//some/path"s`
's' stands for (s)tring. This forces `xpath/2` to return the value as
string instead of a char list.
* `x"//some/path"o`
'o' stands for (O)ptional. This allows the path to not exist, and will return nil.
* `~x"//some/path"sl` - string list.
* `~x"//some/path"i`
'i' stands for (i)nteger. This forces `xpath/2` to return the value as
integer instead of a char list.
* `~x"//some/path"f`
'f' stands for (f)loat. This forces `xpath/2` to return the value as
float instead of a char list.
* `~x"//some/path"il` - integer list
"""
def sigil_x(path, modifiers \\ '') do
%SweetXpath{
path: String.to_charlist(path),
is_value: not (?e in modifiers),
is_list: ?l in modifiers,
is_keyword: ?k in modifiers,
is_optional: ?o in modifiers,
cast_to:
cond do
?s in modifiers -> :string
?S in modifiers -> :soft_string
?i in modifiers -> :integer
?I in modifiers -> :soft_integer
?f in modifiers -> :float
?F in modifiers -> :soft_float
:otherwise -> false
end
}
end
def add_namespace(xpath, prefix, uri) do
%SweetXpath{
xpath
| namespaces: [
{to_charlist(prefix), to_charlist(uri)}
| xpath.namespaces
]
}
end
@doc """
`doc` can be
- a byte list (iodata)
- a binary
- any enumerable of binaries (for instance `File.stream!/3` result)
`options` are `xmerl` options described here [http://www.erlang.org/doc/man/xmerl_scan.html](http://www.erlang.org/doc/man/xmerl_scan.html),
see [the erlang tutorial](http://www.erlang.org/doc/apps/xmerl/xmerl_examples.html) for usage.
When `doc` is an enumerable, the `:cont_fun` option cannot be given.
Return an `xmlElement` record
"""
def parse(doc), do: parse(doc, [])
def parse(doc, options) when is_binary(doc) do
doc |> :erlang.binary_to_list() |> parse(options)
end
def parse([c | _] = doc, options) when is_integer(c) do
{parsed_doc, _} = :xmerl_scan.string(doc, options)
parsed_doc
end
def parse(doc_enum, options) do
{parsed_doc, _} = :xmerl_scan.string('', options ++ continuation_opts(doc_enum))
parsed_doc
end
@doc """
Most common usage of streaming: stream a given tag or a list of tags, and
optionally "discard" some dom elements in order to free memory during streaming
for big files which cannot fit entirely in memory.
Note that each matched tag produces it's own tree. If a given tag appears in
the discarded options, it is ignored.
- `doc` is an enumerable, data will be pulled during the result stream
enumeration. e.g. `File.stream!("some_file.xml")`
- `tags` is an atom or a list of atoms you want to extract. Each stream element
will be `{:tagname, xmlelem}`. e.g. :li, :header
- `options[:discard]` is the list of tag which will be discarded:
not added to its parent DOM.
Examples:
iex> import SweetXml
iex> doc = ["<ul><li>l1</li><li>l2", "</li><li>l3</li></ul>"]
iex> SweetXml.stream_tags(doc, :li, discard: [:li])
...> |> Stream.map(fn {:li, doc} -> doc |> SweetXml.xpath(~x"./text()") end)
...> |> Enum.to_list
['l1', 'l2', 'l3']
iex> SweetXml.stream_tags(doc, [:ul, :li])
...> |> Stream.map(fn {_, doc} -> doc |> SweetXml.xpath(~x"./text()") end)
...> |> Enum.to_list
['l1', 'l2', 'l3', nil]
Becareful if you set `options[:discard]`. If any of the discarded tags is nested
inside a kept tag, you will not be able to access them.
Examples:
iex> import SweetXml
iex> doc = ["<header>", "<title>XML</title", "><header><title>Nested</title></header></header>"]
iex> SweetXml.stream_tags(doc, :header)
...> |> Stream.map(fn {_, doc} -> SweetXml.xpath(doc, ~x".//title/text()") end)
...> |> Enum.to_list
['Nested', 'XML']
iex> SweetXml.stream_tags(doc, :header, discard: [:title])
...> |> Stream.map(fn {_, doc} -> SweetXml.xpath(doc, ~x"./title/text()") end)
...> |> Enum.to_list
[nil, nil]
"""
def stream_tags(doc, tags, options \\ []) do
tags = if is_atom(tags), do: [tags], else: tags
{discard_tags, xmerl_options} =
if options[:discard] do
{options[:discard], Keyword.delete(options, :discard)}
else
{[], options}
end
doc
|> stream(fn emit ->
[
hook_fun: fn
entity, xstate when Record.is_record(entity, :xmlElement) ->
name = xmlElement(entity, :name)
if length(tags) == 0 or name in tags do
emit.({name, entity})
end
{entity, xstate}
entity, xstate ->
{entity, xstate}
end,
acc_fun: fn
entity, acc, xstate when Record.is_record(entity, :xmlElement) ->
if xmlElement(entity, :name) in discard_tags do
{acc, xstate}
else
{[entity | acc], xstate}
end
entity, acc, xstate ->
{[entity | acc], xstate}
end
] ++ xmerl_options
end)
end
@doc """
Create an element stream from a xml `doc`.
This is a lower level API compared to `SweetXml.stream_tags`. You can use
the `options_callback` argument to get fine control of what data to be streamed.
- `doc` is an enumerable, data will be pulled during the result stream
enumeration. e.g. `File.stream!("some_file.xml")`
- `options_callback` is an anonymous function `fn emit -> xmerl_opts` use it to
define your :xmerl callbacks and put data into the stream using
`emit.(elem)` in the callbacks.
For example, here you define a stream of all `xmlElement` :
iex> import Record
iex> doc = ["<h1", "><a>Som", "e linked title</a><a>other</a></h1>"]
iex> SweetXml.stream(doc, fn emit ->
...> [
...> hook_fun: fn
...> entity, xstate when is_record(entity, :xmlElement)->
...> emit.(entity)
...> {entity, xstate}
...> entity, xstate ->
...> {entity,xstate}
...> end
...> ]
...> end) |> Enum.count
3
"""
def stream(doc, options_callback) when is_binary(doc) do
stream([doc], options_callback)
end
def stream([c | _] = doc, options_callback) when is_integer(c) do
stream([IO.iodata_to_binary(doc)], options_callback)
end
def stream(doc, options_callback) do
Stream.resource(
fn ->
{parent, ref} = waiter = {self(), make_ref()}
opts = options_callback.(fn e -> send(parent, {:event, ref, e}) end)
pid = spawn_link(fn -> :xmerl_scan.string('', opts ++ continuation_opts(doc, waiter)) end)
{ref, pid, Process.monitor(pid)}
end,
fn {ref, pid, monref} = acc ->
receive do
{:DOWN, ^monref, _, _, _} ->
## !!! maybe do something when reason !== :normal
{:halt, :parse_ended}
{:event, ^ref, event} ->
{[event], acc}
{:wait, ^ref} ->
send(pid, {:continue, ref})
{[], acc}
end
end,
fn
:parse_ended ->
:ok
{ref, pid, monref} ->
Process.demonitor(monref)
flush_halt(pid, ref)
end
)
end
@doc ~S"""
`xpath` allows you to query an xml document with xpath.
The second argument to xpath is a `SweetXpath` struct. The optional third
argument is a keyword list, such that the value of each keyword is also
either a `SweetXpath` or a list with head being a `SweetXpath` and tail being
another keyword list exactly like before. Please see examples below for better
understanding.
## Examples
Simple
iex> import SweetXml
iex> doc = "<h1><a>Some linked title</a></h1>"
iex> doc |> xpath(~x"//a/text()")
'Some linked title'
With optional mapping
iex> import SweetXml
iex> doc = "<body><header><p>Message</p><ul><li>One</li><li><a>Two</a></li></ul></header></body>"
iex> doc |> xpath(~x"//header", message: ~x"./p/text()", a_in_li: ~x".//li/a/text()"l)
%{a_in_li: ['Two'], message: 'Message'}
With optional mapping and nesting
iex> import SweetXml
iex> doc = "<body><header><p>Message</p><ul><li>One</li><li><a>Two</a></li></ul></header></body>"
iex> doc
...> |> xpath(
...> ~x"//header",
...> ul: [
...> ~x"./ul",
...> a: ~x"./li/a/text()"
...> ]
...> )
%{ul: %{a: 'Two'}}
"""
def xpath(parent, spec) when not is_tuple(parent) do
parent |> parse |> xpath(spec)
end
def xpath(
parent,
%SweetXpath{is_list: true, is_value: true, cast_to: cast, is_optional: is_opt?} = spec
) do
get_current_entities(parent, spec)
|> Enum.map(&(_value(&1) |> to_cast(cast, is_opt?)))
|> spec.transform_fun.()
end
def xpath(parent, %SweetXpath{is_list: true, is_value: false} = spec) do
get_current_entities(parent, spec) |> spec.transform_fun.()
end
def xpath(
parent,
%SweetXpath{is_list: false, is_value: true, cast_to: string_type, is_optional: is_opt?} =
spec
)
when string_type in [:string, :soft_string] do
spec = %SweetXpath{spec | is_list: true}
get_current_entities(parent, spec)
|> Enum.map(&(_value(&1) |> to_cast(string_type, is_opt?)))
|> Enum.join()
|> spec.transform_fun.()
end
def xpath(
parent,
%SweetXpath{is_list: false, is_value: true, cast_to: cast, is_optional: is_opt?} = spec
) do
get_current_entities(parent, spec)
|> _value
|> to_cast(cast, is_opt?)
|> spec.transform_fun.()
end
def xpath(parent, %SweetXpath{is_list: false, is_value: false} = spec) do
get_current_entities(parent, spec) |> spec.transform_fun.()
end
def xpath(parent, sweet_xpath, subspec) do
if sweet_xpath.is_list do
current_entities = xpath(parent, sweet_xpath)
Enum.map(current_entities, fn entity -> xmap(entity, subspec, sweet_xpath) end)
else
current_entity = xpath(parent, sweet_xpath)
xmap(current_entity, subspec, sweet_xpath)
end
end
@doc ~S"""
`xmap` returns a mapping with each value being the result of `xpath`
Just as `xpath`, you can nest the mapping structure. Please see `xpath` for
more detail.
## Examples
Simple
iex> import SweetXml
iex> doc = "<h1><a>Some linked title</a></h1>"
iex> doc |> xmap(a: ~x"//a/text()")
%{a: 'Some linked title'}
With optional mapping
iex> import SweetXml
iex> doc = "<body><header><p>Message</p><ul><li>One</li><li><a>Two</a></li></ul></header></body>"
iex> doc |> xmap(message: ~x"//p/text()", a_in_li: ~x".//li/a/text()"l)
%{a_in_li: ['Two'], message: 'Message'}
With optional mapping and nesting
iex> import SweetXml
iex> doc = "<body><header><p>Message</p><ul><li>One</li><li><a>Two</a></li></ul></header></body>"
iex> doc
...> |> xmap(
...> message: ~x"//p/text()",
...> ul: [
...> ~x"//ul",
...> a: ~x"./li/a/text()"
...> ]
...> )
%{message: 'Message', ul: %{a: 'Two'}}
iex> doc
...> |> xmap(
...> message: ~x"//p/text()",
...> ul: [
...> ~x"//ul"k,
...> a: ~x"./li/a/text()"
...> ]
...> )
%{message: 'Message', ul: [a: 'Two']}
iex> doc
...> |> xmap([
...> message: ~x"//p/text()",
...> ul: [
...> ~x"//ul",
...> a: ~x"./li/a/text()"
...> ]
...> ], true)
[message: 'Message', ul: %{a: 'Two'}]
"""
def xmap(parent, mapping), do: xmap(parent, mapping, %{is_keyword: false})
def xmap(nil, _, %{is_optional: true}), do: nil
def xmap(parent, [], atom) when is_atom(atom), do: xmap(parent, [], %{is_keyword: atom})
def xmap(_, [], %{is_keyword: false}), do: %{}
def xmap(_, [], %{is_keyword: true}), do: []
def xmap(parent, [{label, spec} | tail], is_keyword) when is_list(spec) do
[sweet_xpath | subspec] = spec
result = xmap(parent, tail, is_keyword)
put_in(result[label], xpath(parent, sweet_xpath, subspec))
end
def xmap(parent, [{label, sweet_xpath} | tail], is_keyword) do
result = xmap(parent, tail, is_keyword)
put_in(result[label], xpath(parent, sweet_xpath))
end
@doc """
Tags `%SweetXpath{}` with `fun` to be applied at the end of `xpath` query.
## Examples
iex> import SweetXml
iex> string_to_range = fn str ->
...> [first, last] = str |> String.split("-", trim: true) |> Enum.map(&String.to_integer/1)
...> first..last
...> end
iex> doc = "<weather><zone><name>north</name><wind-speed>5-15</wind-speed></zone></weather>"
iex> doc
...> |> xpath(
...> ~x"//weather/zone"l,
...> name: ~x"//name/text()"s |> transform_by(&String.capitalize/1),
...> wind_speed: ~x"./wind-speed/text()"s |> transform_by(string_to_range)
...> )
[%{name: "North", wind_speed: 5..15}]
"""
def transform_by(%SweetXpath{} = sweet_xpath, fun) when is_function(fun) do
%{sweet_xpath | transform_fun: fun}
end
defp _value(entity) do
cond do
is_record?(entity, :xmlText) ->
xmlText(entity, :value)
is_record?(entity, :xmlComment) ->
xmlComment(entity, :value)
is_record?(entity, :xmlPI) ->
xmlPI(entity, :value)
is_record?(entity, :xmlAttribute) ->
xmlAttribute(entity, :value)
is_record?(entity, :xmlObj) ->
xmlObj(entity, :value)
true ->
entity
end
end
defp is_record?(data, kind) do
is_tuple(data) and tuple_size(data) > 0 and :erlang.element(1, data) == kind
end
defp continuation_opts(enum, waiter \\ nil) do
[
{
:continuation_fun,
fn xcont, xexc, xstate ->
case :xmerl_scan.cont_state(xstate).({:cont, []}) do
{:halted, _acc} ->
xexc.(xstate)
{:suspended, bin, cont} ->
case waiter do
nil ->
:ok
{parent, ref} ->
# continuation behaviour, pause and wait stream decision
send(parent, {:wait, ref})
receive do
# stream continuation fun has been called: parse to find more elements
{:continue, ^ref} ->
:ok
# stream halted: halt the underlying stream and exit parsing process
{:halt, ^ref} ->
cont.({:halt, []})
exit(:normal)
end
end
xcont.(bin, :xmerl_scan.cont_state(cont, xstate))
{:done, _} ->
xexc.(xstate)
end
end,
&Enumerable.reduce(split_by_whitespace(enum), &1, fn bin, _ -> {:suspend, bin} end)
},
{
:close_fun,
# make sure the XML end halts the binary stream (if more bytes are available after XML)
fn xstate ->
:xmerl_scan.cont_state(xstate).({:halt, []})
xstate
end
}
]
end
defp split_by_whitespace(enum) do
reducer = fn
:last, prev ->
{[:erlang.binary_to_list(prev)], :done}
bin, prev ->
bin = if prev === "", do: bin, else: IO.iodata_to_binary([prev, bin])
case split_last_whitespace(bin) do
:white_bin -> {[], bin}
{head, tail} -> {[:erlang.binary_to_list(head)], tail}
end
end
Stream.concat(enum, [:last]) |> Stream.transform("", reducer)
end
defp split_last_whitespace(bin), do: split_last_whitespace(byte_size(bin) - 1, bin)
defp split_last_whitespace(0, _), do: :white_bin
defp split_last_whitespace(size, bin) do
case bin do
<<_::binary-size(size), h>> <> tail when h == ?\s or h == ?\n or h == ?\r or h == ?\t ->
{head, _} = :erlang.split_binary(bin, size + 1)
{head, tail}
_ ->
split_last_whitespace(size - 1, bin)
end
end
defp flush_halt(pid, ref) do
receive do
{:event, ^ref, _} ->
# flush all emitted elems after :halt
flush_halt(pid, ref)
{:wait, ^ref} ->
# tell the continuation function to halt the underlying stream
send(pid, {:halt, ref})
end
end
defp get_current_entities(parent, %SweetXpath{path: path, is_list: true, namespaces: namespaces}) do
:xmerl_xpath.string(path, parent, namespace: namespaces) |> List.wrap()
end
defp get_current_entities(parent, %SweetXpath{
path: path,
is_list: false,
namespaces: namespaces
}) do
ret = :xmerl_xpath.string(path, parent, namespace: namespaces)
if is_record?(ret, :xmlObj) do
ret
else
List.first(ret)
end
end
defp to_cast(value, false, _is_opt?), do: value
defp to_cast(nil, _, true), do: nil
defp to_cast(value, :string, _is_opt?), do: to_string(value)
defp to_cast(value, :integer, _is_opt?), do: String.to_integer(to_string(value))
defp to_cast(value, :float, _is_opt?) do
{float, _} = Float.parse(to_string(value))
float
end
defp to_cast(value, :soft_string, is_opt?) do
if String.Chars.impl_for(value) do
to_string(value)
else
if is_opt?, do: nil, else: ""
end
end
defp to_cast(value, :soft_integer, is_opt?) do
if String.Chars.impl_for(value) do
case Integer.parse(to_string(value)) do
:error -> if is_opt?, do: nil, else: 0
{int, _} -> int
end
else
if is_opt?, do: nil, else: 0
end
end
defp to_cast(value, :soft_float, is_opt?) do
if String.Chars.impl_for(value) do
case Float.parse(to_string(value)) do
:error -> if is_opt?, do: nil, else: 0.0
{float, _} -> float
end
else
if is_opt?, do: nil, else: 0.0
end
end
end
| 30.719187 | 142 | 0.582396 |
e8653bd6d2f41f20b6b72138f8416498ba7ddc5a | 5,857 | exs | Elixir | apps/blockchain/test/blockchain/contract/create_contract_test.exs | atoulme/mana | cff3fd96c23feaaeb9fe32df3c0d35ee6dc548a5 | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | null | null | null | apps/blockchain/test/blockchain/contract/create_contract_test.exs | atoulme/mana | cff3fd96c23feaaeb9fe32df3c0d35ee6dc548a5 | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | null | null | null | apps/blockchain/test/blockchain/contract/create_contract_test.exs | atoulme/mana | cff3fd96c23feaaeb9fe32df3c0d35ee6dc548a5 | [
"ECL-2.0",
"Apache-2.0",
"MIT-0",
"MIT"
] | null | null | null | defmodule Blockchain.Contract.CreateContractTest do
use ExUnit.Case
doctest Blockchain.Contract.CreateContract
alias Blockchain.{Account, Contract}
alias Blockchain.Account.Repo
alias ExthCrypto.Hash.Keccak
alias EVM.{MachineCode, SubState}
alias MerklePatriciaTree.{DB, Trie}
setup do
db = MerklePatriciaTree.Test.random_ets_db(:contract_test)
{:ok, %{db: db}}
end
describe "execute/1" do
test "creates a new contract", %{db: db} do
account = %Account{balance: 11, nonce: 5}
state =
db
|> Trie.new()
|> Account.put_account(<<0x10::160>>, account)
params = %Contract.CreateContract{
account_repo: Repo.new(state),
sender: <<0x10::160>>,
originator: <<0x10::160>>,
available_gas: 100_000_000,
gas_price: 1,
endowment: 5,
init_code: init_code(),
stack_depth: 5,
block_header: %Block.Header{nonce: 1}
}
{_, {account_repo, gas, sub_state, _output}} = Contract.create(params)
state = Repo.commit(account_repo).state
expected_root_hash =
<<9, 235, 32, 146, 153, 242, 209, 192, 224, 61, 214, 174, 48, 24, 148, 28, 51, 254, 7, 82,
58, 82, 220, 157, 29, 159, 203, 51, 52, 240, 37, 122>>
assert state == %Trie{db: {DB.ETS, :contract_test}, root_hash: expected_root_hash}
assert gas == 99_993_576
assert SubState.empty?(sub_state)
addresses = [<<0x10::160>>, Account.Address.new(<<0x10::160>>, 5)]
actual_accounts = Account.get_accounts(state, addresses)
expected_accounts = [
%Account{balance: 6, nonce: 5},
%Account{
balance: 5,
nonce: 0,
code_hash:
<<243, 247, 169, 254, 54, 79, 170, 185, 59, 33, 109, 165, 10, 50, 20, 21, 79, 34, 160,
162, 180, 21, 178, 58, 132, 200, 22, 158, 139, 99, 110, 227>>
}
]
assert actual_accounts == expected_accounts
contract_address = Account.Address.new(<<0x10::160>>, 5)
assert Account.get_machine_code(state, contract_address) == {:ok, <<0x08::256>>}
assert state |> Trie.Inspector.all_keys() |> Enum.count() == 2
end
test "does not create contract if address already exists with nonce", %{db: db} do
account = %Account{balance: 11, nonce: 5}
account_address = <<0x10::160>>
collision_account = %Account{nonce: 1, code_hash: init_code_result_hash()}
collision_account_address = Account.Address.new(account_address, account.nonce)
state =
db
|> Trie.new()
|> Account.put_account(account_address, account)
|> Account.put_account(collision_account_address, collision_account)
params = %Contract.CreateContract{
account_repo: Repo.new(state),
sender: account_address,
originator: account_address,
available_gas: 100_000_000,
gas_price: 1,
endowment: 5,
init_code: init_code(),
stack_depth: 5,
block_header: %Block.Header{nonce: 1}
}
{:error, {account_repo, 0, sub_state, _output}} = Contract.create(params)
assert state == account_repo.state
assert SubState.empty?(sub_state)
end
test "does not create contract if address has code", %{db: db} do
account = %Account{balance: 11, nonce: 5}
account_address = <<0x10::160>>
collision_account = %Account{code_hash: init_code_result_hash(), nonce: 0}
collision_account_address = Account.Address.new(account_address, account.nonce)
state =
db
|> Trie.new()
|> Account.put_account(account_address, account)
|> Account.put_account(collision_account_address, collision_account)
params = %Contract.CreateContract{
account_repo: Repo.new(state),
sender: account_address,
originator: account_address,
available_gas: 100_000_000,
gas_price: 1,
endowment: 5,
init_code: init_code(),
stack_depth: 5,
block_header: %Block.Header{nonce: 1}
}
{:error, {account_repo, 0, sub_state, _output}} = Contract.create(params)
assert state == account_repo.state
assert SubState.empty?(sub_state)
end
test "creates a contract even if the address already has a balance", %{db: db} do
account = %Account{balance: 10, nonce: 2}
contract_account = %Account{balance: 10}
contract_address = Account.Address.new(<<0x10::160>>, account.nonce)
state =
db
|> Trie.new()
|> Account.put_account(<<0x10::160>>, account)
|> Account.put_account(contract_address, contract_account)
params = %Contract.CreateContract{
account_repo: Repo.new(state),
sender: <<0x10::160>>,
originator: <<0x10::160>>,
available_gas: 100_000_000,
gas_price: 1,
endowment: 5,
init_code: init_code(),
stack_depth: 5,
block_header: %Block.Header{nonce: 1}
}
{_, {account_repo, _gas, _sub_state, _output}} = Contract.create(params)
state = Repo.commit(account_repo).state
addresses = [<<0x10::160>>, Account.Address.new(<<0x10::160>>, 2)]
actual_accounts = Account.get_accounts(state, addresses)
expected_accounts = [
%Account{
balance: 5,
nonce: 2
},
%Account{
balance: 15,
code_hash: init_code_result_hash()
}
]
assert actual_accounts == expected_accounts
end
end
defp init_code_result_hash do
Keccak.kec(<<8::256>>)
end
defp init_code do
assembly = [
:push1,
3,
:push1,
5,
:add,
:push1,
0x00,
:mstore,
:push1,
32,
:push1,
0,
:return
]
MachineCode.compile(assembly)
end
end
| 29.580808 | 98 | 0.606112 |
e8654700838dd81a19eb6174881cbedd364e611f | 52 | exs | Elixir | config/config.exs | chungwong/timex | bcd2504119f5c11ada7455d19726b5a49254dabf | [
"MIT"
] | null | null | null | config/config.exs | chungwong/timex | bcd2504119f5c11ada7455d19726b5a49254dabf | [
"MIT"
] | null | null | null | config/config.exs | chungwong/timex | bcd2504119f5c11ada7455d19726b5a49254dabf | [
"MIT"
] | null | null | null | use Mix.Config
config :timex, default_locale: "en"
| 13 | 35 | 0.75 |
e8655642781c68c07b5057d6bca309ed8baf03ec | 273 | exs | Elixir | config/test.exs | boqolo/boqolo | ba9a2ebcf379f5056b3836756179b49163f91d72 | [
"BSD-3-Clause"
] | null | null | null | config/test.exs | boqolo/boqolo | ba9a2ebcf379f5056b3836756179b49163f91d72 | [
"BSD-3-Clause"
] | null | null | null | config/test.exs | boqolo/boqolo | ba9a2ebcf379f5056b3836756179b49163f91d72 | [
"BSD-3-Clause"
] | null | null | null | use Mix.Config
# We don't run a server during test. If one is required,
# you can enable the server option below.
config :rayyan_site, RayyanSiteWeb.Endpoint,
http: [port: 4002],
server: false
# Print only warnings and errors during test
config :logger, level: :warn
| 24.818182 | 56 | 0.74359 |
e8656098b06b52927899547b714c4973176cddde | 400 | exs | Elixir | priv/repo/migrations/20210811165243_create_blogs.exs | ysaito8015/communitex | d469447a62029d59883d95df4df3c9b09e0022e2 | [
"Apache-2.0"
] | 7 | 2021-07-14T15:45:55.000Z | 2022-01-25T11:13:01.000Z | priv/repo/migrations/20210811165243_create_blogs.exs | ysaito8015/communitex | d469447a62029d59883d95df4df3c9b09e0022e2 | [
"Apache-2.0"
] | 10 | 2021-08-09T15:54:05.000Z | 2022-02-17T04:18:38.000Z | priv/repo/migrations/20210811165243_create_blogs.exs | ysaito8015/communitex | d469447a62029d59883d95df4df3c9b09e0022e2 | [
"Apache-2.0"
] | 5 | 2021-07-23T05:54:35.000Z | 2022-01-28T04:14:51.000Z | defmodule Basic.Repo.Migrations.CreateBlogs do
use Ecto.Migration
def change do
create table(:blogs) do
add :post_id, :string
add :user_id, :integer
add :title, :string
add :image, :text
add :tags, :string
add :body, :text
add :likes, :integer
add :views, :integer
add :deleted_at, :naive_datetime
timestamps()
end
end
end
| 19.047619 | 46 | 0.6125 |
e86562a89e7a216bb3dc306d5e7294c3cda5a581 | 2,627 | ex | Elixir | lib/coherence_assent/web/controllers/auth_controller.ex | danschultzer/coherence_assent | 538e7e4aba3146c9bf4ac7798fea4b8a0ff099d5 | [
"Unlicense",
"MIT"
] | 22 | 2017-09-15T17:52:31.000Z | 2018-10-07T02:36:27.000Z | lib/coherence_assent/web/controllers/auth_controller.ex | danschultzer/coherence_oauth2 | 538e7e4aba3146c9bf4ac7798fea4b8a0ff099d5 | [
"Unlicense",
"MIT"
] | 15 | 2017-11-01T15:39:37.000Z | 2019-03-11T18:02:04.000Z | lib/coherence_assent/web/controllers/auth_controller.ex | danschultzer/coherence_oauth2 | 538e7e4aba3146c9bf4ac7798fea4b8a0ff099d5 | [
"Unlicense",
"MIT"
] | 9 | 2017-09-18T20:48:06.000Z | 2018-12-05T15:24:24.000Z | defmodule CoherenceAssent.AuthController do
@moduledoc false
use CoherenceWeb, :controller
alias CoherenceAssent.Callback
alias CoherenceAssent.Controller
import Phoenix.Naming, only: [humanize: 1]
plug Coherence.RequireLogin when action in ~w(delete)a
def index(conn, %{"provider" => provider}) do
config = provider
|> get_config!()
|> Keyword.put(:redirect_uri, redirect_uri(conn, provider))
{:ok, %{conn: conn, url: url}} = call_strategy!(config,
:authorize_url,
[conn, config])
redirect(conn, external: url)
end
def callback(conn, %{"provider" => provider} = params) do
config = get_config!(provider)
params = %{"redirect_uri" => redirect_uri(conn, provider)}
|> Map.merge(params)
config
|> call_strategy!(:callback, [conn, config, params])
|> callback_handler(provider, params)
end
def delete(conn, %{"provider" => provider}) do
conn
|> Coherence.current_user()
|> CoherenceAssent.UserIdentities.delete_identity_from_user(provider)
|> case do
{:ok, _} ->
msg = CoherenceAssent.Messages.backend().authentication_has_been_removed(%{provider: humanize(provider)})
put_flash(conn, :info, msg)
{:error, %{errors: [user: {"needs password", []}]}} ->
msg = CoherenceAssent.Messages.backend().identity_cannot_be_removed_missing_user_password()
put_flash(conn, :error, msg)
end
|> redirect(to: Controller.get_route(conn, :registration_path, :edit))
end
defp redirect_uri(conn, provider) do
Controller.get_route(conn, :coherence_assent_auth_url, :callback, [provider])
end
defp callback_handler({:ok, %{conn: conn, user: user}}, provider, params) do
conn
|> Coherence.current_user()
|> Callback.handler(provider, user)
|> Controller.callback_response(conn, provider, user, params)
end
defp callback_handler({:error, %{error: error}}, _provider, _params),
do: raise error
def get_config!(provider) do
config = provider |> CoherenceAssent.config()
config
|> case do
nil -> nil
list -> Enum.into(list, %{})
end
|> case do
%{strategy: _} -> config
%{} -> raise "No :strategy set for :#{provider} configuration!"
nil -> raise "No provider configuration available for #{provider}."
end
end
defp call_strategy!(config, method, arguments) do
apply(config[:strategy], method, arguments)
end
end
| 32.432099 | 116 | 0.62086 |
e865b9c077a8d7f035920f651a60d714cc7e070e | 818 | exs | Elixir | blinky/config/config.exs | rob-brown/ElixirTraining2018 | 9724cc4961d767a1ba2450240e026b46ad5a0f1b | [
"MIT"
] | 2 | 2018-02-01T22:56:09.000Z | 2020-01-20T19:57:48.000Z | blinky/config/config.exs | rob-brown/ElixirTraining2018 | 9724cc4961d767a1ba2450240e026b46ad5a0f1b | [
"MIT"
] | null | null | null | blinky/config/config.exs | rob-brown/ElixirTraining2018 | 9724cc4961d767a1ba2450240e026b46ad5a0f1b | [
"MIT"
] | null | null | null | use Mix.Config
# Use shoehorn to start the main application. See the shoehorn
# docs for separating out critical OTP applications such as those
# involved with firmware updates.
config :shoehorn,
init: [:nerves_runtime, :nerves_network],
app: Mix.Project.config()[:app]
config :nerves_network,
regulatory_domain: "US"
key_mgmt = System.get_env("NERVES_NETWORK_KEY_MGMT") || "NONE"
config :nerves_network, :default,
wlan0: [
ssid: System.get_env("NERVES_NETWORK_SSID") || "Tracy",
psk: System.get_env("NERVES_NETWORK_PSK") || "",
key_mgmt: String.to_atom(key_mgmt)
],
eth0: [
ipv4_address_method: :dhcp
]
# Import target specific config. This must remain at the bottom
# of this file so it overrides the configuration defined above.
import_config "#{Mix.Project.config[:target]}.exs"
| 29.214286 | 65 | 0.734719 |
e865dffb26c3095fb54f518f2b40aae570f691e7 | 449 | exs | Elixir | throwaway/hello/config/prod.secret.exs | BryanJBryce/programming-phoenix | ced3b5c383ada2575237d377430ef0a2743da0a2 | [
"MIT"
] | null | null | null | throwaway/hello/config/prod.secret.exs | BryanJBryce/programming-phoenix | ced3b5c383ada2575237d377430ef0a2743da0a2 | [
"MIT"
] | null | null | null | throwaway/hello/config/prod.secret.exs | BryanJBryce/programming-phoenix | ced3b5c383ada2575237d377430ef0a2743da0a2 | [
"MIT"
] | null | null | null | use Mix.Config
# In this file, we keep production configuration that
# you likely want to automate and keep it away from
# your version control system.
config :hello, Hello.Endpoint,
secret_key_base: "DRMNKXMxDAafM8SS7JrhU82VOtMTwEVzS/vtQHuku1RAFj/CX/Q3e+0YpHXAaR3B"
# Configure your database
config :hello, Hello.Repo,
adapter: Ecto.Adapters.Postgres,
username: "postgres",
password: "postgres",
database: "hello_prod",
pool_size: 20
| 28.0625 | 85 | 0.772829 |
e8661f2ef25b0123807bd878564e65b5a6b8cf1c | 1,322 | exs | Elixir | test/event_test.exs | tinfoil/poxa | a1c3501ecf1df9873a5c7018dc7da0abe4556cec | [
"MIT"
] | null | null | null | test/event_test.exs | tinfoil/poxa | a1c3501ecf1df9873a5c7018dc7da0abe4556cec | [
"MIT"
] | null | null | null | test/event_test.exs | tinfoil/poxa | a1c3501ecf1df9873a5c7018dc7da0abe4556cec | [
"MIT"
] | null | null | null | defmodule Poxa.EventTest do
use ExUnit.Case, async: true
import Mimic
alias Poxa.Event
alias Poxa.SocketId
setup :verify_on_exit!
setup do
stub(SocketId)
{ :ok, _ } = GenEvent.start_link(name: Poxa.Event)
:ok
end
defmodule TestHandler do
use GenEvent
def handle_event(%{event: :failed_handling}, _pid) do
:error
end
def handle_event(event_data, pid) do
send pid, event_data
{:ok, pid}
end
end
test "notifies handler with a socket_id" do
:ok = Event.add_handler({TestHandler, self()}, self())
:ok = Event.notify(:successful_handling, %{socket_id: :socket_id, data: :map})
assert_receive %{event: :successful_handling,
data: :map,
socket_id: :socket_id}
end
test "notifies handler without a socket_id" do
:ok = Event.add_handler({TestHandler, self()}, self())
expect(SocketId, :mine, fn -> :socket_id end)
:ok = Event.notify(:successful_handling, %{data: :map})
assert_receive %{event: :successful_handling, data: :map, socket_id: :socket_id}
end
test "handler failures are notified" do
:ok = Event.add_handler({TestHandler, self()}, self())
:ok = Event.notify(:failed_handling, %{socket_id: :socket_id})
assert_receive {:gen_event_EXIT, _, _}
end
end
| 25.921569 | 84 | 0.652799 |
e86622b8b3046eecabeb9a167bfca2ae0b44be26 | 1,646 | ex | Elixir | lib/live_sup_web/live/setup/components/dashboard_component.ex | livesup-dev/livesup | eaf9ffc78d3043bd9e3408f0f4df26ed16eb8446 | [
"Apache-2.0",
"MIT"
] | null | null | null | lib/live_sup_web/live/setup/components/dashboard_component.ex | livesup-dev/livesup | eaf9ffc78d3043bd9e3408f0f4df26ed16eb8446 | [
"Apache-2.0",
"MIT"
] | 3 | 2022-02-23T15:51:48.000Z | 2022-03-14T22:52:43.000Z | lib/live_sup_web/live/setup/components/dashboard_component.ex | livesup-dev/livesup | eaf9ffc78d3043bd9e3408f0f4df26ed16eb8446 | [
"Apache-2.0",
"MIT"
] | null | null | null | defmodule LiveSupWeb.SetupLive.Components.DashboardComponent do
use LiveSupWeb, :live_component
alias LiveSup.Core.Projects
@impl true
def update(%{project: project} = assigns, socket) do
changeset = Projects.change(project)
{:ok,
socket
|> assign(assigns)
|> assign(:changeset, changeset)}
end
@impl true
def render(assigns) do
~H"""
<div>
<h3 class="text-xl text-gray-600 dark:text-gray-200 text-center">Create a project</h3>
<.form
let={f}
for={@changeset}
id="project-form"
phx-target={@myself}
phx-change="validate"
phx-submit="save">
<%= label f, :name %>
<%= text_input f, :name %>
<%= error_tag f, :name %>
<div>
<%= submit "Save", phx_disable_with: "Saving..." %>
</div>
</.form>
</div>
"""
end
def handle_event("save", %{"project" => project_params}, socket) do
save_project(socket, :new, project_params)
end
@impl true
def handle_event("validate", %{"project" => project_params}, socket) do
changeset =
socket.assigns.project
|> Projects.change(project_params)
|> Map.put(:action, :validate)
{:noreply, assign(socket, :changeset, changeset)}
end
defp save_project(socket, :new, project_params) do
case Projects.create(project_params) do
{:ok, _project} ->
{:noreply,
socket
|> put_flash(:info, "Project created successfully")}
{:error, %Ecto.Changeset{} = changeset} ->
{:noreply, assign(socket, changeset: changeset)}
end
end
end
| 24.567164 | 92 | 0.586877 |
e86655788cd800c7b7f51ba3ec0fad7bb647c09d | 2,966 | exs | Elixir | test/ecto/query/builder/join_test.exs | christhekeele/ecto | 8d4a9b4e3230bf98694ed652972bd127ac2f7460 | [
"Apache-2.0"
] | null | null | null | test/ecto/query/builder/join_test.exs | christhekeele/ecto | 8d4a9b4e3230bf98694ed652972bd127ac2f7460 | [
"Apache-2.0"
] | null | null | null | test/ecto/query/builder/join_test.exs | christhekeele/ecto | 8d4a9b4e3230bf98694ed652972bd127ac2f7460 | [
"Apache-2.0"
] | null | null | null | defmodule Ecto.Query.Builder.JoinTest do
use ExUnit.Case, async: true
import Ecto.Query.Builder.Join
doctest Ecto.Query.Builder.Join
import Ecto.Query
defmacro join_macro(left, right) do
quote do
fragment("? <> ?", unquote(left), unquote(right))
end
end
test "expands macros as sources" do
left = "left"
right = "right"
assert %{joins: [_]} = join("posts", :inner, [p], c in join_macro(^left, ^right), true)
end
test "accepts keywords on :on" do
assert %{joins: [join]} =
join("posts", :inner, [p], c in "comments", [post_id: p.id, public: true])
assert Macro.to_string(join.on.expr) ==
"&1.post_id() == &0.id() and &1.public() == %Ecto.Query.Tagged{tag: nil, type: {1, :public}, value: true}"
assert join.on.params == []
end
test "accepts queries on interpolation" do
qual = :left
source = "comments"
assert %{joins: [%{source: {"comments", nil}}]} =
join("posts", qual, [p], c in ^source, true)
qual = :right
source = Comment
assert %{joins: [%{source: {nil, Comment}}]} =
join("posts", qual, [p], c in ^source, true)
qual = :right
source = {"user_comments", Comment}
assert %{joins: [%{source: {"user_comments", Comment}}]} =
join("posts", qual, [p], c in ^source, true)
qual = :inner
source = from c in "comments", where: c.public
assert %{joins: [%{source: %Ecto.Query{from: {"comments", nil}}}]} =
join("posts", qual, [p], c in ^source, true)
end
test "accepts interpolation on :on" do
assert %{joins: [join]} =
join("posts", :inner, [p], c in "comments", ^[post_id: 1, public: true])
assert Macro.to_string(join.on.expr) == "&1.post_id() == ^0 and &1.public() == ^1"
assert join.on.params == [{1, {1, :post_id}}, {true, {1, :public}}]
dynamic = dynamic([p, c], c.post_id == p.id and c.public == ^true)
assert %{joins: [join]} =
join("posts", :inner, [p], c in "comments", ^dynamic)
assert Macro.to_string(join.on.expr) == "&1.post_id() == &0.id() and &1.public() == ^0"
assert join.on.params == [{true, {1, :public}}]
end
test "accepts interpolation on assoc/2 field" do
assoc = :comments
join("posts", :left, [p], c in assoc(p, ^assoc), true)
end
test "raises on invalid qualifier" do
assert_raise ArgumentError,
~r/invalid join qualifier `:whatever`/, fn ->
qual = :whatever
join("posts", qual, [p], c in "comments", true)
end
end
test "raises on invalid interpolation" do
assert_raise Protocol.UndefinedError, fn ->
source = 123
join("posts", :left, [p], c in ^source, true)
end
end
test "raises on invalid assoc/2" do
assert_raise Ecto.Query.CompileError,
~r/you passed the variable \`field_var\` to \`assoc\/2\`/, fn ->
escape(quote do assoc(join_var, field_var) end, nil, nil)
end
end
end
| 32.593407 | 117 | 0.584626 |
e8668a7b7abdef1b1ba8599aec3b9bdd145f8681 | 6,540 | exs | Elixir | spec/narou_query_spec.exs | harukikubota/narou_wrapper | d8b4279ca57760f44bc3da8fbb70e2927a987809 | [
"MIT"
] | null | null | null | spec/narou_query_spec.exs | harukikubota/narou_wrapper | d8b4279ca57760f44bc3da8fbb70e2927a987809 | [
"MIT"
] | 2 | 2021-01-15T17:14:26.000Z | 2021-01-16T12:58:06.000Z | spec/narou_query_spec.exs | harukikubota/narou | d8b4279ca57760f44bc3da8fbb70e2927a987809 | [
"MIT"
] | null | null | null | defmodule NarouQuerySpec do
use ESpec
use Narou.Query
describe "from" do
context "from good_param default" do
subject do: from :novel
it do: is_expected() |> to(have type: :novel)
end
context "from good_param with initialize option" do
subject do: from :novel, limit: 10, st: 10
it do: is_expected() |> to(have limit: 10)
it do: is_expected() |> to(have st: 10)
end
context "from good_param with query" do
subject do: from :novel, select: [:t], where: [ncode: "n2267be"], order: :old
it do: is_expected() |> to(have select: [:t])
it do: is_expected() |> to(have where: %{ncode: "n2267be"})
it do: is_expected() |> to(have order: :old)
end
context "from extend query" do
let! :base_query, do: from(:novel)
let! :extended_query, do: from(base_query(), limit: 10, select: :t)
it do: expect extended_query() |> to(have type: :novel)
it do: expect extended_query() |> to(have limit: 10)
it do: expect extended_query() |> to(have select: [:t])
end
end
describe "select" do
context "novel good_param" do
let! :s, do: from :novel
let :q_one, do: s() |> select(:t)
let :q_many, do: q_one() |> select([:w, :n])
it do: expect q_one() |> to(have select: [:t])
it do: expect q_many() |> to(have select: [:t, :w, :n])
end
context "novel bad_param" do
let! :s, do: from :novel
let :q_str, do: s() |> select("hoge")
let :q_atm, do: s() |> select(Hoge)
it do: expect q_str() |> to(be_error_result())
it do: expect q_atm() |> to(be_error_result())
end
context "user good_param" do
let! :s, do: from :user
let :q_one, do: s() |> select(:n)
let :q_many, do: q_one() |> select([:u, :y])
it do: expect q_one() |> to(have select: [:n])
it do: expect q_many() |> to(have select: [:n, :u, :y])
end
context "user bad_param" do
let! :s, do: from :user
let :q_str, do: s() |> select("hoge")
let :q_atm, do: s() |> select(Hoge)
it do: expect q_str() |> to(be_error_result())
it do: expect q_atm() |> to(be_error_result())
end
context "not_suppert types" do
subject do: from(:rank) |> select(:a)
it do: is_expected() |> to(match_pattern {:error, "The API type :rank does not support `select` queries."})
end
end
describe "where" do
context "novel good_param" do
let! :s, do: from :novel
let :q_one, do: s() |> where(ncode: "n2267be")
let :q_many, do: q_one() |> where(userid: 1, genre: 1)
it do: expect q_one() |> Map.get(:where) |> to(have ncode: "n2267be")
it do: expect q_many() |> Map.get(:where) |> to(have userid: 1)
it do: expect q_many() |> Map.get(:where) |> to(have genre: 1)
end
# context "novel bad_param" , do: # pass
context "rank good_param" do
let :s, do: from :rank
let :q_date, do: s() |> where(y: 2020, m: 12, d: 31)
it do: expect q_date() |> Map.get(:where) |> to(have y: 2020)
it do: expect q_date() |> Map.get(:where) |> to(have m: 12)
it do: expect q_date() |> Map.get(:where) |> to(have d: 31)
it do: expect s() |> where(t: :d) |> Map.get(:where) |> to(have t: :d)
it do: expect s() |> where(t: :w) |> Map.get(:where) |> to(have t: :w)
it do: expect s() |> where(t: :m) |> Map.get(:where) |> to(have t: :m)
it do: expect s() |> where(t: :q) |> Map.get(:where) |> to(have t: :q)
end
context "rank bad_param" do
let! :s, do: from :rank
let :bad_date, do: s() |> where(y: 2020, m: 2, d: 31)
let :bad_type, do: s() |> where(t: :hoge)
let :bad_key, do: s() |> where(hoge: :hoge)
it do: expect bad_date() |> to(match_pattern {:error, [{:error, :where, :by, "invalid date."}]})
it do: expect bad_type() |> to(match_pattern {:error, [{:error, :where, :by, "must be one of [:d, :w, :m, :q]"}]})
it do: expect bad_key() |> to(match_pattern {:error, [{:error, :where, :by, "Unexpected keys [:hoge]"}]})
end
context "rankin good_param" do
let! :s, do: from :rankin
let :large, do: s() |> where(ncode: "N1234ABC")
let :small, do: s() |> where(ncode: "n0956z")
it do: expect large() |> Map.get(:where) |> to(have ncode: "N1234ABC")
it do: expect small() |> Map.get(:where) |> to(have ncode: "n0956z")
end
context "rankin bad_param" do
subject do: from(:rankin) |> where(ncode: "NHOGEHOGE")
it do: is_expected() |> to(match_pattern {:error, [{:error, :where, :by, "invalid NCode `NHOGEHOGE`."}]})
end
context "user good_param" do
let! :s, do: from :user
let :q_one, do: s() |> where(userid: 123456)
let :q_many, do: q_one() |> where(name1st: "あ", minnovel: 1)
it do: expect q_one() |> Map.get(:where) |> to(have userid: 123456)
it do: expect q_many() |> Map.get(:where) |> to(have name1st: "あ")
it do: expect q_many() |> Map.get(:where) |> to(have minnovel: 1)
end
# context "user bad_param" , do: # pass
end
describe "order" do
context "novel good_param" do
let! :s, do: from :novel
let :q_order, do: s() |> order(:hyoka)
let :overrided, do: s() |> order(:old)
it do: expect q_order() |> to(have order: :hyoka)
it do: expect overrided() |> to(have order: :old)
end
context "novel bad_param" do
let! :s, do: from :novel
let :bad_key, do: s() |> order(:hoge)
let :bad_val, do: s() |> order("hoge")
it do: expect bad_key() |> to(be_error_result())
it do: expect bad_val() |> to(be_error_result())
end
context "user good_param" do
let! :s, do: from :user
let :q_order, do: s() |> order(:novelcnt)
let :overrided, do: s() |> order(:old)
it do: expect q_order() |> to(have order: :novelcnt)
it do: expect overrided() |> to(have order: :old)
end
context "novel bad_param" do
let! :s, do: from :novel
let :bad_key, do: s() |> order(:hoge)
let :bad_val, do: s() |> order("hoge")
it do: expect bad_key() |> to(be_error_result())
it do: expect bad_val() |> to(be_error_result())
end
context "not_suppert types" do
subject do: from(:rank) |> order(:not)
it do: is_expected() |> to(match_pattern {:error, "The API type :rank does not support `order` queries."})
end
end
end
| 34.240838 | 120 | 0.556881 |
e866b0ffac65a1d87f841d6bfd4315dd5edc1148 | 2,113 | ex | Elixir | lib/redix_clustered/options.ex | PRX/redix-clustered | 46d5826fdcd877ba9794e083260b5bcc784b8a9e | [
"MIT"
] | null | null | null | lib/redix_clustered/options.ex | PRX/redix-clustered | 46d5826fdcd877ba9794e083260b5bcc784b8a9e | [
"MIT"
] | null | null | null | lib/redix_clustered/options.ex | PRX/redix-clustered | 46d5826fdcd877ba9794e083260b5bcc784b8a9e | [
"MIT"
] | null | null | null | defmodule RedixClustered.Options do
@namespace "namespace"
@pool_size "pool_size"
@default_pool_size 1
@redix_opts "redix_opts"
@redix_keys [:host, :port, :username, :password, :timeout]
@redix_request_opts "redix_request_opts"
@redix_request_key :request_opts
def init(opts) do
cluster_name = cluster_name(opts)
:ets.new(cluster_name, [:set, :protected, :named_table])
:ets.insert(cluster_name, {@namespace, get_non_blank(opts, :namespace)})
:ets.insert(cluster_name, {@pool_size, get_number(opts, :pool_size, @default_pool_size)})
:ets.insert(cluster_name, {@redix_opts, take_non_blank(opts, @redix_keys)})
:ets.insert(cluster_name, {@redix_request_opts, Keyword.get(opts, @redix_request_key, [])})
end
def cluster_name(opts) when is_list(opts), do: cluster_name(Keyword.get(opts, :name))
def cluster_name(nil), do: :redix_clustered
def cluster_name(name), do: :"redix_clustered_#{name}"
def clone_cluster_name(o) when is_list(o), do: clone_cluster_name(Keyword.get(o, :name))
def clone_cluster_name(nil), do: :clone
def clone_cluster_name(name), do: :"#{name}_clone"
def registry_name(opts), do: :"#{cluster_name(opts)}_registry"
def slots_name(opts), do: :"#{cluster_name(opts)}_slots"
def pool_name(opts), do: :"#{cluster_name(opts)}_pool"
def get(name, key) do
case :ets.lookup(cluster_name(name), key) do
[{_key, val}] -> val
_ -> nil
end
end
def namespace(name), do: get(name, @namespace)
def pool_size(name), do: get(name, @pool_size)
def redix_opts(name), do: get(name, @redix_opts)
def redix_request_opts(name), do: get(name, @redix_request_opts)
defp get_non_blank(opts, key) do
case Keyword.get(opts, key) do
"" -> nil
val -> val
end
end
defp get_number(opts, key, default) do
case Keyword.get(opts, key, default) do
"" -> default
"" <> str -> String.to_integer(str)
num -> num
end
end
defp take_non_blank(opts, keys) do
opts
|> Keyword.take(keys)
|> Enum.reject(fn {_key, val} ->
is_nil(val) || val == ""
end)
end
end
| 32.015152 | 95 | 0.679129 |
e866bcb52101f469d8ab3ec18871c86229c6d2a9 | 1,844 | exs | Elixir | clients/o_auth2/mix.exs | kaaboaye/elixir-google-api | 1896784c4342151fd25becd089a5beb323eff567 | [
"Apache-2.0"
] | null | null | null | clients/o_auth2/mix.exs | kaaboaye/elixir-google-api | 1896784c4342151fd25becd089a5beb323eff567 | [
"Apache-2.0"
] | null | null | null | clients/o_auth2/mix.exs | kaaboaye/elixir-google-api | 1896784c4342151fd25becd089a5beb323eff567 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.OAuth2.Mixfile do
use Mix.Project
@version "0.10.0"
def project() do
[
app: :google_api_o_auth2,
version: @version,
elixir: "~> 1.6",
build_embedded: Mix.env == :prod,
start_permanent: Mix.env == :prod,
description: description(),
package: package(),
deps: deps(),
source_url: "https://github.com/googleapis/elixir-google-api/tree/master/clients/o_auth2"
]
end
def application() do
[extra_applications: [:logger]]
end
defp deps() do
[
{:google_gax, "~> 0.2"},
{:ex_doc, "~> 0.16", only: :dev}
]
end
defp description() do
"""
Google OAuth2 API client library. Obtains end-user authorization grants for use with other Google APIs.
"""
end
defp package() do
[
files: ["lib", "mix.exs", "README*", "LICENSE"],
maintainers: ["Jeff Ching", "Daniel Azuma"],
licenses: ["Apache 2.0"],
links: %{
"GitHub" => "https://github.com/googleapis/elixir-google-api/tree/master/clients/o_auth2",
"Homepage" => "https://developers.google.com/accounts/docs/OAuth2"
}
]
end
end
| 27.522388 | 107 | 0.65564 |
e867163c4a042bbdbdbfea96e907bb31a87ed649 | 353 | ex | Elixir | lib/supabase_surface/components/icons/icon_shopping_cart.ex | treebee/supabase-surface | 5a184ca92323c085dd81e2fc8aa8c10367f2382e | [
"Apache-2.0"
] | 5 | 2021-06-08T08:02:43.000Z | 2022-02-09T23:13:46.000Z | lib/supabase_surface/components/icons/icon_shopping_cart.ex | treebee/supabase-surface | 5a184ca92323c085dd81e2fc8aa8c10367f2382e | [
"Apache-2.0"
] | null | null | null | lib/supabase_surface/components/icons/icon_shopping_cart.ex | treebee/supabase-surface | 5a184ca92323c085dd81e2fc8aa8c10367f2382e | [
"Apache-2.0"
] | 1 | 2021-07-14T05:20:31.000Z | 2021-07-14T05:20:31.000Z | defmodule SupabaseSurface.Components.Icons.IconShoppingCart do
use SupabaseSurface.Components.Icon
@impl true
def render(assigns) do
icon_size = IconContainer.get_size(assigns.size)
~F"""
<IconContainer assigns={assigns}>
{Feathericons.shopping_cart(width: icon_size, height: icon_size)}
</IconContainer>
"""
end
end
| 23.533333 | 71 | 0.728045 |
e8671cf0e838760cf2edda3a1932668cc121c277 | 605 | ex | Elixir | lib/exbee/adapters/adapter.ex | msm/exbee | b99b7996c5da9092472371b8bc0bd5e7c2daa3f6 | [
"Apache-2.0"
] | 6 | 2017-06-18T14:29:59.000Z | 2019-07-28T15:47:02.000Z | lib/exbee/adapters/adapter.ex | msm/exbee | b99b7996c5da9092472371b8bc0bd5e7c2daa3f6 | [
"Apache-2.0"
] | 3 | 2018-01-28T22:00:06.000Z | 2019-10-14T11:14:47.000Z | lib/exbee/adapters/adapter.ex | msm/exbee | b99b7996c5da9092472371b8bc0bd5e7c2daa3f6 | [
"Apache-2.0"
] | 4 | 2016-09-22T02:50:34.000Z | 2019-10-14T07:43:38.000Z | defmodule Exbee.Adapter do
@moduledoc """
Interface to a serial connection.
"""
@type adapter_option ::
{:speed, non_neg_integer}
| {:data_bits, 5..8}
| {:stop_bits, 1..2}
| {:parity, :none | :even | :odd | :space | :mark}
| {:flow_control, :none | :hardware | :software}
@callback enumerate() :: map
@callback start_link() :: {:ok, pid} | {:error, term}
@callback open(pid, binary, [adapter_option]) :: :ok | {:error, term}
@callback write(pid, binary | [byte]) :: :ok | {:error, term}
@callback stop(pid) :: :ok | {:error, term}
end
| 31.842105 | 71 | 0.565289 |
e8671d33365e6f86a3036d7590dabfa36845fbc5 | 610 | exs | Elixir | http_practice/mix.exs | TheBrockstar/elixir-projects | eb55246dd24c949b7e6d77c9eb5847391dd05175 | [
"MIT"
] | null | null | null | http_practice/mix.exs | TheBrockstar/elixir-projects | eb55246dd24c949b7e6d77c9eb5847391dd05175 | [
"MIT"
] | null | null | null | http_practice/mix.exs | TheBrockstar/elixir-projects | eb55246dd24c949b7e6d77c9eb5847391dd05175 | [
"MIT"
] | null | null | null | defmodule HttpPractice.MixProject do
use Mix.Project
def project do
[
app: :http_practice,
version: "0.1.0",
elixir: "~> 1.8",
start_permanent: Mix.env() == :prod,
deps: deps()
]
end
# Run "mix help compile.app" to learn about applications.
def application do
[
extra_applications: [:logger]
]
end
# Run "mix help deps" to learn about dependencies.
defp deps do
[
# {:dep_from_hexpm, "~> 0.3.0"},
# {:dep_from_git, git: "https://github.com/elixir-lang/my_dep.git", tag: "0.1.0"}
{:floki, "~> 0.11.0"}
]
end
end
| 20.333333 | 87 | 0.572131 |
e8672526b63c0aa2b4dcacd4c4166af7f3445c11 | 135 | exs | Elixir | test/parser_test.exs | tera-insights/angularjs-template-migrate | 94ba52781417ed375b7b904fc8ca5481caef4fdb | [
"Apache-2.0"
] | 1 | 2019-07-04T00:39:31.000Z | 2019-07-04T00:39:31.000Z | test/parser_test.exs | tera-insights/angularjs-template-migrate | 94ba52781417ed375b7b904fc8ca5481caef4fdb | [
"Apache-2.0"
] | null | null | null | test/parser_test.exs | tera-insights/angularjs-template-migrate | 94ba52781417ed375b7b904fc8ca5481caef4fdb | [
"Apache-2.0"
] | null | null | null | defmodule ParserTest do
use ExUnit.Case
doctest Parser
test "greets the world" do
assert Parser.hello() == :world
end
end
| 15 | 35 | 0.703704 |
e8674bbc8c30d2c09c395be799646adbf26488ff | 892 | ex | Elixir | clients/bigtable_admin/lib/google_api/bigtable_admin/v2/metadata.ex | renovate-bot/elixir-google-api | 1da34cd39b670c99f067011e05ab90af93fef1f6 | [
"Apache-2.0"
] | 1 | 2021-12-20T03:40:53.000Z | 2021-12-20T03:40:53.000Z | clients/bigtable_admin/lib/google_api/bigtable_admin/v2/metadata.ex | swansoffiee/elixir-google-api | 9ea6d39f273fb430634788c258b3189d3613dde0 | [
"Apache-2.0"
] | 1 | 2020-08-18T00:11:23.000Z | 2020-08-18T00:44:16.000Z | clients/bigtable_admin/lib/google_api/bigtable_admin/v2/metadata.ex | dazuma/elixir-google-api | 6a9897168008efe07a6081d2326735fe332e522c | [
"Apache-2.0"
] | null | null | null | # Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.BigtableAdmin.V2 do
@moduledoc """
API client metadata for GoogleApi.BigtableAdmin.V2.
"""
@discovery_revision "20211202"
def discovery_revision(), do: @discovery_revision
end
| 33.037037 | 74 | 0.761211 |
e86778ae75b131465116432d8201d2a52ea363e8 | 475 | ex | Elixir | lib/temple/parser/temple_namespace_void.ex | mhanberg/dsl | 21e3c7e3a24fabecd436fb09271d90c3e9fb8c30 | [
"MIT"
] | 366 | 2019-07-12T22:43:35.000Z | 2022-03-31T07:26:25.000Z | lib/temple/parser/temple_namespace_void.ex | mhanberg/dsl | 21e3c7e3a24fabecd436fb09271d90c3e9fb8c30 | [
"MIT"
] | 114 | 2019-07-12T19:49:10.000Z | 2022-03-23T10:48:48.000Z | lib/temple/parser/temple_namespace_void.ex | mhanberg/dsl | 21e3c7e3a24fabecd436fb09271d90c3e9fb8c30 | [
"MIT"
] | 17 | 2019-07-23T15:16:34.000Z | 2021-02-25T02:42:02.000Z | defmodule Temple.Parser.TempleNamespaceVoid do
@moduledoc false
@behaviour Temple.Parser
@impl Temple.Parser
def applicable?({{:., _, [{:__aliases__, _, [:Temple]}, name]}, _meta, _args}) do
name in Temple.Parser.void_elements_aliases()
end
def applicable?(_), do: false
@impl Temple.Parser
def run({name, meta, args}) do
{:., _, [{:__aliases__, _, [:Temple]}, name]} = name
Temple.Parser.VoidElementsAliases.run({name, meta, args})
end
end
| 25 | 83 | 0.671579 |
e8677a3440ebdcfa6e89c657e4cb7ad45e81a684 | 73 | exs | Elixir | exercises/concept/wine-cellar/test/test_helper.exs | devtayls/elixir | 67824de8209ff1b6ed2f736deedfb5bd815130ca | [
"MIT"
] | 343 | 2017-06-22T16:28:28.000Z | 2022-03-25T21:33:32.000Z | exercises/concept/wine-cellar/test/test_helper.exs | devtayls/elixir | 67824de8209ff1b6ed2f736deedfb5bd815130ca | [
"MIT"
] | 583 | 2017-06-19T10:48:40.000Z | 2022-03-28T21:43:12.000Z | exercises/concept/wine-cellar/test/test_helper.exs | devtayls/elixir | 67824de8209ff1b6ed2f736deedfb5bd815130ca | [
"MIT"
] | 228 | 2017-07-05T07:09:32.000Z | 2022-03-27T08:59:08.000Z | ExUnit.start()
ExUnit.configure(exclude: :pending, trace: true, seed: 0)
| 24.333333 | 57 | 0.739726 |
e8678faa9c6fd9bae3e1848e1ef6293340b894ad | 442 | ex | Elixir | broker/lib/broker_web/views/api_view.ex | mikehelmick/broker-prototype | 68082f753d3a7ea29399706504419e495287b35f | [
"Apache-2.0"
] | 1 | 2019-02-04T21:09:16.000Z | 2019-02-04T21:09:16.000Z | broker/lib/broker_web/views/api_view.ex | mikehelmick/broker-prototype | 68082f753d3a7ea29399706504419e495287b35f | [
"Apache-2.0"
] | null | null | null | broker/lib/broker_web/views/api_view.ex | mikehelmick/broker-prototype | 68082f753d3a7ea29399706504419e495287b35f | [
"Apache-2.0"
] | null | null | null | defmodule BrokerWeb.ApiView do
use BrokerWeb, :view
def render("add_type.json", %{type: type}) do
%{type: type}
end
def render("list_types.json", %{types: types}) do
types
end
def render("emit.json", %{event: event}) do
%{id: CloudEvent.id(event)}
end
def render("set_trigger.json", %{trigger: trigger}) do
trigger
end
def render("list_triggers.json", %{triggers: triggers}) do
triggers
end
end
| 18.416667 | 60 | 0.649321 |
e867ae3d69de3ad1672f73c9f65ba1acd36c9f8f | 336 | ex | Elixir | lib/supabase_surface/components/icons/icon_bell.ex | treebee/supabase-surface | 5a184ca92323c085dd81e2fc8aa8c10367f2382e | [
"Apache-2.0"
] | 5 | 2021-06-08T08:02:43.000Z | 2022-02-09T23:13:46.000Z | lib/supabase_surface/components/icons/icon_bell.ex | treebee/supabase-surface | 5a184ca92323c085dd81e2fc8aa8c10367f2382e | [
"Apache-2.0"
] | null | null | null | lib/supabase_surface/components/icons/icon_bell.ex | treebee/supabase-surface | 5a184ca92323c085dd81e2fc8aa8c10367f2382e | [
"Apache-2.0"
] | 1 | 2021-07-14T05:20:31.000Z | 2021-07-14T05:20:31.000Z | defmodule SupabaseSurface.Components.Icons.IconBell do
use SupabaseSurface.Components.Icon
@impl true
def render(assigns) do
icon_size = IconContainer.get_size(assigns.size)
~F"""
<IconContainer assigns={assigns}>
{Feathericons.bell(width: icon_size, height: icon_size)}
</IconContainer>
"""
end
end
| 22.4 | 62 | 0.714286 |
e867d316610e461ffcc4b513ec087846990c9ddf | 2,287 | exs | Elixir | microservice/test/controllers/v1/contact_controller_test.exs | eduardobarbiero/microservice-contacts | 330d1855c8b9296e030bfe62cabc3a1c20cef500 | [
"MIT"
] | null | null | null | microservice/test/controllers/v1/contact_controller_test.exs | eduardobarbiero/microservice-contacts | 330d1855c8b9296e030bfe62cabc3a1c20cef500 | [
"MIT"
] | null | null | null | microservice/test/controllers/v1/contact_controller_test.exs | eduardobarbiero/microservice-contacts | 330d1855c8b9296e030bfe62cabc3a1c20cef500 | [
"MIT"
] | null | null | null | defmodule Microservice.V1.ContactControllerTest do
use Microservice.ConnCase
alias Microservice.Contact
@valid_attrs %{email: "some content", fone: "some content", name: "some content"}
@invalid_attrs %{}
setup do
conn = conn() |> put_req_header("accept", "application/json")
{:ok, conn: conn}
end
test "lists all entries on index", %{conn: conn} do
conn = get conn, contact_path(conn, :index)
assert json_response(conn, 200)["data"] == []
end
test "shows chosen resource", %{conn: conn} do
contact = Repo.insert! %Contact{}
conn = get conn, contact_path(conn, :show, contact)
assert json_response(conn, 200)["data"] == %{"id" => contact.id,
"name" => contact.name,
"fone" => contact.fone,
"email" => contact.email}
end
test "does not show resource and instead throw error when id is nonexistent", %{conn: conn} do
assert_raise Ecto.NoResultsError, fn ->
get conn, contact_path(conn, :show, -1)
end
end
test "creates and renders resource when data is valid", %{conn: conn} do
conn = post conn, contact_path(conn, :create), contact: @valid_attrs
assert json_response(conn, 201)["data"]["id"]
assert Repo.get_by(Contact, @valid_attrs)
end
test "does not create resource and renders errors when data is invalid", %{conn: conn} do
conn = post conn, contact_path(conn, :create), contact: @invalid_attrs
assert json_response(conn, 422)["errors"] != %{}
end
test "updates and renders chosen resource when data is valid", %{conn: conn} do
contact = Repo.insert! %Contact{}
conn = put conn, contact_path(conn, :update, contact), contact: @valid_attrs
assert json_response(conn, 200)["data"]["id"]
assert Repo.get_by(Contact, @valid_attrs)
end
test "does not update chosen resource and renders errors when data is invalid", %{conn: conn} do
contact = Repo.insert! %Contact{}
conn = put conn, contact_path(conn, :update, contact), contact: @invalid_attrs
assert json_response(conn, 422)["errors"] != %{}
end
test "deletes chosen resource", %{conn: conn} do
contact = Repo.insert! %Contact{}
conn = delete conn, contact_path(conn, :delete, contact)
assert response(conn, 204)
refute Repo.get(Contact, contact.id)
end
end
| 35.734375 | 98 | 0.675557 |
e867d5e917bfaea99a7e7427a4718c32edc04bc0 | 1,662 | exs | Elixir | mix.exs | aforward/gen_template_library | 6243ad8a0def0128f2a0be0e37e1e17250daed5f | [
"MIT"
] | null | null | null | mix.exs | aforward/gen_template_library | 6243ad8a0def0128f2a0be0e37e1e17250daed5f | [
"MIT"
] | null | null | null | mix.exs | aforward/gen_template_library | 6243ad8a0def0128f2a0be0e37e1e17250daed5f | [
"MIT"
] | null | null | null | defmodule GenTemplateLibrary.Mixfile do
use Mix.Project
@name :gen_template_library
@version "0.1.1"
@deps [
{:mix_templates, ">0.0.0", app: false},
{:version_tasks, "~> 0.11"},
{:ex_doc, ">0.0.0", only: [:dev, :test]}
]
@maintainers ["Andrew Forward <aforward@gmail.com>"]
@github "https://github.com/aforward/#{@name}"
@home @github
@description """
A template for building Elixir libraries (projects without any state, just functions).
This project is an extremely paired down version of `mix new «project_name»`
based on Dave Thomas' view of simplifying our Elixir code. Let's see
how this plays out.
"""
@docs [
main: "GenTemplateLibrary",
extras: ["README.md"]
]
@aliases []
# ------------------------------------------------------------
def project do
in_production = Mix.env() == :prod
[
app: @name,
version: @version,
deps: @deps,
aliases: @aliases,
elixir: "~> 1.6",
package: package(),
source_url: @github,
homepage_url: @home,
docs: @docs,
description: @description,
build_embedded: in_production,
start_permanent: in_production
]
end
defp package do
[
name: @name,
files: [
"lib",
"mix.exs",
"README.md",
"LICENSE.md",
"template",
"template/$PROJECT_NAME$/.gitignore",
"template/$PROJECT_NAME$/.formatter.exs"
],
maintainers: @maintainers,
licenses: ["Apache 2.0"],
links: %{
"GitHub" => @github
}
# extra: %{ "type" => "a_template_for_mix_gen" },
]
end
end
| 22.459459 | 88 | 0.554152 |
e867de5e2d9361e7d9a91e14789097dc33725182 | 11,396 | ex | Elixir | lib/oli/delivery/paywall.ex | candert1/oli-torus | b7408f7d7c04cc3e9cf537873d98c3a586ec3a66 | [
"MIT"
] | null | null | null | lib/oli/delivery/paywall.ex | candert1/oli-torus | b7408f7d7c04cc3e9cf537873d98c3a586ec3a66 | [
"MIT"
] | null | null | null | lib/oli/delivery/paywall.ex | candert1/oli-torus | b7408f7d7c04cc3e9cf537873d98c3a586ec3a66 | [
"MIT"
] | null | null | null | defmodule Oli.Delivery.Paywall do
import Ecto.Query, warn: false
require Logger
alias Oli.Repo
alias Oli.Accounts.User
alias Oli.Delivery.Paywall.Payment
alias Oli.Delivery.Paywall.Discount
alias Oli.Delivery.Sections.Section
alias Oli.Delivery.Sections
alias Oli.Delivery.Sections.Enrollment
alias Oli.Delivery.Sections.Blueprint
alias Oli.Institutions.Institution
@maximum_batch_size 500
@doc """
Determines if a user can access a course section, taking into account paywall settings.
"""
def can_access?(_, %Section{requires_payment: false}), do: true
def can_access?(%User{id: id} = user, %Section{slug: slug, requires_payment: true} = section) do
if Sections.is_instructor?(user, slug) or Sections.is_admin?(user, slug) do
true
else
enrollment = Sections.get_enrollment(slug, id)
# A student can access a paywalled section if the following two conditions hold:
# 1. They are enrolled in the section
# 2. They have either made a payment OR they are within the grace period (if there is one)
!is_nil(enrollment) and (has_paid?(enrollment) or within_grace_period?(enrollment, section))
end
end
defp has_paid?(%Enrollment{id: id}) do
query =
from(
p in Payment,
where: p.enrollment_id == ^id,
limit: 1,
select: p
)
case Repo.all(query) do
[] -> false
_ -> true
end
end
defp within_grace_period?(_, %Section{has_grace_period: false}), do: false
defp within_grace_period?(%Enrollment{inserted_at: inserted_at}, %Section{
grace_period_days: days,
grace_period_strategy: strategy,
start_date: start_date
}) do
case strategy do
:relative_to_section ->
case start_date do
nil -> false
_ -> Date.compare(Date.utc_today(), Date.add(start_date, days)) == :lt
end
:relative_to_student ->
Date.compare(Date.utc_today(), Date.add(inserted_at, days)) == :lt
end
end
@doc """
Generates a batch of payment codes (aka deferred payments).
Returns {:ok, [%Payment{}]} on successful creation.
Can return one of the following specific error conditions:
{:error, {:invalid_batch_size}} - when the batch size is not valid
{:error, {:invalid_product}} - when the product slug does not reference a valid product
{:error, e} - on a database error encountered during creatinon of the payment
"""
def create_payment_codes(_, number_of_codes) when number_of_codes <= 0,
do: {:error, {:invalid_batch_size}}
def create_payment_codes(_, number_of_codes) when number_of_codes > @maximum_batch_size,
do: {:error, {:invalid_batch_size}}
def create_payment_codes(product_slug, number_of_codes) do
case Blueprint.get_active_blueprint(product_slug) do
nil ->
{:error, {:invalid_product}}
%Section{} = section ->
create_codes_for_section(section, number_of_codes)
end
end
defp create_codes_for_section(%Section{id: id, amount: amount}, number_of_codes) do
now = DateTime.utc_now()
Repo.transaction(fn _ ->
case unique_codes(number_of_codes) do
{:ok, codes} ->
result =
Enum.reverse(codes)
|> Enum.reduce_while([], fn code, all ->
case create_payment(%{
type: :deferred,
code: code,
generation_date: now,
application_date: nil,
amount: amount,
section_id: id,
enrollment_id: nil
}) do
{:ok, payment} -> {:cont, [payment | all]}
{:error, e} -> {:halt, {:error, e}}
end
end)
case result do
{:error, e} -> Repo.rollback(e)
all -> all
end
{:error, e} ->
Repo.rollback(e)
end
end)
end
defp unique_codes(count) do
# Generate a batch of unique integer codes, in one query
query =
Ecto.Adapters.SQL.query(
Oli.Repo,
"SELECT * FROM (SELECT trunc(random() * (10000000000 - 100000000) + 100000000) AS new_id
FROM generate_series(1, #{count})) AS x
WHERE x.new_id NOT IN (SELECT code FROM payments WHERE type = \'deferred\')",
[]
)
case query do
{:ok, %{num_rows: ^count, rows: rows}} ->
{:ok, List.flatten(rows) |> Enum.map(fn c -> trunc(c) end)}
{:error, e} ->
Logger.error("could not generate random codes: #{inspect(e)}")
{:error, "could not generate random codes"}
end
end
@doc """
Given a section (blueprint or enrollable), calculate the cost to use it for
a specific institution, taking into account any product-wide and product-specific discounts
this instituttion has.
Returns {:ok, %Money{}} or {:error, reason}
"""
def calculate_product_cost(
%Section{requires_payment: false},
_
),
do: {:ok, Money.new(:USD, 0)}
def calculate_product_cost(
%Section{requires_payment: true, amount: amount},
nil
),
do: {:ok, amount}
def calculate_product_cost(
%Section{requires_payment: true, id: id, amount: amount},
%Institution{id: institution_id}
) do
discounts =
from(d in Discount,
where:
(is_nil(d.section_id) and d.institution_id == ^institution_id) or
(d.section_id == ^id and d.institution_id == ^institution_id),
select: d
)
|> Repo.all()
# Remove any institution-wide discounts if an institution and section specific discount exists
discounts =
case Enum.any?(discounts, fn d -> !is_nil(d.section_id) end) do
true ->
Enum.filter(discounts, fn d -> !is_nil(d.section_id) end)
false ->
discounts
end
# Now calculate the product cost, taking into account a discount
case discounts do
[] ->
{:ok, amount}
[%Discount{type: :percentage, percentage: percentage}] ->
Money.mult(amount, percentage)
[%Discount{amount: amount}] ->
{:ok, amount}
end
end
@doc """
Redeems a payment code for a given course section.
Returns {:ok, %Payment{}} on success, otherwise:
{:error, {:already_paid}} if the student has already paid for this section
{:error, {:not_enrolled}} if the student is not enrolled in the section
{:error, {:unknown_section}} when the section slug does not pertain to a valid section
{:error, {:unknown_code}} when no deferred payment record is found for `code`
{:error, {:invalid_code}} when the code is invalid, whether it has already been redeemed or
if it doesn't pertain to this section or blueprint product
"""
def redeem_code(human_readable_code, %User{} = user, section_slug) do
case Payment.from_human_readable(human_readable_code) do
{:ok, code} ->
case Sections.get_section_by_slug(section_slug) do
nil ->
{:error, {:unknown_section}}
%Section{blueprint_id: blueprint_id, id: id} = section ->
case Repo.get_by(Payment, code: code) do
nil ->
{:error, {:unknown_code}}
%Payment{
type: :deferred,
application_date: nil,
section_id: ^id,
enrollment_id: nil
} = payment ->
apply_payment(payment, user, section)
%Payment{
type: :deferred,
application_date: nil,
section_id: ^blueprint_id,
enrollment_id: nil
} = payment ->
apply_payment(payment, user, section)
_ ->
{:error, {:invalid_code}}
end
end
_ ->
{:error, {:invalid_code}}
end
end
defp apply_payment(payment, user, section) do
case Sections.get_enrollment(section.slug, user.id) do
nil ->
{:error, {:not_enrolled}}
%{id: id} ->
case Repo.get_by(Payment, enrollment_id: id) do
nil ->
update_payment(payment, %{
enrollment_id: id,
pending_user_id: user.id,
pending_section_id: section.id,
application_date: DateTime.utc_now()
})
_ ->
{:error, {:already_paid}}
end
end
end
@doc """
List all payments for a product, joined with the enrollment (user and section) if
the payment has been applied.
"""
def list_payments(product_slug) do
case Oli.Delivery.Sections.get_section_by_slug(product_slug) do
nil ->
[]
%Section{id: id} ->
query =
from(
p in Payment,
left_join: e in Enrollment,
on: e.id == p.enrollment_id,
left_join: u in User,
on: e.user_id == u.id,
left_join: s2 in Section,
on: e.section_id == s2.id,
where: p.section_id == ^id,
select: %{payment: p, section: s2, user: u}
)
Repo.all(query)
end
end
@doc """
Retrieve a payment for a specific provider and id.
"""
def get_provider_payment(provider_type, provider_id) do
query =
from(
p in Payment,
where: p.provider_type == ^provider_type and p.provider_id == ^provider_id,
select: p
)
Repo.one(query)
end
@doc """
Creates a new pending payment, ensuring that no other payments exists for this user
and section.
"""
def create_pending_payment(%User{id: user_id}, %Section{id: section_id}, attrs) do
Oli.Repo.transaction(fn _ ->
query =
from(
p in Payment,
where: p.pending_section_id == ^section_id and p.pending_user_id == ^user_id
)
case Oli.Repo.one(query) do
# No payment record found for this user in this section
nil ->
case create_payment(
Map.merge(attrs, %{pending_user_id: user_id, pending_section_id: section_id})
) do
{:ok, r} -> r
{:error, e} -> Oli.Repo.rollback(e)
end
# A payment found, but this payment was never finalized. We will reuse this
# payment record.
%Payment{enrollment_id: nil, application_date: nil} = p ->
case update_payment(p, attrs) do
{:ok, r} -> r
{:error, e} -> Oli.Repo.rollback(e)
end
_ ->
Oli.Repo.rollback({:payment_already_exists})
end
end)
end
@doc """
Creates a payment.
## Examples
iex> create_payment(%{field: value})
{:ok, %Payment{}}
iex> create_payment(%{field: bad_value})
{:error, %Ecto.Changeset{}}
"""
def create_payment(attrs \\ %{}) do
%Payment{}
|> Payment.changeset(attrs)
|> Repo.insert()
end
def update_payment(%Payment{} = p, attrs) do
p
|> Payment.changeset(attrs)
|> Repo.update()
end
@doc """
Creates a discount.
## Examples
iex> create_discount(%{field: value})
{:ok, %Discount{}}
iex> create_discount(%{field: bad_value})
{:error, %Ecto.Changeset{}}
"""
def create_discount(attrs \\ %{}) do
%Discount{}
|> Discount.changeset(attrs)
|> Repo.insert()
end
end
| 29.14578 | 98 | 0.589944 |
e867f453eff750e224765cc58762388616114bde | 1,161 | ex | Elixir | lib/ash_policy_authorizer/check/relating_to_actor.ex | jonathanstiansen/ash_policy_authorizer | d383c63d890c43211c2ac65c8ebb1aeed7fddfa5 | [
"MIT"
] | null | null | null | lib/ash_policy_authorizer/check/relating_to_actor.ex | jonathanstiansen/ash_policy_authorizer | d383c63d890c43211c2ac65c8ebb1aeed7fddfa5 | [
"MIT"
] | null | null | null | lib/ash_policy_authorizer/check/relating_to_actor.ex | jonathanstiansen/ash_policy_authorizer | d383c63d890c43211c2ac65c8ebb1aeed7fddfa5 | [
"MIT"
] | null | null | null | defmodule AshPolicyAuthorizer.Check.RelatingToActor do
@moduledoc false
use AshPolicyAuthorizer.SimpleCheck
@impl true
def describe(opts) do
"relating this.#{opts[:relationship]} to the actor"
end
@impl true
def match?(nil, _, _), do: false
def match?(actor, %{changeset: %Ash.Changeset{} = changeset}, opts) do
resource = changeset.resource
relationship = Ash.Resource.relationship(resource, opts[:relationship])
if Ash.Changeset.changing_relationship?(changeset, relationship.name) do
case Map.get(changeset.relationships, relationship.name) do
%{replace: replacing} ->
Enum.any?(List.wrap(replacing), fn replacing ->
Map.fetch(replacing, relationship.destination_field) ==
Map.fetch(actor, relationship.destination_field)
end)
%{add: adding} ->
Enum.any?(List.wrap(adding), fn adding ->
Map.fetch(adding, relationship.destination_field) ==
Map.fetch(actor, relationship.destination_field)
end)
_ ->
false
end
else
false
end
end
def match?(_, _, _), do: false
end
| 28.317073 | 76 | 0.645134 |
e867f9e2fda561617d12816c4bda2007895e5056 | 1,269 | exs | Elixir | config/prod.secret.exs | Dhall777/signinsheet | 703f6f2b0bc4e31888bf7e81939cfd4b05d99cfb | [
"BSD-3-Clause"
] | null | null | null | config/prod.secret.exs | Dhall777/signinsheet | 703f6f2b0bc4e31888bf7e81939cfd4b05d99cfb | [
"BSD-3-Clause"
] | null | null | null | config/prod.secret.exs | Dhall777/signinsheet | 703f6f2b0bc4e31888bf7e81939cfd4b05d99cfb | [
"BSD-3-Clause"
] | null | null | null | # In this file, we load production configuration and secrets
# from environment variables. You can also hardcode secrets,
# although such is generally not recommended and you have to
# remember to add this file to your .gitignore.
use Mix.Config
database_url =
System.get_env("DATABASE_URL") ||
raise """
environment variable DATABASE_URL is missing.
For example: ecto://USER:PASS@HOST/DATABASE
"""
config :signinsheet, Signinsheet.Repo,
# ssl: true,
url: database_url,
pool_size: String.to_integer(System.get_env("POOL_SIZE") || "10")
secret_key_base =
System.get_env("SECRET_KEY_BASE") ||
raise """
environment variable SECRET_KEY_BASE is missing.
You can generate one by calling: mix phx.gen.secret
"""
config :signinsheet, SigninsheetWeb.Endpoint,
http: [
port: String.to_integer(System.get_env("PORT") || "4000"),
transport_options: [socket_opts: [:inet6]]
],
secret_key_base: secret_key_base
# ## Using releases (Elixir v1.9+)
#
# If you are doing OTP releases, you need to instruct Phoenix
# to start each relevant endpoint:
#
# config :signinsheet, SigninsheetWeb.Endpoint, server: true
#
# Then you can assemble a release by calling `mix release`.
# See `mix help release` for more information.
| 30.214286 | 67 | 0.724192 |
e867fcb3697cf263c46692790e4337c05354b8eb | 828 | exs | Elixir | lib/elixir/test/elixir/bitwise_test.exs | Nicd/elixir | e62ef92a4be1b562033d35b2d822cc9d6c661077 | [
"Apache-2.0"
] | 4 | 2016-04-05T05:51:36.000Z | 2019-10-31T06:46:35.000Z | lib/elixir/test/elixir/bitwise_test.exs | Nicd/elixir | e62ef92a4be1b562033d35b2d822cc9d6c661077 | [
"Apache-2.0"
] | null | null | null | lib/elixir/test/elixir/bitwise_test.exs | Nicd/elixir | e62ef92a4be1b562033d35b2d822cc9d6c661077 | [
"Apache-2.0"
] | 5 | 2015-02-01T06:01:19.000Z | 2019-08-29T09:02:35.000Z | Code.require_file "test_helper.exs", __DIR__
defmodule Bitwise.FunctionsTest do
use ExUnit.Case, async: true
use Bitwise, skip_operators: true
test :bnot do
assert bnot(1) == -2
end
test :band do
assert band(1, 1) == 1
end
test :bor do
assert bor(0, 1) == 1
end
test :bxor do
assert bxor(1, 1) == 0
end
test :bsl do
assert bsl(1, 1) == 2
end
test :bsr do
assert bsr(1, 1) == 0
end
end
defmodule Bitwise.OperatorsTest do
use ExUnit.Case, async: true
use Bitwise, only_operators: true
test :bnot do
assert ~~~1 == -2
end
test :band do
assert 1 &&& 1 == 1
end
test :bor do
assert 0 ||| 1 == 1
end
test :bxor do
assert 1 ^^^ 1 == 0
end
test :bsl do
assert 1 <<< 1 == 2
end
test :bsr do
assert 1 >>> 1 == 0
end
end
| 13.8 | 44 | 0.582126 |
e868007be6a4fc736271ea80474053a93f0eafda | 3,548 | ex | Elixir | apps/re_integrations/lib/salesforce/job_queue.ex | ruby2elixir/emcasa-backend | 70d7f4f233555417941ffa6ada84cf8740c21dd2 | [
"MIT"
] | 4 | 2019-11-01T16:29:31.000Z | 2020-10-10T21:20:12.000Z | apps/re_integrations/lib/salesforce/job_queue.ex | eduardomartines/emcasa-backend | 70d7f4f233555417941ffa6ada84cf8740c21dd2 | [
"MIT"
] | null | null | null | apps/re_integrations/lib/salesforce/job_queue.ex | eduardomartines/emcasa-backend | 70d7f4f233555417941ffa6ada84cf8740c21dd2 | [
"MIT"
] | 5 | 2019-11-04T21:25:45.000Z | 2020-02-13T23:49:36.000Z | defmodule ReIntegrations.Salesforce.JobQueue do
@moduledoc """
Module for processing jobs related with emcasa/salesforce's api.
"""
use EctoJob.JobQueue, table_name: "salesforce_jobs", schema_prefix: "re_integrations"
require Logger
alias Ecto.Multi
alias ReIntegrations.{
Repo,
Routific,
Salesforce,
Salesforce.Mapper
}
def perform(%Multi{} = multi, %{"type" => "monitor_routific_job", "job_id" => id}) do
multi
|> Multi.run(:get_job_status, fn _repo, _changes -> get_routific_job(id) end)
|> Multi.merge(fn %{get_job_status: payload} ->
enqueue_routific_insert_events(Ecto.Multi.new(), payload)
end)
|> Multi.merge(fn %{get_job_status: payload} ->
enqueue_routific_update_unserved(Ecto.Multi.new(), payload)
end)
|> Multi.run(:send_notification, fn _repo, %{get_job_status: payload} ->
Salesforce.report_scheduled_tours(payload)
end)
|> Repo.transaction()
|> handle_error()
end
def perform(%Multi{} = multi, %{
"type" => "insert_event",
"opportunity_id" => opportunity_id,
"route_id" => route_id,
"event" => event
}) do
multi
|> Multi.run(:insert_event, fn _repo, _changes -> Salesforce.insert_event(event) end)
|> __MODULE__.enqueue(:update_opportunity, %{
"type" => "update_opportunity",
"id" => opportunity_id,
"opportunity" => %{stage: :visit_scheduled, route_unserved_reason: "", route_id: route_id}
})
|> Repo.transaction()
|> handle_error()
end
def perform(%Multi{} = multi, %{
"type" => "update_opportunity",
"id" => id,
"opportunity" => payload
}) do
multi
|> Multi.run(:update_opportunity, fn _repo, _changes ->
Salesforce.update_opportunity(id, payload)
end)
|> Repo.transaction()
|> handle_error()
end
defp get_routific_job(job_id) do
with {:ok, payload} <- Routific.get_job_status(job_id) do
{:ok, payload}
else
{:error, error} -> {:error, error}
{status, _data} -> {:error, status}
end
end
defp enqueue_routific_insert_events(multi, payload),
do:
Enum.reduce(payload.solution, multi, fn route, multi ->
enqueue_calendar_insert_events(multi, route, payload)
end)
defp enqueue_calendar_insert_events(multi, {calendar_uuid, [_depot | events]}, payload),
do:
events
|> Enum.filter(&(not Map.get(&1, :break)))
|> Enum.reduce(multi, fn event, multi ->
__MODULE__.enqueue(multi, "schedule_#{event.id}", %{
"type" => "insert_event",
"event" => Mapper.Routific.build_event(event, calendar_uuid, payload),
"route_id" => payload.id,
"opportunity_id" => event.id
})
end)
defp enqueue_routific_update_unserved(multi, payload),
do:
Enum.reduce(payload.unserved, multi, fn {opportunity_id, reason}, multi ->
__MODULE__.enqueue(multi, "update_#{opportunity_id}", %{
"type" => "update_opportunity",
"id" => opportunity_id,
"opportunity" => %{
route_unserved_reason: reason,
route_id: payload.id
}
})
end)
defp handle_error({:ok, result}), do: {:ok, result}
defp handle_error({:error, :get_job_status, :pending, _changes} = result),
do: result
defp handle_error(error) do
Sentry.capture_message("error when performing Salesforce.JobQueue",
extra: %{error: error}
)
raise "Error when performing Salesforce.JobQueue"
end
end
| 29.815126 | 96 | 0.628805 |
e8680bd6d9f2cadc19a4f4f5740c3ff91e3d6154 | 583 | exs | Elixir | test/views/error_view_test.exs | linx-iot/linx_controller | 1a227ebff85180f0781fef7fd0371e2fcb94cfeb | [
"MIT"
] | null | null | null | test/views/error_view_test.exs | linx-iot/linx_controller | 1a227ebff85180f0781fef7fd0371e2fcb94cfeb | [
"MIT"
] | null | null | null | test/views/error_view_test.exs | linx-iot/linx_controller | 1a227ebff85180f0781fef7fd0371e2fcb94cfeb | [
"MIT"
] | null | null | null | defmodule LinxCntrlr.ErrorViewTest do
use LinxCntrlr.ConnCase, async: true
# Bring render/3 and render_to_string/3 for testing custom views
import Phoenix.View
test "renders 404.html" do
assert render_to_string(LinxCntrlr.ErrorView, "404.html", []) ==
"Page not found"
end
test "render 500.html" do
assert render_to_string(LinxCntrlr.ErrorView, "500.html", []) ==
"Server internal error"
end
test "render any other" do
assert render_to_string(LinxCntrlr.ErrorView, "505.html", []) ==
"Server internal error"
end
end
| 26.5 | 68 | 0.684391 |
e8682d4c8172e1833d81e908e1c3fd6cbe5465be | 524 | ex | Elixir | lib/pton/exceptions.ex | casey-chow/pton.co | c794dc6903326dc827f5cbd5c7a8e35868a0fa45 | [
"MIT"
] | null | null | null | lib/pton/exceptions.ex | casey-chow/pton.co | c794dc6903326dc827f5cbd5c7a8e35868a0fa45 | [
"MIT"
] | 4 | 2017-10-18T15:52:20.000Z | 2017-10-19T00:18:38.000Z | lib/pton/exceptions.ex | casey-chow/pton.co | c794dc6903326dc827f5cbd5c7a8e35868a0fa45 | [
"MIT"
] | null | null | null | defmodule Pton.QuotaExceededError do
@moduledoc """
Raised when the user exceeds a given resource quota.
This exception is raised by `Pton.Redirection.create_link/2`.
"""
defexception [:message, plug_status: :too_many_requests]
def exception do
msg = "The current user has exceeded their hard quota for the number " <>
"of links created in their lifetime. Try deleting some links or " <>
"contacting the administrator of this site."
%Pton.QuotaExceededError{message: msg}
end
end
| 32.75 | 78 | 0.715649 |
e86831cfd4833ca724ff93a5169fa9a4def66a1a | 255 | ex | Elixir | apps/cucumber_expressions/lib/string_reducer.ex | Ajwah/ex_cucumber | f2b9cf06caeef624c66424ae6160f274dc133fc6 | [
"Apache-2.0"
] | 2 | 2021-05-18T18:20:05.000Z | 2022-02-13T00:15:06.000Z | apps/cucumber_expressions/lib/string_reducer.ex | Ajwah/ex_cucumber | f2b9cf06caeef624c66424ae6160f274dc133fc6 | [
"Apache-2.0"
] | 2 | 2021-04-22T00:28:17.000Z | 2021-05-19T21:04:20.000Z | apps/cucumber_expressions/lib/string_reducer.ex | Ajwah/ex_cucumber | f2b9cf06caeef624c66424ae6160f274dc133fc6 | [
"Apache-2.0"
] | 4 | 2021-04-14T03:07:45.000Z | 2021-12-12T21:23:59.000Z | defmodule StringReducer do
def reduce(str, initial_val, fun), do: traverse(str, initial_val, fun)
defp traverse("", acc, _), do: acc
defp traverse(<<char::utf8, rest::binary>>, acc, fun),
do: traverse(rest, fun.(<<char::utf8>>, acc), fun)
end
| 28.333333 | 72 | 0.658824 |
e868dd39558d038a7b27c9e44d9c20fbab94575b | 1,477 | ex | Elixir | lib/bamboo/adapters/test_adapter.ex | tslcjames/bamboo | d10fb1d51737d1b505de5462a2e25988f13744b4 | [
"MIT"
] | null | null | null | lib/bamboo/adapters/test_adapter.ex | tslcjames/bamboo | d10fb1d51737d1b505de5462a2e25988f13744b4 | [
"MIT"
] | null | null | null | lib/bamboo/adapters/test_adapter.ex | tslcjames/bamboo | d10fb1d51737d1b505de5462a2e25988f13744b4 | [
"MIT"
] | 1 | 2018-09-05T09:17:27.000Z | 2018-09-05T09:17:27.000Z | defmodule Bamboo.TestAdapter do
@moduledoc """
Used for testing email delivery
No emails are sent, instead a message is sent to the current process and can
be asserted on with helpers from `Bamboo.Test`.
## Example config
# Typically done in config/test.exs
config :my_app, MyApp.Mailer,
adapter: Bamboo.TestAdapter
# Define a Mailer. Typically in lib/my_app/mailer.ex
defmodule MyApp.Mailer do
use Bamboo.Mailer, otp_app: :my_app
end
"""
@behaviour Bamboo.Adapter
@doc false
def deliver(email, _config) do
email = clean_assigns(email)
send test_process, {:delivered_email, email}
end
defp test_process do
Application.get_env(:bamboo, :shared_test_process) || self()
end
def handle_config(config) do
case config[:deliver_later_strategy] do
nil ->
Map.put(config, :deliver_later_strategy, Bamboo.ImmediateDeliveryStrategy)
Bamboo.ImmediateDeliveryStrategy ->
config
_ ->
raise ArgumentError, """
Bamboo.TestAdapter requires that the deliver_later_strategy is
Bamboo.ImmediateDeliveryStrategy
Instead it got: #{inspect config[:deliver_later_strategy]}
Please remove the deliver_later_strategy from your config options, or
set it to Bamboo.ImmediateDeliveryStrategy.
"""
end
end
@doc false
def clean_assigns(email) do
%{email | assigns: :assigns_removed_for_testing}
end
end
| 26.375 | 82 | 0.693297 |
e8691d22813bf6941f35d895561794cbdd866a1a | 5,157 | ex | Elixir | deps/earmark/lib/earmark/scanner.ex | alex-philippov/jwt-google-tokens | 18aa3eccc9a2c1a6016b2fe53b0bc6b1612a04b7 | [
"Apache-2.0"
] | null | null | null | deps/earmark/lib/earmark/scanner.ex | alex-philippov/jwt-google-tokens | 18aa3eccc9a2c1a6016b2fe53b0bc6b1612a04b7 | [
"Apache-2.0"
] | null | null | null | deps/earmark/lib/earmark/scanner.ex | alex-philippov/jwt-google-tokens | 18aa3eccc9a2c1a6016b2fe53b0bc6b1612a04b7 | [
"Apache-2.0"
] | 1 | 2019-11-23T12:09:14.000Z | 2019-11-23T12:09:14.000Z | defmodule Earmark.Scanner do
import Earmark.Helpers.StringHelpers, only: [behead: 2]
@backtix_rgx ~r/\A(`+)(.*)/
@blockquote_rgx ~r/\A>(?!\S)/
@code_fence_rgx ~r/\A(\s*)~~~/
@headline_rgx ~r/\A(\#{1,6})(\s+)(.*)/
@id_close_rgx ~r/\[(.*?)\](?!:)/
@id_open_rgx ~r/\A(\s{0,3})\[(.*?)\]:\s+(.*)\z/
@indent_rgx ~r/\A\s{4,}/
@list_item_rgx ~r/\A(\s{0,3})(\d+\.|\*|-)\s+/
@ruler_rgx ~r/\A \s{0,3} (?:([-_*])\s?)(?:\1\s?){2,} \z/x
@under_l1_head_rgx ~r/\A=+\s*\z/
@under_l2_head_rgx ~r/\A-{1,2}\s*\z/
@text_rgx ~r/(?:[^`]|\\`)*/
defmodule Backtix, do: defstruct count: 1
defmodule Blockquote, do: defstruct []
defmodule CodeFence, do: defstruct []
defmodule Headline, do: defstruct level: 1..6
defmodule IdClose, do: defstruct id: "content of [...]"
defmodule IdOpen, do: defstruct id: "content of [...]", href: "word after ]:\\s+"
defmodule Indent, do: defstruct count: 4
defmodule ListItem, do: defstruct type: :ul_ol, bullet: "* or - or empty"
defmodule RulerFat, do: defstruct []
defmodule RulerMedium, do: defstruct []
defmodule RulerThin, do: defstruct []
defmodule Text, do: defstruct content: ""
defmodule UnderHeadline, do: defstruct level: 1..2
@type token :: %Backtix{} | %Blockquote{} | %CodeFence{} | %Headline{} | %IdClose{} | %IdOpen{} | %Indent{} | %ListItem{} | %RulerFat{} | %RulerMedium{} | %RulerThin{} | %Text{} | %UnderHeadline{}
@type tokens :: list(token)
@type t_continuation :: {token, String.t, boolean()}
@spec scan_line( String.t ) :: tokens
@doc """
Scans a line into a list of tokens
"""
def scan_line line do
scan_line_into_tokens( line, [], true )
|> Enum.reverse
end
@spec scan_line_into_tokens( String.t, tokens, boolean() ) :: tokens
# Empty Line
defp scan_line_into_tokens "", [], _beg do
[]
end
# Line consumed
defp scan_line_into_tokens( "", tokens, _beg), do: tokens
# Line not consumed
defp scan_line_into_tokens line, tokens, beg do
{token, rest, still_at_beg} = scan_next_token( line, beg )
scan_line_into_tokens( rest, [token|tokens], still_at_beg )
end
@spec scan_next_token( String.t, boolean ) :: false | t_continuation
defp scan_next_token line, beg_of_line
defp scan_next_token line, true do
cond do
Regex.run( @blockquote_rgx, line ) ->
{%Blockquote{}, behead(line, 1), false}
matches = Regex.run( @list_item_rgx, line) ->
[content, ws, bullet] = matches
prefixed_with_ws(line, ws) ||
{make_list_item(bullet), behead(line,content), false}
matches = Regex.run( @id_open_rgx, line ) ->
[_content, ws, id, rest ] = matches
prefixed_with_ws(line, ws) ||
{%IdOpen{id: id}, rest, false}
matches = Regex.run( @under_l1_head_rgx, line ) ->
{%UnderHeadline{level: 1}, "", false}
matches = Regex.run( @under_l2_head_rgx, line ) ->
{%UnderHeadline{level: 2}, "", false}
matches = Regex.run( @code_fence_rgx, line ) ->
[_line, ws] = matches
prefixed_with_ws(line, ws) ||
{%CodeFence{}, behead(line, 3), false}
matches = Regex.run( @indent_rgx, line ) ->
count = String.length(hd matches)
{%Indent{count: count}, behead(line, count), false}
matches = Regex.run( @headline_rgx, line ) ->
[_line, levelstr, _ws, rest] = matches
{%Headline{level: String.length(levelstr)}, rest, false}
matches = Regex.run( @ruler_rgx, line ) ->
[_content, type] = matches
{make_ruler_from(type), "", false}
true ->
scan_next_token( line, false )
end
end
defp scan_next_token line, false do
scan_token_not_at_beg( line )
|> Tuple.append( false )
end
@spec scan_token_not_at_beg( String.t ) :: {} | t_continuation
defp scan_token_not_at_beg line do
cond do
matches = Regex.run( @backtix_rgx, line ) ->
[_line, backtix, rest] = matches
{%Backtix{count: String.length(backtix)}, rest}
# matches = Regex.run( @id_close_rgx, line ) ->
# [text, id, does_open] = matches
# {%IdDef{id: id, type:
# (if does_open == "", do: "close", else: "open")
# }, behead(line, text)}
matches = Regex.run( @text_rgx, line ) ->
text = hd matches
{%Text{content: text}, behead(line, text)}
true -> {}
end
end
@spec make_ruler_from( String.t ) :: token
defp make_ruler_from type do
case type do
"*" -> %RulerFat{}
"_" -> %RulerMedium{}
"-" -> %RulerThin{}
end
end
@spec make_list_item( String.t ) :: %ListItem{}
defp make_list_item bullet do
case bullet do
"*" -> %ListItem{type: :ul, bullet: "*"}
"-" -> %ListItem{type: :ul, bullet: "-"}
_ -> %ListItem{type: :ol, bullet: ""}
end
end
@spec prefixed_with_ws(String.t, String.t) :: false | { %Text{}, String.t, true}
defp prefixed_with_ws line, ws do
if ws == "" do
false
else
rest = behead( line, ws )
{%Text{content: ws}, rest, true}
end
end
end
| 33.487013 | 199 | 0.586387 |
e8691ee4c058a64fa749aa4782607595e9ea9f22 | 4,273 | ex | Elixir | clients/slides/lib/google_api/slides/v1/model/page.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/slides/lib/google_api/slides/v1/model/page.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/slides/lib/google_api/slides/v1/model/page.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Slides.V1.Model.Page do
@moduledoc """
A page in a presentation.
## Attributes
* `layoutProperties` (*type:* `GoogleApi.Slides.V1.Model.LayoutProperties.t`, *default:* `nil`) - Layout specific properties. Only set if page_type = LAYOUT.
* `masterProperties` (*type:* `GoogleApi.Slides.V1.Model.MasterProperties.t`, *default:* `nil`) - Master specific properties. Only set if page_type = MASTER.
* `notesProperties` (*type:* `GoogleApi.Slides.V1.Model.NotesProperties.t`, *default:* `nil`) - Notes specific properties. Only set if page_type = NOTES.
* `objectId` (*type:* `String.t`, *default:* `nil`) - The object ID for this page. Object IDs used by
Page and
PageElement share the same namespace.
* `pageElements` (*type:* `list(GoogleApi.Slides.V1.Model.PageElement.t)`, *default:* `nil`) - The page elements rendered on the page.
* `pageProperties` (*type:* `GoogleApi.Slides.V1.Model.PageProperties.t`, *default:* `nil`) - The properties of the page.
* `pageType` (*type:* `String.t`, *default:* `nil`) - The type of the page.
* `revisionId` (*type:* `String.t`, *default:* `nil`) - The revision ID of the presentation containing this page. Can be used in
update requests to assert that the presentation revision hasn't changed
since the last read operation. Only populated if the user has edit access
to the presentation.
The format of the revision ID may change over time, so it should be treated
opaquely. A returned revision ID is only guaranteed to be valid for 24
hours after it has been returned and cannot be shared across users. If the
revision ID is unchanged between calls, then the presentation has not
changed. Conversely, a changed ID (for the same presentation and user)
usually means the presentation has been updated; however, a changed ID can
also be due to internal factors such as ID format changes.
* `slideProperties` (*type:* `GoogleApi.Slides.V1.Model.SlideProperties.t`, *default:* `nil`) - Slide specific properties. Only set if page_type = SLIDE.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:layoutProperties => GoogleApi.Slides.V1.Model.LayoutProperties.t(),
:masterProperties => GoogleApi.Slides.V1.Model.MasterProperties.t(),
:notesProperties => GoogleApi.Slides.V1.Model.NotesProperties.t(),
:objectId => String.t(),
:pageElements => list(GoogleApi.Slides.V1.Model.PageElement.t()),
:pageProperties => GoogleApi.Slides.V1.Model.PageProperties.t(),
:pageType => String.t(),
:revisionId => String.t(),
:slideProperties => GoogleApi.Slides.V1.Model.SlideProperties.t()
}
field(:layoutProperties, as: GoogleApi.Slides.V1.Model.LayoutProperties)
field(:masterProperties, as: GoogleApi.Slides.V1.Model.MasterProperties)
field(:notesProperties, as: GoogleApi.Slides.V1.Model.NotesProperties)
field(:objectId)
field(:pageElements, as: GoogleApi.Slides.V1.Model.PageElement, type: :list)
field(:pageProperties, as: GoogleApi.Slides.V1.Model.PageProperties)
field(:pageType)
field(:revisionId)
field(:slideProperties, as: GoogleApi.Slides.V1.Model.SlideProperties)
end
defimpl Poison.Decoder, for: GoogleApi.Slides.V1.Model.Page do
def decode(value, options) do
GoogleApi.Slides.V1.Model.Page.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Slides.V1.Model.Page do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 50.869048 | 161 | 0.717061 |
e86939f5fa2e983a075f37c5b09e540fdb55d273 | 1,308 | exs | Elixir | inventory_management_system/inventory_management_system.exs | ChrisWilding/advent-of-code-2018 | 144a60875c66eb79f4c57aff50bdaf1d236d2656 | [
"Apache-2.0"
] | 1 | 2019-05-14T04:43:23.000Z | 2019-05-14T04:43:23.000Z | inventory_management_system/inventory_management_system.exs | ChrisWilding/advent-of-code-2018 | 144a60875c66eb79f4c57aff50bdaf1d236d2656 | [
"Apache-2.0"
] | null | null | null | inventory_management_system/inventory_management_system.exs | ChrisWilding/advent-of-code-2018 | 144a60875c66eb79f4c57aff50bdaf1d236d2656 | [
"Apache-2.0"
] | null | null | null | defmodule InventoryManagementSystem do
def run("1", file) do
{two, three} =
file
|> String.split()
|> Enum.map(&InventoryManagementSystem.count/1)
|> Enum.reduce({0, 0}, fn list, acc ->
Enum.reduce(list, acc, fn n, {two, three} ->
cond do
n == 2 -> {two + 1, three}
n == 3 -> {two, three + 1}
end
end)
end)
two * three
end
def run("2", file) do
ids = file |> String.split()
[{a, b} | _] = for a <- ids, b <- ids, one_letter_difference(a, b), do: {a, b}
Enum.zip(String.graphemes(a), String.graphemes(b))
|> Enum.filter(fn {a, b} -> a == b end)
|> Enum.map(fn {a, _} -> a end)
|> Enum.join()
end
def count(string) do
string
|> String.graphemes()
|> Enum.reduce(%{}, fn letter, acc -> Map.update(acc, letter, 1, &(&1 + 1)) end)
|> Enum.filter(fn {_, v} -> v > 1 end)
|> Enum.group_by(fn {_, v} -> v end)
|> Enum.map(fn {k, _} -> k end)
end
def one_letter_difference(a, b) do
count =
Enum.zip(String.graphemes(a), String.graphemes(b))
|> Enum.count(fn {a, b} -> a == b end)
count == 25
end
end
[part, file] = System.argv()
contents = File.read!(file)
output = InventoryManagementSystem.run(part, contents)
IO.puts(output)
| 25.647059 | 84 | 0.535933 |
e8695702d6db303971a1ce02624ff362e31fdea4 | 17,637 | ex | Elixir | clients/slides/lib/google_api/slides/v1/api/presentations.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/slides/lib/google_api/slides/v1/api/presentations.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/slides/lib/google_api/slides/v1/api/presentations.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Slides.V1.Api.Presentations do
@moduledoc """
API calls for all endpoints tagged `Presentations`.
"""
alias GoogleApi.Slides.V1.Connection
alias GoogleApi.Gax.{Request, Response}
@library_version Mix.Project.config() |> Keyword.get(:version, "")
@doc """
Applies one or more updates to the presentation.
Each request is validated before
being applied. If any request is not valid, then the entire request will
fail and nothing will be applied.
Some requests have replies to
give you some information about how they are applied. Other requests do
not need to return information; these each return an empty reply.
The order of replies matches that of the requests.
For example, suppose you call batchUpdate with four updates, and only the
third one returns information. The response would have two empty replies:
the reply to the third request, and another empty reply, in that order.
Because other users may be editing the presentation, the presentation
might not exactly reflect your changes: your changes may
be altered with respect to collaborator changes. If there are no
collaborators, the presentation should reflect your changes. In any case,
the updates in your request are guaranteed to be applied together
atomically.
## Parameters
* `connection` (*type:* `GoogleApi.Slides.V1.Connection.t`) - Connection to server
* `presentation_id` (*type:* `String.t`) - The presentation to apply the updates to.
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `:body` (*type:* `GoogleApi.Slides.V1.Model.BatchUpdatePresentationRequest.t`) -
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.Slides.V1.Model.BatchUpdatePresentationResponse{}}` on success
* `{:error, info}` on failure
"""
@spec slides_presentations_batch_update(Tesla.Env.client(), String.t(), keyword(), keyword()) ::
{:ok, GoogleApi.Slides.V1.Model.BatchUpdatePresentationResponse.t()}
| {:ok, Tesla.Env.t()}
| {:error, Tesla.Env.t()}
def slides_presentations_batch_update(
connection,
presentation_id,
optional_params \\ [],
opts \\ []
) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query,
:body => :body
}
request =
Request.new()
|> Request.method(:post)
|> Request.url("/v1/presentations/{presentationId}:batchUpdate", %{
"presentationId" => URI.encode(presentation_id, &URI.char_unreserved?/1)
})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(
opts ++ [struct: %GoogleApi.Slides.V1.Model.BatchUpdatePresentationResponse{}]
)
end
@doc """
Creates a blank presentation using the title given in the request. If a
`presentationId` is provided, it is used as the ID of the new presentation.
Otherwise, a new ID is generated. Other fields in the request, including
any provided content, are ignored.
Returns the created presentation.
## Parameters
* `connection` (*type:* `GoogleApi.Slides.V1.Connection.t`) - Connection to server
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `:body` (*type:* `GoogleApi.Slides.V1.Model.Presentation.t`) -
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.Slides.V1.Model.Presentation{}}` on success
* `{:error, info}` on failure
"""
@spec slides_presentations_create(Tesla.Env.client(), keyword(), keyword()) ::
{:ok, GoogleApi.Slides.V1.Model.Presentation.t()}
| {:ok, Tesla.Env.t()}
| {:error, Tesla.Env.t()}
def slides_presentations_create(connection, optional_params \\ [], opts \\ []) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query,
:body => :body
}
request =
Request.new()
|> Request.method(:post)
|> Request.url("/v1/presentations", %{})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(opts ++ [struct: %GoogleApi.Slides.V1.Model.Presentation{}])
end
@doc """
Gets the latest version of the specified presentation.
## Parameters
* `connection` (*type:* `GoogleApi.Slides.V1.Connection.t`) - Connection to server
* `presentation_id` (*type:* `String.t`) - The ID of the presentation to retrieve.
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.Slides.V1.Model.Presentation{}}` on success
* `{:error, info}` on failure
"""
@spec slides_presentations_get(Tesla.Env.client(), String.t(), keyword(), keyword()) ::
{:ok, GoogleApi.Slides.V1.Model.Presentation.t()}
| {:ok, Tesla.Env.t()}
| {:error, Tesla.Env.t()}
def slides_presentations_get(connection, presentation_id, optional_params \\ [], opts \\ []) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query
}
request =
Request.new()
|> Request.method(:get)
|> Request.url("/v1/presentations/{+presentationId}", %{
"presentationId" => URI.encode(presentation_id, &URI.char_unreserved?/1)
})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(opts ++ [struct: %GoogleApi.Slides.V1.Model.Presentation{}])
end
@doc """
Gets the latest version of the specified page in the presentation.
## Parameters
* `connection` (*type:* `GoogleApi.Slides.V1.Connection.t`) - Connection to server
* `presentation_id` (*type:* `String.t`) - The ID of the presentation to retrieve.
* `page_object_id` (*type:* `String.t`) - The object ID of the page to retrieve.
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.Slides.V1.Model.Page{}}` on success
* `{:error, info}` on failure
"""
@spec slides_presentations_pages_get(
Tesla.Env.client(),
String.t(),
String.t(),
keyword(),
keyword()
) ::
{:ok, GoogleApi.Slides.V1.Model.Page.t()}
| {:ok, Tesla.Env.t()}
| {:error, Tesla.Env.t()}
def slides_presentations_pages_get(
connection,
presentation_id,
page_object_id,
optional_params \\ [],
opts \\ []
) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query
}
request =
Request.new()
|> Request.method(:get)
|> Request.url("/v1/presentations/{presentationId}/pages/{pageObjectId}", %{
"presentationId" => URI.encode(presentation_id, &URI.char_unreserved?/1),
"pageObjectId" => URI.encode(page_object_id, &URI.char_unreserved?/1)
})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(opts ++ [struct: %GoogleApi.Slides.V1.Model.Page{}])
end
@doc """
Generates a thumbnail of the latest version of the specified page in the
presentation and returns a URL to the thumbnail image.
This request counts as an [expensive read request](/slides/limits) for
quota purposes.
## Parameters
* `connection` (*type:* `GoogleApi.Slides.V1.Connection.t`) - Connection to server
* `presentation_id` (*type:* `String.t`) - The ID of the presentation to retrieve.
* `page_object_id` (*type:* `String.t`) - The object ID of the page whose thumbnail to retrieve.
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `:"thumbnailProperties.mimeType"` (*type:* `String.t`) - The optional mime type of the thumbnail image.
If you don't specify the mime type, the default mime type will be PNG.
* `:"thumbnailProperties.thumbnailSize"` (*type:* `String.t`) - The optional thumbnail image size.
If you don't specify the size, the server chooses a default size of the
image.
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.Slides.V1.Model.Thumbnail{}}` on success
* `{:error, info}` on failure
"""
@spec slides_presentations_pages_get_thumbnail(
Tesla.Env.client(),
String.t(),
String.t(),
keyword(),
keyword()
) ::
{:ok, GoogleApi.Slides.V1.Model.Thumbnail.t()}
| {:ok, Tesla.Env.t()}
| {:error, Tesla.Env.t()}
def slides_presentations_pages_get_thumbnail(
connection,
presentation_id,
page_object_id,
optional_params \\ [],
opts \\ []
) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query,
:"thumbnailProperties.mimeType" => :query,
:"thumbnailProperties.thumbnailSize" => :query
}
request =
Request.new()
|> Request.method(:get)
|> Request.url("/v1/presentations/{presentationId}/pages/{pageObjectId}/thumbnail", %{
"presentationId" => URI.encode(presentation_id, &URI.char_unreserved?/1),
"pageObjectId" => URI.encode(page_object_id, &URI.char_unreserved?/1)
})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(opts ++ [struct: %GoogleApi.Slides.V1.Model.Thumbnail{}])
end
end
| 44.537879 | 196 | 0.632817 |
e8695938801415869444044c882b22254333e0bb | 1,460 | exs | Elixir | bracket-push/bracket_push_test.exs | ravanscafi/exercism-elixir | 0f5c8c923166a0a795c323c7e2d6ccc9da572fcf | [
"MIT"
] | null | null | null | bracket-push/bracket_push_test.exs | ravanscafi/exercism-elixir | 0f5c8c923166a0a795c323c7e2d6ccc9da572fcf | [
"MIT"
] | null | null | null | bracket-push/bracket_push_test.exs | ravanscafi/exercism-elixir | 0f5c8c923166a0a795c323c7e2d6ccc9da572fcf | [
"MIT"
] | null | null | null | if !System.get_env("EXERCISM_TEST_EXAMPLES") do
Code.load_file("bracket_push.exs", __DIR__)
end
ExUnit.start()
# ExUnit.configure exclude: :pending, trace: true
defmodule BracketPushTest do
use ExUnit.Case
# @tag :pending
test "empty string" do
assert BracketPush.check_brackets("")
end
# @tag :pending
test "appropriate bracketing in a set of brackets" do
assert BracketPush.check_brackets("{}")
end
# @tag :pending
test "unclosed brackets" do
refute BracketPush.check_brackets("{{")
end
# @tag :pending
test "more than one pair of brackets" do
assert BracketPush.check_brackets("{}[]")
end
@tag :pending
test "brackets are out of order" do
refute BracketPush.check_brackets("}{")
end
@tag :pending
test "nested brackets" do
assert BracketPush.check_brackets("{[()]}")
end
@tag :pending
test "unbalanced nested brackets" do
refute BracketPush.check_brackets("{[}]")
end
@tag :pending
test "bracket closure with deeper nesting" do
refute BracketPush.check_brackets("{[)][]}")
end
@tag :pending
test "bracket closure in a long string of brackets" do
assert BracketPush.check_brackets("{[]([()])}")
end
@tag :pending
test "should ignore non-bracket characters" do
assert BracketPush.check_brackets("{hello[]([a()])b}c")
end
@tag :pending
test "string with newlines" do
assert BracketPush.check_brackets("[]\n{()}\n[(({}))]\n")
end
end
| 22.121212 | 61 | 0.681507 |
e8695fe1963a9c8352b7ce9c9fffb9a50d7ccf21 | 429 | exs | Elixir | apps/quickstart_web/test/quickstart_web/views/error_view_test.exs | jdcumpson/elixir-phoenix-ssr-redux-postgres | 20ecf8bac1322be273ed6441bb586b8574998286 | [
"MIT"
] | 1 | 2020-11-04T04:32:56.000Z | 2020-11-04T04:32:56.000Z | apps/quickstart_web/test/quickstart_web/views/error_view_test.exs | jdcumpson/elixir-phoenix-ssr-redux-postgres | 20ecf8bac1322be273ed6441bb586b8574998286 | [
"MIT"
] | 4 | 2021-10-06T22:46:17.000Z | 2022-02-27T12:04:59.000Z | apps/quickstart_web/test/quickstart_web/views/error_view_test.exs | jdcumpson/elixir-phoenix-ssr-redux-postgres | 20ecf8bac1322be273ed6441bb586b8574998286 | [
"MIT"
] | null | null | null | defmodule QuickstartWeb.ErrorViewTest do
use QuickstartWeb.ConnCase, async: true
# Bring render/3 and render_to_string/3 for testing custom views
import Phoenix.View
test "renders 404.html" do
assert render_to_string(QuickstartWeb.ErrorView, "404.html", []) == "Not Found"
end
test "renders 500.html" do
assert render_to_string(QuickstartWeb.ErrorView, "500.html", []) == "Internal Server Error"
end
end
| 28.6 | 95 | 0.741259 |
e8697bf8b9b7b0d5bb5582c4c8d675f8409a98fd | 417 | exs | Elixir | priv/repo/migrations/20180903223853_create_inquries.exs | zephraph/readtome | 64a5f773bdc3c19d9c5ac50a04aa14e446e36c55 | [
"MIT"
] | 1 | 2021-09-05T20:54:57.000Z | 2021-09-05T20:54:57.000Z | priv/repo/migrations/20180903223853_create_inquries.exs | zephraph/readtome | 64a5f773bdc3c19d9c5ac50a04aa14e446e36c55 | [
"MIT"
] | 17 | 2019-07-06T17:31:56.000Z | 2021-06-22T15:31:06.000Z | priv/repo/migrations/20180903223853_create_inquries.exs | zephraph/readtome | 64a5f773bdc3c19d9c5ac50a04aa14e446e36c55 | [
"MIT"
] | 1 | 2021-03-15T20:50:27.000Z | 2021-03-15T20:50:27.000Z | defmodule Readtome.Repo.Migrations.Createinquiries do
use Ecto.Migration
def change do
create table(:inquiries) do
add :type, :string
add :user_id, references(:users, on_delete: :nothing)
add :book_instance_id, references(:book_instances, on_delete: :nothing)
timestamps()
end
create index(:inquiries, [:user_id])
create index(:inquiries, [:book_instance_id])
end
end
| 24.529412 | 77 | 0.70024 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.