hexsha stringlengths 40 40 | size int64 2 991k | ext stringclasses 2 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 4 208 | max_stars_repo_name stringlengths 6 106 | max_stars_repo_head_hexsha stringlengths 40 40 | max_stars_repo_licenses list | max_stars_count int64 1 33.5k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 4 208 | max_issues_repo_name stringlengths 6 106 | max_issues_repo_head_hexsha stringlengths 40 40 | max_issues_repo_licenses list | max_issues_count int64 1 16.3k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 4 208 | max_forks_repo_name stringlengths 6 106 | max_forks_repo_head_hexsha stringlengths 40 40 | max_forks_repo_licenses list | max_forks_count int64 1 6.91k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 2 991k | avg_line_length float64 1 36k | max_line_length int64 1 977k | alphanum_fraction float64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1cb75919574df513c1a59e77c28f52b33ba4f73b | 1,036 | exs | Elixir | mix.exs | PillarTechnology/smartcitiesdata | 9420a26820e38267513cd1bfa82c7f5583222bb1 | [
"Apache-2.0"
] | null | null | null | mix.exs | PillarTechnology/smartcitiesdata | 9420a26820e38267513cd1bfa82c7f5583222bb1 | [
"Apache-2.0"
] | null | null | null | mix.exs | PillarTechnology/smartcitiesdata | 9420a26820e38267513cd1bfa82c7f5583222bb1 | [
"Apache-2.0"
] | null | null | null | defmodule Smartcitiesdata.MixProject do
use Mix.Project
def project do
[
apps_path: "apps",
start_permanent: Mix.env() == :prod,
deps: deps(),
aliases: aliases(),
docs: docs(),
description: description()
]
end
defp deps, do: []
defp aliases do
[
test: "cmd mix test --color",
"test.e2e": "cmd --app e2e mix test.integration --color --include e2e",
sobelow: "cmd --app andi mix sobelow -i Config.HTTPS --skip --compact --exit low"
]
end
defp description, do: "A data ingestion and processing platform for the next generation."
defp docs() do
[
main: "readme",
source_url: "https://github.com/smartcitiesdata/smartcitiesdata.git",
extras: [
"README.md",
"apps/andi/README.md",
"apps/reaper/README.md",
"apps/valkyrie/README.md",
"apps/odo/README.md",
"apps/discovery_streams/README.md",
"apps/forklift/README.md",
"apps/flair/README.md"
]
]
end
end
| 23.545455 | 91 | 0.587838 |
1cb781a4d8c1e96ea29dd7bdd9b8dd7dd5feb7c5 | 67 | ex | Elixir | lib/onsor_web/views/admin/material_view.ex | ashkan18/onsor | 73b75b24f638f1a425de8ebf4454df971040e9f2 | [
"MIT"
] | null | null | null | lib/onsor_web/views/admin/material_view.ex | ashkan18/onsor | 73b75b24f638f1a425de8ebf4454df971040e9f2 | [
"MIT"
] | 4 | 2021-03-09T00:47:04.000Z | 2022-02-10T15:15:28.000Z | lib/onsor_web/views/admin/material_view.ex | ashkan18/onsor | 73b75b24f638f1a425de8ebf4454df971040e9f2 | [
"MIT"
] | null | null | null | defmodule OnsorWeb.Admin.MaterialView do
use OnsorWeb, :view
end
| 16.75 | 40 | 0.80597 |
1cb7ac20e8f6c5c34e7a3cba91a86c5cb76c6944 | 3,220 | ex | Elixir | clients/speech/lib/google_api/speech/v1/model/operation.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | 1 | 2018-12-03T23:43:10.000Z | 2018-12-03T23:43:10.000Z | clients/speech/lib/google_api/speech/v1/model/operation.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | null | null | null | clients/speech/lib/google_api/speech/v1/model/operation.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This class is auto generated by the elixir code generator program.
# Do not edit the class manually.
defmodule GoogleApi.Speech.V1.Model.Operation do
@moduledoc """
This resource represents a long-running operation that is the result of a
network API call.
## Attributes
* `done` (*type:* `boolean()`, *default:* `nil`) - If the value is `false`, it means the operation is still in progress.
If `true`, the operation is completed, and either `error` or `response` is
available.
* `error` (*type:* `GoogleApi.Speech.V1.Model.Status.t`, *default:* `nil`) - The error result of the operation in case of failure or cancellation.
* `metadata` (*type:* `map()`, *default:* `nil`) - Service-specific metadata associated with the operation. It typically
contains progress information and common metadata such as create time.
Some services might not provide such metadata. Any method that returns a
long-running operation should document the metadata type, if any.
* `name` (*type:* `String.t`, *default:* `nil`) - The server-assigned name, which is only unique within the same service that
originally returns it. If you use the default HTTP mapping, the
`name` should be a resource name ending with `operations/{unique_id}`.
* `response` (*type:* `map()`, *default:* `nil`) - The normal response of the operation in case of success. If the original
method returns no data on success, such as `Delete`, the response is
`google.protobuf.Empty`. If the original method is standard
`Get`/`Create`/`Update`, the response should be the resource. For other
methods, the response should have the type `XxxResponse`, where `Xxx`
is the original method name. For example, if the original method name
is `TakeSnapshot()`, the inferred response type is
`TakeSnapshotResponse`.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:done => boolean(),
:error => GoogleApi.Speech.V1.Model.Status.t(),
:metadata => map(),
:name => String.t(),
:response => map()
}
field(:done)
field(:error, as: GoogleApi.Speech.V1.Model.Status)
field(:metadata, type: :map)
field(:name)
field(:response, type: :map)
end
defimpl Poison.Decoder, for: GoogleApi.Speech.V1.Model.Operation do
def decode(value, options) do
GoogleApi.Speech.V1.Model.Operation.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Speech.V1.Model.Operation do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 43.513514 | 150 | 0.700311 |
1cb7c4d8da0502fc538ceff34f6a0892936f7117 | 7,111 | exs | Elixir | test/transaction_client_invite_test.exs | BrendanBall/elixir-sippet | 877edcbbc8d8ba5b6b41684c20041510c410aad3 | [
"BSD-3-Clause"
] | 54 | 2017-04-26T03:15:56.000Z | 2022-02-08T00:22:11.000Z | test/transaction_client_invite_test.exs | BrendanBall/elixir-sippet | 877edcbbc8d8ba5b6b41684c20041510c410aad3 | [
"BSD-3-Clause"
] | 21 | 2017-06-19T08:00:33.000Z | 2022-01-19T10:38:11.000Z | test/transaction_client_invite_test.exs | BrendanBall/elixir-sippet | 877edcbbc8d8ba5b6b41684c20041510c410aad3 | [
"BSD-3-Clause"
] | 22 | 2017-06-19T08:15:34.000Z | 2022-03-22T13:56:20.000Z | defmodule Sippet.Transactions.Client.Invite.Test do
use ExUnit.Case, async: false
alias Sippet.Message
alias Sippet.Transactions.Client
alias Sippet.Transactions.Client.State
alias Sippet.Transactions.Client.Invite
import Mock
defmacro action_timeout(actions, delay) do
quote do
unquote(actions)
|> Enum.count(fn x ->
interval = unquote(delay)
case x do
{:state_timeout, ^interval, _data} -> true
_otherwise -> false
end
end)
end
end
setup do
request =
"""
INVITE sip:bob@biloxi.com SIP/2.0
Via: SIP/2.0/UDP pc33.atlanta.com;branch=z9hG4bK776asdhds
Max-Forwards: 70
To: Bob <sip:bob@biloxi.com>
From: Alice <sip:alice@atlanta.com>;tag=1928301774
Call-ID: a84b4c76e66710@pc33.atlanta.com
CSeq: 314159 INVITE
Contact: <sip:alice@pc33.atlanta.com>
"""
|> Message.parse!()
transaction = Client.Key.new(request)
state = State.new(request, transaction, :sippet)
{:ok, %{request: request, transaction: transaction, state: state}}
end
test "client transaction data", %{transaction: transaction} do
assert transaction.branch == "z9hG4bK776asdhds"
assert transaction.method == :invite
end
test "client invite calling state",
%{request: request, transaction: transaction, state: state} do
# test if the retry timer has been started for unreliable transports, and
# if the received request is sent to the core
with_mocks [
{Sippet.Router, [],
[
send_transport_message: fn _, _, _ -> :ok end
]},
{Sippet, [],
[
reliable?: fn _, _ -> false end
]}
] do
{:keep_state_and_data, actions} = Invite.calling(:enter, :none, state)
assert action_timeout(actions, 600)
assert called(Sippet.reliable?(:sippet, request))
assert called(Sippet.Router.send_transport_message(:sippet, request, transaction))
end
# test if the timeout timer has been started for reliable transports
with_mocks [
{Sippet.Router, [],
[
send_transport_message: fn _, _, _ -> :ok end
]},
{Sippet, [],
[
reliable?: fn _, _ -> true end
]}
] do
{:keep_state_and_data, actions} = Invite.calling(:enter, :none, state)
assert action_timeout(actions, 64 * 600)
assert called(Sippet.reliable?(:sippet, request))
assert called(Sippet.Router.send_transport_message(:sippet, request, transaction))
end
# test timer expiration for unreliable transports
with_mocks [
{Sippet.Router, [],
[
send_transport_message: fn _, _, _ -> :ok end
]},
{Sippet, [],
[
reliable?: fn _, _ -> false end
]}
] do
{:keep_state_and_data, actions} = Invite.calling(:state_timeout, {1200, 1200}, state)
assert action_timeout(actions, 2400)
assert called(Sippet.Router.send_transport_message(:sippet, request, transaction))
end
# test timeout and errors
with_mock Sippet.Router, to_core: fn _, _, _ -> :ok end do
{:stop, :shutdown, _data} = Invite.calling(:state_timeout, {6000, 64 * 600}, state)
{:stop, :shutdown, _data} = Invite.calling(:cast, {:error, :uh_oh}, state)
end
## test state transitions that depend on the reception of responses with
## different status codes
with_mock Sippet.Router, to_core: fn _, _, _ -> :ok end do
response = Message.to_response(request, 100)
{:next_state, :proceeding, _data} =
Invite.calling(:cast, {:incoming_response, response}, state)
response = Message.to_response(request, 200)
{:stop, :normal, _data} = Invite.calling(:cast, {:incoming_response, response}, state)
response = Message.to_response(request, 400)
{:next_state, :completed, _data} =
Invite.calling(:cast, {:incoming_response, response}, state)
end
end
test "client invite proceeding state",
%{request: request, state: state} do
# check state transitions depending on the received responses
with_mock Sippet.Router, to_core: fn _, _, _ -> :ok end do
:keep_state_and_data = Invite.proceeding(:enter, :calling, state)
response = Message.to_response(request, 180)
{:keep_state, _data} = Invite.proceeding(:cast, {:incoming_response, response}, state)
assert called(Sippet.Router.to_core(:sippet, :receive_response, [response, :_]))
response = Message.to_response(request, 200)
{:stop, :normal, _data} = Invite.proceeding(:cast, {:incoming_response, response}, state)
assert called(Sippet.Router.to_core(:sippet, :receive_response, [response, :_]))
response = Message.to_response(request, 400)
{:next_state, :completed, _data} =
Invite.proceeding(:cast, {:incoming_response, response}, state)
assert called(Sippet.Router.to_core(:sippet, :receive_response, [response, :_]))
end
# this is not part of the standard, but may occur in exceptional cases
with_mock Sippet.Router, to_core: fn _, _, _ -> :ok end do
{:stop, :shutdown, _data} = Invite.proceeding(:cast, {:error, :uh_oh}, state)
assert called(Sippet.Router.to_core(:sippet, :receive_error, :_))
end
end
test "client invite completed state",
%{request: request, transaction: transaction, state: state} do
# test the ACK request creation
with_mocks [
{Sippet.Router, [],
[
send_transport_message: fn _, _, _ -> :ok end
]},
{Sippet, [],
[
reliable?: fn _, _ -> false end
]}
] do
last_response = Message.to_response(request, 400)
%{extras: extras} = state
extras = extras |> Map.put(:last_response, last_response)
{:keep_state, data, actions} =
Invite.completed(:enter, :proceeding, %{state | extras: extras})
assert action_timeout(actions, 32000)
%{extras: %{ack: ack}} = data
assert :ack == ack.start_line.method
assert :ack == ack.headers.cseq |> elem(1)
# ACK is retransmitted case another response comes in
:keep_state_and_data = Invite.completed(:cast, {:incoming_response, last_response}, data)
assert called(Sippet.Router.send_transport_message(:sippet, ack, transaction))
end
# reliable transports don't keep the completed state
with_mocks [
{Sippet.Router, [],
[
send_transport_message: fn _, _, _ -> :ok end
]},
{Sippet, [],
[
reliable?: fn _, _ -> true end
]}
] do
last_response = Message.to_response(request, 400)
%{extras: extras} = state
extras = extras |> Map.put(:last_response, last_response)
{:stop, :normal, data} = Invite.completed(:enter, :proceeding, %{state | extras: extras})
%{extras: %{ack: ack}} = data
assert :ack == ack.start_line.method
assert :ack == ack.headers.cseq |> elem(1)
end
# check state completion after timer D
{:stop, :normal, nil} = Invite.completed(:state_timeout, nil, nil)
end
end
| 32.176471 | 95 | 0.634651 |
1cb7e3019e01525af2824868e6a64e6d2e7c3484 | 7,880 | ex | Elixir | apps/site/lib/site_web/views/partial_view.ex | thecristen/dotcom | b8dce8683c1a58e631dff42dd3b012786ddccadb | [
"MIT"
] | null | null | null | apps/site/lib/site_web/views/partial_view.ex | thecristen/dotcom | b8dce8683c1a58e631dff42dd3b012786ddccadb | [
"MIT"
] | 65 | 2021-05-06T18:38:33.000Z | 2022-03-28T20:50:04.000Z | apps/site/lib/site_web/views/partial_view.ex | thecristen/dotcom | b8dce8683c1a58e631dff42dd3b012786ddccadb | [
"MIT"
] | null | null | null | defmodule SiteWeb.PartialView do
@moduledoc """
Handles rendering of partial components and CMS content.
"""
use SiteWeb, :view
alias CMS.{Partial.Teaser, Repo}
alias Plug.Conn
alias Routes.Route
alias SiteWeb.CMS.TeaserView
alias SiteWeb.PartialView.SvgIconWithCircle
import SiteWeb.CMSView, only: [file_description: 1]
import SiteWeb.CMS.ParagraphView, only: [render_paragraph: 2]
import SiteWeb.CMS.TeaserView, only: [transit_tag: 1, handle_fields: 2]
defdelegate fa_icon_for_file_type(mime), to: Site.FontAwesomeHelpers
@spec clear_selector_link(map()) :: Phoenix.HTML.Safe.t()
def clear_selector_link(%{clearable?: true, selected: selected} = assigns)
when not is_nil(selected) do
link to: update_url(assigns.conn, [{assigns.query_key, nil}]) do
[
"(clear",
content_tag(:span, [" ", assigns.placeholder_text], class: "sr-only"),
")"
]
end
end
def clear_selector_link(_assigns) do
""
end
@doc """
Returns the suffix to be shown in the stop selector.
"""
@spec stop_selector_suffix(Conn.t(), Stops.Stop.id_t(), String.t() | nil) :: iodata
def stop_selector_suffix(conn, stop_id, text \\ "")
def stop_selector_suffix(%Conn{assigns: %{route: %Route{type: 2}}} = conn, stop_id, text) do
if zone = conn.assigns.zone_map[stop_id] do
["Zone ", zone]
else
text || ""
end
end
def stop_selector_suffix(
%Conn{assigns: %{route: %Route{id: "Green"}, stops_on_routes: stops}},
stop_id,
text
) do
GreenLine.branch_ids()
|> Enum.flat_map(&(stop_id |> GreenLine.stop_on_route?(&1, stops) |> green_branch_name(&1)))
|> Enum.intersperse(",")
|> green_line_stop_selector_suffix(text)
end
def stop_selector_suffix(_conn, _stop_id, text) do
text || ""
end
@spec green_line_stop_selector_suffix(iodata, String.t() | nil) :: String.t() | iodata
defp green_line_stop_selector_suffix([], nil), do: ""
defp green_line_stop_selector_suffix([], <<text::binary>>), do: text
defp green_line_stop_selector_suffix(iodata, _), do: iodata
@spec green_branch_name(boolean, Route.id_t()) :: [String.t() | nil]
defp green_branch_name(stop_on_green_line?, route_id)
defp green_branch_name(true, route_id), do: [display_branch_name(route_id)]
defp green_branch_name(false, _), do: []
@doc """
Pulls out the branch name of a Green Line route ID.
"""
@spec display_branch_name(Route.id_t()) :: String.t() | nil
def display_branch_name(<<"Green-", branch::binary>>), do: branch
def display_branch_name(_), do: nil
@doc """
Abstraction layer for displaying a list of teasers outside of the CMS app.
List type is determined by the first teaser's :type value. Avoid creating
a single list with multiple, mixed content types.
"""
@spec render_teasers([Teaser.t()], Conn.t(), Keyword.t()) :: Phoenix.HTML.safe()
def render_teasers(teasers, conn, opts \\ [])
def render_teasers([], _, _), do: {:safe, []}
def render_teasers(teasers, conn, opts) do
display_fields =
teasers
|> List.first()
|> Map.get(:type)
|> handle_fields(opts[:fields])
TeaserView.render(
"_teaser_list.html",
teasers: teasers,
fields: display_fields,
list_class: Keyword.get(opts, :list_class, ""),
conn: conn
)
end
@doc """
Renders homepage news entries. Takes two options:
size: :large | :small
class: class to apply to the link
"""
@spec news_entry(Teaser.t(), Keyword.t()) :: Phoenix.HTML.Safe.t()
def news_entry(entry, opts \\ []) do
size = Keyword.get(opts, :size, :small)
color = transit_tag(entry)
opts = [{:color, color} | opts]
link(
[
news_date(entry, size),
content_tag(
:div,
raw(entry.title),
class: "c-news-entry__title c-news-entry__title--#{size}"
)
],
to: news_path(entry, SiteWeb.Endpoint),
class: news_entry_class(opts),
id: entry.id
)
end
@spec news_path(Teaser.t(), module | Conn.t()) :: String.t()
defp news_path(%Teaser{path: url}, conn), do: cms_static_page_path(conn, url)
@spec news_date(Teaser.t(), :large | :small) :: Phoenix.HTML.Safe.t()
defp news_date(%Teaser{date: date}, size), do: do_news_date(date, size)
@spec do_news_date(Date.t(), :large | :small) :: Phoenix.HTML.Safe.t()
defp do_news_date(date, size) do
content_tag(
:div,
[
content_tag(
:span,
Timex.format!(date, "{Mshort}"),
class: "c-news-entry__month c-news-entry__month--#{size}"
),
content_tag(
:span,
Timex.format!(date, "{0D}"),
class: "c-news-entry__day--#{size}"
)
],
class: "c-news-entry__date c-news-entry__date--#{size} u-small-caps"
)
end
@spec news_entry_class(Keyword.t()) :: String.t()
defp news_entry_class(opts) do
size = Keyword.get(opts, :size, :small)
class = Keyword.get(opts, :class, "")
color = Keyword.get(opts, :color)
Enum.join(
[
"c-news-entry",
"c-news-entry--#{size}",
"c-news-entry--#{color}",
class
],
" "
)
end
@doc """
Renders a specific CMS paragraph, provided an alias to it and the conn.
"""
@spec paragraph(String.t(), Conn.t()) :: Phoenix.HTML.Safe.t()
def paragraph(path, conn) do
path
|> Repo.get_paragraph(conn.query_params)
|> render_paragraph(conn)
end
@doc """
Renders the All/Current/Planned filters for an alerts list.
The third argument is for the link generated for each item;
the first element in the tuple is the path helper method
that should be used
"""
@spec alert_time_filters(atom, Keyword.t()) :: Phoenix.HTML.Safe.t()
def alert_time_filters(current_timeframe, path_opts) do
[
content_tag(:h3, "Filter by type", class: "h3"),
content_tag(
:div,
Enum.map([nil, :current, :upcoming], &time_filter(&1, current_timeframe, path_opts)),
class: "m-alerts__time-filters"
)
]
end
@spec time_filter(atom, atom, Keyword.t()) :: Phoenix.HTML.Safe.t()
defp time_filter(filter, current_timeframe, path_opts) do
path_method = Keyword.fetch!(path_opts, :method)
# item can be an atom (representing a mode) or a Route
item = Keyword.fetch!(path_opts, :item) |> get_item_value()
path_params =
path_opts
|> Keyword.get(:params, [])
|> time_filter_params(filter)
path =
apply(SiteWeb.Router.Helpers, path_method, [SiteWeb.Endpoint, :show, item, path_params])
filter
|> time_filter_text()
|> link(
class: time_filter_class(filter, current_timeframe),
to: path
)
end
@spec get_item_value(Route.t() | atom) :: Route.id_t() | atom
defp get_item_value(route_or_mode) when is_atom(route_or_mode), do: route_or_mode
defp get_item_value(route_or_mode), do: route_or_mode.id
@spec time_filter_text(atom) :: String.t()
defp time_filter_text(nil), do: "All Alerts"
defp time_filter_text(:current), do: "Current Alerts"
defp time_filter_text(:upcoming), do: "Planned Service Alerts"
@spec time_filter_params(Keyword.t(), atom) :: Keyword.t()
defp time_filter_params(params, nil), do: params
defp time_filter_params(params, timeframe)
when timeframe in [:current, :upcoming],
do: Keyword.put(params, :alerts_timeframe, timeframe)
@spec time_filter_class(atom, atom) :: [String.t()]
defp time_filter_class(filter, current_timeframe) do
["m-alerts__time-filter" | time_filter_selected_class(filter, current_timeframe)]
end
@spec time_filter_selected_class(atom, atom) :: [String.t()]
defp time_filter_selected_class(filter, filter) do
[" ", "m-alerts__time-filter--selected"]
end
defp time_filter_selected_class(_, _) do
[]
end
end
| 30.661479 | 96 | 0.650888 |
1cb843ba07f85b846f0cb580352bb510a9c1423b | 11,401 | exs | Elixir | lib/mix/test/mix/tasks/escript_test.exs | spencerdcarlson/elixir | 23d75ecdf58df80969e12f4420282238e19219a1 | [
"Apache-2.0"
] | 1 | 2021-12-16T20:32:28.000Z | 2021-12-16T20:32:28.000Z | lib/mix/test/mix/tasks/escript_test.exs | spencerdcarlson/elixir | 23d75ecdf58df80969e12f4420282238e19219a1 | [
"Apache-2.0"
] | 1 | 2020-09-14T16:23:33.000Z | 2021-03-25T17:38:59.000Z | lib/mix/test/mix/tasks/escript_test.exs | spencerdcarlson/elixir | 23d75ecdf58df80969e12f4420282238e19219a1 | [
"Apache-2.0"
] | 1 | 2020-11-25T02:22:55.000Z | 2020-11-25T02:22:55.000Z | Code.require_file("../../test_helper.exs", __DIR__)
defmodule Mix.Tasks.EscriptTest do
use MixTest.Case
defmodule Escript do
def project do
[
app: :escript_test,
version: "0.0.1",
escript: [main_module: EscriptTest, name: "escript_test", embed_elixir: true]
]
end
end
defmodule EscriptWithDebugInfo do
def project do
[
app: :escript_test_with_debug_info,
version: "0.0.1",
escript: [main_module: EscriptTest, strip_beams: false]
]
end
end
defmodule EscriptWithPath do
def project do
[
app: :escript_test_with_path,
version: "0.0.1",
escript: [
app: nil,
embed_elixir: true,
main_module: EscriptTest,
path: Path.join("ebin", "escript_test_with_path")
]
]
end
end
defmodule EscriptWithDeps do
def project do
[
app: :escript_test_with_deps,
version: "0.0.1",
escript: [main_module: EscriptTest],
deps: [{:ok, path: fixture_path("deps_status/deps/ok")}]
]
end
end
defmodule EscriptErlangWithDeps do
def project do
[
app: :escript_test_erlang_with_deps,
version: "0.0.1",
language: :erlang,
escript: [main_module: :escript_test],
deps: [{:ok, path: fixture_path("deps_status/deps/ok")}]
]
end
def application do
[applications: [], extra_applications: [:crypto, elixir: :optional]]
end
end
defmodule EscriptErlangMainModule do
def project do
[
app: :escript_test_erlang_main_module,
version: "0.0.1",
escript: [main_module: :escript_test]
]
end
end
defmodule EscriptWithUnknownMainModule do
def project do
[
app: :escript_test_with_unknown_main_module,
version: "0.0.1",
escript: [main_module: BogusEscriptTest]
]
end
end
defmodule EscriptConsolidated do
def project do
[
app: :escript_test_consolidated,
build_embedded: true,
version: "0.0.1",
escript: [main_module: EscriptTest]
]
end
end
test "generate escript" do
Mix.Project.push(Escript)
in_fixture("escript_test", fn ->
Mix.Tasks.Escript.Build.run([])
assert_received {:mix_shell, :info, ["Generated escript escript_test with MIX_ENV=dev"]}
assert System.cmd("escript", ["escript_test"]) == {"TEST\n", 0}
assert count_abstract_code("escript_test") == 0
end)
end
test "generate escript with --no-compile option" do
Mix.Project.push(Escript)
in_fixture("escript_test", fn ->
Mix.Tasks.Compile.run([])
purge([EscriptTest])
Mix.Tasks.Escript.Build.run(["--no-compile"])
assert_received {:mix_shell, :info, ["Generated escript escript_test with MIX_ENV=dev"]}
end)
end
test "generate escript with compile config" do
Mix.Project.push(Escript)
in_fixture("escript_test", fn ->
File.mkdir_p!("config")
File.write!("config/config.exs", ~S"""
import Config
config :foobar, :value, "COMPILE #{config_env()} TARGET #{config_target()}"
""")
Mix.Tasks.Escript.Build.run([])
assert_received {:mix_shell, :info, ["Generated escript escript_test with MIX_ENV=dev"]}
assert System.cmd("escript", ["escript_test"]) == {"COMPILE dev TARGET host\n", 0}
assert count_abstract_code("escript_test") == 0
end)
end
test "generate escript with runtime config" do
Mix.Project.push(Escript)
in_fixture("escript_test", fn ->
File.mkdir_p!("config")
File.write!("config/config.exs", """
[foobar: [value: "OLD", other: %{}]]
""")
File.write!("config/config.exs", ~S"""
import Config
config :foobar, :value, "RUNTIME #{config_env()} TARGET #{config_target()}"
""")
Mix.Tasks.Escript.Build.run([])
assert_received {:mix_shell, :info, ["Generated escript escript_test with MIX_ENV=dev"]}
assert System.cmd("escript", ["escript_test"]) == {"RUNTIME dev TARGET host\n", 0}
assert count_abstract_code("escript_test") == 0
end)
end
test "generate escript with debug information" do
Mix.Project.push(EscriptWithDebugInfo)
in_fixture("escript_test", fn ->
Mix.Tasks.Escript.Build.run([])
msg = "Generated escript escript_test_with_debug_info with MIX_ENV=dev"
assert_received {:mix_shell, :info, [^msg]}
assert System.cmd("escript", ["escript_test_with_debug_info"]) == {"TEST\n", 0}
assert count_abstract_code("escript_test_with_debug_info") > 0
end)
end
defp count_abstract_code(escript_filename) do
escript_filename
|> read_beams()
|> Enum.count(fn {_, beam} -> get_abstract_code(beam) end)
end
defp read_beams(escript_filename) do
# :zip.unzip/2 cannot unzip an escript unless we remove the escript header
zip_data = remove_escript_header(File.read!(escript_filename))
{:ok, tuples} = :zip.unzip(zip_data, [:memory])
for {filename, beam} <- tuples, Path.extname(filename) == ".beam" do
{filename, beam}
end
end
defp remove_escript_header(escript_data) do
{offset, _length} = :binary.match(escript_data, "\nPK")
zip_start = offset + 1
binary_part(escript_data, zip_start, byte_size(escript_data) - zip_start)
end
defp get_abstract_code(beam) do
case :beam_lib.chunks(beam, [:abstract_code]) do
{:ok, {_, [{:abstract_code, {_, abstract_code}}]}} -> abstract_code
_ -> nil
end
end
test "generate escript with path" do
Mix.Project.push(EscriptWithPath)
in_fixture("escript_test", fn ->
Mix.Tasks.Escript.Build.run([])
message = "Generated escript ebin/escript_test_with_path with MIX_ENV=dev"
assert_received {:mix_shell, :info, [^message]}
assert System.cmd("escript", ["ebin/escript_test_with_path"]) == {"TEST\n", 0}
end)
end
test "generate escript with deps" do
Mix.Project.push(EscriptWithDeps)
in_fixture("escript_test", fn ->
Mix.Tasks.Escript.Build.run([])
message = "Generated escript escript_test_with_deps with MIX_ENV=dev"
assert_received {:mix_shell, :info, [^message]}
assert System.cmd("escript", ["escript_test_with_deps"]) == {"TEST\n", 0}
end)
after
purge([Ok.MixProject])
end
test "generate escript with Erlang and deps" do
Mix.Project.push(EscriptErlangWithDeps)
in_fixture("escript_test", fn ->
Mix.Tasks.Escript.Build.run([])
message = "Generated escript escript_test_erlang_with_deps with MIX_ENV=dev"
assert_received {:mix_shell, :info, [^message]}
assert System.cmd("escript", ["escript_test_erlang_with_deps", "arg1", "arg2"]) ==
{~s(["arg1","arg2"]), 0}
end)
after
purge([Ok.MixProject])
end
test "generate escript with Erlang main module" do
Mix.Project.push(EscriptErlangMainModule)
in_fixture("escript_test", fn ->
Mix.Tasks.Escript.Build.run([])
message = "Generated escript escript_test_erlang_main_module with MIX_ENV=dev"
assert_received {:mix_shell, :info, [^message]}
assert System.cmd("escript", ["escript_test_erlang_main_module", "arg1", "arg2"]) ==
{~s([<<"arg1">>,<<"arg2">>]), 0}
end)
after
purge([Ok.MixProject])
end
test "generate escript with consolidated protocols" do
Mix.Project.push(EscriptConsolidated)
in_fixture("escript_test", fn ->
Mix.Tasks.Escript.Build.run([])
message = "Generated escript escript_test_consolidated with MIX_ENV=dev"
assert_received {:mix_shell, :info, [^message]}
assert System.cmd("escript", ["escript_test_consolidated", "Enumerable"]) == {"true\n", 0}
end)
end
test "escript install and uninstall" do
File.rm_rf!(tmp_path(".mix/escripts"))
Mix.Project.push(Escript)
in_fixture("escript_test", fn ->
# check that no escripts are installed
Mix.Tasks.Escript.run([])
assert_received {:mix_shell, :info, ["No escripts currently installed."]}
# build and install our escript
send(self(), {:mix_shell_input, :yes?, true})
Mix.Tasks.Escript.Install.run([])
# check that it shows in the list
Mix.Tasks.Escript.run([])
assert_received {:mix_shell, :info, ["* escript_test"]}
refute_received {:mix_shell, :info, ["* escript_test.bat"]}
# check uninstall confirmation
send(self(), {:mix_shell_input, :yes?, false})
Mix.Tasks.Escript.Uninstall.run(["escript_test"])
assert File.regular?(tmp_path(".mix/escripts/escript_test"))
# uninstall the escript
send(self(), {:mix_shell_input, :yes?, true})
Mix.Tasks.Escript.Uninstall.run(["escript_test"])
refute File.regular?(tmp_path(".mix/escripts/escript_test"))
refute File.regular?(tmp_path(".mix/escripts/escript_test.bat"))
# check that no escripts remain
Mix.Tasks.Escript.run([])
assert_received {:mix_shell, :info, ["No escripts currently installed."]}
end)
end
test "escript install and uninstall --force" do
File.rm_rf!(tmp_path(".mix/escripts"))
Mix.Project.push(Escript)
in_fixture("escript_test", fn ->
Mix.Tasks.Escript.Install.run(["--force"])
# check that it shows in the list
Mix.Tasks.Escript.run([])
assert_received {:mix_shell, :info, ["* escript_test"]}
refute_received {:mix_shell, :info, ["* escript_test.bat"]}
# uninstall the escript
Mix.Tasks.Escript.Uninstall.run(["escript_test", "--force"])
# check that no escripts remain
Mix.Tasks.Escript.run([])
assert_received {:mix_shell, :info, ["No escripts currently installed."]}
end)
end
test "escript invalid install" do
# Install our escript
send(self(), {:mix_shell_input, :yes?, true})
message = "The given path does not point to an escript, installation aborted"
assert_raise Mix.Error, message, fn ->
Mix.Tasks.Escript.Install.run([__ENV__.file])
end
end
test "escript invalid main module" do
Mix.Project.push(EscriptWithUnknownMainModule)
in_fixture("escript_test", fn ->
message =
"Could not generate escript, module Elixir.BogusEscriptTest defined as :main_module could not be loaded"
assert_raise Mix.Error, message, fn ->
Mix.Tasks.Escript.Build.run([])
end
end)
end
test "escript.install from Git" do
in_fixture("git_repo", fn ->
File.write!("lib/git_repo.ex", """
defmodule GitRepo do
def main(_argv) do
IO.puts("TEST")
end
end
""")
File.write!("mix.exs", """
defmodule GitRepo.MixProject do
use Mix.Project
def project do
[app: :git_repo, version: "0.1.0", escript: [main_module: GitRepo]]
end
end
""")
System.cmd("git", ~w[add .])
System.cmd("git", ~w[commit -m "ok"])
send(self(), {:mix_shell_input, :yes?, true})
Mix.Tasks.Escript.Install.run(["git", File.cwd!()])
assert_received {:mix_shell, :info, ["Generated escript git_repo with MIX_ENV=prod"]}
escript_path = Path.join([tmp_path(".mix"), "escripts", "git_repo"])
assert System.cmd("escript", [escript_path]) == {"TEST\n", 0}
end)
after
purge([GitRepo, GitRepo.MixProject])
end
end
| 29.233333 | 112 | 0.639154 |
1cb850c069eee1c8d36fafd966ba213d8c692622 | 2,494 | exs | Elixir | benchmarks/hs_benchmark.exs | blagh/joken | eafc407603cb28f12b1a95b11a57da89903bb3b6 | [
"Apache-2.0"
] | 421 | 2015-01-06T12:06:00.000Z | 2018-11-17T09:12:27.000Z | benchmarks/hs_benchmark.exs | blagh/joken | eafc407603cb28f12b1a95b11a57da89903bb3b6 | [
"Apache-2.0"
] | 178 | 2015-02-24T15:40:25.000Z | 2018-11-06T16:32:27.000Z | benchmarks/hs_benchmark.exs | blagh/joken | eafc407603cb28f12b1a95b11a57da89903bb3b6 | [
"Apache-2.0"
] | 71 | 2015-02-24T00:52:34.000Z | 2018-10-30T17:34:59.000Z | defmodule HS256Auth do
use Joken.Config, default_key: :hs256
end
defmodule HS384Auth do
use Joken.Config, default_key: :hs384
end
defmodule HS512Auth do
use Joken.Config, default_key: :hs512
end
defmodule HS256AuthVerify do
use Joken.Config, default_key: :hs256
def token_config do
%{}
|> add_claim("name", fn -> "John Doe" end, &(&1 == "John Doe"))
|> add_claim("test", fn -> true end, &(&1 == true))
|> add_claim("age", fn -> 666 end, &(&1 > 18))
|> add_claim("simple time test", fn -> 1 end, &(Joken.current_time() > &1))
end
end
defmodule HS384AuthVerify do
use Joken.Config, default_key: :hs384
def token_config do
%{}
|> add_claim("name", fn -> "John Doe" end, &(&1 == "John Doe"))
|> add_claim("test", fn -> true end, &(&1 == true))
|> add_claim("age", fn -> 666 end, &(&1 > 18))
|> add_claim("simple time test", fn -> 1 end, &(Joken.current_time() > &1))
end
end
defmodule HS512AuthVerify do
use Joken.Config, default_key: :hs512
def token_config do
%{}
|> add_claim("name", fn -> "John Doe" end, &(&1 == "John Doe"))
|> add_claim("test", fn -> true end, &(&1 == true))
|> add_claim("age", fn -> 666 end, &(&1 > 18))
|> add_claim("simple time test", fn -> 1 end, &(Joken.current_time() > &1))
end
end
hs256_token =
"eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJhZ2UiOjY2NiwibmFtZSI6IkpvaG4gRG9lIiwic2ltcGxlIHRpbWUgdGVzdCI6MSwidGVzdCI6dHJ1ZX0.AxM6-iOez0tM35N6hSxr_LWe9LC28c4MeoRvEIi4Gtw"
hs384_token =
"eyJhbGciOiJIUzM4NCIsInR5cCI6IkpXVCJ9.eyJhZ2UiOjY2NiwibmFtZSI6IkpvaG4gRG9lIiwic2ltcGxlIHRpbWUgdGVzdCI6MSwidGVzdCI6dHJ1ZX0.35wYGZk5Dzka_BMzeplo9sz0q_BDwg_C2m-_xqp-6RBVU7qyhudAwy8hFY1Dxti_"
hs512_token =
"eyJhbGciOiJIUzUxMiIsInR5cCI6IkpXVCJ9.eyJhZ2UiOjY2NiwibmFtZSI6IkpvaG4gRG9lIiwic2ltcGxlIHRpbWUgdGVzdCI6MSwidGVzdCI6dHJ1ZX0.P7Og_ODvM94PPXettTtalgiGtxwj7oBoDk_4zj08o3kRZPQCDqNy4lHanoEhY-CTS-CPbJKivelnxMGBJ-3x5A"
Benchee.run(
%{
"HS256 generate and sign" => fn -> HS256Auth.generate_and_sign() end,
"HS384 generate and sign" => fn -> HS384Auth.generate_and_sign() end,
"HS512 generate and sign" => fn -> HS512Auth.generate_and_sign() end
},
time: 5
)
Benchee.run(
%{
"HS256 verify and validate" => fn -> HS256AuthVerify.verify_and_validate(hs256_token) end,
"HS384 verify and validate" => fn -> HS384AuthVerify.verify_and_validate(hs384_token) end,
"HS512 verify and validate" => fn -> HS512AuthVerify.verify_and_validate(hs512_token) end
},
time: 5
)
| 33.253333 | 211 | 0.706496 |
1cb86d52b1ce90b9a7f5282d6e54805ee34c9947 | 302 | ex | Elixir | lib/dune/shims/atom.ex | functional-rewire/dune | 9a7d3fd182e5b29e0bfb0b2a97daba468231546c | [
"MIT"
] | 58 | 2021-09-19T09:06:36.000Z | 2022-02-03T11:19:38.000Z | lib/dune/shims/atom.ex | functional-rewire/dune | 9a7d3fd182e5b29e0bfb0b2a97daba468231546c | [
"MIT"
] | null | null | null | lib/dune/shims/atom.ex | functional-rewire/dune | 9a7d3fd182e5b29e0bfb0b2a97daba468231546c | [
"MIT"
] | 1 | 2021-09-26T14:56:18.000Z | 2021-09-26T14:56:18.000Z | defmodule Dune.Shims.Atom do
@moduledoc false
alias Dune.AtomMapping
def to_string(env, atom) when is_atom(atom) do
AtomMapping.to_string(env.atom_mapping, atom)
end
def to_charlist(env, atom) when is_atom(atom) do
__MODULE__.to_string(env, atom) |> String.to_charlist()
end
end
| 21.571429 | 59 | 0.738411 |
1cb889f19d50c62b52bf1b55471e831bc82ddfbb | 1,709 | ex | Elixir | lib/livebook/notebook/cell/elixir.ex | byjpr/livebook | 350cd13853eb4c2357203b365d9f6fb59f0601f4 | [
"Apache-2.0"
] | null | null | null | lib/livebook/notebook/cell/elixir.ex | byjpr/livebook | 350cd13853eb4c2357203b365d9f6fb59f0601f4 | [
"Apache-2.0"
] | null | null | null | lib/livebook/notebook/cell/elixir.ex | byjpr/livebook | 350cd13853eb4c2357203b365d9f6fb59f0601f4 | [
"Apache-2.0"
] | null | null | null | defmodule Livebook.Notebook.Cell.Elixir do
@moduledoc false
# A cell with Elixir code.
#
# It consists of text content that the user can edit
# and produces some output once evaluated.
defstruct [:id, :source, :outputs, :disable_formatting, :reevaluate_automatically]
alias Livebook.Utils
alias Livebook.Notebook.Cell
@type t :: %__MODULE__{
id: Cell.id(),
source: String.t(),
outputs: list(output()),
disable_formatting: boolean(),
reevaluate_automatically: boolean()
}
@typedoc """
For more details on output types see `t:Kino.Output.t/0`.
"""
@type output ::
:ignored
# Regular text, adjacent such outputs can be treated as a whole
| binary()
# Standalone text block
| {:text, binary()}
# Markdown content
| {:markdown, binary()}
# HTML content
| {:html, binary()}
# A raw image in the given format
| {:image, content :: binary(), mime_type :: binary()}
# Vega-Lite graphic
| {:vega_lite_static, spec :: map()}
# Vega-Lite graphic with dynamic data
| {:vega_lite_dynamic, widget_process :: pid()}
# Interactive data table
| {:table_dynamic, widget_process :: pid()}
# Internal output format for errors
| {:error, message :: binary(), type :: :other | :runtime_restart_required}
@doc """
Returns an empty cell.
"""
@spec new() :: t()
def new() do
%__MODULE__{
id: Utils.random_id(),
source: "",
outputs: [],
disable_formatting: false,
reevaluate_automatically: false
}
end
end
| 28.483333 | 85 | 0.578701 |
1cb88a05e7407c7d68a27599c36967f19d9b1591 | 1,484 | exs | Elixir | day17/task1.exs | joskov/advent2016 | cdade8f0f47bf67b9dd29f8e25f4e8351cada309 | [
"MIT"
] | 2 | 2016-12-26T20:44:04.000Z | 2017-02-10T19:59:55.000Z | day17/task1.exs | joskov/advent2016 | cdade8f0f47bf67b9dd29f8e25f4e8351cada309 | [
"MIT"
] | null | null | null | day17/task1.exs | joskov/advent2016 | cdade8f0f47bf67b9dd29f8e25f4e8351cada309 | [
"MIT"
] | null | null | null | defmodule Task do
@input "qljzarfv"
def calculate do
calculate([{@input, 0, 0}], 0)
end
def calculate([], index) do
IO.puts "Dead end on index #{index}"
end
def calculate(list, index) do
IO.puts "Working on index #{index}"
list = Enum.reduce(list, [], &add_possible/2)
calculate(list, index + 1)
end
def add_possible(old, acc) do
possible(old) ++ acc
end
def possible({input, 3, 3}) do
IO.puts(input)
1 / 0
end
def possible({input, _x, _y} = old) do
hash = :crypto.hash(:md5, input) |> Base.encode16(case: :lower) |> String.graphemes
all_moves
|> Enum.filter(&(check_move(hash, &1)))
|> Enum.map(&(complete_move(old, &1)))
|> Enum.filter(&allowed_move?/1)
end
def complete_move({input, x, y}, {c, _i, dx, dy}) do
{input <> c, x + dx, y + dy}
end
def all_moves do
[{"U", 0, 0, -1}, {"D", 1, 0, 1}, {"L", 2, -1, 0}, {"R", 3, 1, 0}]
end
def check_move(hash, {_c, i, _dx, _dy}) do
Enum.at(hash, i) |> open?
end
def allowed_move?({_i, x, _y}) when x < 0, do: false
def allowed_move?({_i, x, _y}) when x > 3, do: false
def allowed_move?({_i, _x, y}) when y < 0, do: false
def allowed_move?({_i, _x, y}) when y > 3, do: false
def allowed_move?(_), do: true
def open?("b"), do: true
def open?("c"), do: true
def open?("d"), do: true
def open?("e"), do: true
def open?("f"), do: true
def open?(_), do: false
end
result = Task.calculate
IO.inspect result
| 24.733333 | 87 | 0.576146 |
1cb893db2e23327e3055c552f258844040e7256e | 620 | ex | Elixir | lib/yaws_ex.ex | evbogdanov/yaws_ex | bc3095f5bb38661453dfe9a593b2b9a9c363e4a9 | [
"BSD-3-Clause"
] | 1 | 2019-09-11T00:06:47.000Z | 2019-09-11T00:06:47.000Z | lib/yaws_ex.ex | evbogdanov/yaws_ex | bc3095f5bb38661453dfe9a593b2b9a9c363e4a9 | [
"BSD-3-Clause"
] | null | null | null | lib/yaws_ex.ex | evbogdanov/yaws_ex | bc3095f5bb38661453dfe9a593b2b9a9c363e4a9 | [
"BSD-3-Clause"
] | null | null | null | defmodule YawsEx do
use Application
alias YawsEx.Yaws
def start(_type, _args) do
import Supervisor.Spec, warn: false
# Start supervisor without children
{:ok, sup_pid} = Supervisor.start_link([], strategy: :one_for_one)
# Obtain Yaws configuration
{:ok, server_conf, global_conf, child_specs} = Yaws.get_conf()
# Start Yaws under my supervisor
child_specs
|> Enum.each(fn(child) -> Supervisor.start_child(sup_pid, child) end)
# Apply Yaws configuration
Yaws.set_conf(global_conf, server_conf)
# Don't forget to return supervisor pid
{:ok, sup_pid}
end
end
| 22.962963 | 73 | 0.698387 |
1cb89649431404ed5d0a150679a413d6d47ac44d | 87 | ex | Elixir | lib/mipha/mailer.ex | ZPVIP/mipha | a7df054f72eec7de88b60d94c501488375bdff6a | [
"MIT"
] | 156 | 2018-06-01T19:52:32.000Z | 2022-02-03T10:58:10.000Z | lib/mipha/mailer.ex | ZPVIP/mipha | a7df054f72eec7de88b60d94c501488375bdff6a | [
"MIT"
] | 139 | 2018-07-10T01:57:23.000Z | 2021-08-02T21:29:24.000Z | lib/mipha/mailer.ex | ZPVIP/mipha | a7df054f72eec7de88b60d94c501488375bdff6a | [
"MIT"
] | 29 | 2018-07-17T08:43:45.000Z | 2021-12-14T13:45:30.000Z | defmodule Mipha.Mailer do
@moduledoc false
use Bamboo.Mailer, otp_app: :mipha
end
| 14.5 | 36 | 0.758621 |
1cb8c660c7f35e8c23062a8beac4baf337c0dc70 | 340 | ex | Elixir | lib/ex_prometheus_io/supervisor.ex | kennyballou/ex_prometheus_io | da2a3bf02ee083ba1caef3673ebef0e708b9b9ee | [
"Apache-2.0"
] | 2 | 2016-03-16T12:47:48.000Z | 2016-03-16T19:16:04.000Z | lib/ex_prometheus_io/supervisor.ex | kennyballou/ex_prometheus_io | da2a3bf02ee083ba1caef3673ebef0e708b9b9ee | [
"Apache-2.0"
] | null | null | null | lib/ex_prometheus_io/supervisor.ex | kennyballou/ex_prometheus_io | da2a3bf02ee083ba1caef3673ebef0e708b9b9ee | [
"Apache-2.0"
] | null | null | null | defmodule ExPrometheusIo.Supervisor do
use Supervisor
def start_link() do
Supervisor.start_link(__MODULE__, [], name: __MODULE__)
end
def init(_) do
children = [
supervisor(Task.Supervisor, [[name: ExPrometheusIo.QuerySupervisor]])
]
opts = [strategy: :one_for_one]
supervise(children, opts)
end
end
| 18.888889 | 75 | 0.691176 |
1cb8f1094041aa330707bfe11288e075a9c84856 | 77 | ex | Elixir | raspberry_phoenix/web/views/page_view.ex | raymondboswel/raspberry_phoenix | ec0594c765db4405f2ec1f876ef9f76e685fcea4 | [
"Apache-2.0"
] | null | null | null | raspberry_phoenix/web/views/page_view.ex | raymondboswel/raspberry_phoenix | ec0594c765db4405f2ec1f876ef9f76e685fcea4 | [
"Apache-2.0"
] | null | null | null | raspberry_phoenix/web/views/page_view.ex | raymondboswel/raspberry_phoenix | ec0594c765db4405f2ec1f876ef9f76e685fcea4 | [
"Apache-2.0"
] | null | null | null | defmodule RaspberryPhoenix.PageView do
use RaspberryPhoenix.Web, :view
end
| 19.25 | 38 | 0.831169 |
1cb93c4a07381165ed452713b2e74d8da9303871 | 1,292 | ex | Elixir | apps/snitch_core/lib/core/data/schema/images/image.ex | Acrecio/avia | 54d264fc179b5b5f17d174854bdca063e1d935e9 | [
"MIT"
] | 456 | 2018-09-20T02:40:59.000Z | 2022-03-07T08:53:48.000Z | apps/snitch_core/lib/core/data/schema/images/image.ex | Acrecio/avia | 54d264fc179b5b5f17d174854bdca063e1d935e9 | [
"MIT"
] | 273 | 2018-09-19T06:43:43.000Z | 2021-08-07T12:58:26.000Z | apps/snitch_core/lib/core/data/schema/images/image.ex | Acrecio/avia | 54d264fc179b5b5f17d174854bdca063e1d935e9 | [
"MIT"
] | 122 | 2018-09-26T16:32:46.000Z | 2022-03-13T11:44:19.000Z | defmodule Snitch.Data.Schema.Image do
@moduledoc """
Models an Image.
"""
use Snitch.Data.Schema
alias Ecto.Nanoid
@type t :: %__MODULE__{}
schema "snitch_images" do
field(:name, Nanoid)
field(:image_url, :string)
field(:image, :any, virtual: true)
field(:is_default, :boolean, default: false)
timestamps()
end
@doc """
Returns an `image` changeset.
"""
@spec changeset(t, map) :: Ecto.Changeset.t()
def changeset(%__MODULE__{} = image, params) do
image
|> cast(params, [:image, :is_default])
|> put_name_and_url()
end
def update_changeset(%__MODULE__{} = image, params) do
image
|> cast(params, [:is_default])
|> put_change(:is_default, params.is_default)
end
@doc """
Returns an `image` changeset.
"""
@spec changeset(t, map) :: Ecto.Changeset.t()
def create_changeset(%__MODULE__{} = image, params) do
image
|> cast(params, [:image, :is_default])
|> put_name_and_url()
end
def put_name_and_url(changeset) do
case changeset do
%Ecto.Changeset{
valid?: true,
changes: %{image: %{filename: name, url: url}}
} ->
changeset
|> put_change(:name, name)
|> put_change(:image_url, url)
_ ->
changeset
end
end
end
| 21.898305 | 56 | 0.611455 |
1cb94254184410606eafe26a2ad88e2aec15203d | 97 | ex | Elixir | lib/bluetooth_websocket_server_web/views/layout_view.ex | highmobility/bluetooth-websocket-server | 512a6fab0db1821ddb99426c1bececa06b0b8dc1 | [
"MIT"
] | 10 | 2016-06-24T16:23:56.000Z | 2021-02-16T07:56:51.000Z | lib/bluetooth_websocket_server_web/views/layout_view.ex | highmobility/bluetooth-websocket-server | 512a6fab0db1821ddb99426c1bececa06b0b8dc1 | [
"MIT"
] | 2 | 2017-10-15T11:28:54.000Z | 2017-10-19T11:58:40.000Z | lib/bluetooth_websocket_server_web/views/layout_view.ex | highmobility/bluetooth-websocket-server | 512a6fab0db1821ddb99426c1bececa06b0b8dc1 | [
"MIT"
] | 2 | 2017-10-07T10:08:16.000Z | 2020-11-01T08:44:18.000Z | defmodule BluetoothWebsocketServerWeb.LayoutView do
use BluetoothWebsocketServerWeb, :view
end
| 24.25 | 51 | 0.876289 |
1cb9607ff5ecca0fb01c4df2d50169061d174a4d | 4,614 | ex | Elixir | apps/snitch_core/lib/core/data/schema/package_item.ex | saurabharch/avia | 74a82a95cf8bfe8143d1fce8136a3bb7ffc9467c | [
"MIT"
] | 1 | 2018-12-01T18:13:55.000Z | 2018-12-01T18:13:55.000Z | apps/snitch_core/lib/core/data/schema/package_item.ex | saurabharch/avia | 74a82a95cf8bfe8143d1fce8136a3bb7ffc9467c | [
"MIT"
] | null | null | null | apps/snitch_core/lib/core/data/schema/package_item.ex | saurabharch/avia | 74a82a95cf8bfe8143d1fce8136a3bb7ffc9467c | [
"MIT"
] | null | null | null | defmodule Snitch.Data.Schema.PackageItem do
@moduledoc """
Models a PackageItem, a `Package` is composed of many `PackageItem`s.
## Fulfillment
There are two kinds of fulfillments:
1. `immediate`, there is enough stock "on hand" at the origin stock location.
- `:backordered` is set to `false`
2. `deffered`, there is not enough stock "on hand" (possibly even none),
***and*** the origin stock location allows backorders on the requested
variant/product.
- `:backordered` is set to `true`
For a detailed explanation of how backorders work, please refer the [guide on
setting up stock locations](#).
> The *origin stock location* is the location which would ship the package
containing this package-item.
### Immediate Fulfillment
If the `:backordered?` field is `false`, the package item immediately fulfills
the line item.
This also implies the following:
```
package_item.delta = 0
```
### Deferred Fulfillment
If the `:backordered?` field is `true`, the package item _will fulfill_ the
line item, in the future. The (parent) package cannot be immediately shipped.
This also implies the following:
```
package_item.delta = package_item.line_item.quantity - currently_on_hand
```
"""
use Snitch.Data.Schema
alias Ecto.Nanoid
alias Snitch.Data.Schema.{LineItem, Package, Product}
@typedoc """
Every fulfilled `LineItem` get shipped in as `PackageItem` in a `Package`.
## Fields
### `:quantity`
The number of units (of this item) that are currently "on hand" at the stock
location. The package can be shipped only when this becomes equal to the
quantity ordered.
When the item is immediately fulfilled, this is same as the line_item's
quantity.
Otherwise, this is the number of units that are currently "on hand" at the
origin stock location.
### `:delta`
The difference between the `:quantity` and the number of units "on
hand".
### `:tax`
The tax levied over (or included in) the cost of the line item, as applicable
when the line item is sold from the `:origin` stock location.
This does not include any shipping tax components.
### `:shipping_tax`
The sum of all shipping taxes that apply for the shipping of this item from
the `origin` stock location.
"""
@type t :: %__MODULE__{}
# TODO: :backordered could be made a virtual field...
schema "snitch_package_items" do
field(:number, Nanoid, autogenerate: true)
field(:state, :string)
field(:quantity, :integer, default: 0)
# The field should be tracked in some other way
field(:delta, :integer, default: 0)
field(:backordered?, :boolean)
field(:tax, Money.Ecto.Composite.Type)
field(:shipping_tax, Money.Ecto.Composite.Type)
belongs_to(:product, Product)
belongs_to(:line_item, LineItem)
belongs_to(:package, Package)
has_one(:order, through: [:package, :order])
timestamps()
end
@create_fields ~w(state delta quantity line_item_id product_id package_id tax shipping_tax)a
@required_fields ~w(state quantity line_item_id product_id tax)a
@update_fields ~w(state quantity delta tax shipping_tax)a
@doc """
Returns a `PackageItem` changeset to create a new `package_item`.
"""
@spec create_changeset(t, map) :: Ecto.Changeset.t()
def create_changeset(%__MODULE__{} = package_item, params) do
package_item
|> cast(params, @create_fields)
|> validate_required(@required_fields)
|> foreign_key_constraint(:line_item_id)
|> foreign_key_constraint(:product_id)
|> foreign_key_constraint(:package_id)
|> unique_constraint(:number)
|> common_changeset()
end
@doc """
Returns a `PackageItem` changeset to update the `package_item`.
"""
@spec update_changeset(t, map) :: Ecto.Changeset.t()
def update_changeset(%__MODULE__{} = package_item, params) do
package_item
|> cast(params, @update_fields)
|> common_changeset()
end
defp common_changeset(package_item_changeset) do
package_item_changeset
|> validate_number(:quantity, greater_than: -1)
|> validate_number(:delta, greater_than: -1)
|> validate_amount(:tax)
|> validate_amount(:shipping_tax)
|> set_backordered()
end
defp set_backordered(%Ecto.Changeset{valid?: true} = changeset) do
case fetch_field(changeset, :delta) do
{_, delta} when delta == 0 ->
put_change(changeset, :backordered?, false)
{_, delta} when delta > 0 ->
put_change(changeset, :backordered?, true)
_ ->
changeset
end
end
defp set_backordered(%Ecto.Changeset{} = cs), do: cs
end
| 31.60274 | 94 | 0.701344 |
1cb96d0e3ea505b32d1c380010e092762fffba97 | 2,907 | exs | Elixir | test/json_schema_test_suite/draft7/optional/format/uri_test.exs | hrzndhrn/json_xema | 955eab7b0919d144b38364164d90275201c89474 | [
"MIT"
] | 54 | 2019-03-10T19:51:07.000Z | 2021-12-23T07:31:09.000Z | test/json_schema_test_suite/draft7/optional/format/uri_test.exs | hrzndhrn/json_xema | 955eab7b0919d144b38364164d90275201c89474 | [
"MIT"
] | 36 | 2018-05-20T09:13:20.000Z | 2021-03-14T15:22:03.000Z | test/json_schema_test_suite/draft7/optional/format/uri_test.exs | hrzndhrn/json_xema | 955eab7b0919d144b38364164d90275201c89474 | [
"MIT"
] | 3 | 2019-04-12T09:08:51.000Z | 2019-12-04T01:23:56.000Z | defmodule JsonSchemaTestSuite.Draft7.Optional.Format.UriTest do
use ExUnit.Case
import JsonXema, only: [valid?: 2]
describe ~s|validation of URIs| do
setup do
%{schema: JsonXema.new(%{"format" => "uri"})}
end
test ~s|a valid URL with anchor tag|, %{schema: schema} do
assert valid?(schema, "http://foo.bar/?baz=qux#quux")
end
test ~s|a valid URL with anchor tag and parantheses|, %{schema: schema} do
assert valid?(schema, "http://foo.com/blah_(wikipedia)_blah#cite-1")
end
test ~s|a valid URL with URL-encoded stuff|, %{schema: schema} do
assert valid?(schema, "http://foo.bar/?q=Test%20URL-encoded%20stuff")
end
test ~s|a valid puny-coded URL |, %{schema: schema} do
assert valid?(schema, "http://xn--nw2a.xn--j6w193g/")
end
test ~s|a valid URL with many special characters|, %{schema: schema} do
assert valid?(schema, "http://-.~_!$&'()*+,;=:%40:80%2f::::::@example.com")
end
test ~s|a valid URL based on IPv4|, %{schema: schema} do
assert valid?(schema, "http://223.255.255.254")
end
test ~s|a valid URL with ftp scheme|, %{schema: schema} do
assert valid?(schema, "ftp://ftp.is.co.za/rfc/rfc1808.txt")
end
test ~s|a valid URL for a simple text file|, %{schema: schema} do
assert valid?(schema, "http://www.ietf.org/rfc/rfc2396.txt")
end
test ~s|a valid URL |, %{schema: schema} do
assert valid?(schema, "ldap://[2001:db8::7]/c=GB?objectClass?one")
end
test ~s|a valid mailto URI|, %{schema: schema} do
assert valid?(schema, "mailto:John.Doe@example.com")
end
test ~s|a valid newsgroup URI|, %{schema: schema} do
assert valid?(schema, "news:comp.infosystems.www.servers.unix")
end
test ~s|a valid tel URI|, %{schema: schema} do
assert valid?(schema, "tel:+1-816-555-1212")
end
test ~s|a valid URN|, %{schema: schema} do
assert valid?(schema, "urn:oasis:names:specification:docbook:dtd:xml:4.1.2")
end
test ~s|an invalid protocol-relative URI Reference|, %{schema: schema} do
refute valid?(schema, "//foo.bar/?baz=qux#quux")
end
test ~s|an invalid relative URI Reference|, %{schema: schema} do
refute valid?(schema, "/abc")
end
test ~s|an invalid URI|, %{schema: schema} do
refute valid?(schema, "\\\\WINDOWS\\fileshare")
end
test ~s|an invalid URI though valid URI reference|, %{schema: schema} do
refute valid?(schema, "abc")
end
test ~s|an invalid URI with spaces|, %{schema: schema} do
refute valid?(schema, "http:// shouldfail.com")
end
test ~s|an invalid URI with spaces and missing scheme|, %{schema: schema} do
refute valid?(schema, ":// should fail")
end
test ~s|an invalid URI with comma in scheme|, %{schema: schema} do
refute valid?(schema, "bar,baz:foo")
end
end
end
| 31.597826 | 82 | 0.628139 |
1cb96fa421164563410f512da37bf237a135bdb0 | 1,331 | exs | Elixir | apps/reaper/test/unit/reaper/persistence_test.exs | smartcitiesdata/smartcitiesdata | c926c25003a8ee2d09b933c521c49f674841c0b6 | [
"Apache-2.0"
] | 26 | 2019-09-20T23:54:45.000Z | 2020-08-20T14:23:32.000Z | apps/reaper/test/unit/reaper/persistence_test.exs | smartcitiesdata/smartcitiesdata | c926c25003a8ee2d09b933c521c49f674841c0b6 | [
"Apache-2.0"
] | 757 | 2019-08-15T18:15:07.000Z | 2020-09-18T20:55:31.000Z | apps/reaper/test/unit/reaper/persistence_test.exs | smartcitiesdata/smartcitiesdata | c926c25003a8ee2d09b933c521c49f674841c0b6 | [
"Apache-2.0"
] | 9 | 2019-11-12T16:43:46.000Z | 2020-03-25T16:23:16.000Z | defmodule Reaper.RecorderTest do
use ExUnit.Case
use Placebo
alias Reaper.Persistence
@ingestion_id "whatever"
@timestamp "does not matter"
@redix Reaper.Application.redis_client()
test "persists to redis using appropriate prefix" do
expect Redix.command(@redix, ["SET", "reaper:derived:#{@ingestion_id}", any()]), return: nil
Persistence.record_last_fetched_timestamp(@ingestion_id, @timestamp)
end
test "retrieves last_processed_index from redis" do
expect Redix.command!(@redix, ["GET", "reaper:#{@ingestion_id}:last_processed_index"]), return: "1"
Persistence.get_last_processed_index(@ingestion_id)
end
test "persists last_processed_index to redis" do
expect Redix.command!(@redix, ["SET", "reaper:#{@ingestion_id}:last_processed_index", 1]), return: "OK"
Persistence.record_last_processed_index(@ingestion_id, 1)
end
test "deletes last_processed_index from redis" do
expect Redix.command!(@redix, ["DEL", "reaper:#{@ingestion_id}:last_processed_index"]), return: 1
Persistence.remove_last_processed_index(@ingestion_id)
end
test "persists to redis json with timestamp" do
expect Redix.command(@redix, ["SET", any(), "{\"timestamp\": \"#{@timestamp}\"}"]), return: nil
Persistence.record_last_fetched_timestamp(@ingestion_id, @timestamp)
end
end
| 35.972973 | 107 | 0.736289 |
1cb970045eb1f4b7961364b589302ffad841a5fc | 387 | exs | Elixir | elixir/elixir-sips/samples/boltun_playground/mix.exs | afronski/playground-erlang | 6ac4b58b2fd717260c22a33284547d44a9b5038e | [
"MIT"
] | 2 | 2015-12-09T02:16:51.000Z | 2021-07-26T22:53:43.000Z | elixir/elixir-sips/samples/boltun_playground/mix.exs | afronski/playground-erlang | 6ac4b58b2fd717260c22a33284547d44a9b5038e | [
"MIT"
] | null | null | null | elixir/elixir-sips/samples/boltun_playground/mix.exs | afronski/playground-erlang | 6ac4b58b2fd717260c22a33284547d44a9b5038e | [
"MIT"
] | 1 | 2016-05-08T18:40:31.000Z | 2016-05-08T18:40:31.000Z | defmodule BoltunPlayground.Mixfile do
use Mix.Project
def project do
[app: :boltun_playground,
version: "0.0.1",
elixir: "~> 1.0",
build_embedded: Mix.env == :prod,
start_permanent: Mix.env == :prod,
deps: deps]
end
def application do
[applications: [:logger, :boltun]]
end
defp deps do
[
{:boltun, "~> 0.0.4"}
]
end
end
| 16.826087 | 39 | 0.581395 |
1cb9b7c18be1b22a50a271f01e9743decb6a9383 | 1,715 | ex | Elixir | apps/customer/lib/customer/es/index.ex | JaiMali/job_search-1 | 5fe1afcd80aa5d55b92befed2780cd6721837c88 | [
"MIT"
] | 102 | 2017-05-21T18:24:04.000Z | 2022-03-10T12:53:20.000Z | apps/customer/lib/customer/es/index.ex | JaiMali/job_search-1 | 5fe1afcd80aa5d55b92befed2780cd6721837c88 | [
"MIT"
] | 2 | 2017-05-21T01:53:30.000Z | 2017-12-01T00:27:06.000Z | apps/customer/lib/customer/es/index.ex | JaiMali/job_search-1 | 5fe1afcd80aa5d55b92befed2780cd6721837c88 | [
"MIT"
] | 18 | 2017-05-22T09:51:36.000Z | 2021-09-24T00:57:01.000Z | defmodule Customer.Es.Index do
import Tirexs.Manage.Aliases
alias Tirexs.HTTP
alias Tirexs.Resources
alias Tirexs.Resources.Indices
alias Customer.Blank
alias Customer.Es
def name_type(model) do
model
|> to_string
|> String.downcase
end
def name_index(model) do
"es_#{name_type(model)}"
end
def name_reindex(index) do
"#{index}_#{time_suffix}"
end
def reindex(model), do: reindex(model, nil)
def reindex(model, data) do
index = name_index(model)
new_index = name_reindex(index)
case get_aliases(index) do
nil ->
upsert_index(model, index, data, new_index)
old_index ->
upsert_index(model, index, data, new_index, old_index)
HTTP.delete("#{old_index}")
end
:ok
end
defp time_suffix do
Timex.now |> Timex.format!("%Y%m%d%H%M%S%f", :strftime)
end
defp upsert_index(model, index, data, new_index, old_index \\ nil) do
model.es_create_index(new_index)
unless Blank.blank?(data) do
case Es.Document.put_document(data, new_index) do
:ok -> upsert_aliases(aliase_query(index, new_index, old_index))
error -> IO.inspect error
end
end
end
defp upsert_aliases(alias_query) do
Es.Logger.ppdebug alias_query
Resources.bump(alias_query)._aliases
end
defp aliase_query(index, new_index, old_index) do
aliases do
add index: new_index, alias: index
if old_index, do: remove index: old_index, alias: index
end
end
defp get_aliases(index) do
case HTTP.get(Indices._aliases(index)) do
{:ok, _, map} ->
map
|> Dict.keys
|> List.first
|> to_string
_ -> nil
end
end
end
| 21.708861 | 73 | 0.650729 |
1cb9b863923ad1f0edba593c2fb1f5559b7db1ac | 228 | exs | Elixir | priv/repo/migrations/20160213223945_add_user_to_projects.exs | tuvistavie/projare | e776b2d326fed97e0dbf62530674fe688ff73ab8 | [
"MIT"
] | 3 | 2016-03-06T12:23:01.000Z | 2017-03-21T18:22:07.000Z | priv/repo/migrations/20160213223945_add_user_to_projects.exs | tuvistavie/projare | e776b2d326fed97e0dbf62530674fe688ff73ab8 | [
"MIT"
] | null | null | null | priv/repo/migrations/20160213223945_add_user_to_projects.exs | tuvistavie/projare | e776b2d326fed97e0dbf62530674fe688ff73ab8 | [
"MIT"
] | null | null | null | defmodule Projare.Repo.Migrations.AddUserToProjects do
use Ecto.Migration
def change do
alter table(:projects) do
add :author_id, :integer, null: false
end
create index(:projects, [:author_id])
end
end
| 19 | 54 | 0.70614 |
1cb9cf6cc8769ecfb734df9ef998ef1e6e9afb68 | 950 | exs | Elixir | config/config.exs | enoliglesias/blogix | cd679490f5198c5c8c9e42330a8bf96bffa59e40 | [
"MIT"
] | 1 | 2016-05-18T10:19:14.000Z | 2016-05-18T10:19:14.000Z | config/config.exs | enoliglesias/blogix | cd679490f5198c5c8c9e42330a8bf96bffa59e40 | [
"MIT"
] | null | null | null | config/config.exs | enoliglesias/blogix | cd679490f5198c5c8c9e42330a8bf96bffa59e40 | [
"MIT"
] | null | null | null | # This file is responsible for configuring your application
# and its dependencies with the aid of the Mix.Config module.
#
# This configuration file is loaded before any dependency and
# is restricted to this project.
use Mix.Config
# Configures the endpoint
config :blogix, Blogix.Endpoint,
url: [host: "localhost"],
root: Path.dirname(__DIR__),
secret_key_base: "JHSOAf1fgpzY4LUjaH9S2i1rUAb3DsqNWtAWwSOHn75b24Kv0tE8CRw2kg6Bl0Dr",
render_errors: [accepts: ~w(html json)],
pubsub: [name: Blogix.PubSub,
adapter: Phoenix.PubSub.PG2]
# Configures Elixir's Logger
config :logger, :console,
format: "$time $metadata[$level] $message\n",
metadata: [:request_id]
# Import environment specific config. This must remain at the bottom
# of this file so it overrides the configuration defined above.
import_config "#{Mix.env}.exs"
# Configure phoenix generators
config :phoenix, :generators,
migration: true,
binary_id: false
| 31.666667 | 86 | 0.757895 |
1cba23a5f5d72730fe904a3f9ab69d02957d99cb | 1,835 | exs | Elixir | clients/recommendation_engine/mix.exs | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | null | null | null | clients/recommendation_engine/mix.exs | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | null | null | null | clients/recommendation_engine/mix.exs | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.RecommendationEngine.Mixfile do
use Mix.Project
@version "0.4.4"
def project() do
[
app: :google_api_recommendation_engine,
version: @version,
elixir: "~> 1.6",
build_embedded: Mix.env == :prod,
start_permanent: Mix.env == :prod,
description: description(),
package: package(),
deps: deps(),
source_url: "https://github.com/googleapis/elixir-google-api/tree/master/clients/recommendation_engine"
]
end
def application() do
[extra_applications: [:logger]]
end
defp deps() do
[
{:google_gax, "~> 0.4"},
{:ex_doc, "~> 0.16", only: :dev}
]
end
defp description() do
"""
Recommendations AI (Beta) client library.
"""
end
defp package() do
[
files: ["lib", "mix.exs", "README*", "LICENSE"],
maintainers: ["Jeff Ching", "Daniel Azuma"],
licenses: ["Apache 2.0"],
links: %{
"GitHub" => "https://github.com/googleapis/elixir-google-api/tree/master/clients/recommendation_engine",
"Homepage" => "https://cloud.google.com/recommendations-ai/docs"
}
]
end
end
| 27.38806 | 112 | 0.659401 |
1cba29242f8f5a2f1a09d53e801de1a599966ac5 | 1,824 | ex | Elixir | clients/cloud_iot/lib/google_api/cloud_iot/v1/model/list_devices_response.ex | leandrocp/elixir-google-api | a86e46907f396d40aeff8668c3bd81662f44c71e | [
"Apache-2.0"
] | null | null | null | clients/cloud_iot/lib/google_api/cloud_iot/v1/model/list_devices_response.ex | leandrocp/elixir-google-api | a86e46907f396d40aeff8668c3bd81662f44c71e | [
"Apache-2.0"
] | null | null | null | clients/cloud_iot/lib/google_api/cloud_iot/v1/model/list_devices_response.ex | leandrocp/elixir-google-api | a86e46907f396d40aeff8668c3bd81662f44c71e | [
"Apache-2.0"
] | null | null | null | # Copyright 2017 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This class is auto generated by the swagger code generator program.
# https://github.com/swagger-api/swagger-codegen.git
# Do not edit the class manually.
defmodule GoogleApi.CloudIot.V1.Model.ListDevicesResponse do
@moduledoc """
Response for `ListDevices`.
## Attributes
- devices ([Device]): The devices that match the request. Defaults to: `null`.
- nextPageToken (String.t): If not empty, indicates that there may be more devices that match the request; this value should be passed in a new `ListDevicesRequest`. Defaults to: `null`.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:devices => list(GoogleApi.CloudIot.V1.Model.Device.t()),
:nextPageToken => any()
}
field(:devices, as: GoogleApi.CloudIot.V1.Model.Device, type: :list)
field(:nextPageToken)
end
defimpl Poison.Decoder, for: GoogleApi.CloudIot.V1.Model.ListDevicesResponse do
def decode(value, options) do
GoogleApi.CloudIot.V1.Model.ListDevicesResponse.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.CloudIot.V1.Model.ListDevicesResponse do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 35.764706 | 198 | 0.74068 |
1cba29c408ba6b323492597e2d21159094eb7ad4 | 8,086 | ex | Elixir | clients/deployment_manager/lib/google_api/deployment_manager/v2/api/manifests.ex | linjunpop/elixir-google-api | 444cb2b2fb02726894535461a474beddd8b86db4 | [
"Apache-2.0"
] | null | null | null | clients/deployment_manager/lib/google_api/deployment_manager/v2/api/manifests.ex | linjunpop/elixir-google-api | 444cb2b2fb02726894535461a474beddd8b86db4 | [
"Apache-2.0"
] | null | null | null | clients/deployment_manager/lib/google_api/deployment_manager/v2/api/manifests.ex | linjunpop/elixir-google-api | 444cb2b2fb02726894535461a474beddd8b86db4 | [
"Apache-2.0"
] | null | null | null | # Copyright 2017 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This class is auto generated by the swagger code generator program.
# https://github.com/swagger-api/swagger-codegen.git
# Do not edit the class manually.
defmodule GoogleApi.DeploymentManager.V2.Api.Manifests do
@moduledoc """
API calls for all endpoints tagged `Manifests`.
"""
alias GoogleApi.DeploymentManager.V2.Connection
alias GoogleApi.Gax.{Request, Response}
@doc """
Gets information about a specific manifest.
## Parameters
- connection (GoogleApi.DeploymentManager.V2.Connection): Connection to server
- project (String.t): The project ID for this request.
- deployment (String.t): The name of the deployment for this request.
- manifest (String.t): The name of the manifest for this request.
- optional_params (KeywordList): [optional] Optional parameters
- :alt (String.t): Data format for the response.
- :fields (String.t): Selector specifying which fields to include in a partial response.
- :key (String.t): API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
- :oauth_token (String.t): OAuth 2.0 token for the current user.
- :prettyPrint (boolean()): Returns response with indentations and line breaks.
- :quotaUser (String.t): An opaque string that represents a user for quota purposes. Must not exceed 40 characters.
- :userIp (String.t): Deprecated. Please use quotaUser instead.
## Returns
{:ok, %GoogleApi.DeploymentManager.V2.Model.Manifest{}} on success
{:error, info} on failure
"""
@spec deploymentmanager_manifests_get(
Tesla.Env.client(),
String.t(),
String.t(),
String.t(),
keyword()
) :: {:ok, GoogleApi.DeploymentManager.V2.Model.Manifest.t()} | {:error, Tesla.Env.t()}
def deploymentmanager_manifests_get(
connection,
project,
deployment,
manifest,
optional_params \\ [],
opts \\ []
) do
optional_params_config = %{
:alt => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:userIp => :query
}
request =
Request.new()
|> Request.method(:get)
|> Request.url("/{project}/global/deployments/{deployment}/manifests/{manifest}", %{
"project" => URI.encode(project, &URI.char_unreserved?/1),
"deployment" => URI.encode(deployment, &URI.char_unreserved?/1),
"manifest" => URI.encode(manifest, &URI.char_unreserved?/1)
})
|> Request.add_optional_params(optional_params_config, optional_params)
connection
|> Connection.execute(request)
|> Response.decode(opts ++ [struct: %GoogleApi.DeploymentManager.V2.Model.Manifest{}])
end
@doc """
Lists all manifests for a given deployment.
## Parameters
- connection (GoogleApi.DeploymentManager.V2.Connection): Connection to server
- project (String.t): The project ID for this request.
- deployment (String.t): The name of the deployment for this request.
- optional_params (KeywordList): [optional] Optional parameters
- :alt (String.t): Data format for the response.
- :fields (String.t): Selector specifying which fields to include in a partial response.
- :key (String.t): API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
- :oauth_token (String.t): OAuth 2.0 token for the current user.
- :prettyPrint (boolean()): Returns response with indentations and line breaks.
- :quotaUser (String.t): An opaque string that represents a user for quota purposes. Must not exceed 40 characters.
- :userIp (String.t): Deprecated. Please use quotaUser instead.
- :filter (String.t): A filter expression that filters resources listed in the response. The expression must specify the field name, a comparison operator, and the value that you want to use for filtering. The value must be a string, a number, or a boolean. The comparison operator must be either =, !=, >, or <. For example, if you are filtering Compute Engine instances, you can exclude instances named example-instance by specifying name != example-instance. You can also filter nested fields. For example, you could specify scheduling.automaticRestart = false to include instances only if they are not scheduled for automatic restarts. You can use filtering on nested fields to filter based on resource labels. To filter on multiple expressions, provide each separate expression within parentheses. For example, (scheduling.automaticRestart = true) (cpuPlatform = \"Intel Skylake\"). By default, each expression is an AND expression. However, you can include AND and OR expressions explicitly. For example, (cpuPlatform = \"Intel Skylake\") OR (cpuPlatform = \"Intel Broadwell\") AND (scheduling.automaticRestart = true).
- :maxResults (integer()): The maximum number of results per page that should be returned. If the number of available results is larger than maxResults, Compute Engine returns a nextPageToken that can be used to get the next page of results in subsequent list requests. Acceptable values are 0 to 500, inclusive. (Default: 500)
- :orderBy (String.t): Sorts list results by a certain order. By default, results are returned in alphanumerical order based on the resource name. You can also sort results in descending order based on the creation timestamp using orderBy=\"creationTimestamp desc\". This sorts results based on the creationTimestamp field in reverse chronological order (newest result first). Use this to sort resources like operations so that the newest operation is returned first. Currently, only sorting by name or creationTimestamp desc is supported.
- :pageToken (String.t): Specifies a page token to use. Set pageToken to the nextPageToken returned by a previous list request to get the next page of results.
## Returns
{:ok, %GoogleApi.DeploymentManager.V2.Model.ManifestsListResponse{}} on success
{:error, info} on failure
"""
@spec deploymentmanager_manifests_list(Tesla.Env.client(), String.t(), String.t(), keyword()) ::
{:ok, GoogleApi.DeploymentManager.V2.Model.ManifestsListResponse.t()}
| {:error, Tesla.Env.t()}
def deploymentmanager_manifests_list(
connection,
project,
deployment,
optional_params \\ [],
opts \\ []
) do
optional_params_config = %{
:alt => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:userIp => :query,
:filter => :query,
:maxResults => :query,
:orderBy => :query,
:pageToken => :query
}
request =
Request.new()
|> Request.method(:get)
|> Request.url("/{project}/global/deployments/{deployment}/manifests", %{
"project" => URI.encode(project, &URI.char_unreserved?/1),
"deployment" => URI.encode(deployment, &URI.char_unreserved?/1)
})
|> Request.add_optional_params(optional_params_config, optional_params)
connection
|> Connection.execute(request)
|> Response.decode(
opts ++ [struct: %GoogleApi.DeploymentManager.V2.Model.ManifestsListResponse{}]
)
end
end
| 51.833333 | 1,213 | 0.703067 |
1cba344e24a0228103a40f992a385e8e79520026 | 677 | exs | Elixir | .iex.exs | lee-dohm/staff-notes | 07186e8407f1955876fa2dee2dbbfd0bbac91333 | [
"MIT"
] | 1 | 2020-01-26T18:08:40.000Z | 2020-01-26T18:08:40.000Z | .iex.exs | lee-dohm/staff-notes | 07186e8407f1955876fa2dee2dbbfd0bbac91333 | [
"MIT"
] | 36 | 2017-12-23T20:22:07.000Z | 2018-05-10T09:16:59.000Z | .iex.exs | lee-dohm/staff-notes | 07186e8407f1955876fa2dee2dbbfd0bbac91333 | [
"MIT"
] | null | null | null | import_file_if_available "~/.iex.exs"
use Phoenix.HTML
import Ecto.Query
import Phoenix.HTML.Safe, only: [to_iodata: 1]
alias StaffNotes.Accounts
alias StaffNotes.Accounts.Organization
alias StaffNotes.Accounts.Team
alias StaffNotes.Accounts.User
alias StaffNotes.Ecto.Markdown
alias StaffNotes.Notes.Member
alias StaffNotes.Notes.Note
alias StaffNotes.Repo
alias StaffNotesWeb.AuthController
alias StaffNotesWeb.PageController
alias StaffNotesWeb.UserController
alias StaffNotesWeb.AvatarHelpers
alias StaffNotesWeb.ErrorView
alias StaffNotesWeb.LayoutView
alias StaffNotesWeb.PageView
alias StaffNotesWeb.Primer
alias StaffNotesWeb.UserView
import StaffNotes.IEx.Helpers
| 25.074074 | 46 | 0.865583 |
1cba4ac88b7c4b5e21e682a9ddcd4e0d9b9c32a9 | 3,092 | ex | Elixir | apps/omg_eth/test/support/dev_geth.ex | Pongch/elixir-omg | 8a33c246898b49cba62b847e0989d9b6c89f5106 | [
"Apache-2.0"
] | null | null | null | apps/omg_eth/test/support/dev_geth.ex | Pongch/elixir-omg | 8a33c246898b49cba62b847e0989d9b6c89f5106 | [
"Apache-2.0"
] | null | null | null | apps/omg_eth/test/support/dev_geth.ex | Pongch/elixir-omg | 8a33c246898b49cba62b847e0989d9b6c89f5106 | [
"Apache-2.0"
] | null | null | null | # Copyright 2018 OmiseGO Pte Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
defmodule OMG.Eth.DevGeth do
@moduledoc """
Helper module for deployment of contracts to dev geth.
"""
@doc """
Run geth in temp dir, kill it with SIGKILL when done.
"""
require Logger
alias OMG.Eth
def start do
{:ok, _} = Application.ensure_all_started(:briefly)
{:ok, _} = Application.ensure_all_started(:erlexec)
{:ok, _} = Application.ensure_all_started(:ethereumex)
{:ok, homedir} = Briefly.create(directory: true)
geth_pid = launch("geth --dev --dev.period=1 --rpc --rpcapi=personal,eth,web3 --datadir #{homedir} 2>&1")
{:ok, :ready} = Eth.WaitFor.eth_rpc()
on_exit = fn -> stop(geth_pid) end
{:ok, on_exit}
end
defp stop(pid) do
# NOTE: monitor is required to stop_and_wait, don't know why? `monitor: true` on run doesn't work
_ = Process.monitor(pid)
{:exit_status, 35_072} = Exexec.stop_and_wait(pid)
:ok
end
# PRIVATE
defp log_geth_output(line) do
_ = Logger.debug(fn -> "geth: " <> line end)
line
end
defp launch(cmd) do
_ = Logger.debug(fn -> "Starting geth" end)
{:ok, geth_proc, _ref, [{:stream, geth_out, _stream_server}]} =
Exexec.run(cmd, stdout: :stream, kill_command: "pkill -9 geth")
wait_for_geth_start(geth_out)
_ =
if Application.get_env(:omg_eth, :geth_logging_in_debug) do
%Task{} =
fn ->
geth_out |> Enum.each(&log_geth_output/1)
end
|> Task.async()
end
geth_proc
end
def wait_for_start(outstream, look_for, timeout) do
# Monitors the stdout coming out of a process for signal of successful startup
waiting_task_function = fn ->
outstream
|> Stream.map(&log_geth_output/1)
|> Stream.take_while(fn line -> not String.contains?(line, look_for) end)
|> Enum.to_list()
end
waiting_task_function
|> Task.async()
|> Task.await(timeout)
:ok
end
defp wait_for_geth_start(geth_out) do
wait_for_start(geth_out, "IPC endpoint opened", 15_000)
end
def maybe_mine(false), do: :noop
def maybe_mine(true), do: mine_eth_dev_block()
def mine_eth_dev_block do
{:ok, [addr | _]} = Ethereumex.HttpClient.eth_accounts()
txmap = %{from: addr, to: addr, value: "0x1"}
{:ok, txhash} = Ethereumex.HttpClient.eth_send_transaction(txmap)
# Dev geth is mining every second, that's why we need to wait longer than 1 s for receipt
{:ok, _receipt} = txhash |> Eth.Encoding.from_hex() |> Eth.WaitFor.eth_receipt(2_000)
end
end
| 28.897196 | 109 | 0.670763 |
1cba84818b6dc1170b083b00da955a4a98861866 | 1,959 | ex | Elixir | lib/absinthe/phase/schema/validation/names_must_be_valid.ex | zoldar/absinthe | 72ff9f91fcc0a261f9965cf8120c7c72ff6e4c7c | [
"MIT"
] | 2 | 2021-04-22T23:45:04.000Z | 2021-05-07T01:01:15.000Z | lib/absinthe/phase/schema/validation/names_must_be_valid.ex | zoldar/absinthe | 72ff9f91fcc0a261f9965cf8120c7c72ff6e4c7c | [
"MIT"
] | 2 | 2019-03-07T00:26:03.000Z | 2019-08-19T17:30:30.000Z | lib/absinthe/phase/schema/validation/names_must_be_valid.ex | zoldar/absinthe | 72ff9f91fcc0a261f9965cf8120c7c72ff6e4c7c | [
"MIT"
] | 1 | 2019-01-18T20:49:03.000Z | 2019-01-18T20:49:03.000Z | defmodule Absinthe.Phase.Schema.Validation.NamesMustBeValid do
@moduledoc false
use Absinthe.Phase
alias Absinthe.Blueprint
alias Absinthe.Blueprint.Schema
@valid_name_regex ~r/[_A-Za-z][_0-9A-Za-z]*/
def run(bp, _) do
bp = Blueprint.prewalk(bp, &validate_names/1)
{:ok, bp}
end
defp validate_names(%{name: nil} = entity) do
entity
end
defp validate_names(%struct{name: name} = entity) do
if valid_name?(name) do
entity
else
kind = struct_to_kind(struct)
detail = %{artifact: "#{kind} name", value: entity.name}
entity |> put_error(error(entity, detail))
end
end
defp validate_names(entity) do
entity
end
defp valid_name?(name) do
[match] = Regex.run(@valid_name_regex, name)
match == name
end
defp error(object, data) do
%Absinthe.Phase.Error{
message: explanation(data),
locations: [object.__reference__.location],
phase: __MODULE__,
extra: data
}
end
defp struct_to_kind(Schema.InputValueDefinition), do: "argument"
defp struct_to_kind(Schema.FieldDefinition), do: "field"
defp struct_to_kind(Schema.DirectiveDefinition), do: "directive"
defp struct_to_kind(Schema.ScalarTypeDefinition), do: "scalar"
defp struct_to_kind(Schema.ObjectTypeDefinition), do: "object"
defp struct_to_kind(Schema.InputObjectTypeDefinition), do: "input object"
defp struct_to_kind(_), do: "type"
@description """
Name does not match possible #{inspect(@valid_name_regex)} regex.
> Names in GraphQL are limited to this ASCII subset of possible characters to
> support interoperation with as many other systems as possible.
Reference: https://graphql.github.io/graphql-spec/June2018/#sec-Names
"""
def explanation(%{artifact: artifact, value: value}) do
artifact_name = String.capitalize(artifact)
"""
#{artifact_name} #{inspect(value)} has invalid characters.
#{@description}
"""
end
end
| 26.12 | 79 | 0.700357 |
1cba9b16e83a413fe9dd966ef944452c182b137b | 581 | ex | Elixir | lib/ingram_marketplace/model/validation_parameters_result.ex | fbettag/ingram_marketplace.ex | 1c63d391707058fb8cf58fdefd54e2ade97acf4b | [
"MIT"
] | null | null | null | lib/ingram_marketplace/model/validation_parameters_result.ex | fbettag/ingram_marketplace.ex | 1c63d391707058fb8cf58fdefd54e2ade97acf4b | [
"MIT"
] | null | null | null | lib/ingram_marketplace/model/validation_parameters_result.ex | fbettag/ingram_marketplace.ex | 1c63d391707058fb8cf58fdefd54e2ade97acf4b | [
"MIT"
] | null | null | null | defmodule Ingram.Marketplace.Model.ValidationParametersResult do
@moduledoc """
The result of validation of the product-specific activation parameters.
"""
@derive [Poison.Encoder]
defstruct [
:data
]
@type t :: %__MODULE__{
:data => [ValidationParameterItem] | nil
}
end
defimpl Poison.Decoder, for: Ingram.Marketplace.Model.ValidationParametersResult do
import Ingram.Marketplace.Deserializer
def decode(value, options) do
value
|> deserialize(:data, :list, Ingram.Marketplace.Model.ValidationParameterItem, options)
end
end
| 24.208333 | 91 | 0.726334 |
1cbaa54345c185d022567b9f28d5e1952bfeb259 | 517 | ex | Elixir | apps/scraper/lib/scraper/cron.ex | JaiMali/job_search-1 | 5fe1afcd80aa5d55b92befed2780cd6721837c88 | [
"MIT"
] | 102 | 2017-05-21T18:24:04.000Z | 2022-03-10T12:53:20.000Z | apps/scraper/lib/scraper/cron.ex | JaiMali/job_search-1 | 5fe1afcd80aa5d55b92befed2780cd6721837c88 | [
"MIT"
] | 2 | 2017-05-21T01:53:30.000Z | 2017-12-01T00:27:06.000Z | apps/scraper/lib/scraper/cron.ex | JaiMali/job_search-1 | 5fe1afcd80aa5d55b92befed2780cd6721837c88 | [
"MIT"
] | 18 | 2017-05-22T09:51:36.000Z | 2021-09-24T00:57:01.000Z | defmodule Scraper.Cron do
use GenServer
require Logger
# NOTE: everyweek
@period 7 * 24 * 60 * 60 * 1000
def start_link do
GenServer.start_link(__MODULE__, %{})
end
def init(state) do
set_schedule_work()
{:ok, state}
end
def handle_info(:work, state) do
Logger.info("scraper executes #{DateTime.utc_now()}")
Scraper.Caller.perform()
set_schedule_work()
{:noreply, state}
end
defp set_schedule_work() do
Process.send_after(self(), :work, @period)
end
end
| 17.233333 | 57 | 0.661509 |
1cbaad8a2c9cbfc0b9e7b2cb3d6175210c03db12 | 2,490 | ex | Elixir | clients/domains/lib/google_api/domains/v1alpha2/model/configure_contact_settings_request.ex | kyleVsteger/elixir-google-api | 3a0dd498af066a4361b5b0fd66ffc04a57539488 | [
"Apache-2.0"
] | 1 | 2021-10-01T09:20:41.000Z | 2021-10-01T09:20:41.000Z | clients/domains/lib/google_api/domains/v1alpha2/model/configure_contact_settings_request.ex | kyleVsteger/elixir-google-api | 3a0dd498af066a4361b5b0fd66ffc04a57539488 | [
"Apache-2.0"
] | null | null | null | clients/domains/lib/google_api/domains/v1alpha2/model/configure_contact_settings_request.ex | kyleVsteger/elixir-google-api | 3a0dd498af066a4361b5b0fd66ffc04a57539488 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Domains.V1alpha2.Model.ConfigureContactSettingsRequest do
@moduledoc """
Request for the `ConfigureContactSettings` method.
## Attributes
* `contactNotices` (*type:* `list(String.t)`, *default:* `nil`) - The list of contact notices that the caller acknowledges. The notices needed here depend on the values specified in `contact_settings`.
* `contactSettings` (*type:* `GoogleApi.Domains.V1alpha2.Model.ContactSettings.t`, *default:* `nil`) - Fields of the `ContactSettings` to update.
* `updateMask` (*type:* `String.t`, *default:* `nil`) - Required. The field mask describing which fields to update as a comma-separated list. For example, if only the registrant contact is being updated, the `update_mask` would be `"registrant_contact"`.
* `validateOnly` (*type:* `boolean()`, *default:* `nil`) - Validate the request without actually updating the contact settings.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:contactNotices => list(String.t()) | nil,
:contactSettings => GoogleApi.Domains.V1alpha2.Model.ContactSettings.t() | nil,
:updateMask => String.t() | nil,
:validateOnly => boolean() | nil
}
field(:contactNotices, type: :list)
field(:contactSettings, as: GoogleApi.Domains.V1alpha2.Model.ContactSettings)
field(:updateMask)
field(:validateOnly)
end
defimpl Poison.Decoder, for: GoogleApi.Domains.V1alpha2.Model.ConfigureContactSettingsRequest do
def decode(value, options) do
GoogleApi.Domains.V1alpha2.Model.ConfigureContactSettingsRequest.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Domains.V1alpha2.Model.ConfigureContactSettingsRequest do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 44.464286 | 258 | 0.738956 |
1cbae75e259b0654817cbb7b8baccb064a4e9e55 | 1,129 | ex | Elixir | integration_test/temple_demo/test/support/channel_case.ex | rktjmp/temple | 6fe46cbb4998477d76147fa95c9fd9c7841545ef | [
"MIT"
] | 366 | 2019-07-12T22:43:35.000Z | 2022-03-31T07:26:25.000Z | integration_test/temple_demo/test/support/channel_case.ex | rktjmp/temple | 6fe46cbb4998477d76147fa95c9fd9c7841545ef | [
"MIT"
] | 114 | 2019-07-12T19:49:10.000Z | 2022-03-23T10:48:48.000Z | integration_test/temple_demo/test/support/channel_case.ex | rktjmp/temple | 6fe46cbb4998477d76147fa95c9fd9c7841545ef | [
"MIT"
] | 17 | 2019-07-23T15:16:34.000Z | 2021-02-25T02:42:02.000Z | defmodule TempleDemoWeb.ChannelCase do
@moduledoc """
This module defines the test case to be used by
channel tests.
Such tests rely on `Phoenix.ChannelTest` and also
import other functionality to make it easier
to build common data structures and query the data layer.
Finally, if the test case interacts with the database,
we enable the SQL sandbox, so changes done to the database
are reverted at the end of every test. If you are using
PostgreSQL, you can even run database tests asynchronously
by setting `use TempleDemoWeb.ChannelCase, async: true`, although
this option is not recommended for other databases.
"""
use ExUnit.CaseTemplate
using do
quote do
# Import conveniences for testing with channels
import Phoenix.ChannelTest
import TempleDemoWeb.ChannelCase
# The default endpoint for testing
@endpoint TempleDemoWeb.Endpoint
end
end
setup tags do
:ok = Ecto.Adapters.SQL.Sandbox.checkout(TempleDemo.Repo)
unless tags[:async] do
Ecto.Adapters.SQL.Sandbox.mode(TempleDemo.Repo, {:shared, self()})
end
:ok
end
end
| 27.536585 | 72 | 0.731621 |
1cbae7cf309fb7bb4d99bce8a6bb53c16f73598c | 1,492 | ex | Elixir | clients/content/lib/google_api/content/v2/model/refund_reason.ex | kaaboaye/elixir-google-api | 1896784c4342151fd25becd089a5beb323eff567 | [
"Apache-2.0"
] | null | null | null | clients/content/lib/google_api/content/v2/model/refund_reason.ex | kaaboaye/elixir-google-api | 1896784c4342151fd25becd089a5beb323eff567 | [
"Apache-2.0"
] | null | null | null | clients/content/lib/google_api/content/v2/model/refund_reason.ex | kaaboaye/elixir-google-api | 1896784c4342151fd25becd089a5beb323eff567 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Content.V2.Model.RefundReason do
@moduledoc """
## Attributes
* `description` (*type:* `String.t`, *default:* `nil`) - Description of the reason.
* `reasonCode` (*type:* `String.t`, *default:* `nil`) - Code of the refund reason.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:description => String.t(),
:reasonCode => String.t()
}
field(:description)
field(:reasonCode)
end
defimpl Poison.Decoder, for: GoogleApi.Content.V2.Model.RefundReason do
def decode(value, options) do
GoogleApi.Content.V2.Model.RefundReason.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Content.V2.Model.RefundReason do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 29.84 | 87 | 0.717158 |
1cbaf281b4424b3d5817711a892846c511b0763a | 827 | exs | Elixir | umbrella2/apps/umbrella2_app1/mix.exs | hirotnk/excoveralls_post_sample | 3a1bffa632964ff99d2fe31d4adf793f5154fbb2 | [
"MIT"
] | null | null | null | umbrella2/apps/umbrella2_app1/mix.exs | hirotnk/excoveralls_post_sample | 3a1bffa632964ff99d2fe31d4adf793f5154fbb2 | [
"MIT"
] | null | null | null | umbrella2/apps/umbrella2_app1/mix.exs | hirotnk/excoveralls_post_sample | 3a1bffa632964ff99d2fe31d4adf793f5154fbb2 | [
"MIT"
] | null | null | null | defmodule Umbrella2App1.MixProject do
use Mix.Project
def project do
[
app: :umbrella2_app1,
version: "0.1.0",
build_path: "../../_build",
config_path: "../../config/config.exs",
deps_path: "../../deps",
lockfile: "../../mix.lock",
elixir: "~> 1.11",
start_permanent: Mix.env() == :prod,
deps: deps(),
test_coverage: [tool: ExCoveralls]
]
end
# Run "mix help compile.app" to learn about applications.
def application do
[
extra_applications: [:logger]
]
end
# Run "mix help deps" to learn about dependencies.
defp deps do
[
# {:dep_from_hexpm, "~> 0.3.0"},
# {:dep_from_git, git: "https://github.com/elixir-lang/my_dep.git", tag: "0.1.0"},
# {:sibling_app_in_umbrella, in_umbrella: true}
]
end
end
| 23.628571 | 88 | 0.579202 |
1cbb0cb37dd39373575199ee24c33b6a7c4933de | 1,412 | exs | Elixir | config/config.exs | MarbilleJuntado/phoenix-discuss-app | 7d10149f7f503385bcb673940bf59f60590f1254 | [
"MIT"
] | 1 | 2018-04-21T16:04:31.000Z | 2018-04-21T16:04:31.000Z | config/config.exs | MarbilleJuntado/phoenix-discuss-app | 7d10149f7f503385bcb673940bf59f60590f1254 | [
"MIT"
] | null | null | null | config/config.exs | MarbilleJuntado/phoenix-discuss-app | 7d10149f7f503385bcb673940bf59f60590f1254 | [
"MIT"
] | null | null | null | # This file is responsible for configuring your application
# and its dependencies with the aid of the Mix.Config module.
#
# This configuration file is loaded before any dependency and
# is restricted to this project.
use Mix.Config
# General application configuration
config :discuss,
ecto_repos: [Discuss.Repo]
# Configures the endpoint
config :discuss, Discuss.Endpoint,
url: [host: "localhost"],
secret_key_base: "p/mNbQpv7Ox/zRFATAqZF2AzJbgbErJFVphrHAUQ6IvHp3cIDCOaCeTYaRrNp4uD",
render_errors: [view: Discuss.ErrorView, accepts: ~w(html json)],
pubsub: [name: Discuss.PubSub,
adapter: Phoenix.PubSub.PG2]
# Configures Elixir's Logger
config :logger, :console,
format: "$time $metadata[$level] $message\n",
metadata: [:request_id]
# Import environment specific config. This must remain at the bottom
# of this file so it overrides the configuration defined above.
import_config "#{Mix.env}.exs"
# Configure Ueberauth
config :ueberauth, Ueberauth,
providers: [
github: { Ueberauth.Strategy.Github, [] },
google: { Ueberauth.Strategy.Google, [] }
]
config :ueberauth, Ueberauth.Strategy.Github.OAuth,
client_id: System.get_env("GITHUB_CLIENT_ID"),
client_secret: System.get_env("GITHUB_CLIENT_SECRET")
config :ueberauth, Ueberauth.Strategy.Google.OAuth,
client_id: System.get_env("DISCUSS_GOOGLE_ID"),
client_secret: System.get_env("DISCUSS_GOOGLE_SECRET") | 33.619048 | 86 | 0.759915 |
1cbb32a968de425e4b31818d242533923e780053 | 5,061 | ex | Elixir | lib/ex_doc/refs.ex | kelvinst/ex_doc | 609d9765dd6f098dc298e5d6db6430859ee934ec | [
"Apache-2.0",
"CC-BY-4.0"
] | 1,206 | 2015-01-02T02:05:12.000Z | 2022-03-29T17:18:10.000Z | lib/ex_doc/refs.ex | kelvinst/ex_doc | 609d9765dd6f098dc298e5d6db6430859ee934ec | [
"Apache-2.0",
"CC-BY-4.0"
] | 1,266 | 2015-01-03T03:26:04.000Z | 2022-03-31T09:43:53.000Z | lib/ex_doc/refs.ex | LaudateCorpus1/ex_doc | 0612bac649cbb44ef09c20dedcf7e588a150606c | [
"Apache-2.0",
"CC-BY-4.0"
] | 300 | 2015-01-03T04:07:24.000Z | 2022-03-29T08:10:56.000Z | defmodule ExDoc.Refs do
@moduledoc false
# A read-through cache of documentation references.
#
# A given ref is always associated with a module. If we don't have a ref
# in the cache we fetch the module's docs chunk and fill in the cache.
#
# If the module does not have the docs chunk, we fetch it's functions,
# callbacks and types from other sources.
@typep entry() :: {ref(), visibility()}
@typep ref() ::
{:module, module()}
| {kind(), module(), name :: atom(), arity()}
@typep kind() :: :function | :callback | :type
@typep visibility() :: :hidden | :public | :undefined
@name __MODULE__
use GenServer
@spec start_link(any()) :: GenServer.on_start()
def start_link(arg) do
GenServer.start_link(__MODULE__, arg, name: @name)
end
@spec init(any()) :: {:ok, nil}
def init(_) do
:ets.new(@name, [:named_table, :public, :set])
{:ok, nil}
end
@spec clear() :: :ok
def clear() do
:ets.delete_all_objects(@name)
:ok
end
@spec get_visibility(ref()) :: visibility()
def get_visibility(ref) do
case lookup(ref) do
{:ok, visibility} ->
visibility
:error ->
fetch(ref)
end
end
defp lookup(ref) do
case :ets.lookup(@name, ref) do
[{^ref, visibility}] ->
{:ok, visibility}
[] ->
:error
end
rescue
_ ->
:error
end
@spec insert([entry()]) :: :ok
def insert(entries) do
true = :ets.insert(@name, entries)
:ok
end
@spec insert_from_chunk(module, tuple()) :: :ok
def insert_from_chunk(module, result) do
entries = fetch_entries(module, result)
insert(entries)
:ok
end
defp fetch({:module, module} = ref) do
entries = fetch_entries(module, ExDoc.Utils.Code.fetch_docs(module))
insert(entries)
Map.get(Map.new(entries), ref, :undefined)
end
defp fetch({_kind, module, _name, _arity} = ref) do
with module_visibility <- fetch({:module, module}),
true <- module_visibility in [:public, :hidden],
{:ok, visibility} <- lookup(ref) do
visibility
else
_ ->
:undefined
end
end
defp fetch_entries(module, result) do
case result do
{:docs_v1, _, _, _, module_doc, _, docs} ->
module_visibility = visibility(module_doc)
for {{kind, name, arity}, _, _, doc, metadata} <- docs do
ref_kind = to_ref_kind(kind)
visibility = visibility(module_doc, {ref_kind, name, doc})
for arity <- (arity - (metadata[:defaults] || 0))..arity do
{{ref_kind, module, name, arity}, visibility}
end
end
|> List.flatten()
|> Enum.concat([
{{:module, module}, module_visibility}
| to_refs(types(module, [:typep]), module, :type, :hidden)
])
{:error, _} ->
if Code.ensure_loaded?(module) do
(to_refs(exports(module), module, :function) ++
to_refs(callbacks(module), module, :callback) ++
to_refs(types(module, [:type, :opaque]), module, :type) ++
to_refs(types(module, [:typep]), module, :type, :hidden))
|> Enum.concat([{{:module, module}, :public}])
else
[{{:module, module}, :undefined}]
end
end
end
defguardp has_no_docs(doc) when doc == :none or doc == %{}
defp starts_with_underscore?(name), do: hd(Atom.to_charlist(name)) == ?_
defp visibility(:hidden),
do: :hidden
defp visibility(_module_doc),
do: :public
defp visibility(_module_doc, {kind, _name, _doc})
when kind not in [:callback, :function, :type],
do: raise(ArgumentError, "Unknown kind #{inspect(kind)}")
defp visibility(:hidden, {_kind, _name, _doc}),
do: :hidden
defp visibility(_, {_kind, _name, :hidden}),
do: :hidden
defp visibility(_, {kind, name, doc}) when has_no_docs(doc) do
cond do
kind in [:callback, :type] ->
:public
kind == :function and starts_with_underscore?(name) ->
:hidden
kind == :function ->
:public
end
end
defp visibility(_, {_, _, _}) do
:public
end
defp to_ref_kind(:macro), do: :function
defp to_ref_kind(:macrocallback), do: :callback
defp to_ref_kind(other), do: other
defp exports(module) do
if function_exported?(module, :__info__, 1) do
module.__info__(:functions) ++ module.__info__(:macros)
else
module.module_info(:exports)
end
end
defp callbacks(module) do
if function_exported?(module, :behaviour_info, 1) do
module.behaviour_info(:callbacks)
else
[]
end
end
defp types(module, kind_list) do
case Code.Typespec.fetch_types(module) do
{:ok, list} ->
for {kind, {name, _, args}} <- list,
kind in kind_list do
{name, length(args)}
end
:error ->
[]
end
end
defp to_refs(list, module, kind, visibility \\ :public) do
for {name, arity} <- list do
{{kind, module, name, arity}, visibility}
end
end
end
| 24.931034 | 74 | 0.593163 |
1cbb44303d82769e969991be791dab1dd9bf42c1 | 1,708 | exs | Elixir | test/rondo/test_test.exs | exstruct/rondo | 8ec1a8754d594086065020bc45a801620fdda79d | [
"MIT"
] | 1 | 2016-08-09T21:45:21.000Z | 2016-08-09T21:45:21.000Z | test/rondo/test_test.exs | exstruct/rondo | 8ec1a8754d594086065020bc45a801620fdda79d | [
"MIT"
] | null | null | null | test/rondo/test_test.exs | exstruct/rondo | 8ec1a8754d594086065020bc45a801620fdda79d | [
"MIT"
] | null | null | null | defmodule Test.Rondo.Test do
use Test.Rondo.Case
context :render_shallow do
defmodule First do
use Rondo.Component
def render(_) do
throw :IT_RENDERED
end
end
defmodule Second do
use Rondo.Component
def render(_) do
el(First)
end
end
after
"call" ->
{app, _store} = render_shallow(Second, %TestStore{})
assert_path app, [], %{type: First}
end
context :find_element do
defmodule Component do
use Rondo.Component
def render(_) do
nest(100)
end
defp nest(0) do
el("Item", %{key: :foo})
end
defp nest(count) when rem(count, 2) == 0 do
el("Container", nil, [nest(count - 1)])
end
defp nest(count) do
el("Item", %{key: :foo}, [nest(count - 1)])
end
end
after
"render" ->
{app, _store} = render(Component)
elements = find_element(app, fn
(%{key: :foo}) ->
true
(_) ->
false
end)
assert length(elements) == 51
end
context :action_error do
defmodule Action do
def affordance(_) do
%{}
end
end
defmodule Component do
use Rondo.Component
def state(_, _) do
%{
thing: create_store()
}
end
def render(_) do
el("Item", %{action: action(ref([:thing]), Action)})
end
end
after
"put_action_error" ->
{app, store} = render(Component)
store = TestStore.put_action_error(store, "You goofed!")
{:ok, ref} = fetch_path(app, [:props, :action, :ref])
{:error, _error, _app, _store} = submit_action(app, store, ref, 1)
end
end
| 19.191011 | 72 | 0.539813 |
1cbb62b60efe7984b64b1d1fd1360e0e531825d7 | 1,579 | ex | Elixir | clients/cloud_kms/lib/google_api/cloud_kms/v1/model/asymmetric_decrypt_request.ex | mocknen/elixir-google-api | dac4877b5da2694eca6a0b07b3bd0e179e5f3b70 | [
"Apache-2.0"
] | null | null | null | clients/cloud_kms/lib/google_api/cloud_kms/v1/model/asymmetric_decrypt_request.ex | mocknen/elixir-google-api | dac4877b5da2694eca6a0b07b3bd0e179e5f3b70 | [
"Apache-2.0"
] | null | null | null | clients/cloud_kms/lib/google_api/cloud_kms/v1/model/asymmetric_decrypt_request.ex | mocknen/elixir-google-api | dac4877b5da2694eca6a0b07b3bd0e179e5f3b70 | [
"Apache-2.0"
] | null | null | null | # Copyright 2017 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This class is auto generated by the swagger code generator program.
# https://github.com/swagger-api/swagger-codegen.git
# Do not edit the class manually.
defmodule GoogleApi.CloudKMS.V1.Model.AsymmetricDecryptRequest do
@moduledoc """
Request message for KeyManagementService.AsymmetricDecrypt.
## Attributes
- ciphertext (binary()): Required. The data encrypted with the named CryptoKeyVersion's public key using OAEP. Defaults to: `null`.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:ciphertext => any()
}
field(:ciphertext)
end
defimpl Poison.Decoder, for: GoogleApi.CloudKMS.V1.Model.AsymmetricDecryptRequest do
def decode(value, options) do
GoogleApi.CloudKMS.V1.Model.AsymmetricDecryptRequest.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.CloudKMS.V1.Model.AsymmetricDecryptRequest do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 32.895833 | 137 | 0.756175 |
1cbb820ae2bdffa4deeef6206f0700676c105d74 | 3,854 | ex | Elixir | lib/oli_web/live/community_live/members_index_view.ex | malav2110/oli-torus | 8af64e762a7c8a2058bd27a7ab8e96539ffc055f | [
"MIT"
] | 1 | 2022-03-17T20:35:47.000Z | 2022-03-17T20:35:47.000Z | lib/oli_web/live/community_live/members_index_view.ex | malav2110/oli-torus | 8af64e762a7c8a2058bd27a7ab8e96539ffc055f | [
"MIT"
] | 9 | 2021-11-02T16:52:09.000Z | 2022-03-25T15:14:01.000Z | lib/oli_web/live/community_live/members_index_view.ex | malav2110/oli-torus | 8af64e762a7c8a2058bd27a7ab8e96539ffc055f | [
"MIT"
] | null | null | null | defmodule OliWeb.CommunityLive.MembersIndexView do
use Surface.LiveView, layout: {OliWeb.LayoutView, "live.html"}
use OliWeb.Common.SortableTable.TableHandlers
alias Oli.Groups
alias OliWeb.Common.{Breadcrumb, Filter, Listing}
alias OliWeb.CommunityLive.{ShowView, MembersTableModel}
alias OliWeb.Router.Helpers, as: Routes
data title, :string, default: "Community Members"
data breadcrumbs, :any
data query, :string, default: ""
data total_count, :integer, default: 0
data offset, :integer, default: 0
data limit, :integer, default: 20
data sort, :string, default: "sort"
data page_change, :string, default: "page_change"
data show_bottom_paging, :boolean, default: false
data additional_table_class, :string, default: ""
@table_filter_fn &__MODULE__.filter_rows/3
@table_push_patch_path &__MODULE__.live_path/2
def filter_rows(socket, query, _filter) do
Enum.filter(socket.assigns.members, &member_filter(&1.name, &1.email, query))
end
def live_path(socket, params) do
Routes.live_path(socket, __MODULE__, socket.assigns.community_id, params)
end
def breadcrumb(community_id) do
ShowView.breadcrumb(community_id) ++
[
Breadcrumb.new(%{
full_title: "Members",
link: Routes.live_path(OliWeb.Endpoint, __MODULE__, community_id)
})
]
end
def mount(%{"community_id" => community_id}, _session, socket) do
members = Groups.list_community_members(community_id)
{:ok, table_model} = MembersTableModel.new(members)
{:ok,
assign(socket,
breadcrumbs: breadcrumb(community_id),
members: members,
community_id: community_id,
table_model: table_model,
total_count: length(members)
)}
end
def render(assigns) do
~F"""
<div class="d-flex p-3 justify-content-between">
<Filter
change="change_search"
reset="reset_search"
apply="apply_search"
query={@query}/>
</div>
<div class="p-4">
<Listing
filter={@query}
table_model={@table_model}
total_count={@total_count}
offset={@offset}
limit={@limit}
sort={@sort}
page_change={@page_change}
show_bottom_paging={@show_bottom_paging}
additional_table_class={@additional_table_class}/>
</div>
"""
end
def handle_event("remove", %{"id" => id}, socket) do
socket = clear_flash(socket)
case Groups.delete_community_account(%{
community_id: socket.assigns.community_id,
user_id: id
}) do
{:ok, _community_account} ->
members = Groups.list_community_members(socket.assigns.community_id)
{:ok, table_model} = MembersTableModel.new(members)
socket =
put_flash(socket, :info, "Community member successfully removed.")
|> assign(
members: members,
table_model: table_model,
total_count: length(members)
)
{:noreply,
push_patch(socket,
to:
@table_push_patch_path.(
socket,
get_patch_params(
socket.assigns.table_model,
socket.assigns.offset,
socket.assigns.query,
socket.assigns.filter
)
),
replace: true
)}
{:error, _error} ->
{:noreply, put_flash(socket, :error, "Community member couldn't be removed.")}
end
end
defp member_filter(nil, email, query),
do: String.contains?(String.downcase(email), String.downcase(query))
defp member_filter(name, email, query) do
String.contains?(String.downcase(name), String.downcase(query)) or
String.contains?(String.downcase(email), String.downcase(query))
end
end
| 29.875969 | 86 | 0.631292 |
1cbb89dbe00d208d85a60bb6559e7af54342ce8f | 37,788 | ex | Elixir | lib/codes/codes_t26.ex | badubizzle/icd_code | 4c625733f92b7b1d616e272abc3009bb8b916c0c | [
"Apache-2.0"
] | null | null | null | lib/codes/codes_t26.ex | badubizzle/icd_code | 4c625733f92b7b1d616e272abc3009bb8b916c0c | [
"Apache-2.0"
] | null | null | null | lib/codes/codes_t26.ex | badubizzle/icd_code | 4c625733f92b7b1d616e272abc3009bb8b916c0c | [
"Apache-2.0"
] | null | null | null | defmodule IcdCode.ICDCode.Codes_T26 do
alias IcdCode.ICDCode
def _T2600XA do
%ICDCode{full_code: "T2600XA",
category_code: "T26",
short_code: "00XA",
full_name: "Burn of unspecified eyelid and periocular area, initial encounter",
short_name: "Burn of unspecified eyelid and periocular area, initial encounter",
category_name: "Burn of unspecified eyelid and periocular area, initial encounter"
}
end
def _T2600XD do
%ICDCode{full_code: "T2600XD",
category_code: "T26",
short_code: "00XD",
full_name: "Burn of unspecified eyelid and periocular area, subsequent encounter",
short_name: "Burn of unspecified eyelid and periocular area, subsequent encounter",
category_name: "Burn of unspecified eyelid and periocular area, subsequent encounter"
}
end
def _T2600XS do
%ICDCode{full_code: "T2600XS",
category_code: "T26",
short_code: "00XS",
full_name: "Burn of unspecified eyelid and periocular area, sequela",
short_name: "Burn of unspecified eyelid and periocular area, sequela",
category_name: "Burn of unspecified eyelid and periocular area, sequela"
}
end
def _T2601XA do
%ICDCode{full_code: "T2601XA",
category_code: "T26",
short_code: "01XA",
full_name: "Burn of right eyelid and periocular area, initial encounter",
short_name: "Burn of right eyelid and periocular area, initial encounter",
category_name: "Burn of right eyelid and periocular area, initial encounter"
}
end
def _T2601XD do
%ICDCode{full_code: "T2601XD",
category_code: "T26",
short_code: "01XD",
full_name: "Burn of right eyelid and periocular area, subsequent encounter",
short_name: "Burn of right eyelid and periocular area, subsequent encounter",
category_name: "Burn of right eyelid and periocular area, subsequent encounter"
}
end
def _T2601XS do
%ICDCode{full_code: "T2601XS",
category_code: "T26",
short_code: "01XS",
full_name: "Burn of right eyelid and periocular area, sequela",
short_name: "Burn of right eyelid and periocular area, sequela",
category_name: "Burn of right eyelid and periocular area, sequela"
}
end
def _T2602XA do
%ICDCode{full_code: "T2602XA",
category_code: "T26",
short_code: "02XA",
full_name: "Burn of left eyelid and periocular area, initial encounter",
short_name: "Burn of left eyelid and periocular area, initial encounter",
category_name: "Burn of left eyelid and periocular area, initial encounter"
}
end
def _T2602XD do
%ICDCode{full_code: "T2602XD",
category_code: "T26",
short_code: "02XD",
full_name: "Burn of left eyelid and periocular area, subsequent encounter",
short_name: "Burn of left eyelid and periocular area, subsequent encounter",
category_name: "Burn of left eyelid and periocular area, subsequent encounter"
}
end
def _T2602XS do
%ICDCode{full_code: "T2602XS",
category_code: "T26",
short_code: "02XS",
full_name: "Burn of left eyelid and periocular area, sequela",
short_name: "Burn of left eyelid and periocular area, sequela",
category_name: "Burn of left eyelid and periocular area, sequela"
}
end
def _T2610XA do
%ICDCode{full_code: "T2610XA",
category_code: "T26",
short_code: "10XA",
full_name: "Burn of cornea and conjunctival sac, unspecified eye, initial encounter",
short_name: "Burn of cornea and conjunctival sac, unspecified eye, initial encounter",
category_name: "Burn of cornea and conjunctival sac, unspecified eye, initial encounter"
}
end
def _T2610XD do
%ICDCode{full_code: "T2610XD",
category_code: "T26",
short_code: "10XD",
full_name: "Burn of cornea and conjunctival sac, unspecified eye, subsequent encounter",
short_name: "Burn of cornea and conjunctival sac, unspecified eye, subsequent encounter",
category_name: "Burn of cornea and conjunctival sac, unspecified eye, subsequent encounter"
}
end
def _T2610XS do
%ICDCode{full_code: "T2610XS",
category_code: "T26",
short_code: "10XS",
full_name: "Burn of cornea and conjunctival sac, unspecified eye, sequela",
short_name: "Burn of cornea and conjunctival sac, unspecified eye, sequela",
category_name: "Burn of cornea and conjunctival sac, unspecified eye, sequela"
}
end
def _T2611XA do
%ICDCode{full_code: "T2611XA",
category_code: "T26",
short_code: "11XA",
full_name: "Burn of cornea and conjunctival sac, right eye, initial encounter",
short_name: "Burn of cornea and conjunctival sac, right eye, initial encounter",
category_name: "Burn of cornea and conjunctival sac, right eye, initial encounter"
}
end
def _T2611XD do
%ICDCode{full_code: "T2611XD",
category_code: "T26",
short_code: "11XD",
full_name: "Burn of cornea and conjunctival sac, right eye, subsequent encounter",
short_name: "Burn of cornea and conjunctival sac, right eye, subsequent encounter",
category_name: "Burn of cornea and conjunctival sac, right eye, subsequent encounter"
}
end
def _T2611XS do
%ICDCode{full_code: "T2611XS",
category_code: "T26",
short_code: "11XS",
full_name: "Burn of cornea and conjunctival sac, right eye, sequela",
short_name: "Burn of cornea and conjunctival sac, right eye, sequela",
category_name: "Burn of cornea and conjunctival sac, right eye, sequela"
}
end
def _T2612XA do
%ICDCode{full_code: "T2612XA",
category_code: "T26",
short_code: "12XA",
full_name: "Burn of cornea and conjunctival sac, left eye, initial encounter",
short_name: "Burn of cornea and conjunctival sac, left eye, initial encounter",
category_name: "Burn of cornea and conjunctival sac, left eye, initial encounter"
}
end
def _T2612XD do
%ICDCode{full_code: "T2612XD",
category_code: "T26",
short_code: "12XD",
full_name: "Burn of cornea and conjunctival sac, left eye, subsequent encounter",
short_name: "Burn of cornea and conjunctival sac, left eye, subsequent encounter",
category_name: "Burn of cornea and conjunctival sac, left eye, subsequent encounter"
}
end
def _T2612XS do
%ICDCode{full_code: "T2612XS",
category_code: "T26",
short_code: "12XS",
full_name: "Burn of cornea and conjunctival sac, left eye, sequela",
short_name: "Burn of cornea and conjunctival sac, left eye, sequela",
category_name: "Burn of cornea and conjunctival sac, left eye, sequela"
}
end
def _T2620XA do
%ICDCode{full_code: "T2620XA",
category_code: "T26",
short_code: "20XA",
full_name: "Burn with resulting rupture and destruction of unspecified eyeball, initial encounter",
short_name: "Burn with resulting rupture and destruction of unspecified eyeball, initial encounter",
category_name: "Burn with resulting rupture and destruction of unspecified eyeball, initial encounter"
}
end
def _T2620XD do
%ICDCode{full_code: "T2620XD",
category_code: "T26",
short_code: "20XD",
full_name: "Burn with resulting rupture and destruction of unspecified eyeball, subsequent encounter",
short_name: "Burn with resulting rupture and destruction of unspecified eyeball, subsequent encounter",
category_name: "Burn with resulting rupture and destruction of unspecified eyeball, subsequent encounter"
}
end
def _T2620XS do
%ICDCode{full_code: "T2620XS",
category_code: "T26",
short_code: "20XS",
full_name: "Burn with resulting rupture and destruction of unspecified eyeball, sequela",
short_name: "Burn with resulting rupture and destruction of unspecified eyeball, sequela",
category_name: "Burn with resulting rupture and destruction of unspecified eyeball, sequela"
}
end
def _T2621XA do
%ICDCode{full_code: "T2621XA",
category_code: "T26",
short_code: "21XA",
full_name: "Burn with resulting rupture and destruction of right eyeball, initial encounter",
short_name: "Burn with resulting rupture and destruction of right eyeball, initial encounter",
category_name: "Burn with resulting rupture and destruction of right eyeball, initial encounter"
}
end
def _T2621XD do
%ICDCode{full_code: "T2621XD",
category_code: "T26",
short_code: "21XD",
full_name: "Burn with resulting rupture and destruction of right eyeball, subsequent encounter",
short_name: "Burn with resulting rupture and destruction of right eyeball, subsequent encounter",
category_name: "Burn with resulting rupture and destruction of right eyeball, subsequent encounter"
}
end
def _T2621XS do
%ICDCode{full_code: "T2621XS",
category_code: "T26",
short_code: "21XS",
full_name: "Burn with resulting rupture and destruction of right eyeball, sequela",
short_name: "Burn with resulting rupture and destruction of right eyeball, sequela",
category_name: "Burn with resulting rupture and destruction of right eyeball, sequela"
}
end
def _T2622XA do
%ICDCode{full_code: "T2622XA",
category_code: "T26",
short_code: "22XA",
full_name: "Burn with resulting rupture and destruction of left eyeball, initial encounter",
short_name: "Burn with resulting rupture and destruction of left eyeball, initial encounter",
category_name: "Burn with resulting rupture and destruction of left eyeball, initial encounter"
}
end
def _T2622XD do
%ICDCode{full_code: "T2622XD",
category_code: "T26",
short_code: "22XD",
full_name: "Burn with resulting rupture and destruction of left eyeball, subsequent encounter",
short_name: "Burn with resulting rupture and destruction of left eyeball, subsequent encounter",
category_name: "Burn with resulting rupture and destruction of left eyeball, subsequent encounter"
}
end
def _T2622XS do
%ICDCode{full_code: "T2622XS",
category_code: "T26",
short_code: "22XS",
full_name: "Burn with resulting rupture and destruction of left eyeball, sequela",
short_name: "Burn with resulting rupture and destruction of left eyeball, sequela",
category_name: "Burn with resulting rupture and destruction of left eyeball, sequela"
}
end
def _T2630XA do
%ICDCode{full_code: "T2630XA",
category_code: "T26",
short_code: "30XA",
full_name: "Burns of other specified parts of unspecified eye and adnexa, initial encounter",
short_name: "Burns of other specified parts of unspecified eye and adnexa, initial encounter",
category_name: "Burns of other specified parts of unspecified eye and adnexa, initial encounter"
}
end
def _T2630XD do
%ICDCode{full_code: "T2630XD",
category_code: "T26",
short_code: "30XD",
full_name: "Burns of other specified parts of unspecified eye and adnexa, subsequent encounter",
short_name: "Burns of other specified parts of unspecified eye and adnexa, subsequent encounter",
category_name: "Burns of other specified parts of unspecified eye and adnexa, subsequent encounter"
}
end
def _T2630XS do
%ICDCode{full_code: "T2630XS",
category_code: "T26",
short_code: "30XS",
full_name: "Burns of other specified parts of unspecified eye and adnexa, sequela",
short_name: "Burns of other specified parts of unspecified eye and adnexa, sequela",
category_name: "Burns of other specified parts of unspecified eye and adnexa, sequela"
}
end
def _T2631XA do
%ICDCode{full_code: "T2631XA",
category_code: "T26",
short_code: "31XA",
full_name: "Burns of other specified parts of right eye and adnexa, initial encounter",
short_name: "Burns of other specified parts of right eye and adnexa, initial encounter",
category_name: "Burns of other specified parts of right eye and adnexa, initial encounter"
}
end
def _T2631XD do
%ICDCode{full_code: "T2631XD",
category_code: "T26",
short_code: "31XD",
full_name: "Burns of other specified parts of right eye and adnexa, subsequent encounter",
short_name: "Burns of other specified parts of right eye and adnexa, subsequent encounter",
category_name: "Burns of other specified parts of right eye and adnexa, subsequent encounter"
}
end
def _T2631XS do
%ICDCode{full_code: "T2631XS",
category_code: "T26",
short_code: "31XS",
full_name: "Burns of other specified parts of right eye and adnexa, sequela",
short_name: "Burns of other specified parts of right eye and adnexa, sequela",
category_name: "Burns of other specified parts of right eye and adnexa, sequela"
}
end
def _T2632XA do
%ICDCode{full_code: "T2632XA",
category_code: "T26",
short_code: "32XA",
full_name: "Burns of other specified parts of left eye and adnexa, initial encounter",
short_name: "Burns of other specified parts of left eye and adnexa, initial encounter",
category_name: "Burns of other specified parts of left eye and adnexa, initial encounter"
}
end
def _T2632XD do
%ICDCode{full_code: "T2632XD",
category_code: "T26",
short_code: "32XD",
full_name: "Burns of other specified parts of left eye and adnexa, subsequent encounter",
short_name: "Burns of other specified parts of left eye and adnexa, subsequent encounter",
category_name: "Burns of other specified parts of left eye and adnexa, subsequent encounter"
}
end
def _T2632XS do
%ICDCode{full_code: "T2632XS",
category_code: "T26",
short_code: "32XS",
full_name: "Burns of other specified parts of left eye and adnexa, sequela",
short_name: "Burns of other specified parts of left eye and adnexa, sequela",
category_name: "Burns of other specified parts of left eye and adnexa, sequela"
}
end
def _T2640XA do
%ICDCode{full_code: "T2640XA",
category_code: "T26",
short_code: "40XA",
full_name: "Burn of unspecified eye and adnexa, part unspecified, initial encounter",
short_name: "Burn of unspecified eye and adnexa, part unspecified, initial encounter",
category_name: "Burn of unspecified eye and adnexa, part unspecified, initial encounter"
}
end
def _T2640XD do
%ICDCode{full_code: "T2640XD",
category_code: "T26",
short_code: "40XD",
full_name: "Burn of unspecified eye and adnexa, part unspecified, subsequent encounter",
short_name: "Burn of unspecified eye and adnexa, part unspecified, subsequent encounter",
category_name: "Burn of unspecified eye and adnexa, part unspecified, subsequent encounter"
}
end
def _T2640XS do
%ICDCode{full_code: "T2640XS",
category_code: "T26",
short_code: "40XS",
full_name: "Burn of unspecified eye and adnexa, part unspecified, sequela",
short_name: "Burn of unspecified eye and adnexa, part unspecified, sequela",
category_name: "Burn of unspecified eye and adnexa, part unspecified, sequela"
}
end
def _T2641XA do
%ICDCode{full_code: "T2641XA",
category_code: "T26",
short_code: "41XA",
full_name: "Burn of right eye and adnexa, part unspecified, initial encounter",
short_name: "Burn of right eye and adnexa, part unspecified, initial encounter",
category_name: "Burn of right eye and adnexa, part unspecified, initial encounter"
}
end
def _T2641XD do
%ICDCode{full_code: "T2641XD",
category_code: "T26",
short_code: "41XD",
full_name: "Burn of right eye and adnexa, part unspecified, subsequent encounter",
short_name: "Burn of right eye and adnexa, part unspecified, subsequent encounter",
category_name: "Burn of right eye and adnexa, part unspecified, subsequent encounter"
}
end
def _T2641XS do
%ICDCode{full_code: "T2641XS",
category_code: "T26",
short_code: "41XS",
full_name: "Burn of right eye and adnexa, part unspecified, sequela",
short_name: "Burn of right eye and adnexa, part unspecified, sequela",
category_name: "Burn of right eye and adnexa, part unspecified, sequela"
}
end
def _T2642XA do
%ICDCode{full_code: "T2642XA",
category_code: "T26",
short_code: "42XA",
full_name: "Burn of left eye and adnexa, part unspecified, initial encounter",
short_name: "Burn of left eye and adnexa, part unspecified, initial encounter",
category_name: "Burn of left eye and adnexa, part unspecified, initial encounter"
}
end
def _T2642XD do
%ICDCode{full_code: "T2642XD",
category_code: "T26",
short_code: "42XD",
full_name: "Burn of left eye and adnexa, part unspecified, subsequent encounter",
short_name: "Burn of left eye and adnexa, part unspecified, subsequent encounter",
category_name: "Burn of left eye and adnexa, part unspecified, subsequent encounter"
}
end
def _T2642XS do
%ICDCode{full_code: "T2642XS",
category_code: "T26",
short_code: "42XS",
full_name: "Burn of left eye and adnexa, part unspecified, sequela",
short_name: "Burn of left eye and adnexa, part unspecified, sequela",
category_name: "Burn of left eye and adnexa, part unspecified, sequela"
}
end
def _T2650XA do
%ICDCode{full_code: "T2650XA",
category_code: "T26",
short_code: "50XA",
full_name: "Corrosion of unspecified eyelid and periocular area, initial encounter",
short_name: "Corrosion of unspecified eyelid and periocular area, initial encounter",
category_name: "Corrosion of unspecified eyelid and periocular area, initial encounter"
}
end
def _T2650XD do
%ICDCode{full_code: "T2650XD",
category_code: "T26",
short_code: "50XD",
full_name: "Corrosion of unspecified eyelid and periocular area, subsequent encounter",
short_name: "Corrosion of unspecified eyelid and periocular area, subsequent encounter",
category_name: "Corrosion of unspecified eyelid and periocular area, subsequent encounter"
}
end
def _T2650XS do
%ICDCode{full_code: "T2650XS",
category_code: "T26",
short_code: "50XS",
full_name: "Corrosion of unspecified eyelid and periocular area, sequela",
short_name: "Corrosion of unspecified eyelid and periocular area, sequela",
category_name: "Corrosion of unspecified eyelid and periocular area, sequela"
}
end
def _T2651XA do
%ICDCode{full_code: "T2651XA",
category_code: "T26",
short_code: "51XA",
full_name: "Corrosion of right eyelid and periocular area, initial encounter",
short_name: "Corrosion of right eyelid and periocular area, initial encounter",
category_name: "Corrosion of right eyelid and periocular area, initial encounter"
}
end
def _T2651XD do
%ICDCode{full_code: "T2651XD",
category_code: "T26",
short_code: "51XD",
full_name: "Corrosion of right eyelid and periocular area, subsequent encounter",
short_name: "Corrosion of right eyelid and periocular area, subsequent encounter",
category_name: "Corrosion of right eyelid and periocular area, subsequent encounter"
}
end
def _T2651XS do
%ICDCode{full_code: "T2651XS",
category_code: "T26",
short_code: "51XS",
full_name: "Corrosion of right eyelid and periocular area, sequela",
short_name: "Corrosion of right eyelid and periocular area, sequela",
category_name: "Corrosion of right eyelid and periocular area, sequela"
}
end
def _T2652XA do
%ICDCode{full_code: "T2652XA",
category_code: "T26",
short_code: "52XA",
full_name: "Corrosion of left eyelid and periocular area, initial encounter",
short_name: "Corrosion of left eyelid and periocular area, initial encounter",
category_name: "Corrosion of left eyelid and periocular area, initial encounter"
}
end
def _T2652XD do
%ICDCode{full_code: "T2652XD",
category_code: "T26",
short_code: "52XD",
full_name: "Corrosion of left eyelid and periocular area, subsequent encounter",
short_name: "Corrosion of left eyelid and periocular area, subsequent encounter",
category_name: "Corrosion of left eyelid and periocular area, subsequent encounter"
}
end
def _T2652XS do
%ICDCode{full_code: "T2652XS",
category_code: "T26",
short_code: "52XS",
full_name: "Corrosion of left eyelid and periocular area, sequela",
short_name: "Corrosion of left eyelid and periocular area, sequela",
category_name: "Corrosion of left eyelid and periocular area, sequela"
}
end
def _T2660XA do
%ICDCode{full_code: "T2660XA",
category_code: "T26",
short_code: "60XA",
full_name: "Corrosion of cornea and conjunctival sac, unspecified eye, initial encounter",
short_name: "Corrosion of cornea and conjunctival sac, unspecified eye, initial encounter",
category_name: "Corrosion of cornea and conjunctival sac, unspecified eye, initial encounter"
}
end
def _T2660XD do
%ICDCode{full_code: "T2660XD",
category_code: "T26",
short_code: "60XD",
full_name: "Corrosion of cornea and conjunctival sac, unspecified eye, subsequent encounter",
short_name: "Corrosion of cornea and conjunctival sac, unspecified eye, subsequent encounter",
category_name: "Corrosion of cornea and conjunctival sac, unspecified eye, subsequent encounter"
}
end
def _T2660XS do
%ICDCode{full_code: "T2660XS",
category_code: "T26",
short_code: "60XS",
full_name: "Corrosion of cornea and conjunctival sac, unspecified eye, sequela",
short_name: "Corrosion of cornea and conjunctival sac, unspecified eye, sequela",
category_name: "Corrosion of cornea and conjunctival sac, unspecified eye, sequela"
}
end
def _T2661XA do
%ICDCode{full_code: "T2661XA",
category_code: "T26",
short_code: "61XA",
full_name: "Corrosion of cornea and conjunctival sac, right eye, initial encounter",
short_name: "Corrosion of cornea and conjunctival sac, right eye, initial encounter",
category_name: "Corrosion of cornea and conjunctival sac, right eye, initial encounter"
}
end
def _T2661XD do
%ICDCode{full_code: "T2661XD",
category_code: "T26",
short_code: "61XD",
full_name: "Corrosion of cornea and conjunctival sac, right eye, subsequent encounter",
short_name: "Corrosion of cornea and conjunctival sac, right eye, subsequent encounter",
category_name: "Corrosion of cornea and conjunctival sac, right eye, subsequent encounter"
}
end
def _T2661XS do
%ICDCode{full_code: "T2661XS",
category_code: "T26",
short_code: "61XS",
full_name: "Corrosion of cornea and conjunctival sac, right eye, sequela",
short_name: "Corrosion of cornea and conjunctival sac, right eye, sequela",
category_name: "Corrosion of cornea and conjunctival sac, right eye, sequela"
}
end
def _T2662XA do
%ICDCode{full_code: "T2662XA",
category_code: "T26",
short_code: "62XA",
full_name: "Corrosion of cornea and conjunctival sac, left eye, initial encounter",
short_name: "Corrosion of cornea and conjunctival sac, left eye, initial encounter",
category_name: "Corrosion of cornea and conjunctival sac, left eye, initial encounter"
}
end
def _T2662XD do
%ICDCode{full_code: "T2662XD",
category_code: "T26",
short_code: "62XD",
full_name: "Corrosion of cornea and conjunctival sac, left eye, subsequent encounter",
short_name: "Corrosion of cornea and conjunctival sac, left eye, subsequent encounter",
category_name: "Corrosion of cornea and conjunctival sac, left eye, subsequent encounter"
}
end
def _T2662XS do
%ICDCode{full_code: "T2662XS",
category_code: "T26",
short_code: "62XS",
full_name: "Corrosion of cornea and conjunctival sac, left eye, sequela",
short_name: "Corrosion of cornea and conjunctival sac, left eye, sequela",
category_name: "Corrosion of cornea and conjunctival sac, left eye, sequela"
}
end
def _T2670XA do
%ICDCode{full_code: "T2670XA",
category_code: "T26",
short_code: "70XA",
full_name: "Corrosion with resulting rupture and destruction of unspecified eyeball, initial encounter",
short_name: "Corrosion with resulting rupture and destruction of unspecified eyeball, initial encounter",
category_name: "Corrosion with resulting rupture and destruction of unspecified eyeball, initial encounter"
}
end
def _T2670XD do
%ICDCode{full_code: "T2670XD",
category_code: "T26",
short_code: "70XD",
full_name: "Corrosion with resulting rupture and destruction of unspecified eyeball, subsequent encounter",
short_name: "Corrosion with resulting rupture and destruction of unspecified eyeball, subsequent encounter",
category_name: "Corrosion with resulting rupture and destruction of unspecified eyeball, subsequent encounter"
}
end
def _T2670XS do
%ICDCode{full_code: "T2670XS",
category_code: "T26",
short_code: "70XS",
full_name: "Corrosion with resulting rupture and destruction of unspecified eyeball, sequela",
short_name: "Corrosion with resulting rupture and destruction of unspecified eyeball, sequela",
category_name: "Corrosion with resulting rupture and destruction of unspecified eyeball, sequela"
}
end
def _T2671XA do
%ICDCode{full_code: "T2671XA",
category_code: "T26",
short_code: "71XA",
full_name: "Corrosion with resulting rupture and destruction of right eyeball, initial encounter",
short_name: "Corrosion with resulting rupture and destruction of right eyeball, initial encounter",
category_name: "Corrosion with resulting rupture and destruction of right eyeball, initial encounter"
}
end
def _T2671XD do
%ICDCode{full_code: "T2671XD",
category_code: "T26",
short_code: "71XD",
full_name: "Corrosion with resulting rupture and destruction of right eyeball, subsequent encounter",
short_name: "Corrosion with resulting rupture and destruction of right eyeball, subsequent encounter",
category_name: "Corrosion with resulting rupture and destruction of right eyeball, subsequent encounter"
}
end
def _T2671XS do
%ICDCode{full_code: "T2671XS",
category_code: "T26",
short_code: "71XS",
full_name: "Corrosion with resulting rupture and destruction of right eyeball, sequela",
short_name: "Corrosion with resulting rupture and destruction of right eyeball, sequela",
category_name: "Corrosion with resulting rupture and destruction of right eyeball, sequela"
}
end
def _T2672XA do
%ICDCode{full_code: "T2672XA",
category_code: "T26",
short_code: "72XA",
full_name: "Corrosion with resulting rupture and destruction of left eyeball, initial encounter",
short_name: "Corrosion with resulting rupture and destruction of left eyeball, initial encounter",
category_name: "Corrosion with resulting rupture and destruction of left eyeball, initial encounter"
}
end
def _T2672XD do
%ICDCode{full_code: "T2672XD",
category_code: "T26",
short_code: "72XD",
full_name: "Corrosion with resulting rupture and destruction of left eyeball, subsequent encounter",
short_name: "Corrosion with resulting rupture and destruction of left eyeball, subsequent encounter",
category_name: "Corrosion with resulting rupture and destruction of left eyeball, subsequent encounter"
}
end
def _T2672XS do
%ICDCode{full_code: "T2672XS",
category_code: "T26",
short_code: "72XS",
full_name: "Corrosion with resulting rupture and destruction of left eyeball, sequela",
short_name: "Corrosion with resulting rupture and destruction of left eyeball, sequela",
category_name: "Corrosion with resulting rupture and destruction of left eyeball, sequela"
}
end
def _T2680XA do
%ICDCode{full_code: "T2680XA",
category_code: "T26",
short_code: "80XA",
full_name: "Corrosions of other specified parts of unspecified eye and adnexa, initial encounter",
short_name: "Corrosions of other specified parts of unspecified eye and adnexa, initial encounter",
category_name: "Corrosions of other specified parts of unspecified eye and adnexa, initial encounter"
}
end
def _T2680XD do
%ICDCode{full_code: "T2680XD",
category_code: "T26",
short_code: "80XD",
full_name: "Corrosions of other specified parts of unspecified eye and adnexa, subsequent encounter",
short_name: "Corrosions of other specified parts of unspecified eye and adnexa, subsequent encounter",
category_name: "Corrosions of other specified parts of unspecified eye and adnexa, subsequent encounter"
}
end
def _T2680XS do
%ICDCode{full_code: "T2680XS",
category_code: "T26",
short_code: "80XS",
full_name: "Corrosions of other specified parts of unspecified eye and adnexa, sequela",
short_name: "Corrosions of other specified parts of unspecified eye and adnexa, sequela",
category_name: "Corrosions of other specified parts of unspecified eye and adnexa, sequela"
}
end
def _T2681XA do
%ICDCode{full_code: "T2681XA",
category_code: "T26",
short_code: "81XA",
full_name: "Corrosions of other specified parts of right eye and adnexa, initial encounter",
short_name: "Corrosions of other specified parts of right eye and adnexa, initial encounter",
category_name: "Corrosions of other specified parts of right eye and adnexa, initial encounter"
}
end
def _T2681XD do
%ICDCode{full_code: "T2681XD",
category_code: "T26",
short_code: "81XD",
full_name: "Corrosions of other specified parts of right eye and adnexa, subsequent encounter",
short_name: "Corrosions of other specified parts of right eye and adnexa, subsequent encounter",
category_name: "Corrosions of other specified parts of right eye and adnexa, subsequent encounter"
}
end
def _T2681XS do
%ICDCode{full_code: "T2681XS",
category_code: "T26",
short_code: "81XS",
full_name: "Corrosions of other specified parts of right eye and adnexa, sequela",
short_name: "Corrosions of other specified parts of right eye and adnexa, sequela",
category_name: "Corrosions of other specified parts of right eye and adnexa, sequela"
}
end
def _T2682XA do
%ICDCode{full_code: "T2682XA",
category_code: "T26",
short_code: "82XA",
full_name: "Corrosions of other specified parts of left eye and adnexa, initial encounter",
short_name: "Corrosions of other specified parts of left eye and adnexa, initial encounter",
category_name: "Corrosions of other specified parts of left eye and adnexa, initial encounter"
}
end
def _T2682XD do
%ICDCode{full_code: "T2682XD",
category_code: "T26",
short_code: "82XD",
full_name: "Corrosions of other specified parts of left eye and adnexa, subsequent encounter",
short_name: "Corrosions of other specified parts of left eye and adnexa, subsequent encounter",
category_name: "Corrosions of other specified parts of left eye and adnexa, subsequent encounter"
}
end
def _T2682XS do
%ICDCode{full_code: "T2682XS",
category_code: "T26",
short_code: "82XS",
full_name: "Corrosions of other specified parts of left eye and adnexa, sequela",
short_name: "Corrosions of other specified parts of left eye and adnexa, sequela",
category_name: "Corrosions of other specified parts of left eye and adnexa, sequela"
}
end
def _T2690XA do
%ICDCode{full_code: "T2690XA",
category_code: "T26",
short_code: "90XA",
full_name: "Corrosion of unspecified eye and adnexa, part unspecified, initial encounter",
short_name: "Corrosion of unspecified eye and adnexa, part unspecified, initial encounter",
category_name: "Corrosion of unspecified eye and adnexa, part unspecified, initial encounter"
}
end
def _T2690XD do
%ICDCode{full_code: "T2690XD",
category_code: "T26",
short_code: "90XD",
full_name: "Corrosion of unspecified eye and adnexa, part unspecified, subsequent encounter",
short_name: "Corrosion of unspecified eye and adnexa, part unspecified, subsequent encounter",
category_name: "Corrosion of unspecified eye and adnexa, part unspecified, subsequent encounter"
}
end
def _T2690XS do
%ICDCode{full_code: "T2690XS",
category_code: "T26",
short_code: "90XS",
full_name: "Corrosion of unspecified eye and adnexa, part unspecified, sequela",
short_name: "Corrosion of unspecified eye and adnexa, part unspecified, sequela",
category_name: "Corrosion of unspecified eye and adnexa, part unspecified, sequela"
}
end
def _T2691XA do
%ICDCode{full_code: "T2691XA",
category_code: "T26",
short_code: "91XA",
full_name: "Corrosion of right eye and adnexa, part unspecified, initial encounter",
short_name: "Corrosion of right eye and adnexa, part unspecified, initial encounter",
category_name: "Corrosion of right eye and adnexa, part unspecified, initial encounter"
}
end
def _T2691XD do
%ICDCode{full_code: "T2691XD",
category_code: "T26",
short_code: "91XD",
full_name: "Corrosion of right eye and adnexa, part unspecified, subsequent encounter",
short_name: "Corrosion of right eye and adnexa, part unspecified, subsequent encounter",
category_name: "Corrosion of right eye and adnexa, part unspecified, subsequent encounter"
}
end
def _T2691XS do
%ICDCode{full_code: "T2691XS",
category_code: "T26",
short_code: "91XS",
full_name: "Corrosion of right eye and adnexa, part unspecified, sequela",
short_name: "Corrosion of right eye and adnexa, part unspecified, sequela",
category_name: "Corrosion of right eye and adnexa, part unspecified, sequela"
}
end
def _T2692XA do
%ICDCode{full_code: "T2692XA",
category_code: "T26",
short_code: "92XA",
full_name: "Corrosion of left eye and adnexa, part unspecified, initial encounter",
short_name: "Corrosion of left eye and adnexa, part unspecified, initial encounter",
category_name: "Corrosion of left eye and adnexa, part unspecified, initial encounter"
}
end
def _T2692XD do
%ICDCode{full_code: "T2692XD",
category_code: "T26",
short_code: "92XD",
full_name: "Corrosion of left eye and adnexa, part unspecified, subsequent encounter",
short_name: "Corrosion of left eye and adnexa, part unspecified, subsequent encounter",
category_name: "Corrosion of left eye and adnexa, part unspecified, subsequent encounter"
}
end
def _T2692XS do
%ICDCode{full_code: "T2692XS",
category_code: "T26",
short_code: "92XS",
full_name: "Corrosion of left eye and adnexa, part unspecified, sequela",
short_name: "Corrosion of left eye and adnexa, part unspecified, sequela",
category_name: "Corrosion of left eye and adnexa, part unspecified, sequela"
}
end
end
| 46.252142 | 120 | 0.667646 |
1cbb9bce09c4a1b1227449d7b380547e20e7b0e1 | 165 | exs | Elixir | priv/repo/migrations/20210811165241_drop_old_tables.exs | ysaito8015/communitex | d469447a62029d59883d95df4df3c9b09e0022e2 | [
"Apache-2.0"
] | 7 | 2021-07-14T15:45:55.000Z | 2022-01-25T11:13:01.000Z | priv/repo/migrations/20210811165241_drop_old_tables.exs | ysaito8015/communitex | d469447a62029d59883d95df4df3c9b09e0022e2 | [
"Apache-2.0"
] | 10 | 2021-08-09T15:54:05.000Z | 2022-02-17T04:18:38.000Z | priv/repo/migrations/20210811165241_drop_old_tables.exs | ysaito8015/communitex | d469447a62029d59883d95df4df3c9b09e0022e2 | [
"Apache-2.0"
] | 5 | 2021-07-23T05:54:35.000Z | 2022-01-28T04:14:51.000Z | defmodule Basic.Repo.Migrations.DropOldTables do
use Ecto.Migration
def change do
drop_if_exists table(:members)
drop_if_exists table(:blogs)
end
end
| 18.333333 | 48 | 0.763636 |
1cbbb431bf10796a28cdd01b27c0be2ad6e27889 | 1,076 | ex | Elixir | lib/geocoder/cache.ex | Chervychnyk/geocoder | 8b566b42dda4d75228e2a52a62fd1b3f2208f135 | [
"MIT"
] | null | null | null | lib/geocoder/cache.ex | Chervychnyk/geocoder | 8b566b42dda4d75228e2a52a62fd1b3f2208f135 | [
"MIT"
] | null | null | null | lib/geocoder/cache.ex | Chervychnyk/geocoder | 8b566b42dda4d75228e2a52a62fd1b3f2208f135 | [
"MIT"
] | null | null | null | defmodule Geocoder.Cache do
use Nebulex.Cache,
otp_app: :geocoder,
adapter: Nebulex.Adapters.Local
@precision Application.get_env(:geocoder, :latlng_precision, 11)
def geocode(opts) when is_list(opts) do
opts |> Keyword.get(:address) |> geocode()
end
def geocode(location) when is_binary(location) do
link = encode(location)
with key when is_binary(key) <- get(link),
%Geocoder.Coords{} = coords <- get(key) do
{:just, coords}
else
_ -> :nothing
end
end
def link(from, %Geocoder.Coords{lat: lat, lon: lon} = coords) do
key = encode({lat, lon}, @precision)
link = encode(from[:address] || from[:latlng], @precision)
link |> set(key) |> set(coords)
end
defp encode(location, option \\ nil)
defp encode({lat, lon}, precision) do
Geohash.encode(:erlang.float(lat), :erlang.float(lon), precision)
end
defp encode(location, _) when is_binary(location) do
location
|> String.downcase()
|> String.replace(~r/[^\w]/, "")
|> String.trim()
|> Base.encode64()
end
end
| 24.454545 | 69 | 0.637546 |
1cbc0293d22020a47aef7ceefcbcde2d08d58bb7 | 6,399 | ex | Elixir | test/fixture_files/language.ex | bitchef/exocco | b0e93b01794d71c0458f52907b2f66b03e3f207e | [
"MIT"
] | null | null | null | test/fixture_files/language.ex | bitchef/exocco | b0e93b01794d71c0458f52907b2f66b03e3f207e | [
"MIT"
] | 1 | 2016-02-09T12:27:42.000Z | 2016-02-09T12:27:42.000Z | test/fixture_files/language.ex | bitchef/exocco | b0e93b01794d71c0458f52907b2f66b03e3f207e | [
"MIT"
] | null | null | null | defmodule Language do
defstruct name: "", symbol: "", literate: false, multistart: nil, multiend: nil,
commentMatcher: nil, dividerText: nil, dividerHtml: nil
@type t :: %Language{ name: String.t, symbol: String.t, literate: boolean,
multistart: String.t, multiend: String.t, commentMatcher: Regex.t,
dividerText: String.t, dividerHtml: Regex.t}
# List of languages supported by Exocco.
#
# To support a new language,
# map a file extension to the name of its Pygments lexer following the same syntax.
# Then rebuild the project.
defmacrop mLanguages do
quote do
languages = [
".applescript": %Language{name: "applescript", symbol: "--"},
".as": %Language{name: "actionscript", symbol: "//"},
".bat": %Language{name: "dos", symbol: "@?rem"},
".btm": %Language{name: "dos", symbol: "@?rem"},
".c": %Language{name: "c", symbol: "//", multistart: "/*", multiend: "*/"},
".clj": %Language{name: "clojure", symbol: ";"},
".cls": %Language{name: "tex", symbol: "%"},
".cmake": %Language{name: "cmake", symbol: "#"},
".cmd": %Language{name: "dos", symbol: "@?rem"},
".coffee": %Language{name: "coffeescript", symbol: "#"},
".cpp": %Language{name: "cpp", symbol: "//", multistart: "/*", multiend: "*/"},
".cs": %Language{name: "cs", symbol: "//"},
".cson": %Language{name: "coffeescript", symbol: "#"},
".d": %Language{name: "d", symbol: "//"},
".dtx": %Language{name: "tex", symbol: "%"},
".erl": %Language{name: "erlang", symbol: "%"},
".eex": %Language{name: "elixir", symbol: "#"},
".ex": %Language{name: "elixir", symbol: "#"},
".exs": %Language{name: "elixir", symbol: "#"},
".frag": %Language{name: "glsl", symbol: "//"},
".glsl": %Language{name: "glsl", symbol: "//"},
".go": %Language{name: "go", symbol: "//"},
".groovy": %Language{name: "groovy", symbol: "//"},
".h": %Language{name: "c", symbol: "//", multistart: "/*", multiend: "*/"},
".hrl": %Language{name: "erlang", symbol: "%"},
".hs": %Language{name: "haskell", symbol: "--", multistart: "{-", multiend: "-}"},
".ini": %Language{name: "ini", symbol: ";"},
".js": %Language{name: "javascript", symbol: "//", multistart: "/*", multiend: "*/"},
".java": %Language{name: "java", symbol: "//", multistart: "/*", multiend: "*/"},
".latex": %Language{name: "tex", symbol: "%"},
".less": %Language{name: "less", symbol: "//"},
".lisp": %Language{name: "lisp", symbol: ";"},
".litcoffee": %Language{name: "coffeescript", symbol: "#", literate: true},
".ls": %Language{name: "coffeescript", symbol: "#"},
".lua": %Language{name: "lua", symbol: "--", multistart: "--[[", multiend: "]]--"},
".n": %Language{name: "nemerle", symbol: "//"},
".m": %Language{name: "objectivec", symbol: "//"},
".mel": %Language{name: "mel", symbol: "//"},
".markdown": %Language{name: "markdown"},
".md": %Language{name: "markdown"},
".mm": %Language{name: "objectivec", symbol: "//"},
".p": %Language{name: "delphi", symbol: "//"},
".pas": %Language{name: "delphi", symbol: "//"},
".php": %Language{name: "php", symbol: "//"},
".pl": %Language{name: "perl", symbol: "#"},
".pm": %Language{name: "perl", symbol: "#"},
".pod": %Language{name: "perl", symbol: "#"},
".pp": %Language{name: "delphi", symbol: "//"},
".py": %Language{name: "python", symbol: "#"},
".rb": %Language{name: "ruby", symbol: "#", multistart: "=begin", multiend: "=end"},
".tex": %Language{name: "tex", symbol: "%"},
".scala": %Language{name: "scala", symbol: "//"},
".scm": %Language{name: "scheme", symbol: ";;", multistart: "#|", multiend: "|#"},
".scpt": %Language{name: "applescript", symbol: "--"},
".scss": %Language{name: "scss", symbol: "//"},
".sh": %Language{name: "bash", symbol: "#"},
".sql": %Language{name: "sql", symbol: "--"},
".sty": %Language{name: "tex", symbol: "%"},
".t": %Language{name: "perl", symbol: "#"},
".vala": %Language{name: "vala", symbol: "//"},
".vapi": %Language{name: "vala", symbol: "//"},
".vbe": %Language{name: "vbscript", symbol: "'"},
".vbs": %Language{name: "vbscript", symbol: "'"},
".vert": %Language{name: "glsl", symbol: "//"},
".vhdl": %Language{name: "vhdl", symbol: "--"},
".r": %Language{name: "r", symbol: "#"},
".rc": %Language{name: "rust", symbol: "//"},
".rs": %Language{name: "rust", symbol: "//"},
".wsc": %Language{name: "vbscript", symbol: "'"},
".wsf": %Language{name: "vbscript", symbol: "'"}
]
# Build out the appropriate matchers and delimiters for each language.
for { ext,lang } <- languages, do: { ext, %Language{ lang|
# Regex to test if a line starts with a comment symbol.
commentMatcher: Regex.compile!("^\s*#{lang.symbol}+\s?" ),
# The dividing token we feed into Pygments, to delimit the boundaries between
# sections.
dividerText: "\n#{lang.symbol}DIVIDER\n",
# The mirror of `dividerText` that we expect Pygments to return. We can split
# on this to recover the original sections.
dividerHtml: Regex.compile!("\n*<span class=\"c1?\">#{lang.symbol}DIVIDER</span>\n*"),
} }
end
end #defmacrop
# Get language specifications from a file extension.
@spec from_file_ext(fileExtension :: String.t) :: %Language{} | nil
def from_file_ext(fileExtension) do
extension = String.downcase(fileExtension) |> String.to_atom
mLanguages[extension]
end #from_file_ext
end #defmodule Language
| 57.133929 | 102 | 0.490858 |
1cbc2bf4644621f4012adf031d346926f36f9aa5 | 1,084 | exs | Elixir | apps/commuter_rail_boarding/test/uploader/s3_test.exs | paulswartz/commuter_rail_boarding | 6be34c192d6a1ee980307d9f3d027bf4cdafa53f | [
"MIT"
] | 1 | 2022-01-30T20:53:07.000Z | 2022-01-30T20:53:07.000Z | apps/commuter_rail_boarding/test/uploader/s3_test.exs | paulswartz/commuter_rail_boarding | 6be34c192d6a1ee980307d9f3d027bf4cdafa53f | [
"MIT"
] | 47 | 2021-05-05T10:31:05.000Z | 2022-03-30T22:18:14.000Z | apps/commuter_rail_boarding/test/uploader/s3_test.exs | paulswartz/commuter_rail_boarding | 6be34c192d6a1ee980307d9f3d027bf4cdafa53f | [
"MIT"
] | 1 | 2021-05-14T00:35:08.000Z | 2021-05-14T00:35:08.000Z | defmodule Uploader.S3Test do
@moduledoc false
use ExUnit.Case
@moduletag :capture_log
import Uploader.S3
setup do
old_config = Application.get_env(:commuter_rail_boarding, Uploader.S3)
on_exit(fn ->
Application.put_env(:commuter_rail_boarding, Uploader.S3, old_config)
end)
config =
Keyword.merge(
old_config || [],
requestor: __MODULE__.MockAws,
bucket: "test_bucket"
)
Application.put_env(:commuter_rail_boarding, Uploader.S3, config)
:ok
end
describe "upload/1" do
test "uploads to a configured S3 bucket" do
assert :ok = upload("filename", "binary")
assert_received {:aws_request, request}
assert request.path == "filename"
assert request.bucket == "test_bucket"
assert request.body == "binary"
assert request.headers["content-type"] == "application/json"
assert request.headers["x-amz-acl"] == "public-read"
end
end
defmodule MockAws do
def request!(request) do
send(self(), {:aws_request, request})
:ok
end
end
end
| 24.088889 | 75 | 0.656827 |
1cbc340ef0d7e7207663b255e3244bb90c54bfcf | 3,948 | exs | Elixir | test/sap/combinators/server_error_test.exs | slogsdon/sap | 766f06cfac8a04772affd977a88d61210064e598 | [
"MIT"
] | 7 | 2015-10-25T16:38:45.000Z | 2020-01-12T19:06:57.000Z | test/sap/combinators/server_error_test.exs | slogsdon/sap | 766f06cfac8a04772affd977a88d61210064e598 | [
"MIT"
] | null | null | null | test/sap/combinators/server_error_test.exs | slogsdon/sap | 766f06cfac8a04772affd977a88d61210064e598 | [
"MIT"
] | null | null | null | defmodule Sap.Combinators.ServerErrorTest do
use ExUnit.Case, async: true
use Plug.Test
import Sap.Combinators.ServerError
test "internal_server_error without body" do
conn = conn(:get, "/")
resp = internal_server_error().(conn)
assert resp.status == :ok
refute resp.conn == conn
assert resp.conn.status == 500
assert resp.conn.resp_body == ""
end
test "internal_server_error with body" do
conn = conn(:get, "/")
resp = internal_server_error("internal_server_error").(conn)
assert resp.status == :ok
refute resp.conn == conn
assert resp.conn.status == 500
assert resp.conn.resp_body == "internal_server_error"
end
test "not_implemented without body" do
conn = conn(:get, "/")
resp = not_implemented().(conn)
assert resp.status == :ok
refute resp.conn == conn
assert resp.conn.status == 501
assert resp.conn.resp_body == ""
end
test "not_implemented with body" do
conn = conn(:get, "/")
resp = not_implemented("not_implemented").(conn)
assert resp.status == :ok
refute resp.conn == conn
assert resp.conn.status == 501
assert resp.conn.resp_body == "not_implemented"
end
test "bad_gateway without body" do
conn = conn(:get, "/")
resp = bad_gateway().(conn)
assert resp.status == :ok
refute resp.conn == conn
assert resp.conn.status == 502
assert resp.conn.resp_body == ""
end
test "bad_gateway with body" do
conn = conn(:get, "/")
resp = bad_gateway("bad_gateway").(conn)
assert resp.status == :ok
refute resp.conn == conn
assert resp.conn.status == 502
assert resp.conn.resp_body == "bad_gateway"
end
test "service_unavailable without body" do
conn = conn(:get, "/")
resp = service_unavailable().(conn)
assert resp.status == :ok
refute resp.conn == conn
assert resp.conn.status == 503
assert resp.conn.resp_body == ""
end
test "service_unavailable with body" do
conn = conn(:get, "/")
resp = service_unavailable("service_unavailable").(conn)
assert resp.status == :ok
refute resp.conn == conn
assert resp.conn.status == 503
assert resp.conn.resp_body == "service_unavailable"
end
test "gateway_timeout without body" do
conn = conn(:get, "/")
resp = gateway_timeout().(conn)
assert resp.status == :ok
refute resp.conn == conn
assert resp.conn.status == 504
assert resp.conn.resp_body == ""
end
test "gateway_timeout with body" do
conn = conn(:get, "/")
resp = gateway_timeout("gateway_timeout").(conn)
assert resp.status == :ok
refute resp.conn == conn
assert resp.conn.status == 504
assert resp.conn.resp_body == "gateway_timeout"
end
test "http_version_not_supported without body" do
conn = conn(:get, "/")
resp = http_version_not_supported().(conn)
assert resp.status == :ok
refute resp.conn == conn
assert resp.conn.status == 505
assert resp.conn.resp_body == ""
end
test "http_version_not_supported with body" do
conn = conn(:get, "/")
resp = http_version_not_supported("http_version_not_supported").(conn)
assert resp.status == :ok
refute resp.conn == conn
assert resp.conn.status == 505
assert resp.conn.resp_body == "http_version_not_supported"
end
test "network_authentication_required without body" do
conn = conn(:get, "/")
resp = network_authentication_required().(conn)
assert resp.status == :ok
refute resp.conn == conn
assert resp.conn.status == 511
assert resp.conn.resp_body == ""
end
test "network_authentication_required with body" do
conn = conn(:get, "/")
resp = network_authentication_required(
"network_authentication_required"
).(conn)
assert resp.status == :ok
refute resp.conn == conn
assert resp.conn.status == 511
assert resp.conn.resp_body == "network_authentication_required"
end
end
| 26.675676 | 74 | 0.660588 |
1cbc45980cc90a1d0ad79005b4b18cf66473bef4 | 1,225 | ex | Elixir | lib/bitpal/exchange_rate/kraken.ex | bitpal/bitpal | 0e10eeaacf7a65b23945cfb95e4dbda8bffd4590 | [
"BSD-3-Clause-Clear"
] | 5 | 2021-05-04T21:28:00.000Z | 2021-12-01T11:19:48.000Z | lib/bitpal/exchange_rate/kraken.ex | bitpal/bitpal | 0e10eeaacf7a65b23945cfb95e4dbda8bffd4590 | [
"BSD-3-Clause-Clear"
] | 71 | 2021-04-21T05:48:49.000Z | 2022-03-23T06:30:37.000Z | lib/bitpal/exchange_rate/kraken.ex | bitpal/bitpal | 0e10eeaacf7a65b23945cfb95e4dbda8bffd4590 | [
"BSD-3-Clause-Clear"
] | 1 | 2021-04-25T10:35:41.000Z | 2021-04-25T10:35:41.000Z | defmodule BitPal.ExchangeRate.Kraken do
@behaviour BitPal.ExchangeRate.Backend
alias BitPal.ExchangeRate
alias BitPal.ExchangeRateSupervisor.Result
require Logger
@base "https://api.kraken.com/0/public/Ticker?pair="
@impl true
def name, do: "kraken"
@impl true
def supported, do: %{BCH: [:USD, :EUR]}
@spec compute(ExchangeRate.pair(), keyword) :: {:ok, Result.t()}
@impl true
def compute(pair, _opts) do
Logger.debug("Computing Kraken exchange rate #{inspect(pair)}")
{:ok,
%Result{
score: 100,
backend: __MODULE__,
rate: ExchangeRate.new!(get_rate(pair), pair)
}}
end
defp get_rate(pair) do
pair = pair2str(pair)
http = BitPalSettings.http_client()
{:ok, body} = http.request_body(url(pair))
body
|> decode
|> transform_rate(pair)
end
defp pair2str({from, to}) do
Atom.to_string(from) <> Atom.to_string(to)
end
defp url(pair) do
@base <> String.downcase(pair)
end
defp decode(body) do
Poison.decode!(body)
end
defp transform_rate(%{"result" => res}, pair) do
{f, ""} =
res[String.upcase(pair)]["c"]
|> List.first()
|> Float.parse()
Decimal.from_float(f)
end
end
| 20.081967 | 67 | 0.633469 |
1cbc483a4a949eab711b97ecff86fe30af971ae0 | 247 | ex | Elixir | lib/mqttsn/state_sup.ex | knyl/mqtt-sn | fd8a44918180830b8e6f00246503336945368fb6 | [
"MIT"
] | 1 | 2018-07-26T18:55:49.000Z | 2018-07-26T18:55:49.000Z | lib/mqttsn/state_sup.ex | knyl/mqtt-sn | fd8a44918180830b8e6f00246503336945368fb6 | [
"MIT"
] | null | null | null | lib/mqttsn/state_sup.ex | knyl/mqtt-sn | fd8a44918180830b8e6f00246503336945368fb6 | [
"MIT"
] | null | null | null | defmodule Mqttsn.StateSup do
use Supervisor
def start_link() do
Supervisor.start_link(__MODULE__, [])
end
def init(_args) do
children = [
worker(Mqttsn.State, [])]
supervise(children, strategy: :one_for_one)
end
end
| 16.466667 | 47 | 0.676113 |
1cbc52e41af8618235e880e42cdb950f5ae097aa | 505 | ex | Elixir | lib/geonames/endpoints/country_subdivision.ex | vheathen/geonames-elixir | b47ef0e38462c3bdbfb1a5b710f53993d9d3056d | [
"MIT"
] | null | null | null | lib/geonames/endpoints/country_subdivision.ex | vheathen/geonames-elixir | b47ef0e38462c3bdbfb1a5b710f53993d9d3056d | [
"MIT"
] | null | null | null | lib/geonames/endpoints/country_subdivision.ex | vheathen/geonames-elixir | b47ef0e38462c3bdbfb1a5b710f53993d9d3056d | [
"MIT"
] | null | null | null | defmodule Geonames.Endpoints.CountrySubdivision do
@moduledoc false
@behaviour Geonames.Endpoint
@default_arguments %{
lat: nil,
lng: nil,
radius: nil,
level: nil
}
def endpoint, do: "countrySubdivisionJSON"
def available_url_parameters, do: [:lat, :lng, :radius, :level]
def required_url_parameters, do: [:lat, :lng]
def function_name, do: :country_subdivision
def url_arguments(provided_arguments) do
Map.merge(@default_arguments, provided_arguments)
end
end
| 22.954545 | 65 | 0.728713 |
1cbc89b2d9069dd1574818f15b10bfed2c56ac0f | 478 | ex | Elixir | lib/live_doom_fire_web/router.ex | allmonty/elixir-live-doom-fire | 4a3525eb1f66983e501d6a01dc0af10f5c02d896 | [
"MIT"
] | 12 | 2019-03-26T18:34:01.000Z | 2021-04-23T05:17:39.000Z | lib/live_doom_fire_web/router.ex | allmonty/elixir-live-doom-fire | 4a3525eb1f66983e501d6a01dc0af10f5c02d896 | [
"MIT"
] | 5 | 2020-07-16T21:02:50.000Z | 2020-09-19T19:40:24.000Z | lib/live_doom_fire_web/router.ex | allmonty/elixir-live-doom-fire | 4a3525eb1f66983e501d6a01dc0af10f5c02d896 | [
"MIT"
] | 1 | 2019-07-23T12:59:02.000Z | 2019-07-23T12:59:02.000Z | defmodule LiveDoomFireWeb.Router do
use LiveDoomFireWeb, :router
pipeline :browser do
plug(:accepts, ["html"])
plug(:fetch_session)
plug(:fetch_flash)
plug(:fetch_live_flash)
plug(:protect_from_forgery)
plug(:put_secure_browser_headers)
end
pipeline :api do
plug(:accepts, ["json"])
end
scope "/", LiveDoomFireWeb do
# Use the default browser stack
pipe_through(:browser)
get("/", LiveDoomFireController, :live)
end
end
| 19.916667 | 43 | 0.688285 |
1cbcc0386ebd136284e19507f668b628f41247fc | 958 | ex | Elixir | lib/soccer_table/application.ex | thomasbrus/phoenix-live-view-soccer-table | 60bff384a71fb9c08eb22600409e71fe25e54712 | [
"MIT"
] | 1 | 2020-05-09T02:09:21.000Z | 2020-05-09T02:09:21.000Z | lib/soccer_table/application.ex | thomasbrus/phoenix-live-view-soccer-table | 60bff384a71fb9c08eb22600409e71fe25e54712 | [
"MIT"
] | null | null | null | lib/soccer_table/application.ex | thomasbrus/phoenix-live-view-soccer-table | 60bff384a71fb9c08eb22600409e71fe25e54712 | [
"MIT"
] | 1 | 2020-07-07T21:38:33.000Z | 2020-07-07T21:38:33.000Z | defmodule SoccerTable.Application do
# See https://hexdocs.pm/elixir/Application.html
# for more information on OTP Applications
@moduledoc false
use Application
def start(_type, _args) do
# List all child processes to be supervised
children = [
# Start the Ecto repository
SoccerTable.Repo,
# Start the endpoint when the application starts
SoccerTableWeb.Endpoint
# Starts a worker by calling: SoccerTable.Worker.start_link(arg)
# {SoccerTable.Worker, arg},
]
# See https://hexdocs.pm/elixir/Supervisor.html
# for other strategies and supported options
opts = [strategy: :one_for_one, name: SoccerTable.Supervisor]
Supervisor.start_link(children, opts)
end
# Tell Phoenix to update the endpoint configuration
# whenever the application is updated.
def config_change(changed, _new, removed) do
SoccerTableWeb.Endpoint.config_change(changed, removed)
:ok
end
end
| 29.9375 | 70 | 0.722338 |
1cbcf2087ffb2013b2abae1e351a78dccdca67c2 | 3,870 | ex | Elixir | lib/console_web/channels/graphql_channel.ex | helium/console | c6912b521926455ee0d4172dbd8e8183bdbae186 | [
"Apache-2.0"
] | 83 | 2018-05-31T14:49:10.000Z | 2022-03-27T16:49:49.000Z | lib/console_web/channels/graphql_channel.ex | helium/console | c6912b521926455ee0d4172dbd8e8183bdbae186 | [
"Apache-2.0"
] | 267 | 2018-05-22T23:19:02.000Z | 2022-03-31T04:31:06.000Z | lib/console_web/channels/graphql_channel.ex | helium/console | c6912b521926455ee0d4172dbd8e8183bdbae186 | [
"Apache-2.0"
] | 18 | 2018-11-20T05:15:54.000Z | 2022-03-28T08:20:13.000Z | defmodule ConsoleWeb.GraphqlChannel do
use Phoenix.Channel
def join("graphql:topbar_orgs", _message, socket) do
{:ok, socket}
end
def join("graphql:orgs_index_table", _message, socket) do
{:ok, socket}
end
def join("graphql:invitations_table", _message, socket) do
{:ok, socket}
end
def join("graphql:members_table", _message, socket) do
{:ok, socket}
end
def join("graphql:api_keys", _message, socket) do
{:ok, socket}
end
def join("graphql:dc_index", _message, socket) do
{:ok, socket}
end
def join("graphql:dc_purchases_table", _message, socket) do
{:ok, socket}
end
def join("graphql:function_index_table", _message, socket) do
{:ok, socket}
end
def join("graphql:function_index_bar", _message, socket) do
{:ok, socket}
end
def join("graphql:function_show", _message, socket) do
{:ok, socket}
end
def join("graphql:channels_index_table", _message, socket) do
{:ok, socket}
end
def join("graphql:channel_index_bar", _message, socket) do
{:ok, socket}
end
def join("graphql:channel_show", _message, socket) do
{:ok, socket}
end
def join("graphql:device_index_labels_bar", _message, socket) do
{:ok, socket}
end
def join("graphql:label_show", _message, socket) do
{:ok, socket}
end
def join("graphql:label_show_table", _message, socket) do
{:ok, socket}
end
def join("graphql:label_show_debug", _message, socket) do
{:ok, socket}
end
def join("graphql:devices_index_table", _message, socket) do
{:ok, socket}
end
def join("graphql:devices_header_count", _message, socket) do
{:ok, socket}
end
def join("graphql:device_show", _message, socket) do
{:ok, socket}
end
def join("graphql:device_show_labels_table", _message, socket) do
{:ok, socket}
end
def join("graphql:events_dashboard", _message, socket) do
{:ok, socket}
end
def join("graphql:device_show_debug", _message, socket) do
{:ok, socket}
end
def join("graphql:device_import_update", _message, socket) do
{:ok, socket}
end
def join("graphql:device_show_downlink", _message, socket) do
{:ok, socket}
end
def join("graphql:label_show_downlink", _message, socket) do
{:ok, socket}
end
def join("graphql:flows_update", _message, socket) do
{:ok, socket}
end
def join("graphql:devices_in_labels_update", _message, socket) do
{:ok, socket}
end
def join("graphql:flows_nodes_menu", _message, socket) do
{:ok, socket}
end
def join("graphql:resources_update", _message, socket) do
{:ok, socket}
end
def join("graphql:alerts_index_table", _message, socket) do
{:ok, socket}
end
def join("graphql:alert_show", _message, socket) do
{:ok, socket}
end
def join("graphql:alert_settings_table", _message, socket) do
{:ok, socket}
end
def join("graphql:multi_buys_index_table", _message, socket) do
{:ok, socket}
end
def join("graphql:multi_buy_show", _message, socket) do
{:ok, socket}
end
def join("graphql:xor_filter_update", _message, socket) do
{:ok, socket}
end
def join("graphql:coverage_index_org_hotspots", _message, socket) do
{:ok, socket}
end
def join("graphql:coverage_hotspot_show_debug", _message, socket) do
{:ok, socket}
end
def join("graphql:config_profiles_index_table", _message, socket) do
{:ok, socket}
end
def join("graphql:config_profile_show", _message, socket) do
{:ok, socket}
end
def join("graphql:followed_hotspot_table", _message, socket) do
{:ok, socket}
end
def join("graphql:groups_index", _message, socket) do
{:ok, socket}
end
def join("graphql:followed_hotspots_groups", _message, socket) do
{:ok, socket}
end
def join("graphql:mobile_device_labels", _message, socket) do
{:ok, socket}
end
end
| 21.5 | 70 | 0.67907 |
1cbcf560887a23ac1274e144b3a161e3e4ec6f90 | 1,349 | ex | Elixir | lib/Radpath/directory.ex | lowks/Radpath | 512d10a8f27c887b47f378b078e5abc1f3b9ea0e | [
"MIT"
] | 20 | 2015-01-17T14:52:54.000Z | 2022-03-01T19:12:37.000Z | lib/Radpath/directory.ex | lowks/Radpath | 512d10a8f27c887b47f378b078e5abc1f3b9ea0e | [
"MIT"
] | 25 | 2015-01-28T02:02:40.000Z | 2020-02-12T01:43:30.000Z | lib/Radpath/directory.ex | lowks/Radpath | 512d10a8f27c887b47f378b078e5abc1f3b9ea0e | [
"MIT"
] | 7 | 2015-03-05T04:12:39.000Z | 2022-03-01T19:12:43.000Z | defmodule Radpath.Dirs do
defmacro __using__([]) do
quote do
# defp dirs_list(path) when is_list(path) do
# []
# end
@moduledoc """
Directory functions of Radpath library
"""
@doc """
Returns all of the directories in the given path
## Arguments
- `path` path to show list of directories in bitstring.
- `regex_dir` String regex of directory pattern to show in final output.
## Usage
iex(4)> Radpath.dirs("/home/lowks/src/elixir/radpath/lib", regex_dir)
iex(3)> Radpath.dirs(["/home/lowks/src/elixir/radpath/lib", "/home/lowks/src/elixir/radpath/_build"], regex_dir)
"""
@spec dirs(bitstring, bitstring) :: list
def dirs(path, regex_dir \\ ".+") when (is_bitstring(path) or is_list(path)) do
path
|> normalize_path
|> do_dirs([], regex_dir)
end
defp do_dirs([], result, regex_dir) do
result
end
defp do_dirs(paths ,result, regex_dir) do
[h | t] = paths
do_dirs(t, result ++ dirs_list(h, regex_dir), regex_dir)
end
defp dirs_list(path, regex_dir) when is_bitstring(path) do
Finder.new()
|> Finder.with_directory_regex(Regex.compile!(regex_dir))
|> Finder.only_directories()
|> Finder.find(Path.expand(path))
|> Enum.to_list
|> Enum.sort
end
end
end
end
| 25.45283 | 118 | 0.625649 |
1cbcfa6266b3196615a773fe3e7381d769e1e04d | 1,267 | exs | Elixir | test/ecto/query/builder/order_by_test.exs | yrashk/ecto | 1462d5ad4cbb7bf74c292ec405852bc196808daf | [
"Apache-2.0"
] | 1 | 2016-08-15T21:23:28.000Z | 2016-08-15T21:23:28.000Z | test/ecto/query/builder/order_by_test.exs | yrashk/ecto | 1462d5ad4cbb7bf74c292ec405852bc196808daf | [
"Apache-2.0"
] | null | null | null | test/ecto/query/builder/order_by_test.exs | yrashk/ecto | 1462d5ad4cbb7bf74c292ec405852bc196808daf | [
"Apache-2.0"
] | null | null | null | defmodule Ecto.Query.Builder.OrderByTest do
use ExUnit.Case, async: true
import Ecto.Query.Builder.OrderBy
doctest Ecto.Query.Builder.OrderBy
test "escape" do
assert {Macro.escape(quote do [asc: &0.y] end), %{}} ==
escape(quote do x.y end, [x: 0])
assert {Macro.escape(quote do [asc: &0.x, asc: &1.y] end), %{}} ==
escape(quote do [x.x, y.y] end, [x: 0, y: 1])
assert {Macro.escape(quote do [asc: &0.x, desc: &1.y] end), %{}} ==
escape(quote do [asc: x.x, desc: y.y] end, [x: 0, y: 1])
assert {Macro.escape(quote do [asc: 1 == 2] end), %{}} ==
escape(quote do 1 == 2 end, [])
end
test "escape raise" do
assert_raise Ecto.Query.CompileError, "unbound variable `x` in query", fn ->
escape(quote do x.y end, [])
end
message = "expected :asc, :desc or interpolated value in order by, got: `:test`"
assert_raise Ecto.Query.CompileError, message, fn ->
escape(quote do [test: x.y] end, [x: 0])
end
message = "expected :asc or :desc in order by, got: `:temp`"
assert_raise Ecto.Query.CompileError, message, fn ->
escape(quote do [{^var!(temp), x.y}] end, [x: 0])
|> elem(0)
|> Code.eval_quoted([temp: :temp], __ENV__)
end
end
end
| 32.487179 | 84 | 0.584846 |
1cbd0e3a278d91dd5fa1baaa90f8d8a72d43e047 | 2,647 | ex | Elixir | clients/dialogflow/lib/google_api/dialogflow/v2/model/google_cloud_dialogflow_v2beta1_original_detect_intent_request.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/dialogflow/lib/google_api/dialogflow/v2/model/google_cloud_dialogflow_v2beta1_original_detect_intent_request.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/dialogflow/lib/google_api/dialogflow/v2/model/google_cloud_dialogflow_v2beta1_original_detect_intent_request.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Dialogflow.V2.Model.GoogleCloudDialogflowV2beta1OriginalDetectIntentRequest do
@moduledoc """
Represents the contents of the original request that was passed to
the `[Streaming]DetectIntent` call.
## Attributes
* `payload` (*type:* `map()`, *default:* `nil`) - Optional. This field is set to the value of the `QueryParameters.payload`
field passed in the request. Some integrations that query a Dialogflow
agent may provide additional information in the payload.
In particular for the Telephony Gateway this field has the form:
<pre>{
"telephony": {
"caller_id": "+18558363987"
}
}</pre>
Note: The caller ID field (`caller_id`) will be redacted for Standard
Edition agents and populated with the caller ID in [E.164
format](https://en.wikipedia.org/wiki/E.164) for Enterprise Edition agents.
* `source` (*type:* `String.t`, *default:* `nil`) - The source of this request, e.g., `google`, `facebook`, `slack`. It is set
by Dialogflow-owned servers.
* `version` (*type:* `String.t`, *default:* `nil`) - Optional. The version of the protocol used for this request.
This field is AoG-specific.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:payload => map(),
:source => String.t(),
:version => String.t()
}
field(:payload, type: :map)
field(:source)
field(:version)
end
defimpl Poison.Decoder,
for: GoogleApi.Dialogflow.V2.Model.GoogleCloudDialogflowV2beta1OriginalDetectIntentRequest do
def decode(value, options) do
GoogleApi.Dialogflow.V2.Model.GoogleCloudDialogflowV2beta1OriginalDetectIntentRequest.decode(
value,
options
)
end
end
defimpl Poison.Encoder,
for: GoogleApi.Dialogflow.V2.Model.GoogleCloudDialogflowV2beta1OriginalDetectIntentRequest do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 36.260274 | 130 | 0.710994 |
1cbd0ea7a25c390a12876dd15f2fc21770fd9980 | 2,261 | ex | Elixir | lib/policr_mini_web/controllers/api/third_party_controller.ex | skyplaying/policr-mini | ac265daa251fd76b770d0ce08c67075a6a57f796 | [
"MIT"
] | null | null | null | lib/policr_mini_web/controllers/api/third_party_controller.ex | skyplaying/policr-mini | ac265daa251fd76b770d0ce08c67075a6a57f796 | [
"MIT"
] | 2 | 2022-02-25T06:15:30.000Z | 2022-02-25T06:15:33.000Z | lib/policr_mini_web/controllers/api/third_party_controller.ex | skyplaying/policr-mini | ac265daa251fd76b770d0ce08c67075a6a57f796 | [
"MIT"
] | null | null | null | defmodule PolicrMiniWeb.API.ThirdPartyController do
@moduledoc """
第三方实例的前台 API 控制器。
"""
use PolicrMiniWeb, :controller
alias PolicrMini.Instances
action_fallback PolicrMiniWeb.API.FallbackController
plug CORSPlug, origin: &__MODULE__.load_origins/0
@project_start_date ~D[2020-06-01]
def offical_bot do
%PolicrMini.Instances.ThirdParty{
name: "Policr Mini",
bot_username: "policr_mini_bot",
homepage: "https://mini.telestd.me",
description: "由 Telestd 项目组维护、POLICR 社区运营",
running_days: Date.diff(Date.utc_today(), @project_start_date),
is_forked: false
}
end
def index(conn, _params) do
third_parties = Instances.find_third_parties()
{current_index, official_index, third_parties} =
case get_req_header(conn, "referer") do
[referer] ->
r =
third_parties
|> Enum.with_index()
|> Enum.find(fn {third_party, _} ->
homepage = PolicrMiniWeb.handle_url(third_party.homepage, has_end_slash: true)
String.starts_with?(referer, homepage)
end)
if r do
current_index = elem(r, 1)
{current_bot, third_parties} = List.pop_at(third_parties, current_index)
third_parties =
third_parties
|> List.insert_at(0, current_bot)
|> List.insert_at(1, offical_bot())
{0, 1, third_parties}
else
# 没有找到此实例。
if String.starts_with?(referer, PolicrMiniWeb.root_url(has_end_slash: true)) do
# 此实例为官方实例。
{0, 0, [offical_bot()] ++ third_parties}
else
# 非官方实例。
{-1, 0, [offical_bot()] ++ third_parties}
end
end
[] ->
# 无此实例。
{-1, 0, [offical_bot()] ++ third_parties}
end
render(conn, "index.json", %{
third_parties: third_parties,
official_index: official_index,
current_index: current_index
})
end
# TODO: 使用专门的 Plug 缓存 `third_parties` 数据,以避免在 API 实现中出现重复查询。
def load_origins do
third_parties = Instances.find_third_parties()
Enum.map(third_parties, fn third_party -> third_party.homepage end)
end
end
| 27.240964 | 92 | 0.605042 |
1cbd3e756a17ddb1582d61bfdf03ee7b98745afd | 200 | exs | Elixir | gearbox_impl/priv/repo/migrations/20211130212220_create_client.exs | enelesmai/gearbox-example | f30e0f9828a87d96c14333cb39928aa1a4c56787 | [
"MIT"
] | null | null | null | gearbox_impl/priv/repo/migrations/20211130212220_create_client.exs | enelesmai/gearbox-example | f30e0f9828a87d96c14333cb39928aa1a4c56787 | [
"MIT"
] | null | null | null | gearbox_impl/priv/repo/migrations/20211130212220_create_client.exs | enelesmai/gearbox-example | f30e0f9828a87d96c14333cb39928aa1a4c56787 | [
"MIT"
] | null | null | null | defmodule Clients.Repo.Migrations.CreateClient do
use Ecto.Migration
def change do
create table(:client) do
add :name, :string, null: false
add :status, :string
end
end
end
| 18.181818 | 49 | 0.68 |
1cbd3ef46523ed0dd46a19219c35147b19d1a774 | 926 | exs | Elixir | config/releases.exs | isabella232/elixir_cluster | 1e81d191e50b8959200770d47fd7c0b609b658e4 | [
"MIT"
] | 18 | 2019-06-06T19:30:48.000Z | 2021-08-24T19:17:37.000Z | config/releases.exs | isabella232/elixir_cluster | 1e81d191e50b8959200770d47fd7c0b609b658e4 | [
"MIT"
] | 2 | 2020-03-29T21:28:56.000Z | 2020-11-19T17:13:56.000Z | config/releases.exs | isabella232/elixir_cluster | 1e81d191e50b8959200770d47fd7c0b609b658e4 | [
"MIT"
] | 7 | 2019-09-11T11:29:37.000Z | 2022-03-26T07:45:37.000Z | # In this file, we load production configuration and secrets
# from environment variables. You can also hardcode secrets,
# although such is generally not recommended and you have to
# remember to add this file to your .gitignore.
import Config
secret_key_base =
System.get_env("SECRET_KEY_BASE") ||
raise """
environment variable SECRET_KEY_BASE is missing.
You can generate one by calling: mix phx.gen.secret
"""
config :elixir_cluster_demo, ElixirClusterDemoWeb.Endpoint,
http: [:inet6, port: String.to_integer(System.get_env("PORT") || "4000")],
secret_key_base: secret_key_base
# ## Using releases (Elixir v1.9+)
#
# If you are doing OTP releases, you need to instruct Phoenix
# to start each relevant endpoint:
#
config :elixir_cluster_demo, ElixirClusterDemoWeb.Endpoint, server: true
#
# Then you can assemble a release by calling `mix release`.
# See `mix help release` for more information.
| 34.296296 | 76 | 0.75594 |
1cbd569875cefad9230ca98f2b113ef62823bcaf | 6,686 | ex | Elixir | lib/pexels.ex | factor18/pexels | c6655f7dd0e58a9a06b6b7a02e74b3f009cd7b01 | [
"MIT"
] | 5 | 2021-05-10T12:10:02.000Z | 2021-05-24T08:15:42.000Z | lib/pexels.ex | factor18/pexels | c6655f7dd0e58a9a06b6b7a02e74b3f009cd7b01 | [
"MIT"
] | null | null | null | lib/pexels.ex | factor18/pexels | c6655f7dd0e58a9a06b6b7a02e74b3f009cd7b01 | [
"MIT"
] | 1 | 2021-05-13T16:49:17.000Z | 2021-05-13T16:49:17.000Z | defmodule Pexels do
@moduledoc ~S"""
Parses the given `line` into a command.
## Examples
iex> client = Pexels.client()
iex> {:ok, %{liked: _liked, photo: photo}} = Pexels.photo(client, %{id: 156934})
iex> photo.url
"https://www.pexels.com/photo/white-and-black-cat-156934/"
"""
alias Pexels.Video
alias Pexels.Client.Video.{VideoSearchRequest, PopularVideosRequest, VideoRequest, VideosResponse}
alias Pexels.Client.Photo.{PhotoSearchRequest, CuratedPhotosRequest, PhotoRequest, PhotoResponse, PhotosResponse}
alias Pexels.Client.Collection.{CollectionsRequest, CollectionMediaRequest, CollectionMediaResponse, CollectionsResponse}
# Photos
@doc ~S"""
This endpoint enables you to search Pexels for any topic that you would like.
For example your query could be something broad like Nature, Tigers, People.
Or it could be something specific like Group of people working.
## Examples
iex> client = Pexels.client()
iex> {:ok, response = %Pexels.Client.Photo.PhotosResponse{}} = Pexels.search_photos(client, %{query: "cat", per_page: 10, page: 1})
iex> response.photos |> Enum.count
10
"""
def search_photos(client, request) do
with {:ok, request} <- PhotoSearchRequest.make(request),
{:ok, env} <- Tesla.get(client, "/v1/search", query: to_kwlist(request)) do
PhotosResponse.make(env.body)
else
{:error, error} -> {:error, error}
end
end
@doc ~S"""
This endpoint enables you to receive real-time photos curated by the Pexels team.
## Examples
iex> client = Pexels.client()
iex> {:ok, response = %Pexels.Client.Photo.PhotosResponse{}} = Pexels.curated_photos(client, %{per_page: 2, page: 1})
iex> response.photos |> Enum.count
2
"""
def curated_photos(client, request) do
with {:ok, request} <- CuratedPhotosRequest.make(request),
{:ok, env} <- Tesla.get(client, "/v1/curated", query: to_kwlist(request)) do
PhotosResponse.make(env.body)
else
{:error, error} -> {:error, error}
end
end
@doc ~S"""
Retrieve a specific Photo from its id.
## Examples
iex> client = Pexels.client()
iex> {:ok, %{liked: _liked, photo: photo} = %Pexels.Client.Photo.PhotoResponse{}} = Pexels.photo(client, %{id: 156934})
iex> photo.url
"https://www.pexels.com/photo/white-and-black-cat-156934/"
"""
def photo(client, request) do
with {:ok, request} <- PhotoRequest.make(request),
{:ok, env} <- Tesla.get(client, "/v1/photos/#{request.id}", query: to_kwlist(request)) do
PhotoResponse.make(%{photo: env.body, liked: env.body["liked"]})
else
{:error, error} -> {:error, error}
end
end
# Videos
@doc ~S"""
This endpoint enables you to search Pexels for any topic that you would like.
For example your query could be something broad like Nature, Tigers, People.
Or it could be something specific like Group of people working.
## Examples
iex> client = Pexels.client()
iex> {:ok, response = %Pexels.Client.Video.VideosResponse{}} = Pexels.search_videos(client, %{query: "cat", per_page: 10, page: 1})
iex> response.videos |> Enum.count
10
"""
def search_videos(client, request) do
with {:ok, request} <- VideoSearchRequest.make(request),
{:ok, env} <- Tesla.get(client, "/videos/search", query: to_kwlist(request)) do
VideosResponse.make(env.body)
else
{:error, error} -> {:error, error}
end
end
@doc ~S"""
This endpoint enables you to receive the current popular Pexels videos.
## Examples
iex> client = Pexels.client()
iex> {:ok, response = %Pexels.Client.Video.VideosResponse{}} = Pexels.popular_videos(client, %{per_page: 2, page: 1})
iex> response.videos |> Enum.count
2
"""
def popular_videos(client, request) do
with {:ok, request} <- PopularVideosRequest.make(request),
{:ok, env} <- Tesla.get(client, "/videos/popular", query: to_kwlist(request)) do
VideosResponse.make(env.body)
else
{:error, error} -> {:error, error}
end
end
@doc ~S"""
Retrieve a specific Video from its id.
## Examples
iex> client = Pexels.client()
iex> {:ok, video = %Pexels.Video{}} = Pexels.video(client, %{id: 6982949})
iex> video.url
"https://www.pexels.com/video/woman-dancing-in-the-spotlight-6982949/"
"""
def video(client, request) do
with {:ok, request} <- VideoRequest.make(request),
{:ok, env} <- Tesla.get(client, "/videos/videos/#{request.id}", query: to_kwlist(request)) do
Video.make(env.body)
else
{:error, error} -> {:error, error}
end
end
# Collections
@doc ~S"""
This endpoint returns all of your collections.
## Examples
iex> client = Pexels.client()
iex> {:ok, %{collections: collections} = %Pexels.Client.Collection.CollectionsResponse{}} = Pexels.collections(client, %{per_page: 1})
iex> collections |> Enum.count
1
"""
def collections(client, request) do
with {:ok, request} <- CollectionsRequest.make(request),
{:ok, env} <- Tesla.get(client, "/v1/collections", query: to_kwlist(request)) do
CollectionsResponse.make(env.body)
else
{:error, error} -> {:error, error}
end
end
@doc ~S"""
This endpoint returns all of your collections.
## Examples
iex> client = Pexels.client()
iex> {:ok, %{media: media} = %Pexels.Client.Collection.CollectionMediaResponse{}} = Pexels.collection_media(client, %{id: "z40vgi2", per_page: 1})
iex> media |> Enum.count
1
"""
def collection_media(client, request) do
with {:ok, request} <- CollectionMediaRequest.make(request),
{:ok, env} <- Tesla.get(client, "/v1/collections/#{request.id}", query: to_kwlist(request)) do
CollectionMediaResponse.make(env.body)
else
{:error, error} -> {:error, error}
end
end
# Client
@doc ~S"""
Creates the `tesla` client for pexels
## Examples
iex> _client = Pexels.client()
"""
def client(token \\ Application.get_env(:pexels, :token),
base \\ Application.get_env(:pexels, :base),
timeout \\ Application.get_env(:pexels, :timeout)) do
middleware = [
Tesla.Middleware.JSON,
{Tesla.Middleware.BaseUrl, base},
{Tesla.Middleware.Timeout, timeout: timeout},
{Tesla.Middleware.Headers, [{"Authorization", token}]}
]
Tesla.client(middleware)
end
# Helpers
defp to_kwlist(map) do
map
|> Map.from_struct
|> Enum.map(fn({key, value}) -> {key, value} end)
end
end
| 31.389671 | 152 | 0.638498 |
1cbd6de621ec73b8c049583e77f8ec113c42d32f | 717 | ex | Elixir | lib/fika.ex | v0idpwn/fika | 4453d1c9222f7c2ab09f03d436f2222c7e325344 | [
"Apache-2.0"
] | null | null | null | lib/fika.ex | v0idpwn/fika | 4453d1c9222f7c2ab09f03d436f2222c7e325344 | [
"Apache-2.0"
] | null | null | null | lib/fika.ex | v0idpwn/fika | 4453d1c9222f7c2ab09f03d436f2222c7e325344 | [
"Apache-2.0"
] | null | null | null | defmodule Fika do
def start(path \\ nil) do
if path do
File.cd!(path)
end
if File.exists?("router.fi") do
start_route_store()
start_webserver()
:ok
else
IO.puts("Cannot start webserver: file router.fi not found.")
:error
end
end
def start_route_store do
{:ok, _pid} = Supervisor.start_child(Fika.Supervisor, Fika.RouteStore)
end
def start_webserver(port \\ 6060) do
{:ok, _pid} =
Supervisor.start_child(
Fika.Supervisor,
{Plug.Cowboy, scheme: :http, plug: Fika.Router, options: [port: port, ip: {127, 0, 0, 1}]}
)
IO.puts("Web server is running on http://localhost:#{port}\nPress Ctrl+C to exit")
end
end
| 23.129032 | 98 | 0.616457 |
1cbd6fae8305da0c44e6a46a7bd6e78a582926dc | 7,960 | ex | Elixir | lib/numato_gpio.ex | kmiecikt/numato_gpio | 7d6775317dd52a04aa8b1fa0ecdce02e8c77e90b | [
"Apache-2.0"
] | null | null | null | lib/numato_gpio.ex | kmiecikt/numato_gpio | 7d6775317dd52a04aa8b1fa0ecdce02e8c77e90b | [
"Apache-2.0"
] | null | null | null | lib/numato_gpio.ex | kmiecikt/numato_gpio | 7d6775317dd52a04aa8b1fa0ecdce02e8c77e90b | [
"Apache-2.0"
] | null | null | null | defmodule Numato.Gpio do
use GenStage
defmodule State do
defstruct [
uart_pid: nil,
last_command: nil
]
end
def start_link(uart_pid) do
GenStage.start_link(__MODULE__, uart_pid)
end
def init(uart_pid) when is_pid(uart_pid) do
{:producer, %State{uart_pid: uart_pid}}
end
def init(com_port) when is_bitstring(com_port) do
{:ok, uart_pid} = Circuits.UART.start_link()
:ok = Circuits.UART.open(uart_pid, com_port,
speed: 115200,
active: true,
framing: {Numato.UART.Framing, separator: "\r\n"})
{:producer, %State{uart_pid: uart_pid}}
end
def handle_demand(_demand, state) do
{:noreply, [], state}
end
@doc """
Returns Numato firmware version.
"""
def ver(pid) when is_pid(pid) do
GenStage.call(pid, :ver)
end
@doc """
Reads ID of the Numato module.
"""
def id_get(pid) when is_pid(pid) do
GenStage.call(pid, :id_get)
end
@doc """
Writes ID of the Numato module. The ID must be a string with exactly 8 characters.
"""
def id_set(pid, id) when is_pid(pid) and is_bitstring(id) and byte_size(id) == 8 do
GenStage.call(pid, {:id_set, id})
end
@doc """
Reads the digial input status for the given GPIO. Returns 0 for low and 1 for high state.
"""
def gpio_read(pid, gpio) when is_pid(pid) and is_integer(gpio) do
GenStage.call(pid, {:gpio_read, gpio})
end
@doc """
Sets the GPIO output status to either low (value `0`) or high (value `1`).
"""
def gpio_write(pid, gpio, value) when is_pid(pid) and is_integer(gpio) and (value == 0 or value == 1) do
GenStage.call(pid, {:gpio_write, gpio, value})
end
@doc """
Sets mask for subsequent GPIO `Numato.Gpio.gpio_writeall` and `Numato_Gpio.gpio_iodir` commands.
A 0 in a bit position mask the corresponding GPIO and any update to that GPIO is ignored
during `Numato.Gpio.gpio_iodir` and `Numato.Gpio.gpio_writeall` operations.
"""
def gpio_iomask(pid, iomask) when is_bitstring(iomask) do
GenStage.call(pid, {:gpio_iomask, iomask})
end
@doc """
Sets the direction of all GPIOs in a single operation.
A 0 in a bit position configures that GPIO as output and 1 configures it as input.
This operation respects the `iomask`, set using `Numato.Gpio.gpio_iomask()` function.
"""
def gpio_iodir(pid, iodir) when is_bitstring(iodir) do
GenStage.call(pid, {:gpio_iodir, iodir})
end
@doc """
Reads the status of all GPIOs in a single operation.
"""
def gpio_readall(pid) when is_pid(pid) do
GenStage.call(pid, :gpio_readall)
end
@doc """
Enables GPIOs input change notifications. When notifications are enabled, this `GenStage` process
will produce events that are tuples with three elemetns: `{current_value, previous_value, iodir}`.
"""
def gpio_notify_on(pid) when is_pid(pid) do
GenStage.call(pid, :gpio_notify_on)
end
@doc """
Disables GPIOs input change notifications.
"""
def gpio_notify_off(pid) when is_pid(pid) do
GenStage.call(pid, :gpio_notify_off)
end
@doc """
Controls al GPIOs in a single operation.
This operation respects the `iomask`, set using `Numato.Gpio.gpio_iomask()` function.
"""
def gpio_writeall(pid, value) when is_pid(pid) do
GenStage.call(pid, {:gpio_writeall, value})
end
@doc """
Returns information whether notifications are enabled (`true`) or disabled ('false`).
"""
def gpio_notify_get(pid) when is_pid(pid) do
GenStage.call(pid, :gpio_notify_get)
end
@doc """
Reads the analog voltage present at the given ADC input. Responses are
integeres in range 0 - 1023.
"""
def adc_read(pid, input) when is_pid(pid) and is_integer(input) do
GenStage.call(pid, {:adc_read, input})
end
def handle_call(:ver, from, state) do
command_text = Numato.Commands.ver()
send_call(command_text, {:ver, from}, state)
end
def handle_call(:id_get, from, state) do
command_text = Numato.Commands.id_get()
send_call(command_text, {:id_get, from}, state)
end
def handle_call({:id_set, id}, _from, state) do
command_text = Numato.Commands.id_set(id)
send_info(command_text, state)
end
def handle_call({:gpio_read, gpio}, from, state) do
command_text = Numato.Commands.gpio_read(gpio)
send_call(command_text, {:gpio_read, from}, state)
end
def handle_call({:gpio_write, gpio, value}, _from, state) do
command_text = Numato.Commands.gpio_write(gpio, value)
send_info(command_text, state)
end
def handle_call({:gpio_iomask, iomask}, _from, state) do
command_text = Numato.Commands.gpio_iomask(iomask)
send_info(command_text, state)
end
def handle_call({:gpio_iodir, iodir}, _from, state) do
command_text = Numato.Commands.gpio_iodir(iodir)
send_info(command_text, state)
end
def handle_call(:gpio_readall, from, state) do
command_text = Numato.Commands.gpio_readall()
send_call(command_text, {:gpio_readall, from}, state)
end
def handle_call(:gpio_notify_on, from , state) do
command_text = Numato.Commands.gpio_notify_on()
send_call(command_text, {:gpio_notify_on, from}, state)
end
def handle_call(:gpio_notify_off, from, state) do
command_text = Numato.Commands.gpio_notify_off()
send_call(command_text, {:gpio_notify_off, from}, state)
end
def handle_call(:gpio_notify_get, from, state) do
command_text = Numato.Commands.gpio_notify_get()
send_call(command_text, {:gpio_notify_get, from}, state)
end
def handle_call({:gpio_writeall, values}, _from, state) do
command_text = Numato.Commands.gpio_writeall(values)
send_info(command_text, state)
end
def handle_call({:adc_read, input}, from, state) do
command_text = Numato.Commands.adc_read(input)
send_call(command_text, {:adc_read, from}, state)
end
def handle_info({:circuits_uart, _, line}, state) do
response = Numato.Responses.parse(line)
case response do
:echo ->
{:noreply, [], state}
{:notification, previous, current, iodir} ->
{events, new_state} = process_notification(previous, current, iodir, state)
{:noreply, events, new_state}
_ ->
:ok = reply_to_command(state.last_command, response)
{:noreply, [], %State{state | last_command: nil}}
end
end
defp send_info(command_text, state) do
response = Circuits.UART.write(state.uart_pid, command_text)
{:reply, response, [], %State{state | last_command: nil}}
end
defp send_call(command_text, command_tuple, state) do
case Circuits.UART.write(state.uart_pid, command_text) do
:ok -> {:noreply, [], %State{state | last_command: command_tuple}}
error -> {:reply, error, [], %State{state | last_command: nil}}
end
end
defp process_notification(previous, current, iodir, state) do
changes = Numato.Utils.get_changed_ports(previous, current, iodir)
{[{:notification, changes}], state}
end
defp reply_to_command({:gpio_read, from}, {:int, value}) when value == 0 or value == 1 do
GenStage.reply(from, value)
end
defp reply_to_command({:id_get, from}, {:id, value}) do
GenStage.reply(from, value)
end
defp reply_to_command({:id_get, from}, {:bits, value}) do
GenStage.reply(from, Base.encode16(value))
end
defp reply_to_command({:ver, from}, {:bits, value}) do
GenStage.reply(from, Base.encode16(value))
end
defp reply_to_command({:gpio_readall, from}, {:bits, value}) do
GenStage.reply(from, value)
end
defp reply_to_command({:adc_read, from}, {:int, value}) do
GenStage.reply(from, value)
end
defp reply_to_command({:gpio_notify_get, from}, {:notify, value}) do
GenStage.reply(from, value)
end
defp reply_to_command({:gpio_notify_on, from}, {:notify, value}) do
GenStage.reply(from, value)
end
defp reply_to_command({:gpio_notify_off, from}, {:notify, value}) do
GenStage.reply(from, value)
end
defp reply_to_command(nil, _) do
:ok
end
end
| 29.701493 | 106 | 0.69196 |
1cbd7082a803fa16ccdce3e4d59eff7572a53f61 | 7,834 | exs | Elixir | test/esme/sync_test.exs | dwhelan/smppex | 548b1827828b571434aa390f8abae787df1f8b60 | [
"MIT"
] | null | null | null | test/esme/sync_test.exs | dwhelan/smppex | 548b1827828b571434aa390f8abae787df1f8b60 | [
"MIT"
] | null | null | null | test/esme/sync_test.exs | dwhelan/smppex | 548b1827828b571434aa390f8abae787df1f8b60 | [
"MIT"
] | null | null | null | defmodule SMPPEX.ESME.SyncTest do
use ExUnit.Case
alias :timer, as: Timer
alias SMPPEX.Session
alias SMPPEX.MC
alias SMPPEX.Pdu
alias SMPPEX.ESME.Sync, as: ESMESync
setup do
{:ok, callback_agent} = Agent.start_link(fn -> [] end)
callbacks = fn ->
Agent.get(
callback_agent,
&Enum.reverse(&1)
)
end
mc_opts = [
enquire_link_limit: 1000,
session_init_limit: :infinity,
enquire_link_resp_limit: 1000,
inactivity_limit: 10_000,
response_limit: 2000,
timer_resolution: 100_000
]
port = Support.TCP.Helpers.find_free_port()
mc_with_opts = fn handler, opts ->
case MC.start(
{Support.Session, {callback_agent, handler}},
transport_opts: [port: port],
mc_opts: opts
) do
{:ok, ref} -> ref
other -> other
end
end
mc = &mc_with_opts.(&1, mc_opts)
esme_with_opts = fn opts ->
{:ok, pid} = SMPPEX.ESME.Sync.start_link("127.0.0.1", port, opts)
pid
end
esme = fn -> esme_with_opts.([]) end
{
:ok,
mc: mc,
mc_with_opts: mc_with_opts,
callbacks: callbacks,
port: port,
esme: esme,
esme_with_opts: esme_with_opts
}
end
test "request and response", ctx do
ctx[:mc].(fn
{:init, _, _}, st ->
{:ok, st}
{:handle_pdu, pdu}, st ->
{:ok, [SMPPEX.Pdu.Factory.bind_transmitter_resp(0, "sid") |> Pdu.as_reply_to(pdu)], st}
{:handle_send_pdu_result, _, _}, st ->
st
end)
esme = ctx[:esme].()
1..100
|> Enum.map(fn _ ->
Task.async(fn ->
pdu = SMPPEX.Pdu.Factory.bind_transmitter("system_id", "password")
assert {:ok, resp} = ESMESync.request(esme, pdu)
assert :bind_transmitter_resp == Pdu.command_name(resp)
end)
end)
|> Enum.each(fn task ->
Task.await(task)
end)
end
test "request and stop", ctx do
Process.flag(:trap_exit, true)
ctx[:mc].(fn
{:init, _, _}, st ->
{:ok, st}
{:handle_pdu, _pdu}, st ->
{:stop, :oops, st}
{:handle_send_pdu_result, _, _}, st ->
st
end)
esme = ctx[:esme].()
pdu = SMPPEX.Pdu.Factory.bind_transmitter("system_id", "password")
assert :stop = ESMESync.request(esme, pdu)
end
test "request and timeout", ctx do
Process.flag(:trap_exit, true)
ctx[:mc].(fn
{:init, _, _}, st ->
{:ok, st}
{:handle_pdu, _pdu}, st ->
{:ok, st}
end)
esme = ctx[:esme].()
pdu = SMPPEX.Pdu.Factory.bind_transmitter("system_id", "password")
assert :timeout = ESMESync.request(esme, pdu, 50)
end
test "request and session timeout", ctx do
esme_opts = [
response_limit: 50,
timer_resolution: 5
]
ctx[:mc].(fn
{:init, _, _}, st ->
{:ok, st}
{:handle_pdu, _pdu}, st ->
{:ok, st}
end)
esme = ctx[:esme_with_opts].(esme_opts: esme_opts)
pdu = SMPPEX.Pdu.Factory.bind_transmitter("system_id", "password")
assert :timeout = ESMESync.request(esme, pdu, 100)
end
test "request and error", ctx do
Process.flag(:trap_exit, true)
ctx[:mc].(fn
{:init, _, _}, st ->
{:ok, st}
{:handle_pdu, _pdu}, st ->
{:ok, st}
end)
esme = ctx[:esme].()
pdu = SMPPEX.Pdu.Factory.bind_transmitter("system_id", "too_long_password")
assert {:error, _} = ESMESync.request(esme, pdu)
end
test "wait_for_pdus, blocking", ctx do
pid = self()
ctx[:mc].(fn
{:init, _, _}, st ->
send(pid, self())
{:ok, st}
{:handle_send_pdu_result, _, _}, st ->
st
end)
esme = ctx[:esme].()
mc_session =
receive do
pid -> pid
end
spawn_link(fn ->
Timer.sleep(30)
Session.send_pdu(mc_session, SMPPEX.Pdu.Factory.bind_transmitter("system_id", "password"))
end)
{time, _} =
Timer.tc(fn ->
assert [pdu: _] = ESMESync.wait_for_pdus(esme, 60)
end)
assert time > 30_000
end
test "wait_for_pdus, nonblocking", ctx do
pid = self()
ctx[:mc].(fn
{:init, _, _}, st ->
send(pid, self())
{:ok, st}
{:handle_send_pdu_result, _, _}, st ->
st
end)
esme = ctx[:esme].()
mc_session =
receive do
pid -> pid
end
Session.send_pdu(mc_session, SMPPEX.Pdu.Factory.bind_transmitter("system_id", "password"))
Timer.sleep(30)
assert [pdu: _] = ESMESync.wait_for_pdus(esme)
end
test "wait_for_pdus, send_pdu result", ctx do
ctx[:mc].(fn
{:init, _, _}, st ->
{:ok, st}
{:handle_pdu, _pdu}, st ->
{:ok, st}
{:handle_send_pdu_result, _, _}, st ->
st
end)
esme = ctx[:esme].()
spawn_link(fn ->
Timer.sleep(30)
Session.send_pdu(esme, SMPPEX.Pdu.Factory.enquire_link())
end)
assert [ok: _] = ESMESync.wait_for_pdus(esme)
end
test "wait_for_pdus, send_pdu error", ctx do
ctx[:mc].(fn
{:init, _, _}, st ->
{:ok, st}
{:handle_pdu, pdu}, st ->
{:ok, [SMPPEX.Pdu.Factory.enquire_link_resp() |> Pdu.as_reply_to(pdu)], st}
{:handle_send_pdu_result, _, _}, st ->
st
end)
esme = ctx[:esme].()
spawn_link(fn ->
Timer.sleep(30)
Session.send_pdu(
esme,
SMPPEX.Pdu.Factory.bind_transmitter("system_id", "too_long_password")
)
end)
assert [{:error, _pdu, _error}] = ESMESync.wait_for_pdus(esme)
end
test "wait_for_pdus, resp", ctx do
ctx[:mc].(fn
{:init, _, _}, st ->
{:ok, st}
{:handle_pdu, pdu}, st ->
pid = self()
spawn(fn ->
Timer.sleep(30)
Session.send_pdu(
pid,
SMPPEX.Pdu.Factory.bind_transmitter_resp(0) |> Pdu.as_reply_to(pdu)
)
end)
{:ok, [], st}
{:handle_send_pdu_result, _, _}, st ->
st
end)
esme = ctx[:esme].()
Session.send_pdu(esme, SMPPEX.Pdu.Factory.bind_transmitter("system_id", "password"))
assert [ok: _] = ESMESync.wait_for_pdus(esme)
{time, _} =
Timer.tc(fn ->
assert [{:resp, _, _}] = ESMESync.wait_for_pdus(esme, 60)
end)
assert time > 30_000
end
test "wait_for_pdus, resp timeout", ctx do
esme_opts = [
response_limit: 50,
timer_resolution: 5
]
ctx[:mc].(fn
{:init, _, _}, st ->
{:ok, st}
{:handle_pdu, _pdu}, st ->
{:ok, st}
{:handle_send_pdu_result, _, _}, st ->
st
end)
esme = ctx[:esme_with_opts].(esme_opts: esme_opts)
Session.send_pdu(esme, SMPPEX.Pdu.Factory.bind_transmitter("system_id", "password"))
assert [ok: _] = ESMESync.wait_for_pdus(esme)
assert [{:timeout, _}] = ESMESync.wait_for_pdus(esme, 100)
end
test "wait_for_pdus, timeout", ctx do
ctx[:mc].(fn {:init, _, _}, st ->
{:ok, st}
end)
esme = ctx[:esme].()
{time, _} =
Timer.tc(fn ->
assert :timeout = ESMESync.wait_for_pdus(esme, 30)
end)
assert time > 30_000
end
test "stop", ctx do
ctx[:mc].(fn {:init, _, _}, st ->
{:ok, st}
end)
esme = ctx[:esme].()
assert :ok = ESMESync.stop(esme)
Timer.sleep(30)
refute Process.alive?(esme)
end
test "pdus", ctx do
pid = self()
ctx[:mc].(fn
{:init, _, _}, st ->
send(pid, self())
{:ok, st}
{:handle_send_pdu_result, _, _}, st ->
st
end)
esme = ctx[:esme].()
mc_session =
receive do
pid -> pid
end
Session.send_pdu(mc_session, SMPPEX.Pdu.Factory.bind_transmitter("system_id", "password"))
Timer.sleep(30)
assert [pdu: _] = ESMESync.pdus(esme)
end
end
| 20.615789 | 96 | 0.547358 |
1cbdceb8d4c097dddb160f96c22befe64409db27 | 1,463 | ex | Elixir | lib/atlas/field_converter.ex | lowks/atlas | e01cefa088fe1174d9162f5834156611f1caa594 | [
"MIT"
] | 94 | 2015-01-01T16:17:40.000Z | 2021-12-16T02:53:01.000Z | lib/atlas/field_converter.ex | IdoBn/atlas | e01cefa088fe1174d9162f5834156611f1caa594 | [
"MIT"
] | 2 | 2015-09-22T14:45:26.000Z | 2017-08-29T18:16:04.000Z | lib/atlas/field_converter.ex | IdoBn/atlas | e01cefa088fe1174d9162f5834156611f1caa594 | [
"MIT"
] | 11 | 2015-01-06T02:25:00.000Z | 2021-12-16T03:10:11.000Z | defmodule Atlas.FieldConverter do
@moduledoc """
FieldConverter provides transformations from raw values returned from database or
user input into value Elixir types based on Schema field types
Examples
iex> value_to_field_type("1", :float)
1.0
"""
def value_to_field_type(value, :string) when is_binary(value), do: value
def value_to_field_type(nil, :string), do: nil
def value_to_field_type(value, :string), do: to_string(value)
def value_to_field_type(value, :integer) when is_integer(value), do: value
def value_to_field_type(nil, :integer), do: nil
def value_to_field_type(value, :integer), do: elem(Integer.parse(value), 0)
def value_to_field_type(value, :float) when is_float(value), do: value
def value_to_field_type(value, :float) when is_integer(value), do: value + 0.0
def value_to_field_type(nil, :float), do: nil
def value_to_field_type(value, :float) do
case Float.parse(to_string(value)) do
{value, _} -> value
:error -> nil
end
end
def value_to_field_type(value, :boolean) when is_boolean(value), do: value
def value_to_field_type(value, :boolean), do: String.to_atom(to_string(value)) == true
def value_to_field_type(value, :datetime) when is_binary(value), do: value
def value_to_field_type(nil, :datetime), do: nil
def value_to_field_type(value, :datetime), do: value
# handle undefined field type
def value_to_field_type(value, nil), do: value
end
| 35.682927 | 88 | 0.731374 |
1cbdde2f5b0d4f28edd951a4b9e4ba273d042ae6 | 182 | exs | Elixir | priv/repo/migrations/20190402145007_remove_device_type_from_pageviews.exs | wvffle/analytics | 2c0fd55bc67f74af1fe1e2641678d44e9fee61d5 | [
"MIT"
] | 984 | 2019-09-02T11:36:41.000Z | 2020-06-08T06:25:48.000Z | priv/repo/migrations/20190402145007_remove_device_type_from_pageviews.exs | wvffle/analytics | 2c0fd55bc67f74af1fe1e2641678d44e9fee61d5 | [
"MIT"
] | 24 | 2019-09-10T09:53:17.000Z | 2020-06-08T07:35:26.000Z | priv/repo/migrations/20190402145007_remove_device_type_from_pageviews.exs | wvffle/analytics | 2c0fd55bc67f74af1fe1e2641678d44e9fee61d5 | [
"MIT"
] | 51 | 2019-09-03T10:48:10.000Z | 2020-06-07T00:23:34.000Z | defmodule Plausible.Repo.Migrations.RemoveDeviceTypeFromPageviews do
use Ecto.Migration
def change do
alter table(:pageviews) do
remove :device_type
end
end
end
| 18.2 | 68 | 0.752747 |
1cbe0cad05c48323bc208b58b01f521a107c3794 | 6,290 | ex | Elixir | lib/off_broadway_mqtt/producer.ex | tymoor/off_broadway_mqtt | 2066a62a400ea40a86fc8fdd1d588d8406bd95be | [
"Apache-2.0"
] | 11 | 2019-07-03T00:54:04.000Z | 2021-12-13T22:24:09.000Z | lib/off_broadway_mqtt/producer.ex | tymoor/off_broadway_mqtt | 2066a62a400ea40a86fc8fdd1d588d8406bd95be | [
"Apache-2.0"
] | 1 | 2020-10-17T02:25:56.000Z | 2020-11-14T01:51:15.000Z | lib/off_broadway_mqtt/producer.ex | tymoor/off_broadway_mqtt | 2066a62a400ea40a86fc8fdd1d588d8406bd95be | [
"Apache-2.0"
] | 5 | 2019-08-09T03:14:29.000Z | 2022-03-22T21:39:02.000Z | defmodule OffBroadway.MQTT.Producer do
@moduledoc """
Acts as Producer for messages from a MQTT topic subscription.
## Features
* Retrying messages based on fail reason in the message status.
* Telemetry events.
* Gently handles connection outages thanks to `Tortoise`.
* Customizeable behaviour by dependency injection.
## Options
The producer requires on start a single argument - a list containing as the
* first element a `t:OffBroadway.MQTT.Config.t/0` struct. Refere to
the `OffBroadway.MQTT.Config` module for more info.
* second element a tuple with the subscription :
`{:subscription, {"some_topic", 0}}`.
Any further keywords are passed as options to
`OffBroadway.MQTT.Client.start/2`.
## Notes
* The buffer queues are started and registered based on topics. If you are
using shared subscriptions you will have a single queue for buffering
the incoming messages.
* The default queue keeps buffered messages only in memory. If the queue
supervisor terminates all unprocessed messages are lost.
* The buffer queues are supervised independently and don't shut down with
the producer. That way a restarted producer on the same topic can pick up
where the faulty one stopped. You might need to stop queues manually if
stopping a producer on purpose.
"""
use GenStage
require Logger
alias OffBroadway.MQTT
alias OffBroadway.MQTT.Client
alias OffBroadway.MQTT.Config
@behaviour Broadway.Producer
@typedoc "Internal state of the producer."
@type state :: %{
client_id: String.t(),
config: Config.t(),
demand: non_neg_integer,
dequeue_timer: nil | reference,
queue: GenServer.name()
}
@typedoc """
Type for optios that can be passed to the producer.
* `subscription` - A tuple with the topic and QOS to subscribe to.
Any other option is passed to `OffBroadway.MQTT.Client.start/4` as
options. Refere there for further options.
"""
@type opt ::
{:subscription, MQTT.subscription()}
| Client.option()
@typedoc "Collection type for options to start the producer with."
@type opts :: [opt, ...]
@typedoc "Type for the argument passed to the producer on start."
@type args :: nonempty_improper_list(Config.t(), opts)
@impl true
@spec init(args) ::
{:producer, state}
| {:stop, {:client, :already_started}}
| {:stop, {:client, :ignore}}
| {:stop, {:client, term}}
| {:stop, {:queue, :ignore}}
| {:stop, {:queue, term}}
when args: nonempty_improper_list(Config.t(), opts),
opt: {:subscription, MQTT.subscription()} | Client.option(),
opts: [opt, ...]
def init([%Config{} = config, {:subscription, {topic, qos} = sub} | opts]) do
queue_name = MQTT.queue_name(config, topic)
:ok =
config
|> config_to_metadata()
|> Keyword.put(:qos, qos)
|> Keyword.put(:topic, topic)
|> Logger.metadata()
client_opts =
opts
|> Keyword.put_new_lazy(:client_id, fn ->
MQTT.unique_client_id(config)
end)
with :ok <- start_queue(config, queue_name),
:ok <- start_client(config, sub, queue_name, client_opts) do
{:producer,
%{
client_id: client_opts[:client_id],
config: config,
demand: 0,
dequeue_timer: nil,
queue: queue_name
}}
else
{:error, reason} -> {:stop, reason}
end
end
defp start_client(
%{client: client} = config,
{topic, qos},
queue_name,
client_opts
) do
case client.start(config, {topic, qos}, queue_name, client_opts) do
{:ok, _pid} ->
:ok
{:error, {:already_started, _}} ->
{:error, {:client, :already_started}}
:ignore ->
{:error, {:client, :ignore}}
{:error, reason} ->
{:error, {:client, reason}}
end
end
defp start_queue(config, queue_name) do
Process.flag(:trap_exit, true)
child_spec = config.queue.child_spec([config, queue_name])
case DynamicSupervisor.start_child(config.queue_supervisor, child_spec) do
{:ok, _} ->
:ok
{:error, {:already_started, _}} ->
topic = MQTT.topic_from_queue_name(queue_name)
Logger.warn("queue for topic #{inspect(topic)} is already started")
:ok
:ignore ->
{:error, {:queue, :ignore}}
{:error, reason} ->
{:error, {:queue, reason}}
end
end
@impl true
def handle_demand(incoming_demand, %{demand: demand} = state) do
handle_dequeue_messages(%{state | demand: incoming_demand + demand})
end
@impl true
def handle_info(:dequeue_messages, state) do
handle_dequeue_messages(%{state | dequeue_timer: nil})
end
@impl true
def handle_info(_, state) do
{:noreply, [], state}
end
@impl true
def terminate(reason, state) do
:ok = Tortoise.Connection.disconnect(state.client_id)
reason
end
defp handle_dequeue_messages(
%{dequeue_timer: nil, demand: demand, config: config} = state
)
when demand > 0 do
messages = dequeue_messages_from_queue(state, demand)
new_demand = demand - length(messages)
dequeue_timer =
case {messages, new_demand} do
{[], _} -> schedule_dequeue_messages(config.dequeue_interval)
{_, 0} -> nil
_ -> schedule_dequeue_messages(0)
end
{:noreply, messages,
%{state | demand: new_demand, dequeue_timer: dequeue_timer}}
end
defp handle_dequeue_messages(state) do
{:noreply, [], state}
end
defp dequeue_messages_from_queue(
%{queue: queue_name, config: config},
total_demand
) do
config.queue.dequeue(queue_name, total_demand)
end
defp schedule_dequeue_messages(interval) do
Process.send_after(self(), :dequeue_messages, interval)
end
defp config_to_metadata(config) do
{transport, opts} = config.server
opts
|> Keyword.put(:transport, transport)
|> hide_password
end
defp hide_password(meta) do
if Keyword.has_key?(meta, :password),
do: Keyword.update!(meta, :password, fn _ -> "******" end),
else: meta
end
end
| 27.831858 | 79 | 0.633704 |
1cbe129a38cb68e38f6e6cc86f0f37a71880dab6 | 406 | ex | Elixir | lib/elixir_plug_examples.ex | jtrost/elixir_plug_examples | 38d27296b29f29703644149e02e5c614cc815813 | [
"Apache-2.0"
] | 1 | 2015-06-30T02:50:19.000Z | 2015-06-30T02:50:19.000Z | lib/elixir_plug_examples.ex | jtrost/elixir_plug_examples | 38d27296b29f29703644149e02e5c614cc815813 | [
"Apache-2.0"
] | null | null | null | lib/elixir_plug_examples.ex | jtrost/elixir_plug_examples | 38d27296b29f29703644149e02e5c614cc815813 | [
"Apache-2.0"
] | null | null | null | defmodule ElixirPlugExamples do
use Application
def start(_type, _args) do
import Supervisor.Spec, warn: false
children = [
worker(__MODULE__, [], function: :run)
]
opts = [strategy: :one_for_one, name: ElixirPlugExamples.Supervisor]
Supervisor.start_link(children, opts)
end
def run do
{:ok, _} = Plug.Adapters.Cowboy.http ElixirPlugExamples.Router, []
end
end
| 21.368421 | 72 | 0.689655 |
1cbe3ace48763ded6c9509939789da4368f90b4e | 996 | exs | Elixir | mix.exs | itu-devops2022/logstash-json | f7febcd453126861e56f7b6c231b858d339fa4b0 | [
"MIT"
] | null | null | null | mix.exs | itu-devops2022/logstash-json | f7febcd453126861e56f7b6c231b858d339fa4b0 | [
"MIT"
] | null | null | null | mix.exs | itu-devops2022/logstash-json | f7febcd453126861e56f7b6c231b858d339fa4b0 | [
"MIT"
] | null | null | null | defmodule LogstashJson.Mixfile do
use Mix.Project
def project do
[
app: :logstash_json,
version: "0.7.5",
elixir: "~> 1.4",
description: description(),
package: package(),
build_embedded: Mix.env() == :prod,
start_permanent: Mix.env() == :prod,
consolidate_protocols: Mix.env() != :test,
deps: deps()
]
end
def application do
[applications: [:logger]]
end
defp deps do
[
{:connection, "~> 1.1"},
{:jason, "~> 1.2", optional: true},
{:blocking_queue, "~> 1.3", optional: true},
{:ex_doc, ">= 0.0.0", only: :dev},
{:credo, ">= 0.0.0", only: :dev}
]
end
defp description do
"""
Formats logs as JSON, forwards to Logstash via TCP, or to console.
"""
end
defp package do
[
name: :logstash_json,
maintainers: ["Tobias Ara Svensson"],
licenses: ["MIT"],
links: %{"Github" => "https://github.com/svetob/logstash-json"}
]
end
end
| 21.191489 | 70 | 0.548193 |
1cbe42bc1eed2e5f1de09ff792dc1c82e5e85304 | 1,035 | ex | Elixir | lib/credo/cli/output/formatter/oneline.ex | isaacsanders/credo | 5623570bb2e3944345f1bf11819ca613533b5e10 | [
"MIT"
] | null | null | null | lib/credo/cli/output/formatter/oneline.ex | isaacsanders/credo | 5623570bb2e3944345f1bf11819ca613533b5e10 | [
"MIT"
] | null | null | null | lib/credo/cli/output/formatter/oneline.ex | isaacsanders/credo | 5623570bb2e3944345f1bf11819ca613533b5e10 | [
"MIT"
] | 1 | 2020-06-30T16:32:44.000Z | 2020-06-30T16:32:44.000Z | defmodule Credo.CLI.Output.Formatter.Oneline do
@moduledoc false
alias Credo.CLI.Filename
alias Credo.CLI.Output
alias Credo.CLI.Output.UI
alias Credo.Issue
def print_issues(issues) do
issues
|> Enum.sort_by(fn issue -> {issue.filename, issue.line_no, issue.column} end)
|> Enum.each(fn issue ->
UI.puts(to_oneline(issue))
end)
end
def to_oneline(
%Issue{
check: check,
message: message,
filename: filename,
priority: priority
} = issue
) do
inner_color = Output.check_color(issue)
message_color = inner_color
filename_color = :default_color
[
inner_color,
Output.check_tag(check.category),
" ",
priority |> Output.priority_arrow(),
" ",
:reset,
filename_color,
:faint,
filename |> to_string,
:default_color,
:faint,
Filename.pos_suffix(issue.line_no, issue.column),
:reset,
message_color,
" ",
message
]
end
end
| 21.122449 | 82 | 0.600966 |
1cbe8be389051bf10ff5cfe85633d4337a447c52 | 1,543 | ex | Elixir | lib/lexin_web/views/error_helpers.ex | cr0t/lexin | bff2997db52a00bf770614630b8684821ab72abc | [
"MIT"
] | null | null | null | lib/lexin_web/views/error_helpers.ex | cr0t/lexin | bff2997db52a00bf770614630b8684821ab72abc | [
"MIT"
] | 6 | 2022-01-05T12:51:37.000Z | 2022-01-13T09:52:36.000Z | lib/lexin_web/views/error_helpers.ex | cr0t/lexin | bff2997db52a00bf770614630b8684821ab72abc | [
"MIT"
] | null | null | null | defmodule LexinWeb.ErrorHelpers do
@moduledoc """
Conveniences for translating and building error messages.
"""
use Phoenix.HTML
@doc """
Generates tag for inlined form input errors.
"""
def error_tag(form, field) do
Enum.map(Keyword.get_values(form.errors, field), fn error ->
content_tag(:span, translate_error(error),
class: "invalid-feedback",
phx_feedback_for: input_name(form, field)
)
end)
end
@doc """
Translates an error message using gettext.
"""
def translate_error({msg, opts}) do
# When using gettext, we typically pass the strings we want
# to translate as a static argument:
#
# # Translate "is invalid" in the "errors" domain
# dgettext("errors", "is invalid")
#
# # Translate the number of files with plural rules
# dngettext("errors", "1 file", "%{count} files", count)
#
# Because the error messages we show in our forms and APIs
# are defined inside Ecto, we need to translate them dynamically.
# This requires us to call the Gettext module passing our gettext
# backend as first argument.
#
# Note we use the "errors" domain, which means translations
# should be written to the errors.po file. The :count option is
# set by Ecto and indicates we should also apply plural rules.
if count = opts[:count] do
Gettext.dngettext(LexinWeb.Gettext, "errors", msg, msg, count, opts)
else
Gettext.dgettext(LexinWeb.Gettext, "errors", msg, opts)
end
end
end
| 32.145833 | 74 | 0.66429 |
1cbea9bd576dab01a86744422a0941f71bccaa07 | 591 | ex | Elixir | lib/splitwise/application.ex | nathanbegbie/ex_splitwise | 6de8b9f59db9834b342b86dfcd5c41354f349e5d | [
"MIT"
] | 3 | 2019-09-29T04:15:29.000Z | 2021-04-02T14:52:04.000Z | lib/splitwise/application.ex | nathanbegbie/ex_splitwise | 6de8b9f59db9834b342b86dfcd5c41354f349e5d | [
"MIT"
] | null | null | null | lib/splitwise/application.ex | nathanbegbie/ex_splitwise | 6de8b9f59db9834b342b86dfcd5c41354f349e5d | [
"MIT"
] | 1 | 2022-02-22T15:32:16.000Z | 2022-02-22T15:32:16.000Z | defmodule ExSplitwise.Application do
# See https://hexdocs.pm/elixir/Application.html
# for more information on OTP Applications
@moduledoc false
use Application
def start(_type, _args) do
# List all child processes to be supervised
children = [
# Starts a worker by calling: Splitwise.Worker.start_link(arg)
# {Splitwise.Worker, arg}
]
# See https://hexdocs.pm/elixir/Supervisor.html
# for other strategies and supported options
opts = [strategy: :one_for_one, name: Splitwise.Supervisor]
Supervisor.start_link(children, opts)
end
end
| 28.142857 | 68 | 0.715736 |
1cbeae626ff344874e28a36062569ff49da2a3e7 | 63 | ex | Elixir | lib/mappers_web/views/layout_view.ex | evandiewald/mappers | 7359cfb39a4d9d26c42f5917ee04a7e41d3291bc | [
"Apache-2.0"
] | 32 | 2021-04-22T01:55:31.000Z | 2022-02-25T13:17:21.000Z | lib/mappers_web/views/layout_view.ex | evandiewald/mappers | 7359cfb39a4d9d26c42f5917ee04a7e41d3291bc | [
"Apache-2.0"
] | 58 | 2021-06-04T18:42:59.000Z | 2022-03-31T07:17:01.000Z | lib/mappers_web/views/layout_view.ex | evandiewald/mappers | 7359cfb39a4d9d26c42f5917ee04a7e41d3291bc | [
"Apache-2.0"
] | 13 | 2021-04-10T06:09:15.000Z | 2022-03-23T13:07:37.000Z | defmodule MappersWeb.LayoutView do
use MappersWeb, :view
end
| 15.75 | 34 | 0.809524 |
1cbeb87a9fa63599a16b9a91a1503d997ae13156 | 4,795 | ex | Elixir | apps/aecore/lib/aecore/sync/chain.ex | SingularityMatrix/elixir-node | ad126aa97931165185cf35454718ed2eee40ceed | [
"ISC"
] | null | null | null | apps/aecore/lib/aecore/sync/chain.ex | SingularityMatrix/elixir-node | ad126aa97931165185cf35454718ed2eee40ceed | [
"ISC"
] | 2 | 2018-10-01T16:46:26.000Z | 2018-10-01T19:45:42.000Z | apps/aecore/lib/aecore/sync/chain.ex | gspasov/dogs-blockchain | 884c14cfc98de2c3793a204da069630d090bbc90 | [
"0BSD"
] | null | null | null | defmodule Aecore.Sync.Chain do
@moduledoc """
Implements all the functions regarding the Chain structure of the SyncTask
"""
alias __MODULE__
alias Aecore.Chain.Header
alias Aecore.Sync.Task
@type peer_id :: pid()
@type chain_id :: reference()
@type height :: non_neg_integer()
@type header_hash :: binary()
@typedoc "Holds data for header height and hash"
@type chain :: %{height: height(), hash: header_hash()}
@type t :: %Chain{chain_id: chain_id(), peers: list(peer_id()), chain: list(chain())}
defstruct chain_id: nil,
peers: [],
chain: []
@spec init_chain(peer_id(), Header.t()) :: Chain.t()
def init_chain(peer_id, header) do
init_chain(Kernel.make_ref(), [peer_id], header)
end
@spec init_chain(chain_id(), peer_id(), Header.t()) :: Chain.t()
def init_chain(chain_id, peers, %Header{height: height, prev_hash: prev_hash} = header) do
header_hash = Header.hash(header)
prev_header_data =
if height > 1 do
[%{height: height - 1, hash: prev_hash}]
else
[]
end
%Chain{
chain_id: chain_id,
peers: peers,
chain: [%{height: height, hash: header_hash}] ++ prev_header_data
}
end
@spec merge_chains(Chain.t(), Chain.t()) :: Chain.t()
def merge_chains(%Chain{chain_id: chain_id, peers: peers_1, chain: chain_1}, %Chain{
chain_id: chain_id,
peers: peers_2,
chain: chain_2
}) do
peers =
(peers_1 ++ peers_2)
|> Enum.sort()
|> Enum.uniq()
%Chain{
chain_id: chain_id,
peers: peers,
chain: merge_chain_list_descending(chain_1, chain_2)
}
end
@spec try_match_chains(list(chain()), list(chain())) ::
:equal | :different | {:first | :second, height()}
def try_match_chains([%{height: height_1} | chain_1], [
%{height: height_2, hash: header_hash} | _
])
when height_1 > height_2 do
case find_hash_at_height(height_2, chain_1) do
{:ok, ^header_hash} -> :equal
{:ok, _} -> :different
:not_found -> {:first, height_2}
end
end
def try_match_chains([%{height: height_1, hash: header_hash} | _], chain_2) do
case find_hash_at_height(height_1, chain_2) do
{:ok, ^header_hash} -> :equal
{:ok, _} -> :different
:not_found -> {:second, height_1}
end
end
@spec find_hash_at_height(height(), list(chain())) :: {:ok, header_hash()} | :not_found
def find_hash_at_height(height, [%{height: height, hash: header_hash} | _]),
do: {:ok, header_hash}
def find_hash_at_height(_, []), do: :not_found
def find_hash_at_height(height, [%{height: height_1} | _]) when height_1 < height,
do: :not_found
def find_hash_at_height(height, [_ | chain]), do: find_hash_at_height(height, chain)
@doc """
If there is a task with chain_id equal to the given chain,
merge the data between the chain in the task and the given chain
"""
@spec add_chain_info(Chain.t(), Sync.t()) :: Sync.t()
def add_chain_info(%Chain{chain_id: chain_id} = incoming_chain, sync) do
case Task.get_sync_task(chain_id, sync) do
{:ok, %Task{chain: current_chain} = task} ->
merged_chain = merge_chains(incoming_chain, current_chain)
task_with_merged_chain = %Task{task | chain: merged_chain}
Task.set_sync_task(task_with_merged_chain, sync)
{:error, :not_found} ->
sync
end
end
@doc """
Get the next known header_hash at a height bigger than N; or
if no such hash exist, the header_hash at the highest known height.
"""
@spec next_known_header_hash(Chain.t(), height()) :: header_hash()
def next_known_header_hash(chains, height) do
%{hash: header_hash} =
case Enum.take_while(chains, fn %{height: h} -> h > height end) do
[] ->
[chain | _] = chains
chain
chains_1 ->
List.last(chains_1)
end
header_hash
end
## Merges two list of chains, that are already sorted descending
## (based on the height), without keeping duplicates,
## where each element is a map with height and header hash
defp merge_chain_list_descending(list1, list2) do
merge(list1, list2, [])
end
defp merge([], [], acc), do: Enum.reverse(acc)
defp merge([], [head2 | rest2], acc) do
merge([], rest2, [head2 | acc])
end
defp merge([head1 | rest1], [], acc) do
merge(rest1, [], [head1 | acc])
end
defp merge(
[%{height: height1} = hd1 | rest1] = list1,
[%{height: height2} = hd2 | rest2] = list2,
acc
) do
cond do
height1 > height2 ->
merge(rest1, list2, [hd1 | acc])
height1 < height2 ->
merge(list1, rest2, [hd2 | acc])
true ->
merge(rest1, rest2, [hd1 | acc])
end
end
end
| 28.885542 | 92 | 0.620855 |
1cbecefb20db25ad7ee31ecbbf8691d29ef9661a | 1,773 | ex | Elixir | clients/content/lib/google_api/content/v2/model/orders_get_by_merchant_order_id_response.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | 1 | 2018-12-03T23:43:10.000Z | 2018-12-03T23:43:10.000Z | clients/content/lib/google_api/content/v2/model/orders_get_by_merchant_order_id_response.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | null | null | null | clients/content/lib/google_api/content/v2/model/orders_get_by_merchant_order_id_response.ex | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This class is auto generated by the elixir code generator program.
# Do not edit the class manually.
defmodule GoogleApi.Content.V2.Model.OrdersGetByMerchantOrderIdResponse do
@moduledoc """
## Attributes
* `kind` (*type:* `String.t`, *default:* `content#ordersGetByMerchantOrderIdResponse`) - Identifies what kind of resource this is. Value: the fixed string "content#ordersGetByMerchantOrderIdResponse".
* `order` (*type:* `GoogleApi.Content.V2.Model.Order.t`, *default:* `nil`) - The requested order.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:kind => String.t(),
:order => GoogleApi.Content.V2.Model.Order.t()
}
field(:kind)
field(:order, as: GoogleApi.Content.V2.Model.Order)
end
defimpl Poison.Decoder, for: GoogleApi.Content.V2.Model.OrdersGetByMerchantOrderIdResponse do
def decode(value, options) do
GoogleApi.Content.V2.Model.OrdersGetByMerchantOrderIdResponse.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Content.V2.Model.OrdersGetByMerchantOrderIdResponse do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 35.46 | 204 | 0.743937 |
1cbed0419bcceb7ee30a11a62ffd37382b78fd34 | 552 | ex | Elixir | lib/level/posts.ex | cas27/level | 70f4c7ab696e426c4be5cdc0b71bca1dcc0fe21a | [
"Apache-2.0"
] | null | null | null | lib/level/posts.ex | cas27/level | 70f4c7ab696e426c4be5cdc0b71bca1dcc0fe21a | [
"Apache-2.0"
] | null | null | null | lib/level/posts.ex | cas27/level | 70f4c7ab696e426c4be5cdc0b71bca1dcc0fe21a | [
"Apache-2.0"
] | null | null | null | defmodule Level.Posts do
@moduledoc """
The Posts context.
"""
alias Level.Posts.Post
alias Level.Repo
alias Level.Spaces.SpaceUser
@doc """
Creates a new post.
"""
@spec create_post(SpaceUser.t(), map()) :: {:ok, Post.t()} | {:error, Ecto.Changeset.t()}
def create_post(space_user, params) do
params_with_relations =
params
|> Map.put(:space_id, space_user.space_id)
|> Map.put(:space_user_id, space_user.id)
%Post{}
|> Post.create_changeset(params_with_relations)
|> Repo.insert()
end
end
| 22.08 | 91 | 0.648551 |
1cbed557201f3ea0fb5ec32373474ed328eb2cbe | 2,050 | ex | Elixir | lib/dde_iotserver_liveview_web.ex | aslakjohansen/dde-iotserver-liveview | eaf063c366105da7ca30b55c6a7a7dd4505b0916 | [
"BSD-3-Clause"
] | null | null | null | lib/dde_iotserver_liveview_web.ex | aslakjohansen/dde-iotserver-liveview | eaf063c366105da7ca30b55c6a7a7dd4505b0916 | [
"BSD-3-Clause"
] | null | null | null | lib/dde_iotserver_liveview_web.ex | aslakjohansen/dde-iotserver-liveview | eaf063c366105da7ca30b55c6a7a7dd4505b0916 | [
"BSD-3-Clause"
] | null | null | null | defmodule DdeIotserverLiveviewWeb do
@moduledoc """
The entrypoint for defining your web interface, such
as controllers, views, channels and so on.
This can be used in your application as:
use DdeIotserverLiveviewWeb, :controller
use DdeIotserverLiveviewWeb, :view
The definitions below will be executed for every view,
controller, etc, so keep them short and clean, focused
on imports, uses and aliases.
Do NOT define functions inside the quoted expressions
below. Instead, define any helper function in modules
and import those modules here.
"""
def controller do
quote do
use Phoenix.Controller, namespace: DdeIotserverLiveviewWeb
import Plug.Conn
import DdeIotserverLiveviewWeb.Gettext
alias DdeIotserverLiveviewWeb.Router.Helpers, as: Routes
end
end
def view do
quote do
use Phoenix.View,
root: "lib/dde_iotserver_liveview_web/templates",
namespace: DdeIotserverLiveviewWeb
# Import convenience functions from controllers
import Phoenix.Controller,
only: [get_flash: 1, get_flash: 2, view_module: 1, view_template: 1]
# Include shared imports and aliases for views
unquote(view_helpers())
end
end
def router do
quote do
use Phoenix.Router
import Plug.Conn
import Phoenix.Controller
end
end
def channel do
quote do
use Phoenix.Channel
import DdeIotserverLiveviewWeb.Gettext
end
end
defp view_helpers do
quote do
# Use all HTML functionality (forms, tags, etc)
use Phoenix.HTML
# Import basic rendering functionality (render, render_layout, etc)
import Phoenix.View
import DdeIotserverLiveviewWeb.ErrorHelpers
import DdeIotserverLiveviewWeb.Gettext
alias DdeIotserverLiveviewWeb.Router.Helpers, as: Routes
end
end
@doc """
When used, dispatch to the appropriate controller/view/etc.
"""
defmacro __using__(which) when is_atom(which) do
apply(__MODULE__, which, [])
end
end
| 25 | 76 | 0.709756 |
1cbedfdb8703da071802844a1b8c9df108c9a3bd | 1,895 | exs | Elixir | config/dev.exs | madvoidhq/gaga | 1a539edc327135c7910a51bffd6824bddcba5f7d | [
"MIT"
] | 13 | 2020-11-22T18:43:21.000Z | 2022-02-12T00:57:45.000Z | config/dev.exs | madvoidhq/gaga | 1a539edc327135c7910a51bffd6824bddcba5f7d | [
"MIT"
] | 2 | 2020-11-25T16:58:15.000Z | 2021-06-21T12:02:41.000Z | config/dev.exs | madvoidhq/gaga | 1a539edc327135c7910a51bffd6824bddcba5f7d | [
"MIT"
] | 4 | 2020-11-23T08:14:03.000Z | 2022-01-25T08:18:41.000Z | use Mix.Config
# For development, we disable any cache and enable
# debugging and code reloading.
#
# The watchers configuration can be used to run external
# watchers to your application. For example, we use it
# with webpack to recompile .js and .css sources.
config :gaga, GagaWeb.Endpoint,
http: [port: 4000],
debug_errors: true,
code_reloader: true,
check_origin: false,
watchers: [
node: [
"node_modules/webpack/bin/webpack.js",
"--mode",
"development",
"--watch-stdin",
cd: Path.expand("../assets", __DIR__)
]
]
# ## SSL Support
#
# In order to use HTTPS in development, a self-signed
# certificate can be generated by running the following
# Mix task:
#
# mix phx.gen.cert
#
# Note that this task requires Erlang/OTP 20 or later.
# Run `mix help phx.gen.cert` for more information.
#
# The `http:` config above can be replaced with:
#
# https: [
# port: 4001,
# cipher_suite: :strong,
# keyfile: "priv/cert/selfsigned_key.pem",
# certfile: "priv/cert/selfsigned.pem"
# ],
#
# If desired, both `http:` and `https:` keys can be
# configured to run both http and https servers on
# different ports.
# Watch static and templates for browser reloading.
config :gaga, GagaWeb.Endpoint,
live_reload: [
patterns: [
~r"priv/static/.*(js|css|png|jpeg|jpg|gif|svg)$",
~r"priv/gettext/.*(po)$",
~r"lib/gaga_web/(live|views)/.*(ex)$",
~r"lib/gaga_web/templates/.*(eex)$"
]
]
# Do not include metadata nor timestamps in development logs
config :logger, :console, format: "[$level] $message\n"
# Set a higher stacktrace during development. Avoid configuring such
# in production as building large stacktraces may be expensive.
config :phoenix, :stacktrace_depth, 20
# Initialize plugs at runtime for faster development compilation
config :phoenix, :plug_init_mode, :runtime
| 27.867647 | 68 | 0.680739 |
1cbeecf1d57e0ef65e912c8e7367cd3513e7d325 | 186 | ex | Elixir | lib/ex_alipay.ex | leozhang37/ex_alipay | 64ef0d376df36ff52e0828754c07f23249ab6914 | [
"MIT"
] | null | null | null | lib/ex_alipay.ex | leozhang37/ex_alipay | 64ef0d376df36ff52e0828754c07f23249ab6914 | [
"MIT"
] | null | null | null | lib/ex_alipay.ex | leozhang37/ex_alipay | 64ef0d376df36ff52e0828754c07f23249ab6914 | [
"MIT"
] | null | null | null | defmodule ExAlipay do
@moduledoc """
An Alipay client that extendable.
### Setup
Add :ex_alipay to the deps in `mix.exe`:
```elixir
{:ex_alipay, "~> 0.1"}
```
"""
end
| 13.285714 | 42 | 0.591398 |
1cbef56f4e36d0c6b9eca88e752cd681bbe2f494 | 225 | ex | Elixir | lib/requester_web/router.ex | jbrb/lyrix-api | c532d1a9f42f9d086a35f8aeb59dc363143d4242 | [
"MIT"
] | null | null | null | lib/requester_web/router.ex | jbrb/lyrix-api | c532d1a9f42f9d086a35f8aeb59dc363143d4242 | [
"MIT"
] | null | null | null | lib/requester_web/router.ex | jbrb/lyrix-api | c532d1a9f42f9d086a35f8aeb59dc363143d4242 | [
"MIT"
] | null | null | null | defmodule RequesterWeb.Router do
use RequesterWeb, :router
pipeline :api do
plug :accepts, ["json"]
end
scope "/api", RequesterWeb do
pipe_through :api
post("/send", RequestController, :post)
end
end
| 16.071429 | 43 | 0.68 |
1cbf265aafd1820afd20cfcc538ca3d181704c62 | 283 | exs | Elixir | test/grizzly/zwave/commands/thermostat_fan_state_get_test.exs | jellybob/grizzly | 290bee04cb16acbb9dc996925f5c501697b7ac94 | [
"Apache-2.0"
] | 76 | 2019-09-04T16:56:58.000Z | 2022-03-29T06:54:36.000Z | test/grizzly/zwave/commands/thermostat_fan_state_get_test.exs | jellybob/grizzly | 290bee04cb16acbb9dc996925f5c501697b7ac94 | [
"Apache-2.0"
] | 124 | 2019-09-05T14:01:24.000Z | 2022-02-28T22:58:14.000Z | test/grizzly/zwave/commands/thermostat_fan_state_get_test.exs | jellybob/grizzly | 290bee04cb16acbb9dc996925f5c501697b7ac94 | [
"Apache-2.0"
] | 10 | 2019-10-23T19:25:45.000Z | 2021-11-17T13:21:20.000Z | defmodule Grizzly.ZWave.Commands.ThermostatFanStateGetTest do
use ExUnit.Case, async: true
alias Grizzly.ZWave.Commands.ThermostatFanStateGet
test "creates the command and validates params" do
params = []
{:ok, _command} = ThermostatFanStateGet.new(params)
end
end
| 25.727273 | 61 | 0.766784 |
1cbf303a3c408bb3609c46233a68c4b5ae8e8eab | 1,113 | ex | Elixir | lib/commodity_api/iam/access_control/passphrase/invalidation/invalidation_view.ex | akdilsiz/commodity-cloud | 08c366c9fc95fbb3565131672db4cc52f8b870c9 | [
"Apache-2.0"
] | 7 | 2019-04-11T21:12:49.000Z | 2021-04-14T12:56:42.000Z | lib/commodity_api/iam/access_control/passphrase/invalidation/invalidation_view.ex | akdilsiz/commodity-cloud | 08c366c9fc95fbb3565131672db4cc52f8b870c9 | [
"Apache-2.0"
] | null | null | null | lib/commodity_api/iam/access_control/passphrase/invalidation/invalidation_view.ex | akdilsiz/commodity-cloud | 08c366c9fc95fbb3565131672db4cc52f8b870c9 | [
"Apache-2.0"
] | 2 | 2019-06-06T18:05:33.000Z | 2019-07-16T08:49:45.000Z | ##
# Copyright 2018 Abdulkadir DILSIZ
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
##
defmodule Commodity.Api.Iam.AccessControl.Passphrase.InvalidationView do
use Commodity.Api, :view
def render("show.json", %{invalidation: invalidation}) do
%{data: render_one(invalidation.one, __MODULE__, "invalidation.json"),
time_information: render_one(invalidation.time_information,
Commodity.Api.Util.TimeInformationView,
"time_information.json")}
end
def render("invalidation.json", %{invalidation: invalidation}) do
%{passphrase_ids: invalidation.passphrase_ids}
end
end | 38.37931 | 77 | 0.751123 |
1cbf90dc1f820c93b48d66da8670ec78e95f8932 | 519 | exs | Elixir | config/dev.exs | 0nkery/psychic-computing-machine | d90e8792c49d170e166b4d6029e4686104109b24 | [
"MIT"
] | null | null | null | config/dev.exs | 0nkery/psychic-computing-machine | d90e8792c49d170e166b4d6029e4686104109b24 | [
"MIT"
] | null | null | null | config/dev.exs | 0nkery/psychic-computing-machine | d90e8792c49d170e166b4d6029e4686104109b24 | [
"MIT"
] | null | null | null | import Config
config :logger, level: :debug
config :logger, :console,
format: "\n$time $metadata[$level] $levelpad$message\n",
metadata: [:application, :query_time, :response_status]
config :deribit, Deribit.InfluxDBConnection,
database: "stock",
host: "localhost",
http_opts: [insecure: true],
pool: [max_overflow: 10, size: 50],
port: 8086,
scheme: "http",
writer: Instream.Writer.Line
if File.exists?(__ENV__.file |> Path.dirname() |> Path.join("local.exs")) do
import_config "local.exs"
end
| 24.714286 | 76 | 0.701349 |
1cc002e82f52de99eddc935af07ad567f7bcb91b | 770 | ex | Elixir | web/controllers/auth_controller.ex | wsmoak/my_app_808732 | f4f92dd5cb2ffa2d6854a6573ba092ef433ab85d | [
"MIT"
] | null | null | null | web/controllers/auth_controller.ex | wsmoak/my_app_808732 | f4f92dd5cb2ffa2d6854a6573ba092ef433ab85d | [
"MIT"
] | null | null | null | web/controllers/auth_controller.ex | wsmoak/my_app_808732 | f4f92dd5cb2ffa2d6854a6573ba092ef433ab85d | [
"MIT"
] | null | null | null | defmodule MyApp_808732.AuthController do
use MyApp_808732.Web, :controller
alias MyApp_808732.User
def index(conn, _params) do
redirect conn, external: Fitbit.authorize_url!(scope: "settings sleep")
end
def callback(conn, %{"code" => code}) do
token = Fitbit.get_token!(code: code)
data = OAuth2.AccessToken.get!(token, "/1/user/-/profile.json")
IO.inspect data
user_name = data["user"]["displayName"]
changeset = User.changeset(%User{},
%{user_id: token.other_params["user_id"],
access_token: token.access_token,
refresh_token: token.refresh_token,
name: user_name
})
Repo.insert!(changeset)
conn
|> put_flash(:info, "Hello #{user_name}!")
|> redirect(to: "/")
end
end
| 24.0625 | 75 | 0.651948 |
1cc0105fd1b41dfa77508f9d6388c2d8dbeaca6a | 276 | exs | Elixir | test/nat_vis_test.exs | BradLyman/elixir_native_visualization | e651e16254d752eeefc4e9946759e8d8c0b538b5 | [
"MIT"
] | null | null | null | test/nat_vis_test.exs | BradLyman/elixir_native_visualization | e651e16254d752eeefc4e9946759e8d8c0b538b5 | [
"MIT"
] | 4 | 2017-10-06T06:26:09.000Z | 2017-10-06T06:36:11.000Z | test/nat_vis_test.exs | BradLyman/elixir_native_visualization | e651e16254d752eeefc4e9946759e8d8c0b538b5 | [
"MIT"
] | null | null | null | defmodule ElixirNativeVisualizationTest do
use ExUnit.Case
doctest NatVis
test "open_window and close_window should succeed" do
window = NatVis.open_window
:timer.sleep(1000)
result =
window |> NatVis.close_window
assert result == :ok
end
end
| 18.4 | 55 | 0.717391 |
1cc02358e8e8a35e2acc875aede4f92142ca5371 | 286 | exs | Elixir | test/forum/publishable_test.exs | myskoach/events | 84e270b9f4546a32a033cc2cba3e3961b9a86a75 | [
"MIT"
] | null | null | null | test/forum/publishable_test.exs | myskoach/events | 84e270b9f4546a32a033cc2cba3e3961b9a86a75 | [
"MIT"
] | null | null | null | test/forum/publishable_test.exs | myskoach/events | 84e270b9f4546a32a033cc2cba3e3961b9a86a75 | [
"MIT"
] | null | null | null | defmodule Forum.PublishableTest do
use ExUnit.Case, async: true
alias Forum.Publishable
describe "topic/1" do
test "returns the topic for the event by default" do
assert Publishable.topic(%Forum.TestEventOne{life: 42}) == "test-event-one"
end
end
end
| 23.833333 | 82 | 0.688811 |
1cc08c4e351f4e9c15bb0b3e3d1a0d4624814e59 | 20,029 | ex | Elixir | lib/chunkr/pagination_planner.ex | sjqtentacles/chunkr | eaac18d704c46889f47ead35b67a82ec94c18a6b | [
"MIT"
] | null | null | null | lib/chunkr/pagination_planner.ex | sjqtentacles/chunkr | eaac18d704c46889f47ead35b67a82ec94c18a6b | [
"MIT"
] | null | null | null | lib/chunkr/pagination_planner.ex | sjqtentacles/chunkr | eaac18d704c46889f47ead35b67a82ec94c18a6b | [
"MIT"
] | null | null | null | defmodule Chunkr.PaginationPlanner do
@moduledoc """
Provides a set of macros for generating functions to assist with paginating queries. For example:
defmodule MyApp.PaginationPlanner do
use Chunkr.PaginationPlanner
paginate_by :user_created_at do
sort :desc, as(:user).inserted_at
sort :desc, as(:user).id, type: :binary_id
end
paginate_by :user_name do
sort :asc, fragment("lower(coalesce(?, 'zzz')"), as(:user).name).inserted_at
sort :desc, as(:user).id, type: :binary_id
end
end
The `paginate_by/1` macro above takes a query name and sets up the necessary `beyond_cursor/4`,
`apply_order/4`, and `apply_select/2` functions based on the number of sort options passed in the
block as well as the sort directions specified.
Each call to `sort/3` must include the sort direction, the field to be sorted, and an optional
`:type` keyword. If `:type` is provided, the cursor value will be cast as that type for the
sake of comparisons. See Ecto.Query.API.type/2.
## Ordering
In keyset-based pagination, it is essential that results are deterministically ordered, otherwise
you may see unexpected results. Therefore, the final column used for sorting must _always_ be
unique and non-NULL.
Ordering of paginated results can be based on columns from the primary table, any joined table,
any subquery, or any dynamically computed value based on other fields. Regardless of where the
column resides, named bindings are always required…
## Named bindings
Because these `sort/3` clauses must reference bindings that have not yet been established, each
sort clause must use `:as` to take advantage of late binding. A parallel `:as` must then be used
within the query that gets passed to `Chunkr.paginate/4` or the query will fail. See
[Ecto Named bindings](https://hexdocs.pm/ecto/Ecto.Query.html#module-named-bindings) for more.
## NULL values in sort fields
When using comparison operators in SQL, records involving comparisons against `NULL` get dropped.
This is generally undesirable for pagination, as the goal is usually to work your way through an
entire result set in chunks—not just through the part of the result set that doesn't have NULL
values in the important fields. For example, when sorting users by [last name, first name,
middle name], you most likely don't want to exclude users without a known middle name.
To work around this awkwardness, you'll need to pick a value that is almost sure to come before
or after the rest of your results (depending on whether you want `NULL` values to sort to the
beginning or the end of your results). It's not good enough to think you can simply use a strategy
like ordering by `NULLS LAST` because the filtering of values up to the cursor values will use
comparison operators—which will cause records with relevant NULL values to be dropped entirely.
The following `fragment` example sets up names to be compared in a case-insensitive fashion
and places records with a `NULL` name at the end of the list (assuming no names will sort beyond
"zzz"!).
sort :asc, fragment("lower(coalesce(?, 'zzz')"), as(:user).name).inserted_at
## Limitations
_Note that Chunkr limits the number of `sort` clauses to 4._
"""
@doc false
defmacro __using__(_) do
quote do
import unquote(__MODULE__)
require Ecto.Query
def apply_limit(query, limit) do
Ecto.Query.limit(query, ^limit)
end
end
end
@doc """
Implements the functions necessary for pagination.
paginate_by :user_id do
sort :asc, as(:user).id
end
"""
defmacro paginate_by(query_name, do: {:sort, _, args}) do
sorts = [parse_sorts(args)]
implement(query_name, sorts)
end
defmacro paginate_by(query_name, do: {:__block__, _, sorts}) do
sorts = Enum.map(sorts, fn {:sort, _, args} -> parse_sorts(args) end)
implement(query_name, sorts)
end
@doc false
def parse_sorts([dir, field]), do: {dir, field, nil}
def parse_sorts([dir, field, [type: type]]), do: {dir, field, type}
@doc false
def with_cursor_fields_func(query_name, fields) do
quote do
def apply_select(query, unquote(query_name)) do
Ecto.Query.select(query, [record], {unquote(fields), record})
end
end
end
@doc false
def with_order_func(query_name, primary_sort_dir, order_bys) do
inverted_sort_dir = invert(primary_sort_dir)
quote do
def apply_order(query, unquote(query_name), unquote(primary_sort_dir), :forward) do
Ecto.Query.order_by(query, unquote(order_bys))
end
def apply_order(query, unquote(query_name), unquote(primary_sort_dir), :backward) do
Ecto.Query.order_by(query, unquote(order_bys))
|> Ecto.Query.reverse_order()
end
def apply_order(query, unquote(query_name), unquote(inverted_sort_dir), :forward) do
Ecto.Query.order_by(query, unquote(order_bys))
|> Ecto.Query.reverse_order()
end
def apply_order(query, unquote(query_name), unquote(inverted_sort_dir), :backward) do
Ecto.Query.order_by(query, unquote(order_bys))
end
end
end
@doc false
def implement(query_name, sorts) when length(sorts) == 1 do
[{dir1, f1, t1}] = sorts
rdir1 = invert(dir1)
operators = derive_operators([dir1])
[op1] = operators
[rop1] = operators |> Enum.map(&invert/1)
order_bys = Enum.map(sorts, fn {dir, field, _type} -> {dir, field} end)
fields = Enum.map(sorts, fn {_dir, field, _type} -> field end)
quote do
def beyond_cursor(query, unquote(query_name), unquote(dir1), :forward, cursor_values) do
[cv1] = cursor_values
Ecto.Query.where(query, compare(unquote(f1), unquote(op1), cv1, unquote(t1)))
end
def beyond_cursor(query, unquote(query_name), unquote(dir1), :backward, cursor_values) do
[cv1] = cursor_values
Ecto.Query.where(query, compare(unquote(f1), unquote(rop1), cv1, unquote(t1)))
end
def beyond_cursor(query, unquote(query_name), unquote(rdir1), :forward, cursor_values) do
[cv1] = cursor_values
Ecto.Query.where(query, compare(unquote(f1), unquote(rop1), cv1, unquote(t1)))
end
def beyond_cursor(query, unquote(query_name), unquote(rdir1), :backward, cursor_values) do
[cv1] = cursor_values
Ecto.Query.where(query, compare(unquote(f1), unquote(op1), cv1, unquote(t1)))
end
unquote(with_order_func(query_name, dir1, order_bys))
unquote(with_cursor_fields_func(query_name, fields))
end
end
def implement(query_name, sorts) when length(sorts) == 2 do
[{dir1, f1, t1}, {dir2, f2, t2}] = sorts
rdir1 = invert(dir1)
operators = derive_operators([dir1, dir2])
[op1, op2, op3, op4] = operators
[rop1, rop2, rop3, rop4] = Enum.map(operators, &invert/1)
order_bys = Enum.map(sorts, fn {dir, field, _type} -> {dir, field} end)
fields = Enum.map(sorts, fn {_dir, field, _type} -> field end)
quote do
def beyond_cursor(query, unquote(query_name), unquote(dir1), :forward, cursor_values) do
[cv1, cv2] = cursor_values
query
|> Ecto.Query.where(
compare(unquote(f1), unquote(op1), cv1, unquote(t1)) and
(compare(unquote(f1), unquote(op2), cv1, unquote(t1)) or
(compare(unquote(f1), unquote(op3), cv1, unquote(t1)) and
compare(unquote(f2), unquote(op4), cv2, unquote(t2))))
)
end
def beyond_cursor(query, unquote(query_name), unquote(dir1), :backward, cursor_values) do
[cv1, cv2] = cursor_values
query
|> Ecto.Query.where(
compare(unquote(f1), unquote(rop1), cv1, unquote(t1)) and
(compare(unquote(f1), unquote(rop2), cv1, unquote(t1)) or
(compare(unquote(f1), unquote(rop3), cv1, unquote(t1)) and
compare(unquote(f2), unquote(rop4), cv2, unquote(t2))))
)
end
def beyond_cursor(query, unquote(query_name), unquote(rdir1), :forward, cursor_values) do
[cv1, cv2] = cursor_values
query
|> Ecto.Query.where(
compare(unquote(f1), unquote(rop1), cv1, unquote(t1)) and
(compare(unquote(f1), unquote(rop2), cv1, unquote(t1)) or
(compare(unquote(f1), unquote(rop3), cv1, unquote(t1)) and
compare(unquote(f2), unquote(rop4), cv2, unquote(t2))))
)
end
def beyond_cursor(query, unquote(query_name), unquote(rdir1), :backward, cursor_values) do
[cv1, cv2] = cursor_values
query
|> Ecto.Query.where(
compare(unquote(f1), unquote(op1), cv1, unquote(t1)) and
(compare(unquote(f1), unquote(op2), cv1, unquote(t1)) or
(compare(unquote(f1), unquote(op3), cv1, unquote(t1)) and
compare(unquote(f2), unquote(op4), cv2, unquote(t2))))
)
end
unquote(with_order_func(query_name, dir1, order_bys))
unquote(with_cursor_fields_func(query_name, fields))
end
end
@doc false
def implement(query_name, sorts) when length(sorts) == 3 do
[{dir1, f1, t1}, {dir2, f2, t2}, {dir3, f3, t3}] = sorts
rdir1 = invert(dir1)
operators = derive_operators([dir1, dir2, dir3])
[op1, op2, op3, op4, op5, op6, op7] = operators
[rop1, rop2, rop3, rop4, rop5, rop6, rop7] = Enum.map(operators, &invert/1)
order_bys = Enum.map(sorts, fn {dir, field, _type} -> {dir, field} end)
fields = Enum.map(sorts, fn {_dir, field, _type} -> field end)
quote do
def beyond_cursor(query, unquote(query_name), unquote(dir1), :forward, cursor_values) do
[cv1, cv2, cv3] = cursor_values
query
|> Ecto.Query.where(
compare(unquote(f1), unquote(op1), cv1, unquote(t1)) and
(compare(unquote(f1), unquote(op2), cv1, unquote(t1)) or
((compare(unquote(f1), unquote(op3), cv1, unquote(t1)) and
compare(unquote(f2), unquote(op4), cv2, unquote(t2))) or
(compare(unquote(f1), unquote(op5), cv1, unquote(t1)) and
compare(unquote(f2), unquote(op6), cv2, unquote(t2)) and
compare(unquote(f3), unquote(op7), cv3, unquote(t3)))))
)
end
def beyond_cursor(query, unquote(query_name), unquote(dir1), :backward, cursor_values) do
[cv1, cv2, cv3] = cursor_values
query
|> Ecto.Query.where(
compare(unquote(f1), unquote(rop1), cv1, unquote(t1)) and
(compare(unquote(f1), unquote(rop2), cv1, unquote(t1)) or
((compare(unquote(f1), unquote(rop3), cv1, unquote(t1)) and
compare(unquote(f2), unquote(rop4), cv2, unquote(t2))) or
(compare(unquote(f1), unquote(rop5), cv1, unquote(t1)) and
compare(unquote(f2), unquote(rop6), cv2, unquote(t2)) and
compare(unquote(f3), unquote(rop7), cv3, unquote(t3)))))
)
end
def beyond_cursor(query, unquote(query_name), unquote(rdir1), :forward, cursor_values) do
[cv1, cv2, cv3] = cursor_values
query
|> Ecto.Query.where(
compare(unquote(f1), unquote(rop1), cv1, unquote(t1)) and
(compare(unquote(f1), unquote(rop2), cv1, unquote(t1)) or
((compare(unquote(f1), unquote(rop3), cv1, unquote(t1)) and
compare(unquote(f2), unquote(rop4), cv2, unquote(t2))) or
(compare(unquote(f1), unquote(rop5), cv1, unquote(t1)) and
compare(unquote(f2), unquote(rop6), cv2, unquote(t2)) and
compare(unquote(f3), unquote(rop7), cv3, unquote(t3)))))
)
end
def beyond_cursor(query, unquote(query_name), unquote(rdir1), :backward, cursor_values) do
[cv1, cv2, cv3] = cursor_values
query
|> Ecto.Query.where(
compare(unquote(f1), unquote(op1), cv1, unquote(t1)) and
(compare(unquote(f1), unquote(op2), cv1, unquote(t1)) or
((compare(unquote(f1), unquote(op3), cv1, unquote(t1)) and
compare(unquote(f2), unquote(op4), cv2, unquote(t2))) or
(compare(unquote(f1), unquote(op5), cv1, unquote(t1)) and
compare(unquote(f2), unquote(op6), cv2, unquote(t2)) and
compare(unquote(f3), unquote(op7), cv3, unquote(t3)))))
)
end
unquote(with_order_func(query_name, dir1, order_bys))
unquote(with_cursor_fields_func(query_name, fields))
end
end
def implement(query_name, sorts) when length(sorts) == 4 do
[{dir1, f1, t1}, {dir2, f2, t2}, {dir3, f3, t3}, {dir4, f4, t4}] = sorts
rdir1 = invert(dir1)
order_bys = Enum.map(sorts, fn {dir, field, _type} -> {dir, field} end)
fields = Enum.map(sorts, fn {_dir, field, _type} -> field end)
operators = derive_operators([dir1, dir2, dir3, dir4])
[op1, op2, op3, op4, op5, op6, op7, op8, op9, op10, op11] = operators
[rop1, rop2, rop3, rop4, rop5, rop6, rop7, rop8, rop9, rop10, rop11] =
Enum.map(operators, &invert/1)
quote do
def beyond_cursor(query, unquote(query_name), unquote(dir1), :forward, cursor_values) do
[cv1, cv2, cv3, cv4] = cursor_values
query
|> Ecto.Query.where(
compare(unquote(f1), unquote(op1), cv1, unquote(t1)) and
(compare(unquote(f1), unquote(op2), cv1, unquote(t1)) or
((compare(unquote(f1), unquote(op3), cv1, unquote(t1)) and
compare(unquote(f2), unquote(op4), cv2, unquote(t2))) or
((compare(unquote(f1), unquote(op5), cv1, unquote(t1)) and
compare(unquote(f2), unquote(op6), cv2, unquote(t2)) and
compare(unquote(f3), unquote(op7), cv3, unquote(t3))) or
(compare(unquote(f1), unquote(op8), cv1, unquote(t1)) and
compare(unquote(f2), unquote(op9), cv2, unquote(t2)) and
compare(unquote(f3), unquote(op10), cv3, unquote(t3)) and
compare(unquote(f4), unquote(op11), cv4, unquote(t4))))))
)
end
def beyond_cursor(query, unquote(query_name), unquote(dir1), :backward, cursor_values) do
[cv1, cv2, cv3, cv4] = cursor_values
query
|> Ecto.Query.where(
compare(unquote(f1), unquote(rop1), cv1, unquote(t1)) and
(compare(unquote(f1), unquote(rop2), cv1, unquote(t1)) or
((compare(unquote(f1), unquote(rop3), cv1, unquote(t1)) and
compare(unquote(f2), unquote(rop4), cv2, unquote(t2))) or
((compare(unquote(f1), unquote(rop5), cv1, unquote(t1)) and
compare(unquote(f2), unquote(rop6), cv2, unquote(t2)) and
compare(unquote(f3), unquote(rop7), cv3, unquote(t3))) or
(compare(unquote(f1), unquote(rop8), cv1, unquote(t1)) and
compare(unquote(f2), unquote(rop9), cv2, unquote(t2)) and
compare(unquote(f3), unquote(rop10), cv3, unquote(t3)) and
compare(unquote(f4), unquote(rop11), cv4, unquote(t4))))))
)
end
def beyond_cursor(query, unquote(query_name), unquote(rdir1), :forward, cursor_values) do
[cv1, cv2, cv3, cv4] = cursor_values
query
|> Ecto.Query.where(
compare(unquote(f1), unquote(rop1), cv1, unquote(t1)) and
(compare(unquote(f1), unquote(rop2), cv1, unquote(t1)) or
((compare(unquote(f1), unquote(rop3), cv1, unquote(t1)) and
compare(unquote(f2), unquote(rop4), cv2, unquote(t2))) or
((compare(unquote(f1), unquote(rop5), cv1, unquote(t1)) and
compare(unquote(f2), unquote(rop6), cv2, unquote(t2)) and
compare(unquote(f3), unquote(rop7), cv3, unquote(t3))) or
(compare(unquote(f1), unquote(rop8), cv1, unquote(t1)) and
compare(unquote(f2), unquote(rop9), cv2, unquote(t2)) and
compare(unquote(f3), unquote(rop10), cv3, unquote(t3)) and
compare(unquote(f4), unquote(rop11), cv4, unquote(t4))))))
)
end
def beyond_cursor(query, unquote(query_name), unquote(rdir1), :backward, cursor_values) do
[cv1, cv2, cv3, cv4] = cursor_values
query
|> Ecto.Query.where(
compare(unquote(f1), unquote(op1), cv1, unquote(t1)) and
(compare(unquote(f1), unquote(op2), cv1, unquote(t1)) or
((compare(unquote(f1), unquote(op3), cv1, unquote(t1)) and
compare(unquote(f2), unquote(op4), cv2, unquote(t2))) or
((compare(unquote(f1), unquote(op5), cv1, unquote(t1)) and
compare(unquote(f2), unquote(op6), cv2, unquote(t2)) and
compare(unquote(f3), unquote(op7), cv3, unquote(t3))) or
(compare(unquote(f1), unquote(op8), cv1, unquote(t1)) and
compare(unquote(f2), unquote(op9), cv2, unquote(t2)) and
compare(unquote(f3), unquote(op10), cv3, unquote(t3)) and
compare(unquote(f4), unquote(op11), cv4, unquote(t4))))))
)
end
unquote(with_order_func(query_name, dir1, order_bys))
unquote(with_cursor_fields_func(query_name, fields))
end
end
@doc false
def derive_operators([dir1]) do
[
comparison_operator(dir1)
]
end
def derive_operators([dir1, dir2]) do
[
index_friendly_comparison_operator(dir1),
comparison_operator(dir1),
:eq,
comparison_operator(dir2)
]
end
def derive_operators([dir1, dir2, dir3]) do
[
index_friendly_comparison_operator(dir1),
comparison_operator(dir1),
:eq,
comparison_operator(dir2),
:eq,
:eq,
comparison_operator(dir3)
]
end
def derive_operators([dir1, dir2, dir3, dir4]) do
[
index_friendly_comparison_operator(dir1),
comparison_operator(dir1),
:eq,
comparison_operator(dir2),
:eq,
:eq,
comparison_operator(dir3),
:eq,
:eq,
:eq,
comparison_operator(dir4)
]
end
@doc false
def invert(:asc), do: :desc
def invert(:desc), do: :asc
def invert(:eq), do: :eq
def invert(:gt), do: :lt
def invert(:gte), do: :lte
def invert(:lt), do: :gt
def invert(:lte), do: :gte
@doc false
def index_friendly_comparison_operator(:asc), do: :gte
def index_friendly_comparison_operator(:desc), do: :lte
@doc false
def comparison_operator(:asc), do: :gt
def comparison_operator(:desc), do: :lt
@doc false
defmacro compare(field, :gte, value, nil) do
quote do: unquote(field) >= ^unquote(value)
end
defmacro compare(field, :gte, value, type) do
quote do: unquote(field) >= type(^unquote(value), unquote(type))
end
defmacro compare(field, :gt, value, nil) do
quote do: unquote(field) > ^unquote(value)
end
defmacro compare(field, :gt, value, type) do
quote do: unquote(field) > type(^unquote(value), unquote(type))
end
defmacro compare(field, :eq, value, nil) do
quote do: unquote(field) == ^unquote(value)
end
defmacro compare(field, :eq, value, type) do
quote do: unquote(field) == type(^unquote(value), unquote(type))
end
defmacro compare(field, :lt, value, nil) do
quote do: unquote(field) < ^unquote(value)
end
defmacro compare(field, :lt, value, type) do
quote do: unquote(field) < type(^unquote(value), unquote(type))
end
defmacro compare(field, :lte, value, nil) do
quote do: unquote(field) <= ^unquote(value)
end
defmacro compare(field, :lte, value, type) do
quote do: unquote(field) <= type(^unquote(value), unquote(type))
end
end
| 38.815891 | 100 | 0.621049 |
1cc0aa124c7ef16235e447a503ccfcdc64cd2965 | 2,229 | ex | Elixir | lib/hui/query/facet.ex | niccolox/hui | d0eecd0176d1bf3c9ff4075ab1c6048e3d1c25b3 | [
"Apache-2.0"
] | null | null | null | lib/hui/query/facet.ex | niccolox/hui | d0eecd0176d1bf3c9ff4075ab1c6048e3d1c25b3 | [
"Apache-2.0"
] | null | null | null | lib/hui/query/facet.ex | niccolox/hui | d0eecd0176d1bf3c9ff4075ab1c6048e3d1c25b3 | [
"Apache-2.0"
] | null | null | null | defmodule Hui.Query.Facet do
@moduledoc """
Struct related to [faceting](http://lucene.apache.org/solr/guide/faceting.html).
### Example
iex> x = %Hui.Query.Facet{field: ["type", "year"], query: "year:[2000 TO NOW]"}
%Hui.Query.Facet{
contains: nil,
"contains.ignoreCase": nil,
"enum.cache.minDf": nil,
excludeTerms: nil,
exists: nil,
facet: true,
field: ["type", "year"],
interval: nil,
limit: nil,
matches: nil,
method: nil,
mincount: nil,
missing: nil,
offset: nil,
"overrequest.count": nil,
"overrequest.ratio": nil,
pivot: [],
"pivot.mincount": nil,
prefix: nil,
query: "year:[2000 TO NOW]",
range: nil,
sort: nil,
threads: nil
}
iex> x |> Hui.Encoder.encode
"facet=true&facet.field=type&facet.field=year&facet.query=year%3A%5B2000+TO+NOW%5D"
"""
defstruct [facet: true, field: [], query: []]
++ [:"pivot.mincount", pivot: []]
++ [:prefix, :contains, :"contains.ignoreCase", :matches]
++ [:sort, :limit, :offset, :mincount,
:missing, :method, :"enum.cache.minDf", :exists]
++ [:excludeTerms, :"overrequest.count", :"overrequest.ratio",
:threads]
++ [:interval, :range]
@typedoc """
Struct for faceting.
"""
@type t :: %__MODULE__{facet: boolean, field: binary | list(binary), query: binary | list(binary),
"pivot.mincount": number, pivot: binary | list(binary),
prefix: binary, contains: binary, "contains.ignoreCase": binary, matches: binary,
sort: binary, limit: number, offset: number, mincount: number,
missing: boolean, method: binary, "enum.cache.minDf": number, exists: boolean,
excludeTerms: binary, "overrequest.count": number, "overrequest.ratio": number,
threads: binary,
interval: Hui.Query.FacetInterval.t | list(Hui.Query.FacetInterval.t),
range: Hui.Query.FacetRange.t | list(Hui.Query.FacetRange.t)}
end | 38.431034 | 106 | 0.543742 |
1cc0ea21cf2e605ed1fbeb0611d0b525b524f511 | 1,712 | ex | Elixir | clients/tool_results/lib/google_api/tool_results/v1beta3/model/project_settings.ex | GoNZooo/elixir-google-api | cf3ad7392921177f68091f3d9001f1b01b92f1cc | [
"Apache-2.0"
] | null | null | null | clients/tool_results/lib/google_api/tool_results/v1beta3/model/project_settings.ex | GoNZooo/elixir-google-api | cf3ad7392921177f68091f3d9001f1b01b92f1cc | [
"Apache-2.0"
] | null | null | null | clients/tool_results/lib/google_api/tool_results/v1beta3/model/project_settings.ex | GoNZooo/elixir-google-api | cf3ad7392921177f68091f3d9001f1b01b92f1cc | [
"Apache-2.0"
] | 1 | 2018-07-28T20:50:50.000Z | 2018-07-28T20:50:50.000Z | # Copyright 2017 Google Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This class is auto generated by the swagger code generator program.
# https://github.com/swagger-api/swagger-codegen.git
# Do not edit the class manually.
defmodule GoogleApi.ToolResults.V1beta3.Model.ProjectSettings do
@moduledoc """
Per-project settings for the Tool Results service.
## Attributes
- defaultBucket (String): The name of the Google Cloud Storage bucket to which results are written. By default, this is unset. In update request: optional In response: optional Defaults to: `null`.
- name (String): The name of the project's settings. Always of the form: projects/{project-id}/settings In update request: never set In response: always set Defaults to: `null`.
"""
defstruct [
:"defaultBucket",
:"name"
]
end
defimpl Poison.Decoder, for: GoogleApi.ToolResults.V1beta3.Model.ProjectSettings do
def decode(value, _options) do
value
end
end
defimpl Poison.Encoder, for: GoogleApi.ToolResults.V1beta3.Model.ProjectSettings do
def encode(value, options) do
GoogleApi.ToolResults.V1beta3.Deserializer.serialize_non_nil(value, options)
end
end
| 35.666667 | 201 | 0.755841 |
1cc0f44154f96a515380f8d2158041cbb38a2309 | 13,989 | ex | Elixir | lib/merkle/sparse_merkle.ex | rkstarnerd/bargad | 82886c4934042622498401302024abeab3f0df85 | [
"Apache-2.0"
] | 50 | 2018-10-09T17:21:08.000Z | 2022-01-31T19:48:44.000Z | lib/merkle/sparse_merkle.ex | rkstarnerd/bargad | 82886c4934042622498401302024abeab3f0df85 | [
"Apache-2.0"
] | 8 | 2018-10-09T16:18:10.000Z | 2020-05-18T00:25:08.000Z | lib/merkle/sparse_merkle.ex | rkstarnerd/bargad | 82886c4934042622498401302024abeab3f0df85 | [
"Apache-2.0"
] | 8 | 2018-10-07T13:38:22.000Z | 2020-05-06T19:59:59.000Z | # Copyright 2018 Faraz Haider. All Rights Reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
defmodule Bargad.SparseMerkle do
use Bitwise
# insertion in empty tree
@spec insert(Bargad.Types.tree, binary, binary) :: Bargad.Types.tree
def insert(tree = %Bargad.Trees.Tree{size: 0}, k, v) do
root = Bargad.Utils.make_map_node(tree, k, v)
Bargad.Utils.set_node(tree, root.hash, root)
Map.put(tree, :root, root.hash) |> Map.put(:size, 1)
end
# insertion in non empty tree
@spec insert(Bargad.Types.tree, binary, binary) :: Bargad.Types.tree
def insert(tree = %Bargad.Trees.Tree{root: root, size: size}, k, v) do
root = Bargad.Utils.get_node(tree, root)
new_root = do_insert(tree, root, k, v)
# basically don't delete the root if the tree contains only one node, and that would be a leaf node
if tree.size > 1 do
# deletes the existing root from the storage as there would be a new root
Bargad.Utils.delete_node(tree, root.hash)
end
Map.put(tree, :root, new_root.hash) |> Map.put(:size, size + 1)
end
defp do_insert(tree, root = %Bargad.Nodes.Node{children: [left, right]}, k, v) do
left = Bargad.Utils.get_node(tree, left)
right = Bargad.Utils.get_node(tree, right)
l_dist = distance(k, left.key)
r_dist = distance(k, right.key)
# checks if the key to be inserted falls in the left subtree or the right subtree
cond do
l_dist == r_dist ->
# when putting a key to a new level, we have to decide whether it will become a left child or right child
# as distances are already equal, we find a new parameter
# we compare the new key to the keys at this level
# if it is smaller than max, then this new leaf will become
# make new leaf in a new level"
new_leaf = Bargad.Utils.make_map_node(tree, k, v)
Bargad.Utils.set_node(tree, new_leaf.hash, new_leaf)
min_key = min(left.key, right.key)
if k < min_key do
# deletes the existing root from the storage as there would be a new root
Bargad.Utils.delete_node(tree, root.hash)
# make new leaf as left child at the new level
new_root = Bargad.Utils.make_map_node(tree, new_leaf, root)
Bargad.Utils.set_node(tree, new_root.hash, new_root)
new_root
else
# deletes the existing root from the storage as there would be a new root
Bargad.Utils.delete_node(tree, root.hash)
# make new leaf as right child at the new level
new_root = Bargad.Utils.make_map_node(tree, root, new_leaf)
Bargad.Utils.set_node(tree, new_root.hash, new_root)
new_root
end
l_dist < r_dist ->
# Going towards left child
left = do_insert(tree, left, k, v)
# deletes the existing root from the storage as there would be a new root
Bargad.Utils.delete_node(tree, root.hash)
new_root = Bargad.Utils.make_map_node(tree, left, right)
Bargad.Utils.set_node(tree, new_root.hash, new_root)
new_root
l_dist > r_dist ->
# Going towards right child
right = do_insert(tree, right, k, v)
# deletes the existing root from the storage as there would be a new root
Bargad.Utils.delete_node(tree, root.hash)
new_root = Bargad.Utils.make_map_node(tree, left, right)
Bargad.Utils.set_node(tree, new_root.hash, new_root)
new_root
end
end
defp do_insert(tree, leaf = %Bargad.Nodes.Node{children: [], metadata: _, key: key}, k, v) do
new_leaf = Bargad.Utils.make_map_node(tree, k, v)
Bargad.Utils.set_node(tree, new_leaf.hash, new_leaf)
# reached leaf node level
# after reaching the level where the new key is to inserted,
# make a new node comprising of the existing key and new one
# if the new leaf is bigger than the existing one, make the new one as the right child of resulting node
cond do
k == key -> raise "key exists"
k > key ->
# new key will be right child
new_root = Bargad.Utils.make_map_node(tree, leaf , new_leaf)
Bargad.Utils.set_node(tree, new_root.hash, new_root)
new_root
k < key ->
# new key will be left child
new_root = Bargad.Utils.make_map_node(tree, new_leaf , leaf)
Bargad.Utils.set_node(tree, new_root.hash, new_root)
new_root
end
end
@spec get_with_inclusion_proof!(Bargad.Types.tree, binary) :: Bargad.Types.audit_proof
def get_with_inclusion_proof!(tree = %Bargad.Trees.Tree{root: root}, k) do
root = Bargad.Utils.get_node(tree, root)
result = do_get_with_inclusion_proof(tree, nil, nil, root, k)
case result do
# membership proof case
[{_, _} | _] ->
[{value, hash} | proof] = Enum.reverse(result)
%{key: k, value: value, hash: hash, proof: proof}
# Edge Case 1 for non-membership proof
[key, :MINRS] -> [get_with_inclusion_proof!(tree, key), nil]
# Edge Case 2 for non-membership proof
[:MAXLS, key] -> [nil, get_with_inclusion_proof!(tree, key)]
# When a key is bounded by two keys in case of non-membership proof
[key1, key2] ->
[get_with_inclusion_proof!(tree, key1),
get_with_inclusion_proof!(tree,key2)]
end
end
# this is called only once, when starting
defp do_get_with_inclusion_proof(tree, nil, nil, root = %Bargad.Nodes.Node{children: [left, right]}, k) do
left = Bargad.Utils.get_node(tree, left)
right = Bargad.Utils.get_node(tree, right)
l_dist = distance(k, left.key)
r_dist = distance(k, right.key)
cond do
l_dist == r_dist ->
case k > root.key do
true -> [right.key, :MINRS]
_ -> [:MAXLS, left.key]
end
l_dist < r_dist ->
# Going towards left child
do_get_with_inclusion_proof(tree, right, "L", left, k)
l_dist > r_dist ->
# Going towards right child
do_get_with_inclusion_proof(tree, left, "R", right, k)
end
end
defp do_get_with_inclusion_proof(tree, sibling, direction, leaf = %Bargad.Nodes.Node{hash: salted_hash, children: [], metadata: value, key: key}, k) do
if key == k do
[{sibling.hash, rev_dir(direction)}, {value, salted_hash} ]
else
# Find the non membership proof otherwise
get_non_inclusion_proof({tree, k, key, direction, sibling})
# raise "key does not exist"
end
end
defp do_get_with_inclusion_proof(tree, sibling, direction, root = %Bargad.Nodes.Node{children: [left, right]}, k) do
left = Bargad.Utils.get_node(tree, left)
right = Bargad.Utils.get_node(tree, right)
l_dist = distance(k, left.key)
r_dist = distance(k, right.key)
cond do
l_dist == r_dist ->
# Find the non membership proof otherwise
get_non_inclusion_proof({k, root.key, direction, sibling})
# raise "key does not exist"
l_dist < r_dist ->
# Going towards left child
result = do_get_with_inclusion_proof(tree, right, "L", left, k)
case { result, direction } do
# membership proof case
{[{_, _} | _], _} -> [{sibling.hash, rev_dir(direction)} | result]
{[key, :MINRS],"L"} -> [key, min_in_subtree(tree, sibling)]
{[:MAXLS, key], "R"} -> [max_in_subtree(sibling), key]
_ -> result
end
l_dist > r_dist ->
# Going towards right child
result = do_get_with_inclusion_proof(tree, left, "R", right, k)
case { result, direction } do
# membership proof case
{[{_, _} | _], _} -> [{sibling.hash, rev_dir(direction)} | result]
{[key, :MINRS],"L"} -> [key, min_in_subtree(tree, sibling)]
{[:MAXLS, key], "R"} -> [max_in_subtree(sibling), key]
_ -> result
end
end
end
defp get_non_inclusion_proof({tree, k, key, direction, sibling}) do
case [k > key, direction] do
[true, "L"] -> [key, min_in_subtree(tree, sibling)]
[true, "R"] -> [key, :MINRS]
[false, "L"] -> [:MAXLS, key]
[false, "R"] -> [max_in_subtree(sibling), key]
end
end
defp min_in_subtree(tree, %Bargad.Nodes.Node{children: [left, _]}) do
min_in_subtree(tree, Bargad.Utils.get_node(tree, left))
end
defp min_in_subtree(_, %Bargad.Nodes.Node{children: [], key: key}) do
key
end
defp max_in_subtree(root) do
root.key
end
@spec delete!(Bargad.Types.tree, binary) :: Bargad.Types.tree
def delete!(tree = %Bargad.Trees.Tree{root: root, size: size}, k) do
root = Bargad.Utils.get_node(tree, root)
new_root = do_delete(tree, root, k)
# deletes the existing root from the storage as there would be a new root
Bargad.Utils.delete_node(tree, root.hash)
Map.put(tree, :root, new_root.hash) |> Map.put(:size, size - 1)
end
defp do_delete(tree, root = %Bargad.Nodes.Node{children: [left, right]}, k) do
left = Bargad.Utils.get_node(tree, left)
right = Bargad.Utils.get_node(tree, right)
if check_for_leaf(left, right, k) do
if left.key == k do
# deletes the target key
Bargad.Utils.delete_node(tree, left.hash)
right
else
# deletes the target key
Bargad.Utils.delete_node(tree, right.hash)
left
end
else
l_dist = distance(k, left.key)
r_dist = distance(k, right.key)
cond do
l_dist == r_dist ->
raise "key does not exist"
l_dist < r_dist ->
# Going towards left child
left = do_delete(tree, left, k)
# deletes the existing root from the storage as there would be a new root
Bargad.Utils.delete_node(tree, root.hash)
new_root = Bargad.Utils.make_map_node(tree, left, right)
Bargad.Utils.set_node(tree, new_root.hash, new_root)
new_root
l_dist > r_dist ->
# Going towards right child
right = do_delete(tree, right, k)
# deletes the existing root from the storage as there would be a new root
Bargad.Utils.delete_node(tree, root.hash)
new_root = Bargad.Utils.make_map_node(tree, left, right)
Bargad.Utils.set_node(tree, new_root.hash, new_root)
new_root
end
end
end
## Check if this would ever be called, if not then remove it.
defp do_delete(_, leaf = %Bargad.Nodes.Node{children: [], key: key}, k) do
if key == k do
IO.puts "found a key here"
else
raise "key does not exist"
end
end
defp check_for_leaf(left, right, k) do
(left.size == 1 && left.key == k) || (right.size == 1 && right.key == k)
end
def audit_tree(tree) do
root = Bargad.Utils.get_node(tree, tree.root)
do_audit_tree(tree, root, []) |> List.flatten
end
defp do_audit_tree(tree, root = %Bargad.Nodes.Node{children: [left, right]}, acc) do
left = Bargad.Utils.get_node(tree, left)
right = Bargad.Utils.get_node(tree, right)
[do_audit_tree(tree, left, ["L" | acc])] ++
[do_audit_tree(tree, right, ["R" | acc])]
end
defp do_audit_tree(_, leaf = %Bargad.Nodes.Node{children: [], metadata: m}, acc) do
[m | acc] |> Enum.reverse |> List.to_tuple
end
defp distance(x, y) do
x = x |> Base.encode16 |> Integer.parse(16) |> elem(0)
y = y |> Base.encode16 |> Integer.parse(16) |> elem(0)
result = bxor(x, y)
# xor with the same results in a zero, this check is done for when a person tries to insert an existing key
# xor becomes 0, for which log is undefined so we return a negative value to indicated that it is the minimum distance
if result == 0 do
-1
else
result = result |> :math.log2 |> trunc
# after log, diff was always 1 less than the actual
result + 1
end
end
defp rev_dir(dir) do
case dir do
"L" -> "R"
_ -> "L"
end
end
end | 40.314121 | 155 | 0.563586 |
1cc10910c43055b376d59c1024a4837d10ab2ab2 | 1,313 | ex | Elixir | lib/ido_keido/geolocation.ex | isabella232/ido_keido | 871fd5c8b294fdc89e31e8dba173f2a5464de399 | [
"Apache-2.0"
] | 6 | 2019-11-15T20:34:47.000Z | 2021-04-05T03:15:23.000Z | lib/ido_keido/geolocation.ex | FindHotel/ido_keido | 871fd5c8b294fdc89e31e8dba173f2a5464de399 | [
"Apache-2.0"
] | 1 | 2022-03-20T09:34:42.000Z | 2022-03-20T09:34:42.000Z | lib/ido_keido/geolocation.ex | isabella232/ido_keido | 871fd5c8b294fdc89e31e8dba173f2a5464de399 | [
"Apache-2.0"
] | 3 | 2021-06-09T05:58:09.000Z | 2022-03-20T07:05:13.000Z | defmodule IdoKeido.GeolocationContext do
@moduledoc """
Geolocation context behaviour.
"""
@doc """
Find city geolocation data by IP.
"""
@callback city(ip :: String.t()) :: map() | nil
@doc """
Find country geolocation data by IP.
"""
@callback country(ip :: String.t()) :: map() | nil
end
defmodule IdoKeido.Geolocation do
@moduledoc """
Implementation of `IdoKeido.GeolocationContext` behaviour.
"""
@behaviour IdoKeido.GeolocationContext
alias IdoKeido.{Cache, CityResult, CountryResult, Parser}
@default_locale :en
@repo Application.get_env(:ido_keido, :injections)[:repo]
@impl true
def city(ip) when is_binary(ip), do: get(ip, :city)
@impl true
def country(ip) when is_binary(ip), do: get(ip, :country)
@spec cache_key(String.t(), atom()) :: String.t()
defp cache_key(ip, type) do
"#{type}_#{ip}"
end
@spec get(String.t(), atom()) :: map() | nil
defp get(ip, type) do
ip
|> cache_key(type)
|> Cache.fetch(fn -> get_from_db(ip, type) end)
end
@spec get_from_db(String.t(), atom()) :: CityResult.t() | CountryResult.t() | nil
defp get_from_db(ip, type) do
ip
|> @repo.lookup(locale: @default_locale, where: type)
|> case do
nil -> nil
result -> apply(Parser, type, [result, ip])
end
end
end
| 23.035088 | 83 | 0.639756 |
1cc10feb7d9d0cd8f0f0153758d8248a7ecf279d | 3,252 | ex | Elixir | clients/container/lib/google_api/container/v1/model/set_labels_request.ex | kolorahl/elixir-google-api | 46bec1e092eb84c6a79d06c72016cb1a13777fa6 | [
"Apache-2.0"
] | null | null | null | clients/container/lib/google_api/container/v1/model/set_labels_request.ex | kolorahl/elixir-google-api | 46bec1e092eb84c6a79d06c72016cb1a13777fa6 | [
"Apache-2.0"
] | null | null | null | clients/container/lib/google_api/container/v1/model/set_labels_request.ex | kolorahl/elixir-google-api | 46bec1e092eb84c6a79d06c72016cb1a13777fa6 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Container.V1.Model.SetLabelsRequest do
@moduledoc """
SetLabelsRequest sets the Google Cloud Platform labels on a Google Container
Engine cluster, which will in turn set them for Google Compute Engine
resources used by that cluster
## Attributes
* `clusterId` (*type:* `String.t`, *default:* `nil`) - Deprecated. The name of the cluster.
This field has been deprecated and replaced by the name field.
* `labelFingerprint` (*type:* `String.t`, *default:* `nil`) - Required. The fingerprint of the previous set of labels for this resource,
used to detect conflicts. The fingerprint is initially generated by
Kubernetes Engine and changes after every request to modify or update
labels. You must always provide an up-to-date fingerprint hash when
updating or changing labels. Make a <code>get()</code> request to the
resource to get the latest fingerprint.
* `name` (*type:* `String.t`, *default:* `nil`) - The name (project, location, cluster id) of the cluster to set labels.
Specified in the format `projects/*/locations/*/clusters/*`.
* `projectId` (*type:* `String.t`, *default:* `nil`) - Deprecated. The Google Developers Console [project ID or project
number](https://developers.google.com/console/help/new/#projectnumber).
This field has been deprecated and replaced by the name field.
* `resourceLabels` (*type:* `map()`, *default:* `nil`) - Required. The labels to set for that cluster.
* `zone` (*type:* `String.t`, *default:* `nil`) - Deprecated. The name of the Google Compute Engine
[zone](https://cloud.google.com/compute/docs/zones#available) in which the
cluster resides. This field has been deprecated and replaced by the name
field.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:clusterId => String.t(),
:labelFingerprint => String.t(),
:name => String.t(),
:projectId => String.t(),
:resourceLabels => map(),
:zone => String.t()
}
field(:clusterId)
field(:labelFingerprint)
field(:name)
field(:projectId)
field(:resourceLabels, type: :map)
field(:zone)
end
defimpl Poison.Decoder, for: GoogleApi.Container.V1.Model.SetLabelsRequest do
def decode(value, options) do
GoogleApi.Container.V1.Model.SetLabelsRequest.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Container.V1.Model.SetLabelsRequest do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 42.789474 | 140 | 0.705412 |
1cc11a8ad2d751e2bf09e91a10be7823f56510b2 | 8,975 | ex | Elixir | lib/prom_ex/plugins/ecto.ex | treble37/prom_ex | f5bb45279cf9576b1c4533de0ad555d3b2987f76 | [
"MIT"
] | null | null | null | lib/prom_ex/plugins/ecto.ex | treble37/prom_ex | f5bb45279cf9576b1c4533de0ad555d3b2987f76 | [
"MIT"
] | null | null | null | lib/prom_ex/plugins/ecto.ex | treble37/prom_ex | f5bb45279cf9576b1c4533de0ad555d3b2987f76 | [
"MIT"
] | null | null | null | if Code.ensure_loaded?(Ecto) do
defmodule PromEx.Plugins.Ecto do
@moduledoc """
This plugin captures metrics emitted by Ecto. Be sure that your PromEx module is listed before your Repo module
in your supervision tree so that the Ecto init events are not missed. If you miss those events the dashboard
variable dropdowns for the repo value will be broken.
This plugin supports the following options:
- `otp_app`: This is a REQUIRED option and is the name of you application in snake case (e.g. :my_cool_app).
- `repos`: This is an OPTIONAL option and is a list with the full module name of your Ecto Repos (e.g [MyApp.Repo]).
If you do not provide this value, PromEx will attempt to resolve your Repo modules via the
`:ecto_repos` configuration on your OTP app.
This plugin exposes the following metric groups:
- `:ecto_init_event_metrics`
- `:ecto_query_event_metrics`
To use plugin in your application, add the following to your PromEx module `plugins/0` function:
```
def plugins do
[
...
{PromEx.Plugins.Ecto, otp_app: :web_app, repos: [WebApp.Repo]}
]
end
```
"""
use PromEx.Plugin
require Logger
@init_event [:ecto, :repo, :init]
@query_event [:prom_ex, :plugin, :ecto, :query]
@impl true
def event_metrics(opts) do
otp_app = Keyword.fetch!(opts, :otp_app)
metric_prefix = PromEx.metric_prefix(otp_app, :ecto)
repo_event_prefixes =
opts
|> Keyword.get_lazy(:repos, fn ->
Application.get_env(otp_app, :ecto_repos)
end)
|> Enum.map(fn repo ->
otp_app
|> Application.get_env(repo)
|> Keyword.get_lazy(:telemetry_prefix, fn ->
telemetry_prefix(repo)
end)
end)
# Telemetry metrics will emit warnings if multiple handlers with the same names are defined.
# As a result, this plugin supports gathering metrics on multiple repos, but needs to proxy
# them as not to create multiple definitions of the same metrics. The final data point will
# have a label for the Repo associated with the event though so you'll be able to separate one
# repos measurements from another.
set_up_telemetry_proxy(repo_event_prefixes)
# Event metrics definitions
[
init_metrics(metric_prefix),
query_metrics(metric_prefix)
]
end
@doc false
def handle_proxy_query_event(_event_name, event_measurement, event_metadata, _config) do
:telemetry.execute(@query_event, event_measurement, event_metadata)
end
# Generate the default telemetry prefix
defp telemetry_prefix(repo) do
repo
|> Module.split()
|> Enum.map(&(&1 |> Macro.underscore() |> String.to_atom()))
end
defp init_metrics(metric_prefix) do
Event.build(
:ecto_init_event_metrics,
[
last_value(
metric_prefix ++ [:repo, :init, :status, :info],
event_name: @init_event,
description: "Information regarding the initialized repo.",
measurement: fn _measurements -> 1 end,
tags: [:repo, :database_name, :database_host],
tag_values: &ecto_init_tag_values/1
),
last_value(
metric_prefix ++ [:repo, :init, :pool, :size],
event_name: @init_event,
description: "The configured pool size value for the repo.",
measurement: fn _measurements, %{opts: opts} ->
Keyword.get(opts, :pool_size)
end,
tags: [:repo],
tag_values: &ecto_init_tag_values/1
),
last_value(
metric_prefix ++ [:repo, :init, :timeout, :duration],
event_name: @init_event,
description: "The configured timeout value for the repo.",
measurement: fn _measurements, %{opts: opts} ->
Keyword.get(opts, :timeout)
end,
tags: [:repo],
tag_values: &ecto_init_tag_values/1
)
]
)
end
defp query_metrics(metric_prefix) do
Event.build(
:ecto_query_event_metrics,
[
# Capture the db connection idle time
distribution(
metric_prefix ++ [:repo, :query, :idle, :time, :milliseconds],
event_name: @query_event,
measurement: :idle_time,
description: "The time the connection spent waiting before being checked out for the query.",
tags: [:repo],
tag_values: &ecto_query_tag_values/1,
reporter_options: [
buckets: [1, 10, 50, 100, 500, 1_000, 5_000, 10_000]
],
unit: {:native, :millisecond}
),
# Capture the db connection queue time
distribution(
metric_prefix ++ [:repo, :query, :queue, :time, :milliseconds],
event_name: @query_event,
measurement: :queue_time,
description: "The time spent waiting to check out a database connection.",
tags: [:repo],
tag_values: &ecto_query_tag_values/1,
reporter_options: [
buckets: [1, 10, 50, 100, 500, 1_000, 5_000, 10_000]
],
unit: {:native, :millisecond}
),
# Capture the db query decode time
distribution(
metric_prefix ++ [:repo, :query, :decode, :time, :milliseconds],
event_name: @query_event,
measurement: :decode_time,
description: "The time spent decoding the data received from the database.",
tags: [:repo],
tag_values: &ecto_query_tag_values/1,
reporter_options: [
buckets: [1, 10, 50, 100, 500, 1_000, 5_000, 10_000]
],
unit: {:native, :millisecond}
),
# Capture the query execution time
distribution(
metric_prefix ++ [:repo, :query, :execution, :time, :milliseconds],
event_name: @query_event,
measurement: :query_time,
description: "The time spent executing the query.",
tags: [:repo, :source, :command],
tag_values: &ecto_query_tag_values/1,
reporter_options: [
buckets: [1, 10, 50, 100, 500, 1_000, 5_000, 10_000]
],
unit: {:native, :millisecond}
),
# Capture the number of results returned
distribution(
metric_prefix ++ [:repo, :query, :results, :returned],
event_name: @query_event,
measurement: fn _measurement, %{result: result} ->
normalize_results_returned(result)
end,
description: "The time spent executing the query.",
tags: [:repo, :source, :command],
tag_values: &ecto_query_tag_values/1,
reporter_options: [
buckets: [1, 10, 50, 100, 250, 500, 1_000, 5_000]
],
drop: fn %{result: result} ->
normalize_results_returned(result) == :drop_data_point
end
)
]
)
end
defp set_up_telemetry_proxy(repo_event_prefixes) do
repo_event_prefixes
|> Enum.each(fn telemetry_prefix ->
query_event = telemetry_prefix ++ [:query]
:telemetry.attach(
[:prom_ex, :ecto, :proxy] ++ telemetry_prefix,
query_event,
&__MODULE__.handle_proxy_query_event/4,
%{}
)
end)
end
defp ecto_init_tag_values(%{repo: repo, opts: opts}) do
%{
repo: repo |> Atom.to_string() |> String.trim_leading("Elixir."),
database_name: Keyword.get(opts, :database),
database_host: Keyword.get(opts, :hostname)
}
end
defp ecto_query_tag_values(%{repo: repo, source: source, result: result}) do
%{
repo: repo |> Atom.to_string() |> String.trim_leading("Elixir."),
source: normalize_source(source),
command: normalize_command(result)
}
end
defp normalize_source(nil), do: "source_unavailable"
defp normalize_source(source) when is_binary(source), do: source
defp normalize_source(source) when is_atom(source), do: Atom.to_string(source)
defp normalize_source(_), do: "source_unavailable"
defp normalize_command({:ok, %_{command: command}}) when is_atom(command) do
Atom.to_string(command)
end
defp normalize_command(_) do
"unavailable"
end
defp normalize_results_returned({:ok, %_{num_rows: num_row}}) when is_integer(num_row) do
num_row
end
defp normalize_results_returned(_) do
:drop_data_point
end
end
else
defmodule PromEx.Plugins.Ecto do
@moduledoc false
use PromEx.Plugin
@impl true
def event_metrics(_opts) do
PromEx.Plugin.no_dep_raise(__MODULE__, "Ecto")
end
end
end
| 34.255725 | 120 | 0.598663 |
1cc162613be57efc41c458234aa3c726fd06b8a2 | 247 | ex | Elixir | lib/rumbl.ex | brunorafa/rumbl | 910e6ecfaae8da8e54da9e67871a02885c2f383f | [
"MIT"
] | 1 | 2021-05-30T20:57:51.000Z | 2021-05-30T20:57:51.000Z | lib/rumbl.ex | brunorafa/rumbl | 910e6ecfaae8da8e54da9e67871a02885c2f383f | [
"MIT"
] | 2 | 2021-03-09T19:04:16.000Z | 2021-05-10T16:20:10.000Z | lib/rumbl.ex | brunorafa/rumbl | 910e6ecfaae8da8e54da9e67871a02885c2f383f | [
"MIT"
] | 1 | 2020-07-17T14:48:52.000Z | 2020-07-17T14:48:52.000Z | defmodule Rumbl do
@moduledoc """
Rumbl keeps the contexts that define your domain
and business logic.
Contexts are also responsible for managing your data, regardless
if it comes from the database, an external API or others.
"""
end
| 24.7 | 66 | 0.748988 |
1cc1658fb6ad0aa11afa450be4be8faea7d80297 | 10,518 | ex | Elixir | lib/game/command/skills.ex | deep-spaced/ex_venture | 45848abe509620d6d2643b2a780dab01c1ac523b | [
"MIT"
] | null | null | null | lib/game/command/skills.ex | deep-spaced/ex_venture | 45848abe509620d6d2643b2a780dab01c1ac523b | [
"MIT"
] | null | null | null | lib/game/command/skills.ex | deep-spaced/ex_venture | 45848abe509620d6d2643b2a780dab01c1ac523b | [
"MIT"
] | null | null | null | defmodule Game.Command.Skills do
@moduledoc """
Parse out class skills
"""
use Game.Command
alias Game.Character
alias Game.Command
alias Game.Command.Target
alias Game.Effect
alias Game.Experience
alias Game.Hint
alias Game.Item
alias Game.Session.GMCP
alias Game.Skill
alias Game.Skills
@must_be_alive true
commands(["skills"], parse: false)
@impl Game.Command
def help(:topic), do: "Skills"
def help(:short), do: "List out your known skills"
def help(:full) do
"""
List out the skills you know.
To use a skill you must also be targeting something. Optionally pass in
a target after your skill to switch or set a target before using a skill.
List out your known skills:
[ ] > {command}skills{/command}
Some skills will automatically target yourself instead of your real target,
for instance a heal skill will target you before your opponent. You can get
around this by providing a target after the skill command.
Healing yourself:
[ ] > {command}heal{/command}
Healing your target:
[ ] > {command}heal guard{/command}
"""
end
@impl true
def parse(command, _context), do: parse(command)
@impl Game.Command
@doc """
Parse the command into arguments
iex> Game.Command.Skills.parse("skills")
{}
iex> Game.Command.Skills.parse("skills all")
{:all}
iex> Game.Command.Skills.parse("skills hi")
{:error, :bad_parse, "skills hi"}
iex> Game.Command.Skills.parse("unknown")
{:error, :bad_parse, "unknown"}
"""
def parse(command)
def parse("skills"), do: {}
def parse("skills all"), do: {:all}
@doc """
Parse skill specific commands
"""
@spec parse_skill(String.t(), ParseContext.t()) :: Command.t() | {:error, :bad_parse, String.t()}
def parse_skill(command, context)
def parse_skill(command, context) do
%{save: save} = context.player
with {:ok, skill} <- parse_find_skill(command),
{:ok, skill} <- parse_check_skill_enabled(skill),
{:ok, skill} <- parse_check_skill_known(skill, save),
{:ok, skill} <- parse_check_skill_level(skill, save) do
%Command{text: command, module: __MODULE__, args: {skill, command}}
else
{:error, :not_found} ->
{:error, :bad_parse, command}
{:error, :not_enabled, _skill} ->
{:error, :bad_parse, command}
{:error, :not_known, skill} ->
%Command{text: command, module: __MODULE__, args: {skill, :not_known}}
{:error, :level_too_low, skill} ->
%Command{text: command, module: __MODULE__, args: {skill, :level_too_low}}
end
end
defp parse_find_skill(command) do
skill =
Skills.all()
|> Enum.find(fn skill ->
Regex.match?(~r(^#{skill.command}), command)
end)
case skill do
nil ->
{:error, :not_found}
skill ->
{:ok, skill}
end
end
defp parse_check_skill_enabled(skill) do
case skill.is_enabled do
true ->
{:ok, skill}
false ->
{:error, :not_enabled, skill}
end
end
defp parse_check_skill_known(skill, save) do
case skill.id in save.skill_ids do
true ->
{:ok, skill}
false ->
{:error, :not_known, skill}
end
end
defp parse_check_skill_level(skill, save) do
case skill.level <= save.level do
true ->
{:ok, skill}
false ->
{:error, :level_too_low, skill}
end
end
@impl Game.Command
@doc """
Look at your info sheet
"""
def run(command, state)
def run({}, %{socket: socket, save: save}) do
skills =
save.skill_ids
|> Skills.skills()
|> Enum.filter(&(&1.level <= save.level))
|> Enum.filter(&(&1.is_enabled))
|> Enum.sort_by(& &1.level)
socket |> @socket.echo(Format.skills(skills))
end
def run({:all}, %{socket: socket, save: save}) do
skills =
save.skill_ids
|> Skills.skills()
|> Enum.sort_by(& &1.level)
socket |> @socket.echo(Format.skills(skills))
end
def run({%{command: command}, command}, %{socket: socket, target: target})
when is_nil(target) do
socket |> @socket.echo(gettext("You don't have a target."))
end
def run({skill, :level_too_low}, state) do
message = gettext("You are too low of a level to use %{skill}.", skill: Format.skill_name(skill))
state.socket |> @socket.echo(message)
end
def run({skill, :not_known}, state) do
message = gettext("You do not know %{skill}.", skill: Format.skill_name(skill))
state.socket |> @socket.echo(message)
end
def run({skill, command}, state = %{save: %{room_id: room_id}, target: target}) do
new_target =
command
|> String.replace(skill.command, "")
|> String.trim()
{:ok, room} = @environment.look(room_id)
with {:ok, target} <- maybe_replace_target_with_self(state, skill, target),
{:ok, target} <- find_target(state, room, target, new_target),
{:ok, skill} <- check_skill_level(state, skill),
{:ok, skill} <- check_cooldown(state, skill) do
use_skill(skill, target, state)
else
{:error, :not_found} ->
state.socket |> @socket.echo(gettext("Your target could not be found."))
{:error, :skill, :level_too_low} ->
state.socket |> @socket.echo(gettext("You are not high enough level to use this skill."))
{:error, :skill_not_ready, remaining_seconds} ->
message = gettext("%{skill} is not ready yet.", skill: Format.skill_name(skill))
state.socket |> @socket.echo(message)
Hint.gate(state, "skills.cooldown_time", %{remaining_seconds: remaining_seconds})
:ok
end
end
defp maybe_replace_target_with_self(state, skill, target) do
case skill.require_target do
true ->
{:ok, {:player, state.user.id}}
false ->
{:ok, target}
end
end
defp check_skill_level(%{save: save}, skill) do
case skill.level > save.level do
true ->
{:error, :skill, :level_too_low}
false ->
{:ok, skill}
end
end
defp check_cooldown(%{skills: skills}, skill) do
case Map.get(skills, skill.id) do
nil ->
{:ok, skill}
last_used_at ->
difference = Timex.diff(Timex.now(), last_used_at, :milliseconds)
case difference > skill.cooldown_time do
true ->
{:ok, skill}
false ->
remaining_seconds = round((skill.cooldown_time - difference) / 1000)
{:error, :skill_not_ready, remaining_seconds}
end
end
end
defp use_skill(skill, target, state) do
%{socket: socket, user: user, save: save = %{stats: stats}} = state
{state, target} = maybe_change_target(state, skill, target)
case stats |> Skill.pay(skill) do
{:ok, stats} ->
save = %{save | stats: stats}
player_effects = save |> Item.effects_on_player()
effects = Skill.filter_effects(player_effects ++ skill.effects, skill)
effects =
stats
|> Effect.calculate_stats_from_continuous_effects(state)
|> Effect.calculate(effects)
Character.apply_effects(
target,
effects,
{:player, user},
Format.skill_usee(skill, user: {:player, user}, target: target)
)
socket |> @socket.echo(Format.skill_user(skill, {:player, user}, target))
state |> GMCP.skill_state(skill, active: false)
state =
state
|> set_timeout(skill)
|> Map.put(:save, save)
|> track_stat_usage(effects)
{:skip, :prompt, state}
{:error, _} ->
message = gettext(~s(You don't have enough skill points to use "%{skill}".), skill: skill.command)
socket |> @socket.echo(message)
{:update, state}
end
end
defp track_stat_usage(state = %{save: save}, effects) do
save = Experience.track_stat_usage(save, effects)
%{state | save: save}
end
defp maybe_change_target(state, skill, target) do
case skill.require_target do
true ->
{state, target}
false ->
_maybe_change_target(state, target)
end
end
defp _maybe_change_target(state, target) do
case Character.who(target) == state.target do
true ->
{state, target}
false ->
case target do
{:npc, npc} ->
{:update, state} = Target.target_npc(npc, state.socket, state)
{state, target}
{:player, user} ->
{:update, state} = Target.target_player(user, state.socket, state)
{state, target}
end
end
end
@doc """
Find a target from state target
iex> Game.Command.Skills.find_target(%{npcs: []}, {:npc, 1})
{:error, :not_found}
iex> Game.Command.Skills.find_target(%{npcs: [%{id: 1, name: "Bandit"}]}, {:npc, 1})
{:ok, {:npc, %{id: 1, name: "Bandit"}}}
iex> Game.Command.Skills.find_target(%{players: []}, {:player, 1})
{:error, :not_found}
iex> Game.Command.Skills.find_target(%{players: [%{id: 1, name: "Bandit"}]}, {:player, 1})
{:ok, {:player, %{id: 1, name: "Bandit"}}}
iex> Game.Command.Skills.find_target(%{}, %{players: [%{id: 1, name: "Bandit"}], npcs: []}, {:player, 2}, "bandit")
{:ok, {:player, %{id: 1, name: "Bandit"}}}
"""
@spec find_target(Room.t(), Character.t(), String.t()) :: Character.t()
def find_target(state, room, target, new_target \\ "")
def find_target(state, room, character, new_target) when new_target != "" do
case Target.find_target(state, new_target, room.players, room.npcs) do
nil ->
find_target(room, character)
target ->
{:ok, target}
end
end
def find_target(_state, room, character, _), do: find_target(room, character)
def find_target(%{npcs: npcs}, {:npc, id}) do
case Enum.find(npcs, &(&1.id == id)) do
nil ->
{:error, :not_found}
npc ->
{:ok, {:npc, npc}}
end
end
def find_target(%{players: players}, {:player, id}) do
case Enum.find(players, &(&1.id == id)) do
nil ->
{:error, :not_found}
player ->
{:ok, {:player, player}}
end
end
def find_target(_, _), do: {:error, :not_found}
defp set_timeout(state, skill) do
Process.send_after(self(), {:skill, :ready, skill}, skill.cooldown_time)
skills = Map.put(state.skills, skill.id, Timex.now())
%{state | skills: skills}
end
end
| 26.627848 | 121 | 0.599544 |
1cc18e8cce15fcf1b6ad7997297e1f29216f9456 | 311 | ex | Elixir | exercises/concept/lasagna/lib/lasagna.ex | jaimeiniesta/elixir-1 | e8ddafeb313822645e0cd76743955a5c728a84c5 | [
"MIT"
] | 343 | 2017-06-22T16:28:28.000Z | 2022-03-25T21:33:32.000Z | exercises/concept/lasagna/lib/lasagna.ex | jaimeiniesta/elixir-1 | e8ddafeb313822645e0cd76743955a5c728a84c5 | [
"MIT"
] | 583 | 2017-06-19T10:48:40.000Z | 2022-03-28T21:43:12.000Z | exercises/concept/lasagna/lib/lasagna.ex | jaimeiniesta/elixir-1 | e8ddafeb313822645e0cd76743955a5c728a84c5 | [
"MIT"
] | 228 | 2017-07-05T07:09:32.000Z | 2022-03-27T08:59:08.000Z | defmodule Lasagna do
# Please define the 'expected_minutes_in_oven/0' function
# Please define the 'remaining_minutes_in_oven/1' function
# Please define the 'preparation_time_in_minutes/1' function
# Please define the 'total_time_in_minutes/2' function
# Please define the 'alarm/0' function
end
| 25.916667 | 62 | 0.78135 |
1cc19ee35363b04432627aad63ffdba39ad4d061 | 1,814 | ex | Elixir | test/support/model_case.ex | bitriot/phoenix_base | 15ec83a9acf46202102f2b006d577972f5564b2f | [
"MIT"
] | null | null | null | test/support/model_case.ex | bitriot/phoenix_base | 15ec83a9acf46202102f2b006d577972f5564b2f | [
"MIT"
] | null | null | null | test/support/model_case.ex | bitriot/phoenix_base | 15ec83a9acf46202102f2b006d577972f5564b2f | [
"MIT"
] | null | null | null | defmodule PhoenixBase.ModelCase do
@moduledoc """
This module defines the test case to be used by
model tests.
You may define functions here to be used as helpers in
your model tests. See `errors_on/2`'s definition as reference.
Finally, if the test case interacts with the database,
it cannot be async. For this reason, every test runs
inside a transaction which is reset at the beginning
of the test unless the test case is marked as async.
"""
use ExUnit.CaseTemplate
using do
quote do
alias PhoenixBase.Repo
import Ecto
import Ecto.Changeset
import Ecto.Query
import PhoenixBase.ModelCase
end
end
setup tags do
:ok = Ecto.Adapters.SQL.Sandbox.checkout(PhoenixBase.Repo)
unless tags[:async] do
Ecto.Adapters.SQL.Sandbox.mode(PhoenixBase.Repo, {:shared, self()})
end
:ok
end
@doc """
Helper for returning list of errors in a struct when given certain data.
## Examples
Given a User schema that lists `:name` as a required field and validates
`:password` to be safe, it would return:
iex> errors_on(%User{}, %{password: "password"})
[password: "is unsafe", name: "is blank"]
You could then write your assertion like:
assert {:password, "is unsafe"} in errors_on(%User{}, %{password: "password"})
You can also create the changeset manually and retrieve the errors
field directly:
iex> changeset = User.changeset(%User{}, password: "password")
iex> {:password, "is unsafe"} in changeset.errors
true
"""
def errors_on(struct, data) do
struct.__struct__.changeset(struct, data)
|> Ecto.Changeset.traverse_errors(&PhoenixBase.ErrorHelpers.translate_error/1)
|> Enum.flat_map(fn {key, errors} -> for msg <- errors, do: {key, msg} end)
end
end
| 27.484848 | 84 | 0.689085 |
1cc1a5eb835dcfdc70c7dc1d5bf95f70cf20144f | 3,119 | ex | Elixir | lib/mix/tasks/nerves/precompile.ex | TORIFUKUKaiou/nerves_bootstrap | 8d4142779fa5f2a3719118490c8cd3401c6eb374 | [
"Apache-2.0"
] | 33 | 2017-10-20T04:04:00.000Z | 2021-04-27T11:15:08.000Z | lib/mix/tasks/nerves/precompile.ex | TORIFUKUKaiou/nerves_bootstrap | 8d4142779fa5f2a3719118490c8cd3401c6eb374 | [
"Apache-2.0"
] | 67 | 2018-01-10T15:41:43.000Z | 2022-02-23T22:11:55.000Z | lib/mix/tasks/nerves/precompile.ex | TORIFUKUKaiou/nerves_bootstrap | 8d4142779fa5f2a3719118490c8cd3401c6eb374 | [
"Apache-2.0"
] | 14 | 2018-02-04T16:31:28.000Z | 2022-01-21T11:12:46.000Z | defmodule Mix.Tasks.Nerves.Precompile do
use Mix.Task
import Mix.Nerves.IO
@moduledoc false
@switches [loadpaths: :boolean]
def run(args) do
debug_info("Precompile Start")
# Note: We have to directly use the environment variable here instead of
# calling Nerves.Env.enabled?/0 because the nerves.precompile step happens
# before the nerves dependency is compiled, which is where Nerves.Env
# currently lives. This would be improved by moving Nerves.Env to
# nerves_bootstrap.
unless System.get_env("NERVES_ENV_DISABLED") do
System.put_env("NERVES_PRECOMPILE", "1")
{opts, _, _} = OptionParser.parse(args, switches: @switches)
Mix.Tasks.Nerves.Env.run([])
{app, deps} =
Nerves.Env.packages()
|> Enum.split_with(&(&1.app == Mix.Project.config()[:app]))
(deps ++ app)
|> compile_check()
|> Enum.each(&compile/1)
Mix.Task.reenable("deps.compile")
Mix.Task.reenable("compile")
System.put_env("NERVES_PRECOMPILE", "0")
if opts[:loadpaths] != false, do: Mix.Task.rerun("nerves.loadpaths")
end
debug_info("Precompile End")
end
defp compile(%{app: app}) do
cond do
Mix.Project.config()[:app] == app ->
Mix.Tasks.Compile.run([app, "--include-children"])
true ->
Mix.Tasks.Deps.Compile.run([app, "--no-deps-check", "--include-children"])
end
end
defp compile_check(packages) do
stale =
packages
|> Enum.reject(&(Mix.Project.config()[:app] == &1.app))
|> Enum.filter(&(:nerves_package in Map.get(&1, :compilers, Mix.compilers())))
|> Enum.filter(&(Nerves.Artifact.expand_sites(&1) != []))
|> Enum.filter(&Nerves.Artifact.stale?/1)
|> Enum.reject(&Keyword.get(Map.get(&1, :dep_opts, []), :compile, false))
case stale do
[] ->
packages
packages ->
stale_packages =
packages
|> Enum.map(&Map.get(&1, :app))
|> Enum.map(&Atom.to_string/1)
|> Enum.join("\n ")
example = List.first(packages)
Mix.raise("""
The following Nerves packages need to be built:
#{stale_packages}
The build process for each of these can take a significant amount of
time so the maintainers have listed URLs for downloading pre-built packages.
If you have not modified these packages, please try running `mix deps.get`
or `mix deps.update` to download the precompiled versions.
If you have limited network access and are able to obtain the files via
other means, copy them to `~/.nerves/dl` or the location specified by
`$NERVES_DL_DIR`.
If you are making modifications to one of the packages or want to force
local compilation, add `nerves: [compile: true]` to the dependency. For
example:
{:#{example.app}, "~> #{example.version}", nerves: [compile: true]}
If the package is a dependency of a dependency, you will need to
override it on the parent project with `override: true`.
""")
end
end
end
| 30.578431 | 84 | 0.624239 |
1cc1b560dcfc796c269de26a028641b558bf9a81 | 541 | ex | Elixir | web/router.ex | tsara27/collab-x-phoenix | 828f8fbdcf853a43e096a42dc2f003cf443eb792 | [
"MIT"
] | null | null | null | web/router.ex | tsara27/collab-x-phoenix | 828f8fbdcf853a43e096a42dc2f003cf443eb792 | [
"MIT"
] | null | null | null | web/router.ex | tsara27/collab-x-phoenix | 828f8fbdcf853a43e096a42dc2f003cf443eb792 | [
"MIT"
] | null | null | null | defmodule CollabXPhoenix.Router do
use CollabXPhoenix.Web, :router
pipeline :browser do
plug :accepts, ["html"]
plug :fetch_session
plug :fetch_flash
plug :protect_from_forgery
plug :put_secure_browser_headers
end
pipeline :api do
plug :accepts, ["json"]
end
scope "/", CollabXPhoenix do
pipe_through :browser # Use the default browser stack
get "/", PageController, :index
end
# Other scopes may use custom stacks.
# scope "/api", CollabXPhoenix do
# pipe_through :api
# end
end
| 20.037037 | 57 | 0.689464 |
1cc1bee4379b0c1837f3fbd66f6115798c66bf28 | 1,751 | ex | Elixir | lib/absinthe/blueprint/schema/input_object_type_definition.ex | TheRealReal/absinthe | 6eae5bc36283e58f42d032b8afd90de3ad64f97b | [
"MIT"
] | 4,101 | 2016-03-02T03:49:20.000Z | 2022-03-31T05:46:01.000Z | lib/absinthe/blueprint/schema/input_object_type_definition.ex | TheRealReal/absinthe | 6eae5bc36283e58f42d032b8afd90de3ad64f97b | [
"MIT"
] | 889 | 2016-03-02T16:06:59.000Z | 2022-03-31T20:24:12.000Z | lib/absinthe/blueprint/schema/input_object_type_definition.ex | TheRealReal/absinthe | 6eae5bc36283e58f42d032b8afd90de3ad64f97b | [
"MIT"
] | 564 | 2016-03-02T07:49:59.000Z | 2022-03-06T14:40:59.000Z | defmodule Absinthe.Blueprint.Schema.InputObjectTypeDefinition do
@moduledoc false
alias Absinthe.{Blueprint, Type}
@enforce_keys [:name]
defstruct [
:identifier,
:name,
:module,
description: nil,
fields: [],
imports: [],
directives: [],
source_location: nil,
# Added by phases,
flags: %{},
errors: [],
__reference__: nil,
__private__: []
]
@type t :: %__MODULE__{
name: String.t(),
description: nil | String.t(),
fields: [Blueprint.Schema.InputValueDefinition.t()],
directives: [Blueprint.Directive.t()],
source_location: nil | Blueprint.SourceLocation.t(),
# Added by phases
flags: Blueprint.flags_t(),
errors: [Absinthe.Phase.Error.t()]
}
def build(type_def, schema) do
%Type.InputObject{
identifier: type_def.identifier,
name: type_def.name,
fields: build_fields(type_def, schema),
description: type_def.description,
definition: type_def.module
}
end
def build_fields(type_def, schema) do
for field_def <- type_def.fields, into: %{} do
field = %Type.Field{
identifier: field_def.identifier,
deprecation: field_def.deprecation,
description: field_def.description,
name: field_def.name,
type: Blueprint.TypeReference.to_type(field_def.type, schema),
definition: type_def.module,
__reference__: field_def.__reference__,
__private__: field_def.__private__,
default_value: field_def.default_value
}
{field.identifier, field}
end
end
defimpl Inspect do
defdelegate inspect(term, options),
to: Absinthe.Schema.Notation.SDL.Render
end
end
| 26.134328 | 70 | 0.635637 |
1cc1f74e5f45750e6fa343e924251e1cd8f1b6f0 | 1,782 | ex | Elixir | clients/dfa_reporting/lib/google_api/dfa_reporting/v35/model/site_transcode_setting.ex | renovate-bot/elixir-google-api | 1da34cd39b670c99f067011e05ab90af93fef1f6 | [
"Apache-2.0"
] | 1 | 2021-12-20T03:40:53.000Z | 2021-12-20T03:40:53.000Z | clients/dfa_reporting/lib/google_api/dfa_reporting/v35/model/site_transcode_setting.ex | swansoffiee/elixir-google-api | 9ea6d39f273fb430634788c258b3189d3613dde0 | [
"Apache-2.0"
] | 1 | 2020-08-18T00:11:23.000Z | 2020-08-18T00:44:16.000Z | clients/dfa_reporting/lib/google_api/dfa_reporting/v35/model/site_transcode_setting.ex | dazuma/elixir-google-api | 6a9897168008efe07a6081d2326735fe332e522c | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.DFAReporting.V35.Model.SiteTranscodeSetting do
@moduledoc """
Transcode Settings
## Attributes
* `enabledVideoFormats` (*type:* `list(integer())`, *default:* `nil`) - Allowlist of video formats to be served to this site template. Set this list to null or empty to serve all video formats.
* `kind` (*type:* `String.t`, *default:* `nil`) - Identifies what kind of resource this is. Value: the fixed string "dfareporting#siteTranscodeSetting".
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:enabledVideoFormats => list(integer()) | nil,
:kind => String.t() | nil
}
field(:enabledVideoFormats, type: :list)
field(:kind)
end
defimpl Poison.Decoder, for: GoogleApi.DFAReporting.V35.Model.SiteTranscodeSetting do
def decode(value, options) do
GoogleApi.DFAReporting.V35.Model.SiteTranscodeSetting.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.DFAReporting.V35.Model.SiteTranscodeSetting do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 35.64 | 197 | 0.735129 |
1cc20df550cd85cd55dc0f0d74f5b47cb44a4961 | 529 | exs | Elixir | test/apps/migrator/config/config.exs | mori5321/yacto | 85d6f216fd7ae16825c03e9772cb75a06b495245 | [
"Apache-2.0"
] | null | null | null | test/apps/migrator/config/config.exs | mori5321/yacto | 85d6f216fd7ae16825c03e9772cb75a06b495245 | [
"Apache-2.0"
] | null | null | null | test/apps/migrator/config/config.exs | mori5321/yacto | 85d6f216fd7ae16825c03e9772cb75a06b495245 | [
"Apache-2.0"
] | null | null | null | use Mix.Config
config :migrator, :ecto_repos, [Migrator.Repo0, Migrator.Repo1]
config :migrator, Migrator.Repo0,
database: "migrator_repo0",
username: "root",
password: "",
hostname: "localhost",
port: 3306
config :migrator, Migrator.Repo1,
database: "migrator_repo1",
username: "root",
password: "",
hostname: "localhost",
port: 3306
config :yacto, :databases, %{
default: %{module: Yacto.DB.Single, repo: Migrator.Repo1},
player: %{module: Yacto.DB.Shard, repos: [Migrator.Repo0, Migrator.Repo1]}
}
| 23 | 76 | 0.695652 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.