hexsha stringlengths 40 40 | size int64 2 991k | ext stringclasses 2 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 4 208 | max_stars_repo_name stringlengths 6 106 | max_stars_repo_head_hexsha stringlengths 40 40 | max_stars_repo_licenses list | max_stars_count int64 1 33.5k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 4 208 | max_issues_repo_name stringlengths 6 106 | max_issues_repo_head_hexsha stringlengths 40 40 | max_issues_repo_licenses list | max_issues_count int64 1 16.3k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 4 208 | max_forks_repo_name stringlengths 6 106 | max_forks_repo_head_hexsha stringlengths 40 40 | max_forks_repo_licenses list | max_forks_count int64 1 6.91k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 2 991k | avg_line_length float64 1 36k | max_line_length int64 1 977k | alphanum_fraction float64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ffa23c93773b6edb4e9c0fb895fd9f2ccd06f2d7 | 1,996 | ex | Elixir | clients/display_video/lib/google_api/display_video/v1/model/parent_entity_filter.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | null | null | null | clients/display_video/lib/google_api/display_video/v1/model/parent_entity_filter.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-12-18T09:25:12.000Z | 2020-12-18T09:25:12.000Z | clients/display_video/lib/google_api/display_video/v1/model/parent_entity_filter.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-10-04T10:12:44.000Z | 2020-10-04T10:12:44.000Z | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.DisplayVideo.V1.Model.ParentEntityFilter do
@moduledoc """
A filtering option that filters on selected file types belonging to a chosen set of filter entities.
## Attributes
* `fileType` (*type:* `list(String.t)`, *default:* `nil`) - Required. File types that will be returned.
* `filterIds` (*type:* `list(String.t)`, *default:* `nil`) - The IDs of the specified filter type. This is used to filter entities to fetch. If filter type is not `FILTER_TYPE_NONE`, at least one ID must be specified.
* `filterType` (*type:* `String.t`, *default:* `nil`) - Required. Filter type used to filter fetched entities.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:fileType => list(String.t()),
:filterIds => list(String.t()),
:filterType => String.t()
}
field(:fileType, type: :list)
field(:filterIds, type: :list)
field(:filterType)
end
defimpl Poison.Decoder, for: GoogleApi.DisplayVideo.V1.Model.ParentEntityFilter do
def decode(value, options) do
GoogleApi.DisplayVideo.V1.Model.ParentEntityFilter.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.DisplayVideo.V1.Model.ParentEntityFilter do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 37.660377 | 221 | 0.721443 |
ffa245dcd4de9f8c6d7ed3f506b8c25f64f3d4c8 | 13,947 | ex | Elixir | lib/event/event.ex | aeturnum/twitch_discord_connector | b965ba1164540c92a925d2bd08e3fa299dfd457a | [
"MIT"
] | null | null | null | lib/event/event.ex | aeturnum/twitch_discord_connector | b965ba1164540c92a925d2bd08e3fa299dfd457a | [
"MIT"
] | null | null | null | lib/event/event.ex | aeturnum/twitch_discord_connector | b965ba1164540c92a925d2bd08e3fa299dfd457a | [
"MIT"
] | null | null | null | defmodule TwitchDiscordConnector.Event do
@moduledoc """
Event System
Simple messaging system used to allow different elements of the server to communicate with each other.
Required functions to be a listener:
init({your_address, args}) -> state
channel() -> channel atom
handle_event(from, event, state)
valid responses:
:ignore
{:ok, new_state}
{action, new_state}
{[actions], new_state}
actions:
{:brod, name, data \\ nil}
{:send, to, name, data \\ nil} # to can be :me
{:job, to, name, {function, args}} # to can be :me, :brod or addr
{:in, ms, what} # what can be any action
{:cancel, delay_id} # cancel a delay task
callback format:
handle_event(ctx, event, state)
where ctx is:
{:brod, channel} # broadcast
{:send, channel} # sent to your address
:me # sent to your address by yourself
where event is:
{name, data}
state is state
delay tasks, once started, will generate a callback equivilent to the job starter returning:
{:send, :me, :started, delay_id} on the :event channel.
This can be used to cancel the delay
internal state: {next_id, listeners, tasks}
listener: {id, elixir_module, tag, state}
"""
use GenServer
alias TwitchDiscordConnector.Event
alias TwitchDiscordConnector.Util.L
@name Event
@type delay_atom :: :in | :delay
@type f_call() :: {fun(), list()}
@type addr() :: integer() | :me
@type action() ::
{:brod, atom(), any()}
| {:send, addr(), atom(), any()}
| {:job, addr(), atom(), f_call()}
| {delay_atom(), integer(), action()}
| {:cancel, integer()}
@type response() :: :ignore | {action(), map()} | {[action()], map()}
@doc """
Broadcast a message to every listener from `channel` with name `message_name`.
Optionally include an arbitraty `data` argument (will be `nil` is excluded)
Returns `:ok`
"""
@spec broadcast({atom(), atom()}, any()) :: :ok
def broadcast({channel, message_name}, data \\ nil) do
GenServer.cast(@name, {{:brod, data}, {channel, message_name, :brod}})
end
@doc """
Send a message to one listener from `channel` with name `message_name`.
Must specify `from` and `to`. `from` may be a special case atom `:me` which deisgnates it is a message sent by
the target.
Optionally include an arbitraty `data` argument (will be `nil` is excluded)
Returns `:ok`
"""
@spec send({atom(), atom()}, {addr(), addr()}, any()) :: :ok
def send({channel, message_name}, {from, to}, data \\ %{}) do
GenServer.cast(@name, {{:send, to, data}, {channel, message_name, from}})
end
@doc """
Execute an action (as defined by an internal format) in `ms` milliseconds.
This function isn't really designed to be called by outsiders and so a later
update may add a pathway to start a delay that doesn't generate an internal
callback with the handle.
Will cause a callback to the listener in the `from` field with the delay handle
in case you want to cancel the delay.
Returns `:ok`
"""
@spec delay(atom(), integer(), action(), {addr(), atom()}) :: :ok
defp delay(name, ms, action, {from, channel}) do
GenServer.cast(@name, {:delay, name, ms, action, {from, channel}})
end
@doc """
Execute a job and return the result to a listener. Should also probably be changed
as all of the jobs I've written so far are responding to the listener that started
them.
Returns `:ok`
"""
@spec job(atom(), {addr(), addr()}, fun(), list()) :: :ok
def job(name, {from, to}, func, args \\ []) do
GenServer.cast(@name, {{:job, to, func, args}, {:job, name, from}})
end
@doc """
Cancel a delay using the handle
Returns `:ok`
"""
defp cancel_delay(delay_id) do
GenServer.cast(@name, {:cancel_delay, delay_id})
end
@doc """
Add a listener using the specified module and argument.
`module` is an Elixer module
`init_arg` is anything, but will default to an empty list
Can be used to load listeners after start
Returns `{:ok, listener_id}`
"""
@spec add_listener(module(), list()) :: {:ok, integer()}
def add_listener(module, init_arg \\ []) do
GenServer.call(@name, {:add_lis, module, init_arg})
end
@doc """
Get the current state of a listener as specified by the id.`
Mostly used for testing.
Returns `{:ok, listener_id}`
"""
@spec get_state(integer()) :: {:ok, map()} | {:error, binary()}
def get_state(address) do
GenServer.call(@name, {:get_state, address})
end
########################
### Init ###############
########################
def start_link(modules) do
GenServer.start_link(__MODULE__, modules, name: @name)
end
def init(modules) do
{
:ok,
{
0,
# listeners
[],
# tasks
[]
}
|> make_start_state(modules)
}
end
defp make_start_state({id, mods, tks}, modules) do
with mod_and_args <- Enum.map(modules, &to_mod_and_args/1),
{id, mods} <- Enum.reduce(mod_and_args, {id, mods}, &add_mod_to_list/2) do
{
id,
Enum.reverse(mods),
tks
}
end
end
defp to_mod_and_args({mod, arg}), do: {mod, arg}
defp to_mod_and_args(mod), do: {mod, []}
# defp send_event(lis = {m_id, mod, m_ch, last_state}, {channel, name, from}, {type, data}) do
defp add_mod_to_list({mod, arg}, {next_id, list}) do
with channel <- mod.channel(),
mod_state <- mod.init(arg),
init_listener <- {next_id, mod, channel, mod_state},
state_tuple <- send_event(init_listener, {:event, :added, :event}, {:send, %{}}) do
L.d("Added #{lis_s(state_tuple)} with Module #{inspect(mod)}")
{next_id + 1, [state_tuple | list]}
end
end
########################
### Handles ############
########################
def handle_call({:add_lis, module, init_arg}, _from, {nid, mod_list, tsks}) do
with {id, new_list} <- add_mod_to_list({module, init_arg}, {nid, mod_list}) do
{:reply, {:ok, nid}, {id, new_list, tsks}}
end
end
def handle_call({:get_state, addr}, _from, s) do
with {:ok, listener} <- find_addr(s, addr) do
{:reply, {:ok, elem(listener, 3)}, s}
end
end
def handle_call(info, _from, state) do
L.e("Manager: unexpected call: #{inspect(info)}")
{:reply, nil, state}
end
def handle_cast({{:brod, data}, src}, state) do
{:noreply, send_broadcast(state, src, data)}
end
def handle_cast({{:send, to, data}, src}, state) do
{:noreply, send_send(state, src, to, data)}
end
def handle_cast({:delay, name, ms, action, context}, state) do
{:noreply, delay_action(state, name, ms, action, context)}
end
def handle_cast({{:job, to, func, args}, src}, state) do
{:noreply, start_job(state, to, {func, args}, src)}
end
def handle_cast({:cancel_delay, delay_id}, state) do
{:noreply, cancel_delay(state, delay_id)}
end
# todo: add shutdown event
#######################
### Doers #############
#######################
defp send_broadcast({nid, mod_info, tsks}, src, data) do
{
nid,
mod_info
|> Enum.map(fn listener ->
send_event(listener, src, {:brod, data})
end),
tsks
}
end
defp send_send({nid, mod_info, tsks}, src, to, data) do
{
nid,
mod_info
|> Enum.map(fn lis = {addr, _, _, _} ->
case addr == to do
false -> lis
true -> send_event(lis, src, {:send, data})
end
end),
tsks
}
end
defp delay_action({id, mod_info, tsks}, name, ms, action, context) do
with {from, _} <- context,
{:ok, tsk} <- Task.start(fn -> do_delay_action(name, ms, {action, context}) end),
# send notification that task can be cancelled
:ok <- Event.send({:event, :delay_started}, {from, from}, {name, tsk}) do
{
id,
mod_info,
[tsk | tsks]
}
end
end
defp start_job(state, to, {func, args}, src) do
Task.start(fn -> do_job({func, args}, to, src) end)
state
end
defp cancel_delay({id, mod_info, tsks}, delay_id) do
# this is possible because the "task id" we provide is just the task handle
Task.shutdown(delay_id, 0)
{id, mod_info, tsks |> Enum.filter(fn tsk -> tsk != delay_id end)}
end
defp do_delay_action(name, ms, {action, {from_id, channel}}) do
L.d("#{inspect(name)}: delaying #{inspect(action)} by #{ms}ms")
:timer.sleep(ms)
L.d("#{inspect(name)}: executing #{inspect(action)}")
norm_action(action, from_id)
|> handle_response({from_id, channel})
|> L.ins()
end
defp do_job({f, args}, to, {job_channel, name, from}) do
# L.d("Running job[#{inspect(name)}]...")
r = apply(f, args)
L.d("Job[#{inspect(name)}] #{inspect(f)}(#{inspect(args)}) -> #{inspect(r, pretty: true)}")
case to do
:brod -> Event.broadcast({job_channel, name}, r)
_ -> Event.send({job_channel, name}, {from, to}, r)
end
end
# handle_event(ctx, event, state)
# where ctx is:
# {:brod, channel} # broadcast
# {:send, channel} # sent to your address
# :me # sent to your address by yourself
# where event is:
# {name, data}
# state is state
defp send_event(lis = {mod_id, mod, mod_ch, last_state}, {ev_ch, name, from}, {type, data}) do
# L.d("send_event(#{lis_s(lis)}, #{inspect({ev_ch, name, from})}, #{inspect({type, data})})")
{
mod_id,
mod,
mod_ch,
with ctx <- make_context(type, ev_ch, from, mod_id),
event <- {name, data} do
try do
mod.handle_event(ctx, event, last_state)
rescue
FunctionClauseError ->
# last_state
# |> L.ins(
# label: "#{lis_s(lis)} does not handle (#{inspect(ctx)}, #{inspect(event)}, state)"
# )
:ignore
_ ->
L.e(
"Unknown error when calling (#{inspect(ctx)}, #{inspect(event)}, state) on #{
lis_s(lis)
}"
)
:ignore
end
|> normalize_response({mod_id, last_state})
|> do_responses({lis, ctx, event})
end
}
end
defp make_context(:send, channel, from, to) do
case from == to do
true -> {:send, :me}
false -> {:send, channel}
end
end
defp make_context(:brod, channel, _, _), do: {:brod, channel}
defp normalize_response(:ignore, {_, last_state}), do: {:ignore, last_state}
defp normalize_response({:ok, s}, _), do: {:ok, s}
defp normalize_response({a, s}, info) when not is_list(a),
do: normalize_response({[a], s}, info)
defp normalize_response({a, s}, {my_id, _}) do
# L.i("Final normalize_response: {#{inspect(a)}, s}")
{
Enum.map(
a,
fn a -> norm_action(a, my_id) end
),
s
}
end
defp norm_action(a = {:brod, _}, _), do: Tuple.append(a, nil)
defp norm_action({:send, to, name}, my_id), do: {:send, to_addr(to, my_id), name, nil}
defp norm_action({:send, to, name, data}, my_id), do: {:send, to_addr(to, my_id), name, data}
defp norm_action({:in, name, ms, what}, _), do: {:delay, name, ms, what}
defp norm_action({:job, to, name, f}, my_id), do: {:job, to_addr(to, my_id), name, f}
defp to_addr(:brod, _), do: :brod
defp to_addr(:me, my_id), do: my_id
defp to_addr(other, _), do: other
defp do_responses({:ignore, state}, _) do
# L.d("#{lis_s(lis)}(#{inspect(ctx)},#{inspect(event)}) -> :ignore")
state
end
defp do_responses({:ok, state}, {lis, ctx, event}) do
L.ins(state, label: "#{lis_s(lis)}(#{inspect(ctx)},#{inspect(event)}) -> {:ok, ")
end
defp do_responses({action_list, state}, {lis, ctx, event}) do
with {from, _mod, channel, _state} <- lis do
logs = Enum.map(action_list, fn action -> handle_response(action, {from, channel}) end)
L.d("#{lis_s(lis)}(#{inspect(ctx)},#{inspect(event)}) ->
{
#{Enum.join(logs, "\n")},
#{inspect(state, pretty: true)},
}")
state
end
end
# {:brod, name, data \\ nil}
# {:send, to, name, data \\ nil} # to can be :me
# {:job, to, name, {function, args}} # to can be :me, :brod or addr
# {:in, ms, what} # what can be any action
# {:cancel, delay_id} # cancel a delay task
defp handle_response({:brod, name, data}, {_, channel}) do
# queue broadcast
Event.broadcast({channel, name}, data)
"broadcast] |#{inspect({channel, name})}|: #{inspect(data)})"
end
defp handle_response({:send, to, name, data}, {from, channel}) do
# queue broadcast
Event.send({channel, name}, {from, to}, data)
"send] |#{inspect({channel, name})}| #{inspect(from)} -> #{inspect(to)}: #{inspect(data)})"
end
defp handle_response({:job, to, name, {func, args}}, {from, _}) do
Event.job(name, {from, to}, func, args)
"starting job] #{inspect({name})} for #{inspect(to)}: #{inspect(func)}(#{inspect(args)}))"
end
defp handle_response({:delay, name, ms, action}, ctx) do
delay(name, ms, action, ctx)
"delay] #{inspect(name)} in #{ms}ms: #{inspect(action)})"
end
defp handle_response({:cancel, delay_id}, _) do
cancel_delay(delay_id)
"cancelling] delay #{delay_id}"
end
defp handle_response({unknown, state}, channel) do
L.ins(unknown, label: "handle_response unknown argument from #{inspect(channel)}")
state
end
########################
### Helpers ############
########################
defp find_addr(s, addr) do
# {next_id, listeners, tasks}
s
|> elem(1)
|> Enum.reduce_while(
{:error, "Could not find id #{inspect(addr)}"},
# {id, elixir_module, channel, state}
fn lis = {id, _, _, _}, def_result ->
if id == addr do
{:halt, {:ok, lis}}
else
{:cont, def_result}
end
end
)
end
defp lis_s({m_id, _mod, m_ch, _last_state}), do: "#{m_ch}[#{m_id}]"
end
| 29.486258 | 112 | 0.584068 |
ffa249e55107ab7244f96b5361c8129b9563ee1a | 719 | ex | Elixir | lib/cat_show/auth/error_handler.ex | kemm/cat_show | 42f47f93ecec48700a4d5373be27138cf907d6cb | [
"BSD-3-Clause"
] | null | null | null | lib/cat_show/auth/error_handler.ex | kemm/cat_show | 42f47f93ecec48700a4d5373be27138cf907d6cb | [
"BSD-3-Clause"
] | 1 | 2021-05-11T15:49:58.000Z | 2021-05-11T15:49:58.000Z | lib/cat_show/auth/error_handler.ex | kemm/cat_show | 42f47f93ecec48700a4d5373be27138cf907d6cb | [
"BSD-3-Clause"
] | null | null | null | defmodule CatShow.Auth.ErrorHandler do
import Plug.Conn
def auth_error(conn, {:invalid_token, _reason}, _opts) do
response(conn, :unauthorized, "Invalid Token")
end
def auth_error(conn, {:unauthenticated, _reason}, _opts) do
response(conn, :unauthorized, "Not Authenticated")
end
def auth_error(conn, {:no_resource_found, _reason}, _opts) do
response(conn, :unauthorized, "No Resource Found")
end
def auth_error(conn, {type, _reason}, _opts) do
response(conn, :forbidden, to_string(type))
end
defp response(conn, status, message) do
body = Poison.encode!(%{error: message})
conn
|> put_resp_content_type("application/json")
|> send_resp(status, body)
end
end
| 25.678571 | 63 | 0.703755 |
ffa254ff9f016933fef11059585d51207bc2a154 | 65 | ex | Elixir | lib/colorify_web/views/layout_view.ex | ashkan18/colorify | e4da2bc54a60199c411afc696086689d825b7a13 | [
"MIT"
] | 1 | 2020-03-12T18:44:10.000Z | 2020-03-12T18:44:10.000Z | lib/colorify_web/views/layout_view.ex | ashkan18/colorify | e4da2bc54a60199c411afc696086689d825b7a13 | [
"MIT"
] | 1 | 2021-03-10T10:22:38.000Z | 2021-03-10T10:22:38.000Z | lib/colorify_web/views/layout_view.ex | ashkan18/colorify | e4da2bc54a60199c411afc696086689d825b7a13 | [
"MIT"
] | 1 | 2020-03-12T18:41:42.000Z | 2020-03-12T18:41:42.000Z | defmodule ColorifyWeb.LayoutView do
use ColorifyWeb, :view
end
| 16.25 | 35 | 0.815385 |
ffa2631365876aa5671993b5938714939bd58977 | 493 | ex | Elixir | zombie-driver/test/support/fixtures.ex | akrisanov/microservices_in_elixir | 55829d5e53c8e77f3fb468419a29f1f1f9fa8db0 | [
"MIT"
] | null | null | null | zombie-driver/test/support/fixtures.ex | akrisanov/microservices_in_elixir | 55829d5e53c8e77f3fb468419a29f1f1f9fa8db0 | [
"MIT"
] | null | null | null | zombie-driver/test/support/fixtures.ex | akrisanov/microservices_in_elixir | 55829d5e53c8e77f3fb468419a29f1f1f9fa8db0 | [
"MIT"
] | null | null | null | defmodule Fixtures do
@moduledoc false
def empty_driver_locations_fixture() do
[]
end
def driver_locations_fixture() do
[
%{
latitude: "48.86419237673293736",
longitude: "2.35050016641616821",
updated_at: DateTime.utc_now() |> Timex.shift(minutes: -3)
},
%{
latitude: "58.86419237673293736",
longitude: "3.35050016641616821",
updated_at: DateTime.utc_now() |> Timex.shift(minutes: -2)
}
]
end
end
| 21.434783 | 66 | 0.608519 |
ffa2668d17248bd982c26a027caee1a30ed76e79 | 662 | exs | Elixir | test/onehundredsixtyeight_hours/csv_test.exs | Sgoettschkes/168-hours | fd9895b2b3b8fd1dcce0283f03fe46cd0a5c454b | [
"MIT"
] | null | null | null | test/onehundredsixtyeight_hours/csv_test.exs | Sgoettschkes/168-hours | fd9895b2b3b8fd1dcce0283f03fe46cd0a5c454b | [
"MIT"
] | null | null | null | test/onehundredsixtyeight_hours/csv_test.exs | Sgoettschkes/168-hours | fd9895b2b3b8fd1dcce0283f03fe46cd0a5c454b | [
"MIT"
] | null | null | null | defmodule OnehundredsixtyeightHours.CsvTest do
use ExUnit.Case
describe "parse/1" do
test "csv parses file" do
result =
File.stream!("priv/test/data.csv")
|> OnehundredsixtyeightHours.Csv.parse()
assert List.flatten(result) == result
assert List.first(result) == "Task 1"
end
end
describe "calculate_minutes/1" do
test "csv calculates correct minutes" do
result =
["Task 1", "Task 2", "Task 1", "Task 3", "Task 1", "Task 2"]
|> OnehundredsixtyeightHours.Csv.calculate_minutes()
assert length(result) == 3
assert List.first(result) == ["Task 1", "45m"]
end
end
end
| 25.461538 | 68 | 0.626888 |
ffa28202fa8fd9562a9660c86ccfc9dc17f75b84 | 1,041 | exs | Elixir | apps/tai/test/tai/venue_adapters/binance/stream/route_order_books_test.exs | yurikoval/tai | 94254b45d22fa0307b01577ff7c629c7280c0295 | [
"MIT"
] | null | null | null | apps/tai/test/tai/venue_adapters/binance/stream/route_order_books_test.exs | yurikoval/tai | 94254b45d22fa0307b01577ff7c629c7280c0295 | [
"MIT"
] | 78 | 2020-10-12T06:21:43.000Z | 2022-03-28T09:02:00.000Z | apps/tai/test/tai/venue_adapters/binance/stream/route_order_books_test.exs | yurikoval/tai | 94254b45d22fa0307b01577ff7c629c7280c0295 | [
"MIT"
] | null | null | null | defmodule Tai.VenueAdapters.Binance.Stream.RouteOrderBooksTest do
use ExUnit.Case, async: false
alias Tai.VenueAdapters.Binance.Stream.{ProcessOrderBook, RouteOrderBooks}
@venue :venue_a
@venue_symbol "BTC_USD"
@product struct(Tai.Venues.Product, venue_id: @venue, venue_symbol: @venue_symbol)
@received_at Timex.now()
setup do
name = ProcessOrderBook.to_name(@venue, @venue_symbol)
Process.register(self(), name)
{:ok, pid} = start_supervised({RouteOrderBooks, [venue_id: @venue, products: [@product]]})
{:ok, %{pid: pid}}
end
test "forwards an update message to the order book store for the product", %{pid: pid} do
msg = %{
"e" => "depthUpdate",
"E" => 1_569_051_459_755,
"s" => @venue_symbol,
"U" => "ignore",
"u" => "ignore",
"b" => [],
"a" => []
}
GenServer.cast(pid, {msg, @received_at})
assert_receive {:"$gen_cast", {:update, received_msg, received_at}}
assert received_msg == msg
assert received_at == @received_at
end
end
| 28.916667 | 94 | 0.650336 |
ffa28276738cb139b570f7ee926add475f75f53a | 3,408 | ex | Elixir | deps/postgrex/lib/postgrex/stream.ex | rpillar/Top5_Elixir | 9c450d2e9b291108ff1465dc066dfe442dbca822 | [
"MIT"
] | null | null | null | deps/postgrex/lib/postgrex/stream.ex | rpillar/Top5_Elixir | 9c450d2e9b291108ff1465dc066dfe442dbca822 | [
"MIT"
] | null | null | null | deps/postgrex/lib/postgrex/stream.ex | rpillar/Top5_Elixir | 9c450d2e9b291108ff1465dc066dfe442dbca822 | [
"MIT"
] | null | null | null | defmodule Postgrex.Stream do
@moduledoc false
defstruct [:conn, :query, :params, :options]
@type t :: %Postgrex.Stream{}
end
defmodule Postgrex.Cursor do
@moduledoc false
defstruct [:portal, :ref, :connection_id, :mode]
@type t :: %Postgrex.Cursor{}
end
defmodule Postgrex.Copy do
@moduledoc false
defstruct [:portal, :ref, :connection_id, :query]
@type t :: %Postgrex.Copy{}
end
defimpl Enumerable, for: Postgrex.Stream do
alias Postgrex.Query
def reduce(%Postgrex.Stream{query: %Query{} = query} = stream, acc, fun) do
%Postgrex.Stream{conn: conn, params: params, options: opts} = stream
stream = %DBConnection.Stream{conn: conn, query: query, params: params, opts: opts}
DBConnection.reduce(stream, acc, fun)
end
def reduce(%Postgrex.Stream{query: statement} = stream, acc, fun) do
%Postgrex.Stream{conn: conn, params: params, options: opts} = stream
query = %Query{name: "" , statement: statement}
opts = Keyword.put(opts, :function, :prepare_open)
stream = %DBConnection.PrepareStream{conn: conn, query: query, params: params, opts: opts}
DBConnection.reduce(stream, acc, fun)
end
def member?(_, _) do
{:error, __MODULE__}
end
def count(_) do
{:error, __MODULE__}
end
def slice(_) do
{:error, __MODULE__}
end
end
defimpl Collectable, for: Postgrex.Stream do
alias Postgrex.Stream
alias Postgrex.Query
def into(%Stream{conn: %DBConnection{}} = stream) do
%Stream{conn: conn, query: query, params: params, options: opts} = stream
opts = Keyword.put(opts, :postgrex_copy, true)
case query do
%Query{} ->
copy = DBConnection.execute!(conn, query, params, opts)
{:ok, make_into(conn, stream, copy, opts)}
query ->
query = %Query{name: "", statement: query}
{_, copy} = DBConnection.prepare_execute!(conn, query, params, opts)
{:ok, make_into(conn, stream, copy, opts)}
end
end
def into(_) do
raise ArgumentError, "data can only be copied to database inside a transaction"
end
defp make_into(conn, stream, %Postgrex.Copy{ref: ref} = copy, opts) do
fn
:ok, {:cont, data} ->
_ = DBConnection.execute!(conn, copy, {:copy_data, ref, data}, opts)
:ok
:ok, close when close in [:done, :halt] ->
_ = DBConnection.execute!(conn, copy, {:copy_done, ref}, opts)
stream
end
end
end
defimpl DBConnection.Query, for: Postgrex.Copy do
alias Postgrex.Copy
import Postgrex.Messages
def parse(copy, _) do
raise "can not prepare #{inspect copy}"
end
def describe(copy, _) do
raise "can not describe #{inspect copy}"
end
def encode(%Copy{ref: ref}, {:copy_data, ref, data}, _) do
try do
encode_msg(msg_copy_data(data: data))
rescue
ArgumentError ->
reraise ArgumentError,
"expected iodata to copy to database, got: " <> inspect(data)
else
iodata ->
{:copy_data, iodata}
end
end
def encode(%Copy{ref: ref}, {:copy_done, ref}, _) do
:copy_done
end
def decode(copy, _result, _opts) do
raise "can not describe #{inspect copy}"
end
end
defimpl String.Chars, for: Postgrex.Copy do
def to_string(%Postgrex.Copy{query: query}) do
String.Chars.to_string(query)
end
end
| 28.4 | 95 | 0.632923 |
ffa2c6232323ff2e8d1b179e7fbe2ae6c2153c98 | 61 | ex | Elixir | web/views/html_server_type_view.ex | rob05c/tox | f54847ca058ad24b909341ad65d595a4069d2471 | [
"Apache-2.0"
] | 2 | 2016-11-16T17:24:21.000Z | 2019-02-15T05:38:27.000Z | web/views/html_server_type_view.ex | rob05c/tox | f54847ca058ad24b909341ad65d595a4069d2471 | [
"Apache-2.0"
] | null | null | null | web/views/html_server_type_view.ex | rob05c/tox | f54847ca058ad24b909341ad65d595a4069d2471 | [
"Apache-2.0"
] | null | null | null | defmodule Tox.HtmlServerTypeView do
use Tox.Web, :view
end
| 15.25 | 35 | 0.786885 |
ffa2cf01e3ae87d3655f1bfef542452225db955b | 636 | ex | Elixir | lib/whistler_news_reader/feed_url_extractor.ex | fdietz/whistler_news_reader | 501f3f95e1ba3a684da8b34b60e426da85e7852d | [
"MIT"
] | 8 | 2016-06-12T20:11:26.000Z | 2017-05-02T04:36:41.000Z | lib/whistler_news_reader/feed_url_extractor.ex | fdietz/whistler_news_reader | 501f3f95e1ba3a684da8b34b60e426da85e7852d | [
"MIT"
] | 2 | 2016-06-12T15:49:06.000Z | 2016-06-12T20:00:02.000Z | lib/whistler_news_reader/feed_url_extractor.ex | fdietz/whistler_news_reader | 501f3f95e1ba3a684da8b34b60e426da85e7852d | [
"MIT"
] | null | null | null | defmodule WhistlerNewsReader.FeedUrlExtractor do
# extract the following
# <link rel="alternate" type="application/rss+xml" href="http://example.com"/>
# <link rel="alternate" type="application/atom+xml" href="http://example.com"/>
def extract(html) do
rss_match = Floki.find(html, "link[type='application/rss+xml']")
atom_match = Floki.find(html, "link[type='application/atom+xml']")
case href(rss_match) || href(atom_match) do
nil ->
{:error, :feed_url_not_found}
match ->
{:ok, match}
end
end
defp href(element) do
Enum.at(Floki.attribute(element, "href"), 0)
end
end
| 28.909091 | 81 | 0.658805 |
ffa2d1ef1c8d69c252b1826ab0f9fe7a08a7a801 | 209 | ex | Elixir | lib/home_display/reporters/temperature_series.ex | mattiaslundberg/home_display | ce571e37090398cfb15a508cf9758c174efa3d5e | [
"MIT"
] | null | null | null | lib/home_display/reporters/temperature_series.ex | mattiaslundberg/home_display | ce571e37090398cfb15a508cf9758c174efa3d5e | [
"MIT"
] | null | null | null | lib/home_display/reporters/temperature_series.ex | mattiaslundberg/home_display | ce571e37090398cfb15a508cf9758c174efa3d5e | [
"MIT"
] | null | null | null | defmodule HomeDisplay.Reporters.TemperatureSeries do
use Instream.Series
series do
database("home")
measurement("temperature")
tag(:location)
tag(:sensor_id)
field(:value)
end
end
| 14.928571 | 52 | 0.698565 |
ffa2d4e4199cda140f8e26ae82d71befaae13632 | 1,807 | exs | Elixir | mix.exs | MattIII/CastBug | aad9eabce5af4a80dd0f4383683746a1f518e377 | [
"MIT"
] | 1 | 2019-03-14T03:48:29.000Z | 2019-03-14T03:48:29.000Z | mix.exs | MattIII/CastBug | aad9eabce5af4a80dd0f4383683746a1f518e377 | [
"MIT"
] | null | null | null | mix.exs | MattIII/CastBug | aad9eabce5af4a80dd0f4383683746a1f518e377 | [
"MIT"
] | null | null | null | defmodule CastBug.MixProject do
use Mix.Project
def project do
[
app: :castbug,
version: "0.1.0",
elixir: "~> 1.5",
elixirc_paths: elixirc_paths(Mix.env()),
compilers: [:phoenix, :gettext] ++ Mix.compilers(),
start_permanent: Mix.env() == :prod,
aliases: aliases(),
deps: deps()
]
end
# Configuration for the OTP application.
#
# Type `mix help compile.app` for more information.
def application do
[
mod: {CastBug.Application, []},
extra_applications: [:logger, :runtime_tools]
]
end
# Specifies which paths to compile per environment.
defp elixirc_paths(:test), do: ["lib", "test/support"]
defp elixirc_paths(_), do: ["lib"]
# Specifies your project dependencies.
#
# Type `mix help deps` for examples and options.
defp deps do
[
{:phoenix, "~> 1.4.1"},
{:phoenix_pubsub, "~> 1.1"},
{:phoenix_ecto, "~> 4.0"},
{:ecto_sql, "~> 3.0"},
{:postgrex, ">= 0.0.0"},
{:phoenix_html, "~> 2.11"},
{:phoenix_live_reload, "~> 1.2", only: :dev},
{:gettext, "~> 0.11"},
{:jason, "~> 1.0"},
{:poison, "~> 3.0"},
{:plug_cowboy, "~> 2.0"},
{:absinthe, "~> 1.4.6"},
{:absinthe_ecto, "~> 0.1.3"},
{:absinthe_plug, "~> 1.4.6"},
{:timex, "~> 3.5.0"}
]
end
# Aliases are shortcuts or tasks specific to the current project.
# For example, to create, migrate and run the seeds file at once:
#
# $ mix ecto.setup
#
# See the documentation for `Mix` for more info on aliases.
defp aliases do
[
"ecto.setup": ["ecto.create", "ecto.migrate", "run priv/repo/seeds.exs"],
"ecto.reset": ["ecto.drop", "ecto.setup"],
test: ["ecto.create --quiet", "ecto.migrate", "test"]
]
end
end
| 26.573529 | 79 | 0.558937 |
ffa2d5ccad0d143d57ac416cb6c4c54bf5c98c3d | 563 | ex | Elixir | web/controllers/user_controller.ex | AbdullahDahmash/peep-stack-api-tutorial | 3e6d98be33fbee0ae79022e6a74940e0ea70cc96 | [
"MIT"
] | 1 | 2020-12-23T18:28:54.000Z | 2020-12-23T18:28:54.000Z | web/controllers/user_controller.ex | AbdullahDahmash/peep-stack-api-tutorial | 3e6d98be33fbee0ae79022e6a74940e0ea70cc96 | [
"MIT"
] | null | null | null | web/controllers/user_controller.ex | AbdullahDahmash/peep-stack-api-tutorial | 3e6d98be33fbee0ae79022e6a74940e0ea70cc96 | [
"MIT"
] | null | null | null | defmodule Peepchat.UserController do
use Peepchat.Web, :controller
alias Peepchat.User
plug Guardian.Plug.EnsureAuthenticated, handler: Peepchat.AuthErrorHandler
def index(conn, _params) do
users = Repo.all(User)
render(conn, "index.json-api", data: users)
end
def show(conn, %{"id" => id}) do
user = Repo.get!(User, id)
render(conn, "show.json-api", data: user)
end
def current(conn, _) do
user = conn
|> Guardian.Plug.current_resource
conn
|> render(Peepchat.UserView, "show.json-api", data: user)
end
end
| 22.52 | 76 | 0.676732 |
ffa2df28cd0ad8d8e9f705711f2de5817dc43721 | 37 | ex | Elixir | apps/gingerbread_shop_api/lib/gingerbread_shop.api.ex | ScrimpyCat/gingerbread_shop | ce01f5599fd1d39c79df461373c8f4f73bc144f2 | [
"BSD-2-Clause"
] | null | null | null | apps/gingerbread_shop_api/lib/gingerbread_shop.api.ex | ScrimpyCat/gingerbread_shop | ce01f5599fd1d39c79df461373c8f4f73bc144f2 | [
"BSD-2-Clause"
] | null | null | null | apps/gingerbread_shop_api/lib/gingerbread_shop.api.ex | ScrimpyCat/gingerbread_shop | ce01f5599fd1d39c79df461373c8f4f73bc144f2 | [
"BSD-2-Clause"
] | null | null | null | defmodule GingerbreadShop.API do
end
| 12.333333 | 32 | 0.864865 |
ffa2e108a70968167977747b5fad9c0fdca5c38c | 819 | exs | Elixir | mix.exs | shouya/natural-time | b5da363106c14bf5a674c8d2013e8134677a7719 | [
"MIT"
] | 3 | 2021-07-10T09:25:55.000Z | 2021-07-10T12:38:27.000Z | mix.exs | shouya/natural-time | b5da363106c14bf5a674c8d2013e8134677a7719 | [
"MIT"
] | null | null | null | mix.exs | shouya/natural-time | b5da363106c14bf5a674c8d2013e8134677a7719 | [
"MIT"
] | null | null | null | defmodule NaturalTime.MixProject do
use Mix.Project
def project do
[
app: :natural_time,
version: "0.1.1",
elixir: "~> 1.8",
description: "A simple parser for datetime in natural language",
start_permanent: Mix.env() == :prod,
deps: deps(),
package: package()
]
end
def application do
[
extra_applications: [:logger]
]
end
defp deps do
[
{:timex, "~> 3.5"},
{:nimble_parsec, "~> 1.1"},
# neeeded for publishing to hex
{:ex_doc, ">= 0.0.0", only: :dev, runtime: false}
]
end
defp package do
[
name: "natural_time",
files: ~w(lib .formatter.exs mix.exs README*),
licenses: ["MIT"],
links: %{
"GitHub" => "https://github.com/shouya/natural-time"
}
]
end
end
| 19.5 | 70 | 0.542125 |
ffa2f896302bb7ce4f45400cbab901d8e4ffa4eb | 3,735 | exs | Elixir | test/liblink/cluster/database_test.exs | Xerpa/liblink | 7b983431c5b391bb8cf182edd9ca4937601eea35 | [
"Apache-2.0"
] | 3 | 2018-10-26T12:55:15.000Z | 2019-05-03T22:41:34.000Z | test/liblink/cluster/database_test.exs | Xerpa/liblink | 7b983431c5b391bb8cf182edd9ca4937601eea35 | [
"Apache-2.0"
] | 4 | 2018-08-26T14:43:57.000Z | 2020-09-23T21:14:56.000Z | test/liblink/cluster/database_test.exs | Xerpa/liblink | 7b983431c5b391bb8cf182edd9ca4937601eea35 | [
"Apache-2.0"
] | null | null | null | # Copyright (C) 2018 Xerpa
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
# http://www.apache.org/licenses/LICENSE-2.0
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
defmodule Liblink.Cluster.DatabaseTest do
use ExUnit.Case, async: true
alias Liblink.Cluster.Database
setup do
{:ok, pid} = Database.start_link([], [])
{:ok, tid} = Database.get_tid(pid)
{:ok, [pid: pid, tid: tid]}
end
test "get on empty database", %{tid: tid} do
assert is_nil(Database.get(tid, :key))
assert :default == Database.get(tid, :key, :default)
end
test "fetch on empty database", %{tid: tid} do
assert :error == Database.fetch(tid, :key)
end
test "get after put", %{pid: pid, tid: tid} do
assert :ok == Database.put(pid, :key, :value)
assert :value == Database.get(tid, :key)
end
test "fetch after put", %{pid: pid, tid: tid} do
assert :ok == Database.put(pid, :key, :value)
assert {:ok, :value} == Database.fetch(tid, :key)
end
test "put after put", %{pid: pid, tid: tid} do
assert :ok == Database.put(pid, :key, :value0)
assert :ok == Database.put(pid, :key, :value1)
assert :value1 == Database.get(tid, :key)
end
test "get after put_new", %{pid: pid, tid: tid} do
assert :ok == Database.put_new(pid, :key, :value)
assert :value == Database.get(tid, :key)
end
test "put_new after put", %{pid: pid, tid: tid} do
assert :ok == Database.put(pid, :key, :value0)
assert :error == Database.put_new(pid, :key, :value1)
assert :value0 == Database.get(tid, :key)
end
test "put_new after delete", %{pid: pid} do
assert :ok == Database.put(pid, :key, :value)
assert :ok == Database.del(pid, :key)
assert :ok == Database.put_new(pid, :key, :value)
end
test "fetch_sync", %{pid: pid} do
assert :error = Database.fetch_sync(pid, :key)
end
test "fetch_sync after put_async", %{pid: pid} do
assert :ok == Database.put_async(pid, :key, :value)
assert {:ok, :value} = Database.fetch_sync(pid, :key)
end
test "fetch_sync after del_async", %{pid: pid} do
assert :ok == Database.put_async(pid, :key, :value)
assert :ok == Database.del_async(pid, :key)
assert :error = Database.fetch_sync(pid, :key)
end
describe "database subscriptions" do
test "receive put events", %{pid: pid} do
assert {:ok, tid} = Database.get_tid(pid)
assert :ok == Database.subscribe(pid, self())
assert :ok == Database.put(pid, :key, :value)
assert_receive {Database, ^pid, ^tid, {:put, :key, nil, :value}}
end
test "receive del events", %{pid: pid} do
assert {:ok, tid} = Database.get_tid(pid)
assert :ok == Database.put(pid, :key, :value)
assert :ok == Database.subscribe(pid, self())
assert :ok == Database.del(pid, :key)
assert_receive {Database, ^pid, ^tid, {:del, :key, :value}}
end
test "stop receiving events after unsubscribe", %{pid: pid} do
assert {:ok, tid} = Database.get_tid(pid)
assert :ok == Database.subscribe(pid, self())
assert :ok == Database.put(pid, :key, :value)
assert :ok == Database.unsubscribe(pid, self())
assert :ok == Database.del(pid, :key)
assert_receive {Database, ^pid, ^tid, {:put, :key, nil, :value}}
refute_receive {Database, ^pid, ^tid, _event}
end
end
end
| 33.648649 | 74 | 0.640964 |
ffa3210d301c433a782fe6fc5e06e4a99ec13f6e | 11,925 | ex | Elixir | lib/nerves_grove/oled_display.ex | Manolets/nerves_grove | 7a67a34ab9d897214ce78814aebb56b346edc726 | [
"Unlicense"
] | null | null | null | lib/nerves_grove/oled_display.ex | Manolets/nerves_grove | 7a67a34ab9d897214ce78814aebb56b346edc726 | [
"Unlicense"
] | null | null | null | lib/nerves_grove/oled_display.ex | Manolets/nerves_grove | 7a67a34ab9d897214ce78814aebb56b346edc726 | [
"Unlicense"
] | null | null | null | # This is free and unencumbered software released into the public domain.
defmodule Nerves.Grove.OLED.Display do
@moduledoc """
Seeed Studio [Grove OLED Display 96×96](http://wiki.seeedstudio.com/wiki/Grove_-_OLED_Display_1.12%22)
## Datasheet
http://garden.seeedstudio.com/images/8/82/SSD1327_datasheet.pdf
# Example
alias Nerves.Grove.OLED
{:ok, pid} = OLED.Display.start_link(address)
OLED.Display.reset(pid)
OLED.Display.clear(pid)
OLED.Display.set_text_position(pid, 0, 0)
OLED.Display.put_string(pid, "Hello, world")
"""
alias ElixirALE.I2C
@default_address 0x3C
@command_mode 0x80
@data_mode 0x40
# 8x8 monochrome bitmap font for ASCII code points 32-128.
@default_font {{0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x00, 0x5F, 0x00, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x00, 0x07, 0x00, 0x07, 0x00, 0x00, 0x00},
{0x00, 0x14, 0x7F, 0x14, 0x7F, 0x14, 0x00, 0x00},
{0x00, 0x24, 0x2A, 0x7F, 0x2A, 0x12, 0x00, 0x00},
{0x00, 0x23, 0x13, 0x08, 0x64, 0x62, 0x00, 0x00},
{0x00, 0x36, 0x49, 0x55, 0x22, 0x50, 0x00, 0x00},
{0x00, 0x00, 0x05, 0x03, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x1C, 0x22, 0x41, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x41, 0x22, 0x1C, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x08, 0x2A, 0x1C, 0x2A, 0x08, 0x00, 0x00},
{0x00, 0x08, 0x08, 0x3E, 0x08, 0x08, 0x00, 0x00},
{0x00, 0xA0, 0x60, 0x00, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x08, 0x08, 0x08, 0x08, 0x08, 0x00, 0x00},
{0x00, 0x60, 0x60, 0x00, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x20, 0x10, 0x08, 0x04, 0x02, 0x00, 0x00},
{0x00, 0x3E, 0x51, 0x49, 0x45, 0x3E, 0x00, 0x00},
{0x00, 0x00, 0x42, 0x7F, 0x40, 0x00, 0x00, 0x00},
{0x00, 0x62, 0x51, 0x49, 0x49, 0x46, 0x00, 0x00},
{0x00, 0x22, 0x41, 0x49, 0x49, 0x36, 0x00, 0x00},
{0x00, 0x18, 0x14, 0x12, 0x7F, 0x10, 0x00, 0x00},
{0x00, 0x27, 0x45, 0x45, 0x45, 0x39, 0x00, 0x00},
{0x00, 0x3C, 0x4A, 0x49, 0x49, 0x30, 0x00, 0x00},
{0x00, 0x01, 0x71, 0x09, 0x05, 0x03, 0x00, 0x00},
{0x00, 0x36, 0x49, 0x49, 0x49, 0x36, 0x00, 0x00},
{0x00, 0x06, 0x49, 0x49, 0x29, 0x1E, 0x00, 0x00},
{0x00, 0x00, 0x36, 0x36, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x00, 0xAC, 0x6C, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x08, 0x14, 0x22, 0x41, 0x00, 0x00, 0x00},
{0x00, 0x14, 0x14, 0x14, 0x14, 0x14, 0x00, 0x00},
{0x00, 0x41, 0x22, 0x14, 0x08, 0x00, 0x00, 0x00},
{0x00, 0x02, 0x01, 0x51, 0x09, 0x06, 0x00, 0x00},
{0x00, 0x32, 0x49, 0x79, 0x41, 0x3E, 0x00, 0x00},
{0x00, 0x7E, 0x09, 0x09, 0x09, 0x7E, 0x00, 0x00},
{0x00, 0x7F, 0x49, 0x49, 0x49, 0x36, 0x00, 0x00},
{0x00, 0x3E, 0x41, 0x41, 0x41, 0x22, 0x00, 0x00},
{0x00, 0x7F, 0x41, 0x41, 0x22, 0x1C, 0x00, 0x00},
{0x00, 0x7F, 0x49, 0x49, 0x49, 0x41, 0x00, 0x00},
{0x00, 0x7F, 0x09, 0x09, 0x09, 0x01, 0x00, 0x00},
{0x00, 0x3E, 0x41, 0x41, 0x51, 0x72, 0x00, 0x00},
{0x00, 0x7F, 0x08, 0x08, 0x08, 0x7F, 0x00, 0x00},
{0x00, 0x41, 0x7F, 0x41, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x20, 0x40, 0x41, 0x3F, 0x01, 0x00, 0x00},
{0x00, 0x7F, 0x08, 0x14, 0x22, 0x41, 0x00, 0x00},
{0x00, 0x7F, 0x40, 0x40, 0x40, 0x40, 0x00, 0x00},
{0x00, 0x7F, 0x02, 0x0C, 0x02, 0x7F, 0x00, 0x00},
{0x00, 0x7F, 0x04, 0x08, 0x10, 0x7F, 0x00, 0x00},
{0x00, 0x3E, 0x41, 0x41, 0x41, 0x3E, 0x00, 0x00},
{0x00, 0x7F, 0x09, 0x09, 0x09, 0x06, 0x00, 0x00},
{0x00, 0x3E, 0x41, 0x51, 0x21, 0x5E, 0x00, 0x00},
{0x00, 0x7F, 0x09, 0x19, 0x29, 0x46, 0x00, 0x00},
{0x00, 0x26, 0x49, 0x49, 0x49, 0x32, 0x00, 0x00},
{0x00, 0x01, 0x01, 0x7F, 0x01, 0x01, 0x00, 0x00},
{0x00, 0x3F, 0x40, 0x40, 0x40, 0x3F, 0x00, 0x00},
{0x00, 0x1F, 0x20, 0x40, 0x20, 0x1F, 0x00, 0x00},
{0x00, 0x3F, 0x40, 0x38, 0x40, 0x3F, 0x00, 0x00},
{0x00, 0x63, 0x14, 0x08, 0x14, 0x63, 0x00, 0x00},
{0x00, 0x03, 0x04, 0x78, 0x04, 0x03, 0x00, 0x00},
{0x00, 0x61, 0x51, 0x49, 0x45, 0x43, 0x00, 0x00},
{0x00, 0x7F, 0x41, 0x41, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x02, 0x04, 0x08, 0x10, 0x20, 0x00, 0x00},
{0x00, 0x41, 0x41, 0x7F, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x04, 0x02, 0x01, 0x02, 0x04, 0x00, 0x00},
{0x00, 0x80, 0x80, 0x80, 0x80, 0x80, 0x00, 0x00},
{0x00, 0x01, 0x02, 0x04, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x20, 0x54, 0x54, 0x54, 0x78, 0x00, 0x00},
{0x00, 0x7F, 0x48, 0x44, 0x44, 0x38, 0x00, 0x00},
{0x00, 0x38, 0x44, 0x44, 0x28, 0x00, 0x00, 0x00},
{0x00, 0x38, 0x44, 0x44, 0x48, 0x7F, 0x00, 0x00},
{0x00, 0x38, 0x54, 0x54, 0x54, 0x18, 0x00, 0x00},
{0x00, 0x08, 0x7E, 0x09, 0x02, 0x00, 0x00, 0x00},
{0x00, 0x18, 0xA4, 0xA4, 0xA4, 0x7C, 0x00, 0x00},
{0x00, 0x7F, 0x08, 0x04, 0x04, 0x78, 0x00, 0x00},
{0x00, 0x00, 0x7D, 0x00, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x80, 0x84, 0x7D, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x7F, 0x10, 0x28, 0x44, 0x00, 0x00, 0x00},
{0x00, 0x41, 0x7F, 0x40, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x7C, 0x04, 0x18, 0x04, 0x78, 0x00, 0x00},
{0x00, 0x7C, 0x08, 0x04, 0x7C, 0x00, 0x00, 0x00},
{0x00, 0x38, 0x44, 0x44, 0x38, 0x00, 0x00, 0x00},
{0x00, 0xFC, 0x24, 0x24, 0x18, 0x00, 0x00, 0x00},
{0x00, 0x18, 0x24, 0x24, 0xFC, 0x00, 0x00, 0x00},
{0x00, 0x00, 0x7C, 0x08, 0x04, 0x00, 0x00, 0x00},
{0x00, 0x48, 0x54, 0x54, 0x24, 0x00, 0x00, 0x00},
{0x00, 0x04, 0x7F, 0x44, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x3C, 0x40, 0x40, 0x7C, 0x00, 0x00, 0x00},
{0x00, 0x1C, 0x20, 0x40, 0x20, 0x1C, 0x00, 0x00},
{0x00, 0x3C, 0x40, 0x30, 0x40, 0x3C, 0x00, 0x00},
{0x00, 0x44, 0x28, 0x10, 0x28, 0x44, 0x00, 0x00},
{0x00, 0x1C, 0xA0, 0xA0, 0x7C, 0x00, 0x00, 0x00},
{0x00, 0x44, 0x64, 0x54, 0x4C, 0x44, 0x00, 0x00},
{0x00, 0x08, 0x36, 0x41, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x00, 0x7F, 0x00, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x41, 0x36, 0x08, 0x00, 0x00, 0x00, 0x00},
{0x00, 0x02, 0x01, 0x01, 0x02, 0x01, 0x00, 0x00},
{0x00, 0x02, 0x05, 0x05, 0x02, 0x00, 0x00, 0x00}}
use Bitwise
@spec start_link(byte) :: {:ok, pid} | {:error, any}
def start_link(address \\ @default_address) do
I2C.start_link("i2c-2", address)
end
@spec reset(pid) :: :ok
def reset(pid) do
send_commands(pid, <<0xFD, 0x12>>)
off(pid)
# 96
set_multiplex_ratio(pid, 95)
set_start_line(pid, 0)
set_display_offset(pid, 96)
set_vertical_mode(pid)
send_commands(pid, <<0xAB, 0x01>>)
# 100 nit
set_contrast_level(pid, 0x53)
send_commands(pid, <<0xB1, 0x51>>)
send_commands(pid, <<0xB3, 0x01>>)
send_commands(pid, <<0xB9>>)
send_commands(pid, <<0xBC, 0x08>>)
send_commands(pid, <<0xBE, 0x07>>)
send_commands(pid, <<0xB6, 0x01>>)
send_commands(pid, <<0xD5, 0x62>>)
set_normal_mode(pid)
set_activate_scroll(pid, false)
on(pid)
# ms
Process.sleep(100)
set_row_address(pid, 0, 95)
set_column_address(pid, 8, 8 + 47)
:ok
end
@spec on(pid) :: :ok
def on(pid) do
send_command(pid, 0xAF)
end
@spec off(pid) :: :ok
def off(pid) do
send_command(pid, 0xAE)
end
@spec clear(pid) :: :ok
def clear(pid) do
# TODO: optimize more once https://github.com/fhunleth/elixir_ale/issues/20 is fixed.
block = :erlang.list_to_binary([@data_mode, String.duplicate("\x00", 16)])
Enum.each(1..48, fn _ ->
Enum.each(1..div(96, 16), fn _ -> I2C.write(pid, block) end)
end)
end
@spec set_text_position(pid, byte, byte) :: any
def set_text_position(pid, row, column) do
set_column_address(pid, 0x08 + column * 4, 0x08 + 47)
set_row_address(pid, 0x00 + row * 8, 0x07 + row * 8)
end
# @spec put_string(pid, <<>>) :: :ok
def put_string(_pid, <<>>), do: nil
@spec put_string(pid, binary) :: any
def put_string(pid, <<head, rest::binary>>) do
put_char(pid, head)
put_string(pid, rest)
end
# @spec put_char(pid, byte) :: any
def put_char(pid, char) when is_integer(char) and char in 32..127 do
c = char - 32
Enum.each([0, 2, 4, 6], fn i ->
Enum.each(0..7, fn j ->
glyph = elem(@default_font, c)
bit1 = band(bsr(elem(glyph, i), j), 0x01)
bit2 = band(bsr(elem(glyph, i + 1), j), 0x01)
send_data(
pid,
bor(
if bit1 != 0 do
0xF0
else
0
end,
if bit2 != 0 do
0x0F
else
0
end
)
)
end)
end)
end
@spec put_char(pid, byte) :: any
def put_char(pid, char) when is_integer(char) and char in 0..31 do
# replace with a space
put_char(pid, ?\s)
end
@spec set_column_address(pid, byte, byte) :: :ok
def set_column_address(pid, start, end_) do
send_commands(pid, <<0x15, start, end_>>)
:ok
end
@spec set_row_address(pid, byte, byte) :: :ok
def set_row_address(pid, start, end_) do
send_commands(pid, <<0x75, start, end_>>)
:ok
end
@spec set_contrast_level(pid, byte) :: :ok
def set_contrast_level(pid, level) do
send_commands(pid, <<0x81, level>>)
:ok
end
@spec set_horizontal_mode(pid) :: :ok
def set_horizontal_mode(pid) do
send_commands(pid, <<0xA0, 0x42>>)
set_row_address(pid, 0, 95)
set_column_address(pid, 8, 8 + 47)
end
@spec set_vertical_mode(pid) :: :ok
def set_vertical_mode(pid) do
send_commands(pid, <<0xA0, 0x46>>)
:ok
end
@spec set_start_line(pid, 0..127) :: :ok
def set_start_line(pid, row) do
send_commands(pid, <<0xA1, row>>)
:ok
end
@spec set_display_offset(pid, 0..127) :: :ok
def set_display_offset(pid, row) do
send_commands(pid, <<0xA2, row>>)
:ok
end
@spec set_normal_mode(pid) :: :ok
def set_normal_mode(pid) do
send_command(pid, 0xA4)
end
@spec set_inverse_mode(pid) :: :ok
def set_inverse_mode(pid) do
send_command(pid, 0xA7)
end
@spec set_multiplex_ratio(pid, 16..128) :: :ok
def set_multiplex_ratio(pid, ratio) do
send_commands(pid, <<0xA8, ratio>>)
:ok
end
@spec set_activate_scroll(pid, false) :: :ok
def set_activate_scroll(pid, false) do
send_command(pid, 0x2E)
end
@spec set_activate_scroll(pid, true) :: :ok
def set_activate_scroll(pid, true) do
send_command(pid, 0x2F)
end
# @spec send_commands(pid, <<>>) :: :ok
defp send_commands(_pid, <<>>), do: :ok
@spec send_commands(pid, binary) :: :ok
defp send_commands(pid, <<head, rest::binary>>) do
send_command(pid, head)
send_commands(pid, rest)
end
@spec send_command(pid, byte) :: :ok
defp send_command(pid, command) do
I2C.write(pid, <<@command_mode, command>>)
end
@spec send_data(pid, byte) :: :ok
defp send_data(pid, data) do
I2C.write(pid, <<@data_mode, data>>)
end
end
| 37.149533 | 104 | 0.545744 |
ffa333cfaa81e30054e76679b5327ae5ec968c64 | 6,874 | exs | Elixir | test/station_test.exs | Quanttek/noaa_cli | c9ba3bac9c9fc552cf85e9674875062585ab9601 | [
"Apache-2.0"
] | null | null | null | test/station_test.exs | Quanttek/noaa_cli | c9ba3bac9c9fc552cf85e9674875062585ab9601 | [
"Apache-2.0"
] | null | null | null | test/station_test.exs | Quanttek/noaa_cli | c9ba3bac9c9fc552cf85e9674875062585ab9601 | [
"Apache-2.0"
] | null | null | null | defmodule StationTest do
use ExUnit.Case
#doctest Noaa
import Noaa.Station, only: [parse_keywords: 2,
get_key_matching: 2,
get_fully_matching: 2]
alias Noaa.Station
def station_list do
[%Station{id: "CWAV", state: "AB", name: "Sundre",
latitude: 51.76667, longitude: -114.68333},
%Station{id: "CWDZ", state: "AB", name: "Drumheller East",
latitude: 51.44504, longitude: -112.69654}, #longitude value is modified
%Station{id: "KFME", state: "MD", name: "Fort Meade / Tipton",
latitude: 39.08333, longitude: -76.76667},
%Station{id: "KMTN", state: "MD", name: "Baltimore / Martin",
latitude: 39.33333, longitude: -76.41667},
%Station{id: "KMSO", state: "MT", name: "Missoula, Missoula International Airport",
latitude: 46.92083, longitude: -114.0925},
%Station{id: "KDOV", state: "DE", name: "Dover Air Force Base",
latitude: 39.13333, longitude: -75.46667}]
end
########################
#parse_keywords/2 tests#
########################
test "Get correcty partially filled %Station{} from user input (1/2)" do
user_input = [id: "ABCD", state: "BC"]
expected_result = %Station{id: "ABCD", state: "BC", name: nil,
latitude: nil, longitude: nil}
assert(parse_keywords(user_input, %Station{}) == expected_result)
end
test "Get correcty partially filled %Station{} from user input (2/2)" do
user_input = [latitude: 75.1, longitude: -100]
expected_result = %Station{id: nil, state: nil, name: nil,
latitude: 75.1, longitude: -100}
assert(parse_keywords(user_input, %Station{}) == expected_result)
end
test "Get empty %Station{} from no user input" do
user_input = []
expected_result = %Station{id: nil, state: nil, name: nil,
latitude: nil, longitude: nil}
assert(parse_keywords(user_input, %Station{}) == expected_result)
end
#####################################
#get_fully_matching/2 tests#
#####################################
test "Get correct fully matching stations with state and longitude from station list" do
key_map = %{longitude: -114, state: "AB"}
expected_result = [%Station{id: "CWAV", state: "AB", name: "Sundre",
latitude: 51.76667, longitude: -114.68333}]
assert(get_fully_matching(station_list(), key_map) == expected_result)
end
test "Get correct fully matching stations with state and name from station list" do
key_map = %{state: "MD", name: "Baltimore / Martin"}
expected_result = [%Station{id: "KMTN", state: "MD", name: "Baltimore / Martin",
latitude: 39.33333, longitude: -76.41667}]
assert(get_fully_matching(station_list(), key_map) == expected_result)
end
test "Get correct fully matching stations with latitude and longitude from station list" do
key_map = %{longitude: -76, latitude: 39}
expected_result = [%Station{id: "KFME", state: "MD", name: "Fort Meade / Tipton",
latitude: 39.08333, longitude: -76.76667},
%Station{id: "KMTN", state: "MD", name: "Baltimore / Martin",
latitude: 39.33333, longitude: -76.41667},
%Station{id: "KDOV", state: "DE", name: "Dover Air Force Base",
latitude: 39.13333, longitude: -75.46667}]
assert(get_fully_matching(station_list(), key_map) == expected_result)
end
test "Get correct fully matching stations with latitude, longitude and state from station list" do
key_map = %{longitude: -76, latitude: 39, state: "DE"}
expected_result = [%Station{id: "KDOV", state: "DE", name: "Dover Air Force Base",
latitude: 39.13333, longitude: -75.46667}]
assert(get_fully_matching(station_list(), key_map) == expected_result)
end
###################################
#get_key_matching/2 tests#
###################################
test "Get correct id matching station from station list" do
expected_result = [%Station{id: "KFME", state: "MD", name: "Fort Meade / Tipton",
latitude: 39.08333, longitude: -76.76667}]
assert(get_key_matching(station_list(), [id: "KFME"]) == expected_result)
end
test "Get correct state matching stations from station list" do
expected_result = [%Station{id: "KFME", state: "MD", name: "Fort Meade / Tipton",
latitude: 39.08333, longitude: -76.76667},
%Station{id: "KMTN", state: "MD", name: "Baltimore / Martin",
latitude: 39.33333, longitude: -76.41667}]
assert(get_key_matching(station_list(), [state: "MD"]) == expected_result)
end
test "Get correct name matching station from station list" do
expected_result = [%Station{id: "CWAV", state: "AB", name: "Sundre",
latitude: 51.76667, longitude: -114.68333}]
assert(get_key_matching(station_list(), [name: "Sundre"]) == expected_result)
end
test "Get correct latitude and longitude matching stations from station list" do
expected_result = [%Station{id: "KFME", state: "MD", name: "Fort Meade / Tipton",
latitude: 39.08333, longitude: -76.76667},
%Station{id: "KMTN", state: "MD", name: "Baltimore / Martin",
latitude: 39.33333, longitude: -76.41667},
%Station{id: "KDOV", state: "DE", name: "Dover Air Force Base",
latitude: 39.13333, longitude: -75.46667}]
assert(get_key_matching(station_list(), [latitude: 39, longitude: -76])
== expected_result)
end
test "Get correct latitude matching stations from station list" do
expected_result = [%Station{id: "CWAV", state: "AB", name: "Sundre",
latitude: 51.76667, longitude: -114.68333},
%Station{id: "CWDZ", state: "AB", name: "Drumheller East",
latitude: 51.44504, longitude: -112.69654}]
assert(get_key_matching(station_list(), [latitude: 51])
== expected_result)
end
test "Get correct longitude matching stations from station list" do
expected_result = [%Station{id: "CWAV", state: "AB", name: "Sundre",
latitude: 51.76667, longitude: -114.68333},
%Station{id: "CWDZ", state: "AB", name: "Drumheller East",
latitude: 51.44504, longitude: -112.69654},
%Station{id: "KMSO", state: "MT", name: "Missoula, Missoula International Airport",
latitude: 46.92083, longitude: -114.0925}]
assert(get_key_matching(station_list(), [longitude: -113.69])
== expected_result)
end
end
| 42.695652 | 105 | 0.586122 |
ffa3495c4e456c6850df97a9baa59e6719c0e80c | 1,100 | ex | Elixir | repository_summary/lib/repository_summary/application.ex | joabehenrique/repository-summary | 6a7c28098a9faf11386b43996bf5ad33b50cddf2 | [
"MIT"
] | null | null | null | repository_summary/lib/repository_summary/application.ex | joabehenrique/repository-summary | 6a7c28098a9faf11386b43996bf5ad33b50cddf2 | [
"MIT"
] | null | null | null | repository_summary/lib/repository_summary/application.ex | joabehenrique/repository-summary | 6a7c28098a9faf11386b43996bf5ad33b50cddf2 | [
"MIT"
] | null | null | null | defmodule RepositorySummary.Application do
# See https://hexdocs.pm/elixir/Application.html
# for more information on OTP Applications
@moduledoc false
use Application
def start(_type, _args) do
children = [
# Start the Ecto repository
RepositorySummary.Repo,
# Start the Telemetry supervisor
RepositorySummaryWeb.Telemetry,
# Start the PubSub system
{Phoenix.PubSub, name: RepositorySummary.PubSub},
# Start the Endpoint (http/https)
RepositorySummaryWeb.Endpoint
# Start a worker by calling: RepositorySummary.Worker.start_link(arg)
# {RepositorySummary.Worker, arg}
]
# See https://hexdocs.pm/elixir/Supervisor.html
# for other strategies and supported options
opts = [strategy: :one_for_one, name: RepositorySummary.Supervisor]
Supervisor.start_link(children, opts)
end
# Tell Phoenix to update the endpoint configuration
# whenever the application is updated.
def config_change(changed, _new, removed) do
RepositorySummaryWeb.Endpoint.config_change(changed, removed)
:ok
end
end
| 31.428571 | 75 | 0.726364 |
ffa36c621099686a473dd052d8a13d854f57cb6a | 1,761 | exs | Elixir | clients/apigee/mix.exs | MShaffar19/elixir-google-api | aac92cd85f423e6d08c9571ee97cf21545df163f | [
"Apache-2.0"
] | null | null | null | clients/apigee/mix.exs | MShaffar19/elixir-google-api | aac92cd85f423e6d08c9571ee97cf21545df163f | [
"Apache-2.0"
] | null | null | null | clients/apigee/mix.exs | MShaffar19/elixir-google-api | aac92cd85f423e6d08c9571ee97cf21545df163f | [
"Apache-2.0"
] | 1 | 2020-10-04T10:12:44.000Z | 2020-10-04T10:12:44.000Z | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Apigee.Mixfile do
use Mix.Project
@version "0.19.1"
def project() do
[
app: :google_api_apigee,
version: @version,
elixir: "~> 1.6",
build_embedded: Mix.env == :prod,
start_permanent: Mix.env == :prod,
description: description(),
package: package(),
deps: deps(),
source_url: "https://github.com/googleapis/elixir-google-api/tree/master/clients/apigee"
]
end
def application() do
[extra_applications: [:logger]]
end
defp deps() do
[
{:google_gax, "~> 0.4"},
{:ex_doc, "~> 0.16", only: :dev}
]
end
defp description() do
"""
Apigee API client library.
"""
end
defp package() do
[
files: ["lib", "mix.exs", "README*", "LICENSE"],
maintainers: ["Jeff Ching", "Daniel Azuma"],
licenses: ["Apache 2.0"],
links: %{
"GitHub" => "https://github.com/googleapis/elixir-google-api/tree/master/clients/apigee",
"Homepage" => "https://cloud.google.com/apigee-api-management/"
}
]
end
end
| 26.283582 | 97 | 0.646224 |
ffa3a257b167aecb58044eac518b34bd0edd9838 | 1,649 | exs | Elixir | elixir/auction_umbrella/config/config.exs | kendru/darwin | 67096eb900bc36d30bf5ce36d38aaa6db86a29a2 | [
"MIT"
] | 5 | 2021-11-17T04:37:39.000Z | 2022-01-02T06:43:23.000Z | elixir/auction_umbrella/config/config.exs | kendru/darwin | 67096eb900bc36d30bf5ce36d38aaa6db86a29a2 | [
"MIT"
] | 3 | 2021-05-21T21:50:11.000Z | 2021-11-21T14:34:53.000Z | elixir/auction_umbrella/config/config.exs | kendru/darwin | 67096eb900bc36d30bf5ce36d38aaa6db86a29a2 | [
"MIT"
] | 2 | 2021-11-16T14:14:05.000Z | 2021-12-31T02:01:06.000Z | # This file is responsible for configuring your umbrella
# and **all applications** and their dependencies with the
# help of the Config module.
#
# Note that all applications in your umbrella share the
# same configuration and dependencies, which is why they
# all use the same configuration file. If you want different
# configurations or dependencies per app, it is best to
# move said applications out of the umbrella.
import Config
config :auction,
ecto_repos: [Auction.Repo]
config :auction, Auction.Repo,
username: "auction",
password: "s3cr3t",
database: "auction_dev",
hostname: "localhost",
port: "15432"
# show_sensitive_data_on_connection_error: true,
# pool_size: 10
config :auction_web,
generators: [context_app: false]
# Configures the endpoint
config :auction_web, AuctionWeb.Endpoint,
url: [host: "localhost"],
secret_key_base: "z62EWOOMo4RvEnVP5dKxoJb6Xq7RDvGQmYyKqgpxPD2tTSMz3hlFGDu1zpb8mE8W",
render_errors: [view: AuctionWeb.ErrorView, accepts: ~w(html json), layout: false],
pubsub_server: AuctionWeb.PubSub,
live_view: [signing_salt: "+dlhqlUO"]
# Sample configuration:
#
# config :logger, :console,
# level: :info,
# format: "$date $time [$level] $metadata$message\n",
# metadata: [:user_id]
#
# Configures Elixir's Logger
config :logger, :console,
format: "$time $metadata[$level] $message\n",
metadata: [:request_id]
# Use Jason for JSON parsing in Phoenix
config :phoenix, :json_library, Jason
# Import environment specific config. This must remain at the bottom
# of this file so it overrides the configuration defined above.
import_config "#{Mix.env()}.exs"
| 30.537037 | 86 | 0.741055 |
ffa3bc0d93e817f50ccba77eede495916e347bed | 2,454 | exs | Elixir | config/prod.exs | jeyemwey/radiator-spark | 2fba0a84eb43ab1d58e8ec58c6a07f64adf9cb9d | [
"MIT"
] | 1 | 2021-03-02T16:59:40.000Z | 2021-03-02T16:59:40.000Z | config/prod.exs | optikfluffel/radiator | b1a1b966296fa6bf123e3a2455009ff52099ace6 | [
"MIT"
] | null | null | null | config/prod.exs | optikfluffel/radiator | b1a1b966296fa6bf123e3a2455009ff52099ace6 | [
"MIT"
] | null | null | null | use Mix.Config
# For production, don't forget to configure the url host
# to something meaningful, Phoenix uses this information
# when generating URLs.
#
# Note we also include the path to a cache manifest
# containing the digested version of static files. This
# manifest is generated by the `mix phx.digest` task,
# which you should run after static files are built and
# before starting your production server.
config :radiator, RadiatorWeb.Endpoint,
http: [:inet6, port: System.get_env("PORT") || 4000],
url: [host: "example.com", port: 80],
cache_static_manifest: "priv/static/cache_manifest.json"
# Do not print debug messages in production
config :logger, level: :info
# ## SSL Support
#
# To get SSL working, you will need to add the `https` key
# to the previous section and set your `:url` port to 443:
#
# config :radiator, RadiatorWeb.Endpoint,
# ...
# url: [host: "example.com", port: 443],
# https: [
# :inet6,
# port: 443,
# cipher_suite: :strong,
# keyfile: System.get_env("SOME_APP_SSL_KEY_PATH"),
# certfile: System.get_env("SOME_APP_SSL_CERT_PATH")
# ]
#
# The `cipher_suite` is set to `:strong` to support only the
# latest and more secure SSL ciphers. This means old browsers
# and clients may not be supported. You can set it to
# `:compatible` for wider support.
#
# `:keyfile` and `:certfile` expect an absolute path to the key
# and cert in disk or a relative path inside priv, for example
# "priv/ssl/server.key". For all supported SSL configuration
# options, see https://hexdocs.pm/plug/Plug.SSL.html#configure/1
#
# We also recommend setting `force_ssl` in your endpoint, ensuring
# no data is ever sent via http, always redirecting to https:
#
# config :radiator, RadiatorWeb.Endpoint,
# force_ssl: [hsts: true]
#
# Check `Plug.SSL` for all available options in `force_ssl`.
# ## Using releases (distillery)
#
# If you are doing OTP releases, you need to instruct Phoenix
# to start the server for all endpoints:
#
# config :phoenix, :serve_endpoints, true
#
# Alternatively, you can configure exactly which server to
# start per endpoint:
#
# config :radiator, RadiatorWeb.Endpoint, server: true
#
# Note you can't rely on `System.get_env/1` when using releases.
# See the releases documentation accordingly.
# Finally import the config/prod.secret.exs which should be versioned
# separately.
import_config "prod.secret.exs"
| 34.083333 | 69 | 0.713936 |
ffa4174b555c56415ef05fbea444c707183cc06a | 6,000 | ex | Elixir | lib/astarte_flow/blocks/sorter.ex | matt-mazzucato/astarte_flow | e8644b5a27edf325977f5bced9a919f20e289ee2 | [
"Apache-2.0"
] | null | null | null | lib/astarte_flow/blocks/sorter.ex | matt-mazzucato/astarte_flow | e8644b5a27edf325977f5bced9a919f20e289ee2 | [
"Apache-2.0"
] | null | null | null | lib/astarte_flow/blocks/sorter.ex | matt-mazzucato/astarte_flow | e8644b5a27edf325977f5bced9a919f20e289ee2 | [
"Apache-2.0"
] | null | null | null | #
# This file is part of Astarte.
#
# Copyright 2020 Ispirata Srl
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
defmodule Astarte.Flow.Blocks.Sorter do
@moduledoc """
This block is a stateful realtime block, which reorders an out of order sequence of messages,
and optionally removes any duplicate message.
"""
use GenStage
alias Astarte.Flow.Message
defmodule Config do
defstruct policy: :duplicates,
delay_us: 1_000_000
@type t() :: %__MODULE__{}
@type option() :: {:deduplicate, boolean()} | {:delay_ms, pos_integer()}
@doc """
Initialize config from a keyword list.
## Options
* `:deduplicate` - true when messages deduplication is enabled, otherwise false.
Two messages are duplicate when they have same timestamp and value (any other field is ignored).
* `:delay_ms` - the amount of time the message is kept for reorder and deduplicate operations.
"""
@spec from_keyword(list(option())) :: {:ok, t()}
def from_keyword(kl) do
deduplicate = Keyword.get(kl, :deduplicate, false)
delay_ms = Keyword.get(kl, :delay_ms, 1000)
policy =
if deduplicate do
:unique
else
:duplicates
end
{:ok,
%Config{
policy: policy,
delay_us: delay_ms * 1000
}}
end
end
defmodule State do
defstruct last_timestamp: 0,
queues: %{}
@type t() :: %__MODULE__{}
end
def start_link(opts) do
GenStage.start_link(__MODULE__, opts)
end
# GenStage callbacks
@impl true
def init(opts) do
with {:ok, config} <- Config.from_keyword(opts) do
{:ok, _tref} = :timer.send_interval(1, :timer_timeout)
{:producer_consumer, {config, %State{}}, dispatcher: GenStage.BroadcastDispatcher}
else
{:error, reason} ->
{:stop, reason}
end
end
@impl true
def handle_info(:timer_timeout, state) do
ts =
DateTime.utc_now()
|> DateTime.to_unix(:microsecond)
{messages, state} = take_ready(ts, state)
{:noreply, messages, state}
end
@doc """
Take any ready message.
"""
@spec take_ready(pos_integer(), {Config.t(), State.t()}) ::
{list(Message.t()), {Config.t(), State.t()}}
def take_ready(ts, {config, %{queues: queues} = _state}) do
%Config{
policy: policy,
delay_us: delay
} = config
{queues, messages} =
Enum.reduce(queues, {queues, []}, fn {key, queue}, {queues_acc, msg_acc} ->
{queue, backwards_msg_acc} = dequeue_all_ready(policy, queue, ts - delay, msg_acc)
msg_acc = Enum.reverse(backwards_msg_acc)
queues_acc =
if Prioqueue.empty?(queue) do
Map.delete(queues_acc, key)
else
Map.put(queues_acc, key, queue)
end
{queues_acc, msg_acc}
end)
new_state = %State{last_timestamp: ts, queues: queues}
{messages, {config, new_state}}
end
defp dequeue_all_ready(:duplicates, queue, ts, msg_acc) do
case Prioqueue.peek_min(queue) do
{:ok, %Message{timestamp: peek_ts} = _peek_msg} when peek_ts <= ts ->
{:ok, {peek, queue}} = Prioqueue.extract_min(queue)
dequeue_all_ready(:duplicates, queue, ts, [peek | msg_acc])
{:ok, %Message{timestamp: peek_ts} = _peek_msg} when peek_ts > ts ->
{queue, msg_acc}
{:error, :empty} ->
{queue, msg_acc}
end
end
defp dequeue_all_ready(:unique, queue, ts, msg_acc) do
{last_ts, last_data} =
case msg_acc do
[] -> {nil, nil}
[%Message{timestamp: last_ts, data: last_data} | _tail] -> {last_ts, last_data}
end
case Prioqueue.peek_min(queue) do
{:ok, %Message{timestamp: ^last_ts, data: ^last_data}} ->
{:ok, {_peek, queue}} = Prioqueue.extract_min(queue)
dequeue_all_ready(:unique, queue, ts, msg_acc)
{:ok, %Message{timestamp: peek_ts} = _peek_msg} when peek_ts <= ts ->
{:ok, {peek, queue}} = Prioqueue.extract_min(queue)
dequeue_all_ready(:unique, queue, ts, [peek | msg_acc])
{:ok, %Message{timestamp: peek_ts} = _peek_msg} when peek_ts > ts ->
{queue, msg_acc}
{:error, :empty} ->
{queue, msg_acc}
end
end
@impl true
def handle_events(events, _from, {config, state}) do
new_state =
Enum.reduce(events, state, fn message, acc ->
process_message(message, config, acc)
end)
{:noreply, [], {config, new_state}}
end
@doc """
Process a message and stores it into the block state.
"""
@spec process_message(Message.t(), Config.t(), State.t()) :: State.t()
def process_message(
%Message{} = msg,
config,
%{last_timestamp: last_ts, queues: queues} = state
) do
queues = maybe_insert(queues, msg, last_ts - config.delay_us)
%State{state | queues: queues}
end
defp maybe_insert(queues, %Message{} = msg, min_ts) do
if msg.timestamp > min_ts do
store(queues, msg)
else
queues
end
end
defp store(queues, %Message{key: key} = msg) do
stream_pq =
with {:ok, pq} <- Map.fetch(queues, key) do
pq
else
:error -> Prioqueue.new([], cmp_fun: &cmp_fun/2)
end
stream_pq = Prioqueue.insert(stream_pq, msg)
Map.put(queues, key, stream_pq)
end
defp cmp_fun(%Message{timestamp: ts1}, %Message{timestamp: ts2}) do
cond do
ts1 < ts2 -> :lt
ts1 > ts2 -> :gt
true -> :eq
end
end
end
| 26.785714 | 104 | 0.620833 |
ffa41a25a7c4b3c11c4bfcdbdbb9ff4d124935be | 1,723 | ex | Elixir | test/support/conn_case.ex | imsoulfly/local_hex_repo | 18fca51c44b3dd01d27684877b3c7bc13471f548 | [
"Apache-2.0"
] | 5 | 2021-11-13T13:58:06.000Z | 2022-03-26T03:47:57.000Z | test/support/conn_case.ex | imsoulfly/local_hex_repo | 18fca51c44b3dd01d27684877b3c7bc13471f548 | [
"Apache-2.0"
] | 3 | 2021-11-16T18:45:45.000Z | 2021-12-05T13:58:25.000Z | test/support/conn_case.ex | imsoulfly/local_hex_repo | 18fca51c44b3dd01d27684877b3c7bc13471f548 | [
"Apache-2.0"
] | null | null | null | defmodule LocalHexWeb.ConnCase do
@moduledoc """
This module defines the test case to be used by
tests that require setting up a connection.
Such tests rely on `Phoenix.ConnTest` and also
import other functionality to make it easier
to build common data structures and query the data layer.
Finally, if the test case interacts with the database,
we enable the SQL sandbox, so changes done to the database
are reverted at the end of every test. If you are using
PostgreSQL, you can even run database tests asynchronously
by setting `use LocalHexWeb.ConnCase, async: true`, although
this option is not recommended for other databases.
"""
use ExUnit.CaseTemplate
# alias Ecto.Adapters.SQL.Sandbox
using do
quote do
# Import conveniences for testing with connections
import Plug.Conn
import Phoenix.ConnTest
import LocalHexWeb.ConnCase
alias LocalHexWeb.Router.Helpers, as: Routes
# The default endpoint for testing
@endpoint LocalHexWeb.Endpoint
end
end
setup _tags do
# pid = Sandbox.start_owner!(LocalHex.Repo, shared: not tags[:async])
on_exit(fn ->
root_path(repository_config().store)
|> File.rm_rf()
# Sandbox.stop_owner(pid)
end)
{:ok, conn: Phoenix.ConnTest.build_conn(), repository: repository_config()}
end
def repository_config do
Application.fetch_env!(:local_hex, :repositories)
|> Keyword.fetch!(:main)
|> LocalHex.Repository.init()
end
def path(repository, path) do
Path.join([root_path(repository.store), repository.name | List.wrap(path)])
end
def root_path({_module, root: path}) do
Path.join(Application.app_dir(:local_hex), path)
end
end
| 27.790323 | 79 | 0.714452 |
ffa41a38b8695208d5967a8876069492269058a4 | 878 | exs | Elixir | mix.exs | smeevil/phoenix-jsroutes | b5351527302acff6258acd321a4f055e6fc71371 | [
"MIT"
] | null | null | null | mix.exs | smeevil/phoenix-jsroutes | b5351527302acff6258acd321a4f055e6fc71371 | [
"MIT"
] | null | null | null | mix.exs | smeevil/phoenix-jsroutes | b5351527302acff6258acd321a4f055e6fc71371 | [
"MIT"
] | null | null | null | defmodule PhoenixJsroutes.Mixfile do
use Mix.Project
def project do
[
app: :phoenix_jsroutes,
version: "0.0.4",
elixir: "~> 1.2",
build_embedded: Mix.env() == :prod,
start_permanent: Mix.env() == :prod,
description: description(),
package: package(),
deps: deps()
]
end
def application do
[applications: [:logger]]
end
defp deps do
[{:phoenix, ">= 1.0.0", only: :test}, {:execjs, "~> 1.1.3", only: :test}]
end
defp description do
"""
Brings phoenix router helpers to your javascript code.
"""
end
defp package do
[
name: :phoenix_jsroutes,
files: ["lib", "priv", "mix.exs", "README*", "LICENSE*"],
licenses: ["MIT"],
maintainers: ["Tiago Henrique Engel"],
links: %{"GitHub" => "https://github.com/tiagoengel/phoenix-jsroutes"}
]
end
end
| 21.414634 | 77 | 0.571754 |
ffa41b4997acba06f15d18e5c663cd9e5ed9947f | 1,546 | exs | Elixir | test/blue_jet_web/controllers/storefront/order_controller_test.exs | freshcom/freshcom-api | 4f2083277943cf4e4e8fd4c4d443c7309f285ad7 | [
"BSD-3-Clause"
] | 44 | 2018-05-09T01:08:57.000Z | 2021-01-19T07:25:26.000Z | test/blue_jet_web/controllers/storefront/order_controller_test.exs | freshcom/freshcom-api | 4f2083277943cf4e4e8fd4c4d443c7309f285ad7 | [
"BSD-3-Clause"
] | 36 | 2018-05-08T23:59:54.000Z | 2018-09-28T13:50:30.000Z | test/blue_jet_web/controllers/storefront/order_controller_test.exs | freshcom/freshcom-api | 4f2083277943cf4e4e8fd4c4d443c7309f285ad7 | [
"BSD-3-Clause"
] | 9 | 2018-05-09T14:09:19.000Z | 2021-03-21T21:04:04.000Z | # defmodule BlueJetWeb.OrderControllerTest do
# use BlueJetWeb.ConnCase
# import BlueJet.Identity.TestHelper
# setup do
# conn =
# build_conn()
# |> put_req_header("accept", "application/vnd.api+json")
# |> put_req_header("content-type", "application/vnd.api+json")
# %{conn: conn}
# end
# # Create an order
# describe "POST /v1/orders" do
# test "without PAT", %{conn: conn} do
# conn = get(conn, "/v1/users")
# assert conn.status == 401
# end
# test "with no attributes", %{conn: conn} do
# conn = post(conn, "/v1/orders", %{
# "data" => %{
# "type" => "Order"
# }
# })
# response = json_response(conn, 422)
# assert length(response["errors"]) == 3
# end
# end
# # Retrieve an order
# describe "GET /v1/orders/:id" do
# test "without PAT", %{conn: conn} do
# conn = get(conn, "/v1/orders/#{Ecto.UUID.generate()}")
# assert conn.status == 401
# end
# end
# # Update an order
# describe "PATCH /v1/orders/:id" do
# test "without PAT", %{conn: conn} do
# conn = get(conn, "/v1/orders/#{Ecto.UUID.generate()}")
# assert conn.status == 401
# end
# end
# # Delete an order
# describe "DELETE /v1/orders/:id" do
# test "without PAT", %{conn: conn} do
# end
# end
# # List order
# describe "GET /v1/orders" do
# test "without UAT", %{conn: conn} do
# conn = get(conn, "/v1/orders")
# assert conn.status == 401
# end
# end
# end
| 22.735294 | 69 | 0.548512 |
ffa4264f487e1bc1791799dcd18ebedc9f0343e1 | 592 | exs | Elixir | exercises/practice/rotational-cipher/mix.exs | devtayls/elixir | 67824de8209ff1b6ed2f736deedfb5bd815130ca | [
"MIT"
] | 343 | 2017-06-22T16:28:28.000Z | 2022-03-25T21:33:32.000Z | exercises/practice/rotational-cipher/mix.exs | devtayls/elixir | 67824de8209ff1b6ed2f736deedfb5bd815130ca | [
"MIT"
] | 583 | 2017-06-19T10:48:40.000Z | 2022-03-28T21:43:12.000Z | exercises/practice/rotational-cipher/mix.exs | devtayls/elixir | 67824de8209ff1b6ed2f736deedfb5bd815130ca | [
"MIT"
] | 228 | 2017-07-05T07:09:32.000Z | 2022-03-27T08:59:08.000Z | defmodule RotationalCipher.MixProject do
use Mix.Project
def project do
[
app: :rotational_cipher,
version: "0.1.0",
# elixir: "~> 1.8",
start_permanent: Mix.env() == :prod,
deps: deps()
]
end
# Run "mix help compile.app" to learn about applications.
def application do
[
extra_applications: [:logger]
]
end
# Run "mix help deps" to learn about dependencies.
defp deps do
[
# {:dep_from_hexpm, "~> 0.3.0"},
# {:dep_from_git, git: "https://github.com/elixir-lang/my_dep.git", tag: "0.1.0"}
]
end
end
| 20.413793 | 87 | 0.587838 |
ffa45bd5242da95f13ee2299e08c5d29c20c3671 | 13,477 | ex | Elixir | lib/sobelow.ex | tmecklem/sobelow | 76b441da408b0156a05fa208a8426c63f3536fe5 | [
"Apache-2.0"
] | null | null | null | lib/sobelow.ex | tmecklem/sobelow | 76b441da408b0156a05fa208a8426c63f3536fe5 | [
"Apache-2.0"
] | null | null | null | lib/sobelow.ex | tmecklem/sobelow | 76b441da408b0156a05fa208a8426c63f3536fe5 | [
"Apache-2.0"
] | null | null | null | defmodule Sobelow do
@moduledoc """
Sobelow is a static analysis tool for discovering
vulnerabilities in Phoenix applications.
"""
@v Mix.Project.config()[:version]
@submodules [
Sobelow.XSS,
Sobelow.SQL,
Sobelow.Traversal,
Sobelow.RCE,
Sobelow.Misc,
Sobelow.Config,
Sobelow.CI,
Sobelow.DOS,
Sobelow.Vuln
]
alias Sobelow.Utils
alias Sobelow.Config
alias Sobelow.Vuln
alias Sobelow.FindingLog
alias Sobelow.MetaLog
alias Mix.Shell.IO, as: MixIO
# Remove directory structure check for release candidate
# prior to 1.0
def run() do
project_root = get_env(:root) <> "/"
if !get_env(:private), do: version_check(project_root)
app_name = Utils.get_app_name(project_root <> "mix.exs")
if !is_binary(app_name), do: file_error()
{web_root, lib_root} = get_root(app_name, project_root)
root =
if String.ends_with?(web_root, "./") do
web_root <> "web/"
else
lib_root
end
ignored = get_ignored()
allowed = @submodules -- ignored
# Pulling out function definitions before kicking
# off the test pipeline to avoid dumping warning
# messages into the findings output.
root_meta_files = get_meta_files(root)
template_meta_files = get_meta_templates(root)
# If web_root ends with the app_name, then it is the
# more recent version of Phoenix. Meaning, all files are
# in the lib directory, so we don't need to re-scan
# lib_root separately.
phx_post_1_2? =
String.ends_with?(web_root, "/#{app_name}/") ||
String.ends_with?(web_root, "/#{app_name}_web/")
libroot_meta_files = if !phx_post_1_2?, do: get_meta_files(lib_root), else: []
default_router = get_router(app_name, web_root)
routers = get_routers(root_meta_files ++ libroot_meta_files, default_router)
if Enum.empty?(routers), do: no_router()
FindingLog.start_link()
MetaLog.start_link()
MetaLog.add_templates(template_meta_files)
# This is where the core testing-pipeline starts.
#
# - Print banner
# - Check configuration
# - Remove config check from "allowed" modules
# - Scan funcs from the root
# - Scan funcs from the libroot
if not (format() in ["quiet", "compact", "json"]), do: IO.puts(:stderr, print_banner())
Application.put_env(:sobelow, :app_name, app_name)
if Enum.member?(allowed, Config), do: Config.fetch(project_root, routers)
if Enum.member?(allowed, Vuln), do: Vuln.get_vulns(project_root)
allowed = allowed -- [Config, Vuln]
Enum.each(root_meta_files, fn meta_file ->
meta_file.def_funs
|> combine_skips()
|> Enum.each(&get_fun_vulns(&1, meta_file, root, allowed))
end)
Enum.each(libroot_meta_files, fn meta_file ->
meta_file.def_funs
|> combine_skips()
|> Enum.each(&get_fun_vulns(&1, meta_file, "", allowed))
end)
# Future template handling will look something like this.
# XSS checks should be fully handled earlier, and excluded from
# the second template pass.
template_meta_files = MetaLog.get_templates()
Enum.each(template_meta_files, fn {_, meta_file} ->
if Sobelow.XSS in allowed, do: Sobelow.XSS.get_template_vulns(meta_file)
end)
# Enum.each(template_meta_files, fn {_, meta_file} ->
# get_fun_vulns(meta_file.ast, meta_file, root, allowed)
# end)
if format() != "txt" do
print_output()
else
IO.puts(:stderr, "... SCAN COMPLETE ...\n")
end
exit_with_status()
end
defp print_output() do
details =
case format() do
"json" ->
FindingLog.json(@v)
"quiet" ->
FindingLog.quiet()
_ ->
nil
end
if !is_nil(details), do: IO.puts(details)
end
defp exit_with_status() do
exit_on = get_env(:exit_on)
finding_logs = FindingLog.log()
high_count = length(finding_logs[:high])
medium_count = length(finding_logs[:medium])
low_count = length(finding_logs[:low])
status =
case exit_on do
:high ->
if high_count > 0, do: 1
:medium ->
if high_count + medium_count > 0, do: 1
:low ->
if high_count + medium_count + low_count > 0, do: 1
_ ->
0
end
exit_status = if is_nil(status), do: 0, else: status
System.halt(exit_status)
end
def details() do
mod =
get_env(:details)
|> get_mod
if is_nil(mod) do
MixIO.error("A valid module was not selected.")
else
apply(mod, :details, [])
end
end
def log_finding(finding, severity) do
FindingLog.add(finding, severity)
end
def all_details() do
@submodules
|> Enum.each(&apply(&1, :details, []))
end
def save_config(conf_file) do
conf = """
[
verbose: #{get_env(:verbose)},
private: #{get_env(:private)},
skip: #{get_env(:skip)},
router: "#{get_env(:router)}",
exit: "#{get_env(:exit_on)}",
format: "#{get_env(:format)}",
ignore: ["#{Enum.join(get_env(:ignored), "\", \"")}"],
ignore_files: ["#{Enum.join(get_env(:ignored_files), "\", \"")}"]
]
"""
yes? =
if File.exists?(conf_file) do
MixIO.yes?("The file .sobelow-conf already exists. Are you sure you want to overwrite?")
else
true
end
if yes? do
File.write!(conf_file, conf)
MixIO.info("Updated .sobelow-conf")
end
end
def format() do
get_env(:format)
end
def get_env(key) do
Application.get_env(:sobelow, key)
end
defp print_banner() do
"""
##############################################
# #
# Running Sobelow - v#{@v} #
# Created by Griffin Byatt - @griffinbyatt #
# NCC Group - https://nccgroup.trust #
# #
##############################################
"""
end
defp get_root(app_name, project_root) do
lib_root = project_root <> "lib/"
cond do
File.dir?(project_root <> "lib/#{app_name}_web") ->
# New phoenix RC structure
{lib_root <> "#{app_name}_web/", lib_root}
File.dir?(project_root <> "lib/#{app_name}/web/") ->
# RC 1 phx dir structure
{lib_root <> "#{app_name}/", lib_root}
true ->
# Original dir structure
{project_root <> "./", lib_root}
end
end
defp get_router(app_name, web_root) do
router_path =
if Path.basename(web_root) == "#{app_name}_web" do
"router.ex"
else
"web/router.ex"
end
case get_env(:router) do
nil -> web_root <> router_path
"" -> web_root <> router_path
router -> router
end
|> Path.expand()
end
defp get_routers(meta_files, router) do
routers =
Enum.flat_map(meta_files, fn meta_file ->
case meta_file.is_router? do
true -> [meta_file.file_path]
_ -> []
end
end)
if File.exists?(router) do
Enum.uniq(routers ++ [router])
else
routers
end
end
defp get_meta_templates(root) do
ignored_files = get_env(:ignored_files)
Utils.template_files(root)
|> Enum.reject(&is_ignored_file(&1, ignored_files))
|> Enum.map(&get_template_meta/1)
|> Map.new()
end
defp get_template_meta(filename) do
meta_funs = Utils.get_meta_template_funs(filename)
raw = meta_funs.raw
ast = meta_funs.ast
filename = Utils.normalize_path(filename)
{
filename,
%{
filename: filename,
raw: raw,
ast: [ast],
is_controller?: false
}
}
end
defp get_meta_files(root) do
ignored_files = get_env(:ignored_files)
Utils.all_files(root)
|> Enum.reject(&is_ignored_file(&1, ignored_files))
|> Enum.map(&get_file_meta/1)
end
defp get_file_meta(filename) do
ast = Utils.ast(filename)
meta_funs = Utils.get_meta_funs(ast)
def_funs = meta_funs.def_funs
use_funs = meta_funs.use_funs
%{
filename: Utils.normalize_path(filename),
file_path: Path.expand(filename),
def_funs: def_funs,
is_controller?: Utils.is_controller?(use_funs),
is_router?: Utils.is_router?(use_funs)
}
end
defp get_fun_vulns({fun, skips}, meta_file, web_root, mods) do
skip_mods =
skips
|> Enum.map(&get_mod/1)
Enum.each(mods -- skip_mods, fn mod ->
params = [fun, meta_file, web_root, skip_mods]
apply(mod, :get_vulns, params)
end)
end
defp get_fun_vulns(fun, meta_file, web_root, mods) do
get_fun_vulns({fun, []}, meta_file, web_root, mods)
end
defp combine_skips([]), do: []
defp combine_skips([head | tail] = funs) do
if get_env(:skip), do: combine_skips(head, tail), else: funs
end
defp combine_skips(prev, []), do: [prev]
defp combine_skips(prev, [{:@, _, [{:sobelow_skip, _, [skips]}]} | []]), do: [{prev, skips}]
defp combine_skips(prev, [{:@, _, [{:sobelow_skip, _, [skips]}]} | tail]) do
[h | t] = tail
[{prev, skips} | combine_skips(h, t)]
end
defp combine_skips(prev, rest) do
[h | t] = rest
[prev | combine_skips(h, t)]
end
defp no_router() do
message = """
WARNING: Sobelow cannot find the router. If this is a Phoenix application
please use the `--router` flag to specify the router's location.
"""
IO.puts(:stderr, message)
ignored = get_env(:ignored)
Application.put_env(
:sobelow,
:ignored,
ignored ++ ["Config.CSRF", "Config.Headers", "Config.CSP"]
)
end
defp file_error() do
MixIO.error("This does not appear to be a Phoenix application.")
System.halt(0)
end
defp version_check(project_root) do
cfile = project_root <> ".sobelow"
time = DateTime.utc_now() |> DateTime.to_unix()
if File.exists?(cfile) do
{timestamp, _} =
case File.read!(cfile) do
"sobelow-" <> timestamp -> Integer.parse(timestamp)
_ -> file_error()
end
if time - 12 * 60 * 60 > timestamp do
maybe_prompt_update(time, cfile)
end
else
maybe_prompt_update(time, cfile)
end
end
defp get_sobelow_version() do
# Modeled after old Mix.Utils.read_path
{:ok, _} = Application.ensure_all_started(:ssl)
{:ok, _} = Application.ensure_all_started(:inets)
{:ok, _} = :inets.start(:httpc, [{:profile, :sobelow}])
# update to sobelow.io for future versions
url = 'https://griffinbyatt.com/static/sobelow-version'
IO.puts(:stderr, "Checking Sobelow version...\n")
case :httpc.request(:get, {url, []}, [{:timeout, 10000}], []) do
{:ok, {{_, 200, _}, _, vsn}} ->
Version.parse!(String.trim(to_string(vsn)))
_ ->
MixIO.error("Error fetching version number.\n")
@v
end
after
:inets.stop(:httpc, :sobelow)
end
defp maybe_prompt_update(time, cfile) do
installed_vsn = Version.parse!(@v)
cmp =
get_sobelow_version()
|> Version.compare(installed_vsn)
case cmp do
:gt ->
MixIO.error("""
A new version of Sobelow is available:
mix archive.install hex sobelow
""")
_ ->
nil
end
timestamp = "sobelow-" <> to_string(time)
File.write(cfile, timestamp)
end
def get_mod(mod_string) do
case mod_string do
"XSS" -> Sobelow.XSS
"XSS.Raw" -> Sobelow.XSS.Raw
"XSS.SendResp" -> Sobelow.XSS.SendResp
"XSS.ContentType" -> Sobelow.XSS.ContentType
"XSS.HTML" -> Sobelow.XSS.HTML
"SQL" -> Sobelow.SQL
"SQL.Query" -> Sobelow.SQL.Query
"SQL.Stream" -> Sobelow.SQL.Stream
"Misc" -> Sobelow.Misc
"Misc.BinToTerm" -> Sobelow.Misc.BinToTerm
"Misc.FilePath" -> Sobelow.Misc.FilePath
"RCE" -> Sobelow.RCE
"RCE.EEx" -> Sobelow.RCE.EEx
"RCE.CodeModule" -> Sobelow.RCE.CodeModule
"Config" -> Sobelow.Config
"Config.CSRF" -> Sobelow.Config.CSRF
"Config.Headers" -> Sobelow.Config.Headers
"Config.CSP" -> Sobelow.Config.CSP
"Config.Secrets" -> Sobelow.Config.Secrets
"Config.HTTPS" -> Sobelow.Config.HTTPS
"Config.HSTS" -> Sobelow.Config.HSTS
"Vuln" -> Sobelow.Vuln
"Vuln.CookieRCE" -> Sobelow.Vuln.CookieRCE
"Vuln.HeaderInject" -> Sobelow.Vuln.HeaderInject
"Vuln.PlugNull" -> Sobelow.Vuln.PlugNull
"Vuln.Redirect" -> Sobelow.Vuln.Redirect
"Vuln.Coherence" -> Sobelow.Vuln.Coherence
"Vuln.Ecto" -> Sobelow.Vuln.Ecto
"Traversal" -> Sobelow.Traversal
"Traversal.SendFile" -> Sobelow.Traversal.SendFile
"Traversal.FileModule" -> Sobelow.Traversal.FileModule
"Traversal.SendDownload" -> Sobelow.Traversal.SendDownload
"CI" -> Sobelow.CI
"CI.System" -> Sobelow.CI.System
"CI.OS" -> Sobelow.CI.OS
"DOS" -> Sobelow.DOS
"DOS.StringToAtom" -> Sobelow.DOS.StringToAtom
"DOS.ListToAtom" -> Sobelow.DOS.ListToAtom
"DOS.BinToAtom" -> Sobelow.DOS.BinToAtom
_ -> nil
end
end
def get_ignored() do
get_env(:ignored)
|> Enum.map(&get_mod/1)
end
def is_vuln?({vars, _, _}) do
cond do
length(vars) == 0 ->
false
true ->
true
end
end
defp is_ignored_file(filename, ignored_files) do
Enum.any?(ignored_files, fn ignored_file ->
String.ends_with?(ignored_file, filename)
end)
end
end
| 26.168932 | 96 | 0.607183 |
ffa4714f051685f5b581d64b7678d50b75d41746 | 282 | ex | Elixir | lib/json.ex | sergiotapia/magnex | 37f5024fa6eabf4b790f1e7aa508bfda9e7cb5a2 | [
"Apache-2.0"
] | 8 | 2019-01-08T15:34:49.000Z | 2021-05-25T08:25:26.000Z | lib/json.ex | sergiotapia/magnex | 37f5024fa6eabf4b790f1e7aa508bfda9e7cb5a2 | [
"Apache-2.0"
] | null | null | null | lib/json.ex | sergiotapia/magnex | 37f5024fa6eabf4b790f1e7aa508bfda9e7cb5a2 | [
"Apache-2.0"
] | 5 | 2019-02-01T03:44:54.000Z | 2021-05-25T08:25:32.000Z | defmodule JSON do
@json Application.get_env(:magnex, :json_library) ||
raise("You must set your json library config for magnex.json_library")
def decode(json) do
@json.decode(json)
end
def encode_to_iodata(json) do
@json.encode_to_iodata(json)
end
end
| 21.692308 | 80 | 0.705674 |
ffa4bae6fbc536c01d24d5c5689d995855ff696e | 83 | exs | Elixir | test/akd/helpers/deploy_helper_test.exs | corroded/akd | ed15b8929b6d110552a19522f8a17edf75452e87 | [
"MIT"
] | 51 | 2018-01-07T03:41:05.000Z | 2021-06-17T21:39:27.000Z | test/akd/helpers/deploy_helper_test.exs | corroded/akd | ed15b8929b6d110552a19522f8a17edf75452e87 | [
"MIT"
] | 42 | 2017-12-24T04:36:39.000Z | 2019-05-21T01:32:00.000Z | test/akd/helpers/deploy_helper_test.exs | corroded/akd | ed15b8929b6d110552a19522f8a17edf75452e87 | [
"MIT"
] | 2 | 2018-06-15T11:46:28.000Z | 2018-08-22T16:03:56.000Z | defmodule Akd.DeployHelperTest do
use ExUnit.Case
doctest Akd.DeployHelper
end
| 16.6 | 33 | 0.819277 |
ffa4c0731d4b0b9f53cc51251a0a42b59b54d464 | 674 | ex | Elixir | daniel/almeida/ch6/dungeon_crawl/lib/dungeon_crawl/room.ex | jdashton/glowing-succotash | 44580c2d4cb300e33156d42e358e8a055948a079 | [
"MIT"
] | null | null | null | daniel/almeida/ch6/dungeon_crawl/lib/dungeon_crawl/room.ex | jdashton/glowing-succotash | 44580c2d4cb300e33156d42e358e8a055948a079 | [
"MIT"
] | 1 | 2020-02-26T14:55:23.000Z | 2020-02-26T14:55:23.000Z | daniel/almeida/ch6/dungeon_crawl/lib/dungeon_crawl/room.ex | jdashton/glowing-succotash | 44580c2d4cb300e33156d42e358e8a055948a079 | [
"MIT"
] | null | null | null | defmodule DungeonCrawl.Room do
alias DungeonCrawl.Room
alias DungeonCrawl.Room.Triggers
import DungeonCrawl.Room.Action
defstruct description: nil, actions: [], trigger: nil
def all,
do: [
# %Room{
# description: "You find a quiet place. Looks safe for a little nap.",
# actions: [forward(), rest()]
# }
# %Room{
# description: "You can see the light of day. You found the exit!",
# actions: [forward()],
# trigger: Triggers.Exit
# },
%Room{
description: "You can see an enemy blocking your path.",
actions: [forward()],
trigger: Triggers.Enemy
}
]
end
| 24.962963 | 78 | 0.586053 |
ffa4cbc53f30661c5786d8527674014ce20b2682 | 6,367 | ex | Elixir | lib/storage/client.ex | kianmeng/elixir_wechat | cc838f0930fd6d26d903e4c3535bdfc263d628d6 | [
"MIT"
] | null | null | null | lib/storage/client.ex | kianmeng/elixir_wechat | cc838f0930fd6d26d903e4c3535bdfc263d628d6 | [
"MIT"
] | null | null | null | lib/storage/client.ex | kianmeng/elixir_wechat | cc838f0930fd6d26d903e4c3535bdfc263d628d6 | [
"MIT"
] | null | null | null | defmodule WeChat.Storage.Client do
@moduledoc """
The storage adapter specification for WeChat common application.
Since we need to temporarily storage some key data(e.g. `access_token`) for invoking WeChat APIs, this module
is used for customizing the persistence when use `elixir_wechat` in a client side of WeChat `common` application.
Notice: the scenario as a client, we only need to implement the minimum functions to automatically append the
required parameters from the persistence.
## Writing custom storage adapter
#### Example for WeChat Official Account Platform application
defmodule MyAppStorageClient do
@behaviour WeChat.Storage.Client
@impl true
def fetch_access_token(appid, args) do
access_token = "Get access_token by appid from your persistence..."
{:ok, %WeChat.Token{access_token: access_token}}
end
@impl true
def fetch_ticket(appid, type, args) do
ticket = "Get jsapi-ticket/card-ticket from your persistence..."
{:ok, ticket}
end
@impl true
def refresh_access_token(appid, access_token, args) do
access_token = "Refresh access_token by appid from your persistence..."
{:ok, %WeChat.Token{access_token: access_token}}
end
end
#### Use `MyAppStorageClient`
Global configure `MyAppStorageClient`
defmodule Client do
use WeChat,
adapter_storage: {MyAppStorageClient, args}
end
defmodule Client do
use WeChat,
adapter_storage: MyAppStorageClient
end
Dynamically set `MyAppStorageClient` when call `WeChat.request/2`
WeChat.request(:post, url: ..., adapter_storage: {MyAppStorageClient, args}, ...)
WeChat.request(:post, url: ..., adapter_storage: MyAppStorageClient, ...)
Notice: The above `args` will be returned back into each implement of callback function, if not input it, `args` will be
as an empty list in callback.
"""
@doc """
Fetch access_token of WeChat common application.
"""
@callback fetch_access_token(appid :: String.t(), args :: term()) :: {:ok, %WeChat.Token{}}
@doc """
Refresh access_token of WeChat common application.
"""
@callback refresh_access_token(appid :: String.t(), access_token :: String.t(), args :: term()) ::
{:ok, %WeChat.Token{}} | {:error, %WeChat.Error{}}
@doc """
Fetch ticket of WeChat common application, the option of `type` parameter is "wx_card" or "jsapi"(refer WeChat Official document).
"""
@callback fetch_ticket(appid :: String.t(), type :: String.t(), args :: term()) ::
{:ok, String.t()} | {:error, %WeChat.Error{}}
end
defmodule WeChat.Storage.ComponentClient do
@moduledoc """
The storage adapter specification for WeChat component application.
Since we need to temporarily storage some key data(e.g. `access_token`/`component_access_token`) for invoking WeChat APIs, this module
is used for customizing the persistence when use `elixir_wechat` in a client side of WeChat `component` application.
Notice: as a client, we only need to implement the minimum functions to automatically append the
required parameters from the persistence.
## Writing custom storage adapter
#### Example for WeChat third-party platform application
defmodule MyComponentAppStorageClient do
@behaviour WeChat.Storage.ComponentClient
@impl true
def fetch_access_token(appid, authorizer_appid, args) do
access_token = "Get authorizer's access_token by appid and authorizer appid from your persistence..."
{:ok, %WeChat.Token{access_token: access_token}}
end
@impl ture
def fetch_component_access_token(appid, args) do
access_token = "Get component access_token by component appid from your persistence..."
{:ok, %WeChat.Token{access_token: access_token}}
end
@impl true
def refresh_access_token(appid, authorizer_appid, access_token, args) do
access_token = "Refresh access_token by appid from your persistence..."
{:ok, %WeChat.Token{access_token: access_token}}
end
@impl true
def fetch_ticket(appid, type, args) do
ticket = "Get jsapi-ticket/card-ticket from your persistence..."
{:ok, ticket}
end
end
#### Use `MyComponentAppStorageClient`
Global configure `MyComponentAppStorageClient`
defmodule Client do
use WeChat,
adapter_storage: {MyComponentAppStorageClient, args}
end
defmodule Client do
use WeChat,
adapter_storage: MyComponentAppStorageClient
end
Dynamically set `MyComponentAppStorageClient` when call `WeChat.request/2`
WeChat.request(:post, url: ..., adapter_storage: {MyComponentAppStorageClient, args}, ...)
WeChat.request(:post, url: ..., adapter_storage: MyComponentAppStorageClient, ...)
Notice: The above `args` will be returned back into each implement of callback function, if not input it, `args` will be
as an empty list in callback.
"""
@doc """
Fetch authorizer's access_token in WeChat component application.
"""
@callback fetch_access_token(
appid :: String.t(),
authorizer_appid :: String.t(),
args :: term()
) :: {:ok, %WeChat.Token{}}
@doc """
Fetch access_token of WeChat component application.
"""
@callback fetch_component_access_token(appid :: String.t(), args :: term()) ::
{:ok, %WeChat.Token{}}
@doc """
Refresh authorizer's access_token in WeChat component application.
"""
@callback refresh_access_token(
appid :: String.t(),
authorizer_appid :: String.t(),
access_token :: String.t(),
args :: term()
) :: {:ok, %WeChat.Token{}} | {:error, %WeChat.Error{}}
@doc """
Fetch authorizer's ticket of WeChat component application, the option of `type` parameter is "wx_card" or "jsapi"(refer WeChat Official document).
"""
@callback fetch_ticket(
appid :: String.t(),
authorizer_appid :: String.t(),
type :: String.t(),
args :: term()
) ::
{:ok, String.t()} | {:error, %WeChat.Error{}}
end
| 35.372222 | 148 | 0.656196 |
ffa4e26c71dc7d6ecfe828983923ef713c19eaac | 1,651 | ex | Elixir | apps/mishka_html/lib/mishka_html_web/views/layout_view.ex | mojtaba-naserei/mishka-cms | 1f31f61347bab1aae6ba0d47c5515a61815db6c9 | [
"Apache-2.0"
] | 35 | 2021-06-26T09:05:50.000Z | 2022-03-30T15:41:22.000Z | apps/mishka_html/lib/mishka_html_web/views/layout_view.ex | iArazar/mishka-cms | 8b579101d607d91e80834527c1508fe5f4ceefef | [
"Apache-2.0"
] | 101 | 2021-01-01T09:54:07.000Z | 2022-03-28T10:02:24.000Z | apps/mishka_html/lib/mishka_html_web/views/layout_view.ex | iArazar/mishka-cms | 8b579101d607d91e80834527c1508fe5f4ceefef | [
"Apache-2.0"
] | 8 | 2021-01-17T17:08:07.000Z | 2022-03-11T16:12:06.000Z | defmodule MishkaHtmlWeb.LayoutView do
use MishkaHtmlWeb, :view
# SEO tags function should have all required fields
def seo_tags(%{image: _image, title: _title, description: _description, type: _type, keywords: _keywords, link: _link} = tags) do
raw ("#{seo_tags_converter(tags, :basic_tag)} #{seo_tags_converter(tags, :social_tag)} ")
end
def seo_tags(_seo_tag), do: ""
defp seo_tags_converter(seo_tags, :basic_tag) do
"""
\n
\t <meta name="description" content="#{seo_tags.description}" /> \n
\t <meta name="keywords" content="#{seo_tags.keywords}" /> \n
\t <base href="#{seo_tags.link}" /> \n
\t <link href="#{seo_tags.link}" rel="canonical" /> \n
"""
end
defp seo_tags_converter(seo_tags, :social_tag) do
"""
\n
\t <meta property="og:image" content="#{seo_tags.image}" /> \n
\t <meta property="og:image:width" content="482" /> \n
\t <meta property="og:image:height" content="451" /> \n
\t <meta property="og:title" content="#{seo_tags.title}" /> \n
\t <meta property="og:description" content="#{seo_tags.description}" /> \n
\t <meta property="og:type" content="#{seo_tags.type}" /> \n
\t <meta property="og:url" content="#{seo_tags.link}" /> \n
\n
\t <meta name="twitter:image" content="#{seo_tags.image}" /> \n
\t <meta name="twitter:card" content="summary_large_image" /> \n
\t <meta name="twitter:url" content="#{seo_tags.link}" /> \n
\t <meta name="twitter:title" content="#{seo_tags.title}" /> \n
\t <meta name="twitter:description" content="#{seo_tags.description}" /> \n
"""
end
end
| 41.275 | 131 | 0.62447 |
ffa522f2aea1c7ca01a6edf44a1f55eca1258743 | 1,598 | ex | Elixir | clients/speech/lib/google_api/speech/v1/model/speech_adaptation.ex | read-with-simbi/elixir-google-api | 2f4cefa653dff06c59391189fc7783ca096c51e5 | [
"Apache-2.0"
] | null | null | null | clients/speech/lib/google_api/speech/v1/model/speech_adaptation.ex | read-with-simbi/elixir-google-api | 2f4cefa653dff06c59391189fc7783ca096c51e5 | [
"Apache-2.0"
] | null | null | null | clients/speech/lib/google_api/speech/v1/model/speech_adaptation.ex | read-with-simbi/elixir-google-api | 2f4cefa653dff06c59391189fc7783ca096c51e5 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Speech.V1.Model.SpeechAdaptation do
@moduledoc """
Provides information to the recognizer that specifies how to process the request.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:phraseSets => list(GoogleApi.Speech.V1.Model.PhraseSet.t()) | nil,
:phraseSetReferences => list(String.t()) | nil,
:customClasses => any() | nil
}
field(:phraseSets, as: GoogleApi.Speech.V1.Model.PhraseSet, type: :list)
field(:phraseSetReferences, type: :list)
field(:customClasses)
end
defimpl Poison.Decoder, for: GoogleApi.Speech.V1.Model.RecognitionConfig do
def decode(value, options) do
GoogleApi.Speech.V1.Model.RecognitionConfig.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Speech.V1.Model.RecognitionConfig do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 34 | 83 | 0.737797 |
ffa5244d16416fb218cd5d24c2f63cea3f5d8afa | 471 | exs | Elixir | tooling/pbt/mix.exs | sfat/programming-elixir-exercises | 19e62e3f3344ec044e1eb1b39b195f4dad3dff1c | [
"Apache-2.0"
] | 1 | 2019-02-17T11:54:17.000Z | 2019-02-17T11:54:17.000Z | tooling/pbt/mix.exs | sfat/programming-elixir-exercises | 19e62e3f3344ec044e1eb1b39b195f4dad3dff1c | [
"Apache-2.0"
] | null | null | null | tooling/pbt/mix.exs | sfat/programming-elixir-exercises | 19e62e3f3344ec044e1eb1b39b195f4dad3dff1c | [
"Apache-2.0"
] | null | null | null | defmodule Pbt.MixProject do
use Mix.Project
def project do
[
app: :pbt,
version: "0.1.0",
elixir: "~> 1.7",
start_permanent: Mix.env() == :prod,
deps: deps()
]
end
# Run "mix help compile.app" to learn about applications.
def application do
[
extra_applications: [:logger]
]
end
# Run "mix help deps" to learn about dependencies.
defp deps do
[
{ :stream_data, ">= 0.0.0"},
]
end
end
| 16.821429 | 59 | 0.56051 |
ffa536313e422ee409cc0e23862566a47cf757b4 | 1,706 | ex | Elixir | api/lib/router_utils.ex | lucas-angermann/idai-field-web | 788c9c9505b6fd12d591345b23053e934f1022d1 | [
"Apache-2.0"
] | null | null | null | api/lib/router_utils.ex | lucas-angermann/idai-field-web | 788c9c9505b6fd12d591345b23053e934f1022d1 | [
"Apache-2.0"
] | null | null | null | api/lib/router_utils.ex | lucas-angermann/idai-field-web | 788c9c9505b6fd12d591345b23053e934f1022d1 | [
"Apache-2.0"
] | null | null | null | defmodule Api.RouterUtils do
import Plug.Conn, only: [put_resp_content_type: 2, send_resp: 3, get_req_header: 2]
alias Api.Core.Config
alias Api.Auth.Rights
def send_json(conn, %{error: "bad_request"} = error) do
conn
|> put_resp_content_type("application/json")
|> send_resp(400, Poison.encode!(error))
end
def send_json(conn, %{error: "not_found"} = error) do
conn
|> put_resp_content_type("application/json")
|> send_resp(404, Poison.encode!(error))
end
def send_json(conn, %{error: "unknown"} = error) do
conn
|> put_resp_content_type("application/json")
|> send_resp(500, Poison.encode!(error))
end
def send_json(conn, body) do
conn
|> put_resp_content_type("application/json")
|> send_resp(200, Poison.encode!(body))
end
def send_unauthorized(conn) do
conn
|> put_resp_content_type("application/json")
|> send_resp(401, Poison.encode!(%{error: :unauthorized}))
end
def send_error(conn, message) do
conn
|> put_resp_content_type("application/json")
|> send_resp(500, Poison.encode!(%{error: message}))
end
def send_not_found(conn) do
conn
|> put_resp_content_type("application/json")
|> send_resp(404, Poison.encode!(%{error: :not_found}))
end
def access_for_project_allowed readable_projects, project do
if project in readable_projects, do: :ok, else: :unauthorized_access
end
def get_user conn do
conn
|> get_req_header("authorization")
|> List.first
|> Rights.authenticate(Config.get(:rights), Config.get(:projects))
end
def get_user_from_token token do
Rights.authenticate(token, Config.get(:rights), Config.get(:projects))
end
end
| 26.65625 | 85 | 0.690504 |
ffa53ff8243ac388f34830e2e0241c9c2ffc3c32 | 6,227 | exs | Elixir | lib/elixir/test/elixir/string_io_test.exs | pap/elixir | c803afe90c766663823c74397fb23ed40ec52c5b | [
"Apache-2.0"
] | 1 | 2019-06-11T20:22:20.000Z | 2019-06-11T20:22:20.000Z | lib/elixir/test/elixir/string_io_test.exs | pap/elixir | c803afe90c766663823c74397fb23ed40ec52c5b | [
"Apache-2.0"
] | null | null | null | lib/elixir/test/elixir/string_io_test.exs | pap/elixir | c803afe90c766663823c74397fb23ed40ec52c5b | [
"Apache-2.0"
] | null | null | null | Code.require_file "test_helper.exs", __DIR__
defmodule StringIOTest do
use ExUnit.Case, async: true
test "open and close" do
{:ok, pid} = StringIO.open("")
assert StringIO.close(pid) == {:ok, {"", ""}}
end
test "contents" do
{:ok, pid} = StringIO.open("abc")
IO.write(pid, "edf")
assert StringIO.contents(pid) == {"abc", "edf"}
end
test "flush" do
{:ok, pid} = StringIO.open("")
IO.write(pid, "edf")
assert StringIO.flush(pid) == "edf"
assert StringIO.contents(pid) == {"", ""}
end
## IO module
def start(string, opts \\ []) do
StringIO.open(string, opts) |> elem(1)
end
def contents(pid) do
StringIO.contents(pid)
end
test "IO.read :line with \\n" do
pid = start("abc\n")
assert IO.read(pid, :line) == "abc\n"
assert IO.read(pid, :line) == :eof
assert contents(pid) == {"", ""}
end
test "IO.read :line with \\rn" do
pid = start("abc\r\n")
assert IO.read(pid, :line) == "abc\n"
assert IO.read(pid, :line) == :eof
assert contents(pid) == {"", ""}
end
test "IO.read :line without line break" do
pid = start("abc")
assert IO.read(pid, :line) == "abc"
assert IO.read(pid, :line) == :eof
assert contents(pid) == {"", ""}
end
test "IO.read :line with invalid utf8" do
pid = start(<< 130, 227, 129, 132, 227, 129, 134 >>)
assert IO.read(pid, :line) == {:error, :collect_line}
assert contents(pid) == {<< 130, 227, 129, 132, 227, 129, 134 >>, ""}
end
test "IO.read count" do
pid = start("abc")
assert IO.read(pid, 2) == "ab"
assert IO.read(pid, 8) == "c"
assert IO.read(pid, 1) == :eof
assert contents(pid) == {"", ""}
end
test "IO.read count with utf8" do
pid = start("あいう")
assert IO.read(pid, 2) == "あい"
assert IO.read(pid, 8) == "う"
assert IO.read(pid, 1) == :eof
assert contents(pid) == {"", ""}
end
test "IO.read count with invalid utf8" do
pid = start(<< 130, 227, 129, 132, 227, 129, 134 >>)
assert IO.read(pid, 2) == {:error, :invalid_unicode}
assert contents(pid) == {<< 130, 227, 129, 132, 227, 129, 134 >>, ""}
end
test "IO.binread :line with \\n" do
pid = start("abc\n")
assert IO.binread(pid, :line) == "abc\n"
assert IO.binread(pid, :line) == :eof
assert contents(pid) == {"", ""}
end
test "IO.binread :line with \\r\\n" do
pid = start("abc\r\n")
assert IO.binread(pid, :line) == "abc\n"
assert IO.binread(pid, :line) == :eof
assert contents(pid) == {"", ""}
end
test "IO.binread :line without line break" do
pid = start("abc")
assert IO.binread(pid, :line) == "abc"
assert IO.binread(pid, :line) == :eof
assert contents(pid) == {"", ""}
end
test "IO.binread count" do
pid = start("abc")
assert IO.binread(pid, 2) == "ab"
assert IO.binread(pid, 8) == "c"
assert IO.binread(pid, 1) == :eof
assert contents(pid) == {"", ""}
end
test "IO.binread count with utf8" do
pid = start("あいう")
assert IO.binread(pid, 2) == << 227, 129 >>
assert IO.binread(pid, 8) == << 130, 227, 129, 132, 227, 129, 134 >>
assert IO.binread(pid, 1) == :eof
assert contents(pid) == {"", ""}
end
test "IO.write" do
pid = start("")
assert IO.write(pid, "foo") == :ok
assert contents(pid) == {"", "foo"}
end
test "IO.write with utf8" do
pid = start("")
assert IO.write(pid, "あいう") == :ok
assert contents(pid) == {"", "あいう"}
end
test "IO.binwrite" do
pid = start("")
assert IO.binwrite(pid, "foo") == :ok
assert contents(pid) == {"", "foo"}
end
test "IO.binwrite with utf8" do
pid = start("")
assert IO.binwrite(pid, "あいう") == :ok
assert contents(pid) == {"", "あいう"}
end
test "IO.puts" do
pid = start("")
assert IO.puts(pid, "abc") == :ok
assert contents(pid) == {"", "abc\n"}
end
test "IO.inspect" do
pid = start("")
assert IO.inspect(pid, {}, []) == {}
assert contents(pid) == {"", "{}\n"}
end
test "IO.getn" do
pid = start("abc")
assert IO.getn(pid, ">", 2) == "ab"
assert contents(pid) == {"c", ""}
end
test "IO.getn with utf8" do
pid = start("あいう")
assert IO.getn(pid, ">", 2) == "あい"
assert contents(pid) == {"う", ""}
end
test "IO.getn with invalid utf8" do
pid = start(<< 130, 227, 129, 132, 227, 129, 134 >>)
assert IO.getn(pid, ">", 2) == {:error, :invalid_unicode}
assert contents(pid) == {<< 130, 227, 129, 132, 227, 129, 134 >>, ""}
end
test "IO.getn with capture_prompt" do
pid = start("abc", capture_prompt: true)
assert IO.getn(pid, ">", 2) == "ab"
assert contents(pid) == {"c", ">"}
end
test "IO.gets with \\n" do
pid = start("abc\nd")
assert IO.gets(pid, ">") == "abc\n"
assert contents(pid) == {"d", ""}
end
test "IO.gets with \\r\\n" do
pid = start("abc\r\nd")
assert IO.gets(pid, ">") == "abc\n"
assert contents(pid) == {"d", ""}
end
test "IO.gets without line breaks" do
pid = start("abc")
assert IO.gets(pid, ">") == "abc"
assert contents(pid) == {"", ""}
end
test "IO.gets with invalid utf8" do
pid = start(<< 130, 227, 129, 132, 227, 129, 134 >>)
assert IO.gets(pid, ">") == {:error, :collect_line}
assert contents(pid) == {<< 130, 227, 129, 132, 227, 129, 134 >>, ""}
end
test "IO.gets with capture_prompt" do
pid = start("abc\n", capture_prompt: true)
assert IO.gets(pid, ">") == "abc\n"
assert contents(pid) == {"", ">"}
end
test ":io.get_password" do
pid = start("abc\n")
assert :io.get_password(pid) == "abc\n"
assert contents(pid) == {"", ""}
end
test "IO.stream" do
pid = start("abc")
assert IO.stream(pid, 2) |> Enum.to_list == ["ab", "c"]
assert contents(pid) == {"", ""}
end
test "IO.stream with invalid utf8" do
pid = start(<< 130, 227, 129, 132, 227, 129, 134 >>)
assert_raise IO.StreamError, fn->
IO.stream(pid, 2) |> Enum.to_list
end
assert contents(pid) == {<< 130, 227, 129, 132, 227, 129, 134 >>, ""}
end
test "IO.binstream" do
pid = start("abc")
assert IO.stream(pid, 2) |> Enum.to_list == ["ab", "c"]
assert contents(pid) == {"", ""}
end
end
| 26.385593 | 73 | 0.548418 |
ffa54453b277d4e987b5bccda03c2c00afcd2499 | 256 | ex | Elixir | chapter_7/todo_worker_pool/lib/todo/database/worker/client.ex | librity/elixir_in_action | d2df441ceb7e6a0d3f18bc3ab3c59570125fcdec | [
"MIT"
] | 3 | 2021-04-22T11:55:58.000Z | 2021-08-22T13:19:56.000Z | chapter_7/todo_worker_pool/lib/todo/database/worker/client.ex | librity/elixir_in_action | d2df441ceb7e6a0d3f18bc3ab3c59570125fcdec | [
"MIT"
] | null | null | null | chapter_7/todo_worker_pool/lib/todo/database/worker/client.ex | librity/elixir_in_action | d2df441ceb7e6a0d3f18bc3ab3c59570125fcdec | [
"MIT"
] | 3 | 2021-04-22T21:19:45.000Z | 2021-08-22T13:20:03.000Z | defmodule Todo.Database.Worker.Client do
def start(db_folder), do: GenServer.start(Todo.Database.Worker, db_folder)
def store(pid, key, data), do: GenServer.cast(pid, {:store, key, data})
def get(pid, key), do: GenServer.call(pid, {:get, key})
end
| 32 | 76 | 0.710938 |
ffa546e3ac939b0a143d75162135b8d484062ebf | 356 | exs | Elixir | config/cypress.exs | thelastinuit/akkad | 08df3f51daeada737c53d07663c166a5e6cc297e | [
"MIT"
] | 1 | 2022-03-05T00:05:26.000Z | 2022-03-05T00:05:26.000Z | config/cypress.exs | thelastinuit/akkad | 08df3f51daeada737c53d07663c166a5e6cc297e | [
"MIT"
] | null | null | null | config/cypress.exs | thelastinuit/akkad | 08df3f51daeada737c53d07663c166a5e6cc297e | [
"MIT"
] | null | null | null | import Config
config :akkad, Akkad.Repo,
username: "postgres",
password: "postgres",
hostname: "localhost",
database: "akkad_test#{System.get_env("MIX_TEST_PARTITION")}",
pool: Ecto.Adapters.SQL.Sandbox,
pool_size: 10
config :akkad, AkkadWeb.Endpoint,
check_origin: false,
http: [port: 4000],
server: true
config :logger, level: :warn
| 20.941176 | 64 | 0.713483 |
ffa54f5d2d8b4b1cb30dee2e047ce92c2c1cfc4c | 1,709 | ex | Elixir | lib/entice/logic/maps/map.ex | hoodaly/logic | 6737f644129c16e83479bc76286624d5d54b197b | [
"WTFPL"
] | null | null | null | lib/entice/logic/maps/map.ex | hoodaly/logic | 6737f644129c16e83479bc76286624d5d54b197b | [
"WTFPL"
] | null | null | null | lib/entice/logic/maps/map.ex | hoodaly/logic | 6737f644129c16e83479bc76286624d5d54b197b | [
"WTFPL"
] | null | null | null | defmodule Entice.Logic.Map do
@moduledoc """
Top-level map macros for convenient access to all defined maps.
Is mainly used in area.ex where all the maps are defined.
"""
import Inflex
alias Geom.Shape.Vector2D
defmacro __using__(_) do
quote do
import Entice.Logic.Map
@maps []
@before_compile Entice.Logic.Map
end
end
defmacro defmap(mapname, opts \\ []) do
spawn = Keyword.get(opts, :spawn, quote do %Vector2D{} end)
outpost = Keyword.get(opts, :outpost, quote do true end)
nav_mesh = Keyword.get(opts, :nav_mesh, quote do nil end)
map_content = content(Macro.to_string(mapname))
quote do
defmodule unquote(mapname) do
alias Geom.Shape.Vector2D
unquote(map_content)
def spawn, do: unquote(spawn)
def is_outpost?, do: unquote(outpost)
def nav_mesh, do: unquote(nav_mesh)
end
@maps [ unquote(mapname) | @maps ]
end
end
defmacro __before_compile__(_) do
quote do
@doc """
Simplistic map getter, tries to convert a PascalCase map name to the module atom.
"""
def get_map(name) do
try do
{:ok, ((__MODULE__ |> Atom.to_string) <> "." <> name) |> String.to_existing_atom}
rescue
ArgumentError -> {:error, :map_not_found}
end
end
def get_maps, do: @maps
end
end
defp content(name) do
uname = underscore(name)
quote do
def name, do: unquote(name)
def underscore_name, do: unquote(uname)
def spawn, do: %Vector2D{}
def is_outpost?, do: true
def nav_mesh, do: nil
defoverridable [spawn: 0, is_outpost?: 0, nav_mesh: 0]
end
end
end
| 24.070423 | 91 | 0.623171 |
ffa5599573956ce5bfe9e6c91a44dc37cb542a8e | 888 | exs | Elixir | mix.exs | sylv3r/katalyst | 562b53f5a403a9fe1978ab3c3210937aac9b7ffd | [
"Apache-2.0"
] | null | null | null | mix.exs | sylv3r/katalyst | 562b53f5a403a9fe1978ab3c3210937aac9b7ffd | [
"Apache-2.0"
] | null | null | null | mix.exs | sylv3r/katalyst | 562b53f5a403a9fe1978ab3c3210937aac9b7ffd | [
"Apache-2.0"
] | null | null | null | defmodule KatalystFramework.Mixfile do
use Mix.Project
def project, do: [
build_path: System.get_env("BUILD_PATH") || "_build",
deps_path: System.get_env("DEPS_PATH") || "_deps",
config_path: "config/config.exs",
apps_path: "apps",
elixirc_paths: elixirc_paths(Mix.env),
lockfile: "mix.lock",
elixir: "~> 1.5",
start_permanent: Mix.env == :prod,
deps: deps()
]
# Run "mix help compile.app" to learn about applications.
def application, do: [
extra_applications: [:logger]
]
# Common paths to use in all elixirc_paths method definitions
defp common_paths, do: ["apps"]
defp elixirc_paths(_), do: common_paths()
# Dependencies listed here are available only for this
# project and cannot be accessed from applications inside
# the apps folder.
#
# Run "mix help deps" for examples and options.
defp deps, do: []
end
| 27.75 | 63 | 0.681306 |
ffa55b85912313774bfe7d6c1f9fb2562e49ee05 | 1,123 | exs | Elixir | mix.exs | squaretwo/amplitude_ex | 846dfa41fee5702e812ae6c4ce1f35a2bba049d0 | [
"BSD-3-Clause"
] | null | null | null | mix.exs | squaretwo/amplitude_ex | 846dfa41fee5702e812ae6c4ce1f35a2bba049d0 | [
"BSD-3-Clause"
] | null | null | null | mix.exs | squaretwo/amplitude_ex | 846dfa41fee5702e812ae6c4ce1f35a2bba049d0 | [
"BSD-3-Clause"
] | 2 | 2020-04-26T22:17:11.000Z | 2021-08-16T14:38:44.000Z | defmodule Amplitude.Mixfile do
use Mix.Project
def project do
[
app: :amplitude,
version: "0.3.0",
elixir: "~> 1.3",
build_embedded: Mix.env() == :prod,
start_permanent: Mix.env() == :prod,
deps: deps(),
description: "Supports the Track and Identiy Amplitude API",
package: package()
]
end
# Configuration for the OTP application
#
# Type "mix help compile.app" for more information
def application do
[applications: [:logger]]
end
# Dependencies can be Hex packages:
#
# {:mydep, "~> 0.3.0"}
#
# Or git/path repositories:
#
# {:mydep, git: "https://github.com/elixir-lang/mydep.git", tag: "0.1.0"}
#
# Type "mix help deps" for more examples and options
defp deps do
[
{:elixir_uuid, ">= 1.2.0"},
{:httpoison, ">= 1.5.0"},
{:poison, ">= 3.1.0"},
{:ex_doc, ">= 0.0.0", only: :dev}
]
end
defp package do
[
maintainers: ["Ben Yee", "Donald Plummer"],
licenses: ["MIT"],
links: %{
GitHub: "https://github.com/squaretwo/amplitude_ex"
}
]
end
end
| 21.596154 | 77 | 0.558326 |
ffa56b58643eb3cbdc99fbc6125d2e7bbc1aed90 | 596 | ex | Elixir | lib/todo_app_web/controllers/fallback_controller.ex | pghildiy/phoenix-todoapp | 4ba8e75e2a88a9c7f5756bc579e6763423bfaa42 | [
"MIT"
] | 20 | 2017-07-18T14:51:28.000Z | 2021-11-15T11:40:24.000Z | lib/todo_app_web/controllers/fallback_controller.ex | pghildiy/phoenix-todoapp | 4ba8e75e2a88a9c7f5756bc579e6763423bfaa42 | [
"MIT"
] | 4 | 2016-08-12T07:00:15.000Z | 2017-11-09T20:35:13.000Z | lib/todo_app_web/controllers/fallback_controller.ex | pghildiy/phoenix-todoapp | 4ba8e75e2a88a9c7f5756bc579e6763423bfaa42 | [
"MIT"
] | 5 | 2017-11-09T13:33:22.000Z | 2020-05-24T16:36:27.000Z | defmodule TodoAppWeb.FallbackController do
@moduledoc """
Translates controller action results into valid `Plug.Conn` responses.
See `Phoenix.Controller.action_fallback/1` for more details.
"""
use TodoAppWeb, :controller
def call(conn, {:error, %Ecto.Changeset{} = changeset}) do
conn
|> put_status(:unprocessable_entity)
|> put_view(TodoAppWeb.ChangesetView)
|> render("error.json", changeset: changeset)
end
def call(conn, {:error, :not_found}) do
conn
|> put_status(:not_found)
|> put_view(TodoAppWeb.ErrorView)
|> render(:"404")
end
end
| 24.833333 | 72 | 0.692953 |
ffa5ac62ceab3574e33cdb1d659da19fc6771f3a | 34,231 | ex | Elixir | lib/elixir/lib/kernel/typespec.ex | oskarkook/elixir | 2ddd291c533cdc2b1b1f02153d94c0b248cb9836 | [
"Apache-2.0"
] | 1 | 2019-06-27T08:47:13.000Z | 2019-06-27T08:47:13.000Z | lib/elixir/lib/kernel/typespec.ex | oskarkook/elixir | 2ddd291c533cdc2b1b1f02153d94c0b248cb9836 | [
"Apache-2.0"
] | null | null | null | lib/elixir/lib/kernel/typespec.ex | oskarkook/elixir | 2ddd291c533cdc2b1b1f02153d94c0b248cb9836 | [
"Apache-2.0"
] | 1 | 2021-12-09T11:36:07.000Z | 2021-12-09T11:36:07.000Z | defmodule Kernel.Typespec do
@moduledoc false
## Deprecated API moved to Code.Typespec
@doc false
@deprecated "Use Code.Typespec.spec_to_quoted/2 instead"
def spec_to_ast(name, spec) do
Code.Typespec.spec_to_quoted(name, spec)
end
@doc false
@deprecated "Use Code.Typespec.type_to_quoted/1 instead"
def type_to_ast(type) do
Code.Typespec.type_to_quoted(type)
end
@doc false
@deprecated "Use Code.fetch_docs/1 instead"
def beam_typedocs(module) when is_atom(module) or is_binary(module) do
case Code.fetch_docs(module) do
{:docs_v1, _, _, _, _, _, docs} ->
for {{:type, name, arity}, _, _, doc, _} <- docs do
case doc do
%{"en" => doc_string} -> {{name, arity}, doc_string}
:hidden -> {{name, arity}, false}
_ -> {{name, arity}, nil}
end
end
{:error, _} ->
nil
end
end
@doc false
@deprecated "Use Code.Typespec.fetch_types/1 instead"
def beam_types(module) when is_atom(module) or is_binary(module) do
case Code.Typespec.fetch_types(module) do
{:ok, types} -> types
:error -> nil
end
end
@doc false
@deprecated "Use Code.Typespec.fetch_specs/1 instead"
def beam_specs(module) when is_atom(module) or is_binary(module) do
case Code.Typespec.fetch_specs(module) do
{:ok, specs} -> specs
:error -> nil
end
end
@doc false
@deprecated "Use Code.Typespec.fetch_callbacks/1 instead"
def beam_callbacks(module) when is_atom(module) or is_binary(module) do
case Code.Typespec.fetch_callbacks(module) do
{:ok, callbacks} -> callbacks
:error -> nil
end
end
## Hooks for Module functions
def defines_type?(module, {name, arity} = signature)
when is_atom(module) and is_atom(name) and arity in 0..255 do
{_set, bag} = :elixir_module.data_tables(module)
finder = fn {_kind, expr, _caller} ->
type_to_signature(expr) == signature
end
:lists.any(finder, get_typespecs(bag, [:type, :opaque, :typep]))
end
def spec_to_callback(module, {name, arity} = signature)
when is_atom(module) and is_atom(name) and arity in 0..255 do
{set, bag} = :elixir_module.data_tables(module)
filter = fn {:spec, expr, pos} ->
if spec_to_signature(expr) == signature do
kind = :callback
store_typespec(bag, kind, expr, pos)
case :ets.lookup(set, {:function, name, arity}) do
[{{:function, ^name, ^arity}, _, line, _, doc, doc_meta}] ->
store_doc(set, kind, name, arity, line, :doc, doc, doc_meta)
_ ->
nil
end
true
else
false
end
end
:lists.filter(filter, get_typespecs(bag, :spec)) != []
end
## Typespec definition and storage
@doc """
Defines a typespec.
Invoked by `Kernel.@/1` expansion.
"""
def deftypespec(:spec, expr, _line, _file, module, pos) do
{_set, bag} = :elixir_module.data_tables(module)
store_typespec(bag, :spec, expr, pos)
end
def deftypespec(kind, expr, line, _file, module, pos)
when kind in [:callback, :macrocallback] do
{set, bag} = :elixir_module.data_tables(module)
case spec_to_signature(expr) do
{name, arity} ->
# store doc only once in case callback has multiple clauses
unless :ets.member(set, {kind, name, arity}) do
{line, doc} = get_doc_info(set, :doc, line)
store_doc(set, kind, name, arity, line, :doc, doc, %{})
end
:error ->
:error
end
store_typespec(bag, kind, expr, pos)
end
@reserved_signatures [required: 1, optional: 1]
def deftypespec(kind, expr, line, file, module, pos)
when kind in [:type, :typep, :opaque] do
{set, bag} = :elixir_module.data_tables(module)
case type_to_signature(expr) do
{name, arity} = signature when signature in @reserved_signatures ->
compile_error(
:elixir_locals.get_cached_env(pos),
"type #{name}/#{arity} is a reserved type and it cannot be defined"
)
{name, arity} when kind == :typep ->
{line, doc} = get_doc_info(set, :typedoc, line)
if doc do
warning =
"type #{name}/#{arity} is private, @typedoc's are always discarded for private types"
:elixir_errors.erl_warn(line, file, warning)
end
{name, arity} ->
{line, doc} = get_doc_info(set, :typedoc, line)
spec_meta = if kind == :opaque, do: %{opaque: true}, else: %{}
store_doc(set, :type, name, arity, line, :typedoc, doc, spec_meta)
:error ->
:error
end
store_typespec(bag, kind, expr, pos)
end
defp get_typespecs(bag, keys) when is_list(keys) do
:lists.flatmap(&get_typespecs(bag, &1), keys)
end
defp get_typespecs(bag, key) do
:ets.lookup_element(bag, {:accumulate, key}, 2)
catch
:error, :badarg -> []
end
defp take_typespecs(bag, keys) when is_list(keys) do
:lists.flatmap(&take_typespecs(bag, &1), keys)
end
defp take_typespecs(bag, key) do
:lists.map(&elem(&1, 1), :ets.take(bag, {:accumulate, key}))
end
defp store_typespec(bag, key, expr, pos) do
:ets.insert(bag, {{:accumulate, key}, {key, expr, pos}})
:ok
end
defp store_doc(set, kind, name, arity, line, doc_kind, doc, spec_meta) do
doc_meta = get_doc_meta(spec_meta, doc_kind, set)
:ets.insert(set, {{kind, name, arity}, line, doc, doc_meta})
end
defp get_doc_info(set, attr, line) do
case :ets.take(set, attr) do
[{^attr, {line, doc}, _}] -> {line, doc}
[] -> {line, nil}
end
end
defp get_doc_meta(spec_meta, doc_kind, set) do
case :ets.take(set, {doc_kind, :meta}) do
[{{^doc_kind, :meta}, metadata, _}] -> Map.merge(metadata, spec_meta)
[] -> spec_meta
end
end
defp spec_to_signature({:when, _, [spec, _]}), do: type_to_signature(spec)
defp spec_to_signature(other), do: type_to_signature(other)
defp type_to_signature({:"::", _, [{name, _, context}, _]})
when is_atom(name) and name != :"::" and is_atom(context),
do: {name, 0}
defp type_to_signature({:"::", _, [{name, _, args}, _]}) when is_atom(name) and name != :"::",
do: {name, length(args)}
defp type_to_signature(_), do: :error
## Translation from Elixir AST to typespec AST
@doc false
def translate_typespecs_for_module(_set, bag) do
type_typespecs = take_typespecs(bag, [:type, :opaque, :typep])
defined_type_pairs = collect_defined_type_pairs(type_typespecs)
state = %{
defined_type_pairs: defined_type_pairs,
used_type_pairs: [],
local_vars: %{},
undefined_type_error_enabled?: true
}
{types, state} = :lists.mapfoldl(&translate_type/2, state, type_typespecs)
{specs, state} = :lists.mapfoldl(&translate_spec/2, state, take_typespecs(bag, :spec))
{callbacks, state} = :lists.mapfoldl(&translate_spec/2, state, take_typespecs(bag, :callback))
{macrocallbacks, state} =
:lists.mapfoldl(&translate_spec/2, state, take_typespecs(bag, :macrocallback))
optional_callbacks = :lists.flatten(get_typespecs(bag, :optional_callbacks))
used_types = filter_used_types(types, state)
{used_types, specs, callbacks, macrocallbacks, optional_callbacks}
end
defp collect_defined_type_pairs(type_typespecs) do
fun = fn {_kind, expr, pos}, type_pairs ->
%{file: file, line: line} = env = :elixir_locals.get_cached_env(pos)
case type_to_signature(expr) do
{name, arity} = type_pair ->
if built_in_type?(name, arity) do
message = "type #{name}/#{arity} is a built-in type and it cannot be redefined"
compile_error(env, message)
end
if Map.has_key?(type_pairs, type_pair) do
{error_full_path, error_line} = type_pairs[type_pair]
error_relative_path = Path.relative_to_cwd(error_full_path)
compile_error(
env,
"type #{name}/#{arity} is already defined in #{error_relative_path}:#{error_line}"
)
end
Map.put(type_pairs, type_pair, {file, line})
:error ->
compile_error(env, "invalid type specification: #{Macro.to_string(expr)}")
end
end
:lists.foldl(fun, %{}, type_typespecs)
end
defp filter_used_types(types, state) do
fun = fn {_kind, {name, arity} = type_pair, _line, _type, export} ->
if not export and not :lists.member(type_pair, state.used_type_pairs) do
%{^type_pair => {file, line}} = state.defined_type_pairs
:elixir_errors.erl_warn(line, file, "type #{name}/#{arity} is unused")
false
else
true
end
end
:lists.filter(fun, types)
end
defp translate_type({kind, {:"::", _, [{name, _, args}, definition]}, pos}, state) do
caller = :elixir_locals.get_cached_env(pos)
state = clean_local_state(state)
args =
if is_atom(args) do
[]
else
:lists.map(&variable/1, args)
end
vars = :lists.filter(&match?({:var, _, _}, &1), args)
var_names = :lists.map(&elem(&1, 2), vars)
state = :lists.foldl(&update_local_vars(&2, &1), state, var_names)
{spec, state} = typespec(definition, var_names, caller, state)
type = {name, spec, vars}
arity = length(args)
ensure_no_underscore_local_vars!(caller, var_names)
ensure_no_unused_local_vars!(caller, state.local_vars)
{kind, export} =
case kind do
:type -> {:type, true}
:typep -> {:type, false}
:opaque -> {:opaque, true}
end
invalid_args = :lists.filter(&(not valid_variable_ast?(&1)), args)
unless invalid_args == [] do
invalid_args = :lists.join(", ", :lists.map(&Macro.to_string/1, invalid_args))
message =
"@type definitions expect all arguments to be variables. The type " <>
"#{name}/#{arity} has an invalid argument(s): #{invalid_args}"
compile_error(caller, message)
end
if underspecified?(kind, arity, spec) do
message = "@#{kind} type #{name}/#{arity} is underspecified and therefore meaningless"
:elixir_errors.erl_warn(caller.line, caller.file, message)
end
{{kind, {name, arity}, caller.line, type, export}, state}
end
defp valid_variable_ast?({variable_name, _, context})
when is_atom(variable_name) and is_atom(context),
do: true
defp valid_variable_ast?(_), do: false
defp underspecified?(:opaque, 0, {:type, _, type, []}) when type in [:any, :term], do: true
defp underspecified?(_kind, _arity, _spec), do: false
defp translate_spec({kind, {:when, _meta, [spec, guard]}, pos}, state) do
caller = :elixir_locals.get_cached_env(pos)
translate_spec(kind, spec, guard, caller, state)
end
defp translate_spec({kind, spec, pos}, state) do
caller = :elixir_locals.get_cached_env(pos)
translate_spec(kind, spec, [], caller, state)
end
defp translate_spec(kind, {:"::", meta, [{name, _, args}, return]}, guard, caller, state)
when is_atom(name) and name != :"::" do
translate_spec(kind, meta, name, args, return, guard, caller, state)
end
defp translate_spec(_kind, {name, _meta, _args} = spec, _guard, caller, _state)
when is_atom(name) and name != :"::" do
spec = Macro.to_string(spec)
compile_error(caller, "type specification missing return type: #{spec}")
end
defp translate_spec(_kind, spec, _guard, caller, _state) do
spec = Macro.to_string(spec)
compile_error(caller, "invalid type specification: #{spec}")
end
defp translate_spec(kind, meta, name, args, return, guard, caller, state) when is_atom(args),
do: translate_spec(kind, meta, name, [], return, guard, caller, state)
defp translate_spec(kind, meta, name, args, return, guard, caller, state) do
ensure_no_defaults!(args)
state = clean_local_state(state)
unless Keyword.keyword?(guard) do
error = "expected keywords as guard in type specification, got: #{Macro.to_string(guard)}"
compile_error(caller, error)
end
line = line(meta)
vars = Keyword.keys(guard)
{fun_args, state} = fn_args(meta, args, return, vars, caller, state)
spec = {:type, line, :fun, fun_args}
{spec, state} =
case guard_to_constraints(guard, vars, meta, caller, state) do
{[], state} -> {spec, state}
{constraints, state} -> {{:type, line, :bounded_fun, [spec, constraints]}, state}
end
ensure_no_unused_local_vars!(caller, state.local_vars)
arity = length(args)
{{kind, {name, arity}, caller.line, spec}, state}
end
# TODO: Remove char_list type by v2.0
defp built_in_type?(:char_list, 0), do: true
defp built_in_type?(:charlist, 0), do: true
defp built_in_type?(:as_boolean, 1), do: true
defp built_in_type?(:struct, 0), do: true
defp built_in_type?(:nonempty_charlist, 0), do: true
defp built_in_type?(:keyword, 0), do: true
defp built_in_type?(:keyword, 1), do: true
defp built_in_type?(:var, 0), do: true
defp built_in_type?(name, arity), do: :erl_internal.is_type(name, arity)
defp ensure_no_defaults!(args) do
fun = fn
{:"::", _, [left, right]} ->
ensure_not_default(left)
ensure_not_default(right)
left
other ->
ensure_not_default(other)
other
end
:lists.foreach(fun, args)
end
defp ensure_not_default({:\\, _, [_, _]}) do
raise ArgumentError, "default arguments \\\\ not supported in typespecs"
end
defp ensure_not_default(_), do: :ok
defp guard_to_constraints(guard, vars, meta, caller, state) do
line = line(meta)
fun = fn
{_name, {:var, _, context}}, {constraints, state} when is_atom(context) ->
{constraints, state}
{name, type}, {constraints, state} ->
{spec, state} = typespec(type, vars, caller, state)
constraint = [{:atom, line, :is_subtype}, [{:var, line, name}, spec]]
state = update_local_vars(state, name)
{[{:type, line, :constraint, constraint} | constraints], state}
end
{constraints, state} = :lists.foldl(fun, {[], state}, guard)
{:lists.reverse(constraints), state}
end
## To typespec conversion
defp line(meta) do
Keyword.get(meta, :line, 0)
end
# Handle unions
defp typespec({:|, meta, [_, _]} = exprs, vars, caller, state) do
exprs = collect_union(exprs)
{union, state} = :lists.mapfoldl(&typespec(&1, vars, caller, &2), state, exprs)
{{:type, line(meta), :union, union}, state}
end
# Handle binaries
defp typespec({:<<>>, meta, []}, _, _, state) do
line = line(meta)
{{:type, line, :binary, [{:integer, line, 0}, {:integer, line, 0}]}, state}
end
defp typespec(
{:<<>>, meta, [{:"::", unit_meta, [{:_, _, ctx1}, {:*, _, [{:_, _, ctx2}, unit]}]}]},
_,
_,
state
)
when is_atom(ctx1) and is_atom(ctx2) and unit in 1..256 do
line = line(meta)
{{:type, line, :binary, [{:integer, line, 0}, {:integer, line(unit_meta), unit}]}, state}
end
defp typespec({:<<>>, meta, [{:"::", size_meta, [{:_, _, ctx}, size]}]}, _, _, state)
when is_atom(ctx) and is_integer(size) and size >= 0 do
line = line(meta)
{{:type, line, :binary, [{:integer, line(size_meta), size}, {:integer, line, 0}]}, state}
end
defp typespec(
{
:<<>>,
meta,
[
{:"::", size_meta, [{:_, _, ctx1}, size]},
{:"::", unit_meta, [{:_, _, ctx2}, {:*, _, [{:_, _, ctx3}, unit]}]}
]
},
_,
_,
state
)
when is_atom(ctx1) and is_atom(ctx2) and is_atom(ctx3) and is_integer(size) and
size >= 0 and unit in 1..256 do
args = [{:integer, line(size_meta), size}, {:integer, line(unit_meta), unit}]
{{:type, line(meta), :binary, args}, state}
end
defp typespec({:<<>>, _meta, _args}, _vars, caller, _state) do
message =
"invalid binary specification, expected <<_::size>>, <<_::_*unit>>, " <>
"or <<_::size, _::_*unit>> with size being non-negative integers, and unit being an integer between 1 and 256"
compile_error(caller, message)
end
## Handle maps and structs
defp typespec({:map, meta, args}, _vars, _caller, state) when args == [] or is_atom(args) do
{{:type, line(meta), :map, :any}, state}
end
defp typespec({:%{}, meta, fields} = map, vars, caller, state) do
fun = fn
{{:required, meta2, [k]}, v}, state ->
{arg1, state} = typespec(k, vars, caller, state)
{arg2, state} = typespec(v, vars, caller, state)
{{:type, line(meta2), :map_field_exact, [arg1, arg2]}, state}
{{:optional, meta2, [k]}, v}, state ->
{arg1, state} = typespec(k, vars, caller, state)
{arg2, state} = typespec(v, vars, caller, state)
{{:type, line(meta2), :map_field_assoc, [arg1, arg2]}, state}
{k, v}, state ->
{arg1, state} = typespec(k, vars, caller, state)
{arg2, state} = typespec(v, vars, caller, state)
{{:type, line(meta), :map_field_exact, [arg1, arg2]}, state}
{:|, _, [_, _]}, _state ->
error =
"invalid map specification. When using the | operator in the map key, " <>
"make sure to wrap the key type in parentheses: #{Macro.to_string(map)}"
compile_error(caller, error)
_, _state ->
compile_error(caller, "invalid map specification: #{Macro.to_string(map)}")
end
{fields, state} = :lists.mapfoldl(fun, state, fields)
{{:type, line(meta), :map, fields}, state}
end
defp typespec({:%, _, [name, {:%{}, meta, fields}]}, vars, caller, state) do
module = Macro.expand(name, %{caller | function: {:__info__, 1}})
struct =
module
|> Macro.struct!(caller)
|> Map.delete(:__struct__)
|> Map.to_list()
unless Keyword.keyword?(fields) do
compile_error(caller, "expected key-value pairs in struct #{Macro.to_string(name)}")
end
types =
:lists.map(
fn
{:__exception__ = field, true} -> {field, Keyword.get(fields, field, true)}
{field, _} -> {field, Keyword.get(fields, field, quote(do: term()))}
end,
:lists.sort(struct)
)
fun = fn {field, _} ->
unless Keyword.has_key?(struct, field) do
compile_error(
caller,
"undefined field #{inspect(field)} on struct #{inspect(module)}"
)
end
end
:lists.foreach(fun, fields)
typespec({:%{}, meta, [__struct__: module] ++ types}, vars, caller, state)
end
# Handle records
defp typespec({:record, meta, [atom]}, vars, caller, state) do
typespec({:record, meta, [atom, []]}, vars, caller, state)
end
defp typespec({:record, meta, [tag, field_specs]}, vars, caller, state)
when is_atom(tag) and is_list(field_specs) do
# We cannot set a function name to avoid tracking
# as a compile time dependency because for records it actually is one.
case Macro.expand({tag, [], [{:{}, [], []}]}, caller) do
{_, _, [name, fields | _]} when is_list(fields) ->
types =
:lists.map(
fn {field, _} ->
{:"::", [],
[
{field, [], nil},
Keyword.get(field_specs, field, quote(do: term()))
]}
end,
fields
)
fun = fn {field, _} ->
unless Keyword.has_key?(fields, field) do
compile_error(caller, "undefined field #{field} on record #{inspect(tag)}")
end
end
:lists.foreach(fun, field_specs)
typespec({:{}, meta, [name | types]}, vars, caller, state)
_ ->
compile_error(caller, "unknown record #{inspect(tag)}")
end
end
defp typespec({:record, _meta, [_tag, _field_specs]}, _vars, caller, _state) do
message = "invalid record specification, expected the record name to be an atom literal"
compile_error(caller, message)
end
# Handle ranges
defp typespec({:.., meta, [left, right]}, vars, caller, state) do
{left, state} = typespec(left, vars, caller, state)
{right, state} = typespec(right, vars, caller, state)
:ok = validate_range(left, right, caller)
{{:type, line(meta), :range, [left, right]}, state}
end
# Handle special forms
defp typespec({:__MODULE__, _, atom}, vars, caller, state) when is_atom(atom) do
typespec(caller.module, vars, caller, state)
end
defp typespec({:__aliases__, _, _} = alias, vars, caller, state) do
typespec(expand_remote(alias, caller), vars, caller, state)
end
# Handle funs
defp typespec([{:->, meta, [args, return]}], vars, caller, state)
when is_list(args) do
{args, state} = fn_args(meta, args, return, vars, caller, state)
{{:type, line(meta), :fun, args}, state}
end
# Handle type operator
defp typespec(
{:"::", meta, [{var_name, var_meta, context}, expr]} = ann_type,
vars,
caller,
state
)
when is_atom(var_name) and is_atom(context) do
case typespec(expr, vars, caller, state) do
{{:ann_type, _, _}, _state} ->
message =
"invalid type annotation. Type annotations cannot be nested: " <>
"#{Macro.to_string(ann_type)}"
# TODO: Make this an error on v2.0 and remove the code below
:elixir_errors.erl_warn(caller.line, caller.file, message)
# This may be generating an invalid typespec but we need to generate it
# to avoid breaking existing code that was valid but only broke Dialyzer
{right, state} = typespec(expr, vars, caller, state)
{{:ann_type, line(meta), [{:var, line(var_meta), var_name}, right]}, state}
{right, state} ->
{{:ann_type, line(meta), [{:var, line(var_meta), var_name}, right]}, state}
end
end
defp typespec({:"::", meta, [left, right]}, vars, caller, state) do
message =
"invalid type annotation. The left side of :: must be a variable, got: #{Macro.to_string(left)}"
message =
case left do
{:|, _, _} ->
message <>
". Note \"left | right :: ann\" is the same as \"(left | right) :: ann\". " <>
"To solve this, use parentheses around the union operands: \"left | (right :: ann)\""
_ ->
message
end
# TODO: Make this an error on v2.0, and remove the code below and
# the :undefined_type_error_enabled? key from the state
:elixir_errors.erl_warn(caller.line, caller.file, message)
# This may be generating an invalid typespec but we need to generate it
# to avoid breaking existing code that was valid but only broke Dialyzer
state = %{state | undefined_type_error_enabled?: false}
{left, state} = typespec(left, vars, caller, state)
state = %{state | undefined_type_error_enabled?: true}
{right, state} = typespec(right, vars, caller, state)
{{:ann_type, line(meta), [left, right]}, state}
end
# Handle unary ops
defp typespec({op, meta, [integer]}, _, _, state) when op in [:+, :-] and is_integer(integer) do
line = line(meta)
{{:op, line, op, {:integer, line, integer}}, state}
end
# Handle remote calls in the form of @module_attribute.type.
# These are not handled by the general remote type clause as calling
# Macro.expand/2 on the remote does not expand module attributes (but expands
# things like __MODULE__).
defp typespec(
{{:., meta, [{:@, _, [{attr, _, _}]}, name]}, _, args} = orig,
vars,
caller,
state
) do
remote = Module.get_attribute(caller.module, attr)
unless is_atom(remote) and remote != nil do
message =
"invalid remote in typespec: #{Macro.to_string(orig)} (@#{attr} is #{inspect(remote)})"
compile_error(caller, message)
end
{remote_spec, state} = typespec(remote, vars, caller, state)
{name_spec, state} = typespec(name, vars, caller, state)
type = {remote_spec, meta, name_spec, args}
remote_type(type, vars, caller, state)
end
# Handle remote calls
defp typespec({{:., meta, [remote, name]}, _, args} = orig, vars, caller, state) do
remote = expand_remote(remote, caller)
cond do
not is_atom(remote) ->
compile_error(caller, "invalid remote in typespec: #{Macro.to_string(orig)}")
remote == caller.module ->
typespec({name, meta, args}, vars, caller, state)
true ->
{remote_spec, state} = typespec(remote, vars, caller, state)
{name_spec, state} = typespec(name, vars, caller, state)
type = {remote_spec, meta, name_spec, args}
remote_type(type, vars, caller, state)
end
end
# Handle tuples
defp typespec({:tuple, meta, []}, _vars, _caller, state) do
{{:type, line(meta), :tuple, :any}, state}
end
defp typespec({:{}, meta, t}, vars, caller, state) when is_list(t) do
{args, state} = :lists.mapfoldl(&typespec(&1, vars, caller, &2), state, t)
{{:type, line(meta), :tuple, args}, state}
end
defp typespec({left, right}, vars, caller, state) do
typespec({:{}, [], [left, right]}, vars, caller, state)
end
# Handle blocks
defp typespec({:__block__, _meta, [arg]}, vars, caller, state) do
typespec(arg, vars, caller, state)
end
# Handle variables or local calls
defp typespec({name, meta, atom}, vars, caller, state) when is_atom(atom) do
if :lists.member(name, vars) do
state = update_local_vars(state, name)
{{:var, line(meta), name}, state}
else
typespec({name, meta, []}, vars, caller, state)
end
end
# Handle local calls
defp typespec({:string, meta, args}, vars, caller, state) do
warning =
"string() type use is discouraged. " <>
"For character lists, use charlist() type, for strings, String.t()\n" <>
Exception.format_stacktrace(Macro.Env.stacktrace(caller))
:elixir_errors.erl_warn(caller.line, caller.file, warning)
{args, state} = :lists.mapfoldl(&typespec(&1, vars, caller, &2), state, args)
{{:type, line(meta), :string, args}, state}
end
defp typespec({:nonempty_string, meta, args}, vars, caller, state) do
warning =
"nonempty_string() type use is discouraged. " <>
"For non-empty character lists, use nonempty_charlist() type, for strings, String.t()\n" <>
Exception.format_stacktrace(Macro.Env.stacktrace(caller))
:elixir_errors.erl_warn(caller.line, caller.file, warning)
{args, state} = :lists.mapfoldl(&typespec(&1, vars, caller, &2), state, args)
{{:type, line(meta), :nonempty_string, args}, state}
end
defp typespec({type, _meta, []}, vars, caller, state) when type in [:charlist, :char_list] do
if type == :char_list do
warning = "the char_list() type is deprecated, use charlist()"
:elixir_errors.erl_warn(caller.line, caller.file, warning)
end
typespec(quote(do: :elixir.charlist()), vars, caller, state)
end
defp typespec({:nonempty_charlist, _meta, []}, vars, caller, state) do
typespec(quote(do: :elixir.nonempty_charlist()), vars, caller, state)
end
defp typespec({:struct, _meta, []}, vars, caller, state) do
typespec(quote(do: :elixir.struct()), vars, caller, state)
end
defp typespec({:as_boolean, _meta, [arg]}, vars, caller, state) do
typespec(quote(do: :elixir.as_boolean(unquote(arg))), vars, caller, state)
end
defp typespec({:keyword, _meta, args}, vars, caller, state) when length(args) <= 1 do
typespec(quote(do: :elixir.keyword(unquote_splicing(args))), vars, caller, state)
end
defp typespec({:fun, meta, args}, vars, caller, state) do
{args, state} = :lists.mapfoldl(&typespec(&1, vars, caller, &2), state, args)
{{:type, line(meta), :fun, args}, state}
end
defp typespec({name, meta, args}, vars, caller, state) do
{args, state} = :lists.mapfoldl(&typespec(&1, vars, caller, &2), state, args)
arity = length(args)
case :erl_internal.is_type(name, arity) do
true ->
{{:type, line(meta), name, args}, state}
false ->
if state.undefined_type_error_enabled? and
not Map.has_key?(state.defined_type_pairs, {name, arity}) do
compile_error(
caller,
"type #{name}/#{arity} undefined (no such type in #{inspect(caller.module)})"
)
end
state =
if :lists.member({name, arity}, state.used_type_pairs) do
state
else
%{state | used_type_pairs: [{name, arity} | state.used_type_pairs]}
end
{{:user_type, line(meta), name, args}, state}
end
end
# Handle literals
defp typespec(atom, _, _, state) when is_atom(atom) do
{{:atom, 0, atom}, state}
end
defp typespec(integer, _, _, state) when is_integer(integer) do
{{:integer, 0, integer}, state}
end
defp typespec([], vars, caller, state) do
typespec({nil, [], []}, vars, caller, state)
end
defp typespec([{:..., _, atom}], vars, caller, state) when is_atom(atom) do
typespec({:nonempty_list, [], []}, vars, caller, state)
end
defp typespec([spec, {:..., _, atom}], vars, caller, state) when is_atom(atom) do
typespec({:nonempty_list, [], [spec]}, vars, caller, state)
end
defp typespec([spec], vars, caller, state) do
typespec({:list, [], [spec]}, vars, caller, state)
end
defp typespec(list, vars, caller, state) when is_list(list) do
[head | tail] = :lists.reverse(list)
union =
:lists.foldl(
fn elem, acc -> {:|, [], [validate_kw(elem, list, caller), acc]} end,
validate_kw(head, list, caller),
tail
)
typespec({:list, [], [union]}, vars, caller, state)
end
defp typespec(other, _vars, caller, _state) do
compile_error(caller, "unexpected expression in typespec: #{Macro.to_string(other)}")
end
## Helpers
# This is a backport of Macro.expand/2 because we want to expand
# aliases but we don't them to become compile-time references.
defp expand_remote({:__aliases__, _, _} = alias, env) do
case :elixir_aliases.expand(alias, env) do
receiver when is_atom(receiver) ->
receiver
aliases ->
aliases = :lists.map(&Macro.expand_once(&1, env), aliases)
case :lists.all(&is_atom/1, aliases) do
true -> :elixir_aliases.concat(aliases)
false -> alias
end
end
end
defp expand_remote(other, env), do: Macro.expand(other, env)
defp compile_error(caller, desc) do
raise CompileError, file: caller.file, line: caller.line, description: desc
end
defp remote_type({remote, meta, name, args}, vars, caller, state) do
{args, state} = :lists.mapfoldl(&typespec(&1, vars, caller, &2), state, args)
{{:remote_type, line(meta), [remote, name, args]}, state}
end
defp collect_union({:|, _, [a, b]}), do: [a | collect_union(b)]
defp collect_union(v), do: [v]
defp validate_kw({key, _} = t, _, _caller) when is_atom(key), do: t
defp validate_kw(_, original, caller) do
compile_error(caller, "unexpected list in typespec: #{Macro.to_string(original)}")
end
defp validate_range({:op, _, :-, {:integer, meta, first}}, last, caller) do
validate_range({:integer, meta, -first}, last, caller)
end
defp validate_range(first, {:op, _, :-, {:integer, meta, last}}, caller) do
validate_range(first, {:integer, meta, -last}, caller)
end
defp validate_range({:integer, _, first}, {:integer, _, last}, _caller) when first < last do
:ok
end
defp validate_range(_, _, caller) do
message =
"invalid range specification, expected both sides to be integers, " <>
"with the left side lower than the right side"
compile_error(caller, message)
end
defp fn_args(meta, args, return, vars, caller, state) do
{fun_args, state} = fn_args(meta, args, vars, caller, state)
{spec, state} = typespec(return, vars, caller, state)
case [fun_args, spec] do
[{:type, _, :any}, {:type, _, :any, []}] -> {[], state}
x -> {x, state}
end
end
defp fn_args(meta, [{:..., _, _}], _vars, _caller, state) do
{{:type, line(meta), :any}, state}
end
defp fn_args(meta, args, vars, caller, state) do
{args, state} = :lists.mapfoldl(&typespec(&1, vars, caller, &2), state, args)
{{:type, line(meta), :product, args}, state}
end
defp variable({name, meta, args}) when is_atom(name) and is_atom(args) do
{:var, line(meta), name}
end
defp variable(expr), do: expr
defp clean_local_state(state) do
%{state | local_vars: %{}}
end
defp update_local_vars(%{local_vars: local_vars} = state, var_name) do
case Map.fetch(local_vars, var_name) do
{:ok, :used_once} -> %{state | local_vars: Map.put(local_vars, var_name, :used_multiple)}
{:ok, :used_multiple} -> state
:error -> %{state | local_vars: Map.put(local_vars, var_name, :used_once)}
end
end
defp ensure_no_underscore_local_vars!(caller, var_names) do
case :lists.member(:_, var_names) do
true ->
compile_error(caller, "type variable '_' is invalid")
false ->
:ok
end
end
defp ensure_no_unused_local_vars!(caller, local_vars) do
fun = fn {name, used_times} ->
case {:erlang.atom_to_list(name), used_times} do
{[?_ | _], :used_once} ->
:ok
{[?_ | _], :used_multiple} ->
warning =
"the underscored type variable \"#{name}\" is used more than once in the " <>
"type specification. A leading underscore indicates that the value of the " <>
"variable should be ignored. If this is intended please rename the variable to " <>
"remove the underscore"
:elixir_errors.erl_warn(caller.line, caller.file, warning)
{_, :used_once} ->
compile_error(
caller,
"type variable #{name} is used only once. Type variables in typespecs " <>
"must be referenced at least twice, otherwise it is equivalent to term()"
)
_ ->
:ok
end
end
:lists.foreach(fun, :maps.to_list(local_vars))
end
end
| 32.600952 | 118 | 0.614969 |
ffa5c2e875765b20fc69c1c3cb4fc1321db10714 | 4,253 | exs | Elixir | test/attempt_test.exs | kipcole9/attempt | 4f085702982b591cc258f703b345aa038d35db8d | [
"Apache-2.0"
] | 2 | 2018-01-04T07:48:49.000Z | 2018-01-06T10:20:46.000Z | test/attempt_test.exs | kipcole9/attempt | 4f085702982b591cc258f703b345aa038d35db8d | [
"Apache-2.0"
] | 1 | 2018-01-12T22:28:00.000Z | 2018-01-12T22:28:00.000Z | test/attempt_test.exs | kipcole9/attempt | 4f085702982b591cc258f703b345aa038d35db8d | [
"Apache-2.0"
] | null | null | null | defmodule AttemptTest do
use ExUnit.Case
import ExUnit.CaptureIO
alias Attempt.Bucket
setup do
on_exit(fn ->
Bucket.Token.stop(:test)
end)
end
test "create a token bucket" do
assert Attempt.Bucket.Token.new(:test) ==
{:ok, %Attempt.Bucket.Token{
burst_size: 10,
fill_rate: 3,
max_queue_length: 100,
name: :test,
queue: nil,
tokens: 0
}}
end
test "creating an existing bucket returns an error" do
{:ok, _bucket} = Attempt.Bucket.Token.new(:test)
assert Attempt.Bucket.Token.new(:test) ==
{
:error,
{Attempt.TokenBucket.AlreadyStartedError, "Bucket :test is already started"},
%Attempt.Bucket.Token{
burst_size: 10,
fill_rate: 3,
max_queue_length: 100,
name: :test,
queue: nil,
tokens: 0
}
}
end
test "that we can return the bucket state" do
{:ok, bucket} = Attempt.Bucket.Token.new(:test)
assert Bucket.state(bucket) ==
{:ok, %Attempt.Bucket.Token{
burst_size: 10,
fill_rate: 3,
max_queue_length: 100,
name: :test,
queue: {[], []},
tokens: 10
}}
end
test "a simple excution with all default options" do
assert Attempt.run(fn -> :ok end) == :ok
end
test "that we execute the right number of tries" do
for tries <- 1..10 do
output =
capture_io(fn ->
assert Attempt.run(
fn ->
IO.puts("try")
:error
end,
tries: tries
) == :error
end)
assert tries == String.split(output, "\n", trim: true) |> Enum.count()
end
end
test "that we short circuit the tries if we get a ok return" do
exit_on_count = 3
Process.put(:count, 1)
output =
capture_io(fn ->
assert Attempt.run(
fn ->
IO.puts("Try")
if Process.get(:count) == exit_on_count do
:ok
else
Process.put(:count, Process.get(:count) + 1)
:error
end
end,
tries: 5
) == :ok
end)
assert exit_on_count == Process.get(:count)
assert exit_on_count == String.split(output, "\n", trim: true) |> Enum.count()
end
test "that we timeout on the bucket if we exceed the burst" do
tries = 5
timeout = 500
bucket_name = :test
{:ok, bucket} = Attempt.Bucket.Token.new(bucket_name, fill_rate: 1_000, burst_size: tries - 1)
output =
capture_io(fn ->
assert Attempt.run(
fn ->
IO.puts("Try")
:error
end,
tries: tries,
token_bucket: bucket,
timeout: 500
) ==
{
:error,
{
Attempt.TokenBucket.TimeoutError,
"Token request for bucket #{inspect(bucket_name)} timed out after #{timeout} milliseconds"
}
}
end)
assert tries - 1 == String.split(output, "\n", trim: true) |> Enum.count()
end
test "that exponential backoff works" do
tries = 10
strategy = Attempt.Retry.Backoff.Exponential
bucket_name = :test
{:ok, bucket} = Attempt.Bucket.Token.new(bucket_name, fill_rate: 1_000, burst_size: tries - 1)
{time, _} =
:timer.tc(fn ->
Attempt.run(
fn ->
:error
end,
tries: tries,
token_bucket: bucket,
backoff_strategy: strategy
) == :error
end)
time = div(time, 1000)
estimated_time =
Enum.reduce(1..tries, 0, fn i, acc ->
acc + strategy.delay(%Attempt.Retry.Budget{current_try: i})
end)
assert_in_delta estimated_time, time, estimated_time * 0.03
end
end
| 26.416149 | 111 | 0.484129 |
ffa5c6e3f32f08db591c32745a3b6657006253c4 | 2,632 | ex | Elixir | lib/xdr/types/variable_array.ex | revelrylabs/exdr | 4a3a5d65e2d0ba2daba32c5f648cbe401176e2e1 | [
"MIT"
] | 3 | 2020-01-12T03:16:38.000Z | 2020-09-06T03:37:03.000Z | lib/xdr/types/variable_array.ex | revelrylabs/exdr | 4a3a5d65e2d0ba2daba32c5f648cbe401176e2e1 | [
"MIT"
] | 7 | 2019-10-14T14:27:31.000Z | 2020-05-01T22:27:52.000Z | lib/xdr/types/variable_array.ex | revelrylabs/exdr | 4a3a5d65e2d0ba2daba32c5f648cbe401176e2e1 | [
"MIT"
] | 1 | 2020-02-06T23:19:24.000Z | 2020-02-06T23:19:24.000Z | defmodule XDR.Type.VariableArray do
@moduledoc """
A variable-length array of some other type
"""
alias XDR.Size
defstruct type_name: "VariableArray", data_type: nil, max_length: Size.max(), values: []
@type t() :: %__MODULE__{
type_name: String.t(),
data_type: XDR.Type.t(),
max_length: XDR.Size.t(),
values: list(XDR.Type.t())
}
@type options() :: [type: XDR.Type.key(), max_length: XDR.Size.t()]
defimpl XDR.Type do
alias XDR.Error
def build_type(type, type: data_type, max_length: max_length) when is_integer(max_length) do
%{type | data_type: data_type, max_length: max_length}
end
def build_type(type, options) do
build_type(type, Keyword.merge(options, max_length: Size.max()))
end
def resolve_type!(%{data_type: data_type} = type, %{} = custom_types) do
%{type | data_type: XDR.Type.resolve_type!(data_type, custom_types)}
end
def build_value!(
%{data_type: data_type, max_length: max_length, type_name: name} = type,
raw_values
)
when is_list(raw_values) do
if length(raw_values) > max_length do
raise Error,
message: "Input values too long, expected a max of #{max_length} values",
type: name,
data: raw_values
end
values =
raw_values
|> Enum.with_index()
|> Enum.map(fn {value, index} ->
Error.wrap_call(:build_value!, [data_type, value], index)
end)
%{type | values: values}
end
def extract_value!(%{values: values}) do
values
|> Enum.with_index()
|> Enum.map(fn {type_with_value, index} ->
Error.wrap_call(:extract_value!, [type_with_value], index)
end)
end
def encode!(%{values: values}) do
encoded_length = Size.encode(length(values))
encoded_values =
values
|> Enum.with_index()
|> Enum.map(fn {type_with_value, index} ->
Error.wrap_call(:encode!, [type_with_value], index)
end)
|> Enum.join()
encoded_length <> encoded_values
end
def decode!(%{data_type: data_type} = type, full_encoding) do
{length, encoding} = Size.decode!(full_encoding)
{reversed_values, rest} =
Enum.reduce(0..(length - 1), {[], encoding}, fn index, {vals, prev_rest} ->
{current_value, next_rest} =
Error.wrap_call(XDR.Type, :decode!, [data_type, prev_rest], index)
{[current_value | vals], next_rest}
end)
{%{type | values: Enum.reverse(reversed_values)}, rest}
end
end
end
| 28.923077 | 96 | 0.602964 |
ffa5dd4f89d9b7f2251197652e762150ec982d72 | 6,872 | ex | Elixir | clients/double_click_bid_manager/lib/google_api/double_click_bid_manager/v11/api/lineitems.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | null | null | null | clients/double_click_bid_manager/lib/google_api/double_click_bid_manager/v11/api/lineitems.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-12-18T09:25:12.000Z | 2020-12-18T09:25:12.000Z | clients/double_click_bid_manager/lib/google_api/double_click_bid_manager/v11/api/lineitems.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-10-04T10:12:44.000Z | 2020-10-04T10:12:44.000Z | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.DoubleClickBidManager.V11.Api.Lineitems do
@moduledoc """
API calls for all endpoints tagged `Lineitems`.
"""
alias GoogleApi.DoubleClickBidManager.V11.Connection
alias GoogleApi.Gax.{Request, Response}
@library_version Mix.Project.config() |> Keyword.get(:version, "")
@doc """
Retrieves line items in CSV format. YouTube & partners line items are not supported.
## Parameters
* `connection` (*type:* `GoogleApi.DoubleClickBidManager.V11.Connection.t`) - Connection to server
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `:body` (*type:* `GoogleApi.DoubleClickBidManager.V11.Model.DownloadLineItemsRequest.t`) -
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.DoubleClickBidManager.V11.Model.DownloadLineItemsResponse{}}` on success
* `{:error, info}` on failure
"""
@spec doubleclickbidmanager_lineitems_downloadlineitems(
Tesla.Env.client(),
keyword(),
keyword()
) ::
{:ok, GoogleApi.DoubleClickBidManager.V11.Model.DownloadLineItemsResponse.t()}
| {:ok, Tesla.Env.t()}
| {:error, any()}
def doubleclickbidmanager_lineitems_downloadlineitems(
connection,
optional_params \\ [],
opts \\ []
) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query,
:body => :body
}
request =
Request.new()
|> Request.method(:post)
|> Request.url("/lineitems/downloadlineitems", %{})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(
opts ++ [struct: %GoogleApi.DoubleClickBidManager.V11.Model.DownloadLineItemsResponse{}]
)
end
@doc """
Uploads line items in CSV format. YouTube & partners line items are not supported.
## Parameters
* `connection` (*type:* `GoogleApi.DoubleClickBidManager.V11.Connection.t`) - Connection to server
* `optional_params` (*type:* `keyword()`) - Optional parameters
* `:"$.xgafv"` (*type:* `String.t`) - V1 error format.
* `:access_token` (*type:* `String.t`) - OAuth access token.
* `:alt` (*type:* `String.t`) - Data format for response.
* `:callback` (*type:* `String.t`) - JSONP
* `:fields` (*type:* `String.t`) - Selector specifying which fields to include in a partial response.
* `:key` (*type:* `String.t`) - API key. Your API key identifies your project and provides you with API access, quota, and reports. Required unless you provide an OAuth 2.0 token.
* `:oauth_token` (*type:* `String.t`) - OAuth 2.0 token for the current user.
* `:prettyPrint` (*type:* `boolean()`) - Returns response with indentations and line breaks.
* `:quotaUser` (*type:* `String.t`) - Available to use for quota purposes for server-side applications. Can be any arbitrary string assigned to a user, but should not exceed 40 characters.
* `:uploadType` (*type:* `String.t`) - Legacy upload protocol for media (e.g. "media", "multipart").
* `:upload_protocol` (*type:* `String.t`) - Upload protocol for media (e.g. "raw", "multipart").
* `:body` (*type:* `GoogleApi.DoubleClickBidManager.V11.Model.UploadLineItemsRequest.t`) -
* `opts` (*type:* `keyword()`) - Call options
## Returns
* `{:ok, %GoogleApi.DoubleClickBidManager.V11.Model.UploadLineItemsResponse{}}` on success
* `{:error, info}` on failure
"""
@spec doubleclickbidmanager_lineitems_uploadlineitems(Tesla.Env.client(), keyword(), keyword()) ::
{:ok, GoogleApi.DoubleClickBidManager.V11.Model.UploadLineItemsResponse.t()}
| {:ok, Tesla.Env.t()}
| {:error, any()}
def doubleclickbidmanager_lineitems_uploadlineitems(
connection,
optional_params \\ [],
opts \\ []
) do
optional_params_config = %{
:"$.xgafv" => :query,
:access_token => :query,
:alt => :query,
:callback => :query,
:fields => :query,
:key => :query,
:oauth_token => :query,
:prettyPrint => :query,
:quotaUser => :query,
:uploadType => :query,
:upload_protocol => :query,
:body => :body
}
request =
Request.new()
|> Request.method(:post)
|> Request.url("/lineitems/uploadlineitems", %{})
|> Request.add_optional_params(optional_params_config, optional_params)
|> Request.library_version(@library_version)
connection
|> Connection.execute(request)
|> Response.decode(
opts ++ [struct: %GoogleApi.DoubleClickBidManager.V11.Model.UploadLineItemsResponse{}]
)
end
end
| 42.95 | 196 | 0.64188 |
ffa62614f02e583213fec771f8a067e91a9f4c30 | 1,853 | ex | Elixir | lib/telegramex/http_client.ex | thiamsantos/telegramex | 475595aca8df02122579236b3a021b5349a77f7a | [
"Apache-2.0"
] | 1 | 2021-02-17T13:30:06.000Z | 2021-02-17T13:30:06.000Z | lib/telegramex/http_client.ex | thiamsantos/telegramex | 475595aca8df02122579236b3a021b5349a77f7a | [
"Apache-2.0"
] | 1 | 2021-03-14T21:39:33.000Z | 2021-03-14T21:39:33.000Z | lib/telegramex/http_client.ex | thiamsantos/telegramex | 475595aca8df02122579236b3a021b5349a77f7a | [
"Apache-2.0"
] | null | null | null | defmodule Telegramex.HTTPClient do
@moduledoc """
Specifies the behaviour of a HTTP Client.
You can switch the default HTTP client which uses [finch](https://github.com/keathley/finch) underneath
by defining a different implementation by setting the `:http_client`
configuration in `Telegramex.Client`:
client = %Telegramex.Client{http_client: {MyCustomHTTPClient, []}}
Telegramex.get_updates(client)
"""
@doc """
Sends an HTTP request.
"""
@callback request(
method :: atom(),
url :: binary(),
headers :: list(),
body :: iodata(),
opts :: Keyword.t()
) ::
{:ok, %{status: integer(), headers: [{binary(), binary()}], body: binary()}}
| {:error, term()}
if Code.ensure_loaded?(Finch) do
def request(method, url, headers, body, opts) do
{name, opts} = pop!(opts, :name)
method
|> Finch.build(url, headers, body)
|> Finch.request(name, opts)
|> to_response()
end
defp to_response({:ok, response}) do
{:ok, %{status: response.status, headers: response.headers, body: response.body}}
end
defp to_response({:error, reason}), do: {:error, reason}
def pop!(keywords, key) when is_list(keywords) and is_atom(key) do
case Keyword.fetch(keywords, key) do
{:ok, value} -> {value, Keyword.delete(keywords, key)}
:error -> raise KeyError, key: key, term: keywords
end
end
else
def request(_, _, _, _, _) do
raise ArgumentError, """
Could not find finch dependency.
Please add :finch to your dependencies:
{:finch, "~> 0.5"}
Or provide your own #{inspect(__MODULE__)} implementation:
%Telegramex.Client{http_client: {MyCustomHTTPClient, []}}
"""
end
end
end
| 28.953125 | 105 | 0.594711 |
ffa66762c1ac78308904ece8db698dab08afe41c | 6,022 | ex | Elixir | lib/timezone/timezone_info.ex | shahryarjb/timex | d7cc4728c4275fba5ff23cf2fb50b8a46cfd9b0a | [
"MIT"
] | 1,623 | 2015-01-03T16:53:19.000Z | 2022-03-27T01:25:50.000Z | lib/timezone/timezone_info.ex | shahryarjb/timex | d7cc4728c4275fba5ff23cf2fb50b8a46cfd9b0a | [
"MIT"
] | 654 | 2015-01-04T23:59:47.000Z | 2022-03-08T01:02:01.000Z | lib/timezone/timezone_info.ex | shahryarjb/timex | d7cc4728c4275fba5ff23cf2fb50b8a46cfd9b0a | [
"MIT"
] | 428 | 2015-01-04T19:37:37.000Z | 2022-03-31T10:48:44.000Z | defmodule Timex.TimezoneInfo do
@moduledoc """
All relevant timezone information for a given period, i.e. Europe/Moscow on March 3rd, 2013
Notes:
- `full_name` is the name of the zone, but does not indicate anything about the current period (i.e. CST vs CDT)
- `abbreviation` is the abbreviated name for the zone in the current period, i.e. "America/Chicago" on 3/30/15 is "CDT"
- `offset_std` is the offset in seconds from standard time for this period
- `offset_utc` is the offset in seconds from UTC for this period
Spec:
- `day_of_week`: :sunday, :monday, :tuesday, etc
- `datetime`: {{year, month, day}, {hour, minute, second}}
- `from`: :min | {day_of_week, datetime}, when this zone starts
- `until`: :max | {day_of_week, datetime}, when this zone ends
"""
defstruct full_name: "Etc/UTC",
abbreviation: "UTC",
offset_std: 0,
offset_utc: 0,
from: :min,
until: :max
@valid_day_names [:sunday, :monday, :tuesday, :wednesday, :thursday, :friday, :saturday]
@max_seconds_in_day 60 * 60 * 24
@type day_of_week :: :sunday | :monday | :tuesday | :wednesday | :thursday | :friday | :saturday
@type datetime :: {{non_neg_integer, 1..12, 1..31}, {0..24, 0..59, 0..60}}
@type offset :: -85399..85399
@type from_constraint :: :min | {day_of_week, datetime}
@type until_constraint :: :max | {day_of_week, datetime}
@type t :: %__MODULE__{
full_name: String.t(),
abbreviation: String.t(),
offset_std: offset,
offset_utc: offset,
from: from_constraint,
until: until_constraint
}
@doc """
Create a custom timezone if a built-in one does not meet your needs.
You must provide the name, abbreviation, offset from UTC, daylight savings time offset,
and the from/until reference points for when the zone takes effect and ends.
To clarify the two offsets, `offset_utc` is the absolute offset relative to UTC,
`offset_std` is the offset to apply to `offset_utc` which gives us the offset from UTC
during daylight savings time for this timezone. If DST does not apply for this zone, simply
set it to 0.
The from/until reference points must meet the following criteria:
- Be set to `:min` for from, or `:max` for until, which represent
"infinity" for the start/end of the zone period.
- OR, be a tuple of {day_of_week, datetime}, where:
- `day_of_week` is an atom like `:sunday`
- `datetime` is an Erlang datetime tuple, e.g. `{{2016,10,8},{2,0,0}}`
*IMPORTANT*: Offsets are in seconds, not minutes, if you do not ensure they
are in the correct unit, runtime errors or incorrect results are probable.
## Examples
iex> #{__MODULE__}.create("Etc/Test", "TST", 120*60, 0, :min, :max)
%TimezoneInfo{full_name: "Etc/Test", abbreviation: "TST", offset_std: 7200, offset_utc: 0, from: :min, until: :max}
...> #{__MODULE__}.create("Etc/Test", "TST", 24*60*60, 0, :min, :max)
{:error, "invalid timezone offset '86400'"}
"""
@spec create(String.t(), String.t(), offset, offset, from_constraint, until_constraint) ::
__MODULE__.t() | {:error, String.t()}
def create(name, abbr, offset_utc, offset_std, from, until) do
%__MODULE__{
full_name: name,
abbreviation: abbr,
offset_std: offset_std,
offset_utc: offset_utc,
from: from || :min,
until: until || :max
}
|> validate_and_return()
end
def from_datetime(%DateTime{
time_zone: name,
zone_abbr: abbr,
std_offset: std_offset,
utc_offset: utc_offset
}) do
%__MODULE__{
full_name: name,
abbreviation: abbr,
offset_std: std_offset,
offset_utc: utc_offset,
from: :min,
until: :max
}
end
@doc false
def to_period(%__MODULE__{offset_utc: utc, offset_std: std, abbreviation: abbr}) do
%{std_offset: std, utc_offset: utc, zone_abbr: abbr}
end
defp validate_and_return(%__MODULE__{} = tz) do
with true <- is_valid_name(tz.full_name),
true <- is_valid_name(tz.abbreviation),
true <- is_valid_offset(tz.offset_std),
true <- is_valid_offset(tz.offset_utc),
true <- is_valid_from_constraint(tz.from),
true <- is_valid_until_constraint(tz.until),
do: tz
end
defp is_valid_name(name) when is_binary(name), do: true
defp is_valid_name(name), do: {:error, "invalid timezone name '#{inspect(name)}'!"}
defp is_valid_offset(offset)
when is_integer(offset) and
(offset < @max_seconds_in_day and offset > -@max_seconds_in_day),
do: true
defp is_valid_offset(offset), do: {:error, "invalid timezone offset '#{inspect(offset)}'"}
defp is_valid_from_constraint(:min), do: true
defp is_valid_from_constraint(:max),
do: {:error, ":max is not a valid from constraint for timezones"}
defp is_valid_from_constraint(c), do: is_valid_constraint(c)
defp is_valid_until_constraint(:min),
do: {:error, ":min is not a valid until constraint for timezones"}
defp is_valid_until_constraint(:max), do: true
defp is_valid_until_constraint(c), do: is_valid_constraint(c)
defp is_valid_constraint({day_of_week, {{y, m, d}, {h, mm, s}}} = datetime)
when day_of_week in @valid_day_names do
cond do
:calendar.valid_date({y, m, d}) ->
valid_hour = h >= 1 and h <= 24
valid_min = mm >= 0 and mm <= 59
valid_sec = s >= 0 and s <= 59
cond do
valid_hour && valid_min && valid_sec ->
true
:else ->
{:error,
"invalid datetime constraint for timezone: #{inspect(datetime)} (invalid time)"}
end
:else ->
{:error, "invalid datetime constraint for timezone: #{inspect(datetime)} (invalid date)"}
end
end
defp is_valid_constraint(c),
do: {:error, "'#{inspect(c)}' is not a valid constraint for timezones"}
end
| 36.719512 | 123 | 0.641979 |
ffa673ec07ae17f750a2ec335d54702d309320ba | 2,296 | exs | Elixir | packages/elixir/test/interceptor_test.exs | venomnert/riptide | 39a50047bb51cd30a93ff7022bb01385c28a4f03 | [
"MIT"
] | 11 | 2020-01-20T16:42:00.000Z | 2021-09-18T11:36:45.000Z | packages/elixir/test/interceptor_test.exs | venomnert/riptide | 39a50047bb51cd30a93ff7022bb01385c28a4f03 | [
"MIT"
] | 16 | 2020-04-05T19:26:14.000Z | 2020-05-21T14:50:59.000Z | packages/elixir/test/interceptor_test.exs | venomnert/riptide | 39a50047bb51cd30a93ff7022bb01385c28a4f03 | [
"MIT"
] | 2 | 2020-10-16T11:01:57.000Z | 2021-01-20T21:11:52.000Z | defmodule Riptide.Test.Interceptor do
defmodule Example do
use Riptide.Interceptor
def mutation_before(["creatures", key], %{merge: %{"key" => _}}, _mut, _state) do
{:combine,
Riptide.Mutation.put_merge(["creatures", key, "created"], :os.system_time(:millisecond))}
end
def mutation_after(["creatures", key], %{merge: %{"key" => _}}, _mut, _state) do
Process.put(:after, true)
:ok
end
def mutation_effect(["creatures", key], %{merge: %{"key" => _}}, _mut, _state),
do: {:trigger, []}
def trigger() do
end
def query_before(["denied" | _rest], _opts, _state) do
{:error, :denied}
end
def query_resolve(["resolved" | path], _opts, _state) do
{_, creature} = Riptide.Test.Data.pet_hammerhead()
Dynamic.get(creature, path)
end
end
use ExUnit.Case
test "mutation_before" do
{key, creature} = Riptide.Test.Data.clean_tank()
{
:ok,
%{
merge: %{
"creatures" => %{
^key => %{
"created" => _
}
}
}
}
} =
Riptide.Interceptor.mutation_before(
Riptide.Mutation.put_merge(["creatures", key], creature),
%{},
[Example]
)
end
test "mutation_after" do
{key, creature} = Riptide.Test.Data.clean_tank()
:ok =
Riptide.Interceptor.mutation_after(
Riptide.Mutation.put_merge(["creatures", key], creature),
%{},
[Example]
)
true = Process.get(:after)
end
test "mutation_effect" do
{key, creature} = Riptide.Test.Data.clean_tank()
%{
merge: %{
"riptide:scheduler" => %{}
}
} =
Riptide.Interceptor.mutation_effect(
Riptide.Mutation.put_merge(["creatures", key], creature),
%{},
[Example]
)
end
test "query_resolve" do
{_key, creature} = Riptide.Test.Data.pet_hammerhead()
[{["resolved"], ^creature}] =
Riptide.Interceptor.query_resolve(
%{
"resolved" => %{}
},
%{},
[Example]
)
end
test "query_before" do
{:error, :denied} =
Riptide.Interceptor.query_before(
%{
"denied" => %{}
},
%{},
[Example]
)
end
end
| 21.457944 | 96 | 0.53223 |
ffa68587516db8a730c231b612e2498d5d0d765f | 406 | exs | Elixir | clients/text_to_speech/test/test_helper.exs | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | 1 | 2018-12-03T23:43:10.000Z | 2018-12-03T23:43:10.000Z | clients/text_to_speech/test/test_helper.exs | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | null | null | null | clients/text_to_speech/test/test_helper.exs | matehat/elixir-google-api | c1b2523c2c4cdc9e6ca4653ac078c94796b393c3 | [
"Apache-2.0"
] | null | null | null | ExUnit.start()
defmodule GoogleApi.TextToSpeech.V1beta1.TestHelper do
defmacro __using__(opts) do
quote do
use ExUnit.Case, unquote(opts)
import GoogleApi.TextToSpeech.V1beta1.TestHelper
end
end
def for_scope(scopes) when is_list(scopes), do: for_scope(Enum.join(scopes, " "))
def for_scope(scope) do
{:ok, token} = Goth.Token.for_scope(scope)
token.token
end
end
| 21.368421 | 83 | 0.711823 |
ffa688427ba6fbd8872e1b0dbbf155b330fed0f0 | 166 | ex | Elixir | lib/ex_keccak/impl.ex | alanwilhelm/ex_keccak | 49e722f4ddc6f7723d9612f51cd2dd6b97771886 | [
"Apache-2.0"
] | null | null | null | lib/ex_keccak/impl.ex | alanwilhelm/ex_keccak | 49e722f4ddc6f7723d9612f51cd2dd6b97771886 | [
"Apache-2.0"
] | null | null | null | lib/ex_keccak/impl.ex | alanwilhelm/ex_keccak | 49e722f4ddc6f7723d9612f51cd2dd6b97771886 | [
"Apache-2.0"
] | null | null | null | defmodule ExKeccak.Impl do
@moduledoc false
use Rustler, otp_app: :ex_keccak, crate: :exkeccak
def hash_256(_data), do: :erlang.nif_error(:nif_not_loaded)
end
| 23.714286 | 61 | 0.759036 |
ffa6abd34d301e6044e76e82faa38ed56257efc5 | 675 | ex | Elixir | lib/graphql/type/float.ex | seanabrahams/graphql-elixir | 568a07da317e19621cc0c931d738b4eda75f0a35 | [
"BSD-3-Clause"
] | null | null | null | lib/graphql/type/float.ex | seanabrahams/graphql-elixir | 568a07da317e19621cc0c931d738b4eda75f0a35 | [
"BSD-3-Clause"
] | null | null | null | lib/graphql/type/float.ex | seanabrahams/graphql-elixir | 568a07da317e19621cc0c931d738b4eda75f0a35 | [
"BSD-3-Clause"
] | null | null | null | defmodule GraphQL.Type.Float do
defstruct name: "Float", description:
"""
The `Float` scalar type represents signed double-precision fractional
values as specified by [IEEE 754](http://en.wikipedia.org/wiki/IEEE_floating_point).
"""
def coerce(false), do: 0
def coerce(true), do: 1
def coerce(value) when is_binary(value) do
case Float.parse(value) do
:error -> nil
{v, _} -> coerce(v)
end
end
def coerce(value) do
value * 1.0
end
end
defimpl GraphQL.Types, for: GraphQL.Type.Float do
def parse_value(_, value), do: GraphQL.Type.Float.coerce(value)
def serialize(_, value), do: GraphQL.Type.Float.coerce(value)
end
| 27 | 88 | 0.682963 |
ffa6bf72505a3945dcf2afa185880ee6481b161a | 7,293 | ex | Elixir | clients/document_ai/lib/google_api/document_ai/v1beta3/model/google_type_postal_address.ex | MMore/elixir-google-api | 0574ec1439d9bbfe22d63965be1681b0f45a94c9 | [
"Apache-2.0"
] | null | null | null | clients/document_ai/lib/google_api/document_ai/v1beta3/model/google_type_postal_address.ex | MMore/elixir-google-api | 0574ec1439d9bbfe22d63965be1681b0f45a94c9 | [
"Apache-2.0"
] | null | null | null | clients/document_ai/lib/google_api/document_ai/v1beta3/model/google_type_postal_address.ex | MMore/elixir-google-api | 0574ec1439d9bbfe22d63965be1681b0f45a94c9 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.DocumentAI.V1beta3.Model.GoogleTypePostalAddress do
@moduledoc """
Represents a postal address, e.g. for postal delivery or payments addresses. Given a postal address, a postal service can deliver items to a premise, P.O. Box or similar. It is not intended to model geographical locations (roads, towns, mountains). In typical usage an address would be created via user input or from importing existing data, depending on the type of process. Advice on address input / editing: - Use an i18n-ready address widget such as https://github.com/google/libaddressinput) - Users should not be presented with UI elements for input or editing of fields outside countries where that field is used. For more guidance on how to use this schema, please see: https://support.google.com/business/answer/6397478
## Attributes
* `addressLines` (*type:* `list(String.t)`, *default:* `nil`) - Unstructured address lines describing the lower levels of an address. Because values in address_lines do not have type information and may sometimes contain multiple values in a single field (e.g. "Austin, TX"), it is important that the line order is clear. The order of address lines should be "envelope order" for the country/region of the address. In places where this can vary (e.g. Japan), address_language is used to make it explicit (e.g. "ja" for large-to-small ordering and "ja-Latn" or "en" for small-to-large). This way, the most specific line of an address can be selected based on the language. The minimum permitted structural representation of an address consists of a region_code with all remaining information placed in the address_lines. It would be possible to format such an address very approximately without geocoding, but no semantic reasoning could be made about any of the address components until it was at least partially resolved. Creating an address only containing a region_code and address_lines, and then geocoding is the recommended way to handle completely unstructured addresses (as opposed to guessing which parts of the address should be localities or administrative areas).
* `administrativeArea` (*type:* `String.t`, *default:* `nil`) - Optional. Highest administrative subdivision which is used for postal addresses of a country or region. For example, this can be a state, a province, an oblast, or a prefecture. Specifically, for Spain this is the province and not the autonomous community (e.g. "Barcelona" and not "Catalonia"). Many countries don't use an administrative area in postal addresses. E.g. in Switzerland this should be left unpopulated.
* `languageCode` (*type:* `String.t`, *default:* `nil`) - Optional. BCP-47 language code of the contents of this address (if known). This is often the UI language of the input form or is expected to match one of the languages used in the address' country/region, or their transliterated equivalents. This can affect formatting in certain countries, but is not critical to the correctness of the data and will never affect any validation or other non-formatting related operations. If this value is not known, it should be omitted (rather than specifying a possibly incorrect default). Examples: "zh-Hant", "ja", "ja-Latn", "en".
* `locality` (*type:* `String.t`, *default:* `nil`) - Optional. Generally refers to the city/town portion of the address. Examples: US city, IT comune, UK post town. In regions of the world where localities are not well defined or do not fit into this structure well, leave locality empty and use address_lines.
* `organization` (*type:* `String.t`, *default:* `nil`) - Optional. The name of the organization at the address.
* `postalCode` (*type:* `String.t`, *default:* `nil`) - Optional. Postal code of the address. Not all countries use or require postal codes to be present, but where they are used, they may trigger additional validation with other parts of the address (e.g. state/zip validation in the U.S.A.).
* `recipients` (*type:* `list(String.t)`, *default:* `nil`) - Optional. The recipient at the address. This field may, under certain circumstances, contain multiline information. For example, it might contain "care of" information.
* `regionCode` (*type:* `String.t`, *default:* `nil`) - Required. CLDR region code of the country/region of the address. This is never inferred and it is up to the user to ensure the value is correct. See https://cldr.unicode.org/ and https://www.unicode.org/cldr/charts/30/supplemental/territory_information.html for details. Example: "CH" for Switzerland.
* `revision` (*type:* `integer()`, *default:* `nil`) - The schema revision of the `PostalAddress`. This must be set to 0, which is the latest revision. All new revisions **must** be backward compatible with old revisions.
* `sortingCode` (*type:* `String.t`, *default:* `nil`) - Optional. Additional, country-specific, sorting code. This is not used in most regions. Where it is used, the value is either a string like "CEDEX", optionally followed by a number (e.g. "CEDEX 7"), or just a number alone, representing the "sector code" (Jamaica), "delivery area indicator" (Malawi) or "post office indicator" (e.g. Côte d'Ivoire).
* `sublocality` (*type:* `String.t`, *default:* `nil`) - Optional. Sublocality of the address. For example, this can be neighborhoods, boroughs, districts.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:addressLines => list(String.t()) | nil,
:administrativeArea => String.t() | nil,
:languageCode => String.t() | nil,
:locality => String.t() | nil,
:organization => String.t() | nil,
:postalCode => String.t() | nil,
:recipients => list(String.t()) | nil,
:regionCode => String.t() | nil,
:revision => integer() | nil,
:sortingCode => String.t() | nil,
:sublocality => String.t() | nil
}
field(:addressLines, type: :list)
field(:administrativeArea)
field(:languageCode)
field(:locality)
field(:organization)
field(:postalCode)
field(:recipients, type: :list)
field(:regionCode)
field(:revision)
field(:sortingCode)
field(:sublocality)
end
defimpl Poison.Decoder, for: GoogleApi.DocumentAI.V1beta3.Model.GoogleTypePostalAddress do
def decode(value, options) do
GoogleApi.DocumentAI.V1beta3.Model.GoogleTypePostalAddress.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.DocumentAI.V1beta3.Model.GoogleTypePostalAddress do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 94.714286 | 1,279 | 0.739613 |
ffa6e42eccf3c77da476cff91a85a4676e403b3b | 2,421 | ex | Elixir | lib/bsv/crypto/rsa/public_key.ex | afomi/bsv-ex | a31db1e9d223aa4ac9cc00e86b1e6344a0037805 | [
"Apache-2.0"
] | null | null | null | lib/bsv/crypto/rsa/public_key.ex | afomi/bsv-ex | a31db1e9d223aa4ac9cc00e86b1e6344a0037805 | [
"Apache-2.0"
] | null | null | null | lib/bsv/crypto/rsa/public_key.ex | afomi/bsv-ex | a31db1e9d223aa4ac9cc00e86b1e6344a0037805 | [
"Apache-2.0"
] | null | null | null | defmodule BSV.Crypto.RSA.PublicKey do
@moduledoc """
RSA Public Key module.
"""
defstruct [:modulus, :public_exponent]
@typedoc "RSA Public Key"
@type t :: %__MODULE__{
modulus: integer,
public_exponent: integer
}
@typedoc "Erlang RSA Public Key sequence"
@type sequence :: {
:RSAPublicKey,
integer,
integer
}
@typedoc "Erlang RSA Public Key raw binary list"
@type raw_key :: [binary]
@doc """
Converts the given Erlang public key sequence to a RSA public key struct.
## Examples
iex> public_key_sequence = BSV.Crypto.RSA.PrivateKey.from_sequence(BSV.Test.rsa_key)
...> |> BSV.Crypto.RSA.PrivateKey.get_public_key
...> |> BSV.Crypto.RSA.PublicKey.as_sequence
...>
...> public_key = BSV.Crypto.RSA.PublicKey.from_sequence(public_key_sequence)
...> public_key.__struct__ == BSV.Crypto.RSA.PublicKey
true
"""
@spec from_sequence(BSV.Crypto.RSA.PublicKey.sequence) :: BSV.Crypto.RSA.PublicKey.t
def from_sequence(rsa_key_sequence) do
struct(__MODULE__, [
modulus: elem(rsa_key_sequence, 1) |> :binary.encode_unsigned,
public_exponent: elem(rsa_key_sequence, 2) |> :binary.encode_unsigned
])
end
@doc """
Converts the given RSA public key struct to an Erlang public key sequence.=
## Examples
iex> public_key = BSV.Crypto.RSA.PrivateKey.from_sequence(BSV.Test.rsa_key)
...> |> BSV.Crypto.RSA.PrivateKey.get_public_key
...>
...> BSV.Crypto.RSA.PublicKey.as_sequence(public_key)
...> |> is_tuple
true
"""
@spec as_sequence(BSV.Crypto.RSA.PublicKey.t) :: BSV.Crypto.RSA.PublicKey.sequence
def as_sequence(public_key) do
{
:RSAPublicKey,
:binary.decode_unsigned(public_key.modulus),
:binary.decode_unsigned(public_key.public_exponent)
}
end
@doc """
Converts the given Erlang public key raw list to a RSA public key struct.
"""
@spec from_raw(BSV.Crypto.RSA.PublicKey.raw_key) :: BSV.Crypto.RSA.PublicKey.t
def from_raw([e, n]) do
struct(__MODULE__, [
modulus: n,
public_exponent: e
])
end
@doc """
Converts the given RSA private key struct to an Erlang private key raw list.
"""
@spec as_raw(BSV.Crypto.RSA.PrivateKey.t) :: BSV.Crypto.RSA.PrivateKey.raw_key
def as_raw(private_key) do
[
private_key.public_exponent,
private_key.modulus
]
end
end
| 26.032258 | 90 | 0.672036 |
ffa6f2a5524b64d479d9770b42e96cb652eee929 | 435 | ex | Elixir | lib/recipient_4_genserver.ex | ianrumford/siariwyd_examples | 73eecc80b3f21f175a767bbf03c18acbc32a84f3 | [
"MIT"
] | null | null | null | lib/recipient_4_genserver.ex | ianrumford/siariwyd_examples | 73eecc80b3f21f175a767bbf03c18acbc32a84f3 | [
"MIT"
] | null | null | null | lib/recipient_4_genserver.ex | ianrumford/siariwyd_examples | 73eecc80b3f21f175a767bbf03c18acbc32a84f3 | [
"MIT"
] | null | null | null | defmodule Recipient4GenServer do
use GenServer
# client API functions
defdelegate donor_b_genserver_state_get(pid), to: DonorBGenServer
defdelegate donor_c_genserver_state_put(pid, new_state), to: DonorCGenServer
# handle_call functions
defdelegate handle_call(pid, fromref, state), to: DonorBGenServer
# this will be ignored but compiler will warn
defdelegate handle_call(pid, fromref, state), to: DonorCGenServer
end
| 39.545455 | 78 | 0.809195 |
ffa6f91c340f4ba489d563c73f1ec8f174d4c74c | 1,729 | ex | Elixir | clients/android_publisher/lib/google_api/android_publisher/v3/model/bundles_list_response.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | null | null | null | clients/android_publisher/lib/google_api/android_publisher/v3/model/bundles_list_response.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-12-18T09:25:12.000Z | 2020-12-18T09:25:12.000Z | clients/android_publisher/lib/google_api/android_publisher/v3/model/bundles_list_response.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-10-04T10:12:44.000Z | 2020-10-04T10:12:44.000Z | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.AndroidPublisher.V3.Model.BundlesListResponse do
@moduledoc """
Response listing all bundles.
## Attributes
* `bundles` (*type:* `list(GoogleApi.AndroidPublisher.V3.Model.Bundle.t)`, *default:* `nil`) - All bundles.
* `kind` (*type:* `String.t`, *default:* `nil`) - The kind of this response ("androidpublisher#bundlesListResponse").
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:bundles => list(GoogleApi.AndroidPublisher.V3.Model.Bundle.t()),
:kind => String.t()
}
field(:bundles, as: GoogleApi.AndroidPublisher.V3.Model.Bundle, type: :list)
field(:kind)
end
defimpl Poison.Decoder, for: GoogleApi.AndroidPublisher.V3.Model.BundlesListResponse do
def decode(value, options) do
GoogleApi.AndroidPublisher.V3.Model.BundlesListResponse.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.AndroidPublisher.V3.Model.BundlesListResponse do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 34.58 | 121 | 0.735685 |
ffa70dbe9d0ba7975f8839e097f09f7cd8d4731b | 1,206 | exs | Elixir | bench/witchcraft/ord/integer_bench.exs | doma-engineering/witchcraft | c84fa6b2146e7de745105e21f672ed413df93ad3 | [
"MIT"
] | 454 | 2019-06-05T22:56:45.000Z | 2022-03-27T23:03:02.000Z | bench/witchcraft/ord/integer_bench.exs | doma-engineering/witchcraft | c84fa6b2146e7de745105e21f672ed413df93ad3 | [
"MIT"
] | 26 | 2019-07-08T09:29:08.000Z | 2022-02-04T02:40:48.000Z | bench/witchcraft/ord/integer_bench.exs | doma-engineering/witchcraft | c84fa6b2146e7de745105e21f672ed413df93ad3 | [
"MIT"
] | 36 | 2019-06-25T17:45:27.000Z | 2022-03-21T01:53:42.000Z | defmodule Witchcraft.Ord.IntegerBench do
use Benchfella
use Witchcraft.Ord
#########
# Setup #
#########
# ---------- #
# Data Types #
# ---------- #
@int_a 10
@int_b -45
##########
# Kernel #
##########
bench "Kernel.>/2", do: Kernel.>(@int_a, @int_b)
bench "Kernel.</2", do: Kernel.<(@int_a, @int_b)
bench "Kernel.>=/2", do: Kernel.>=(@int_a, @int_b)
bench "Kernel.<=/2", do: Kernel.<=(@int_a, @int_b)
#######
# Ord #
#######
bench "compare/2", do: compare(@int_a, @int_b)
bench "equal?/2", do: equal?(@int_a, @int_b)
bench "greater?/2", do: greater?(@int_a, @int_b)
bench "lesser/2", do: lesser?(@int_a, @int_b)
# --------- #
# Operators #
# --------- #
bench ">/2", do: @int_a > @int_b
bench "</2", do: @int_a < @int_b
bench ">=/2", do: @int_a >= @int_b
bench "<=/2", do: @int_a <= @int_b
# ---------- #
# Large Data #
# ---------- #
@big_int_a 1_234_567_890
@big_int_b 9_876_543_210
bench "$$$ Kernel.>/2", do: Kernel.>(@big_int_a, @big_int_b)
bench "$$$ Kernel.</2", do: Kernel.<(@big_int_a, @big_int_b)
bench "$$$ </2", do: @big_int_a < @big_int_b
bench "$$$ >/2", do: @big_int_a > @big_int_b
end
| 20.440678 | 62 | 0.504975 |
ffa71172414afdfbdf4d653c4bff68cd3e5af23a | 12,298 | ex | Elixir | lib/saucexages.ex | nocursor/saucexages | 33e986a652306b2c54ad4891db7a27d78ed0d7cf | [
"MIT"
] | 7 | 2018-11-01T15:47:05.000Z | 2021-05-19T10:07:23.000Z | lib/saucexages.ex | nocursor/saucexages | 33e986a652306b2c54ad4891db7a27d78ed0d7cf | [
"MIT"
] | null | null | null | lib/saucexages.ex | nocursor/saucexages | 33e986a652306b2c54ad4891db7a27d78ed0d7cf | [
"MIT"
] | null | null | null | defmodule Saucexages do
@moduledoc """
Saucexages is a library that provides functionality for reading, writing, interrogating, and fixing [SAUCE](http://www.acid.org/info/sauce/sauce.htm) – Standard Architecture for Universal Comment Extensions.
The primary use of SAUCE is to add or augment metadata for files. This metadata has historically been focused on supporting art scene formats such as ANSi art (ANSI), but also supports many other formats. SAUCE support includes popular formats often found in various underground and computing communities, including, but not limited to:
* `Character` - ANSI, ASCII, RIP, etc.
* `Bitmap` - GIF, JPEG, PNG, etc
* `Audio` - S3M, IT, MOD, XM, etc.
* `Binary Text` - BIN
* `Extended Binary Text` - XBIN
* `Archive` - ZIP, LZH, TAR, RAR, etc.
* `Executable` - EXE, COM, BAT, etc.
Saucexages was created to help make handling SAUCE data contained in such file easier and available in the BEAM directly. As such, a number of features are included, such as:
* Reading and writing all SAUCE fields, including comments
* Cleaning SAUCE entries, including removal of both SAUCE records and comments
* Encoding/Decoding of all SAUCE fields, including type dependent fields
* Encoding/Decoding fallbacks based on real-world data
* Proper padding and truncation handling per the SAUCE spec
* Full support of all file and data types in the SAUCE spec
* Compiled and optimized binary handling that respects output from `ERL_COMPILER_OPTIONS=bin_opt_info`
* Choice of pure binary or positional file-based interface
* Complete metadata related to all types, footnotes, and fields in the SAUCE spec
* Sugar functions for dealing with interface building, dynamic behaviors, and media-specific handling
* Macros for compile time guarantees, efficiency, and ease-of-use and to allow further use of metadata in guards, matches, and binaries as consumers see fit.
## Overview
The Saucexages codebase is divided into 3 major namespaces:
* `Saucexages` - Core namespace with a wrapper API and various types and functions useful for working with SAUCE. This includes facilities for decoding font information, ANSi flags, data type, file information, and general media info.
* `Saucexages.Codec` - Contains functionality for encoding and decoding SAUCE data on a per-field and per-binary basis. An effort has been made to ensure that these functions attempt to work with binary as efficiently as possible.
* `Saucexages.IO` - Modules for handling IO tasks including reading and writing both binaries and files.
## SAUCE Structure
SAUCE is constructed via 2 major binary components, a SAUCE record and a comment block. The sum of these two parts can be thought of as a SAUCE block, or simply SAUCE. The SAUCE record is primarily what other libraries are referring to when they mention SAUCE or use the term. Saucexages attempts to make an effort when possible for clarity purposes to differentiate between the SAUCE block, SAUCE record, and SAUCE comment block.
## Location
The SAUCE block itself is always written after an EOF character, and 128 bytes from the actual end of file. It is important to note there are possibly two EOF markers used in practice. The first is the modern notion of an EOF which means the end of the file data, as commonly recognized by modern OSs. The second marker is an EOF character,represented in hex by `0x1a`. The key to SAUCE co-existing with formats is the EOF character as its presence often signaled to viewers, readers, and other programs to stop reading data past the character in the file, or to stop interpreting this data as part of the format.
Writing a SAUCE without an EOF character or before an EOF character will interfere with reading many formats. It is important to note that this requirement means that the EOF character must be *before* a SAUCE record, however in practice this does *not* always mean it will be *adjacent* to the EOF character. The reasons for this can be many, but common ones are flawed SAUCE writers, co-existing with other EOF-based formats, and misc. data or even garbage that may have been written in the file by other programs.
## Sauce Record
The SAUCE record is 128-bytes used to hold the majority of metadata to describe files using SAUCE. The SAUCE layout is extensively described in the [SAUCE specification](http://www.acid.org/info/sauce/sauce.htm).
Saucexages follows the same layout and types, but in Elixir-driven way. Please see the specification for more detailed descriptions including size, encoding/decoding requirements, and limitations.
All fields are fixed-size and are padded if shorter when written. As such, any integer fields are subject to wrap when encoding, and decoding. Likewise, any string fields are subject to truncation and padding when encoding/decoding accordingly.
The fields contained within the SAUCE record are as follows:
* `ID` - SAUCE identifier. Should always be "SAUCE". This field is defining for the format, used extensively to pattern match, and to find SAUCE records within binaries and files.
* `Version` - The SAUCE version. Should generally be "00". Programs should set this value to "00" and avoid changing it arbitrarily.
* `Title` - Title of the file.
* `Author` - Author of the file.
* `Group` - The group associated with the file, for example the name of an art scene group.
* `Date` - File creation date in "CCYYMMDD" format.
* `File Size` - The original file size, excluding the SAUCE data. Generally, this is the size of all the data before the SAUCE block, and typically this equates to all data before an EOF character.
* `Data Type` - An integer corresponding to a data type identifier.
* `File Type` - An integer corresponding to a file type identifier. This identifier is dependent on the data type as well for interpretation.
* `TInfo 1` - File type identifier dependent field 1.
* `TInfo 2` - File type identifier dependent field 2.
* `TInfo 3` - File type identifier dependent field 3.
* `TInfo 4` - File type identifier dependent field 4.
* `Comments` - Comment lines in the SAUCE block. This field serves as a crucial pointer and is required to properly read a comments block.
* `TFlags` - File type identifier dependent flags.
* `TInfoS` - File type identifier dependent string.
## Comment Block
The SAUCE comment block is an optional, variable sized binary structure that holds up to 255 lines of 64 bytes of information.
The SAUCE comment block fields are as follows:
* ID - Identifier for the comment block that should always be equal to "COMNT". This field is defining for the format, used extensively to pattern match, and to find SAUCE comments.
* Comment Line - Fixed size (64 bytes) field of text. Each comment block consists of 1 or more lines.
It is vital to note that the SAUCE comment block is often broken in practice in many files. Saucexages provides many functions for identifying, repairing, and dealing with such cases. When reading and writing SAUCE, however, by default the approach described in the SAUCE specification is used. That is, the comment block location and size is always read and written according to the SAUCE record comments (comment lines) field.
## Format
The general format of a SAUCE binary is as follows:
```
[contents][eof character][sauce block [comment block][sauce record]]
```
Conceptually, the parts of a file with SAUCE data are as follows:
* `Contents` - The file contents. Generally anything but the SAUCE data.
* `EOF Character` - The EOF character, or `0x1a` in hex. Occupies a single byte.
* `SAUCE Block` - The SAUCE record + optional comment block.
* `SAUCE Record` - The 128-byte collection of fields outlined in this module, as defined by the SAUCE spec.
* `Comment Block` - Optional comment block of variable size, consisting of a minimum of 69 characters, and a maximum of `5 + 64 * 255` bytes. Dependent on the comment field in the SAUCE record which determines the size and location of this block.
In Elixir binary format, this takes the pseudo-form of:
```elixir
<<contents::binary, eof_character::binary-size(1), comment_block::binary-size(comment_lines * 64 + 5), sauce_record::binary-size(128)>>
```
The SAUCE comment block is optional and as such, the following format is also valid:
```elixir
<<contents::binary, eof_character::binary-size(1), sauce_record::binary-size(128)>>
```
Additionally, since the SAUCE block itself is defined from the EOF, and after the EOF character, the following form is also valid and includes combinations of the above forms:
```elixir
<<contents::binary, eof_character::binary-size(1), other_content::binary, comment_block::binary-size(comment_lines * 64 + 5), sauce_record::binary-size(128)>>
```
"""
require Saucexages.IO.BinaryReader
alias Saucexages.IO.{BinaryWriter, BinaryReader}
alias Saucexages.SauceBlock
#TODO: in the future we my decide to make file or binary readers/writers pluggable via using/protocols
@doc """
Reads a binary containing a SAUCE record and returns decoded SAUCE information as `{:ok, sauce_block}`.
If the binary does not contain a SAUCE record, `{:error, :no_sauce}` is returned.
"""
@spec sauce(binary()) :: {:ok, SauceBlock.t} | {:error, :no_sauce} | {:error, :invalid_sauce} | {:error, term()}
def sauce(bin) do
BinaryReader.sauce(bin)
end
@doc """
Reads a binary containing a SAUCE record and returns the raw binary in the form `{:ok, {sauce_bin, comments_bin}}`.
If the binary does not contain a SAUCE record, `{:error, :no_sauce}` is returned.
"""
@spec raw(binary()) :: {:ok, {binary(), binary()}} | {:error, :no_sauce} | {:error, term()}
def raw(bin) do
BinaryReader.raw(bin)
end
@doc """
Reads a binary containing a SAUCE record and returns the decoded SAUCE comments.
"""
@spec comments(binary()) :: {:ok, [String.t]} | {:error, :no_sauce} | {:error, :no_comments} | {:error, term()}
def comments(bin) do
BinaryReader.comments(bin)
end
@doc """
Reads a binary and returns the contents without the SAUCE block.
"""
@spec contents(binary()) :: {:ok, binary()} | {:error, term()}
def contents(bin) do
BinaryReader.contents(bin)
end
@doc """
Reads a binary and returns whether or not a SAUCE record exists.
Will match both binary that is a SAUCE record and binary that contains a SAUCE record.
"""
@spec sauce?(binary()) :: boolean()
def sauce?(bin) do
BinaryReader.sauce?(bin)
end
@doc """
Reads a binary and returns whether or not a SAUCE comments block exists within the SAUCE block.
Will match a comments block only if it a SAUCE record exists. Comment fragments are not considered to be valid without the presence of a SAUCE record.
"""
@spec comments?(binary()) :: boolean()
def comments?(bin) when is_binary(bin) do
BinaryReader.comments?(bin)
end
@doc """
Writes the given SAUCE block to the provided binary.
"""
@spec write(binary(), SauceBlock.t) :: {:ok, binary()} | {:error, term()}
def write(bin, sauce_block) do
BinaryWriter.write(bin, sauce_block)
end
@doc """
Removes a SAUCE block from a binary.
Both the SAUCE record and comments block will be removed.
"""
@spec remove_sauce(binary()) :: {:ok, binary()} | {:error, term()}
def remove_sauce(bin) when is_binary(bin) do
BinaryWriter.remove_sauce(bin)
end
@doc """
Removes any comments, if present from a SAUCE and rewrites the SAUCE accordingly.
Can be used to remove a SAUCE comments block or to clean erroneous comment information such as mismatched comment lines or double comment blocks.
"""
@spec remove_comments(binary()) :: {:ok, binary()} | {:error, :no_sauce} | {:error, term()}
def remove_comments(bin) when is_binary(bin) do
BinaryWriter.remove_comments(bin)
end
@doc """
Returns a detailed map of all SAUCE block data.
"""
@spec details(binary()) :: {:ok, map()} | {:error, term()}
def details(bin) when is_binary(bin) do
with {:ok, sauce_block} <- sauce(bin) do
{:ok, SauceBlock.details(sauce_block)}
else
err -> err
end
end
end
| 54.657778 | 615 | 0.734347 |
ffa755003524bf447a50653a8f5302f62d1cf227 | 2,588 | ex | Elixir | clients/binary_authorization/lib/google_api/binary_authorization/v1/model/admission_rule.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | null | null | null | clients/binary_authorization/lib/google_api/binary_authorization/v1/model/admission_rule.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-12-18T09:25:12.000Z | 2020-12-18T09:25:12.000Z | clients/binary_authorization/lib/google_api/binary_authorization/v1/model/admission_rule.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-10-04T10:12:44.000Z | 2020-10-04T10:12:44.000Z | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.BinaryAuthorization.V1.Model.AdmissionRule do
@moduledoc """
An admission rule specifies either that all container images used in a pod creation request must be attested to by one or more attestors, that all pod creations will be allowed, or that all pod creations will be denied. Images matching an admission whitelist pattern are exempted from admission rules and will never block a pod creation.
## Attributes
* `enforcementMode` (*type:* `String.t`, *default:* `nil`) - Required. The action when a pod creation is denied by the admission rule.
* `evaluationMode` (*type:* `String.t`, *default:* `nil`) - Required. How this admission rule will be evaluated.
* `requireAttestationsBy` (*type:* `list(String.t)`, *default:* `nil`) - Optional. The resource names of the attestors that must attest to a container image, in the format `projects/*/attestors/*`. Each attestor must exist before a policy can reference it. To add an attestor to a policy the principal issuing the policy change request must be able to read the attestor resource. Note: this field must be non-empty when the evaluation_mode field specifies REQUIRE_ATTESTATION, otherwise it must be empty.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:enforcementMode => String.t(),
:evaluationMode => String.t(),
:requireAttestationsBy => list(String.t())
}
field(:enforcementMode)
field(:evaluationMode)
field(:requireAttestationsBy, type: :list)
end
defimpl Poison.Decoder, for: GoogleApi.BinaryAuthorization.V1.Model.AdmissionRule do
def decode(value, options) do
GoogleApi.BinaryAuthorization.V1.Model.AdmissionRule.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.BinaryAuthorization.V1.Model.AdmissionRule do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 48.830189 | 508 | 0.750773 |
ffa75ff6fc5db67393849fc239bfacce7beb5006 | 333 | ex | Elixir | lib/ex_nrel/utils.ex | techgaun/ex_nrel | 6a6509903f9340e0d48b88b985ef2aa0eaae4594 | [
"Apache-2.0"
] | null | null | null | lib/ex_nrel/utils.ex | techgaun/ex_nrel | 6a6509903f9340e0d48b88b985ef2aa0eaae4594 | [
"Apache-2.0"
] | null | null | null | lib/ex_nrel/utils.ex | techgaun/ex_nrel | 6a6509903f9340e0d48b88b985ef2aa0eaae4594 | [
"Apache-2.0"
] | null | null | null | defmodule ExNrel.Utils do
@moduledoc false
def api_key, do: Application.get_env(:ex_nrel, :api_key)
def output_format, do: Application.get_env(:ex_nrel, :format)
def user_agent, do: [{"User-agent", "ExNrel"}]
def compression, do: [{"Content-Encoding", "gzip"}]
def request_headers, do: user_agent() ++ compression()
end
| 33.3 | 63 | 0.714715 |
ffa77232482f76ef39d12662ff801b291d289bfa | 67 | ex | Elixir | web/views/layout_view.ex | Juanchote/elixir-phoenix | d00411c8b30ea5fafca00f95d58f258a025c9338 | [
"MIT"
] | 8 | 2016-09-16T15:03:51.000Z | 2019-06-19T02:25:57.000Z | web/views/layout_view.ex | Juanchote/elixir-phoenix | d00411c8b30ea5fafca00f95d58f258a025c9338 | [
"MIT"
] | 13 | 2016-08-09T01:32:04.000Z | 2020-01-27T14:11:45.000Z | web/views/layout_view.ex | Juanchote/elixir-phoenix | d00411c8b30ea5fafca00f95d58f258a025c9338 | [
"MIT"
] | 13 | 2016-08-03T21:11:32.000Z | 2019-07-04T15:33:57.000Z | defmodule HelloWorld.LayoutView do
use HelloWorld.Web, :view
end
| 16.75 | 34 | 0.80597 |
ffa7d474d333d074b763c84030ca56acb306aaae | 1,132 | ex | Elixir | lib/spotify/models/pagings/paging_cursor.ex | chippers/spotify_web_api | 221a197dbac4971f87e9917d02cb335e6a42b726 | [
"MIT"
] | null | null | null | lib/spotify/models/pagings/paging_cursor.ex | chippers/spotify_web_api | 221a197dbac4971f87e9917d02cb335e6a42b726 | [
"MIT"
] | null | null | null | lib/spotify/models/pagings/paging_cursor.ex | chippers/spotify_web_api | 221a197dbac4971f87e9917d02cb335e6a42b726 | [
"MIT"
] | null | null | null | defmodule Spotify.Pagings.PagingCursor do
@moduledoc """
The cursor-based paging object is a container for a set of objects.
It contains a key called `items` (whose value is an array of the requested objects)
along with other keys like `next` and `cursors` that can be useful in future calls.
[Spotify Docs](https://beta.developer.spotify.com/documentation/web-api/reference/object-model/#cursor-based-paging-object)
"""
alias Spotify.Pagings
alias Spotify.Cursor
defstruct [
:href,
:items,
:limit,
:next,
:cursors,
:total,
]
@typedoc "The cursor-based paging object."
@type t(item_type) :: %__MODULE__{
href: Pagings.href | nil,
items: Pagings.items(item_type) | nil,
limit: Pagings.limit | nil,
next: Pagings.next | nil,
cursors: Pagings.cursors | nil,
total: Pagings.total | nil,
}
def wrap(struct) do
%__MODULE__{
cursors: Cursor.as(),
items: [struct]
}
end
end
| 28.3 | 127 | 0.573322 |
ffa8332984b083f2786519175ccb2652099524b5 | 22 | ex | Elixir | lib/yaps/yaps.ex | keichan34/YAPS | 908ab1c1b8913b5acef5c4bf634f8879709fb0b9 | [
"MIT"
] | 1 | 2015-06-17T03:47:45.000Z | 2015-06-17T03:47:45.000Z | lib/yaps/yaps.ex | keichan34/YAPS | 908ab1c1b8913b5acef5c4bf634f8879709fb0b9 | [
"MIT"
] | null | null | null | lib/yaps/yaps.ex | keichan34/YAPS | 908ab1c1b8913b5acef5c4bf634f8879709fb0b9 | [
"MIT"
] | null | null | null | defmodule Yaps do
end
| 7.333333 | 17 | 0.818182 |
ffa8481f9b209be8653e19e94bd2ff3e5e290caa | 63 | exs | Elixir | test/sammal_test.exs | epiphone/sammal | 0058359f62fcdf048743fdd62bd84913b98b7773 | [
"MIT"
] | 2 | 2017-05-02T14:15:11.000Z | 2021-12-11T11:13:15.000Z | test/sammal_test.exs | epiphone/sammal | 0058359f62fcdf048743fdd62bd84913b98b7773 | [
"MIT"
] | null | null | null | test/sammal_test.exs | epiphone/sammal | 0058359f62fcdf048743fdd62bd84913b98b7773 | [
"MIT"
] | null | null | null | defmodule SammalTest do
use ExUnit.Case
doctest Sammal
end
| 12.6 | 23 | 0.793651 |
ffa85d9265a64a0b81108a746fb5c755a4909949 | 67 | exs | Elixir | test/upstream_test.exs | artellectual/blazay | 0c5f58514cc2417931786b5cd482978fa5bfd9c0 | [
"MIT"
] | 5 | 2018-02-16T15:32:41.000Z | 2020-03-31T10:55:24.000Z | test/upstream_test.exs | artellectual/blazay | 0c5f58514cc2417931786b5cd482978fa5bfd9c0 | [
"MIT"
] | 8 | 2017-07-12T04:27:31.000Z | 2017-12-26T06:30:47.000Z | test/upstream_test.exs | upmaru/blazay | 0c5f58514cc2417931786b5cd482978fa5bfd9c0 | [
"MIT"
] | 2 | 2018-02-16T15:32:45.000Z | 2018-03-04T12:11:57.000Z | defmodule UpstreamTest do
use ExUnit.Case
doctest Upstream
end
| 13.4 | 25 | 0.80597 |
ffa887f6b1ed9397c7ba48ba3e8bfd781de9276c | 3,803 | exs | Elixir | lib/elixir/test/elixir/kernel/lexical_tracker_test.exs | diogovk/elixir | 7b8213affaad38b50afaa3dfc3a43717f35ba4e7 | [
"Apache-2.0"
] | null | null | null | lib/elixir/test/elixir/kernel/lexical_tracker_test.exs | diogovk/elixir | 7b8213affaad38b50afaa3dfc3a43717f35ba4e7 | [
"Apache-2.0"
] | null | null | null | lib/elixir/test/elixir/kernel/lexical_tracker_test.exs | diogovk/elixir | 7b8213affaad38b50afaa3dfc3a43717f35ba4e7 | [
"Apache-2.0"
] | null | null | null | Code.require_file "../test_helper.exs", __DIR__
defmodule Kernel.LexicalTrackerTest do
use ExUnit.Case, async: true
alias Kernel.LexicalTracker, as: D
setup do
{:ok, pid} = D.start_link("dest")
{:ok, [pid: pid]}
end
test "gets the destination", config do
assert D.dest(config[:pid]) == "dest"
end
test "can add remote dispatches", config do
D.remote_dispatch(config[:pid], String, :runtime)
assert D.remotes(config[:pid]) == {[], [String]}
D.remote_dispatch(config[:pid], String, :compile)
assert D.remotes(config[:pid]) == {[String], []}
D.remote_dispatch(config[:pid], String, :runtime)
assert D.remotes(config[:pid]) == {[String], []}
end
test "can add module imports", config do
D.add_import(config[:pid], String, 1, true)
D.import_dispatch(config[:pid], {String, :upcase, 1})
assert D.remotes(config[:pid]) == {[String], []}
end
test "can add {module, function, arity} imports", config do
D.add_import(config[:pid], {String, :upcase, 1}, 1, true)
D.import_dispatch(config[:pid], {String, :upcase, 1})
assert D.remotes(config[:pid]) == {[String], []}
end
test "can add aliases", config do
D.add_alias(config[:pid], String, 1, true)
D.alias_dispatch(config[:pid], String)
assert D.remotes(config[:pid]) == {[], []}
end
test "unused module imports", config do
D.add_import(config[:pid], String, 1, true)
assert D.collect_unused_imports(config[:pid]) == [{String, 1}]
end
test "used module imports are not unused", config do
D.add_import(config[:pid], String, 1, true)
D.import_dispatch(config[:pid], {String, :upcase, 1})
assert D.collect_unused_imports(config[:pid]) == []
end
test "unused {module, function, arity} imports", config do
D.add_import(config[:pid], {String, :upcase, 1}, 1, true)
assert D.collect_unused_imports(config[:pid]) == [{String, 1}, {{String, :upcase, 1}, 1}]
end
test "used {module, function, arity} imports are not unused", config do
D.add_import(config[:pid], {String, :upcase, 1}, 1, true)
D.add_import(config[:pid], {String, :downcase, 1}, 1, true)
D.import_dispatch(config[:pid], {String, :upcase, 1})
assert D.collect_unused_imports(config[:pid]) == [{{String, :downcase, 1}, 1}]
end
test "overwriting {module, function, arity} import with module import", config do
D.add_import(config[:pid], {String, :upcase, 1}, 1, true)
D.add_import(config[:pid], String, 1, true)
D.import_dispatch(config[:pid], {String, :downcase, 1})
assert D.collect_unused_imports(config[:pid]) == []
end
test "imports with no warn are not unused", config do
D.add_import(config[:pid], String, 1, false)
assert D.collect_unused_imports(config[:pid]) == []
end
test "unused aliases", config do
D.add_alias(config[:pid], String, 1, true)
assert D.collect_unused_aliases(config[:pid]) == [{String, 1}]
end
test "used aliases are not unused", config do
D.add_alias(config[:pid], String, 1, true)
D.alias_dispatch(config[:pid], String)
assert D.collect_unused_aliases(config[:pid]) == []
end
test "aliases with no warn are not unused", config do
D.add_alias(config[:pid], String, 1, false)
assert D.collect_unused_aliases(config[:pid]) == []
end
test "does not tag aliases nor types as compile time" do
{{compile, runtime}, _binding} =
Code.eval_string("""
defmodule Kernel.LexicalTrackerTest.Sample do
alias Foo.Bar, as: Bar, warn: false
@spec foo :: Foo.Bar.t
def foo, do: Bar.t
Kernel.LexicalTracker.remotes(__ENV__.module)
end |> elem(3)
""")
refute Elixir.Bar in runtime
refute Elixir.Bar in compile
assert Foo.Bar in runtime
refute Foo.Bar in compile
end
end
| 33.069565 | 93 | 0.651854 |
ffa88a2759272ab1dce43b5e963793657ba129dd | 498 | exs | Elixir | config/test.exs | deanbrophy/fantastic-garbanzo | 2649c3ddb915a8e3e589be21cbfa1c79c59ed9b4 | [
"MIT"
] | 1 | 2017-03-08T18:46:17.000Z | 2017-03-08T18:46:17.000Z | config/test.exs | deanbrophy/fantastic-garbanzo | 2649c3ddb915a8e3e589be21cbfa1c79c59ed9b4 | [
"MIT"
] | null | null | null | config/test.exs | deanbrophy/fantastic-garbanzo | 2649c3ddb915a8e3e589be21cbfa1c79c59ed9b4 | [
"MIT"
] | null | null | null | use Mix.Config
# We don't run a server during test. If one is required,
# you can enable the server option below.
config :elixircast, Elixircast.Endpoint,
http: [port: 4001],
server: false
# Print only warnings and errors during test
config :logger, level: :warn
# Configure your database
config :elixircast, Elixircast.Repo,
adapter: Ecto.Adapters.Postgres,
username: "postgres",
password: "",
database: "elixircast_test",
hostname: "localhost",
pool: Ecto.Adapters.SQL.Sandbox
| 24.9 | 56 | 0.736948 |
ffa88af3b45b6192c832d3882724591fe8378bba | 397 | ex | Elixir | lib/stripi/ephemeral_keys.ex | goalves/stripi | 1d9865103efb69cd69d7a418bcb53cfd355df37c | [
"MIT"
] | 7 | 2018-03-20T14:18:01.000Z | 2019-10-02T22:40:02.000Z | lib/stripi/ephemeral_keys.ex | goalves/stripi | 1d9865103efb69cd69d7a418bcb53cfd355df37c | [
"MIT"
] | 2 | 2019-03-12T16:37:39.000Z | 2019-03-12T18:28:03.000Z | lib/stripi/ephemeral_keys.ex | goalves/stripi | 1d9865103efb69cd69d7a418bcb53cfd355df37c | [
"MIT"
] | 2 | 2019-10-04T17:35:44.000Z | 2019-12-04T20:27:52.000Z | defmodule Stripi.EphemeralKeys do
use Stripi.Api
@url Application.get_env(:stripi, :base_url, "https://api.stripe.com/v1") <> "/ephemeral_keys"
def create(customer_id, opts \\ [headers: [{"Stripe-Version", api_version()}]]),
do: Stripi.request(&post/3, [@url, %{customer: customer_id}, opts])
def remove(key, opts \\ []), do: Stripi.request(&delete/2, [@url <> "/#{key}", opts])
end
| 39.7 | 96 | 0.65995 |
ffa899123843e5b938dc10caf7a657a1789bd6e3 | 621 | ex | Elixir | lib/ex_component_schema/schema/root.ex | lenra-io/ex_component_schema | a051ba94057cdd3218c4625e24f56a834a586aa6 | [
"MIT"
] | 2 | 2022-03-18T08:52:29.000Z | 2022-03-18T08:52:33.000Z | lib/ex_component_schema/schema/root.ex | lenra-io/ex_component_schema | a051ba94057cdd3218c4625e24f56a834a586aa6 | [
"MIT"
] | 8 | 2021-09-15T11:52:45.000Z | 2022-01-10T13:13:53.000Z | lib/ex_component_schema/schema/root.ex | lenra-io/ex_component_schema | a051ba94057cdd3218c4625e24f56a834a586aa6 | [
"MIT"
] | null | null | null | defmodule ExComponentSchema.Schema.Root do
defstruct schema: %{},
refs: %{},
definitions: %{},
location: :root,
version: nil,
custom_format_validator: nil
@type t :: %ExComponentSchema.Schema.Root{
schema: ExComponentSchema.Schema.resolved(),
refs: %{String.t() => ExComponentSchema.Schema.resolved()},
location: :root | String.t(),
definitions: %{String.t() => ExComponentSchema.Schema.resolved()},
version: non_neg_integer | nil,
custom_format_validator: {module(), atom()} | nil
}
end
| 34.5 | 76 | 0.5781 |
ffa8d2fddd6b83daf556a70dc16d9ad7af7c832e | 1,069 | exs | Elixir | mix.exs | thommay/bakeoff | 4c6cca860c45371f04021c3afceb4fe2e0b8d380 | [
"MIT"
] | null | null | null | mix.exs | thommay/bakeoff | 4c6cca860c45371f04021c3afceb4fe2e0b8d380 | [
"MIT"
] | null | null | null | mix.exs | thommay/bakeoff | 4c6cca860c45371f04021c3afceb4fe2e0b8d380 | [
"MIT"
] | null | null | null | defmodule Bakeoff.Mixfile do
use Mix.Project
def project do
[app: :bakeoff,
version: "0.0.1",
elixir: "~> 1.0",
elixirc_paths: elixirc_paths(Mix.env),
compilers: [:phoenix] ++ Mix.compilers,
build_embedded: Mix.env == :prod,
start_permanent: Mix.env == :prod,
deps: deps]
end
# Configuration for the OTP application
#
# Type `mix help compile.app` for more information
def application do
[mod: {Bakeoff, []},
applications: [:phoenix, :phoenix_html, :cowboy, :logger,
:phoenix_ecto, :postgrex]]
end
# Specifies which paths to compile per environment
defp elixirc_paths(:test), do: ["lib", "web", "test/support"]
defp elixirc_paths(_), do: ["lib", "web"]
# Specifies your project dependencies
#
# Type `mix help deps` for examples and options
defp deps do
[{:phoenix, "~> 0.14"},
{:phoenix_ecto, "~> 0.9"},
{:postgrex, ">= 0.0.0"},
{:phoenix_html, "~> 2.1"},
{:phoenix_live_reload, "~> 0.4", only: :dev},
{:cowboy, "~> 1.0"}]
end
end
| 26.725 | 63 | 0.597755 |
ffa90007645dcff684a14d9f395e1f61d1647d02 | 778 | ex | Elixir | lib/virgo/thrift/client.ex | GinShio/AstraeaVirgo | 92804cbae01f67e21b8f421009fa37fddc9054e1 | [
"BSD-2-Clause"
] | null | null | null | lib/virgo/thrift/client.ex | GinShio/AstraeaVirgo | 92804cbae01f67e21b8f421009fa37fddc9054e1 | [
"BSD-2-Clause"
] | null | null | null | lib/virgo/thrift/client.ex | GinShio/AstraeaVirgo | 92804cbae01f67e21b8f421009fa37fddc9054e1 | [
"BSD-2-Clause"
] | null | null | null | defmodule AstraeaVirgo.Thrift.Client do
use GenServer
@host Application.get_env(:virgo, :rpc)[:host]
@port Application.get_env(:virgo, :rpc)[:port]
def start_link(opts \\ []) do
GenServer.start_link(__MODULE__, opts, name: __MODULE__)
end
def init(_opts) do
:thrift_client_util.new(
@host |> String.to_charlist, @port,
:data_service_thrift, [protocol: :compact, framed: true]
)
end
def run(func, params) do
GenServer.call(__MODULE__, %{func: func, params: params})
end
def handle_call(%{func: func, params: params}, _from, client) do
case :thrift_client.call(client, func, params) do
{client, {:ok, result}} -> {:reply, result, client}
{client, {:error, reason}} -> {:stop, reason, client}
end
end
end
| 25.933333 | 66 | 0.660668 |
ffa90758971656f9f08056182aad30a5626e8339 | 2,349 | exs | Elixir | test/hex/repo_test.exs | hrzndhrn/hex | f74e2ed979e74130bdc4a6974660aa986333f33f | [
"Apache-2.0"
] | 824 | 2015-01-05T09:12:36.000Z | 2022-03-28T12:02:29.000Z | test/hex/repo_test.exs | hrzndhrn/hex | f74e2ed979e74130bdc4a6974660aa986333f33f | [
"Apache-2.0"
] | 737 | 2015-01-01T05:48:46.000Z | 2022-03-29T12:56:12.000Z | test/hex/repo_test.exs | hrzndhrn/hex | f74e2ed979e74130bdc4a6974660aa986333f33f | [
"Apache-2.0"
] | 220 | 2015-03-14T17:55:11.000Z | 2022-03-23T22:17:07.000Z | defmodule Hex.RepoTest do
use HexTest.IntegrationCase
@private_key File.read!(Path.join(__DIR__, "../fixtures/test_priv.pem"))
test "get_package" do
assert {:ok, {200, _, _}} = Hex.Repo.get_package("hexpm", "postgrex", "")
assert_raise Mix.Error, ~r"Unknown repository \"bad\"", fn ->
Hex.Repo.get_package("bad", "postgrex", "")
end
end
test "verify signature" do
message = :mix_hex_registry.sign_protobuf("payload", @private_key)
assert Hex.Repo.verify(message, "hexpm") == "payload"
assert_raise(Mix.Error, fn ->
message = :mix_hex_pb_signed.encode_msg(%{payload: "payload", signature: "foobar"}, :Signed)
Hex.Repo.verify(message, "hexpm")
end)
end
test "decode package" do
package = %{releases: [], repository: "hexpm", name: "ecto"}
message = :mix_hex_pb_package.encode_msg(package, :Package)
assert Hex.Repo.decode_package(message, "hexpm", "ecto") == []
end
test "decode package verify origin" do
package = %{releases: [], repository: "hexpm", name: "ecto"}
message = :mix_hex_pb_package.encode_msg(package, :Package)
assert_raise(Mix.Error, fn ->
Hex.Repo.decode_package(message, "other repo", "ecto")
end)
assert_raise(Mix.Error, fn ->
Hex.Repo.decode_package(message, "hexpm", "other package")
end)
Hex.State.put(:no_verify_repo_origin, true)
assert Hex.Repo.decode_package(message, "other repo", "ecto") == []
assert Hex.Repo.decode_package(message, "hexpm", "other package") == []
end
test "get public key" do
bypass = Bypass.open()
repos = Hex.State.fetch!(:repos)
hexpm = Hex.Repo.default_hexpm_repo()
repos = put_in(repos["hexpm"].url, "http://localhost:#{bypass.port}")
Hex.State.put(:repos, repos)
Bypass.expect(bypass, fn %Plug.Conn{request_path: path} = conn ->
case path do
"/public_key" ->
Plug.Conn.resp(conn, 200, hexpm.public_key)
"/not_found/public_key" ->
Plug.Conn.resp(conn, 404, "not found")
end
end)
assert {:ok, {200, public_key, _}} =
Hex.Repo.get_public_key("http://localhost:#{bypass.port}", nil)
assert public_key == hexpm.public_key
assert {:ok, {404, "not found", _}} =
Hex.Repo.get_public_key("http://localhost:#{bypass.port}/not_found", nil)
end
end
| 31.743243 | 98 | 0.642401 |
ffa9b1cafc1cbe612f6740f2d5958cea82397fdf | 1,599 | ex | Elixir | clients/dlp/lib/google_api/dlp/v2/model/google_privacy_dlp_v2_quote_info.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2021-12-20T03:40:53.000Z | 2021-12-20T03:40:53.000Z | clients/dlp/lib/google_api/dlp/v2/model/google_privacy_dlp_v2_quote_info.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2020-08-18T00:11:23.000Z | 2020-08-18T00:44:16.000Z | clients/dlp/lib/google_api/dlp/v2/model/google_privacy_dlp_v2_quote_info.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.DLP.V2.Model.GooglePrivacyDlpV2QuoteInfo do
@moduledoc """
Message for infoType-dependent details parsed from quote.
## Attributes
* `dateTime` (*type:* `GoogleApi.DLP.V2.Model.GooglePrivacyDlpV2DateTime.t`, *default:* `nil`) - The date time indicated by the quote.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:dateTime => GoogleApi.DLP.V2.Model.GooglePrivacyDlpV2DateTime.t() | nil
}
field(:dateTime, as: GoogleApi.DLP.V2.Model.GooglePrivacyDlpV2DateTime)
end
defimpl Poison.Decoder, for: GoogleApi.DLP.V2.Model.GooglePrivacyDlpV2QuoteInfo do
def decode(value, options) do
GoogleApi.DLP.V2.Model.GooglePrivacyDlpV2QuoteInfo.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.DLP.V2.Model.GooglePrivacyDlpV2QuoteInfo do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 34.021277 | 138 | 0.754221 |
ffa9ff444518fbe8544a0fc8bd1248d89ec1eeb6 | 861 | ex | Elixir | lib/trento/application/read_models/database_read_model.ex | trento-project/web | 3260b30c781bffbbb0e5205cd650966c4026b9ac | [
"Apache-2.0"
] | 1 | 2022-03-22T16:59:34.000Z | 2022-03-22T16:59:34.000Z | lib/trento/application/read_models/database_read_model.ex | trento-project/web | 3260b30c781bffbbb0e5205cd650966c4026b9ac | [
"Apache-2.0"
] | 24 | 2022-03-22T16:45:25.000Z | 2022-03-31T13:00:02.000Z | lib/trento/application/read_models/database_read_model.ex | trento-project/web | 3260b30c781bffbbb0e5205cd650966c4026b9ac | [
"Apache-2.0"
] | 1 | 2022-03-30T14:16:16.000Z | 2022-03-30T14:16:16.000Z | defmodule Trento.DatabaseReadModel do
@moduledoc """
Database read model
"""
use Ecto.Schema
import Ecto.Changeset
alias Trento.DatabaseInstanceReadModel
@type t :: %__MODULE__{}
@derive {Jason.Encoder, except: [:__meta__, :__struct__]}
@primary_key {:id, :binary_id, autogenerate: false}
schema "databases" do
field :sid, :string
field :health, Ecto.Enum, values: [:passing, :warning, :critical, :unknown]
has_many :tags, Trento.Tag, foreign_key: :resource_id
has_many :database_instances, DatabaseInstanceReadModel,
references: :id,
foreign_key: :sap_system_id,
preload_order: [asc: :instance_number, asc: :host_id]
end
@spec changeset(t() | Ecto.Changeset.t(), map) :: Ecto.Changeset.t()
def changeset(database, attrs) do
cast(database, attrs, __MODULE__.__schema__(:fields))
end
end
| 26.090909 | 79 | 0.70151 |
ffaa1fc258daae7a6876a1a17f3e1dc95ea45992 | 7,240 | ex | Elixir | farmbot_core/lib/firmware/uart_core.ex | gdwb/farmbot_os | 0ef2697c580c9fbf37a22daa063a64addfcb778d | [
"MIT"
] | 1 | 2021-08-23T13:36:14.000Z | 2021-08-23T13:36:14.000Z | farmbot_core/lib/firmware/uart_core.ex | gdwb/farmbot_os | 0ef2697c580c9fbf37a22daa063a64addfcb778d | [
"MIT"
] | null | null | null | farmbot_core/lib/firmware/uart_core.ex | gdwb/farmbot_os | 0ef2697c580c9fbf37a22daa063a64addfcb778d | [
"MIT"
] | null | null | null | defmodule FarmbotCore.Firmware.UARTCore do
@moduledoc """
UARTCore is the central logic and processing module for all
inbound and outbound UART data (GCode).
Guiding Principles:
* No cached state - Delay data fetching. Never duplicate.
* No timeouts - Push data, don't pull.
* No polling - Push data, don't pull.
* No retries - Fail fast / hard. Restarting the module
is the only recovery option.
"""
alias __MODULE__, as: State
alias FarmbotCore.Firmware.UARTCoreSupport, as: Support
alias FarmbotCore.BotState
alias FarmbotCore.Firmware.{
RxBuffer,
TxBuffer,
GCodeDecoder,
InboundSideEffects
}
require Logger
require FarmbotCore.Logger
defstruct uart_pid: nil,
logs_enabled: false,
uart_path: nil,
needs_config: true,
fw_type: nil,
rx_count: 0,
rx_buffer: RxBuffer.new(),
tx_buffer: TxBuffer.new()
# The Firmware has a 120 second default timeout.
# Queuing up 10 messages that take one minute each == 10 minutes.
# This is a reasonable (but not perfect) assumption. RC
@minutes 10
@fw_timeout 1000 * 60 * @minutes
# ==== HISTORICAL NOTE ABOUT FBExpress 1.0 ===============
# Unlike USB serial ports, FBExpress serial uses GPIO.
# This means the GPIO is always running with no definitive
# start/stop signal. This means the parser gets "stuck" on
# the wrong GCode block. The end result is a firmware handler
# that sits there and does nothing. To get around this,
# we do a "health check" after a certain amount of time to
# ensure the farmduion is actually running.
@bugfix_timeout 60_000
# ===== END HISTORICAL CODE ==============================
# This is a helper method that I use for inspecting GCode
# over SSH. It is not used by production systems except for
# debugging.
def toggle_logging(server \\ __MODULE__) do
send(server, :toggle_logging)
end
def refresh_config(server, new_keys) do
send(server, {:refresh_config, new_keys})
end
def flash_firmware(server \\ __MODULE__, package) do
GenServer.call(server, {:flash_firmware, package}, @fw_timeout)
end
def start_job(server \\ __MODULE__, gcode) do
GenServer.call(server, {:start_job, gcode}, @fw_timeout)
end
# Sends GCode directly to the MCU without any checks or
# queues. Don't use outside of the `/firmware` directory.
def send_raw(server \\ __MODULE__, gcode) do
send(server, {:send_raw, gcode})
end
def restart_firmware(server \\ __MODULE__) do
send(server, :reset_state)
:ok
end
# ================= BEGIN GENSERVER CODE =================
def start_link(args, opts \\ [name: __MODULE__]) do
GenServer.start_link(__MODULE__, args, opts)
end
def init(opts) do
BotState.firmware_offline()
path = Keyword.fetch!(opts, :path)
{:ok, uart_pid} = Support.connect(path)
fw_type = Keyword.get(opts, :fw_package)
state = %State{uart_pid: uart_pid, uart_path: path, fw_type: fw_type}
Process.send_after(self(), :best_effort_bug_fix, @bugfix_timeout)
{:ok, state}
end
def handle_info(:reset_state, %State{uart_path: old_path} = state1) do
# Teardown existing connection.
Support.disconnect(state1, "Rebooting firmware")
# Reset state tree
{:ok, next_state} = init(path: old_path, fw_type: state1.fw_type)
Logger.debug("Firmware restart initiated")
{:noreply, next_state}
end
# === SCENARIO: EMERGENCY LOCK - this one gets special
# treatment. It skips all queing mechanisms and dumps
# any tasks that were already queued.
def handle_info({:send_raw, "E"}, %State{} = state) do
Support.uart_send(state.uart_pid, "E\r\n")
msg = "Emergency locked"
txb = TxBuffer.error_all(state.tx_buffer, msg)
Support.lock!()
{:noreply, %{state | tx_buffer: txb}}
end
# === SCENARIO: Direct GCode transmission without queueing
def handle_info({:send_raw, text}, %State{} = state) do
Support.uart_send(state.uart_pid, "#{text}\r\n")
{:noreply, state}
end
# === SCENARIO: Serial cable is unplugged.
def handle_info({:circuits_uart, _, {:error, :eio}}, _) do
{:stop, :cable_unplugged, %State{}}
end
# === SCENARIO: Serial sent us some chars to consume.
def handle_info({:circuits_uart, _, msg}, %State{} = state1)
when is_binary(msg) do
# First, push all messages into a buffer. The result is a
# list of stringly-typed Gcode blocks to be
# processed (if any).
{next_rx_buffer, txt_lines} = process_incoming_text(state1.rx_buffer, msg)
state2 = %{state1 | rx_buffer: next_rx_buffer}
# Then, format GCode strings into Elixir-readable tuples.
gcodes = GCodeDecoder.run(txt_lines)
# Lastly, trigger any relevant side effect(s).
# Example: send userl logs when firmware is locked.
state3 = InboundSideEffects.process(state2, gcodes)
if state3.needs_config && state3.rx_buffer.ready do
Logger.debug("=== Uploading configuration")
{:noreply, FarmbotCore.Firmware.ConfigUploader.upload(state3)}
else
{:noreply, state3}
end
end
def handle_info({:refresh_config, new_keys}, state) do
{:noreply, FarmbotCore.Firmware.ConfigUploader.refresh(state, new_keys)}
end
def handle_info(:toggle_logging, state) do
next_state = %{state | logs_enabled: !state.logs_enabled}
{:noreply, next_state}
end
def handle_info(:best_effort_bug_fix, state) do
silent = state.rx_count < 1
borked = BotState.fetch().informational_settings.firmware_version == nil
if silent || borked do
msg = "Rebooting inactive Farmduino. #{Support.uptime_ms()}"
FarmbotCore.Logger.debug(3, msg)
package =
state.fw_type ||
FarmbotCore.Asset.fbos_config().firmware_hardware
spawn(__MODULE__, :flash_firmware, [self(), package])
else
FarmbotCore.Logger.debug(3, "Farmduino OK")
end
{:noreply, state}
end
# === SCENARIO: Unexpected message from a library or FBOS.
def handle_info(message, %State{} = state) do
Logger.error("UNEXPECTED FIRMWARE MESSAGE: #{inspect(message)}")
{:noreply, state}
end
def handle_call({:start_job, gcode}, caller, %State{} = state) do
if Support.locked?() do
{:reply, {:error, "Device is locked."}, state}
else
next_buffer =
state.tx_buffer
|> TxBuffer.push(caller, gcode)
|> TxBuffer.process_next_message(state.uart_pid)
next_state = %{state | tx_buffer: next_buffer}
{:noreply, next_state}
end
end
def handle_call({:flash_firmware, nil}, _, %State{} = state) do
msg = "Can't flash firmware yet because hardware is unknown."
FarmbotCore.Logger.info(1, msg)
{:reply, :ok, state}
end
def handle_call({:flash_firmware, package}, _, %State{} = state) do
next_state = FarmbotCore.Firmware.Flash.run(state, package)
Process.send_after(self(), :reset_state, 1)
{:reply, :ok, next_state}
end
def terminate(_, _) do
Logger.debug("Firmware terminated.")
BotState.firmware_offline()
end
defp process_incoming_text(rx_buffer, text) do
rx_buffer
|> RxBuffer.puts(text)
|> RxBuffer.gets()
end
end
| 32.035398 | 78 | 0.671961 |
ffaa25455e2c0886c35bcd0ac68e13f49f96d888 | 7,545 | ex | Elixir | lib/sanbase_web/graphql/resolvers/user/user_resolver.ex | rmoorman/sanbase2 | 226784ab43a24219e7332c49156b198d09a6dd85 | [
"MIT"
] | 1 | 2022-01-30T19:51:39.000Z | 2022-01-30T19:51:39.000Z | lib/sanbase_web/graphql/resolvers/user/user_resolver.ex | rmoorman/sanbase2 | 226784ab43a24219e7332c49156b198d09a6dd85 | [
"MIT"
] | null | null | null | lib/sanbase_web/graphql/resolvers/user/user_resolver.ex | rmoorman/sanbase2 | 226784ab43a24219e7332c49156b198d09a6dd85 | [
"MIT"
] | null | null | null | defmodule SanbaseWeb.Graphql.Resolvers.UserResolver do
require Logger
import Sanbase.Utils.ErrorHandling, only: [changeset_errors: 1]
import Absinthe.Resolution.Helpers, except: [async: 1]
alias Sanbase.InternalServices.Ethauth
alias Sanbase.Accounts.{User, UserFollower}
alias SanbaseWeb.Graphql.SanbaseDataloader
def email(%User{email: nil}, _args, _resolution), do: {:ok, nil}
def email(%User{id: id, email: email}, _args, %{
context: %{auth: %{current_user: %User{id: id}}}
}) do
{:ok, email}
end
def email(%User{} = user, _args, _resolution) do
{:ok, User.Public.hide_private_data(user).email}
end
def permissions(
%User{} = user,
_args,
_resolution
) do
{:ok, User.Permissions.permissions(user)}
end
@spec san_balance(%User{}, map(), Absinthe.Resolution.t()) :: {:ok, float()}
def san_balance(
%User{} = user,
_args,
_res
) do
case User.san_balance(user) do
{:ok, san_balance} ->
{:ok, san_balance || 0}
{:error, error} ->
Logger.warn("Error getting a user's san balance. Reason: #{inspect(error)}")
{:nocache, {:ok, 0.0}}
end
end
def api_calls_history(
%User{} = user,
%{from: from, to: to, interval: interval} = args,
_resolution
) do
auth_method = Map.get(args, :auth_method, :all)
Sanbase.Clickhouse.ApiCallData.api_call_history(user.id, from, to, interval, auth_method)
end
def api_calls_count(
%User{} = user,
%{from: from, to: to} = args,
_resolution
) do
auth_method = Map.get(args, :auth_method, :all)
Sanbase.Clickhouse.ApiCallData.api_call_count(user.id, from, to, auth_method)
end
def relays_quota(_root, _args, %{context: %{auth: %{current_user: user}}}) do
{:ok, Sanbase.WalletHunters.RelayQuota.relays_quota(user.id)}
end
def current_user(_root, _args, %{
context: %{auth: %{current_user: user}}
}) do
{:ok, user}
end
def current_user(_root, _args, _context), do: {:ok, nil}
def get_user(_root, %{selector: selector}, _resolution) when map_size(selector) != 1 do
{:error, "Provide exactly one field in the user selector object"}
end
def get_user(_root, %{selector: selector}, _resolution) do
User.by_selector(selector)
end
def following(%User{id: user_id}, _args, _resolution) do
following = UserFollower.followed_by(user_id)
{:ok, %{count: length(following), users: following}}
end
def following2(%User{id: user_id}, _args, _resolution) do
following = UserFollower.followed_by2(user_id)
{:ok, %{count: length(following), users: following}}
end
def followers(%User{id: user_id}, _args, _resolution) do
followers = UserFollower.followers_of(user_id)
{:ok, %{count: length(followers), users: followers}}
end
def followers2(%User{id: user_id}, _args, _resolution) do
followers = UserFollower.followers_of2(user_id)
{:ok, %{count: length(followers), users: followers}}
end
def change_email(_root, %{email: email_candidate}, %{
context: %{auth: %{auth_method: :user_token, current_user: user}}
}) do
with {:ok, user} <- User.update_email_candidate(user, email_candidate),
{:ok, _user} <- User.send_verify_email(user) do
{:ok, %{success: true}}
else
{:error, changeset} ->
message = "Can't change current user's email to #{email_candidate}"
Logger.warn(message)
{:error, message: message, details: changeset_errors(changeset)}
end
end
def email_change_verify(
%{token: email_candidate_token, email_candidate: email_candidate},
%{context: %{device_data: device_data}}
) do
with {:ok, user} <- User.find_by_email_candidate(email_candidate, email_candidate_token),
true <- User.email_candidate_token_valid?(user, email_candidate_token),
{:ok, jwt_tokens} <-
SanbaseWeb.Guardian.get_jwt_tokens(user,
platform: device_data.platform,
client: device_data.client
),
{:ok, user} <- User.update_email_from_email_candidate(user) do
{:ok,
%{
user: user,
token: jwt_tokens.access_token,
access_token: jwt_tokens.access_token,
refresh_token: jwt_tokens.refresh_token
}}
else
_ -> {:error, message: "Login failed"}
end
end
def change_name(_root, %{name: new_name}, %{context: %{auth: %{current_user: user}}}) do
case User.change_name(user, new_name) do
{:ok, user} ->
{:ok, user}
{:error, changeset} ->
{
:error,
message: "Cannot update current user's name to #{new_name}",
details: changeset_errors(changeset)
}
end
end
def change_username(_root, %{username: new_username}, %{context: %{auth: %{current_user: user}}}) do
case User.change_username(user, new_username) do
{:ok, user} ->
{:ok, user}
{:error, changeset} ->
{
:error,
message: "Cannot update current user's username to #{new_username}",
details: changeset_errors(changeset)
}
end
end
def change_avatar(_root, %{avatar_url: avatar_url}, %{
context: %{auth: %{auth_method: :user_token, current_user: user}}
}) do
User.update_avatar_url(user, avatar_url)
|> case do
{:ok, user} ->
{:ok, user}
{:error, changeset} ->
{
:error,
message: "Cannot change the avatar", details: changeset_errors(changeset)
}
end
end
def add_user_eth_address(
_root,
%{signature: signature, address: address, message_hash: message_hash},
%{context: %{auth: %{auth_method: :user_token, current_user: user}}}
) do
with true <- Ethauth.verify_signature(signature, address, message_hash),
{:ok, _} <- User.add_eth_account(user, address) do
{:ok, user}
else
{:error, reason} ->
Logger.warn(
"Could not add an ethereum address for user #{user.id}. Reason: #{inspect(reason)}"
)
{:error, "Could not add an ethereum address."}
end
end
def remove_user_eth_address(_root, %{address: address}, %{
context: %{auth: %{auth_method: :user_token, current_user: user}}
}) do
case User.remove_eth_account(user, address) do
true ->
{:ok, user}
{:error, reason} ->
error_msg =
"Could not remove an ethereum address for user #{user.id}. Reason: #{inspect(reason)}"
{:error, error_msg}
end
end
def update_terms_and_conditions(_root, args, %{
context: %{auth: %{auth_method: :user_token, current_user: user}}
}) do
# Update only the provided arguments
args =
args
|> Enum.reject(fn {_key, value} -> value == nil end)
|> Enum.into(%{})
user
|> User.changeset(args)
|> Sanbase.Repo.update()
|> case do
{:ok, user} ->
{:ok, user}
{:error, changeset} ->
{
:error,
message: "Cannot update current user's terms and conditions",
details: changeset_errors(changeset)
}
end
end
def user_no_preloads(%{user_id: user_id}, _args, %{context: %{loader: loader}}) do
loader
|> Dataloader.load(SanbaseDataloader, :users_by_id, user_id)
|> on_load(fn loader ->
{:ok, Dataloader.get(loader, SanbaseDataloader, :users_by_id, user_id)}
end)
end
end
| 29.357977 | 102 | 0.616037 |
ffaa4aa33d1c8a7314fd7f2b53bea820272ba3ba | 10,318 | ex | Elixir | lib/exfsm.ex | tableturn/exfsm | 38083f1524a8a66e05c99ab6d024c369b9c83726 | [
"MIT"
] | 6 | 2015-01-23T17:44:28.000Z | 2016-09-04T04:00:01.000Z | lib/exfsm.ex | tableturn/exfsm | 38083f1524a8a66e05c99ab6d024c369b9c83726 | [
"MIT"
] | 1 | 2021-08-25T13:28:11.000Z | 2021-08-25T13:28:11.000Z | lib/exfsm.ex | tableturn/exfsm | 38083f1524a8a66e05c99ab6d024c369b9c83726 | [
"MIT"
] | 3 | 2019-02-27T16:54:08.000Z | 2021-03-03T16:28:41.000Z | defmodule ExFSM do
@type fsm_spec :: %{{state_name :: atom, event_name :: atom} => {exfsm_module :: atom,[dest_statename :: atom]}}
@moduledoc """
After `use ExFSM` : define FSM transition handler with `deftrans fromstate({action_name,params},state)`.
A function `fsm` will be created returning a map of the `fsm_spec` describing the fsm.
Destination states are found with AST introspection, if the `{:next_state,xxx,xxx}` is defined
outside the `deftrans/2` function, you have to define them manually defining a `@to` attribute.
For instance :
iex> defmodule Elixir.Door do
...> use ExFSM
...>
...> @doc "Close to open"
...> @to [:opened]
...> deftrans closed({:open, _}, s) do
...> {:next_state, :opened, s}
...> end
...>
...> @doc "Close to close"
...> deftrans closed({:close, _}, s) do
...> {:next_state, :closed, s}
...> end
...>
...> deftrans closed({:else, _}, s) do
...> {:next_state, :closed, s}
...> end
...>
...> @doc "Open to open"
...> deftrans opened({:open, _}, s) do
...> {:next_state, :opened, s}
...> end
...>
...> @doc "Open to close"
...> @to [:closed]
...> deftrans opened({:close, _}, s) do
...> {:next_state, :closed, s}
...> end
...>
...> deftrans opened({:else, _}, s) do
...> {:next_state, :opened, s}
...> end
...> end
...> Door.fsm
%{{:closed, :close} => {Door, [:closed]}, {:closed, :else} => {Door, [:closed]},
{:closed, :open} => {Door, [:opened]}, {:opened, :close} => {Door, [:closed]},
{:opened, :else} => {Door, [:opened]}, {:opened, :open} => {Door, [:opened]}}
iex> Door.docs
%{{:transition_doc, :closed, :close} => "Close to close",
{:transition_doc, :closed, :else} => nil,
{:transition_doc, :closed, :open} => "Close to open",
{:transition_doc, :opened, :close} => "Open to close",
{:transition_doc, :opened, :else} => nil,
{:transition_doc, :opened, :open} => "Open to open"}
"""
defmacro __using__(_opts) do
quote do
import ExFSM
@fsm %{}
@bypasses %{}
@docs %{}
@to nil
@before_compile ExFSM
end
end
defmacro __before_compile__(_env) do
quote do
def fsm, do: @fsm
def event_bypasses, do: @bypasses
def docs, do: @docs
end
end
@doc """
Define a function of type `transition` describing a state and its
transition. The function name is the state name, the transition is the
first argument. A state object can be modified and is the second argument.
deftrans opened({:close_door,_params},state) do
{:next_state,:closed,state}
end
"""
@type transition :: (({event_name :: atom, event_param :: any},state :: any) -> {:next_state,event_name :: atom,state :: any})
defmacro deftrans({state,_meta,[{trans,_param}|_rest]}=signature, body_block) do
quote do
@fsm Map.put(@fsm,{unquote(state),unquote(trans)},{__MODULE__,@to || unquote(Enum.uniq(find_nextstates(body_block[:do])))})
doc = Module.get_attribute(__MODULE__, :doc)
@docs Map.put(@docs,{:transition_doc,unquote(state),unquote(trans)},doc)
def unquote(signature), do: unquote(body_block[:do])
@to nil
end
end
defp find_nextstates({:{},_,[:next_state,state|_]}) when is_atom(state), do: [state]
defp find_nextstates({_,_,asts}), do: find_nextstates(asts)
defp find_nextstates({_,asts}), do: find_nextstates(asts)
defp find_nextstates(asts) when is_list(asts), do: Enum.flat_map(asts,&find_nextstates/1)
defp find_nextstates(_), do: []
defmacro defbypass({event,_meta,_args}=signature,body_block) do
quote do
@bypasses Map.put(@bypasses,unquote(event),__MODULE__)
doc = Module.get_attribute(__MODULE__, :doc)
@docs Map.put(@docs,{:event_doc,unquote(event)},doc)
def unquote(signature), do: unquote(body_block[:do])
end
end
end
defmodule ExFSM.Machine do
@moduledoc """
Module to simply use FSMs defined with ExFSM :
- `ExFSM.Machine.fsm/1` merge fsm from multiple handlers (see `ExFSM` to see how to define one).
- `ExFSM.Machine.event_bypasses/1` merge bypasses from multiple handlers (see `ExFSM` to see how to define one).
- `ExFSM.Machine.event/2` allows you to execute the correct handler from a state and action
Define a structure implementing `ExFSM.Machine.State` in order to
define how to extract handlers and state_name from state, and how
to apply state_name change. Then use `ExFSM.Machine.event/2` in order
to execute transition.
iex> defmodule Elixir.Door1 do
...> use ExFSM
...> deftrans closed({:open_door,_},s) do {:next_state,:opened,s} end
...> end
...> defmodule Elixir.Door2 do
...> use ExFSM
...> @doc "allow multiple closes"
...> defbypass close_door(_,s), do: {:keep_state,Map.put(s,:doubleclosed,true)}
...> @doc "standard door open"
...> deftrans opened({:close_door,_},s) do {:next_state,:closed,s} end
...> end
...> ExFSM.Machine.fsm([Door1,Door2])
%{
{:closed,:open_door}=>{Door1,[:opened]},
{:opened,:close_door}=>{Door2,[:closed]}
}
iex> ExFSM.Machine.event_bypasses([Door1,Door2])
%{close_door: Door2}
iex> defmodule Elixir.DoorState do defstruct(handlers: [Door1,Door2], state: nil, doubleclosed: false) end
...> defimpl ExFSM.Machine.State, for: DoorState do
...> def handlers(d) do d.handlers end
...> def state_name(d) do d.state end
...> def set_state_name(d,name) do %{d|state: name} end
...> end
...> struct(DoorState, state: :closed) |> ExFSM.Machine.event({:open_door,nil})
{:next_state,%{__struct__: DoorState, handlers: [Door1,Door2],state: :opened, doubleclosed: false}}
...> struct(DoorState, state: :closed) |> ExFSM.Machine.event({:close_door,nil})
{:next_state,%{__struct__: DoorState, handlers: [Door1,Door2],state: :closed, doubleclosed: true}}
iex> ExFSM.Machine.find_info(struct(DoorState, state: :opened),:close_door)
{:known_transition,"standard door open"}
iex> ExFSM.Machine.find_info(struct(DoorState, state: :closed),:close_door)
{:bypass,"allow multiple closes"}
iex> ExFSM.Machine.available_actions(struct(DoorState, state: :closed))
[:open_door,:close_door]
"""
defprotocol State do
@doc "retrieve current state handlers from state object, return [Handler1,Handler2]"
def handlers(state)
@doc "retrieve current state name from state object"
def state_name(state)
@doc "set new state name"
def set_state_name(state,state_name)
end
@doc "return the FSM as a map of transitions %{{state,action}=>{handler,[dest_states]}} based on handlers"
@spec fsm([exfsm_module :: atom]) :: ExFSM.fsm_spec
def fsm(handlers) when is_list(handlers), do:
(handlers |> Enum.map(&(&1.fsm)) |> Enum.concat |> Enum.into(%{}))
def fsm(state), do:
fsm(State.handlers(state))
def event_bypasses(handlers) when is_list(handlers), do:
(handlers |> Enum.map(&(&1.event_bypasses)) |> Enum.concat |> Enum.into(%{}))
def event_bypasses(state), do:
event_bypasses(State.handlers(state))
@doc "find the ExFSM Module from the list `handlers` implementing the event `action` from `state_name`"
@spec find_handler({state_name::atom,event_name::atom},[exfsm_module :: atom]) :: exfsm_module :: atom
def find_handler({state_name,action},handlers) when is_list(handlers) do
case Map.get(fsm(handlers),{state_name,action}) do
{handler,_}-> handler
_ -> nil
end
end
@doc "same as `find_handler/2` but using a 'meta' state implementing `ExFSM.Machine.State`"
def find_handler({state,action}), do:
find_handler({State.state_name(state),action},State.handlers(state))
def find_bypass(handlers_or_state,action) do
event_bypasses(handlers_or_state)[action]
end
def infos(handlers,_action) when is_list(handlers), do:
(handlers |> Enum.map(&(&1.docs)) |> Enum.concat |> Enum.into(%{}))
def infos(state,action), do:
infos(State.handlers(state),action)
def find_info(state,action) do
docs = infos(state,action)
if doc = docs[{:transition_doc,State.state_name(state),action}] do
{:known_transition,doc}
else
{:bypass,docs[{:event_doc,action}]}
end
end
@doc "Meta application of the transition function, using `find_handler/2` to find the module implementing it."
@type meta_event_reply :: {:next_state,ExFSM.Machine.State.t} | {:next_state,ExFSM.Machine.State.t,timeout :: integer} | {:error,:illegal_action}
@spec event(ExFSM.Machine.State.t,{event_name :: atom, event_params :: any}) :: meta_event_reply
def event(state,{action,params}) do
case find_handler({state,action}) do
nil ->
case find_bypass(state,action) do
nil-> {:error,:illegal_action}
handler-> case apply(handler,action,[params,state]) do
{:keep_state,state}->{:next_state,state}
{:next_state,state_name,state,timeout} -> {:next_state,State.set_state_name(state,state_name),timeout}
{:next_state,state_name,state} -> {:next_state,State.set_state_name(state,state_name)}
other -> other
end
end
handler ->
case apply(handler,State.state_name(state),[{action,params},state]) do
{:next_state,state_name,state,timeout} -> {:next_state,State.set_state_name(state,state_name),timeout}
{:next_state,state_name,state} -> {:next_state,State.set_state_name(state,state_name)}
other -> other
end
end
end
@spec available_actions(ExFSM.Machine.State.t) :: [action_name :: atom]
def available_actions(state) do
fsm_actions = ExFSM.Machine.fsm(state)
|> Enum.filter(fn {{from,_},_}->from==State.state_name(state) end)
|> Enum.map(fn {{_,action},_}->action end)
bypasses_actions = ExFSM.Machine.event_bypasses(state) |> Map.keys
Enum.uniq(fsm_actions ++ bypasses_actions)
end
@spec action_available?(ExFSM.Machine.State.t,action_name :: atom) :: boolean
def action_available?(state,action) do
action in available_actions(state)
end
end
| 41.272 | 147 | 0.640628 |
ffaa53a3553ae68f30501c0d54313cd9b3831c76 | 8,979 | ex | Elixir | lib/ecto/adapters/postgres.ex | IcyEagle/ecto | 26237057d4ffff2daf5258e181eccc3238b71490 | [
"Apache-2.0"
] | null | null | null | lib/ecto/adapters/postgres.ex | IcyEagle/ecto | 26237057d4ffff2daf5258e181eccc3238b71490 | [
"Apache-2.0"
] | null | null | null | lib/ecto/adapters/postgres.ex | IcyEagle/ecto | 26237057d4ffff2daf5258e181eccc3238b71490 | [
"Apache-2.0"
] | null | null | null | defmodule Ecto.Adapters.Postgres do
@moduledoc """
Adapter module for PostgreSQL.
It uses `postgrex` for communicating to the database
and a connection pool, such as `poolboy`.
## Features
* Full query support (including joins, preloads and associations)
* Support for transactions
* Support for data migrations
* Support for ecto.create and ecto.drop operations
* Support for transactional tests via `Ecto.Adapters.SQL`
## Options
Postgres options split in different categories described
below. All options can be given via the repository
configuration:
config :your_app, YourApp.Repo,
adapter: Ecto.Adapters.Postgres,
...
Non-compile time options can also be returned from the
repository `init/2` callback.
### Compile time options
Those options should be set in the config file and require
recompilation in order to make an effect.
* `:adapter` - The adapter name, in this case, `Ecto.Adapters.Postgres`
### Connection options
* `:pool` - The connection pool module, defaults to `DBConnection.Poolboy`
* `:pool_timeout` - The default timeout to use on pool calls, defaults to `5000`
* `:timeout` - The default timeout to use on queries, defaults to `15000`
* `:hostname` - Server hostname
* `:port` - Server port (default: 5432)
* `:username` - Username
* `:password` - User password
* `:ssl` - Set to true if ssl should be used (default: false)
* `:ssl_opts` - A list of ssl options, see Erlang's `ssl` docs
* `:parameters` - Keyword list of connection parameters
* `:connect_timeout` - The timeout for establishing new connections (default: 5000)
* `:prepare` - How to prepare queries, either `:named` to use named queries
or `:unnamed` to force unnamed queries (default: `:named`)
* `:socket_options` - Specifies socket configuration
The `:socket_options` are particularly useful when configuring the size
of both send and receive buffers. For example, when Ecto starts with a
pool of 20 connections, the memory usage may quickly grow from 20MB to
50MB based on the operating system default values for TCP buffers. It is
advised to stick with the operating system defaults but they can be
tweaked if desired:
socket_options: [recbuf: 8192, sndbuf: 8192]
We also recommend developers to consult the
[Postgrex documentation](https://hexdocs.pm/postgrex/Postgrex.html#start_link/1)
for a complete listing of all supported options.
### Storage options
* `:encoding` - the database encoding (default: "UTF8")
* `:template` - the template to create the database from
* `:lc_collate` - the collation order
* `:lc_ctype` - the character classification
* `:dump_path` - where to place dumped structures
## Extensions
Both PostgreSQL and its adapter for Elixir, Postgrex, support an
extension system. If you want to use custom extensions for Postgrex
alongside Ecto, you must define a type module with your extensions.
Create a new file anywhere in your application with the following:
Postgrex.Types.define(MyApp.PostgresTypes,
[MyExtension.Foo, MyExtensionBar] ++ Ecto.Adapters.Postgres.extensions(),
json: Poison)
Once your type module is defined, you can configure the repository to use it:
config :my_app, MyApp.Repo, types: MyApp.PostgresTypes
"""
# Inherit all behaviour from Ecto.Adapters.SQL
use Ecto.Adapters.SQL, :postgrex
# And provide a custom storage implementation
@behaviour Ecto.Adapter.Storage
@behaviour Ecto.Adapter.Structure
@doc """
All Ecto extensions for Postgrex.
"""
def extensions do
[]
end
# Support arrays in place of IN
@doc false
def dumpers({:embed, _} = type, _), do: [&Ecto.Adapters.SQL.dump_embed(type, &1)]
def dumpers({:in, sub}, {:in, sub}), do: [{:array, sub}]
def dumpers(:binary_id, type), do: [type, Ecto.UUID]
def dumpers(_, type), do: [type]
## Storage API
@doc false
def storage_up(opts) do
database = Keyword.fetch!(opts, :database) || raise ":database is nil in repository configuration"
encoding = opts[:encoding] || "UTF8"
opts = Keyword.put(opts, :database, "postgres")
command =
~s(CREATE DATABASE "#{database}" ENCODING '#{encoding}')
|> concat_if(opts[:template], &"TEMPLATE=#{&1}")
|> concat_if(opts[:lc_ctype], &"LC_CTYPE='#{&1}'")
|> concat_if(opts[:lc_collate], &"LC_COLLATE='#{&1}'")
case run_query(command, opts) do
{:ok, _} ->
:ok
{:error, %{postgres: %{code: :duplicate_database}}} ->
{:error, :already_up}
{:error, error} ->
{:error, Exception.message(error)}
end
end
defp concat_if(content, nil, _fun), do: content
defp concat_if(content, value, fun), do: content <> " " <> fun.(value)
@doc false
def storage_down(opts) do
database = Keyword.fetch!(opts, :database) || raise ":database is nil in repository configuration"
command = "DROP DATABASE \"#{database}\""
opts = Keyword.put(opts, :database, "postgres")
case run_query(command, opts) do
{:ok, _} ->
:ok
{:error, %{postgres: %{code: :invalid_catalog_name}}} ->
{:error, :already_down}
{:error, error} ->
{:error, Exception.message(error)}
end
end
@doc false
def supports_ddl_transaction? do
true
end
@doc false
def structure_dump(default, config) do
table = config[:migration_source] || "schema_migrations"
with {:ok, versions} <- select_versions(table, config),
{:ok, path} <- pg_dump(default, config),
do: append_versions(table, versions, path)
end
defp select_versions(table, config) do
case run_query(~s[SELECT version FROM "#{table}" ORDER BY version], config) do
{:ok, %{rows: rows}} -> {:ok, Enum.map(rows, &hd/1)}
{:error, %{postgres: %{code: :undefined_table}}} -> {:ok, []}
{:error, _} = error -> error
end
end
defp pg_dump(default, config) do
path = config[:dump_path] || Path.join(default, "structure.sql")
File.mkdir_p!(Path.dirname(path))
case run_with_cmd("pg_dump", config, ["--file", path, "--schema-only", "--no-acl",
"--no-owner", config[:database]]) do
{_output, 0} ->
{:ok, path}
{output, _} ->
{:error, output}
end
end
defp append_versions(_table, [], path) do
{:ok, path}
end
defp append_versions(table, versions, path) do
sql =
~s[INSERT INTO "#{table}" (version) VALUES ] <>
Enum.map_join(versions, ", ", &"(#{&1})") <>
~s[;\n\n]
File.open!(path, [:append], fn file ->
IO.write(file, sql)
end)
{:ok, path}
end
@doc false
def structure_load(default, config) do
path = config[:dump_path] || Path.join(default, "structure.sql")
case run_with_cmd("psql", config, ["--quiet", "--file", path, config[:database]]) do
{_output, 0} -> {:ok, path}
{output, _} -> {:error, output}
end
end
## Helpers
defp run_query(sql, opts) do
{:ok, _} = Application.ensure_all_started(:postgrex)
opts =
opts
|> Keyword.drop([:name, :log])
|> Keyword.put(:pool, DBConnection.Connection)
|> Keyword.put(:backoff_type, :stop)
{:ok, pid} = Task.Supervisor.start_link
task = Task.Supervisor.async_nolink(pid, fn ->
{:ok, conn} = Postgrex.start_link(opts)
value = Ecto.Adapters.Postgres.Connection.execute(conn, sql, [], opts)
GenServer.stop(conn)
value
end)
timeout = Keyword.get(opts, :timeout, 15_000)
case Task.yield(task, timeout) || Task.shutdown(task) do
{:ok, {:ok, result}} ->
{:ok, result}
{:ok, {:error, error}} ->
{:error, error}
{:exit, {%{__struct__: struct} = error, _}}
when struct in [Postgrex.Error, DBConnection.Error] ->
{:error, error}
{:exit, reason} ->
{:error, RuntimeError.exception(Exception.format_exit(reason))}
nil ->
{:error, RuntimeError.exception("command timed out")}
end
end
defp run_with_cmd(cmd, opts, opt_args) do
unless System.find_executable(cmd) do
raise "could not find executable `#{cmd}` in path, " <>
"please guarantee it is available before running ecto commands"
end
env =
[{"PGCONNECT_TIMEOUT", "10"}]
env =
if password = opts[:password] do
[{"PGPASSWORD", password}|env]
else
env
end
args =
[]
args =
if username = opts[:username], do: ["-U", username|args], else: args
args =
if port = opts[:port], do: ["-p", to_string(port)|args], else: args
host = opts[:hostname] || System.get_env("PGHOST") || "localhost"
args = ["--host", host|args]
args = args ++ opt_args
System.cmd(cmd, args, env: env, stderr_to_stdout: true)
end
end
| 32.067857 | 102 | 0.63281 |
ffaa6ad5af8a38bfd1984bb9aef34b480a13bd89 | 161 | ex | Elixir | apps/wechat_mp/lib/wechat_mp/api/model/media_id.ex | secretworry/exwechat | 2d3a8bf03135eebd58452122c2f7b3718b5f5b3d | [
"Apache-2.0"
] | null | null | null | apps/wechat_mp/lib/wechat_mp/api/model/media_id.ex | secretworry/exwechat | 2d3a8bf03135eebd58452122c2f7b3718b5f5b3d | [
"Apache-2.0"
] | null | null | null | apps/wechat_mp/lib/wechat_mp/api/model/media_id.ex | secretworry/exwechat | 2d3a8bf03135eebd58452122c2f7b3718b5f5b3d | [
"Apache-2.0"
] | null | null | null | defmodule WechatMP.Api.Model.MediaId do
use WechatBase.Api.Model.JsonResponse
model do
field :type
field :media_id
field :created_at
end
end
| 14.636364 | 39 | 0.726708 |
ffaa947b276a9459ee8f2636da418489053c03c4 | 189 | exs | Elixir | priv/repo/migrations/20201111234045_criar_tarefas.exs | tarcisiooliveira/todo_app | 2a5291390c64cc6a0a593b8d0f671ad899dea034 | [
"MIT"
] | null | null | null | priv/repo/migrations/20201111234045_criar_tarefas.exs | tarcisiooliveira/todo_app | 2a5291390c64cc6a0a593b8d0f671ad899dea034 | [
"MIT"
] | null | null | null | priv/repo/migrations/20201111234045_criar_tarefas.exs | tarcisiooliveira/todo_app | 2a5291390c64cc6a0a593b8d0f671ad899dea034 | [
"MIT"
] | null | null | null | defmodule TodoApp.Repo.Migrations.CriarTarefas do
use Ecto.Migration
def change do
create table(:tarefas) do
add :titulo, :text
add :pronta, :boolean
end
end
end
| 17.181818 | 49 | 0.68254 |
ffaaa9539e05ad91b3986aefa93ab94984b434de | 894 | ex | Elixir | src/schnapps-ex/apps/schnapps_web/lib/schnapps_web/application.ex | korczis/dj-schnapps | 7128988695855161f22d0ef2dd9ef61b46ba6f9a | [
"MIT"
] | null | null | null | src/schnapps-ex/apps/schnapps_web/lib/schnapps_web/application.ex | korczis/dj-schnapps | 7128988695855161f22d0ef2dd9ef61b46ba6f9a | [
"MIT"
] | null | null | null | src/schnapps-ex/apps/schnapps_web/lib/schnapps_web/application.ex | korczis/dj-schnapps | 7128988695855161f22d0ef2dd9ef61b46ba6f9a | [
"MIT"
] | null | null | null | defmodule SchnappsWeb.Application do
# See https://hexdocs.pm/elixir/Application.html
# for more information on OTP Applications
@moduledoc false
use Application
def start(_type, _args) do
# List all child processes to be supervised
children = [
# Start the endpoint when the application starts
SchnappsWeb.Endpoint
# Starts a worker by calling: SchnappsWeb.Worker.start_link(arg)
# {SchnappsWeb.Worker, arg},
]
# See https://hexdocs.pm/elixir/Supervisor.html
# for other strategies and supported options
opts = [strategy: :one_for_one, name: SchnappsWeb.Supervisor]
Supervisor.start_link(children, opts)
end
# Tell Phoenix to update the endpoint configuration
# whenever the application is updated.
def config_change(changed, _new, removed) do
SchnappsWeb.Endpoint.config_change(changed, removed)
:ok
end
end
| 29.8 | 70 | 0.725951 |
ffaab1211f0bf0d4ab2231355b47055cc8f34d8b | 6,938 | ex | Elixir | lib/codes/codes_q64.ex | badubizzle/icd_code | 4c625733f92b7b1d616e272abc3009bb8b916c0c | [
"Apache-2.0"
] | null | null | null | lib/codes/codes_q64.ex | badubizzle/icd_code | 4c625733f92b7b1d616e272abc3009bb8b916c0c | [
"Apache-2.0"
] | null | null | null | lib/codes/codes_q64.ex | badubizzle/icd_code | 4c625733f92b7b1d616e272abc3009bb8b916c0c | [
"Apache-2.0"
] | null | null | null | defmodule IcdCode.ICDCode.Codes_Q64 do
alias IcdCode.ICDCode
def _Q640 do
%ICDCode{full_code: "Q640",
category_code: "Q64",
short_code: "0",
full_name: "Epispadias",
short_name: "Epispadias",
category_name: "Epispadias"
}
end
def _Q6410 do
%ICDCode{full_code: "Q6410",
category_code: "Q64",
short_code: "10",
full_name: "Exstrophy of urinary bladder, unspecified",
short_name: "Exstrophy of urinary bladder, unspecified",
category_name: "Exstrophy of urinary bladder, unspecified"
}
end
def _Q6411 do
%ICDCode{full_code: "Q6411",
category_code: "Q64",
short_code: "11",
full_name: "Supravesical fissure of urinary bladder",
short_name: "Supravesical fissure of urinary bladder",
category_name: "Supravesical fissure of urinary bladder"
}
end
def _Q6412 do
%ICDCode{full_code: "Q6412",
category_code: "Q64",
short_code: "12",
full_name: "Cloacal extrophy of urinary bladder",
short_name: "Cloacal extrophy of urinary bladder",
category_name: "Cloacal extrophy of urinary bladder"
}
end
def _Q6419 do
%ICDCode{full_code: "Q6419",
category_code: "Q64",
short_code: "19",
full_name: "Other exstrophy of urinary bladder",
short_name: "Other exstrophy of urinary bladder",
category_name: "Other exstrophy of urinary bladder"
}
end
def _Q642 do
%ICDCode{full_code: "Q642",
category_code: "Q64",
short_code: "2",
full_name: "Congenital posterior urethral valves",
short_name: "Congenital posterior urethral valves",
category_name: "Congenital posterior urethral valves"
}
end
def _Q6431 do
%ICDCode{full_code: "Q6431",
category_code: "Q64",
short_code: "31",
full_name: "Congenital bladder neck obstruction",
short_name: "Congenital bladder neck obstruction",
category_name: "Congenital bladder neck obstruction"
}
end
def _Q6432 do
%ICDCode{full_code: "Q6432",
category_code: "Q64",
short_code: "32",
full_name: "Congenital stricture of urethra",
short_name: "Congenital stricture of urethra",
category_name: "Congenital stricture of urethra"
}
end
def _Q6433 do
%ICDCode{full_code: "Q6433",
category_code: "Q64",
short_code: "33",
full_name: "Congenital stricture of urinary meatus",
short_name: "Congenital stricture of urinary meatus",
category_name: "Congenital stricture of urinary meatus"
}
end
def _Q6439 do
%ICDCode{full_code: "Q6439",
category_code: "Q64",
short_code: "39",
full_name: "Other atresia and stenosis of urethra and bladder neck",
short_name: "Other atresia and stenosis of urethra and bladder neck",
category_name: "Other atresia and stenosis of urethra and bladder neck"
}
end
def _Q644 do
%ICDCode{full_code: "Q644",
category_code: "Q64",
short_code: "4",
full_name: "Malformation of urachus",
short_name: "Malformation of urachus",
category_name: "Malformation of urachus"
}
end
def _Q645 do
%ICDCode{full_code: "Q645",
category_code: "Q64",
short_code: "5",
full_name: "Congenital absence of bladder and urethra",
short_name: "Congenital absence of bladder and urethra",
category_name: "Congenital absence of bladder and urethra"
}
end
def _Q646 do
%ICDCode{full_code: "Q646",
category_code: "Q64",
short_code: "6",
full_name: "Congenital diverticulum of bladder",
short_name: "Congenital diverticulum of bladder",
category_name: "Congenital diverticulum of bladder"
}
end
def _Q6470 do
%ICDCode{full_code: "Q6470",
category_code: "Q64",
short_code: "70",
full_name: "Unspecified congenital malformation of bladder and urethra",
short_name: "Unspecified congenital malformation of bladder and urethra",
category_name: "Unspecified congenital malformation of bladder and urethra"
}
end
def _Q6471 do
%ICDCode{full_code: "Q6471",
category_code: "Q64",
short_code: "71",
full_name: "Congenital prolapse of urethra",
short_name: "Congenital prolapse of urethra",
category_name: "Congenital prolapse of urethra"
}
end
def _Q6472 do
%ICDCode{full_code: "Q6472",
category_code: "Q64",
short_code: "72",
full_name: "Congenital prolapse of urinary meatus",
short_name: "Congenital prolapse of urinary meatus",
category_name: "Congenital prolapse of urinary meatus"
}
end
def _Q6473 do
%ICDCode{full_code: "Q6473",
category_code: "Q64",
short_code: "73",
full_name: "Congenital urethrorectal fistula",
short_name: "Congenital urethrorectal fistula",
category_name: "Congenital urethrorectal fistula"
}
end
def _Q6474 do
%ICDCode{full_code: "Q6474",
category_code: "Q64",
short_code: "74",
full_name: "Double urethra",
short_name: "Double urethra",
category_name: "Double urethra"
}
end
def _Q6475 do
%ICDCode{full_code: "Q6475",
category_code: "Q64",
short_code: "75",
full_name: "Double urinary meatus",
short_name: "Double urinary meatus",
category_name: "Double urinary meatus"
}
end
def _Q6479 do
%ICDCode{full_code: "Q6479",
category_code: "Q64",
short_code: "79",
full_name: "Other congenital malformations of bladder and urethra",
short_name: "Other congenital malformations of bladder and urethra",
category_name: "Other congenital malformations of bladder and urethra"
}
end
def _Q648 do
%ICDCode{full_code: "Q648",
category_code: "Q64",
short_code: "8",
full_name: "Other specified congenital malformations of urinary system",
short_name: "Other specified congenital malformations of urinary system",
category_name: "Other specified congenital malformations of urinary system"
}
end
def _Q649 do
%ICDCode{full_code: "Q649",
category_code: "Q64",
short_code: "9",
full_name: "Congenital malformation of urinary system, unspecified",
short_name: "Congenital malformation of urinary system, unspecified",
category_name: "Congenital malformation of urinary system, unspecified"
}
end
end
| 33.843902 | 85 | 0.619343 |
ffaad5e5653cf69587d40882dfa4f872ea7a9380 | 4,259 | ex | Elixir | lib/changelog/data/news/news_queue.ex | theafricanengineer/changelog.com | 4954d0df516c0a325667ec6c1fbbf02d68c9b953 | [
"MIT"
] | null | null | null | lib/changelog/data/news/news_queue.ex | theafricanengineer/changelog.com | 4954d0df516c0a325667ec6c1fbbf02d68c9b953 | [
"MIT"
] | null | null | null | lib/changelog/data/news/news_queue.ex | theafricanengineer/changelog.com | 4954d0df516c0a325667ec6c1fbbf02d68c9b953 | [
"MIT"
] | null | null | null | defmodule Changelog.NewsQueue do
use Changelog.Data
require Logger
alias Changelog.{Buffer, NewsItem, NewsQueue, Notifier, Search}
schema "news_queue" do
field :position, :float
belongs_to :item, NewsItem
end
def queued(query \\ NewsQueue) do
from(q in query,
left_join: i in assoc(q, :item),
where: is_nil(i.published_at),
order_by: q.position)
end
def scheduled(query \\ NewsQueue) do
from(q in query,
left_join: i in assoc(q, :item),
where: not(is_nil(i.published_at)),
order_by: i.published_at)
end
def past(query), do: from([_q, i] in query, where: i.published_at <= ^Timex.now)
def append(item) do
entry = change(%NewsQueue{}, %{item_id: item.id})
entries = Repo.all(NewsQueue.queued)
entry = if length(entries) > 0 do
last_position = List.last(entries).position
change(entry, %{position: last_position + 1.0})
else
change(entry, %{position: 1.0})
end
Repo.insert(entry)
NewsItem.queue!(item)
end
def move(item = %NewsItem{}, to_index) do
entry = Repo.get_by(NewsQueue, item_id: item.id)
move(entry, to_index)
end
def move(entry = %NewsQueue{}, to_index) do
entries = Repo.all(NewsQueue.queued)
current_index = Enum.find_index(entries, &(&1 == entry))
entry = cond do
to_index <= 0 -> # move to top
first = List.first(entries)
change(entry, %{position: first.position / 2})
to_index >= (length(entries) - 1) -> # move to bottom
last = List.last(entries)
change(entry, %{position: last.position + 1})
to_index < current_index -> # move up
pre = Enum.at(entries, to_index - 1)
post = Enum.at(entries, to_index)
change(entry, %{position: (pre.position + post.position) / 2})
to_index > current_index -> # move down
pre = Enum.at(entries, to_index)
post = Enum.at(entries, to_index + 1)
change(entry, %{position: (pre.position + post.position) / 2})
true -> change(entry, %{}) # no move-y
end
Repo.update(entry)
end
def preload_all(query = %Ecto.Query{}), do: Ecto.Query.preload(query, [item: [:author, :logger, :topics]])
def preload_all(entry), do: Repo.preload(entry, [item: [:author, :logger, :topics]])
def prepend(item) do
entry = change(%NewsQueue{}, %{item_id: item.id})
entries = NewsQueue.queued |> Repo.all
entry = if length(entries) > 0 do
first_position = List.first(entries).position
change(entry, %{position: first_position / 2})
else
change(entry, %{position: 1.0})
end
Repo.insert(entry)
NewsItem.queue!(item)
end
def publish_next do
NewsQueue.queued()
|> NewsQueue.limit(1)
|> Repo.one()
|> publish()
end
def publish_scheduled do
NewsQueue.scheduled()
|> NewsQueue.past()
|> NewsQueue.limit(1)
|> Repo.one()
|> publish()
end
def publish do
publish_scheduled() || publish_next_maybe(10 , 60)
end
def publish(item = %NewsItem{}) do
case Repo.get_by(NewsQueue, item_id: item.id) do
entry = %NewsQueue{} -> publish(entry)
nil -> publish_item(item)
end
end
def publish(entry = %NewsQueue{}) do
entry = Repo.preload(entry, :item)
publish_item(entry.item)
Repo.delete!(entry)
true
end
def publish(nil) do
Logger.info("News: Published bupkis")
false
end
defp publish_item(item = %NewsItem{}) do
item = NewsItem.publish!(item)
Task.start_link(fn -> Search.save_item(item) end)
Task.start_link(fn -> Buffer.queue(item) end)
Task.start_link(fn -> Notifier.notify(item) end)
Logger.info("News: Published ##{item.id}")
true
end
defp publish_next_maybe(per_day, interval) do
if one_chance_in(5) && nothing_recent(interval) && no_max(per_day) do
publish_next()
end
end
defp one_chance_in(n), do: Enum.random(1..n) == 1
defp nothing_recent(interval) do
Timex.now()
|> Timex.shift(minutes: -interval)
|> NewsItem.published_since()
|> Repo.count()
|> Kernel.==(0)
end
defp no_max(per_day) do
Timex.now()
|> Timex.shift(days: -1)
|> NewsItem.published_since()
|> Repo.count()
|> Kernel.<(per_day)
end
end
| 26.290123 | 108 | 0.629021 |
ffaaddba1ed93a6c2b29251bd94c5a198fee2a48 | 2,833 | exs | Elixir | test/crit/users/user_api_test.exs | brownt23/crit19 | c45c7b3ae580c193168d83144da0eeb9bc91c8a9 | [
"MIT"
] | 6 | 2019-07-16T19:31:23.000Z | 2021-06-05T19:01:05.000Z | test/crit/users/user_api_test.exs | brownt23/crit19 | c45c7b3ae580c193168d83144da0eeb9bc91c8a9 | [
"MIT"
] | null | null | null | test/crit/users/user_api_test.exs | brownt23/crit19 | c45c7b3ae580c193168d83144da0eeb9bc91c8a9 | [
"MIT"
] | 3 | 2020-02-24T23:38:27.000Z | 2020-08-01T23:50:17.000Z | defmodule Crit.Users.UserApiTest do
use Crit.DataCase
alias Crit.Users.UserApi
alias Crit.Users.UserHavingToken
alias Crit.Users.Schemas.PermissionList
alias Crit.Exemplars.Minimal
# Factor out verbosity. Is also a handy list of what's tested here
def t_fresh_user_changeset(),
do: UserApi.fresh_user_changeset()
def t_permissioned_user(id),
do: UserApi.permissioned_user_from_id(id, @institution)
def t_active_users(),
do: UserApi.active_users(@institution)
def t_create_unactivated_user(params),
do: UserApi.create_unactivated_user(params, @institution)
# ------------------------------------------------------------------------
test "the fresh/default user changeset contains permissions" do
t_fresh_user_changeset()
|> assert_no_changes(:permission_list)
|> assert_data_shape(:permission_list, %PermissionList{})
end
# ------------------------------------------------------------------------
describe "getting a fully permissioned user" do
test "does not exist" do
refute t_permissioned_user(3838)
end
test "brings along permission list" do
original = Minimal.user()
t_permissioned_user(original.id)
|> assert_field(permission_list: original.permission_list)
end
end
# ------------------------------------------------------------------------
test "fetching all *active* users" do
visible = Minimal.user()
_invisible = Minimal.user(active: false)
assert [retrieved] = t_active_users()
assert_field(retrieved, auth_id: visible.auth_id)
end
# ------------------------------------------------------------------------
describe "create_unactivated_user" do
setup do
params = Factory.string_params_for(:user, auth_id: "unique")
%UserHavingToken{user: user, token: token} =
t_create_unactivated_user(params) |> ok_content
[user: user, token: token]
end
test "user has fields", %{user: user} do
assert_field(user, auth_id: "unique")
end
test "token matches user", %{user: user, token: token} do
assert_field(token, user_id: user.id)
end
test "user can be retrieved", %{user: user} do
assert [retrieved] = t_active_users()
assert_fields(retrieved,
auth_id: user.auth_id,
active: true)
end
test "trying to reuse an auth id", %{user: original} do
params = Factory.string_params_for(:user, auth_id: original.auth_id)
t_create_unactivated_user(params)
|> error_content
|> assert_error(auth_id: "has already been taken")
end
test "creating with a bad param" do
params = Factory.string_params_for(:user, auth_id: "")
t_create_unactivated_user(params)
|> error_content
|> assert_error(auth_id: "can't be blank")
end
end
end
| 29.821053 | 76 | 0.619485 |
ffaae86db636510e7dc8fedc014c4fcf60a4382d | 20,556 | exs | Elixir | test/ex_doc/language/elixir_test.exs | cristineguadelupe/ex_doc | d152fe92739dbc5372eef936f4a24fbacdf7cc1e | [
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | test/ex_doc/language/elixir_test.exs | cristineguadelupe/ex_doc | d152fe92739dbc5372eef936f4a24fbacdf7cc1e | [
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | test/ex_doc/language/elixir_test.exs | cristineguadelupe/ex_doc | d152fe92739dbc5372eef936f4a24fbacdf7cc1e | [
"Apache-2.0",
"CC-BY-4.0"
] | null | null | null | defmodule ExDoc.Language.ElixirTest do
use ExUnit.Case, async: true
doctest ExDoc.Autolink
import ExUnit.CaptureIO
defp sigil_m(text, []) do
[{:p, _, [ast], _}] = ExDoc.Markdown.to_ast(text, [])
ast
end
setup do
ExDoc.Refs.clear()
:ok
end
describe "autolink_doc/2" do
test "elixir stdlib module" do
assert autolink_doc("String") == ~m"[`String`](https://hexdocs.pm/elixir/String.html)"
assert autolink_doc("Elixir.String") ==
~m"[`Elixir.String`](https://hexdocs.pm/elixir/String.html)"
end
test "other elixir core module" do
assert autolink_doc("IEx.Helpers") ==
~m"[`IEx.Helpers`](https://hexdocs.pm/iex/IEx.Helpers.html)"
end
test "erlang module" do
assert_unchanged(":array")
end
test "unknown module" do
assert_unchanged("Unknown")
assert_unchanged(":unknown")
assert_unchanged("A.b.C")
end
test "project-local module" do
ExDoc.Refs.insert([
{{:module, AutolinkTest.Foo}, :public}
])
assert autolink_doc("AutolinkTest.Foo") == ~m"[`AutolinkTest.Foo`](AutolinkTest.Foo.html)"
assert autolink_doc("String", apps: [:elixir]) == ~m"[`String`](String.html)"
assert autolink_doc("AutolinkTest.Foo", current_module: AutolinkTest.Foo) ==
~m"[`AutolinkTest.Foo`](AutolinkTest.Foo.html#content)"
end
test "remote function" do
ExDoc.Refs.insert([
{{:module, AutolinkTest.Foo}, :public},
{{:function, AutolinkTest.Foo, :foo, 1}, :public},
{{:function, AutolinkTest.Foo, :., 2}, :public},
{{:function, AutolinkTest.Foo, :.., 2}, :public}
])
assert autolink_doc("AutolinkTest.Foo.foo/1") ==
~m"[`AutolinkTest.Foo.foo/1`](AutolinkTest.Foo.html#foo/1)"
assert autolink_doc("AutolinkTest.Foo../2") ==
~m"[`AutolinkTest.Foo../2`](AutolinkTest.Foo.html#./2)"
assert autolink_doc("AutolinkTest.Foo.../2") ==
~m"[`AutolinkTest.Foo.../2`](AutolinkTest.Foo.html#../2)"
assert_unchanged("AutolinkTest.Bad.bar/1")
end
test "elixir stdlib function" do
assert autolink_doc("String.upcase/2") ==
~m"[`String.upcase/2`](https://hexdocs.pm/elixir/String.html#upcase/2)"
end
test "elixir function with default argument" do
assert autolink_doc("Enum.join/1") ==
~m"[`Enum.join/1`](https://hexdocs.pm/elixir/Enum.html#join/1)"
end
test "erlang stdlib function" do
assert autolink_doc(":lists.all/2") ==
~m"[`:lists.all/2`](https://www.erlang.org/doc/man/lists.html#all-2)"
end
test "local function" do
ExDoc.Refs.insert([
{{:module, AutolinkTest.Foo}, :public},
{{:function, AutolinkTest.Foo, :foo, 1}, :public},
{{:function, AutolinkTest.Foo, :., 2}, :public},
{{:function, AutolinkTest.Foo, :.., 2}, :public}
])
assert autolink_doc("foo/1", current_module: AutolinkTest.Foo) == ~m"[`foo/1`](#foo/1)"
assert autolink_doc("./2", current_module: AutolinkTest.Foo) == ~m"[`./2`](#./2)"
assert autolink_doc("../2", current_module: AutolinkTest.Foo) == ~m"[`../2`](#../2)"
assert_unchanged("bar/1", current_module: AutolinkTest.Foo)
end
test "auto-imported function" do
assert autolink_doc("+/2") ==
~m"[`+/2`](https://hexdocs.pm/elixir/Kernel.html#+/2)"
assert autolink_doc("&/1") ==
~m"[`&/1`](https://hexdocs.pm/elixir/Kernel.SpecialForms.html#&/1)"
assert autolink_doc("for/1") ==
~m"[`for/1`](https://hexdocs.pm/elixir/Kernel.SpecialForms.html#for/1)"
assert autolink_doc("for/1", apps: [:elixir]) ==
~m"[`for/1`](Kernel.SpecialForms.html#for/1)"
end
@tag skip: not Version.match?(System.version(), "~> 1.13")
test "stepped range" do
assert autolink_doc("..///3") ==
~m"[`..///3`](https://hexdocs.pm/elixir/Kernel.html#..///3)"
end
test "elixir callback" do
assert autolink_doc("c:GenServer.handle_call/3") ==
~m"[`GenServer.handle_call/3`](https://hexdocs.pm/elixir/GenServer.html#c:handle_call/3)"
end
test "erlang callback" do
assert autolink_doc("c::gen_server.handle_call/3") ==
~m"[`:gen_server.handle_call/3`](https://www.erlang.org/doc/man/gen_server.html#Module:handle_call-3)"
end
test "elixir type" do
assert autolink_doc("t:Calendar.date/0") ==
~m"[`Calendar.date/0`](https://hexdocs.pm/elixir/Calendar.html#t:date/0)"
end
test "elixir basic & built-in types" do
assert autolink_doc("t:atom/0") ==
~m"[`atom/0`](https://hexdocs.pm/elixir/typespecs.html#basic-types)"
assert autolink_doc("t:keyword/0") ==
~m"[`keyword/0`](https://hexdocs.pm/elixir/typespecs.html#built-in-types)"
assert autolink_doc("t:keyword/0", apps: [:elixir]) ==
~m"[`keyword/0`](typespecs.html#built-in-types)"
end
test "erlang type" do
assert autolink_doc("t::array.array/0") ==
~m"[`:array.array/0`](https://www.erlang.org/doc/man/array.html#type-array)"
end
test "special forms" do
assert autolink_doc("__block__/1", current_module: Kernel.SpecialForms) ==
~m"[`__block__/1`](#__block__/1)"
assert autolink_doc("__aliases__/1", current_module: Kernel.SpecialForms) ==
~m"[`__aliases__/1`](#__aliases__/1)"
end
test "escaping" do
assert autolink_doc("Kernel.SpecialForms.%{}/1") ==
~m"[`Kernel.SpecialForms.%{}/1`](https://hexdocs.pm/elixir/Kernel.SpecialForms.html#%25%7B%7D/1)"
assert autolink_doc("Kernel.SpecialForms.%/2") ==
~m"[`Kernel.SpecialForms.%/2`](https://hexdocs.pm/elixir/Kernel.SpecialForms.html#%25/2)"
assert autolink_doc("Kernel.SpecialForms.{}/1") ==
~m"[`Kernel.SpecialForms.{}/1`](https://hexdocs.pm/elixir/Kernel.SpecialForms.html#%7B%7D/1)"
assert autolink_doc("Kernel.SpecialForms.<<>>/1") ==
~m"[`Kernel.SpecialForms.<<>>/1`](https://hexdocs.pm/elixir/Kernel.SpecialForms.html#%3C%3C%3E%3E/1)"
end
test "custom link" do
assert autolink_doc(~m"[custom text](`String`)") ==
~m"[custom text](https://hexdocs.pm/elixir/String.html)"
assert autolink_doc(~m"[custom text](`String.at/2`)") ==
~m"[custom text](https://hexdocs.pm/elixir/String.html#at/2)"
assert autolink_doc(~m"[custom text](`:lists`)") ==
~m"[custom text](https://www.erlang.org/doc/man/lists.html)"
assert autolink_doc(~m"[custom text](`:lists.all/2`)") ==
~m"[custom text](https://www.erlang.org/doc/man/lists.html#all-2)"
end
test "mix task" do
assert autolink_doc("mix compile.elixir") ==
~m"[`mix compile.elixir`](https://hexdocs.pm/mix/Mix.Tasks.Compile.Elixir.html)"
assert autolink_doc("mix help compile.elixir") ==
~m"[`mix help compile.elixir`](https://hexdocs.pm/mix/Mix.Tasks.Compile.Elixir.html)"
assert autolink_doc("mix help help") ==
~m"[`mix help help`](https://hexdocs.pm/mix/Mix.Tasks.Help.html)"
assert autolink_doc("mix compile.elixir", apps: [:mix]) ==
~m"[`mix compile.elixir`](Mix.Tasks.Compile.Elixir.html)"
assert_unchanged("mix compile.elixir --verbose")
assert_unchanged("mix unknown.task")
end
test "3rd party links" do
assert autolink_doc("EarmarkParser.as_ast/2") ==
~m"[`EarmarkParser.as_ast/2`](https://hexdocs.pm/earmark_parser/EarmarkParser.html#as_ast/2)"
assert autolink_doc("EarmarkParser.as_ast/2", deps: [earmark_parser: "https://example.com/"]) ==
~m"[`EarmarkParser.as_ast/2`](https://example.com/EarmarkParser.html#as_ast/2)"
assert autolink_doc("EarmarkParser.as_ast/2", deps: [earmark_parser: "https://example.com"]) ==
~m"[`EarmarkParser.as_ast/2`](https://example.com/EarmarkParser.html#as_ast/2)"
# extensions are ignored for external links
assert autolink_doc("EarmarkParser.as_ast/2", ext: ".xhtml") ==
~m"[`EarmarkParser.as_ast/2`](https://hexdocs.pm/earmark_parser/EarmarkParser.html#as_ast/2)"
end
test "extras" do
opts = [extras: %{"Foo Bar.md" => "foo-bar", "Bar Baz.livemd" => "bar-baz"}]
assert autolink_doc(~m"[Foo](Foo Bar.md)", opts) == ~m"[Foo](foo-bar.html)"
assert autolink_doc(~m"[Bar](Bar Baz.livemd)", opts) == ~m"[Bar](bar-baz.html)"
assert autolink_doc(~m"[Foo](Foo Bar.md)", [ext: ".xhtml"] ++ opts) ==
~m"[Foo](foo-bar.xhtml)"
assert autolink_doc(~m"[Foo](Foo Bar.md#baz)", opts) == ~m"[Foo](foo-bar.html#baz)"
assert autolink_doc(~m"[Foo](../guide/Foo Bar.md)", opts) == ~m"[Foo](foo-bar.html)"
assert_unchanged(~m"[Foo](http://example.com/foo.md)", opts)
assert_unchanged(~m"[Foo](#baz)", opts)
end
test "special case links" do
assert autolink_doc(~m"`//2`") ==
{:a, [href: "https://hexdocs.pm/elixir/Kernel.html#//2"], [ast("//2")], %{}}
assert autolink_doc(~m"[division](`//2`)") ==
{:a, [href: "https://hexdocs.pm/elixir/Kernel.html#//2"], ["division"], %{}}
assert autolink_doc(~m"`Kernel.//2`") ==
{:a, [href: "https://hexdocs.pm/elixir/Kernel.html#//2"], [ast("Kernel.//2")], %{}}
assert autolink_doc(~m"[division](`Kernel.//2`)") ==
{:a, [href: "https://hexdocs.pm/elixir/Kernel.html#//2"], ["division"], %{}}
end
test "other link" do
assert_unchanged(~m"[`String`](foo.html)")
assert_unchanged(~m"[custom text](foo.html)")
end
test "other" do
assert_unchanged("String.upcase() / 2")
assert_unchanged("String.upcase()/2 ")
assert_unchanged(" String.upcase()/2")
assert_unchanged(":\"atom\"")
assert_unchanged("1 + 2")
assert_unchanged({:p, [], ["hello"], %{}})
end
end
describe "autolink_spec/3" do
test "operators" do
ExDoc.Refs.insert([
{{:module, MyModule}, :public},
{{:type, MyModule, :foo, 0}, :public}
])
assert autolink_spec(quote(do: +foo() :: foo())) ==
~s[+<a href="#t:foo/0">foo</a>() :: <a href="#t:foo/0">foo</a>()]
assert autolink_spec(quote(do: foo() + foo() :: foo())) ==
~s[<a href="#t:foo/0">foo</a>() + <a href="#t:foo/0">foo</a>() :: <a href="#t:foo/0">foo</a>()]
assert autolink_spec(quote(do: -0 :: 0)) == ~s[-0 :: 0]
end
test "locals" do
ExDoc.Refs.insert([
{{:module, MyModule}, :public},
{{:type, MyModule, :foo, 1}, :public},
{{:type, MyModule, :foo, 2}, :public},
{{:type, MyModule, :foo?, 1}, :public},
{{:type, MyModule, :foo!, 1}, :public},
{{:type, MyModule, :bar, 0}, :public},
{{:type, MyModule, :bar, 1}, :public},
{{:type, MyModule, :baz, 1}, :public}
])
assert autolink_spec(quote(do: unquote(:"/=")() :: :ok)) ==
~s[/=() :: :ok]
assert autolink_spec(quote(do: t() :: foo(1))) ==
~s[t() :: <a href="#t:foo/1">foo</a>(1)]
assert autolink_spec(quote(do: t() :: bar(foo(1)))) ==
~s[t() :: <a href="#t:bar/1">bar</a>(<a href="#t:foo/1">foo</a>(1))]
assert autolink_spec(quote(do: (t() :: bar(foo(1)) when bat: foo(1)))) ==
~s[t() :: <a href="#t:bar/1">bar</a>(<a href="#t:foo/1">foo</a>(1)) when bat: <a href="#t:foo/1">foo</a>(1)]
assert autolink_spec(quote(do: t() :: bar(baz(1)))) ==
~s[t() :: <a href="#t:bar/1">bar</a>(<a href="#t:baz/1">baz</a>(1))]
assert autolink_spec(quote(do: t() :: foo(bar(), bar()))) ==
~s[t() :: <a href="#t:foo/2">foo</a>(<a href="#t:bar/0">bar</a>(), <a href="#t:bar/0">bar</a>())]
assert autolink_spec(quote(do: t() :: foo!(bar()))) ==
~s[t() :: <a href="#t:foo!/1">foo!</a>(<a href="#t:bar/0">bar</a>())]
assert autolink_spec(quote(do: t() :: foo?(bar()))) ==
~s[t() :: <a href="#t:foo?/1">foo?</a>(<a href="#t:bar/0">bar</a>())]
assert autolink_spec(
quote do
t() :: %{
required(bar()) => bar(),
optional(bar()) => bar()
}
end
) ==
~s[t() :: %{required(<a href="#t:bar/0">bar</a>()) => <a href="#t:bar/0">bar</a>(), optional(<a href="#t:bar/0">bar</a>()) => <a href="#t:bar/0">bar</a>()}]
end
test "remotes" do
ExDoc.Refs.insert([
{{:module, AutolinkTest.Foo}, :public},
{{:type, AutolinkTest.Foo, :t, 0}, :public}
])
assert autolink_spec(quote(do: t() :: AutolinkTest.Foo.t())) ==
~s[t() :: <a href="AutolinkTest.Foo.html#t:t/0">AutolinkTest.Foo.t</a>()]
end
test "skip typespec name" do
ExDoc.Refs.insert([
{{:module, MyModule}, :public},
{{:type, MyModule, :foo, 0}, :public},
{{:type, MyModule, :foo, 1}, :public}
])
assert autolink_spec(quote(do: foo() :: foo()))
~s[foo() :: <a href="#t:foo/0">foo</a>()]
assert autolink_spec(quote(do: foo(1) :: foo(1))) ==
~s[foo(1) :: <a href="#t:foo/1">foo</a>(1)]
assert autolink_spec(quote(do: (foo(1) :: foo(1) when bat: foo(1)))) ==
~s[foo(1) :: <a href="#t:foo/1">foo</a>(1) when bat: <a href="#t:foo/1">foo</a>(1)]
assert autolink_spec(quote(do: bar(foo(1)) :: foo(1))) ==
~s[bar(<a href="#t:foo/1">foo</a>(1)) :: <a href="#t:foo/1">foo</a>(1)]
assert autolink_spec(quote(do: (bar(foo(1)) :: foo(1) when bat: foo(1)))) ==
~s[bar(<a href="#t:foo/1">foo</a>(1)) :: <a href="#t:foo/1">foo</a>(1) when bat: <a href="#t:foo/1">foo</a>(1)]
assert autolink_spec(quote(do: bar(foo :: foo(1)) :: foo(1))) ==
~s[bar(foo :: <a href="#t:foo/1">foo</a>(1)) :: <a href="#t:foo/1">foo</a>(1)]
end
test "Elixir stdlib types" do
assert autolink_spec(quote(do: t() :: String.t())) ==
~s[t() :: <a href="https://hexdocs.pm/elixir/String.html#t:t/0">String.t</a>()]
end
test "Elixir basic types" do
assert autolink_spec(quote(do: t() :: atom())) ==
~s[t() :: <a href="https://hexdocs.pm/elixir/typespecs.html#basic-types">atom</a>()]
end
test "Elixir built-in types" do
assert autolink_spec(quote(do: t() :: keyword())) ==
~s[t() :: <a href="https://hexdocs.pm/elixir/typespecs.html#built-in-types">keyword</a>()]
end
test "Erlang stdlib types" do
assert autolink_spec(quote(do: t() :: :sets.set())) ==
~s[t() :: <a href="https://www.erlang.org/doc/man/sets.html#type-set">:sets.set</a>()]
end
test "escape special HTML characters" do
assert autolink_spec(quote(do: term() < term() :: boolean())) ==
~s[<a href="https://hexdocs.pm/elixir/typespecs.html#built-in-types">term</a>() < <a href="https://hexdocs.pm/elixir/typespecs.html#built-in-types">term</a>() :: <a href="https://hexdocs.pm/elixir/typespecs.html#built-in-types">boolean</a>()]
end
test "extensions are ignored for external links" do
assert autolink_spec(quote(do: t() :: String.t()), ext: ".xhtml") ==
~s[t() :: <a href="https://hexdocs.pm/elixir/String.html#t:t/0">String.t</a>()]
end
end
defmodule InMemory do
@callback hello() :: :world
def hello(), do: :world
end
test "in memory" do
assert {:a, _, _, _} = autolink_doc("ExDoc.Language.ElixirTest.InMemory.hello/0")
assert {:a, _, _, _} = autolink_doc("c:ExDoc.Language.ElixirTest.InMemory.hello/0")
# Types are not checked for in memory
assert_unchanged("t:ExDoc.Language.ElixirTest.InMemory.unknown/0")
warn("ExDoc.Language.ElixirTest.InMemory.unknown/0")
warn("c:ExDoc.Language.ElixirTest.InMemory.unknown/0")
end
test "warnings" do
ExDoc.Refs.insert([
{{:module, AutolinkTest.Foo}, :public},
{{:function, AutolinkTest.Foo, :bar, 1}, :hidden},
{{:type, AutolinkTest.Foo, :bad, 0}, :hidden}
])
captured = warn("AutolinkTest.Foo.bar/1", file: "lib/foo.ex", line: 1, id: nil)
assert captured =~
~s[documentation references function "AutolinkTest.Foo.bar/1" but it is hidden\n]
assert captured =~ ~r{lib/foo.ex:1\n$}
assert warn("t:AutolinkTest.Foo.bad/0", file: "lib/foo.ex", id: "AutolinkTest.Foo.foo/0") =~
~s[documentation references type "t:AutolinkTest.Foo.bad/0" but it is hidden or private]
assert warn("t:Elixir.AutolinkTest.Foo.bad/0",
file: "lib/foo.ex",
id: "AutolinkTest.Foo.foo/0"
) =~
~s[documentation references type "t:Elixir.AutolinkTest.Foo.bad/0" but it is hidden or private]
assert warn("t:AutolinkTest.Foo.bad/0", file: "lib/foo.ex", id: "AutolinkTest.Foo.foo/0") =~
~s[documentation references type "t:AutolinkTest.Foo.bad/0" but it is hidden or private]
assert warn("Code.Typespec") =~
~s[documentation references module "Code.Typespec" but it is hidden]
warn("String.upcase/9")
warn("c:GenServer.handle_call/9")
warn("t:Calendar.date/9")
assert warn(fn ->
autolink_spec(quote(do: t() :: String.bad()))
end) =~ ~s[documentation references type "String.bad()"]
assert warn(fn ->
autolink_spec(
quote do
t() :: %{
name: String.bad()
}
end
)
end) =~ ~s[documentation references type "String.bad()"]
assert warn(~m"[Foo](Foo Bar.md)", extras: %{}) =~
~s[documentation references file "Foo Bar.md" but it does not exist]
options = [skip_undefined_reference_warnings_on: ["MyModule"], module_id: "MyModule"]
assert_unchanged("String.upcase/9", options)
assert warn(fn ->
assert autolink_doc(~m"[Bar A](`Bar.A`)") ==
["Bar A"]
end) =~ ~s[module "Bar.A" but it is undefined]
assert_unchanged(~m"`Bar.A`")
assert warn(fn ->
assert autolink_doc(~m"[custom text](`Elixir.Unknown`)") ==
["custom text"]
end) =~ ~s[documentation references module "Elixir.Unknown" but it is undefined]
assert warn(fn ->
assert autolink_doc(~m"[It is Unknown](`Unknown`)") ==
["It is Unknown"]
end) =~ ~s[documentation references module "Unknown" but it is undefined]
assert warn(~m"[Foo task](`mix foo`)", []) =~
~s[documentation references "mix foo" but it is undefined]
assert_unchanged(~m"`mix foo`")
assert warn(~m"[bad](`String.upcase/9`)", extras: []) =~
~s[documentation references function "String.upcase/9" but it is undefined or private]
assert_unchanged(~m"`Unknown`")
assert_unchanged(~m"[Blank](about:blank)")
end
## Helpers
@default_options [
apps: [:myapp],
current_module: MyModule,
module_id: "MyModule",
file: "nofile"
]
defp autolink_doc(ast_or_text, options \\ []) do
ExDoc.Language.Elixir.autolink_doc(ast(ast_or_text), Keyword.merge(@default_options, options))
end
defp assert_unchanged(ast_or_text, options \\ []) do
captured =
capture_io(:stderr, fn ->
do_assert_unchanged(ast_or_text, options)
end)
unless captured == "" do
IO.puts(captured)
raise "failing due to warnings"
end
end
defp do_assert_unchanged(ast_or_text, options) do
assert autolink_doc(ast_or_text, options) == ast(ast_or_text)
end
defp ast(text) when is_binary(text), do: {:code, [class: "inline"], [text], %{}}
defp ast({_, _, _, _} = ast), do: ast
defp warn(fun) when is_function(fun, 0) do
captured = capture_io(:stderr, fun)
case Regex.scan(~r/documentation references/, captured) do
[_] -> :ok
items -> raise "got #{length(items)} warnings in:\n\n#{captured}"
end
captured
end
defp warn(ast_or_text, options \\ []) do
warn(fn ->
do_assert_unchanged(ast_or_text, options)
end)
end
defp autolink_spec(ast, options \\ []) do
ExDoc.Language.Elixir.autolink_spec(ast, Keyword.merge(@default_options, options))
end
end
| 37.23913 | 260 | 0.569663 |
ffab15d9653ab5899e95ecf37a04597b16f25f28 | 803 | exs | Elixir | config/test.exs | RubyFireStudio/phoenix-api-scaffold | 231c11fe6008a47d51e24b8e26fb6f5dc7a24ca6 | [
"MIT"
] | null | null | null | config/test.exs | RubyFireStudio/phoenix-api-scaffold | 231c11fe6008a47d51e24b8e26fb6f5dc7a24ca6 | [
"MIT"
] | null | null | null | config/test.exs | RubyFireStudio/phoenix-api-scaffold | 231c11fe6008a47d51e24b8e26fb6f5dc7a24ca6 | [
"MIT"
] | null | null | null | use Mix.Config
# We don't run a server during test. If one is required,
# you can enable the server option below.
config :reanix, ReanixWeb.Endpoint,
http: [port: 4001],
server: false
# Print only warnings and errors during test
config :logger, level: :warn
# Reduce number of rounds
config :bcrypt_elixir, log_rounds: 4
# Configure your database
config :reanix, Reanix.Repo,
adapter: Ecto.Adapters.Postgres,
username: System.get_env("DATABASE_POSTGRESQL_USERNAME") || "postgres",
password: System.get_env("DATABASE_POSTGRESQL_PASSWORD") || "postgres",
database: "reanix_test",
hostname: "localhost",
pool: Ecto.Adapters.SQL.Sandbox
# Configure Guardian secret key
config :reanix, ReanixWeb.Guardian,
secret_key: "OSMuzr1uWJthItzsyXItnRoM3MLNaZXUwkamEHTwxUBYPPDuQTLPJnMBMiMATRjF"
| 29.740741 | 80 | 0.770859 |
ffab1b5176f3a2a5d362d073474c2ff97737df8f | 913 | ex | Elixir | apps/management/lib/management/registry.ex | rucker/hindsight | 876a5d344c5d8eebbea37684ee07e0a91e4430f0 | [
"Apache-2.0"
] | 26 | 2019-09-20T23:54:45.000Z | 2020-08-20T14:23:32.000Z | apps/management/lib/management/registry.ex | rucker/hindsight | 876a5d344c5d8eebbea37684ee07e0a91e4430f0 | [
"Apache-2.0"
] | 757 | 2019-08-15T18:15:07.000Z | 2020-09-18T20:55:31.000Z | apps/management/lib/management/registry.ex | rucker/hindsight | 876a5d344c5d8eebbea37684ee07e0a91e4430f0 | [
"Apache-2.0"
] | 10 | 2020-02-13T21:24:09.000Z | 2020-05-21T18:39:35.000Z | defmodule Management.Registry do
@moduledoc """
Folds helper functions around `Registry` usage into a
custom `Registry` module.
"""
defmacro __using__(opts) do
name = Keyword.fetch!(opts, :name)
quote location: :keep do
def child_spec(_init_arg) do
Supervisor.child_spec({Registry, keys: :unique, name: unquote(name)}, [])
end
@spec via(key) :: {:via, Registry, {unquote(name), key}} when key: term
def via(key) do
{:via, Registry, {unquote(name), key}}
end
@spec registered_processes() :: list
def registered_processes() do
Registry.select(unquote(name), [{{:"$1", :_, :_}, [], [:"$1"]}])
end
@spec whereis(key :: term) :: pid | :undefined
def whereis(key) do
case Registry.lookup(unquote(name), key) do
[{pid, _}] -> pid
_ -> :undefined
end
end
end
end
end
| 26.852941 | 81 | 0.579409 |
ffab3091c3def38e4eb422fd1a6ccbabe9ecb651 | 892 | exs | Elixir | apps/robby_web/test/controllers/settings_profile_controller_test.exs | puppetlabs/openrobby | a4b70939ee1b878d44cb09d757b7f72e7109ac5d | [
"Apache-2.0"
] | 3 | 2021-04-16T21:54:55.000Z | 2021-04-30T22:15:41.000Z | apps/robby_web/test/controllers/settings_profile_controller_test.exs | puppetlabs/openrobby | a4b70939ee1b878d44cb09d757b7f72e7109ac5d | [
"Apache-2.0"
] | 1 | 2021-06-29T15:54:19.000Z | 2021-06-29T15:54:19.000Z | apps/robby_web/test/controllers/settings_profile_controller_test.exs | puppetlabs/openrobby | a4b70939ee1b878d44cb09d757b7f72e7109ac5d | [
"Apache-2.0"
] | 2 | 2021-04-16T22:23:16.000Z | 2021-05-26T15:52:55.000Z | defmodule RobbyWeb.SettingsProfileControllerTest do
use RobbyWeb.ConnCase
alias RobbyWeb.User
setup do
{:ok, conn: build_conn()}
end
test "should not render profile page if not logged in", %{conn: conn} do
conn = get(conn, settings_profile_path(conn, :show))
assert html_response(conn, 302) =~ "redirected"
assert get_flash(conn, :error) =~ "You must be logged in"
assert redirected_to(conn) == page_path(conn, :index)
end
@tag ldap: true
test "should render profile page if valid id", %{conn: conn} do
Repo.insert!(%User{username: "tom@example.com"})
result =
conn
|> post(session_path(conn, :create), %{
"session" => %{"email" => "tom@example.com", "password" => "cattbutt"}
})
|> fetch_session
|> get(settings_profile_path(conn, :show))
|> html_response(200)
assert result =~ "tom"
end
end
| 27.875 | 78 | 0.644619 |
ffab4c9e031f0d841e570fbba8eedd8873453473 | 904 | exs | Elixir | messages/elixir/mix.exs | plavcik/cucumber | 8559082eaa668c0d5ddb3594666a8966efddfdc3 | [
"MIT"
] | null | null | null | messages/elixir/mix.exs | plavcik/cucumber | 8559082eaa668c0d5ddb3594666a8966efddfdc3 | [
"MIT"
] | null | null | null | messages/elixir/mix.exs | plavcik/cucumber | 8559082eaa668c0d5ddb3594666a8966efddfdc3 | [
"MIT"
] | null | null | null | defmodule CucumberMessages.MixProject do
@moduledoc false
use Mix.Project
@vsn "15.0.0"
@github "https://github.com/cucumber/cucumber/tree/master/messages/elixir"
@name "CucumberMessages"
def project do
[
app: :cucumber_messages,
version: @vsn,
name: @name,
description: description(),
package: package(),
elixir: "~> 1.10",
start_permanent: Mix.env() == :prod,
deps: deps()
]
end
def application do
[
extra_applications: [:logger]
]
end
defp deps do
[
{:protox, "~> 1.3.0"},
{:jason, "~> 1.2"},
{:ex_doc, "~> 0.23", only: :dev, runtime: false}
]
end
defp description() do
"Elixir implementation of the cucumber messages protobuf schema"
end
defp package() do
[
licenses: ["MIT"],
source_url: @github,
links: %{"GitHub" => @github}
]
end
end
| 18.833333 | 76 | 0.575221 |
ffab752c04bbb8c6d5a179b78a88c01040b9163a | 916 | ex | Elixir | lib/sitemap/alternates.ex | isavita/sitemap | 0fa66869646e1bace12f2e60bb08aeaa0f713f38 | [
"MIT",
"Unlicense"
] | 3 | 2019-05-17T00:57:02.000Z | 2020-04-03T12:38:53.000Z | lib/sitemap/alternates.ex | isavita/sitemap | 0fa66869646e1bace12f2e60bb08aeaa0f713f38 | [
"MIT",
"Unlicense"
] | 1 | 2020-02-18T14:15:48.000Z | 2020-02-18T14:15:48.000Z | lib/sitemap/alternates.ex | isavita/sitemap | 0fa66869646e1bace12f2e60bb08aeaa0f713f38 | [
"MIT",
"Unlicense"
] | 1 | 2020-02-18T11:29:25.000Z | 2020-02-18T11:29:25.000Z | defmodule Sitemap.Alternate do
@moduledoc false
@doc false
@spec new(keyword()) :: iodata()
def new(alternates), do: create(Map.new(alternates), 4, ["\n\s\s<xhtml:link"])
defp create(_alt, 0, ["\n\s\s<xhtml:link"]), do: []
defp create(_alt, 0, acc), do: [acc | "/>"]
defp create(%{href: value} = alt, 4, acc) do
create(alt, 3, [acc | "\shref=\"#{URI.to_string(URI.parse(value))}\""])
end
defp create(%{lang: value} = alt, 3, acc) do
create(alt, 2, [acc | "\shreflang=\"#{value}\""])
end
defp create(%{media: value} = alt, 2, acc) do
create(alt, 1, [acc | "\smedia=\"#{value}\""])
end
defp create(%{nofollow: true} = alt, 1, acc) do
create(alt, 0, [acc | "\srel=\"alternate nofollow\""])
end
defp create(%{nofollow: false} = alt, 1, acc) do
create(alt, 0, [acc | "\srel=\"alternate\""])
end
defp create(alt, pos, acc) do
create(alt, pos - 1, acc)
end
end
| 31.586207 | 80 | 0.582969 |
ffab96b3937c267341ccaa89c8929b94a4598ed5 | 1,660 | ex | Elixir | web/controllers/session_controller.ex | olsv/phoenix_app_template | f55b1faddd32b09ce7c11e667bbc9e73e2c80fbf | [
"MIT"
] | null | null | null | web/controllers/session_controller.ex | olsv/phoenix_app_template | f55b1faddd32b09ce7c11e667bbc9e73e2c80fbf | [
"MIT"
] | null | null | null | web/controllers/session_controller.ex | olsv/phoenix_app_template | f55b1faddd32b09ce7c11e667bbc9e73e2c80fbf | [
"MIT"
] | null | null | null | defmodule PhoenixAppTemplate.SessionController do
use PhoenixAppTemplate.Web, :controller
plug Ueberauth
plug :scrub_params, "user" when action in [:create]
alias PhoenixAppTemplate.User
def callback(%{assigns: %{ueberauth_failure: _fails}} = conn, _params) do
conn
|> handle_authentication_error("Failed to authenticate")
end
def callback(%{assigns: %{ueberauth_auth: auth}} = conn, _params) do
case User.get_or_create_by_oauth(auth) do
{:ok, user} ->
conn
|> handle_authenticated(user)
{:error, changeset} ->
conn
|> render(PhoenixAppTemplate.UserView, "new.html", changeset: changeset)
end
end
def new(conn, _params) do
render(conn, "new.html", changeset: User.changeset(%User{}))
end
def create(conn, %{"user" => user_params}) do
case User.authenticate(user_params["email"], user_params["password"]) do
{:ok, user} ->
conn
|> handle_authenticated(user)
{:error, _} ->
conn
|> handle_authentication_error("Invalid Email/Password combination")
end
end
def delete(conn, _params) do
conn
|> Guardian.Plug.sign_out
|> put_flash(:info, "See you later")
|> redirect(to: root_path(conn, :index))
end
defp handle_authenticated(conn, user) do
conn
|> Guardian.Plug.sign_in(user)
|> put_flash(:info, "Welcome back #{user.name}")
|> redirect(to: root_path(conn, :index))
end
defp handle_authentication_error(conn, error_message) do
conn
|> put_flash(:error, error_message)
|> redirect(to: session_path(conn, :new))
|> halt() # prevent double rendering
end
end
| 27.666667 | 80 | 0.658434 |
ffabe2edfa271e43b17e09907e581aec068f8cf6 | 2,843 | ex | Elixir | lib/absinthe/phase/schema/populate_persistent_term.ex | derekbrown/absinthe | 20b6d6970173dcf8d262cad952563e3b0e856289 | [
"MIT"
] | null | null | null | lib/absinthe/phase/schema/populate_persistent_term.ex | derekbrown/absinthe | 20b6d6970173dcf8d262cad952563e3b0e856289 | [
"MIT"
] | null | null | null | lib/absinthe/phase/schema/populate_persistent_term.ex | derekbrown/absinthe | 20b6d6970173dcf8d262cad952563e3b0e856289 | [
"MIT"
] | null | null | null | case Code.ensure_loaded(:persistent_term) do
{:module, _mod} ->
defmodule Absinthe.Phase.Schema.PopulatePersistentTerm do
@moduledoc false
alias Absinthe.Blueprint.Schema
def run(blueprint, opts) do
%{schema_definitions: [schema]} = blueprint
type_list =
Map.new(schema.type_definitions, fn type_def ->
{type_def.identifier, type_def.name}
end)
types_map =
schema.type_artifacts
|> Enum.flat_map(fn type -> [{type.identifier, type}, {type.name, type}] end)
|> Map.new()
referenced_types =
for type_def <- schema.type_definitions,
type_def.__private__[:__absinthe_referenced__],
into: %{},
do: {type_def.identifier, type_def.name}
directive_list =
Map.new(schema.directive_definitions, fn type_def ->
{type_def.identifier, type_def.name}
end)
directives_map =
schema.directive_artifacts
|> Enum.flat_map(fn type -> [{type.identifier, type}, {type.name, type}] end)
|> Map.new()
prototype_schema = Keyword.fetch!(opts, :prototype_schema)
metadata = build_metadata(schema)
implementors = build_implementors(schema)
schema_content = %{
__absinthe_types__: %{
referenced: referenced_types,
all: type_list
},
__absinthe_directives__: directive_list,
__absinthe_interface_implementors__: implementors,
__absinthe_prototype_schema__: prototype_schema,
__absinthe_type__: types_map,
__absinthe_directive__: directives_map,
__absinthe_reference__: metadata
}
schema_name = opts[:schema] || raise "no schema name provided"
put_schema(schema_name, schema_content)
{:ok, blueprint}
end
@dialyzer {:nowarn_function, [put_schema: 2]}
defp put_schema(schema_name, content) do
:persistent_term.put(
{Absinthe.Schema.PersistentTerm, schema_name},
content
)
end
def build_metadata(schema) do
for type <- schema.type_definitions do
{type.identifier, type.__reference__}
end
end
defp build_implementors(schema) do
schema.type_definitions
|> Enum.filter(&match?(%Schema.InterfaceTypeDefinition{}, &1))
|> Map.new(fn iface ->
implementors =
Schema.InterfaceTypeDefinition.find_implementors(iface, schema.type_definitions)
{iface.identifier, Enum.sort(implementors)}
end)
end
end
{:error, _err} ->
defmodule Absinthe.Phase.Schema.PopulatePersistentTerm do
def run(_, _) do
raise "This phase requires OTP >= 21.2"
end
end
end
| 29.926316 | 92 | 0.614843 |
ffabe72562a15e232b6155476cd83a1e04a31a3b | 3,335 | ex | Elixir | lib/chat_api_web/controllers/widget_settings_controller.ex | ZmagoD/papercups | dff9a5822b809edc4fd8ecf198566f9b14ab613f | [
"MIT"
] | 4,942 | 2020-07-20T22:35:28.000Z | 2022-03-31T15:38:51.000Z | lib/chat_api_web/controllers/widget_settings_controller.ex | ZmagoD/papercups | dff9a5822b809edc4fd8ecf198566f9b14ab613f | [
"MIT"
] | 552 | 2020-07-22T01:39:04.000Z | 2022-02-01T00:26:35.000Z | lib/chat_api_web/controllers/widget_settings_controller.ex | ZmagoD/papercups | dff9a5822b809edc4fd8ecf198566f9b14ab613f | [
"MIT"
] | 396 | 2020-07-22T19:27:48.000Z | 2022-03-31T05:25:24.000Z | defmodule ChatApiWeb.WidgetSettingsController do
use ChatApiWeb, :controller
alias ChatApi.{Accounts, WidgetSettings}
alias ChatApi.WidgetSettings.WidgetSetting
action_fallback(ChatApiWeb.FallbackController)
@spec show(Plug.Conn.t(), map()) :: Plug.Conn.t()
def show(conn, %{"account_id" => account_id} = params) do
# TODO: should we create a Phoenix Plug for this pattern?
# Or just improve error handling with `foreign_key_constraint(:account_id)`
if Accounts.exists?(account_id) do
filters = ensure_inbox_filter_included(account_id, params)
widget_settings = WidgetSettings.get_settings_by_account!(account_id, filters)
render(conn, "show.json", widget_settings: widget_settings)
else
send_account_not_found_error(conn, account_id)
end
end
def show(conn, params) do
conn
|> put_status(422)
|> json(%{
error: %{
status: 422,
message: "The account_id is a required parameter",
received: Map.keys(params)
}
})
end
@spec update_metadata(Plug.Conn.t(), map()) :: Plug.Conn.t()
def update_metadata(conn, %{"account_id" => account_id, "metadata" => metadata} = params) do
if Accounts.exists?(account_id) do
with filters <- ensure_inbox_filter_included(account_id, params),
{:ok, widget_settings} <-
WidgetSettings.update_widget_metadata(account_id, metadata, filters) do
render(conn, "update.json", widget_settings: widget_settings)
end
else
send_account_not_found_error(conn, account_id)
end
end
def update_metadata(conn, params) do
conn
|> put_status(422)
|> json(%{
error: %{
status: 422,
message: "The following parameters are required: account_id, metadata",
received: Map.keys(params)
}
})
end
@spec update(Plug.Conn.t(), map()) :: Plug.Conn.t()
def update(conn, %{"widget_settings" => params}) do
with %{account_id: account_id} <- conn.assigns.current_user do
filters = ensure_inbox_filter_included(account_id, params)
{:ok, widget_settings} =
account_id
|> WidgetSettings.get_settings_by_account!(filters)
|> WidgetSettings.update_widget_setting(params)
render(conn, "update.json", widget_settings: widget_settings)
end
end
@spec delete(Plug.Conn.t(), map()) :: Plug.Conn.t()
def delete(conn, %{"id" => id}) do
widget_setting = WidgetSettings.get_widget_setting!(id)
with {:ok, %WidgetSetting{}} <- WidgetSettings.delete_widget_setting(widget_setting) do
send_resp(conn, :no_content, "")
end
end
@spec send_account_not_found_error(Plug.Conn.t(), binary()) :: Plug.Conn.t()
defp send_account_not_found_error(conn, account_id) do
conn
|> put_status(404)
|> json(%{
error: %{
status: 404,
message: "No account found with ID: #{account_id}. Are you pointing at the correct host?",
host: System.get_env("BACKEND_URL") || "localhost"
}
})
end
defp ensure_inbox_filter_included(account_id, params) do
case params do
%{"inbox_id" => inbox_id} when is_binary(inbox_id) ->
params
_ ->
Map.merge(params, %{
"inbox_id" => ChatApi.Inboxes.get_account_primary_inbox_id(account_id)
})
end
end
end
| 30.87963 | 98 | 0.663268 |
ffabf95720b8f5931c2e243b686ba9204f6e4225 | 1,425 | ex | Elixir | clients/apigee/lib/google_api/apigee/v1/model/google_cloud_apigee_v1_announcement_data.ex | EVLedger/elixir-google-api | 61edef19a5e2c7c63848f7030c6d8d651e4593d4 | [
"Apache-2.0"
] | null | null | null | clients/apigee/lib/google_api/apigee/v1/model/google_cloud_apigee_v1_announcement_data.ex | EVLedger/elixir-google-api | 61edef19a5e2c7c63848f7030c6d8d651e4593d4 | [
"Apache-2.0"
] | null | null | null | clients/apigee/lib/google_api/apigee/v1/model/google_cloud_apigee_v1_announcement_data.ex | EVLedger/elixir-google-api | 61edef19a5e2c7c63848f7030c6d8d651e4593d4 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Apigee.V1.Model.GoogleCloudApigeeV1AnnouncementData do
@moduledoc """
## Attributes
* `message` (*type:* `String.t`, *default:* `nil`) - Details of the announcement.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:message => String.t()
}
field(:message)
end
defimpl Poison.Decoder, for: GoogleApi.Apigee.V1.Model.GoogleCloudApigeeV1AnnouncementData do
def decode(value, options) do
GoogleApi.Apigee.V1.Model.GoogleCloudApigeeV1AnnouncementData.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Apigee.V1.Model.GoogleCloudApigeeV1AnnouncementData do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 30.319149 | 93 | 0.746667 |
ffac17d77084eb67cbf0ed62aeb9db0405d6397b | 598 | exs | Elixir | nerves.exs | evokly/evokly_things_system | d951f4f2d8e0f5b4f29896ca3c460b4c4f58cd13 | [
"Apache-2.0"
] | null | null | null | nerves.exs | evokly/evokly_things_system | d951f4f2d8e0f5b4f29896ca3c460b4c4f58cd13 | [
"Apache-2.0"
] | null | null | null | nerves.exs | evokly/evokly_things_system | d951f4f2d8e0f5b4f29896ca3c460b4c4f58cd13 | [
"Apache-2.0"
] | null | null | null | use Mix.Config
version =
Path.join(__DIR__, "VERSION")
|> File.read!
|> String.strip
config :nerves_system_evokly_thing, :nerves_env,
type: :system,
version: version,
mirrors: [
"https://github.com/evokly/evokly_things_system/releases/download/v#{version}/nerves_system_evokly_thing-v#{version}.tar.gz"
],
build_platform: Nerves.System.Platforms.BR,
build_config: [
defconfig: "nerves_defconfig",
package_files: [
"rootfs-additions",
"linux-4.4.defconfig",
"fwup.conf",
"cmdline.txt",
"config.txt",
"post-createfs.sh"
]
]
| 23 | 128 | 0.660535 |
ffac68a315ea91380ba9133a369c9d9992e092e9 | 286 | exs | Elixir | calc/calc.exs | ryuichi1208/rand_w_exs | e3d2264ed8065e2b6e63cbc02e479bb795ea2bc6 | [
"MIT"
] | null | null | null | calc/calc.exs | ryuichi1208/rand_w_exs | e3d2264ed8065e2b6e63cbc02e479bb795ea2bc6 | [
"MIT"
] | null | null | null | calc/calc.exs | ryuichi1208/rand_w_exs | e3d2264ed8065e2b6e63cbc02e479bb795ea2bc6 | [
"MIT"
] | null | null | null | defmodule IntegerSamples do
require Integer
def exec do
print_exec(Integer.is_odd(1))
print_exec(Integer.is_odd(2))
print_exec(Integer.is_even(1))
print_exec(Integer.is_even(2))
end
defp print_exec(value) do
IO.inspect value
end
end
IntegerSamples.exec
| 17.875 | 34 | 0.727273 |
ffac6b6a05d860bd83f8eb5a0bfa9125bc213de8 | 257 | exs | Elixir | apps/andi/priv/repo/migrations/20201022192734_change_extract_column_to_action.exs | calebcarroll1/smartcitiesdata | b0f03496f6c592c82ba14aebf6c5996311cf3cd0 | [
"Apache-2.0"
] | 26 | 2019-09-20T23:54:45.000Z | 2020-08-20T14:23:32.000Z | apps/andi/priv/repo/migrations/20201022192734_change_extract_column_to_action.exs | calebcarroll1/smartcitiesdata | b0f03496f6c592c82ba14aebf6c5996311cf3cd0 | [
"Apache-2.0"
] | 757 | 2019-08-15T18:15:07.000Z | 2020-09-18T20:55:31.000Z | apps/andi/priv/repo/migrations/20201022192734_change_extract_column_to_action.exs | calebcarroll1/smartcitiesdata | b0f03496f6c592c82ba14aebf6c5996311cf3cd0 | [
"Apache-2.0"
] | 9 | 2019-11-12T16:43:46.000Z | 2020-03-25T16:23:16.000Z | defmodule Andi.Repo.Migrations.ChangeExtractColumnToAction do
use Ecto.Migration
def change do
rename table("extract_http_step"), :method, to: :action
alter table(:extract_http_step) do
add :protocol, {:array, :string}
end
end
end
| 21.416667 | 61 | 0.719844 |
ffac880c85b4161995174e8f6b8f337cf80dce5a | 516 | ex | Elixir | lib/mastani_server/cms/repo_contributor.ex | DavidAlphaFox/coderplanets_server | 3fd47bf3bba6cc04c9a34698201a60ad2f3e8254 | [
"Apache-2.0"
] | 1 | 2019-05-07T15:03:54.000Z | 2019-05-07T15:03:54.000Z | lib/mastani_server/cms/repo_contributor.ex | DavidAlphaFox/coderplanets_server | 3fd47bf3bba6cc04c9a34698201a60ad2f3e8254 | [
"Apache-2.0"
] | null | null | null | lib/mastani_server/cms/repo_contributor.ex | DavidAlphaFox/coderplanets_server | 3fd47bf3bba6cc04c9a34698201a60ad2f3e8254 | [
"Apache-2.0"
] | null | null | null | defmodule MastaniServer.CMS.RepoContributor do
@moduledoc false
alias __MODULE__
use Ecto.Schema
import Ecto.Changeset
@required_fields ~w(avatar nickname html_url)a
@type t :: %RepoContributor{}
embedded_schema do
field(:avatar, :string)
field(:nickname, :string)
field(:html_url, :string)
end
@doc false
def changeset(%RepoContributor{} = repo_contributor, attrs) do
repo_contributor
|> cast(attrs, @required_fields)
|> validate_required(@required_fields)
end
end
| 21.5 | 64 | 0.722868 |
ffacbb5664fc9ec93aa11ea06481da24f7ee9df4 | 1,916 | ex | Elixir | clients/display_video/lib/google_api/display_video/v1/model/list_location_lists_response.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | null | null | null | clients/display_video/lib/google_api/display_video/v1/model/list_location_lists_response.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-12-18T09:25:12.000Z | 2020-12-18T09:25:12.000Z | clients/display_video/lib/google_api/display_video/v1/model/list_location_lists_response.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-10-04T10:12:44.000Z | 2020-10-04T10:12:44.000Z | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.DisplayVideo.V1.Model.ListLocationListsResponse do
@moduledoc """
## Attributes
* `locationLists` (*type:* `list(GoogleApi.DisplayVideo.V1.Model.LocationList.t)`, *default:* `nil`) - The list of location lists. This list will be absent if empty.
* `nextPageToken` (*type:* `String.t`, *default:* `nil`) - A token to retrieve the next page of results. Pass this value in the page_token field in the subsequent call to `ListLocationLists` method to retrieve the next page of results.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:locationLists => list(GoogleApi.DisplayVideo.V1.Model.LocationList.t()),
:nextPageToken => String.t()
}
field(:locationLists, as: GoogleApi.DisplayVideo.V1.Model.LocationList, type: :list)
field(:nextPageToken)
end
defimpl Poison.Decoder, for: GoogleApi.DisplayVideo.V1.Model.ListLocationListsResponse do
def decode(value, options) do
GoogleApi.DisplayVideo.V1.Model.ListLocationListsResponse.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.DisplayVideo.V1.Model.ListLocationListsResponse do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 38.32 | 239 | 0.74739 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.