hexsha stringlengths 40 40 | size int64 2 991k | ext stringclasses 2 values | lang stringclasses 1 value | max_stars_repo_path stringlengths 4 208 | max_stars_repo_name stringlengths 6 106 | max_stars_repo_head_hexsha stringlengths 40 40 | max_stars_repo_licenses list | max_stars_count int64 1 33.5k ⌀ | max_stars_repo_stars_event_min_datetime stringlengths 24 24 ⌀ | max_stars_repo_stars_event_max_datetime stringlengths 24 24 ⌀ | max_issues_repo_path stringlengths 4 208 | max_issues_repo_name stringlengths 6 106 | max_issues_repo_head_hexsha stringlengths 40 40 | max_issues_repo_licenses list | max_issues_count int64 1 16.3k ⌀ | max_issues_repo_issues_event_min_datetime stringlengths 24 24 ⌀ | max_issues_repo_issues_event_max_datetime stringlengths 24 24 ⌀ | max_forks_repo_path stringlengths 4 208 | max_forks_repo_name stringlengths 6 106 | max_forks_repo_head_hexsha stringlengths 40 40 | max_forks_repo_licenses list | max_forks_count int64 1 6.91k ⌀ | max_forks_repo_forks_event_min_datetime stringlengths 24 24 ⌀ | max_forks_repo_forks_event_max_datetime stringlengths 24 24 ⌀ | content stringlengths 2 991k | avg_line_length float64 1 36k | max_line_length int64 1 977k | alphanum_fraction float64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
08f5c0d5a8a021c7b14840229c685ded6c80410b | 153 | exs | Elixir | test/rp_middleware_test.exs | DriesDeBackker/rp-middleware | 7066c584236cfb4894dfe40e0a81309e6d8dd1da | [
"MIT"
] | null | null | null | test/rp_middleware_test.exs | DriesDeBackker/rp-middleware | 7066c584236cfb4894dfe40e0a81309e6d8dd1da | [
"MIT"
] | null | null | null | test/rp_middleware_test.exs | DriesDeBackker/rp-middleware | 7066c584236cfb4894dfe40e0a81309e6d8dd1da | [
"MIT"
] | null | null | null | defmodule RpMiddlewareTest do
use ExUnit.Case
doctest RpMiddleware
test "greets the world" do
assert RpMiddleware.hello() == :world
end
end
| 17 | 41 | 0.738562 |
08f5c737e6175ab47313cbb7ac08e3ea7c0ab804 | 1,133 | exs | Elixir | config/config.exs | mdoza/elixir_math | f814faeb9d4d09f8b5c6d71bb09b899fa615fd63 | [
"MIT"
] | 1 | 2019-08-02T22:14:39.000Z | 2019-08-02T22:14:39.000Z | config/config.exs | mdoza/elixir_math | f814faeb9d4d09f8b5c6d71bb09b899fa615fd63 | [
"MIT"
] | null | null | null | config/config.exs | mdoza/elixir_math | f814faeb9d4d09f8b5c6d71bb09b899fa615fd63 | [
"MIT"
] | 1 | 2020-09-16T08:11:22.000Z | 2020-09-16T08:11:22.000Z | # This file is responsible for configuring your application
# and its dependencies with the aid of the Mix.Config module.
use Mix.Config
# This configuration is loaded before any dependency and is restricted
# to this project. If another project depends on this project, this
# file won't be loaded nor affect the parent project. For this reason,
# if you want to provide default values for your application for
# 3rd-party users, it should be done in your "mix.exs" file.
# You can configure your application as:
#
# config :elixir_math, key: :value
#
# and access this configuration in your application as:
#
# Application.get_env(:elixir_math, :key)
#
# You can also configure a 3rd-party app:
#
# config :logger, level: :info
#
# It is also possible to import configuration files, relative to this
# directory. For example, you can emulate configuration per environment
# by uncommenting the line below and defining dev.exs, test.exs and such.
# Configuration from the imported file will override the ones defined
# here (which is why it is important to import them last).
#
# import_config "#{Mix.env()}.exs"
| 36.548387 | 73 | 0.751986 |
08f5d3c4bb262a5199e47734ad0fbff0ba98e67d | 578 | exs | Elixir | test/views/error_view_test.exs | Redorb/SpeakEasy | ef9176ae9d9c9064abd3a287852ba897238e3516 | [
"MIT"
] | null | null | null | test/views/error_view_test.exs | Redorb/SpeakEasy | ef9176ae9d9c9064abd3a287852ba897238e3516 | [
"MIT"
] | null | null | null | test/views/error_view_test.exs | Redorb/SpeakEasy | ef9176ae9d9c9064abd3a287852ba897238e3516 | [
"MIT"
] | null | null | null | defmodule Speakeasy.ErrorViewTest do
use Speakeasy.ConnCase, async: true
# Bring render/3 and render_to_string/3 for testing custom views
import Phoenix.View
test "renders 404.html" do
assert render_to_string(Speakeasy.ErrorView, "404.html", []) ==
"Page not found"
end
test "render 500.html" do
assert render_to_string(Speakeasy.ErrorView, "500.html", []) ==
"Internal server error"
end
test "render any other" do
assert render_to_string(Speakeasy.ErrorView, "505.html", []) ==
"Internal server error"
end
end
| 26.272727 | 67 | 0.681661 |
08f5d4abaebed9668ae6f72c185b06bec08228cd | 1,551 | exs | Elixir | test/assertions/be_type_test.exs | MeneDev/espec | ec4b3d579c5192999e930224a8a2650bb1fdf0bc | [
"Apache-2.0"
] | 807 | 2015-03-25T14:00:19.000Z | 2022-03-24T08:08:15.000Z | test/assertions/be_type_test.exs | MeneDev/espec | ec4b3d579c5192999e930224a8a2650bb1fdf0bc | [
"Apache-2.0"
] | 254 | 2015-03-27T10:12:25.000Z | 2021-07-12T01:40:15.000Z | test/assertions/be_type_test.exs | MeneDev/espec | ec4b3d579c5192999e930224a8a2650bb1fdf0bc | [
"Apache-2.0"
] | 85 | 2015-04-02T10:25:19.000Z | 2021-01-30T21:30:43.000Z | defmodule BeTypeTest do
use ExUnit.Case, async: true
defmodule SomeSpec do
use ESpec
context "Success" do
it do: :atom |> should(be_atom())
it do: "binary" |> should(be_binary())
it do: <<102>> |> should(be_bitstring())
it do: true |> should(be_boolean())
it do: 1.2 |> should(be_float())
it do: fn -> :ok end |> should(be_function())
it do: 1 |> should(be_integer())
it do: [1, 2, 3] |> should(be_list())
it do: %{a: :b} |> should(be_map())
it do: nil |> should(be_nil())
it do: 1.5 |> should(be_number())
it do: spawn(fn -> :ok end) |> should(be_pid())
it do: hd(Port.list()) |> should(be_port())
it do: make_ref() |> should(be_reference())
it do: {:a, :b} |> should(be_tuple())
it do: fn _a, _b -> :ok end |> should(be_function 2)
end
context "Error" do
it do: 1 |> should(be_atom())
it do: :atom |> should_not(be_atom())
it do: 5 |> should(be_nil())
it do: nil |> should_not(be_nil())
it do: fn _a -> :ok end |> should(be_function 2)
it do: fn _a, _b -> :ok end |> should_not(be_function 2)
end
end
setup_all do
examples = ESpec.SuiteRunner.run_examples(SomeSpec.examples(), true)
{:ok, success: Enum.slice(examples, 0, 15), errors: Enum.slice(examples, 16, 21)}
end
test "Success", context do
Enum.each(context[:success], &assert(&1.status == :success))
end
test "Errors", context do
Enum.each(context[:errors], &assert(&1.status == :failure))
end
end
| 30.411765 | 85 | 0.578337 |
08f5d597a0deb35e2a9a31dba470447354f2dd9a | 829 | ex | Elixir | lib/captain_hook/clients/response.ex | annatel/captain_hook | e16a01107d11756d37d96d1e9092c17d9aa9260b | [
"MIT"
] | 4 | 2020-11-13T11:27:24.000Z | 2021-08-19T17:28:53.000Z | lib/captain_hook/clients/response.ex | annatel/captain_hook | e16a01107d11756d37d96d1e9092c17d9aa9260b | [
"MIT"
] | null | null | null | lib/captain_hook/clients/response.ex | annatel/captain_hook | e16a01107d11756d37d96d1e9092c17d9aa9260b | [
"MIT"
] | null | null | null | defmodule CaptainHook.Clients.Response do
@moduledoc false
defstruct client_error_message: nil,
request_body: nil,
request_headers: nil,
request_method: nil,
request_url: nil,
requested_at: nil,
response_body: nil,
response_http_status: nil,
responded_at: nil,
success?: nil
@type t :: %__MODULE__{
client_error_message: binary | nil,
request_body: binary | nil,
request_headers: map | nil,
request_method: binary | nil,
request_url: binary | nil,
requested_at: DateTime.t() | nil,
response_body: binary | nil,
response_http_status: binary | nil,
responded_at: binary | nil,
success?: boolean | nil
}
end
| 29.607143 | 45 | 0.568154 |
08f5d94a18a9eafa248f94f74539356ff2657bb6 | 5,166 | exs | Elixir | apps/snitch_core/test/data/model/line_item_test.exs | saurabharch/avia | 74a82a95cf8bfe8143d1fce8136a3bb7ffc9467c | [
"MIT"
] | 1 | 2018-12-01T18:13:55.000Z | 2018-12-01T18:13:55.000Z | apps/snitch_core/test/data/model/line_item_test.exs | saurabharch/avia | 74a82a95cf8bfe8143d1fce8136a3bb7ffc9467c | [
"MIT"
] | null | null | null | apps/snitch_core/test/data/model/line_item_test.exs | saurabharch/avia | 74a82a95cf8bfe8143d1fce8136a3bb7ffc9467c | [
"MIT"
] | null | null | null | defmodule Snitch.Data.Model.LineItemTest do
use ExUnit.Case, async: true
use Snitch.DataCase
import Mox, only: [expect: 3, verify_on_exit!: 1]
import Snitch.Factory
alias Snitch.Data.Model.{LineItem, Order}
alias Snitch.Data.Model.GeneralConfiguration, as: GCModel
describe "with valid params" do
setup :variants
setup :good_line_items
test "update_unit_price/1", context do
%{line_items: line_items} = context
priced_items = LineItem.update_unit_price(line_items)
assert Enum.all?(priced_items, fn %{unit_price: price} -> not is_nil(price) end)
end
test "compute_total/1", context do
%{line_items: line_items} = context
priced_items = LineItem.update_unit_price(line_items)
assert %Money{} = LineItem.compute_total(priced_items)
end
end
describe "with invalid params" do
setup :variants
setup :bad_line_items
test "update_unit_price/1", %{line_items: line_items} do
priced_items = LineItem.update_unit_price(line_items)
assert [
%{quantity: 2, product_id: -1},
%{quantity: nil, unit_price: %Money{}, product_id: _},
%{quantity: 2, product_id: nil}
] = priced_items
end
end
describe "compute_total/1 with empty list" do
setup :verify_on_exit!
test "when default currency is set in the store" do
config = insert(:general_config)
assert GCModel.fetch_currency() == config.currency
assert Money.zero(config.currency) == LineItem.compute_total([])
end
test "when default currency is not set" do
assert GCModel.fetch_currency() == "USD"
assert Money.zero("USD") == LineItem.compute_total([])
end
end
describe "create/1" do
setup :variants
setup :stock_items_setup
@tag stock_item_count: 1
test "fails without an existing order or variant", %{stock_items: [si]} do
product = si.product
assert {:error, changeset} = LineItem.create(%{line_item_params(product) | order_id: -1})
assert %{order: ["does not exist"]} == errors_on(changeset)
assert {:error, changeset} = LineItem.create(line_item_params(product))
assert %{order_id: ["can't be blank"]} == errors_on(changeset)
order = insert(:order)
assert {:error, changeset} =
LineItem.create(%{line_item_params(product) | product_id: nil, order_id: order.id})
assert %{product_id: ["can't be blank"]} == errors_on(changeset)
end
test "for an empty order" do
stock_item = insert(:stock_item)
variant = stock_item.product
order = insert(:order, line_items: [])
assert {:ok, lineitem} = LineItem.create(%{line_item_params(variant) | order_id: order.id})
end
@tag variant_count: 1
test "fails if stock insufficient", %{variants: [variant]} do
order = insert(:order, line_items: [])
assert {:error, changeset} =
LineItem.create(%{line_item_params(variant) | order_id: order.id})
assert %{stock: ["Stock Insufficient"]} = errors_on(changeset)
end
end
describe "update/1" do
setup :orders
test "with valid params", %{orders: [order]} do
stock_item = insert(:stock_item, count_on_hand: 5)
product = stock_item.product
order = struct(order, line_items(%{order: order, variants: [product]}))
[li] = order.line_items
params = %{quantity: li.quantity + 1}
assert {:ok, _} = LineItem.update(li, params)
end
end
describe "delete/1 for order in `cart` state" do
setup :variants
setup :orders
@tag variant_count: 1
test "with valid params", %{variants: [v], orders: [order]} do
order = struct(order, line_items(%{order: order, variants: [v]}))
[line_item] = order.line_items
{:ok, _} = LineItem.delete(line_item)
assert [] =
order.id
|> Order.get()
|> Repo.preload(:line_items)
|> Map.fetch!(:line_items)
end
end
defp good_line_items(context) do
%{variants: vs} = context
quantities = Stream.cycle([2])
line_items =
vs
|> Stream.zip(quantities)
|> Enum.reduce([], fn {variant, quantity}, acc ->
[%{product_id: variant.id, quantity: quantity} | acc]
end)
[line_items: line_items]
end
defp bad_line_items(context) do
%{variants: [one, two, three]} = context
variants = [%{one | id: -1}, two, %{three | id: nil}]
quantities = [2, nil, 2]
line_items =
variants
|> Stream.zip(quantities)
|> Enum.map(fn {variant, quantity} ->
%{product_id: variant.id, quantity: quantity}
end)
[line_items: line_items]
end
defp line_item_params(variant) do
%{
quantity: 1,
unit_price: variant.selling_price,
total: variant.selling_price,
order_id: nil,
product_id: variant.id
}
end
end
defmodule Snitch.Data.Model.LineItemDocTest do
use ExUnit.Case, async: true
use Snitch.DataCase
import Snitch.Factory
alias Snitch.Data.Model
setup do
insert(:variant)
:ok
end
doctest Snitch.Data.Model.LineItem
end
| 27.04712 | 98 | 0.637437 |
08f5df5f48fe81de07befb0a1277c150ac77a1e1 | 527 | ex | Elixir | lib/error_response.ex | karlosmid/exkeycdn | 4f8a756b2ea7b5bcb2d2821509250004ed8850ea | [
"MIT"
] | null | null | null | lib/error_response.ex | karlosmid/exkeycdn | 4f8a756b2ea7b5bcb2d2821509250004ed8850ea | [
"MIT"
] | null | null | null | lib/error_response.ex | karlosmid/exkeycdn | 4f8a756b2ea7b5bcb2d2821509250004ed8850ea | [
"MIT"
] | null | null | null | defmodule ExKeyCDN.ErrorResponse do
@moduledoc """
A general purpose response wrapper that is built for any failed API
response.
See the following pages for details about the various responses:
* https://www.keycdn.com/api#errors
"""
use ExKeyCDN.Construction
@type t :: %__MODULE__{
errors: map,
message: String.t(),
params: map,
transaction: map
}
defstruct errors: %{},
message: "",
params: %{},
transaction: %{}
end
| 21.08 | 69 | 0.590133 |
08f5e47b09550e6ef56f3ce9b14feb23ad87a009 | 1,297 | ex | Elixir | lib/ptr_web/controllers/session_controller.ex | francocatena/ptr | 4c8a960cdcb1c8523334fcc0cddba6b7fb3b3e60 | [
"MIT"
] | null | null | null | lib/ptr_web/controllers/session_controller.ex | francocatena/ptr | 4c8a960cdcb1c8523334fcc0cddba6b7fb3b3e60 | [
"MIT"
] | 2 | 2021-03-09T01:59:47.000Z | 2022-02-10T17:08:54.000Z | lib/ptr_web/controllers/session_controller.ex | francocatena/ptr | 4c8a960cdcb1c8523334fcc0cddba6b7fb3b3e60 | [
"MIT"
] | null | null | null | defmodule PtrWeb.SessionController do
use PtrWeb, :controller
plug(:authenticate when action in [:delete])
def new(%{assigns: %{current_session: session}} = conn, _params)
when is_map(session) do
redirect(conn, to: Routes.root_path(conn, :index))
end
def new(conn, _params) do
render(conn, "new.html")
end
def create(conn, %{"session" => %{"email" => email, "password" => password}}) do
case Ptr.Accounts.authenticate_by_email_and_password(email, password) do
{:ok, user} ->
conn
|> put_flash(:info, gettext("Welcome!"))
|> put_session(:user_id, user.id)
|> put_session(:account_id, user.account_id)
|> configure_session(renew: true)
|> redirect_to_next_path()
{:error, :unauthorized} ->
conn
|> put_flash(:error, dgettext("sessions", "Invalid email/password combination"))
|> render("new.html")
end
end
def delete(conn, _) do
conn
|> clear_session()
|> put_flash(:info, gettext("See you soon!"))
|> redirect(to: Routes.root_path(conn, :index))
end
defp redirect_to_next_path(conn) do
path = get_session(conn, :previous_url) || Routes.root_path(conn, :index)
conn
|> delete_session(:previous_url)
|> redirect(to: path)
end
end
| 27.595745 | 88 | 0.635312 |
08f5ef3a87babae25319c7596e420f8843951aa6 | 1,677 | ex | Elixir | lib/yum/util.ex | ZURASTA/yum | 132cb79ab328d2e4063d581f79fde0ff4243eb02 | [
"BSD-2-Clause"
] | null | null | null | lib/yum/util.ex | ZURASTA/yum | 132cb79ab328d2e4063d581f79fde0ff4243eb02 | [
"BSD-2-Clause"
] | 4 | 2017-08-02T08:38:18.000Z | 2017-12-23T12:21:02.000Z | lib/yum/util.ex | ZURASTA/yum | 132cb79ab328d2e4063d581f79fde0ff4243eb02 | [
"BSD-2-Clause"
] | 3 | 2017-07-25T10:17:28.000Z | 2017-10-04T05:46:01.000Z | defmodule Yum.Util do
@moduledoc """
Common utilities for interacting with the data.
"""
defp create_parent_ref([_|groups]), do: create_parent_ref(groups, "")
defp create_parent_ref([_], ""), do: nil
defp create_parent_ref([_], ref), do: ref
defp create_parent_ref([current|groups], ref), do: create_parent_ref(groups, "#{ref}/#{current}")
@doc """
Get the parent ref.
This can be used to either refer to the parent or find the parent.
"""
@spec group_ref(String.t) :: String.t | nil
def group_ref(ref) do
String.split(ref, "/")
|> create_parent_ref
end
@doc """
Get the reference name.
"""
@spec name(String.t) :: String.t
def name(ref) do
String.split(ref, "/")
|> List.last
end
@doc """
Convert a list of names into a list of refs.
"""
@spec ref_list([String.t], [String.t]) :: [String.t]
def ref_list(groups, list \\ [])
def ref_list([], [_|list]), do: list
def ref_list([group|groups], []), do: ref_list(groups, [group, group])
def ref_list([group|groups], [prev|list]) do
ref = prev <> "/" <> group
ref_list(groups, [ref, ref|list])
end
@doc """
Convert a ref or list of refs into a list of refs for each individual ref in the group.
"""
@spec match_ref(String.t | [String.t], [String.t]) :: [String.t]
def match_ref(ref, matches \\ [])
def match_ref([], matches), do: matches
def match_ref([ref|list], matches), do: match_ref(list, match_ref(ref, matches))
def match_ref(ref, matches), do: String.split(ref, "/", trim: true) |> ref_list([""|matches])
end
| 31.641509 | 101 | 0.595707 |
08f6009c265b71e6b26bf7f0eeef0f9c732d0ca5 | 3,316 | exs | Elixir | test/wunderground/conditions_test.exs | optikfluffel/wunderground | 67ebd8fbb83f2f0d1eb1a6fba1273afa3cec8233 | [
"Unlicense"
] | 2 | 2017-08-23T21:48:07.000Z | 2017-10-16T21:35:36.000Z | test/wunderground/conditions_test.exs | optikfluffel/wunderground | 67ebd8fbb83f2f0d1eb1a6fba1273afa3cec8233 | [
"Unlicense"
] | 8 | 2017-08-23T10:02:35.000Z | 2017-09-03T11:35:36.000Z | test/wunderground/conditions_test.exs | optikfluffel/wunderground | 67ebd8fbb83f2f0d1eb1a6fba1273afa3cec8233 | [
"Unlicense"
] | 1 | 2021-06-22T15:02:15.000Z | 2021-06-22T15:02:15.000Z | defmodule Wunderground.ConditionsTest do
@moduledoc false
use ExUnit.Case, async: false
use ExVCR.Mock, adapter: ExVCR.Adapter.Hackney
alias Wunderground.Conditions
@not_found {:not_found, "No cities match your search query"}
@station_offline {:station_offline, "The station you're looking for either doesn't exist or is simply offline right now."}
describe "get/1" do
test "us" do
use_cassette "conditions/us" do
assert {:ok, %Conditions{}} = Conditions.get({:us, "CA", "San_Francisco"})
end
end
test "us not_found" do
use_cassette "conditions/us_not_found" do
assert {:error, @not_found} = Conditions.get({:us, "CA", "USDUBFOURZEGBNUIZDSNGIUZFV"})
end
end
test "us_zip" do
use_cassette "conditions/us_zip" do
assert {:ok, %Conditions{}} = Conditions.get({:us_zip, 60290})
end
end
test "us_zip not_found" do
use_cassette "conditions/us_zip_not_found" do
assert {:error, @not_found} = Conditions.get({:us_zip, -1})
end
end
test "international" do
use_cassette "conditions/international" do
assert {:ok, %Conditions{}} = Conditions.get({:international, "Australia", "Sydney"})
end
end
test "international not_found" do
use_cassette "conditions/international_not_found" do
assert {:error, @not_found} = Conditions.get({:international, "Australia", "AUDUBFOURZEGBNUIZDSNGIUZFV"})
end
end
test "geo" do
use_cassette "conditions/geo" do
assert {:ok, %Conditions{}} = Conditions.get({:geo, 37.8, -122.4})
end
end
test "geo not_found" do
use_cassette "conditions/geo_not_found" do
assert {:error, @not_found} = Conditions.get({:geo, 2500.0, -5000.0})
end
end
test "airport" do
use_cassette "conditions/airport" do
assert {:ok, %Conditions{}} = Conditions.get({:airport, "KJFK"})
end
end
test "airport not_found" do
use_cassette "conditions/airport_not_found" do
assert {:error, @not_found} = Conditions.get({:airport, "AIRUBFOURZEGBNUIZDSNGIUZFV"})
end
end
test "pws" do
use_cassette "conditions/pws" do
assert {:ok, %Conditions{}} = Conditions.get({:pws, "KCASANFR70"})
end
end
test "pws not_found" do
use_cassette "conditions/pws_not_found" do
assert {:error, @station_offline} = Conditions.get({:pws, "NOT_A_PWS_ID"})
end
end
test "auto_ip" do
use_cassette "conditions/auto_ip" do
assert {:ok, %Conditions{}} = Conditions.get({:auto_ip})
end
end
test "auto_ip with given ip address" do
use_cassette "conditions/auto_ip_custom" do
assert {:ok, %Conditions{}} = Conditions.get({:auto_ip, {185, 1, 74, 1}})
end
end
test "auto_ip with 'wrong' ip address tuple" do
assert_raise ArgumentError, fn ->
Conditions.get({:auto_ip, {"185", "1", "74", "1"}})
end
end
test "auto_ip ArgumentError when no 4 element tuple is given" do
assert_raise ArgumentError, fn ->
Conditions.get({:auto_ip, "185.1.74.1"})
end
end
test "ArgumentError" do
assert_raise ArgumentError, fn ->
Conditions.get(:not_an_argument)
end
end
end
end
| 28.834783 | 124 | 0.635103 |
08f693ca5f84cf1fcc05f997fd6243e88d54b30d | 377 | ex | Elixir | lib/codes/codes_a55.ex | badubizzle/icd_code | 4c625733f92b7b1d616e272abc3009bb8b916c0c | [
"Apache-2.0"
] | null | null | null | lib/codes/codes_a55.ex | badubizzle/icd_code | 4c625733f92b7b1d616e272abc3009bb8b916c0c | [
"Apache-2.0"
] | null | null | null | lib/codes/codes_a55.ex | badubizzle/icd_code | 4c625733f92b7b1d616e272abc3009bb8b916c0c | [
"Apache-2.0"
] | null | null | null | defmodule IcdCode.ICDCode.Codes_A55 do
alias IcdCode.ICDCode
def _A55 do
%ICDCode{full_code: "A55",
category_code: "A55",
short_code: "",
full_name: "Chlamydial lymphogranuloma (venereum)",
short_name: "Chlamydial lymphogranuloma (venereum)",
category_name: "Chlamydial lymphogranuloma (venereum)"
}
end
end
| 23.5625 | 64 | 0.647215 |
08f69f28fe91e0b5aebed99c3c82c4aa5be97142 | 1,492 | ex | Elixir | lib/tune/spotify/schema/playlist.ex | pedromtavares/tune | b95bdb5038ccb8c7ae262ef5d0803e53565e192f | [
"MIT"
] | 1 | 2021-07-09T11:57:41.000Z | 2021-07-09T11:57:41.000Z | lib/tune/spotify/schema/playlist.ex | pedromtavares/tune | b95bdb5038ccb8c7ae262ef5d0803e53565e192f | [
"MIT"
] | null | null | null | lib/tune/spotify/schema/playlist.ex | pedromtavares/tune | b95bdb5038ccb8c7ae262ef5d0803e53565e192f | [
"MIT"
] | 2 | 2021-07-09T11:57:45.000Z | 2022-01-06T23:37:19.000Z | defmodule Tune.Spotify.Schema.Playlist do
@moduledoc """
Represents a playlist.
"""
alias Tune.{Duration, Spotify.Schema}
alias Schema.{Album, Track}
@enforce_keys [
:id,
:uri,
:name,
:description,
:thumbnails,
:tracks
]
defstruct [
:id,
:uri,
:name,
:description,
:thumbnails,
:tracks
]
@type id :: Schema.id()
@type t :: %__MODULE__{
id: id(),
uri: Schema.uri(),
name: String.t(),
description: String.t(),
thumbnails: Schema.thumbnails(),
tracks: [Track.t()] | :not_fetched
}
@spec total_duration_ms(t()) :: Duration.milliseconds() | :not_available
def total_duration_ms(playlist) do
case playlist.tracks do
:not_fetched ->
:not_available
tracks ->
Enum.reduce(tracks, 0, fn track, total_duration_ms ->
total_duration_ms + track.duration_ms
end)
end
end
@spec tracks_count(t()) :: pos_integer() | :not_available
def tracks_count(playlist) do
case playlist.tracks do
:not_fetched ->
:not_available
tracks ->
Enum.count(tracks)
end
end
@spec albums_grouped_by_type(t()) :: %{Album.album_type() => [Album.t()]} | :not_available
def albums_grouped_by_type(playlist) do
case playlist.tracks do
:not_fetched ->
:not_available
tracks ->
Enum.group_by(tracks, & &1.album.album_type, & &1.album)
end
end
end
| 21.014085 | 92 | 0.590483 |
08f6c0f18a13d82ad43764c9418fa32327925eae | 671 | exs | Elixir | mix.exs | milangoda/tesseract-elixir | b2ab56f8a91de63ba1c54ba5276081b72613c33e | [
"MIT"
] | 17 | 2016-06-14T12:46:37.000Z | 2020-09-19T15:09:51.000Z | mix.exs | milangoda/tesseract-elixir | b2ab56f8a91de63ba1c54ba5276081b72613c33e | [
"MIT"
] | null | null | null | mix.exs | milangoda/tesseract-elixir | b2ab56f8a91de63ba1c54ba5276081b72613c33e | [
"MIT"
] | 4 | 2016-09-10T05:46:38.000Z | 2018-10-08T11:52:53.000Z | defmodule Tesseract.Mixfile do
use Mix.Project
def project do
[app: :tesseract,
version: "0.0.1",
elixir: "~> 1.2",
build_embedded: Mix.env == :prod,
start_permanent: Mix.env == :prod,
deps: deps]
end
# Configuration for the OTP application
#
# Type "mix help compile.app" for more information
def application do
[applications: [:logger]]
end
# Dependencies can be Hex packages:
#
# {:mydep, "~> 0.3.0"}
#
# Or git/path repositories:
#
# {:mydep, git: "https://github.com/elixir-lang/mydep.git", tag: "0.1.0"}
#
# Type "mix help deps" for more examples and options
defp deps do
[]
end
end
| 20.333333 | 77 | 0.611028 |
08f6d0e532c6bba2ce21220255ed499076599c45 | 1,035 | ex | Elixir | test/support/conn_case.ex | korczis/tosh | dc197f223b460e5769ceaa47d8058202aeab66b8 | [
"MIT"
] | null | null | null | test/support/conn_case.ex | korczis/tosh | dc197f223b460e5769ceaa47d8058202aeab66b8 | [
"MIT"
] | null | null | null | test/support/conn_case.ex | korczis/tosh | dc197f223b460e5769ceaa47d8058202aeab66b8 | [
"MIT"
] | null | null | null | defmodule ToshWeb.ConnCase do
@moduledoc """
This module defines the test case to be used by
tests that require setting up a connection.
Such tests rely on `Phoenix.ConnTest` and also
import other functionality to make it easier
to build common data structures and query the data layer.
Finally, if the test case interacts with the database,
it cannot be async. For this reason, every test runs
inside a transaction which is reset at the beginning
of the test unless the test case is marked as async.
"""
use ExUnit.CaseTemplate
using do
quote do
# Import conveniences for testing with connections
use Phoenix.ConnTest
alias ToshWeb.Router.Helpers, as: Routes
# The default endpoint for testing
@endpoint ToshWeb.Endpoint
end
end
setup tags do
:ok = Ecto.Adapters.SQL.Sandbox.checkout(Tosh.Repo)
unless tags[:async] do
Ecto.Adapters.SQL.Sandbox.mode(Tosh.Repo, {:shared, self()})
end
{:ok, conn: Phoenix.ConnTest.build_conn()}
end
end
| 26.538462 | 66 | 0.71401 |
08f6effa997d54709cdbcaeecfc10510c83f703b | 2,416 | ex | Elixir | lib/ex_oneroster/academic_sessions/academic_sessions.ex | jrissler/ex_oneroster | cec492117bffc14aec91e2448643682ceeb449e9 | [
"Apache-2.0"
] | 3 | 2018-09-06T11:15:07.000Z | 2021-12-27T15:36:51.000Z | lib/ex_oneroster/academic_sessions/academic_sessions.ex | jrissler/ex_oneroster | cec492117bffc14aec91e2448643682ceeb449e9 | [
"Apache-2.0"
] | null | null | null | lib/ex_oneroster/academic_sessions/academic_sessions.ex | jrissler/ex_oneroster | cec492117bffc14aec91e2448643682ceeb449e9 | [
"Apache-2.0"
] | null | null | null | defmodule ExOneroster.AcademicSessions do
@moduledoc """
The boundary for the AcademicSessions system.
"""
import Ecto.Query, warn: false
alias ExOneroster.Repo
alias ExOneroster.AcademicSessions.AcademicSession
@doc """
Returns the list of academic_sessions.
## Examples
iex> list_academic_sessions()
[%AcademicSession{}, ...]
"""
def list_academic_sessions do
Repo.all(AcademicSession) |> Repo.preload([:parent, :children])
end
@doc """
Gets a single academic_session.
Raises `Ecto.NoResultsError` if the Academic session does not exist.
## Examples
iex> get_academic_session!(123)
%AcademicSession{}
iex> get_academic_session!(456)
** (Ecto.NoResultsError)
"""
def get_academic_session!(id), do: Repo.get!(AcademicSession, id) |> Repo.preload([:parent, :children])
@doc """
Creates a academic_session.
## Examples
iex> create_academic_session(%{field: value})
{:ok, %AcademicSession{}}
iex> create_academic_session(%{field: bad_value})
{:error, %Ecto.Changeset{}}
"""
def create_academic_session(attrs \\ %{}) do
%AcademicSession{}
|> Repo.preload([:parent, :children])
|> AcademicSession.changeset(attrs)
|> Repo.insert()
end
@doc """
Updates a academic_session.
## Examples
iex> update_academic_session(academic_session, %{field: new_value})
{:ok, %AcademicSession{}}
iex> update_academic_session(academic_session, %{field: bad_value})
{:error, %Ecto.Changeset{}}
"""
def update_academic_session(%AcademicSession{} = academic_session, attrs) do
academic_session
|> AcademicSession.changeset(attrs)
|> Repo.update()
end
@doc """
Deletes a AcademicSession.
## Examples
iex> delete_academic_session(academic_session)
{:ok, %AcademicSession{}}
iex> delete_academic_session(academic_session)
{:error, %Ecto.Changeset{}}
"""
def delete_academic_session(%AcademicSession{} = academic_session) do
Repo.delete(academic_session)
end
@doc """
Returns an `%Ecto.Changeset{}` for tracking academic_session changes.
## Examples
iex> change_academic_session(academic_session)
%Ecto.Changeset{source: %AcademicSession{}}
"""
def change_academic_session(%AcademicSession{} = academic_session) do
AcademicSession.changeset(academic_session, %{})
end
end
| 22.792453 | 105 | 0.681705 |
08f6fb9f856f219d85a12bc6e71bc7b7536c6e16 | 58 | ex | Elixir | web/views/admin_view.ex | allen-garvey/artour | fce27b234d11a3e434c897b5fa3178b7c126245f | [
"MIT"
] | 4 | 2019-10-04T16:11:15.000Z | 2021-08-18T21:00:13.000Z | apps/artour/web/views/admin_view.ex | allen-garvey/phoenix-umbrella | 1d444bbd62a5e7b5f51d317ce2be71ee994125d5 | [
"MIT"
] | 5 | 2020-03-16T23:52:25.000Z | 2021-09-03T16:52:17.000Z | apps/artour/web/views/admin_view.ex | allen-garvey/phoenix-umbrella | 1d444bbd62a5e7b5f51d317ce2be71ee994125d5 | [
"MIT"
] | null | null | null | defmodule Artour.AdminView do
use Artour.Web, :view
end
| 14.5 | 29 | 0.775862 |
08f72662778928d73a23293ec8a1d91e664c067d | 1,110 | ex | Elixir | lib/msgpax/bin.ex | gskynet/msgpax | 5b1d8bad8692cbf668cd800a522fb6cc653e2c9d | [
"0BSD"
] | null | null | null | lib/msgpax/bin.ex | gskynet/msgpax | 5b1d8bad8692cbf668cd800a522fb6cc653e2c9d | [
"0BSD"
] | null | null | null | lib/msgpax/bin.ex | gskynet/msgpax | 5b1d8bad8692cbf668cd800a522fb6cc653e2c9d | [
"0BSD"
] | 1 | 2020-07-13T14:05:47.000Z | 2020-07-13T14:05:47.000Z | defmodule Msgpax.Bin do
@moduledoc """
A struct to represent the MessagePack [Binary
type](https://github.com/msgpack/msgpack/blob/master/spec.md#formats-bin).
Elixir binaries are serialized and de-serialized as [MessagePack
strings](https://github.com/msgpack/msgpack/blob/master/spec.md#formats-str):
`Msgpax.Bin` is used when you want to enforce the serialization of a binary
into the Binary MessagePack type. De-serialization functions (such as
`Msgpax.unpack/2`) provide an option to deserialize Binary terms (which are
de-serialized to Elixir binaries by default) to `Msgpax.Bin` structs.
"""
@type t :: %__MODULE__{
data: binary
}
defstruct [:data]
@doc """
Creates a new `Msgpax.Bin` struct from the given binary.
## Examples
iex> Msgpax.Bin.new("foo")
#Msgpax.Bin<"foo">
"""
def new(data) when is_binary(data) do
%__MODULE__{data: data}
end
defimpl Inspect do
import Inspect.Algebra
def inspect(%{data: data}, opts) do
concat(["#Msgpax.Bin<", Inspect.BitString.inspect(data, opts), ">"])
end
end
end
| 27.073171 | 79 | 0.686486 |
08f735a9e800f591212a9a1189be16dfda4607aa | 1,078 | ex | Elixir | test/support/conn_case.ex | deadmp/wiki | 29d98ed0517e26d2e3f1c75f70852bc7512d87fc | [
"MIT"
] | null | null | null | test/support/conn_case.ex | deadmp/wiki | 29d98ed0517e26d2e3f1c75f70852bc7512d87fc | [
"MIT"
] | null | null | null | test/support/conn_case.ex | deadmp/wiki | 29d98ed0517e26d2e3f1c75f70852bc7512d87fc | [
"MIT"
] | null | null | null | defmodule Wiki.ConnCase do
@moduledoc """
This module defines the test case to be used by
tests that require setting up a connection.
Such tests rely on `Phoenix.ConnTest` and also
import other functionality to make it easier
to build and query models.
Finally, if the test case interacts with the database,
it cannot be async. For this reason, every test runs
inside a transaction which is reset at the beginning
of the test unless the test case is marked as async.
"""
use ExUnit.CaseTemplate
using do
quote do
# Import conveniences for testing with connections
use Phoenix.ConnTest
alias Wiki.Repo
import Ecto
import Ecto.Changeset
import Ecto.Query
import Wiki.Router.Helpers
# The default endpoint for testing
@endpoint Wiki.Endpoint
end
end
setup tags do
:ok = Ecto.Adapters.SQL.Sandbox.checkout(Wiki.Repo)
unless tags[:async] do
Ecto.Adapters.SQL.Sandbox.mode(Wiki.Repo, {:shared, self()})
end
{:ok, conn: Phoenix.ConnTest.build_conn()}
end
end
| 23.955556 | 66 | 0.699443 |
08f749f690cda7f03f904919f4619c9e22825988 | 4,081 | ex | Elixir | clients/cloud_kms/lib/google_api/cloud_kms/v1/model/encrypt_response.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2021-12-20T03:40:53.000Z | 2021-12-20T03:40:53.000Z | clients/cloud_kms/lib/google_api/cloud_kms/v1/model/encrypt_response.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2020-08-18T00:11:23.000Z | 2020-08-18T00:44:16.000Z | clients/cloud_kms/lib/google_api/cloud_kms/v1/model/encrypt_response.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.CloudKMS.V1.Model.EncryptResponse do
@moduledoc """
Response message for KeyManagementService.Encrypt.
## Attributes
* `ciphertext` (*type:* `String.t`, *default:* `nil`) - The encrypted data.
* `ciphertextCrc32c` (*type:* `String.t`, *default:* `nil`) - Integrity verification field. A CRC32C checksum of the returned EncryptResponse.ciphertext. An integrity check of EncryptResponse.ciphertext can be performed by computing the CRC32C checksum of EncryptResponse.ciphertext and comparing your results to this field. Discard the response in case of non-matching checksum values, and perform a limited number of retries. A persistent mismatch may indicate an issue in your computation of the CRC32C checksum. Note: This field is defined as int64 for reasons of compatibility across different languages. However, it is a non-negative integer, which will never exceed 2^32-1, and can be safely downconverted to uint32 in languages that support this type.
* `name` (*type:* `String.t`, *default:* `nil`) - The resource name of the CryptoKeyVersion used in encryption. Check this field to verify that the intended resource was used for encryption.
* `protectionLevel` (*type:* `String.t`, *default:* `nil`) - The ProtectionLevel of the CryptoKeyVersion used in encryption.
* `verifiedAdditionalAuthenticatedDataCrc32c` (*type:* `boolean()`, *default:* `nil`) - Integrity verification field. A flag indicating whether EncryptRequest.additional_authenticated_data_crc32c was received by KeyManagementService and used for the integrity verification of the AAD. A false value of this field indicates either that EncryptRequest.additional_authenticated_data_crc32c was left unset or that it was not delivered to KeyManagementService. If you've set EncryptRequest.additional_authenticated_data_crc32c but this field is still false, discard the response and perform a limited number of retries.
* `verifiedPlaintextCrc32c` (*type:* `boolean()`, *default:* `nil`) - Integrity verification field. A flag indicating whether EncryptRequest.plaintext_crc32c was received by KeyManagementService and used for the integrity verification of the plaintext. A false value of this field indicates either that EncryptRequest.plaintext_crc32c was left unset or that it was not delivered to KeyManagementService. If you've set EncryptRequest.plaintext_crc32c but this field is still false, discard the response and perform a limited number of retries.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:ciphertext => String.t() | nil,
:ciphertextCrc32c => String.t() | nil,
:name => String.t() | nil,
:protectionLevel => String.t() | nil,
:verifiedAdditionalAuthenticatedDataCrc32c => boolean() | nil,
:verifiedPlaintextCrc32c => boolean() | nil
}
field(:ciphertext)
field(:ciphertextCrc32c)
field(:name)
field(:protectionLevel)
field(:verifiedAdditionalAuthenticatedDataCrc32c)
field(:verifiedPlaintextCrc32c)
end
defimpl Poison.Decoder, for: GoogleApi.CloudKMS.V1.Model.EncryptResponse do
def decode(value, options) do
GoogleApi.CloudKMS.V1.Model.EncryptResponse.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.CloudKMS.V1.Model.EncryptResponse do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 65.822581 | 763 | 0.766234 |
08f7580419bca9b92b950d5b1d44589c9afd1c3c | 1,025 | exs | Elixir | config/config.exs | tinfoil/ex_json_schema | 6836d531c9adcd6abfa3a5e296d3da977f7b7fb1 | [
"MIT"
] | 301 | 2015-07-17T22:22:56.000Z | 2022-03-20T13:42:22.000Z | config/config.exs | tinfoil/ex_json_schema | 6836d531c9adcd6abfa3a5e296d3da977f7b7fb1 | [
"MIT"
] | 70 | 2015-09-30T21:19:43.000Z | 2022-02-03T10:23:07.000Z | config/config.exs | tinfoil/ex_json_schema | 6836d531c9adcd6abfa3a5e296d3da977f7b7fb1 | [
"MIT"
] | 100 | 2015-09-16T11:58:15.000Z | 2022-01-31T19:09:32.000Z | # This file is responsible for configuring your application
# and its dependencies with the aid of the Mix.Config module.
use Mix.Config
# This configuration is loaded before any dependency and is restricted
# to this project. If another project depends on this project, this
# file won't be loaded nor affect the parent project. For this reason,
# if you want to provide default values for your application for third-
# party users, it should be done in your mix.exs file.
# Sample configuration:
#
# config :logger, :console,
# level: :info,
# format: "$date $time [$level] $metadata$message\n",
# metadata: [:user_id]
# It is also possible to import configuration files, relative to this
# directory. For example, you can emulate configuration per environment
# by uncommenting the line below and defining dev.exs, test.exs and such.
# Configuration from the imported file will override the ones defined
# here (which is why it is important to import them last).
import_config "#{Mix.env()}.exs"
| 41 | 73 | 0.746341 |
08f768a9b695aae045f2b8a07cb77fbcc65542f3 | 1,475 | exs | Elixir | episode04/my_app/mix.exs | paulfioravanti/learn_phoenix | 3767f28b09bb5e740231dd261a0bfa8b3eea98d3 | [
"MIT"
] | null | null | null | episode04/my_app/mix.exs | paulfioravanti/learn_phoenix | 3767f28b09bb5e740231dd261a0bfa8b3eea98d3 | [
"MIT"
] | null | null | null | episode04/my_app/mix.exs | paulfioravanti/learn_phoenix | 3767f28b09bb5e740231dd261a0bfa8b3eea98d3 | [
"MIT"
] | null | null | null | defmodule MyApp.Mixfile do
use Mix.Project
def project do
[app: :my_app,
version: "0.0.1",
elixir: "~> 1.0",
elixirc_paths: elixirc_paths(Mix.env),
compilers: [:phoenix] ++ Mix.compilers,
build_embedded: Mix.env == :prod,
start_permanent: Mix.env == :prod,
aliases: aliases,
deps: deps]
end
# Configuration for the OTP application.
#
# Type `mix help compile.app` for more information.
def application do
[mod: {MyApp, []},
applications: [:phoenix, :phoenix_html, :cowboy, :logger,
:phoenix_ecto, :postgrex]]
end
# Specifies which paths to compile per environment.
defp elixirc_paths(:test), do: ["lib", "web", "test/support"]
defp elixirc_paths(_), do: ["lib", "web"]
# Specifies your project dependencies.
#
# Type `mix help deps` for examples and options.
defp deps do
[{:phoenix, "~> 1.0.3"},
{:phoenix_ecto, "~> 1.1"},
{:postgrex, ">= 0.0.0"},
{:phoenix_html, "~> 2.1"},
{:phoenix_live_reload, "~> 1.0", only: :dev},
{:cowboy, "~> 1.0"}]
end
# Aliases are shortcut or tasks specific to the current project.
# For example, to create, migrate and run the seeds file at once:
#
# $ mix ecto.setup
#
# See the documentation for `Mix` for more info on aliases.
defp aliases do
["ecto.setup": ["ecto.create", "ecto.migrate", "run priv/repo/seeds.exs"],
"ecto.reset": ["ecto.drop", "ecto.setup"]]
end
end
| 28.365385 | 78 | 0.608136 |
08f782f7098f396e70f8409cd94d17857ffc4bd5 | 1,033 | ex | Elixir | elixir/lib/day06.ex | dimspith/aoc2021 | c26cf4b07e85ab140d294e9f2816ea84396c8887 | [
"Unlicense"
] | 1 | 2021-12-08T18:23:55.000Z | 2021-12-08T18:23:55.000Z | elixir/lib/day06.ex | dimspith/aoc2021 | c26cf4b07e85ab140d294e9f2816ea84396c8887 | [
"Unlicense"
] | null | null | null | elixir/lib/day06.ex | dimspith/aoc2021 | c26cf4b07e85ab140d294e9f2816ea84396c8887 | [
"Unlicense"
] | null | null | null | defmodule Day06 do
def get_input(file) do
File.read!(file)
|> String.split(",", trim: true)
|> Enum.map(&String.to_integer/1)
end
def insert_missing_keys(map) do
Enum.reduce(0..8, map, fn x, acc ->
acc |> Map.put_new(x, 0)
end)
end
def simulate(input, days) do
input
|> Enum.frequencies()
|> insert_missing_keys
|> (fn x ->
Enum.reduce(1..days, x, fn _x, acc ->
with days <- Map.keys(acc),
fish <- Map.values(acc),
new_fish <- hd(fish) do
fish
|> tl
|> Kernel.++([new_fish])
|> List.update_at(6, &(&1 + new_fish))
|> (fn x -> Enum.zip(days, x) end).()
|> Map.new()
end
end)
end).()
end
def part1 do
get_input("priv/day06")
|> simulate(80)
|> Map.values()
|> Enum.sum()
end
def part2 do
get_input("priv/day06")
|> simulate(256)
|> Map.values()
|> Enum.sum()
end
end
| 21.520833 | 52 | 0.479187 |
08f7cad0943f1428e4fc9c3331822629af45093e | 158 | exs | Elixir | .formatter.exs | parikshitgupta1/k8 | 1df84e3f0c37c59c27061c75408b2909e7e44ff9 | [
"MIT"
] | null | null | null | .formatter.exs | parikshitgupta1/k8 | 1df84e3f0c37c59c27061c75408b2909e7e44ff9 | [
"MIT"
] | null | null | null | .formatter.exs | parikshitgupta1/k8 | 1df84e3f0c37c59c27061c75408b2909e7e44ff9 | [
"MIT"
] | null | null | null | [
import_deps: [:ecto, :phoenix],
inputs: ["*.{ex,exs}", "priv/*/seeds.exs", "{config,lib,test}/**/*.{ex,exs}"],
subdirectories: ["priv/*/migrations"]
] | 31.6 | 80 | 0.575949 |
08f7cce9ee413e21b8c4b5837cdfb165f390a888 | 655 | ex | Elixir | lib/monet/application.ex | neuralvis/monet | 500cc257b39f8aafd7fae9a3025fd8c1b20b76ef | [
"MIT"
] | null | null | null | lib/monet/application.ex | neuralvis/monet | 500cc257b39f8aafd7fae9a3025fd8c1b20b76ef | [
"MIT"
] | null | null | null | lib/monet/application.ex | neuralvis/monet | 500cc257b39f8aafd7fae9a3025fd8c1b20b76ef | [
"MIT"
] | null | null | null | defmodule Monet.Application do
# See https://hexdocs.pm/elixir/Application.html
# for more information on OTP Applications
@moduledoc false
use Application
@impl true
def start(_type, _args) do
children = [
# Starts a worker by calling: Monet.Worker.start_link(arg)
# {Monet.Worker, arg}
# {Monet.SimpleQueue, [1, 2, 3]},
{Monet.Ping, 10},
{Task.Supervisor, name: Monet.TaskSupervisor }
]
# See https://hexdocs.pm/elixir/Supervisor.html
# for other strategies and supported options
opts = [strategy: :one_for_one, name: Monet.Supervisor]
Supervisor.start_link(children, opts)
end
end
| 27.291667 | 64 | 0.679389 |
08f7e1371eabc12d7d54dd5c6109673d8d393a66 | 198 | ex | Elixir | lib/ash_pow.ex | ash-project/ash_pow | 95bc41d6ba0dd8d6cde62769f3a3811c434f9b0a | [
"MIT"
] | 1 | 2021-12-10T20:29:16.000Z | 2021-12-10T20:29:16.000Z | lib/ash_pow.ex | ash-project/ash_pow | 95bc41d6ba0dd8d6cde62769f3a3811c434f9b0a | [
"MIT"
] | null | null | null | lib/ash_pow.ex | ash-project/ash_pow | 95bc41d6ba0dd8d6cde62769f3a3811c434f9b0a | [
"MIT"
] | null | null | null | defmodule AshPow do
@moduledoc """
Documentation for `AshPow`.
"""
@doc """
Hello world.
## Examples
iex> AshPow.hello()
:world
"""
def hello do
:world
end
end
| 10.421053 | 29 | 0.550505 |
08f7e98679dbfe299f99e9b29b30388095f14c4a | 2,176 | exs | Elixir | apps/api/test/api_web/controllers/user_controller_test.exs | michaeljguarino/forge | 50ee583ecb4aad5dee4ef08fce29a8eaed1a0824 | [
"Apache-2.0"
] | 59 | 2021-09-16T19:29:39.000Z | 2022-03-31T20:44:24.000Z | apps/api/test/api_web/controllers/user_controller_test.exs | svilenkov/plural | ac6c6cc15ac4b66a3b5e32ed4a7bee4d46d1f026 | [
"Apache-2.0"
] | 111 | 2021-08-15T09:56:37.000Z | 2022-03-31T23:59:32.000Z | apps/api/test/api_web/controllers/user_controller_test.exs | svilenkov/plural | ac6c6cc15ac4b66a3b5e32ed4a7bee4d46d1f026 | [
"Apache-2.0"
] | 4 | 2021-12-13T09:43:01.000Z | 2022-03-29T18:08:44.000Z | defmodule ApiWeb.UserControllerTest do
use ApiWeb.ConnCase, async: true
alias Core.Services.Users
describe "#signup" do
test "It will create a new user", %{conn: conn} do
path = Routes.user_path(conn, :create)
result =
conn
|> post(path, %{email: "someone@example.com", name: "Some User", password: "verystrongpwd"})
|> assert_header("authorization", &valid_bearer_token/1)
|> json_response(200)
assert result["email"] == "someone@example.com"
refute result["password"]
end
end
describe "#login" do
test "A user can log in", %{conn: conn} do
path = Routes.user_path(conn, :login)
{:ok, _} = Core.Services.Users.create_user(%{
email: "someone@example.com",
name: "Some User",
password: "verystrongpwd"
})
result =
conn
|> post(path, %{email: "someone@example.com", password: "verystrongpwd"})
|> assert_header("authorization", &valid_bearer_token/1)
|> json_response(200)
assert result["email"] == "someone@example.com"
refute result["password"]
end
end
describe "#create_publisher" do
test "Users can create publishers", %{conn: conn} do
user = insert(:user)
path = Routes.user_path(conn, :create_publisher)
result =
conn
|> authorized(user)
|> post(path, %{"name" => "Publisher"})
|> json_response(200)
assert result["name"] == "Publisher"
assert result["owner_id"] == user.id
end
end
describe "#me" do
test "It can authorize with persisted tokens", %{conn: conn} do
user = insert(:user)
{:ok, token} = Users.create_persisted_token(user)
path = Routes.user_path(conn, :me)
conn
|> Plug.Conn.put_req_header("authorization", "Bearer #{token.token}")
|> get(path)
|> json_response(200)
assert_receive {:event, %Core.PubSub.AccessTokenUsage{item: found}}
assert found.id == token.id
end
end
defp valid_bearer_token("Bearer " <> token) do
case Core.Guardian.decode_and_verify(token) do
{:ok, _} -> true
_ -> false
end
end
end
| 27.2 | 100 | 0.606618 |
08f7f3fa5a3ad085add259a11c00a52edecc8f43 | 61 | exs | Elixir | test/csvex_test.exs | martin-torhage/csvex | ddf01ca2a8d016654ecd8f0c12e995ec300ce21b | [
"MIT"
] | null | null | null | test/csvex_test.exs | martin-torhage/csvex | ddf01ca2a8d016654ecd8f0c12e995ec300ce21b | [
"MIT"
] | null | null | null | test/csvex_test.exs | martin-torhage/csvex | ddf01ca2a8d016654ecd8f0c12e995ec300ce21b | [
"MIT"
] | null | null | null | defmodule CsvexTest do
use ExUnit.Case
doctest Csvex
end
| 12.2 | 22 | 0.786885 |
08f7f45f5e49da7aa6de1494a4f83e9e53645331 | 238 | ex | Elixir | lib/sass_ex/response.ex | montebrown/sass_ex | 43aae4a8134a8ec418023d7a309036ba7e1debfe | [
"MIT"
] | null | null | null | lib/sass_ex/response.ex | montebrown/sass_ex | 43aae4a8134a8ec418023d7a309036ba7e1debfe | [
"MIT"
] | 1 | 2021-12-15T22:56:26.000Z | 2021-12-29T16:55:46.000Z | lib/sass_ex/response.ex | montebrown/sass_ex | 43aae4a8134a8ec418023d7a309036ba7e1debfe | [
"MIT"
] | 1 | 2021-12-11T22:47:02.000Z | 2021-12-11T22:47:02.000Z | defmodule SassEx.Response do
@moduledoc """
Contains the response from compiling the Sass file
"""
defstruct [:css, :source_map]
@type t :: %__MODULE__{
css: String.t(),
source_map: String.t()
}
end
| 19.833333 | 52 | 0.605042 |
08f7f5eb14ebfb48dadacbfa655b71cdeb07f1c6 | 1,667 | exs | Elixir | mix.exs | evanob/ExQuickBooks | 8c0f64dd658b1a6edfaa338e0cb62b95b9a853e2 | [
"0BSD"
] | null | null | null | mix.exs | evanob/ExQuickBooks | 8c0f64dd658b1a6edfaa338e0cb62b95b9a853e2 | [
"0BSD"
] | null | null | null | mix.exs | evanob/ExQuickBooks | 8c0f64dd658b1a6edfaa338e0cb62b95b9a853e2 | [
"0BSD"
] | null | null | null | defmodule ExQuickBooks.Mixfile do
use Mix.Project
def project do
[
app: :exquickbooks,
version: "0.9.0",
elixir: "~> 1.12",
# Compilation
elixirc_paths: elixirc_paths(Mix.env()),
build_embedded: Mix.env() == :prod,
start_permanent: Mix.env() == :prod,
# Packaging
name: "ExQuickBooks",
description: description(),
package: package(),
deps: deps(),
docs: docs(),
# Test coverage
test_coverage: [tool: ExCoveralls],
preferred_cli_env: [coveralls: :test]
]
end
# Configuration for the OTP application
#
# Type "mix help compile.app" for more information
def application do
[env: env(), extra_applications: [:logger, :httpoison]]
end
defp env do
[backend: ExQuickBooks.Backend.HTTPoison]
end
defp description do
"""
QuickBooks Online API client for Elixir.
"""
end
defp package do
[
licenses: ["ISC"],
maintainers: ["Evan O'Brien"],
links: %{"GitHub" => "https://github.com/evanob/ExQuickBooks"}
]
end
defp docs do
[main: ExQuickBooks]
end
# Dependencies can be Hex packages:
#
# {:my_dep, "~> 0.3.0"}
#
# Or git/path repositories:
#
# {:my_dep, git: "https://github.com/elixir-lang/my_dep.git", tag: "0.1.0"}
#
# Type "mix help deps" for more examples and options
defp deps do
[
{:httpoison, "~> 1.8"},
{:jason, "~> 1.0"},
{:ex_doc, "~> 0.15", only: :dev, runtime: false},
{:excoveralls, "~> 0.6", only: :test}
]
end
defp elixirc_paths(:test), do: ["lib", "test/support"]
defp elixirc_paths(_), do: ["lib"]
end
| 21.371795 | 79 | 0.582484 |
08f800179a9cfb65dad33ce8ca79885bf443c7ae | 1,691 | ex | Elixir | lib/ex_chat_web/controllers/user_controller.ex | embik/ex_chat | 08d83fe2076a96f9dad647fe509daec301b4965a | [
"Apache-2.0"
] | 1 | 2017-12-28T12:49:19.000Z | 2017-12-28T12:49:19.000Z | lib/ex_chat_web/controllers/user_controller.ex | embik/ex_chat | 08d83fe2076a96f9dad647fe509daec301b4965a | [
"Apache-2.0"
] | null | null | null | lib/ex_chat_web/controllers/user_controller.ex | embik/ex_chat | 08d83fe2076a96f9dad647fe509daec301b4965a | [
"Apache-2.0"
] | null | null | null | defmodule ExChatWeb.UserController do
use ExChatWeb, :controller
alias ExChat.Accounts
alias ExChat.Accounts.User
def index(conn, _params) do
users = Accounts.list_users()
render(conn, "index.html", users: users)
end
def new(conn, _params) do
changeset = Accounts.change_user(%User{})
render(conn, "new.html", changeset: changeset)
end
def create(conn, %{"user" => user_params}) do
case Accounts.create_user(user_params) do
{:ok, user} ->
conn
|> put_flash(:info, "User created successfully.")
|> redirect(to: user_path(conn, :show, user))
{:error, %Ecto.Changeset{} = changeset} ->
render(conn, "new.html", changeset: changeset)
end
end
def show(conn, %{"id" => id}) do
user = Accounts.get_user!(id)
render(conn, "show.html", user: user)
end
def edit(conn, %{"id" => id}) do
user = Accounts.get_user!(id)
changeset = Accounts.change_user(user)
render(conn, "edit.html", user: user, changeset: changeset)
end
def update(conn, %{"id" => id, "user" => user_params}) do
user = Accounts.get_user!(id)
case Accounts.update_user(user, user_params) do
{:ok, user} ->
conn
|> put_flash(:info, "User updated successfully.")
|> redirect(to: user_path(conn, :show, user))
{:error, %Ecto.Changeset{} = changeset} ->
render(conn, "edit.html", user: user, changeset: changeset)
end
end
def delete(conn, %{"id" => id}) do
user = Accounts.get_user!(id)
{:ok, _user} = Accounts.delete_user(user)
conn
|> put_flash(:info, "User deleted successfully.")
|> redirect(to: user_path(conn, :index))
end
end
| 27.721311 | 67 | 0.623891 |
08f805240c9955ea39e43b5bfd341ce7cac8a722 | 1,239 | exs | Elixir | postgres/sample/config/prod.secret.exs | penqen/phenix_db_samples | 4ae7211abb3cf66a120a78727641b3963a9aafe3 | [
"MIT"
] | null | null | null | postgres/sample/config/prod.secret.exs | penqen/phenix_db_samples | 4ae7211abb3cf66a120a78727641b3963a9aafe3 | [
"MIT"
] | null | null | null | postgres/sample/config/prod.secret.exs | penqen/phenix_db_samples | 4ae7211abb3cf66a120a78727641b3963a9aafe3 | [
"MIT"
] | null | null | null | # In this file, we load production configuration and secrets
# from environment variables. You can also hardcode secrets,
# although such is generally not recommended and you have to
# remember to add this file to your .gitignore.
use Mix.Config
database_url =
System.get_env("DATABASE_URL") ||
raise """
environment variable DATABASE_URL is missing.
For example: ecto://USER:PASS@HOST/DATABASE
"""
config :sample, Sample.Repo,
# ssl: true,
url: database_url,
pool_size: String.to_integer(System.get_env("POOL_SIZE") || "10")
secret_key_base =
System.get_env("SECRET_KEY_BASE") ||
raise """
environment variable SECRET_KEY_BASE is missing.
You can generate one by calling: mix phx.gen.secret
"""
config :sample, SampleWeb.Endpoint,
http: [
port: String.to_integer(System.get_env("PORT") || "4000"),
transport_options: [socket_opts: [:inet6]]
],
secret_key_base: secret_key_base
# ## Using releases (Elixir v1.9+)
#
# If you are doing OTP releases, you need to instruct Phoenix
# to start each relevant endpoint:
#
# config :sample, SampleWeb.Endpoint, server: true
#
# Then you can assemble a release by calling `mix release`.
# See `mix help release` for more information.
| 29.5 | 67 | 0.717514 |
08f80d635648ec2ab761edff9c6956d83813fb8c | 724 | exs | Elixir | apps/forklift/config/config.exs | AWHServiceAccount/smartcitiesdata | 6957afac12809288640b6ba6b576c3016e6033d7 | [
"Apache-2.0"
] | 1 | 2020-03-18T21:14:39.000Z | 2020-03-18T21:14:39.000Z | apps/forklift/config/config.exs | AWHServiceAccount/smartcitiesdata | 6957afac12809288640b6ba6b576c3016e6033d7 | [
"Apache-2.0"
] | null | null | null | apps/forklift/config/config.exs | AWHServiceAccount/smartcitiesdata | 6957afac12809288640b6ba6b576c3016e6033d7 | [
"Apache-2.0"
] | null | null | null | use Mix.Config
input_topic_prefix = "transformed"
config :forklift,
data_reader: Pipeline.Reader.DatasetTopicReader,
topic_writer: Pipeline.Writer.TopicWriter,
table_writer: Pipeline.Writer.S3Writer,
retry_count: 100,
retry_initial_delay: 100,
retry_max_wait: 1_000 * 60 * 60,
cache_processing_batch_size: 1_000,
message_processing_cadence: 10_000,
number_of_empty_reads_to_delete: 50,
input_topic_prefix: input_topic_prefix,
collector: StreamingMetrics.PrometheusMetricCollector
config :logger,
backends: [:console],
level: :info,
compile_time_purge_level: :debug
import_config "#{Mix.env()}.exs"
config :prestige, :session_opts,
catalog: "hive",
schema: "default",
user: "carpenter"
| 24.965517 | 55 | 0.774862 |
08f82dfd12918b0b7f16eb8463d3f555a8897b88 | 354 | exs | Elixir | priv/repo/migrations/20190130185547_create_tags_users.exs | jmhthethird/assoc | d73c556f398d01258cbc17b6de58a621a6bc4397 | [
"MIT"
] | 6 | 2019-01-31T23:31:42.000Z | 2020-10-06T20:05:34.000Z | priv/repo/migrations/20190130185547_create_tags_users.exs | jmhthethird/assoc | d73c556f398d01258cbc17b6de58a621a6bc4397 | [
"MIT"
] | 2 | 2021-11-09T14:35:51.000Z | 2021-11-09T18:04:15.000Z | priv/repo/migrations/20190130185547_create_tags_users.exs | jmhthethird/assoc | d73c556f398d01258cbc17b6de58a621a6bc4397 | [
"MIT"
] | 2 | 2019-01-31T23:53:33.000Z | 2021-11-05T21:47:26.000Z | defmodule Assoc.Test.Repo.Migrations.CreateTagsUsers do
use Ecto.Migration
def change do
create table(:tags_users) do
add :tag_id, references(:tags, on_delete: :delete_all), null: false
add :user_id, references(:users, on_delete: :delete_all), null: false
end
create unique_index(:tags_users, [:tag_id, :user_id])
end
end
| 27.230769 | 75 | 0.714689 |
08f83adb4b87772bf9e42fa28ecf25ab804d7d46 | 1,276 | exs | Elixir | mix.exs | moogle19/bandit | 610aebaac2ddf76d53faac109b07e519727affb6 | [
"MIT"
] | null | null | null | mix.exs | moogle19/bandit | 610aebaac2ddf76d53faac109b07e519727affb6 | [
"MIT"
] | null | null | null | mix.exs | moogle19/bandit | 610aebaac2ddf76d53faac109b07e519727affb6 | [
"MIT"
] | null | null | null | defmodule Bandit.MixProject do
use Mix.Project
def project do
[
app: :bandit,
version: "0.5.0",
elixir: "~> 1.9",
start_permanent: Mix.env() == :prod,
deps: deps(),
dialyzer: dialyzer(),
name: "Bandit",
description: "A pure-Elixir HTTP server built for Plug apps",
source_url: "https://github.com/mtrudel/bandit",
package: [
files: ["lib", "test", "mix.exs", "README*", "LICENSE*"],
maintainers: ["Mat Trudel"],
licenses: ["MIT"],
links: %{"GitHub" => "https://github.com/mtrudel/bandit"}
],
docs: docs()
]
end
def application do
[extra_applications: [:logger]]
end
defp deps do
[
{:thousand_island, "~> 0.5.7"},
{:plug, "~> 1.12"},
{:hpax, "~> 0.1.1"},
{:finch, "~> 0.8", only: [:dev, :test]},
{:ex_doc, "~> 0.24", only: [:dev, :test], runtime: false},
{:dialyxir, "~> 1.0", only: [:dev, :test], runtime: false},
{:credo, "~> 1.0", only: [:dev, :test], runtime: false},
{:mix_test_watch, "~> 1.0", only: :dev, runtime: false}
]
end
defp dialyzer do
[plt_core_path: "priv/plts", plt_file: {:no_warn, "priv/plts/dialyzer.plt"}]
end
defp docs do
[main: "Bandit"]
end
end
| 25.52 | 80 | 0.528997 |
08f83da12fe74e673d6900074c09f79792dd4044 | 484 | exs | Elixir | config/test.exs | alex2chan/Phoenix-JWT-Auth-API | 5313b41bb590db4c12bdc16f624c11a035d4d692 | [
"MIT"
] | 40 | 2018-05-20T21:30:30.000Z | 2021-09-10T07:25:44.000Z | config/test.exs | alex2chan/Phoenix-JWT-Auth-API | 5313b41bb590db4c12bdc16f624c11a035d4d692 | [
"MIT"
] | 3 | 2020-09-07T10:53:53.000Z | 2022-02-12T21:45:34.000Z | config/test.exs | alex2chan/Phoenix-JWT-Auth-API | 5313b41bb590db4c12bdc16f624c11a035d4d692 | [
"MIT"
] | 15 | 2018-06-07T08:16:20.000Z | 2020-04-12T07:38:05.000Z | use Mix.Config
# We don't run a server during test. If one is required,
# you can enable the server option below.
config :myApi, MyApiWeb.Endpoint,
http: [port: 4001],
server: false
# Print only warnings and errors during test
config :logger, level: :warn
# Configure your database
config :myApi, MyApi.Repo,
adapter: Ecto.Adapters.Postgres,
username: "postgres",
password: "postgres",
database: "myapi_test",
hostname: "localhost",
pool: Ecto.Adapters.SQL.Sandbox
| 24.2 | 56 | 0.729339 |
08f84d81c113f4f016c447acf28cb9b490367bfc | 116 | exs | Elixir | config/config.exs | fimassuda/web_driver_client | 09d373c9a8a923c5e2860f107f84b16565e338f7 | [
"MIT"
] | 8 | 2019-11-24T18:33:12.000Z | 2020-12-09T10:20:09.000Z | config/config.exs | fimassuda/web_driver_client | 09d373c9a8a923c5e2860f107f84b16565e338f7 | [
"MIT"
] | 67 | 2019-12-20T16:33:30.000Z | 2021-09-14T03:50:10.000Z | config/config.exs | fimassuda/web_driver_client | 09d373c9a8a923c5e2860f107f84b16565e338f7 | [
"MIT"
] | 10 | 2020-06-19T16:15:03.000Z | 2021-09-13T17:56:25.000Z | use Mix.Config
if File.exists?(Path.join([__DIR__, "#{Mix.env()}.exs"])) do
import_config "#{Mix.env()}.exs"
end
| 19.333333 | 60 | 0.646552 |
08f85216530fa001576e30a13ffee5df06fafbd4 | 12,561 | ex | Elixir | lib/commanded/process_managers/process_manager_instance.ex | SimpleBet/commanded | dc89737bd22daf4f6c5b3333b5d8d8de47fea5b8 | [
"MIT"
] | 1 | 2020-03-09T11:50:38.000Z | 2020-03-09T11:50:38.000Z | lib/commanded/process_managers/process_manager_instance.ex | perzanko/commanded | fd18ee3981cd237cbb874d1ccd8155e98d35d178 | [
"MIT"
] | null | null | null | lib/commanded/process_managers/process_manager_instance.ex | perzanko/commanded | fd18ee3981cd237cbb874d1ccd8155e98d35d178 | [
"MIT"
] | null | null | null | defmodule Commanded.ProcessManagers.ProcessManagerInstance do
@moduledoc false
use GenServer, restart: :temporary
require Logger
alias Commanded.ProcessManagers.{ProcessRouter, FailureContext}
alias Commanded.EventStore
alias Commanded.EventStore.{RecordedEvent, SnapshotData}
defmodule State do
@moduledoc false
defstruct [
:application,
:idle_timeout,
:process_router,
:process_manager_name,
:process_manager_module,
:process_uuid,
:process_state,
:last_seen_event
]
end
def start_link(opts) do
process_manager_module = Keyword.fetch!(opts, :process_manager_module)
state = %State{
application: Keyword.fetch!(opts, :application),
idle_timeout: Keyword.fetch!(opts, :idle_timeout),
process_router: Keyword.fetch!(opts, :process_router),
process_manager_name: Keyword.fetch!(opts, :process_manager_name),
process_manager_module: process_manager_module,
process_uuid: Keyword.fetch!(opts, :process_uuid),
process_state: struct(process_manager_module)
}
GenServer.start_link(__MODULE__, state)
end
@doc """
Checks whether or not the process manager has already processed events
"""
def new?(process_manager) do
GenServer.call(process_manager, :new?)
end
@doc """
Handle the given event by delegating to the process manager module
"""
def process_event(process_manager, %RecordedEvent{} = event) do
GenServer.cast(process_manager, {:process_event, event})
end
@doc """
Stop the given process manager and delete its persisted state.
Typically called when it has reached its final state.
"""
def stop(process_manager) do
GenServer.call(process_manager, :stop)
end
@doc """
Fetch the process state of this instance
"""
def process_state(process_manager) do
GenServer.call(process_manager, :process_state)
end
@doc false
@impl GenServer
def init(%State{} = state) do
{:ok, state, {:continue, :fetch_state}}
end
@doc """
Attempt to fetch intial process state from snapshot storage.
"""
@impl GenServer
def handle_continue(:fetch_state, %State{} = state) do
%State{application: application} = state
state =
case EventStore.read_snapshot(application, process_state_uuid(state)) do
{:ok, snapshot} ->
%State{
state
| process_state: snapshot.data,
last_seen_event: snapshot.source_version
}
{:error, :snapshot_not_found} ->
state
end
{:noreply, state}
end
@doc false
@impl GenServer
def handle_call(:stop, _from, %State{} = state) do
:ok = delete_state(state)
# Stop the process with a normal reason
{:stop, :normal, :ok, state}
end
@doc false
@impl GenServer
def handle_call(:process_state, _from, %State{} = state) do
%State{idle_timeout: idle_timeout, process_state: process_state} = state
{:reply, process_state, state, idle_timeout}
end
@doc false
@impl GenServer
def handle_call(:new?, _from, %State{} = state) do
%State{idle_timeout: idle_timeout, last_seen_event: last_seen_event} = state
{:reply, is_nil(last_seen_event), state, idle_timeout}
end
@doc """
Handle the given event, using the process manager module, against the current process state
"""
@impl GenServer
def handle_cast({:process_event, event}, %State{} = state) do
case event_already_seen?(event, state) do
true -> process_seen_event(event, state)
false -> process_unseen_event(event, state)
end
end
@doc false
@impl GenServer
def handle_info(:timeout, %State{} = state) do
Logger.debug(fn -> describe(state) <> " stopping due to inactivity timeout" end)
{:stop, :normal, state}
end
defp event_already_seen?(%RecordedEvent{}, %State{last_seen_event: nil}),
do: false
defp event_already_seen?(%RecordedEvent{} = event, %State{} = state) do
%RecordedEvent{event_number: event_number} = event
%State{last_seen_event: last_seen_event} = state
event_number <= last_seen_event
end
# Already seen event, so just ack
defp process_seen_event(%RecordedEvent{} = event, %State{} = state) do
%State{idle_timeout: idle_timeout} = state
:ok = ack_event(event, state)
{:noreply, state, idle_timeout}
end
defp process_unseen_event(event, %State{} = state, context \\ %{}) do
%RecordedEvent{correlation_id: correlation_id, event_id: event_id, event_number: event_number} =
event
%State{application: application, idle_timeout: idle_timeout} = state
case handle_event(event, state) do
{:error, error} ->
Logger.error(fn ->
describe(state) <>
" failed to handle event #{inspect(event_number)} due to: #{inspect(error)}"
end)
handle_event_error({:error, error}, event, state, context)
{:stop, _error, _state} = reply ->
reply
commands ->
# Copy event id, as causation id, and correlation id from handled event.
opts = [application: application, causation_id: event_id, correlation_id: correlation_id]
with :ok <- commands |> List.wrap() |> dispatch_commands(opts, state, event) do
process_state = mutate_state(event, state)
state = %State{
state
| process_state: process_state,
last_seen_event: event_number
}
:ok = persist_state(event_number, state)
:ok = ack_event(event, state)
{:noreply, state, idle_timeout}
else
{:stop, reason} ->
{:stop, reason, state}
end
end
end
# Process instance is given the event and returns applicable commands
# (may be none, one or many).
defp handle_event(%RecordedEvent{} = event, %State{} = state) do
%RecordedEvent{data: data} = event
%State{
process_manager_module: process_manager_module,
process_state: process_state
} = state
try do
process_manager_module.handle(process_state, data)
rescue
e -> {:error, e}
end
end
defp handle_event_error(error, failed_event, state, context) do
%RecordedEvent{data: data} = failed_event
%State{
idle_timeout: idle_timeout,
process_manager_module: process_manager_module
} = state
failure_context = %FailureContext{
pending_commands: [],
process_manager_state: state,
last_event: failed_event,
context: context
}
case process_manager_module.error(error, data, failure_context) do
{:retry, context} when is_map(context) ->
# Retry the failed event
Logger.info(fn -> describe(state) <> " is retrying failed event" end)
process_unseen_event(failed_event, state, context)
{:retry, delay, context} when is_map(context) and is_integer(delay) and delay >= 0 ->
# Retry the failed event after waiting for the given delay, in milliseconds
Logger.info(fn ->
describe(state) <> " is retrying failed event after #{inspect(delay)}ms"
end)
:timer.sleep(delay)
process_unseen_event(failed_event, state, context)
:skip ->
# Skip the failed event by confirming receipt
Logger.info(fn -> describe(state) <> " is skipping event" end)
:ok = ack_event(failed_event, state)
{:noreply, state, idle_timeout}
{:stop, error} ->
# Stop the process manager instance
Logger.warn(fn -> describe(state) <> " has requested to stop: #{inspect(error)}" end)
{:stop, error, state}
invalid ->
Logger.warn(fn ->
describe(state) <> " returned an invalid error response: #{inspect(invalid)}"
end)
# Stop process manager with original error
{:stop, error, state}
end
end
# update the process instance's state by applying the event
defp mutate_state(%RecordedEvent{} = event, %State{} = state) do
%RecordedEvent{data: data} = event
%State{
process_manager_module: process_manager_module,
process_state: process_state
} = state
process_manager_module.apply(process_state, data)
end
defp dispatch_commands(commands, opts, state, last_event, context \\ %{})
defp dispatch_commands([], _opts, _state, _last_event, _context), do: :ok
defp dispatch_commands([command | pending_commands], opts, state, last_event, context) do
%State{application: application} = state
Logger.debug(fn ->
describe(state) <> " attempting to dispatch command: #{inspect(command)}"
end)
case application.dispatch(command, opts) do
:ok ->
dispatch_commands(pending_commands, opts, state, last_event)
# when include_execution_result is set to true, the dispatcher returns an :ok tuple
{:ok, _} ->
dispatch_commands(pending_commands, opts, state, last_event)
error ->
Logger.warn(fn ->
describe(state) <>
" failed to dispatch command #{inspect(command)} due to: #{inspect(error)}"
end)
failure_context = %FailureContext{
pending_commands: pending_commands,
process_manager_state: mutate_state(last_event, state),
last_event: last_event,
context: context
}
dispatch_failure(error, command, opts, state, failure_context)
end
end
defp dispatch_failure(error, failed_command, opts, state, failure_context) do
%State{process_manager_module: process_manager_module} = state
%FailureContext{pending_commands: pending_commands, last_event: last_event} = failure_context
case process_manager_module.error(error, failed_command, failure_context) do
{:continue, commands, context} when is_list(commands) ->
# continue dispatching the given commands
Logger.info(fn -> describe(state) <> " is continuing with modified command(s)" end)
dispatch_commands(commands, opts, state, last_event, context)
{:retry, context} ->
# retry the failed command immediately
Logger.info(fn -> describe(state) <> " is retrying failed command" end)
dispatch_commands([failed_command | pending_commands], opts, state, last_event, context)
{:retry, delay, context} when is_integer(delay) ->
# retry the failed command after waiting for the given delay, in milliseconds
Logger.info(fn ->
describe(state) <> " is retrying failed command after #{inspect(delay)}ms"
end)
:timer.sleep(delay)
dispatch_commands([failed_command | pending_commands], opts, state, last_event, context)
{:skip, :discard_pending} ->
# skip the failed command and discard any pending commands
Logger.info(fn ->
describe(state) <>
" is skipping event and #{length(pending_commands)} pending command(s)"
end)
:ok
{:skip, :continue_pending} ->
# skip the failed command, but continue dispatching any pending commands
Logger.info(fn -> describe(state) <> " is ignoring error dispatching command" end)
dispatch_commands(pending_commands, opts, state, last_event)
{:stop, reason} = reply ->
# stop process manager
Logger.warn(fn -> describe(state) <> " has requested to stop: #{inspect(reason)}" end)
reply
end
end
defp describe(%State{process_manager_module: process_manager_module}),
do: inspect(process_manager_module)
defp persist_state(source_version, %State{} = state) do
%State{
application: application,
process_manager_module: process_manager_module,
process_state: process_state
} = state
snapshot = %SnapshotData{
source_uuid: process_state_uuid(state),
source_version: source_version,
source_type: Atom.to_string(process_manager_module),
data: process_state
}
EventStore.record_snapshot(application, snapshot)
end
defp delete_state(%State{} = state) do
%State{application: application} = state
EventStore.delete_snapshot(application, process_state_uuid(state))
end
defp ack_event(%RecordedEvent{} = event, %State{} = state) do
%State{process_router: process_router} = state
ProcessRouter.ack_event(process_router, event, self())
end
defp process_state_uuid(%State{} = state) do
%State{
process_manager_name: process_manager_name,
process_uuid: process_uuid
} = state
inspect(process_manager_name) <> "-" <> inspect(process_uuid)
end
end
| 30.050239 | 100 | 0.6681 |
08f85d238b56d3fe39d93a07516b5166875d9843 | 1,129 | exs | Elixir | config/config.exs | jdollar/video_theater_elixir | 4f5899ed2987565613c4a6092bc8cf7eb0d18c06 | [
"MIT"
] | null | null | null | config/config.exs | jdollar/video_theater_elixir | 4f5899ed2987565613c4a6092bc8cf7eb0d18c06 | [
"MIT"
] | null | null | null | config/config.exs | jdollar/video_theater_elixir | 4f5899ed2987565613c4a6092bc8cf7eb0d18c06 | [
"MIT"
] | null | null | null | # This file is responsible for configuring your application
# and its dependencies with the aid of the Mix.Config module.
use Mix.Config
# This configuration is loaded before any dependency and is restricted
# to this project. If another project depends on this project, this
# file won't be loaded nor affect the parent project. For this reason,
# if you want to provide default values for your application for
# 3rd-party users, it should be done in your "mix.exs" file.
# You can configure for your application as:
#
# config :video_theater, key: :value
#
# And access this configuration in your application as:
#
# Application.get_env(:video_theater, :key)
#
# Or configure a 3rd-party app:
#
# config :logger, level: :info
#
# It is also possible to import configuration files, relative to this
# directory. For example, you can emulate configuration per environment
# by uncommenting the line below and defining dev.exs, test.exs and such.
# Configuration from the imported file will override the ones defined
# here (which is why it is important to import them last).
#
# import_config "#{Mix.env}.exs"
| 36.419355 | 73 | 0.753764 |
08f863d1533ed854fe27149c07ebfd2aea7fedbd | 1,989 | exs | Elixir | test/nebulex/cache/supervisor_test.exs | filipeherculano/nebulex | 8c16848dc0bef28267c1f0e941ce05005765652b | [
"MIT"
] | null | null | null | test/nebulex/cache/supervisor_test.exs | filipeherculano/nebulex | 8c16848dc0bef28267c1f0e941ce05005765652b | [
"MIT"
] | null | null | null | test/nebulex/cache/supervisor_test.exs | filipeherculano/nebulex | 8c16848dc0bef28267c1f0e941ce05005765652b | [
"MIT"
] | null | null | null | defmodule Nebulex.Cache.SupervisorTest do
use ExUnit.Case, async: true
defmodule MyCache do
use Nebulex.Cache,
otp_app: :nebulex,
adapter: Nebulex.Adapters.Local
end
setup do
:ok = Application.put_env(:nebulex, MyCache, n_shards: 2)
{:ok, pid} = MyCache.start_link()
:ok
on_exit(fn ->
:ok = Process.sleep(10)
if Process.alive?(pid), do: MyCache.stop(pid)
end)
end
test "fail on compile_config because missing otp_app" do
opts = [:nebulex, n_shards: 2, adapter: TestAdapter]
:ok = Application.put_env(:nebulex, MyCache, opts)
assert_raise ArgumentError, "expected otp_app: to be given as argument", fn ->
Nebulex.Cache.Supervisor.compile_config(MyCache, opts)
end
end
test "fail on compile_config because missing adapter" do
opts = [otp_app: :nebulex, n_shards: 2]
:ok = Application.put_env(:nebulex, MyCache, opts)
assert_raise ArgumentError, "expected adapter: to be given as argument", fn ->
Nebulex.Cache.Supervisor.compile_config(MyCache, opts)
end
end
test "fail on compile_config because adapter was not compiled" do
opts = [otp_app: :nebulex, n_shards: 2, adapter: TestAdapter]
:ok = Application.put_env(:nebulex, MyCache, opts)
msg = ~r"adapter TestAdapter was not compiled, ensure"
assert_raise ArgumentError, msg, fn ->
Nebulex.Cache.Supervisor.compile_config(MyCache, opts)
end
end
test "fail on compile_config because adapter error" do
opts = [otp_app: :nebulex, n_shards: 2]
:ok = Application.put_env(:nebulex, MyCache2, opts)
msg = "expected :adapter option given to Nebulex.Cache to list Nebulex.Adapter as a behaviour"
assert_raise ArgumentError, msg, fn ->
defmodule MyAdapter do
end
defmodule MyCache2 do
use Nebulex.Cache,
otp_app: :nebulex,
adapter: MyAdapter
end
Nebulex.Cache.Supervisor.compile_config(MyCache2, opts)
end
end
end
| 28.414286 | 98 | 0.684766 |
08f86e08ccdc5b69e07a21d627f751707a15302e | 166 | ex | Elixir | lib/bitcoin_simulator_web/controllers/transaction_controller.ex | sidharth-shridhar/Bitcoin-Miner-Simulation | 2789dc8fe5f65269789540f675fac682e431e518 | [
"MIT"
] | null | null | null | lib/bitcoin_simulator_web/controllers/transaction_controller.ex | sidharth-shridhar/Bitcoin-Miner-Simulation | 2789dc8fe5f65269789540f675fac682e431e518 | [
"MIT"
] | null | null | null | lib/bitcoin_simulator_web/controllers/transaction_controller.ex | sidharth-shridhar/Bitcoin-Miner-Simulation | 2789dc8fe5f65269789540f675fac682e431e518 | [
"MIT"
] | null | null | null | defmodule BitcoinSimulatorWeb.TransactionController do
use BitcoinSimulatorWeb, :controller
def index(conn, _params) do
render(conn, "index.html")
end
end
| 20.75 | 54 | 0.777108 |
08f8942b5eacfbabbd59cf5fb92bddbd0b31ea37 | 1,655 | exs | Elixir | test/lib/code_corps_web/controllers/password_controller_test.exs | fikape/code-corps-api | c21674b0b2a19fa26945c94268db8894420ca181 | [
"MIT"
] | 275 | 2015-06-23T00:20:51.000Z | 2021-08-19T16:17:37.000Z | test/lib/code_corps_web/controllers/password_controller_test.exs | fikape/code-corps-api | c21674b0b2a19fa26945c94268db8894420ca181 | [
"MIT"
] | 1,304 | 2015-06-26T02:11:54.000Z | 2019-12-12T21:08:00.000Z | test/lib/code_corps_web/controllers/password_controller_test.exs | fikape/code-corps-api | c21674b0b2a19fa26945c94268db8894420ca181 | [
"MIT"
] | 140 | 2016-01-01T18:19:47.000Z | 2020-11-22T06:24:47.000Z | defmodule CodeCorpsWeb.PasswordControllerTest do
@moduledoc false
use CodeCorpsWeb.ApiCase, resource_name: :password
use Bamboo.Test
alias CodeCorps.AuthToken
test "Unauthenticated - creates and renders resource when email is valid", %{conn: conn} do
user = insert(:user)
attrs = %{"email" => user.email}
conn = post conn, password_path(conn, :forgot_password), attrs
response = json_response(conn, 200)
assert response == %{ "email" => user.email }
%AuthToken{value: token} = Repo.get_by(AuthToken, user_id: user.id)
assert_delivered_email CodeCorps.Emails.ForgotPasswordEmail.create(user, token)
end
@tag :authenticated
test "Authenticated - creates and renders resource when email is valid and removes session", %{conn: conn} do
user = insert(:user)
attrs = %{"email" => user.email}
conn = post conn, password_path(conn, :forgot_password), attrs
response = json_response(conn, 200)
assert response == %{ "email" => user.email }
%AuthToken{value: token} = Repo.get_by(AuthToken, user_id: user.id)
assert_delivered_email CodeCorps.Emails.ForgotPasswordEmail.create(user, token)
refute CodeCorps.Guardian.Plug.authenticated?(conn)
end
test "does not create resource and renders 200 when email is invalid", %{conn: conn} do
user = insert(:user)
attrs = %{"email" => "random_email@gmail.com"}
conn = post conn, password_path(conn, :forgot_password), attrs
response = json_response(conn, 200)
assert response == %{ "email" => "random_email@gmail.com" }
refute_delivered_email CodeCorps.Emails.ForgotPasswordEmail.create(user, nil)
end
end
| 34.479167 | 111 | 0.712991 |
08f8991da09bdacf813045ce152378957ee4e662 | 406 | ex | Elixir | elixir/elixir-sips/samples/mailman_playground/lib/mailman_playground/emails.ex | afronski/playground-erlang | 6ac4b58b2fd717260c22a33284547d44a9b5038e | [
"MIT"
] | 2 | 2015-12-09T02:16:51.000Z | 2021-07-26T22:53:43.000Z | elixir/elixir-sips/samples/mailman_playground/lib/mailman_playground/emails.ex | afronski/playground-erlang | 6ac4b58b2fd717260c22a33284547d44a9b5038e | [
"MIT"
] | null | null | null | elixir/elixir-sips/samples/mailman_playground/lib/mailman_playground/emails.ex | afronski/playground-erlang | 6ac4b58b2fd717260c22a33284547d44a9b5038e | [
"MIT"
] | 1 | 2016-05-08T18:40:31.000Z | 2016-05-08T18:40:31.000Z | defmodule MailmanPlayground.Emails do
def testing_email do
%Mailman.Email{
subject: "Hey!",
from: "test@mailman.ex",
to: [ "afronski@gmail.com" ],
cc: [ "afronski@op.pl" ],
bcc: [],
data: [
name: "TestUser"
],
text: "Hello <%= name %>!",
html: """
<html>
<body>
<b>Hello <%= name %>!</b>
</body>
</html>
"""
}
end
end
| 17.652174 | 37 | 0.472906 |
08f8bba87c1462b47c51d3f33ce0837ac5d63543 | 555 | ex | Elixir | web/models/reward.ex | itmaster921/crowdfunding-api | 994d75d6c28356b45531b71251eff39a309d414b | [
"MIT"
] | 1 | 2020-05-24T15:18:36.000Z | 2020-05-24T15:18:36.000Z | web/models/reward.ex | itmaster921/crowdfunding-api | 994d75d6c28356b45531b71251eff39a309d414b | [
"MIT"
] | null | null | null | web/models/reward.ex | itmaster921/crowdfunding-api | 994d75d6c28356b45531b71251eff39a309d414b | [
"MIT"
] | null | null | null | defmodule CrowdfundingApi.Reward do
use CrowdfundingApi.Web, :model
schema "rewards" do
field :title, :string
field :description, :string
field :image_url, :string
field :amount, :integer
belongs_to :project, CrowdfundingApi.Project
timestamps()
end
@doc """
Builds a changeset based on the `struct` and `params`.
"""
def changeset(struct, params \\ %{}) do
struct
|> cast(params, [:title, :description, :image_url, :amount])
|> validate_required([:title, :description, :image_url, :amount])
end
end
| 24.130435 | 69 | 0.67027 |
08f8c75f328d2fc9d87ce1b753e96202f410a0ff | 1,121 | exs | Elixir | config/test.exs | praekeltfoundation/nurseconnect-companion | 9afaabaf3ae3e0123abcbd12e0a2073b681e9052 | [
"BSD-3-Clause"
] | 1 | 2018-10-10T18:20:22.000Z | 2018-10-10T18:20:22.000Z | config/test.exs | praekeltfoundation/nurseconnect-companion | 9afaabaf3ae3e0123abcbd12e0a2073b681e9052 | [
"BSD-3-Clause"
] | 23 | 2018-06-07T15:13:15.000Z | 2019-07-30T09:06:03.000Z | config/test.exs | praekeltfoundation/nurseconnect-companion | 9afaabaf3ae3e0123abcbd12e0a2073b681e9052 | [
"BSD-3-Clause"
] | null | null | null | use Mix.Config
# We don't run a server during test. If one is required,
# you can enable the server option below.
config :companion, CompanionWeb.Endpoint,
http: [port: 4001],
server: false
# Print only warnings and errors during test
config :logger, level: :warn
# Configure your database
config :companion, Companion.Repo,
adapter: Ecto.Adapters.Postgres,
username: "postgres",
password: "postgres",
database: "companion_test",
hostname: "localhost",
pool: Ecto.Adapters.SQL.Sandbox
config :ueberauth, Ueberauth,
providers: [
google: {Ueberauth.Strategy.Google, [hd: "example.org"]}
]
# Use Tesla mock adapter
config :tesla, adapter: Tesla.Mock
config :tesla, Tesla.Mock, json_engine: Poison
# Rapidpro config
config :companion, :rapidpro,
url: "http://rapidpro",
token: "rapidprotoken"
# Jembi config
config :companion, :openhim,
url: "http://openhim",
username: "user",
password: "pass"
# Whatsapp config
config :companion, :whatsapp,
url: "https://whatsapp",
username: "user",
password: "pass",
hsm_namespace: "hsm_namespace",
hsm_element_name: "hsm_element_name"
| 23.354167 | 60 | 0.719001 |
08f8c7842b6b692ca6ef6ac3bb6777155dcb075e | 3,489 | ex | Elixir | lib/yggdrasil/redis/application.ex | Robugroup/yggdrasil_redis | 6f4ea4f4efb6c77c66f078152b7e705fb6810971 | [
"MIT"
] | null | null | null | lib/yggdrasil/redis/application.ex | Robugroup/yggdrasil_redis | 6f4ea4f4efb6c77c66f078152b7e705fb6810971 | [
"MIT"
] | null | null | null | lib/yggdrasil/redis/application.ex | Robugroup/yggdrasil_redis | 6f4ea4f4efb6c77c66f078152b7e705fb6810971 | [
"MIT"
] | 1 | 2021-12-24T02:30:46.000Z | 2021-12-24T02:30:46.000Z | defmodule Yggdrasil.Redis.Application do
@moduledoc """
[](https://travis-ci.org/gmtprime/yggdrasil_redis) [](https://hex.pm/packages/yggdrasil_redis) [](https://hex.pm/packages/yggdrasil_redis)
This project is a Redis adapter for `Yggdrasil` publisher/subscriber.
## Small example
The following example uses Redis adapter to distribute messages:
```elixir
iex(1)> channel = %Yggdrasil.Channel{name: "some_channel", adapter: :redis}
iex(2)> Yggdrasil.subscribe(channel)
iex(3)> flush()
{:Y_CONNECTED, %Yggdrasil.Channel{(...)}}
```
and to publish a message for the subscribers:
```elixir
iex(4)> Yggdrasil.publish(channel, "message")
iex(5)> flush()
{:Y_EVENT, %Yggdrasil.Channel{(...)}, "message"}
```
When the subscriber wants to stop receiving messages, then it can unsubscribe
from the channel:
```elixir
iex(6)> Yggdrasil.unsubscribe(channel)
iex(7)> flush()
{:Y_DISCONNECTED, %Yggdrasil.Channel{(...)}}
```
## Redis adapter
The Redis adapter has the following rules:
* The `adapter` name is identified by the atom `:redis`.
* The channel `name` must be a string.
* The `transformer` must encode to a string. From the `transformer`s provided
it defaults to `:default`, but `:json` can also be used.
* Any `backend` can be used (by default is `:default`).
The following is an example of a valid channel for both publishers and
subscribers:
```elixir
%Yggdrasil.Channel{
name: "redis_channel_name",
adapter: :redis,
transformer: :json
}
```
It will expect valid JSONs from Redis and it will write valid JSONs in Redis.
## Redis configuration
Uses the list of options for `Redix`, but the more relevant optuons are shown
below:
* `hostname` - Redis hostname (defaults to `"localhost"`).
* `port` - Redis port (defaults to `6379`).
* `password` - Redis password (defaults to `""`).
The following shows a configuration with and without namespace:
```elixir
# Without namespace
config :yggdrasil,
redis: [hostname: "redis.zero"]
# With namespace
config :yggdrasil, RedisOne,
redis: [
hostname: "redis.one",
port: 1234
]
```
Also the options can be provided as OS environment variables. The available
variables are:
* `YGGDRASIL_REDIS_HOSTNAME` or `<NAMESPACE>_YGGDRASIL_REDIS_HOSTNAME`.
* `YGGDRASIL_REDIS_PORT` or `<NAMESPACE>_YGGDRASIL_REDIS_PORT`.
* `YGGDRASIL_REDIS_PASSWORD` or `<NAMESPACE>_YGGDRASIL_REDIS_PASSWORD`.
* `YGGDRASIL_REDIS_DATABASE` or `<NAMESPACE>_YGGDRASIL_REDIS_DATABASE`.
where `<NAMESPACE>` is the snakecase of the namespace chosen e.g. for the
namespace `RedisTwo`, you would use `REDIS_TWO` as namespace in the OS
environment variable.
## Installation
Using this Redis adapter with `Yggdrasil` is a matter of adding the available
hex package to your `mix.exs` file e.g:
```elixir
def deps do
[{:yggdrasil_redis, "~> 4.1"}]
end
```
"""
use Application
@impl true
def start(_type, _args) do
children = [
Supervisor.child_spec({Yggdrasil.Adapter.Redis, []}, [])
]
opts = [strategy: :one_for_one, name: Yggdrasil.Redis.Supervisor]
Supervisor.start_link(children, opts)
end
end
| 30.33913 | 375 | 0.694182 |
08f8ce41dd2c6a9420d131673e67f9d6c81c2a39 | 4,646 | exs | Elixir | apps/admin_api/test/admin_api/v1/views/user_view_test.exs | jimpeebles/ewallet | ad4a9750ec8dc5adc4c0dfe6c22f0ef760825405 | [
"Apache-2.0"
] | null | null | null | apps/admin_api/test/admin_api/v1/views/user_view_test.exs | jimpeebles/ewallet | ad4a9750ec8dc5adc4c0dfe6c22f0ef760825405 | [
"Apache-2.0"
] | null | null | null | apps/admin_api/test/admin_api/v1/views/user_view_test.exs | jimpeebles/ewallet | ad4a9750ec8dc5adc4c0dfe6c22f0ef760825405 | [
"Apache-2.0"
] | null | null | null | # Copyright 2018 OmiseGO Pte Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
defmodule AdminAPI.V1.UserViewTest do
use AdminAPI.ViewCase, :v1
alias AdminAPI.V1.UserView
alias EWallet.Web.Paginator
alias EWalletDB.User
alias Utils.Helpers.DateFormatter
describe "AdminAPI.V1.UserView.render/2" do
test "renders user.json with correct response structure" do
{:ok, user} = :user |> params_for() |> User.insert()
# I prefer to keep this test code duplicate with the `UserView.render/2` test,
# because in practice they are separate responses.
expected = %{
version: @expected_version,
success: true,
data: %{
object: "user",
id: user.id,
socket_topic: "user:#{user.id}",
username: user.username,
full_name: user.full_name,
calling_name: user.calling_name,
provider_user_id: user.provider_user_id,
email: user.email,
enabled: user.enabled,
avatar: %{
original: nil,
large: nil,
small: nil,
thumb: nil
},
metadata: %{
"first_name" => user.metadata["first_name"],
"last_name" => user.metadata["last_name"]
},
encrypted_metadata: %{},
created_at: DateFormatter.to_iso8601(user.inserted_at),
updated_at: DateFormatter.to_iso8601(user.updated_at)
}
}
assert UserView.render("user.json", %{user: user}) == expected
end
test "renders users.json with correct response structure" do
{:ok, user1} = :user |> params_for() |> User.insert()
{:ok, user2} = :user |> params_for() |> User.insert()
paginator = %Paginator{
data: [user1, user2],
pagination: %{
per_page: 10,
current_page: 1,
is_first_page: true,
is_last_page: false
}
}
expected = %{
version: @expected_version,
success: true,
data: %{
object: "list",
data: [
%{
object: "user",
id: user1.id,
socket_topic: "user:#{user1.id}",
username: user1.username,
full_name: user1.full_name,
calling_name: user1.calling_name,
provider_user_id: user1.provider_user_id,
email: user1.email,
enabled: user1.enabled,
avatar: %{
original: nil,
large: nil,
small: nil,
thumb: nil
},
metadata: %{
"first_name" => user1.metadata["first_name"],
"last_name" => user1.metadata["last_name"]
},
encrypted_metadata: %{},
created_at: DateFormatter.to_iso8601(user1.inserted_at),
updated_at: DateFormatter.to_iso8601(user1.updated_at)
},
%{
object: "user",
id: user2.id,
socket_topic: "user:#{user2.id}",
username: user2.username,
full_name: user2.full_name,
calling_name: user2.calling_name,
provider_user_id: user2.provider_user_id,
email: user2.email,
enabled: user2.enabled,
avatar: %{
original: nil,
large: nil,
small: nil,
thumb: nil
},
metadata: %{
"first_name" => user2.metadata["first_name"],
"last_name" => user2.metadata["last_name"]
},
encrypted_metadata: %{},
created_at: DateFormatter.to_iso8601(user2.inserted_at),
updated_at: DateFormatter.to_iso8601(user2.updated_at)
}
],
pagination: %{
per_page: 10,
current_page: 1,
is_first_page: true,
is_last_page: false
}
}
}
assert UserView.render("users.json", %{users: paginator}) == expected
end
end
end
| 32.48951 | 84 | 0.543909 |
08f8e1efb6f9b3c8c174e96c08d0c2b81f63b51d | 622 | ex | Elixir | fade/lib/snapshot/types/channel_snapshot.ex | ahives/Fade | 7094b6703933e41a1400b1053764335e32928b0a | [
"Apache-2.0"
] | null | null | null | fade/lib/snapshot/types/channel_snapshot.ex | ahives/Fade | 7094b6703933e41a1400b1053764335e32928b0a | [
"Apache-2.0"
] | null | null | null | fade/lib/snapshot/types/channel_snapshot.ex | ahives/Fade | 7094b6703933e41a1400b1053764335e32928b0a | [
"Apache-2.0"
] | null | null | null | defmodule Fade.Snapshot.Types.ChannelSnapshot do
use TypedStruct
alias Fade.Snapshot.Types.QueueOperationMetrics
typedstruct do
field(:prefetch_count, integer())
field(:uncommitted_acknowledgements, integer())
field(:uncommitted_messages, integer())
field(:unconfirmed_messages, integer())
field(:unacknowledged_messages, integer())
field(:consumers, integer())
field(:identifier, String.t())
field(:connection_identifier, integer())
field(:node, integer())
field(:queue_operations, QueueOperationMetrics.t())
def new(fields), do: struct!(__MODULE__, fields)
end
end
| 29.619048 | 55 | 0.731511 |
08f8f7de4b7db88d781e401040a2fd1f927e12cb | 3,977 | ex | Elixir | lib/cadet_web/controllers/courses_controller.ex | source-academy/cadet | c447552453f78799755de73f66999e4c9d20383c | [
"Apache-2.0"
] | 27 | 2018-01-20T05:56:24.000Z | 2021-05-24T03:21:55.000Z | lib/cadet_web/controllers/courses_controller.ex | source-academy/cadet | c447552453f78799755de73f66999e4c9d20383c | [
"Apache-2.0"
] | 731 | 2018-04-16T13:25:49.000Z | 2021-06-22T07:16:12.000Z | lib/cadet_web/controllers/courses_controller.ex | source-academy/cadet | c447552453f78799755de73f66999e4c9d20383c | [
"Apache-2.0"
] | 43 | 2018-01-20T06:35:46.000Z | 2021-05-05T03:22:35.000Z | defmodule CadetWeb.CoursesController do
use CadetWeb, :controller
use PhoenixSwagger
alias Cadet.Courses
alias Cadet.Accounts.CourseRegistrations
def index(conn, %{"course_id" => course_id}) when is_ecto_id(course_id) do
case Courses.get_course_config(course_id) do
{:ok, config} ->
render(conn, "config.json", config: config)
# coveralls-ignore-start
# no course error will not happen here
{:error, {status, message}} ->
send_resp(conn, status, message)
# coveralls-ignore-stop
end
end
def create(conn, params) do
user = conn.assigns.current_user
params = params |> to_snake_case_atom_keys()
if CourseRegistrations.get_admin_courses_count(user) < 5 do
case Courses.create_course_config(params, user) do
{:ok, _} ->
text(conn, "OK")
{:error, _, _, _} ->
conn
|> put_status(:bad_request)
|> text("Invalid parameter(s)")
end
else
conn
|> put_status(:forbidden)
|> text("User not allowed to be admin of more than 5 courses.")
end
end
swagger_path :create do
post("/v2/config/create")
summary("Creates a new course")
security([%{JWT: []}])
consumes("application/json")
parameters do
course_name(:body, :string, "Course name", required: true)
course_short_name(:body, :string, "Course module code", required: true)
viewable(:body, :boolean, "Course viewability", required: true)
enable_game(:body, :boolean, "Enable game", required: true)
enable_achievements(:body, :boolean, "Enable achievements", required: true)
enable_sourcecast(:body, :boolean, "Enable sourcecast", required: true)
source_chapter(:body, :number, "Default source chapter", required: true)
source_variant(:body, Schema.ref(:SourceVariant), "Default source variant name",
required: true
)
module_help_text(:body, :string, "Module help text", required: true)
end
end
swagger_path :get_course_config do
get("/v2/courses/{course_id}/config")
summary("Retrieves the course configuration of the specified course")
security([%{JWT: []}])
produces("application/json")
parameters do
course_id(:path, :integer, "Course ID", required: true)
end
response(200, "OK", Schema.ref(:Config))
response(400, "Invalid course_id")
end
def swagger_definitions do
%{
Config:
swagger_schema do
title("Course Configuration")
properties do
course_name(:string, "Course name", required: true)
course_short_name(:string, "Course module code", required: true)
viewable(:boolean, "Course viewability", required: true)
enable_game(:boolean, "Enable game", required: true)
enable_achievements(:boolean, "Enable achievements", required: true)
enable_sourcecast(:boolean, "Enable sourcecast", required: true)
source_chapter(:integer, "Source Chapter number from 1 to 4", required: true)
source_variant(Schema.ref(:SourceVariant), "Source Variant name", required: true)
module_help_text(:string, "Module help text", required: true)
assessment_types(:list, "Assessment Types", required: true)
end
example(%{
course_name: "Programming Methodology",
course_short_name: "CS1101S",
viewable: true,
enable_game: true,
enable_achievements: true,
enable_sourcecast: true,
source_chapter: 1,
source_variant: "default",
module_help_text: "Help text",
assessment_types: ["Missions", "Quests", "Paths", "Contests", "Others"]
})
end,
SourceVariant:
swagger_schema do
type(:string)
enum([:default, :concurrent, :gpu, :lazy, "non-det", :wasm])
end
}
end
end
| 31.816 | 93 | 0.626854 |
08f8feefa048546f97e5150bfcbe939ad6a96c3a | 5,060 | ex | Elixir | lib/phoenix/controller/pipeline.ex | lancehalvorsen/phoenix | f366b7a399db04badb816f74851040fa141c51ee | [
"MIT"
] | null | null | null | lib/phoenix/controller/pipeline.ex | lancehalvorsen/phoenix | f366b7a399db04badb816f74851040fa141c51ee | [
"MIT"
] | null | null | null | lib/phoenix/controller/pipeline.ex | lancehalvorsen/phoenix | f366b7a399db04badb816f74851040fa141c51ee | [
"MIT"
] | null | null | null | defmodule Phoenix.Controller.Pipeline do
@moduledoc """
This module implements the controller pipeline responsible for handling requests.
## The pipeline
The goal of a controller is to receive a request and invoke the desired
action. The whole flow of the controller is managed by a single pipeline:
defmodule UserController do
use Phoenix.Controller
require Logger
plug :log_message, "before action"
def show(conn, _params) do
Logger.debug "show/2"
send_resp(conn, 200, "OK")
end
defp log_message(conn, msg) do
Logger.debug msg
conn
end
end
When invoked, this pipeline will print:
before action
show/2
As any other Plug pipeline, we can halt at any step by calling
`Plug.Conn.halt/1` (which is by default imported into controllers).
If we change `log_message/2` to:
def log_message(conn, msg) do
Logger.debug msg
halt(conn)
end
it will print only:
before action
As the rest of the pipeline (the action and the after action plug)
will never be invoked.
## Guards
`plug/2` supports guards, allowing a developer to configure a plug to only
run in some particular action:
plug :log_message, "before show and edit" when action in [:show, :edit]
plug :log_message, "before all but index" when not action in [:index]
The first plug will run only when action is show or edit.
The second plug will always run, except for the index action.
Those guards work like regular Elixir guards and the only variables accessible
in the guard are `conn`, the `action` as an atom and the `controller` as an
alias.
## Controllers are plugs
Like routers, controllers are plugs, but they are wired to dispatch
to a particular function which is called an action.
For example, the route:
get "/users/:id", UserController, :show
will invoke `UserController` as a plug:
UserController.call(conn, :show)
which will trigger the plug pipeline and which will eventually
invoke the inner action plug that dispatches to the `show/2`
function in the `UserController`.
As controllers are plugs, they implement both `init/1` and
`call/2`, and it also provides a function named `action/2`
which is responsible for dispatching the appropriate action
after the plug stack (and is also overridable).
"""
@doc false
defmacro __using__(_) do
quote do
@behaviour Plug
import Phoenix.Controller.Pipeline
Module.register_attribute(__MODULE__, :plugs, accumulate: true)
@before_compile Phoenix.Controller.Pipeline
def init(action) when is_atom(action) do
action
end
def call(conn, action) do
conn = update_in conn.private,
&(&1 |> Map.put(:phoenix_controller, __MODULE__)
|> Map.put(:phoenix_action, action))
phoenix_controller_pipeline(conn, action)
end
def action(%{private: %{phoenix_action: action}} = conn, _options) do
apply(__MODULE__, action, [conn, conn.params])
end
defoverridable [init: 1, call: 2, action: 2]
end
end
@doc false
defmacro __before_compile__(env) do
action = {:action, [], true}
plugs = [action|Module.get_attribute(env.module, :plugs)]
{conn, body} = Plug.Builder.compile(env, plugs, log_on_halt: :debug)
quote do
defoverridable [action: 2]
def action(conn, opts) do
try do
super(conn, opts)
catch
kind, reason ->
Phoenix.Controller.Pipeline.__catch__(
kind, reason, __MODULE__, conn.private.phoenix_action, System.stacktrace
)
end
end
defp phoenix_controller_pipeline(unquote(conn), var!(action)) do
var!(conn) = unquote(conn)
var!(controller) = __MODULE__
_ = var!(conn)
_ = var!(controller)
_ = var!(action)
unquote(body)
end
end
end
@doc false
def __catch__(:error, :function_clause, controller, action,
[{controller, action, [%Plug.Conn{} | _], _loc} | _] = stack) do
args = [controller: controller, action: action]
reraise Phoenix.ActionClauseError, args, stack
end
def __catch__(kind, reason, _controller, _action, stack) do
:erlang.raise(kind, reason, stack)
end
@doc """
Stores a plug to be executed as part of the plug pipeline.
"""
defmacro plug(plug)
defmacro plug({:when, _, [plug, guards]}), do:
plug(plug, [], guards)
defmacro plug(plug), do:
plug(plug, [], true)
@doc """
Stores a plug with the given options to be executed as part of
the plug pipeline.
"""
defmacro plug(plug, opts)
defmacro plug(plug, {:when, _, [opts, guards]}), do:
plug(plug, opts, guards)
defmacro plug(plug, opts), do:
plug(plug, opts, true)
defp plug(plug, opts, guards) do
quote do
@plugs {unquote(plug), unquote(opts), unquote(Macro.escape(guards))}
end
end
end
| 27.351351 | 86 | 0.652767 |
08f92293e79136331eaf5c0d3f7788de0848aa39 | 4,423 | exs | Elixir | test/elixir/test/recreate_doc_test.exs | mtenrero/couchdb-vetcontrol | b7ede3ededdf0072c73f08d8f1217cb723b03f7a | [
"Apache-2.0"
] | null | null | null | test/elixir/test/recreate_doc_test.exs | mtenrero/couchdb-vetcontrol | b7ede3ededdf0072c73f08d8f1217cb723b03f7a | [
"Apache-2.0"
] | null | null | null | test/elixir/test/recreate_doc_test.exs | mtenrero/couchdb-vetcontrol | b7ede3ededdf0072c73f08d8f1217cb723b03f7a | [
"Apache-2.0"
] | null | null | null | defmodule RecreateDocTest do
use CouchTestCase
@moduletag :recreate_doc
@moduledoc """
Test CouchDB document recreation
This is a port of the recreate_doc.js suite
"""
@tag :with_db
test "recreate document", context do
db_name = context[:db_name]
# First create a new document with the ID "foo", and delete it again
doc = %{_id: "foo", a: "bar", b: 42}
{:ok, resp} = create_doc(db_name, doc)
first_rev = resp.body["rev"]
resp = Couch.delete("/#{db_name}/foo?rev=#{first_rev}")
assert resp.status_code == 200
# Now create a new document with the same ID, save it, and then modify it
doc = %{_id: "foo"}
for _i <- 0..9 do
{:ok, _} = create_doc(db_name, doc)
resp = Couch.get("/#{db_name}/foo")
updated_doc =
resp.body
|> Map.put("a", "baz")
resp = Couch.put("/#{db_name}/foo", body: updated_doc)
assert resp.status_code == 201
rev = resp.body["rev"]
resp = Couch.delete("/#{db_name}/foo?rev=#{rev}")
assert resp.status_code == 200
end
end
@tag :with_db
test "COUCHDB-292 - recreate a deleted document", context do
db_name = context[:db_name]
# First create a new document with the ID "foo", and delete it again
doc = %{_id: "foo", a: "bar", b: 42}
{:ok, resp} = create_doc(db_name, doc)
first_rev = resp.body["rev"]
resp = Couch.delete("/#{db_name}/foo?rev=#{first_rev}")
assert resp.status_code == 200
# COUCHDB-292 now attempt to save the document with a prev that's since
# been deleted and this should generate a conflict exception
updated_doc =
doc
|> Map.put(:_rev, first_rev)
resp = Couch.put("/#{db_name}/foo", body: updated_doc)
assert resp.status_code == 409
# same as before, but with binary
bin_att_doc = %{
_id: "foo",
_rev: first_rev,
_attachments: %{
"foo.txt": %{
content_type: "text/plain",
data: "VGhpcyBpcyBhIGJhc2U2NCBlbmNvZGVkIHRleHQ="
}
}
}
resp = Couch.put("/#{db_name}/foo", body: bin_att_doc)
assert resp.status_code == 409
end
@tag :with_db
test "Recreate a deleted document with non-exsistant rev", context do
db_name = context[:db_name]
doc = %{_id: "foo", a: "bar", b: 42}
{:ok, resp} = create_doc(db_name, doc)
first_rev = resp.body["rev"]
resp = Couch.delete("/#{db_name}/foo?rev=#{first_rev}")
assert resp.status_code == 200
# random non-existant prev rev
updated_doc =
doc
|> Map.put(:_rev, "1-asfafasdf")
resp = Couch.put("/#{db_name}/foo", body: updated_doc)
assert resp.status_code == 409
# random non-existant prev rev with bin
bin_att_doc = %{
_id: "foo",
_rev: "1-aasasfasdf",
_attachments: %{
"foo.txt": %{
content_type: "text/plain",
data: "VGhpcyBpcyBhIGJhc2U2NCBlbmNvZGVkIHRleHQ="
}
}
}
resp = Couch.put("/#{db_name}/foo", body: bin_att_doc)
assert resp.status_code == 409
end
@tag :with_db
test "COUCHDB-1265 - changes feed after we try and break the update_seq tree",
context do
db_name = context[:db_name]
# Test COUCHDB-1265 - Reinserting an old revision into the revision tree causes
# duplicates in the update_seq tree.
revs = create_rev_doc(db_name, "a", 3)
resp =
Couch.put("/#{db_name}/a",
body: Enum.at(revs, 0),
query: [new_edits: false]
)
assert resp.status_code == 201
resp =
Couch.put("/#{db_name}/a",
body: Enum.at(revs, -1)
)
assert resp.status_code == 201
resp = Couch.get("/#{db_name}/_changes")
assert resp.status_code == 200
assert length(resp.body["results"]) == 1
end
# function to create a doc with multiple revisions
defp create_rev_doc(db_name, id, num_revs) do
doc = %{_id: id, count: 0}
{:ok, resp} = create_doc(db_name, doc)
create_rev_doc(db_name, id, num_revs, [Map.put(doc, :_rev, resp.body["rev"])])
end
defp create_rev_doc(db_name, id, num_revs, revs) do
if length(revs) < num_revs do
doc = %{_id: id, _rev: Enum.at(revs, -1)[:_rev], count: length(revs)}
{:ok, resp} = create_doc(db_name, doc)
create_rev_doc(
db_name,
id,
num_revs,
revs ++ [Map.put(doc, :_rev, resp.body["rev"])]
)
else
revs
end
end
end
| 26.644578 | 83 | 0.602984 |
08f92a90d6fc1d0596be1020f9a011d5da4e1144 | 2,878 | exs | Elixir | test/ptr_web/views/cellar_view_test.exs | francocatena/ptr | 4c8a960cdcb1c8523334fcc0cddba6b7fb3b3e60 | [
"MIT"
] | null | null | null | test/ptr_web/views/cellar_view_test.exs | francocatena/ptr | 4c8a960cdcb1c8523334fcc0cddba6b7fb3b3e60 | [
"MIT"
] | 2 | 2021-03-09T01:59:47.000Z | 2022-02-10T17:08:54.000Z | test/ptr_web/views/cellar_view_test.exs | francocatena/ptr | 4c8a960cdcb1c8523334fcc0cddba6b7fb3b3e60 | [
"MIT"
] | null | null | null | defmodule PtrWeb.CellarViewTest do
use PtrWeb.ConnCase, async: true
alias PtrWeb.CellarView
alias Ptr.Cellars
alias Ptr.Cellars.Cellar
import Phoenix.View
import Phoenix.HTML, only: [safe_to_string: 1]
test "renders empty.html", %{conn: conn} do
content = render_to_string(CellarView, "empty.html", conn: conn)
assert String.contains?(content, "you have no cellars")
end
test "renders index.html", %{conn: conn} do
page = %Scrivener.Page{total_pages: 1, page_number: 1}
cellars = [
%Cellar{id: "1", identifier: "gallo", name: "Gallo"},
%Cellar{id: "2", identifier: "conti", name: "Romanée-Conti"}
]
content = render_to_string(CellarView, "index.html", conn: conn, cellars: cellars, page: page)
for cellar <- cellars do
assert String.contains?(content, cellar.identifier)
end
end
test "renders new.html", %{conn: conn} do
changeset = test_account() |> Cellars.change_cellar(%Cellar{})
content = render_to_string(CellarView, "new.html", conn: conn, changeset: changeset)
assert String.contains?(content, "New cellar")
end
test "renders edit.html", %{conn: conn} do
cellar = %Cellar{id: "1", identifier: "gallo", name: "Gallo"}
changeset = test_account() |> Cellars.change_cellar(cellar)
content =
render_to_string(CellarView, "edit.html", conn: conn, cellar: cellar, changeset: changeset)
assert String.contains?(content, cellar.identifier)
end
test "renders show.html with no vessels", %{conn: conn} do
cellar = %Cellar{id: "1", identifier: "gallo", name: "Gallo"}
content =
render_to_string(
CellarView,
"show.html",
conn: conn,
cellar: cellar,
account: test_account()
)
assert String.contains?(content, cellar.identifier)
end
test "renders show.html with vessels", %{conn: conn} do
cellar = %Cellar{id: "1", identifier: "gallo", name: "Gallo", vessels_count: 3}
content =
render_to_string(
CellarView,
"show.html",
conn: conn,
cellar: cellar,
account: test_account()
)
assert String.contains?(content, cellar.identifier)
assert String.contains?(content, "#{cellar.vessels_count}")
end
test "link to delete cellar is empty when has lots", %{conn: conn} do
cellar = %Cellar{id: "1", lots_count: 3}
content =
conn
|> Plug.Conn.assign(:current_session, %{cellar: cellar})
|> CellarView.link_to_delete(cellar)
assert content == nil
end
test "link to delete cellar is not empty when has no lots", %{conn: conn} do
cellar = %Cellar{id: "1", lots_count: 0}
content =
conn
|> CellarView.link_to_delete(cellar)
|> safe_to_string
refute content == nil
end
defp test_account do
%Ptr.Accounts.Account{db_prefix: "test_account"}
end
end
| 27.150943 | 98 | 0.649757 |
08f93f4086d588194aa6d0ee361f9e2b95a042cd | 2,256 | ex | Elixir | lib/crit/users/password.ex | jesseshieh/crit19 | 0bba407fea09afed72cbb90ca579ba34c537edef | [
"MIT"
] | null | null | null | lib/crit/users/password.ex | jesseshieh/crit19 | 0bba407fea09afed72cbb90ca579ba34c537edef | [
"MIT"
] | null | null | null | lib/crit/users/password.ex | jesseshieh/crit19 | 0bba407fea09afed72cbb90ca579ba34c537edef | [
"MIT"
] | null | null | null | defmodule Crit.Users.Password do
use Ecto.Schema
alias Crit.Users.User
import Ecto.Changeset
import Ecto.Query
alias Crit.Sql
import Ecto.ChangesetX
schema "passwords" do
field :hash, :string
belongs_to :user, User, foreign_key: :auth_id, type: :string
field :new_password, :string, virtual: true
field :new_password_confirmation, :string, virtual: true
timestamps()
end
# A changeset with only default or empty fields. For `new` actions.
def default_changeset(),
do: change(%__MODULE__{}) |> hide([:hash, :auth_id])
def create_changeset(password, attrs \\ %{}) do
password
|> cast(attrs, [:new_password, :new_password_confirmation])
|> validate_required([:new_password, :new_password_confirmation])
|> validate_password_length(:new_password)
|> validate_password_confirmation()
|> put_password_hash()
end
# Util
defp validate_password_length(changeset, field),
do: validate_length(changeset, field, min: 8, max: 128)
defp validate_password_confirmation(changeset) do
password = changeset.changes[:new_password]
confirmation = changeset.changes[:new_password_confirmation]
if password == confirmation do
changeset
else
add_error(changeset, :new_password_confirmation,
"should be the same as the new password")
end
end
defp put_password_hash(changeset) do
case changeset do
%Ecto.Changeset{valid?: true, changes: %{new_password: pass}} ->
put_change(changeset, :hash, Pbkdf2.hash_pwd_salt(pass))
_ ->
changeset
end
end
# Utilities
def count_for(auth_id, institution) do
query =
from p in __MODULE__,
where: p.auth_id == ^auth_id,
select: count(p.id)
Sql.one(query, institution)
end
defmodule Query do
import Ecto.Query
alias Crit.Users.Password
def by_auth_id(auth_id) do
from p in Password,
where: p.auth_id == ^auth_id
end
# This seems ridiculously overcomplicated, but I can't make
# a plain preload work without the join.
def preloading_user(query) do
query
|> join(:inner, [p], u in User, on: p.auth_id == u.auth_id)
|> preload([p, u], [user: u])
end
end
end
| 25.348315 | 70 | 0.675089 |
08f95b4d204bffd238ccfff48cfbe2a9b5ff848a | 1,584 | ex | Elixir | lib/problem010.ex | lewapkon/eulixir | 990017cdccee7cd508269b7036e290ec777aea3d | [
"MIT"
] | null | null | null | lib/problem010.ex | lewapkon/eulixir | 990017cdccee7cd508269b7036e290ec777aea3d | [
"MIT"
] | null | null | null | lib/problem010.ex | lewapkon/eulixir | 990017cdccee7cd508269b7036e290ec777aea3d | [
"MIT"
] | null | null | null | defmodule Eulixir.Problem010 do
@moduledoc """
http://projecteuler.net/problem=10
The sum of the primes below 10 is 2 + 3 + 5 + 7 = 17.
Find the sum of all the primes below two million.
"""
@doc """
iex> Eulixir.Problem010.solve(10)
17
"""
def solve(limit) do
limit
|> primes_to
|> Enum.sum
end
@doc """
Returns primes upto limit in reversed order.
iex> Eulixir.Problem010.primes_to(10)
[7, 5, 3, 2]
"""
def primes_to(limit) do
primes_to(limit, 2, [])
end
defp primes_to(limit, n, acc) when n > limit, do: acc
defp primes_to(limit, n, acc) do
if prime?(n) do
primes_to(limit, n + 1, [n | acc])
else
primes_to(limit, n + 1, acc)
end
end
@doc """
Returns primes upto limit in reversed order.
iex> Eulixir.Problem010.prime?(982_451_653)
true
iex> Eulixir.Problem010.prime?(982_451_654)
false
"""
def prime?(2), do: true
def prime?(3), do: true
def prime?(n) when rem(n, 2) == 0, do: false
def prime?(prime_candidate) do
max_divisor = max(2, trunc(:math.sqrt(prime_candidate)))
prime?(prime_candidate, 3, max_divisor)
end
defp prime?(_prime_candidate, current_divisor, max_divisor)
when current_divisor > max_divisor, do: true
defp prime?(prime_candidate, current_divisor, _max_divisor)
when rem(prime_candidate, current_divisor) == 0, do: false
defp prime?(prime_candidate, current_divisor, max_divisor) do
prime?(prime_candidate, current_divisor + 2, max_divisor)
end
def solution do
solve(2_000_000)
end
end
| 24 | 63 | 0.654672 |
08f9a880b853bdd2163259dab85a7a370f4260fb | 998 | ex | Elixir | lib/trello.ex | kensupermen/trello | 00eba2165ac32663319679271dcc56ac6cfe4cad | [
"MIT"
] | null | null | null | lib/trello.ex | kensupermen/trello | 00eba2165ac32663319679271dcc56ac6cfe4cad | [
"MIT"
] | 3 | 2018-10-03T16:59:21.000Z | 2018-10-06T09:53:51.000Z | lib/trello.ex | kensupermen/trello | 00eba2165ac32663319679271dcc56ac6cfe4cad | [
"MIT"
] | 1 | 2018-10-03T17:06:47.000Z | 2018-10-03T17:06:47.000Z | defmodule Trello do
use Application
# See https://hexdocs.pm/elixir/Application.html
# for more information on OTP Applications
def start(_type, _args) do
import Supervisor.Spec
# Define workers and child supervisors to be supervised
children = [
# Start the Ecto repository
supervisor(Trello.Repo, []),
# Start the endpoint when the application starts
supervisor(Trello.Endpoint, []),
# Start your own worker by calling: Trello.Worker.start_link(arg1, arg2, arg3)
# worker(Trello.Worker, [arg1, arg2, arg3]),
]
# See https://hexdocs.pm/elixir/Supervisor.html
# for other strategies and supported options
opts = [strategy: :one_for_one, name: Trello.Supervisor]
Supervisor.start_link(children, opts)
end
# Tell Phoenix to update the endpoint configuration
# whenever the application is updated.
def config_change(changed, _new, removed) do
Trello.Endpoint.config_change(changed, removed)
:ok
end
end
| 31.1875 | 84 | 0.707415 |
08f9d2b7c0a1af5605cacdac2814a455c638bf9b | 9,527 | ex | Elixir | apps/ewallet_config/lib/ewallet_config/setting.ex | AndonMitev/EWallet | 898cde38933d6f134734528b3e594eedf5fa50f3 | [
"Apache-2.0"
] | 322 | 2018-02-28T07:38:44.000Z | 2020-05-27T23:09:55.000Z | apps/ewallet_config/lib/ewallet_config/setting.ex | AndonMitev/EWallet | 898cde38933d6f134734528b3e594eedf5fa50f3 | [
"Apache-2.0"
] | 643 | 2018-02-28T12:05:20.000Z | 2020-05-22T08:34:38.000Z | apps/ewallet_config/lib/ewallet_config/setting.ex | AndonMitev/EWallet | 898cde38933d6f134734528b3e594eedf5fa50f3 | [
"Apache-2.0"
] | 63 | 2018-02-28T10:57:06.000Z | 2020-05-27T23:10:38.000Z | # Copyright 2018-2019 OmiseGO Pte Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
defmodule EWalletConfig.Setting do
@moduledoc """
Schema overlay acting as an interface to the StoredSetting schema.
This is needed because some transformation is applied to the
attributes before saving them to the database. Indeed, value is stored
in a map to allow any type to be saved, but is for simplicity, the
users never need to know that - all they need to care is the "value"
field.
Here are some explanations about some of the fields of the settings:
- position
Position is used to have a constant order for settings that we define.
It cannot be updated and is only set at creation. We can add more settings
in the seeds later and fix their positions.
- parent and parent_value
Those are used to link settings together in a very simple way. No logic is
actually implemented for those, it's mostly intended to be used by clients
(like the admin panel) to show settings in a logical way. So if someone
selects gcs for file_storage, you can show all settings that have file_storage
as a parent were parent_value=gcs.
"""
require Ecto.Query
use ActivityLogger.ActivityLogging
alias EWalletConfig.{Repo, StoredSetting, Setting}
alias Ecto.{Changeset, Query}
defstruct [
:uuid,
:id,
:key,
:value,
:type,
:description,
:options,
:parent,
:parent_value,
:secret,
:position,
:inserted_at,
:updated_at
]
@spec get_setting_mappings() :: [map()]
def get_setting_mappings, do: Application.get_env(:ewallet_config, :settings_mappings)
@spec get_default_settings() :: [map()]
def get_default_settings, do: Application.get_env(:ewallet_config, :default_settings)
@spec types() :: [String.t()]
def types, do: StoredSetting.types()
@doc """
Retrieves all settings.
"""
@spec all() :: [%Setting{}]
def all do
StoredSetting
|> Query.order_by(asc: :position)
|> Repo.all()
|> Enum.map(&build/1)
end
def query do
StoredSetting
end
@doc """
Retrieves a setting by its string name.
"""
@spec get(String.t()) :: %Setting{}
def get(key) when is_atom(key) do
get(Atom.to_string(key))
end
def get(key) when is_binary(key) do
case Repo.get_by(StoredSetting, key: key) do
nil -> nil
stored_setting -> build(stored_setting)
end
end
def get(_), do: nil
@doc """
Retrieves a setting's value by its string name.
"""
@spec get_value(String.t() | atom()) :: any()
def get_value(key, default \\ nil)
def get_value(key, default) when is_atom(key) do
key
|> Atom.to_string()
|> get_value(default)
end
def get_value(key, default) when is_binary(key) do
case Repo.get_by(StoredSetting, key: key) do
nil -> default
stored_setting -> extract_value(stored_setting)
end
end
def get_value(nil, _), do: nil
@doc """
Creates a new setting with the passed attributes.
"""
@spec insert(map()) :: {:ok, %Setting{}} | {:error, %Changeset{}}
def insert(attrs) do
attrs = cast_attrs(attrs)
%StoredSetting{}
|> StoredSetting.changeset(attrs)
|> Repo.insert_record_with_activity_log()
|> return_from_change()
end
@doc """
Inserts all the default settings.
"""
@spec insert_all_defaults(map(), map()) :: [{:ok, %Setting{}}] | [{:error, %Changeset{}}]
def insert_all_defaults(originator, overrides \\ %{}) do
Repo.transaction(fn ->
get_default_settings()
|> Enum.map(fn data ->
insert_default(data, originator, overrides)
end)
|> all_defaults_inserted?()
end)
|> return_tx_result()
end
defp insert_default({key, data}, originator, overrides) do
data = Map.put(data, :originator, originator)
case overrides[key] do
nil ->
insert(data)
override ->
data
|> Map.put(:value, override)
|> insert()
end
end
defp all_defaults_inserted?(list) do
Enum.all?(list, fn res ->
case res do
{:ok, _} -> true
_ -> false
end
end)
end
defp return_tx_result({:ok, true}), do: :ok
defp return_tx_result({:ok, false}), do: :error
defp return_tx_result({:error, _}), do: {:error, :setting_insert_failed}
@spec update(String.t(), map()) ::
{:ok, %Setting{}} | {:error, atom()} | {:error, Changeset.t()}
def update(nil, _), do: {:error, :setting_not_found}
def update(key, attrs) when is_atom(key) do
key
|> Atom.to_string()
|> update(attrs)
end
def update(key, attrs) do
case Repo.get_by(StoredSetting, key: key) do
nil ->
{:error, :setting_not_found}
setting ->
attrs = cast_attrs(setting, attrs)
setting
|> StoredSetting.update_changeset(attrs)
|> Repo.update_record_with_activity_log()
|> return_from_change()
end
end
@spec update_all(List.t()) :: [{:ok, %Setting{}} | {:error, atom()} | {:error, Changeset.t()}]
def update_all(attrs) when is_list(attrs) do
case Keyword.keyword?(attrs) do
true -> update_all_with_keyword_list(attrs)
false -> update_all_with_map_list(attrs)
end
end
@spec update_all(map()) :: [{:ok, %Setting{}} | {:error, atom()} | {:error, Changeset.t()}]
def update_all(attrs) do
originator = attrs[:originator]
attrs
|> Map.delete(:originator)
|> Enum.map(fn {key, value} ->
{key,
update(key, %{
value: value,
originator: originator
})}
end)
end
def lock_all do
StoredSetting
|> Query.lock("FOR UPDATE")
|> Repo.all()
end
def lock(keys) do
StoredSetting
|> Query.lock("FOR UPDATE")
|> Query.where([s], s.key in ^keys)
|> Repo.all()
end
def build(stored_setting) do
%Setting{
uuid: stored_setting.uuid,
id: stored_setting.id,
key: stored_setting.key,
value: extract_value(stored_setting),
type: stored_setting.type,
description: stored_setting.description,
options: get_options(stored_setting),
parent: stored_setting.parent,
parent_value: stored_setting.parent_value,
secret: stored_setting.secret,
position: stored_setting.position,
inserted_at: stored_setting.inserted_at,
updated_at: stored_setting.updated_at
}
end
defp update_all_with_keyword_list(attrs) do
originator = attrs[:originator]
attrs
|> Keyword.delete(:originator)
|> Enum.map(fn {key, value} ->
{key, update(key, %{value: value, originator: originator})}
end)
end
defp update_all_with_map_list(attrs) do
Enum.map(attrs, fn data ->
key = data[:key] || data["key"]
{key, update(key, data)}
end)
end
defp cast_attrs(attrs) do
attrs
|> cast_value()
|> cast_options()
|> add_position()
end
defp cast_attrs(setting, attrs) do
attrs
|> cast_value(setting)
|> cast_options()
|> add_position()
end
defp return_from_change({:ok, stored_setting}) do
{:ok, build(stored_setting)}
end
defp return_from_change({:error, changeset}) do
{:error, changeset}
end
defp extract_value(%{secret: true, encrypted_data: nil}), do: nil
defp extract_value(%{secret: true, encrypted_data: data}) do
case Map.get(data, :value) do
nil -> Map.get(data, "value")
value -> value
end
end
defp extract_value(%{secret: false, data: nil}), do: nil
defp extract_value(%{secret: false, data: data}) do
case Map.get(data, :value) do
nil -> Map.get(data, "value")
value -> value
end
end
defp get_options(%{options: nil}), do: nil
defp get_options(%{options: options}) do
Map.get(options, :array) || Map.get(options, "array")
end
defp cast_value(%{value: value} = attrs, %{secret: true}) do
Map.put(attrs, :encrypted_data, %{value: value})
end
defp cast_value(%{value: value} = attrs, _) do
Map.put(attrs, :data, %{value: value})
end
defp cast_value(attrs, _), do: attrs
defp cast_value(%{secret: true, value: value} = attrs) do
Map.put(attrs, :encrypted_data, %{value: value})
end
defp cast_value(%{value: value} = attrs) do
Map.put(attrs, :data, %{value: value})
end
defp cast_value(attrs) do
Map.put(attrs, :data, %{value: nil})
end
defp cast_options(%{options: nil} = attrs), do: attrs
defp cast_options(%{options: options} = attrs) do
Map.put(attrs, :options, %{array: options})
end
defp cast_options(attrs), do: attrs
defp add_position(%{position: position} = attrs)
when not is_nil(position) and is_integer(position) do
attrs
end
defp add_position(attrs) do
case get_last_setting() do
nil ->
Map.put(attrs, :position, 0)
latest_setting ->
Map.put(attrs, :position, latest_setting.position + 1)
end
end
defp get_last_setting do
StoredSetting
|> Query.order_by(desc: :position)
|> Query.limit(1)
|> Repo.one()
end
end
| 25.610215 | 96 | 0.647423 |
08f9e57ff1e0f3c42fa627dac08e295e9336df02 | 353 | ex | Elixir | programming_elixir/part14/spawn2.ex | youknowcast/training | 5c79671b397f6397c52ae642fc1f16aa0a6dbb09 | [
"MIT"
] | null | null | null | programming_elixir/part14/spawn2.ex | youknowcast/training | 5c79671b397f6397c52ae642fc1f16aa0a6dbb09 | [
"MIT"
] | 2 | 2019-09-06T13:24:21.000Z | 2019-09-06T15:04:50.000Z | programming_elixir/part14/spawn2.ex | youknowcast/training | 5c79671b397f6397c52ae642fc1f16aa0a6dbb09 | [
"MIT"
] | null | null | null | defmodule Spawn2 do
def greet do
receive do
{sender, msg} ->
send sender, { :ok, "Hello, #{msg}" }
end
end
end
pid = spawn(Spawn2, :greet, [])
send pid, { self, "World" }
receive do
{:ok, message} ->
IO.puts message
end
# 以下は実行されない
send pid, { self, "Kermit" }
receive do
{:ok, message} ->
IO.puts message
end
| 13.576923 | 45 | 0.575071 |
08fa0d455eecee02327670719eef57cccb43b85d | 5,129 | ex | Elixir | lib/ex_admin/csv.ex | andriybohdan/ex_admin | e31c725078ac4e7390204a87d96360a21ffe7b90 | [
"MIT"
] | 1 | 2018-08-30T20:20:56.000Z | 2018-08-30T20:20:56.000Z | lib/ex_admin/csv.ex | 8thlight/ex_admin | 314d4068270c47799ec54f719073a565222bcfad | [
"MIT"
] | null | null | null | lib/ex_admin/csv.ex | 8thlight/ex_admin | 314d4068270c47799ec54f719073a565222bcfad | [
"MIT"
] | 2 | 2018-07-12T07:44:50.000Z | 2018-07-19T11:45:09.000Z | defmodule ExAdmin.CSV do
@moduledoc """
ExAdmin provides a CSV export link on the index page of each resource.
The CSV file format can be customized with the `csv` macro.
For example, give the following ecto model for Example.Contact:
defmodule Example.Contact do
use Ecto.Model
schema "contacts" do
field :first_name, :string, default: ""
field :last_name, :string, default: ""
field :email, :string, default: ""
belongs_to :category, Example.Category
has_many :contacts_phone_numbers, Example.ContactPhoneNumber
has_many :phone_numbers, through: [:contacts_phone_numbers, :phone_number]
has_many :contacts_groups, Example.ContactGroup
has_many :groups, through: [:contacts_groups, :group]
end
...
end
The following resource file will export the contact list as shown below:
defmodule Example.ExAdmin.Contact do
use ExAdmin.Register
alias Example.PhoneNumber
register_resource Example.Contact do
csv [
{"Surname", &(&1.last_name)},
{:category, &(&1.category.name)},
{"Groups", &(Enum.map(&1.groups, fn g -> g.name end) |> Enum.join("; "))},
] ++
(for label <- PhoneNumber.all_labels do
fun = fn c ->
c.phone_numbers
|> PhoneNumber.find_by_label(label)
|> Map.get(:number, "")
end
{label, fun}
end)
end
end
# output.csv
Surname,Given,Category,Groups,Home Phone,Business Phone,Mobile Phone
Pallen,Steve,R&D,Groop 1;Groop2,555-555-5555,555,555,1234
The macros available in the csv do block include
* `column` - Define a column in the exported CSV file
## Examples
# List format
csv [:name, :description]
# List format with functions
csv [:id, {:name, fn item -> "Mr. " <> item.name end}, :description]
# No header
csv header: false do
column :id
column :name
end
# Don't humanize the header name
csv [:name, :created_at], humanize: false
"""
require Logger
defmacro __using__(_opts \\ []) do
quote do
import unquote(__MODULE__), only: [csv: 1]
end
end
@doc """
Customize the exported CSV file.
"""
defmacro csv(opts \\ [], block \\ [])
defmacro csv(block_or_opts, block) do
{block, opts} = if block == [], do: {block_or_opts, block}, else: {block, block_or_opts}
quote location: :keep do
import ExAdmin.Register, except: [column: 1, column: 2]
import unquote(__MODULE__)
def build_csv(resources) do
var!(columns, ExAdmin.CSV) = []
unquote(block)
case var!(columns, ExAdmin.CSV) do
[] ->
unquote(block)
|> ExAdmin.CSV.build_csv(resources, unquote(opts))
schema ->
schema
|> Enum.reverse
|> ExAdmin.CSV.build_csv(resources, unquote(opts))
end
end
end
end
@doc """
Configure a column in the exported CSV file.
## Examples
csv do
column :id
column :name, fn user -> "#\{user.first_name} #\{user.last_name}" end
column :age
end
"""
defmacro column(field, fun \\ nil) do
quote do
entry = %{field: unquote(field), fun: unquote(fun)}
var!(columns, ExAdmin.CSV) = [entry | var!(columns, ExAdmin.CSV)]
end
end
@doc false
def default_schema([]), do: []
@doc false
def default_schema([resource | _]) do
resource.__struct__.__schema__(:fields)
|> Enum.map(&(build_default_column(&1)))
end
@doc false
def build_default_column(name) do
%{field: name, fun: nil}
end
@doc false
def build_csv(schema, resources, opts) do
schema = normalize_schema schema
Enum.reduce(resources, build_header_row(schema, opts), &(build_row(&2, &1, schema)))
|> Enum.reverse
|> CSVLixir.write
end
def build_csv(resources) do
default_schema(resources)
|> build_csv(resources, [])
end
defp normalize_schema(schema) do
Enum.map schema, fn
{name, fun} -> %{field: name, fun: fun}
name when is_atom(name) -> %{field: name, fun: nil}
map -> map
end
end
@doc false
def build_header_row(schema, opts) do
if Keyword.get(opts, :header, true) do
humanize? = Keyword.get(opts, :humanize, true)
[(for field <- schema, do: column_name(field[:field], humanize?))]
else
[]
end
end
defp column_name(field, true), do: ExAdmin.Utils.humanize(field)
defp column_name(field, _), do: Atom.to_string(field)
@doc false
def build_row(acc, resource, schema) do
row = Enum.reduce(schema, [], fn
%{field: name, fun: nil}, acc ->
[(Map.get(resource, name) |> ExAdmin.Render.to_string) | acc]
%{field: _name, fun: fun}, acc ->
[(fun.(resource) |> ExAdmin.Render.to_string) | acc]
end)
|> Enum.reverse
[row | acc]
end
@doc false
def write_csv(csv) do
csv
|> CSVLixir.write
end
end
| 26.853403 | 92 | 0.597192 |
08fa77634fa58809f0ed205bc3ce4e7385fe01bc | 713 | ex | Elixir | lib/chankins/web/gettext.ex | no0x9d/chankins | b4fd37d3145a001e4ebbe86eea91742d5a812858 | [
"MIT"
] | null | null | null | lib/chankins/web/gettext.ex | no0x9d/chankins | b4fd37d3145a001e4ebbe86eea91742d5a812858 | [
"MIT"
] | null | null | null | lib/chankins/web/gettext.ex | no0x9d/chankins | b4fd37d3145a001e4ebbe86eea91742d5a812858 | [
"MIT"
] | null | null | null | defmodule Chankins.Web.Gettext do
@moduledoc """
A module providing Internationalization with a gettext-based API.
By using [Gettext](https://hexdocs.pm/gettext),
your module gains a set of macros for translations, for example:
import Chankins.Web.Gettext
# Simple translation
gettext "Here is the string to translate"
# Plural translation
ngettext "Here is the string to translate",
"Here are the strings to translate",
3
# Domain-based translation
dgettext "errors", "Here is the error message to translate"
See the [Gettext Docs](https://hexdocs.pm/gettext) for detailed usage.
"""
use Gettext, otp_app: :chankins
end
| 28.52 | 72 | 0.680224 |
08faac4cdb1d53a056118f952259693fafe443bc | 2,571 | ex | Elixir | clients/content/lib/google_api/content/v21/model/pos_store.ex | MMore/elixir-google-api | 0574ec1439d9bbfe22d63965be1681b0f45a94c9 | [
"Apache-2.0"
] | null | null | null | clients/content/lib/google_api/content/v21/model/pos_store.ex | MMore/elixir-google-api | 0574ec1439d9bbfe22d63965be1681b0f45a94c9 | [
"Apache-2.0"
] | null | null | null | clients/content/lib/google_api/content/v21/model/pos_store.ex | MMore/elixir-google-api | 0574ec1439d9bbfe22d63965be1681b0f45a94c9 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Content.V21.Model.PosStore do
@moduledoc """
Store resource.
## Attributes
* `gcidCategory` (*type:* `list(String.t)`, *default:* `nil`) - The business type of the store.
* `kind` (*type:* `String.t`, *default:* `nil`) - Identifies what kind of resource this is. Value: the fixed string "`content#posStore`"
* `phoneNumber` (*type:* `String.t`, *default:* `nil`) - The store phone number.
* `placeId` (*type:* `String.t`, *default:* `nil`) - The Google Place Id of the store location.
* `storeAddress` (*type:* `String.t`, *default:* `nil`) - Required. The street address of the store.
* `storeCode` (*type:* `String.t`, *default:* `nil`) - Required. A store identifier that is unique for the given merchant.
* `storeName` (*type:* `String.t`, *default:* `nil`) - The merchant or store name.
* `websiteUrl` (*type:* `String.t`, *default:* `nil`) - The website url for the store or merchant.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:gcidCategory => list(String.t()) | nil,
:kind => String.t() | nil,
:phoneNumber => String.t() | nil,
:placeId => String.t() | nil,
:storeAddress => String.t() | nil,
:storeCode => String.t() | nil,
:storeName => String.t() | nil,
:websiteUrl => String.t() | nil
}
field(:gcidCategory, type: :list)
field(:kind)
field(:phoneNumber)
field(:placeId)
field(:storeAddress)
field(:storeCode)
field(:storeName)
field(:websiteUrl)
end
defimpl Poison.Decoder, for: GoogleApi.Content.V21.Model.PosStore do
def decode(value, options) do
GoogleApi.Content.V21.Model.PosStore.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Content.V21.Model.PosStore do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 37.808824 | 140 | 0.667445 |
08fab2ffd46b8eb60c63b2b126e8a4e024a95d19 | 1,783 | ex | Elixir | web/controllers/episode_controller.ex | thluiz/yehudimtv | 71aba0ee537b4bba28474fb20d3209fc261a03d7 | [
"MIT"
] | null | null | null | web/controllers/episode_controller.ex | thluiz/yehudimtv | 71aba0ee537b4bba28474fb20d3209fc261a03d7 | [
"MIT"
] | null | null | null | web/controllers/episode_controller.ex | thluiz/yehudimtv | 71aba0ee537b4bba28474fb20d3209fc261a03d7 | [
"MIT"
] | null | null | null | defmodule Yehudimtv.EpisodeController do
use Yehudimtv.Web, :controller
alias Yehudimtv.Episode
plug :scrub_params, "episode" when action in [:create, :update]
plug :action
def index(conn, _params) do
episodes = Repo.all(Episode)
render(conn, "index.html", episodes: episodes)
end
def new(conn, _params) do
changeset = Episode.changeset(%Episode{})
render(conn, "new.html", changeset: changeset)
end
def create(conn, %{"episode" => episode_params}) do
changeset = Episode.changeset(%Episode{}, episode_params)
if changeset.valid? do
Repo.insert(changeset)
conn
|> put_flash(:info, "Episode created successfully.")
|> redirect(to: episode_path(conn, :index))
else
render(conn, "new.html", changeset: changeset)
end
end
def show(conn, %{"id" => id}) do
episode = Repo.get(Episode, id)
render(conn, "show.html", episode: episode)
end
def edit(conn, %{"id" => id}) do
episode = Repo.get(Episode, id)
changeset = Episode.changeset(episode)
render(conn, "edit.html", episode: episode, changeset: changeset)
end
def update(conn, %{"id" => id, "episode" => episode_params}) do
episode = Repo.get(Episode, id)
changeset = Episode.changeset(episode, episode_params)
if changeset.valid? do
Repo.update(changeset)
conn
|> put_flash(:info, "Episode updated successfully.")
|> redirect(to: episode_path(conn, :index))
else
render(conn, "edit.html", episode: episode, changeset: changeset)
end
end
def delete(conn, %{"id" => id}) do
episode = Repo.get(Episode, id)
Repo.delete(episode)
conn
|> put_flash(:info, "Episode deleted successfully.")
|> redirect(to: episode_path(conn, :index))
end
end
| 26.220588 | 71 | 0.654515 |
08facf6fa3b3ca10094b38845f037c10c54ad8c8 | 1,920 | exs | Elixir | elixir/test/homework/account_test.exs | ceejaay/web-homework | e5844609b62bdfa79a9b5b8f302c0d7ba81dc75d | [
"MIT"
] | null | null | null | elixir/test/homework/account_test.exs | ceejaay/web-homework | e5844609b62bdfa79a9b5b8f302c0d7ba81dc75d | [
"MIT"
] | null | null | null | elixir/test/homework/account_test.exs | ceejaay/web-homework | e5844609b62bdfa79a9b5b8f302c0d7ba81dc75d | [
"MIT"
] | null | null | null | defmodule Homework.AccountTest do
use Homework.DataCase
alias Homework.Account
describe "cards" do
alias Homework.Account.Card
@valid_attrs %{description: "some description"}
@update_attrs %{description: "some updated description"}
@invalid_attrs %{description: nil}
def card_fixture(attrs \\ %{}) do
{:ok, card} =
attrs
|> Enum.into(@valid_attrs)
|> Account.create_card()
card
end
test "list_cards/0 returns all cards" do
card = card_fixture()
assert Account.list_cards() == [card]
end
test "get_card!/1 returns the card with given id" do
card = card_fixture()
assert Account.get_card!(card.id) == card
end
test "create_card/1 with valid data creates a card" do
assert {:ok, %Card{} = card} = Account.create_card(@valid_attrs)
assert card.description == "some description"
end
test "create_card/1 with invalid data returns error changeset" do
assert {:error, %Ecto.Changeset{}} = Account.create_card(@invalid_attrs)
end
test "update_card/2 with valid data updates the card" do
card = card_fixture()
assert {:ok, %Card{} = card} = Account.update_card(card, @update_attrs)
assert card.description == "some updated description"
end
test "update_card/2 with invalid data returns error changeset" do
card = card_fixture()
assert {:error, %Ecto.Changeset{}} = Account.update_card(card, @invalid_attrs)
assert card == Account.get_card!(card.id)
end
test "delete_card/1 deletes the card" do
card = card_fixture()
assert {:ok, %Card{}} = Account.delete_card(card)
assert_raise Ecto.NoResultsError, fn -> Account.get_card!(card.id) end
end
test "change_card/1 returns a card changeset" do
card = card_fixture()
assert %Ecto.Changeset{} = Account.change_card(card)
end
end
end
| 29.538462 | 84 | 0.659896 |
08fad7ea6fd5776f4f9bb879fe8e96c0f19c9c1f | 486 | ex | Elixir | lib/flix/protocol/events/no_space_for_new_connection.ex | efcasado/flix | 945fe84e6dba31b7f47d07279a97559e1094da46 | [
"Unlicense",
"MIT"
] | 1 | 2021-07-24T09:44:54.000Z | 2021-07-24T09:44:54.000Z | lib/flix/protocol/events/no_space_for_new_connection.ex | efcasado/flix | 945fe84e6dba31b7f47d07279a97559e1094da46 | [
"Unlicense",
"MIT"
] | 1 | 2021-07-24T07:13:40.000Z | 2021-08-02T13:44:44.000Z | lib/flix/protocol/events/no_space_for_new_connection.ex | efcasado/flix | 945fe84e6dba31b7f47d07279a97559e1094da46 | [
"Unlicense",
"MIT"
] | null | null | null | defmodule Flix.Protocol.Events.NoSpaceForNewConnection do
defstruct max_concurrently_connected_buttons: 0
# @type t :: %__MODULE__{
# x: String.t(),
# y: boolean,
# z: integer
# }
def decode(
<<
max_concurrently_connected_buttons::unsigned-little-integer-16
>> = _binary
) do
%__MODULE__{
max_concurrently_connected_buttons: max_concurrently_connected_buttons
}
end
def encode(_data) do
# TODO
nil
end
end
| 19.44 | 76 | 0.658436 |
08fae09e15f71ec67478206af7f79cda99c22cfe | 6,831 | ex | Elixir | lib/day14/nanofactory.ex | anamba/adventofcode2019 | a5de43ddce8b40f67c3017f349d8563c73c94e20 | [
"MIT"
] | null | null | null | lib/day14/nanofactory.ex | anamba/adventofcode2019 | a5de43ddce8b40f67c3017f349d8563c73c94e20 | [
"MIT"
] | null | null | null | lib/day14/nanofactory.ex | anamba/adventofcode2019 | a5de43ddce8b40f67c3017f349d8563c73c94e20 | [
"MIT"
] | null | null | null | defmodule Day14.Nanofactory do
@doc """
iex> Day14.Nanofactory.part1("day14-sample0.txt")
31
iex> Day14.Nanofactory.part1("day14-sample1.txt")
165
iex> Day14.Nanofactory.part1("day14-sample2.txt")
13312
iex> Day14.Nanofactory.part1("day14-sample3.txt")
180697
iex> Day14.Nanofactory.part1("day14-sample4.txt")
2210736
iex> Day14.Nanofactory.part1("day14.txt")
504284
"""
def part1(filename) do
for _i <- 1..100 do
parse_input_into_recipe_tree(filename)
|> find_lowest_cost
end
|> Enum.sort()
|> List.first()
end
@doc """
iex> Day14.Nanofactory.parse_input_into_recipe_tree("day14-sample1.txt")
%{"A" => [{2, [{"ORE", 9}]}], "AB" => [{1, [{"A", 3}, {"B", 4}]}], "B" => [{3, [{"ORE", 8}]}],
"BC" => [{1, [{"B", 5}, {"C", 7}]}], "C" => [{5, [{"ORE", 7}]}], "CA" => [{1, [{"C", 4}, {"A", 1}]}],
"FUEL" => [{1, [{"AB", 2}, {"BC", 3}, {"CA", 4}]}]}
"""
def parse_input_into_recipe_tree(filename) do
"inputs/#{filename}"
|> File.stream!()
|> Enum.map(fn line ->
line |> String.trim() |> String.split(" => ")
end)
|> Enum.map(&reaction_to_recipe/1)
|> master_recipe_tree
end
def reaction_to_recipe([input_str, output_str]) do
{output, qty} = parse_recipe_output(output_str)
{output, {qty, parse_recipe_inputs(input_str)}}
end
@doc """
iex> Day14.Nanofactory.parse_recipe_output("2 A")
{"A", 2}
"""
def parse_recipe_output(output_str) do
[qty_str, output] = String.split(output_str, " ")
{output, String.to_integer(qty_str)}
end
@doc """
iex> Day14.Nanofactory.parse_recipe_inputs("3 A, 4 B")
[{"A", 3}, {"B", 4}]
"""
def parse_recipe_inputs(input_str) do
input_str
|> String.split(", ")
|> Enum.map(fn str ->
[qty, chemical] = String.split(str, " ")
{chemical, String.to_integer(qty)}
end)
end
def master_recipe_tree(recipes, master_list \\ %{})
def master_recipe_tree([], master_list), do: master_list
def master_recipe_tree([{output, recipe} | recipes], master_list) do
recipes_for_output = [recipe | Map.get(master_list, output, [])]
master_recipe_tree(recipes, Map.put(master_list, output, recipes_for_output))
end
@doc """
iex> Day14.Nanofactory.find_lowest_cost(%{"FUEL" => [{1, [{"ORE", 123}]}]})
123
iex> Day14.Nanofactory.find_lowest_cost(%{"FUEL" => [{1, [{"ORE", 123}]}, {1, [{"ORE", 234}]}]})
123
iex> Day14.Nanofactory.find_lowest_cost(%{"FUEL" => [{1, [{"A", 2}]}], "A" => [{1, [{"ORE", 2}]}]})
4
iex> Day14.Nanofactory.find_lowest_cost(%{"FUEL" => [{1, [{"A", 2}, {"B", 1}]}], "A" => [{1, [{"ORE", 2}]}], "B" => [{1, [{"ORE", 3}]}]})
7
iex> Day14.Nanofactory.find_lowest_cost(%{"FUEL" => [{1, [{"A", 7}, {"C", 1}]}], "A" => [{10, [{"ORE", 10}]}], "B" => [{1, [{"ORE", 1}]}], "C" => [{1, [{"A", 7}, {"B", 1}]}]})
21
"""
def find_lowest_cost(master_recipe_tree, requirements \\ [{"FUEL", 1}], stock \\ %{})
def find_lowest_cost(_master_recipe_tree, [], _stock), do: 0
def find_lowest_cost(master_recipe_tree, [requirement | requirements], stock) do
{required_chemical, required_qty} = requirement
# IO.inspect([requirement | requirements], label: "WANT")
# IO.inspect(stock, label: "HAVE")
# if multiple options, pursue each in parallel
master_recipe_tree[required_chemical]
|> Enum.map(fn recipe -> recipe_with_multiplier(recipe, required_qty) end)
|> Enum.map(fn {multiplier, {produced_qty, inputs}} ->
# from this point, it's considered that we ran the reaction with the multiplier
stock_of_required_chemical = Map.get(stock, required_chemical, 0)
stock =
Map.put(stock, required_chemical, stock_of_required_chemical + produced_qty * multiplier)
# IO.inspect(stock, label: "HAVE (following reaction)")
# IO.inspect({required_chemical, required_qty}, label: "Using")
stock =
Map.put(
stock,
required_chemical,
stock_of_required_chemical + produced_qty * multiplier - required_qty
)
# IO.inspect(stock, label: "HAVE (after using what we need)")
case inputs do
# if we are down to only ore, then can start evaluating
[{"ORE", ore_qty}] ->
# IO.puts(
# "=== #{ore_qty * multiplier} ORE USED to produce #{produced_qty * multiplier} #{
# required_chemical
# } ==="
# )
ore_qty * multiplier + find_lowest_cost(master_recipe_tree, requirements, stock)
# otherwise, proceed down the rabbit hole
[{chemical, qty} | inputs] ->
qty_needed = qty * multiplier
stock_of_chemical = Map.get(stock, chemical, 0)
cond do
# if we have enough in stock, use that
qty_needed < stock_of_chemical ->
stock = Map.put(stock, chemical, stock_of_chemical - qty_needed)
find_lowest_cost(
master_recipe_tree,
combine_requirements(requirements, multiply_inputs(inputs, multiplier)),
stock
)
true ->
# use what we have (if any) and make the rest
qty_needed = qty_needed - stock_of_chemical
stock = Map.put(stock, chemical, 0)
find_lowest_cost(
master_recipe_tree,
combine_requirements(
requirements,
[{chemical, qty_needed} | multiply_inputs(inputs, multiplier)]
),
stock
)
end
end
end)
|> List.flatten()
|> Enum.sort()
# |> IO.inspect(charlists: false)
|> List.first()
end
@doc """
iex> Day14.Nanofactory.recipe_with_multiplier({2, [{"ORE", 9}]}, 4)
{2, {2, [{"ORE", 9}]}}
iex> Day14.Nanofactory.recipe_with_multiplier({10, [{"ORE", 10}]}, 14)
{2, {10, [{"ORE", 10}]}}
"""
def recipe_with_multiplier({produced_qty, _} = recipe, desired_qty),
do: {ceil(desired_qty / produced_qty), recipe}
def multiply_inputs([], _multiplier), do: []
def multiply_inputs([{chemical, qty} | inputs], multiplier) do
[{chemical, qty * multiplier} | multiply_inputs(inputs, multiplier)]
end
@doc """
iex> Day14.Nanofactory.combine_requirements([{"A", 2}], [{"A", 3}])
[{"A", 5}]
"""
def combine_requirements(reqs1, reqs2) do
keys =
(reqs1 ++ reqs2)
|> Enum.map(fn {key, _} -> key end)
|> Enum.uniq()
for key <- keys do
combined_qty =
(reqs1 ++ reqs2)
|> Enum.reduce(0, fn {k, qty}, acc -> if k == key, do: acc + qty, else: acc end)
{key, combined_qty}
end
|> Enum.shuffle()
end
end
| 33.816832 | 181 | 0.569024 |
08fb089fa035f8b09115232c16d4a0eec0db2c4d | 13,035 | exs | Elixir | test/elixir/test/attachments_test.exs | frapa/couchdb | 6c28960f0fe2eec06aca7d58fd73f3c7cdbe1112 | [
"Apache-2.0"
] | null | null | null | test/elixir/test/attachments_test.exs | frapa/couchdb | 6c28960f0fe2eec06aca7d58fd73f3c7cdbe1112 | [
"Apache-2.0"
] | null | null | null | test/elixir/test/attachments_test.exs | frapa/couchdb | 6c28960f0fe2eec06aca7d58fd73f3c7cdbe1112 | [
"Apache-2.0"
] | null | null | null | defmodule AttachmentsTest do
use CouchTestCase
@moduletag :attachments
@moduletag kind: :single_node
# MD5 Digests of compressible attachments and therefore Etags
# will vary depending on platform gzip implementation.
# These MIME types are defined in [attachments] compressible_types
@bin_att_doc %{
_id: "bin_doc",
_attachments: %{
"foo.txt": %{
content_type: "application/octet-stream",
data: "VGhpcyBpcyBhIGJhc2U2NCBlbmNvZGVkIHRleHQ="
}
}
}
@moduledoc """
Test CouchDB attachments
This is a port of the attachments.js suite
"""
@tag :with_db
test "saves attachment successfully", context do
db_name = context[:db_name]
resp = Couch.put("/#{db_name}/bin_doc", body: @bin_att_doc, query: %{w: 3})
assert resp.status_code in [201, 202]
assert resp.body["ok"]
end
@tag :with_db
test "errors for bad attachment", context do
db_name = context[:db_name]
bad_att_doc = %{
_id: "bad_doc",
_attachments: %{
"foo.txt": %{
content_type: "text/plain",
data: "notBase64Encoded="
}
}
}
resp = Couch.put("/#{db_name}/bad_doc", body: bad_att_doc, query: %{w: 3})
assert resp.status_code == 400
end
@tag :with_db
test "reads attachment successfully", context do
db_name = context[:db_name]
resp = Couch.put("/#{db_name}/bin_doc", body: @bin_att_doc, query: %{w: 3})
assert resp.status_code in [201, 202]
resp = Couch.get("/#{db_name}/bin_doc/foo.txt", body: @bin_att_doc)
assert resp.body == "This is a base64 encoded text"
assert resp.headers["Content-Type"] == "application/octet-stream"
assert resp.headers["Etag"] == "\"aEI7pOYCRBLTRQvvqYrrJQ==\""
end
@tag :with_db
test "update attachment", context do
db_name = context[:db_name]
bin_att_doc2 = %{
_id: "bin_doc2",
_attachments: %{
"foo.txt": %{
content_type: "text/plain",
data: ""
}
}
}
resp = Couch.put("/#{db_name}/bin_doc2", body: bin_att_doc2, query: %{w: 3})
assert resp.status_code in [201, 202]
rev = resp.body["rev"]
resp = Couch.get("/#{db_name}/bin_doc2/foo.txt")
assert resp.headers["Content-Type"] == "text/plain"
assert resp.body == ""
resp =
Couch.put(
"/#{db_name}/bin_doc2/foo2.txt",
query: %{rev: rev, w: 3},
body: "This is no base64 encoded text",
headers: ["Content-Type": "text/plain;charset=utf-8"]
)
assert resp.status_code in [201, 202]
assert Regex.match?(~r/bin_doc2\/foo2.txt/, resp.headers["location"])
end
@tag :with_db
test "delete attachment", context do
db_name = context[:db_name]
resp = Couch.put("/#{db_name}/bin_doc", body: @bin_att_doc, query: %{w: 3})
assert resp.status_code in [201, 202]
rev = resp.body["rev"]
resp = Couch.delete("/#{db_name}/bin_doc/foo.txt", query: %{w: 3})
assert resp.status_code == 409
resp = Couch.delete("/#{db_name}/bin_doc/foo.txt", query: %{w: 3, rev: "4-garbage"})
assert resp.status_code == 409
assert resp.body["error"] == "not_found"
assert resp.body["reason"] == "missing_rev"
resp = Couch.delete("/#{db_name}/bin_doc/notexisting.txt", query: %{w: 3, rev: rev})
assert resp.status_code == 404
assert resp.body["error"] == "not_found"
assert resp.body["reason"] == "Document is missing attachment"
resp = Couch.delete("/#{db_name}/bin_doc/foo.txt", query: %{w: 3, rev: rev})
assert resp.status_code == 200
assert resp.headers["location"] == nil
end
@tag :pending # Wrong Content-Type
@tag :with_db
test "saves binary", context do
db_name = context[:db_name]
bin_data = "JHAPDO*AU£PN ){(3u[d 93DQ9¡€])} ææøo'∂ƒæ≤çæππ•¥∫¶®#†π¶®¥π€ª®˙π8np"
resp =
Couch.put(
"/#{db_name}/bin_doc3/attachment.txt",
body: bin_data,
headers: ["Content-Type": "text/plain;charset=utf-8"],
query: %{w: 3}
)
assert resp.status_code in [201, 202]
assert resp.body["ok"]
rev = resp.body["rev"]
resp = Couch.get("/#{db_name}/bin_doc3/attachment.txt")
assert resp.body == bin_data
resp =
Couch.put("/#{db_name}/bin_doc3/attachment.txt", body: bin_data, query: %{w: 3})
assert resp.status_code == 409
# non-existent rev
resp =
Couch.put(
"/#{db_name}/bin_doc3/attachment.txt",
query: %{rev: "1-adae8575ecea588919bd08eb020c708e", w: 3},
headers: ["Content-Type": "text/plain;charset=utf-8"],
body: bin_data
)
assert resp.status_code == 409
# current rev
resp =
Couch.put(
"/#{db_name}/bin_doc3/attachment.txt",
query: %{rev: rev, w: 3},
headers: ["Content-Type": "text/plain;charset=utf-8"],
body: bin_data
)
assert resp.status_code in [201, 202]
rev = resp.body["rev"]
resp = Couch.get("/#{db_name}/bin_doc3/attachment.txt")
assert String.downcase(resp.headers["Content-Type"]) == "text/plain;charset=utf-8"
assert resp.body == bin_data
resp = Couch.get("/#{db_name}/bin_doc3/attachment.txt", query: %{rev: rev})
assert String.downcase(resp.headers["Content-Type"]) == "text/plain;charset=utf-8"
assert resp.body == bin_data
resp = Couch.delete("/#{db_name}/bin_doc3/attachment.txt", query: %{rev: rev, w: 3})
assert resp.status_code == 200
resp = Couch.get("/#{db_name}/bin_doc3/attachment.txt")
assert resp.status_code == 404
resp = Couch.get("/#{db_name}/bin_doc3/attachment.txt", query: %{rev: rev})
assert String.downcase(resp.headers["Content-Type"]) == "text/plain;charset=utf-8"
assert resp.body == bin_data
end
@tag :with_db
test "empty attachments", context do
db_name = context[:db_name]
resp =
Couch.put(
"/#{db_name}/bin_doc4/attachment.txt",
body: "",
headers: ["Content-Type": "text/plain;charset=utf-8"],
query: %{w: 3}
)
assert resp.status_code in [201, 202]
assert resp.body["ok"]
rev = resp.body["rev"]
resp = Couch.get("/#{db_name}/bin_doc4/attachment.txt")
assert resp.status_code == 200
assert resp.body == ""
resp =
Couch.put(
"/#{db_name}/bin_doc4/attachment.txt",
query: %{rev: rev, w: 3},
headers: ["Content-Type": "text/plain;charset=utf-8"],
body: "This is a string"
)
assert resp.status_code in [201, 202]
resp = Couch.get("/#{db_name}/bin_doc4/attachment.txt")
assert resp.status_code == 200
assert resp.body == "This is a string"
end
@tag :with_db
test "large attachments COUCHDB-366", context do
db_name = context[:db_name]
lorem = "Lorem ipsum dolor sit amet, consectetur adipiscing elit. "
range = 1..10
large_att = Enum.reduce(range, lorem, fn _, acc -> lorem <> acc end)
resp =
Couch.put(
"/#{db_name}/bin_doc5/attachment.txt",
body: large_att,
query: %{w: 3},
headers: ["Content-Type": "text/plain;charset=utf-8"]
)
assert resp.status_code in [201, 202]
assert resp.body["ok"]
resp = Couch.get("/#{db_name}/bin_doc5/attachment.txt")
assert String.downcase(resp.headers["Content-Type"]) == "text/plain;charset=utf-8"
assert resp.body == large_att
lorem_b64 =
"TG9yZW0gaXBzdW0gZG9sb3Igc2l0IGFtZXQsIGNvbnNlY3RldHVyIGFkaXBpc2NpbmcgZWxpdC4g"
range = 1..10
large_b64_att = Enum.reduce(range, lorem_b64, fn _, acc -> lorem_b64 <> acc end)
resp =
Couch.get(
"/#{db_name}/bin_doc5",
query: %{attachments: true},
headers: [Accept: "application/json"]
)
assert large_b64_att == resp.body["_attachments"]["attachment.txt"]["data"]
end
@tag :with_db
test "etags for attachments", context do
db_name = context[:db_name]
lorem_att = "Lorem ipsum dolor sit amet, consectetur adipiscing elit. "
resp =
Couch.put(
"/#{db_name}/bin_doc6/attachment.txt",
body: lorem_att,
headers: ["Content-Type": "text/plain;charset=utf-8"],
query: %{w: 3}
)
assert resp.status_code in [201, 202]
assert resp.body["ok"]
resp = Couch.get("/#{db_name}/bin_doc6/attachment.txt")
assert resp.status_code == 200
etag = resp.headers["etag"]
resp =
Couch.get("/#{db_name}/bin_doc6/attachment.txt", headers: ["if-none-match": etag])
assert resp.status_code == 304
end
@tag :with_db
test "COUCHDB-497 - empty attachments", context do
db_name = context[:db_name]
lorem_att = "Lorem ipsum dolor sit amet, consectetur adipiscing elit. "
resp =
Couch.put(
"/#{db_name}/bin_doc7/attachment.txt",
body: lorem_att,
headers: ["Content-Type": "text/plain;charset=utf-8"],
query: %{w: 3}
)
assert resp.status_code in [201, 202]
assert resp.body["ok"]
rev = resp.body["rev"]
resp =
Couch.put(
"/#{db_name}/bin_doc7/empty.txt",
query: %{rev: rev, w: 3},
body: "",
headers: ["Content-Type": "text/plain;charset=utf-8"]
)
assert resp.status_code in [201, 202]
rev = resp.body["rev"]
resp =
Couch.put(
"/#{db_name}/bin_doc7/empty.txt",
query: %{rev: rev, w: 3},
body: "",
headers: ["Content-Type": "text/plain;charset=utf-8"]
)
assert resp.status_code in [201, 202]
end
@tag :with_db
test "implicit doc creation allows creating docs with a reserved id. COUCHDB-565",
context do
db_name = context[:db_name]
resp =
Couch.put(
"/#{db_name}/_nonexistant/attachment.txt",
body: "ATTACHMENT INFO",
headers: ["Content-Type": "text/plain;charset=utf-8"],
query: %{w: 3}
)
assert resp.status_code == 400
end
@tag :pending # HTTP 500
@tag :with_db
test "COUCHDB-809 - stubs should only require the 'stub' field", context do
db_name = context[:db_name]
stub_doc = %{
_id: "stub_doc",
_attachments: %{
"foo.txt": %{
content_type: "text/plain",
data: "VGhpcyBpcyBhIGJhc2U2NCBlbmNvZGVkIHRleHQ="
}
}
}
resp =
Couch.put(
"/#{db_name}/stub_doc",
body: stub_doc,
query: %{w: 3}
)
assert resp.status_code in [201, 202]
assert resp.body["ok"]
rev = resp.body["rev"]
stub_doc =
Map.merge(stub_doc, %{
_rev: rev,
_attachments: %{"foo.txt": %{stub: true}}
})
resp =
Couch.put(
"/#{db_name}/stub_doc",
query: %{rev: rev, w: 3},
body: stub_doc
)
assert resp.status_code in [201, 202]
assert resp.body["ok"]
rev = resp.body["rev"]
stub_doc =
Map.merge(stub_doc, %{
_rev: rev,
_attachments: %{"foo.txt": %{stub: true, revpos: 10}}
})
resp =
Couch.put(
"/#{db_name}/stub_doc",
query: %{rev: rev},
body: stub_doc
)
assert resp.status_code == 412
assert resp.body["error"] == "missing_stub"
end
@tag :with_db
test "md5 header for attachments", context do
db_name = context[:db_name]
md5 = "MntvB0NYESObxH4VRDUycw=="
bin_data = "foo bar"
resp =
Couch.put(
"/#{db_name}/bin_doc8/attachment.txt",
body: bin_data,
headers: ["Content-Type": "application/octet-stream", "Content-MD5": md5],
query: %{w: 3}
)
assert resp.status_code in [201, 202]
assert resp.body["ok"]
resp = Couch.get("/#{db_name}/bin_doc8/attachment.txt")
assert resp.status_code == 200
assert md5 == resp.headers["Content-MD5"]
end
@tag :with_db
test "attachment via multipart/form-data", context do
db_name = context[:db_name]
form_data_doc = %{
_id: "form-data-doc"
}
resp =
Couch.put(
"/#{db_name}/form_data_doc",
body: form_data_doc,
query: %{w: 3}
)
assert resp.status_code in [201, 202]
assert resp.body["ok"]
rev = resp.body["rev"]
body =
"------TF\r\n" <>
"Content-Disposition: form-data; name=\"_rev\"\r\n\r\n" <>
rev <>
"\r\n" <>
"------TF\r\n" <>
"Content-Disposition: form-data; name=\"_attachments\"; filename=\"file.txt\"\r\n" <>
"Content-Type: text/plain\r\n\r\n" <>
"contents of file.txt\r\n\r\n" <> "------TF--"
resp =
Couch.post(
"/#{db_name}/form_data_doc",
body: body,
query: %{w: 3},
headers: [
Referer: "http://127.0.0.1:15984",
"Content-Type": "multipart/form-data; boundary=----TF",
"Content-Length": byte_size(body)
]
)
assert resp.status_code in [201, 202]
assert resp.body["ok"]
resp = Couch.get("/#{db_name}/form_data_doc")
assert resp.status_code == 200
doc = resp.body
assert doc["_attachments"]["file.txt"]["length"] == 22
end
end
| 26.333333 | 93 | 0.589336 |
08fb5e057eefa16c33ef21fb633d4f2e9b0a0ace | 333 | exs | Elixir | test/nerves_livebook_test.exs | trarbr/nerves_livebook | ac5a5f7f8b80fb0c63cfe81e565c439e912973dc | [
"Apache-2.0"
] | 76 | 2021-04-23T15:51:52.000Z | 2021-09-20T19:35:26.000Z | test/nerves_livebook_test.exs | trarbr/nerves_livebook | ac5a5f7f8b80fb0c63cfe81e565c439e912973dc | [
"Apache-2.0"
] | 79 | 2021-11-03T08:33:32.000Z | 2022-03-22T08:29:37.000Z | test/nerves_livebook_test.exs | trarbr/nerves_livebook | ac5a5f7f8b80fb0c63cfe81e565c439e912973dc | [
"Apache-2.0"
] | 14 | 2021-04-30T17:32:50.000Z | 2021-09-19T06:03:16.000Z | defmodule NervesLivebookTest do
use ExUnit.Case
test "check_internet!/0 doesn't raise on host" do
NervesLivebook.check_internet!()
end
test "version/0" do
# Just check that it's a parsable version since
# there's no second source of truth to my knowledge.
Version.parse!(NervesLivebook.version())
end
end
| 23.785714 | 56 | 0.723724 |
08fb64e632519ef3efb66331ad964f629974f136 | 5,779 | ex | Elixir | lib/ecto/query/builder/filter.ex | jccf091/ecto | 42d47a6da0711f842e1a0e6724a89b318b9b2144 | [
"Apache-2.0"
] | 1 | 2017-11-27T06:00:32.000Z | 2017-11-27T06:00:32.000Z | lib/ecto/query/builder/filter.ex | jccf091/ecto | 42d47a6da0711f842e1a0e6724a89b318b9b2144 | [
"Apache-2.0"
] | null | null | null | lib/ecto/query/builder/filter.ex | jccf091/ecto | 42d47a6da0711f842e1a0e6724a89b318b9b2144 | [
"Apache-2.0"
] | null | null | null | import Kernel, except: [apply: 3]
defmodule Ecto.Query.Builder.Filter do
@moduledoc false
alias Ecto.Query.Builder
@doc """
Escapes a where or having clause.
It allows query expressions that evaluate to a boolean
or a keyword list of field names and values. In a keyword
list multiple key value pairs will be joined with "and".
"""
@spec escape(:where | :having, Macro.t, non_neg_integer, Keyword.t, Macro.Env.t) :: {Macro.t, %{}}
def escape(_kind, [], _binding, _vars, _env) do
{true, %{}}
end
def escape(kind, expr, binding, vars, env) when is_list(expr) do
{parts, params} =
Enum.map_reduce(expr, %{}, fn
{field, nil}, _params ->
Builder.error! "nil given for #{inspect field}. Comparison with nil is forbidden as it is unsafe. " <>
"Instead write a query with is_nil/1, for example: is_nil(s.#{field})"
{field, value}, params when is_atom(field) ->
{value, {params, :acc}} = Builder.escape(value, {binding, field}, {params, :acc}, vars, env)
{{:{}, [], [:==, [], [to_escaped_field(binding, field), value]]}, params}
_, _params ->
Builder.error! "expected a keyword list at compile time in #{kind}, " <>
"got: `#{Macro.to_string expr}`. If you would like to " <>
"pass a list dynamically, please interpolate the whole list with ^"
end)
expr = Enum.reduce parts, &{:{}, [], [:and, [], [&2, &1]]}
{expr, params}
end
def escape(_kind, expr, _binding, vars, env) do
{expr, {params, :acc}} = Builder.escape(expr, :boolean, {%{}, :acc}, vars, env)
{expr, params}
end
@doc """
Builds a quoted expression.
The quoted expression should evaluate to a query at runtime.
If possible, it does all calculations at compile time to avoid
runtime work.
"""
@spec build(:where | :having, :and | :or, Macro.t, [Macro.t], Macro.t, Macro.Env.t) :: Macro.t
def build(kind, op, query, _binding, {:^, _, [var]}, env) do
quote do
Ecto.Query.Builder.Filter.filter!(unquote(kind), unquote(op), unquote(query),
unquote(var), 0, unquote(env.file), unquote(env.line))
end
end
def build(kind, op, query, binding, expr, env) do
{query, binding} = Builder.escape_binding(query, binding)
{expr, params} = escape(kind, expr, 0, binding, env)
params = Builder.escape_params(params)
expr = quote do: %Ecto.Query.BooleanExpr{
expr: unquote(expr),
op: unquote(op),
params: unquote(params),
file: unquote(env.file),
line: unquote(env.line)}
Builder.apply_query(query, __MODULE__, [kind, expr], env)
end
@doc """
The callback applied by `build/4` to build the query.
"""
@spec apply(Ecto.Queryable.t, :where | :having, term) :: Ecto.Query.t
def apply(query, _, %{expr: true}) do
query
end
def apply(%Ecto.Query{wheres: wheres} = query, :where, expr) do
%{query | wheres: wheres ++ [expr]}
end
def apply(%Ecto.Query{havings: havings} = query, :having, expr) do
%{query | havings: havings ++ [expr]}
end
def apply(query, kind, expr) do
apply(Ecto.Queryable.to_query(query), kind, expr)
end
@doc """
Builds a filter based on the given arguments.
"""
def filter!(_kind, query, %Ecto.Query.DynamicExpr{} = dynamic, _binding, _file, _line) do
{expr, _binding, params, file, line} = Ecto.Query.Builder.Dynamic.fully_expand(query, dynamic)
{expr, params, file, line}
end
def filter!(_kind, _query, bool, _binding, file, line) when is_boolean(bool) do
{bool, [], file, line}
end
def filter!(kind, _query, kw, binding, file, line) when is_list(kw) do
{expr, params} = kw!(kind, kw, binding)
{expr, params, file, line}
end
def filter!(kind, _query, other, _binding, _file, _line) do
raise ArgumentError, "expected a keyword list or dynamic expression in `#{kind}`, got: `#{inspect other}`"
end
@doc """
Builds the filter and applies it to the given query as boolean operator.
"""
def filter!(kind, op, query, expr, binding, file, line) do
{expr, params, file, line} = filter!(kind, query, expr, binding, file, line)
boolean = %Ecto.Query.BooleanExpr{expr: expr, params: params, line: line, file: file, op: op}
apply(query, kind, boolean)
end
defp kw!(kind, kw, binding) do
case kw!(kw, binding, 0, [], [], kind, kw) do
{[], params} -> {true, params}
{parts, params} -> {Enum.reduce(parts, &{:and, [], [&2, &1]}), params}
end
end
defp kw!([{field, nil}|_], _binding, _counter, _exprs, _params, _kind, _original) when is_atom(field) do
raise ArgumentError, "nil given for #{inspect field}. Comparison with nil is forbidden as it is unsafe. " <>
"Instead write a query with is_nil/1, for example: is_nil(s.#{field})"
end
defp kw!([{field, value}|t], binding, counter, exprs, params, kind, original) when is_atom(field) do
kw!(t, binding, counter + 1,
[{:==, [], [to_field(binding, field), {:^, [], [counter]}]}|exprs],
[{value, {binding, field}}|params],
kind, original)
end
defp kw!([], _binding, _counter, exprs, params, _kind, _original) do
{Enum.reverse(exprs), Enum.reverse(params)}
end
defp kw!(_, _binding, _counter, _exprs, _params, kind, original) do
raise ArgumentError, "expected a keyword list in `#{kind}`, got: `#{inspect original}`"
end
defp to_field(binding, field),
do: {{:., [], [{:&, [], [binding]}, field]}, [], []}
defp to_escaped_field(binding, field),
do: {:{}, [], [{:{}, [], [:., [], [{:{}, [], [:&, [], [binding]]}, field]]}, [], []]}
end
| 38.785235 | 112 | 0.602872 |
08fb8cfc48f5c7b1fcd32593f754184c2627da82 | 87,141 | ex | Elixir | lib/ash/changeset/changeset.ex | tlietz/ash | 12cfe0d8486b4bf9d750f428b43d8adb87d5c964 | [
"MIT"
] | null | null | null | lib/ash/changeset/changeset.ex | tlietz/ash | 12cfe0d8486b4bf9d750f428b43d8adb87d5c964 | [
"MIT"
] | null | null | null | lib/ash/changeset/changeset.ex | tlietz/ash | 12cfe0d8486b4bf9d750f428b43d8adb87d5c964 | [
"MIT"
] | null | null | null | defmodule Ash.Changeset do
@moduledoc """
Changesets are used to create and update data in Ash.
Create a changeset with `new/1` or `new/2`, and alter the attributes
and relationships using the functions provided in this module. Nothing in this module
actually incurs changes in a data layer. To commit a changeset, see `c:Ash.Api.create/2`
and `c:Ash.Api.update/2`.
For example:
```elixir
Ash.Changeset.replace_relationship(changeset, :linked_tickets, [
{1, %{link_type: "blocking"}},
{a_ticket, %{link_type: "caused_by"}},
{%{id: 2}, %{link_type: "related_to"}}
])
```
`Ash.Changeset.replace_relationship/3`, `Ash.Changeset.append_to_relationship/3` and `Ash.Changeset.remove_from_relationship/3`
are simply about managing what data is/isn't related. A simple example might be updating the *tags* of a post, where all the tags
already exist, we simply want to edit the information. They are shorthands for calling `Ash.Changeset.manage_relationship/4` with
a specific set of options.
See the action DSL documentation for more.
"""
defstruct [
:data,
:action_type,
:action,
:resource,
:api,
:tenant,
:__validated_for_action__,
:handle_errors,
select: nil,
params: %{},
action_failed?: false,
arguments: %{},
context: %{},
defaults: [],
after_action: [],
before_action: [],
errors: [],
valid?: true,
attributes: %{},
relationships: %{},
change_dependencies: []
]
defimpl Inspect do
import Inspect.Algebra
@spec inspect(Ash.Changeset.t(), Inspect.Opts.t()) ::
{:doc_cons, :doc_line | :doc_nil | binary | tuple,
:doc_line | :doc_nil | binary | tuple}
| {:doc_group,
:doc_line
| :doc_nil
| binary
| {:doc_collapse, pos_integer}
| {:doc_force, any}
| {:doc_break | :doc_color | :doc_cons | :doc_fits | :doc_group | :doc_string, any,
any}
| {:doc_nest, any, :cursor | :reset | non_neg_integer, :always | :break},
:inherit | :self}
def inspect(changeset, opts) do
context = Map.delete(changeset.context, :private)
context =
if context == %{} do
empty()
else
concat("context: ", to_doc(sanitize_context(context), opts))
end
tenant =
if changeset.tenant do
concat("tenant: ", to_doc(changeset.tenant, opts))
else
empty()
end
api =
if changeset.api do
concat("api: ", to_doc(changeset.api, opts))
else
empty()
end
container_doc(
"#Ash.Changeset<",
[
api,
concat("action_type: ", inspect(changeset.action_type)),
concat("action: ", inspect(changeset.action && changeset.action.name)),
tenant,
concat("attributes: ", to_doc(changeset.attributes, opts)),
concat("relationships: ", to_doc(changeset.relationships, opts)),
arguments(changeset, opts),
concat("errors: ", to_doc(changeset.errors, opts)),
concat("data: ", to_doc(changeset.data, opts)),
context,
concat("valid?: ", to_doc(changeset.valid?, opts))
],
">",
opts,
fn str, _ -> str end
)
end
defp sanitize_context(%{manage_relationship_source: manage_relationship_source} = context) do
sanitized_managed_relationship_source =
manage_relationship_source
|> Enum.reverse()
|> Enum.map_join(" -> ", fn {resource, rel, _} ->
"#{inspect(resource)}.#{rel}"
end)
%{
context
| manage_relationship_source:
"#manage_relationship_source<#{sanitized_managed_relationship_source}>"
}
end
defp sanitize_context(context), do: context
defp arguments(changeset, opts) do
if changeset.action do
if Enum.empty?(changeset.action.arguments) do
empty()
else
arg_string =
changeset.action.arguments
|> Enum.filter(fn argument ->
match?({:ok, _}, Ash.Changeset.fetch_argument(changeset, argument.name))
end)
|> Map.new(fn argument ->
if argument.sensitive? do
{argument.name, "**redacted**"}
else
{argument.name, Ash.Changeset.get_argument(changeset, argument.name)}
end
end)
|> to_doc(opts)
concat(["arguments: ", arg_string])
end
else
empty()
end
end
end
@type t :: %__MODULE__{data: Ash.Resource.record()}
alias Ash.Error.{
Changes.InvalidArgument,
Changes.InvalidAttribute,
Changes.InvalidChanges,
Changes.InvalidRelationship,
Changes.NoSuchAttribute,
Changes.NoSuchRelationship,
Changes.Required,
Invalid.NoSuchResource
}
require Ash.Query
@doc """
Return a changeset over a resource or a record. `params` can be either attributes, relationship values or arguments.
If you are using external input, you almost certainly want to use `Ash.Changeset.for_<action_type>`. However, you can
use `Ash.Changeset.new/2` to start a changeset and make a few changes prior to calling `for_action`. For example:
```elixir
Ash.Changeset.new()
|> Ash.Changeset.change_attribute(:name, "foobar")
|> Ash.Changeset.for_action(...)
```
Anything that is modified prior to `for_action` is validated against the rules of the action, while *anything after it is not*.
This changeset does not consider an action, and so allows you to change things with minimal validation. Values are
validated when changed, and the existence of attributes and relationships are validated. If you want to essentially
"run an action", and get back a changeset with any errors that would be generated by that action (with the exception
of errors that can only be generated by the data layer), use `for_action/4`.
Additionally, this format only supports supplying attributes in the params. This is because we don't know what the
behavior should be for relationship changes, nor what arguments are available. You can manage them yourself with
the functions that allow managing arguments/relationships that are provided in this module, e.g `set_argument/3` and
`replace_relationship/3`
"""
@spec new(Ash.Resource.t() | Ash.Resource.record(), params :: map) :: t
def new(resource, params \\ %{})
def new(%resource{} = record, params) do
tenant =
record
|> Map.get(:__metadata__, %{})
|> Map.get(:tenant, nil)
context = Ash.Resource.Info.default_context(resource) || %{}
if Ash.Resource.Info.resource?(resource) do
%__MODULE__{resource: resource, data: record, action_type: :update}
|> change_attributes(params)
|> set_context(context)
|> set_tenant(tenant)
else
%__MODULE__{
resource: resource,
action_type: :update,
data: struct(resource)
}
|> add_error(NoSuchResource.exception(resource: resource))
|> set_tenant(tenant)
|> set_context(context)
end
end
def new(resource, params) do
if Ash.Resource.Info.resource?(resource) do
%__MODULE__{
resource: resource,
action_type: :create,
data: struct(resource)
}
|> change_attributes(params)
else
%__MODULE__{resource: resource, action_type: :create, data: struct(resource)}
|> add_error(NoSuchResource.exception(resource: resource))
end
end
@doc """
Ensure that only the specified attributes are present in the results.
The first call to `select/2` will replace the default behavior of selecting
all attributes. Subsequent calls to `select/2` will combine the provided
fields unless the `replace?` option is provided with a value of `true`.
If a field has been deselected, selecting it again will override that (because a single list of fields is tracked for selection)
Primary key attributes always selected and cannot be deselected.
When attempting to load a relationship (or manage it with `Ash.Changeset.manage_relationship/3`),
if the source field is not selected on the query/provided data an error will be produced. If loading
a relationship with a query, an error is produced if the query does not select the destination field
of the relationship.
Datalayers currently are not notified of the `select` for a changeset(unlike queries), and creates/updates select all fields when they are performed.
A select provided on a changeset simply sets the unselected fields to `nil` before returning the result.
Use `ensure_selected/2` if you simply wish to make sure a field has been selected, without deselecting any other fields.
"""
def select(changeset, fields, opts \\ []) do
if opts[:replace?] do
%{changeset | select: Enum.uniq(List.wrap(fields))}
else
%{changeset | select: Enum.uniq(List.wrap(fields) ++ (changeset.select || []))}
end
end
@doc """
Ensures that the given attributes are selected.
The first call to `select/2` will *limit* the fields to only the provided fields.
Use `ensure_selected/2` to say "select this field (or these fields) without deselecting anything else".
See `select/2` for more.
"""
def ensure_selected(changeset, fields) do
if changeset.select do
Ash.Changeset.select(changeset, List.wrap(fields))
else
to_select =
changeset.resource
|> Ash.Resource.Info.attributes()
|> Enum.map(& &1.name)
Ash.Changeset.select(changeset, to_select)
end
end
@doc """
Ensure the the specified attributes are `nil` in the changeset results.
"""
def deselect(changeset, fields) do
select =
if changeset.select do
changeset.select
else
changeset.resource
|> Ash.Resource.Info.attributes()
|> Enum.map(& &1.name)
end
select = select -- List.wrap(fields)
select(changeset, select, replace?: true)
end
def selecting?(changeset, field) do
case changeset.select do
nil ->
not is_nil(Ash.Resource.Info.attribute(changeset.resource, field))
select ->
if field in select do
true
else
attribute = Ash.Resource.Info.attribute(changeset.resource, field)
attribute && (attribute.private? || attribute.primary_key?)
end
end
end
@manage_types [:replace, :append, :remove, :direct_control, :create]
@for_create_opts [
relationships: [
type: :any,
doc: """
customize relationship behavior.
By default, any relationships are ignored. There are three ways to change relationships with this function:
### Action Arguments (preferred)
Create an argument on the action and add a `Ash.Resource.Change.Builtins.manage_relationship/3` change to the action.
### Overrides
You can pass the `relationships` option to specify the behavior. It is a keyword list of relationship and either
* one of the preset manage types: #{inspect(@manage_types)}
* explicit options, in the form of `{:manage, [...opts]}`
```elixir
Ash.Changeset.for_create(MyResource, :create, params, relationships: [relationship: :append, other_relationship: {:manage, [...opts]}])
```
You can also use explicit calls to `manage_relationship/4`.
"""
],
require?: [
type: :boolean,
default: false,
doc:
"If set to `true`, values are only required when the action is run (instead of immediately)."
],
actor: [
type: :any,
doc:
"set the actor, which can be used in any `Ash.Resource.Change`s configured on the action. (in the `context` argument)"
],
tenant: [
type: :any,
doc: "set the tenant on the changeset"
]
]
@doc false
def for_create_opts, do: @for_create_opts
@doc """
Constructs a changeset for a given create action, and validates it.
Anything that is modified prior to `for_create/4` is validated against the rules of the action, while *anything after it is not*.
This runs any `change`s contained on your action. To have your logic execute *only* during the action, you can use `after_action/2`
or `before_action/2`.
Multitenancy is *not* validated until an action is called. This allows you to avoid specifying a tenant until just before calling
the api action.
### Params
`params` may be attributes, relationships, or arguments. You can safely pass user/form input directly into this function.
Only public attributes and relationships are supported. If you want to change private attributes as well, see the
Customization section below. `params` are stored directly as given in the `params` field of the changeset, which is used
### Opts
#{Ash.OptionsHelpers.docs(@for_create_opts)}
### Customization
A changeset can be provided as the first argument, instead of a resource, to allow
setting specific attributes ahead of time.
For example:
MyResource
|> Ash.Changeset.new()
|> Ash.Changeset.change_attribute(:foo, 1)
|> Ash.Changeset.for_create(:create, ...opts)
Once a changeset has been validated by `for_create/4` (or `for_update/4`), it isn't validated again in the action.
New changes added are validated individually, though. This allows you to create a changeset according
to a given action, and then add custom changes if necessary.
"""
def for_create(initial, action, params \\ %{}, opts \\ []) do
changeset =
case initial do
%__MODULE__{action_type: :create} = changeset ->
changeset
resource when is_atom(resource) ->
new(resource)
other ->
raise ArgumentError,
message: """
Initial must be a changeset with the action type of `:create`, or a resource.
Got: #{inspect(other)}
"""
end
for_action(changeset, action, params, opts)
end
@doc """
Constructs a changeset for a given update action, and validates it.
Anything that is modified prior to `for_update/4` is validated against the rules of the action, while *anything after it is not*.
See `for_create/4` for more information
"""
def for_update(initial, action, params \\ %{}, opts \\ []) do
changeset =
case initial do
# We accept :destroy here to support soft deletes
%__MODULE__{action_type: type} = changeset when type in [:update, :destroy] ->
changeset
%mod{} = struct when mod != __MODULE__ ->
new(struct)
other ->
raise ArgumentError,
message: """
Initial must be a changeset with the action type of `:update` or `:destroy`, or a record.
Got: #{inspect(other)}
"""
end
for_action(changeset, action, params, opts)
end
@doc """
Constructs a changeset for a given destroy action, and validates it.
### Opts
* `:actor` - set the actor, which can be used in any `Ash.Resource.Change`s configured on the action. (in the `context` argument)
* `:tenant` - set the tenant on the changeset
Anything that is modified prior to `for_destroy/4` is validated against the rules of the action, while *anything after it is not*.
Once a changeset has been validated by `for_destroy/4`, it isn't validated again in the action.
New changes added are validated individually, though. This allows you to create a changeset according
to a given action, and then add custom changes if necessary.
"""
def for_destroy(initial, action_name, params \\ %{}, opts \\ []) do
changeset =
case initial do
%__MODULE__{} = changeset ->
changeset
|> Map.put(:action_type, :destroy)
%_{} = struct ->
struct
|> new()
|> Map.put(:action_type, :destroy)
other ->
raise ArgumentError,
message: """
Initial must be a changeset with the action type of `:destroy`, or a record.
Got: #{inspect(other)}
"""
end
if changeset.valid? do
action = Ash.Resource.Info.action(changeset.resource, action_name, changeset.action_type)
if action do
if action.soft? do
for_action(%{changeset | action_type: :destroy}, action.name, params, opts)
else
changeset
|> handle_errors(action.error_handler)
|> set_actor(opts)
|> set_tenant(opts[:tenant] || changeset.tenant)
|> Map.put(:__validated_for_action__, action.name)
|> Map.put(:action, action)
|> cast_params(action, params, opts)
|> set_argument_defaults(action)
|> run_action_changes(action, opts[:actor])
|> add_validations()
|> mark_validated(action.name)
|> require_arguments(action)
end
else
raise_no_action(changeset.resource, action_name, :destroy)
end
else
changeset
end
end
defp for_action(changeset, action_name, params, opts) do
if changeset.valid? do
action = Ash.Resource.Info.action(changeset.resource, action_name, changeset.action_type)
if action do
changeset =
changeset
|> handle_errors(action.error_handler)
|> set_actor(opts)
|> set_tenant(opts[:tenant] || changeset.tenant)
|> Map.put(:action, action)
|> Map.put(:__validated_for_action__, action.name)
|> cast_params(action, params || %{}, opts)
|> set_argument_defaults(action)
|> validate_attributes_accepted(action)
|> require_values(action.type, false, action.require_attributes)
|> run_action_changes(action, opts[:actor])
|> set_defaults(changeset.action_type, false)
|> add_validations()
|> mark_validated(action.name)
|> require_arguments(action)
if Keyword.get(opts, :require?, true) do
require_values(changeset, action.type)
else
changeset
end
else
raise_no_action(changeset.resource, action_name, changeset.action_type)
end
else
changeset
end
end
defp require_arguments(changeset, action) do
changeset
|> do_require_arguments(action)
end
defp do_require_arguments(changeset, action) do
action.arguments
|> Enum.filter(&(&1.allow_nil? == false))
|> Enum.reduce(changeset, fn argument, changeset ->
case fetch_argument(changeset, argument.name) do
{:ok, value} when not is_nil(value) ->
changeset
_ ->
add_error(
changeset,
Ash.Error.Changes.Required.exception(
resource: changeset.resource,
field: argument.name,
type: :argument
)
)
end
end)
end
defp set_argument_defaults(changeset, action) do
Enum.reduce(action.arguments, changeset, fn argument, changeset ->
case fetch_argument(changeset, argument.name) do
:error ->
if is_nil(argument.default) do
changeset
else
%{
changeset
| arguments: Map.put(changeset.arguments, argument.name, default(:create, argument))
}
end
_ ->
changeset
end
end)
end
defp set_actor(changeset, opts) do
if Keyword.has_key?(opts, :actor) do
put_context(changeset, :private, %{actor: opts[:actor]})
else
changeset
end
end
defp raise_no_action(resource, action, type) do
available_actions =
resource
|> Ash.Resource.Info.actions()
|> Enum.filter(&(&1.type == type))
|> Enum.map_join("\n", &" - `#{inspect(&1.name)}`")
raise ArgumentError,
message: """
No such #{type} action on resource #{inspect(resource)}: #{String.slice(inspect(action), 0..50)}
Example Call:
Ash.Changeset.for_#{type}(changeset_or_record, :action_name, input, options)
Available #{type} actions:
#{available_actions}
"""
end
defp mark_validated(changeset, action_name) do
%{changeset | __validated_for_action__: action_name}
end
@doc false
def validate_multitenancy(changeset) do
if Ash.Resource.Info.multitenancy_strategy(changeset.resource) &&
not Ash.Resource.Info.multitenancy_global?(changeset.resource) &&
is_nil(changeset.tenant) do
add_error(
changeset,
"#{inspect(changeset.resource)} changesets require a tenant to be specified"
)
else
changeset
end
end
defp cast_params(changeset, action, params, opts) do
changeset = %{
changeset
| params: Map.merge(changeset.params || %{}, Enum.into(params || %{}, %{}))
}
Enum.reduce(params, changeset, fn {name, value}, changeset ->
cond do
has_argument?(action, name) ->
set_argument(changeset, name, value)
attr = Ash.Resource.Info.public_attribute(changeset.resource, name) ->
if attr.writable? do
change_attribute(changeset, attr.name, value)
else
changeset
end
rel = Ash.Resource.Info.public_relationship(changeset.resource, name) ->
if rel.writable? do
behaviour = opts[:relationships][rel.name]
case behaviour do
nil ->
changeset
type when is_atom(type) ->
manage_relationship(changeset, rel.name, value, type: type)
{:manage, manage_opts} ->
manage_relationship(changeset, rel.name, value, manage_opts)
end
else
changeset
end
true ->
changeset
end
end)
end
defp has_argument?(action, name) when is_atom(name) do
Enum.any?(action.arguments, &(&1.private? == false && &1.name == name))
end
defp has_argument?(action, name) when is_binary(name) do
Enum.any?(action.arguments, &(&1.private? == false && to_string(&1.name) == name))
end
defp validate_attributes_accepted(changeset, %{accept: nil}), do: changeset
defp validate_attributes_accepted(changeset, %{accept: accepted_attributes}) do
changeset.attributes
|> Enum.reject(fn {key, _value} ->
key in accepted_attributes
end)
|> Enum.reduce(changeset, fn {key, _}, changeset ->
add_error(
changeset,
InvalidAttribute.exception(field: key, message: "cannot be changed")
)
end)
end
defp run_action_changes(changeset, %{changes: changes}, actor) do
changes = changes ++ Ash.Resource.Info.changes(changeset.resource, changeset.action_type)
Enum.reduce(changes, changeset, fn
%{only_when_valid?: true}, %{valid?: false} = changeset ->
changeset
%{change: {module, opts}, where: where}, changeset ->
if Enum.all?(where || [], fn {module, opts} ->
module.validate(changeset, opts) == :ok
end) do
{:ok, opts} = module.init(opts)
module.change(changeset, opts, %{actor: actor})
else
changeset
end
%{validation: _} = validation, changeset ->
validate(changeset, validation)
end)
end
@doc false
def set_defaults(changeset, action_type, lazy? \\ false)
def set_defaults(changeset, :create, lazy?) do
changeset.resource
|> Ash.Resource.Info.attributes()
|> Enum.filter(fn attribute ->
not is_nil(attribute.default) &&
(lazy? or
not (is_function(attribute.default) or match?({_, _, _}, attribute.default)))
end)
|> Enum.reduce(changeset, fn attribute, changeset ->
changeset
|> force_change_new_attribute_lazy(attribute.name, fn ->
default(:create, attribute)
end)
|> Map.update!(:defaults, fn defaults ->
[attribute.name | defaults]
end)
end)
|> Map.update!(:defaults, &Enum.uniq/1)
end
def set_defaults(changeset, :update, lazy?) do
changeset.resource
|> Ash.Resource.Info.attributes()
|> Enum.filter(fn attribute ->
not is_nil(attribute.update_default) &&
(lazy? or
not (is_function(attribute.update_default) or
match?({_, _, _}, attribute.update_default)))
end)
|> Enum.reduce(changeset, fn attribute, changeset ->
changeset
|> force_change_new_attribute_lazy(attribute.name, fn ->
default(:update, attribute)
end)
|> Map.update!(:defaults, fn defaults ->
[attribute.name | defaults]
end)
end)
|> Map.update!(:defaults, &Enum.uniq/1)
end
def set_defaults(changeset, _, _) do
changeset
end
defp default(:create, %{default: {mod, func, args}}), do: apply(mod, func, args)
defp default(:create, %{default: function}) when is_function(function, 0), do: function.()
defp default(:create, %{default: value}), do: value
defp default(:update, %{update_default: {mod, func, args}}), do: apply(mod, func, args)
defp default(:update, %{update_default: function}) when is_function(function, 0),
do: function.()
defp default(:update, %{update_default: value}), do: value
defp add_validations(changeset) do
changeset.resource
# We use the `changeset.action_type` to support soft deletes
# Because a delete is an `update` with an action type of `update`
|> Ash.Resource.Info.validations(changeset.action_type)
|> Enum.reduce(changeset, &validate(&2, &1))
end
defp validate(changeset, validation) do
if validation.before_action? do
before_action(changeset, fn changeset ->
if validation.expensive? and not changeset.valid? do
changeset
else
do_validation(changeset, validation)
end
end)
else
if validation.expensive? and not changeset.valid? do
changeset
else
do_validation(changeset, validation)
end
end
end
defp do_validation(changeset, validation) do
if Enum.all?(validation.where || [], fn {module, opts} ->
module.validate(changeset, opts) == :ok
end) do
case validation.module.validate(changeset, validation.opts) do
:ok ->
changeset
{:error, error} when is_binary(error) ->
Ash.Changeset.add_error(changeset, validation.message || error)
{:error, error} when is_exception(error) ->
if validation.message do
error =
case error do
%{field: field} when not is_nil(field) ->
error
|> Map.take([:field, :vars])
|> Map.to_list()
|> Keyword.put(:message, validation.message)
|> InvalidAttribute.exception()
%{fields: fields} when fields not in [nil, []] ->
error
|> Map.take([:fields, :vars])
|> Map.to_list()
|> Keyword.put(:message, validation.message)
|> InvalidChanges.exception()
_ ->
validation.message
end
Ash.Changeset.add_error(changeset, error)
else
Ash.Changeset.add_error(changeset, error)
end
{:error, error} ->
error =
if Keyword.keyword?(error) do
Keyword.put(error, :message, validation.message || error[:message])
else
validation.message || error
end
Ash.Changeset.add_error(changeset, error)
end
else
changeset
end
end
@doc false
def require_values(changeset, action_type, private_and_belongs_to? \\ false, attrs \\ nil)
def require_values(changeset, :create, private_and_belongs_to?, attrs) do
attributes =
attrs ||
attributes_to_require(changeset.resource, changeset.action, private_and_belongs_to?)
Enum.reduce(attributes, changeset, fn required_attribute, changeset ->
if is_atom(required_attribute) do
if is_nil(get_attribute(changeset, required_attribute)) do
add_error(
changeset,
Required.exception(
resource: changeset.resource,
field: required_attribute,
type: :attribute
)
)
else
changeset
end
else
if private_and_belongs_to? || changing_attribute?(changeset, required_attribute.name) do
if is_nil(get_attribute(changeset, required_attribute.name)) do
add_error(
changeset,
Required.exception(
resource: changeset.resource,
field: required_attribute.name,
type: :attribute
)
)
else
changeset
end
else
if is_nil(required_attribute.default) do
add_error(
changeset,
Required.exception(
resource: changeset.resource,
field: required_attribute.name,
type: :attribute
)
)
else
changeset
end
end
end
end)
end
def require_values(changeset, :update, private_and_belongs_to?, attrs) do
attributes =
attrs ||
attributes_to_require(changeset.resource, changeset.action, private_and_belongs_to?)
Enum.reduce(attributes, changeset, fn required_attribute, changeset ->
if is_atom(required_attribute) do
if is_nil(get_attribute(changeset, required_attribute)) do
add_error(
changeset,
Required.exception(
resource: changeset.resource,
field: required_attribute,
type: :attribute
)
)
else
changeset
end
else
if changing_attribute?(changeset, required_attribute.name) do
if is_nil(get_attribute(changeset, required_attribute.name)) do
add_error(
changeset,
Required.exception(
resource: changeset.resource,
field: required_attribute.name,
type: :attribute
)
)
else
changeset
end
else
changeset
end
end
end)
end
def require_values(changeset, _, _, _), do: changeset
# Attributes that are private and/or are the source field of a belongs_to relationship
# are typically not set by input, so they aren't required until the actual action
# is run.
defp attributes_to_require(resource, _action, true = _private_and_belongs_to?) do
resource
|> Ash.Resource.Info.attributes()
|> Enum.reject(&(&1.allow_nil? || &1.generated?))
end
defp attributes_to_require(resource, action, false = _private_and_belongs_to?) do
belongs_to =
resource
|> Ash.Resource.Info.relationships()
|> Enum.filter(&(&1.type == :belongs_to))
|> Enum.map(& &1.source_field)
action =
case action do
action when is_atom(action) ->
Ash.Resource.Info.action(resource, action)
_ ->
action
end
allow_nil_input =
case action do
%{allow_nil_input: allow_nil_input} ->
allow_nil_input
_ ->
[]
end
masked_argument_names = Enum.map(action.arguments, & &1.name)
resource
|> Ash.Resource.Info.attributes()
|> Enum.reject(
&(&1.allow_nil? || &1.private? || !&1.writable? || &1.generated? ||
&1.name in masked_argument_names || &1.name in belongs_to ||
&1.name in allow_nil_input)
)
end
@doc """
Wraps a function in the before/after action hooks of a changeset.
The function takes a changeset and if it returns
`{:ok, result}`, the result will be passed through the after
action hooks.
"""
@spec with_hooks(
t(),
(t() ->
{:ok, term, %{notifications: list(Ash.Notifier.Notification.t())}}
| {:error, term})
) ::
{:ok, term, t(), %{notifications: list(Ash.Notifier.Notification.t())}} | {:error, term}
def with_hooks(%{valid?: false} = changeset, _func) do
{:error, changeset.errors}
end
def with_hooks(changeset, func) do
{changeset, %{notifications: before_action_notifications}} =
Enum.reduce_while(
changeset.before_action,
{changeset, %{notifications: []}},
fn before_action, {changeset, instructions} ->
case before_action.(changeset) do
{changeset, %{notifications: notifications}} ->
cont =
if changeset.valid? do
:cont
else
:halt
end
{cont,
{changeset,
%{
instructions
| notifications: instructions.notifications ++ List.wrap(notifications)
}}}
changeset ->
cont =
if changeset.valid? do
:cont
else
:halt
end
{cont, {changeset, instructions}}
end
end
)
if changeset.valid? do
case func.(changeset) do
{:ok, result, instructions} ->
run_after_actions(
result,
instructions[:new_changeset] || changeset,
List.wrap(instructions[:notifications]) ++ before_action_notifications
)
{:ok, result} ->
run_after_actions(result, changeset, before_action_notifications)
{:error, error} ->
{:error, error}
end
else
{:error, changeset.errors}
end
end
defp run_after_actions(result, changeset, before_action_notifications) do
Enum.reduce_while(
changeset.after_action,
{:ok, result, changeset, %{notifications: before_action_notifications}},
fn after_action, {:ok, result, changeset, %{notifications: notifications} = acc} ->
case after_action.(changeset, result) do
{:ok, new_result, new_notifications} ->
all_notifications =
Enum.map(notifications ++ new_notifications, fn notification ->
%{
notification
| resource: notification.resource || changeset.resource,
action:
notification.action ||
Ash.Resource.Info.action(
changeset.resource,
changeset.action,
changeset.action_type
),
data: notification.data || new_result,
changeset: notification.changeset || changeset,
actor: notification.actor || changeset.context[:private][:actor]
}
end)
{:cont, {:ok, new_result, changeset, %{acc | notifications: all_notifications}}}
{:ok, new_result} ->
{:cont, {:ok, new_result, changeset, acc}}
{:error, error} ->
{:halt, {:error, error}}
end
end
)
end
@doc "Gets the value of an argument provided to the changeset"
@spec get_argument(t, atom) :: term
def get_argument(changeset, argument) when is_atom(argument) do
Map.get(changeset.arguments, argument) || Map.get(changeset.arguments, to_string(argument))
end
@doc "fetches the value of an argument provided to the changeset or `:error`"
@spec fetch_argument(t, atom) :: {:ok, term} | :error
def fetch_argument(changeset, argument) when is_atom(argument) do
case Map.fetch(changeset.arguments, argument) do
{:ok, value} ->
{:ok, value}
:error ->
case Map.fetch(changeset.arguments, to_string(argument)) do
{:ok, value} -> {:ok, value}
:error -> :error
end
end
end
@doc "Gets the changing value or the original value of an attribute"
@spec get_attribute(t, atom) :: term
def get_attribute(changeset, attribute) do
case fetch_change(changeset, attribute) do
{:ok, value} ->
value
:error ->
get_data(changeset, attribute)
end
end
@doc "Gets the value of an argument provided to the changeset, falling back to `Ash.Changeset.get_attribute/2` if nothing was provided"
@spec get_argument_or_attribute(t, atom) :: term
def get_argument_or_attribute(changeset, attribute) do
case fetch_argument(changeset, attribute) do
{:ok, value} -> value
:error -> get_attribute(changeset, attribute)
end
end
@doc "Gets the new value for an attribute, or `:error` if it is not being changed"
@spec fetch_change(t, atom) :: {:ok, any} | :error
def fetch_change(changeset, attribute) do
Map.fetch(changeset.attributes, attribute)
end
@doc "Gets the value of an argument provided to the changeset, falling back to `Ash.Changeset.fetch_change/2` if nothing was provided"
@spec fetch_argument_or_change(t, atom) :: {:ok, any} | :error
def fetch_argument_or_change(changeset, attribute) do
case fetch_argument(changeset, attribute) do
{:ok, value} -> {:ok, value}
:error -> fetch_change(changeset, attribute)
end
end
@doc "Gets the original value for an attribute"
@spec get_data(t, atom) :: term
def get_data(changeset, attribute) do
Map.get(changeset.data, attribute)
end
@doc """
Puts a key/value in the changeset context that can be used later
Do not use the `private` key in your custom context, as that is reserved for internal use.
"""
@spec put_context(t(), atom, term) :: t()
def put_context(changeset, key, value) do
set_context(changeset, %{key => value})
end
@spec set_tenant(t(), String.t()) :: t()
def set_tenant(changeset, tenant) do
%{changeset | tenant: tenant}
end
@doc """
Deep merges the provided map into the changeset context that can be used later
Do not use the `private` key in your custom context, as that is reserved for internal use.
"""
@spec set_context(t(), map | nil) :: t()
def set_context(changeset, nil), do: changeset
def set_context(changeset, map) do
%{changeset | context: Ash.Helpers.deep_merge_maps(changeset.context, map)}
end
@type manage_relationship_type :: :replace | :append | :remove | :direct_control
@spec manage_relationship_opts(manage_relationship_type()) :: Keyword.t()
def manage_relationship_opts(:replace) do
[
on_lookup: :relate,
on_no_match: :error,
on_match: :ignore,
on_missing: :unrelate
]
end
def manage_relationship_opts(:append) do
[
on_lookup: :relate,
on_no_match: :error,
on_match: :ignore,
on_missing: :ignore
]
end
def manage_relationship_opts(:remove) do
[
on_no_match: :error,
on_match: :unrelate,
on_missing: :ignore
]
end
def manage_relationship_opts(:create) do
[
on_no_match: :create,
on_match: :ignore
]
end
def manage_relationship_opts(:direct_control) do
[
on_lookup: :ignore,
on_no_match: :create,
on_match: :update,
on_missing: :destroy
]
end
@manage_opts [
type: [
type: {:one_of, @manage_types},
doc: """
If the `type` is specified, the default values of each option is modified to match that `type` of operation.
This allows for specifying certain operations much more succinctly. The defaults that are modified are listed below
## `:replace`
[
on_lookup: :relate,
on_no_match: :error,
on_match: :ignore,
on_missing: :unrelate
]
## `:append`
[
on_lookup: :relate,
on_no_match: :error,
on_match: :ignore,
on_missing: :ignore
]
## `:remove`
[
on_no_match: :error,
on_match: :unrelate,
on_missing: :ignore
]
## `:direct_control`
[
on_lookup: :ignore,
on_no_match: :create,
on_match: :update,
on_missing: :destroy
]
## `:create`
[
on_no_match: :create,
on_match: :ignore
]
"""
],
authorize?: [
type: :boolean,
default: true,
doc:
"Authorize reads and changes to the destination records, if the primary change is being authorized as well."
],
eager_validate_with: [
type: :atom,
default: false,
doc:
"Validates that any referenced entities exist *before* the action is being performed, using the provided api for the read."
],
on_no_match: [
type: :any,
default: :ignore,
doc: """
instructions for handling records where no matching record existed in the relationship
* `:ignore`(default) - those inputs are ignored
* `:match` - For "has_one" and "belongs_to" only, any input is treated as a match for an existing value. For has_many and many_to_many, this is the same as :ignore.
* `:create` - the records are created using the destination's primary create action
* `{:create, :action_name}` - the records are created using the specified action on the destination resource
* `{:create, :action_name, :join_table_action_name, [:list, :of, :join_table, :params]}` - Same as `{:create, :action_name}` but takes
the list of params specified out and applies them when creating the join table row, with the provided join_table_action_name.
* `:error` - an eror is returned indicating that a record would have been created
* If `on_lookup` is set, and the data contained a primary key or identity, then the error is a `NotFound` error
* Otherwise, an `InvalidRelationship` error is returned
"""
],
on_lookup: [
type: :any,
default: :ignore,
doc: """
Before creating a record(because no match was found in the relationship), the record can be looked up and related.
* `:ignore`(default) - Does not look for existing entries (matches in the current relationship are still considered updates)
* `:relate` - Same as calling `{:relate, primary_action_name}`
* `{:relate, :action_name}` - the records are looked up by primary key/the first identity that is found (using the primary read action), and related. The action should be:
* many_to_many - a create action on the join resource
* has_many - an update action on the destination resource
* has_one - an update action on the destination resource
* belongs_to - an update action on the source resource
* `{:relate, :action_name, :read_action_name}` - Same as the above, but customizes the read action called to search for matches.
* `:relate_and_update` - Same as `:relate`, but the remaining parameters from the lookup are passed into the action that is used to change the relationship key
* `{:relate_and_update, :action_name}` - Same as the above, but customizes the action used. The action should be:
* many_to_many - a create action on the join resource
* has_many - an update action on the destination resource
* has_one - an update action on the destination resource
* belongs_to - an update action on the source resource
* `{:relate_and_update, :action_name, :read_action_name}` - Same as the above, but customizes the read action called to search for matches.
* `{:relate_and_update, :action_name, :read_action_name, [:list, :of, :join_table, :params]}` - Same as the above, but uses the provided list of parameters when creating
the join row (only relevant for many to many relationships). Use `:all` to *only* update the join table row, and pass all parameters to its action
"""
],
on_match: [
type: :any,
default: :ignore,
doc: """
instructions for handling records where a matching record existed in the relationship already
* `:ignore`(default) - those inputs are ignored
* `:update` - the record is updated using the destination's primary update action
* `{:update, :action_name}` - the record is updated using the specified action on the destination resource
* `{:update, :action_name, :join_table_action_name, [:list, :of, :params]}` - Same as `{:update, :action_name}` but takes
the list of params specified out and applies them as an update to the join table row (only valid for many to many).
* `{:destroy, :action_name}` - the record is destroyed using the specified action on the destination resource. The action should be:
* many_to_many - a destroy action on the join table
* has_many - a destroy action on the destination resource
* has_one - a destroy action on the destination resource
* belongs_to - a destroy action on the destination resource
* `:error` - an eror is returned indicating that a record would have been updated
* `:no_match` - ignores the primary key match and follows the on_no_match instructions with these records instead.
* `:unrelate` - the related item is not destroyed, but the data is "unrelated", making this behave like `remove_from_relationship/3`. The action should be:
* many_to_many - the join resource row is destroyed
* has_many - the destination_field (on the related record) is set to `nil`
* has_one - the destination_field (on the related record) is set to `nil`
* belongs_to - the source_field (on this record) is set to `nil`
* `{:unrelate, :action_name}` - the record is unrelated using the provided update action. The action should be:
* many_to_many - a destroy action on the join resource
* has_many - an update action on the destination resource
* has_one - an update action on the destination resource
* belongs_to - an update action on the source resource
"""
],
on_missing: [
type: :any,
default: :ignore,
doc: """
instructions for handling records that existed in the current relationship but not in the input
* `:ignore`(default) - those inputs are ignored
* `:destroy` - the record is destroyed using the destination's primary destroy action
* `{:destroy, :action_name}` - the record is destroyed using the specified action on the destination resource
* `{:destroy, :action_name, :join_resource_action_name, [:join, :keys]}` - the record is destroyed using the specified action on the destination resource,
but first the join resource is destroyed with its specified action
* `:error` - an error is returned indicating that a record would have been updated
* `:unrelate` - the related item is not destroyed, but the data is "unrelated", making this behave like `remove_from_relationship/3`. The action should be:
* many_to_many - the join resource row is destroyed
* has_many - the destination_field (on the related record) is set to `nil`
* has_one - the destination_field (on the related record) is set to `nil`
* belongs_to - the source_field (on this record) is set to `nil`
* `{:unrelate, :action_name}` - the record is unrelated using the provided update action. The action should be:
* many_to_many - a destroy action on the join resource
* has_many - an update action on the destination resource
* has_one - an update action on the destination resource
* belongs_to - an update action on the source resource
"""
],
relationships: [
type: :any,
default: [],
doc: "A keyword list of instructions for nested relationships."
],
error_path: [
type: :any,
doc: """
By default, errors added to the changeset will use the path `[:relationship_name]`, or `[:relationship_name, <index>]`.
If you want to modify this path, you can specify `error_path`, e.g if had a `change` on an action that takes an argument
and uses that argument data to call `manage_relationship`, you may want any generated errors to appear under the name of that
argument, so you could specify `error_path: :argument_name` when calling `manage_relationship`.
"""
],
meta: [
type: :any,
doc:
"Freeform data that will be retained along with the options, which can be used to track/manage the changes that are added to the `relationships` key."
],
ignore?: [
type: :any,
default: false,
doc: """
This tells Ash to ignore the provided inputs when actually running the action. This can be useful for
building up a set of instructions that you intend to handle manually
"""
]
]
@doc false
def manage_relationship_schema, do: @manage_opts
@doc """
Manages the related records by creating, updating, or destroying them as necessary.
Keep in mind that the default values for all `on_*` are `:ignore`, meaning nothing will happen
unless you provide instructions.
The input provided to `manage_relationship` should be a map, in the case of to_one relationships, or a list of maps
in the case of to_many relationships. The following steps are followed for each input provided:
- The input is checked against the currently related records to find any matches. The primary key and unique identities are used to find matches.
- For any input that had a match in the current relationship, the `:on_match` behavior is triggered
- For any input that does not have a match:
- if there is `on_lookup` behavior:
- we try to find the record in the data layer.
- if the record is found, the on_lookup behavior is triggered
- if the record is not found, the `on_no_match` behavior is triggered
- if there is no `on_lookup` behavior:
- the `on_no_match` behavior is triggered
- finally, for any records present in the *current relationship* that had no match *in the input*, the `on_missing` behavior is triggered
## Options
#{Ash.OptionsHelpers.docs(@manage_opts)}
Each call to this function adds new records that will be handled according to their options. For example,
if you tracked "tags to add" and "tags to remove" in separate fields, you could input them like so:
```elixir
changeset
|> Ash.Changeset.manage_relationship(
:tags,
[%{name: "backend"}],
on_lookup: :relate, #relate that tag if it exists in the database
on_no_match: :error # error if a tag with that name doesn't exist
)
|> Ash.Changeset.manage_relationship(
:tags,
[%{name: "frontend"}],
on_no_match: :error, # error if a tag with that name doesn't exist in the relationship
on_match: :unrelate # if a tag with that name is related, unrelate it
)
```
When calling this multiple times with the `on_missing` option set, the list of records that are considered missing are checked
after each set of inputs is processed. For example, if you manage the relationship once with `on_missing: :unrelate`, the records
missing from your input will be removed, and *then* your next call to `manage_relationship` will be resolved (with those records unrelated).
For this reason, it is suggested that you don't call this function multiple times with an `on_missing` instruction, as you may be
surprised by the result.
If you want the input to update existing entities, you need to ensure that the primary key (or unique identity) is provided as
part of the input. See the example below:
changeset
|> Ash.Changeset.manage_relationship(
:comments,
[%{rating: 10, contents: "foo"}],
on_no_match: {:create, :create_action},
on_missing: :ignore
)
|> Ash.Changeset.manage_relationship(
:comments,
[%{id: 10, rating: 10, contents: "foo"}],
on_match: {:update, :update_action},
on_no_match: {:create, :create_action})
This is a simple way to manage a relationship. If you need custom behavior, you can customize the action that is
called, which allows you to add arguments/changes. However, at some point you may want to forego this function
and make the changes yourself. For example:
input = [%{id: 10, rating: 10, contents: "foo"}]
changeset
|> Ash.Changeset.after_action(fn _changeset, result ->
# An example of updating comments based on a result of other changes
for comment <- input do
comment = MyApi.get(Comment, comment.id)
comment
|> Map.update(:rating, 0, &(&1 * result.rating_weight))
|> MyApi.update!()
end
{:ok, result}
end)
## Using records as input
Records can be supplied as the input values. If you do:
* if it would be looked up due to `on_lookup`, the record is used as-is
* if it would be created due to `on_no_match`, the record is used as-is
* Instead of specifying `join_keys`, those keys must go in `__metadata__.join_keys`. If `join_keys` is specified in the options, it is ignored.
For example:
```elixir
post1 =
changeset
|> Api.create!()
|> Ash.Resource.Info.put_metadata(:join_keys, %{type: "a"})
post1 =
changeset2
|> Api.create!()
|> Ash.Resource.Info.put_metadata(:join_keys, %{type: "b"})
author = Api.create!(author_changeset)
Ash.Changeset.manage_relationship(
author,
:posts,
[post1, post2],
on_lookup: :relate
)
```
"""
def manage_relationship(changeset, relationship, input, opts \\ [])
def manage_relationship(changeset, relationship, "", opts) do
manage_relationship(changeset, relationship, nil, opts)
end
def manage_relationship(changeset, relationship, input, opts) do
inputs_was_list? = is_list(input)
manage_opts =
if opts[:type] do
defaults = manage_relationship_opts(opts[:type])
Enum.reduce(defaults, @manage_opts, fn {key, value}, manage_opts ->
Ash.OptionsHelpers.set_default!(manage_opts, key, value)
end)
else
@manage_opts
end
opts = Ash.OptionsHelpers.validate!(opts, manage_opts)
opts =
if Keyword.has_key?(opts[:meta] || [], :inputs_was_list?) do
opts
else
Keyword.update(
opts,
:meta,
[inputs_was_list?: inputs_was_list?],
&Keyword.put(&1, :inputs_was_list?, inputs_was_list?)
)
end
case Ash.Resource.Info.relationship(changeset.resource, relationship) do
nil ->
error =
NoSuchRelationship.exception(
resource: changeset.resource,
name: relationship
)
add_error(changeset, error)
%{manual: manual} = relationship when not is_nil(manual) ->
error =
InvalidRelationship.exception(
relationship: relationship.name,
message: "Cannot manage a manual relationship"
)
add_error(changeset, error)
%{cardinality: :one, type: type} = relationship when is_list(input) and length(input) > 1 ->
error =
InvalidRelationship.exception(
relationship: relationship.name,
message: "Cannot manage a #{type} relationship with a list of records"
)
add_error(changeset, error)
%{writable?: false} = relationship ->
error =
InvalidRelationship.exception(
relationship: relationship.name,
message: "Relationship is not editable"
)
add_error(changeset, error)
relationship ->
if relationship.cardinality == :many && is_map(input) && !is_struct(input) do
case map_input_to_list(input) do
{:ok, input} ->
manage_relationship(changeset, relationship.name, input, opts)
:error ->
manage_relationship(changeset, relationship.name, List.wrap(input), opts)
end
else
input =
changeset.resource
|> Ash.Resource.Info.related(relationship.name)
|> Ash.Resource.Info.primary_key()
|> case do
[key] ->
input
|> List.wrap()
|> Enum.map(fn input ->
if is_map(input) || is_list(input) do
input
else
%{key => input}
end
end)
_ ->
input
end
if Enum.any?(
List.wrap(input),
&(is_struct(&1) && &1.__struct__ != relationship.destination)
) do
add_error(
changeset,
InvalidRelationship.exception(
relationship: relationship.name,
message: "Cannot provide structs that don't match the destination"
)
)
else
relationships =
changeset.relationships
|> Map.put_new(relationship.name, [])
|> Map.update!(relationship.name, &(&1 ++ [{input, opts}]))
changeset = %{changeset | relationships: relationships}
if opts[:eager_validate_with] do
eager_validate_relationship_input(
relationship,
input,
changeset,
opts[:eager_validate_with],
opts[:error_path] || opts[:meta][:id] || relationship.name
)
else
changeset
end
end
end
end
end
defp eager_validate_relationship_input(_relationship, [], _changeset, _api, _error_path),
do: :ok
defp eager_validate_relationship_input(relationship, input, changeset, api, error_path) do
pkeys = Ash.Actions.ManagedRelationships.pkeys(relationship)
pkeys =
Enum.map(pkeys, fn pkey ->
Enum.map(pkey, fn key ->
Ash.Resource.Info.attribute(relationship.destination, key)
end)
end)
search =
Enum.reduce(input, false, fn item, expr ->
filter =
Enum.find_value(pkeys, fn pkey ->
this_filter =
pkey
|> Enum.reject(&(&1.name == relationship.destination_field))
|> Enum.all?(fn key ->
case fetch_identity_field(
item,
changeset.data,
key,
relationship
) do
{:ok, _value} ->
true
:error ->
false
end
end)
|> case do
true ->
case Map.take(
item,
Enum.map(pkey, & &1.name) ++ Enum.map(pkey, &to_string(&1.name))
) do
empty when empty == %{} -> nil
filter -> filter
end
false ->
nil
end
if Enum.any?(pkey, &(&1.name == relationship.destination_field)) &&
relationship.type in [:has_many, :has_one] do
destination_value = Map.get(changeset.data, relationship.source_field)
Ash.Query.expr(
^this_filter and
(is_nil(ref(^relationship.destination_field, [])) or
ref(^relationship.destination_field, []) == ^destination_value)
)
else
this_filter
end
end)
if filter && filter != %{} do
Ash.Query.expr(^expr or ^filter)
else
expr
end
end)
results =
if search == false do
{:ok, []}
else
action =
relationship.read_action ||
Ash.Resource.Info.primary_action!(relationship.destination, :read).name
relationship.destination
|> Ash.Query.for_read(action, %{},
actor: changeset.context[:private][:actor],
tenant: changeset.tenant
)
|> Ash.Query.limit(Enum.count(input))
|> Ash.Query.filter(^search)
|> api.read()
end
case results do
{:error, error} ->
{:error, error}
{:ok, results} ->
case Enum.find(input, fn item ->
no_pkey_all_matches(results, pkeys, fn result, key ->
case fetch_identity_field(item, changeset.data, key, relationship) do
{:ok, value} ->
Ash.Type.equal?(
key.type,
value,
Map.get(result, key.name)
)
:error ->
false
end
end)
end) do
nil ->
:ok
item ->
pkey_search =
Enum.find_value(pkeys, fn pkey ->
if Enum.all?(pkey, fn key ->
Map.has_key?(item, key.name) || Map.has_key?(item, to_string(key.name))
end) do
Map.take(item, pkey ++ Enum.map(pkey, &to_string(&1.name)))
end
end)
{:error,
Ash.Error.Query.NotFound.exception(
primary_key: pkey_search,
resource: relationship.destination
)}
end
end
|> case do
:ok ->
changeset
{:error, error} ->
add_error(changeset, Ash.Error.set_path(error, error_path))
end
end
defp no_pkey_all_matches(results, pkeys, func) do
!Enum.any?(results, fn result ->
Enum.any?(pkeys, fn pkey ->
Enum.all?(pkey, fn key ->
func.(result, key)
end)
end)
end)
end
defp fetch_identity_field(item, data, attribute, relationship) do
if attribute.name == relationship.destination_field &&
relationship.type in [:has_many, :has_one] do
{:ok, Map.get(data, relationship.source_field)}
else
string_attribute = to_string(attribute.name)
if Map.has_key?(item, attribute.name) || Map.has_key?(item, string_attribute) do
input_value = Map.get(item, attribute.name) || Map.get(item, string_attribute)
case Ash.Type.cast_input(attribute.type, input_value, attribute.name) do
{:ok, casted_input_value} ->
{:ok, casted_input_value}
_ ->
:error
end
else
:error
end
end
end
defp map_input_to_list(input) when input == %{} do
:error
end
defp map_input_to_list(input) do
input
|> Enum.reduce_while({:ok, []}, fn
{key, value}, {:ok, acc} when is_integer(key) ->
{:cont, {:ok, [{key, value} | acc]}}
{key, value}, {:ok, acc} when is_binary(key) ->
case Integer.parse(key) do
{int, ""} ->
{:cont, {:ok, [{int, value} | acc]}}
_ ->
{:halt, :error}
end
_, _ ->
{:halt, :error}
end)
|> case do
{:ok, value} ->
{:ok,
value
|> Enum.sort_by(&elem(&1, 0))
|> Enum.map(&elem(&1, 1))}
:error ->
:error
end
end
@doc """
Appends a record or a list of records to a relationship.
Alias for:
```elixir
manage_relationship(changeset, relationship, input,
on_lookup: :relate, # If a record is not in the relationship, and can be found, relate it
on_no_match: :error, # If a record is not found in the relationship or the database, we error
on_match: :ignore, # If a record is found in the relationship we don't change it
on_missing: :ignore, # If a record is not found in the input, we ignore it
)
```
Provide `opts` to customize/override the behavior.
"""
@spec append_to_relationship(
t,
atom,
Ash.Resource.record() | map | term | [Ash.Resource.record() | map | term],
Keyword.t()
) ::
t()
def append_to_relationship(changeset, relationship, record_or_records, opts \\ []) do
manage_relationship(
changeset,
relationship,
record_or_records,
Keyword.merge(
[
on_lookup: :relate,
on_no_match: :error,
on_match: :ignore,
on_missing: :ignore,
authorize?: false
],
opts
)
)
end
@doc """
Removes a record or a list of records to a relationship.
Alias for:
```elixir
manage_relationship(changeset, relationship, record_or_records,
on_no_match: :error, # If a record is not found in the relationship, we error
on_match: :unrelate, # If a record is found in the relationship we unrelate it
on_missing: :ignore, # If a record is not found in the relationship
authorize?: false
)
```
"""
@spec remove_from_relationship(
t,
atom,
Ash.Resource.record() | map | term | [Ash.Resource.record() | map | term],
Keyword.t()
) ::
t()
def remove_from_relationship(changeset, relationship, record_or_records, opts \\ []) do
manage_relationship(
changeset,
relationship,
record_or_records,
Keyword.merge(
[
on_no_match: :error,
on_match: :unrelate,
on_missing: :ignore,
authorize?: false
],
opts
)
)
end
@doc """
Alias for:
```elixir
manage_relationship(
changeset,
relationship,
record_or_records,
on_lookup: :relate, # If a record is not found in the relationship, but is found in the database, relate it and apply the input as an update
on_no_match: :error, # If a record is not found in the relationship or the database, we error
on_match: :ignore, # If a record is found in the relationship we make no changes to it
on_missing: :unrelate, # If a record is not found in the relationship, we unrelate it
authorize?: false
)
```
"""
@spec replace_relationship(
t(),
atom(),
Ash.Resource.record() | map | term | [Ash.Resource.record() | map | term] | nil,
Keyword.t()
) :: t()
def replace_relationship(changeset, relationship, record_or_records, opts \\ []) do
manage_relationship(
changeset,
relationship,
record_or_records,
Keyword.merge(
[
on_lookup: :relate,
on_no_match: :error,
on_match: :ignore,
on_missing: :unrelate,
authorize?: false
],
opts
)
)
end
@doc "Returns true if any attributes on the resource are being changed."
@spec changing_attributes?(t()) :: boolean
def changing_attributes?(changeset) do
changeset.resource
|> Ash.Resource.Info.attributes()
|> Enum.any?(&changing_attribute?(changeset, &1.name))
end
@doc "Returns true if an attribute exists in the changes"
@spec changing_attribute?(t(), atom) :: boolean
def changing_attribute?(changeset, attribute) do
Map.has_key?(changeset.attributes, attribute)
end
@doc "Returns true if a relationship exists in the changes"
@spec changing_relationship?(t(), atom) :: boolean
def changing_relationship?(changeset, relationship) do
Map.has_key?(changeset.relationships, relationship)
end
@doc "Change an attribute only if is not currently being changed"
@spec change_new_attribute(t(), atom, term) :: t()
def change_new_attribute(changeset, attribute, value) do
if changing_attribute?(changeset, attribute) do
changeset
else
change_attribute(changeset, attribute, value)
end
end
@doc """
Change an attribute if is not currently being changed, by calling the provided function
Use this if you want to only perform some expensive calculation for an attribute value
only if there isn't already a change for that attribute
"""
@spec change_new_attribute_lazy(t(), atom, (() -> any)) :: t()
def change_new_attribute_lazy(changeset, attribute, func) do
if changing_attribute?(changeset, attribute) do
changeset
else
change_attribute(changeset, attribute, func.())
end
end
@doc """
Add an argument to the changeset, which will be provided to the action
"""
def set_argument(changeset, argument, value) do
if changeset.action do
argument =
Enum.find(
changeset.action.arguments,
&(&1.name == argument || to_string(&1.name) == argument)
)
if argument do
with {:ok, casted} <- cast_input(argument.type, value, argument.constraints, changeset),
{:constrained, {:ok, casted}, argument} when not is_nil(casted) <-
{:constrained,
Ash.Type.apply_constraints(argument.type, casted, argument.constraints),
argument} do
%{changeset | arguments: Map.put(changeset.arguments, argument.name, casted)}
else
{:constrained, {:ok, nil}, _argument} ->
%{changeset | arguments: Map.put(changeset.arguments, argument.name, nil)}
{:constrained, {:error, error}, argument} ->
changeset = %{
changeset
| arguments: Map.put(changeset.arguments, argument.name, value)
}
add_invalid_errors(:argument, changeset, argument, error)
{:error, error} ->
changeset = %{
changeset
| arguments: Map.put(changeset.arguments, argument.name, value)
}
add_invalid_errors(:argument, changeset, argument, error)
end
else
%{changeset | arguments: Map.put(changeset.arguments, argument, value)}
end
else
%{changeset | arguments: Map.put(changeset.arguments, argument, value)}
end
end
@doc """
Remove an argument from the changeset
"""
def delete_argument(changeset, argument_or_arguments) do
argument_or_arguments
|> List.wrap()
|> Enum.reduce(changeset, fn argument, changeset ->
%{changeset | arguments: Map.delete(changeset.arguments, argument)}
end)
end
@doc """
Merge a map of arguments to the arguments list
"""
def set_arguments(changeset, map) do
Enum.reduce(map, changeset, fn {key, value}, changeset ->
set_argument(changeset, key, value)
end)
end
@doc """
Force change an attribute if is not currently being changed, by calling the provided function
See `change_new_attribute_lazy/3` for more.
"""
@spec force_change_new_attribute_lazy(t(), atom, (() -> any)) :: t()
def force_change_new_attribute_lazy(changeset, attribute, func) do
if changing_attribute?(changeset, attribute) do
changeset
else
force_change_attribute(changeset, attribute, func.())
end
end
@doc "Calls `change_attribute/3` for each key/value pair provided"
@spec change_attributes(t(), map | Keyword.t()) :: t()
def change_attributes(changeset, changes) do
Enum.reduce(changes, changeset, fn {key, value}, changeset ->
change_attribute(changeset, key, value)
end)
end
@doc "Adds a change to the changeset, unless the value matches the existing value"
@spec change_attribute(t(), atom, any) :: t()
def change_attribute(changeset, attribute, value) do
case Ash.Resource.Info.attribute(changeset.resource, attribute) do
nil ->
error =
NoSuchAttribute.exception(
resource: changeset.resource,
name: attribute
)
add_error(changeset, error)
%{writable?: false} = attribute ->
changeset = %{
changeset
| attributes: Map.put(changeset.attributes, attribute.name, value)
}
add_invalid_errors(:attribute, changeset, attribute, "Attribute is not writable")
attribute ->
with value <- handle_indexed_maps(attribute.type, value),
{:ok, prepared} <-
prepare_change(changeset, attribute, value, attribute.constraints),
{:ok, casted} <-
cast_input(attribute.type, prepared, attribute.constraints, changeset, true),
{:ok, casted} <-
handle_change(changeset, attribute, casted, attribute.constraints),
{:ok, casted} <-
Ash.Type.apply_constraints(attribute.type, casted, attribute.constraints) do
data_value = Map.get(changeset.data, attribute.name)
changeset = remove_default(changeset, attribute.name)
cond do
changeset.action_type == :create ->
%{changeset | attributes: Map.put(changeset.attributes, attribute.name, casted)}
is_nil(data_value) and is_nil(casted) ->
%{changeset | attributes: Map.delete(changeset.attributes, attribute.name)}
Ash.Type.equal?(attribute.type, casted, data_value) ->
%{changeset | attributes: Map.delete(changeset.attributes, attribute.name)}
true ->
%{changeset | attributes: Map.put(changeset.attributes, attribute.name, casted)}
end
else
{{:error, error_or_errors}, last_val} ->
changeset = %{
changeset
| attributes: Map.put(changeset.attributes, attribute.name, last_val)
}
add_invalid_errors(:attribute, changeset, attribute, error_or_errors)
:error ->
changeset = %{
changeset
| attributes: Map.put(changeset.attributes, attribute.name, value)
}
add_invalid_errors(:attribute, changeset, attribute)
{:error, error_or_errors} ->
changeset = %{
changeset
| attributes: Map.put(changeset.attributes, attribute.name, value)
}
add_invalid_errors(:attribute, changeset, attribute, error_or_errors)
end
end
end
@doc """
The same as `change_attribute`, but annotates that the attribute is currently holding a default value.
This information can be used in changes to see if a value was explicitly set or if it was set by being the default.
Additionally, this is used in `upsert` actions to not overwrite existing values with the default
"""
@spec change_default_attribute(t(), atom, any) :: t()
def change_default_attribute(changeset, attribute, value) do
case Ash.Resource.Info.attribute(changeset.resource, attribute) do
nil ->
error =
NoSuchAttribute.exception(
resource: changeset.resource,
name: attribute
)
add_error(changeset, error)
attribute ->
changeset
|> change_attribute(attribute.name, value)
|> Map.update!(:defaults, fn defaults ->
Enum.uniq([attribute.name | defaults])
end)
end
end
@doc false
def cast_input(type, term, constraints, changeset, return_value? \\ false)
def cast_input(type, value, constraints, changeset, return_value?) do
value = handle_indexed_maps(type, value)
constraints = Ash.Type.constraints(changeset, type, constraints)
case Ash.Type.cast_input(type, value, constraints) do
{:ok, value} ->
{:ok, value}
{:error, error} ->
if return_value? do
{{:error, error}, value}
else
{:error, error}
end
end
end
@doc false
def handle_indexed_maps({:array, type}, term) when is_map(term) and term != %{} do
term
|> Enum.reduce_while({:ok, []}, fn
{key, value}, {:ok, acc} when is_integer(key) ->
{:cont, {:ok, [{key, value} | acc]}}
{key, value}, {:ok, acc} when is_binary(key) ->
case Integer.parse(key) do
{int, ""} ->
{:cont, {:ok, [{int, value} | acc]}}
_ ->
{:halt, :error}
end
_, _ ->
{:halt, :error}
end)
|> case do
{:ok, value} ->
value
|> Enum.sort_by(&elem(&1, 0))
|> Enum.map(&elem(&1, 1))
|> Enum.map(&handle_indexed_maps(type, &1))
:error ->
term
end
end
def handle_indexed_maps(_, value), do: value
@doc "Calls `force_change_attribute/3` for each key/value pair provided"
@spec force_change_attributes(t(), map) :: t()
def force_change_attributes(changeset, changes) do
Enum.reduce(changes, changeset, fn {key, value}, changeset ->
force_change_attribute(changeset, key, value)
end)
end
@doc "Changes an attribute even if it isn't writable"
@spec force_change_attribute(t(), atom, any) :: t()
def force_change_attribute(changeset, attribute, value) do
case Ash.Resource.Info.attribute(changeset.resource, attribute) do
nil ->
error =
NoSuchAttribute.exception(
resource: changeset.resource,
name: attribute
)
add_error(changeset, error)
attribute when is_nil(value) ->
%{changeset | attributes: Map.put(changeset.attributes, attribute.name, nil)}
attribute ->
with value <- handle_indexed_maps(attribute.type, value),
{:ok, prepared} <-
prepare_change(changeset, attribute, value, attribute.constraints),
{:ok, casted} <-
cast_input(attribute.type, prepared, attribute.constraints, changeset),
{:ok, casted} <- handle_change(changeset, attribute, casted, attribute.constraints),
{:ok, casted} <-
Ash.Type.apply_constraints(attribute.type, casted, attribute.constraints) do
data_value = Map.get(changeset.data, attribute.name)
changeset = remove_default(changeset, attribute.name)
cond do
is_nil(data_value) and is_nil(casted) ->
changeset
Ash.Type.equal?(attribute.type, casted, data_value) ->
changeset
true ->
%{changeset | attributes: Map.put(changeset.attributes, attribute.name, casted)}
end
else
:error ->
changeset = %{
changeset
| attributes: Map.put(changeset.attributes, attribute.name, value)
}
add_invalid_errors(:attribute, changeset, attribute)
{:error, error_or_errors} ->
changeset = %{
changeset
| attributes: Map.put(changeset.attributes, attribute.name, value)
}
add_invalid_errors(:attribute, changeset, attribute, error_or_errors)
end
end
end
@doc "Adds a before_action hook to the changeset."
@spec before_action(
t(),
(t() -> t() | {t(), %{notifications: list(Ash.Notifier.Notification.t())}})
) ::
t()
def before_action(changeset, func) do
%{changeset | before_action: [func | changeset.before_action]}
end
@doc "Adds an after_action hook to the changeset."
@spec after_action(
t(),
(t(), Ash.Resource.record() ->
{:ok, Ash.Resource.record()}
| {:ok, Ash.Resource.record(), list(Ash.Notifier.Notification.t())}
| {:error, term})
) :: t()
def after_action(changeset, func) do
%{changeset | after_action: changeset.after_action ++ [func]}
end
@doc """
Returns the original data with attribute changes merged, if the changeset is valid.
Options:
* force? - applies current attributes even if the changeset is not valid
"""
@spec apply_attributes(t(), opts :: Keyword.t()) :: {:ok, Ash.Resource.record()} | {:error, t()}
def apply_attributes(changeset, opts \\ [])
def apply_attributes(%{valid?: true} = changeset, _opts) do
{:ok,
Enum.reduce(changeset.attributes, changeset.data, fn {attribute, value}, data ->
Map.put(data, attribute, value)
end)}
end
def apply_attributes(changeset, opts) do
if opts[:force?] do
apply_attributes(%{changeset | valid?: true}, opts)
else
{:error, changeset}
end
end
defp remove_default(changeset, attribute) do
%{changeset | defaults: changeset.defaults -- [attribute]}
end
@doc "Clears an attribute or relationship change off of the changeset"
def clear_change(changeset, field) do
cond do
attr = Ash.Resource.Info.attribute(changeset.resource, field) ->
%{changeset | attributes: Map.delete(changeset.attributes, attr.name)}
rel = Ash.Resource.Info.relationship(changeset.resource, field) ->
%{changeset | relationships: Map.delete(changeset.relationships, rel.name)}
true ->
changeset
end
end
@doc """
Sets a custom error handler on the changeset.
The error handler should be a two argument function or an mfa, in which case the first two arguments will be set
to the changeset and the error, w/ the supplied arguments following those.
Any errors generated are passed to `handle_errors`, which can return any of the following:
* `:ignore` - the error is discarded, and the changeset is not marked as invalid
* `changeset` - a new (or the same) changeset. The error is not added (you'll want to add an error yourself), but the changeset *is* marked as invalid.
* `{changeset, error}` - a new (or the same) error and changeset. The error is added to the changeset, and the changeset is marked as invalid.
* `anything_else` - is treated as a new, transformed version of the error. The result is added as an error to the changeset, and the changeset is marked as invalid.
"""
@spec handle_errors(
t(),
(t(), error :: term -> :ignore | t() | (error :: term) | {error :: term, t()})
| {module, atom, [term]}
) :: t()
def handle_errors(changeset, {m, f, a}) do
%{changeset | handle_errors: &apply(m, f, [&1, &2 | a])}
end
def handle_errors(changeset, func) do
%{changeset | handle_errors: func}
end
@doc "Adds an error to the changesets errors list, and marks the change as `valid?: false`"
@spec add_error(t(), term | String.t() | list(term | String.t())) :: t()
def add_error(changeset, errors, path \\ [])
def add_error(changeset, errors, path) when is_list(errors) do
if Keyword.keyword?(errors) do
errors
|> to_change_error()
|> handle_error(changeset)
else
Enum.reduce(errors, changeset, &add_error(&2, &1, path))
end
end
def add_error(changeset, error, path) when is_binary(error) do
add_error(
changeset,
InvalidChanges.exception(message: error),
path
)
end
def add_error(changeset, error, path) do
error
|> Ash.Error.set_path(path)
|> handle_error(changeset)
end
defp handle_error(error, %{handle_errors: nil} = changeset) do
%{changeset | valid?: false, errors: [error | changeset.errors]}
end
defp handle_error(error, changeset) do
changeset
|> changeset.handle_errors.(error)
|> case do
:ignore ->
changeset
%__MODULE__{} = changeset ->
%{changeset | valid?: false}
{changeset, error} ->
%{changeset | valid?: false, errors: [error | changeset.errors]}
error ->
%{changeset | valid?: false, errors: [error | changeset.errors]}
end
end
defp to_change_error(keyword) do
error =
if keyword[:field] do
InvalidAttribute.exception(
field: keyword[:field],
message: keyword[:message],
vars: keyword
)
else
InvalidChanges.exception(
fields: keyword[:fields] || [],
message: keyword[:message],
vars: keyword
)
end
if keyword[:path] do
Ash.Error.set_path(error, keyword[:path])
else
error
end
end
defp prepare_change(%{action_type: :create}, _attribute, value, _constraints), do: {:ok, value}
defp prepare_change(changeset, attribute, value, constraints) do
old_value = Map.get(changeset.data, attribute.name)
Ash.Type.prepare_change(attribute.type, old_value, value, constraints)
end
defp handle_change(%{action_type: :create}, _attribute, value, _constraints), do: {:ok, value}
defp handle_change(changeset, attribute, value, constraints) do
old_value = Map.get(changeset.data, attribute.name)
Ash.Type.handle_change(attribute.type, old_value, value, constraints)
end
defp add_invalid_errors(type, changeset, attribute, message \\ nil) do
messages =
if Keyword.keyword?(message) do
[message]
else
List.wrap(message)
end
Enum.reduce(messages, changeset, fn message, changeset ->
opts = error_to_exception_opts(message, attribute)
exception =
case type do
:attribute -> InvalidAttribute
:argument -> InvalidArgument
end
Enum.reduce(opts, changeset, fn opts, changeset ->
error =
exception.exception(
field: Keyword.get(opts, :field),
message: Keyword.get(opts, :message),
vars: opts
)
error =
if opts[:path] do
Ash.Error.set_path(error, opts[:path])
else
error
end
add_error(changeset, error)
end)
end)
end
@doc false
def error_to_exception_opts(message, attribute) do
case message do
keyword when is_list(keyword) ->
fields =
case List.wrap(keyword[:fields]) do
[] ->
List.wrap(keyword[:field])
fields ->
fields
end
fields
|> case do
[] ->
[
keyword
|> Keyword.put(
:message,
add_index(keyword[:message], keyword)
)
|> Keyword.put(:field, attribute.name)
]
fields ->
Enum.map(
fields,
&Keyword.merge(message,
field: attribute.name,
message: add_index(add_field(keyword[:message], "#{&1}"), keyword)
)
)
end
message when is_binary(message) ->
[[field: attribute.name, message: message]]
_ ->
[[field: attribute.name]]
end
end
defp add_field(message, field) do
"at field #{field} " <> (message || "")
end
defp add_index(message, opts) do
if opts[:index] do
"at index #{opts[:index]} " <> (message || "")
else
message
end
end
end
| 33.095708 | 181 | 0.609265 |
08fbc3db8b0b224e9a6387910b97e74c955cbad7 | 850 | ex | Elixir | lib/lattice_observer/observed/link_definition.ex | janitha09/lattice-observer | 8b71b83e75d610d9caa818ff689b646185eff8ea | [
"Apache-2.0"
] | null | null | null | lib/lattice_observer/observed/link_definition.ex | janitha09/lattice-observer | 8b71b83e75d610d9caa818ff689b646185eff8ea | [
"Apache-2.0"
] | 6 | 2021-11-22T12:51:35.000Z | 2022-02-10T21:29:28.000Z | lib/lattice_observer/observed/link_definition.ex | janitha09/lattice-observer | 8b71b83e75d610d9caa818ff689b646185eff8ea | [
"Apache-2.0"
] | 1 | 2021-11-27T01:16:42.000Z | 2021-11-27T01:16:42.000Z | defmodule LatticeObserver.Observed.LinkDefinition do
alias __MODULE__
@enforce_keys [:actor_id, :provider_id, :contract_id, :link_name]
defstruct [:actor_id, :provider_id, :contract_id, :link_name, :values]
@typedoc """
A representation of an observed link definition. Link definitions are considered
a global entity and are always shareable by all `AppSpec` models. The link definition
trait can be satisfied for an `AppSpec` model by the existence of the link definitition,
without regard for _how_ the definition came into being.
Link definitions are uniquely identified by the actor ID, provider ID, and link name.
"""
@type t :: %LinkDefinition{
actor_id: String.t(),
provider_id: String.t(),
contract_id: String.t(),
link_name: String.t(),
values: Map.t()
}
end
| 38.636364 | 90 | 0.704706 |
08fbf0ca6a0fc1cd60d41be951a8ec7fab862620 | 1,614 | ex | Elixir | lib/admin_web.ex | jeantsai/phoenix-admin | 3f954f0c452d385438b616f7e91bc5d66bcc1adc | [
"MIT"
] | null | null | null | lib/admin_web.ex | jeantsai/phoenix-admin | 3f954f0c452d385438b616f7e91bc5d66bcc1adc | [
"MIT"
] | null | null | null | lib/admin_web.ex | jeantsai/phoenix-admin | 3f954f0c452d385438b616f7e91bc5d66bcc1adc | [
"MIT"
] | null | null | null | defmodule AdminWeb do
@moduledoc """
The entrypoint for defining your web interface, such
as controllers, views, channels and so on.
This can be used in your application as:
use AdminWeb, :controller
use AdminWeb, :view
The definitions below will be executed for every view,
controller, etc, so keep them short and clean, focused
on imports, uses and aliases.
Do NOT define functions inside the quoted expressions
below. Instead, define any helper function in modules
and import those modules here.
"""
def controller do
quote do
use Phoenix.Controller, namespace: AdminWeb
import Plug.Conn
import AdminWeb.Gettext
alias AdminWeb.Router.Helpers, as: Routes
end
end
def view do
quote do
use Phoenix.View, root: "lib/admin_web/templates",
namespace: AdminWeb
# Import convenience functions from controllers
import Phoenix.Controller, only: [get_flash: 1, get_flash: 2, view_module: 1]
# Use all HTML functionality (forms, tags, etc)
use Phoenix.HTML
import AdminWeb.ErrorHelpers
import AdminWeb.Gettext
alias AdminWeb.Router.Helpers, as: Routes
end
end
def router do
quote do
use Phoenix.Router
import Plug.Conn
import Phoenix.Controller
end
end
def channel do
quote do
use Phoenix.Channel
import AdminWeb.Gettext
end
end
@doc """
When used, dispatch to the appropriate controller/view/etc.
"""
defmacro __using__(which) when is_atom(which) do
apply(__MODULE__, which, [])
end
end
| 23.391304 | 83 | 0.680917 |
08fbf4565e4db968cc6d3518002d0442ff37d558 | 5,423 | ex | Elixir | lib/ash/actions/update.ex | axelson/ash | 5992fc00f7bdc0ba0ebdb476a5191245145ef7c8 | [
"MIT"
] | null | null | null | lib/ash/actions/update.ex | axelson/ash | 5992fc00f7bdc0ba0ebdb476a5191245145ef7c8 | [
"MIT"
] | null | null | null | lib/ash/actions/update.ex | axelson/ash | 5992fc00f7bdc0ba0ebdb476a5191245145ef7c8 | [
"MIT"
] | null | null | null | defmodule Ash.Actions.Update do
@moduledoc false
require Logger
alias Ash.Actions.Helpers
alias Ash.Engine
alias Ash.Engine.Request
@spec run(Ash.Api.t(), Ash.Resource.record(), Ash.Resource.Actions.action(), Keyword.t()) ::
{:ok, Ash.Resource.record()} | {:error, Ash.Changeset.t()} | {:error, term}
def run(api, changeset, action, opts) do
authorize? =
if opts[:authorize?] == false do
false
else
opts[:authorize?] || Keyword.has_key?(opts, :actor)
end
opts = Keyword.put(opts, :authorize?, authorize?)
engine_opts =
opts
|> Keyword.take([:verbose?, :actor, :authorize?])
|> Keyword.put(:transaction?, true)
resource = changeset.resource
changeset = changeset(changeset, api, action, opts[:actor])
with %{valid?: true} <- Ash.Changeset.validate_multitenancy(changeset),
{:ok, %{data: %{commit: %^resource{} = updated}} = engine_result} <-
do_run_requests(
changeset,
engine_opts,
action,
resource,
api
) do
updated
|> add_tenant(changeset)
|> add_notifications(engine_result, opts)
else
%Ash.Changeset{errors: errors} = changeset ->
{:error, Ash.Error.to_error_class(errors, changeset: changeset)}
{:error, %Ash.Engine.Runner{errors: errors, changeset: changeset}} ->
{:error, Ash.Error.to_error_class(errors, changeset: changeset)}
{:error, error} ->
{:error, Ash.Error.to_error_class(error, changeset: changeset)}
end
end
defp add_tenant(data, changeset) do
if Ash.Resource.Info.multitenancy_strategy(changeset.resource) do
%{data | __metadata__: Map.put(data.__metadata__, :tenant, changeset.tenant)}
else
data
end
end
defp add_notifications(result, engine_result, opts) do
if opts[:return_notifications?] do
{:ok, result, Map.get(engine_result, :resource_notifications, [])}
else
{:ok, result}
end
end
defp changeset(changeset, api, action, actor) do
changeset = %{changeset | api: api}
if changeset.__validated_for_action__ == action.name do
changeset
else
Ash.Changeset.for_update(changeset, action.name, %{}, actor: actor)
end
|> Ash.Changeset.set_defaults(:update)
|> Ash.Changeset.cast_arguments(action)
end
defp do_run_requests(
changeset,
engine_opts,
action,
resource,
api
) do
authorization_request =
Request.new(
api: api,
changeset: changeset,
action: action,
resource: resource,
data: changeset.data,
authorize?: false,
path: :data,
name: "#{action.type} - `#{action.name}`: prepare"
)
commit_request =
Request.new(
api: api,
changeset:
Request.resolve([[:data, :changeset]], fn %{data: %{changeset: changeset}} ->
{:ok, changeset}
end),
action: action,
resource: resource,
notify?: true,
manage_changeset?: true,
authorize?: false,
data:
Request.resolve(
[[:data, :changeset]],
fn %{data: %{changeset: changeset}} ->
result =
changeset
|> Ash.Changeset.put_context(:private, %{actor: engine_opts[:actor]})
|> Ash.Changeset.before_action(fn changeset ->
{changeset, instructions} =
Ash.Actions.ManagedRelationships.setup_managed_belongs_to_relationships(
changeset,
engine_opts[:actor]
)
{changeset, instructions}
end)
|> Ash.Changeset.require_values(:update, true)
|> Ash.Changeset.with_hooks(fn changeset ->
changeset = set_tenant(changeset)
Ash.DataLayer.update(resource, changeset)
end)
with {:ok, updated, changeset, %{notifications: notifications}} <- result,
{:ok, loaded} <-
Ash.Actions.ManagedRelationships.load(api, updated, changeset, engine_opts),
{:ok, with_relationships, new_notifications} <-
Ash.Actions.ManagedRelationships.manage_relationships(
loaded,
changeset,
engine_opts[:actor]
) do
{:ok, Helpers.select(with_relationships, changeset),
%{notifications: notifications ++ new_notifications}}
end
end
),
path: [:commit],
name: "#{action.type} - `#{action.name}` commit"
)
Engine.run(
[authorization_request, commit_request],
api,
engine_opts
)
end
defp set_tenant(changeset) do
if changeset.tenant &&
Ash.Resource.Info.multitenancy_strategy(changeset.resource) == :attribute do
attribute = Ash.Resource.Info.multitenancy_attribute(changeset.resource)
{m, f, a} = Ash.Resource.Info.multitenancy_parse_attribute(changeset.resource)
attribute_value = apply(m, f, [changeset.tenant | a])
Ash.Changeset.force_change_attribute(changeset, attribute, attribute_value)
else
changeset
end
end
end
| 31.166667 | 97 | 0.576434 |
08fc15155c3807d6d60cefacdb6c7961fae54edf | 72 | exs | Elixir | src/server/test/views/page_view_test.exs | bryceklinker/trader | 9fb06bf7638e2434802b0414ac9bea46cf4f94c9 | [
"MIT"
] | null | null | null | src/server/test/views/page_view_test.exs | bryceklinker/trader | 9fb06bf7638e2434802b0414ac9bea46cf4f94c9 | [
"MIT"
] | null | null | null | src/server/test/views/page_view_test.exs | bryceklinker/trader | 9fb06bf7638e2434802b0414ac9bea46cf4f94c9 | [
"MIT"
] | null | null | null | defmodule Server.PageViewTest do
use Server.ConnCase, async: true
end
| 18 | 34 | 0.805556 |
08fc1d860be3882aa34e62ce832c750f7fb62ac1 | 1,403 | exs | Elixir | other-benchmarks/elixir-phoenix-absinthe/mix.exs | uladzislau-hlebovich/node-graphql-benchmarks | 00b8ba59299e7f725d7a79215f9e56471247f95b | [
"MIT"
] | null | null | null | other-benchmarks/elixir-phoenix-absinthe/mix.exs | uladzislau-hlebovich/node-graphql-benchmarks | 00b8ba59299e7f725d7a79215f9e56471247f95b | [
"MIT"
] | null | null | null | other-benchmarks/elixir-phoenix-absinthe/mix.exs | uladzislau-hlebovich/node-graphql-benchmarks | 00b8ba59299e7f725d7a79215f9e56471247f95b | [
"MIT"
] | null | null | null | defmodule Benchmark.MixProject do
use Mix.Project
def project do
[
app: :benchmark,
version: "0.1.0",
elixir: "~> 1.12",
elixirc_paths: elixirc_paths(Mix.env()),
compilers: Mix.compilers(),
start_permanent: Mix.env() == :prod,
aliases: aliases(),
deps: deps()
]
end
# Configuration for the OTP application.
#
# Type `mix help compile.app` for more information.
def application do
[
mod: {Benchmark.Application, []},
extra_applications: [:logger, :runtime_tools]
]
end
# Specifies which paths to compile per environment.
defp elixirc_paths(:test), do: ["lib", "test/support"]
defp elixirc_paths(_), do: ["lib"]
# Specifies your project dependencies.
#
# Type `mix help deps` for examples and options.
defp deps do
[
{:phoenix, "~> 1.6.6"},
{:telemetry_metrics, "~> 0.6"},
{:telemetry_poller, "~> 1.0"},
{:jason, "~> 1.2"},
{:plug_cowboy, "~> 2.5"},
{:absinthe, "~> 1.6"},
{:absinthe_plug, "~> 1.5"},
{:faker, "~> 0.17"}
]
end
# Aliases are shortcuts or tasks specific to the current project.
# For example, to install project dependencies and perform other setup tasks, run:
#
# $ mix setup
#
# See the documentation for `Mix` for more info on aliases.
defp aliases do
[
setup: ["deps.get"]
]
end
end
| 23.779661 | 84 | 0.588738 |
08fc231f5ccfc3a6bef9e3327ca1f4c18f3cae8e | 72 | ex | Elixir | lib/events_tools_web/views/comment_view.ex | Apps-Team/conferencetools | ce2e16a3e4a521dc4682e736a209e6dd380c050d | [
"Apache-2.0"
] | null | null | null | lib/events_tools_web/views/comment_view.ex | Apps-Team/conferencetools | ce2e16a3e4a521dc4682e736a209e6dd380c050d | [
"Apache-2.0"
] | 6 | 2017-10-05T20:16:34.000Z | 2017-10-05T20:36:11.000Z | lib/events_tools_web/views/comment_view.ex | apps-team/events-tools | ce2e16a3e4a521dc4682e736a209e6dd380c050d | [
"Apache-2.0"
] | null | null | null | defmodule EventsToolsWeb.CommentView do
use EventsToolsWeb, :view
end
| 18 | 39 | 0.833333 |
08fc2c0458e54596f9b7a8e804c5e86ffa4706b4 | 189 | exs | Elixir | priv/repo/migrations/20210427210239_add_unique_index_to_alert_node_table.exs | maco2035/console | 2a9a65678b8c671c7d92cdb62dfcfc71b84957c5 | [
"Apache-2.0"
] | 83 | 2018-05-31T14:49:10.000Z | 2022-03-27T16:49:49.000Z | priv/repo/migrations/20210427210239_add_unique_index_to_alert_node_table.exs | maco2035/console | 2a9a65678b8c671c7d92cdb62dfcfc71b84957c5 | [
"Apache-2.0"
] | 267 | 2018-05-22T23:19:02.000Z | 2022-03-31T04:31:06.000Z | priv/repo/migrations/20210427210239_add_unique_index_to_alert_node_table.exs | maco2035/console | 2a9a65678b8c671c7d92cdb62dfcfc71b84957c5 | [
"Apache-2.0"
] | 18 | 2018-11-20T05:15:54.000Z | 2022-03-28T08:20:13.000Z | defmodule Console.Repo.Migrations.AddUniqueIndexToAlertNodeTable do
use Ecto.Migration
def change do
create unique_index(:alert_nodes, [:alert_id, :node_id, :node_type])
end
end
| 23.625 | 72 | 0.783069 |
08fc4f997e5dd18ffe9d2de01499edc31011f180 | 1,438 | ex | Elixir | clients/container_analysis/lib/google_api/container_analysis/v1beta1/model/artifact_hashes.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2021-12-20T03:40:53.000Z | 2021-12-20T03:40:53.000Z | clients/container_analysis/lib/google_api/container_analysis/v1beta1/model/artifact_hashes.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | 1 | 2020-08-18T00:11:23.000Z | 2020-08-18T00:44:16.000Z | clients/container_analysis/lib/google_api/container_analysis/v1beta1/model/artifact_hashes.ex | pojiro/elixir-google-api | 928496a017d3875a1929c6809d9221d79404b910 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.ContainerAnalysis.V1beta1.Model.ArtifactHashes do
@moduledoc """
Defines a hash object for use in Materials and Products.
## Attributes
* `sha256` (*type:* `String.t`, *default:* `nil`) -
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:sha256 => String.t() | nil
}
field(:sha256)
end
defimpl Poison.Decoder, for: GoogleApi.ContainerAnalysis.V1beta1.Model.ArtifactHashes do
def decode(value, options) do
GoogleApi.ContainerAnalysis.V1beta1.Model.ArtifactHashes.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.ContainerAnalysis.V1beta1.Model.ArtifactHashes do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 30.595745 | 88 | 0.741307 |
08fc5684767afbe81a7155044d1c826ecf970582 | 4,788 | ex | Elixir | lib/oli_web/controllers/payment_controller.ex | candert1/oli-torus | b7408f7d7c04cc3e9cf537873d98c3a586ec3a66 | [
"MIT"
] | null | null | null | lib/oli_web/controllers/payment_controller.ex | candert1/oli-torus | b7408f7d7c04cc3e9cf537873d98c3a586ec3a66 | [
"MIT"
] | null | null | null | lib/oli_web/controllers/payment_controller.ex | candert1/oli-torus | b7408f7d7c04cc3e9cf537873d98c3a586ec3a66 | [
"MIT"
] | null | null | null | defmodule OliWeb.PaymentController do
use OliWeb, :controller
require Logger
@doc """
Render the page to show a student that they do not have access because
of the paywall state. This is the route that the enforce paywall plug
redirects to.
"""
def guard(conn, %{"section_slug" => section_slug}) do
user = conn.assigns.current_user
if user.guest do
render(conn, "require_account.html", section_slug: section_slug)
else
render(conn, "guard.html",
section_slug: section_slug,
direct_payments_enabled: direct_payments_enabled?()
)
end
end
# Returns the module for the configured payment provider
defp get_provider_module() do
validate = fn a ->
case a do
:stripe ->
OliWeb.PaymentProviders.StripeController
:none ->
OliWeb.PaymentProviders.NoProviderController
e ->
Logger.warning("Payment provider is not valid. ", e)
OliWeb.PaymentProviders.NoProviderController
end
end
case Application.fetch_env!(:oli, :payment_provider) do
a when is_atom(a) -> validate.(a)
s when is_binary(s) -> String.to_existing_atom(s) |> validate.()
end
end
defp direct_payments_enabled?() do
get_provider_module() != OliWeb.PaymentProviders.NoProviderController
end
@doc """
Renders the page to start the direct payment processing flow.
"""
def make_payment(conn, %{"section_slug" => section_slug}) do
# Dynamically dispatch to the "index" method of the registered
# payment provider implementation
section = conn.assigns.section
user = conn.assigns.current_user
if user.guest do
render(conn, "require_account.html", section_slug: section_slug)
else
if Oli.Delivery.Paywall.can_access?(user, section) do
conn
|> redirect(to: Routes.page_delivery_path(conn, :index, section.slug))
else
case determine_cost(section) do
{:ok, amount} ->
get_provider_module()
|> apply(:show, [conn, section, user, amount])
_ ->
conn
|> redirect(to: Routes.page_delivery_path(conn, :index, section.slug))
end
end
end
# perform this check in the case that a user refreshes the payment page
# after already paying. This will simply redirect them to their course.
end
defp determine_product(section) do
if is_nil(section.blueprint_id) do
section
else
section.blueprint
end
end
defp determine_cost(section) do
section = Oli.Repo.preload(section, [:institution, :blueprint])
product = determine_product(section)
Oli.Delivery.Paywall.calculate_product_cost(product, section.institution)
end
@doc """
Renders the page to allow payment code redemption.
"""
def use_code(conn, %{"section_slug" => section_slug}) do
user = conn.assigns.current_user
if user.guest do
render(conn, "require_account.html", section_slug: section_slug)
else
render(conn, "code.html", section_slug: section_slug)
end
end
@doc """
Endpoint that triggers creation and download of a batch of payemnt codes.
"""
def download_codes(conn, %{"count" => count, "product_id" => product_slug}) do
case Oli.Delivery.Paywall.create_payment_codes(product_slug, String.to_integer(count)) do
{:ok, payments} ->
contents =
Enum.map(payments, fn p ->
Oli.Delivery.Paywall.Payment.to_human_readable(p.code)
end)
|> Enum.join("\n")
conn
|> send_download({:binary, contents},
filename: "codes_#{product_slug}.txt"
)
_ ->
conn
|> send_download({:binary, "Error in generating codes"},
filename: "ERROR_codes_#{product_slug}.txt"
)
end
end
@doc """
Handles applying a user supplied code as a payment code.
"""
def apply_code(
conn,
%{
"section_slug" => section_slug,
"code" => %{"value" => code}
} = params
) do
if Map.get(params, "g-recaptcha-response", "") |> recaptcha_verified?() do
user = conn.assigns.current_user
case Oli.Delivery.Paywall.redeem_code(code, user, section_slug) do
{:ok, _} ->
render(conn, "code_success.html", section_slug: section_slug)
{:error, _} ->
render(conn, "code.html", error: "This is an invalid code", section_slug: section_slug)
end
else
render(conn, "code.html",
recaptcha_error: "ReCaptcha failed, please try again",
section_slug: section_slug
)
end
end
defp recaptcha_verified?(g_recaptcha_response) do
Oli.Utils.Recaptcha.verify(g_recaptcha_response) == {:success, true}
end
end
| 29.018182 | 97 | 0.644319 |
08fce39231440c080158123cd1e51c1b36237ceb | 584 | ex | Elixir | web/router.ex | parndt/git_ecto_sandbox | 11855a7fc5cdf63594c8fdf2a5baa704d8badb0a | [
"MIT"
] | null | null | null | web/router.ex | parndt/git_ecto_sandbox | 11855a7fc5cdf63594c8fdf2a5baa704d8badb0a | [
"MIT"
] | null | null | null | web/router.ex | parndt/git_ecto_sandbox | 11855a7fc5cdf63594c8fdf2a5baa704d8badb0a | [
"MIT"
] | null | null | null | defmodule GitEctoSandbox.Router do
use GitEctoSandbox.Web, :router
pipeline :browser do
plug :accepts, ["html"]
plug :fetch_session
plug :fetch_flash
plug :protect_from_forgery
plug :put_secure_browser_headers
end
pipeline :api do
plug :accepts, ["json"]
end
scope "/", GitEctoSandbox do
pipe_through :browser # Use the default browser stack
get "/", PageController, :index
resources "/commits", CommitController
end
# Other scopes may use custom stacks.
# scope "/api", GitEctoSandbox do
# pipe_through :api
# end
end
| 20.857143 | 57 | 0.693493 |
08fd05575e4e4c72f8cc2b39fcb9260ad69be39a | 1,008 | ex | Elixir | lib/elixible/client/connection.ex | gabrielgatu/elixible | 3d8c4881469f2d3349488caf67f0d8400d7d1426 | [
"MIT"
] | 1 | 2020-05-08T05:11:55.000Z | 2020-05-08T05:11:55.000Z | lib/elixible/client/connection.ex | gabrielgatu/elixible | 3d8c4881469f2d3349488caf67f0d8400d7d1426 | [
"MIT"
] | 1 | 2020-05-08T10:49:17.000Z | 2020-05-08T10:59:55.000Z | lib/elixible/client/connection.ex | gabrielgatu/elixible | 3d8c4881469f2d3349488caf67f0d8400d7d1426 | [
"MIT"
] | null | null | null | defmodule Elixible.Connection do
use GenServer
def start_link(host, port \\ 5222) when is_bitstring(host) and is_integer(port) do
host = String.to_char_list(host)
GenServer.start_link __MODULE__, {host, port}
end
def command(pid, command) do
GenServer.cast(pid, {:command, command})
end
# Server API
def init({host, port}) do
{:ok, socket} = :gen_tcp.connect host, port, [:binary, active: true]
{:ok, %{socket: socket, host: host}}
end
def handle_cast({:command, command}, %{socket: socket} = state) do
:gen_tcp.send socket, command
{:noreply, state}
end
def handle_info({:tcp, _socket, "<message" <> _ = xml}, state) do
xml
|> String.split("</message>")
|> Enum.map(fn message_xml -> message_xml <> "</message>" end)
|> Enum.each(&Elixible.Client.Handler.handle_response/1)
{:noreply, state}
end
def handle_info({:tcp, _socket, xml}, state) do
Elixible.Client.Handler.handle_response(xml)
{:noreply, state}
end
end
| 25.846154 | 84 | 0.65873 |
08fd15c9a767813d7efe4a2c9172bce29e37f5c8 | 353 | exs | Elixir | test/order_api/services/process_test.exs | gissandrogama/delivery_order | 8642453b03f590fe828225fc13aa58a5f79b2117 | [
"MIT"
] | null | null | null | test/order_api/services/process_test.exs | gissandrogama/delivery_order | 8642453b03f590fe828225fc13aa58a5f79b2117 | [
"MIT"
] | 6 | 2021-01-22T15:23:04.000Z | 2021-01-28T07:56:01.000Z | test/order_api/services/process_test.exs | gissandrogama/delivery_order | 8642453b03f590fe828225fc13aa58a5f79b2117 | [
"MIT"
] | null | null | null | defmodule OrderApi.Services.ProcessTest do
use OrderApi.DataCase
doctest Process
import OrderApi.PayloadFixture
alias OrderApi.Services.Process
describe "structure/0" do
test "return structure in json" do
build()
result = Process.structure() |> Jason.decode!()
assert result["deliveryFee"] == "5.14"
end
end
end
| 20.764706 | 53 | 0.699717 |
08fd179484980c666eb40ac0c6fb4248c93d2584 | 10,207 | ex | Elixir | lib/postgrex/messages.ex | activeprospect/postgrex | d267e419de5db61ac8705210dec8527e4bf94a84 | [
"Apache-2.0"
] | null | null | null | lib/postgrex/messages.ex | activeprospect/postgrex | d267e419de5db61ac8705210dec8527e4bf94a84 | [
"Apache-2.0"
] | 1 | 2020-07-17T10:07:44.000Z | 2020-07-17T10:07:44.000Z | lib/postgrex/messages.ex | activeprospect/postgrex | d267e419de5db61ac8705210dec8527e4bf94a84 | [
"Apache-2.0"
] | null | null | null | defmodule Postgrex.Messages do
@moduledoc false
import Postgrex.BinaryUtils
import Record, only: [defrecord: 2]
@protocol_vsn_major 3
@protocol_vsn_minor 0
@auth_types [ ok: 0, kerberos: 2, cleartext: 3, md5: 5, scm: 6, gss: 7,
gss_cont: 8, sspi: 9, sasl: 10, sasl_cont: 11, sasl_fin: 12 ]
@error_fields [ severity: ?S, code: ?C, message: ?M, detail: ?D, hint: ?H,
position: ?P, internal_position: ?p, internal_query: ?q,
where: ?W, schema: ?s, table: ?t, column: ?c, data_type: ?d,
constraint: ?n, file: ?F, line: ?L, routine: ?R ]
defrecord :msg_auth, [:type, :data]
defrecord :msg_startup, [:params]
defrecord :msg_password, [:pass]
defrecord :msg_error, [:fields]
defrecord :msg_parameter, [:name, :value]
defrecord :msg_backend_key, [:pid, :key]
defrecord :msg_ready, [:status]
defrecord :msg_notice, [:fields]
defrecord :msg_query, [:statement]
defrecord :msg_parse, [:name, :statement, :type_oids]
defrecord :msg_describe, [:type, :name]
defrecord :msg_flush, []
defrecord :msg_close, [:type, :name]
defrecord :msg_parse_complete, []
defrecord :msg_parameter_desc, [:type_oids]
defrecord :msg_too_many_parameters, [:len, :max_len]
defrecord :msg_row_desc, [:fields]
defrecord :msg_no_data, []
defrecord :msg_notify, [:pg_pid, :channel, :payload]
defrecord :msg_bind, [:name_port, :name_stat, :param_formats, :params,
:result_formats]
defrecord :msg_execute, [:name_port, :max_rows]
defrecord :msg_sync, []
defrecord :msg_bind_complete, []
defrecord :msg_close_complete, []
defrecord :msg_portal_suspend, []
defrecord :msg_data_row, [:values]
defrecord :msg_command_complete, [:tag]
defrecord :msg_empty_query, []
defrecord :msg_copy_data, [:data]
defrecord :msg_copy_done, []
defrecord :msg_copy_fail, [:message]
defrecord :msg_copy_in_response, [:format, :columns]
defrecord :msg_copy_both_response, [:format, :columns]
defrecord :msg_copy_out_response, [:format, :columns]
defrecord :msg_terminate, []
defrecord :msg_ssl_request, []
defrecord :msg_cancel_request, [:pid, :key]
defrecord :row_field, [:name, :table_oid, :column, :type_oid, :type_size,
:type_mod, :format]
### decoders ###
# auth
def parse(<<type :: int32, rest :: binary>>, ?R, size) do
type = decode_auth_type(type)
data =
case type do
:md5 ->
<<data :: binary-size(4)>> = rest
data
:gss_cont ->
rest_size = size - 2
<<data :: size(rest_size)>> = rest
data
:sasl ->
rest
:sasl_cont ->
rest
:sasl_fin ->
rest
_ ->
nil
end
msg_auth(type: type, data: data)
end
# backend_key
def parse(<<pid :: int32, key :: int32>>, ?K, _size) do
msg_backend_key(pid: pid, key: key)
end
# ready
def parse(<<status :: int8>>, ?Z, _size) do
status = case status do
?I -> :idle
?T -> :transaction
?E -> :error
end
msg_ready(status: status)
end
# parameter_desc
def parse(<<len :: uint16, rest :: binary(len, 32)>>, ?t, _size) do
oids = for <<oid :: size(32) <- rest>>, do: oid
msg_parameter_desc(type_oids: oids)
end
def parse(<<overflow_len :: uint16, _ :: binary>>, ?t, size) do
len = div(size - 2, 4)
case <<len ::uint16>> do
<<^overflow_len :: uint16>> ->
msg_too_many_parameters(len: len, max_len: 0xFFFF)
_ ->
raise "invalid parameter description"
end
end
# row_desc
def parse(<<len :: uint16, rest :: binary>>, ?T, _size) do
fields = decode_row_fields(rest, len)
msg_row_desc(fields: fields)
end
# data_row
def parse(<<_ :: uint16, rest :: binary>>, ?D, _size) do
msg_data_row(values: rest)
end
# notify
def parse(<<pg_pid :: int32, rest :: binary>>, ?A, _size) do
{channel, rest} = decode_string(rest)
{payload, ""} = decode_string(rest)
msg_notify(pg_pid: pg_pid, channel: channel, payload: payload)
end
# error
def parse(rest, ?E, _size) do
fields = decode_fields(rest)
msg_error(fields: Map.new(fields))
end
# notice
def parse(rest, ?N, _size) do
fields = decode_fields(rest)
msg_notice(fields: Map.new(fields))
end
# parameter
def parse(rest, ?S, _size) do
{name, rest} = decode_string(rest)
{value, ""} = decode_string(rest)
msg_parameter(name: name, value: value)
end
# parse_complete
def parse(_rest, ?1, _size) do
msg_parse_complete()
end
# no_data
def parse(_rest, ?n, _size) do
msg_no_data()
end
# bind_complete
def parse(_rest, ?2, _size) do
msg_bind_complete()
end
# close_complete
def parse(_rest, ?3, _size) do
msg_close_complete()
end
# portal_suspended
def parse(_rest, ?s, _size) do
msg_portal_suspend()
end
# command_complete
def parse(rest, ?C, _size) do
{tag, ""} = decode_string(rest)
msg_command_complete(tag: tag)
end
# empty_query
def parse(_rest, ?I, _size) do
msg_empty_query()
end
# msg_copy_data
def parse(data, ?d, _size) do
msg_copy_data(data: data)
end
# msg_copy_done
def parse(_rest, ?c, _size) do
msg_copy_done()
end
# msg_copy_fail
def parse(message, ?f, _size) do
msg_copy_fail(message: message)
end
# msg_copy_in_response
def parse(rest, ?G, _size) do
{format, columns} = decode_copy(rest)
msg_copy_in_response(format: format, columns: columns)
end
# msg_copy_out_response
def parse(rest, ?H, _size) do
{format, columns} = decode_copy(rest)
msg_copy_out_response(format: format, columns: columns)
end
# msg_copy_both_response
def parse(rest, ?W, _size) do
{format, columns} = decode_copy(rest)
msg_copy_both_response(format: format, columns: columns)
end
### encoders ###
def encode_msg(msg) do
{first, data} = encode(msg)
size = IO.iodata_length(data) + 4
if first do
[first, <<size :: int32>>, data]
else
[<<size :: int32>>, data]
end
end
# startup
defp encode(msg_startup(params: params)) do
params = Enum.reduce(params, [], fn {key, value}, acc ->
[acc, to_string(key), 0, value, 0]
end)
vsn = <<@protocol_vsn_major :: int16, @protocol_vsn_minor :: int16>>
{nil, [vsn, params, 0]}
end
# password
defp encode(msg_password(pass: pass)) do
{?p, [pass]}
end
# query
defp encode(msg_query(statement: statement)) do
{?Q, [statement, 0]}
end
# parse
defp encode(msg_parse(name: name, statement: statement, type_oids: oids)) do
oids = for oid <- oids, into: "", do: <<oid :: uint32>>
len = <<div(byte_size(oids), 4) :: int16>>
{?P, [name, 0, statement, 0, len, oids]}
end
# describe
defp encode(msg_describe(type: type, name: name)) do
byte = case type do
:statement -> ?S
:portal -> ?P
end
{?D, [byte, name, 0]}
end
# flush
defp encode(msg_flush()) do
{?H, ""}
end
# close
defp encode(msg_close(type: type, name: name)) do
byte = case type do
:statement -> ?S
:portal -> ?P
end
{?C, [byte, name, 0]}
end
# bind
defp encode(msg_bind(name_port: port, name_stat: stat, param_formats: param_formats,
params: params, result_formats: result_formats)) do
pfs = for format <- param_formats, into: "", do: <<format(format) :: int16>>
rfs = for format <- result_formats, into: "", do: <<format(format) :: int16>>
len_pfs = <<div(byte_size(pfs), 2) :: int16>>
len_rfs = <<div(byte_size(rfs), 2) :: int16>>
len_params = <<length(params) :: int16>>
{?B, [port, 0, stat, 0, len_pfs, pfs, len_params, params, len_rfs, rfs]}
end
# execute
defp encode(msg_execute(name_port: port, max_rows: rows)) do
{?E, [port, 0, <<rows :: int32>>]}
end
# sync
defp encode(msg_sync()) do
{?S, ""}
end
# terminate
defp encode(msg_terminate()) do
{?X, ""}
end
# ssl_request
defp encode(msg_ssl_request()) do
{nil, <<1234 :: int16, 5679 :: int16>>}
end
# cancel_request
defp encode(msg_cancel_request(pid: pid, key: key)) do
{nil, <<1234 :: int16, 5678 :: int16, pid :: int32, key :: int32>>}
end
# copy_data
defp encode(msg_copy_data(data: data)) do
{?d, data}
end
# copy_done
defp encode(msg_copy_done()) do
{?c, ""}
end
# copy_fail
defp encode(msg_copy_fail(message: msg)) do
{?f, [msg, 0]}
end
### encode helpers ###
defp format(:text), do: 0
defp format(:binary), do: 1
### decode helpers ###
defp decode_fields(<<0>>), do: []
defp decode_fields(<<field :: int8, rest :: binary>>) do
type = decode_field_type(field)
{string, rest} = decode_string(rest)
[{type, string} | decode_fields(rest)]
end
defp decode_string(bin) do
{pos, 1} = :binary.match(bin, <<0>>)
{string, <<0, rest :: binary>>} = :erlang.split_binary(bin, pos)
{string, rest}
end
defp decode_row_fields("", 0), do: []
defp decode_row_fields(rest, count) do
{field, rest} = decode_row_field(rest)
[field | decode_row_fields(rest, count - 1)]
end
defp decode_row_field(rest) do
{name, rest} = decode_string(rest)
<<table_oid :: uint32, column :: int16, type_oid :: uint32,
type_size :: int16, type_mod :: int32, format :: int16,
rest :: binary>> = rest
field = row_field(name: name, table_oid: table_oid, column: column, type_oid: type_oid,
type_size: type_size, type_mod: type_mod, format: format)
{field, rest}
end
Enum.each(@auth_types, fn {type, value} ->
def decode_auth_type(unquote(value)), do: unquote(type)
end)
Enum.each(@error_fields, fn {field, char} ->
def decode_field_type(unquote(char)), do: unquote(field)
end)
def decode_field_type(_), do: :unknown
defp decode_format(0), do: :text
defp decode_format(1), do: :binary
defp decode_copy(<<format::int8, len::uint16, rest::binary(len, 16)>>) do
format = decode_format(format)
columns = for <<column::uint16 <- rest>>, do: decode_format(column)
{format, columns}
end
end
| 25.906091 | 91 | 0.620359 |
08fd26f11807afdb6f61ef99331ab3203a89c03a | 1,129 | exs | Elixir | config/config.exs | Tomboyo/identicon | 69df031bc00346964a8fa4639043ea4d96e4a3eb | [
"MIT"
] | 15 | 2017-09-19T08:09:01.000Z | 2019-04-29T00:37:51.000Z | config/config.exs | Tomboyo/identicon | 69df031bc00346964a8fa4639043ea4d96e4a3eb | [
"MIT"
] | 1 | 2021-03-09T12:34:49.000Z | 2021-03-09T12:34:49.000Z | config/config.exs | Tomboyo/identicon | 69df031bc00346964a8fa4639043ea4d96e4a3eb | [
"MIT"
] | 17 | 2018-02-27T03:15:54.000Z | 2019-04-24T09:26:46.000Z | # This file is responsible for configuring your application
# and its dependencies with the aid of the Mix.Config module.
use Mix.Config
# This configuration is loaded before any dependency and is restricted
# to this project. If another project depends on this project, this
# file won't be loaded nor affect the parent project. For this reason,
# if you want to provide default values for your application for
# 3rd-party users, it should be done in your "mix.exs" file.
# You can configure your application as:
#
# config :identicon, key: :value
#
# and access this configuration in your application as:
#
# Application.get_env(:identicon, :key)
#
# You can also configure a 3rd-party app:
#
# config :logger, level: :info
#
# It is also possible to import configuration files, relative to this
# directory. For example, you can emulate configuration per environment
# by uncommenting the line below and defining dev.exs, test.exs and such.
# Configuration from the imported file will override the ones defined
# here (which is why it is important to import them last).
#
# import_config "#{Mix.env()}.exs"
| 36.419355 | 73 | 0.751107 |
08fd2b50df9e8a4169aa89dfcdbe8a50b4faf641 | 1,441 | ex | Elixir | lib/elixir/lib/list/chars.ex | doughsay/elixir | 7356a47047d0b54517bd6886603f09b1121dde2b | [
"Apache-2.0"
] | 19,291 | 2015-01-01T02:42:49.000Z | 2022-03-31T21:01:40.000Z | lib/elixir/lib/list/chars.ex | doughsay/elixir | 7356a47047d0b54517bd6886603f09b1121dde2b | [
"Apache-2.0"
] | 8,082 | 2015-01-01T04:16:23.000Z | 2022-03-31T22:08:02.000Z | lib/elixir/lib/list/chars.ex | doughsay/elixir | 7356a47047d0b54517bd6886603f09b1121dde2b | [
"Apache-2.0"
] | 3,472 | 2015-01-03T04:11:56.000Z | 2022-03-29T02:07:30.000Z | defprotocol List.Chars do
@moduledoc ~S"""
The `List.Chars` protocol is responsible for
converting a structure to a charlist (only if applicable).
The only function that must be implemented is
`to_charlist/1` which does the conversion.
The `to_charlist/1` function automatically imported
by `Kernel` invokes this protocol.
"""
@doc """
Converts `term` to a charlist.
"""
@spec to_charlist(t) :: charlist
def to_charlist(term)
@doc false
@deprecated "Use List.Chars.to_charlist/1 instead"
Kernel.def to_char_list(term) do
__MODULE__.to_charlist(term)
end
end
defimpl List.Chars, for: Atom do
def to_charlist(nil), do: ''
def to_charlist(atom), do: Atom.to_charlist(atom)
end
defimpl List.Chars, for: BitString do
@doc """
Returns the given binary `term` converted to a charlist.
"""
def to_charlist(term) when is_binary(term) do
String.to_charlist(term)
end
def to_charlist(term) do
raise Protocol.UndefinedError,
protocol: @protocol,
value: term,
description: "cannot convert a bitstring to a charlist"
end
end
defimpl List.Chars, for: List do
# Note that same inlining is used for the rewrite rule.
def to_charlist(list), do: list
end
defimpl List.Chars, for: Integer do
def to_charlist(term) do
Integer.to_charlist(term)
end
end
defimpl List.Chars, for: Float do
def to_charlist(term) do
:io_lib_format.fwrite_g(term)
end
end
| 22.515625 | 61 | 0.714087 |
08fd33d81fa51b7c237751578821595054081f1d | 1,896 | ex | Elixir | lib/integrate_web/controllers/fallback_controller.ex | integratedb/core | 0b4a7a38d014e5ae973a1fa807c137834dfdf9cb | [
"MIT"
] | 13 | 2021-01-28T14:45:43.000Z | 2021-11-04T21:54:19.000Z | lib/integrate_web/controllers/fallback_controller.ex | integratedb/integrate | 0b4a7a38d014e5ae973a1fa807c137834dfdf9cb | [
"MIT"
] | null | null | null | lib/integrate_web/controllers/fallback_controller.ex | integratedb/integrate | 0b4a7a38d014e5ae973a1fa807c137834dfdf9cb | [
"MIT"
] | null | null | null | defmodule IntegrateWeb.FallbackController do
@moduledoc """
Translates controller action results into valid `Plug.Conn` responses.
See `Phoenix.Controller.action_fallback/1` for more details.
"""
use IntegrateWeb, :controller
# This clause handles errors returned by Ecto's insert/update/delete.
def call(conn, {:error, %Ecto.Changeset{} = changeset}) do
conn
|> put_status(:unprocessable_entity)
|> put_view(IntegrateWeb.ChangesetView)
|> render("error.json", changeset: changeset)
end
# These clauses handles errors returned by Ecto's multi.
def call(conn, {:error, _key, %Ecto.Changeset{} = changeset, _changes_so_far}) do
conn
|> put_status(:unprocessable_entity)
|> put_view(IntegrateWeb.ChangesetView)
|> render("error.json", changeset: changeset)
end
def call(conn, {:error, _key, exception, _changes_so_far}) do
IO.inspect(exception)
conn
|> put_status(:unprocessable_entity)
|> put_view(IntegrateWeb.ErrorView)
|> render(:"500")
end
def call(conn, {:error, errors}) when is_list(errors) do
conn
|> put_status(:unprocessable_entity)
|> put_view(IntegrateWeb.ErrorView)
|> render(:"422", errors: errors)
end
# This clause is an example of how to handle resources that cannot be found.
def call(conn, {:error, :not_found}) do
conn
|> put_status(:not_found)
|> put_view(IntegrateWeb.ErrorView)
|> render(:"404")
end
# This clause is an example of how to handle resources that cannot be found.
def call(conn, {:error, :forbidden}) do
conn
|> put_status(:forbidden)
|> put_view(IntegrateWeb.ErrorView)
|> render(:"403")
end
def call(conn, other) do
IO.inspect({"XXX controller did not expect argument:", other})
conn
|> put_status(:internal_server_error)
|> put_view(IntegrateWeb.ErrorView)
|> render(:"500")
end
end
| 28.727273 | 83 | 0.687236 |
08fd54d281a197bbfd2691337734e3cc0556cede | 42,659 | exs | Elixir | test/web/controllers/submission_controller_test.exs | smartlogic/Challenge_gov | b4203d1fcfb742dd17ecfadb9e9c56ad836d4254 | [
"CC0-1.0"
] | null | null | null | test/web/controllers/submission_controller_test.exs | smartlogic/Challenge_gov | b4203d1fcfb742dd17ecfadb9e9c56ad836d4254 | [
"CC0-1.0"
] | null | null | null | test/web/controllers/submission_controller_test.exs | smartlogic/Challenge_gov | b4203d1fcfb742dd17ecfadb9e9c56ad836d4254 | [
"CC0-1.0"
] | null | null | null | defmodule Web.SubmissionControllerTest do
use Web.ConnCase
alias ChallengeGov.Submissions
alias ChallengeGov.TestHelpers.AccountHelpers
alias ChallengeGov.TestHelpers.ChallengeHelpers
alias ChallengeGov.TestHelpers.SubmissionHelpers
describe "index under challenge" do
test "successfully retrieve all submissions for current solver user", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user1} = conn.assigns
user2 = AccountHelpers.create_user(%{email: "solver@example.com", role: "solver"})
challenge = ChallengeHelpers.create_single_phase_challenge(user1, %{user_id: user1.id})
challenge_2 = ChallengeHelpers.create_single_phase_challenge(user1, %{user_id: user1.id})
SubmissionHelpers.create_submitted_submission(%{}, user1, challenge)
SubmissionHelpers.create_submitted_submission(%{}, user1, challenge_2)
SubmissionHelpers.create_submitted_submission(%{}, user2, challenge_2)
conn = get(conn, Routes.submission_path(conn, :index))
%{submissions: submissions, pagination: _pagination} = conn.assigns
assert length(submissions) === 2
end
test "successfully retrieve filtered submissions for challenge", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
challenge_2 = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
SubmissionHelpers.create_submitted_submission(
%{
"title" => "Filtered title"
},
user,
challenge
)
SubmissionHelpers.create_submitted_submission(%{}, user, challenge_2)
SubmissionHelpers.create_submitted_submission(
%{
"title" => "Filtered title"
},
user,
challenge_2
)
conn = get(conn, Routes.submission_path(conn, :index), filter: %{title: "Filtered"})
%{submissions: submissions, pagination: _pagination, filter: filter} = conn.assigns
assert length(submissions) === 2
assert filter["title"] === "Filtered"
end
test "redirect to sign in when signed out", %{conn: conn} do
user = AccountHelpers.create_user()
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
conn = get(conn, Routes.challenge_submission_path(conn, :index, challenge.id))
assert conn.status === 302
assert conn.halted
end
end
describe "show action" do
test "successfully viewing a submission", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
submission =
SubmissionHelpers.create_submitted_submission(
%{
"title" => "Filtered title"
},
user,
challenge
)
conn = get(conn, Routes.submission_path(conn, :show, submission.id))
%{submission: fetched_submission} = conn.assigns
assert fetched_submission.id === submission.id
end
test "success: viewing a submission of single phase challenge as challenge_owner", %{
conn: conn
} do
conn = prep_conn_challenge_owner(conn)
%{current_user: challenge_owner} = conn.assigns
submission_owner =
AccountHelpers.create_user(%{email: "submission_owner@example.com", role: "solver"})
challenge =
ChallengeHelpers.create_single_phase_challenge(challenge_owner, %{
user_id: challenge_owner.id
})
submission =
SubmissionHelpers.create_submitted_submission(
%{},
submission_owner,
challenge
)
conn = get(conn, Routes.submission_path(conn, :show, submission.id))
%{submission: fetched_submission} = conn.assigns
assert fetched_submission.id === submission.id
assert html_response(conn, 200) =~ "Back to submissions"
assert html_response(conn, 200) =~ "Submission ID:"
assert html_response(conn, 200) =~
"<i>#{challenge.title}</i>"
end
test "success: viewing a submission of multi phase challenge as challenge_owner", %{
conn: conn
} do
conn = prep_conn_challenge_owner(conn)
%{current_user: challenge_owner} = conn.assigns
submission_owner =
AccountHelpers.create_user(%{email: "submission_owner@example.com", role: "solver"})
challenge =
ChallengeHelpers.create_multi_phase_challenge(challenge_owner, %{
user_id: challenge_owner.id
})
_phase = Enum.at(challenge.phases, 0)
submission =
SubmissionHelpers.create_submitted_submission(
%{},
submission_owner,
challenge
)
conn = get(conn, Routes.submission_path(conn, :show, submission.id))
%{submission: fetched_submission} = conn.assigns
assert fetched_submission.id === submission.id
assert html_response(conn, 200) =~ "Back to submissions"
assert html_response(conn, 200) =~ "Submission ID:"
assert html_response(conn, 200) =~
"<i>#{challenge.title}</i>"
end
test "success: viewing a submission of single phase challenge as admin", %{conn: conn} do
conn = prep_conn_challenge_owner(conn)
%{current_user: admin} = conn.assigns
submission_owner =
AccountHelpers.create_user(%{email: "submission_owner@example.com", role: "solver"})
challenge = ChallengeHelpers.create_single_phase_challenge(admin, %{user_id: admin.id})
submission =
SubmissionHelpers.create_submitted_submission(
%{},
submission_owner,
challenge
)
conn = get(conn, Routes.submission_path(conn, :show, submission.id))
%{submission: fetched_submission} = conn.assigns
assert fetched_submission.id === submission.id
assert html_response(conn, 200) =~ "Back to submissions"
assert html_response(conn, 200) =~ "Submission ID:"
assert html_response(conn, 200) =~
"<i>#{challenge.title}</i>"
end
test "success: viewing a submission of multi phase challenge as admin", %{conn: conn} do
conn = prep_conn_challenge_owner(conn)
%{current_user: admin} = conn.assigns
submission_owner =
AccountHelpers.create_user(%{email: "submission_owner@example.com", role: "solver"})
challenge = ChallengeHelpers.create_multi_phase_challenge(admin, %{user_id: admin.id})
_phase = Enum.at(challenge.phases, 0)
submission =
SubmissionHelpers.create_submitted_submission(
%{},
submission_owner,
challenge
)
conn = get(conn, Routes.submission_path(conn, :show, submission.id))
%{submission: fetched_submission} = conn.assigns
assert fetched_submission.id === submission.id
assert html_response(conn, 200) =~ "Back to submissions"
assert html_response(conn, 200) =~ "Submission ID:"
assert html_response(conn, 200) =~
"<i>#{challenge.title}</i>"
end
test "not found viewing a deleted submission", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
submission = SubmissionHelpers.create_submitted_submission(%{}, user, challenge)
Submissions.delete(submission)
conn = get(conn, Routes.submission_path(conn, :show, submission.id))
assert conn.status === 302
assert get_flash(conn, :error) === "Submission not found"
end
end
describe "new action" do
test "viewing the new submission form from phases as an admin", %{conn: conn} do
conn = prep_conn_admin(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
phase = challenge.phases |> Enum.at(0)
params = %{
"challenge_id" => challenge.id,
"phase_id" => phase.id
}
conn = get(conn, Routes.challenge_submission_path(conn, :new, challenge.id), params)
%{changeset: changeset} = conn.assigns
assert changeset
end
test "viewing the new submission form without phase id", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
_phase = challenge.phases |> Enum.at(0)
params = %{
"challenge_id" => challenge.id
}
conn = get(conn, Routes.challenge_submission_path(conn, :new, challenge.id), params)
%{changeset: changeset} = conn.assigns
assert changeset
end
end
describe "create action" do
test "saving as a draft as a solver", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
phase = challenge.phases |> Enum.at(0)
params = %{
"action" => "draft",
"submission" => %{
"title" => "Test title"
},
"challenge_id" => challenge.id,
"phase_id" => "#{phase.id}"
}
conn = post(conn, Routes.challenge_submission_path(conn, :create, challenge.id), params)
assert %{id: id} = redirected_params(conn)
assert redirected_to(conn) === Routes.submission_path(conn, :edit, id)
end
test "saving as a draft as an admin", %{conn: conn} do
conn = prep_conn_admin(conn)
%{current_user: user} = conn.assigns
solver_user = AccountHelpers.create_user(%{email: "solver@example.com", role: "solver"})
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
phase = challenge.phases |> Enum.at(0)
params = %{
"action" => "draft",
"submission" => %{
"solver_addr" => solver_user.email,
"title" => "Test title"
},
"challenge_id" => challenge.id,
"phase_id" => "#{phase.id}"
}
conn = post(conn, Routes.challenge_submission_path(conn, :create, challenge.id), params)
assert %{id: id} = redirected_params(conn)
assert redirected_to(conn) === Routes.submission_path(conn, :edit, id)
end
test "creating a submission and review as a solver", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
phase = challenge.phases |> Enum.at(0)
params = %{
"action" => "review",
"submission" => %{
"title" => "Test title",
"brief_description" => "Test brief description",
"description" => "Test description",
"external_url" => "Test external url",
"terms_accepted" => "true",
"review_verified" => "true"
},
"challenge_id" => challenge.id,
"phase_id" => "#{phase.id}"
}
conn = post(conn, Routes.challenge_submission_path(conn, :create, challenge.id), params)
assert %{id: id} = redirected_params(conn)
assert redirected_to(conn) === Routes.submission_path(conn, :show, id)
end
test "creating a submission and review as an admin", %{conn: conn} do
conn = prep_conn_admin(conn)
%{current_user: user} = conn.assigns
solver_user = AccountHelpers.create_user(%{email: "solver@example.com", role: "solver"})
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
phase = challenge.phases |> Enum.at(0)
params = %{
"action" => "review",
"submission" => %{
"solver_addr" => solver_user.email,
"title" => "Test title",
"brief_description" => "Test brief description",
"description" => "Test description",
"external_url" => "Test external url"
},
"challenge_id" => challenge.id,
"phase_id" => "#{phase.id}"
}
conn = post(conn, Routes.challenge_submission_path(conn, :create, challenge.id), params)
assert %{id: id} = redirected_params(conn)
assert redirected_to(conn) === Routes.submission_path(conn, :show, id)
end
test "creating a submission and review with missing params", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
phase = challenge.phases |> Enum.at(0)
params = %{
"action" => "review",
"submission" => %{},
"challenge_id" => challenge.id,
"phase_id" => "#{phase.id}"
}
conn = post(conn, Routes.challenge_submission_path(conn, :create, challenge.id), params)
%{changeset: changeset} = conn.assigns
assert conn.status === 422
assert changeset.errors[:title]
assert changeset.errors[:brief_description]
assert changeset.errors[:description]
end
test "creating a submission and review as a challenge owner", %{conn: conn} do
conn = prep_conn_challenge_owner(conn)
admin_user = AccountHelpers.create_user(%{email: "admin_user_2@example.com", role: "admin"})
challenge =
ChallengeHelpers.create_single_phase_challenge(admin_user, %{user_id: admin_user.id})
phase = challenge.phases |> Enum.at(0)
params = %{
"action" => "review",
"submission" => %{
"title" => "Test title",
"brief_description" => "Test brief description",
"description" => "Test description",
"external_url" => "Test external url"
},
"challenge_id" => challenge.id,
"phase_id" => "#{phase.id}"
}
conn = post(conn, Routes.challenge_submission_path(conn, :create, challenge.id), params)
assert get_flash(conn, :error) === "You are not authorized"
assert redirected_to(conn) === Routes.dashboard_path(conn, :index)
end
end
describe "edit action" do
test "viewing the edit submission form for a draft", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
submission = SubmissionHelpers.create_draft_submission(%{}, user, challenge)
conn = get(conn, Routes.submission_path(conn, :edit, submission.id))
%{submission: submission, changeset: changeset} = conn.assigns
assert changeset
assert submission.status === "draft"
end
test "viewing the edit submission form for a submitted submission", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
submission = SubmissionHelpers.create_submitted_submission(%{}, user, challenge)
conn = get(conn, Routes.submission_path(conn, :edit, submission.id))
%{submission: submission, changeset: changeset} = conn.assigns
assert changeset
assert submission.status === "submitted"
end
test "viewing the edit submission form for a submission you don't own", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: solver_user} = conn.assigns
solver_user_2 = AccountHelpers.create_user(%{email: "user_2@example.com"})
challenge =
ChallengeHelpers.create_single_phase_challenge(solver_user, %{user_id: solver_user.id})
submission = SubmissionHelpers.create_submitted_submission(%{}, solver_user_2, challenge)
conn = get(conn, Routes.submission_path(conn, :edit, submission.id))
assert get_flash(conn, :error) === "You are not authorized to edit this submission"
assert redirected_to(conn) === Routes.submission_path(conn, :index)
end
test "failure: viewing the edit submission form for a solver created submission as an admin",
%{conn: conn} do
conn = prep_conn_admin(conn)
solver_user = AccountHelpers.create_user(%{email: "solver@example.com", role: "solver"})
challenge =
ChallengeHelpers.create_single_phase_challenge(solver_user, %{user_id: solver_user.id})
submission =
SubmissionHelpers.create_submitted_submission(
%{
"terms_accepted" => "true",
"review_verified" => "true"
},
solver_user,
challenge
)
conn = get(conn, Routes.submission_path(conn, :edit, submission.id))
assert get_flash(conn, :error) === "You are not authorized to edit this submission"
assert redirected_to(conn) ===
Routes.challenge_phase_managed_submission_path(
conn,
:managed_submissions,
submission.challenge_id,
submission.phase_id
)
end
test "failure: viewing the edit submission form for an admin created submission that's been verified as an admin",
%{conn: conn} do
conn = prep_conn_admin(conn)
%{current_user: admin_user} = conn.assigns
solver_user = AccountHelpers.create_user(%{email: "solver@example.com", role: "solver"})
challenge =
ChallengeHelpers.create_single_phase_challenge(solver_user, %{user_id: solver_user.id})
submission =
SubmissionHelpers.create_submitted_submission(
%{
"manager_id" => admin_user.id,
"terms_accepted" => "true",
"review_verified" => "true"
},
solver_user,
challenge
)
conn = get(conn, Routes.submission_path(conn, :edit, submission.id))
assert get_flash(conn, :error) === "Submission cannot be edited"
assert redirected_to(conn) ===
Routes.challenge_phase_managed_submission_path(
conn,
:managed_submissions,
submission.challenge_id,
submission.phase_id
)
end
test "failure: viewing the edit submission form as a Challenge Owner", %{conn: conn} do
conn = prep_conn_challenge_owner(conn)
%{current_user: challenge_owner} = conn.assigns
solver_user = AccountHelpers.create_user(%{email: "solver@example.com"})
challenge =
ChallengeHelpers.create_single_phase_challenge(challenge_owner, %{
user_id: challenge_owner.id
})
submission = SubmissionHelpers.create_submitted_submission(%{}, solver_user, challenge)
conn = get(conn, Routes.submission_path(conn, :edit, submission.id))
assert get_flash(conn, :error) === "You are not authorized"
assert redirected_to(conn) === Routes.dashboard_path(conn, :index)
end
end
describe "update action" do
test "updating a draft submission and saving as draft as a solver", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: solver_user} = conn.assigns
admin_user = AccountHelpers.create_user(%{email: "admin_user_2@example.com", role: "admin"})
challenge =
ChallengeHelpers.create_single_phase_challenge(admin_user, %{user_id: admin_user.id})
submission =
SubmissionHelpers.create_draft_submission(
%{"manager_id" => admin_user.id},
solver_user,
challenge
)
params = %{
"action" => "draft",
"submission" => %{
"title" => "New test title",
"brief_description" => "Test brief description",
"description" => "Test description",
"external_url" => nil
}
}
conn = put(conn, Routes.submission_path(conn, :update, submission.id), params)
{:ok, submission} = Submissions.get(submission.id)
assert submission.status === "draft"
assert get_flash(conn, :info) === "Submission draft saved"
assert redirected_to(conn) === Routes.submission_path(conn, :edit, submission.id)
end
test "updating a draft submission and saving as draft as an admin", %{conn: conn} do
conn = prep_conn_admin(conn)
%{current_user: admin_user} = conn.assigns
solver_user = AccountHelpers.create_user(%{email: "solver@example.com", role: "solver"})
challenge =
ChallengeHelpers.create_single_phase_challenge(admin_user, %{user_id: admin_user.id})
submission =
SubmissionHelpers.create_draft_submission(
%{"manager_id" => admin_user.id},
admin_user,
challenge
)
params = %{
"action" => "draft",
"submission" => %{
"solver_addr" => solver_user.email,
"title" => "New test title",
"brief_description" => "Test brief description",
"description" => "Test description",
"external_url" => nil
}
}
conn = put(conn, Routes.submission_path(conn, :update, submission.id), params)
{:ok, submission} = Submissions.get(submission.id)
assert submission.status === "draft"
assert get_flash(conn, :info) === "Submission draft saved"
assert redirected_to(conn) === Routes.submission_path(conn, :edit, submission.id)
end
test "updating a submitted submission--no save draft option", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
submission = SubmissionHelpers.create_submitted_submission(%{}, user, challenge)
conn = get(conn, Routes.submission_path(conn, :edit, submission.id))
assert html_response(conn, 200) =~ "Review and submit"
refute html_response(conn, 200) =~ "Save draft"
end
test "failure: updating a submitted submission and saving as draft", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
submission = SubmissionHelpers.create_submitted_submission(%{}, user, challenge)
params = %{
"action" => "draft",
"submission" => %{
"title" => "New test title",
"brief_description" => "Test brief description",
"description" => "Test description",
"external_url" => nil
}
}
conn = put(conn, Routes.submission_path(conn, :update, submission.id), params)
{:ok, submission} = Submissions.get(submission.id)
assert submission.status === "submitted"
assert get_flash(conn, :error) === "Submission cannot be saved as a draft"
assert redirected_to(conn) === Routes.submission_path(conn, :edit, submission.id)
end
test "updating a submission and sending to review with errors", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
submission = SubmissionHelpers.create_submitted_submission(%{}, user, challenge)
params = %{
"action" => "review",
"submission" => %{
"title" => "New test title",
"brief_description" => "Test brief description",
"description" => nil,
"external_url" => nil
}
}
conn = put(conn, Routes.submission_path(conn, :update, submission.id), params)
%{changeset: changeset} = conn.assigns
assert changeset.errors[:description]
end
test "failure: updating a submission as a solver without accepting the terms", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
submission = SubmissionHelpers.create_submitted_submission(%{}, user, challenge)
params = %{
"action" => "review",
"submission" => %{
"title" => "New test title",
"brief_description" => "Test brief description",
"description" => "New test description",
"terms_accepted" => "false"
}
}
conn = put(conn, Routes.submission_path(conn, :update, submission.id), params)
%{changeset: changeset} = conn.assigns
assert changeset.errors[:terms_accepted]
end
test "failure: updating an unverified submission as a solver without verifying it", %{
conn: conn
} do
conn = prep_conn_admin(conn)
%{current_user: user} = conn.assigns
solver_user = AccountHelpers.create_user(%{email: "solver@example.com", role: "solver"})
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
submission =
SubmissionHelpers.create_submitted_submission(
%{"manager_id" => user.id},
solver_user,
challenge
)
params = %{
"action" => "review",
"submission" => %{
"title" => "New test title",
"brief_description" => "Test brief description",
"description" => "New test description",
"terms_accepted" => "true",
"review_verified" => "false"
}
}
conn = put(conn, Routes.submission_path(conn, :update, submission.id), params)
%{changeset: changeset} = conn.assigns
assert changeset.errors[:review_verified]
end
test "failure: updating a verified submission as an admin", %{conn: conn} do
conn = prep_conn_admin(conn)
%{current_user: admin_user} = conn.assigns
challenge =
ChallengeHelpers.create_single_phase_challenge(admin_user, %{user_id: admin_user.id})
submission =
SubmissionHelpers.create_submitted_submission(
%{
"manager_id" => admin_user.id,
"terms_accepted" => "true",
"review_verified" => "true"
},
admin_user,
challenge
)
params = %{
"action" => "review",
"submission" => %{
"title" => "New test title",
"brief_description" => "Test brief description",
"description" => "New test description"
}
}
conn = put(conn, Routes.submission_path(conn, :update, submission.id), params)
assert get_flash(conn, :error) === "Submission cannot be edited"
assert redirected_to(conn) === Routes.submission_path(conn, :index)
end
test "updating a submission and sending to review as a solver", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
submission = SubmissionHelpers.create_submitted_submission(%{}, user, challenge)
params = %{
"action" => "review",
"submission" => %{
"title" => "New test title",
"brief_description" => "New test brief description",
"description" => "New test description",
"external_url" => "www.test_example.com",
"terms_accepted" => "true",
"review_verified" => "true"
}
}
conn = put(conn, Routes.submission_path(conn, :update, submission.id), params)
{:ok, submission} = Submissions.get(submission.id)
assert submission.status === "draft"
assert redirected_to(conn) === Routes.submission_path(conn, :show, submission.id)
end
test "updating an admin created submission and sending to review as an admin", %{conn: conn} do
conn = prep_conn_admin(conn)
%{current_user: admin_user} = conn.assigns
challenge =
ChallengeHelpers.create_single_phase_challenge(admin_user, %{user_id: admin_user.id})
submission =
SubmissionHelpers.create_submitted_submission(
%{"manager_id" => admin_user.id},
admin_user,
challenge
)
params = %{
"action" => "review",
"submission" => %{
"title" => "New test title",
"brief_description" => "New test brief description",
"description" => "New test description",
"external_url" => "www.test_example.com"
}
}
conn = put(conn, Routes.submission_path(conn, :update, submission.id), params)
{:ok, submission} = Submissions.get(submission.id)
assert submission.status === "draft"
assert redirected_to(conn) === Routes.submission_path(conn, :show, submission.id)
end
test "attempting to update a submission that you don't own", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
user_2 = AccountHelpers.create_user(%{email: "user_2@example.com"})
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
submission = SubmissionHelpers.create_submitted_submission(%{}, user_2, challenge)
params = %{
"action" => "review",
"submission" => %{
"title" => "New test title",
"brief_description" => "New test brief description",
"description" => "New test description",
"external_url" => "www.test_example.com"
}
}
conn = put(conn, Routes.submission_path(conn, :update, submission.id), params)
assert get_flash(conn, :error) === "You are not authorized to edit this submission"
assert redirected_to(conn) === Routes.submission_path(conn, :index)
end
test "updating a submission to submitted", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
submission = SubmissionHelpers.create_draft_submission(%{}, user, challenge)
conn = put(conn, Routes.submission_path(conn, :submit, submission.id))
{:ok, submission} = Submissions.get(submission.id)
assert submission.status === "submitted"
assert redirected_to(conn) === Routes.submission_path(conn, :show, submission.id)
end
test "attempting to update a submission that was deleted", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
submission = SubmissionHelpers.create_submitted_submission(%{}, user, challenge)
{:ok, submission} = Submissions.delete(submission)
conn = put(conn, Routes.submission_path(conn, :update, submission.id), %{})
assert conn.status === 302
assert get_flash(conn, :error) === "Submission not found"
end
test "attempting to update a submission that doesn't exist", %{conn: conn} do
conn = prep_conn(conn)
conn = put(conn, Routes.submission_path(conn, :update, 1), %{})
assert conn.status === 302
assert get_flash(conn, :error) === "Submission not found"
end
test "failure: attempting to update a submission as a Challenge Owner", %{conn: conn} do
conn = prep_conn_challenge_owner(conn)
%{current_user: challenge_owner} = conn.assigns
solver_user = AccountHelpers.create_user(%{email: "solver@example.com"})
challenge =
ChallengeHelpers.create_single_phase_challenge(challenge_owner, %{
user_id: challenge_owner.id
})
submission = SubmissionHelpers.create_submitted_submission(%{}, solver_user, challenge)
params = %{
"action" => "review",
"submission" => %{
"title" => "New test title",
"brief_description" => "New test brief description",
"description" => "New test description",
"external_url" => "www.test_example.com"
}
}
conn = put(conn, Routes.submission_path(conn, :update, submission.id), params)
assert get_flash(conn, :error) === "You are not authorized"
assert redirected_to(conn) === Routes.dashboard_path(conn, :index)
end
end
describe "updating judging status" do
test "success: selecting for judging", %{conn: conn} do
conn = prep_conn_admin(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
phase = Enum.at(challenge.phases, 0)
referer = Routes.challenge_phase_path(conn, :show, challenge.id, phase.id)
conn = Plug.Conn.update_req_header(conn, "referer", referer, &(&1 <> "; charset=utf-8"))
submission = SubmissionHelpers.create_submitted_submission(%{}, user, challenge)
assert submission.judging_status === "not_selected"
conn =
put(
conn,
Routes.api_submission_path(
conn,
:update_judging_status,
submission.id,
"selected"
)
)
{:ok, updated_submission} = Submissions.get(submission.id)
assert updated_submission.judging_status === "selected"
assert response(conn, 200) ===
Jason.encode!(
Web.PhaseView.get_judging_status_button_values(
conn,
challenge,
phase,
updated_submission,
submission.judging_status,
%{}
)
)
end
test "success: unselecting for judging", %{conn: conn} do
conn = prep_conn_admin(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
phase = Enum.at(challenge.phases, 0)
referer = Routes.challenge_phase_path(conn, :show, challenge.id, phase.id)
conn = Plug.Conn.update_req_header(conn, "referer", referer, &(&1 <> "; charset=utf-8"))
submission = SubmissionHelpers.create_submitted_submission(%{}, user, challenge)
{:ok, submission} = Submissions.update_judging_status(submission, "selected")
assert submission.judging_status === "selected"
conn =
put(
conn,
Routes.api_submission_path(
conn,
:update_judging_status,
submission.id,
"not_selected"
)
)
{:ok, updated_submission} = Submissions.get(submission.id)
assert updated_submission.judging_status === "not_selected"
assert response(conn, 200) ===
Jason.encode!(
Web.PhaseView.get_judging_status_button_values(
conn,
challenge,
phase,
updated_submission,
submission.judging_status,
%{}
)
)
end
test "failure: invalid status", %{conn: conn} do
conn = prep_conn_admin(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
phase = Enum.at(challenge.phases, 0)
referer = Routes.challenge_phase_path(conn, :show, challenge.id, phase.id)
conn = Plug.Conn.update_req_header(conn, "referer", referer, &(&1 <> "; charset=utf-8"))
submission = SubmissionHelpers.create_submitted_submission(%{}, user, challenge)
assert submission.judging_status === "not_selected"
conn =
put(
conn,
Routes.api_submission_path(
conn,
:update_judging_status,
submission.id,
"invalid"
)
)
assert response(conn, 400) === ""
{:ok, updated_submission} = Submissions.get(submission.id)
assert updated_submission.judging_status === "not_selected"
end
test "failure: solver not authorized", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
phase = Enum.at(challenge.phases, 0)
referer = Routes.challenge_phase_path(conn, :show, challenge.id, phase.id)
conn = Plug.Conn.update_req_header(conn, "referer", referer, &(&1 <> "; charset=utf-8"))
submission = SubmissionHelpers.create_submitted_submission(%{}, user, challenge)
assert submission.judging_status === "not_selected"
conn =
put(
conn,
Routes.api_submission_path(
conn,
:update_judging_status,
submission.id,
"selected"
)
)
assert get_flash(conn, :error) === "You are not authorized"
assert redirected_to(conn) === Routes.dashboard_path(conn, :index)
{:ok, updated_submission} = Submissions.get(submission.id)
assert updated_submission.judging_status === "not_selected"
end
test "failure: challenge owner not authorized", %{conn: conn} do
conn = prep_conn_challenge_owner(conn)
%{current_user: user} = conn.assigns
different_challenge_owner =
AccountHelpers.create_user(%{
email: "challenge_owner2@example.com",
role: "challenge_owner"
})
challenge =
ChallengeHelpers.create_single_phase_challenge(different_challenge_owner, %{
user_id: user.id
})
phase = Enum.at(challenge.phases, 0)
referer = Routes.challenge_phase_path(conn, :show, challenge.id, phase.id)
conn = Plug.Conn.update_req_header(conn, "referer", referer, &(&1 <> "; charset=utf-8"))
submission = SubmissionHelpers.create_submitted_submission(%{}, user, challenge)
assert submission.judging_status === "not_selected"
conn =
put(
conn,
Routes.api_submission_path(
conn,
:update_judging_status,
submission.id,
"selected"
)
)
assert response(conn, 403) === ""
{:ok, updated_submission} = Submissions.get(submission.id)
assert updated_submission.judging_status === "not_selected"
end
end
describe "delete action" do
test "deleting a draft submission you own", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
submission = SubmissionHelpers.create_draft_submission(%{}, user, challenge)
conn = delete(conn, Routes.submission_path(conn, :delete, submission.id))
assert {:error, :not_found} === Submissions.get(submission.id)
assert get_flash(conn, :info) === "Submission deleted"
assert redirected_to(conn) === Routes.submission_path(conn, :index)
end
test "deleting a submitted submission you own", %{conn: conn} do
conn = prep_conn(conn)
%{current_user: user} = conn.assigns
challenge = ChallengeHelpers.create_single_phase_challenge(user, %{user_id: user.id})
submission = SubmissionHelpers.create_submitted_submission(%{}, user, challenge)
conn = delete(conn, Routes.submission_path(conn, :delete, submission.id))
assert {:error, :not_found} === Submissions.get(submission.id)
assert get_flash(conn, :info) === "Submission deleted"
assert redirected_to(conn) === Routes.submission_path(conn, :index)
end
test "deleting a draft submission as an admin", %{conn: conn} do
conn = prep_conn_admin(conn)
%{current_user: admin_user} = conn.assigns
solver_user = AccountHelpers.create_user(%{email: "solver@example.com", role: "solver"})
challenge =
ChallengeHelpers.create_single_phase_challenge(admin_user, %{user_id: admin_user.id})
submission =
SubmissionHelpers.create_draft_submission(
%{"manager_id" => admin_user.id},
solver_user,
challenge
)
conn = delete(conn, Routes.submission_path(conn, :delete, submission))
assert {:error, :not_found} === Submissions.get(submission.id)
assert get_flash(conn, :info) === "Submission deleted"
assert redirected_to(conn) ===
Routes.challenge_phase_managed_submission_path(
conn,
:managed_submissions,
submission.challenge_id,
submission.phase_id
)
end
test "deleting a submitted submission as an admin", %{conn: conn} do
conn = prep_conn_admin(conn)
%{current_user: admin_user} = conn.assigns
solver_user = AccountHelpers.create_user(%{email: "solver@example.com", role: "solver"})
challenge =
ChallengeHelpers.create_single_phase_challenge(admin_user, %{user_id: admin_user.id})
submission =
SubmissionHelpers.create_submitted_submission(
%{"manager_id" => admin_user.id},
solver_user,
challenge
)
conn = delete(conn, Routes.submission_path(conn, :delete, submission))
assert {:error, :not_found} === Submissions.get(submission.id)
assert get_flash(conn, :info) === "Submission deleted"
assert redirected_to(conn) ===
Routes.challenge_phase_managed_submission_path(
conn,
:managed_submissions,
submission.challenge_id,
submission.phase_id
)
end
test "failure: deleting a submitted submission as a Challenge Owner", %{conn: conn} do
conn = prep_conn_challenge_owner(conn)
%{current_user: challenge_owner} = conn.assigns
solver_user = AccountHelpers.create_user(%{email: "solver@example.com", role: "solver"})
challenge =
ChallengeHelpers.create_single_phase_challenge(challenge_owner, %{
user_id: challenge_owner.id
})
submission =
SubmissionHelpers.create_submitted_submission(
solver_user,
challenge
)
conn = delete(conn, Routes.submission_path(conn, :delete, submission))
assert get_flash(conn, :error) === "You are not authorized"
assert redirected_to(conn) === Routes.dashboard_path(conn, :index)
end
end
defp prep_conn(conn) do
user = AccountHelpers.create_user()
assign(conn, :current_user, user)
end
defp prep_conn_admin(conn) do
user = AccountHelpers.create_user(%{email: "admin@example.com", role: "admin"})
assign(conn, :current_user, user)
end
defp prep_conn_challenge_owner(conn) do
user =
AccountHelpers.create_user(%{email: "challenge_owner@example.com", role: "challenge_owner"})
assign(conn, :current_user, user)
end
end
| 33.991235 | 118 | 0.633184 |
08fd94975ccb82aeff89994504d4d032ecd13bf6 | 587 | ex | Elixir | lib/atecc508a/transport/i2c_supervisor.ex | bcdevices/atecc508a | 934652947ac1de2022f1da556adffa3e8cba31e3 | [
"Apache-2.0"
] | 6 | 2018-12-13T16:33:09.000Z | 2022-03-02T08:57:20.000Z | lib/atecc508a/transport/i2c_supervisor.ex | bcdevices/atecc508a | 934652947ac1de2022f1da556adffa3e8cba31e3 | [
"Apache-2.0"
] | 10 | 2019-01-30T19:33:48.000Z | 2022-03-03T21:07:37.000Z | lib/atecc508a/transport/i2c_supervisor.ex | bcdevices/atecc508a | 934652947ac1de2022f1da556adffa3e8cba31e3 | [
"Apache-2.0"
] | 9 | 2019-08-22T06:26:45.000Z | 2022-03-01T18:05:01.000Z | defmodule ATECC508A.Transport.I2CSupervisor do
use DynamicSupervisor
@moduledoc false
def start_link(args) do
DynamicSupervisor.start_link(__MODULE__, args, name: __MODULE__)
end
@spec start_child(binary() | charlist(), Circuits.I2C.address(), atom()) ::
DynamicSupervisor.on_start_child()
def start_child(bus_name, address, name) do
spec = {ATECC508A.Transport.I2CServer, [bus_name, address, name]}
DynamicSupervisor.start_child(__MODULE__, spec)
end
@impl true
def init(_args) do
DynamicSupervisor.init(strategy: :one_for_one)
end
end
| 26.681818 | 77 | 0.737649 |
08fd9fd4971b15c13b2f1351876517ce13c7060f | 8,967 | ex | Elixir | lib/ex_unit/lib/ex_unit/cli_formatter.ex | davidsulc/elixir | dd4fd6ab742acd75862e34e26dbdb86e0cf6453f | [
"Apache-2.0"
] | null | null | null | lib/ex_unit/lib/ex_unit/cli_formatter.ex | davidsulc/elixir | dd4fd6ab742acd75862e34e26dbdb86e0cf6453f | [
"Apache-2.0"
] | null | null | null | lib/ex_unit/lib/ex_unit/cli_formatter.ex | davidsulc/elixir | dd4fd6ab742acd75862e34e26dbdb86e0cf6453f | [
"Apache-2.0"
] | null | null | null | defmodule ExUnit.CLIFormatter do
@moduledoc false
use GenServer
import ExUnit.Formatter, only: [format_time: 2, format_filters: 2, format_test_failure: 5,
format_test_all_failure: 5]
## Callbacks
def init(opts) do
print_filters(Keyword.take(opts, [:include, :exclude]))
config = %{
seed: opts[:seed],
trace: opts[:trace],
colors: Keyword.put_new(opts[:colors], :enabled, IO.ANSI.enabled?),
width: get_terminal_width(),
slowest: opts[:slowest],
test_counter: %{},
test_timings: [],
failure_counter: 0,
skipped_counter: 0,
invalid_counter: 0
}
{:ok, config}
end
def handle_cast({:suite_started, _opts}, config) do
{:noreply, config}
end
def handle_cast({:suite_finished, run_us, load_us}, config) do
print_suite(config, run_us, load_us)
{:noreply, config}
end
def handle_cast({:test_started, %ExUnit.Test{} = test}, config) do
if config.trace, do: IO.write " * #{test.name}"
{:noreply, config}
end
def handle_cast({:test_finished, %ExUnit.Test{state: nil} = test}, config) do
if config.trace do
IO.puts success(trace_test_result(test), config)
else
IO.write success(".", config)
end
{:noreply, %{config | test_counter: update_test_counter(config.test_counter, test),
test_timings: update_test_timings(config.test_timings, test)}}
end
def handle_cast({:test_finished, %ExUnit.Test{state: {:skip, _}} = test}, config) do
if config.trace, do: IO.puts trace_test_skip(test)
{:noreply, %{config | test_counter: update_test_counter(config.test_counter, test),
skipped_counter: config.skipped_counter + 1}}
end
def handle_cast({:test_finished, %ExUnit.Test{state: {:invalid, _}} = test}, config) do
if config.trace do
IO.puts invalid(trace_test_result(test), config)
else
IO.write invalid("?", config)
end
{:noreply, %{config | test_counter: update_test_counter(config.test_counter, test),
invalid_counter: config.invalid_counter + 1}}
end
def handle_cast({:test_finished, %ExUnit.Test{state: {:failed, failures}} = test}, config) do
if config.trace do
IO.puts failure(trace_test_result(test), config)
end
formatted = format_test_failure(test, failures, config.failure_counter + 1,
config.width, &formatter(&1, &2, config))
print_failure(formatted, config)
print_logs(test.logs)
{:noreply, %{config | test_counter: update_test_counter(config.test_counter, test),
test_timings: update_test_timings(config.test_timings, test),
failure_counter: config.failure_counter + 1}}
end
def handle_cast({:module_started, %ExUnit.TestModule{name: name}}, config) do
if config.trace do
IO.puts("\n#{inspect name}")
end
{:noreply, config}
end
def handle_cast({:module_finished, %ExUnit.TestModule{state: nil}}, config) do
{:noreply, config}
end
def handle_cast({:module_finished, %ExUnit.TestModule{state: {:failed, failures}} = test_module}, config) do
tests_length = length(test_module.tests)
formatted = format_test_all_failure(test_module, failures, config.failure_counter + tests_length,
config.width, &formatter(&1, &2, config))
print_failure(formatted, config)
test_counter = Enum.reduce(test_module.tests, config.test_counter, &update_test_counter(&2, &1))
{:noreply, %{config | test_counter: test_counter, failure_counter: config.failure_counter + tests_length}}
end
def handle_cast(_, config) do
{:noreply, config}
end
## Tracing
defp trace_test_time(%ExUnit.Test{time: time}) do
"#{format_us(time)}ms"
end
defp trace_test_result(test) do
"\r * #{test.name} (#{trace_test_time(test)})"
end
defp trace_test_skip(test) do
"\r * #{test.name} (skipped)"
end
defp normalize_us(us) do
div(us, 1000)
end
defp format_us(us) do
us = div(us, 10)
if us < 10 do
"0.0#{us}"
else
us = div us, 10
"#{div(us, 10)}.#{rem(us, 10)}"
end
end
defp update_test_counter(test_counter, %{tags: %{type: type}}) do
Map.update(test_counter, type, 1, &(&1 + 1))
end
## Slowest
defp format_slowest_total(%{slowest: slowest} = config, run_us) do
slowest_us =
config
|> extract_slowest_tests()
|> Enum.reduce(0, & &1.time + &2)
slowest_time =
slowest_us
|> normalize_us()
|> format_us()
percentage = Float.round(((slowest_us / run_us) * 100), 1)
"Top #{slowest} slowest (#{slowest_time}s), #{percentage}% of total time:\n"
end
defp format_slowest_times(config) do
config
|> extract_slowest_tests()
|> Enum.map(&format_slow_test/1)
end
defp format_slow_test(%ExUnit.Test{name: name, time: time, module: module}) do
" * #{name} (#{format_us(time)}ms) (#{inspect module})\n"
end
defp extract_slowest_tests(%{slowest: slowest, test_timings: timings} = _config) do
timings
|> Enum.sort_by(fn %{time: time} -> -time end)
|> Enum.take(slowest)
end
defp update_test_timings(timings, %ExUnit.Test{} = test) do
[test | timings]
end
## Printing
defp print_suite(config, run_us, load_us) do
IO.write "\n\n"
IO.puts format_time(run_us, load_us)
if config.slowest > 0 do
IO.write "\n"
IO.puts format_slowest_total(config, run_us)
IO.puts format_slowest_times(config)
end
# singular/plural
test_type_counts = format_test_type_counts(config)
failure_pl = pluralize(config.failure_counter, "failure", "failures")
message =
"#{test_type_counts}#{config.failure_counter} #{failure_pl}"
|> if_true(config.skipped_counter > 0, & &1 <> ", #{config.skipped_counter} skipped")
|> if_true(config.invalid_counter > 0, & &1 <> ", #{config.invalid_counter} invalid")
cond do
config.failure_counter > 0 -> IO.puts failure(message, config)
config.invalid_counter > 0 -> IO.puts invalid(message, config)
true -> IO.puts success(message, config)
end
IO.puts "\nRandomized with seed #{config.seed}"
end
defp if_true(value, false, _fun), do: value
defp if_true(value, true, fun), do: fun.(value)
defp print_filters([include: [], exclude: []]) do
:ok
end
defp print_filters([include: include, exclude: exclude]) do
if include != [], do: IO.puts format_filters(include, :include)
if exclude != [], do: IO.puts format_filters(exclude, :exclude)
IO.puts("")
:ok
end
defp print_failure(formatted, config) do
cond do
config.trace -> IO.puts ""
true -> IO.puts "\n"
end
IO.puts formatted
end
defp format_test_type_counts(%{test_counter: test_counter} = _config) do
Enum.map test_counter, fn {type, count} ->
type_pluralized = pluralize(count, type, ExUnit.plural_rule(type |> to_string()))
"#{count} #{type_pluralized}, "
end
end
# Color styles
defp colorize(escape, string, %{colors: colors}) do
if colors[:enabled] do
[escape, string, :reset]
|> IO.ANSI.format_fragment(true)
|> IO.iodata_to_binary
else
string
end
end
defp success(msg, config) do
colorize(:green, msg, config)
end
defp invalid(msg, config) do
colorize(:yellow, msg, config)
end
defp failure(msg, config) do
colorize(:red, msg, config)
end
defp formatter(:diff_enabled?, _, %{colors: colors}),
do: colors[:enabled]
defp formatter(:error_info, msg, config),
do: colorize(:red, msg, config)
defp formatter(:extra_info, msg, config),
do: colorize(:cyan, msg, config)
defp formatter(:location_info, msg, config),
do: colorize([:bright, :black], msg, config)
defp formatter(:diff_delete, msg, config),
do: colorize(:red, msg, config)
defp formatter(:diff_delete_whitespace, msg, config),
do: colorize(IO.ANSI.color_background(2, 0, 0), msg, config)
defp formatter(:diff_insert, msg, config),
do: colorize(:green, msg, config)
defp formatter(:diff_insert_whitespace, msg, config),
do: colorize(IO.ANSI.color_background(0, 2, 0), msg, config)
defp formatter(:blame_diff, msg, %{colors: colors} = config) do
if colors[:enabled] do
colorize(:red, msg, config)
else
"-" <> msg <> "-"
end
end
defp formatter(_, msg, _config),
do: msg
defp pluralize(1, singular, _plural), do: singular
defp pluralize(_, _singular, plural), do: plural
defp get_terminal_width do
case :io.columns do
{:ok, width} -> max(40, width)
_ -> 80
end
end
defp print_logs(""), do: nil
defp print_logs(output) do
indent = "\n "
output = String.replace(output, "\n", indent)
IO.puts([" The following output was logged:", indent | output])
end
end
| 28.466667 | 110 | 0.643136 |
08fe077094144f18743bd0995d5af5ca837c1efd | 563 | ex | Elixir | lib/forget/table.ex | MaethorNaur/forget | 8ce8adfcbf88a48d7adabc03c4815f5777e75f03 | [
"MIT"
] | null | null | null | lib/forget/table.ex | MaethorNaur/forget | 8ce8adfcbf88a48d7adabc03c4815f5777e75f03 | [
"MIT"
] | null | null | null | lib/forget/table.ex | MaethorNaur/forget | 8ce8adfcbf88a48d7adabc03c4815f5777e75f03 | [
"MIT"
] | null | null | null | defmodule Forget.Table do
defmacro __using__(_opts) do
quote do
import Forget.Table, only: [deftable: 2]
require Record
end
end
defmacro deftable(name, [_ | _] = do_block) when is_atom(name) and not is_nil(name) do
end
defmacro deftable(_name, _do_block) do
description = """
deftable/2 require an atom as name and a non empty do block. e.g.
deftable Test do
fields :name
end
"""
raise %SyntaxError{
file: __ENV__.file,
line: __ENV__.line,
description: description
}
end
end
| 20.107143 | 88 | 0.641208 |
08fe131281a13e6476c07d3bb8fb4ee02adeabfb | 7,730 | ex | Elixir | lib/elixir_ex_aliyun_ots_table_store_search_term_query.ex | hou8/tablestore_protos | 1a3223326b92bbe196d57ce4dd19b5a8db1c728d | [
"MIT"
] | null | null | null | lib/elixir_ex_aliyun_ots_table_store_search_term_query.ex | hou8/tablestore_protos | 1a3223326b92bbe196d57ce4dd19b5a8db1c728d | [
"MIT"
] | 1 | 2022-02-08T06:37:02.000Z | 2022-02-08T06:37:02.000Z | lib/elixir_ex_aliyun_ots_table_store_search_term_query.ex | hou8/tablestore_protos | 1a3223326b92bbe196d57ce4dd19b5a8db1c728d | [
"MIT"
] | 2 | 2022-01-24T06:13:03.000Z | 2022-01-24T08:33:41.000Z | # credo:disable-for-this-file
defmodule(ExAliyunOts.TableStoreSearch.TermQuery) do
@moduledoc false
(
defstruct(field_name: nil, term: nil)
(
(
@spec encode(struct) :: {:ok, iodata} | {:error, any}
def(encode(msg)) do
try do
{:ok, encode!(msg)}
rescue
e in [Protox.EncodingError, Protox.RequiredFieldsError] ->
{:error, e}
end
end
@spec encode!(struct) :: iodata | no_return
def(encode!(msg)) do
[] |> encode_field_name(msg) |> encode_term(msg)
end
)
[]
[
defp(encode_field_name(acc, msg)) do
try do
case(msg.field_name) do
nil ->
acc
_ ->
[acc, "\n", Protox.Encode.encode_string(msg.field_name)]
end
rescue
ArgumentError ->
reraise(
Protox.EncodingError.new(:field_name, "invalid field value"),
__STACKTRACE__
)
end
end,
defp(encode_term(acc, msg)) do
try do
case(msg.term) do
nil ->
acc
_ ->
[acc, <<18>>, Protox.Encode.encode_bytes(msg.term)]
end
rescue
ArgumentError ->
reraise(Protox.EncodingError.new(:term, "invalid field value"), __STACKTRACE__)
end
end
]
[]
)
(
(
@spec decode(binary) :: {:ok, struct} | {:error, any}
def(decode(bytes)) do
try do
{:ok, decode!(bytes)}
rescue
e in [Protox.DecodingError, Protox.IllegalTagError, Protox.RequiredFieldsError] ->
{:error, e}
end
end
(
@spec decode!(binary) :: struct | no_return
def(decode!(bytes)) do
parse_key_value(bytes, struct(ExAliyunOts.TableStoreSearch.TermQuery))
end
)
)
(
@spec parse_key_value(binary, struct) :: struct
defp(parse_key_value(<<>>, msg)) do
msg
end
defp(parse_key_value(bytes, msg)) do
{field, rest} =
case(Protox.Decode.parse_key(bytes)) do
{0, _, _} ->
raise(%Protox.IllegalTagError{})
{1, _, bytes} ->
{len, bytes} = Protox.Varint.decode(bytes)
{delimited, rest} = Protox.Decode.parse_delimited(bytes, len)
{[field_name: delimited], rest}
{2, _, bytes} ->
{len, bytes} = Protox.Varint.decode(bytes)
{delimited, rest} = Protox.Decode.parse_delimited(bytes, len)
{[term: delimited], rest}
{tag, wire_type, rest} ->
{_, rest} = Protox.Decode.parse_unknown(tag, wire_type, rest)
{[], rest}
end
msg_updated = struct(msg, field)
parse_key_value(rest, msg_updated)
end
)
[]
)
(
@spec json_decode(iodata(), keyword()) :: {:ok, struct()} | {:error, any()}
def(json_decode(input, opts \\ [])) do
try do
{:ok, json_decode!(input, opts)}
rescue
e in Protox.JsonDecodingError ->
{:error, e}
end
end
@spec json_decode!(iodata(), keyword()) :: struct() | no_return()
def(json_decode!(input, opts \\ [])) do
{json_library_wrapper, json_library} = Protox.JsonLibrary.get_library(opts, :decode)
Protox.JsonDecode.decode!(
input,
ExAliyunOts.TableStoreSearch.TermQuery,
&json_library_wrapper.decode!(json_library, &1)
)
end
@spec json_encode(struct(), keyword()) :: {:ok, iodata()} | {:error, any()}
def(json_encode(msg, opts \\ [])) do
try do
{:ok, json_encode!(msg, opts)}
rescue
e in Protox.JsonEncodingError ->
{:error, e}
end
end
@spec json_encode!(struct(), keyword()) :: iodata() | no_return()
def(json_encode!(msg, opts \\ [])) do
{json_library_wrapper, json_library} = Protox.JsonLibrary.get_library(opts, :encode)
Protox.JsonEncode.encode!(msg, &json_library_wrapper.encode!(json_library, &1))
end
)
@deprecated "Use fields_defs()/0 instead"
@spec defs() :: %{
required(non_neg_integer) => {atom, Protox.Types.kind(), Protox.Types.type()}
}
def(defs()) do
%{1 => {:field_name, {:scalar, ""}, :string}, 2 => {:term, {:scalar, ""}, :bytes}}
end
@deprecated "Use fields_defs()/0 instead"
@spec defs_by_name() :: %{
required(atom) => {non_neg_integer, Protox.Types.kind(), Protox.Types.type()}
}
def(defs_by_name()) do
%{field_name: {1, {:scalar, ""}, :string}, term: {2, {:scalar, ""}, :bytes}}
end
@spec fields_defs() :: list(Protox.Field.t())
def(fields_defs()) do
[
%{
__struct__: Protox.Field,
json_name: "fieldName",
kind: {:scalar, ""},
label: :optional,
name: :field_name,
tag: 1,
type: :string
},
%{
__struct__: Protox.Field,
json_name: "term",
kind: {:scalar, ""},
label: :optional,
name: :term,
tag: 2,
type: :bytes
}
]
end
[
@spec(field_def(atom) :: {:ok, Protox.Field.t()} | {:error, :no_such_field}),
(
def(field_def(:field_name)) do
{:ok,
%{
__struct__: Protox.Field,
json_name: "fieldName",
kind: {:scalar, ""},
label: :optional,
name: :field_name,
tag: 1,
type: :string
}}
end
def(field_def("fieldName")) do
{:ok,
%{
__struct__: Protox.Field,
json_name: "fieldName",
kind: {:scalar, ""},
label: :optional,
name: :field_name,
tag: 1,
type: :string
}}
end
def(field_def("field_name")) do
{:ok,
%{
__struct__: Protox.Field,
json_name: "fieldName",
kind: {:scalar, ""},
label: :optional,
name: :field_name,
tag: 1,
type: :string
}}
end
),
(
def(field_def(:term)) do
{:ok,
%{
__struct__: Protox.Field,
json_name: "term",
kind: {:scalar, ""},
label: :optional,
name: :term,
tag: 2,
type: :bytes
}}
end
def(field_def("term")) do
{:ok,
%{
__struct__: Protox.Field,
json_name: "term",
kind: {:scalar, ""},
label: :optional,
name: :term,
tag: 2,
type: :bytes
}}
end
[]
),
def(field_def(_)) do
{:error, :no_such_field}
end
]
[]
@spec required_fields() :: []
def(required_fields()) do
[]
end
@spec syntax() :: atom
def(syntax()) do
:proto2
end
[
@spec(default(atom) :: {:ok, boolean | integer | String.t() | float} | {:error, atom}),
def(default(:field_name)) do
{:ok, ""}
end,
def(default(:term)) do
{:ok, ""}
end,
def(default(_)) do
{:error, :no_such_field}
end
]
)
end | 25.939597 | 94 | 0.462484 |
08fe4726e5cc9f1fecd83f5ccb93047f093c991b | 3,270 | exs | Elixir | priv/cabbage/apps/itest/test/itest/watcher_status_test.exs | boolafish/elixir-omg | 46b568404972f6e4b4da3195d42d4fb622edb934 | [
"Apache-2.0"
] | null | null | null | priv/cabbage/apps/itest/test/itest/watcher_status_test.exs | boolafish/elixir-omg | 46b568404972f6e4b4da3195d42d4fb622edb934 | [
"Apache-2.0"
] | null | null | null | priv/cabbage/apps/itest/test/itest/watcher_status_test.exs | boolafish/elixir-omg | 46b568404972f6e4b4da3195d42d4fb622edb934 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019-2020 OmiseGO Pte Ltd
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
defmodule WatcherStatusTests do
use Cabbage.Feature, async: false, file: "watcher_status.feature"
require Logger
defwhen ~r/^Operator requests the watcher's status$/, %{}, state do
{:ok, response} = WatcherSecurityCriticalAPI.Api.Status.status_get(WatcherSecurityCriticalAPI.Connection.new())
body = Jason.decode!(response.body)
{:ok, Map.put(state, :service_response, body)}
end
defthen ~r/^Operator can read "(?<contract_name>[^"]+)" contract address$/, %{contract_name: contract_name}, state do
assert "0x" <> _ = state.service_response["data"]["contract_addr"][contract_name]
assert byte_size(state.service_response["data"]["contract_addr"][contract_name]) == 42
{:ok, state}
end
defthen ~r/^Operator can read byzantine_events$/, %{}, state do
assert is_list(state.service_response["data"]["byzantine_events"])
{:ok, state}
end
defthen ~r/^Operator can read eth_syncing$/, %{}, state do
assert is_boolean(state.service_response["data"]["eth_syncing"])
{:ok, state}
end
defthen ~r/^Operator can read in_flight_exits$/, %{}, state do
assert is_list(state.service_response["data"]["in_flight_exits"])
{:ok, state}
end
defthen ~r/^Operator can read last_mined_child_block_number$/, %{}, state do
assert is_integer(state.service_response["data"]["last_mined_child_block_number"])
{:ok, state}
end
defthen ~r/^Operator can read last_mined_child_block_timestamp$/, %{}, state do
assert is_integer(state.service_response["data"]["last_mined_child_block_timestamp"])
{:ok, state}
end
defthen ~r/^Operator can read last_seen_eth_block_number$/, %{}, state do
assert is_integer(state.service_response["data"]["last_seen_eth_block_number"])
{:ok, state}
end
defthen ~r/^Operator can read last_seen_eth_block_timestamp$/, %{}, state do
assert is_integer(state.service_response["data"]["last_seen_eth_block_timestamp"])
{:ok, state}
end
defthen ~r/^Operator can read last_validated_child_block_number$/, %{}, state do
assert is_integer(state.service_response["data"]["last_validated_child_block_number"])
{:ok, state}
end
defthen ~r/^Operator can read last_validated_child_block_timestamp$/, %{}, state do
assert is_integer(state.service_response["data"]["last_validated_child_block_timestamp"])
{:ok, state}
end
defthen ~r/^Operator can read services_synced_heights$/, %{}, state do
services = state.service_response["data"]["services_synced_heights"]
_ =
Enum.each(services, fn service ->
assert is_binary(service["service"])
assert is_integer(service["height"])
end)
{:ok, state}
end
end
| 36.333333 | 119 | 0.719572 |
08fe896552210a0236c176bb2b8d465fe8276861 | 368 | exs | Elixir | test/wallaby/browser/stale_nodes_test.exs | schnittchen/wallaby | 30be89cc78087e53e5b47a86043c2bbe8566bbf4 | [
"MIT"
] | null | null | null | test/wallaby/browser/stale_nodes_test.exs | schnittchen/wallaby | 30be89cc78087e53e5b47a86043c2bbe8566bbf4 | [
"MIT"
] | null | null | null | test/wallaby/browser/stale_nodes_test.exs | schnittchen/wallaby | 30be89cc78087e53e5b47a86043c2bbe8566bbf4 | [
"MIT"
] | null | null | null | defmodule Wallaby.Browser.StaleElementsTest do
use Wallaby.SessionCase, async: true
describe "when a DOM element becomes stale" do
test "the query is retried", %{session: session} do
element =
session
|> visit("stale_nodes.html")
|> find(Query.css(".stale-node", text: "Stale", count: 1))
assert element
end
end
end
| 24.533333 | 66 | 0.646739 |
08febdcba1bed94a4b760f7e20fe356f6e82d2a8 | 27,354 | exs | Elixir | test/floki/html/generated/tokenizer/namedEntities_part29_test.exs | nathanl/floki | 042b3f60f4d9a6218ec85d558d13cc6dac30c587 | [
"MIT"
] | 1,778 | 2015-01-07T14:12:31.000Z | 2022-03-29T22:42:48.000Z | test/floki/html/generated/tokenizer/namedEntities_part29_test.exs | nathanl/floki | 042b3f60f4d9a6218ec85d558d13cc6dac30c587 | [
"MIT"
] | 279 | 2015-01-01T15:54:50.000Z | 2022-03-28T18:06:03.000Z | test/floki/html/generated/tokenizer/namedEntities_part29_test.exs | nathanl/floki | 042b3f60f4d9a6218ec85d558d13cc6dac30c587 | [
"MIT"
] | 166 | 2015-04-24T20:48:02.000Z | 2022-03-28T17:29:05.000Z | defmodule Floki.HTML.Generated.Tokenizer.NamedentitiesPart29Test do
use ExUnit.Case, async: true
# NOTE: This file was generated by "mix generate_tokenizer_tests namedEntities.test".
# html5lib-tests rev: e52ff68cc7113a6ef3687747fa82691079bf9cc5
alias Floki.HTML.Tokenizer
test "tokenize/1 Named entity: capand; with a semi-colon" do
input = "⩄"
output = [["Character", "⩄"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: capbrcup; with a semi-colon" do
input = "⩉"
output = [["Character", "⩉"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: capcap; with a semi-colon" do
input = "⩋"
output = [["Character", "⩋"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: capcup; with a semi-colon" do
input = "⩇"
output = [["Character", "⩇"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: capdot; with a semi-colon" do
input = "⩀"
output = [["Character", "⩀"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: caps; with a semi-colon" do
input = "∩︀"
output = [["Character", "∩︀"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: caret; with a semi-colon" do
input = "⁁"
output = [["Character", "⁁"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: caron; with a semi-colon" do
input = "ˇ"
output = [["Character", "ˇ"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: ccaps; with a semi-colon" do
input = "⩍"
output = [["Character", "⩍"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: ccaron; with a semi-colon" do
input = "č"
output = [["Character", "č"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: ccedil without a semi-colon" do
input = "ç"
output = [["Character", "ç"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: ccedil; with a semi-colon" do
input = "ç"
output = [["Character", "ç"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: ccirc; with a semi-colon" do
input = "ĉ"
output = [["Character", "ĉ"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: ccups; with a semi-colon" do
input = "⩌"
output = [["Character", "⩌"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: ccupssm; with a semi-colon" do
input = "⩐"
output = [["Character", "⩐"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cdot; with a semi-colon" do
input = "ċ"
output = [["Character", "ċ"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cedil without a semi-colon" do
input = "¸"
output = [["Character", "¸"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cedil; with a semi-colon" do
input = "¸"
output = [["Character", "¸"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cemptyv; with a semi-colon" do
input = "⦲"
output = [["Character", "⦲"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cent without a semi-colon" do
input = "¢"
output = [["Character", "¢"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cent; with a semi-colon" do
input = "¢"
output = [["Character", "¢"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: centerdot; with a semi-colon" do
input = "·"
output = [["Character", "·"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cfr; with a semi-colon" do
input = "𝔠"
output = [["Character", "𝔠"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: chcy; with a semi-colon" do
input = "ч"
output = [["Character", "ч"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: check; with a semi-colon" do
input = "✓"
output = [["Character", "✓"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: checkmark; with a semi-colon" do
input = "✓"
output = [["Character", "✓"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: chi; with a semi-colon" do
input = "χ"
output = [["Character", "χ"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cir; with a semi-colon" do
input = "○"
output = [["Character", "○"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cirE; with a semi-colon" do
input = "⧃"
output = [["Character", "⧃"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: circ; with a semi-colon" do
input = "ˆ"
output = [["Character", "ˆ"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: circeq; with a semi-colon" do
input = "≗"
output = [["Character", "≗"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: circlearrowleft; with a semi-colon" do
input = "↺"
output = [["Character", "↺"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: circlearrowright; with a semi-colon" do
input = "↻"
output = [["Character", "↻"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: circledR; with a semi-colon" do
input = "®"
output = [["Character", "®"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: circledS; with a semi-colon" do
input = "Ⓢ"
output = [["Character", "Ⓢ"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: circledast; with a semi-colon" do
input = "⊛"
output = [["Character", "⊛"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: circledcirc; with a semi-colon" do
input = "⊚"
output = [["Character", "⊚"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: circleddash; with a semi-colon" do
input = "⊝"
output = [["Character", "⊝"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cire; with a semi-colon" do
input = "≗"
output = [["Character", "≗"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cirfnint; with a semi-colon" do
input = "⨐"
output = [["Character", "⨐"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cirmid; with a semi-colon" do
input = "⫯"
output = [["Character", "⫯"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cirscir; with a semi-colon" do
input = "⧂"
output = [["Character", "⧂"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: clubs; with a semi-colon" do
input = "♣"
output = [["Character", "♣"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: clubsuit; with a semi-colon" do
input = "♣"
output = [["Character", "♣"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: colon; with a semi-colon" do
input = ":"
output = [["Character", ":"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: colone; with a semi-colon" do
input = "≔"
output = [["Character", "≔"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: coloneq; with a semi-colon" do
input = "≔"
output = [["Character", "≔"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: comma; with a semi-colon" do
input = ","
output = [["Character", ","]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: commat; with a semi-colon" do
input = "@"
output = [["Character", "@"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: comp; with a semi-colon" do
input = "∁"
output = [["Character", "∁"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: compfn; with a semi-colon" do
input = "∘"
output = [["Character", "∘"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: complement; with a semi-colon" do
input = "∁"
output = [["Character", "∁"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: complexes; with a semi-colon" do
input = "ℂ"
output = [["Character", "ℂ"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cong; with a semi-colon" do
input = "≅"
output = [["Character", "≅"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: congdot; with a semi-colon" do
input = "⩭"
output = [["Character", "⩭"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: conint; with a semi-colon" do
input = "∮"
output = [["Character", "∮"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: copf; with a semi-colon" do
input = "𝕔"
output = [["Character", "𝕔"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: coprod; with a semi-colon" do
input = "∐"
output = [["Character", "∐"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: copy without a semi-colon" do
input = "©"
output = [["Character", "©"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: copy; with a semi-colon" do
input = "©"
output = [["Character", "©"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: copysr; with a semi-colon" do
input = "℗"
output = [["Character", "℗"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: crarr; with a semi-colon" do
input = "↵"
output = [["Character", "↵"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cross; with a semi-colon" do
input = "✗"
output = [["Character", "✗"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cscr; with a semi-colon" do
input = "𝒸"
output = [["Character", "𝒸"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: csub; with a semi-colon" do
input = "⫏"
output = [["Character", "⫏"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: csube; with a semi-colon" do
input = "⫑"
output = [["Character", "⫑"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: csup; with a semi-colon" do
input = "⫐"
output = [["Character", "⫐"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: csupe; with a semi-colon" do
input = "⫒"
output = [["Character", "⫒"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: ctdot; with a semi-colon" do
input = "⋯"
output = [["Character", "⋯"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cudarrl; with a semi-colon" do
input = "⤸"
output = [["Character", "⤸"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cudarrr; with a semi-colon" do
input = "⤵"
output = [["Character", "⤵"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cuepr; with a semi-colon" do
input = "⋞"
output = [["Character", "⋞"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cuesc; with a semi-colon" do
input = "⋟"
output = [["Character", "⋟"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cularr; with a semi-colon" do
input = "↶"
output = [["Character", "↶"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cularrp; with a semi-colon" do
input = "⤽"
output = [["Character", "⤽"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cup; with a semi-colon" do
input = "∪"
output = [["Character", "∪"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cupbrcap; with a semi-colon" do
input = "⩈"
output = [["Character", "⩈"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cupcap; with a semi-colon" do
input = "⩆"
output = [["Character", "⩆"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cupcup; with a semi-colon" do
input = "⩊"
output = [["Character", "⩊"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cupdot; with a semi-colon" do
input = "⊍"
output = [["Character", "⊍"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cupor; with a semi-colon" do
input = "⩅"
output = [["Character", "⩅"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cups; with a semi-colon" do
input = "∪︀"
output = [["Character", "∪︀"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: curarr; with a semi-colon" do
input = "↷"
output = [["Character", "↷"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: curarrm; with a semi-colon" do
input = "⤼"
output = [["Character", "⤼"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: curlyeqprec; with a semi-colon" do
input = "⋞"
output = [["Character", "⋞"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: curlyeqsucc; with a semi-colon" do
input = "⋟"
output = [["Character", "⋟"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: curlyvee; with a semi-colon" do
input = "⋎"
output = [["Character", "⋎"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: curlywedge; with a semi-colon" do
input = "⋏"
output = [["Character", "⋏"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: curren without a semi-colon" do
input = "¤"
output = [["Character", "¤"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: curren; with a semi-colon" do
input = "¤"
output = [["Character", "¤"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: curvearrowleft; with a semi-colon" do
input = "↶"
output = [["Character", "↶"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: curvearrowright; with a semi-colon" do
input = "↷"
output = [["Character", "↷"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cuvee; with a semi-colon" do
input = "⋎"
output = [["Character", "⋎"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cuwed; with a semi-colon" do
input = "⋏"
output = [["Character", "⋏"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cwconint; with a semi-colon" do
input = "∲"
output = [["Character", "∲"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cwint; with a semi-colon" do
input = "∱"
output = [["Character", "∱"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: cylcty; with a semi-colon" do
input = "⌭"
output = [["Character", "⌭"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: dArr; with a semi-colon" do
input = "⇓"
output = [["Character", "⇓"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: dHar; with a semi-colon" do
input = "⥥"
output = [["Character", "⥥"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
test "tokenize/1 Named entity: dagger; with a semi-colon" do
input = "†"
output = [["Character", "†"]]
result =
input
|> Tokenizer.tokenize()
|> TokenizerTestLoader.tokenization_result()
assert result.tokens == output
end
end
| 22.62531 | 87 | 0.615011 |
08fee21db4ff4f6d528a70d7d3dc68fc71943c9b | 9,642 | ex | Elixir | lib/site_generator/add_stuff.ex | andersju/municipality-privacy | a9d4a1ae83a0aab3e480ee8ddd35ec897cf10265 | [
"MIT"
] | 11 | 2016-05-31T18:20:37.000Z | 2020-05-27T21:26:21.000Z | lib/site_generator/add_stuff.ex | andersju/municipality-privacy | a9d4a1ae83a0aab3e480ee8ddd35ec897cf10265 | [
"MIT"
] | null | null | null | lib/site_generator/add_stuff.ex | andersju/municipality-privacy | a9d4a1ae83a0aab3e480ee8ddd35ec897cf10265 | [
"MIT"
] | 2 | 2016-06-07T20:33:10.000Z | 2019-01-12T10:59:41.000Z | defmodule SiteGenerator.AddStuff do
import SiteGenerator.Helpers
import Sqlitex.{Query, Statement}
def start do
{:ok, db} = Sqlitex.open("data/crawl-data.sqlite")
add_columns(db, "http_requests", ~w(base_domain scheme), "TEXT")
add_columns(db, "javascript_cookies", ~w(base_domain), "TEXT")
add_columns(db, "site_visits", ~w(municipality_name scheme hsts referrer_policy score), "TEXT")
add_columns(db, "site_visits", ~w(first_party_persistent_cookies third_party_persistent_cookies
first_party_session_cookies third_party_session_cookies
third_party_requests insecure_requests), "INTEGER")
add_domain_meta(db)
add_cookie_meta(db)
update_sites(db)
Sqlitex.close(db)
end
defp add_columns(db, table, columns_to_add, type) do
existing_columns =
db
|> Sqlitex.query!("PRAGMA table_info(#{table})")
|> MapSet.new(fn(x) -> x[:name] end)
columns_to_add
|> MapSet.new
|> MapSet.difference(existing_columns)
|> Enum.each(fn(column) ->
IO.puts "Adding column #{column} #{type} to table #{table}"
Sqlitex.exec(db, "ALTER TABLE #{table} ADD COLUMN #{column} #{type}")
end)
end
# TODO: Let OpenWPM do this instead.
defp add_domain_meta(db) do
IO.puts "Setting base_domain and scheme in all rows in http_requests"
db
|> query!("SELECT id, url FROM http_requests")
|> Enum.each(fn(x) ->
scheme = URI.parse(x[:url]).scheme
prepare!(db, "UPDATE http_requests SET base_domain = ?, scheme = ? WHERE id = ?")
|> bind_values!([get_base_domain(x[:url]), scheme, x[:id]])
|> exec!
end)
end
# TODO: Let OpenWPM do this instead.
defp add_cookie_meta(db) do
IO.puts "Setting base_domain in all rows in javascript_cookies"
db
|> query!("SELECT id, raw_host FROM javascript_cookies")
|> Enum.each(fn(x) ->
prepare!(db, "UPDATE javascript_cookies SET base_domain = ? WHERE id = ?")
|> bind_values!([PublicSuffix.registrable_domain(x[:raw_host]), x[:id]])
|> exec!
end)
end
defp get_municipalities do
"data/municipalities.txt"
|> File.read!
|> String.split("\n")
|> List.delete("")
|> Enum.reduce(%{}, fn(x), acc ->
[url, name] = String.split(x, "|")
base_name = get_base_domain(url)
Map.put_new(acc, base_name, name)
end)
end
defp update_sites(db) do
municipalities = get_municipalities()
db
|> query!("SELECT visit_id, site_url FROM site_visits")
|> Enum.each(&(update_site(db, municipalities, &1)))
end
defp update_site(db, municipalities, [visit_id: visit_id, site_url: site_url]) do
IO.puts "Updating site #{site_url} (visit_id #{visit_id})"
base_domain = get_base_domain(site_url)
scheme = URI.parse(site_url).scheme
third_party_requests = get_third_party_requests(db, visit_id, base_domain)
{insecure_requests, _secure_requests} = get_request_types(db, visit_id)
headers = get_headers(db, visit_id)
referrer_policy = get_referrer_policy(site_url, headers)
hsts_header = Map.get(headers, "strict-transport-security", 0)
{persistent_cookies_first, persistent_cookies_third} = get_persistent_cookie_count(db, visit_id, base_domain)
{session_cookies_first, session_cookies_third} = get_session_cookie_count(db, visit_id, base_domain)
# It could be that the HSTS header is set but with value max-age=0, which
# "signals the UA to cease regarding the host as a Known HSTS Host" (RFC 6797 6.1.1)
# In that case, we set hsts to 0; otherwise we set the actual value.
hsts =
cond do
hsts_header !== 0 && hsts_header =~ ~r/max-age=0/i -> 0
true -> hsts_header
end
score =
cond do
scheme == "https" && persistent_cookies_first == 0
&& persistent_cookies_third == 0
&& String.contains?(referrer_policy, ["no-referrer", "never"])
&& hsts !== 0 && third_party_requests == 0 && insecure_requests == 0
-> "a"
scheme == "https" && persistent_cookies_third == 0
&& third_party_requests == 0
&& insecure_requests == 0
-> "b"
(scheme == "https" && third_party_requests > 0 && insecure_requests == 0)
|| (third_party_requests == 0)
-> "c"
(scheme == "https" && insecure_requests)
|| (persistent_cookies_third == 0)
-> "d"
true
-> "e"
end
db
|> prepare!(
"UPDATE site_visits
SET municipality_name = ?,
scheme = ?,
hsts = ?,
first_party_persistent_cookies = ?,
third_party_persistent_cookies = ?,
first_party_session_cookies = ?,
third_party_session_cookies = ?,
third_party_requests = ?,
insecure_requests = ?,
referrer_policy = ?,
score = ?
WHERE visit_id = ?")
|> bind_values!([municipalities[base_domain],
scheme,
hsts,
persistent_cookies_first,
persistent_cookies_third,
session_cookies_first,
session_cookies_third,
third_party_requests,
insecure_requests,
referrer_policy,
score,
visit_id])
|> exec!
end
# Returns tuple with number of first-party cookies and number of
# third-party cookies.
defp get_persistent_cookie_count(db, visit_id, base_domain) do
db
|> prepare!("SELECT *
FROM javascript_cookies
WHERE visit_id = ?
AND is_session = 0
AND change = 'added'
GROUP BY host, name")
|> bind_values!([visit_id])
|> fetch_all!
|> Enum.partition(&(&1[:base_domain] == base_domain))
|> Tuple.to_list
|> (&({Enum.count(List.first(&1)), Enum.count(List.last(&1))})).()
end
# Returns tuple with number of first-party session cookies and number
# of third-party session cookies.
defp get_session_cookie_count(db, visit_id, base_domain) do
db
|> prepare!("SELECT DISTINCT host, name, base_domain
FROM javascript_cookies
WHERE visit_id = ?
AND is_session = 1
AND change = 'added'
GROUP BY host, name")
|> bind_values!([visit_id])
|> fetch_all!
|> Enum.partition(&(&1[:base_domain] == base_domain))
|> Tuple.to_list
|> (&({Enum.count(List.first(&1)), Enum.count(List.last(&1))})).()
end
# Returns tuple with number of insecure and number of secure
# requests.
defp get_request_types(db, visit_id) do
db
|> prepare!(
"SELECT
(SELECT count(DISTINCT id)
FROM http_requests
WHERE visit_id = ?
AND scheme = 'http')
AS insecure,
(SELECT count(DISTINCT id)
FROM http_requests
WHERE visit_id = ?
AND scheme = 'https')
AS secure")
|> bind_values!([visit_id, visit_id])
|> fetch_all!
|> List.first
|> Keyword.values
|> List.to_tuple
end
# Returns integer with number of third-party requests.
defp get_third_party_requests(db, visit_id, base_domain) do
db
|> prepare!(
"SELECT count(DISTINCT base_domain) AS count
FROM http_requests
WHERE visit_id = ?
AND base_domain != ?")
|> bind_values!([visit_id, base_domain])
|> fetch_all!
|> List.first
|> Keyword.get(:count)
end
# Returns map with HTTP headers (in lowercase) and values from the
# first response from the given site. %{"connection" => "Keep-Alive", ..}
defp get_headers(db, visit_id) do
db
|> prepare!("SELECT headers FROM http_responses WHERE visit_id = ? LIMIT 1")
|> bind_values!([visit_id])
|> fetch_all!
|> List.first
|> Keyword.get(:headers)
|> Poison.decode!
|> Enum.into(%{}, fn([key, value]) -> {String.downcase(key), value} end)
end
defp get_referrer_policy(site_url, headers) do
csp_referrer = check_csp_referrer(headers)
referrer_header = check_referrer_header(headers)
meta_referrer = check_meta_referrer(site_url)
# Precedence in Firefox 50
cond do
meta_referrer -> meta_referrer
csp_referrer -> csp_referrer
referrer_header -> referrer_header
true -> nil
end
end
defp check_csp_referrer(headers) do
if Map.has_key?(headers, "content-security-policy") do
case Regex.run(~r/\breferrer ([\w-]+)\b/, headers["content-security-policy"]) do
[_, value] -> value
nil -> nil
end
else
nil
end
end
defp check_referrer_header(headers) do
if Map.has_key?(headers, "referrer-policy") do
case Regex.run(~r/^([\w-]+)$/i, headers["referrer-policy"]) do
[_, value] -> value
nil -> nil
end
else
nil
end
end
defp check_meta_referrer(site_url) do
site_url
|> HTTPoison.get!([], recv_timeout: 30_000)
|> Map.get(:body)
|> Floki.find("meta[name='referrer']")
|> Floki.attribute("content")
|> List.to_string
|> String.downcase
|> case do
"" -> nil
value -> value
end
end
end
| 33.595819 | 113 | 0.58826 |
08fef8a9fbd20324cd82029102a96d5839d77dd4 | 4,316 | ex | Elixir | lib/git.ex | fivetwentysix/elixir-git-cli | e9b5aad27fca9195325a699aee6d9c9e12bc12b7 | [
"MIT"
] | null | null | null | lib/git.ex | fivetwentysix/elixir-git-cli | e9b5aad27fca9195325a699aee6d9c9e12bc12b7 | [
"MIT"
] | null | null | null | lib/git.ex | fivetwentysix/elixir-git-cli | e9b5aad27fca9195325a699aee6d9c9e12bc12b7 | [
"MIT"
] | null | null | null | defmodule Git do
@type error :: {:error, Git.Error}
@type cli_arg :: String.t() | [String.t()]
@type path :: String.t()
defp get_repo_path(args) when not is_list(args), do: get_repo_path([args])
defp get_repo_path(args) when is_list(args) do
{_options, positional, _rest} = OptionParser.parse(args, strict: [])
case positional do
[_url, path] -> path
[url] -> url |> Path.basename() |> Path.rootname()
_ -> raise "invalid arguments for clones: #{inspect(args)}"
end
end
@doc """
Clones the repository. The first argument can be `url` or `[url, path]`.
Returns `{:ok, repository}` on success and `{:error, reason}` on failure.
"""
@spec clone(cli_arg) :: {:ok, Git.Repository.t()} | error
def clone(args) do
execute_command(nil, "clone", args, fn _ ->
path = args |> get_repo_path() |> Path.expand()
{:ok, %Git.Repository{path: path}}
end)
end
@doc """
Same as clone/1 but raise an exception on failure.
"""
@spec clone!(cli_arg) :: Git.Repository.t()
def clone!(args), do: result_or_fail(clone(args))
@spec init() :: {:ok, Git.Repository.t()} | error
@spec init(Git.Repository.t()) :: {:ok, Git.Repository.t()} | error
@spec init(cli_arg) :: {:ok, Git.Repository.t()} | error
def init(), do: init([])
def init(repo = %Git.Repository{}) do
Git.execute_command(repo, "init", [], fn _ -> {:ok, repo} end)
end
def init(args) do
execute_command(nil, "init", args, fn _ ->
args = if is_list(args), do: args, else: [args]
path = (Enum.at(args, 0) || ".") |> Path.expand()
{:ok, %Git.Repository{path: path}}
end)
end
@doc """
Run `git init` in the given directory
Returns `{:ok, repository}` on success and `{:error, reason}` on failure.
"""
@spec init!(cli_arg) :: Git.Repository.t()
@spec init!() :: Git.Repository.t()
def init!(args \\ []), do: result_or_fail(init(args))
commands =
File.read!(Path.join(__DIR__, "../git-commands.txt"))
|> String.split("\n")
|> Enum.filter(fn x ->
trim = if function_exported?(String, :trim, 1), do: :trim, else: :strip
x = apply(String, trim, [x])
not (String.length(x) == 0 or String.starts_with?(x, "#"))
end)
Enum.each(commands, fn name ->
normalized_name = String.to_atom(String.replace(name, "-", "_"))
bang_name = String.to_atom("#{normalized_name}!")
@doc """
Run `git #{name}` in the given repository
Returns `{:ok, output}` on success and `{:error, reason}` on failure.
"""
@spec unquote(normalized_name)(Git.Repository.t(), cli_arg) :: {:ok, binary} | error
def unquote(normalized_name)(repository, args \\ []) do
execute_command(repository, unquote(name), args, fn n -> {:ok, n} end)
end
@doc """
Same as `#{normalized_name}/2` but raises an exception on error.
"""
@spec unquote(bang_name)(Git.Repository.t(), cli_arg) :: binary | no_return
def unquote(bang_name)(repository, args \\ []) do
result_or_fail(unquote(normalized_name)(repository, args))
end
end)
@doc """
Return a Git.Repository struct with the specified or defaulted path.
For use with an existing repo (when Git.init and Git.clone would not be appropriate).
"""
@spec new(path) :: Git.Repository.t()
def new(path \\ "."), do: %Git.Repository{path: path}
@doc """
Execute the git command in the given repository.
"""
@spec execute_command(
Git.Repository.t() | nil,
String.t(),
cli_arg,
(String.t() -> {:ok, any} | error)
) :: {:ok, any} | {:error, any}
def execute_command(repo, command, args, callback) when is_list(args) do
options =
case repo do
nil -> [stderr_to_stdout: true]
_ -> [stderr_to_stdout: true, cd: repo.path]
end
case System.cmd("git", [command | args], options) do
{output, 0} -> callback.(output)
{err, code} -> {:error, %Git.Error{message: err, command: command, args: args, code: code}}
end
end
def execute_command(repo, command, args, callback) do
execute_command(repo, command, [args], callback)
end
@spec result_or_fail({:ok, t}) :: t when t: Git.Repository.t() | String.t()
defp result_or_fail({:ok, res}), do: res
defp result_or_fail({:error, res}), do: raise(res)
end
| 33.457364 | 97 | 0.612604 |
08ff0765dba4b6f9618d71e5af373655732ae846 | 10,271 | exs | Elixir | test/membrane_caps_audio_raw_test.exs | membraneframework/membrane-caps-audio-raw | afe28579bb93d8f20bbfc0b23dcded01b7fcbe56 | [
"Apache-2.0"
] | 2 | 2018-07-27T14:16:14.000Z | 2018-09-11T02:26:55.000Z | test/membrane_caps_audio_raw_test.exs | membraneframework/membrane-caps-audio-raw | afe28579bb93d8f20bbfc0b23dcded01b7fcbe56 | [
"Apache-2.0"
] | 1 | 2018-11-05T13:05:08.000Z | 2018-11-05T13:05:08.000Z | test/membrane_caps_audio_raw_test.exs | membraneframework/membrane-caps-audio-raw | afe28579bb93d8f20bbfc0b23dcded01b7fcbe56 | [
"Apache-2.0"
] | null | null | null | defmodule Membrane.RawAudioTest do
use ExUnit.Case, async: true
alias Membrane.Caps.Audio.Raw, as: RawAudio
@all_formats [
:s8,
:u8,
:s16le,
:u16le,
:s16be,
:u16be,
:s24le,
:u24le,
:s24be,
:u24be,
:s32le,
:u32le,
:s32be,
:u32be,
:f32le,
:f32be,
:f64le,
:f64be
]
defp format_to_caps(format) do
%RawAudio{format: format, channels: 2, sample_rate: 44_100}
end
test "sample_size/1" do
sizes = [1, 1, 2, 2, 2, 2, 3, 3, 3, 3, 4, 4, 4, 4, 4, 4, 8, 8]
assert length(@all_formats) == length(sizes)
@all_formats
|> Enum.map(&format_to_caps/1)
|> Enum.zip(sizes)
|> Enum.each(fn {caps, size} ->
assert RawAudio.sample_size(caps) == size
end)
end
test "frame_size/1" do
sizes = [2, 2, 4, 4, 4, 4, 6, 6, 6, 6, 8, 8, 8, 8, 8, 8, 16, 16]
assert length(@all_formats) == length(sizes)
@all_formats
|> Enum.map(&format_to_caps/1)
|> Enum.zip(sizes)
|> Enum.each(fn {caps, size} ->
assert RawAudio.frame_size(caps) == size
end)
end
@float_formats [:f32be, :f32le, :f64le, :f64be]
@non_float_caps [
:s8,
:u8,
:s16le,
:u16le,
:s16be,
:u16be,
:s24le,
:u24le,
:s24be,
:u24be,
:s32le,
:u32le,
:s32be,
:u32be
]
test "sample_type_float?" do
@float_formats
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> assert RawAudio.sample_type_float?(caps) == true end)
@non_float_caps
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> assert RawAudio.sample_type_float?(caps) == false end)
end
test "sample_type_int?" do
@float_formats
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> assert RawAudio.sample_type_int?(caps) == false end)
@non_float_caps
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> assert RawAudio.sample_type_int?(caps) == true end)
end
@little_endian_caps [
:s16le,
:u16le,
:s24le,
:u24le,
:s32le,
:u32le,
:f32le,
:f64le
]
@big_endian_caps [
:s16be,
:u16be,
:s24be,
:u24be,
:s32be,
:u32be,
:f32be,
:f64be
]
@one_byte_caps [:s8, :u8]
test "little_endian?" do
@little_endian_caps
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> RawAudio.little_endian?(caps) == true end)
@big_endian_caps
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> RawAudio.little_endian?(caps) == false end)
@one_byte_caps
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> RawAudio.little_endian?(caps) == true end)
end
test "big_endian?" do
@little_endian_caps
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> RawAudio.big_endian?(caps) == false end)
@big_endian_caps
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> RawAudio.big_endian?(caps) == true end)
@one_byte_caps
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> RawAudio.big_endian?(caps) == true end)
end
@signed_formats [
:s8,
:s16le,
:s16be,
:s24le,
:s24be,
:s32le,
:s32be
]
@unsigned_formats [
:u8,
:u16le,
:u16be,
:u24le,
:u24be,
:u32le,
:u32be
]
test "signed?" do
@signed_formats
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> RawAudio.signed?(caps) == true end)
@unsigned_formats
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> RawAudio.signed?(caps) == false end)
@float_formats
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> RawAudio.signed?(caps) == true end)
end
test "unsigned?/1" do
@signed_formats
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> RawAudio.unsigned?(caps) == false end)
@unsigned_formats
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> RawAudio.unsigned?(caps) == true end)
@float_formats
|> Enum.map(&format_to_caps/1)
|> Enum.each(fn caps -> RawAudio.unsigned?(caps) == false end)
end
@example_value 42
defp assert_value_to_sample(format, result) do
assert RawAudio.value_to_sample(@example_value, format_to_caps(format)) == result
end
test "value_to_sample/2" do
assert_value_to_sample(:s8, <<@example_value>>)
assert_value_to_sample(:u8, <<@example_value>>)
assert_value_to_sample(:s16le, <<@example_value, 0>>)
assert_value_to_sample(:s16le, <<@example_value, 0>>)
assert_value_to_sample(:u16be, <<0, @example_value>>)
assert_value_to_sample(:u16be, <<0, @example_value>>)
assert_value_to_sample(:s24le, <<@example_value, 0, 0>>)
assert_value_to_sample(:s24be, <<0, 0, @example_value>>)
assert_value_to_sample(:u24le, <<@example_value, 0, 0>>)
assert_value_to_sample(:u24be, <<0, 0, @example_value>>)
assert_value_to_sample(:s32le, <<@example_value, 0, 0, 0>>)
assert_value_to_sample(:s32be, <<0, 0, 0, @example_value>>)
assert_value_to_sample(:u32le, <<@example_value, 0, 0, 0>>)
assert_value_to_sample(:u32be, <<0, 0, 0, @example_value>>)
end
defp assert_value_to_sample_check_overflow_ok(format, result) do
assert RawAudio.value_to_sample_check_overflow(@example_value, format_to_caps(format)) ==
{:ok, result}
end
test "value_to_sample_check_overflow/2 with value in range" do
assert_value_to_sample_check_overflow_ok(:s8, <<@example_value>>)
assert_value_to_sample_check_overflow_ok(:u8, <<@example_value>>)
assert_value_to_sample_check_overflow_ok(:s16le, <<@example_value, 0>>)
assert_value_to_sample_check_overflow_ok(:s16le, <<@example_value, 0>>)
assert_value_to_sample_check_overflow_ok(:u16be, <<0, @example_value>>)
assert_value_to_sample_check_overflow_ok(:u16be, <<0, @example_value>>)
assert_value_to_sample_check_overflow_ok(:s24le, <<@example_value, 0, 0>>)
assert_value_to_sample_check_overflow_ok(:s24be, <<0, 0, @example_value>>)
assert_value_to_sample_check_overflow_ok(:u24le, <<@example_value, 0, 0>>)
assert_value_to_sample_check_overflow_ok(:u24be, <<0, 0, @example_value>>)
assert_value_to_sample_check_overflow_ok(:s32le, <<@example_value, 0, 0, 0>>)
assert_value_to_sample_check_overflow_ok(:s32be, <<0, 0, 0, @example_value>>)
assert_value_to_sample_check_overflow_ok(:u32le, <<@example_value, 0, 0, 0>>)
assert_value_to_sample_check_overflow_ok(:u32be, <<0, 0, 0, @example_value>>)
end
defp assert_value_to_sample_check_overflow_error(value, format) do
assert RawAudio.value_to_sample_check_overflow(value, format_to_caps(format)) ==
{:error, :overflow}
end
test "value_to_sample_check_overflow/2 when value is not in valid range" do
assert_value_to_sample_check_overflow_error(257, :u8)
assert_value_to_sample_check_overflow_error(-1, :u8)
assert_value_to_sample_check_overflow_error(129, :s8)
assert_value_to_sample_check_overflow_error(-129, :s8)
assert_value_to_sample_check_overflow_error(65_536, :u16le)
assert_value_to_sample_check_overflow_error(-1, :u16le)
assert_value_to_sample_check_overflow_error(32_768, :s16le)
assert_value_to_sample_check_overflow_error(-32_769, :s16le)
assert_value_to_sample_check_overflow_error(:math.pow(2, 23), :s24le)
assert_value_to_sample_check_overflow_error(-(:math.pow(2, 23) + 1), :s24be)
assert_value_to_sample_check_overflow_error(:math.pow(2, 24), :u24le)
assert_value_to_sample_check_overflow_error(-1, :u24be)
assert_value_to_sample_check_overflow_error(:math.pow(2, 31), :s32le)
assert_value_to_sample_check_overflow_error(-(:math.pow(2, 31) + 1), :s32be)
assert_value_to_sample_check_overflow_error(:math.pow(2, 32), :u32le)
assert_value_to_sample_check_overflow_error(-1, :u32be)
end
defp assert_sample_to_value_ok(sample, format, value) do
RawAudio.sample_to_value(sample, format |> format_to_caps) == {:ok, value}
end
test "sample_to_value/2" do
value = -123
assert_sample_to_value_ok(<<value::integer-unit(8)-size(1)-signed>>, :s8, value)
value = 123
assert_sample_to_value_ok(<<value::integer-unit(8)-size(1)-unsigned>>, :u8, value)
value = -1234
assert_sample_to_value_ok(<<value::integer-unit(8)-size(2)-little-signed>>, :s16le, value)
value = -1234
assert_sample_to_value_ok(<<value::integer-unit(8)-size(4)-little-signed>>, :s32le, value)
value = 1234
assert_sample_to_value_ok(<<value::integer-unit(8)-size(2)-little-unsigned>>, :u16le, value)
value = 1234
assert_sample_to_value_ok(<<value::integer-unit(8)-size(4)-little-unsigned>>, :u32le, value)
value = 1234.0
assert_sample_to_value_ok(<<value::float-unit(8)-size(4)-little>>, :f32le, value)
value = -1234
assert_sample_to_value_ok(<<value::integer-unit(8)-size(2)-big-signed>>, :s16be, value)
value = -1234
assert_sample_to_value_ok(<<value::integer-unit(8)-size(4)-big-signed>>, :s32be, value)
value = 1234
assert_sample_to_value_ok(<<value::integer-unit(8)-size(2)-big-unsigned>>, :u16be, value)
value = 1234
assert_sample_to_value_ok(<<value::integer-unit(8)-size(4)-big-unsigned>>, :u32be, value)
value = 1234.0
assert_sample_to_value_ok(<<value::float-unit(8)-size(4)-big>>, :f32be, value)
end
test "sample_min/1" do
[
{:s8, -128},
{:u8, 0},
{:s16le, -32_768},
{:s32le, -2_147_483_648},
{:u16le, 0},
{:u32le, 0},
{:s16be, -32_768},
{:s32be, -2_147_483_648},
{:u16be, 0},
{:u32be, 0},
{:f32le, -1.0},
{:f32be, -1.0},
{:f64le, -1.0},
{:f64be, -1.0}
]
|> Enum.each(fn {format, min_sample} ->
assert RawAudio.sample_min(format |> format_to_caps()) == min_sample
end)
end
test "sample_max/1" do
[
{:s8, 127},
{:u8, 255},
{:s16le, 32_767},
{:s32le, 2_147_483_647},
{:u16le, 65_535},
{:u32le, 4_294_967_295},
{:s16be, 32_767},
{:s32be, 2_147_483_647},
{:u16be, 65_535},
{:u32be, 4_294_967_295},
{:f32le, 1.0},
{:f32be, 1.0},
{:f64le, 1.0},
{:f64be, 1.0}
]
|> Enum.each(fn {format, max_sample} ->
assert RawAudio.sample_max(format |> format_to_caps()) == max_sample
end)
end
end
| 29.599424 | 96 | 0.654269 |
08ff20935948c7ad9a7ed8ed9099a0307b452856 | 4,056 | ex | Elixir | clients/private_ca/lib/google_api/private_ca/v1/model/certificate_description.ex | renovate-bot/elixir-google-api | 1da34cd39b670c99f067011e05ab90af93fef1f6 | [
"Apache-2.0"
] | 1 | 2021-12-20T03:40:53.000Z | 2021-12-20T03:40:53.000Z | clients/private_ca/lib/google_api/private_ca/v1/model/certificate_description.ex | swansoffiee/elixir-google-api | 9ea6d39f273fb430634788c258b3189d3613dde0 | [
"Apache-2.0"
] | 1 | 2020-08-18T00:11:23.000Z | 2020-08-18T00:44:16.000Z | clients/private_ca/lib/google_api/private_ca/v1/model/certificate_description.ex | dazuma/elixir-google-api | 6a9897168008efe07a6081d2326735fe332e522c | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.PrivateCA.V1.Model.CertificateDescription do
@moduledoc """
A CertificateDescription describes an X.509 certificate or CSR that has been issued, as an alternative to using ASN.1 / X.509.
## Attributes
* `aiaIssuingCertificateUrls` (*type:* `list(String.t)`, *default:* `nil`) - Describes lists of issuer CA certificate URLs that appear in the "Authority Information Access" extension in the certificate.
* `authorityKeyId` (*type:* `GoogleApi.PrivateCA.V1.Model.KeyId.t`, *default:* `nil`) - Identifies the subject_key_id of the parent certificate, per https://tools.ietf.org/html/rfc5280#section-4.2.1.1
* `certFingerprint` (*type:* `GoogleApi.PrivateCA.V1.Model.CertificateFingerprint.t`, *default:* `nil`) - The hash of the x.509 certificate.
* `crlDistributionPoints` (*type:* `list(String.t)`, *default:* `nil`) - Describes a list of locations to obtain CRL information, i.e. the DistributionPoint.fullName described by https://tools.ietf.org/html/rfc5280#section-4.2.1.13
* `publicKey` (*type:* `GoogleApi.PrivateCA.V1.Model.PublicKey.t`, *default:* `nil`) - The public key that corresponds to an issued certificate.
* `subjectDescription` (*type:* `GoogleApi.PrivateCA.V1.Model.SubjectDescription.t`, *default:* `nil`) - Describes some of the values in a certificate that are related to the subject and lifetime.
* `subjectKeyId` (*type:* `GoogleApi.PrivateCA.V1.Model.KeyId.t`, *default:* `nil`) - Provides a means of identifiying certificates that contain a particular public key, per https://tools.ietf.org/html/rfc5280#section-4.2.1.2.
* `x509Description` (*type:* `GoogleApi.PrivateCA.V1.Model.X509Parameters.t`, *default:* `nil`) - Describes some of the technical X.509 fields in a certificate.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:aiaIssuingCertificateUrls => list(String.t()) | nil,
:authorityKeyId => GoogleApi.PrivateCA.V1.Model.KeyId.t() | nil,
:certFingerprint => GoogleApi.PrivateCA.V1.Model.CertificateFingerprint.t() | nil,
:crlDistributionPoints => list(String.t()) | nil,
:publicKey => GoogleApi.PrivateCA.V1.Model.PublicKey.t() | nil,
:subjectDescription => GoogleApi.PrivateCA.V1.Model.SubjectDescription.t() | nil,
:subjectKeyId => GoogleApi.PrivateCA.V1.Model.KeyId.t() | nil,
:x509Description => GoogleApi.PrivateCA.V1.Model.X509Parameters.t() | nil
}
field(:aiaIssuingCertificateUrls, type: :list)
field(:authorityKeyId, as: GoogleApi.PrivateCA.V1.Model.KeyId)
field(:certFingerprint, as: GoogleApi.PrivateCA.V1.Model.CertificateFingerprint)
field(:crlDistributionPoints, type: :list)
field(:publicKey, as: GoogleApi.PrivateCA.V1.Model.PublicKey)
field(:subjectDescription, as: GoogleApi.PrivateCA.V1.Model.SubjectDescription)
field(:subjectKeyId, as: GoogleApi.PrivateCA.V1.Model.KeyId)
field(:x509Description, as: GoogleApi.PrivateCA.V1.Model.X509Parameters)
end
defimpl Poison.Decoder, for: GoogleApi.PrivateCA.V1.Model.CertificateDescription do
def decode(value, options) do
GoogleApi.PrivateCA.V1.Model.CertificateDescription.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.PrivateCA.V1.Model.CertificateDescription do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 59.647059 | 235 | 0.739892 |
08ff9b3fbbfb95065ae6c45b5a7d002620f55014 | 2,428 | ex | Elixir | clients/firebase/lib/google_api/firebase/v1beta1/model/analytics_details.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/firebase/lib/google_api/firebase/v1beta1/model/analytics_details.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | clients/firebase/lib/google_api/firebase/v1beta1/model/analytics_details.ex | medikent/elixir-google-api | 98a83d4f7bfaeac15b67b04548711bb7e49f9490 | [
"Apache-2.0"
] | null | null | null | # Copyright 2019 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.Firebase.V1beta1.Model.AnalyticsDetails do
@moduledoc """
## Attributes
* `analyticsProperty` (*type:* `GoogleApi.Firebase.V1beta1.Model.AnalyticsProperty.t`, *default:* `nil`) - The Analytics Property object associated with the specified
`FirebaseProject`.
<br>
<br>This object contains the details of the Google Analytics property
associated with the specified `FirebaseProject`.
* `streamMappings` (*type:* `list(GoogleApi.Firebase.V1beta1.Model.StreamMapping.t)`, *default:* `nil`) - For Android Apps and iOS Apps: A map of `app` to `streamId` for each
Firebase App in the specified `FirebaseProject`. Each `app` and
`streamId` appears only once.<br>
<br>
For Web Apps: A map of `app` to `streamId` and `measurementId` for each
Firebase App in the specified `FirebaseProject`. Each `app`, `streamId`,
and `measurementId` appears only once.
"""
use GoogleApi.Gax.ModelBase
@type t :: %__MODULE__{
:analyticsProperty => GoogleApi.Firebase.V1beta1.Model.AnalyticsProperty.t(),
:streamMappings => list(GoogleApi.Firebase.V1beta1.Model.StreamMapping.t())
}
field(:analyticsProperty, as: GoogleApi.Firebase.V1beta1.Model.AnalyticsProperty)
field(:streamMappings, as: GoogleApi.Firebase.V1beta1.Model.StreamMapping, type: :list)
end
defimpl Poison.Decoder, for: GoogleApi.Firebase.V1beta1.Model.AnalyticsDetails do
def decode(value, options) do
GoogleApi.Firebase.V1beta1.Model.AnalyticsDetails.decode(value, options)
end
end
defimpl Poison.Encoder, for: GoogleApi.Firebase.V1beta1.Model.AnalyticsDetails do
def encode(value, options) do
GoogleApi.Gax.ModelBase.encode(value, options)
end
end
| 40.466667 | 178 | 0.737232 |
08ffa4b1a7d538a49e6220277234523c8e443038 | 10,450 | ex | Elixir | lib/groupher_server/cms/cms.ex | coderplanets/coderplanets_server | 3663e56340d6d050e974c91f7e499d8424fc25e9 | [
"Apache-2.0"
] | 240 | 2018-11-06T09:36:54.000Z | 2022-02-20T07:12:36.000Z | lib/groupher_server/cms/cms.ex | coderplanets/coderplanets_server | 3663e56340d6d050e974c91f7e499d8424fc25e9 | [
"Apache-2.0"
] | 363 | 2018-07-11T03:38:14.000Z | 2021-12-14T01:42:40.000Z | lib/groupher_server/cms/cms.ex | coderplanets/coderplanets_server | 3663e56340d6d050e974c91f7e499d8424fc25e9 | [
"Apache-2.0"
] | 22 | 2019-01-27T11:47:56.000Z | 2021-02-28T13:17:52.000Z | defmodule GroupherServer.CMS do
@moduledoc """
this module defined basic method to handle [CMS] content [CURD] ..
[CMS]: post, job, ...
[CURD]: create, update, delete ...
"""
alias GroupherServer.CMS.Delegate
alias Delegate.{
AbuseReport,
ArticleCURD,
BlogCURD,
WorksCURD,
ArticleCommunity,
ArticleEmotion,
CitedArtiment,
CommentCurd,
ArticleCollect,
ArticleUpvote,
CommentAction,
CommentEmotion,
ArticleTag,
CommunitySync,
CommunityCURD,
CommunityOperation,
PassportCURD,
Search,
Seeds
}
# do not pattern match in delegating func, do it on one delegating inside
# see https://github.com/elixir-lang/elixir/issues/5306
# Community CURD: editors, thread, tag
defdelegate read_community(args), to: CommunityCURD
defdelegate read_community(args, user), to: CommunityCURD
defdelegate create_community(args), to: CommunityCURD
defdelegate update_community(id, args), to: CommunityCURD
# >> editor ..
defdelegate update_editor(user, community, title), to: CommunityCURD
# >> geo info ..
defdelegate community_geo_info(community), to: CommunityCURD
# >> subscribers / editors
defdelegate community_members(type, community, filters), to: CommunityCURD
# >> category
defdelegate create_category(category_attrs, user), to: CommunityCURD
defdelegate update_category(category_attrs), to: CommunityCURD
# >> thread
defdelegate create_thread(attrs), to: CommunityCURD
defdelegate count(community, part), to: CommunityCURD
# >> tag
defdelegate create_article_tag(community, thread, attrs, user), to: ArticleTag
defdelegate update_article_tag(tag_id, attrs), to: ArticleTag
defdelegate delete_article_tag(tag_id), to: ArticleTag
defdelegate set_article_tag(thread, article_id, tag_id), to: ArticleTag
defdelegate unset_article_tag(thread, article_id, tag_id), to: ArticleTag
defdelegate paged_article_tags(filter), to: ArticleTag
# >> wiki & cheatsheet (sync with github)
defdelegate get_wiki(community), to: CommunitySync
defdelegate get_cheatsheet(community), to: CommunitySync
defdelegate sync_github_content(community, thread, attrs), to: CommunitySync
defdelegate add_contributor(content, attrs), to: CommunitySync
# CommunityOperation
# >> category
defdelegate set_category(community, category), to: CommunityOperation
defdelegate unset_category(community, category), to: CommunityOperation
# >> editor
defdelegate set_editor(community, title, user), to: CommunityOperation
defdelegate unset_editor(community, user), to: CommunityOperation
# >> thread
defdelegate set_thread(community, thread), to: CommunityOperation
defdelegate unset_thread(community, thread), to: CommunityOperation
# >> subscribe / unsubscribe
defdelegate subscribe_community(community, user), to: CommunityOperation
defdelegate subscribe_community(community, user, remote_ip), to: CommunityOperation
defdelegate unsubscribe_community(community, user), to: CommunityOperation
defdelegate unsubscribe_community(community, user, remote_ip), to: CommunityOperation
defdelegate subscribe_default_community_ifnot(user, remote_ip), to: CommunityOperation
defdelegate subscribe_default_community_ifnot(user), to: CommunityOperation
# ArticleCURD
defdelegate read_article(thread, id), to: ArticleCURD
defdelegate read_article(thread, id, user), to: ArticleCURD
defdelegate paged_articles(queryable, filter), to: ArticleCURD
defdelegate paged_articles(queryable, filter, user), to: ArticleCURD
defdelegate paged_published_articles(queryable, filter, user), to: ArticleCURD
defdelegate create_article(community, thread, attrs, user), to: ArticleCURD
defdelegate update_article(article, attrs), to: ArticleCURD
defdelegate mark_delete_article(thread, id), to: ArticleCURD
defdelegate undo_mark_delete_article(thread, id), to: ArticleCURD
defdelegate delete_article(article), to: ArticleCURD
defdelegate delete_article(article, reason), to: ArticleCURD
defdelegate update_active_timestamp(thread, article), to: ArticleCURD
defdelegate sink_article(thread, id), to: ArticleCURD
defdelegate undo_sink_article(thread, id), to: ArticleCURD
defdelegate archive_articles(thread), to: ArticleCURD
defdelegate create_blog(community, attrs, user), to: BlogCURD
defdelegate create_blog_rss(attrs), to: BlogCURD
defdelegate update_blog_rss(attrs), to: BlogCURD
defdelegate blog_rss_info(rss), to: BlogCURD
defdelegate create_works(attrs, user), to: WorksCURD
defdelegate update_works(attrs, user), to: WorksCURD
defdelegate paged_citing_contents(type, id, filter), to: CitedArtiment
defdelegate upvote_article(thread, article_id, user), to: ArticleUpvote
defdelegate undo_upvote_article(thread, article_id, user), to: ArticleUpvote
defdelegate upvoted_users(thread, article_id, filter), to: ArticleUpvote
defdelegate collect_article(thread, article_id, user), to: ArticleCollect
defdelegate collect_article_ifneed(thread, article_id, user), to: ArticleCollect
defdelegate undo_collect_article(thread, article_id, user), to: ArticleCollect
defdelegate undo_collect_article_ifneed(thread, article_id, user), to: ArticleCollect
defdelegate collected_users(thread, article_id, filter), to: ArticleCollect
defdelegate set_collect_folder(collect, folder), to: ArticleCollect
defdelegate undo_set_collect_folder(collect, folder), to: ArticleCollect
# ArticleCommunity
# >> set flag on article, like: pin / unpin article
defdelegate pin_article(thread, id, community_id), to: ArticleCommunity
defdelegate undo_pin_article(thread, id, community_id), to: ArticleCommunity
# >> community: set / unset
defdelegate mirror_article(thread, article_id, community_id), to: ArticleCommunity
defdelegate mirror_article(thread, article_id, community_id, article_ids), to: ArticleCommunity
defdelegate unmirror_article(thread, article_id, community_id), to: ArticleCommunity
defdelegate move_article(thread, article_id, community_id), to: ArticleCommunity
defdelegate move_article(thread, article_id, community_id, article_ids), to: ArticleCommunity
defdelegate move_to_blackhole(thread, article_id, article_ids), to: ArticleCommunity
defdelegate move_to_blackhole(thread, article_id), to: ArticleCommunity
defdelegate mirror_to_home(thread, article_id, article_ids), to: ArticleCommunity
defdelegate mirror_to_home(thread, article_id), to: ArticleCommunity
defdelegate emotion_to_article(thread, article_id, args, user), to: ArticleEmotion
defdelegate undo_emotion_to_article(thread, article_id, args, user), to: ArticleEmotion
# Comment CURD
defdelegate comments_state(thread, article_id), to: CommentCurd
defdelegate comments_state(thread, article_id, user), to: CommentCurd
defdelegate one_comment(id), to: CommentCurd
defdelegate one_comment(id, user), to: CommentCurd
defdelegate update_user_in_comments_participants(user), to: CommentCurd
defdelegate paged_comments(thread, article_id, filters, mode), to: CommentCurd
defdelegate paged_comments(thread, article_id, filters, mode, user), to: CommentCurd
defdelegate paged_published_comments(user, thread, filters), to: CommentCurd
defdelegate paged_published_comments(user, filters), to: CommentCurd
defdelegate paged_folded_comments(thread, article_id, filters), to: CommentCurd
defdelegate paged_folded_comments(thread, article_id, filters, user), to: CommentCurd
defdelegate paged_comment_replies(comment_id, filters), to: CommentCurd
defdelegate paged_comment_replies(comment_id, filters, user), to: CommentCurd
defdelegate paged_comments_participants(thread, content_id, filters), to: CommentCurd
defdelegate create_comment(thread, article_id, args, user), to: CommentCurd
defdelegate update_comment(comment, content), to: CommentCurd
defdelegate delete_comment(comment), to: CommentCurd
defdelegate mark_comment_solution(comment, user), to: CommentCurd
defdelegate undo_mark_comment_solution(comment, user), to: CommentCurd
defdelegate archive_comments(), to: CommentCurd
defdelegate upvote_comment(comment_id, user), to: CommentAction
defdelegate undo_upvote_comment(comment_id, user), to: CommentAction
defdelegate reply_comment(comment_id, args, user), to: CommentAction
defdelegate lock_article_comments(thread, article_id), to: CommentAction
defdelegate undo_lock_article_comments(thread, article_id), to: CommentAction
defdelegate pin_comment(comment_id), to: CommentAction
defdelegate undo_pin_comment(comment_id), to: CommentAction
defdelegate fold_comment(comment_id, user), to: CommentAction
defdelegate unfold_comment(comment_id, user), to: CommentAction
defdelegate emotion_to_comment(comment_id, args, user), to: CommentEmotion
defdelegate undo_emotion_to_comment(comment_id, args, user), to: CommentEmotion
###################
###################
###################
###################
# TODO: move report to abuse report module
defdelegate report_article(thread, article_id, reason, attr, user), to: AbuseReport
defdelegate report_comment(comment_id, reason, attr, user), to: AbuseReport
defdelegate report_account(account_id, reason, attr, user), to: AbuseReport
defdelegate undo_report_account(account_id, user), to: AbuseReport
defdelegate undo_report_article(thread, article_id, user), to: AbuseReport
defdelegate paged_reports(filter), to: AbuseReport
defdelegate undo_report_comment(comment_id, user), to: AbuseReport
# Passport CURD
defdelegate stamp_passport(rules, user), to: PassportCURD
defdelegate erase_passport(rules, user), to: PassportCURD
defdelegate get_passport(user), to: PassportCURD
defdelegate paged_passports(community, key), to: PassportCURD
defdelegate delete_passport(user), to: PassportCURD
# search
defdelegate search_articles(thread, args), to: Search
defdelegate search_communities(args), to: Search
# seeds
defdelegate seed_communities(opt), to: Seeds
defdelegate seed_community(raw, type), to: Seeds
defdelegate seed_community(raw), to: Seeds
defdelegate seed_set_category(communities, category), to: Seeds
defdelegate seed_articles(community, type), to: Seeds
defdelegate seed_articles(community, type, count), to: Seeds
defdelegate clean_up_community(raw), to: Seeds
defdelegate clean_up_articles(community, type), to: Seeds
# defdelegate seed_bot, to: Seeds
end
| 44.468085 | 97 | 0.787368 |
08ffa57d8afa40b62a56269dbec89d567fd733e5 | 894 | ex | Elixir | clients/security_center/lib/google_api/security_center/v1/metadata.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | null | null | null | clients/security_center/lib/google_api/security_center/v1/metadata.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | null | null | null | clients/security_center/lib/google_api/security_center/v1/metadata.ex | MasashiYokota/elixir-google-api | 975dccbff395c16afcb62e7a8e411fbb58e9ab01 | [
"Apache-2.0"
] | 1 | 2020-10-04T10:12:44.000Z | 2020-10-04T10:12:44.000Z | # Copyright 2020 Google LLC
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
# NOTE: This file is auto generated by the elixir code generator program.
# Do not edit this file manually.
defmodule GoogleApi.SecurityCenter.V1 do
@moduledoc """
API client metadata for GoogleApi.SecurityCenter.V1.
"""
@discovery_revision "20200918"
def discovery_revision(), do: @discovery_revision
end
| 33.111111 | 74 | 0.761745 |
08ffeff739d248265a04ab3282c2dc0981b3a3e4 | 1,863 | exs | Elixir | installer/templates/phx_umbrella/apps/app_name_web/config/dev.exs | rafaelbiten/phoenix | d1cc7c0fd06c0a2484197a49c36cc27085c0c2e6 | [
"MIT"
] | null | null | null | installer/templates/phx_umbrella/apps/app_name_web/config/dev.exs | rafaelbiten/phoenix | d1cc7c0fd06c0a2484197a49c36cc27085c0c2e6 | [
"MIT"
] | null | null | null | installer/templates/phx_umbrella/apps/app_name_web/config/dev.exs | rafaelbiten/phoenix | d1cc7c0fd06c0a2484197a49c36cc27085c0c2e6 | [
"MIT"
] | 1 | 2021-03-22T14:58:39.000Z | 2021-03-22T14:58:39.000Z | # For development, we disable any cache and enable
# debugging and code reloading.
#
# The watchers configuration can be used to run external
# watchers to your application. For example, we use it
# with webpack to recompile .js and .css sources.
config :<%= @web_app_name %>, <%= @endpoint_module %>,
# Binding to loopback ipv4 address prevents access from other machines.
# Change to `ip: {0, 0, 0, 0}` to allow access from other machines.
http: [ip: {127, 0, 0, 1}, port: 4000],
debug_errors: true,
code_reloader: true,
check_origin: false,
watchers: <%= if @webpack do %>[
node: [
"node_modules/webpack/bin/webpack.js",
"--mode",
"development",
"--watch",
"--watch-options-stdin",
cd: Path.expand("../apps/<%= @web_app_name %>/assets", __DIR__)
]
]<% else %>[]<% end %>
# ## SSL Support
#
# In order to use HTTPS in development, a self-signed
# certificate can be generated by running the following
# Mix task:
#
# mix phx.gen.cert
#
# Note that this task requires Erlang/OTP 20 or later.
# Run `mix help phx.gen.cert` for more information.
#
# The `http:` config above can be replaced with:
#
# https: [
# port: 4001,
# cipher_suite: :strong,
# keyfile: "priv/cert/selfsigned_key.pem",
# certfile: "priv/cert/selfsigned.pem"
# ],
#
# If desired, both `http:` and `https:` keys can be
# configured to run both http and https servers on
# different ports.<%= if @html do %>
# Watch static and templates for browser reloading.
config :<%= @web_app_name %>, <%= @endpoint_module %>,
live_reload: [
patterns: [
~r"priv/static/.*(js|css|png|jpeg|jpg|gif|svg)$",<%= if @gettext do %>
~r"priv/gettext/.*(po)$",<% end %>
~r"lib/<%= @web_app_name %>/(live|views)/.*(ex)$",
~r"lib/<%= @web_app_name %>/templates/.*(eex)$"
]
]<% end %>
| 31.576271 | 76 | 0.626409 |
1c002332b8273acfd4e1a532b3f206d1da3c9dab | 13 | exs | Elixir | config/config.exs | r26D/ueberauth_apple | bf97cb8ee1894d57253eccd30f78f0f9f3b6214d | [
"MIT"
] | 1 | 2022-03-10T06:54:28.000Z | 2022-03-10T06:54:28.000Z | config/config.exs | r26D/ueberauth_apple | bf97cb8ee1894d57253eccd30f78f0f9f3b6214d | [
"MIT"
] | 1 | 2018-11-15T16:20:06.000Z | 2018-11-24T00:02:57.000Z | config/config.exs | r26D/ueberauth_apple | bf97cb8ee1894d57253eccd30f78f0f9f3b6214d | [
"MIT"
] | 2 | 2017-10-05T11:35:37.000Z | 2018-11-15T16:56:01.000Z | import Config | 13 | 13 | 0.923077 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.